WO2024029634A1 - Dispositif de réception de diffusion, procédé de protection de contenu, procédé de traitement pour ajouter un son de réverbération et procédé de commande pour dispositif de réception de diffusion - Google Patents

Dispositif de réception de diffusion, procédé de protection de contenu, procédé de traitement pour ajouter un son de réverbération et procédé de commande pour dispositif de réception de diffusion Download PDF

Info

Publication number
WO2024029634A1
WO2024029634A1 PCT/JP2023/029279 JP2023029279W WO2024029634A1 WO 2024029634 A1 WO2024029634 A1 WO 2024029634A1 JP 2023029279 W JP2023029279 W JP 2023029279W WO 2024029634 A1 WO2024029634 A1 WO 2024029634A1
Authority
WO
WIPO (PCT)
Prior art keywords
broadcast
signal
broadcast receiving
receiving device
layer
Prior art date
Application number
PCT/JP2023/029279
Other languages
English (en)
Japanese (ja)
Inventor
信夫 益岡
拓也 清水
康宣 橋本
和彦 吉澤
仁 秋山
展明 甲
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022123799A external-priority patent/JP2024021157A/ja
Priority claimed from JP2022151952A external-priority patent/JP2024046518A/ja
Priority claimed from JP2022181025A external-priority patent/JP2024070494A/ja
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Publication of WO2024029634A1 publication Critical patent/WO2024029634A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K15/00Acoustics not otherwise provided for
    • G10K15/02Synthesis of acoustic waves
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K15/00Acoustics not otherwise provided for
    • G10K15/08Arrangements for producing a reverberation or echo sound
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/53Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers
    • H04H20/59Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers for emergency or urgency
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H40/00Arrangements specially adapted for receiving broadcast information
    • H04H40/18Arrangements characterised by circuits or components specially adapted for receiving
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S3/00Systems employing more than two channels, e.g. quadraphonic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control

Definitions

  • the present invention relates to a broadcast receiving device, a content protection method, a reverberation sound addition processing method, and a control method for a broadcast receiving device.
  • Digital broadcasting services began in various countries in the late 1990s to replace conventional analog broadcasting services.
  • Digital broadcasting services include improving broadcast quality using error correction technology, multi-channeling and HD (High Definition) using compression coding technology, BML (Broadcast Markup Language) and HTML5 (Hyper Text Markup Language). version 5)
  • BML Broadcast Markup Language
  • HTML5 Hyper Text Markup Language
  • Patent Document 1 There is a system described in Patent Document 1 as a technology for realizing UHD broadcasting in digital broadcasting services.
  • the system described in Patent Document 1 is intended to replace the current digital broadcasting, and does not take into consideration the maintenance of the viewing environment of the current digital broadcasting service.
  • An object of the present invention is to provide a technology for more appropriately transmitting or receiving advanced digital broadcasting services with higher functionality, taking into consideration compatibility with current digital broadcasting services.
  • a broadcast receiving device capable of receiving signals for each sound source via broadcast waves, which includes a broadcast receiving section that receives the broadcast waves, and a plurality of speakers, and which transmits signals from the broadcast receiving device.
  • an audio output unit that outputs audio according to the signal for each sound source, and a control unit;
  • the control unit determines a playback position of audio based on the signal for each sound source according to the arrangement information of the plurality of speakers, and generates a signal corresponding to the 22.2ch audio channel based on the signal for each sound source.
  • a broadcast receiving device may be used that converts a signal corresponding to the 22.2ch audio channel into a signal for the audio output section.
  • FIG. 1 is a system configuration diagram of a broadcasting system according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 2 is a detailed block diagram of a first tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a detailed block diagram of a second tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a detailed block diagram of a third tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a detailed block diagram of a fourth tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 2 is a detailed block diagram of a first tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 1 is a software configuration diagram of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 1 is a configuration diagram of a broadcasting station server according to an embodiment of the present invention.
  • FIG. 1 is a configuration diagram of a service provider server according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a portable information terminal according to an embodiment of the present invention.
  • FIG. 1 is a software configuration diagram of a portable information terminal according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a segment configuration related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating layer allocation in layered transmission related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating the generation process of OFDM transmission waves related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the basic configuration of a transmission line encoding unit related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating segment parameters of an OFDM system related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating transmission signal parameters related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the arrangement of pilot signals of synchronous modulation segments in digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the arrangement of pilot signals of differential modulation segments in digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating bit allocation of a TMCC carrier related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating bit allocation of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating transmission parameter information of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating system identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a carrier modulation mapping method for TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating frequency conversion processing identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating physical channel number identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating main signal identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating 4K signal transmission layer identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating additional layer transmission identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating identification of a coding rate of an inner code of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating bit allocation of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating configuration identification of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating seismic motion warning information of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating signal identification of seismic motion warning information of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating additional information regarding transmission control of a modulated wave of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating additional transmission parameter information of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an AC signal error correction method for digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a constellation format of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a dual polarization transmission system according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a dual polarization transmission system according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a dual polarization transmission system according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating frequency conversion processing according to an embodiment of the present invention.
  • 1 is a diagram illustrating the configuration of a pass-through transmission method according to an embodiment of the present invention.
  • FIG. FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention.
  • 1 is a diagram illustrating the configuration of a pass-through transmission method according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a single polarization transmission method according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a single polarization transmission method according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a single polarization transmission method according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a layer division multiplex transmission system according to an embodiment of the present invention.
  • FIG. 1 is a system configuration diagram of a broadcasting system using a hierarchical division multiplex transmission system according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating frequency conversion amplification processing according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a hierarchical division multiplex transmission system according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a protocol stack of MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of tables used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of tables used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram illustrating a protocol stack in an MMT broadcast transmission path.
  • FIG. 2 is a diagram illustrating a protocol stack in an MMT communication line.
  • FIG. 2 is a diagram explaining the names and functions of tables used in TLV-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT TLV-SI.
  • FIG. 2 is a diagram illustrating the names and functions of messages used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of tables used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT.
  • FIG. 3 is a diagram illustrating the relationship between MMT data transmission and each table.
  • FIG. 3 is an operation sequence diagram of channel setting processing of the broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating the data structure of a network information table.
  • FIG. 2 is a diagram illustrating the data structure of a ground distribution system descriptor.
  • FIG. 3 is a diagram illustrating the data structure of a service list descriptor.
  • FIG. 3 is a diagram illustrating the data structure of a TS information descriptor.
  • FIG. 1 is an external view of a remote controller according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a banner display when selecting a channel according to an embodiment of the present invention. It is a figure explaining speaker arrangement. It is a figure explaining speaker arrangement. It is a figure explaining speaker arrangement. It is a figure explaining speaker arrangement.
  • FIG. 3 is a diagram illustrating the positional relationship when headphones are used.
  • FIG. 3 is a diagram illustrating the positional relationship when headphones are used.
  • This is an example of the configuration of an audio decoder for an audio signal consisting of only channel-based signals.
  • This is an example of the configuration of an audio decoder for advanced audio signals.
  • This is an example of placement information of a speaker system.
  • This is the default value of the downmix coefficient from the 22.2ch signal to the 5.1ch signal.
  • This is the default value of the downmix coefficient from a 5.1ch signal to a 2ch signal.
  • This is an example of a screen for selecting a speaker system used for audio reproduction.
  • FIG. 3 is a diagram illustrating metadata of an object-based signal.
  • Metadata specifies the playback position of an object-based signal.
  • metadata specifies the playback position of an object-based signal.
  • metadata specifies the playback position of an object-based signal.
  • This is an example of a screen for selecting a playback position of an object-based signal.
  • This is an example of a screen for setting the playback position of an object-based signal.
  • metadata specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • FIG. 3 is a diagram of a selection screen for selecting an audio signal for each output device.
  • FIG. 2 is a diagram illustrating audio playback in a cooperating device.
  • FIG. 3 is a diagram illustrating parameters describing the number of audio signals to be transmitted and the acquisition destination.
  • FIG. 3 is a diagram illustrating the data structure of an audio component descriptor.
  • FIG. 3 is a diagram illustrating audio component type data. This is an example in which transmitted audio signals are displayed in an electronic program guide. This is an example in which transmitted audio signals are displayed in an electronic program guide. This is an example of displaying a signal source and an output device.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 6 is a diagram illustrating an example of a reverberation sound processing flow when using headphones.
  • FIG. 3 is a diagram showing an example of an audio output setting menu.
  • FIG. 7 is a diagram showing an example of a detailed menu for reverberation sound settings.
  • FIG. 3 is a diagram illustrating an example of a banner display indicating a reverberation processing state.
  • FIG. 7 is a diagram illustrating an example of a banner display in partially enlarged display of a broadcast image.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment
  • FIG. 6 is a diagram for explaining viewing conditions before and after partially enlarged display of a broadcast image.
  • 5 is a flowchart illustrating a sound image localization mode switching operation according to the present embodiment.
  • FIG. 2 is a diagram showing an example of the appearance of the remote controller of the present embodiment.
  • FIG. 7 is a diagram showing an example of a banner display for setting the sound image localization mode of the present embodiment.
  • 12 is a flowchart showing an operation for correcting ⁇ when the sound image localization mode of the present modification is non-fixed.
  • FIG. 1 is a system configuration diagram showing an example of the configuration of a broadcasting system.
  • the broadcasting system includes, for example, a broadcast receiving device 100 and an antenna 200, a radio tower 300 of a broadcasting station and a broadcasting station server 400, a service provider server 500, a mobile phone communication server 600, a base station 600B of a mobile phone communication network, and a mobile phone communication network. It is composed of an information terminal 700, a broadband network 800 such as the Internet, a router device 800R, an information server 900, headphones 910, and an HMD (Head Mound Display) 920. Note that the broadcasting system may include either one of the headphones 910 and the HMD 920, or may include a speaker system (not shown) instead of the headphones 910 and the HMD 920. Further, various server devices and communication devices may be further connected to the Internet 800.
  • the broadcast receiving device 100 is a television receiver equipped with a reception function for advanced digital broadcasting services. Broadcast receiving device 100 may further include a receiving function for existing digital broadcasting services. Furthermore, by linking digital broadcasting services (existing digital broadcasting services or advanced digital broadcasting services) with functions using broadband networks, we will be able to obtain additional content via broadband networks, perform calculation processing on server devices, and link with mobile terminal devices. It is compatible with broadcast communication cooperation systems that combine presentation processing, etc. with digital broadcasting services. Broadcast receiving device 100 receives digital broadcast waves transmitted from radio tower 300 via antenna 200 . The digital broadcast wave may be directly transmitted from the radio tower 300 to the antenna 200, or may be transmitted via a broadcasting satellite, a communication satellite, etc. (not shown). A broadcast signal retransmitted by a cable television station may be received via a cable line or the like. Further, the broadcast receiving device 100 can be connected to the Internet 800 via the router device 800R, and can transmit and receive data through communication with each server device on the Internet 800.
  • the router device 800R is connected to the Internet 800 by wireless or wired communication, and is also connected to the broadcast receiving device 100 by wired communication and to the mobile information terminal 700 by wireless communication. This enables each server device, broadcast receiving device 100, and portable information terminal 700 on the Internet 800 to mutually transmit and receive data via the router device 800R.
  • the router device 800R, the broadcast receiving device 100, and the mobile information terminal 700 constitute a LAN (Local Area Network). Note that communication between the broadcast receiving device 100 and the mobile information terminal 700 may be performed directly using a method such as Bluetooth (registered trademark) or NFC (Near Field Communication) without going through the router device 800R.
  • the radio tower 300 is a broadcasting facility of a broadcasting station, and transmits digital broadcast waves containing various control information related to digital broadcasting services, content data of broadcast programs (video content, audio content, etc.), and the like.
  • the broadcast station also includes a broadcast station server 400.
  • Broadcasting station server 400 stores content data of broadcast programs and metadata of each broadcast program, such as program title, program ID, program summary, performers, broadcast date and time, and the like.
  • the broadcasting station server 400 provides the content data and metadata to the service provider based on a contract. Content data and metadata are provided to the service provider through an API (Application Programming Interface) included in the broadcast station server 400.
  • API Application Programming Interface
  • the service provider server 500 is a server device prepared by a service provider to provide services using a broadcasting and communication cooperation system.
  • the service provider server 500 stores, manages, and stores content data and metadata provided by the broadcasting station server 400, as well as content data and applications (operating programs and/or various data, etc.) created for the broadcast communication cooperation system. Perform distribution, etc. It also has the ability to search for and provide a list of available applications in response to inquiries from television receivers. Note that storage, management, distribution, etc. of the content data and metadata, and storage, management, distribution, etc. of the application may be performed by different server devices.
  • the broadcasting station and the service provider may be the same or different providers.
  • a plurality of service provider servers 500 may be provided for different services.
  • the functions of the service provider server 500 may also be provided by the broadcasting station server 400.
  • Mobile telephone communication server 600 is connected to the Internet 800, and on the other hand, is connected to mobile information terminal 700 via base station 600B.
  • the mobile phone communication server 600 manages telephone communication (calls) and data transmission and reception via the mobile phone communication network of the mobile information terminal 700, and transmits and receives data through communication between the mobile information terminal 700 and each server device on the Internet 800. It is possible to send and receive.
  • communication between the mobile information terminal 700 and the broadcast receiving device 100 may be performed via the base station 600B, the mobile telephone communication server 600, the Internet 800, and the router device 800R.
  • the information server 900 is a server device that provides information such as the sound field environment of a concert hall or theater.
  • the information server 900 provides information to supplement sound field environment metadata when the broadcast content does not have or is lacking in sound field environment metadata. For example, if the name of the theater where the performance will be performed is indicated from the title or metadata of the broadcast content, information about the sound field environment of the theater can be obtained by searching the information stored in the information server based on the theater name. be obtained.
  • the information server 900 may provide not only environmental information of the performance location but also a head-related transfer function that takes sound reproduction through headphones 910 into consideration.
  • the sound field environment and head-related transfer functions may be provided including transfer functions that take hearing-impaired people into consideration.
  • the information server 900 may have a function of acquiring a viewer's brain waves and arranging a sound field environment and head-related transfer function suitable for the viewer.
  • the HMD 920 receives video and audio data from the broadcast receiving device 100, displays the video to the viewer, and reproduces the audio.
  • FIG. 2A is a block diagram showing an example of the internal configuration of broadcast receiving device 100.
  • the broadcast receiving apparatus 100 includes a main control section 101, a system bus 102, a ROM 103, a RAM 104, a storage section 110, a LAN communication section 121, an expansion interface section 124, a digital interface section 125, a first tuner/demodulation section 130C, and a first tuner/demodulation section 130C.
  • the main control section 101 is a microprocessor unit that controls the entire broadcast receiving apparatus 100 according to a predetermined operation program.
  • the system bus 102 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 101 and each operational block within the broadcast receiving apparatus 100.
  • the ROM (Read Only Memory) 103 is a non-volatile memory in which basic operating programs such as an operating system and other operating programs are stored. used. Further, the ROM 103 stores operation setting values and the like necessary for the operation of the broadcast receiving apparatus 100.
  • a RAM (Random Access Memory) 104 serves as a work area when basic operation programs and other operation programs are executed.
  • the ROM 103 and the RAM 104 may be integrated with the main control unit 101. Further, the ROM 103 may not have an independent configuration as shown in FIG. 2A, but may use a part of the storage area within the storage section 110.
  • the storage unit 110 stores operating programs and operating settings of the broadcast receiving device 100, personal information of the user of the broadcast receiving device 100, and the like. Further, it is possible to store operating programs downloaded via the Internet 800 and various data created using the operating programs. It is also possible to store content such as moving images, still images, audio, etc., obtained from broadcast waves or downloaded via the Internet 800. All or part of the functions of the ROM 103 may be replaced by a partial area of the storage section 110. Further, the storage unit 110 needs to hold stored information even when power is not supplied to the broadcast receiving apparatus 100 from the outside. Therefore, for example, devices such as semiconductor element memories such as flash ROMs and SSDs (Solid State Drives), and magnetic disk drives such as HDDs (Hard Disc Drives) are used.
  • each of the operating programs stored in the ROM 103 and the storage unit 110 can be added, updated, and expanded in function by downloading from each server device or broadcast wave on the Internet 800.
  • the LAN communication unit 121 is connected to the Internet 800 via the router device 800R, and sends and receives data to and from each server device and other communication devices on the Internet 800. It also acquires the content data (or part of it) of the program transmitted via the communication line.
  • the connection to the router device 800R may be a wired connection or a wireless connection such as Wi-Fi (registered trademark).
  • the LAN communication unit 121 includes an encoding circuit, a decoding circuit, and the like.
  • the broadcast receiving device 100 may further include other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B each receive broadcast waves of the digital broadcasting service, and Tuning processing (channel selection) is performed by tuning to a predetermined service channel based on the control. Furthermore, it performs demodulation processing and waveform shaping processing of the modulated wave of the received signal, reconstruction processing of the frame structure and hierarchical structure, energy despreading processing, error correction decoding processing, etc., and reproduces the packet stream. It also extracts and decodes a transmission multiplexing configuration control (TMCC) signal from the received signal.
  • TMCC transmission multiplexing configuration control
  • the first tuner/demodulator 130C can input digital broadcast waves of the current digital terrestrial broadcasting service received by the antenna 200C, which is the current digital terrestrial broadcast receiving antenna.
  • the first tuner/demodulator 130C inputs a broadcast signal of one polarization between a horizontal (H) polarization signal and a vertical (V) polarization signal of dual-polarization terrestrial digital broadcasting, which will be described later. It is also possible to demodulate layer segments that use the same modulation method as the digital terrestrial broadcasting service.
  • the first tuner/demodulator 130C can input a broadcast signal of single-polarized digital terrestrial broadcasting, which will be described later, and demodulate the segment of the layer that uses the same modulation method as the current digital terrestrial broadcasting service. be.
  • the first tuner/demodulator 130C can receive a broadcast signal of layer division multiplexed digital terrestrial broadcasting, which will be described later, and demodulate the segment of the layer that uses the same modulation method as the current digital terrestrial broadcasting service. be.
  • the second tuner/demodulator 130T inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service received by the antenna 200T, which is a dual polarization digital terrestrial broadcast receiving antenna, via the converter 201T. Further, the second tuner/demodulator 130T may input digital broadcast waves of the advanced digital terrestrial broadcasting service received by a single-polarized digital terrestrial broadcast receiving antenna (not shown). When the second tuner/demodulator 130T inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service from a single-polarized digital terrestrial broadcast receiving antenna (not shown), the converter 201T may not be used. Note that the antenna 200T that receives digital broadcast waves of dual-polarization terrestrial digital broadcasting includes an element that receives horizontally polarized signals and an element that receives vertically polarized signals.
  • a single-polarized terrestrial digital broadcast receiving antenna includes either an element for receiving a horizontally polarized signal or an element for receiving a vertically polarized signal.
  • the single-polarized digital terrestrial broadcast reception antenna may be used in common with the antenna 200C, which is the current antenna for terrestrial digital broadcast reception.
  • the third tuner/demodulator 130L inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service received by the antenna 200L, which is a hierarchical division multiplexing digital terrestrial broadcast receiving antenna, via the converter 201L.
  • the fourth tuner/demodulator 130B converts digital broadcast waves of an advanced BS (Broadcasting Satellite) digital broadcasting service or an advanced CS (Communication Satellite) digital broadcasting service received by the antenna 200B, which is a BS/CS shared receiving antenna, into a converter. 201B.
  • BS Broadcasting Satellite
  • CS Common Satellite
  • the antenna 200C, the antenna 200T, the antenna 200L, the antenna 200B, the converter 201T, the converter 201L, and the converter 201B do not constitute part of the broadcast receiving device 100, but rather the building in which the broadcast receiving device 100 is installed. It belongs to the equipment side such as.
  • the current terrestrial digital broadcasting described above is a broadcast signal of a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels.
  • Dual-polarization terrestrial digital broadcasting is terrestrial digital broadcasting that uses multiple polarizations, horizontal (H) and vertical (V). This segment transmits a terrestrial digital broadcasting service that can transmit video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels.
  • Single-polarized terrestrial digital broadcasting is terrestrial digital broadcasting that uses either horizontal (H) polarization or vertical (V) polarization, and some segments are divided into 1920 pixels horizontally x vertically.
  • a terrestrial digital broadcasting service capable of transmitting video with a maximum resolution of more than 1080 pixels will be transmitted.
  • the current terrestrial digital broadcasting service which transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, in multiple segments with different polarizations according to each embodiment of the present invention, and It is possible to simultaneously transmit a digital terrestrial broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 pixels x 1080 vertical pixels.
  • single-polarized terrestrial digital broadcasting can transmit images with a maximum resolution of 1920 horizontal pixels x vertical 1080 pixels using the same modulation method as the above-mentioned current digital terrestrial broadcasting in some divided segments. .
  • the current digital terrestrial broadcasting service transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically in different segments of each embodiment of the present invention, and 1920 pixels horizontally x 1080 pixels vertically. It is possible to simultaneously transmit a digital terrestrial broadcasting service that can transmit video whose maximum resolution is a number of pixels exceeding the number of pixels.
  • layer division multiplex terrestrial digital broadcasting (advanced terrestrial digital broadcasting that employs layer division multiplex transmission method) will be described later, but it is capable of transmitting video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically.
  • This is a broadcast signal of terrestrial digital broadcasting service.
  • Hierarchical division multiplexing terrestrial digital broadcasting multiplexes a plurality of digital broadcasting signals with different signal levels. Note that digital broadcast signals with different signal levels mean that the power for transmitting the digital broadcast signals is different.
  • the hierarchical division multiplexing terrestrial digital broadcasting of each embodiment of the present invention is the broadcasting of the current terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels as a plurality of digital broadcasting signals with different signal levels. It is possible to hierarchically multiplex and transmit signals and broadcast signals of digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels in the frequency band of the same physical channel.
  • the current terrestrial digital broadcasting service transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically in multiple layers with different signal levels, and It is possible to simultaneously transmit a digital terrestrial broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 pixels x 1080 vertical pixels.
  • the broadcast receiving device in each embodiment of the present invention may have any configuration as long as it can suitably receive advanced digital broadcasting, and may include a first tuner/demodulator 130C, a second tuner/demodulator 130T, and a third tuner/demodulator. It is not essential to include all of the section 130L and the fourth tuner/demodulator 130B. For example, it is sufficient to include at least one of the second tuner/demodulator 130T or the third tuner/demodulator 130L. Further, in order to realize more advanced functions, one or more of the above four tuners/demodulators may be provided in addition to either the second tuner/demodulator 130T or the third tuner/demodulator 130L. good.
  • the antenna 200C, the antenna 200T, and the antenna 200L may be used in combination as appropriate. Further, among the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L, a plurality of tuners/demodulators may be used in common (or integrated) as appropriate.
  • the first decoder section 140S and the second decoder section 140U each receive the output from the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B.
  • a packet stream or a packet stream obtained from each server device on the Internet 800 via the LAN communication unit 121 is input.
  • the packet streams input to the first decoder unit 140S and the second decoder unit 140U are MPEG (Moving Picture Experts Group)-2 TS (Transport Stream) or MPEG-2 PS (Program Stream).
  • TLV Type Length Value
  • MMT MPEG Media Transport
  • the first decoder section 140S and the second decoder section 140U perform conditional access (CA) processing and extract video data, audio data, and various information data from the packet stream based on various control information included in the packet stream. etc., decoding processing of video data and audio data, acquisition of program information and EPG (Electronic Program Guide) generation processing, processing of playing back data broadcasting screens and multimedia data, etc. . It also performs a process of superimposing the generated EPG and reproduced multimedia data with decoded video data and audio data.
  • CA conditional access
  • the video selection unit 191 inputs the video data output from the first decoder unit 140S and the video data output from the second decoder unit 140U, and selects and/or superimposes the data as appropriate based on the control of the main control unit 101. Process. Further, the video selection unit 191 appropriately performs scaling processing, OSD (On Screen Display) data superimposition processing, and the like.
  • the monitor unit 192 is, for example, a display device such as a liquid crystal panel, and displays the video data selected and/or superimposed by the video selection unit 191, and provides the video data to the user of the broadcast receiving apparatus 100.
  • the video output unit 193 is a video output interface that outputs the video data selected and/or superimposed by the video selection unit 191 to the outside.
  • the video output interface is, for example, HDMI (High-Defenition Multimedia Interface) (registered trademark).
  • the audio selection unit 194 inputs the audio data output from the first decoder unit 140S and the audio data output from the second decoder unit 140U, and selects and/or mixes the audio data as appropriate based on the control of the main control unit 101. Process.
  • the speaker unit 195 outputs the audio data selected and/or mixed by the audio selection unit 194 and provides the audio data to the user of the broadcast receiving device 100 .
  • the audio output unit 196 is an audio output interface that outputs the audio data selected and/or mixed by the audio selection unit 194 to the outside. Examples of the audio output interface include an analog headphone jack, an optical digital interface, Bluetooth, and an ARC (Audio Return Channel) assigned to an HDMI input terminal.
  • the digital interface unit 125 is an interface that outputs or inputs a packet stream containing encoded digital video data and/or digital audio data.
  • the digital interface section 125 allows the first decoder section 140S and the second decoder section 140U to communicate with each other from the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B. It is possible to output the input packet stream as is. Further, a packet stream input from the outside via the digital interface section 125 may be input to the first decoder section 140S or the second decoder section 140U, or may be controlled to be stored in the storage section 110.
  • the video data and audio data separated and extracted by the first decoder section 140S and the second decoder section 140U may be output. Further, it is also possible to control the video data and audio data input from the outside via the digital interface unit 125 to be input to the first decoder unit 140S and the second decoder unit 140U, or to be stored in the storage unit 110. good.
  • the expansion interface unit 124 is a group of interfaces for expanding the functions of the broadcast receiving device 100, and includes a video/audio interface, a USB (Universal Serial Bus) interface, a memory interface, and the like.
  • the video/audio interface inputs video signals/audio signals from an external video/audio output device, outputs video signals/audio signals to the external video/audio input device, and so on.
  • Video/audio interface For example, there are pin jacks and D terminals that handle analog signals, and HDMI that handles digital signals.
  • the USB interface is connected to a PC or the like to send and receive data. Broadcast programs and other content data may be recorded by connecting an HDD. Additionally, a keyboard or other USB devices may be connected.
  • the memory interface connects a memory card or other memory medium to send and receive data.
  • the operation input unit 180 is an instruction input unit for inputting operation instructions to the broadcast receiving apparatus 100, and is an operation input unit that is arranged with button switches and a remote control reception unit that receives commands transmitted from a remote controller (not shown). Consists of keys. Only one of them may be used. Further, the operation input section 180 can be replaced with a touch panel or the like arranged over the monitor section 192. A keyboard or the like connected to the expansion interface unit 124 may be used instead.
  • the remote control can be replaced by a portable information terminal 700 equipped with a remote control command sending function. Note that any of the "keys" provided on the remote control described in the following embodiments may be expressed as "buttons" without any problem.
  • the broadcast receiving device 100 when the broadcast receiving device 100 is a television receiver or the like, the video output section 193 and the audio output section 196 are not essential components. Further, the broadcast receiving device 100 may be an optical disk drive recorder such as a DVD (Digital Versatile Disc) recorder, a magnetic disk drive recorder such as an HDD recorder, an STB (Set Top Box), or the like. It may be a PC (Personal Computer), a tablet terminal, or the like that is equipped with a function of receiving a digital broadcasting service. When the broadcast receiving device 100 is a DVD recorder, an HDD recorder, an STB, or the like, the monitor section 192 and the speaker section 195 are not essential components. By connecting an external monitor and external speakers to the video output section 193 and the audio output section 196 or the digital interface section 125, operations similar to those of a television receiver or the like are possible.
  • an optical disk drive recorder such as a DVD (Digital Versatile Disc) recorder
  • a magnetic disk drive recorder such as
  • FIG. 2B is a block diagram showing an example of a detailed configuration of the first tuner/demodulator 130C.
  • the channel selection/detection unit 131C inputs the current digital broadcast wave received by the antenna 200C, and selects a channel based on the channel selection control signal.
  • the TMCC decoding section 132C extracts the TMCC signal from the output signal of the channel selection/detection section 131C and obtains various TMCC information.
  • the acquired TMCC information is used to control each subsequent process. Details of the TMCC signal and TMCC information will be described later.
  • the demodulator 133C uses QPSK (Quadrature Phase Shift Keying), DQPSK (Differential QPSK), and 16QAM (Quadrature Amplitude Modulation) based on TMCC information and the like.
  • Input a modulated wave modulated using a method such as Performs demodulation processing including frequency deinterleaving, time deinterleaving, carrier demapping processing, etc.
  • the demodulation unit 133C may be capable of supporting a modulation method different from each of the above-mentioned modulation methods.
  • the stream playback unit 134C performs layer division processing, inner code error correction processing such as Viterbi decoding, energy despreading processing, stream playback processing, outer code error correction processing such as RS (Reed Solomon) decoding, and the like. Note that as the error correction process, a method different from each of the above-mentioned methods may be used. Further, the packet stream reproduced and output by the stream reproduction unit 134C is, for example, MPEG-2 TS. Other formats of packet streams may also be used.
  • FIG. 2C is a block diagram showing an example of a detailed configuration of the second tuner/demodulator 130T.
  • the channel selection/detection unit 131H inputs the horizontal (H) polarized signal of the digital broadcast wave received by the antenna 200T, and selects a channel based on the channel selection control signal.
  • the channel selection/detection unit 131V inputs the vertical (V) polarized wave signal of the digital broadcast wave received by the antenna 200T, and selects a channel based on the channel selection control signal. Note that the operation of the channel selection process in the channel selection/detection section 131H and the operation of the channel selection process in the channel selection/detection section 131V may be controlled in conjunction with each other, or may be controlled independently.
  • channel selection/detection section 131H and the channel selection/detection section 131V as one channel selection/detection section, one of the digital broadcasting services transmitted using both horizontal and vertical polarization. It is also possible to control to select one channel, and by assuming that the channel selection/detection section 131H and the channel selection/detection section 131V are two independent channel selection/detection sections, it is possible to perform control to select only horizontally polarized waves (or It is also possible to perform control to select two different channels of a digital broadcasting service transmitted using vertically polarized waves only.
  • the horizontal (H) polarized signal and the vertical (V) polarized signal received by the second tuner/demodulator 130T of the broadcast receiving device in each embodiment of the present invention are broadcast waves whose polarization directions differ by approximately 90 degrees. Any polarization signal may be used, and the configurations related to the horizontal (H) polarization signal, vertical (V) polarization signal, and their reception described below may be reversed.
  • the TMCC decoding unit 132H extracts the TMCC signal from the output signal of the channel selection/detection unit 131H and obtains various TMCC information.
  • the TMCC decoding unit 132V extracts the TMCC signal from the output signal of the channel selection/detection unit 131V and obtains various TMCC information. Only one of the TMCC decoding section 132H and the TMCC decoding section 132V may be provided. The acquired TMCC information is used to control each subsequent process.
  • the demodulation unit 133H and the demodulation unit 133V each perform BPSK (Binary Phase Shift Keying), DBPSK (Differential BPSK), QPSK, DQPSK, 8PSK (Phase Shift Keying) based on TMCC information, etc. Keying), 16APSK (Amplitude and Phase Shift Keying) ), 32APSK, 16QAM, 64QAM, 256QAM, 1024QAM, etc., and performs demodulation processing including frequency deinterleaving, time deinterleaving, carrier demapping processing, etc.
  • the demodulating section 133H and the demodulating section 133V may be capable of supporting a modulation method different from each of the above-mentioned modulation methods.
  • the stream playback unit 134H and the stream playback unit 134V perform layer division processing, inner code error correction processing such as Viterbi decoding and LDPC (Low Density Parity Check) decoding, energy despreading processing, stream playback processing, RS decoding, and BCH decoding, respectively. Performs outer code error correction processing, etc.
  • inner code error correction processing such as Viterbi decoding and LDPC (Low Density Parity Check) decoding
  • energy despreading processing processing
  • stream playback processing RS decoding
  • BCH decoding Low Density Parity Check
  • outer code error correction processing etc.
  • the packet stream reproduced and output by the stream reproduction unit 134H is, for example, MPEG-2 TS.
  • the packet stream reproduced and output by the stream reproduction unit 134V is, for example, a TLV including an MPEG-2 TS or an MMT packet stream. Each of these may be a packet stream in another format.
  • the channel selection/detection unit 131V, TMCC decoding unit 132V, and demodulation unit 133V may not be provided.
  • the signal of the segment that transmits the current digital terrestrial broadcasting service is stream-played.
  • the signal of the segment that transmits the advanced terrestrial digital broadcasting service is input to the stream playback unit 134V.
  • FIG. 2D is a block diagram showing an example of a detailed configuration of the third tuner/demodulator 130L.
  • the channel selection/detection unit 131L receives digital broadcast waves that have been subjected to layered division multiplexing (LDM) processing from the antenna 200L, and selects a channel based on a channel selection control signal.
  • Digital broadcast waves that have been subjected to layer division multiplexing processing can be used for digital broadcasting services (or for different broadcasting services of the same broadcasting service) in which the modulated waves of the upper layer (UL) and the modulated waves of the lower layer (LL) are different. channel) can be used for transmission.
  • the modulated wave of the upper layer is output to the demodulator 133S, and the modulated wave of the lower layer is output to the demodulator 133L.
  • the TMCC decoding unit 132L inputs the upper layer modulated wave and the lower layer modulated wave output from the channel selection/detection unit 131L, extracts the TMCC signal, and obtains various TMCC information. Note that the signal input to the TMCC decoding unit 132L may be only the modulated wave of the upper layer or only the modulated wave of the lower layer.
  • FIG. 2E is a block diagram showing an example of a detailed configuration of the fourth tuner/demodulator 130B.
  • the channel selection/detection unit 131B inputs the digital broadcast waves of the advanced BS digital broadcasting service and the advanced CS digital broadcasting service received by the antenna 200B, and selects a channel based on the channel selection control signal. Other operations are the same as those of the channel selection/detection section 131H and the channel selection/detection section 131V, so detailed explanation will be omitted.
  • TMCC decoding unit 132B, demodulation unit 133B, and stream playback unit 134B also operate in the same manner as the TMCC decoding unit 132H, TMCC decoding unit 132V, demodulation unit 133H, demodulation unit 133V, and stream playback unit 134V, so the details are The explanation will be omitted.
  • FIG. 2F is a block diagram showing an example of a detailed configuration of the first decoder section 140S.
  • the selection unit 141S under the control of the main control unit 101, selects the packet stream input from the first tuner/demodulation unit 130C, the packet stream input from the second tuner/demodulation unit 130T, and the packet stream input from the third tuner/demodulation unit 130L. One of the packet streams is selected and output.
  • the packet streams input from the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L are, for example, MPEG-2 TS.
  • the CA descrambler 142S performs processing for canceling a predetermined scrambling encryption algorithm based on various control information related to conditional access superimposed on the packet stream.
  • the demultiplexer 143S is a stream decoder, and separates and extracts video data, audio data, superimposed text data, subtitle data, program information data, etc. based on various control information included in the input packet stream.
  • the separated and extracted video data is distributed to the video decoder 145S
  • the separated and extracted audio data is distributed to the audio decoder 146S
  • the separated and extracted character superimposition data, subtitle data, program information data, etc. are distributed to the data decoder 144S.
  • a packet stream (eg, MPEG-2 PS, etc.) obtained from a server device on the Internet 800 via the LAN communication unit 121 may be input to the demultiplexing unit 143S.
  • the demultiplexer 143S outputs the packet stream input from the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L to the outside via the digital interface unit 125. It is possible to input a packet stream obtained from the outside via the digital interface section 125.
  • the video decoder 145S performs decoding processing of compression-encoded video information, colorimetry conversion processing, dynamic range conversion processing, etc. on the decoded video information on the video data input from the demultiplexer 143S.
  • processing such as resolution conversion (up/down conversion) based on the control of the main control unit 101 is performed as appropriate to UHD (horizontal 3840 pixels x vertical 2160 pixels), HD (horizontal 1920 pixels x vertical 1080 pixels), and SD ( Video data is output at a resolution of 720 pixels horizontally x 480 pixels vertically. Video data may be output at other resolutions.
  • the audio decoder 146S performs decoding processing of compressed and encoded audio information.
  • a plurality of video decoders 145S and audio decoders 146S may be provided in order to simultaneously perform decoding processing of video data and audio data.
  • the data decoder 144S performs processes such as generating an EPG based on program information data, generating a data broadcasting screen based on BML data, and controlling a cooperative application based on a broadcast communication cooperative function.
  • the data decoder 144S has a BML browser function that executes a BML document, and data broadcasting screen generation processing is executed by the BML browser function.
  • the data decoder 144S also performs processes such as decoding character superimposition data to generate character superimposition information and decoding subtitle data to generate subtitle information.
  • the superimposing section 147S, the superimposing section 148S, and the superimposing section 149S perform superimposition processing on the video data output from the video decoder 145S and the EPG, data broadcast screen, etc. output from the data decoder 144S, respectively.
  • the synthesis unit 151S performs a process of synthesizing the audio data output from the audio decoder 146S and the audio data reproduced by the data decoder 144S.
  • the selection unit 150S selects the resolution of video data based on the control of the main control unit 101. Note that the functions of the superimposing section 147S, the superimposing section 148S, the superimposing section 149S, and the selection section 150S may be integrated with the video selection section 191. The functions of the synthesis section 151S may be integrated with the voice selection section 194.
  • FIG. 2G is a block diagram showing an example of a detailed configuration of the second decoder section 140U.
  • the selection unit 141U selects the packet stream input from the second tuner/demodulation unit 130T, the packet stream input from the third tuner/demodulation unit 130L, and the packet stream input from the fourth tuner/demodulation unit 130B.
  • One of the packet streams is selected and output.
  • the packet streams input from the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B are, for example, an MMT packet stream or a TLV including an MMT packet stream. It may be an MPEG-2 TS format packet stream that uses HEVC (High Efficiency Video Coding) or the like as a video compression method.
  • the CA descrambler 142U performs processing for canceling a predetermined scrambling encryption algorithm based on various control information related to conditional access superimposed on the packet stream.
  • the demultiplexer 143U is a stream decoder, and separates and extracts video data, audio data, superimposed text data, subtitle data, program information data, etc. based on various control information included in the input packet stream.
  • the separated and extracted video data is distributed to the video decoder 145U
  • the separated and extracted audio data is distributed to the audio decoder 146U
  • the separated and extracted character superimposition data, subtitle data, program information data, etc. are distributed to the multimedia decoder 144U.
  • a packet stream (eg, MPEG-2 PS, MMT packet stream, etc.) acquired from a server device on the Internet 800 via the LAN communication unit 121 may be input to the demultiplexing unit 143U.
  • the demultiplexer 143U outputs the packet stream input from the second tuner/demodulator 130T, the third tuner/demodulator 130L, or the fourth tuner/demodulator 130B to the outside via the digital interface unit 125. It is possible to input a packet stream obtained from the outside via the digital interface section 125.
  • the multimedia decoder 144U performs a process of generating an EPG based on program information data, a process of generating a multimedia screen based on multimedia data, a process of controlling a cooperative application based on a broadcast communication cooperative function, and the like.
  • the multimedia decoder 144U has an HTML browser function that executes HTML documents, and multimedia screen generation processing is executed by the HTML browser function.
  • the video decoder 145U, the audio decoder 146U, the superimposing section 147U, the superimposing section 148U, the superimposing section 149U, the combining section 151U, and the selecting section 150U are respectively the video decoder 145S, the audio decoder 146S, the superimposing section 147S, the superimposing section 148S, and the superimposing section 149S. It is a component having the same functions as the synthesis section 151S and the selection section 150S. In the description of the video decoder 145S, the audio decoder 146S, the superimposing section 147S, the superimposing section 148S, the superimposing section 149S, the combining section 151S, and the selecting section 150S in FIG.
  • the video decoder 145U, the audio decoder 146U, the superimposing section 147U, the superimposing section 148U, the superimposing section 149U, the synthesizing section 151U, and the selecting section 150U will be explained separately, so a separate detailed explanation will be omitted.
  • FIG. 2H is a software configuration diagram of the broadcast receiving apparatus 100, and shows an example of the software configuration in the storage unit 110 (or ROM 103, hereinafter the same) and RAM 104.
  • the storage unit 110 stores a basic operation program 1001, a reception function program 1002, a browser program 1003, a content management program 1004, and other operation programs 1009.
  • the storage unit 110 also includes a content storage area 1011 that stores content data such as videos, still images, and audio, and authentication information that is used for communication and cooperation with external mobile terminal devices, server devices, etc. , an authentication information storage area 1012 for storing information, and a various information storage area 1019 for storing various other information.
  • the basic operation program 1001 stored in the storage unit 110 is expanded into the RAM 104, and the main control unit 101 further executes the expanded basic operation program to configure the basic operation control unit 1101. Further, the reception function program 1002, browser program 1003, and content management program 1004 stored in the storage unit 110 are each expanded into the RAM 104, and the main control unit 101 executes each of the expanded operation programs. This configures a reception function control unit 1102, a browser engine 1103, and a content management unit 1104. Furthermore, the RAM 104 is provided with a temporary storage area 1200 that temporarily holds data created when each operating program is executed, as needed.
  • the main control unit 101 controls each operation block by expanding the basic operation program 1001 stored in the storage unit 110 into the RAM 104 and executing it.
  • the basic operation control unit 1101 controls each operation block. Similar descriptions are made for other operating programs.
  • the reception function control unit 1102 performs basic control of the broadcast reception function, broadcast communication cooperation function, etc. of the broadcast reception device 100.
  • the channel selection/demodulation section 1102a performs channel selection processing and TMCC information in the first tuner/demodulation section 130C, the second tuner/demodulation section 130T, the third tuner/demodulation section 130L, the fourth tuner/demodulation section 130B, etc.
  • the stream playback control unit 1102b performs layer division processing, error correction decoding processing, and energy processing in the first tuner/demodulation unit 130C, second tuner/demodulation unit 130T, third tuner/demodulation unit 130L, fourth tuner/demodulation unit 130B, etc.
  • the AV decoder section 1102c mainly controls demultiplexing processing (stream decoding processing), video data decoding processing, audio data decoding processing, etc. in the first decoder section 140S, the second decoder section 140U, and the like.
  • the multimedia (MM) data playback unit 1102d performs BML data playback processing, text super data decoding processing, subtitle data decoding processing, communication cooperation application control processing in the first decoder unit 140S, and HTML data playback processing in the second decoder unit 140U. It mainly controls processes such as multimedia screen generation processing and communication cooperation application control processing.
  • the EPG generation unit 1102e mainly controls EPG generation processing and display processing of the generated EPG in the first decoder unit 140S and the second decoder unit 140U.
  • the presentation processing unit 1102f controls colorimetry conversion processing, dynamic range conversion processing, resolution conversion processing, audio downmix processing, etc. in the first decoder unit 140S and the second decoder unit 140U, and also controls the video selection unit 191 and the audio selection unit 194. etc. are controlled.
  • the BML browser 1103a and HTML browser 1103b of the browser engine 1103 interpret BML documents and HTML documents during the aforementioned BML data playback processing and HTML data playback processing, and perform data broadcasting screen generation processing and multimedia screen generation processing. .
  • the content management unit 1104 manages time schedules and execution controls when making recording reservations and viewing reservations for broadcast programs, and manages copyrights when outputting broadcast programs, recorded programs, etc. from the digital interface unit 125, the LAN communication unit 121, etc. Performs expiration date management, etc. of cooperative applications acquired based on management and broadcast communication cooperation functions.
  • Each of the operating programs may be stored in advance in the storage unit 110 and/or the ROM 103 at the time of product shipment.
  • the information may be obtained from a server device on the Internet 800 via the LAN communication unit 121 or the like after the product is shipped.
  • each of the operating programs stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 124 or the like. It may be newly acquired or updated via broadcast waves.
  • FIG. 3A is an example of an internal configuration of the broadcast station server 400.
  • the broadcast station server 400 includes a main control section 401, a system bus 402, a RAM 404, a storage section 410, a LAN communication section 421, and a digital broadcast signal transmission section 460 as components.
  • the main control unit 401 is a microprocessor unit that controls the entire broadcasting station server 400 according to a predetermined operating program.
  • the system bus 402 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 401 and each operational block within the broadcast station server 400.
  • the RAM 404 serves as a work area when each operating program is executed.
  • the storage unit 410 stores a basic operation program 4001, a content management/distribution program 4002, and a content transmission program 4003, and further includes a content data storage area 4011 and a metadata storage area 4012.
  • the content data storage area 4011 stores content data of each broadcast program broadcast by a broadcast station.
  • the metadata storage area 4012 stores metadata such as the program title, program ID, program summary, performers, broadcast date and time of each of the broadcast programs.
  • the basic operating program 4001, content management/distribution program 4002, and content sending program 4003 stored in the storage unit 410 are each expanded to the RAM 404, and the main control unit 401 is further expanded to the expanded basic operating program and the content management/distribution program 4003.
  • a basic operation control section 4101, a content management/distribution control section 4102, and a content transmission control section 4103 are configured.
  • a content management/distribution control unit 4102 manages content data, metadata, etc. stored in a content data storage area 4011 and a metadata storage area 4012, and distributes the content data, metadata, etc. to a service provider based on a contract. Control when providing. Furthermore, when providing content data, metadata, etc. to the service provider, the content management/distribution control unit 4102 also performs authentication processing of the service provider server 500 as necessary.
  • the content transmission control unit 4103 includes content data of the broadcast program stored in the content data storage area 4011, program title, program ID, program content copy control information, etc. of the broadcast program stored in the metadata storage area 4012. It performs time schedule management and the like when transmitting a stream via the digital broadcast signal transmitting section 460.
  • the LAN communication unit 421 is connected to the Internet 800 and communicates with the service provider server 500 and other communication devices on the Internet 800.
  • the LAN communication unit 421 includes an encoding circuit, a decoding circuit, and the like.
  • the digital broadcast signal transmitting unit 460 performs processing such as modulation on the stream composed of content data and program information data of each broadcast program stored in the content data storage area 4011, and transmits the stream in digital form via the radio tower 300. Send it out as a broadcast wave.
  • FIG. 3B is an example of the internal configuration of the service provider server 500.
  • the service provider server 500 includes a main control section 501, a system bus 502, a RAM 504, a storage section 510, and a LAN communication section 521.
  • the main control unit 501 is a microprocessor unit that controls the entire service provider server 500 according to a predetermined operating program.
  • the system bus 502 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 501 and each operation block in the service provider server 500.
  • the RAM 504 serves as a work area when each operating program is executed.
  • the storage unit 510 stores a basic operation program 5001, a content management/distribution program 5002, and an application management/distribution program 5003, and further includes a content data storage area 5011, a metadata storage area 5012, and an application storage area 5013.
  • the content data storage area 5011 and the metadata storage area 5012 store content data and metadata provided by the broadcasting station server 400, content produced by a service provider, metadata related to the content, and the like.
  • the application storage area 5013 stores applications (operating programs and/or various data, etc.) necessary for realizing each service of the broadcasting and communication cooperation system, to be distributed in response to requests from each television receiver.
  • the basic operation program 5001, content management/distribution program 5002, and application management/distribution program 5003 stored in the storage unit 510 are each expanded to the RAM 504, and the main control unit 501 is By executing the management/distribution program and the application management/distribution program, a basic operation control section 5101, a content management/distribution control section 5102, and an application management/distribution control section 5103 are configured.
  • the content management/distribution control unit 5102 acquires content data, metadata, etc. from the broadcasting station server 400, manages content data, metadata, etc. stored in the content data storage area 5011 and the metadata storage area 5012, and manages each content data, metadata, etc. Controls the distribution of the content data, metadata, etc. to the television receiver. Further, the application management/distribution control unit 5103 manages each application stored in the application storage area 5013 and controls the distribution of each application in response to a request from each television receiver. Furthermore, when distributing each application to each television receiver, the application management/distribution control unit 5103 also performs authentication processing of the television receiver, etc., as necessary.
  • the LAN communication unit 521 is connected to the Internet 800 and communicates with the broadcasting station server 400 and other communication devices on the Internet 800. It also communicates with the broadcast receiving device 100 and the mobile information terminal 700 via the router device 800R.
  • the LAN communication unit 521 includes an encoding circuit, a decoding circuit, and the like.
  • FIG. 3C is a block diagram showing an example of the internal configuration of mobile information terminal 700.
  • the mobile information terminal 700 includes a main control section 701, a system bus 702, a ROM 703, a RAM 704, a storage section 710, a communication processing section 720, an expansion interface section 724, an operation section 730, an image processing section 740, an audio processing section 750, and a sensor section 760. , consists of.
  • the main control unit 701 is a microprocessor unit that controls the entire mobile information terminal 700 according to a predetermined operating program.
  • the system bus 702 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 701 and each operational block within the mobile information terminal 700.
  • the ROM 703 is a nonvolatile memory in which basic operating programs such as an operating system and other operating programs are stored, and for example, a rewritable ROM such as an EEPROM or a flash ROM is used. Further, the ROM 703 stores operation setting values and the like necessary for the operation of the mobile information terminal 700.
  • the RAM 704 serves as a work area when executing the basic operation program and other operation programs.
  • the ROM 703 and the RAM 704 may be integrated with the main control unit 701. Further, the ROM 703 may not have an independent configuration as shown in FIG. 3C, but may use a part of the storage area within the storage unit 710.
  • the storage unit 710 stores the operating program and operation setting values of the portable information terminal 700, the personal information of the user of the portable information terminal 700, and the like. Further, it is possible to store operating programs downloaded via the Internet 800 and various data created using the operating programs. Further, content downloaded via the Internet 800, such as moving images, still images, and audio, can also be stored. All or part of the functions of the ROM 703 may be replaced by a partial area of the storage section 710. Furthermore, the storage unit 710 needs to retain stored information even when power is not supplied to the portable information terminal 700 from the outside. Therefore, for example, devices such as semiconductor element memories such as flash ROMs and SSDs, and magnetic disk drives such as HDDs are used.
  • each of the operating programs stored in the ROM 703 and the storage unit 710 can be added, updated, and expanded in functionality by downloading from each server device on the Internet 800.
  • the communication processing unit 720 includes a LAN communication unit 721, a mobile telephone network communication unit 722, and an NFC communication unit 723.
  • the LAN communication unit 721 is connected to the Internet 800 via the router device 800R, and sends and receives data to and from each server device and other communication devices on the Internet 800. Connection with the router device 800R is performed by wireless connection such as Wi-Fi (registered trademark).
  • the mobile telephone network communication unit 722 performs telephone communication (call) and data transmission/reception through wireless communication with the base station 600B of the mobile telephone communication network.
  • the NFC communication unit 723 performs wireless communication when in close proximity to a corresponding reader/writer.
  • the LAN communication section 721, mobile telephone network communication section 722, and NFC communication section 723 each include an encoding circuit, a decoding circuit, an antenna, and the like. Further, the communication processing unit 720 may further include other communication units such as a Bluetooth (registered trademark) communication unit and an infrared communication unit.
  • the expansion interface unit 724 is a group of interfaces for expanding the functions of the mobile information terminal 700, and in this embodiment, it is assumed to be composed of a video/audio interface, a USB interface, a memory interface, etc.
  • the video/audio interface inputs video signals/audio signals from an external video/audio output device, outputs video signals/audio signals to the external video/audio input device, and so on.
  • the USB interface is connected to a PC or the like to send and receive data. Additionally, a keyboard or other USB devices may be connected.
  • the memory interface connects a memory card or other memory medium to send and receive data.
  • the operation unit 730 is an instruction input unit for inputting operation instructions to the mobile information terminal 700, and in this embodiment, it is composed of a touch panel 730T arranged over a display unit 741 and an operation key 730K arranged with button switches. . It may be only one of them.
  • the portable information terminal 700 may be operated using a keyboard or the like connected to the expansion interface section 724.
  • the portable information terminal 700 may be operated using a separate terminal device connected by wired communication or wireless communication. That is, the portable information terminal 700 may be operated from the broadcast receiving device 100. Further, the touch panel function may be provided in the display section 741.
  • the image processing section 740 includes a display section 741, an image signal processing section 742, a first image input section 743, and a second image input section 744.
  • the display unit 741 is, for example, a display device such as a liquid crystal panel, and provides image data processed by the image signal processing unit 742 to the user of the mobile information terminal 700.
  • the image signal processing section 742 includes a video RAM (not shown), and the display section 741 is driven based on image data input to the video RAM. Further, the image signal processing unit 742 has a function of performing format conversion, superimposition processing of menus and other OSD (On Screen Display) signals, etc. as necessary.
  • the first image input unit 743 and the second image input unit 744 convert light input from a lens into an electrical signal using an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor. , a camera unit that inputs image data of surroundings and objects.
  • an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • the audio processing section 750 includes an audio output section 751, an audio signal processing section 752, and an audio input section 753.
  • the audio output unit 751 is a speaker, and provides the user of the mobile information terminal 700 with an audio signal processed by the audio signal processing unit 752.
  • the voice input unit 753 is a microphone, and converts the user's voice into voice data and inputs the voice data.
  • the sensor unit 760 is a group of sensors for detecting the state of the mobile information terminal 700, and in this embodiment, it includes a GPS receiving unit 761, a gyro sensor 762, a geomagnetic sensor 763, an acceleration sensor 764, an illuminance sensor 765, and a proximity sensor 766. , consists of. These sensor groups make it possible to detect the position, inclination, direction, movement, surrounding brightness, proximity of surrounding objects, etc. of the mobile information terminal 700. Furthermore, the mobile information terminal 700 may further include other sensors such as an atmospheric pressure sensor.
  • the mobile information terminal 700 may be a mobile phone, a smart phone, a tablet terminal, or the like. It may be a PDA (Personal Digital Assistant) or a notebook PC. Further, it may be a digital still camera, a video camera capable of shooting moving images, a portable game machine, a navigation device, or other portable digital equipment.
  • PDA Personal Digital Assistant
  • notebook PC Portable Computer System
  • the configuration example of the mobile information terminal 700 shown in FIG. 3C includes many configurations that are not essential to this embodiment, such as a sensor section 760, but even if the configuration does not include these, this embodiment does not impair its effectiveness. Furthermore, configurations not shown may be further added, such as a digital broadcast reception function and an electronic money payment function.
  • FIG. 3D is a software configuration diagram of the mobile information terminal 700, and shows an example of the software configuration in the ROM 703, RAM 704, and storage unit 710.
  • the ROM 703 stores a basic operation program 7001 and other operation programs.
  • the storage unit 710 stores a cooperation control program 7002 and other operating programs.
  • the storage unit 710 also includes a content storage area 7200 that stores content data such as videos, still images, and audio, and an authentication information storage area 7300 that stores authentication information necessary for accessing the television receiver and each server device. , and various information storage areas for storing various other information.
  • the basic operation program 7001 stored in the ROM 703 is expanded to the RAM 704, and the main control unit 701 further executes the expanded basic operation program to configure the basic operation execution unit 7101.
  • the cooperation control program 7002 stored in the storage unit 710 is similarly expanded to the RAM 704, and further, the main control unit 701 configures the cooperation control execution unit 7102 by executing the expanded cooperation control program.
  • the RAM 704 is provided with a temporary storage area that temporarily holds data created when each operating program is executed, as needed.
  • the cooperation control execution unit 7102 manages device authentication and connection, transmission and reception of each data, etc. when the mobile information terminal 700 performs a cooperation operation with the television receiver. Further, the cooperation control execution unit 7102 is provided with a browser engine function for executing an application that works in conjunction with the television receiver.
  • Each of the operating programs may be stored in advance in the ROM 703 and/or the storage unit 710 at the time of product shipment. After the product is shipped, the information may be obtained from a server device on the Internet 800 via the LAN communication section 721 or the mobile telephone network communication section 722. Further, each of the operating programs stored in a memory card, an optical disk, etc. may be acquired via the expansion interface unit 724 or the like.
  • the broadcast receiving device 100 is a terrestrial broadcasting system that shares at least some specifications with the ISDB-T (Integrated Services Digital Broadcasting) system. It is possible to receive digital broadcasting services.
  • dual-polarization terrestrial digital broadcasting and single-polarization terrestrial digital broadcasting that can be received by the second tuner/demodulator 130T are advanced terrestrial digital broadcasting that shares some specifications with the ISDB-T system.
  • the hierarchical division multiplexing terrestrial digital broadcasting that can be received by the third tuner/demodulator 130L is an advanced terrestrial digital broadcasting that shares some specifications with the ISDB-T system.
  • the current terrestrial digital broadcasting that can be received by the first tuner/demodulator 130C is ISDB-T type terrestrial digital broadcasting.
  • advanced BS digital broadcasting and advanced CS digital broadcasting that can be received by the fourth tuner/demodulator 130B are digital broadcasting that is different from the ISDB-T system.
  • the dual-polarization terrestrial digital broadcasting, the single-polarization terrestrial digital broadcasting, and the hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment use OFDM, which is one of the multicarrier systems, as a transmission method, similar to the ISDB-T system.
  • OFDM Orthogonal Frequency Division Multiplexing
  • the symbol length is long, and it is effective to add a redundant part in the time axis direction called a guard interval, which can reduce the effects of multipath within the range of the guard interval. It is. Therefore, it is possible to realize an SFN (Single Frequency Network), and effective use of frequencies is possible.
  • the OFDM carrier is divided into groups called segments, as in the ISDB-T system. As shown in 4A, one channel bandwidth of the digital broadcasting service is composed of 13 segments. The center of the band is set as segment 0, and segment numbers (0 to 12) are sequentially assigned above and below this.
  • Transmission path encoding of dual-polarization terrestrial digital broadcasting, single-polarization terrestrial digital broadcasting, and hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment is performed in units of OFDM segments.
  • each layer is composed of one or more OFDM segments, and parameters such as carrier modulation method, inner code coding rate, time interleave length, etc. can be set for each layer.
  • the number of layers may be set arbitrarily; for example, it may be set to a maximum of three layers.
  • FIG. 4B shows an example of layer allocation of OFDM segments when the number of layers is 3 or 2. In the example of FIG. 4B(1), the number of layers is three, and there are an A layer, a B layer, and a C layer.
  • the A layer consists of one segment (segment 0), the B layer consists of seven segments (segments 1 to 7), and the C layer consists of five segments (segments 8 to 12).
  • the number of layers is three, and there are an A layer, a B layer, and a C layer.
  • the A layer consists of one segment (segment 0), the B layer consists of five segments (segments 1 to 5), and the C layer consists of seven segments (segments 6 to 12).
  • the number of layers is 2, and there are an A layer and a B layer.
  • the A layer consists of one segment (segment 0), and the B layer consists of 12 segments (segments 1 to 12).
  • the number of OFDM segments, transmission path coding parameters, etc. of each layer are determined according to configuration information, and are transmitted by a TMCC signal, which is control information for assisting the operation of the receiver.
  • the layer assignment in FIG. 4B(1) can be used in the dual-polarization terrestrial digital broadcasting according to this embodiment, and the same segment layer assignment may be used for both horizontal and vertical polarization. Specifically, it is sufficient to transmit the current mobile reception service of digital terrestrial broadcasting in the above-mentioned one segment of horizontally polarized waves as layer A. (In addition, the current mobile reception service for digital terrestrial broadcasting may transmit the same service using the above-mentioned one segment of vertically polarized waves.
  • layer B horizontally polarized It is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting, using the seven segments of the wave.
  • the digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically may transmit the same service using the above 7 segments of vertically polarized waves. In this case, this is also considered as the B layer.
  • an advanced terrestrial network capable of transmitting video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically in the above five segments for both horizontally polarized waves and vertically polarized waves, a total of 10 segments. It may also be configured to transmit digital broadcasting services. Details of the transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
  • the hierarchy assignment in FIG. 4B(1) can be used in the single-polarized digital terrestrial broadcasting according to this embodiment.
  • the current mobile reception service of digital terrestrial broadcasting may be transmitted in the above-mentioned one segment as layer A.
  • the above seven segments may be used to transmit a terrestrial digital broadcasting service that transmits video having a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting.
  • the C layer may be configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 horizontal pixels x 1080 vertical pixels in the five segments described above.
  • the C layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Details of the transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
  • one segment of the A layer transmits the current mobile reception service of the terrestrial digital broadcasting
  • eight segments of the B layer transmit the current terrestrial digital broadcasting service.
  • Digital terrestrial broadcasting service that transmits broadcast video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, and video whose maximum resolution exceeds 1920 pixels horizontally x 1080 pixels vertically in 4 segments of the C layer. It may also be configured to transmit advanced digital terrestrial broadcasting services that can transmit.
  • the C layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Details of the transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
  • the hierarchy allocation shown in FIG. 4B (2) can be used as an example different from that shown in FIG. Hierarchical assignment may be used. Specifically, it is sufficient to transmit the current mobile reception service of digital terrestrial broadcasting in the above-mentioned one segment of horizontally polarized waves as layer A. (In addition, the current mobile reception service for digital terrestrial broadcasting may transmit the same service using the above-mentioned one segment of vertically polarized waves. In this case, this is also treated as layer A.) Furthermore, as layer B, horizontally polarized It is configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels in the above five segments of both wave and vertical polarization, a total of 10 segments.
  • the C layer it is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting, using the above seven segments of horizontally polarized waves.
  • the digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically may transmit the same service using the above 7 segments of vertical polarization. In this case, this is also the C layer.
  • the details of this transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100 of this embodiment.
  • the hierarchy assignment in FIG. 4B(2) can be used as an example different from FIG. 4B(1) in the single-polarized digital terrestrial broadcasting according to this embodiment.
  • the current mobile reception service of digital terrestrial broadcasting may be transmitted in the above-mentioned one segment as layer A.
  • the B layer may be configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 horizontal pixels x 1080 vertical pixels in the five segments.
  • the B layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting.
  • the transmitted wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100 of this embodiment.
  • the layer allocation shown in FIG. 4B (3) can be used in the layer division multiplexing terrestrial digital broadcasting according to this embodiment and the current terrestrial digital broadcasting.
  • layer B we have advanced terrestrial digital broadcasting services that can transmit video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically in 12 segments in the figure, and 1920 pixels horizontally x 1080 pixels vertically.
  • the current digital terrestrial broadcasting service that transmits video images may be configured to be transmitted.
  • the transmitted wave assigned to the segment hierarchy can be received by, for example, the third tuner/demodulator 130L of the broadcast receiving apparatus 100 of this embodiment.
  • the third tuner/demodulator 130L When used in current terrestrial digital broadcasting, it is sufficient to transmit the current mobile reception service of terrestrial digital broadcasting in 1 segment in the figure as the A layer, and to transmit the current terrestrial digital broadcasting mobile reception service in 12 segments in the figure as the B layer. It is sufficient to transmit a terrestrial digital broadcasting service that transmits video having a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically.
  • the transmitted wave assigned to the segment hierarchy can be received by, for example, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 of this embodiment.
  • FIG. 4C shows a system on the broadcasting station side that realizes generation processing of OFDM transmission waves, which are digital broadcast waves for dual-polarized terrestrial digital broadcasting, single-polarized terrestrial digital broadcasting, and hierarchical division multiplexed terrestrial digital broadcasting according to this embodiment.
  • the information source encoding unit 411 encodes video/audio/various data, etc., respectively.
  • the multiplexing unit/conditional reception processing unit 415 multiplexes the video/audio/various data etc. encoded by the information source encoding unit 411, performs appropriate processing corresponding to conditional reception, and outputs it as a packet stream. do.
  • a plurality of information source encoding units 411 and multiplexing units/conditional access processing units 415 can exist in parallel, and generate a plurality of packet streams.
  • the transmission path encoding unit 416 re-multiplexes the plurality of packet streams into one packet stream, performs transmission path encoding processing, and outputs it as an OFDM transmission wave.
  • the configuration shown in FIG. 4C is the same as the ISDB-T system as a configuration for realizing OFDM transmission wave generation processing, although the details of the information source encoding and transmission path encoding methods are different.
  • some of the plurality of information source encoding units 411 and multiplexing units/conditional access processing units 415 are configured for ISDB-T digital terrestrial broadcasting services, and some are configured for advanced digital terrestrial broadcasting services.
  • the transmission path encoding unit 416 may multiplex packet streams of a plurality of different digital terrestrial broadcasting services.
  • MPEG-2TS which is a TSP (Transport Stream Packet) stream defined by MPEG-2 Systems, is configured. Just generate it.
  • the multiplexing unit/conditional access processing unit 415 when configured for advanced terrestrial digital broadcasting services, an MMT packet stream, a TLV stream containing MMT packets, or a TSP stream specified by other systems may be used. Just generate it.
  • all of the multiple information source encoding units 411 and the multiplexing unit/conditional access processing unit 415 are configured for advanced terrestrial digital broadcasting services, and all packet streams multiplexed by the transmission line encoding unit 416 are It may also be a packet stream for digital terrestrial broadcasting services.
  • FIG. 4D shows an example of the configuration of the transmission path encoding section 416.
  • FIG. 4D (1) shows the configuration of the transmission path encoding unit 416 when generating only OFDM transmission waves for digital broadcasting of the current digital terrestrial broadcasting service.
  • the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. 4B (3).
  • the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, and is subjected to various types of processing such as byte interleaving, bit interleaving, time interleaving, and frequency interleaving. Interleaving processing is performed.
  • the signal is processed by IFFT (Inverse Fast Fourier Transform) together with the pilot signal, TMCC signal, and AC signal, and after a guard interval is added, it becomes an OFDM transmission wave through orthogonal modulation.
  • IFFT Inverse Fast Fourier Transform
  • outer code processing, power spreading processing, byte interleaving, inner code processing, bit interleaving processing, and mapping processing are configured so that they can be processed separately for each layer such as the A layer and the B layer.
  • Figure 4D (1) shows an example of three layers.
  • the mapping process is carried out by carriers. This is the modulation process.
  • the packet stream inputted from the multiplexing unit/conditional access processing unit 415 may be multiplexed with information such as TMCC information, mode, guard interval ratio, and the like.
  • the packet stream input to the transmission path encoding unit 416 may be a TSP stream defined by MPEG-2 Systems, as described above.
  • the OFDM transmission wave generated with the configuration of FIG. 4D(1) can be received by, for example, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 of this embodiment.
  • FIG. 4D (2) shows the configuration of the transmission path encoding unit 416 when generating OFDM transmission waves for dual-polarization terrestrial digital broadcasting according to this embodiment.
  • the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. 4B (1) or (2).
  • the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, byte interleaving, bit interleaving, time
  • Various interleaving processes such as interleaving and frequency interleaving are performed. Thereafter, it is processed by IFFT along with the pilot signal, TMCC signal, and AC signal, subjected to guard interval addition processing, and then subjected to orthogonal modulation to become an OFDM transmission wave.
  • outer code processing, power spreading processing, byte interleaving, inner code processing, bit interleaving processing, mapping processing, and time interleaving are performed for each layer such as layer A, layer B, and layer C. Configure so that they can be processed separately.
  • layer A, layer B, and layer C Configure so that they can be processed separately.
  • FIG. 4D (2) not only a horizontally polarized (H) OFDM transmission wave but also a vertically polarized (V) OFDM transmission wave is generated, and the processing flow is branched into two systems. do.
  • outer code, inner code, mapping, etc. shown in the configuration of FIG. 4D(2) is in addition to the processing compatible with the configuration of FIG. 4D(1). More advanced processing not adopted can be used.
  • the part where processing is performed for each layer is compatible with current terrestrial digital broadcasting mobile reception services and video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically.
  • processing compatible with the configuration shown in FIG. 4D (1) is performed regarding processing of outer codes, inner codes, mapping, etc.
  • advanced digital terrestrial broadcasting is capable of transmitting video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels.
  • the layer that transmits the service may be configured to use more advanced processing, such as outer code, inner code, mapping, etc., which is not adopted in each process in the configuration of FIG. 4D (1).
  • the allocation of the hierarchy and the transmitted terrestrial digital broadcasting service can be switched using the TMCC information described later. It is desirable that processing such as coding and mapping be configured to be switchable using TMCC information.
  • byte interleaving, bit interleaving, and time interleaving are the same as in the current terrestrial digital broadcasting. Processing that is compatible with the service may be performed, or different, more advanced processing may be performed. Alternatively, for layers that transmit advanced digital terrestrial broadcasting services, some interleaving may be omitted.
  • the source input stream may be a TSP stream defined by the MPEG-2 system, which is currently used in digital terrestrial broadcasting, among the packet streams input to the transmission path encoding unit 416.
  • the input stream that is the source of the layer that transmits the advanced digital terrestrial broadcasting service configured as shown in FIG. It may be a stream defined by a system other than the TSP stream defined by MPEG-2 systems, such as .
  • TSP streams defined by MPEG-2 Systems may be adopted.
  • the current mobile reception service of digital terrestrial broadcasting and video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically are transmitted.
  • stream formats and processing compatible with current digital terrestrial broadcasting will be maintained.
  • the current receiving device of the current digital terrestrial broadcasting service receives one of the horizontally polarized OFDM transmission waves and the vertically polarized OFDM transmission waves generated in the configuration shown in Figure 4D (2).
  • the broadcasting of the digital terrestrial broadcasting service It becomes possible to receive and demodulate signals correctly.
  • the maximum resolution is the number of pixels exceeding 1920 pixels horizontally x 1080 pixels vertically. It is possible to transmit an advanced digital terrestrial broadcasting service that can transmit video images, and the broadcast signal of the advanced digital terrestrial broadcasting service can be received and demodulated by the broadcast receiving apparatus 100 according to the embodiment of the present invention. becomes.
  • both the broadcast receiving device compatible with advanced terrestrial digital broadcasting services and the existing receiving device of terrestrial digital broadcasting services can receive and demodulate digital broadcasts favorably. Broadcast waves can be generated.
  • the transmission path encoding unit 416 shown in FIG. 4D (2) when generating OFDM transmission waves for single-polarized terrestrial digital broadcasting according to this embodiment, the transmission path encoding unit 416 shown in FIG. 4D (2) generates horizontally polarized (H) OFDM transmission waves. It is sufficient to consist of only one of a system for generating a vertically polarized (V) OFDM transmission wave and a system for generating a vertically polarized (V) OFDM transmission wave.
  • the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. Unlike the case, only one of the horizontally polarized OFDM transmission wave and the vertically polarized OFDM transmission wave is transmitted.
  • Other configurations, operations, etc. are the same as in the case of generating OFDM transmission waves for dual-polarization terrestrial digital broadcasting described above.
  • FIG. 4D (3) shows the configuration of the transmission path encoding unit 416 when generating OFDM transmission waves for hierarchical division multiplexing digital terrestrial broadcasting according to this embodiment.
  • the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, byte interleaving, bit interleaving, time Various interleaving processes such as interleaving and frequency interleaving are performed.
  • the pilot signal, TMCC signal, and AC signal are processed by IFFT, and after a guard interval is added, they undergo orthogonal modulation to become an OFDM transmission wave.
  • a modulated wave transmitted in the upper layer and a modulated wave transmitted in the lower layer are respectively generated, and after multiplexing, an OFDM transmission wave that is a digital broadcast wave is generated.
  • the processing system shown in the upper part of the configuration of FIG. 4D (3) is a processing system for generating modulated waves transmitted in the upper layer, and the processing system shown in the lower part generates modulated waves transmitted in the lower layer.
  • This is a processing system for The data transmitted by the processing system for generating the modulated wave transmitted in the upper layer of Figure 4D (3) is based on the current terrestrial digital broadcasting mobile reception service and the maximum resolution is 1920 pixels horizontally x 1080 pixels vertically.
  • the modulated wave transmitted in the upper layer of FIG. 4D(3) has, for example, the segment configuration of FIG. 4B(3) similarly to the transmitted wave of FIG. 4D(1). Therefore, the modulated waves transmitted in the upper layer of FIG. 4D (3) are used for the current mobile reception service of digital terrestrial broadcasting and for the current digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically.
  • the modulated wave transmitted in the lower layer of FIG. 4D (3) is, for example, an advanced terrestrial device that can transmit video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels with all 13 segments in the A layer. It may also be allocated to digital broadcasting services. Or, with the segment configuration shown in Figure 4B (3), the current mobile reception service of digital terrestrial broadcasting is transmitted in the A layer of 1 segment, and the pixels exceeding 1920 pixels horizontally x 1080 pixels vertically in the B layer of 12 segments. An advanced digital terrestrial broadcasting service that can transmit video with a maximum resolution of In the latter case, as in FIG. 4D (2), the configuration may be such that the processing from outer code processing to time interleaving processing can be switched for each layer such as the A layer and the B layer. In the layer that transmits the mobile reception service of current digital terrestrial broadcasting, it is necessary to maintain processing compatible with current digital terrestrial broadcasting, which is similar to the explanation in FIG. 4D (2).
  • an OFDM transmission wave which is a terrestrial digital broadcast wave
  • the technology for separating the modulated waves transmitted in the upper layer from the OFDM transmission waves is also installed in the existing reception equipment of the current digital terrestrial broadcasting service, so the current terrestrial digital broadcasting mobile reception services and the current digital terrestrial broadcasting service's broadcast signals that transmit images with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically cannot be correctly received by the existing receiving equipment of the current digital terrestrial broadcasting service. Received and demodulated.
  • the broadcast signals of advanced digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels, which are included in the modulated waves transmitted in the lower layer, are It becomes possible to receive and demodulate the broadcast receiving apparatus 100 according to the embodiment of the invention.
  • both the broadcast receiving device compatible with advanced terrestrial digital broadcasting services and the existing receiving device of terrestrial digital broadcasting services can suitably receive and demodulate digital broadcasts.
  • Broadcast waves can be generated.
  • FIG. 4D(3) unlike the configuration of FIG. 4D(2), there is no need to use a plurality of polarized waves, and it is possible to generate an OFDM transmission wave that can be received more easily.
  • the OFDM transmission wave generation processing according to FIG. 4D (1), FIG. 4D (2), and FIG. 4D (3) of this embodiment is compatible with the distance between SFN stations and resistance to Doppler shift in mobile reception.
  • three types of modes with different numbers of carriers are prepared. Note that another mode having a different number of carriers may be further prepared. In a mode with a large number of carriers, the effective symbol length becomes longer, and with the same guard interval ratio (guard interval length/effective symbol length), the guard interval length becomes longer, making it possible to provide resistance to multipaths with long delay time differences. It is.
  • the carrier spacing becomes wide, and it is possible to make it less susceptible to the influence of inter-carrier interference due to Doppler shift that occurs in mobile reception and the like.
  • FIG. 4E shows an example of transmission parameters for each segment of OFDM segments identified by the mode of the system according to the present embodiment.
  • the carrier modulation method in the figure refers to the modulation method of the "data" carrier.
  • the SP signal, CP signal, TMCC signal, and AC signal employ a different modulation method than that of the "data" carrier.
  • These signals are signals in which noise immunity is more important than the amount of information, so a small-value constellation with fewer states (BPSK or DBPSK (ie, two states), a modulation method that performs mapping is adopted to improve resistance to noise.
  • the value on the left side of the diagonal line is the value when QPSK, 16QAM, 64QAM, etc. is set as the carrier modulation method
  • the value on the right side of the diagonal line is the value when DQPSK is set as the carrier modulation method. It is a value.
  • the underlined parameters are incompatible with the current mobile reception service of digital terrestrial broadcasting.
  • the modulation methods of "data" carriers such as 256QAM, 1024QAM, and 4096QAM are not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG.
  • 256QAM, 1024QAM, and 4096QAM which are modulation methods for the "data” carrier, are not used.
  • QPSK 4 states
  • 16QAM (16 states) are compatible with current digital terrestrial broadcasting services.
  • further multi-level modulation methods such as 256QAM (number of states: 256), 1024QAM (number of states: 1024), and 4096QAM (number of states: 4096) may be applied.
  • a modulation method different from these modulation methods may be employed.
  • BPSK number of states: 2
  • SP or CP pilot symbol
  • DBPSK number of states: 2
  • the LDPC code is not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, LDPC code is not used.
  • An LDPC code may be applied as an inner code to data transmitted in a layer corresponding to advanced digital terrestrial broadcasting services.
  • the BCH code is not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, BCH code is not used.
  • the BCH code may be applied as an outer code to data transmitted in a layer corresponding to advanced terrestrial digital broadcasting services.
  • FIG. 4F shows the transmission signal parameters for each physical channel (6 MHz bandwidth) of the OFDM broadcast wave generation processing according to FIGS. 4D (1), 4D (2), and 4D (3) of this embodiment.
  • An example is shown.
  • the parameters shown in FIG. 4F are compatible with current digital terrestrial broadcasting services.
  • all segments of the modulated waves transmitted in the lower layer of Figure 4D (3) are assigned to advanced digital terrestrial broadcasting services, it is not necessary to maintain compatibility with the current digital terrestrial broadcasting services in the modulated waves. do not have. Therefore, in this case, parameters other than the parameters shown in FIG. 4F may be used for the modulated wave transmitted in the lower layer of FIG. 4D(3).
  • the carrier of the OFDM transmission wave according to this embodiment includes a carrier for transmitting data such as video and audio, a carrier for transmitting pilot signals (SP, CP, AC1, AC2) that serve as demodulation standards, There is a carrier on which a TMCC signal, which is information such as a carrier modulation format and a convolutional coding rate, is transmitted. For these transmissions, a number of carriers corresponding to 1/9 of the number of carriers for each segment are used.
  • a concatenated code is used for error correction, with a shortened Reed-Solomon (204,188) code for the outer code and a punctured code with a constraint length of 7 and a coding rate of 1/2 as the mother code for the inner code.
  • Adopt convolutional codes Encoding different from the above may be used for both the outer code and the inner code.
  • the information rate varies depending on parameters such as carrier modulation format, convolutional coding rate, and guard interval ratio.
  • 204 symbols constitute one frame, and one frame includes an integer number of TSPs. Transmission parameter switching is performed at this frame boundary.
  • Pilot signals that serve as demodulation standards include SP (Scattered Pilot), CP (Continual Pilot), AC (Auxiliary Channel) 1, and AC2.
  • FIG. 4G shows an example of how pilot signals and the like are arranged within a segment in the case of synchronous modulation (QPSK, 16QAM, 64QAM, 256QAM, 1024QAM, 4096QAM, etc.).
  • the SP is inserted into a synchronous modulation segment and is transmitted once every 12 carriers in the carrier number (frequency axis) direction and once every 4 symbols in the OFDM symbol number (time axis) direction. Since the amplitude and phase of SP are known, they can be used as a reference for synchronous demodulation.
  • FIG. 4H shows an example of how pilot signals and the like are arranged within a segment in the case of differential modulation (DQPSK, etc.).
  • CP is a continuous signal inserted at the left end of a differential modulation segment and is used for demodulation.
  • AC1 and AC2 carry information on the CP, and in addition to playing the role of pilot signals, they are also used to transmit information for broadcasters. AC1 and AC2 may also be used to transmit other information.
  • FIGS. 4G and 4H are examples for mode 3, and the carrier numbers range from 0 to 431, but in mode 1 and mode 2, carrier numbers range from 0 to 107 or 0, respectively. 215. Furthermore, carriers for transmitting AC1, AC2, and TMCC may be determined in advance for each segment. Note that carriers for transmitting AC1, AC2, and TMCC are randomly arranged in the frequency direction in order to reduce the influence of periodic dips in transmission path characteristics due to multipath.
  • the TMCC signal transmits information (TMCC information) related to the demodulation operation of the receiver, such as the layer configuration and the transmission parameters of the OFDM segment.
  • the TMCC signal is transmitted on a carrier for TMCC transmission defined within each segment.
  • FIG. 5A shows an example of bit allocation for TMCC carriers.
  • the TMCC carrier consists of 204 bits (B0 to B203).
  • B0 is the demodulation reference signal for the TMCC symbol and has predetermined amplitude and phase references.
  • B1 to B16 are synchronization signals, each consisting of a 16-bit word. Two types of synchronization signals, w0 and w1, are defined, and w0 and w1 are sent out alternately for each frame.
  • B17 to B19 are used to identify the segment type, and identify whether each segment is a differential modulation section or a synchronous modulation section.
  • TMCC information is written in B20 to B121.
  • B122 to B203 are parity bits.
  • the TMCC information of the OFDM transmission wave includes, for example, system identification, transmission parameter switching index, activation control signal (startup flag for emergency warning broadcasting), current information, next information, frequency conversion process identification, It may be configured to include information to assist demodulation and decoding operations of the receiver, such as physical channel number identification, main signal identification, 4K signal transmission layer identification, and additional layer transmission identification.
  • Current information indicates the current hierarchical configuration and transmission parameters
  • next information indicates the hierarchical configuration and transmission parameters after switching. Transmission parameter switching is performed on a frame-by-frame basis.
  • FIG. 5B shows an example of bit allocation of TMCC information.
  • FIG. 5C shows an example of the configuration of transmission parameter information included in the current information/next information.
  • the connected transmission phase correction amount is control information used in the case of terrestrial digital audio broadcasting ISDB-TSB (ISDB for Terrestrial Sound Broadcasting), etc., which uses a common transmission method.
  • ISDB-TSB ISDB for Terrestrial Sound Broadcasting
  • detailed explanation of the coupled transmission phase correction amount will be omitted.
  • FIG. 5D shows an example of bit allocation for system identification. Two bits are allocated to the system identification signal.
  • "00" is set.
  • "01" is set.
  • "10" is set.
  • the advanced digital terrestrial television broadcasting system transmits 2K broadcast programs (1920 horizontal pixels x 1080 vertical pixels, 4K broadcast programs (not limited to broadcast programs with video exceeding 1920 pixels horizontally x 1080 pixels vertically, and broadcast programs with video exceeding 3840 pixels horizontally x 2160 pixels vertically). It is possible to transmit simultaneously within the same service.
  • the transmission parameter switching index is used to notify the receiver of the switching timing by counting down when switching transmission parameters. This index normally has a value of "1111", and when switching transmission parameters, it is subtracted by 1 for each frame starting 15 frames before switching. The switching timing is the next frame synchronization when "0000" is sent. The value of the index returns to "1111" after "0000". Any one of the parameters such as the system identification of the TMCC information, the transmission parameter information, the frequency conversion process identification, the main signal identification, the 4K signal transmission layer identification, the additional layer transmission identification, etc. included in the current information/next information shown in FIG. 5B. When switching between the above, a countdown is performed. When switching only the activation control signal of TMCC information, no countdown is performed.
  • the activation control signal (activation flag for emergency warning broadcasting) is set to ⁇ 1'' when activation control is being performed to the receiver during emergency warning broadcasting, and is set to ⁇ 0'' when activation control is not performed. do.
  • the partial reception flag for each current information/next information is set to "1" when the segment at the center of the transmission band is set for partial reception, and to "0" otherwise.
  • segment 0 is configured for partial reception, its layer is defined as layer A. If next information does not exist, the partial reception flag is set to "1".
  • FIG. 5E shows an example of bit allocation for the carrier modulation mapping method (data carrier modulation method) in each layer transmission parameter for each current information/next information.
  • this parameter is "000", it indicates that the modulation method is DQPSK. "001” indicates that the modulation method is QPSK. "010” indicates that the modulation method is 16QAM. “011” indicates that the modulation method is 64QAM. "100” indicates that the modulation method is 256QAM. "101” indicates that the modulation method is 1024QAM. “110” indicates that the modulation method is 4096QAM. If there is no unused hierarchy or next information, "111" is set in this parameter.
  • each parameter may be set according to the organization information of each layer for each current information/next information.
  • the number of segments indicates the number of segments in each layer using a 4-bit numerical value. If there is no unused hierarchy or next information, "1111" is set. Note that settings such as the mode and guard interval ratio are independently detected on the receiver side, so they do not need to be transmitted using TMCC information.
  • FIG. 5F shows an example of bit allocation for frequency conversion processing identification.
  • Frequency conversion processing identification indicates whether frequency conversion processing (in the case of dual-polarization transmission method) or frequency conversion amplification processing (in the case of hierarchical division multiplexing transmission method), which will be described later, has been performed in the conversion unit 201T or conversion unit 201L in FIG. 2A. In this case, set "0". If frequency conversion processing or frequency conversion amplification processing is not being performed, "1" is set.
  • this parameter is set to "1" when transmitted from a broadcasting station, and when the conversion section 201T or conversion section 201L executes frequency conversion processing or frequency conversion amplification processing, the conversion section 201T or conversion section 201L
  • the configuration may also be such that rewriting to "0" is performed at the time. In this way, when the second tuner/demodulator 130T or the third tuner/demodulator 130L of the broadcast receiving device 100 receives the frequency conversion processing identification bit as "0", the OFDM It can be identified that frequency conversion processing or the like has been performed after the transmission wave is sent out from the broadcasting station.
  • the frequency conversion processing identification bit may be set or rewritten for each of a plurality of polarizations. For example, if both of the plurality of polarized waves are not frequency-converted by the conversion unit 201T in FIG. 2A, the frequency conversion process identification bits included in both OFDM transmission waves may be left as "1". In addition, if only one of the plurality of polarized waves is frequency-converted by the conversion unit 201T, the frequency conversion processing identification bit included in the OFDM transmission wave of the frequency-converted polarized wave is set to “0” in the conversion unit 201T. ”.
  • the frequency conversion processing identification bit included in the frequency-converted OFDM transmission waves of both polarized waves is set to “0” in the conversion unit 201T. Just rewrite it. In this way, in the broadcast receiving apparatus 100, it is possible to identify whether or not frequency conversion is to be performed for each polarized wave among a plurality of polarized waves.
  • the frequency conversion processing identification bit is not defined in current digital terrestrial broadcasting, so it will be ignored by digital terrestrial broadcasting receiving devices that are already used by users.
  • the bits may be introduced into a new terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels, which is an improvement on the current terrestrial digital broadcasting.
  • the first tuner/demodulator 130C of the broadcast receiving apparatus 100 may also be configured as a first tuner/demodulator compatible with the new terrestrial digital broadcasting service.
  • the conversion unit 201T or conversion unit 201L in FIG. 2A performs frequency conversion processing or frequency conversion amplification processing on the OFDM transmission wave, It may be set to "0" in advance. Note that if the received broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter may be configured to be set to "1".
  • FIG. 5G shows an example of bit allocation for physical channel number identification.
  • the physical channel number identification consists of a 6-bit code, and identifies the physical channel number (13 to 52 ch) of the received broadcast wave. If the received broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter is set to "111111".
  • the physical channel number identification bit is not defined in current digital terrestrial broadcasting, and current digital terrestrial broadcasting receiving devices acquire the physical channel number of the broadcast wave specified by the broadcasting station from the TMCC signal, AC signal, etc. I could't.
  • the broadcast receiving apparatus 100 by using the physical channel number identification bit of the received OFDM transmission wave, the OFDM transmission wave is It is possible to understand the physical channel number set by the broadcasting station.
  • the physical channels 13ch to 52ch have a bandwidth of 6 MHz per channel and are previously assigned to a frequency band of 470 to 710 MHz. Therefore, the fact that the broadcast receiving device 100 can grasp the physical channel number of the OFDM transmission wave based on the physical channel number identification bit means that the frequency band in which the OFDM transmission wave was transmitted in the air as a terrestrial digital broadcast wave can be grasped. It means that it is possible.
  • each of a plurality of pairs of polarized waves in the bandwidth that originally constitutes one physical channel is assigned the corresponding physical channel number. It is sufficient to arrange identification bits and give the same physical number.
  • the converter 201T in FIG. 2A may convert only the frequency of one of the plurality of polarized waves. As a result, if the frequencies of the plurality of polarized waves received by the broadcast receiving apparatus 100 differ from each other, the plurality of polarized waves with different frequencies can be recognized as originally a pair.
  • the broadcast receiving device will not be able to demodulate advanced digital terrestrial broadcasting using both polarizations of dual-polarization digital terrestrial broadcasting.
  • the broadcasting station can It can be identified as a transmission wave that was transmitted as a polarized wave pair that constituted one physical channel. This makes it possible to realize advanced demodulation of dual-polarization terrestrial digital broadcasting using the plurality of transmission waves exhibiting the same value.
  • FIG. 5H shows an example of bit allocation for main signal identification.
  • the main signal identification bit is placed in bit B117.
  • the OFDM transmission wave to be transmitted is a transmission wave of dual polarization terrestrial digital broadcasting
  • this parameter is set to "1" in the TMCC information of the transmission wave transmitted with the main polarization.
  • This parameter is set to "0" in the TMCC information of the transmission wave transmitted with the secondary polarization.
  • Transmission waves that are transmitted with main polarization are vertically polarized signals and horizontally polarized signals that are polarized in the same direction as the polarization direction used for transmission of current digital terrestrial broadcasting services. Refers to wave signals.
  • horizontal polarization is the main polarization
  • vertical polarization is the secondary polarization in dual-polarization terrestrial digital broadcasting services. becomes.
  • vertical polarization is the main polarization
  • horizontal polarization is the secondary polarization. becomes.
  • the broadcast receiving device 100 that receives the transmission wave of the dual polarization terrestrial digital broadcasting according to the embodiment of the present invention, by using the main signal identification bit, the received transmission wave is transmitted in the main polarization at the time of transmission. It is possible to identify whether the signal was being transmitted using a secondary polarization. For example, if the primary polarization and secondary polarization identification processing is used, during the initial scan described later, the transmission wave transmitted with the primary polarization is first scanned, and the transmission wave transmitted with the primary polarization is scanned first. After the initial scan of the transmitted wave is completed, it becomes possible to perform processing such as performing an initial scan of the transmitted wave transmitted with the secondary polarization.
  • the initial scan of the advanced digital terrestrial broadcasting service can be performed after the initial scan of the current digital terrestrial broadcasting service is completed, and the settings made by the initial scan of the current digital terrestrial broadcasting service can be This is suitable because it can be reflected in the settings based on the initial scan of the broadcasting service.
  • the meanings of the main signal identification bits "1" and "0" may be defined in the opposite way to the above explanation.
  • the polarization direction identification bit may be used as one parameter of the TMCC information. Specifically, the broadcasting station sets the polarization direction identification bit to "1" for transmission waves transmitted with horizontal polarization, and the polarization direction identification bit is set on the broadcasting station side for transmission waves transmitted with vertical polarization. It is sufficient to set it to "0". In the broadcast receiving apparatus 100 that receives the transmission wave of the dual-polarization terrestrial digital broadcasting according to the embodiment of the present invention, by using the polarization direction identification bit, it is possible to determine which polarization direction the received transmission wave is at the time of transmission.
  • the first signal second signal identification bit may be used as one parameter of the TMCC information.
  • one of horizontally polarized waves and vertically polarized waves is defined as the first polarized wave
  • the broadcast signal of the transmission wave transmitted with the first polarized wave is defined as the first signal.
  • the station side may set the first signal second signal identification bit to "1".
  • the other polarized wave is defined as the second polarized wave
  • the broadcast signal of the transmission wave transmitted with the second polarized wave is defined as the second signal
  • the broadcast station side sets the first signal second signal identification bit. It is sufficient to set it to "0".
  • the broadcast receiving apparatus 100 that receives the transmission wave of the dual-polarized terrestrial digital broadcast according to the embodiment of the present invention, by using the first signal second signal identification bit, the received transmission wave is It is possible to identify whether the signal was being transmitted in the polarization direction.
  • the first and second signal identification bits are different from the concepts of "main polarization” and “secondary polarization” in the definition of the main signal identification bits described above.
  • the processing and effects in the broadcast receiving apparatus 100 are as follows: "main polarization" in the part related to the processing of the broadcast receiving apparatus 100 in the explanation of the main signal identification bits described above is changed to "first polarization". ⁇ wave'' and ⁇ sub-polarized wave'' may be read as ⁇ second polarized wave,'' so the explanation will be omitted again.
  • main signal identification, polarization direction identification, and first signal and second signal identification apply only when the broadcast wave is a single-polarization digital terrestrial broadcasting service according to this embodiment or when it is not an advanced digital terrestrial broadcasting service.
  • This parameter is not required and can be set to "1".
  • the upper and lower layer identification bits may be used as one parameter of the TMCC information instead of the above-mentioned main signal identification bits.
  • the above-mentioned upper and lower layer identification bit is set to "1”
  • the above-mentioned upper and lower layer identification bit is set to "1"
  • the broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter may be set to "1".
  • the broadcast receiving apparatus 100 receives transmission waves of layer division multiplexed terrestrial digital broadcasting, it determines whether the modulated waves were originally transmitted in the upper layer or not, based on the above-mentioned upper and lower layer identification bits. It is possible to identify whether it was the modulated wave that was being transmitted.
  • the initial scan of advanced digital terrestrial broadcasting services transmitted in the lower layer can be performed after the initial scan of the current digital terrestrial broadcasting service transmitted in the upper layer is completed, and the It becomes possible to reflect the settings made by the initial scan of the digital broadcasting service to the settings made by the initial scan of the advanced digital terrestrial broadcasting service.
  • the third tuner/demodulator 130L of the broadcast receiving apparatus 100 it can be used to switch the processing of the demodulator 133S and the demodulator 133L based on the identification result.
  • FIG. 5I shows an example of bit allocation for 4K signal transmission layer identification.
  • the 4K signal transmission layer identification bits are for horizontally polarized signals and vertically polarized signals for each of the B layer and C layer. It is sufficient to indicate whether or not to transmit a 4K broadcast program using both signals.
  • One bit is assigned to each of the settings of the B layer and the C layer. For example, in the B layer and the C layer, if the 4K signal transmission layer identification bit for each layer is "0", the 4K broadcast program is broadcast using both the horizontally polarized signal and the vertically polarized signal in the layer concerned. It is sufficient to indicate that transmission is to be performed.
  • the 4K signal transmission layer identification bit for each layer is "1"
  • a 4K broadcast program that uses both horizontally polarized signals and vertically polarized signals is transmitted in that layer. Just show that there is no such thing.
  • the 4K signal can be transmitted using both the horizontally polarized signal and the vertically polarized signal in each layer in the B layer and the C layer. It is possible to identify whether or not to transmit a broadcast program.
  • the bits of the 4K signal transmission layer identification indicate that the 4K broadcast program is transmitted for each of the B layer and the C layer. It suffices to indicate whether or not to do so.
  • One bit is assigned to each of the settings of the B layer and the C layer. For example, in the B layer and the C layer, if the 4K signal transmission layer identification bit for each layer is "0", it may indicate that a 4K broadcast program will be transmitted in that layer. In the B layer and the C layer, when the 4K signal transmission layer identification bit for each layer is "1", it may indicate that the 4K broadcast program is not transmitted in that layer. In this way, the broadcast receiving apparatus 100 can use the 4K signal transmission layer identification bit to identify whether or not to transmit a 4K broadcast program in each layer in the B layer and the C layer. .
  • the bits of the 4K signal transmission layer identification indicate whether or not to transmit the 4K broadcast program in the lower layer. It is sufficient to indicate the following.
  • this parameter B119 is "0"
  • the 4K broadcast program is transmitted in the lower layer.
  • this parameter B119 is "1”
  • the 4K broadcast program is not transmitted in the lower hierarchy.
  • the broadcast receiving apparatus 100 can use the 4K signal transmission layer identification bit to identify whether or not to transmit a 4K broadcast program in the lower layer.
  • this parameter B118 may be undefined.
  • each of these parameters may be set to "1".
  • FIG. 5J shows an example of bit allocation for additional layer transmission identification.
  • the bits of the additional layer transmission identification are virtual when the broadcast wave to be transmitted is the dual-polarization terrestrial digital broadcasting service of this embodiment, and for each of the B layer and C layer of the transmission wave transmitted with the secondary polarization. It suffices if it indicates whether to use it as the D layer or the virtual E layer.
  • the bit placed in B120 is the D layer transmission identification bit, and if this parameter is "0", the B layer transmitted with the secondary polarization is used as the virtual D layer.
  • this parameter is "1"
  • the B layer transmitted by secondary polarization is not used as the virtual D layer, but is used as the B layer.
  • the bit placed in B121 is an E layer transmission identification bit, and if this parameter is "0", the C layer transmitted with the secondary polarization is used as the virtual E layer.
  • this parameter is "0"
  • the C layer transmitted with the secondary polarization is used as the virtual E layer.
  • this parameter is "1"
  • the C layer transmitted by the secondary polarization is not used as the virtual E layer, but is used as the C layer.
  • the parameters such as the carrier modulation mapping method, coding rate, and time interleaving length shown in FIG. 5C will be changed between the virtual D layer/virtual E layer and the B layer/C layer. It is possible to make them different. In this case, if the current/next information of parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer is transmitted using AC information (for example, AC1), etc. On the broadcast receiving apparatus 100 side, parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer can be grasped.
  • AC information for example, AC1
  • the current information/next information of the TMCC information transmitted in the secondary polarization It may be configured to switch the meaning of the transmission parameters of the B layer and/or C layer of information to the transmission parameters of the virtual D layer and/or the virtual E layer.
  • the virtual D layer and/or the virtual E layer when used, the A layer, B layer, and C layer are used in the main polarization, and the transmission parameters of these layers are the TMCC transmitted in the main polarization. It is sufficient to transmit the current information/next information.
  • the A layer, D layer, and E layer are used, and the transmission parameters of these layers may be transmitted using the current information/next information of the TMCC information transmitted in the secondary polarization.
  • the broadcast receiving apparatus 100 side can grasp parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer.
  • this parameter should be set to "1". It may be configured to do so.
  • the parameter for additional layer transmission identification may be stored in both the TMCC information of the main polarization and the TMCC information of the secondary polarization, but if it is stored in the TMCC information of the secondary polarization at least, the above-mentioned Both of these processes are possible.
  • the broadcast receiving apparatus 100 may ignore the D layer transmission identification bit.
  • the 4K signal transmission layer identification parameter indicates that the 4K broadcast program is transmitted in the C layer
  • the E layer transmission identification bit indicates that the C layer is used as the virtual E layer
  • Broadcast receiving apparatus 100 may be configured to ignore the E layer transmission identification bit.
  • the parameter is not "10”
  • all bits are set to "1".
  • the system identification parameter is not "10”, but due to some problem, the frequency conversion process identification bit, physical channel number identification bit, main signal identification bit, 4K signal transmission identification bit, or additional layer transmission identification bit
  • the broadcast receiving device 100 may be configured to ignore the bit that is not "1" and determine that all of these bits are “1”.
  • FIG. 5K shows an example of the "coding rate" bits shown in FIG. 5C, that is, bit allocation for error correction coding rate identification.
  • the advanced terrestrial digital broadcasting service of 4K broadcasting can be broadcast together with the terrestrial digital broadcasting service of 2K broadcasting.
  • the LDPC code can be used as the inner code.
  • the coding rate identification bit for error correction according to the present embodiment shown in FIG. 5K is not a coding rate identification bit dedicated to convolutional codes, but is It is also configured to correspond to
  • the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code
  • the coding rate can be set independently depending on whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code.
  • a group of coding rate options suitable for each coding method can be adopted as a digital broadcasting system.
  • the coding rate is 1/2 if the inner code is a convolutional code, and the coding rate is 2 if the inner code is an LDPC code. /3.
  • the identification bit is "001”, it indicates that the coding rate is 2/3 if the inner code is a convolutional code, and that the coding rate is 3/4 if the inner code is an LDPC code.
  • the identification bit is "010”, it indicates that the coding rate is 3/4 if the inner code is a convolutional code, and that the coding rate is 5/6 if the inner code is an LDPC code.
  • the identification bit When the identification bit is "011", it indicates that the coding rate is 5/6 if the inner code is a convolutional code, and that the coding rate is 2/16 if the inner code is an LDPC code. When the identification bit is "100”, it indicates that the coding rate is 7/8 if the inner code is a convolutional code, and that the coding rate is 6/16 if the inner code is an LDPC code. When the identification bit is "101", it is undefined if the inner code is a convolutional code, and it indicates that the coding rate is 10/16 if the inner code is an LDPC code.
  • the identification bit When the identification bit is "110", it indicates that it is undefined if the inner code is a convolutional code, and that the coding rate is 14/16 if the inner code is an LDPC code. If there is no unused hierarchy or next information, this parameter is set to "111".
  • the above-mentioned coding rate 2/3 may be substituted for the coding rate 81/120.
  • the coding rate 3/4 may be substituted for the coding rate 89/120.
  • the coding rate 5/6 may be substituted for the coding rate 101/120.
  • a coding rate of 8/16, a coding rate of 12/16, etc. may be assigned.
  • the identification of whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code is determined by whether the digital terrestrial broadcasting service is a current digital terrestrial broadcasting service or an advanced digital terrestrial broadcasting service. The identification may be performed using the results of identification. The identification may be performed using the identification bits described in FIG. 5D or FIG. 5I.
  • the target digital terrestrial broadcasting service is the current digital terrestrial broadcasting service, it is sufficient to identify that the inner code is a convolutional code.
  • the target digital terrestrial broadcasting service is an advanced digital terrestrial broadcasting service, it is sufficient to identify that the inner code is an LDPC code.
  • Another example of identifying whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code is to identify it based on the identification bits of the error correction method, which will be described later in FIG. 6I. It's okay.
  • the advanced digital terrestrial broadcasting service using dual-polarization transmission method even if the TMCC information of the transmission wave transmitted with horizontal polarization and the TMCC information of the transmission wave transmitted with vertical polarization are the same, It's okay and it can be different.
  • the advanced digital terrestrial broadcasting service using the layer division multiplex transmission system even if the TMCC information of the transmission wave transmitted in the upper layer and the TMCC information of the transmission wave transmitted in the lower layer are the same, It's okay and it can be different.
  • the aforementioned frequency conversion processing identification parameters, main signal identification parameters, additional layer transmission identification, etc. are described only in the TMCC information of the transmission waves transmitted in the secondary polarization and the transmission waves transmitted in the lower layer. It's okay to be.
  • parameters for frequency conversion processing identification parameters for main signal identification, parameters for polarization direction identification, parameters for first signal and second signal identification, parameters for upper and lower layer identification, and parameters for 4K signal transmission layer identification.
  • an additional layer transmission identification parameter are included in a TMCC signal (TMCC carrier) and transmitted.
  • these parameters may be included in an AC signal (AC carrier) and transmitted instead of the TMCC signal. That is, these parameters may be transmitted using a signal of a carrier (TMCC carrier, AC carrier, etc.) that is modulated using a modulation method that performs mapping with a smaller number of states than the data carrier modulation method.
  • the AC signal is an additional information signal related to broadcasting, such as additional information related to transmission control of modulated waves or seismic motion warning information.
  • additional information regarding transmission control of modulated waves can be transmitted using any AC carrier.
  • FIG. 6A shows an example of bit allocation for an AC signal.
  • the AC signal consists of 204 bits (B0 to B203).
  • B0 is the demodulation reference signal for the AC symbol and has predetermined amplitude and phase references.
  • B1 to B3 are signals for identifying the configuration of the AC signal.
  • B4 to B203 are used to transmit additional information related to modulated wave transmission control or seismic motion warning information.
  • FIG. 6B shows an example of bit allocation for AC signal configuration identification.
  • this parameter is set to "001" or "110".
  • the configuration identification parameter ('001' or '110') shall have the same code as the first 3 bits (B1 to B3) of the synchronization signal of the TMCC signal, and shall be transmitted at the same timing as the TMCC signal. Send alternately for each frame.
  • this parameter has a value other than the above, it indicates that additional information regarding transmission control of modulated waves is transmitted using B4 to B203 of the AC signal.
  • the parameters for identifying the configuration of the AC signal are "000” and "111", or “010” and “101", or “011” and "100", which are alternately transmitted for each frame.
  • B4 to B203 of the AC signal are used to transmit additional information related to modulated wave transmission control or seismic motion warning information.
  • Transmission of additional information regarding modulated wave transmission control may be performed using various bit configurations.
  • the frequency conversion processing identification, physical channel number identification, main signal identification, 4K signal transmission layer identification, additional layer transmission identification, etc. mentioned in the explanation of the TMCC signal can be used instead of or in addition to the TMCC signal.
  • Bits may be assigned to additional information regarding transmission control of a modulated wave of a signal for transmission. In this way, the broadcast receiving apparatus 100 can use these parameters to perform the various identification processes already described in the description of the TMCC signal.
  • Current/next information of transmission parameters regarding the hierarchy/virtual E hierarchy may be assigned. In this way, in the broadcast receiving apparatus 100, the transmission parameters of each layer can be acquired using these parameters, and the demodulation process of each layer can be controlled.
  • the seismic motion warning information includes a synchronization signal, start/end flag, update flag, signal identification, seismic motion warning detailed information, CRC, parity bit, and the like.
  • the synchronization signal is composed of a 13-bit code, and is the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal.
  • the 16-bit code that combines the configuration identification and the synchronization signal becomes a 16-bit synchronization word that is the same as the TMCC synchronization signal.
  • the start/end flag is composed of a 2-bit code as a flag for the start timing/end timing of the seismic motion warning information.
  • the start/end flag is changed from “11” to "00" at the start of sending out seismic motion warning information, and from "00" to "11” at the end of sending out seismic motion warning information.
  • the update flag consists of a 2-bit code, and each time there is a change in the contents of a series of seismic motion warning detailed information transmitted when the start/end flag is "00", the update flag is set to "1" with an initial value of "00”. ” is increased. After “11", the number returns to "00". When the start/end flag is "11", the update flag is also "11".
  • FIG. 6D shows an example of bit allocation for signal identification.
  • the signal identification consists of a 3-bit code, and is used to identify the type of detailed seismic motion warning information.
  • this parameter is "000”, it means “seismic motion warning detailed information (applicable area exists)”.
  • this parameter is "001”, it means “seismic motion warning detailed information (no applicable area)”.
  • this parameter is "010”, it means “test signal of earthquake motion warning detailed information (corresponding area exists)”.
  • this parameter is "011”, it means “test signal of earthquake motion warning detailed information (no applicable area)”.
  • this parameter is "111”, it means “no detailed earthquake motion warning information”.
  • the start/end flag is "00”
  • the signal identification is "000”, “001", “010", or “011”.
  • the start/end flag is "11
  • the signal identification is "111”.
  • the seismic motion warning detailed information is composed of an 88-bit code.
  • the seismic motion warning detailed information indicates information regarding the current time when the seismic motion warning information is sent and the area targeted for the seismic motion warning. It transmits information such as the latitude/longitude/intensity of the epicenter of the earthquake that is the target of information and earthquake motion warnings.
  • FIG. 6E shows an example of bit allocation of the seismic motion warning detailed information when the signal identification is "000", “001", “010", or "011".
  • the signal identification is "111”
  • FIG. 6F shows an example of bit allocation of the seismic motion warning detailed information when the signal identification is "111".
  • CRC is a code generated using a predetermined generating polynomial for B21 to B111 of the seismic motion warning information.
  • the parity bit is a code generated by the shortened code (187, 105) of the difference set cyclic code (273, 191) for B17 to B121 of the seismic motion warning information.
  • the broadcast receiving device 100 it is possible to perform various controls for dealing with an emergency situation using the parameters related to the seismic motion warning described in FIGS. 6C, 6D, 6E, and 6F. For example, it is possible to control the presentation of information related to earthquake motion warnings, control to switch display content with low priority to display related to earthquake motion warnings, control to terminate the application display and switch to display related to earthquake motion warnings or broadcast program video, etc. be.
  • FIG. 6G shows an example of bit allocation of additional information regarding modulated wave transmission control.
  • Additional information regarding transmission control of modulated waves includes a synchronization signal, current information, next information, parity bit, and the like.
  • the synchronization signal is composed of a 13-bit code, and is the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal.
  • the synchronization signal does not have to have the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal.
  • the 16-bit code that combines the configuration identification and the synchronization signal is a 16-bit synchronization word that conforms to the TMCC synchronization signal. becomes. It may be a 16-bit synchronization word different from the TMCC synchronization signal.
  • the current information indicates current information on transmission parameter additional information when transmitting a 4K broadcast program in the B layer or C layer, or transmission parameters regarding the virtual D layer or the virtual E layer.
  • the next information indicates transmission parameter additional information when transmitting a 4K broadcast program in the B layer or C layer, and information after switching of transmission parameters regarding the virtual D layer or the virtual E layer.
  • current information B18 to B30 is the current information of the B layer transmission parameter additional information, and indicates the current information of the transmission parameter additional information when transmitting a 4K broadcast program in the B layer.
  • current information B31 to B43 is current information of the C layer transmission parameter additional information, and indicates the current information of the transmission parameter additional information when transmitting a 4K broadcast program in the C layer.
  • B70 to B82 of the next information are information after switching the transmission parameters of the B layer transmission parameter additional information, and after switching the transmission parameters of the transmission parameter additional information when transmitting a 4K broadcast program in the B layer. It indicates information.
  • B83 to B95 of the next information are information after switching the transmission parameters of the C layer transmission parameter additional information, and information after switching the transmission parameters of the transmission parameter additional information when transmitting a 4K broadcast program in the C layer.
  • the transmission parameter additional information is a transmission parameter related to modulation that is added to the transmission parameter of the TMCC information shown in FIG. 5C and whose specifications are expanded. The specific contents of the transmission parameter additional information will be described later.
  • current information B44 to B56 is current information on transmission parameters for the virtual D layer when the virtual D layer is operated.
  • Current information B57 to B69 is current information on transmission parameters for the virtual E layer when operating the virtual E layer.
  • B96 to B108 of the next information are information after the transmission parameters for the virtual D layer are switched when the virtual D layer is operated.
  • Current information B109 to B121 is information after the transmission parameters for the virtual E layer are switched when the virtual E layer is operated.
  • the parameters stored in the transmission parameters for the virtual D layer and the transmission parameters for the virtual E layer may be the same as those shown in FIG. 5C.
  • the virtual D layer and the virtual E layer are layers that do not exist in current digital terrestrial broadcasting.
  • the TMCC information in FIG. 5B needs to maintain compatibility with current terrestrial digital broadcasting, so it is not easy to increase the number of bits. Therefore, in the embodiment of the present invention, the transmission parameters for the virtual D layer and the virtual E layer are stored in the AC information, as shown in FIG. 6G, instead of in the TMCC information.
  • the broadcast receiving device 100 may be configured to ignore any value contained in the transmission parameters shown in FIG. 6G for the unused virtual D layer or virtual E layer.
  • FIG. 6H shows a specific example of transmission parameter additional information.
  • the transmission parameter additional information can include error correction method parameters, constellation format parameters, and the like.
  • the error correction method is a setting that determines what kind of encoding method is used as the error correction method for the inner code and outer code when transmitting 4K broadcast programs (advanced terrestrial digital broadcasting services) in the B or C layer. shows.
  • FIG. 6I shows an example of bit allocation for the error correction method.
  • this parameter is "000”
  • a convolutional code is used as the inner code
  • a shortened RS code is used as the outer code when transmitting a 4K broadcast program on the B layer or the C layer.
  • this parameter is "001"
  • the LDPC code is used as the inner code
  • the BCH code is used as the outer code.
  • other combinations may be set and selected.
  • FIG. 6J shows an example of bit allocation in a constellation format.
  • this parameter is "000”
  • the carrier modulation mapping method selected by the transmission parameter of TMCC information is applied in a uniform constellation.
  • this parameter is one of "001" to "111”
  • the carrier modulation mapping method selected by the transmission parameter of the TMCC information is applied in a non-uniform constellation. Note that when applying a non-uniform constellation, the optimal value of the non-uniform constellation differs depending on the type of error correction method, its coding rate, etc.
  • the broadcast receiving apparatus 100 of this embodiment uses the non-uniform constellation used in the demodulation process as the parameter of the carrier modulation mapping method. This may be determined based on the parameters of the error correction method and its coding rate. This determination may be made by referring to a predetermined table stored in advance in the broadcast receiving apparatus 100.
  • the dual polarization transmission system is a system that shares some specifications with the current digital terrestrial broadcasting system. For example, by dividing 13 segments within the approximately 6 MHz band, which corresponds to one physical channel, 7 segments are used to transmit a 2K (horizontal 1920 pixels x vertical 1080 pixels) broadcast program, and 5 segments are used to transmit a 4K broadcast program.
  • One segment is allocated to each for mobile reception (so-called one-segment broadcasting). Furthermore, the five segments for 4K broadcasting use not only horizontally polarized signals but also vertically polarized signals to ensure a total transmission capacity of 10 segments using MIMO (Multiple-Input Multiple-Output) technology.
  • MIMO Multiple-Input Multiple-Output
  • 2K broadcast programs maintain image quality by optimizing the latest MPEG-2 Video compression technology, so that they can be received by current TV receivers
  • 4K broadcast programs use HEVC compression, which is more efficient than MPEG-2 Video. Image quality will be ensured through technology optimization and modulation/multi-value conversion. Note that the number of segments allocated to each broadcast may be different from the above.
  • FIG. 7A shows an example of a dual-polarization transmission system in an advanced terrestrial digital broadcasting service according to an embodiment of the present invention.
  • a frequency band of 470 MHz to 710 MHz is used to transmit broadcast waves for digital terrestrial broadcasting services.
  • the number of physical channels in the frequency band of 470 MHz to 710 MHz is 40 channels of 13 to 52 channels, and each physical channel has a bandwidth of 6 MHz.
  • a dual polarization transmission system according to an embodiment of the present invention uses both horizontally polarized signals and vertically polarized signals within one physical channel.
  • FIG. 7A shows two examples (1) and (2) regarding the allocation example of 13 segments.
  • a 2K broadcast program is transmitted using segments 1 to 7 (B layer) of the horizontally polarized signal.
  • a 4K broadcast program is transmitted using a total of 10 segments: horizontally polarized signal segments 8 to 12 (C layer) and vertically polarized signal segments 8 to 12 (C layer).
  • Segments 1 to 7 (layer B) of the vertically polarized signal may be used to transmit the same broadcast program as the 2K broadcast program transmitted by segments 1 to 7 (layer B) of the horizontally polarized signal.
  • the vertically polarized signal segments 1 to 7 (B layer) may be used to transmit a broadcast program different from the 2K broadcast program transmitted in the horizontally polarized signal segments 1 to 7 (B layer).
  • segments 1 to 7 (layer B) of the vertically polarized signal may be used for other data transmission or may be left unused.
  • Identification information on how to use segments 1 to 7 (layer B) of the vertically polarized signal is determined by the receiving device based on the parameters of the 4K signal transmission layer identification of the TMCC signal and the parameters of the additional layer transmission identification, etc., which have already been explained. transmission is possible. In the broadcast receiving apparatus 100, using these parameters, it is possible to identify how to handle segments 1 to 7 (B layer) of the vertically polarized signal.
  • a 2K broadcast program transmitted using the B layer of horizontally polarized signals and a 4K broadcast program transmitted using the C layer of both horizontal and vertically polarized signals are broadcast programs with the same content but transmitted at different resolutions. It may be a simulcast that transmits broadcast programs with different contents. Segment 0 of both the horizontal and vertical polarization signals transmits the same one-segment broadcast program.
  • the example (2) in FIG. 7A is a modification different from (1).
  • a 4K broadcast program is transmitted using a total of 10 segments, segments 1 to 5 of horizontally polarized signals (layer B) and segments 1 to 5 of vertically polarized signals (layer B).
  • a 2K broadcast program is transmitted using segments 6 to 12 (layer C) of horizontally polarized signals.
  • segments 6 to 12 (C layer) of the vertically polarized signal are used to transmit the same 2K broadcast program as the 2K broadcast program transmitted in segments 6 to 12 (C layer) of the horizontally polarized signal. It's okay.
  • Segments 6 to 12 (layer C) of the vertically polarized signal may be used to transmit a broadcast program different from the 2K broadcast program transmitted by segments 6 to 12 (layer C) of the horizontally polarized signal. Further, segments 6 to 12 (layer C) of the vertically polarized signal may be used for other data transmission or may be left unused. These pieces of identification information are also the same as in the example (1), and therefore will not be explained again.
  • FIG. 7B shows an example of the configuration of a broadcasting system for an advanced terrestrial digital broadcasting service using a dual-polarization transmission system according to an embodiment of the present invention.
  • This shows both the transmitting side system and the receiving side system of an advanced terrestrial digital broadcasting service using a dual-polarization transmission system.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the dual-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300T that is the equipment of the broadcasting station is horizontally polarized. This becomes a polarization shared transmitting antenna that can simultaneously transmit a wave signal and a vertically polarized signal.
  • the radio tower 300T that is the equipment of the broadcasting station
  • the horizontally polarized signal sent from the radio tower 300T is received by the horizontally polarized receiving element of the antenna 200T, which is a dual polarization receiving antenna, and is sent from the connector section 100F1 to the tuning/detection section 131H via the coaxial cable 202T1. is input.
  • the vertically polarized signal transmitted from the radio tower 300T is received by the vertically polarized wave receiving element of the antenna 200T, and is inputted from the connector section 100F2 to the channel selection/detection section 131V via the coaxial cable 202T2.
  • An F-type connector is generally used for a connector section that connects an antenna (coaxial cable) and a television receiver.
  • one of the connector parts that connects the antenna (coaxial cable) and the television receiver should be connected to the horizontally polarized signal.
  • the coaxial cable 202T1 for transmitting signals and the connector section 100F1 have a connector section having a different shape from the F-type connector.
  • the channel selection/detection section 131H and the channel selection/detection section 131V can determine whether the input broadcast signal is a horizontally polarized signal or a vertically polarized signal by referring to the main signal identification of the TMCC information of each input signal. All you have to do is identify it and control it.
  • antenna 200T and broadcast receiving apparatus 100 may be connected by one multicore coaxial cable.
  • FIG. 7C shows an example of a configuration different from the above-mentioned configuration of a broadcasting system for an advanced terrestrial digital broadcasting service using a dual-polarization transmission system according to an embodiment of the present invention.
  • the configuration shown in FIG. 7B in which the broadcast receiving device 100 includes two broadcast signal input connector sections and uses two coaxial cables to connect the antenna 200T and the broadcast receiving device 100, is advantageous in terms of equipment cost and This may not necessarily be suitable for handling during cable wiring, etc. Therefore, in the configuration shown in FIG. 7C, a conversion unit ( converter) 201T, and connection between the converter 201T and the broadcast receiving device 100 is made using a single coaxial cable 202T3.
  • converter converter
  • the broadcast signal input from the connector section 100F3 is demultiplexed and input to the channel selection/detection section 131H and the channel selection/detection section 131V.
  • the connector section 100F3 may have a function of supplying operating power to the conversion section 201T.
  • the conversion unit 201T may belong to equipment in an environment (for example, an apartment complex, etc.) in which the broadcast receiving device 100 is installed. Alternatively, it may be configured as a device integrated with the antenna 200T and installed in a house or the like.
  • the conversion unit 201T performs frequency conversion on either the horizontally polarized signal received by the horizontally polarized receiving element of the antenna 200T or the vertically polarized signal received by the vertically polarized receiving element of the antenna 200T. Perform processing. Through this processing, horizontally polarized signals and vertically polarized signals transmitted from radio tower 300T to antenna 200T using horizontally polarized waves and vertically polarized waves in the same frequency band are separated into different frequency bands and unified.
  • the broadcast receiving apparatus 100 It becomes possible to simultaneously transmit data to the broadcast receiving apparatus 100 using a single coaxial cable 202T3. If necessary, frequency conversion processing may be performed on both the horizontally polarized signal and the vertically polarized signal, but in this case as well, the frequency bands of both after frequency conversion must be different from each other. . Furthermore, the broadcast receiving apparatus 100 only needs to include one broadcast signal input connector section 100F3.
  • FIG. 7D shows an example of frequency conversion processing.
  • frequency conversion processing is performed on the vertically polarized signal.
  • the frequency band of the vertically polarized signal is set to 470MHz to 710MHz. Converts the frequency band to a frequency band of 770MHz to 1010MHz.
  • signals transmitted using horizontally polarized waves and vertically polarized waves in the same frequency band can be simultaneously transmitted to the broadcast receiving apparatus 100 using a single coaxial cable 202T3 without mutual interference. Become. Note that frequency conversion processing may be performed on the horizontally polarized signal.
  • the frequency conversion process is performed on the signal transmitted with the secondary polarization according to the result of referring to the main signal identification of the TMCC information.
  • the signal transmitted using the main polarization is more likely to be transmitted including the current digital terrestrial broadcasting service than the signal transmitted using the secondary polarization. Therefore, in order to better maintain compatibility with current digital terrestrial broadcasting services, it is recommended that the signals transmitted in the secondary polarization be frequency-converted without frequency-converting the signals transmitted in the main polarization. is suitable.
  • the frequency band of the signal transmitted by secondary polarization is higher than the frequency band of the signal transmitted by primary polarization in the converted signal. It is desirable to increase the As a result, in the initial scan of the broadcast receiving device 100, if the scan starts from the low frequency side and advances to the high frequency side, the signal transmitted with the main polarization will be sent before the signal transmitted with the secondary polarization. An initial scan can be performed. As a result, it is possible to more appropriately perform a process of reflecting settings based on the initial scan of the current digital terrestrial broadcasting service to settings based on the initial scan of the advanced digital terrestrial broadcasting service.
  • frequency conversion processing may be performed on all physical channels used in advanced terrestrial digital broadcasting services, but it may also be performed only on physical channels that use signal transmission using a dual-polarization transmission method. .
  • the frequency band after conversion by the frequency conversion process is preferably between 710 MHz and 1032 MHz. That is, when attempting to receive a terrestrial digital broadcasting service and a BS/CS digital broadcasting service at the same time, the broadcasting signal of the terrestrial digital broadcasting service received by the antenna 200T and the broadcasting signal of the BS/CS digital broadcasting service received by the antenna 200B. It is conceivable to mix the signals and transmit them to the broadcast receiving apparatus 100 via a single coaxial cable.
  • the BS/CS-IF signal uses a frequency band of approximately 1032 MHz to 2150 MHz, if the frequency band after conversion by the frequency conversion process is set to be between 710 MHz and 1032 MHz, the horizontally polarized signal It becomes possible to avoid interference between the broadcast signal of the terrestrial digital broadcast service and the broadcast signal of the BS/CS digital broadcast service while avoiding interference between the broadcast signal and the vertically polarized wave signal.
  • the frequency band of 770 MHz or less (corresponding to UHF channel 62 or less) is used for TV broadcast distribution by cable television stations. is used, it is more preferable that the frequency band after frequency conversion processing is between 770 MHz and 1032 MHz, which exceeds the band corresponding to 62 channels of UHF.
  • the bandwidth of the region between the frequency band before conversion and the frequency band after conversion by frequency conversion processing (part a in the figure) is set to be an integral multiple of the bandwidth of one physical channel (6 MHz). It is preferable to set it to .
  • frequency setting control can be easily performed when, for example, frequency scanning is performed on broadcast signals in a frequency band before conversion by frequency conversion processing and broadcast signals in a frequency band after conversion.
  • both horizontally polarized signals and vertically polarized signals are used to transmit 4K broadcast programs. Therefore, in order to correctly reproduce a 4K broadcast program, it is necessary for the receiving side to correctly understand the physical channel combination of the horizontally polarized broadcast signal and the vertically polarized broadcast signal. Even if frequency conversion processing is performed and broadcast signals transmitted in horizontally polarized waves and broadcast signals transmitted in vertically polarized waves on the same physical channel are input to the receiving device as signals in different frequency bands, In the broadcast receiving apparatus 100 of this embodiment, by appropriately referring to the parameters (for example, main signal identification and physical channel number identification) of the TMCC information shown in FIGS.
  • the broadcast receiving apparatus 100 of this embodiment can suitably receive, demodulate, and reproduce a 4K broadcast program.
  • FIGS. 7B, 7C, and 7D are cases in which horizontal polarization is the main polarization, horizontal polarization and vertical polarization may be reversed depending on the operation. do not have.
  • the broadcast waves of digital terrestrial broadcasting transmitted using the dual-polarization transmission method described above can be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • Broadcast receiving device 100 is capable of receiving signals transmitted using a pass-through transmission method.
  • the pass-through transmission method is a method in which a broadcast signal received by a cable television station or the like is transmitted to a CATV distribution system using the same signal method or after frequency conversion.
  • the pass-through method consists of two methods: (1) extracting the transmission signal band and adjusting the level of each terrestrial digital broadcasting signal output from the terrestrial reception antenna, and transmitting it to the CATV facility at the same frequency as the transmission signal frequency; and (2) terrestrial reception.
  • the device constituting the receiving amplifier for performing signal processing of the first method or the device constituting the receiving amplifier and frequency converter for performing signal processing of the second method is an OFDM signal processor (OFDM Signal Processor). OFDM-SP).
  • FIG. 7E shows an example of a system configuration when the first method of the pass-through transmission method is applied to the advanced terrestrial digital broadcasting service of the dual polarization transmission method.
  • FIG. 7E shows head end equipment 400C of a cable television station and broadcast receiving apparatus 100.
  • FIG. 7F shows an example of frequency conversion processing at that time.
  • the notation (H ⁇ V) in FIG. 7F indicates the state of the broadcast signal in which both the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization exist in the same frequency band, and (H ) indicates a broadcast signal transmitted by horizontal polarization, and (V) indicates a broadcast signal transmitted by vertical polarization.
  • the notations in FIGS. 7H and 7I below also have the same meaning.
  • the cable television station The head end equipment 400C When applying the pass-through transmission of the first method to the advanced terrestrial digital broadcasting service of the dual polarization transmission method according to the embodiment of the present invention, the cable television station The head end equipment 400C performs signal band extraction and level adjustment, and sends out at the same frequency as the transmission signal frequency.
  • signal band extraction and level adjustment are performed at the cable television station's headend equipment 400C, and frequency conversion processing similar to that explained in FIG. Transmission is performed after converting the broadcast signal into a frequency band higher than the frequency band of 470 MHz to 770 MHz, which is the band corresponding to UHF channels 13 to 62.
  • This processing prevents the frequency bands of horizontally polarized broadcast signals and vertically polarized broadcast signals from overlapping, allowing signal transmission over a single coaxial cable (or optical fiber cable). becomes.
  • the transmitted signal can be received by the broadcast receiving device 100 of this embodiment.
  • the process of receiving and demodulating the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization included in the signal in the broadcast receiving apparatus 100 of this embodiment is similar to the explanation of FIG. 7D. Therefore, further explanation will be omitted.
  • FIG. 7G shows an example of a system configuration when the second method of the pass-through transmission method is applied to the advanced terrestrial digital broadcasting service of the dual polarization transmission method.
  • FIG. 7G shows head end equipment 400C of a cable television station and broadcast receiving apparatus 100.
  • FIG. 7H shows an example of frequency conversion processing at that time.
  • the cable television station The head end equipment 400C When applying the pass-through transmission of the second method to the advanced terrestrial digital broadcasting service of the dual-polarization transmission method according to the embodiment of the present invention, the cable television station The head end equipment 400C performs signal band extraction and level adjustment, and after frequency conversion processing to the frequency set by the CATV facility manager, transmission is performed.
  • signal band extraction and level adjustment are performed at the cable television station's headend equipment 400C, and frequency conversion processing similar to that explained in FIG. After converting the broadcast signal into a frequency band higher than the frequency band of 470 MHz to 770 MHz, which is the UHF 13ch to 62ch band, transmission is performed.
  • the frequency conversion process shown in FIG. 7H differs from FIG.
  • the broadcast signal transmitted with horizontal polarization is not limited to the frequency band of 470MHz to 770MHz, which is the UHF channel 13ch to 62ch band, but also extends to lower frequency bands. Frequency conversion is performed to widen the range and rearrange the frequencies in the range of 90 MHz to 770 MHz.
  • This processing prevents the frequency bands of broadcast signals transmitted with horizontal polarization and broadcast signals transmitted with vertical polarization from overlapping, allowing signal transmission over a single coaxial cable (or optical fiber cable). becomes.
  • the transmitted signal can be received by the broadcast receiving device 100 of this embodiment.
  • the process of receiving and demodulating the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization included in the signal in the broadcast receiving apparatus 100 of this embodiment is similar to the explanation of FIG. 7D. Therefore, further explanation will be omitted.
  • the broadcast signal at the time of pass-through output after frequency conversion may be changed from FIG. 7H to the state shown in FIG. 7I.
  • signal band extraction and level adjustment are performed for both horizontally polarized broadcast signals and vertically polarized broadcast signals, and frequency conversion processing is performed to the frequency set by the CATV facility manager.
  • the transmission may be performed after performing the above.
  • the frequency is changed so that both the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization are rearranged in the range of 90 MHz to 770 MHz (range from VHF1ch to UHF62ch). Since this converter performs conversion and does not use a frequency band exceeding UHF62ch, the frequency band usage efficiency of the broadcast signal is higher than that in FIG. 7H.
  • the frequency band in which broadcast signals are rearranged is wider than the frequency band of 470 MHz to 710 MHz, which is the band of UHF channels 13 to 52 when receiving antennas, as shown in the example of Fig. It is also possible to alternately rearrange the vertically polarized broadcast signal and the vertically polarized broadcast signal. At this time, as shown in the example of Fig.
  • a pair of a broadcast signal transmitted with horizontal polarization and a broadcast signal transmitted with vertical polarization, which were the same physical channel at the time of antenna reception is If the channels are rearranged alternately in order, when the broadcast receiving apparatus 100 of this embodiment performs an initial scan from the low frequency side, the broadcast signals transmitted with horizontal polarization and vertical polarization, which were originally the same physical channel, can be Initial settings can be performed sequentially for pairs of broadcast signals transmitted by waves in units of originally the same physical channel, and initial scanning can be performed efficiently.
  • the broadcast waves of digital terrestrial broadcasting using the dual-polarization transmission method using the pass-through transmission method described above can also be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. However, it can also be received by the first tuner/demodulator 130C of the broadcast receiving apparatus 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • the single polarization transmission system according to the embodiment of the present invention is a system that shares some specifications with the current terrestrial digital broadcasting system, and uses either a horizontally polarized signal or a vertically polarized signal, This is a method of data transmission using SISO (Single-Input Single-Output) technology.
  • SISO Single-Input Single-Output
  • FIG. 7J shows an example of a single polarization transmission system in the advanced digital terrestrial broadcasting service according to the embodiment of the present invention.
  • a frequency band of 470 MHz to 710 MHz is used to transmit broadcast waves for digital terrestrial broadcasting services.
  • the number of physical channels in the frequency band is 40 channels ranging from 13 to 52 channels, and each physical channel has a bandwidth of 6 MHz.
  • 2K broadcasting service and 4K broadcasting service are transmitted simultaneously within one physical channel.
  • FIG. 7J shows two examples (1) and (2) regarding the allocation example of 13 segments.
  • segments 1 to 4 (layer B) are used to transmit a 4K broadcast program.
  • 2K broadcast programs are transmitted using segments 5 to 12 (C layer).
  • the 4K broadcast program transmitted using the B layer and the 2K broadcast program transmitted using the C layer may be simulcasting in which the same content is transmitted at different resolutions, or they may be broadcast programs with different content. It may also be something that transmits.
  • Example (2) is a modification different from (1).
  • segments 1 to 8 (layer B) are used to transmit a 2K broadcast program.
  • 4K broadcast programs are transmitted using segments 9 to 12 (C layer).
  • FIG. 7K shows an example of the configuration of a broadcasting system for advanced digital terrestrial broadcasting service using a single polarization transmission method according to an embodiment of the present invention.
  • This shows both the transmitting side system and the receiving side system of an advanced digital terrestrial broadcasting service using a single polarization transmission method.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the single-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300S, which is the equipment of the broadcasting station, is horizontally polarized. This becomes a single polarization transmitting antenna that can transmit either a wave signal or a vertically polarized signal.
  • the channel selection/detection section 131H of the second tuner/demodulation section 130T is excerpted and described, and other operating sections are omitted.
  • the single-polarized signal transmitted from the radio tower 300S is received by the antenna 200S, which is a single-polarized receiving antenna, and is input to the channel selection/detection section 131H from the connector section 100F3 via the coaxial cable 202S.
  • An F-type connector is generally used for a connector section that connects an antenna (coaxial cable) and a television receiver.
  • frequency conversion processing conversion unit
  • the broadcast waves of digital terrestrial broadcasting transmitted by the single polarization transmission method described above can be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • the broadcast waves of digital terrestrial broadcasting transmitted using the single polarization transmission method are transmitted in the layer of the current digital terrestrial broadcasting service (the layer that transmits 2K broadcasting in FIG. 7J).
  • the broadcast signal can also be received by the first tuner/demodulator 130C. Therefore, by adopting a double tuner configuration in which the second tuner/demodulator 130T and the first tuner/demodulator 130C are used simultaneously, it is possible to combine the broadcast signals transmitted in the layer of the advanced terrestrial digital broadcasting service with the current terrestrial digital It becomes possible to simultaneously receive and reproduce broadcast signals transmitted in the broadcast service layer.
  • FIG. 7L shows an example of the configuration of a broadcasting system for an advanced digital terrestrial broadcasting service using a single-polarization transmission method according to an embodiment of the present invention, which provides the aforementioned double tuner.
  • This shows both the transmitting side system and the receiving side system of an advanced digital terrestrial broadcasting service using a single polarization transmission method.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the single-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300S, which is the equipment of the broadcasting station, is horizontally polarized. This becomes a single polarization transmitting antenna that can transmit either a wave signal or a vertically polarized signal.
  • the radio tower 300S which is the equipment of the broadcasting station
  • the single-polarized signal transmitted from the radio tower 300S is received by the antenna 200S, which is a single-polarized receiving antenna, and is input to the broadcast receiving device 100 from the connector section 100F3 via the coaxial cable 202S.
  • the single polarized wave signal input to the broadcast receiving apparatus 100 is demultiplexed and input to the channel selection/detection section 131C and the channel selection/detection section 131H, respectively.
  • the tuning/detection unit 131C performs tuning/detection processing for broadcast waves of current digital terrestrial broadcasting services
  • the tuning/detection unit 131H performs tuning/detection processing for broadcast waves of advanced digital terrestrial broadcasting services. It will be done.
  • the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service are provided, it is possible to simultaneously receive the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service. becomes possible. In particular, efficient processing becomes possible in the channel setting section and the like.
  • the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service may be transmitted using the same physical channel or may be transmitted using different physical channels. . Further, the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service may or may not be a pair of simulcasting services.
  • FIG. 7L is an example of receiving a broadcast service of an advanced digital terrestrial broadcasting service using a single-polarization transmission method
  • a similar configuration is applicable to an advanced digital terrestrial broadcasting service using a dual-polarization transmission method. It can also be applied to the case of receiving a broadcast service.
  • the dual-polarization signal received by the antenna 200T which is a dual-polarization receiving antenna, and input to the broadcast receiving apparatus 100 from the connector 100F3 via the converter 201T is demultiplexed, and the dual-polarization signal is divided into two channels for channel selection and reception.
  • the tuning/detection unit 131C performs tuning/detection processing on the broadcast waves of the current digital terrestrial broadcasting service transmitted by either the horizontally polarized signal or the vertically polarized signal, and the tuning/detection unit 131H and the selection
  • the station/detection unit 131V performs tuning/detection processing on broadcast waves of the advanced digital terrestrial broadcasting service transmitted as horizontally polarized signals and vertically polarized signals.
  • the hierarchical division multiplex transmission system according to the embodiment of the present invention is a system that shares some specifications with the current terrestrial digital broadcasting system.
  • the broadcast waves of a 4K broadcast service with a low signal level are multiplexed and transmitted on the same channel as the broadcast waves of the current 2K broadcast service.
  • 2K broadcasting is received as before by suppressing the reception level of 4K broadcasting to below the required C/N.
  • 4K broadcasting while expanding the transmission capacity through modulation and multi-value modulation, etc., we will cancel the 2K broadcast waves and receive the remaining 4K broadcast waves using reception technology compatible with LDM (layer division multiplexing) technology. conduct.
  • FIG. 8A shows an example of a hierarchical division multiplexing transmission system in the advanced digital terrestrial broadcasting service according to the embodiment of the present invention.
  • the upper layer is made up of modulated waves of current 2K broadcasting
  • the lower layer is made up of modulated waves of 4K broadcasting
  • the upper layer and lower layer are multiplexed and output as a composite wave in the same frequency band.
  • the upper layer may use 64QAM or the like as a modulation method
  • the lower layer may use 256QAM or the like as a modulation method.
  • the 2K broadcast program transmitted using the upper layer and the 4K broadcast program transmitted using the lower layer may be simulcasting that transmits the same content broadcast program at different resolutions, or may be simulcasting that transmits the same content broadcast program at different resolutions, or The broadcast program may be transmitted.
  • the upper layer is transmitted with high power
  • the lower layer is transmitted with low power.
  • the difference (difference in power) between the modulated wave level of the upper layer and the modulated wave level of the lower layer is called an injection level (IL).
  • the injection level is a value set by the broadcasting station.
  • the injection level is generally expressed as a value expressed as a relative ratio (dB) of a difference in modulated wave level (difference in power) in logarithmic expression.
  • FIG. 8B shows an example of the configuration of a broadcasting system for an advanced digital terrestrial broadcasting service using a hierarchical division multiplex transmission system according to an embodiment of the present invention.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the hierarchical division multiplex transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300L, which is the equipment of the broadcasting station, is located on the upper side. This is a transmitting antenna that sends out a broadcast signal that is a multiplex of 2K broadcasting in the hierarchy and 4K broadcasting in the lower hierarchy.
  • the broadcast receiving apparatus 100 only the channel selection/detection section 131L of the third tuner/demodulation section 130L is extracted and described, and the description of other operating sections is omitted.
  • the broadcast signal received by the antenna 200L is input to the channel selection/detection unit 131L from the connector unit 100F4 via the converter 201L and the coaxial cable 202L.
  • the converter 201L performs frequency conversion amplification processing on the broadcast signal. Also good.
  • the broadcast signal when installing the antenna 200L on the roof of an apartment building or the like and transmitting the broadcast signal to the broadcast receiving device 100 in each room using the long coaxial cable 202L, the broadcast signal will be attenuated and the channel selection/detection section 131L, there is a possibility that a problem may occur in which 4K broadcast waves in the lower hierarchy cannot be received correctly.
  • the conversion unit 201L performs frequency conversion and amplification processing on the 4K broadcast signal of the lower layer.
  • Frequency conversion amplification processing changes the frequency band of the 4K broadcast signal in the lower layer from a frequency band of 470 to 710 MHz (a band corresponding to UHF channels 13 to 52) to a frequency band of 770 to 1010 MHz, which exceeds the band corresponding to UHF channels 62, for example. frequency band.
  • processing is performed to amplify the 4K broadcast signal in the lower hierarchy to a signal level where the influence of cable attenuation is not a problem.
  • the tuning/detection section included in the third tuner/demodulation section 130L of the broadcast receiving apparatus 100 performs processing such as tuning/detection on the modulated wave of the upper layer (2K broadcast).
  • the channel selection/detection section 131L1 may be configured to perform tuning/detection, and the tuning/detection section 131L2 may perform processing such as tuning/detection on the modulated wave of the lower layer (4K broadcasting).
  • the frequency band after conversion by frequency conversion amplification processing is between 710 and 1032 MHz, which exceeds the band corresponding to 52 channels of UHF, or between 770 and 1032 MHz, which exceeds the band corresponding to 62 channels of UHF (retransmission by cable TV stations, etc.) ), and the bandwidth of the region between the frequency band before conversion and the frequency band after conversion by frequency conversion amplification processing is an integral multiple of the bandwidth of one physical channel (6 MHz). It is preferable to set the frequency conversion so that Since the explanation is the same as that of the present embodiment, the explanation will be omitted again.
  • the broadcast receiving apparatus 100 of this embodiment determines whether the received broadcast signal is a broadcast signal transmitted in a lower hierarchy or a broadcast signal transmitted in an upper hierarchy, based on the TMCC information explained in FIG. 5H. It is possible to identify using upper and lower hierarchy identification bits. Furthermore, the broadcast receiving apparatus 100 of the present embodiment uses the frequency conversion process identification bit of the TMCC information described in FIG. It is possible to identify the Furthermore, the broadcast receiving apparatus 100 of this embodiment uses the 4K signal transmission layer identification bit of the TMCC information described in FIG. 5I to determine whether the received broadcast signal transmits a 4K program in the lower layer. It is possible to identify.
  • the channel selection/detection section 131L of the third tuner/demodulation section 130L of the broadcast receiving apparatus 100 has a reception function compatible with LDM (layer division multiplexing) technology. Therefore, the conversion unit 201L shown in FIG. 8B is not necessarily required between the antenna 200L and the broadcast receiving apparatus 100.
  • the broadcast waves of digital terrestrial broadcasting transmitted by the hierarchical division multiplexing transmission method described above can be received and reproduced by the third tuner/demodulator 130L of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • the broadcasting system of this embodiment is compatible with MPEG-2 TS, which is used in current digital terrestrial broadcasting services, as a media transport method for transmitting data such as video and audio.
  • MPEG-2 TS which is used in current digital terrestrial broadcasting services
  • the stream format transmitted by the OFDM transmission wave in Figure 4D (1) is MPEG-2 TS
  • the stream format transmitted in the layer where digital broadcasting services are transmitted is MPEG-2 TS.
  • the stream system obtained by demodulating the transmission wave in the first tuner/demodulator 130C of the broadcast receiving apparatus 100 in FIG. 2 is MPEG-2 TS.
  • the stream format corresponding to the layer in which the current digital terrestrial broadcasting service is transmitted is MPEG-2 TS.
  • the stream format corresponding to the layer in which the current digital terrestrial broadcasting service is transmitted is MPEG-2 TS.
  • MPEG-2 TS is characterized by multiplexing components such as video and audio that make up a program into one packet stream along with control signals and clocks.
  • MPEG-2 TS handles the clock as one packet stream, making it suitable for transmitting one content over one transmission path with guaranteed transmission quality, and is used in many current digital broadcasting systems. has been done.
  • MPEG-2 TS it is possible to realize two-way communication via a two-way network such as a fixed network/mobile network.
  • MPEG-2 TS links digital broadcasting services with functions that utilize a broadband network, and digitally performs acquisition of additional content via the broadband network, arithmetic processing in a server device, presentation processing by linking with a mobile terminal device, etc. It is compatible with broadcast communication cooperation systems that are combined with broadcast services.
  • FIG. 9A shows an example of a protocol stack of a transmission signal in a broadcasting system using MPEG-2 TS.
  • MPEG-2 TS PSI, SI, and other control signals are transmitted in section format.
  • Control signal for broadcasting system using MPEG-2 TS system The control information of the MPEG-2 TS system mainly includes tables used for program sequence information and tables used for purposes other than program sequence information. Tables are transmitted in section format, and descriptors are placed within the tables.
  • FIG. 9B shows a list of tables used in the program sequence information of the MPEG-2 TS broadcasting system.
  • the table shown below is used as the table used in the program sequence information.
  • FIG. 9C shows a list of tables used for purposes other than program sequence information in the MPEG-2 TS broadcasting system.
  • the table shown below is used for purposes other than program sequence information.
  • ECM Entitlement Control Message
  • EMM Entitlement Management Message
  • DCT Download Control Table
  • DLT Download Table
  • DIT Discontinuity Information Table
  • SIT Selection Information Table
  • SDTT Software Download Trigger Table
  • CDT Common Data Table
  • DSM-CC DSM-CC section
  • AIT Application Information Table
  • DCM Download Control Message
  • DMM Download Management Message
  • FIGS. 9D, 9E, and 9F show a list of descriptors used in the program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in the program sequence information.
  • Conditional Access Descriptor (2) Copyright Descriptor (3) Network Name Descriptor (4) Service List Descriptor (5) Stuffing Descriptor (6) Satellite Delivery System Descriptor (7) Terrestrial Delivery System Descriptor (8) Bouquet Name Descriptor (9) Service Descriptor (10) Country Availability Descriptor
  • Linkage Descriptor (12) NVOD Reference Descriptor (13) Time Shifted Service Descriptor (14) Short Event Descriptor (15) Extended Event Descriptor (16) Time Shifted Event Descriptor (17) Component Descriptor (18) Mosaic Descriptor (19) Stream Identifier Descriptor (20) CA Identifier Descriptor
  • Hyperlink Descriptor (31) Hyperlink Descriptor (32) Data Content Descriptor (33) Video Decode Control Descriptor (34) Basic Local Event Descriptor (35) Reference Descriptor (36) Node Relation Descriptor (37) Short Node Information Descriptor (38) STC Reference Descriptor (39) Partial Reception Descriptor (40) Series Descriptor
  • Event Group Descriptor (41) Event Group Descriptor (42) SI Transmission Parameter Descriptor (43) Broadcaster Name Descriptor (44) Component Group Descriptor (45) SI Prime TS Descriptor (46) Board Information Descriptor (47) LDT Linkage Descriptor (48) Connected Transmission Descriptor (49) TS Information Descriptor (50) Extended Broadcaster Descriptor
  • FIG. 9G shows a list of descriptors used in other than program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used for purposes other than program sequence information.
  • Partial Transport Stream Descriptor (2) Network Identification Descriptor (3) Partial Transport Stream Time Descriptor (4) Download Content Descriptor (5) CA_EMM_TS_Descriptor (CA_EMM_TS_Descriptor) (6) CA Contract Information Descriptor (7) CA Service Descriptor (8) Carousel Identifier Descriptor (9) Association Tag Descriptor (10) Deferred Association tags Descriptor (11) Network Download Content Descriptor (12) Download Protection Descriptor (13) CA Startup Descriptor (14) Descriptor set by the operator
  • FIG. 9H shows a list of descriptors used in the INT of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in INT. Note that the descriptors used in the above-mentioned program sequence information and descriptors used in other than the program sequence information are not used in INT.
  • Target Smartcard Descriptor (2) Target IP Address Descriptor (3) Target IPv6 Address Descriptor (4) IP/MAC Platform Name Descriptor (5) IP/MAC Platform Provider Name Descriptor (6) IP/MAC Stream Location Descriptor (7) Descriptor set by the operator
  • FIG. 9I shows a list of descriptors used in the AIT of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in AIT. Note that the descriptors used in the above-mentioned program sequence information and descriptors used in other than the program sequence information are not used in INT.
  • Application Descriptor (2) Transport Protocol Descriptor (3) Simple Application Location Descriptor (4) Application Boundary and Permission Descriptor (5) Autostart Priority Descriptor (6) Cache Control Info Descriptor (7) Randomized Latency Descriptor (8) External Application Control Descriptor (9) Playback Application Descriptor (10) Simple Playback Application Location Descriptor (11) Application Expiration Descriptor (12) Descriptor set by the operator
  • the broadcasting system of this embodiment can also support the MMT system as a media transport system for transmitting data such as video and audio.
  • the stream format transmitted in the layer where advanced digital terrestrial broadcasting services are transmitted is, in principle, the MMT format. It is.
  • the stream format corresponding to the hierarchy in which advanced digital terrestrial broadcasting services are transmitted is, in principle, the MMT format.
  • the method of the stream obtained by demodulating the transmission wave in the fourth tuner/demodulator 130B is also the MMT method.
  • an MPEG-2 TS stream may be operated in an advanced terrestrial digital broadcasting service.
  • the MMT system is based on MPEG-2 in response to changes in the environment related to content distribution, such as the recent diversification of content, the diversification of devices that use content, the diversification of transmission paths for distributing content, and the diversification of content storage environments.
  • This is a newly developed media transport method due to the limitations of the TS method.
  • the video and audio signals of the broadcast program are coded as MFU (Media Fragment Unit)/MPU (Media Processing Unit), placed on an MMTP (MMT Protocol) payload, converted into MMTP packets, and transmitted as IP packets.
  • MFU Media Fragment Unit
  • MPU Media Processing Unit
  • MMTP MMT Protocol
  • data content and subtitle signals related to broadcast programs are also in MFU/MPU format, put on an MMTP payload, converted into MMTP packets, and transmitted as IP packets.
  • UDP/IP User Datagram Protocol/Internet Protocol
  • TCP/IP Transmission Control Protocol
  • col/Internet Protocol a TLV multiplexing method may be used for efficient transmission of IP packets.
  • FIG. 10A shows the MMT protocol stack in the broadcast transmission path. Further, FIG. 10B shows an MMT protocol stack in a communication line.
  • the MMT system provides a mechanism for transmitting two types of control information: MMT-SI and TLV-SI.
  • MMT-SI is control information indicating the structure of a broadcast program. It is in the format of an MMT control message, is placed on an MMTP payload, converted into an MMTP packet, and transmitted as an IP packet.
  • TLV-SI is control information regarding multiplexing of IP packets, and provides information for channel selection and correspondence information between IP addresses and services.
  • TLV-SI and MMT-SI are prepared as control information.
  • TLV-SI consists of tables and descriptors. Tables are transmitted in section format, and descriptors are placed within the tables.
  • MMT-SI consists of three layers: messages that store tables and descriptors, tables that have elements and attributes that indicate specific information, and descriptors that indicate more detailed information.
  • FIG. 10C shows a list of tables used in TLV-SI of the MMT broadcasting system.
  • the table shown below is used as the TLV-SI table.
  • tables having the same meaning as the tables shown in FIGS. 9B and 9C may be further used.
  • FIG. 10D shows a list of descriptors used in TLV-SI of the MMT broadcasting system.
  • the following is used as a TLV-SI descriptor.
  • descriptors having the same meaning as the descriptors shown in FIGS. 9D, 9E, 9F, 9G, 9H, and 9I may also be used.
  • Service List Descriptor (2) Satellite Delivery System Descriptor (3) System Management Descriptor (4) Network Name Descriptor (5) Remote Control Key Descriptor (6) Descriptor set by the operator
  • FIG. 10E shows a list of messages used in MMT-SI of the MMT broadcasting system. In this embodiment, the following messages are used as MMT-SI messages.
  • PA Package Access
  • M2 section message (3) CA message (4) M2 short section message (5) Data transmission message (6) Message set by the operator
  • FIG. 10F shows a list of tables used in MMT-SI of the MMT broadcasting system.
  • the following MMT-SI table is used.
  • tables having the same meaning as the tables shown in FIGS. 9B and 9C may be further used.
  • MPT MMT Package Table
  • PLT Package List Table
  • LCT Layer Control Table
  • ECM Entitlement Control Message
  • EMM Entitlement Management Message
  • CAT MH
  • DCM Download Control Message
  • DMM Download Management Message
  • MH-EIT MH-Event Information Table
  • MH-AIT MH-Application Information Table
  • FIGS. 10G, 10H, and 10I show a list of descriptors used in MMT-SI of the MMT broadcasting system. In this embodiment, the following is used as the MMT-SI descriptor. Furthermore, descriptors having the same meaning as the descriptors shown in FIGS. 9D, 9E, 9F, 9G, 9H, and 9I may also be used.
  • Asset Group Descriptor (2) Event Package Descriptor (3) Background Color Descriptor (4) MPU Presentation Region Descriptor (5) MPU Timestamp Descriptor (6) Dependency Descriptor (7) Access Control Descriptor (8) Scrambler Descriptor (9) Message Authentication Method Descriptor (10) Emergency Information Descriptor
  • MH-MPEG-4 Audio Descriptor (12) MH-MPEG-4 Audio Extension Descriptor (13) MH-HEVC Descriptor (14) MH-Linkage Descriptor (15) MH-Event Group Descriptor (16) MH-Service List Descriptor (17) MH-Short Event Descriptor (18) MH-Extended Event Descriptor (19) Video Component Descriptor (20) MH-Stream Identifier Descriptor
  • MPU Extended Timestamp Descriptor (42) MPU Download Content Descriptor (43) MH-Network Download Content Descriptor (44) Application descriptor (MH-Application Descriptor) (45) MH-Transport Protocol Descriptor (46) MH-Simple Application Location Descriptor (47) Application Boundary and Permission Descriptor (MH-Application Boundary and Permission Descriptor) (48) MH-Autostart Priority Information Descriptor (MH-Autostart Priority Descriptor) (49) MH-Cache Control Info Descriptor (50) MH-Randomized Latency Descriptor
  • FIG. 10J shows the relationship between data transmission and typical tables in an MMT broadcasting system.
  • data can be transmitted through multiple routes, such as a TLV stream via a broadcast transmission path and an IP data flow via a communication line.
  • the TLV stream includes TLV-SI such as TLV-NIT and AMT, and an IP data flow that is an IP packet data flow.
  • the IP data flow includes a video asset including a series of video MPUs and an audio asset including a series of audio MPUs.
  • the IP data flow may include a subtitle asset including a series of subtitle MPUs, a text super asset including a series of text superimpose MPUs, a data asset including a series of data MPUs, and the like.
  • MPT MMT package table
  • the package ID and the asset ID of each asset included in the package may be written in association with each other in the MPT.
  • the assets that make up the package can be only those in the TLV stream, but as shown in FIG. 10J, they can also include assets that are transmitted in the IP data flow of the communication line.
  • This can be realized by including location information of each asset included in the package in the MPT, so that the broadcast receiving device 100 can grasp the reference destination of each asset.
  • the location information for each asset is as follows: (1) Data multiplexed in the same IP data flow as MPT (2) Data multiplexed in IPv4 data flow (3) Data multiplexed in IPv6 data flow (4) Multiplexed in broadcast MPEG2-TS (5) Data multiplexed in MPEG2-TS format within the IP data flow (6) Data at a specified URL It is possible to specify various data to be transmitted via various transmission routes. .
  • the MMT broadcasting system also has the concept of an event.
  • An event is a concept indicating a so-called program that is handled by an MH-EIT that is sent while being included in an M2 section message.
  • a series of data included in the duration period from the disclosure time stored in the MH-EIT is included in the concept of the event. This is the data included.
  • the MH-EIT can be used in the broadcast receiving device 100 for various processing on an event-by-event basis (for example, program guide generation processing, control of recording reservations and viewing reservations, copyright management processing such as temporary storage, etc.). I can do it.
  • the broadcast receiving apparatus 100 which is compatible with the current terrestrial digital broadcasting, is compatible with the terrestrial digital broadcasting (advanced terrestrial digital broadcasting, or advanced terrestrial digital broadcasting and current terrestrial digital broadcasting) according to the embodiment of the present invention.
  • the broadcast receiving apparatus 100 which is transmitted simultaneously on a separate layer from broadcasting, it has the function of searching (scanning) all receivable channels at the receiving point and creating a service list (receivable frequency table) based on the service ID. There is a need.
  • MFN Multi Frequency Network
  • the broadcast receiving device 100 acquires the service list stored in the TLV-NIT. There is no need to create a service list. Therefore, for advanced BS digital broadcasting or advanced CS digital broadcasting received by the fourth tuner/demodulator 130B, initial scanning and rescanning described later are not necessary.
  • the broadcast receiving apparatus 100 has a rescan function in preparation for the opening of a new station, installation of a new relay station, change of reception point of a television receiver, and the like.
  • the broadcast receiving apparatus 100 can notify the user to that effect.
  • FIG. 11A shows an example of an operation sequence of channel setting processing (initial scan/rescan) of the broadcast receiving apparatus 100 according to the embodiment of the present invention. Note that although the figure shows an example where MPEG-2 TS is adopted as the media transport method, the processing is basically the same when the MMT method is adopted.
  • the reception function control unit 1102 first sets the residential area (selects the area where the broadcast receiving device 100 is installed) based on the user's instruction (S101). At this time, instead of the user's instructions, the residential area may be automatically set based on the installation position information of the broadcast receiving device 100 acquired through predetermined processing.
  • the installation position information acquisition process information may be acquired from a network connected to the LAN communication unit 121, or information regarding the installation position may be acquired from an external device connected to the digital interface unit 125.
  • the initial value of the frequency range to be scanned is set, and the tuner/demodulators (the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator If the demodulation unit 130L is not distinguished, it is described like this.The same applies hereafter).) (S102).
  • the tuner/demodulator executes tuning based on the instruction (S103), and if it succeeds in locking to the set frequency (S103: Yes), the process proceeds to S104. If the lock is not successful (S103: No), the process advances to S111. In the process of S104, the C/N is confirmed (S104), and if the C/N is greater than or equal to a predetermined value (S104: Yes), the process proceeds to S105 and a reception confirmation process is performed. If the C/N is not higher than the predetermined value (S104: No), the process proceeds to S111.
  • the reception function control unit 1102 first obtains the BER of the received broadcast wave (S105). Next, by acquiring and comparing the NIT, it is confirmed whether the NIT is valid data or not (S106). If the NIT acquired in the process of S106 is valid data, the reception function control unit 1102 acquires information such as the transport stream ID and original network ID from the NIT. Furthermore, distribution system information regarding the physical conditions of the broadcast transmission path corresponding to each transport stream ID/original network ID is acquired from the terrestrial distribution system descriptor. Additionally, a list of service IDs is acquired from the service list descriptor.
  • the reception function control unit 1102 checks the service list stored in the reception device to check whether the transport stream ID acquired in the process of S106 has already been acquired (S107). . If the transport stream ID acquired in the process of S106 is not already acquired (S107: No), the various information acquired in the process of S106 is associated with the transport stream ID and added to the service list (S108). If the transport stream ID acquired in the process of S106 has already been acquired (S107: Yes), compare the BER acquired in the process of S105 with the BER when the transport stream ID already recorded in the service list was acquired. Execute (S109). As a result, if the BER obtained in S105 is better (S109: Yes), the service list is updated using the various information obtained in S106 (S110). If the BER obtained in step S105 is not better (S109: No), the various information obtained in step S106 is discarded.
  • the remote control key ID may be acquired from the TS information descriptor and the remote control key may be associated with a typical service for each transport stream. This process enables one-touch channel selection, which will be described later.
  • the reception function control unit 1102 confirms whether the current frequency setting is the final value of the frequency range to be scanned (S111). If the current frequency setting is not the final value of the frequency range to be scanned (S111: No), the frequency value set in the tuner/demodulator is increased (S112), and the processes of S103 to S110 are repeated. If the current frequency setting is the final value of the frequency range to be scanned (S111: Yes), the process advances to S113.
  • the service list created (added/updated) in the above process is presented to the user as a result of the channel setting process (S113). Further, if there is a duplication of remote control keys, the user may be notified of this fact and urged to change the remote control key settings (S114).
  • the service list created/updated through the above processing is stored in a nonvolatile memory such as the ROM 103 or the storage unit 110 of the broadcast receiving device 100.
  • FIG. 11B shows an example of the data structure of NIT.
  • “transpotrt_stream_id” corresponds to the above-mentioned transport stream ID
  • “original_network_id” corresponds to the original network ID.
  • FIG. 11C shows an example of the data structure of the ground distribution system descriptor. "guard_interval”, “transmission_mode”, “frequency”, etc. in the figure correspond to the above-mentioned distribution system information.
  • FIG. 11D shows an example of the data structure of a service list descriptor.
  • “service_id” in the figure corresponds to the above-mentioned service ID.
  • FIG. 11E shows an example of the data structure of the TS information descriptor.
  • "remote_control_key_id” in the figure corresponds to the above-mentioned remote control key ID.
  • the broadcast receiving device 100 may control the frequency range to be scanned as described above to be changed as appropriate depending on the broadcast service to be received. For example, when the broadcast receiving device 100 is receiving broadcast waves of the current digital terrestrial broadcasting service, it is controlled to scan the frequency range of 470 MHz to 770 MHz (corresponding to physical channels 13 ch to 62 ch). That is, the initial value of the frequency range is set to 470 MHz to 476 MHz (center frequency 473 MHz), the final value of the frequency range is set to 764 MHz to 770 MHz (center frequency 767 MHz), and the frequency value is increased by +6 MHz in the process of S112. control so that
  • the frequency range of 470 MHz to 1010 MHz (frequency conversion processing shown in FIG. 7D or frequency conversion amplification processing shown in FIG. 8C) control to scan). That is, the initial value of the frequency range is set to 470 MHz to 476 MHz (center frequency 473 MHz), the final value of the frequency range is set to 1004 MHz to 1010 MHz (center frequency 1007 MHz), and the frequency value is increased by +6 MHz in the process of S112.
  • the broadcast receiving device 100 can control the selection of the frequency range to be scanned based on the system identification, frequency conversion processing identification, etc. of the TMCC information.
  • the channel selection/detection One of the channel selection/detection section 131H and the channel selection/detection section 131V may scan the frequency range of 470 MHz to 770 MHz, and the other may scan the frequency range of 770 MHz to 1010 MHz. (if frequency conversion processing is applied to the transmitted wave with polarization).
  • frequency conversion processing is applied to the transmitted wave with polarization.
  • the operation sequence of FIG. 11A may be advanced in parallel in both the channel selection/detection section 131H and the channel selection/detection section 131V to synchronize the loop of frequency up S112 in the operation sequence of FIG. 11A.
  • the configuration is configured so that the pair of horizontally polarized signal and vertically polarized signal transmitted on the same physical channel is received in parallel.
  • control information and the like inside the packet stream of the advanced terrestrial digital service transmitted as a pair of the horizontally polarized signal and the vertically polarized signal can be decoded and acquired during the loop processing. This is preferable because scanning and creation of the service list proceed efficiently.
  • the broadcast receiving apparatus 100 has the configuration shown in FIG. 8B and has a so-called double tuner configuration in which a plurality of tuners/demodulators (channel selection/detection units) are further provided (for example, a plurality of third tuner/demodulators 130L are provided).
  • configuration the configuration shown in FIG. 8D may also be used, and when receiving an advanced terrestrial digital broadcasting service using a hierarchical division multiplex transmission method, one of the double tuners scans the frequency range of 470 MHz to 770 MHz, and the other Alternatively, the frequency range of 770 MHz to 1010 MHz may be scanned (if frequency conversion amplification processing is performed). By controlling in this manner, it is possible to reduce the time required for channel setting, as described above.
  • the terrestrial digital broadcasting service that is transmitted in either the upper layer or the lower layer in the configuration shown in FIG. 8B is the current terrestrial digital broadcasting service. be. Therefore, for example, among the frequency range of 470 MHz to 770 MHz and the frequency range of 770 MHz to 1010 MHz, the first tuner/demodulator 130C scans the frequency range in which the current digital terrestrial broadcasting service is transmitted, and scans the frequency range of the other frequency range.
  • the third tuner/demodulator 130L may perform scanning in parallel. In this case as well, it is possible to reduce the time required for channel setting, similar to the above-described parallel scanning using the double tuner of the third tuner/demodulator 130L.
  • the third tuner/demodulator 130L selects two points in total, one for each frequency range, for example, 470 MHz to 476 MHz (center frequency 473 MHz) and 770 MHz to 776 MHz (center frequency 773 MHz). Identification is possible by performing reception, acquiring TMCC information transmitted on each frequency, and referring to parameters (for example, system identification parameters) stored in the TMCC information.
  • both horizontally polarized signals and vertically polarized signals are transmitted.
  • a channel that has a broadcast program that is transmitted using Write it on the list In addition, in the case of a 2K broadcast program on the B layer shown in the same figure, if the same broadcast program is transmitted on the B layer of horizontally polarized signals and the B layer of vertically polarized signals, the same transport Even if an ID is detected, it is sufficient to store it in the service list as one channel.
  • the broadcast receiving device 100 has program selection functions such as one-touch tuning using the one-touch key on the remote controller, channel up/down tuning using the channel up/down key on the remote controller, and channel up/down tuning using the remote controller's channel up/down key. It has functions such as direct channel selection by directly inputting the 3-digit number used. Any channel selection function may be performed using the information stored in the service list generated by the above-mentioned initial scan/rescan. In addition, after selecting a channel, information on the selected channel is displayed on a banner, etc.
  • ⁇ Direct channel selection processing example> (1) When direct channel selection is selected, the system waits for input of a 3-digit number. (2-1) If the input of the 3-digit number is not completed within the predetermined time (about 5 seconds), the device returns to the normal mode and displays the channel information of the currently selected service. (2-2) When the input of the 3-digit number is completed, it is determined whether the channel exists in the service list of the receivable frequency table, and if not, a message such as ⁇ This channel does not exist'' is displayed. do. (3) If a channel exists, perform channel selection processing, set last mode, and display channel information after channel selection.
  • the channel selection operation is performed based on SI, and if it is determined that broadcasting is suspended, it may also have a function to display and notify the user of this fact.
  • FIG. 12A shows an example of an external view of a remote controller used to input operation instructions to the broadcast receiving apparatus 100 according to the embodiment of the present invention.
  • the remote control 180R includes a power key 180R1 for powering on/off (standby on/off) the broadcast receiving device 100, and cursor keys (up, down, left, right) 180R2 for moving the cursor up, down, left and right. , a determination key 180R3 for determining the item at the cursor position as a selection item, and a return key 180R4.
  • the remote control 180R also includes a network switching key (altitude terrestrial digital, terrestrial digital, advanced BS, BS, CS) 180R5 for switching the broadcast network received by the broadcast receiving apparatus 100.
  • the remote control 180R has one-touch keys (1 to 12) 180R6 used for one-touch tuning, a channel up/down key 180R7 used for channel up/down tuning, and a 3-digit number input for direct tuning. It is equipped with 10 keys used for. In the example shown in the figure, the 10 key is also used as the one-touch key 180R6, and during direct channel selection, it is possible to input a 3-digit number by operating the one-touch key 180R6 after pressing the direct key 180R8. .
  • the remote control 180R also includes an EPG key 180R9 for displaying a program guide and a menu key 180RA for displaying a system menu.
  • the program guide and system menu can be operated in detail using the cursor key 180R2, enter key 180R3, and return key 180R4.
  • the remote control 180R includes a d key 180RB used for data broadcasting services and multimedia services, a cooperation key 180RC for displaying a list of broadcasting and communication cooperation services and their compatible applications, and color keys (blue, red, green). , yellow) 180RD.
  • a d key 180RB used for data broadcasting services and multimedia services
  • a cooperation key 180RC for displaying a list of broadcasting and communication cooperation services and their compatible applications
  • color keys blue, red, green). , yellow
  • cursor key 180R2 For data broadcasting services, multimedia services, broadcasting communication cooperation services, etc., detailed operations are possible using the cursor key 180R2, enter key 180R3, return key 180R4, and color key 180RD.
  • the remote control 180R also has a video key 180RE for selecting related video, an audio key 180RF for switching audio ES and bilingual, and a key 180RF for switching on/off of subtitles and switching the subtitle language. and a subtitle key 180RG.
  • the remote controller 180R also includes a volume key 180RH for increasing/decreasing the volume of audio output, and a mute key 180RI for switching on/off the audio output.
  • the remote control 180R of the broadcast receiving apparatus 100 includes an "altitude terrestrial digital key”, “terrestrial digital key”, “altitude BS key”, “BS key”, and “CS key” as network switching keys 180R5.
  • "altitude terrestrial digital key” and “terrestrial digital key” are used in advanced terrestrial digital broadcasting service, for example, when simultaneous broadcasting of 4K broadcast program and 2K broadcast program is carried out in different layers, "altitude terrestrial digital key” In the pressed state, priority is given to selecting a 4K broadcast program when selecting a channel, and when the "terrestrial digital key” is pressed, priority is given to selecting a 2K broadcast program when selecting a channel.
  • the broadcast receiving device 100 when performing channel selection by one-touch tuning, channel up/down tuning, direct tuning, etc., displays the selected channel by displaying a banner or the like. It has the function of displaying information.
  • FIG. 12B shows an example of a banner display when selecting a channel.
  • Banner display 192A1 is an example of a banner display that is displayed when a 2K broadcast program is selected. For example, the program name, program start time/end time, network type, remote control direct channel selection key number, and service logo are displayed. All you have to do is display the 3-digit number.
  • the banner display 192A2 is an example of a banner display that is displayed when a 4K broadcast program is selected. A mark symbolizing "altitude" will also be displayed.
  • a display may be displayed to indicate this. In the example of the banner display 192A2, for example, it is displayed that down-conversion processing from UHD resolution to HD resolution and downmix processing from 22.2ch to 5.1ch have been performed.
  • a more sophisticated advanced digital broadcasting service that takes into consideration compatibility with the current digital broadcasting service It becomes possible to provide transmission technology and reception technology for broadcasting services. That is, it is possible to provide a technique for more suitably transmitting or receiving advanced digital broadcasting services.
  • Example 2 [Advanced audio signal]
  • the audio signal in current systems is a channel-based signal that corresponds to a speaker.
  • Channel-based signals include those of 5.1ch and those of 22.2ch (here, "ch” is an abbreviation for "channel”).
  • audio signals including object-based signals and HOA (Higher Order Ambisonics) signals are handled.
  • An object-based signal is an audio signal that allows the receiver to change the playback position, such as the voice of a narrator, such as placing it on the right or left side.
  • the playback position is not fixed and may be changed dynamically.
  • the HOA method signal is a signal that expands the sound field as a sum of spherical harmonics. Since there is an upper limit to the transmission capacity, we use expansion up to a finite degree. Since the channel base signal is basically recorded at a microphone position corresponding to a standard speaker arrangement, it is suitable for audio reproduction using a group of speakers arranged at or near the standard arrangement. On the other hand, the HOA method records spatial sound field information independently of a specific speaker arrangement, so it is suitable for supporting any speaker arrangement.
  • FIGS. 13A, 13B, and 13C Examples of standard speaker placement are shown in FIGS. 13A, 13B, and 13C.
  • the speaker group is divided into three groups, upper layer, middle layer, and lower layer, depending on the height of the installation position.
  • the arrangement of each group is as shown in FIGS. 13B and 13C.
  • FIG. 13B shows the arrangement of a 22.2ch speaker system
  • FIG. 13C shows the arrangement of a 7.1ch speaker system.
  • the number below the decimal point in the channel number display is the number of channels of the low frequency signal
  • the corresponding speakers are LFE1, LFE2, and LFE.
  • the other signal channels are called main channels.
  • a 5.1ch speaker system is obtained by removing the upper layer speaker from the 7.1ch speaker system.
  • the speaker system In the current system, if the speaker system has the same number of speakers as the number of channels of the channel-based audio signal, it will be played as is, and if the number of speakers is different from the number of speakers in the speaker system, the format will be converted to match the number of speakers in the speaker system. Reproduce. In particular, when the number of speakers is smaller than the number of audio signal channels, this format conversion is called downmix. Format conversion is also performed when the position of the speaker system assumed when creating the audio signal differs from the actual position of the speaker system. The sound to be output from the actual speaker position is synthesized by weighting and adding the audio signals of each channel.
  • each speaker is placed at the same distance from the expected standard viewing position of the viewer, so there is no need to adjust the playback time, but if the actual speaker is located at the viewer's position If they are not placed at the same distance from each other, the playback time may also be adjusted.
  • the format conversion is as shown in Equation 1 below.
  • s (ch) n (t) is a channel-based audio signal transmitted by broadcasting/communication
  • n is a signal number
  • the number of channel-based signals is N (ch) .
  • t is time.
  • p (ch) m (t) is an audio signal input to a speaker
  • m is a speaker number
  • the number of speakers is M.
  • g (ch) mn is a weighting coefficient for the channel base signal.
  • ⁇ t m is the delay time adjustment time according to the deviation from the distance R o between the speaker farthest from the standard viewing position and the standard viewing position.
  • weighting coefficient g (ch) mn when the speakers are not equidistant is corrected to be (R m /R o ) times the weighting coefficient g (ch) mn when the speakers are equidistant, It is also possible to balance the volume between the speakers.
  • the transmitted signals are weighted and summed, similar to the format conversion formula for channel-based signals. As shown in Equations 3 and 4, the signal is converted into a signal input to the speaker.
  • the meanings of the symbols are the same as those for channel-based signals; the superscript (ch) indicates a symbol for a channel-based signal, (obj) indicates a symbol for an object-based signal, and (HOA) indicates that the signal corresponds to the HOA system signal.
  • the audio signal p m (t) input to the speaker system is given by the following equation 5.
  • the weighting coefficient g (*) mn is determined by the relationship between the speaker arrangement and the standard viewing position, but the weighting coefficient g (obj) mn for the object-based signal is determined by taking into account the playback position of each individual object. Ru. Note that content common to all signals is expressed using a superscript (*).
  • FIG. 14A shows the positional relationship with the broadcast receiving apparatus 100 when headphones are used.
  • the listening position is the midpoint of the line segment connecting the left and right audio output sections of the headphones.
  • the sound field created by the audio signal is based on a reference coordinate system in which the center direction of the receiver screen is the reference direction.
  • the audio output section of the headphones changes its position within this reference coordinate system due to the rotation of the user's head (FIG. 14B). Therefore, the weighting coefficient g (*) mn for synthesizing the input signals to the audio output section of the headphone is calculated by taking into account the position of the audio output section of the headphone within the reference coordinate system at that time.
  • the position of the audio output section of the headphones is determined by, for example, recording the position where the user is facing the center of the receiver screen based on user input, and detecting subsequent changes in the direction of the user's face using a gyro sensor etc. installed in the headphones. It can be obtained by Note that although FIG. 14B shows the arrangement on a plane, the position in the height direction may be considered.
  • FIG. 15A shows a configuration example of the audio decoder 10000 when the audio signal to be transmitted is only a channel base signal.
  • a core decoder 10001 decodes an audio bitstream multiplexed and transmitted through broadcasting or communication into signals for each channel.
  • the format converter 10002 performs the above format conversion and outputs an audio signal for speakers and an audio signal for headphones. Output to external equipment may be performed wirelessly.
  • FIG. 15B is a configuration example of an audio decoder 10100 that supports advanced audio signals.
  • the core decoder 10101 decodes an audio bitstream that has been multiplexed and transmitted through broadcasting or communication into individual signals.
  • each signal is a channel base signal, an object base signal, and an HOA method signal. Even in the case of advanced audio signals, output to external equipment may be performed wirelessly.
  • the channel base signal is converted by the format converter 10102 into a signal for each speaker according to Equation 1 according to the speaker arrangement.
  • the channel base signal is also converted to a signal for headphones.
  • arrangement information stored in the receiver is used.
  • FIG. 16 shows an example of speaker arrangement information.
  • the placement information is based on the speaker type (main channel or low frequency channel), azimuth position, height position (elevation angle, inclination angle), and viewer head position, depending on the number that distinguishes the speaker. It consists of a distance of
  • the azimuth position is the angular position when a positive value turns to the left, and the angular position when a negative value turns to the right, assuming that the front direction is 0° when viewed from the viewing position. be.
  • the position in the height direction is defined as 0° in the horizontal direction when viewed from the position of the viewer's head, and a positive value represents an angle of elevation, and a negative value represents an angle of inclination.
  • the above-mentioned weighting coefficient g (ch) mn is set based on this information and the configuration of the channel base signal. Further, the delay time adjustment of ⁇ t m is performed based on the distance information, but if there is no distance information, the delay time adjustment is not performed.
  • the speaker arrangement information here is displayed using polar coordinates, it may also be displayed using orthogonal coordinates.
  • This speaker placement information may be standard placement information such as a 5.1ch speaker system, or may be speaker placement information specific to the receiver. Alternatively, the location information of the speaker system customized by the receiver user may be used. At this time, the user-customized arrangement information is registered before viewing the program, so that the user can set which arrangement information is to be used. Furthermore, it may be possible to switch between the speaker system provided in the receiver and the speaker system customized by the user. Furthermore, the speaker system used for each program may be reserved. Alternatively, a speaker system may be set for each type of program, each time slot, and each viewer. It becomes possible to reproduce audio according to the program content and the viewing environment at the time, improving convenience for the user.
  • the above g1, g2, g3, g4, g5, and g6 are weighting coefficients (downmix coefficients), and default values are shown in FIG. 17A.
  • This downmix coefficient is transmitted as metadata of the audio signal, but the default value is used until it is received.
  • This conversion formula and the default values of the weighting coefficients are the same as those used in a system that handles audio signals with only channel-based signals.
  • the signal When converting an audio signal with a larger number of channels than 5.1ch to a signal to be output to a 2ch speaker system, the signal is first downmixed to a 5.1ch signal and then downmixed to a 2ch signal.
  • An example of a conversion formula for downmixing from 5.1ch to 2ch is shown below.
  • Lt' L+g7*C+g8*Ls (Formula 12)
  • Rt' R+g7*C+g8*Rs (Formula 13)
  • the above g7 and g8 are weighting coefficients (downmix coefficients), and default values are shown in FIG. 17B. This downmix coefficient is transmitted as metadata of the audio signal, but the default value is used until it is received.
  • This conversion formula and the default values of the weighting coefficients are the same as those used in a system that handles audio signals with only channel-based signals.
  • By sharing processing between systems it is also possible to share signal processing units, which leads to a reduction in the overall system size in the case of a shared receiver.
  • the speaker systems used include a built-in speaker built into the receiver, an external speaker connected by wire, and an external speaker connected wirelessly.
  • the speaker to be used may be selected by remote control operation (for example, selection using arrow buttons) or by a linked terminal such as a smart phone.
  • FIG. 18 shows an example of a selection menu for speaker settings.
  • FIG. 18 shows that the "external speaker 1" system is selected.
  • the connection with the external speaker system may be a combination of wired connection and wireless connection.
  • the user definition is a speaker system that combines speakers of each system. For example, it is a system that combines a built-in speaker with an external speaker for expansion. By preparing a selection menu, the optimal speaker system can be easily selected according to the viewer's preference and viewing environment at the time, improving convenience for the viewer.
  • the arrangement information shown in FIG. 16 is required.
  • This placement information may be input by the viewer himself or may be downloaded from the site of the receiver manufacturer or speaker manufacturer.
  • This arrangement information may be downloaded by the receiver upon receiving identification information such as the model number of the speaker system.
  • placement information recorded on the speaker body may be transmitted to the receiver.
  • the arrangement information may be created and corrected by measuring the actual arrangement state of the speakers through a cooperative operation between the receiver and the speakers.
  • the arrangement state may be measured using, for example, a distance measuring device such as a camera or UWB (Ultra Wideband) provided in the receiver or the speaker or both.
  • the weighting coefficients used for format conversion may be provided externally. This weighting coefficient may be inputted by the user himself, inputted into the receiver through communication from the speaker system, or acquired by the receiver from the server.
  • the speaker system may have an audio conversion function.
  • the receiver may adjust the signal output to the speaker system in response to a request from the speaker system. Adjustments in the signals to be output include, for example, adjustments to the number of channels of channel base signals, the number of object base signals, the number of HOA system signals, the range of metadata necessary for their reproduction, and the like. If there is a limit to the signals that can be output depending on the program (for example, the number of channels), the output signal may be adjusted for each program.
  • the external speaker system also includes an audio bitstream decoder, the audio bitstream may be output to the externally connected speaker system through the bitstream output controller 10106 shown in FIG. 15B. By properly adjusting the output signal, the operation of the audio output is guaranteed.
  • the object base signal is a signal for each sound source, and is separated from the bitstream signal into a signal for each sound source by the core decoder 10101.
  • the object renderer 10103 calculates an output signal for each speaker based on the accompanying playback position information.
  • the actual placement information of the speaker system is also taken into consideration.
  • the output signal to the speaker system may also be calculated. This allows some processing to be shared.
  • weighting coefficients for calculating the output signal to the speaker may be given from the outside.
  • the weighting coefficients may be set by the user, may be obtained through communication with the speaker system, or may be obtained from the server. At this time, since the weighting coefficients vary depending on the sound source position, they are given in a table format or a function format.
  • the 22.2ch signal that takes into account the sound source position is processed within the receiver, it is also possible to provide only the conversion coefficient for converting the 22.2ch signal to the external speaker signal. good.
  • Object-based signals are divided into those that allow the viewer to specify the playback position of the sound source and those that do not.
  • This position designation and whether or not the designated position can be changed are transmitted as metadata for each sound source, and the receiver changes processing depending on the designation of whether or not the designated position can be changed.
  • a parameter is transmitted to the receiver that describes information as to whether or not the user is permitted to set the playback position for each object-based signal.
  • FIG. 19 is an example of metadata.
  • FIG. 19 shows a case where there are three types of sound sources: narration, vocals, and guitar.
  • the position of the narration voice can be changed, and a replacement sound source is also prepared.
  • Replacement sound sources are distinguished by sub-ID.
  • sub-ID a is Japanese
  • b is English
  • c is French.
  • the user instruction may be given for each program, or may be set to always change to a specific language.
  • the replacement signal may be transmitted using a general-purpose user area.
  • the default playback position of the narration audio may be set by the output level of each speaker of a standard speaker system, such as a 22.2 channel system, or may be set as the direction as seen from the standard viewer position. good.
  • Examples of playback position data are shown in FIGS. 20A, 20B, and 20C.
  • the playback position can be expressed in terms of speaker output level (Figure 20A), polar coordinates with the standard viewing position as the origin ( Figure 20B), rectangular coordinates with the standard viewing position as the origin ( Figure 20C), etc. There is.
  • FIG. 20A shows the output levels of two speakers, the output levels of three or more speakers may be set.
  • the orthogonal coordinate display in Figure 20C means that the direction parallel to the screen and to the right is the positive direction of the X-axis, the direction perpendicular to the screen from the origin to the screen is the positive direction of the Y-axis, and the vertically upward direction is the positive direction of the Z-axis. direction.
  • preset data 1 is the default playback position, and it is changed to other preset data according to a user instruction.
  • a configuration may be adopted in which a playback position defined by the user can be used in response to a user instruction.
  • FIG. 21 shows an example of selection of the playback position of the narration audio by the user.
  • preset positions are indicated by buttons and selected using the arrow keys on the remote control.
  • the example in FIG. 21 shows a state where the left position is selected.
  • the playback position of the narration audio may be selected in a far or near direction.
  • a narration sound corresponding to the button selection state may be played.
  • the volume of the narration may also be set independently. For example, on the narration volume setting screen, use the volume change button on the remote control to set the volume.
  • FIG. 22 shows an example of setting the narration position by the user.
  • the playback position is not selected from preset positions, but is set freely within a predetermined range.
  • the black circles in the figure represent the set positions, and the user adjusts the positions using the arrow buttons on the remote control.
  • the narration volume setting is the same as in FIG. 21.
  • the range that can be set by the user may be set for each type of sound source, and the display of the setting scale may be limited to the range that can be set by the user. Furthermore, if there is a possibility that the volume may become too loud depending on the playback position, the range of volume settings may also be limited. By being able to change the reproduction position and volume of the object-based signal as described above, it is possible to realize audio reproduction more suitable for the user.
  • FIG. 24A is an example of stream data in which the playback position is specified by the output level of the speaker
  • FIG. 24B is an example of stream data in which the playback position is specified in polar coordinates
  • FIG. 24C is an example of stream data in which the playback position is shown in orthogonal coordinates.
  • This is an example of stream data specified by .
  • the playback time on the stream data may be set arbitrarily or may be synchronized with the frame of the program image.
  • the volume may be adjusted for each sound source separately from the overall volume adjustment. In this way, by specifying the playback position and volume for each sound source, it is possible to achieve excellent sound image localization and to playback audio that matches the user's preferences.
  • the signal of the HOA system is a signal in which a sound field is expanded by spherical harmonic functions from the 0th order to a certain order.
  • the order of the spherical harmonics is a non-negative integer, and there are 2n+1 nth-order spherical harmonics. Therefore, the number of signals in the HOA method expanded by spherical harmonics up to the nth order is (n+1) 2 .
  • the highest order of the spherical harmonics used for expansion and the number of HOA system signals are summarized in FIG. 25.
  • the order of the HOA signal As can be seen from the figure, the number of HOA signals increases rapidly as the order increases. Since the number of channel base signals corresponding to the 22.2ch speaker system is 24, a 4th order HOA system signal consisting of 25 signals has an information amount comparable to this.
  • the HOA method signal is first separated into a signal expanded by spherical harmonics by the core decoder 10101, and then converted into a signal to be output to the speaker using the HOA method dedicated decoder 10104 based on the speaker arrangement information. be done.
  • This conversion may be performed by generating an output signal for a speaker system to be used directly, or by converting the signal into a signal for 22.2ch and then performing the conversion in accordance with the conversion of the channel base signal. This allows some processing to be shared.
  • weighting coefficients for calculating the output signal to the speaker may be given from the outside.
  • the weighting coefficients may be set by the user, may be obtained through communication with the speaker system, or may be obtained from the server.
  • the difference between this HOA system signal and the channel-based signal is the difference in the density of information in the viewing space.
  • the information density of the HOA system signal is isotropic and uniform, whereas the information density of the channel base signal is high in the front direction where many speakers are arranged.
  • there is no problem with audio playback using channel-based signals but if you have a speaker system that differs significantly from the standard speaker arrangement, or headphones whose orientation in space can change significantly, it is better to use HOA signals. You can enjoy suitable playback sound.
  • FIG. 26 shows an audio signal selection screen for each output device.
  • the channel base signal is expressed as "front-oriented type” and the HOA method signal is expressed as "omnidirectional type".
  • the channel base signal is selected as the audio signal output from the speaker
  • the HOA system signal is selected as the audio signal output from the headphones.
  • This setting may be performed for each program, or may be a common setting regardless of the program. Further, as a default setting, a channel base signal may be used for the speaker, and an HOA signal may be used for the headphone. Furthermore, if the signal set for the output device is not provided for the program, another signal may be used.
  • high-order signals may be transmitted via the Internet.
  • one method is to transmit signals from the 0th order to the 4th order by broadcasting, and to transmit the signals of the 5th order and above via the Internet. Even if the signal via the Internet is interrupted for some reason, it will not be a major problem because the audio will not be completely lost. Further, the signals via the Internet may be downloaded all at once to the receiver before the start of broadcasting. In this way, the influence of internet failures during viewing can be eliminated.
  • the HOA system signal is not transmitted during broadcasting, and the HOA system signal is transmitted only via the Internet. If the HOA signal is interrupted due to an Internet failure, the channel-based signal can be used instead, so that the audio will not be completely lost. In this case as well, the signals via the Internet may be downloaded all at once to the receiver before the start of broadcasting. In this way, the influence of internet failures during viewing can be eliminated.
  • the method of transmitting part of the audio signal via the Internet may be used for channel-based signals and object-based signals.
  • signals of 24 channels corresponding to 22.2ch may be transmitted by broadcasting, and additional channels may be transmitted over the Internet. This makes the information density more isotropic and can also be used as an alternative to HOA type signals.
  • object-based signal some of the signals for each of a plurality of sound sources may be transmitted over the Internet.
  • the audio output described above is based on the premise that audio is output to the system to be primarily used, whether it is an internal speaker system or an external speaker system. However, when multiple users use the system, some users may wish to hear a different sub-audio for themselves. In order to meet such a request, it is possible to add it to the main system and perform audio playback with another output device. For example, you can listen to secondary audio using your smartphone as a linked device.
  • the output device at this time may be selected as appropriate, such as a speaker, normal earphones, or open-air earphones.
  • FIG. 27 shows an example of selecting audio to be played back on the smart phone 10300, which is a linked device.
  • On represents reproduction, and off represents non-regeneration. Playback and non-playback are switched according to the user's preferences.
  • the overall audio is the audio of the channel-based signal
  • the individual audio is the audio of the object-based signal.
  • the overall sound is the sound of the audience, etc., and may be played back by a receiver or by a separate linked device, depending on the user's preference.
  • the selectable audio settings are not limited to this example, and may be selected as appropriate. By being able to freely select audio playback using playback devices near the user in this way, it becomes possible to play back audio that better suits the user's preferences.
  • FIG. 28A shows an example of parameters regarding the number of signals to be transmitted and the signal acquisition destination.
  • Each of the channel-based signals, object-based signals, and HOA system signals has parameters that describe the number of signals and the acquisition destination address (URL) in the case of Internet acquisition.
  • URL acquisition destination address
  • FIG. 29A shows an example in which audio signals to be transmitted are displayed for each program in the electronic program guide.
  • FIG. 29A shows the number of signals for each type of audio signal.
  • signals in parentheses represent signals obtained via the Internet. If information cannot be obtained from the Internet, a display will be displayed to indicate this.
  • FIG. 29B shows an example in which a strikethrough line is displayed superimposed on an audio signal transmitted via the Internet when information cannot be obtained from the Internet. Alternatively, if information cannot be acquired, audio signals transmitted via the Internet may not be displayed.
  • the electronic program guide may include explanatory information regarding the sound sources that the user can select for reproduction.
  • the user can understand the degree of richness of the program audio, and this can be used as auxiliary information for program selection. Furthermore, it may be possible to reserve settings for audio playback to be used when viewing a program before the program starts. Furthermore, for frequently used settings, the setting contents may be recorded in the receiver and the recording may be called up to easily perform audio playback settings when viewing a program or making a reservation. This improves usability for the user.
  • the selection of the speaker system, the type of signal source being reproduced, and the number of signals may be displayed on the screen (FIG. 30).
  • the display may be performed all the time, or may be displayed for a predetermined time only when the power is turned on, when changing the channel, or when changing the state. Further, this display may be prohibited by the user. In this way, by displaying the current state, the user can appropriately grasp the state.
  • Some special headphones perform processing on audio signals taking into account the sound transmission characteristics of the user's head. This processing may be performed by an external device, or may be performed by a receiver by adding a processing function to the mixer and distributor 10105 shown in FIG. 15B. At this time, the programs, parameters, etc. necessary for the processing may be obtained from the headphone manufacturer's server. This makes it possible to achieve more precise audio playback.
  • first control information digital_recording_control_data
  • second control information copy_restriction_mode
  • control information such as MPEG-2 TS program sequence information (for example, PMT) and MMT MPT.
  • Control information indicating content copy control is stored and transmitted from the broadcast station side to the broadcast receiving apparatus 100.
  • the first control information is, for example, 2-bit data, where 00 indicates "copy is possible without any restrictions", 10 indicates “copy is possible for only one generation”, and 11 indicates "copy is possible without restrictions”. If so, it may be configured to indicate "copy prohibited”.
  • the first control information (digital_recording_control_data) is 00 and indicates that it is ⁇ copyable without any constraints,'' it can be combined with another 1-bit control information to indicate that it is ⁇ copiable without constraints and that encryption is enabled during storage and output.'' Two types of states may be identified: ⁇ encryption processing required'' and ⁇ copy possible without constraints and encryption processing not required during storage and output''.
  • the second control information (copy_restriction_mode) is, for example, 1-bit data, and is configured so that if it is 0, it indicates that "only one generation can be copied", and if it is 1, it indicates that "copy with a limited number is allowed”. Just do it.
  • “limited number of copies allowed” is a copy control state that permits copying a predetermined number of times; for example, if nine copies are allowed + one move is allowed, this is a so-called "dubbing 10".
  • the second control information functions only when the first control information (digital_recoding_control_data) is 10, indicating that "only one generation can be copied.” In other words, if the first control information (digital_recording_control_data) is 10, indicating that "only one generation can be copied", the copy control of the content is "limited number of copies can be made" or "only one generation can be copied".
  • the broadcast receiving apparatus 100 of the present embodiment stores the content in the storage section 110 and stores the content on a removable recording medium according to the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode). It may be configured to control recording to, output to external equipment, copying to external equipment, moving to external equipment, etc.
  • the storage processing target is not limited to the storage unit 110 inside the broadcast receiving device 100, but may also include records that have been subjected to protection processing such as encryption processing so that they can be played only by the broadcast receiving device 100.
  • the storage process targets include external recording devices that are in a state where they can be recorded and reproduced only by the broadcast receiving device 100.
  • control information such as the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) will be described below.
  • the broadcast receiving apparatus 100 of this embodiment stores content that indicates "copyable without restrictions" by the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT. ) unit 110, recording on a removable recording medium, outputting to an external device, copying to an external device, and moving to an external device may be performed without restriction.
  • first control information digital_recording_control_data
  • this example applies to content that indicates "only one generation can be copied" by a combination of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS or the MPT of the MMT.
  • the broadcast receiving apparatus 100 enables encrypted storage of content in the storage (storage) unit 110, but when outputting the stored content to an external device for viewing, copying of "re-copy prohibited" is required. It will be encrypted and output together with the control information.
  • so-called move processing to an external device processing of copying the content to the external device and making the content in the storage unit 110 of the broadcast receiving device 100 unplayable by erasing processing or the like) is possible.
  • this embodiment applies to content that indicates "copyable with limited number” by a combination of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS and the MPT of the MMT.
  • the broadcast receiving apparatus 100 is capable of encrypting and accumulating the content in the storage (storage) unit 110, but when outputting the accumulated content to an external device for viewing, copy control of "prohibit re-copying" is required. It will be encrypted and output along with the information. However, it may be possible to enable a predetermined number of copies and moves to an external device. In the case of the so-called "dubbing 10" regulation, nine copies and one move process may be performed to the external device.
  • the broadcast receiving apparatus 100 of this embodiment stores content that is indicated as "copy prohibited” by the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT to the storage unit 110. Copying is prohibited.
  • the broadcast receiving apparatus 100 has a "temporary storage" mode that allows data to be stored in the storage unit 110 only for a predetermined time or a predetermined time specified by control information included in the broadcast signal. In this case, even if the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT indicates "copy prohibited", the content cannot be transferred to the storage unit 110. can be temporarily held.
  • output for viewing to the aforementioned external device may be performed via the video output unit 193 in FIG. 2A, the digital I/F unit 125, the LAN communication unit 121, or the like.
  • the copying or moving process to the external device described above may be performed via the digital I/F section 125, the LAN communication section 121, etc. in FIG. 2A.
  • the broadcast receiving apparatus 100 of the present embodiment is equipped with a function to perform more suitable copyright protection processing (content protection processing) corresponding to audio content that includes the advanced audio signal.
  • FIG. 28B shows the data structure of the audio component descriptor (audio_component_descriptor) included in the PMT of MPEG-2 TS and MPT of MMT in the digital broadcasting system of this embodiment.
  • the audio component descriptor in FIG. 28B stores data describing various information regarding the audio component of the content transmitted by the digital broadcasting system of this embodiment.
  • information on the type of audio component included in the target content can be stored in the component_type (component type) data shown in FIG. 28B.
  • component_type component type
  • FIG. 28C shows a list of audio signal types that can be indicated by some bits in the audio component type data. For example, definitions similar to those used in conventional digital broadcasting are used for 00000 to 10001. These definitions include definitions of relatively popular audio signals such as single mono, stereo, 5.1ch, 7.1ch, and 22.2ch. Note that LFE (Low Frequency Effect) shown in FIG. 28C indicates a bass enhancement channel.
  • LFE Low Frequency Effect
  • definitions of 10010 to 11010 are newly added as audio signal types that can be indicated by some bits in the audio component type data.
  • 7.1.4ch which is 7.1ch plus four channels arranged above, is defined as data 10010.
  • 7.1.4ch+4obj which includes four object signals in 7.1.4ch
  • data 10100 7.1.4ch+6obj, which includes six object signals in 7.1.4ch
  • data 10101 22.2ch+4obj, which includes four object signals in 22.2ch
  • HOA1 which is an HOA system signal with one signal, is defined.
  • HOA4 which is an HOA system signal with four signals
  • HOA9 which is an HOA system signal with nine signals
  • HOA16 which is an HOA system signal with 16 signals
  • HOA25 which is an HOA system signal with 25 signals
  • the digital broadcasting system of this embodiment is capable of transmitting and receiving audio content that includes advanced audio signals.
  • audio signals ranging from a single monaural signal with audio component type data of 00001 to a 7.1.4ch signal with audio component type data of 10010 are channel-based signals.
  • the audio signal from the 7.1.4ch+4obj signal with audio component type data 10011 to the 22.2ch+4obj signal with audio component type data 10101 is the audio signal of the object-based signal.
  • the object-based signal is included in the advanced audio signal in this embodiment.
  • the HOA1 signal with audio component type data of 10110 to the HOA25 signal with audio component type data of 11010 are audio signals of the HOA system.
  • the HOA audio signal is included in the advanced audio signal in this embodiment.
  • audio component type data is stored in the audio component descriptor included in the PMT of MPEG-2 TS and MPT of MMT, and the data is By transmitting the content to the receiving device 100, the broadcast receiving device 100 can identify whether the audio signal of the transmitted content is an advanced audio signal or not.
  • FIGS. 31A to 32B a control example of copyright protection processing (content protection processing) for audio content in the broadcast receiving apparatus 100 will be described using FIGS. 31A to 32B.
  • the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) shown in FIGS. 31A to 32B are control information indicating copy control of the content, and are the first control information regarding the content.
  • the definitions of (digital_recording_control_data) and the second control information (copy_restriction_mode) are as already explained.
  • the decoded audio output (analog) shown in FIGS. 31A to 32B is output processing of an audio signal in a decoded analog signal state.
  • the audio signal decoded by the core decoder in FIG. 15B is converted into an analog signal by a D/A converter (digital-to-analog converter) not shown, and the analog video/video signal provided in the expansion interface section 124 in FIG.
  • D/A converter digital-to-analog converter
  • the decoded audio output (digital) shown in FIGS. 31A to 32B is output processing of an audio signal in a decoded digital signal state. This is a process in which, for example, the audio signal decoded by the core decoder in FIG. 15B is outputted as a digital signal to an external device or the like via the audio output unit 196 in FIG. 2A. Note that the decoded audio signal may be output as a digital signal to an external device or the like via the digital I/F section 125, the LAN communication section 121, or the like.
  • stream output (IP interface) shown in FIGS. 31A to 32B is output processing of an audio signal in a stream format digital signal state. This is, for example, a process in which the core decoder in FIG. 15B does not perform decoding and outputs a digital signal in a stream format to an external device via the bitstream output controller 10106.
  • the broadcast receiving device 100 may output to the external device as an IP interface output via the LAN communication unit 121 in FIG. 2A, for example.
  • FIG. 31A is a table showing a control example of copyright protection processing when the broadcast receiving apparatus 100 outputs the audio component of the content as it is without accumulating it.
  • the definitions of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) regarding the content have already been described.
  • the third control information (audio component_type) is information on the audio component type explained in FIG. Identify whether the type is a channel-based signal, object-based signal, or HOA type signal. Furthermore, when the information on the audio component type indicates a value corresponding to undefined, the broadcast receiving apparatus 100 also identifies that fact.
  • the third control information indicates an advanced audio signal, that is, if the third control information indicates an object-based signal or an HOA method.
  • the control is performed so that the signal is output in a copy-prohibited state, regardless of the combination of the first control information value and the second control information value.
  • the content of the advanced audio signal is more valuable than the content of the audio signal of the channel-based signal. Therefore, in the control example of FIG.
  • the content of object-based audio signals or HOA-based audio signals is copied by decoded audio output in the digital signal state rather than the audio signal content of channel-based signals.
  • the control is more restrictive. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal.
  • the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are as shown in FIG. 31A.
  • the output is controlled so that it can be copied without any restrictions.
  • the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are as shown in FIG. 31A.
  • the first control information indicates "copy", as shown in FIG. If "prohibited” is indicated, the output is controlled so that copying is prohibited.
  • control for decoded audio output in a digital signal state shown in FIG. 31A is performed when the output unit for decoded audio output is an output unit compatible with a copy control method such as SCMS (Serial Copy Management System). It is something that is done. If the output part of the decoded audio output does not support copy control, no matter what combination of the first control information value, the second control information value, and the third control information value, the copy Allows output without restrictions.
  • SCMS Serial Copy Management System
  • the copy control state settings in the control example for audio signal output in a digital signal state in a stream format via an IP interface are as shown in FIG. 31A. Since this is the same as the control example for output, repeated explanation will be omitted.
  • the combination of the first control information value and the second control information value indicates that "copying is possible without any constraints".
  • content of object-based audio signals or HOA method audio signals rather than the content of audio signals of channel-based signals, etc.
  • Copy control is more restrictive. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal.
  • DTCP in the audio signal output in the stream format digital signal state via the IP interface shown in FIG. 31A, DTCP or DTCP2 is used as the copy control method.
  • DTCP also applies to content where the value of the first control information and the value of the second control information indicate that there are copy restrictions such as "only one generation can be copied”, “limited number of copies can be copied", and "copy prohibited”. Control is performed to prohibit output to external devices that are not compatible with DTCP2. Control in this respect is different from the example of control regarding decoded audio output in the digital signal state.
  • the control example in FIG. 31B is a control example of copyright protection processing when the broadcast receiving apparatus 100 outputs the audio component of the content as it is without storing it, and is a modification of the control example in FIG. 31A.
  • the explanation of the points similar to the control example of FIG. 31A will be omitted, and only the points that are different from the control example of FIG. 31A will be explained.
  • control of decoded audio output in an analog signal state and the control of audio signal output in a stream format digital signal state via an IP interface are the same as the control example of FIG. 31A.
  • control on decoded audio output in the digital signal state is different from the control example of FIG. 31A.
  • the combination of the value of the first control information and the value of the second control information is "copyable without constraint". If this is the case, no matter which value the third control information indicates, the decoded audio output in the digital signal state can be output in a copyable state without any restrictions. Furthermore, even if the combination of the value of the first control information and the value of the second control information indicates that "only one generation can be copied" or "copy with a limited number of copies is possible", the value of the third control information No matter which value is indicated, only one generation is output in a copyable state in the decoded audio output in the digital signal state.
  • the digital signal state is The decoded audio output will be output with copying prohibited. That is, in the control of decoded audio output in the digital signal state in the control example of FIG. There is no difference in copy restrictions when outputting audio depending on copy control of audio signal content. Even if this is a decoded audio output in a digital signal state of the content of an object-based audio signal or an HOA-based audio signal, which is an advanced audio signal, for example, it may be processed by the mixer and distributor 10105 in FIG. 15B.
  • the content of the object-based audio signal or the HOA method is more important than the audio signal content of the channel-based signal. It is preferable that the copy control of the content of the audio signal is made more restrictive.
  • control example shown in FIG. 31B of this embodiment described above also makes it possible to realize more suitable copy control that corresponds to the value of content according to the type of audio signal.
  • FIG. 32A is a table showing a control example of copyright protection processing when audio components of content are stored in the broadcast receiving apparatus 100 and then output.
  • the storage process is performed by storing content including audio components in the storage unit 110 of the broadcast receiving device 100.
  • the copy control state of the content in the storage state is controlled differently depending on the combination of the value of the first control information and the value of the second control information. Specifically, if the first control information indicates that the content can be copied without restrictions, regardless of the value of the second control information, the content can be copied without restrictions. Accumulate in the state. At this time, regardless of the value of the third information, the content may be stored in a state where it can be copied without restriction. Further, when the first control information indicates that "only one generation can be copied" and the second control information indicates that "only one generation can be copied", the content is stored in a state where re-copying is prohibited.
  • the content may be stored in a state where re-copying is prohibited.
  • the content when the first control information indicates that "copying is allowed for only one generation" and the second control information indicates that "copying is permitted with a limited number of copies," the content is stored in a state where copying is allowed with a limited number of copies.
  • the content may be stored in a state where copying is possible with a limited number of copies.
  • the first control information indicates "copy prohibited”
  • the content is configured so that it can be stored in a temporary storage state regardless of the value of the second control information. Also good.
  • the content may be stored in a temporary storage state.
  • the copy control state of the content in the storage state is controlled differently depending on the combination of the value of the first control information and the value of the second control information.
  • the decoded audio output in the analog signal state of the stored content can be copied without any restrictions. Control the output with . This is because analog signals are more susceptible to deterioration due to copy processing than digital signals, and copying does not need to be restricted as severely as digital signals.
  • the third control information indicates an advanced audio signal, that is, the third control information is an object.
  • control is performed so that it is output in a copy prohibited state, regardless of which copy control state is in the storage state.
  • the content of the advanced audio signal is more valuable than the content of the audio signal of the channel-based signal. Therefore, in the control example of FIG.
  • the audio signal of the channel base signal is For content such as object-based audio signals or HOA-based audio signals, copy control in decoded audio output in a digital signal state for stored content is more restrictive than for content. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal.
  • the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A.
  • the copy control state in is a control state in which copying is possible without restrictions
  • control is performed to output in a state in which copying is possible without restrictions.
  • the decoded audio output in the digital signal state for the content after storage the content whose third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A.
  • the copy control state in is a control state in which re-copying is prohibited or a control state in which limited copies are allowed, control is performed so that only one generation is output in a copyable state.
  • the content whose third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A. If the copy control state in is a temporary storage control state, control is performed to output in a copy prohibited state.
  • the output unit for decoded audio output is compatible with a copy control method such as SCMS (Serial Copy Management System). This is done when the output section is If the output section of the decoded audio output does not support copy control, no matter what the copy control state is in the storage state or what value the third control information has, copy restrictions will be applied. Allows output without.
  • SCMS Serial Copy Management System
  • the copy control state in the storage state is the control for prohibiting re-copying. If the control state is in a control state with copy restrictions, such as a control state in which copying is possible with a limited number of copies, or a control state in which temporary storage is possible, re-copying is prohibited regardless of the value of the third control information. Output with .
  • the copy control state in the storage state is that copying is possible without restriction.
  • the copy control state at the time of output is switched according to the third control information.
  • content in which the third control information indicates an audio signal of a channel base signal or content in which the third control information indicates undefined is allowed to be output in a state where it can be copied without restriction.
  • content in which the third control information indicates an object-based audio signal or an HOA-based audio signal, which is an advanced audio signal is controlled to be output in a state where re-copying is prohibited.
  • DTCP or DTCP2 is used as the copy control method.
  • DTCP and DTCP2 are not supported for content whose copy control state in the storage state is a control state with copy restrictions, such as a control state of prohibiting re-copying, a control state of copying with limited quantity, or a control state of temporary storage. control to prohibit output to external devices that are not connected.
  • the control example in FIG. 32B is a control example of copyright protection processing when the audio component of content is stored in the broadcast receiving apparatus 100 and then output, and is a modification of the control example in FIG. 32A.
  • the explanation of the points similar to the control example of FIG. 32A will be omitted, and only the points that are different from the control example of FIG. 32A will be explained.
  • control of decoded audio output in an analog signal state for content after storage and the control of audio signal output in a stream format digital signal state via an IP interface for content after storage are performed.
  • the control example in FIG. 32A is the same as the control example in FIG. 32A.
  • the control for outputting decoded audio in the digital signal state for the content after storage is different from the control example of FIG. 32A.
  • the control example of FIG. 32B is different from the control example of FIG.
  • the copy control state in the storage state is a control state in which copying is possible without restriction, No matter what value the third control information indicates, the decoded audio output in the digital signal state of the content after storage can be output in a copyable state without any restrictions.
  • the copy control state in the storage state is a control state in which re-copying is prohibited or a control state in which limited copying is allowed, no matter which value the third control information indicates, the storage In the decoded audio output in the digital signal state for the later content, only one generation is output in a copyable state.
  • the copy control state in the storage state is a temporary storage control state
  • no matter which value the third control information value indicates the decoded audio output in the digital signal state for the content after storage is performed.
  • Output in a copy-prohibited state That is, in the control of decoded audio output in the digital signal state in the control example of FIG.
  • the reason for this control is the same as the reason for controlling the decoded audio output in the digital signal state in the control example of FIG. 31B, which has already been explained, so a repeated explanation will be omitted.
  • the following control may be performed when outputting via the IP interface.
  • the first control information (digital_recording_control_data) and second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS and the MPT of the MMT specify that "only one generation can be copied” and "a limited number of copies can be made".
  • copy_restriction_mode included in the PMT of the MPEG-2 TS and the MPT of the MMT specify that "only one generation can be copied" and "a limited number of copies can be made”.
  • copy restrictions such as "Copy Prohibited” are indicated, to an external device via the LAN communication unit 121, the IP address of the external device that is the destination of the transmission packet from the broadcast receiving device 100.
  • copy control information indicates copy restrictions such as ⁇ only one generation can be copied'', ⁇ a limited number of copies can be made'', and ⁇ copying is possible without any restrictions and encryption processing is required when storing and outputting''.
  • the IP address of the external device that is the destination of the transmission packet from the broadcast receiving device 100 is the same as that of the broadcast receiving device. If the IP address of the external device is outside the same subnet as the IP address of broadcast receiving device 100, it may be prohibited.
  • the IP address of the external device that is the destination of the transmission packet from the broadcast receiving apparatus 100 is the one that receives the broadcast. This is possible only when the IP address of the device 100 is within the same subnet, and is prohibited when the IP address of the external device is outside the same subnet as the IP address of the broadcast receiving device 100. However, if the external device is connected within the same subnet as the IP address of the broadcast receiving device 100 within a predetermined period, and the external device is registered as a device that can be viewed even outside the same subnet as the IP address of the broadcast receiving device 100.
  • the storage (storage) of the broadcast receiving device 100 in the external device may be such that the content stored in the unit 110 can be outputted as video and audio for viewing. In this case, the video and audio output for viewing is performed after encrypting the content.
  • FIG. 33 is a diagram illustrating an example of a reverberation sound processing flow when headphones are used.
  • the broadcast receiving apparatus 100 obtains the reverberant sound signal of each sound based on the sound field characteristics, generates a reverberant sound-added sound signal in which the reverberant sound is added to the original sound, and This is an example of mixing the reverberant sound-added audio signal and transmitting it to headphones for sound reproduction.
  • the sound field characteristics are the acoustic characteristics of the sound field where audio is recorded, that is, the spatial transfer function of sound including reverberant sound.
  • sound field characteristics are not limited to methods that directly indicate the spatial transfer function of sound, but also include the shape of structures such as walls at the recording location, and the reflection and absorption characteristics of sound by each structure. It may also be reference data for estimating the transfer function.
  • information representing the sound field characteristics and information representing the position of each sound source in the sound field is broadcast metadata included in the broadcast wave, for example, metadata of each object-based audio signal included in the broadcast wave. It is described in The broadcast receiving device 100 acquires sound field characteristics from broadcast metadata included in received broadcast waves, reflects the sound field characteristics during audio rendering, generates a reverberated sound signal, and produces sound with a high sense of presence. enable playback.
  • the audio received and played back by broadcast waves is difficult to reflect the reverberation characteristics of concert halls, theaters, etc., and may lack a sense of realism.
  • This control example is particularly effective in solving such problems.
  • the broadcast receiving device 100 uses information obtained from the broadcast program name and its additional information that can specify the name of the recording location, such as a hall or theater, to determine the sound field characteristics via a network such as the Internet. You can get it. For example, information in which hall names or theater names are associated with their sound field characteristics is stored in advance in the information server 900 shown in FIG. 1. Broadcast receiving device 100 acquires the sound field characteristics associated with the specified hall name or theater name from information server 900 via the network through the communication unit. Furthermore, the broadcast receiving device 100 may receive reverberant sound as an object-based sound source, adjust its intensity, frequency characteristics, etc. according to the viewer's preferences and viewing environment, and synthesize it with other sounds. .
  • headphones are an example of an audio output device having multiple speakers.
  • the audio output device may be a head mounted display, a speaker system, or the like, as described later.
  • An audio recording location is a location where audio is converted into an electrical signal using a microphone, pickup, etc. and recorded, such as a concert hall or theater where a performance or play is held.
  • 20010 corresponds to the object renderer 10103.
  • the processing when using headphones described in the above (Example 2) [Advanced Audio Signal] corresponds to spatial position correction 20012 and 20022. Further, reverberation sound addition 20013, 20023 and head transfer correction 20014, 20024 are performed.
  • the background sound 20021 may contain reverberation. Therefore, in this control example, with respect to adding reverberant sound to the individual sound 20011, when adding reverberant sound to the background sound 20021, the amount of reverberant sound added may be reduced or even if the reverberant sound is not added. good.
  • Reverberation addition 20013 and 20023 are calculations based on sound field characteristics extracted from broadcast metadata, such as object audio (individual audio for each sound source) signal metadata, or space simulated from the structure of the recording location. This is done by calculation using a transfer function. More specifically, the reverberation sound additions 20013 and 20023 are performed, for example, by convolution integration of impulse responses to each object audio and background sound, filtering processing of frequency attenuation characteristics for each delay time, and the like. Note that if the sound field characteristics are not described in the metadata of the object audio signal, the theater name may be extracted from the additional information of the program, and the sound field characteristics of the theater may be obtained from the information server 900.
  • the broadcast receiving device 100 acquires the sound source position information in the theater from the metadata of the object audio signal, and when the viewing position, which is the virtual viewer's position in the theater, is specified, the broadcast receiving device 100 acquires the sound source position information in the theater from the metadata of the object audio signal. Based on this, the spatial transfer function can be determined. Based on the obtained spatial transfer function, the broadcast receiving device 100 generates a signal of the reverberant sound so that the reverberant sound becomes close to the sound when the sound of each sound source is listened to at the viewing position. By generating such a reverberation-added audio signal, it is possible to reproduce audio with a higher sense of presence. Note that if the sound field characteristics cannot be obtained directly, the spatial transfer function may be determined by simulation from information on the internal structure and materials of the theater.
  • the theater's internal structure/material information includes, for example, elements such as room size, shape, ceiling height, and interior material.
  • the head-related transfer function shows the relationship between sounds coming from each direction and sounds reaching the entrances of the external auditory canals in both ears. Head transfer characteristics vary from person to person depending on the shape of the auricle and other factors. If there is no personally registered head-related transfer function, the voice is converted by head-related transfer correction 20014 based on the standard head-related transfer function.
  • the sound emitted from the headphones reaches the eardrum through the ear canal, and the listener recognizes the sound.
  • the ear canal transfer function from the headphones to the eardrum may vary not only depending on individual listeners, but also depending on whether the headphones are closed-type headphones that cover the pinna of the ear or earphone-type headphones that are inserted into the ear canal.
  • the head transfer correction 20014 may perform not only correction processing based on the head transfer function but also correction processing based on the ear canal transfer function. In this case, for example, it is desirable to specify the head-related transfer function by acquiring the type information of the headphones and the identification information of the listener, and to perform correction processing based on the specified head-related transfer function.
  • FIG. 33 illustrates the case where there is one individual sound and one background sound, but if there are multiple individual sounds or background sounds, similar processing is performed individually to create each sound after processing. All you have to do is mix. Furthermore, for individual voices of narration, the strength or presence or absence of reverberation sound addition processing may be changed depending on the type of individual voice, such as reducing or eliminating reverberation to make it easier to listen to. In addition, although we have explained an example in which spatial position correction 20012, reverberation sound addition 20013, and head transfer correction 2014 are sequentially processed, it is also possible to calculate a composite transfer function that combines them and process individual sounds with the composite transfer function. good.
  • the head-related transfer correction may be processed all at once after the mixing 20015 processing. , may be processed by the headphones 910.
  • the headphones 910 When processing is performed using the headphones 910, there is no need for the broadcast receiving apparatus 100 to know the type and orientation of the headphones 910, so if there are multiple viewers, it becomes easy to respond to each viewer individually.
  • reverberant sound may be approximated and transmitted as one sound source of object audio. Furthermore, the intensity of the reverberation sound may be adjusted separately from the background sound according to the viewer's preference.
  • FIG. 34A is a diagram showing an example of the audio output setting menu.
  • each tag indicates a setting target. By selecting a tag, the viewer can open the setting menu for the setting target associated with that tag.
  • the tag "Built-in SP" is the setting of the TV's built-in speaker
  • the tag "Optical IF” is the setting of the optical digital interface
  • the tag "ARC” is the setting of the HDMI Audio Return Channel
  • the tag "HP1” is the setting of the Head Phone 1 (Bluetooth (registered trademark)). connection) settings and the tag “HP2" are respectively associated with the settings of Head Phone 2 (assuming a 3.5 ⁇ mini jack).
  • FIG. 34A shows a state in which the tag "HP1" is selected and the settings menu 20032 for Head Phone 1 is opened.
  • "model number hhh” represents the model number of the headphones.
  • the broadcast receiving apparatus 100 may obtain standard head-related transfer function information used for the head-related transfer correction 20014 in FIG. 33 from the information server 900 based on the model number of this headphone.
  • "Channel base On” indicates that the channel base audio signal is turned on.
  • the " ⁇ ” mark is a mark for setting On/Off, and when you move the cursor over this mark, the options "On” and “Off” will appear, allowing viewers to set On/Off.
  • “>Details” is a button for opening a detailed settings menu, and when this button is pressed, a detailed settings menu for the corresponding setting target is opened.
  • Object-based On indicates that the object-based audio signal is turned on.
  • a summary display is displayed showing how many object sounds there are in total and how many of them are on.
  • “3 out of 5 objects selected” is displayed, indicating that there are a total of 5 object sounds, and 3 of them are turned on.
  • "Object Base Off” is displayed, it means that all object sounds are turned off.
  • An example of the detailed setting menu for object-based signals is the detailed menu shown in FIG. In the detailed menu shown in FIG. 27, it is possible to set On/Off for each individual voice.
  • Theatre AAA is the name of the theater to which reverberation sound is added.
  • FIG. 34B is a diagram showing an example of a detailed menu for reverberation sound settings.
  • the viewing position for viewing the content is the same position as the listening point, which is the position for listening to the sound, and the position for viewing the video. However, if the display size changes, etc. There may also be cases where they are different.
  • the listening point will be used instead of the viewing position.
  • a reverberation sound detail setting field 20033 is displayed on the left side of the screen, and a seating layout diagram of the theater where the performance is performed from above is displayed on the right side of the screen.
  • the reverberation sound detailed setting field 20033 is provided with a reverberation sound setting area for setting reverberation sound and a listening point setting area for setting listening point.
  • the name of the selected theater and the effect strength of the reverberation sound are displayed, and it is possible to select the desired one from the options using the " ⁇ " and " ⁇ " marks. If the theater name is extracted from program information, such as program metadata, the theater name is automatically selected based on the program information.
  • FIG. 34B shows an example in which theater AAA is set. It is also possible to select an alternative theater other than theater AAA.
  • An alternative theater is a theater other than the AAA theater where the performance is being performed, such as a BBB theater. When an alternate theater is selected, a simulation of the acoustics of that alternate theater is reproduced. Since the sound field characteristics of an alternative theater, such as the BBB theater, cannot be obtained from the broadcast program, the name of the theater that can be obtained from the information server 900 can be searched and set.
  • the effect strength of the reverberant sound is initially set to the default or stored previous setting.
  • the effect strength of the reverberation sound can be selected from, for example, 10 levels, and the default value recommended by the program or audio metadata is 5.
  • An effect strength value of 0 means no reverberation.
  • FIG. 34B shows an example in which 5 is set as the value of the effect strength. Note that if the metadata of the reverberation sound information cannot be obtained, for example, "-" is displayed.
  • the settings of the selected listening point are displayed in the listening point setting area.
  • the setting of the listening point can be selected from, for example, three types: "Standard (A)," "Synchronized with video,” and "(B) setting.”
  • Standard (A) is a setting in which the default setting indicated by program information or audio metadata is selected.
  • Synchroze with video is a setting that also synchronizes the sound field when enlarging a video portion shown in FIG. 34D, which will be described later. Note that “synchronize with video” is also referred to as "linking of image and sound image” mode.
  • “(B) Setting” is a setting determined by the user by moving the cursor (B) on the seating arrangement map on the right to determine the listening point.
  • FIG. 34B shows an example in which the default listening point 20034 (represented by mark A) is displayed at the center of the seating arrangement map. Further, FIG. 34B shows an example in which "(B) setting" is selected and the listening point 20035 (represented by mark B) in this case is displayed at the front of the seating arrangement map.
  • the listening point can be moved by, for example, operating the cursor of the listening point using marks such as " ⁇ " and " ⁇ " displayed near the cursor.
  • FIG. 34C is a diagram illustrating an example of a banner display indicating the reverberation processing state.
  • FIG. 34C shows an example in which the entire stage of a program called Maxell Performance is displayed as a broadcast image, and shows a banner display 20041 when reverberation sound assuming an AAA theater is added.
  • FIG. 34D is a diagram illustrating an example of a banner display when a portion of a broadcast image is enlarged and displayed by a predetermined remote control operation or the like.
  • FIG. 34D shows an example in which the broadcast image of FIG. 34C is partially enlarged to display the dancers included in the broadcast image in a larger size. The enlarged part may be moved using the up, down, left, and right buttons on the remote control. If "Synchronize with video" is not selected as the listening point setting, that is, if "Image and sound image linkage" mode is Off, even if you partially enlarge the broadcast image, the sound spatial position correction will not be linked. The sound field remains unchanged.
  • the "image and sound image linkage” mode may be set independently when enlarging a portion of the image and when displaying the entire image.
  • the "image and sound image interlocking mode” may be set to On, and when the magnification is above a predetermined magnification (for example, 2x), the "image and sound image interlocking" mode may be set to Off.
  • FIG. 34E is a diagram for explaining viewing conditions before and after partially enlarged display of a broadcast image.
  • FIG. 34E shows an example of a view from above of a performance venue for a program to be broadcast.
  • FIG. 34E shows, as an example, an area 20051 displayed on the screen before partial enlargement display, and an area 20061 displayed on the screen after partial enlargement display. Note that the performer 20052 is shown as a front view to facilitate understanding of the explanation.
  • the screen area 20061 after partially enlarged display appears to have a smaller display area in real space than the screen area 20051 before partially enlarged display. It can be considered that the viewing point 20063 (represented by mark D) is approaching from point 20053 (represented by mark C). That is, the distance from the viewer to the position corresponding to the display area can be considered to be shortened from the distance 20054 before partial enlargement display to the distance 20064 after partial enlargement display.
  • the viewing angle 20065 after partially enlarged display of the broadcast image is larger than the viewing angle 20055 before partially enlarged display.
  • a state in which the processing mode changes the direction from which the sound comes in accordance with changes in the viewing angle is referred to as the "image and sound image linkage" mode being on.
  • the performer can be seen larger by partially enlarging the broadcast image, and the direction from which the sound is coming changes in conjunction with this. That is, in this case, the viewer can get a sense of getting closer to the stage, that is, the performers, and can recreate the experience of moving in a theater.
  • the processing mode that changes the direction of the sound in accordance with changes in the viewing angle is not in effect is referred to as the "image and sound image linkage" mode being off.
  • the performer can be seen larger by partially enlarging the broadcast image, but the direction from which the sound is coming does not change. That is, in this case, the viewer can get the feeling of viewing the stage through binoculars, and can recreate the experience of being at a fixed location in a theater.
  • switching the "image and sound image linkage" mode between On and Off may be performed directly with a remote control button, or may be performed through menu settings.
  • the sound field characteristics of the viewing environment are measured, and head transfer correction is performed to suppress the reverberant sound in the viewing environment based on the measured sound field characteristics. This makes it possible to further enhance the sense of being in the theater.
  • HMD viewing> Similar processing can be performed even when viewing using an HMD.
  • the HMD 920 receives video and audio data from the broadcast receiving device 100 and plays the video and sound, as if watching the monitor section 192 of the broadcast receiving device 100 and outputting the sound from the headphones 910. Provide a viewing environment where you can listen to.
  • the standard viewing state for television broadcasting may be established, that is, the state in which the video is displayed in front of the viewer.
  • the broadcast receiving device 100 has two modes for setting the viewing state of the viewer: a mode in which the attitude information of the HMD 920 is obtained from the HMD 920, a video is generated based on the attitude information and transmitted to the HMD 920, and a mode in which the viewing state is fixed to the standard viewing state.
  • a mode may also be provided.
  • the mode settings may be switched by menu settings or remote control button instructions.
  • the broadcast receiving apparatus 100 may be provided with a mode in which the display of the monitor unit 192 is synchronized with the HMD display, and may be configured to switch between synchronous and asynchronous modes in accordance with menu operations.
  • the broadcast receiving device 100 receives position information or acceleration information of the HMD 920 and detects changes in the position of the HMD 920 to understand the viewer's movements, and adjusts the display image to be enlarged or contracted according to the movement. You can also go there to bring out the sense of realism. For example, if a movement of the viewer approaching or stepping forward is detected, the displayed image is enlarged, and if a movement of the viewer moving away from the displayed image or backwards is detected, the displayed image is enlarged. Try to reduce the size.
  • the user When performing such enlargement/reduction control of the displayed image, the user may be able to set the enlargement/reduction ratio of the displayed image according to the distance at which the viewer approaches or moves away from the displayed image.
  • the user interface for example, remote control operation button or gesture movement detection, ) may be provided.
  • the broadcast receiving device 100 has a mode that changes the direction and sense of distance of the sound in accordance with the enlargement/reduction of the displayed image, that is, a mode in which the image and the sound image are linked, and the mode can be turned on/off.
  • a user interface may also be provided.
  • the video and sound image conversion processing according to the position information, acceleration information, posture information, etc. of the HMD 920 may be performed on the HMD 920 side instead of on the broadcast receiving device 100 side. If the HMD 920 performs video and audio conversion processing, even if the broadcast receiving device 100 provides the same video and audio data to each HMD 920 of multiple viewers, each viewer can independently This has the effect of allowing you to view images and sounds that match your position and posture.
  • Example 3 [Switching sound image localization mode] This embodiment relates to handling of switching of sound image localization mode.
  • the HMD Head Mound Display
  • earphones worn by the user are configured such that the sound image is localized to the image displayed on the screen of the broadcast receiving device 100 in response to changes in the front direction of the user's face.
  • the broadcast receiving apparatus 100 takes in information regarding the front direction of the user's face and performs audio processing so that sound is emitted from an audio output unit such as headphones.
  • variable sound image localization mode a mode in which a sound image is localized to the image displayed on the screen of the broadcast receiving apparatus 100 in response to a change in the front direction of the user's face.
  • the mode in which the broadcast receiving apparatus 100 executes audio output processing assuming that the front direction of the user's face is fixed is referred to as the fixed sound image localization mode.
  • FIG. 14A is a diagram showing a planar positional relationship between the broadcast receiving device 100, the HMD, an audio output unit such as headphones and earphones, and the user's head.
  • the standard viewing position in this case is the midpoint of the line segment connecting the left and right audio output sections.
  • the forward direction of a straight line passing through the midpoint of the line segment connecting the left and right audio output units, which is perpendicular to the line segment connecting the left and right audio output units, is the front direction of the user's face.
  • FIG. 14B shows a case where the standard viewing position is on a line (Y-axis) from the center of the screen of the broadcast receiving device 100 orthogonal to the longitudinal direction (X-axis) of the screen.
  • is the angle between the front direction of the user's face described in FIG. 14A and the center direction of the screen of the broadcast receiving device 100, and indicates how far the user's face is oriented sideways from the center direction of the screen.
  • is an example of the "azimuth angle" in the present invention.
  • the position of the audio output unit such as the HMD, earphones, and headphones is determined by, for example, recording the position where the user is facing the center of the receiver screen based on user input, and then recording the position where the user is facing the center of the receiver screen, and then recording the position of the audio output unit such as the HMD, earphones, and headphones. It can be obtained by detecting with a gyro sensor etc. mounted on the. Furthermore, the position of the audio output unit of the HMD, earphones, headphones, etc. can be calculated by the broadcast receiving apparatus 100 receiving position information from a system such as GPS via the left and right audio output units.
  • the audio output unit is configured to calculate the position of the audio output unit from position information provided by a system such as GPS, and output the calculated position information to the broadcast receiving device 100. It is also possible. Further, the broadcast receiving device 100 and the audio output unit can communicate position information by including other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the broadcast receiving device 100 sets " ⁇ " in FIG. 14B to zero degrees, calculates and outputs sound information including audio information. It has a configuration that makes it possible. That is, assuming that the user's front direction is facing the center of the screen, the broadcast receiving device 100 calculates sound information including audio information, and outputs the calculated sound information to an audio output unit such as an earphone or headphone. .
  • the audio output section The signal to be output is determined.
  • FIG. 35 is a flowchart showing the sound image localization mode switching operation of this embodiment.
  • step S201 it is determined whether the angle " ⁇ " formed by the screen center direction and the front direction of the user's face is fixed or non-fixed. If the angle " ⁇ " is fixed (step S201: Yes), the process proceeds to step S202, and if the angle " ⁇ " is not fixed (step S201: No), the process proceeds to step S204.
  • angle " ⁇ " formed by the screen center direction and the front direction of the user's face is fixed or non-fixed can be set and determined by various methods. Details will be described later.
  • step S203 the broadcast receiving device 100 outputs the calculation information calculated in the audio decoders 146S, 146U, etc. to an audio output unit such as an HMD, earphones, headphones, etc., and the audio output unit emits the calculation information to the user's hearing. .
  • step S204 the broadcast receiving device 100 receives information indicating that the angle “ ⁇ ” formed by the screen center direction and the front direction of the user's face is not fixed, and the broadcast receiving device 100 calculates “ ⁇ ”. Then, the calculated value of " ⁇ " is added to the attribute information of the sound information including the audio information to calculate the calculation information to be output to the audio output unit such as earphones and headphones.
  • the calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
  • FIG. 36A shows an example of an external view of a remote control 180R used to input a sound image localization mode setting instruction to the broadcast receiving apparatus 100 of this embodiment.
  • the remote controller 180R shown in FIG. 36A and the remote controller 180R shown in FIG. 12B have the same key arrangement.
  • the remote control 180R includes a power key 180R1 for powering on/off (standby on/off) the broadcast receiving device 100, and cursor keys (up, down, left, right) 180R2 for moving the cursor up, down, left and right. , a determination key 180R3 for determining the item at the cursor position as a selection item, and a return key 180R4. To switch the audio mode, press menu key 180RA.
  • FIG. 36B is a diagram showing an example of a banner display for setting the sound image localization mode displayed on the monitor unit 192 of the broadcast receiving device 100 when the menu key 180RA is pressed.
  • a banner display 192B1 indicating the type of menu is displayed on the monitor unit 192.
  • the shape of the banner display 192B1, the display position on the monitor unit 192, and the items of the banner display 192B1 can be arbitrarily set by the broadcast receiving apparatus 100.
  • the items of the banner display 192B1 include at least an item for setting the sound image localization mode (as an example, there is item information of "audio setting", but the name is not limited).
  • the banner display 192B1 With the banner display 192B1 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor to select "Audio Settings" and press the OK key 180R3 to display the banner. 192B2 is displayed. Items related to audio settings are displayed on the banner display 192B2. The items of the banner display 192B2 include at least an item for setting the sound image localization mode (as an example, there is item information of "sound image localization", but the name is not limited). With the banner display 192B2 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor to select "sound image localization" and press the enter key 180R3 to display the banner. 192B3 is displayed.
  • the items in the banner display 192B3 include at least an item for setting the sound image localization mode to "fixed” or "non-fixed” (for example, there is item information for "fixed” and “non-fixed”, but the name (For example, it is possible to change the name from “non-fixed” to "variable,” but the function of the sound image localization mode is the same.)
  • the remote control 180R transmits information (hereinafter referred to as "fixed information") that fixes the angle " ⁇ " formed by the center direction of the screen and the front direction of the user's face to the broadcast receiving device 100.
  • the "fixed information” can be communicated by the remote controller 180R and the broadcast receiving apparatus 100 being equipped with other communication units such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the remote control 180R performs the following operations. That is, the remote control 180R sends information (hereinafter referred to as "non-fixed information") that indicates that the angle " ⁇ " formed by the screen center direction and the front direction of the user's face is non-fixed to the broadcast receiving device 100. Send. "Non-fixed information" can be communicated by the remote control 180R and the broadcast receiving apparatus 100 being equipped with other communication units such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • a Bluetooth (registered trademark) communication unit such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the broadcast receiving device 100 that has received the "fixed information” or “non-fixed information” sets the angle " ⁇ " formed by the center direction of the screen and the front direction of the user's face to a fixed value (for example, 0 degrees). ) or angle " ⁇ ", and calculates calculation information to be output to an audio output unit such as an HMD, earphones, headphones, etc.
  • the calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
  • the broadcast receiving apparatus 100 is able to calculate and output sound information including audio information by fixing " ⁇ " in FIG. 14B (zero degrees as an example). Therefore, when the front direction of the user's face cannot be accurately obtained, or when the user feels uncomfortable with the sound emission direction, it is possible to fix the sound image localization direction.
  • FIG. 37 is a flowchart showing the ⁇ correction operation when the sound image localization mode of this modification is non-fixed.
  • step S301 the broadcast receiving device 100 determines whether the reset button has been pressed. If the reset button is pressed (step S301: Yes), the broadcast receiving device 100 proceeds to step S302. If the reset button is not pressed (step S301: No), the broadcast receiving device 100 proceeds to step S303.
  • a reset button may be newly provided on the remote control 180R (not shown). Also, as described above, press the menu key 180RA, select a reset area (not shown) displayed on the monitor unit 192 of the broadcast receiving device 100 using the cursor keys (up, down, left, right) 180R2, By pressing the enter key 180R3, it may be determined that the reset button has been pressed.
  • the color key (blue, red, green, yellow) 180RD may be configured to be set as a reset button.
  • the reset button may be provided on the HMD, earphones, or headphones, and information on the press of the reset button may be output to the broadcast receiving device 100.
  • the information that the reset button has been pressed is realized by each device having other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • step S302 the broadcast receiving device 100 sets " ⁇ " as the direction of the user's face when the reset button is pressed as the new front direction of the user's face.
  • step S303 the broadcast receiving device 100 uses the audio decoders 146S, 146U, etc. to calculate audio information using " ⁇ " set as the new front direction of the user's face, and transmits the calculated calculation information to the HMD, earphones, etc. and output to an audio output unit such as headphones, and the audio output unit emits the calculated information to the user's auditory senses.
  • the broadcast receiving device 100 Even in the case of incorrect recognition, the reset button allows the user to correct the front direction of the user's face to the direction in which the user is actually facing.
  • the broadcast receiving device identifies the sound collection location of the program from the metadata and content of the program information, obtains the sound field information of that location from the server, and generates reverberant sound. , a synthesized sound is generated by adding the reverberation sound to the original sound, and the synthesized sound is outputted to the sound output device. Due to the operation of the broadcast receiving apparatus as described above, the viewer can experience the same sense of presence as if he or she were present at the sound collection location. In addition, when the broadcast receiving device specifies a listening point in the sound collection location, it generates a synthesized sound that corresponds to when listening to the sound from the sound source at that listening point. You can reproduce the sounds you hear.
  • the broadcast receiving device moves the listening point at the sound collection location in accordance with the partial enlargement operation of the broadcast image, and virtually changes the direction from which the sound comes, so the video and audio are synchronized, creating a more realistic feeling. You can enjoy a high quality experience.
  • Some or all of the functions of the present invention described above may be realized in hardware by, for example, designing an integrated circuit.
  • the functions may be realized by software by having a microprocessor unit or the like interpret and execute operating programs for realizing the respective functions.
  • Hardware and software may be used together.
  • the software for controlling the broadcast receiving apparatus 100 may be stored in advance in the ROM 103 and/or the storage unit 110 of the broadcast receiving apparatus 100 at the time of product shipment.
  • the information may be acquired from a server device on the Internet 800 via the LAN communication unit 121 after the product is shipped. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 124 or the like.
  • the software for controlling the mobile information terminal 700 may be stored in advance in the ROM 703 and/or the storage unit 710 of the mobile information terminal 700 at the time of product shipment.
  • the information may be acquired from a server device on the Internet 800 via the LAN communication unit 721 or the mobile telephone network communication unit 722 after the product is shipped. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface section 724 or the like.
  • control lines and information lines shown in the figures are those considered necessary for the explanation, and do not necessarily show all control lines and information lines on the product. In reality, almost all components may be considered to be interconnected.
  • 100 Broadcast receiving device, 101: Main control unit, 102: System bus, 103: ROM, 104: RAM, 110: Storage unit, 121: LAN communication unit, 124: Expansion interface unit, 125: Digital interface unit , 130C, 130T, 130L, 130B: tuner/demodulation section, 140S, 140U: decoder section, 180: operation input section, 191: video selection section, 192: monitor section, 193: video output section, 194: audio selection section, 195: Speaker section, 196: Audio output section, 180R: Remote controller, 200, 200C, 200T, 200S, 200L, 200B: Antenna, 201T, 201L, 201B: Conversion section, 300, 300T, 300S, 300L: Radio tower, 400C: head end of cable television station, 400: broadcasting station server, 500: service provider server, 600: mobile telephone communication server, 600B: base station, 700: mobile information terminal, 800: Internet, 800R: router device, 900 : Information server, 9

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne une technologie permettant d'émettre ou de recevoir de manière plus appropriée un service de diffusion numérique de haut niveau. Ce dispositif de réception de diffusion comprend : une unité de réception de diffusion qui peut recevoir des signaux pour des sources sonores séparées par l'intermédiaire d'ondes de diffusion et reçoit les ondes de diffusion ; une unité de sortie audio qui est configurée à partir d'une pluralité de haut-parleurs et délivre un audio en fonction des signaux pour les sources sonores séparées émis par le dispositif de réception de diffusion ; et une unité de commande. L'unité de commande détermine les positions de lecture de l'audio résultant des signaux pour les sources sonores séparées conformément à des informations d'agencement concernant la pluralité de haut-parleurs, calcule des signaux correspondant à des canaux audio 22,2 ch sur la base des signaux pour les sources sonores séparées, puis convertit les signaux correspondant aux canaux audio 22,2 ch en signaux pour l'unité de sortie audio.
PCT/JP2023/029279 2022-08-03 2023-08-10 Dispositif de réception de diffusion, procédé de protection de contenu, procédé de traitement pour ajouter un son de réverbération et procédé de commande pour dispositif de réception de diffusion WO2024029634A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2022123799A JP2024021157A (ja) 2022-08-03 2022-08-03 放送受信装置およびコンテンツ保護方法
JP2022-123799 2022-08-03
JP2022151952A JP2024046518A (ja) 2022-09-22 2022-09-22 放送受信装置および残響音付加処理方法
JP2022-151952 2022-09-22
JP2022-181025 2022-11-11
JP2022181025A JP2024070494A (ja) 2022-11-11 2022-11-11 放送受信装置および放送受信装置の制御方法

Publications (1)

Publication Number Publication Date
WO2024029634A1 true WO2024029634A1 (fr) 2024-02-08

Family

ID=89849523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/029279 WO2024029634A1 (fr) 2022-08-03 2023-08-10 Dispositif de réception de diffusion, procédé de protection de contenu, procédé de traitement pour ajouter un son de réverbération et procédé de commande pour dispositif de réception de diffusion

Country Status (1)

Country Link
WO (1) WO2024029634A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6229327A (ja) * 1985-07-31 1987-02-07 Toshiba Electric Equip Corp 集合住宅情報システム
JP2003255955A (ja) * 2002-02-28 2003-09-10 Pioneer Electronic Corp 音場制御方法及び音場制御システム
JP2013157680A (ja) * 2012-01-27 2013-08-15 Hitachi Consumer Electronics Co Ltd 受信装置およびデジタル放送送受信システム
JP2014045282A (ja) * 2012-08-24 2014-03-13 Nippon Hoso Kyokai <Nhk> 残響付加装置、残響付加プログラム
WO2016002738A1 (fr) * 2014-06-30 2016-01-07 ソニー株式会社 Processeur d'informations et procédé de traitement d'informations
JP2018046319A (ja) * 2016-09-12 2018-03-22 マクセル株式会社 放送受信装置
JP2019161672A (ja) * 2019-06-27 2019-09-19 マクセル株式会社 システム
JP2021136465A (ja) * 2020-02-21 2021-09-13 日本放送協会 受信装置、コンテンツ伝送システム、及びプログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6229327A (ja) * 1985-07-31 1987-02-07 Toshiba Electric Equip Corp 集合住宅情報システム
JP2003255955A (ja) * 2002-02-28 2003-09-10 Pioneer Electronic Corp 音場制御方法及び音場制御システム
JP2013157680A (ja) * 2012-01-27 2013-08-15 Hitachi Consumer Electronics Co Ltd 受信装置およびデジタル放送送受信システム
JP2014045282A (ja) * 2012-08-24 2014-03-13 Nippon Hoso Kyokai <Nhk> 残響付加装置、残響付加プログラム
WO2016002738A1 (fr) * 2014-06-30 2016-01-07 ソニー株式会社 Processeur d'informations et procédé de traitement d'informations
JP2018046319A (ja) * 2016-09-12 2018-03-22 マクセル株式会社 放送受信装置
JP2019161672A (ja) * 2019-06-27 2019-09-19 マクセル株式会社 システム
JP2021136465A (ja) * 2020-02-21 2021-09-13 日本放送協会 受信装置、コンテンツ伝送システム、及びプログラム

Similar Documents

Publication Publication Date Title
JP7389214B2 (ja) デジタル放送変調波の伝送方法
JP7401603B2 (ja) 伝送波の処理方法
JP2023126310A (ja) 放送受信装置
JP2024026573A (ja) 表示制御方法
JP2024026228A (ja) 放送受信装置
JP2024026572A (ja) 表示制御方法
WO2019221080A1 (fr) Dispositif de réception de diffusion et procédé de traitement d&#39;onde porteuse
WO2024029634A1 (fr) Dispositif de réception de diffusion, procédé de protection de contenu, procédé de traitement pour ajouter un son de réverbération et procédé de commande pour dispositif de réception de diffusion
JP2022132341A (ja) 伝送波の処理方法
JP2024046518A (ja) 放送受信装置および残響音付加処理方法
JP2024021157A (ja) 放送受信装置およびコンテンツ保護方法
JP2024070494A (ja) 放送受信装置および放送受信装置の制御方法
JP2024053891A (ja) 放送システム、放送受信装置、放送局装置および放送方法
JP2019201330A (ja) 伝送波の処理方法
JP7428507B2 (ja) 放送受信装置
JP7460740B2 (ja) デジタル放送変調波の伝送方法
JP7448340B2 (ja) 時刻制御方法
JP7388891B2 (ja) 表示制御方法
WO2023112666A1 (fr) Dispositif de réception de diffusion, procédé de réglage, procédé de transmission, procédé de commande d&#39;affichage et support d&#39;enregistrement
JP7414489B2 (ja) 表示制御方法
JP7405579B2 (ja) 表示制御方法
WO2022168525A1 (fr) Dispositif de réception de diffusion numérique
JP7197288B2 (ja) 放送受信装置
JP2022174771A (ja) デジタル放送受信装置および表示方法
JP2022163274A (ja) 表示装置および再生方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23850187

Country of ref document: EP

Kind code of ref document: A1