WO2017090456A1 - データ処理装置、及び、データ処理方法 - Google Patents

データ処理装置、及び、データ処理方法 Download PDF

Info

Publication number
WO2017090456A1
WO2017090456A1 PCT/JP2016/083466 JP2016083466W WO2017090456A1 WO 2017090456 A1 WO2017090456 A1 WO 2017090456A1 JP 2016083466 W JP2016083466 W JP 2016083466W WO 2017090456 A1 WO2017090456 A1 WO 2017090456A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
information
time information
frame
physical layer
Prior art date
Application number
PCT/JP2016/083466
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
ロックラン ブルース マイケル
高橋 和幸
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to CA3004847A priority Critical patent/CA3004847C/en
Priority to US15/765,861 priority patent/US11265141B2/en
Priority to MX2018006176A priority patent/MX2018006176A/es
Priority to KR1020187013710A priority patent/KR102379544B1/ko
Priority to JP2017552355A priority patent/JPWO2017090456A1/ja
Priority to CN201680067326.3A priority patent/CN108352980B/zh
Priority to EP16868404.1A priority patent/EP3382927B1/en
Publication of WO2017090456A1 publication Critical patent/WO2017090456A1/ja
Priority to US17/648,628 priority patent/US20220224506A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L7/00Arrangements for synchronising receiver with transmitter
    • H04L7/04Speed or phase control by synchronisation signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L7/00Arrangements for synchronising receiver with transmitter
    • H04L7/04Speed or phase control by synchronisation signals
    • H04L7/041Speed or phase control by synchronisation signals using special codes as synchronising signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G7/00Synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • H04N21/6547Transmission by server directed to the client comprising parameters, e.g. for client setup

Definitions

  • the present technology relates to a data processing device and a data processing method, and more particularly to a data processing device and a data processing method capable of reducing an error of time due to the accuracy of time information.
  • ATSC 3.0 Development of Advanced Television Systems Committee (ATSC) 3.0 is underway as one of the next-generation broadcasting systems (see, for example, Non-Patent Document 1).
  • ATSC 3.0 does not use the currently widely used MPEG2-TS (Transport Stream) method as a transmission method, but uses an IP (Internet Protocol) transmission method that uses IP (Internet Protocol) packets used in the field of communication for digital broadcasting. By introducing, it is assumed to provide more advanced services.
  • an error of the time may occur due to the accuracy of the time information included in the signaling transmitted by the physical layer frame, and the accuracy of the time information is due Proposals to reduce the error of the time of day were required.
  • the present technology has been made in view of such a situation, and is intended to be able to reduce an error of time due to the accuracy of time information.
  • the data processing apparatus includes: a generation unit configured to generate signaling including time information having time accuracy according to a frame length of a physical layer frame; and the signaling as a preamble of the physical layer frame And a processor configured to process to be included.
  • the data processing device may be an independent device or an internal block that constitutes one device.
  • a data processing method according to a first aspect of the present technology is a data processing method corresponding to the data processing device according to the first aspect of the present technology described above.
  • signaling including time information having a time accuracy according to a frame length of a physical layer frame is generated, and the signaling is the physical layer It is processed to be included in the preamble of the frame.
  • a data processing apparatus includes a processing unit that processes signaling including time information having a time accuracy according to a frame length of the physical layer frame, included in a preamble of a physical layer frame. It is a processing device.
  • the data processing device of the second aspect of the present technology may be an independent device or an internal block that constitutes one device.
  • a data processing method according to a second aspect of the present technology is a data processing method corresponding to the data processing device according to the second aspect of the present technology described above.
  • signaling including time information having a precision of time according to a frame length of the physical layer frame is included in a preamble of the physical layer frame. It is processed.
  • FIG. 1 is a diagram showing the configuration of an embodiment of a transmission system to which the present technology is applied.
  • a system is a system in which a plurality of devices are logically gathered.
  • the transmission system 1 includes a transmitting device 10 and a receiving device 20.
  • data transmission conforming to a broadcasting system such as ATSC 3.0 is performed.
  • the transmitting device 10 is a transmitter compatible with a broadcasting method such as ATSC 3.0, and transmits content via the transmission path 30.
  • the transmitting apparatus 10 transmits, as a broadcast wave, a broadcast stream including video and audio (components of the video and audio) constituting content such as a broadcast program and the like via the transmission path 30.
  • the receiving device 20 is a receiver compatible with a broadcasting scheme such as ATSC 3.0, and receives and outputs content transmitted from the transmitting device 10 via the transmission path 30.
  • the receiving device 20 receives a broadcast wave from the transmitting device 10, processes video and audio (components of the content), and signaling included in the broadcast stream, and outputs the video of the content such as a broadcast program. And play audio.
  • the transmission path 30 may be, for example, satellite broadcasting using a broadcasting satellite (BS: Broadcasting Satellite) or a communication satellite (CS: Communications Satellite) or a cable in addition to terrestrial waves (terrestrial broadcasting). May be cable broadcasting (CATV) or the like.
  • BS Broadcasting Satellite
  • CS Communications Satellite
  • CATV cable broadcasting
  • FIG. 2 is a diagram showing the structure of a physical layer frame used in data transmission conforming to a broadcasting scheme such as ATSC 3.0.
  • the physical layer frame is composed of a bootstrap, a preamble, and a payload.
  • the physical layer frame is configured with a predetermined frame length such as millisecond. In the physical layer frame, after obtaining the bootstrap and the preamble, it is possible to obtain the subsequent payload.
  • the bootstrap corresponds to, for example, a P1 symbol constituting a T2 frame of DVB-T2 (Digital Video Broadcasting-Second Generation Terrestrial), and the preamble corresponds to, for example, a P2 symbol constituting a T2 frame of DVB-T2 doing.
  • the bootstrap can also be said to be a preamble.
  • a time-aligned mode and a symbol-aligned mode are defined as frame modes according to the frame length of the physical layer frame.
  • the timeline mode is a mode in which the frame length (frame time) of the physical layer frame is always adjusted to an integer millisecond unit and then transmitted by inserting extra samples in the guard interval (GI) portion. is there.
  • the frame length of the physical layer frame If the (frame time) is an integer number of milliseconds, it is convenient for the time breaks to match. However, in the time-aligned mode, data transmission efficiency is degraded because extra samples (meaningless data) are to be transmitted.
  • the symbol aligned mode is a mode in which transmission is performed without inserting extra samples.
  • efficient data transmission can be performed because extra samples are not transmitted.
  • DVB-T Digital Video Broadcasting-Terrestrial
  • DVB-T2 Digital Video Broadcasting-Terrestrial
  • ISDB-T Integrated Services Digital Broadcasting-Terrestrial
  • the frame length is an integral number of milliseconds in the physical layer frame (milliseconds)
  • the frame length is an integral number of milliseconds in the physical layer frame (milliseconds)
  • the physical layer frame which is along the boundary of ⁇ circle around (1) ⁇
  • there are physical layer frames which are not along the boundary of millisecond which are not integral with milliseconds.
  • FIG. 3 shows the frame time of the physical layer frame when the number of OFDM symbols included in one physical layer frame is changed when the FFT mode is 8K mode and the guard interval (GI) is 1024. The simulation results are shown. Note that, in FIG. 3, it is represented by shading whether the frame time is an integral millisecond unit.
  • the symbol time of OFDM and GI is 1.33 ms.
  • the frame time of the first physical layer frame is 15.333 ms
  • the frame time of the second physical layer frame is 30.667 ms.
  • the frame time of the third physical layer frame is 46.000 ms (30.667 ms + 15.33 ms)
  • the frame time of the fourth physical layer frame is 61.333 ms (46.1 000 ms + 15.33 ms).
  • the time obtained by adding 15.33 ms to the frame time of the immediately preceding physical layer frame can be determined as the frame time of the target physical layer frame .
  • the frame times of the fifth to 25th physical layer frames are 76.667 ms, 92.000 ms, 107.333 ms, 122.667 ms, 138.000 ms, 153.333 ms, 168.667 ms, 184.000 ms, 199.333 ms, 214.667. ms, 230.000 ms, 245. 333 ms, 260. 667 ms, 291. 333 ms, 306. 667 ms, 322. 000 ms, 337. 333 ms, 352. 667 ms, 368. 000 ms, 383. 333 ms can be obtained.
  • the frame time of the physical layer frame at regular intervals such as 46.1 000 ms, 92. 000 ms, 138. 000 ms, 184. 000 ms, 230.000 ms, 2760,000 ms, 322.0 000 ms, and 368. 000 ms.
  • there are integer millisecond units there are frame times that are not integer millisecond units.
  • the symbol time of OFDM and GI is 1.33 ms.
  • the total OFDM time is 16.00 ms.
  • the frame time of the first physical layer frame is 18.000 ms
  • the frame time of the second physical layer frame is 36.000 ms (18.00 ms + 18.00 ms).
  • the frame time of the third physical layer frame is 54.000 ms (36000 ms + 18.00 ms)
  • the frame time of the fourth physical layer frame is 72.000 ms (54.000 ms + 18.00 ms).
  • the time obtained by adding 18.00 ms to the frame time of the immediately preceding physical layer frame can be determined as the frame time of the target physical layer frame.
  • the frame times of the fifth to 25th physical layer frames are 90.000 ms, 108.000 ms, 126.000 ms, 144.1 000 ms, 1620. 000 ms, 180.000 ms, 198. 000 ms, 216. 000 ms, 234. 000 ms, 252. 000 ms, 270.000 ms, 288. 000 ms, 306. 000 ms, 324. 000 ms, 342. 000 ms, 360.000 ms, 378. 000 ms, 39.6 000 ms, 414. 000 ms, 432. 000 ms, 45.000 ms are obtained.
  • the frame times of all physical layer frames are in integer millisecond units, such as 18.000 ms, 36.000 ms, ..., 432.000 ms, and 45.000 ms. .
  • FIG. 4 is a physical layer frame when the number of OFDM symbols included in one physical layer frame is changed when the FFT mode is 8K mode and the guard interval (GI) is 768.
  • the simulation result of frame time is shown. Also in FIG. 4, it is represented by shading whether the frame time is an integral millisecond unit.
  • the frame time of the physical layer frame is an integer millisecond unit every eight, but the other frame times are , And integer millisecond units. Further, when the number of OFDM symbols is 16, 17, 19, 20, 22, 23, 25, 26, 28, 29, the frame times of all physical layer frames are not in the unit of millisecond of an integer. On the other hand, when the number of OFDM symbols is 27, the frame times of all physical layer frames are integer millisecond units.
  • the physical layer is selected according to the combination of the FFT mode, the number of OFDM symbols, the guard interval (GI), and the symbol time. It is determined whether the frame time of the frame will be an integer number of milliseconds. For example, in the simulation result of FIG. 3, the frame time is in integer millisecond units at a certain rate, but in the simulation result of FIG. It has become.
  • time information transmitted by signaling may be transmitted with millisecond accuracy. It is assumed.
  • the symbol aligned mode is set, there is no problem in the physical layer frame in which the frame time is an integer millisecond, because there is no error from the time indicated by the time information.
  • an error (jitter) with the time indicated by the time information will occur.
  • the accuracy of the time information transmitted by signaling is made to be higher than the accuracy of the current millisecond, thereby achieving a symbol align mode as a frame mode. Even when is set, an error between the time indicated by the time information and the frame time does not occur in all physical layer frames (an error can be reduced).
  • the time information transmitted by signaling represents an absolute time of a predetermined position in the stream of physical layer frames.
  • the time at a predetermined position in the stream is the time at a predetermined timing during which the bit at the predetermined position is being processed by the transmission device 10.
  • a predetermined position in the stream of the physical layer frame in which the time information represents the time is referred to as a time position.
  • the time position for example, the position of the beginning of the physical layer frame having the preamble including the time information (the position of the beginning of the bootstrap) can be adopted. Also, as the time position, for example, it is possible to adopt the position of the boundary between the bootstrap and the preamble of the physical layer frame having the preamble including the time information (the last position of the bootstrap or the position of the beginning of the preamble). it can.
  • the time position for example, the last position of the preamble of the physical layer frame having the preamble including the time information can be adopted.
  • the time position any position of the physical layer frame can be adopted.
  • FIG. 5 is a diagram for explaining the outline of L1 basic information and L1 detailed information.
  • L1 basic information L1-Basic
  • L1 detailed information L1-Detail
  • L1 basic information is composed of about 200 bits, but L1 detailed information is composed of 400 to several thousand bits. Are different. Further, as indicated by the arrows in the figure, in the preamble, since the L1 basic information and the L1 detailed information are read out in that order, the L1 basic information is read out earlier than the L1 detailed information. Furthermore, L1 basic information is also different in that it is transmitted more robustly than L1 detailed information.
  • FIG. 6 is a diagram showing an example of the syntax of L1 basic information (L1-Basic) of FIG. The detailed content of L1 basic information is described in “Table 9.2 L1-Basic signaling fields and syntax” of Non-Patent Document 1 described above.
  • the 2-bit L1B_content_tag represents a tag value that identifies content.
  • the 3-bit L1B_version represents a version of L1 basic information.
  • One bit L1B_slt_flag indicates whether or not SLT (Service Labeling Table) exists.
  • One bit L1B_time_info_flag indicates whether time information exists.
  • 2-bit L1B_papr represents application of PAPR (Peak to Average Power Reduction).
  • L1B_frame_length_mode represents a frame mode.
  • the 10-bit L1B_frame_length represents the frame length of the physical layer frame. However, this L1B_frame_length is used only when the frame mode is in the time-aligned mode, and is not used when in the symbol-aligned mode.
  • L1B_num_subframes represents the number of subframes included in the physical layer frame.
  • the 3-bit L1B_preamble_num_symbols represents the number of OFDM symbols included in the preamble.
  • the 3-bit L1B_preamble_reduced_carriers represents the number of control units according to the reduction of the maximum number of carriers of FFT size used in the preamble.
  • the 16-bit L1B_L1_Detail_size_bits represents the size of L1 detailed information (L1-Detail).
  • the 3-bit L1B_L1_Detail_fec_type represents the FEC type of L1 detailed information.
  • the 2-bit L1B_L1_Detail_additional_parity_mode represents an additional parity mode of L1 detailed information.
  • the 19-bit L1B_L1_Detail_total_cells represents the total size of L1 detailed information.
  • L1B_First_Sub_mimo represents the usage status of MIMO (Multiple Input and Multiple Output) of the first subframe.
  • L1B_First_Sub_miso represents the usage status of Multiple Input and Single Output (MISO) of the first subframe.
  • the 2-bit L1B_First_Sub_fft_size represents the FFT size of the first subframe.
  • the 3-bit L1B_First_Sub_reduced_carriers represents the number of control units according to the decrease in the maximum number of FFT size carriers used in the first subframe.
  • the 4-bit L1B_First_Sub_guard_interval represents the guard interval length of the first subframe.
  • the 13-bit L1B_First_Sub_excess_samples represents the number of extra samples inserted in the guard interval part in the first subframe. However, this L1B_First_Sub_excess_samples is used only when the frame mode is in the time-aligned mode, and is not used when in the symbol-aligned mode.
  • the 11-bit L1B_First_Sub_num_ofdm_symbols represents the number of OFDM symbols included in the first subframe.
  • the 5-bit L1B_First_Sub_scattered_pilot_pattern represents an SP pattern (Scattered Pilot Pattern) used in the first subframe.
  • the 3-bit L1B_First_Sub_scattered_pilot_boost represents a value for increasing the size of the SP pattern.
  • L1B_First_Sub_sbs_first represents the beginning of the Subframe Boundary Symbol (SBS) of the first subframe.
  • L1B_First_Sub_sbs_last represents the end of the SBS of the first subframe.
  • L1B_Reserved is a reserved area (Reserved).
  • the number of bits of L1B_Reserved is undetermined (TBD: To Be Determined), but is currently 49 bits.
  • the 32-bit L1B_crc indicates that a CRC (Cyclic Redundancy Check) value is included.
  • uimsbf unsigned integer most significant bit first
  • bslbf bit string, left bit first
  • FIG. 7 is a diagram showing an example of syntax of L1 detailed information (L1-Detail) of FIG. However, in the syntax of FIG. 7, a part particularly related to the present technology in the L1 detailed information is extracted and described. The detailed contents of L1 detailed information are described in “Table 9.12 L1-Detail signaling fields and syntax” of Non-Patent Document 1 described above.
  • the 4-bit L1D_version represents a version of L1 detailed information.
  • the 19-bit L1D_rf_frequency represents the frequency of the RF channel coupled by channel bonding.
  • L1B_time_info_flag 1 in the L1 basic information of FIG. 7, it indicates that time information exists, so L1D_time_info as time information is arranged in L1 detailed information. Note that the number of bits of L1D_time_info is considered as undetermined (TBD).
  • L1D_time_info L1D_time_sec of a bit and L1D_time_msec of 10 bits are arranged.
  • L1D_time_sec represents time information in seconds.
  • L1D_time_msec represents time information in milliseconds.
  • time information with precision higher than the current precision in millisecond in addition to time information in seconds (sec) and millisecond (msec), microsecond (usec) And nanoseconds (nsec) time information is transmitted.
  • Time information can be transmitted, but when performing a service by broadcasting in the transmission system 1 of FIG. 1, transmitting the time information with accuracy higher than necessary for the broadcasting affects the transmission band, etc. Because it is not effective.
  • PTP Precision Time Protocol
  • IEEE 1588-2008 is used as time information
  • PTP is composed of a second field and a nanosecond field, and has an accuracy of nanoseconds. It is possible not to transmit time information that exceeds 30 bits, in addition to time information with precision higher than nanosecond precision, that is, time information in seconds and microseconds. It shall be.
  • 10-bit microsecond time information, 20-bit microsecond time information and nanosecond time information may be used as an example, and other bit precision may be adopted.
  • the clock accuracy is determined by the standard (for example, 27 MHz, 30 ppm), but the time and frame indicated by the time information of the signaling when the frame mode becomes the symbol aligned mode If the error with time (rounding error) is compared with the accuracy of the MPEG2-TS system, it becomes as shown in FIG. That is, according to the table in FIG. 9, in the time information in millisecond units, the accuracy is degraded compared to the accuracy of the MPEG2-TS method, but in the microsecond unit and the time information in nanosecond units, the MPEG2-TS method It is clear that the accuracy is much better than the accuracy of.
  • these time information is made to be included in L1 basic information and L1 detailed information as signaling. Further, these time information may be included in at least one of L1 basic information and L1 detailed information, and the L1B + L1D transmission method of transmitting time information in both L1 basic information and L1 detailed information, and Four transmission methods of L1B transmission method a or L1B transmission method b for transmitting time information only with L1 basic information and L1D transmission method for transmitting time information only with L1 detailed information will be described.
  • FIG. 10 is a diagram illustrating an example of syntax of L1 basic information of the L1B + L1D transmission method. However, in the syntax of FIG. 10, only characteristic portions are extracted and described.
  • time information in microseconds L1B_time_usec
  • time information in nanoseconds L1B_time_nsec
  • FIG. 11 is a diagram illustrating an example of syntax of L1 detailed information of the L1B + L1D transmission method. However, in the syntax of FIG. 11, only characteristic portions are extracted and described.
  • time information in seconds L1D_time_sec
  • time information in milliseconds L1D_time_msec
  • time information in seconds (L1D_time_sec), time information in milliseconds (L1D_time_msec), time information in microseconds (L1D_time_msec) according to L1 basic information and L1 detailed information L1B_time_usec) and time information on a nanosecond basis (L1B_time_nsec) are transmitted.
  • the time obtained from these time information has an accuracy of nanoseconds
  • the frame mode is the symbol aligned mode
  • the physical frame length (frame time) does not become integer millisecond units. Even in the case of a layer frame, it is possible to suppress an error (jitter) from the time indicated by the time information.
  • the current L1 detailed information structure is used as it is, and the current L1 basic information structure is only slightly modified (the L1B_frame_length and L1B_First_Sub_excess_samples are not used in symbol aligned mode. Can be realized, thereby reducing the cost of modification.
  • the L1B + L1D transmission scheme is also efficient because it uses many structures of the current L1 basic information and L1 detailed information.
  • time information (L1B_time_usec) in microsecond units and time information (L1B_time_nsec) in nanoseconds are included in the symbol align mode, but time information in microsecond units is illustrated. Only (L1B_time_usec) may be included, and even in such a case, time information with accuracy higher than millisecond units will be transmitted.
  • FIG. 12 is a diagram illustrating an example of syntax of L1 basic information of the L1B transmission scheme a. However, in the syntax of FIG. 12, only characteristic portions are extracted and described.
  • L1B_time_info_flag 1
  • 32 bits of L1B_time_sec and 10 bits of L1B_time_msec are arranged.
  • L1B_Reserved is 7 bits or 49 bits, a 7-bit reserved area (Reserved) is secured when time information exists, and a 49-bit reserved area when time information does not exist. It means that (Reserved) is secured.
  • time information in seconds L1B_time_sec
  • time information in milliseconds millisecond unit
  • microsecond time information L1B_time_usec
  • nanosecond time information L1B_time_nsec
  • FIG. 13 is a diagram illustrating an example of a syntax of L1 detailed information of the L1B transmission scheme a. However, in the syntax of FIG. 13, only characteristic portions are extracted and described.
  • the time information (L1D_time_info) is not arranged there.
  • time information in seconds (L1B_time_sec), time information in milliseconds (L1B_time_msec), time information in microseconds (L1B_time_usec) using only L1 basic information , And nanosecond time information (L1B_time_nsec) is transmitted.
  • time information in milliseconds (L1B_time_msec)
  • time information in microseconds (L1B_time_usec) using only L1 basic information
  • nanosecond time information (L1B_time_nsec) is transmitted.
  • the time obtained from these time information has an accuracy of nanoseconds
  • the frame mode is the symbol aligned mode
  • the physical frame length does not become integer millisecond units. Even in the case of a layer frame, it is possible to suppress an error (jitter) from the time indicated by the time information.
  • the L1B transmission method a since time information is transmitted only by robust L1 basic information, all time information can be sufficiently protected. Further, in the L1B transmission method a, since all time information is transmitted as L1 basic information, it is possible to collectively transmit all time information to the L1 basic information side. Thus, for example, in the receiving device 20, it is possible to more quickly decode time information included in L1 basic information.
  • L1 basic information in FIG. 12 shows an example in which time information (L1B_time_usec) in microsecond units and time information (L1B_time_nsec) in nanoseconds are included in the symbol align mode, time information in microsecond units is shown. Only (L1B_time_usec) may be included, and even in such a case, time information with accuracy higher than millisecond units will be transmitted.
  • FIG. 14 is a diagram illustrating an example of syntax of L1 basic information of the L1B transmission scheme b. However, in the syntax of FIG. 14, only characteristic portions are extracted and described.
  • L1B_time_info_flag 1 bit of L1B_time_info_flag is deleted, and 32 bits of L1B_time_sec and 10 bits of L1B_time_msec are always arranged.
  • time information in seconds L1B_time_sec
  • time information in milliseconds L1B_time_msec
  • the frame mode is When the symbol alignment mode is selected, time information in microseconds (L1B_time_usec) and time information in nanoseconds (L1B_time_nsec) are included in addition to time information in seconds and milliseconds.
  • FIG. 15 is a diagram illustrating an example of syntax of L1 detailed information of the L1B transmission scheme b. However, in the syntax of FIG. 15, only characteristic portions are extracted and described.
  • the time information is not arranged there.
  • time information in seconds (L1B_time_sec), time information in milliseconds (L1B_time_msec), time information in microseconds (L1B_time_usec) using only L1 basic information , And nanosecond time information (L1B_time_nsec) is transmitted.
  • time information in milliseconds (L1B_time_msec)
  • time information in microseconds (L1B_time_usec) using only L1 basic information
  • nanosecond time information (L1B_time_nsec) is transmitted.
  • the time obtained from these time information has an accuracy of nanoseconds
  • the frame mode is the symbol aligned mode
  • the physical frame length does not become integer millisecond units. Even in the case of a layer frame, it is possible to suppress an error (jitter) from the time indicated by the time information.
  • the L1B transmission scheme b since time information is transmitted only by robust L1 basic information, all time information can be sufficiently protected. Further, in the L1B transmission method b, since all time information is transmitted as L1 basic information, it is possible to collectively transmit all time information to the L1 basic information side. Therefore, for example, in the receiving device 20, it is possible to decode the time information included in the L1 basic information more quickly. Furthermore, in the L1B transmission method b, it is possible to always transmit time information regardless of the frame mode in the time alignment mode or the symbol alignment mode.
  • time information (L1B_time_usec) in microsecond units and time information (L1B_time_nsec) in nanoseconds are included in the symbol align mode, but time information in microsecond units is illustrated. Only (L1B_time_usec) may be included, and even in such a case, time information with accuracy higher than millisecond units will be transmitted.
  • FIG. 16 is a diagram illustrating an example of syntax of L1 basic information of the L1D transmission scheme. However, in the syntax of FIG. 16, only characteristic portions are extracted and described.
  • L1B_time_info_flag is not 1 bit but 2 bits are secured.
  • time information in microseconds is arranged in addition to time information in seconds and millisecond.
  • time information in microsecond units and nanosecond units is arranged in addition to time information in seconds and milliseconds.
  • L1B_Reserved is 48 bits because L1B_time_info_flag is 2 bits.
  • FIG. 17 is a diagram illustrating an example of a syntax of L1 detailed information of the L1D transmission scheme. However, in the syntax of FIG. 17, only characteristic portions are extracted and described.
  • L1B_time_info_flag 01
  • time information in seconds L1D_time_sec
  • time information in milliseconds L1D_time_msec
  • L1B_time_info_flag 10
  • time information in seconds L1D_time_sec
  • time information in milliseconds L1D_time_msec
  • time information in microseconds L1D_time_usec
  • time information in units of seconds and time information (L1D_time_msec) in units of milliseconds, as well as time information (L1D_time_usec) in units of microseconds, according to the value of L1B_time_info_flag.
  • microsecond time information L1D_time_usec
  • nanosecond time information L1D_time_nsec
  • time information in seconds L1D_time_sec
  • time information in milliseconds L1D_time_msec
  • time information in microseconds L1D_time_usec
  • only L1 detailed information L1 detailed information
  • time information in a microsecond unit L1D_time_usec
  • time information in a nanosecond unit L1D_time_nsec
  • the frame length is an integer number of milliseconds when the frame mode becomes the symbol align mode. Even with a physical layer frame that is not a unit, an error (jitter) with the time indicated by the time information can be suppressed.
  • the L1D transmission method since time information is transmitted only in L1 detailed information, all time information can be protected at the same level. Further, in the L1D transmission method, since all time information is transmitted as L1 detailed information, it is possible to collectively transmit all time information to the L1 detailed information side together. Therefore, for example, in the reception device 20, it is possible to easily analyze time information (time information configured simply) included in L1 detailed information.
  • L1 detailed information of FIG. 17 an example is shown in which three types of time information are arranged according to the value of L1B_time_info_flag, but time information in seconds (L1D_time_sec), time information in milliseconds (L1D_time_msec) And, only time information (L1D_time_usec) in microseconds may be arranged, and even in that case, time information with accuracy higher than millisecond will be transmitted.
  • time information as shown below is transmitted become. That is, the time information is represented by binary coded decimal (BCD), and for example, in the L1B + L1D transmission method, time information in milliseconds (L1D_time_msec) and time information in microseconds (L1B_time_usec) And time information in nanoseconds (L1B_time_nsec) can be represented as "0.123456789 ns".
  • BCD binary coded decimal
  • the time information can be transmitted by transmitting the time information with accuracy higher than the unit of millisecond. It is possible to suppress an error (jitter) between the time indicated by the information and the frame time.
  • the physical layer frame can be freely transmitted even when the frame length of the physical layer frame is an integer unit of millisecond and the timing of transmitting the physical layer frame is not millisecond. It will be possible to make the implementation easier.
  • FIG. 18 is a diagram showing a configuration example of the transmission device 10 on the transmission side and the reception device 20 on the reception side.
  • the transmission device 10 includes an input format processing unit (Input Format) 101, a bit interleaved coding and modulation (BICM) processing unit 102, a frame interleaving processing unit (Frame and Interleave) 103, and a waveform processing unit (Waveform). And 104).
  • Input Format input format processing unit
  • BICM bit interleaved coding and modulation
  • Frame and Interleave frame interleaving processing unit
  • Waveform Waveform processing unit
  • the input format processing unit 101 performs necessary processing on an input stream to be input, and distributes packets obtained by storing the data to PLP (Physical Layer Pipe).
  • PLP Physical Layer Pipe
  • the data processed by the input format processing unit 101 is output to the BICM processing unit 102.
  • the BICM processing unit 102 performs processing such as error correction processing, bit interleaving, and quadrature modulation on data input from the input format processing unit 101.
  • the data processed by the BICM processing unit 102 is output to the frame interleaving processing unit 103.
  • the frame interleaving processing unit 103 performs processing such as interleaving in the time direction or frequency direction on the data input from the BICM processing unit 102.
  • the data processed by the frame interleaving processing unit 103 is output to the waveform processing unit 104.
  • the waveform processing unit 104 generates an orthogonal frequency division multiplexing (OFDM) signal based on the data input from the frame interleaving processing unit 103, and transmits the signal via the transmission path 30.
  • OFDM orthogonal frequency division multiplexing
  • the receiving device 20 includes a waveform processing unit (Waveform) 201, a frame de-interleave processing unit (Frame and De-Interleave) 202, a De-BICM processing unit 203, and an output format processing unit (Output Format) ) 204.
  • Waveform waveform
  • Frame and De-Interleave frame de-interleave processing unit
  • De-BICM De-BICM processing unit
  • Output Format Output Format
  • the waveform processing unit 201 receives an OFDM signal transmitted from the transmission apparatus 10 via the transmission path 30, and performs signal processing on the OFDM signal.
  • the data processed by the waveform processing unit 201 is output to the frame deinterleave processing unit 202.
  • the detailed configuration of the waveform processing unit 201 will be described later with reference to FIG.
  • the frame / de-interleaving processing unit 202 performs processing such as de-interleaving in the frequency direction or the time direction on the data input from the waveform processing unit 201.
  • the data processed by the frame / de-interleaving processing unit 202 is output to the De-BICM processing unit 203.
  • the De-BICM processing unit 203 performs processing such as orthogonal demodulation, bit deinterleaving, and error correction processing on the data input from the frame deinterleaving processing unit 202.
  • the data processed by the De-BICM processing unit 203 is output to the output format processing unit 204.
  • the output format processing unit 204 performs necessary processing on the data input from the De-BICM processing unit 203, and outputs an output stream obtained thereby.
  • FIG. 19 is a diagram showing a configuration example of the waveform processing unit 104 of the transmission device 10 of FIG.
  • the waveform processing unit 104 includes a data processing unit (Data) 131, a preamble processing unit (Preamble) 132, and a bootstrap processing unit (Bootstrap) 133.
  • Data data processing unit
  • Preamble preamble processing unit
  • Bootstrap bootstrap processing unit
  • the data processing unit 131 performs processing on data included in the payload of the physical layer frame.
  • the preamble processing unit 132 performs processing related to signaling included in a preamble of the physical layer frame.
  • This signaling includes L1 basic information (L1-Basic) and L1 detailed information (L1-Detail).
  • the preamble processing unit 132 includes L1 basic information (FIG. 10) including time information (L1B_time_usec, L1B_time_nsec) in microseconds and nanoseconds, and in milliseconds and seconds.
  • L1 detailed information (FIG. 11) including time information (L1D_time_sec, L1D_time_msec) is generated and included in the physical layer frame as signaling.
  • the preamble processing unit 132 measures time information in seconds, milliseconds, microseconds, and nanoseconds (L1B_time_sec, L1B_time_msec, L1B_time_usec, L1B_time_nsec).
  • the L1 basic information (FIG. 12 or FIG. 14) is included in the physical layer frame as signaling. However, in this case, time information is not included in the L1 detailed information (FIG. 13 or 15).
  • the preamble processing unit 132 includes L1 detailed information (L1D_time_sec, L1D_time_msec, L1D_time_usec, L1D_time_nsec) including time information in seconds, milliseconds, microseconds, and nanoseconds (L1D_time_sec, L1D_time_msec, L1D_time_usec, L1D_time_nsec).
  • L1D_time_sec, L1D_time_msec, L1D_time_usec, L1D_time_nsec is generated and included in the physical layer frame as signaling. However, in this case, time information is not included in the L1 basic information (FIG. 16).
  • the bootstrap processing unit 133 performs processing on data and signaling included in a bootstrap of a physical layer frame.
  • processing for inserting a pilot (PILOTS) symbol processing for MISO, processing for IFFT (Inverse Fast Fourier Transform), processing for PAPR, and A processing unit that performs processing relating to the guard interval is provided, and the processing is performed.
  • PLOTS pilot
  • MISO MISO
  • IFFT Inverse Fast Fourier Transform
  • PAPR PAPR
  • a processing unit that performs processing relating to the guard interval is provided, and the processing is performed.
  • step S101 the input format processing unit 101 performs input data processing.
  • necessary processing is performed on an input stream to be input, and a packet storing data obtained thereby is distributed to one or more PLPs.
  • step S102 the BICM processing unit 102 performs encoding / modulation processing.
  • processing such as error correction processing, bit interleaving, orthogonal modulation and the like is performed.
  • step S103 the frame interleaving processing unit 103 performs frame interleaving processing.
  • processing such as interleaving in the time direction or frequency direction is performed.
  • step S104 the waveform processing unit 104 performs waveform processing.
  • this waveform processing an OFDM signal is generated and transmitted via the transmission line 30. Further, data and signaling are processed by the data processing unit 131, the preamble processing unit 132, and the bootstrap processing unit 133.
  • L1 basic information including time information (L1B_time_usec, L1B_time_nsec) in microseconds and nanoseconds, milliseconds, and seconds
  • L1 detailed information including unit time information (L1D_time_sec, L1D_time_msec) is generated and included in the preamble of the physical layer frame.
  • time information in units of seconds, milliseconds, microseconds, and nanoseconds (L1B_time_sec, L1B_time_msec, L1B_time_usec, L1 basic information (FIG. 12 or FIG. 14) including L1B_time_nsec) is generated and included in the preamble of the physical layer frame.
  • L1 details including time information (L1D_time_sec, L1D_time_msec, L1D_time_usec, L1D_time_nsec) in seconds, milliseconds, microseconds, and nanoseconds.
  • Information (FIG. 17) is generated and included in the preamble of the physical layer frame.
  • the frame mode becomes the symbol alignment mode, the frame Even in the case of a physical layer frame whose length (frame time) is not in units of integer milliseconds, an error (jitter) with respect to the time indicated by the time information can be suppressed.
  • time information is included in the signaling without being aware of whether the frame length (frame time) of the physical layer frame is an integer unit of milliseconds (for example, without being aware of the frame number of the physical layer frame) Can be transmitted.
  • FIG. 21 is a diagram showing a configuration example of the waveform processing unit 201 of the receiving device 20 of FIG.
  • the waveform processing unit 201 includes a bootstrap processing unit (Bootstrap) 231, a preamble processing unit (Preamble) 232, and a data processing unit (Data) 233.
  • Bootstrap bootstrap processing unit
  • Preamble preamble processing unit
  • Data data processing unit
  • the bootstrap processing unit 231 performs processing on data and signaling included in a bootstrap of a physical layer frame.
  • the preamble processing unit 232 performs processing on signaling included in a preamble of the physical layer frame.
  • This signaling includes L1 basic information (L1-Basic) and L1 detailed information (L1-Detail).
  • L1 basic information including time information in microseconds and nanoseconds (L1B_time_usec, L1B_time_nsec) and time information in seconds and milliseconds (L1D_time_sec) , L1D_time_msec) is included in the preamble of the physical layer frame as signaling, the preamble processing unit 232 processes the time information of those.
  • L1 basic including time information (L1B_time_sec, L1B_time_msec, L1B_time_usec, L1B_time_nsec) in seconds, milliseconds, microseconds, and nanoseconds. Since information (FIG. 12 or FIG. 14) is included in the preamble of the physical layer frame as signaling, the preamble processing unit 232 processes the time information thereof. However, in this case, time information is not included in the L1 detailed information (FIG. 13 or 15).
  • L1 detailed information including time information in seconds, milliseconds, microseconds and nanoseconds (L1D_time_sec, L1D_time_msec, L1D_time_usec, L1D_time_nsec) is displayed. Because they are included in the preamble of the physical layer frame as signaling, the preamble processing unit 232 processes their time information. However, in this case, time information is not included in the L1 basic information (FIG. 16).
  • the data processing unit 233 performs processing on data included in the payload of the physical layer frame.
  • the waveform processing unit 201 performs processing on guard intervals, processing on PAPR, processing on FFT (Fast Fourier Transform), processing on MISO, and processing on pilot symbols.
  • a processing unit is provided, and the processing is performed.
  • step S201 the waveform processing unit 201 performs waveform processing.
  • this waveform processing an OFDM signal transmitted from the transmitter 10 (FIG. 18) via the transmission path 30 is received, and signal processing is performed on the OFDM signal. Also, data and signaling are processed by the bootstrap processing unit 231, the preamble processing unit 232, and the data processing unit 233.
  • L1 basic information including time information in microseconds and nanoseconds (L1B_time_usec, L1B_time_nsec) and time information in seconds and milliseconds (L1D_time_sec) (L1D_time_msec) is included in the preamble of the physical layer frame as signaling, the preamble processing unit 232 processes the time information of the preamble.
  • L1 basic including time information (L1B_time_sec, L1B_time_msec, L1B_time_usec, L1B_time_nsec) in seconds, milliseconds, microseconds, and nanoseconds.
  • Information (FIG. 12 or FIG. 14) is included in the preamble of the physical layer frame as signaling, so the preamble processing unit 232 processes the time information of those.
  • L1 detailed information including time information in seconds, milliseconds, microseconds and nanoseconds (L1D_time_sec, L1D_time_msec, L1D_time_usec, L1D_time_nsec) is displayed. Since they are included in the preamble of the physical layer frame as signaling, the preamble processing unit 232 processes their time information.
  • step S202 the frame deinterleave processing unit 202 performs frame deinterleave processing.
  • this frame de-interleaving process processes such as de-interleaving in the frequency direction or in the time direction are performed.
  • step S203 the De-BICM processing unit 203 performs demodulation and decoding processing.
  • demodulation and decoding process processes such as orthogonal demodulation, bit deinterleaving, and error correction are performed.
  • step S204 the output format processing unit 204 performs output data processing.
  • this output data processing necessary processing is performed on the input data, and the data is output as an output stream.
  • reception side data processing by adopting L1B + L1D transmission method, L1B transmission method a, L1B transmission method b, or L1D transmission method, time information in seconds and time information in milliseconds from the preamble of the physical layer frame , Time information in microseconds, and time information in nanoseconds in at least one of L1 basic information and L1 detailed information are acquired and processed.
  • the frame mode becomes the symbol alignment mode, the frame Even in the case of a physical layer frame whose length (frame time) is not in units of integer milliseconds, an error (jitter) with respect to the time indicated by the time information can be suppressed.
  • time information included in the signaling without being aware of whether the frame length (frame time) of the physical layer frame is an integer unit of milliseconds (for example, without being aware of the frame number of the physical layer frame) It is possible to process
  • the ATSC in particular, ATSC 3.0
  • ATSC 3.0 which is a system adopted in the United States and the like
  • DVB Digital Video Broadcasting
  • the ATSC 3.0 in which the IP transmission method is adopted has been described as an example, but the present invention is not limited to the IP transmission method, and is applied to other methods such as the MPEG2-TS (Transport Stream) method. You may do so.
  • BS Broadcasting Satellite
  • CS Communications Satellite
  • CATV cable television
  • time information on time defined by PTP is described as an example of time information, but the time information is not limited to PTP, and for example, NTP (Network Time Protocol) Information on specified time, information on time specified in 3GPP (Third Generation Partnership Project), information on time included in GPS (Global Positioning System) information, and information on time in other uniquely determined format It is possible to adopt information of any time such as.
  • NTP Network Time Protocol
  • 3GPP Third Generation Partnership Project
  • GPS Global Positioning System
  • the present technology prescribes a predetermined standard (assuming use of a transmission line other than a broadcast network, ie, a communication line (communication network) such as the Internet or a telephone network) as a transmission line.
  • a communication line such as the Internet or a telephone network may be used as the transmission line 30 of the transmission system 1 (FIG. 1), and the transmission device 10 may be a server provided on the Internet.
  • the transmitting device 10 server
  • the transmitting device 10 processes data transmitted from the transmitting device 10 (server) via the transmission path 30 (communication line).
  • the names such as signaling described above are an example, and other names may be used. However, the difference between these names is a formal difference, and the substantive content such as target signaling is not different.
  • FIG. 23 is a diagram showing an example of a hardware configuration of a computer that executes the series of processes described above according to a program.
  • a central processing unit (CPU) 1001, a read only memory (ROM) 1002, and a random access memory (RAM) 1003 are mutually connected by a bus 1004.
  • An input / output interface 1005 is further connected to the bus 1004.
  • An input unit 1006, an output unit 1007, a recording unit 1008, a communication unit 1009, and a drive 1010 are connected to the input / output interface 1005.
  • the input unit 1006 includes a keyboard, a mouse, a microphone and the like.
  • the output unit 1007 includes a display, a speaker, and the like.
  • the recording unit 1008 includes a hard disk, a non-volatile memory, and the like.
  • the communication unit 1009 includes a network interface or the like.
  • the drive 1010 drives removable media 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 1001 loads the program stored in the ROM 1002 or the recording unit 1008 into the RAM 1003 via the input / output interface 1005 and the bus 1004, and executes the program. A series of processing is performed.
  • the program executed by the computer 1000 can be provided by being recorded on, for example, a removable medium 1011 as a package medium or the like. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the recording unit 1008 via the input / output interface 1005 by mounting the removable media 1011 in the drive 1010. Also, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the recording unit 1008. In addition, the program can be installed in advance in the ROM 1002 or the recording unit 1008.
  • the processing performed by the computer according to the program does not necessarily have to be performed chronologically in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or separately (for example, parallel processing or processing by an object). Further, the program may be processed by one computer (processor) or may be distributed and processed by a plurality of computers.
  • the present technology can have the following configurations.
  • a generator configured to generate signaling including time information having a precision of time according to a frame length of the physical layer frame;
  • a processing unit configured to process the signaling to be included in a preamble of the physical layer frame.
  • the signaling includes first information and second information to be read after the first information, The data processing device according to (1), wherein the time information is included in at least one of the first information and the second information.
  • the frame length of the physical layer frame has a precision higher than millisecond units, The data processing apparatus according to (2), wherein the time information has accuracy higher than millisecond units.
  • the second information includes time information in seconds and time information in milliseconds.
  • the data processing device wherein the first information includes time information of microsecond unit time, or time information of at least one of microsecond unit time information and nanosecond unit time information.
  • the first information includes time information in seconds, time information in milliseconds, time information in microseconds, or time information in microseconds and time information in nanoseconds, as described in (3).
  • the second information includes time information in seconds, time information in milliseconds, time information in microseconds, or time information in microseconds and time information in nanoseconds, as described in (3).
  • the frame length of the physical layer frame has a precision in milliseconds, The data processing apparatus according to (2), wherein the time information has accuracy higher than millisecond units.
  • the physical layer frame is a physical layer frame defined in Advanced Television Systems Committee (ATSC) 3.0
  • the first information is L1 basic information (L1-Basic) included in a preamble defined in ATSC 3.0
  • the data processing apparatus according to any one of (2) to (7), wherein the second information is L1 detailed information (L1-Detail) included in a preamble defined in ATSC 3.0.
  • the data processing apparatus according to any one of (3) to (8), wherein when the second mode is set, the time information has an accuracy higher than a millisecond unit.
  • the data processor Generate signaling including time information having a precision of time according to the frame length of the physical layer frame, Processing the signaling to be included in a preamble of the physical layer frame.
  • a data processing apparatus comprising: a processing unit that processes signaling including time information having a precision of time according to a frame length of the physical layer frame, which is included in a preamble of the physical layer frame.
  • the signaling includes first information and second information to be read after the first information, The data processing device according to (11), wherein the time information is included in at least one of the first information and the second information.
  • the frame length of the physical layer frame has a precision higher than millisecond units, The data processing apparatus according to (12), wherein the time information has an accuracy higher than a millisecond unit.
  • the second information includes time information in seconds and time information in milliseconds.
  • the first information includes time information in seconds, time information in milliseconds, time information in microseconds, or time information in microseconds and time information in nanoseconds.
  • the second information includes time information in seconds, time information in milliseconds, time information in microseconds, or time information in microseconds and time information in nanoseconds.
  • the frame length of the physical layer frame has a precision in milliseconds
  • the physical layer frame is a physical layer frame defined in ATSC 3.0
  • the first information is L1 basic information (L1-Basic) included in a preamble defined in ATSC 3.0
  • the data processing apparatus There are a first mode for adjusting the frame length of the physical layer frame in millisecond units, and a second mode for not adjusting the frame length of the physical layer frame,
  • the data processing apparatus according to any one of (13) to (18), wherein when the second mode is set, the time information has an accuracy higher than a millisecond unit.
  • the data processor Processing signaling including time information having a precision of time according to a frame length of the physical layer frame, included in a preamble of the physical layer frame.
  • Reference Signs List 1 transmission system 10 transmitting apparatus, 20 receiving apparatus, 30 transmission path, 101 input format processing unit, 102 BICM processing unit, 103 frame interleaving processing unit, 104 waveform processing unit, 131 data processing unit, 132 preamble processing unit, 133 Bootstrap processing unit, 201 waveform processing unit, 202 frame deinterleaving processing unit, 203 De-BICM processing unit, 204 output format processing unit, 231 bootstrap processing unit, 232 preamble processing unit, 233 data processing unit, 1000 computer, 1001 CPU

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Synchronisation In Digital Transmission Systems (AREA)
  • Time-Division Multiplex Systems (AREA)
  • Mobile Radio Communication Systems (AREA)
PCT/JP2016/083466 2015-11-25 2016-11-11 データ処理装置、及び、データ処理方法 WO2017090456A1 (ja)

Priority Applications (8)

Application Number Priority Date Filing Date Title
CA3004847A CA3004847C (en) 2015-11-25 2016-11-11 Data processing apparatus and data processing method
US15/765,861 US11265141B2 (en) 2015-11-25 2016-11-11 Data processing device and data processing method
MX2018006176A MX2018006176A (es) 2015-11-25 2016-11-11 Aparato de procesamiento de datos y metodo de procesamiento de datos.
KR1020187013710A KR102379544B1 (ko) 2015-11-25 2016-11-11 데이터 처리 장치 및 데이터 처리 방법
JP2017552355A JPWO2017090456A1 (ja) 2015-11-25 2016-11-11 データ処理装置、及び、データ処理方法
CN201680067326.3A CN108352980B (zh) 2015-11-25 2016-11-11 数据处理装置与数据处理方法
EP16868404.1A EP3382927B1 (en) 2015-11-25 2016-11-11 Data-processing device and data-processing method
US17/648,628 US20220224506A1 (en) 2015-11-25 2022-01-21 Data processing device and data processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-229768 2015-11-25
JP2015229768 2015-11-25

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/765,861 A-371-Of-International US11265141B2 (en) 2015-11-25 2016-11-11 Data processing device and data processing method
US17/648,628 Continuation US20220224506A1 (en) 2015-11-25 2022-01-21 Data processing device and data processing method

Publications (1)

Publication Number Publication Date
WO2017090456A1 true WO2017090456A1 (ja) 2017-06-01

Family

ID=58763131

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/083466 WO2017090456A1 (ja) 2015-11-25 2016-11-11 データ処理装置、及び、データ処理方法

Country Status (9)

Country Link
US (2) US11265141B2 (ko)
EP (1) EP3382927B1 (ko)
JP (1) JPWO2017090456A1 (ko)
KR (1) KR102379544B1 (ko)
CN (1) CN108352980B (ko)
CA (1) CA3004847C (ko)
MX (1) MX2018006176A (ko)
TW (1) TWI756194B (ko)
WO (1) WO2017090456A1 (ko)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108989841A (zh) * 2017-06-02 2018-12-11 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989842A (zh) * 2017-06-02 2018-12-11 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989843A (zh) * 2017-06-02 2018-12-11 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007104085A (ja) * 2005-09-30 2007-04-19 Toshiba Corp 通信回線を用いるデジタル放送方法およびその装置
WO2014196336A1 (ja) * 2013-06-07 2014-12-11 ソニー株式会社 送信装置、伝送ストリームの送信方法および処理装置
WO2015068352A1 (ja) * 2013-11-08 2015-05-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ 送信方法、受信方法、送信装置、及び受信装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3857057B2 (ja) 2001-02-05 2006-12-13 株式会社日立製作所 動画像データの記録再生方法および装置
JP2005094348A (ja) 2003-09-17 2005-04-07 Sony Corp 通信システムおよび方法、情報処理装置および方法、並びにプログラム
CN103634077B (zh) * 2009-02-12 2017-09-08 Lg电子株式会社 发送和接收广播信号的装置及发送和接收广播信号的方法
KR20100101998A (ko) 2009-03-10 2010-09-20 엘에스산전 주식회사 전력기기 간의 로컬시각 동기 방법 및 그 장치
KR20150016735A (ko) 2013-08-05 2015-02-13 한국전자통신연구원 중앙 집중 제어 평면을 이용한 네트워크 시각 동기 시스템
JP6326213B2 (ja) * 2013-10-04 2018-05-16 サターン ライセンシング エルエルシーSaturn Licensing LLC 受信装置、受信方法、送信装置、及び、送信方法
JP6505413B2 (ja) 2013-11-08 2019-04-24 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 送信方法、受信方法、送信装置、及び受信装置
CN105794219B (zh) * 2013-12-03 2020-03-03 Lg 电子株式会社 发送广播信号的装置、接收广播信号的装置、发送广播信号的方法以及接收广播信号的方法
KR20150112721A (ko) 2014-03-27 2015-10-07 엘지전자 주식회사 이동 단말기 및 그 제어 방법
KR102465856B1 (ko) 2015-03-27 2022-11-11 한국전자통신연구원 코어 레이어의 피지컬 레이어 파이프들의 경계를 이용한 방송 신호 프레임 생성 장치 및 방송 신호 프레임 생성 방법
EP3314463A1 (en) * 2015-06-29 2018-05-02 British Telecommunications public limited company Real time index generation
WO2017029794A1 (en) * 2015-08-14 2017-02-23 Sharp Kabushiki Kaisha Systems and methods for communicating time representations

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007104085A (ja) * 2005-09-30 2007-04-19 Toshiba Corp 通信回線を用いるデジタル放送方法およびその装置
WO2014196336A1 (ja) * 2013-06-07 2014-12-11 ソニー株式会社 送信装置、伝送ストリームの送信方法および処理装置
WO2015068352A1 (ja) * 2013-11-08 2015-05-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ 送信方法、受信方法、送信装置、及び受信装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3382927A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108989841A (zh) * 2017-06-02 2018-12-11 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989842A (zh) * 2017-06-02 2018-12-11 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989843A (zh) * 2017-06-02 2018-12-11 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989843B (zh) * 2017-06-02 2020-10-16 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989841B (zh) * 2017-06-02 2020-12-18 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统
CN108989842B (zh) * 2017-06-02 2020-12-22 上海数字电视国家工程研究中心有限公司 适用于高速运动接收的数据帧的设计方法和传输系统

Also Published As

Publication number Publication date
CA3004847A1 (en) 2017-06-01
US20220224506A1 (en) 2022-07-14
KR102379544B1 (ko) 2022-03-29
MX2018006176A (es) 2018-09-05
EP3382927A1 (en) 2018-10-03
JPWO2017090456A1 (ja) 2018-09-13
TW201724830A (zh) 2017-07-01
CA3004847C (en) 2023-01-24
US11265141B2 (en) 2022-03-01
CN108352980A (zh) 2018-07-31
EP3382927A4 (en) 2018-11-14
CN108352980B (zh) 2021-07-13
KR20180084786A (ko) 2018-07-25
EP3382927B1 (en) 2022-03-02
TWI756194B (zh) 2022-03-01
US20180287777A1 (en) 2018-10-04

Similar Documents

Publication Publication Date Title
US20230179363A1 (en) Transmitting apparatus and receiving apparatus and controlling method thereof
US20220224506A1 (en) Data processing device and data processing method
US10313492B2 (en) Layer one signaling for physical layer pipes (PLPS)
US10389777B2 (en) Transmitting apparatus, receiving apparatus, and control methods thereof
US11595169B2 (en) Transmitting apparatus and receiving apparatus and controlling method thereof
KR20160074532A (ko) 방송 신호 송신 장치, 방송 신호 수신 장치, 방송 신호 송신 방법, 및 방송 신호 수신 방법
JP2017135557A (ja) データ処理装置、及び、データ処理方法
CA3101301C (en) Transmitting apparatus, receiving apparatus, and control methods thereof
JP6838003B2 (ja) データ処理装置、及び、データ処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16868404

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15765861

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2017552355

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3004847

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 20187013710

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: MX/A/2018/006176

Country of ref document: MX

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016868404

Country of ref document: EP