WO2015132957A1 - Video device and video processing method - Google Patents

Video device and video processing method Download PDF

Info

Publication number
WO2015132957A1
WO2015132957A1 PCT/JP2014/055980 JP2014055980W WO2015132957A1 WO 2015132957 A1 WO2015132957 A1 WO 2015132957A1 JP 2014055980 W JP2014055980 W JP 2014055980W WO 2015132957 A1 WO2015132957 A1 WO 2015132957A1
Authority
WO
WIPO (PCT)
Prior art keywords
video signal
video
display
scanning
frame
Prior art date
Application number
PCT/JP2014/055980
Other languages
French (fr)
Japanese (ja)
Inventor
甲 展明
浩朗 伊藤
稲田 圭介
健 木佐貫
吉澤 和彦
益岡 信夫
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2014/055980 priority Critical patent/WO2015132957A1/en
Publication of WO2015132957A1 publication Critical patent/WO2015132957A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/005Adapting incoming signals to the display format of the display terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/66Transforming electric information into light information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/02Addressing, scanning or driving the display screen or processing steps related thereto
    • G09G2310/0202Addressing of scan or signal lines
    • G09G2310/0205Simultaneous scanning of several lines in flat panels
    • G09G2310/021Double addressing, i.e. scanning two or more lines, e.g. lines 2 and 3; 4 and 5, at a time in a first field, followed by scanning two or more lines in another combination, e.g. lines 1 and 2; 3 and 4, in a second field
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0435Change or adaptation of the frame rate of the video stream
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/10Use of a protocol of communication by packets in interfaces along the display data pipeline
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/12Use of DVI or HDMI protocol in interfaces along the display data pipeline

Definitions

  • the present invention relates to a video device and a video processing method for transmitting a video signal from a camera or the like and performing video processing for displaying the video signal with good timing on a display device.
  • Patent Document 1 discloses that “a video signal having a frame rate higher than the frame rate of a video signal to be encoded and transmitted”. It is possible to perform higher-accuracy FRC processing by performing motion determination and transmitting the motion determination result separately. Further, Patent Document 2 discloses that reference control information including a motion vector is superimposed and transmitted in a blanking period of a video signal, that is, “the playback device 1 and the display device 5 are in accordance with the HDMI (registered trademark) standard.
  • HDMI registered trademark
  • the encoding unit 3 In the case of inputting / outputting a signal, “the encoding unit 3 superimposes the reference control information as a kind of AUX data in the data island period of FIG. 5 and the decoding unit 6 separates the reference control information from the data island period”. It is described.
  • JP 2013-174798 A see paragraph 0049
  • JP 2007-274679 A see paragraph 0039
  • the delay time may vary within the screen and image distortion may occur.
  • the occurrence of such delay time and variations in the screen become obstacles when performing precision work while watching fast moving camera images, and must be reduced as much as possible.
  • the size of the delay time includes a portion depending on the scanning method, the video signal transmission method, and the frame interpolation method of the camera and the display device, respectively, but was not particularly taken into consideration in the prior art including the patent document.
  • An object of the present invention is to reduce the size of delay time and the variation in the screen from the imaging timing of the camera to the display timing of the display device.
  • the configuration described in the claims is adopted.
  • a video device that displays an output video of a video output device, from the video output device, a first video signal, interpolation data for interpolating a frame of the first video signal, An input unit that receives scanning information when generating the video, and a second that interpolates a frame of the first video signal using the first video signal and the interpolation data received by the input unit.
  • a display signal processing unit for generating the video signal, a display element for displaying the first video signal, the second video signal generated by the display signal processing unit, the display signal processing unit, and the display element
  • a control unit for controlling the display element, wherein the control unit determines a display scanning method of the display element based on the scanning information received by the input unit, and the display signal processing unit is controlled by the control unit.
  • the determined display element Generating said second video signal based on the display scanning method.
  • the second video signal having a frame rate lower than the frame rate of the first video signal from the first video signal output by the video source, and the second A video signal processing unit that generates interpolation data for interpolating a frame of the video signal, an output unit that transmits the second video signal and the interpolation data to the display element, and the video signal A control unit that controls a processing unit, wherein the control unit acquires display scan information of the display device, information on the second video signal that can be processed, and the interpolation data, and the display scan information and the A format of the second video signal output from the video signal processing unit and the interpolation data is determined based on information on the second video signal that can be processed and the interpolation data, and the video signal is determined. Processing unit, the generating the interpolated data and the second image signal based on the format of the control unit has determined.
  • the present invention it is possible to realize a system in which the size and variation of the delay time from the imaging timing of the camera to the display timing of the display device are reduced.
  • FIG. 4A and 4B illustrate various scanning methods for an image sensor and a display element.
  • the figure explaining the imaging of an image signal, transmission, and a display timing (when an imaging scanning system is a single scan).
  • the figure explaining the imaging of an image signal, transmission, and a display timing (when an imaging scanning system is a dual scan).
  • FIG. 4 is a diagram illustrating an overall configuration of an imaging / display system according to a second embodiment. The figure which shows the internal structure of the encoder 4.
  • FIG. 10 is a diagram for explaining imaging, transmission, and display timing of a 3D video signal according to Embodiment 3 (when the imaging scanning method is single scan). The figure explaining the imaging of 3D video signals, transmission, and a display timing (when an imaging scanning system is a dual scan). The figure which put together the delay time in each scanning system of 3D display.
  • FIG. 10 is a diagram illustrating a configuration example of a display device according to a fourth embodiment. The figure which shows the display timing at the time of shortening the scanning time of a display element.
  • FIG. 10 is a diagram illustrating a configuration example of a display device according to a fifth embodiment.
  • FIG. 1 is a block diagram illustrating the configuration of the imaging / display system according to the first embodiment.
  • a consumer imaging device (camera) 1 and a display device (display) 2 are connected by HDMI (abbreviation of High Definition Multimedia Interface, a registered trademark of HDMI, LLC), etc.
  • HDMI abbreviation of High Definition Multimedia Interface, a registered trademark of HDMI, LLC
  • the internal configuration of the imaging apparatus 1 includes an imaging device 11, memories 12 and 16, a superimposition unit 13, an 8B / 10B encoder 14, an output unit 15, a compression unit 17, an error correction coding unit (ECC) 18, and a control unit 19. .
  • ECC error correction coding unit
  • the internal configuration of the display device 2 includes an input unit 21, a 10B / 8B decoder 22, a separation unit 23, a display signal processing unit 31, a display element 26, a control unit 27, and an error correction decoding unit 28.
  • the display signal processing unit 31 includes memories 24 and 30, a scanning processing unit 25, and an expansion unit 29.
  • the imaging device 1 captures an image at a frame rate of 120 Hz, and interpolated data generated by compressing the even frame is superimposed on the odd frame and transmitted to the display device 2, and the display device 2 interpolates with the odd frame.
  • the even frame is restored from the data, and the frame rate is returned to 120 Hz together with the odd frame to display the video. The operation of each part will be described later.
  • FIG. 2 is a diagram for explaining various scanning methods of the image sensor and the display element.
  • (A) is a single scan method, in which pixels are horizontally scanned from left to right as indicated by an arrow 111, and pixels are vertically scanned from top to bottom as indicated by an arrow 112. This is a general scanning method, and is adopted by many image sensors and display elements.
  • (B) (c) (d) is a dual scan method, which divides the screen into two screens, an upper screen and a lower screen, and simultaneously scans the upper screen and the lower screen.
  • the number of pixels increases as the resolution increases, the selection time for one row and each pixel is shortened, and high-speed driving of the image sensor and the display element becomes difficult. Therefore, by dividing the screen into upper and lower parts and scanning them simultaneously, the scanning time for each row can be reduced by half compared to the single scan method, and high-speed driving can be supported. That is, as indicated by arrows 121, 131, and 141, horizontal scanning is performed from left to right, while vertical scanning is simultaneously performed on the upper screen and the lower screen.
  • (B) is vertical scanning from top to bottom as indicated by arrows 122 and 123, and is called up-down screen same direction scanning.
  • (C) performs vertical scanning from the center to the top and bottom as indicated by arrows 132 and 133, and is referred to as vertical screen reverse scanning (center start).
  • vertical scanning is performed from the upper part and the lower part to the central part as indicated by arrows 142 and 143, which is referred to as vertical screen reverse scanning (up / down start).
  • moving image display is performed by matching the timing of imaging or displaying the boundary between the upper screen and the lower screen, that is, the pixel immediately above and the pixel immediately below the center of the screen. To smooth. For this reason, in the same direction scanning of the upper and lower screens in (b), the scanning of the lower screen may be delayed by one frame compared to the upper screen.
  • the scanning timing of the upper screen and the lower screen is substantially the same.
  • FIG. 3 is a diagram for explaining an example of imaging, transmission, and display timing of a video signal in the imaging / display system of FIG.
  • the image pickup device 11 performs a single scan at a frame rate of 120 Hz (FIG. 2A).
  • the horizontal axis indicates time, and the vertical axis indicates the scanning position in the vertical direction.
  • a thick arrow (for example, reference numeral 201) indicates scanning timing for odd frames, and a thin arrow (for example, reference numeral 202) indicates scanning timing for even frames.
  • a solid line arrow (for example, reference numeral 211) indicates an uncompressed video signal, and a broken line arrow (for example, reference numeral 212) indicates interpolation data.
  • a period T of one frame corresponding to a frame rate of 120 Hz is used.
  • (A) is the imaging timing
  • the imaging device 11 performs a single scan at a frame rate of 120 Hz (period T)
  • the scanning timings of the first to sixth frames are indicated by arrows 201 to 206.
  • (B) is the transmission timing, and the odd-numbered frames 201, 203, 205 output from the image sensor 11 are set to 60 Hz single scan signals 211, 213, 215 (time width 2T) to extend the time axis. Further, in order to restore the even frames 202, 204, and 206, 120 Hz interpolated data 212, 214, and 216 are generated and superimposed on the 60 Hz single scan signal and output. For example, when transmitting by HDMI, it may be superposed as an Info packet format in the horizontal blanking period of a 60 Hz single scan signal.
  • (C) to (f) show display timings in various scanning methods by the display element 26, and display an image of each frame at a period T.
  • (C) is a case of 120 Hz single scan, and the display timings of the first to fifth frames are indicated by 221 to 225, respectively.
  • (D), (e), and (f) are 120 Hz dual scans, (d) is the same scan in the top and bottom screen, (e) is the top and bottom screen reverse scan (center start), and (f) is the top and bottom screen reverse scan. This is the case of (starting up and down).
  • Display timings at the top of the screen are indicated by arrows 231 to 236, 251 to 255, 271 to 274, respectively, and display timings at the bottom of the screen are indicated by arrows 241 to 245, 261 to 265, and 281 to 284, respectively.
  • the image sensor 11 captures an image with a single scan at a frame rate of 120 Hz and outputs an uncompressed video signal.
  • odd frames are stored in the memory 12 and even frames are stored in the memory 16.
  • the odd-frame video signals (201, 203, 205 in FIG. 3A) accumulated in the memory 12 are read out over a time twice as long as the writing (about 2T) (60 Hz for transmission ( Video signals (period 2T) (FIG. 3 (b) 211, 213, 215) are generated.
  • a delay 219 corresponding to a vertical effective display period of 120 Hz (about 1 T) occurs compared to the end of scanning of the corresponding imaging signal (frame 201).
  • the compression unit 17 uses the odd frame and even frame video signals stored in the memories 12 and 16 to generate interpolation data (120 Hz conversion data) for restoring the even frame video from the odd frame.
  • the interpolation data may be any of difference information (equivalent to P frame) with the immediately preceding odd frame, mixing ratio or difference information between the preceding and succeeding odd frames (equivalent to B frame), motion vector related data, and the like.
  • the error correction encoding unit 18 performs error correction encoding on the interpolation data generated by the compression unit 17 in order to prevent errors in the process of transmitting the video signal to the display device 2. This is because the error that occurs in the interpolated data (even frame) is larger than the error that occurs in the uncompressed video signal (odd frame).
  • the coding method may be selected by an instruction from the control unit 19 that detects the error rate of the transmission system. When the transmission system error rate is large, the quality of the display video can be ensured by increasing the compression rate of the compression unit 17 and reducing the interpolation data, and then selecting an error correction coding method with high error correction strength. it can.
  • the superimposing unit 13 superimposes the error correction encoded even frame interpolation data on the blanking period of the odd frame uncompressed video signal.
  • the superimposed video signal is serialized (converted from 8-bit data to 10-bit data) by the 8B / 10B encoder 14, and becomes a non-compressed video signal interface compliant signal such as HDMI. Then, it is transmitted from the output unit 15 to the display device 2.
  • the start timing of the even-frame interpolation data 212, 214, and 216 is longer than the start timing of the corresponding even-numbered frames 202, 204, and 206 in the imaging signal in FIG. Delayed by 218.
  • the transmission time of the interpolation data 212, 214, 216 can be set to the same time width as the imaging scanning time (1T) of the even frames 202, 204, 206, but the interpolation data is concentrated in time.
  • the transmission band may be insufficient. Therefore, in this example, like the transmission signals 211, 213, and 215 of the 60 Hz uncompressed video signal, the transmission time of the interpolation data 212, 214, and 216 is doubled and transmitted at a cycle of 2T.
  • the transmission of the interpolation data is continued in the vertical blanking period of the 60 Hz uncompressed video signal to be transmitted, but the transmission may be paused.
  • the size of the delay time 218 is 1 frame (1T) or more. It becomes.
  • the control unit 19 is responsible for overall control of the imaging apparatus 1 and grasps the operation status of each component unit and gives necessary instructions.
  • the control unit 19 reads an EDID (Extended Display Identification Data) in which information such as the display capability of the display device 2 that is the transmission destination of the output unit 15 is described, and determines whether it has a function capable of processing the interpolation data. To do. Then, the memory 16, the compression unit 17, and the error correction coding unit 18 are operated only when they can be processed, so that power saving can be achieved.
  • EDID Extended Display Identification Data
  • control unit 19 uses the transmission system of this embodiment, “60 Hz uncompressed video”, based on information such as transmission capability, error rate, power consumption, remaining battery level, image quality level requested by the user, and display delay time. It may be determined whether to select “signal + interpolated data transmission” and “120 Hz uncompressed video signal transmission” or “60 Hz uncompressed video signal (without interpolation data) transmission”, which is another transmission method.
  • the input unit 21 of the display device 2 receives the transmission signal (60 Hz uncompressed video signal and 120 Hz conversion data) of FIG. 3B transmitted from the imaging device 1.
  • the 10B / 8B decoder 22 restores the received signal to a video signal.
  • the separation unit 23 separates the signal, the 60 Hz uncompressed video signal (odd frame) is sent to the memory 24, and the 120 Hz conversion data (even number frame) and other metadata are sent to the error correction decoding unit (ECC) 28.
  • ECC error correction decoding unit
  • the error correction decoding unit 28 outputs reliable interpolation data for 120 Hz after the error correction processing to the decompression unit 29, and is necessary for the 120 Hz video signal restoration that the 120 Hz conversion data is included. Parameter information and the like are transmitted to the control unit 27.
  • an info frame extraction processing circuit defined by HDMI can be used as the separation unit 23 and the error correction decoding unit 28.
  • the decompression unit 29 restores the even frame of the 120 Hz video signal from the 60 Hz uncompressed video signal obtained from the memory 24 (that is, the odd frame of the 120 Hz video signal) and the interpolation data for 120 Hz conversion obtained from the error correction decoding unit 28.
  • the scanning processing unit 25 converts the odd frame video signal (cycle 2T) in the memory 24 and the even frame video signal (cycle 2T) in the memory 30 respectively.
  • the time axis is converted so as to scan in half the time (cycle T), and the result is combined and output to the display element 26.
  • the scanning processing unit 25 generates a video signal in accordance with the scanning method of the display element 26, and the display element 26 displays an image of each frame at the scanning timing shown in FIGS.
  • FIG. 3C shows a case where the display element 26 is a 120 Hz single scan system (FIG. 2A).
  • the scanning start timing of each frame 221 to 225 of the 120 Hz video signal to be displayed is delayed by one frame period (1T) 228 from the scanning start timing of each frame 201 to 205 of the corresponding imaging signal (FIG. 5A).
  • the display of the even-numbered frame 222 can be restored immediately after the interpolation data 212 is input.
  • the display must be synchronized with the end of transmission 217 of the interpolation data 212.
  • the scanning processing unit 25 uses the motion vector included in the 120 Hz conversion data obtained from the error correction decoding unit 28 together with the restoration to the 120 Hz uncompressed video signal, and converts the frame rate to 240 Hz. I do.
  • FIG. 3D shows a case where the display element 26 is a 120 Hz dual scan system (same direction scanning, FIG. 2B).
  • the screen is divided into upper and lower parts and simultaneously vertically scanned in the same direction from top to bottom.
  • the lower screen is displayed with a delay of one frame in order to prevent a time lag of the video at the center of the screen, so that the entire screen is vertically scanned from the top to the bottom over a period of 2 frames (2T).
  • the transmitted first frame 211 is divided into a portion 231 that is displayed over the time 1T on the upper screen and a portion 241 that is displayed over the next time 1T on the lower screen.
  • the dual scan method there is no conversion of the time axis of the transmission signal and the display signal, and the time axis conversion process by the scanning processing unit 25 is not necessary. That is, since the display video is not lost in the middle of the frame due to the time axis conversion, the display start timing on the upper screen can be almost the same as the reception start timing of each frame. If the decompression unit 29 does not take much time to restore, the start point of the second frame 232 to be displayed may substantially coincide with the reception start point of the interpolation data 212 for second frame restoration.
  • the display timing at each position in the screen depends on the reception or restoration timing of the video signal at that position, that is, almost the transmission timing.
  • the connection portion (the center portion of the screen) between the upper screen and the lower screen is delayed by about 1 ⁇ 2 frame (0.5 T) compared to the imaging timing.
  • the display end timing of the lower screen is delayed by about 1 frame (1T) from the imaging end timing.
  • the display end point of the first frame 241 is delayed by a time 248 from the imaging end point of the first frame 201.
  • the delay time 248 is obtained by adding processing time and transmission delay time in the compression unit 17 and the expansion unit 29 to one frame (1T).
  • the lower frame second frame 242 is delayed by a horizontal blanking period of 120 Hz from the interpolation data 212 to be transmitted. As a result, it is delayed by a time 238 from the start timing of the imaged third frame 203. This delay time 238 is substantially equal to the delay time 218 of the 120 Hz conversion data.
  • FIGS. 3E and 3F show a case where the display element 26 is a dual scan system (reverse scanning), and FIG. 3E shows a case where scanning is simultaneously performed from the center to the upper and lower ends of the screen (FIG. 2C). , (F) shows a case where scanning is simultaneously performed from the upper end and the lower end of the screen to the center (FIG. 2 (d)). Even in these scans, the time axis conversion is unnecessary, and can be realized by changing the reading order of the memories 24 and 30 in the display device 2 or changing the scanning order in the scanning processing unit 25.
  • the delay time from imaging to display causes a delay 258 of approximately 2 frames (2T) at the upper end of the screen and a delay 269 of approximately 1 frame at the lower end of the screen. Produce. In the center of the screen, a delay of 1.5 frames between them occurs.
  • the delay time from imaging to display causes a delay of approximately 2 frames at the top of the screen and a delay of approximately 1 frame at the bottom of the screen. 289.
  • a delay of 2.5 frames is generated by adding 0.5 frames to them.
  • the delay times are compared for each scanning method of the display element 26.
  • the entire screen is delayed by approximately one frame (1T).
  • the delay time is minimized in the case of the same direction scan method (FIG. 3D). In other words, there is a delay of approximately 1 frame at the bottom of the screen, but there is no delay at the top of the screen (delayed by the time required to compress / expand even frames), and there is a delay of approximately 0.5 frames at the center of the screen. This is advantageous.
  • FIG. 4 is a diagram illustrating another example of image signal imaging, transmission, and display timing.
  • the image sensor 11 performs dual scanning at a frame rate of 120 Hz (FIG. 2B).
  • (A) is the imaging timing
  • the imaging device 11 performs dual scanning at a frame rate of 120 Hz (period T), and the scanning timings of the first to sixth frames are indicated by arrows 301 to 306.
  • the scanning time for one screen is about 2T.
  • (B) is the transmission timing, and the odd frames 301, 303, and 305 output from the image sensor 11 are output as frames 311, 413, and 315 at the same timing.
  • the even frames 302, 304, and 306 are converted into 120 Hz interpolated data 312, 314, and 316 and superimposed on the odd frames and output.
  • (C) to (f) show display timings in various scanning methods by the display element 26, and display an image of each frame at a period T.
  • (C) is a case of 120 Hz single scan, and display timings 321 to 325 are shown for the first to fifth frames, respectively.
  • (D), (e), and (f) are 120 Hz dual scans, (d) is the same scan in the top and bottom screen, (e) is the top and bottom screen reverse scan (center start), and (f) is the top and bottom screen reverse scan. This is the case of (starting up and down).
  • Display timings at the top of the screen are indicated by arrows 331 to 336, 351 to 355, 371 to 374, respectively, and display timings at the bottom of the screen are indicated by arrows 341 to 345, 361 to 365, 381 to 384, respectively.
  • the imaging device 11 in the imaging apparatus 1 captures an image with a dual scan of 120 Hz and outputs an uncompressed video signal.
  • the imaging time for one screen is about 2T.
  • Odd frames are stored in the memory 12 and even frames are stored in the memory 16.
  • the odd-frame video signals (301, 303, and 305 in FIG. 4A) stored in the memory 12 are read at the same timing, and the 60-Hz video signals for transmission (311 in FIG. 4B) are read. 313, 315). Therefore, the memory 12 may be passed through and output to the superimposing unit 13.
  • the compression unit 17 uses the odd-numbered frame and even-numbered frame video signals stored in the memories 12 and 16 to generate interpolation data for restoring the even-numbered frame video from the odd-numbered frame. Since a delay time 318 occurs for generating the interpolation data, the transmission start timing of the interpolation data 312, 314, 316 is delayed by the time 318 from the imaging start timing of the corresponding even frames 302, 304, 306. Since the interpolation data generation method and the operations of the error correction encoding unit 18, the superimposing unit 13, the 8B / 10B encoder 14, and the control unit 19 are the same as the operations in FIG.
  • the operation of the display device 2 displays the image of each frame at the scanning timing shown in FIGS. 4C to 4F in accordance with the scanning method of the display element 26. Since these display timings are the same as those in FIG. 3, the delay time from the imaging timing (FIG. 4A) will be described.
  • the upper end of the upper screen has a delay time 358 of about 2 frames.
  • the lower screen has a delay 369 of almost 0 frames, which is slightly earlier than that shown in FIG.
  • FIG. 5 is a table summarizing the delay times in each scanning method described in FIGS. 3 and 4.
  • the delay time from the imaging timing to the display timing is shown as an approximate number in frame units (period T) for the combination of the scanning method of the imaging element and the display element.
  • the numerical values in the upper, middle, and lower stages of each case indicate the delay times at the top, center, and bottom of the screen, respectively. In the middle stage, two numerical values are shown, the first numerical value is the delay time at the lower end of the upper screen, and the second numerical value is the delay time at the upper end of the lower screen.
  • the scanning method (g) of the display element is similar to (d) in the dual scanning of the same direction scanning, but is a method of scanning the upper screen and the lower screen with the same frame video signal.
  • the delay time in the method (g) is the same as (d) on the lower screen, but one frame is added to (d) on the upper screen.
  • the scanning mode of the image sensor is divided by shutter mode. What has been described so far is the “line shutter” mode, which shifts the shutter timing in the screen in accordance with the scanning.
  • the added “frame shutter” mode captures the entire screen at the same timing and sequentially scans readout.
  • the delay time in the frame shutter mode is longer than that in the line shutter mode. Assuming that the shutter is released immediately before the start of scanning, for example, the delay time is increased by 0 frame at the upper end of the screen, 0.5 frame at the center, and 1 frame at the lower end, compared to the case of single scanning with a line shutter. . In the case of the frame shutter mode, there is no difference in delay time between the image pickup device of the single scan method and the dual scan method.
  • a desirable scanning method of the display element with respect to the scanning method of the image sensor is as follows.
  • A1 When the image pickup device is in the line shutter mode, the display delay time is minimized when the dual scan method ((d): upper and lower screens are scanned in the same direction) is selected as the display device. From the viewpoint of minimizing the delay time variation in the screen, if the image sensor is a single scan system, the display element may be switched to the single scan system.
  • A2 When the image sensor is in the frame shutter mode, the delay time variation in the screen can be reduced by selecting the dual scan method ((e): start of the screen center) or the single scan method for the display element.
  • the display device may extract the metadata attached to the video signal from the imaging device from the info frame or the like, detect the scanning method of the imaging device, and select the scanning method of the display device.
  • a desirable scanning method of the image sensor with respect to the scanning method of the display element is as follows.
  • (B1) When the display element is a single scan system, if the single scan system (line shutter mode) is selected as the image sensor, the delay time variation in the display screen is minimized.
  • (B2) When the display element is a dual scan system, the display delay time is minimized when the dual scan system (line shutter mode) is selected for the image sensor.
  • (B3) When the display element is a dual scan system, if the dual scan system (line shutter mode) is selected as the image sensor, the delay time variation in the display screen is minimized.
  • the imaging device may read the display capability information of the display device from EDID or the like, detect the scanning method of the display element, and select the scanning method of the imaging device.
  • the scanning processing unit 25 may create a video at an intermediate timing from the preceding and following video signals and motion vectors, and reversely correct the delay time in the screen obtained from FIG.
  • the display timing in the screen can be adjusted by lighting the backlight in accordance with the opening time of the frame shutter at the same timing after scanning is completed. It can also be combined.
  • the display device transmits display scanning information (single / dual scan, scanning direction, backlight lighting timing) to the imaging device, or conversely, the imaging device captures imaging scanning information (single / dual scanning, scanning direction, (Line / frame shutter, opening time) is added to the video data and transmitted to the display device, so that each scanning method is selected to be optimum. Accordingly, it is possible to realize an imaging / display system with a short delay time from the imaging timing to the display timing or with little variation in the delay time.
  • FIG. 6 is a diagram illustrating the overall configuration of the imaging / display system according to the second embodiment.
  • the system includes an imaging device (camera) 3, an encoder 4, a server 5, a broadcasting station 6, a set top box (STB) 7, and a display device (display) 2.
  • the imaging device 3 corresponds to the imaging device 11 in the first embodiment (FIG. 1)
  • the encoder 4 and the STB 7 perform compression / decompression processing of a video signal according to the broadcast standard.
  • FIG. 1 the configuration and basic processing of each device will be described.
  • the data is compressed and encoded based on the H.265 standard and stored in the server 5.
  • An editing station (not shown) edits the stored video for broadcasting using a video editing device or the like, and stores it again in the server 5.
  • the broadcast station 6 transmits the broadcast video signal stored in the server 5 to the STB 7 in the home through a broadcast radio wave, a cable TV line, an Internet line, or the like.
  • the video signals of the imaging device 3 and the encoder 4 may be transmitted to the STB 7 as they are as a live program.
  • the STB 7 extracts a video signal that the user views from a plurality of received video signals and transmits the video signal to the display device 2, and the display device 2 displays the received video.
  • a fast-moving video such as a sports program is transmitted with a high frame rate video signal of 120 Hz, for example, and the display device also displays at a high frame rate of 120 Hz.
  • the broadcast signal since there are popular STBs and display devices that do not support high frame rates, it is desirable that the broadcast signal be compatible with both 60 Hz and 120 Hz display devices. In order to realize this, a method of transmitting a 60 Hz video signal and interpolation data for increasing the frame rate is effective.
  • H.264 H.264 and H.264 As a specific transmission method, H.264 H.264 and H.264.
  • MVC format Multiview VideodingCoding
  • the STB 7 and display device 2 compatible with 60 Hz display only the base view.
  • the STB 7 and the display device 2 compatible with 120 Hz also decode and display the non-base view. Thereby, it is possible to support both 60 Hz and 120 Hz display methods with one broadcast signal.
  • MVC format Multiview VideodingCoding
  • the base view (60 Hz video signal) is non-base
  • the view (interpolated data for 120 Hz conversion) is preferably BBBB.
  • the base view has the effect of reducing the amount of data by reducing the B frame to shorten the delay time associated with compression / decompression and making all non-base views B frames.
  • the imaging device 3 outputs a 120 Hz baseband digital video signal to the encoder 4 by SDI (Serial Digital Interface).
  • SDI Serial Digital Interface
  • the encoder 4 generates a base view of a 60 Hz video signal and a non-base view of 120 Hz conversion data in the MVC format, and outputs them to the server 5 through a packet format network, for example, Ethernet (registered trademark).
  • the server 5 outputs a video signal in MVC format to the broadcast station 6 via Ethernet.
  • the broadcast station 6 can easily receive the live program output from the encoder 4 through the server 5.
  • the broadcasting station 6 transmits a video signal in the MVC format to the STB 7 via a broadcast radio wave, a cable television network, the Internet, or the like.
  • the display capability of the display device 2 can be detected by a mechanism such as EDID (Extended Display Identification Data) or CEC (Consumer Electronics Control). If the display device 2 supports only 60 Hz non-compression, a 60 Hz uncompressed video signal obtained by decoding only the base view is transmitted by, for example, HDMI. If the display device 2 supports only 120 Hz non-compression, the display device 2 transmits a 120-Hz uncompressed video signal obtained by decoding both the base view and the non-base view.
  • EDID Extended Display Identification Data
  • CEC Consumer Electronics Control
  • the display device 2 is compatible with 60 Hz uncompressed video signal with 120 Hz interpolation data
  • the 60 Hz uncompressed video signal obtained by decoding the base view is converted into an InfoFrame (120 Hz interpolation data of the non-base view as metadata. It is added to Info Frame) and transmitted.
  • the display device 2 can display a frame rate of 120 Hz picked up by the image pickup device 3 at a higher frame rate, such as 240 Hz, which is higher than this, thereby enhancing motion performance.
  • the interpolation data may be encrypted and transmitted by the info frame.
  • the decryption key may be a key obtained from HDCP authentication, or may be simplified by using a part of uncompressed video data restored from HDCP as a key.
  • FIG. 7 is a diagram showing an internal configuration of the encoder 4.
  • the encoder 4 includes an input unit 41, memories 42 and 46, compression units 43 and 47, a packet processing unit 44, an output unit 45, and a control unit 48.
  • the input unit 41 receives an uncompressed video signal with a frame rate of 120 Hz output from, for example, SDI from the imaging device 3 in FIG. Odd frames are stored in the memory 42 and even frames are stored in the memory 46.
  • the compression unit 43 compresses the odd-numbered frame video signal stored in the memory 42.
  • the compression unit 47 compresses the even frame video signal stored in the memory 46 with reference to the odd frame video and the compressed video signal.
  • the odd frame video compression can be performed more efficiently than the odd frame video.
  • difference information, the mixing ratio of the previous and subsequent frames, the motion vector, and the like are generated as the interpolation data for 120 Hz conversion.
  • the packet processing unit 44 packetizes the odd frames compressed by the compression unit 43 and the even frames compressed more efficiently by the compression unit 47, and the output unit 45 converts the packetized video data into, for example, a LAN (Local (Area Network) or the like to the server 5 in FIG.
  • the control unit 48 controls each unit in the encoder 4 in accordance with the compression method designated by the server 5 or the operator, and at the same time, the imaging device information (single / dual scan, scanning direction, line / frame shutter, opening time) and compression. System information is assigned as metadata of video data. Further, the control unit 48 transmits the display scanning information (single / dual scan, scanning direction, backlight lighting timing) of the display device 2 designated by the server 5 or the operator to the imaging device 3.
  • the encoder 4 generates optimal compressed video data based on the imaging scanning information of the imaging device 3 and the display scanning information of the display device 2, and gives the imaging scanning information to the display device 2 side. Transmit to.
  • the delay time from the imaging timing to the display timing is short or the delay time variation is small.
  • FIG. 8 is a diagram showing the internal configuration of the STB 7.
  • the STB 7 includes an input unit 71, a packet processing unit 72, a separation unit 73, a decompression unit 74, a superposition unit 75, an 8B / 10B encoder 76, an output unit 77, a memory 78, an error correction coding unit 79, and a control unit 80.
  • the input unit 71 receives a compressed video signal (60 Hz video signal and 120 Hz interpolated data) in the MVC format from the broadcasting station 6 of FIG. 6 via a broadcast radio wave or a network, and the packet processing unit 72 receives the compressed video signal.
  • the packetized video signal is depacketized.
  • the separation unit 73 separates the 60 Hz video signal (odd number frame) and the 120 Hz conversion data (even number frame), and sends them to the decompression unit 74 and the memory 78, respectively.
  • the decompression unit 74 decodes the odd frame video signal to obtain a 60 Hz uncompressed video signal.
  • the error correction encoding unit 79 performs error correction encoding on the interpolation data for 120 Hz conversion of even frames stored in the memory 78.
  • the superimposing unit 75 superimposes the interpolation data for 120 Hz conversion of the even frame in the blanking period of the 60 Hz uncompressed video signal of the odd frame.
  • the 8B / 10B encoder 76 serializes the 60 Hz uncompressed video signal with 120 Hz interpolated data (converts 8-bit data into 10-bit data), and outputs the video signal from the output unit 77 to the display device 2 as an HDMI format video signal, for example. Output.
  • the control unit 80 controls each unit, reads display scanning information (single / dual scan, scanning direction, backlight lighting timing) of the display device 2, and generates an optimal uncompressed video signal with interpolation data. Also, imaging device information (single / dual scan, scanning direction, line / frame shutter, opening time) added to the received video signal is added as metadata of the 60 Hz uncompressed video signal. Also, a function of extracting or converting data usable by the display device 2 from the interpolation data for 120 Hz and transmitting it to the display device, and a function of securing a voice data transmission band that requires simultaneous transmission during the retrace period You may have.
  • the STB 7 generates an optimal uncompressed video signal based on the imaging scanning information of the imaging device 3 and the display scanning information of the display device 2, and generates high frame rate interpolation data and imaging scanning information on this. And output to the display device 2.
  • the delay time from the imaging timing to the display timing is short or the delay time variation is small.
  • the frame packing method is a method of alternately transmitting 3D left-eye video (L video) and right-eye video (R video).
  • L video left-eye video
  • R video right-eye video
  • the frame packing method is a method of alternately transmitting 3D left-eye video (L video) and right-eye video (R video).
  • a frame packing method including both L video and R video alternately even in uncompressed video transmission at 60 Hz is described.
  • a display device that can process the interpolation data has an advantage that the image quality of the L video and the R video is uniform.
  • a 3D image can be displayed even on a display device without an interpolation data processing function.
  • the imaging / display system has the same configuration as that of the first embodiment (FIG. 1). Time axis conversion in transmission and display is realized by the memories 12, 16, 24, 30 and the scanning processing unit 25.
  • FIG. 9 is a diagram illustrating an example of imaging, transmission, and display timing of a 3D video signal.
  • the imaging scanning method performs a single scan at a frame rate of 120 Hz.
  • a thick arrow eg, reference numeral 401 indicates the scanning timing of the L video (corresponding to odd frames)
  • a thin arrow eg, reference numeral 402 indicates the scanning timing of the R video (corresponding to even frames).
  • a solid line arrow for example, reference numeral 411) indicates an uncompressed video signal
  • a broken line arrow for example, reference numeral 413) indicates interpolation data.
  • (A) is an imaging timing
  • the image sensor 11 performs a single scan at a frame rate of 120 Hz
  • L images 401, 403, and 405 and R images 402, 404, and 406 are alternately output.
  • (B) is the transmission timing, and the L images 401 and 405 and the R image 402 output from the image sensor 11 are alternately scanned and converted into 60 Hz single scan signals 411, 412 and 415 (time width 2T). Further, the L video 403 and the R video 404 sandwiched between them are converted into 120 Hz interpolated data 413 and 414 and are superimposed on the 60 Hz single scan signals 411, 412 and 415 and output.
  • (C) (g) (e) (f) are display timings in various display scanning methods.
  • (C) is a case of 120 Hz single scan, and the display element 26 alternately scans the L images 421 and 423 and the R images 422 and 424.
  • the shutter glasses are opened and closed synchronously, so that the viewer's left eye captures the L video and the right eye captures the R video, thereby reproducing the 3D video.
  • (G) (e) (f) is the case of 120 Hz dual scan, (g) is the same direction scan, (e) is the reverse direction scan (center start), and (f) is the reverse direction scan (up and down start). .
  • the same frame image is displayed on the upper screen and the lower screen.
  • the upper screen of the display device 26 alternately scans the L video 431, 433 and the R video 432, 434, and the lower screen displays the corresponding L video 441, 443 and R video 442, 444 of the same frame. Scanning alternately. The same applies to the other scanning methods (e) and (f). If shutter glasses are used in synchronism with this, 3D video can be reproduced in the same manner as (c).
  • the transmission and display timing will be described in detail below.
  • the delay time will be described in comparison with the case of the first embodiment (FIG. 3).
  • the L image 401, 403, 405 and the R image 402, 403, 406 captured by the image sensor 11 with a single scan of 120 Hz are converted to 60 Hz, and the L image and the R image are alternately displayed. Transmit side by side. Accordingly, there is an advantage that 3D video display with frame packing can be realized even in a 3D display device that does not perform interpolation data processing. For this reason, the captured L video 401 is transmitted as an immediate L video 411, and the R video 402 is started to be transmitted after being delayed by one frame (1T). The next L video 403 and R video 404 are compressed and transmitted as L video interpolation data 413 and R video 414, respectively.
  • the transmission start of the L video interpolation data 413 is delayed by the compression or encoding processing time 418 compared to the start of the imaging of the L video 403. If the transmission of the R video interpolation data 414 starts immediately after the transmission of the L video interpolation data 413, the transmission periods of the two interpolation data do not overlap, and the transmission band can be used effectively.
  • the transmission start of the R video interpolation data 414 may be delayed by one frame (2T) of 60 Hz from the transmission start of the L video interpolation data 413.
  • the display element 26 alternately displays the L video and the R video in a single scan.
  • delays 428 and 429 of approximately 2 frames (2T) are generated compared to the start of imaging of the L video 401. This is because the display end time of the next R video 422 is matched with the transmission end time of the R video 412, and the display interval between the L video 421 and the R video 422 is set to 1T.
  • the display is delayed by about 1 frame (1T) as a whole as compared with the first embodiment (FIG. 3C).
  • FIG. 9 (g) shows a case where L video and R video are alternately displayed by the dual scan method in the same direction.
  • FIG. 9 (e) shows a dual scan method in which scanning is performed from the center to the upper and lower ends.
  • the scanning is the same as in FIG. 9G, but in the upper screen, the scanning direction is opposite to that in FIG. 9G.
  • the delay time from the imaging timing has a delay 458 of about 3 frames at the upper end of the screen, a delay 469 of about 2 frames at the lower end of the screen, and a delay of about 1.5 frames at the center. Note that since the scanning timing in the center of the screen is the same, the moving body is not distorted.
  • FIG. 9F shows a dual scan method in which scanning is performed from the upper and lower ends to the center. Since scanning is performed from the upper and lower ends of the screen, video data at the upper and lower ends of the screen is required at the start of scanning. In particular, the display start of the R video 482 on the lower screen is after 417 at the end of transmission of the R video 412, and other displays are also adjusted to this. As a result, the delay 478 at the upper end of the screen is about 3 frames, the delay at the center of the screen is about 3.5 frames, and the delay 489 at the lower end of the screen is about 2 frames.
  • FIG. 10 is a diagram illustrating another example of imaging, transmission, and display timing of a 3D video signal.
  • the imaging scanning method performs dual scanning at a frame rate of 120 Hz.
  • (A) is an imaging timing
  • the image sensor 11 is dual-scanning at a frame rate of 120 Hz
  • L images 501, 503, 505 and R images 502, 504, 506 are alternately output (time width 2T).
  • two image sensors may be arranged for the L video and the R video, and output with a phase difference of one frame of 120H.
  • (B) is the transmission timing, and the L images 501 and 505 and the R image 502 output from the image sensor 11 are alternately scanned to obtain 60 Hz single scan signals 511, 512 and 515. Also, the L video 503 and the R video 504 sandwiched between them are converted into 120 Hz interpolated data 513 and 514 and superimposed on the 60 Hz single scan signals 511, 512 and 515 for output. As a result, the transmission timing is the same as that in FIG.
  • (C) (g) (e) (f) are display timings in various display scanning methods.
  • (C) is a case of 120 Hz single scan, and (g), (e), and (f) are cases of 120 Hz dual scan. Since these display timings are the same as those in FIGS. 9C, 9G, 9E, and 9F, the description thereof is omitted.
  • FIG. 11 is a table summarizing the delay times in each of the 3D display scanning methods described in FIGS. 9 and 10. This is compared with the display in Example 1 (2D display). What can be said from this is that the delay time is larger in the 3D display (FIG. 11) than in the case of the 2D display (FIG. 5). This is the case of the dual scan method (e) in which scanning is performed from the center of the screen to the upper and lower ends. In the 3D display using the shutter glasses method, it is difficult to adopt the dual scan display method (d) that scans in the same direction over two frames.
  • the backlight may blink in accordance with the opening / closing of the shutter glasses.
  • the backlight may blink in accordance with the opening / closing of the shutter glasses.
  • the 3D display device transmits display scanning information (single / dual scan, scanning direction, backlight lighting timing) to the 3D imaging device, or conversely, the 3D imaging device captures imaging scanning information (single / dual scan, scanning). (Direction, line / frame shutter, opening time) are given to the video data and transmitted to the 3D display device, so that each scanning method is selected to be optimum. Accordingly, it is possible to realize an imaging / display system with a short delay time from the imaging timing to the display timing or with little delay time variation.
  • Example 4 describes a configuration in which image quality enhancement processing is added to a display device, and the scanning rate is doubled while the frame rate of the display element remains 120 Hz.
  • FIG. 12 is a diagram illustrating a configuration example of the display device according to the fourth embodiment.
  • the display device 8 includes an input unit 81, an image quality improvement processing unit 82, an interpolation data restoration unit 83, a simple image quality improvement processing unit 84, a scanning processing unit 85, and a display element 86. Note that a description of a control unit that transmits the display scanning information of the display element 86 to the imaging device side, detects the imaging scanning information, and controls the entire display device 8 is omitted.
  • the input unit 81 receives a 60 Hz uncompressed video signal and 120 Hz interpolation data.
  • the input 60 Hz uncompressed video signal is sent to the image quality enhancement processing unit 82, and the video is analyzed to improve the brightness, hue, sense of resolution, and the like.
  • the interpolation data restoration unit 83 restores a 120 Hz interpolated video signal from the input 120 Hz conversion data and the 60 Hz uncompressed video signal. Since the restored 120 Hz interpolated video signal does not include the input 60 Hz uncompressed video signal portion, it is a video signal equivalent to 60 Hz. That is, if the 60 Hz video signal input to the input unit 81 and the restored 120 Hz interpolated video signal are combined as they are, a 120 Hz dual scan video signal is obtained.
  • the 120 Hz interpolated video signal output from the interpolation data restoration unit 83 is subjected to image quality improvement processing by the simple image quality improvement processing unit 84. Since the 120 Hz interpolated video signal has a high correlation with the 60 Hz video signal, there is an advantage that the image quality improvement processing can be easily realized by using the result of the image analysis performed by the image quality improvement processing unit 82.
  • the 60 Hz video signal output from the image quality improvement processing unit 82 and the 120 Hz interpolated video signal output from the simple image quality improvement processing unit 84 are input to the scanning processing unit 85 to match the imaging scanning information and the scanning method of the display element. The video signal is generated, and the display device 86 displays the video with high image quality.
  • the moving image response can be improved by reducing the scanning time to, for example, 1 ⁇ 2, and providing the time for turning on the backlight after the scanning is completed.
  • the video signal whose scanning time is shortened can be generated by the time axis compression processing (doubling the scanning speed) of the scanning processor 85.
  • the backlight lighting time corresponds to the opening time of the image sensor, and it is preferable to set the backlight lighting time to be equivalent in order to make use of the photographer's intention.
  • the photographer can take an image while estimating the degree of blurring of the display element using the information on the frame rate of the display element and the backlight lighting time, it is easy to reflect the intention of the photographer.
  • the imaging scanning information is transmitted to the display device side, and the display scanning method and backlight lighting time are set based on the photographer's intention on the display device side. It is good to choose.
  • FIG. 13 is a diagram showing the display timing when the scanning time of the display element is shortened. This will be described in comparison with Example 1 (FIG. 3).
  • Imaging timing 120 Hz single scan
  • transmission timing are the same as those in FIG. (C)
  • (g), (e), and (f) are display timings in various display scanning methods.
  • (C) is a case of single scan display, and scanning is started with a delay of half of the effective display period, that is, a period 829 of about 0.5 frame (0.5 T) from the start of scanning of each frame in FIG.
  • the scanning end timing is matched with the scanning end timing in FIG. As a result, the scanning time is reduced to half (0.5 T).
  • the shutter glasses 3D display by performing an open / close switching operation while the backlight is turned off, a good separation display of the L video and the R video can be realized, and a good stereoscopic display can be realized.
  • (G) is a case of dual scan display (same direction scanning), and the upper and lower screens are scanned in the latter half of the scanning period.
  • the upper screen and the lower screen are video signals of different frames, a time lag of about 0.5 frames occurs above and below the center of the screen. Therefore, it is desirable to use the upper screen and the lower screen as the video signals of the same frames 831 and 841, and suppress the delay time variation within the screen to within 0.5 frames.
  • (E) is a case of dual scan display (reverse scanning, center start), and the upper and lower screens are scanned in the latter half of the scanning period of FIG. This is because the display end point of the double speed second frame 862 needs to come after the transmission end point 217 of the corresponding interpolation data 212.
  • (F) is a case of dual scan display (reverse scanning, upper and lower end start), and the upper and lower screens are scanned in the first half of the scanning period of FIG. 5 (f). This is also because the scanning start time of the double speed second frame 882 needs to come after the transmission end time 217 of the corresponding interpolation data 212. Compared with FIG. 3F, the delay time at the center of the screen becomes shorter.
  • FIG. 14 is a diagram illustrating a configuration example of the display device according to the fifth embodiment.
  • the display device 9 includes an input unit 91, an image quality improvement processing unit 92, an interpolation data image quality improvement unit 93, a frame rate conversion unit 94, a scanning processing unit 95, and a display element 96.
  • the input unit 91 receives a 60 Hz uncompressed video signal and 120 Hz interpolation data.
  • the input 60 Hz uncompressed video signal is sent to the image quality enhancement processing unit 92, and the video is analyzed to improve brightness, hue, resolution, and the like.
  • the interpolation data image quality improving unit 93 reflects the video analysis result of the image quality improving processing unit 92 in the input 120 Hz conversion data. For example, when the interpolation data is difference information and luminance or color correction is performed in an area corresponding to the difference information, the correction amount is reflected in the difference information.
  • the frame rate conversion unit 94 converts the 60 Hz video signal from the image quality improvement processing unit 92 into a 240-fold video signal of 4 times while referring to the 120-Hz interpolation data.
  • the 240 Hz video signal subjected to the high image quality processing output from the frame rate conversion unit 94 is sent to the scanning processing unit 95.
  • the scanning processing unit 95 generates a video signal in accordance with the imaging scanning information and the scanning method of the display element, and performs video display at a high frame rate on the display element 96.
  • the frame rate of the display element 96 When the frame rate of the display element 96 is set to 240 Hz, the moving image characteristics of a hold type display element such as a liquid crystal that continues to be displayed from one scan to the next can be improved. However, if high-quality processing with a large amount of processing is performed after frame rate conversion, the circuit scale increases and power consumption increases. In order to avoid this, it is desirable to perform the image quality improvement processing on the 60 Hz video signal and perform the frame rate conversion after the image quality improvement of the 120 Hz conversion data.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the frame rate of the image sensor may be 240 Hz
  • the uncompressed video signal to be transmitted may be restored to a frame rate of 60 Hz once every four frames
  • the interpolation data may be restored to 240 Hz at the time of imaging.
  • the transmission target video is a video generated by imaging, but the transmission target of the present invention is not limited to this.
  • a video that is different from a video generated by the imaging device, such as a video generated by computer graphics, may be a transmission target.
  • imaging device may be replaced with “video output device”
  • imaging scanning information may be replaced with “scanning information when generating a video”.
  • imaging scanning information can be said to be scanning information when generating the captured image. Therefore, “imaging scanning information” is included in “scanning information when generating image”. It can be said that

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

An image pickup apparatus (1) transmits, to a display apparatus (2), a video signal of a first frame rate, interpolation data to be used for a frame rate conversion, and scan information (number of simultaneous scans, scan directions, line/frame shutters). The display apparatus (2) uses the interpolation data to convert the received video signal of the first frame rate to a video signal of a second frame rate higher than the first frame rate, and displays the video signal as converted. At this moment, the display apparatus (2) determines, on the basis of the received scan information, a display scan scheme (number of simultaneous scans, scan directions, timings to light the backlight) of a display element (26). Alternatively, the image pickup apparatus (1) acquires, from the display apparatus (2), display scan information of the display element (26) and determines an image pickup scan scheme of an image pickup element (11) on the basis of the acquired display scan information. In this way, the magnitude and variation of the delay time from an image pickup timing of the image pickup element (11) until a display timing of the display element (26) can be reduced.

Description

映像機器及び映像処理方法Video equipment and video processing method
 本発明は、カメラ等からの映像信号を伝送し表示装置にてタイミング良く表示するための映像処理を行う映像機器及び映像処理方法に関する。 The present invention relates to a video device and a video processing method for transmitting a video signal from a camera or the like and performing video processing for displaying the video signal with good timing on a display device.
 入力映像信号よりも高いフレームレートの映像信号を生成するフレームレート変換(FRC)技術に関連し、特許文献1には、「符号化して伝送する映像信号のフレームレートよりも高いフレームレートの映像信号から動き判定を行い、別途動き判定結果を伝送することで、より高精度なFRC処理を可能とすることができる」と記載されている。また特許文献2には、映像信号のブランキング期間に動きベクトルを含む参照制御情報を重畳して伝送すること、すなわち、「再生装置1,表示装置5がHDMI(登録商標)規格に則って映像信号を入出力する場合には」、「エンコード部3が図5のデータアイランド期間にAUXデータの一種として参照制御情報を重畳し、デコード部6がこのデータアイランド期間から参照制御情報を分離する」ことが記載されている。 In relation to a frame rate conversion (FRC) technique for generating a video signal having a frame rate higher than that of an input video signal, Patent Document 1 discloses that “a video signal having a frame rate higher than the frame rate of a video signal to be encoded and transmitted”. It is possible to perform higher-accuracy FRC processing by performing motion determination and transmitting the motion determination result separately. Further, Patent Document 2 discloses that reference control information including a motion vector is superimposed and transmitted in a blanking period of a video signal, that is, “the playback device 1 and the display device 5 are in accordance with the HDMI (registered trademark) standard. In the case of inputting / outputting a signal, “the encoding unit 3 superimposes the reference control information as a kind of AUX data in the data island period of FIG. 5 and the decoding unit 6 separates the reference control information from the data island period”. It is described.
特開2013‐174798号公報(段落0049参照)JP 2013-174798 A (see paragraph 0049) 特開2007‐274679号公報(段落0039参照)JP 2007-274679 A (see paragraph 0039)
 カメラ等で撮像した映像を、映像信号とそのフレーム補間用データとして伝送し、表示装置にてフレーム補間して表示するとき、撮像タイミングから表示タイミングまで少なからず遅延時間が発生する。また、カメラの走査方式と表示装置の走査方式の組合せにより、遅延時間が画面内でばらつき、画像歪が生じることがある。このような遅延時間の発生や画面内のばらつきは、動きの速いカメラ映像を見ながら精密作業を行うような場合に障害となり、極力低減する必要がある。遅延時間の大きさは、カメラと表示装置それぞれの走査方式や映像信号伝送方式、フレーム補間方式に依存する部分が含まれるが、前記特許文献を始め従来技術では特に考慮されなかった。 When a video imaged by a camera or the like is transmitted as a video signal and its frame interpolation data and displayed by interpolating the frame on a display device, there is a considerable delay time from the imaging timing to the display timing. Also, depending on the combination of the scanning method of the camera and the scanning method of the display device, the delay time may vary within the screen and image distortion may occur. The occurrence of such delay time and variations in the screen become obstacles when performing precision work while watching fast moving camera images, and must be reduced as much as possible. The size of the delay time includes a portion depending on the scanning method, the video signal transmission method, and the frame interpolation method of the camera and the display device, respectively, but was not particularly taken into consideration in the prior art including the patent document.
 本発明の目的は、カメラの撮像タイミングから表示装置の表示タイミングまでの遅延時間の大きさと画面内ばらつきを低減することにある。 An object of the present invention is to reduce the size of delay time and the variation in the screen from the imaging timing of the camera to the display timing of the display device.
 上記課題を解決するために、例えば請求の範囲に記載の構成を採用する。その一例を挙げるならば、映像出力装置の出力映像を表示する映像機器において、前記映像出力装置から、第1の映像信号と、該第1の映像信号のフレームを補間するための補間データと、前記映像を生成するときの走査情報とを受信する入力部と、前記入力部が受信した前記第1の映像信号と前記補間データを用いて、前記第1の映像信号のフレームを補間する第2の映像信号を生成する表示信号処理部と、前記第1の映像信号と、前記表示信号処理部が生成した前記第2の映像信号を表示する表示素子と、前記表示信号処理部と前記表示素子を制御する制御部と、を備え、前記制御部は、前記入力部が受信した前記走査情報に基づいて、前記表示素子の表示走査方式を決定し、前記表示信号処理部は、前記制御部が決定した前記表示素子の表示走査方式に基づいて前記第2の映像信号を生成する。 In order to solve the above problems, for example, the configuration described in the claims is adopted. For example, in a video device that displays an output video of a video output device, from the video output device, a first video signal, interpolation data for interpolating a frame of the first video signal, An input unit that receives scanning information when generating the video, and a second that interpolates a frame of the first video signal using the first video signal and the interpolation data received by the input unit. A display signal processing unit for generating the video signal, a display element for displaying the first video signal, the second video signal generated by the display signal processing unit, the display signal processing unit, and the display element A control unit for controlling the display element, wherein the control unit determines a display scanning method of the display element based on the scanning information received by the input unit, and the display signal processing unit is controlled by the control unit. The determined display element Generating said second video signal based on the display scanning method.
 あるいは、表示装置へ映像を出力する映像機器において、映像源が出力する第1の映像信号から、該第1の映像信号のフレームレートより低いフレームレートを持つ第2の映像信号と、該第2の映像信号のフレームを補間するための補間データとを生成する映像信号処理部と、前記表示素子に向けて、前記第2の映像信号と前記補間データとを送信する出力部と、前記映像信号処理部を制御する制御部と、を備え、前記制御部は、前記表示装置の表示走査情報と処理可能な前記第2の映像信号および前記補間データに関する情報を取得し、前記表示走査情報と前記処理可能な前記第2の映像信号および前記補間データに関する情報に基づいて前記映像信号処理部の出力する前記第2の映像信号と前記補間データの形式を決定し、前記映像信号処理部は、前記制御部が決定した前記形式に基づいて前記第2の映像信号と前記補間データを生成する。 Alternatively, in a video device that outputs video to a display device, the second video signal having a frame rate lower than the frame rate of the first video signal from the first video signal output by the video source, and the second A video signal processing unit that generates interpolation data for interpolating a frame of the video signal, an output unit that transmits the second video signal and the interpolation data to the display element, and the video signal A control unit that controls a processing unit, wherein the control unit acquires display scan information of the display device, information on the second video signal that can be processed, and the interpolation data, and the display scan information and the A format of the second video signal output from the video signal processing unit and the interpolation data is determined based on information on the second video signal that can be processed and the interpolation data, and the video signal is determined. Processing unit, the generating the interpolated data and the second image signal based on the format of the control unit has determined.
 本発明によれば、カメラの撮像タイミングから表示装置の表示タイミングまでの遅延時間の大きさとばらつきを低減したシステムを実現することができる。 According to the present invention, it is possible to realize a system in which the size and variation of the delay time from the imaging timing of the camera to the display timing of the display device are reduced.
実施例1に係る撮像・表示システムの構成を示すブロック図。1 is a block diagram illustrating a configuration of an imaging / display system according to Embodiment 1. FIG. 撮像素子や表示素子の各種走査方式を説明する図。4A and 4B illustrate various scanning methods for an image sensor and a display element. 映像信号の撮像、伝送、表示タイミングを説明する図(撮像走査方式がシングルスキャンの場合)。The figure explaining the imaging of an image signal, transmission, and a display timing (when an imaging scanning system is a single scan). 映像信号の撮像、伝送、表示タイミングを説明する図(撮像走査方式がデュアルスキャンの場合)。The figure explaining the imaging of an image signal, transmission, and a display timing (when an imaging scanning system is a dual scan). 各走査方式における遅延時間をまとめた図。The figure which put together the delay time in each scanning system. 実施例2に係る撮像・表示システムの全体構成を示す図。FIG. 4 is a diagram illustrating an overall configuration of an imaging / display system according to a second embodiment. エンコーダ4の内部構成を示す図。The figure which shows the internal structure of the encoder 4. FIG. STB7の内部構成を示す図。The figure which shows the internal structure of STB7. 実施例3に係る3D映像信号の撮像、伝送、表示タイミングを説明する図(撮像走査方式がシングルスキャンの場合)。FIG. 10 is a diagram for explaining imaging, transmission, and display timing of a 3D video signal according to Embodiment 3 (when the imaging scanning method is single scan). 3D映像信号の撮像、伝送、表示タイミングを説明する図(撮像走査方式がデュアルスキャンの場合)。The figure explaining the imaging of 3D video signals, transmission, and a display timing (when an imaging scanning system is a dual scan). 3D表示の各走査方式における遅延時間をまとめた図。The figure which put together the delay time in each scanning system of 3D display. 実施例4に係る表示装置の構成例を示す図。FIG. 10 is a diagram illustrating a configuration example of a display device according to a fourth embodiment. 表示素子の走査時間を短縮した場合の表示タイミングを示す図。The figure which shows the display timing at the time of shortening the scanning time of a display element. 実施例5に係る表示装置の構成例を示す図。FIG. 10 is a diagram illustrating a configuration example of a display device according to a fifth embodiment.
 以下、本発明の実施例を図面を用いて説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1は、実施例1に係る撮像・表示システムの構成を示すブロック図である。実施例1では、例えば民生用の撮像装置(カメラ)1と表示装置(ディスプレイ)2をHDMI(High Definition Multimedia Interfaceの略称。HDMI,LLCの登録商標)などで接続し、撮像装置1で撮像した映像信号を表示装置2へ伝送して映像を表示するシステムを想定する。 FIG. 1 is a block diagram illustrating the configuration of the imaging / display system according to the first embodiment. In the first embodiment, for example, a consumer imaging device (camera) 1 and a display device (display) 2 are connected by HDMI (abbreviation of High Definition Multimedia Interface, a registered trademark of HDMI, LLC), etc. Assume a system in which a video signal is transmitted to the display device 2 to display a video.
 撮像装置1の内部構成は、撮像素子11、メモリ12,16、重畳部13、8B/10Bエンコーダ14、出力部15、圧縮部17、エラー訂正符号化部(ECC)18、制御部19を有する。 The internal configuration of the imaging apparatus 1 includes an imaging device 11, memories 12 and 16, a superimposition unit 13, an 8B / 10B encoder 14, an output unit 15, a compression unit 17, an error correction coding unit (ECC) 18, and a control unit 19. .
 一方、表示装置2の内部構成は、入力部21、10B/8Bデコーダ22、分離部23、表示信号処理部31、表示素子26、制御部27、エラー訂正復号化部28を有する。表示信号処理部31は、メモリ24,30、走査処理部25、伸張部29を有する。 On the other hand, the internal configuration of the display device 2 includes an input unit 21, a 10B / 8B decoder 22, a separation unit 23, a display signal processing unit 31, a display element 26, a control unit 27, and an error correction decoding unit 28. The display signal processing unit 31 includes memories 24 and 30, a scanning processing unit 25, and an expansion unit 29.
 このシステムでは、例えば撮像装置1はフレームレート120Hzで撮像し、この奇数フレームに、偶数フレームを圧縮して生成した補間データを重畳して表示装置2へ伝送し、表示装置2は奇数フレームと補間データから偶数フレームを復元し、奇数フレームと合わせて再びフレームレート120Hzに戻して映像を表示するものである。なお、各部の動作の説明は後述する。 In this system, for example, the imaging device 1 captures an image at a frame rate of 120 Hz, and interpolated data generated by compressing the even frame is superimposed on the odd frame and transmitted to the display device 2, and the display device 2 interpolates with the odd frame. The even frame is restored from the data, and the frame rate is returned to 120 Hz together with the odd frame to display the video. The operation of each part will be described later.
 図2は、撮像素子や表示素子の各種走査方式を説明する図である。
  (a)はシングルスキャン方式であり、矢印111が示すように画素を左から右へ水平走査し、また矢印112が示すように画素を上から下へ垂直走査する。一般的な走査方式であり、多くの撮像素子や表示素子が採用している。
FIG. 2 is a diagram for explaining various scanning methods of the image sensor and the display element.
(A) is a single scan method, in which pixels are horizontally scanned from left to right as indicated by an arrow 111, and pixels are vertically scanned from top to bottom as indicated by an arrow 112. This is a general scanning method, and is adopted by many image sensors and display elements.
 (b)(c)(d)はデュアルスキャン方式であり、画面を上画面と下画面の2画面に分割して、上画面と下画面を同時に走査する。高解像度化に伴い画素数が増大すると、1行や各画素の選択時間が短くなり撮像素子や表示素子の高速駆動が難しくなる。そこで、画面を上下に分割して同時走査することにより、各行の走査時間をシングルスキャン方式に比べて約半減でき、高速駆動に対応できる。すなわち、矢印121、131、141が示すように左から右へ水平走査する一方、上画面と下画面を同時に垂直走査する。 (B) (c) (d) is a dual scan method, which divides the screen into two screens, an upper screen and a lower screen, and simultaneously scans the upper screen and the lower screen. When the number of pixels increases as the resolution increases, the selection time for one row and each pixel is shortened, and high-speed driving of the image sensor and the display element becomes difficult. Therefore, by dividing the screen into upper and lower parts and scanning them simultaneously, the scanning time for each row can be reduced by half compared to the single scan method, and high-speed driving can be supported. That is, as indicated by arrows 121, 131, and 141, horizontal scanning is performed from left to right, while vertical scanning is simultaneously performed on the upper screen and the lower screen.
 (b)は、矢印122と123で示すように共に上から下に垂直走査しており、上下画面同方向走査と呼ぶ。(c)は、矢印132と133で示すように中央部から上部と下部へ垂直走査しており、上下画面逆方向走査(中央開始)と呼ぶ。(d)は、矢印142と143で示すように上部と下部から中央部へ垂直走査しており、上下画面逆方向走査(上下開始)と呼ぶ。 (B) is vertical scanning from top to bottom as indicated by arrows 122 and 123, and is called up-down screen same direction scanning. (C) performs vertical scanning from the center to the top and bottom as indicated by arrows 132 and 133, and is referred to as vertical screen reverse scanning (center start). In (d), vertical scanning is performed from the upper part and the lower part to the central part as indicated by arrows 142 and 143, which is referred to as vertical screen reverse scanning (up / down start).
 (b)~(d)のデュアルスキャン方式においては、上画面と下画面の境界部、すなわち画面中央部のすぐ上の画素とすぐ下の画素について、撮像または表示するタイミングを合わせることで動画表示をスムースにする。そのため、(b)の上下画面同方向走査では、上画面に比べて下画面の走査を1フレーム遅らせるとよい。(c)(d)の上下画面逆方向走査では、上画面と下画面の走査するタイミングをほぼ同時とするのがよい。 In the dual scan method of (b) to (d), moving image display is performed by matching the timing of imaging or displaying the boundary between the upper screen and the lower screen, that is, the pixel immediately above and the pixel immediately below the center of the screen. To smooth. For this reason, in the same direction scanning of the upper and lower screens in (b), the scanning of the lower screen may be delayed by one frame compared to the upper screen. (C) In the upper and lower screen reverse scanning in (d), it is preferable that the scanning timing of the upper screen and the lower screen is substantially the same.
 図3は、図1の撮像・表示システムにおける映像信号の撮像、伝送、表示タイミングの例を説明する図である。ここでは、撮像素子11がフレームレート120Hzでシングルスキャンする場合(図2(a))とする。横軸は時間、縦軸は垂直方向の走査位置を矢印で示している。太い矢印(例えば符号201)は奇数フレームの走査タイミングを、細い矢印(例えば符号202)は偶数フレームの走査タイミングを示す。さらに、実線の矢印(例えば符号211)は非圧縮映像信号を、破線の矢印(例えば符号212)は補間データを示す。時間軸の単位として、フレームレート120Hzに対応する1フレームの周期Tを用いる。 FIG. 3 is a diagram for explaining an example of imaging, transmission, and display timing of a video signal in the imaging / display system of FIG. Here, it is assumed that the image pickup device 11 performs a single scan at a frame rate of 120 Hz (FIG. 2A). The horizontal axis indicates time, and the vertical axis indicates the scanning position in the vertical direction. A thick arrow (for example, reference numeral 201) indicates scanning timing for odd frames, and a thin arrow (for example, reference numeral 202) indicates scanning timing for even frames. Further, a solid line arrow (for example, reference numeral 211) indicates an uncompressed video signal, and a broken line arrow (for example, reference numeral 212) indicates interpolation data. As a unit of time axis, a period T of one frame corresponding to a frame rate of 120 Hz is used.
 (a)は撮像タイミングであり、撮像素子11がフレームレート120Hz(周期T)でシングルスキャンしており、第1~第6フレームの走査タイミングを矢印201~206で示している。 (A) is the imaging timing, the imaging device 11 performs a single scan at a frame rate of 120 Hz (period T), and the scanning timings of the first to sixth frames are indicated by arrows 201 to 206.
 (b)は伝送タイミングであり、撮像素子11から出力される奇数フレーム201、203、205を60Hzシングルスキャン信号211、213、215(時間幅2T)として時間軸を伸張する。また偶数フレーム202、204、206を復元するため120Hz化用補間データ212、214、216を生成して、60Hzシングルスキャン信号に重畳させて出力している。例えばHDMIで伝送する場合、60Hzシングルスキャン信号の水平帰線期間にインフォパケット(Info Packet)形式として重畳させてもよい。 (B) is the transmission timing, and the odd-numbered frames 201, 203, 205 output from the image sensor 11 are set to 60 Hz single scan signals 211, 213, 215 (time width 2T) to extend the time axis. Further, in order to restore the even frames 202, 204, and 206, 120 Hz interpolated data 212, 214, and 216 are generated and superimposed on the 60 Hz single scan signal and output. For example, when transmitting by HDMI, it may be superposed as an Info packet format in the horizontal blanking period of a 60 Hz single scan signal.
 (c)~(f)は、表示素子26による各種走査方式での表示タイミングを示し、周期Tで各フレームの映像を表示する。(c)は120Hzシングルスキャンの場合で、第1~第5フレームの表示タイミングをそれぞれ221~225で示す。(d)(e)(f)は120Hzデュアルスキャンの場合で、(d)は上下画面同方向走査、(e)は上下画面逆方向走査(中央開始)、(f)は上下画面逆方向走査(上下開始)の場合である。画面上部の表示タイミングを、それぞれ矢印231~236,251~255,271~274で、画面下部の表示タイミングを、それぞれ矢印241~245,261~265,281~284で示す。 (C) to (f) show display timings in various scanning methods by the display element 26, and display an image of each frame at a period T. (C) is a case of 120 Hz single scan, and the display timings of the first to fifth frames are indicated by 221 to 225, respectively. (D), (e), and (f) are 120 Hz dual scans, (d) is the same scan in the top and bottom screen, (e) is the top and bottom screen reverse scan (center start), and (f) is the top and bottom screen reverse scan. This is the case of (starting up and down). Display timings at the top of the screen are indicated by arrows 231 to 236, 251 to 255, 271 to 274, respectively, and display timings at the bottom of the screen are indicated by arrows 241 to 245, 261 to 265, and 281 to 284, respectively.
 以下、図1と図3を参照して、撮像・表示システムの動作を説明する。
  まず、撮像装置1の動作から説明する。撮像素子11は図3(a)に示すように、フレームレート120Hzのシングルスキャンで撮像し、非圧縮の映像信号を出力する。このうち奇数フレームはメモリ12に、偶数フレームはメモリ16に蓄積する。メモリ12に蓄積された奇数フレームの映像信号(図3(a)の201,203,205)については、書き込み時の2倍の時間(約2T)をかけて読み出すことで、伝送用の60Hz(周期2T)の映像信号(図3(b)211,213,215)を生成する。このため、60Hz映像信号(フレーム211)の伝送終了時は、対応する撮像信号(フレーム201)の走査終了時よりも120Hzの垂直有効表示期間相当分(約1T)の遅延219が生じる。
The operation of the imaging / display system will be described below with reference to FIGS.
First, the operation of the imaging apparatus 1 will be described. As shown in FIG. 3A, the image sensor 11 captures an image with a single scan at a frame rate of 120 Hz and outputs an uncompressed video signal. Of these, odd frames are stored in the memory 12 and even frames are stored in the memory 16. The odd-frame video signals (201, 203, 205 in FIG. 3A) accumulated in the memory 12 are read out over a time twice as long as the writing (about 2T) (60 Hz for transmission ( Video signals (period 2T) (FIG. 3 (b) 211, 213, 215) are generated. For this reason, at the end of transmission of the 60 Hz video signal (frame 211), a delay 219 corresponding to a vertical effective display period of 120 Hz (about 1 T) occurs compared to the end of scanning of the corresponding imaging signal (frame 201).
 圧縮部17では、メモリ12と16に蓄積された奇数フレームと偶数フレームの映像信号を用いて、奇数フレームから偶数フレームの映像を復元するための補間データ(120Hz化用補間データ)を生成する。補間データは、直前の奇数フレームとの差分情報(Pフレーム相当)、前後の奇数フレーム間の混合比率や差分情報(Bフレーム相当)、動きベクトル関連データなどのいずれでもよい。 The compression unit 17 uses the odd frame and even frame video signals stored in the memories 12 and 16 to generate interpolation data (120 Hz conversion data) for restoring the even frame video from the odd frame. The interpolation data may be any of difference information (equivalent to P frame) with the immediately preceding odd frame, mixing ratio or difference information between the preceding and succeeding odd frames (equivalent to B frame), motion vector related data, and the like.
 エラー訂正符号化部18は、映像信号を表示装置2へ伝送する過程でのエラー対策のため、圧縮部17で生成した補間データに対しエラー訂正符号化を施す。これは、エラー発生による表示映像への影響は、非圧縮映像信号(奇数フレーム)に生じるエラーよりも、補間データ(偶数フレーム)に生じるエラーの方が大きいからである。エラー訂正符号化は、伝送系のエラーレートを検出する制御部19の指示で符号化方式を選んでもよい。伝送系エラーレートが大きい場合は、圧縮部17の圧縮率を上げて補間データを減らした上で、エラー訂正強度の高いエラー訂正符号化方式を選択することで表示映像の品質を確保することができる。 The error correction encoding unit 18 performs error correction encoding on the interpolation data generated by the compression unit 17 in order to prevent errors in the process of transmitting the video signal to the display device 2. This is because the error that occurs in the interpolated data (even frame) is larger than the error that occurs in the uncompressed video signal (odd frame). In the error correction coding, the coding method may be selected by an instruction from the control unit 19 that detects the error rate of the transmission system. When the transmission system error rate is large, the quality of the display video can be ensured by increasing the compression rate of the compression unit 17 and reducing the interpolation data, and then selecting an error correction coding method with high error correction strength. it can.
 重畳部13は、エラー訂正符号化された偶数フレームの補間データを、奇数フレームの非圧縮映像信号の帰線期間などに重畳する。例えばHDMIでは、エラー訂正符号化しているインフォフレーム(Info frame)パケットに補間データを重畳するのがよい。重畳された映像信号は、8B/10Bエンコーダ14によりシリアライズ化(8ビットデータを10ビットデータに変換)され、HDMIなどの非圧縮映像信号インタフェース準拠信号となる。そして、出力部15から表示装置2へ送信される。 The superimposing unit 13 superimposes the error correction encoded even frame interpolation data on the blanking period of the odd frame uncompressed video signal. For example, in HDMI, it is preferable to superimpose interpolation data on an Info frame packet subjected to error correction coding. The superimposed video signal is serialized (converted from 8-bit data to 10-bit data) by the 8B / 10B encoder 14, and becomes a non-compressed video signal interface compliant signal such as HDMI. Then, it is transmitted from the output unit 15 to the display device 2.
 図3(b)の伝送信号のうち、偶数フレームの補間データ212,214,216の開始タイミングは、対応する図3(a)の撮像信号の偶数フレーム202,204,206の開始タイミングよりも時間218だけ遅れている。これは、圧縮部17やエラー訂正符号化部18による処理時間を考慮したものである。また、補間データ212,214,216の伝送時間は、偶数フレーム202,204,206の撮像走査時間(1T)と同じ時間幅に設定することもできるが、補間データが時間的に集中してその伝送帯域が不足することが考えられる。よってこの例では、60Hz非圧縮映像信号の伝送信号211,213,215と同様に、補間データ212,214,216の送出時間を2倍に伸ばし、周期2Tで伝送している。 Among the transmission signals in FIG. 3B, the start timing of the even- frame interpolation data 212, 214, and 216 is longer than the start timing of the corresponding even-numbered frames 202, 204, and 206 in the imaging signal in FIG. Delayed by 218. This takes into account the processing time by the compression unit 17 and the error correction coding unit 18. In addition, the transmission time of the interpolation data 212, 214, 216 can be set to the same time width as the imaging scanning time (1T) of the even frames 202, 204, 206, but the interpolation data is concentrated in time. The transmission band may be insufficient. Therefore, in this example, like the transmission signals 211, 213, and 215 of the 60 Hz uncompressed video signal, the transmission time of the interpolation data 212, 214, and 216 is doubled and transmitted at a cycle of 2T.
 なお、図3(b)では、伝送する60Hz非圧縮映像信号の垂直帰線期間において補間データの伝送を継続しているが、伝送を休止させてもよい。また、圧縮部17において偶数フレームの補間データ(例えば212)を、前後の奇数フレーム(例えば201と203)を参照して生成する場合は、遅延時間218の大きさは、1フレーム(1T)以上となる。 In FIG. 3B, the transmission of the interpolation data is continued in the vertical blanking period of the 60 Hz uncompressed video signal to be transmitted, but the transmission may be paused. In addition, in the case where the compression unit 17 generates even-frame interpolation data (for example, 212) with reference to the preceding and following odd-numbered frames (for example, 201 and 203), the size of the delay time 218 is 1 frame (1T) or more. It becomes.
 制御部19は、撮像装置1全体の制御を担っており、各構成部の動作状況を把握して必要な指示を与える。特に制御部19は、出力部15の送信先である表示装置2の表示能力などの情報が記述されたEDID(Extended Display Identification Data)を読み込み、上記補間データを処理できる機能を有するかどうかを判断する。そして、処理できる場合にのみメモリ16、圧縮部17、エラー訂正符号化部18を動作させることで、省電力化を図ることができる。 The control unit 19 is responsible for overall control of the imaging apparatus 1 and grasps the operation status of each component unit and gives necessary instructions. In particular, the control unit 19 reads an EDID (Extended Display Identification Data) in which information such as the display capability of the display device 2 that is the transmission destination of the output unit 15 is described, and determines whether it has a function capable of processing the interpolation data. To do. Then, the memory 16, the compression unit 17, and the error correction coding unit 18 are operated only when they can be processed, so that power saving can be achieved.
 また、制御部19は伝送系の伝送能力やエラーレート、消費電力、電池残量、ユーザの要求する画質レベルや表示遅延時間などの情報から、本実施例の伝送方式である「60Hz非圧縮映像信号+補間データ伝送」と、他の伝送方式である「120Hz非圧縮映像信号伝送」や、「60Hz非圧縮映像信号(補間データなし)伝送」のいずれを選ぶかを判断するとよい。 Further, the control unit 19 uses the transmission system of this embodiment, “60 Hz uncompressed video”, based on information such as transmission capability, error rate, power consumption, remaining battery level, image quality level requested by the user, and display delay time. It may be determined whether to select “signal + interpolated data transmission” and “120 Hz uncompressed video signal transmission” or “60 Hz uncompressed video signal (without interpolation data) transmission”, which is another transmission method.
 続いて、表示装置2の動作を説明する。表示装置2の入力部21は、撮像装置1から伝送された、図3(b)の伝送信号(60Hz非圧縮映像信号と120Hz化用補間データ)を受信する。10B/8Bデコーダ22は、受信した信号を映像信号に復元する。分離部23は信号を分離して、60Hz非圧縮映像信号(奇数フレーム)をメモリ24へ、120Hz化用補間データ(偶数フレーム)や他のメタデータをエラー訂正復号化部(ECC)28へそれぞれ出力する。 Subsequently, the operation of the display device 2 will be described. The input unit 21 of the display device 2 receives the transmission signal (60 Hz uncompressed video signal and 120 Hz conversion data) of FIG. 3B transmitted from the imaging device 1. The 10B / 8B decoder 22 restores the received signal to a video signal. The separation unit 23 separates the signal, the 60 Hz uncompressed video signal (odd frame) is sent to the memory 24, and the 120 Hz conversion data (even number frame) and other metadata are sent to the error correction decoding unit (ECC) 28. Output.
 エラー訂正復号化部28は、エラー訂正処理を施した後の信頼できる120Hz化用補間データを伸張部29へ出力し、120Hz化用補間データが含まれていたことや120Hz映像信号復元に必要なパラメータ情報などを制御部27へ伝える。これらの分離部23やエラー訂正復号化部28として、HDMIで規定されたインフォフレーム抽出処理回路を使用することができる。 The error correction decoding unit 28 outputs reliable interpolation data for 120 Hz after the error correction processing to the decompression unit 29, and is necessary for the 120 Hz video signal restoration that the 120 Hz conversion data is included. Parameter information and the like are transmitted to the control unit 27. As the separation unit 23 and the error correction decoding unit 28, an info frame extraction processing circuit defined by HDMI can be used.
 伸張部29は、メモリ24から得られる60Hz非圧縮映像信号(すなわち120Hz映像信号の奇数フレーム)と、エラー訂正復号化部28から得られる120Hz化用補間データから、120Hz映像信号の偶数フレームを復元し、メモリ30に送る。表示素子26がシングススキャン方式である場合(図3(c))、走査処理部25は、メモリ24の奇数フレーム映像信号(周期2T)とメモリ30の偶数フレーム映像信号(周期2T)を、それぞれ半分の時間(周期T)で走査するように時間軸変換して合成し、表示素子26へ出力する。 The decompression unit 29 restores the even frame of the 120 Hz video signal from the 60 Hz uncompressed video signal obtained from the memory 24 (that is, the odd frame of the 120 Hz video signal) and the interpolation data for 120 Hz conversion obtained from the error correction decoding unit 28. To the memory 30. When the display element 26 is a single scan system (FIG. 3C), the scanning processing unit 25 converts the odd frame video signal (cycle 2T) in the memory 24 and the even frame video signal (cycle 2T) in the memory 30 respectively. The time axis is converted so as to scan in half the time (cycle T), and the result is combined and output to the display element 26.
 走査処理部25は表示素子26の走査方式に合わせた映像信号を生成し、表示素子26は図3(c)~(f)に示す走査タイミングで各フレームの映像を表示する。 The scanning processing unit 25 generates a video signal in accordance with the scanning method of the display element 26, and the display element 26 displays an image of each frame at the scanning timing shown in FIGS.
 図3(c)は表示素子26が120Hzシングルスキャン方式(図2(a))の場合である。表示する120Hz映像信号の各フレーム221~225の走査開始タイミングは、対応する撮像信号の各フレーム201~205の走査開始タイミング(図5(a))より、1フレーム期間(1T)228だけ遅延させる。これは、時間幅2Tで伝送された各フレーム211~215を、時間幅1Tに時間軸変換して表示させるためで、1フレームの映像を欠落せずに表示するためには、表示走査終了時点で表示すべき1フレーム分の映像を全て受信し復元されていることが必要だからである。例えば偶数フレーム222の表示については、補間データ212が入力後直ちに復元することができるが、その表示終了時は補間データ212の伝送終了時217に合わせねばならない。 FIG. 3C shows a case where the display element 26 is a 120 Hz single scan system (FIG. 2A). The scanning start timing of each frame 221 to 225 of the 120 Hz video signal to be displayed is delayed by one frame period (1T) 228 from the scanning start timing of each frame 201 to 205 of the corresponding imaging signal (FIG. 5A). . This is because the frames 211 to 215 transmitted in the time width 2T are displayed with the time axis converted to the time width 1T and displayed without missing one frame of video. This is because it is necessary to receive and restore all the images for one frame to be displayed. For example, the display of the even-numbered frame 222 can be restored immediately after the interpolation data 212 is input. However, when the display ends, the display must be synchronized with the end of transmission 217 of the interpolation data 212.
 なお、表示する偶数フレーム222を、前後双方の奇数フレーム211、213を参照して復元する場合は、後行するフレーム213の入力を待って復元することになる。よって、各フレームの表示タイミングは、さらに1フレーム分遅延して時間幅2Tだけ遅延することになる。 Note that when the even frame 222 to be displayed is restored with reference to the odd frames 211 and 213 on both the front and rear sides, the restoration is made after waiting for the input of the subsequent frame 213. Therefore, the display timing of each frame is further delayed by one frame and delayed by the time width 2T.
 上記説明は、表示素子26が120Hzシングルスキャン走査方式の場合としたが、240Hzシングルスキャン方式の場合も同様に適用できる。ただし、走査処理部25は、120Hz非圧縮映像信号への復元とともに、エラー訂正復号化部28から得られる120Hz化用補間データに含まれる動きベクトルなどを利用して、240Hzへの高フレームレート変換を行う。 Although the above description is based on the case where the display element 26 is of the 120 Hz single scan scanning method, the same applies to the case of the 240 Hz single scan method. However, the scanning processing unit 25 uses the motion vector included in the 120 Hz conversion data obtained from the error correction decoding unit 28 together with the restoration to the 120 Hz uncompressed video signal, and converts the frame rate to 240 Hz. I do.
 図3(d)は、表示素子26が120Hzデュアルスキャン方式(同方向走査、図2(b))の場合である。デュアルスキャン方式(同方向走査)では、画面を上下に分けて、上から下へ同方向に同時に垂直走査する。その際、画面中央部での映像の時間ずれを防ぐために下画面を1フレーム遅らせて表示するので、画面全体は上から下へ2フレームの時間(2T)をかけて垂直走査する。例えば、伝送された第1フレーム211は、上画面で時間1Tをかけて表示する部分231と、下画面で次の時間1Tをかけて表示する部分241に分けられる。 FIG. 3D shows a case where the display element 26 is a 120 Hz dual scan system (same direction scanning, FIG. 2B). In the dual scan method (same direction scanning), the screen is divided into upper and lower parts and simultaneously vertically scanned in the same direction from top to bottom. At this time, the lower screen is displayed with a delay of one frame in order to prevent a time lag of the video at the center of the screen, so that the entire screen is vertically scanned from the top to the bottom over a period of 2 frames (2T). For example, the transmitted first frame 211 is divided into a portion 231 that is displayed over the time 1T on the upper screen and a portion 241 that is displayed over the next time 1T on the lower screen.
 デュアルスキャン方式では、伝送信号と表示信号の時間軸の変換がなく、走査処理部25による時間軸変換処理は不要となる。すなわち、時間軸変換により表示映像がフレーム途中で欠落することがないので、上画面での表示開始タイミングは各フレームの受信開始タイミングとほぼ同時とすることができる。また、伸張部29の復元処理にほとんど時間がかからなければ、表示する第2フレーム232の開始点は、第2フレーム復元用の補間データ212の受信開始点とほぼ一致させてよい。 In the dual scan method, there is no conversion of the time axis of the transmission signal and the display signal, and the time axis conversion process by the scanning processing unit 25 is not necessary. That is, since the display video is not lost in the middle of the frame due to the time axis conversion, the display start timing on the upper screen can be almost the same as the reception start timing of each frame. If the decompression unit 29 does not take much time to restore, the start point of the second frame 232 to be displayed may substantially coincide with the reception start point of the interpolation data 212 for second frame restoration.
 画面内の各位置での表示タイミングは、その位置での映像信号の受信または復元のタイミング、すなわちほぼ伝送タイミングに依存する。例えば上画面と下画面の接続部(画面中央部)では、撮像タイミングと比較して約1/2フレーム(0.5T)だけ遅延する。また、下画面の表示終了タイミングは、撮像終了タイミングから約1フレーム(1T)だけ遅延する。例えば、第1フレーム241の表示終了点は、第1フレーム201の撮像終了点より時間248だけ遅延する。この遅延時間248は、1フレーム(1T)に、圧縮部17と伸張部29での処理時間や伝送遅延時間を加えたものである。 The display timing at each position in the screen depends on the reception or restoration timing of the video signal at that position, that is, almost the transmission timing. For example, the connection portion (the center portion of the screen) between the upper screen and the lower screen is delayed by about ½ frame (0.5 T) compared to the imaging timing. The display end timing of the lower screen is delayed by about 1 frame (1T) from the imaging end timing. For example, the display end point of the first frame 241 is delayed by a time 248 from the imaging end point of the first frame 201. The delay time 248 is obtained by adding processing time and transmission delay time in the compression unit 17 and the expansion unit 29 to one frame (1T).
 なお、下画面は上画面から1フレーム遅れで走査タイミングを合わせるために、例えば下画面第2フレーム242は、伝送される補間データ212よりも120Hzの水平帰線期間分遅らせる。その結果、撮像された第3フレーム203の開始タイミングより時間238だけ遅延させる。この遅延時間238は、120Hz化用補間データの遅延時間218にほぼ等しい。 In order to adjust the scanning timing of the lower screen with a delay of one frame from the upper screen, for example, the lower frame second frame 242 is delayed by a horizontal blanking period of 120 Hz from the interpolation data 212 to be transmitted. As a result, it is delayed by a time 238 from the start timing of the imaged third frame 203. This delay time 238 is substantially equal to the delay time 218 of the 120 Hz conversion data.
 図3(e)(f)は、表示素子26がデュアルスキャン方式(逆方向走査)であって、(e)は画面の中央部から上端と下端へ同時に走査する場合(図2(c))、(f)は画面の上端と下端から中央部へ同時に走査する場合(図2(d))を示している。これらの走査でも時間軸変換は不要であり、表示装置2内のメモリ24や30の読出し順番を変えるか、走査処理部25で走査順番を入れ換えることによって実現できる。 FIGS. 3E and 3F show a case where the display element 26 is a dual scan system (reverse scanning), and FIG. 3E shows a case where scanning is simultaneously performed from the center to the upper and lower ends of the screen (FIG. 2C). , (F) shows a case where scanning is simultaneously performed from the upper end and the lower end of the screen to the center (FIG. 2 (d)). Even in these scans, the time axis conversion is unnecessary, and can be realized by changing the reading order of the memories 24 and 30 in the display device 2 or changing the scanning order in the scanning processing unit 25.
 図3(e)の画面中央部から走査開始する場合には、撮像から表示までの遅延時間は、画面上端ではほぼ2フレーム(2T)の遅延258を生じ、画面下端ではほぼ1フレームの遅延269を生じる。また画面中央部では、それらの中間の1.5フレームの遅延を生じる。 When scanning is started from the center of the screen in FIG. 3E, the delay time from imaging to display causes a delay 258 of approximately 2 frames (2T) at the upper end of the screen and a delay 269 of approximately 1 frame at the lower end of the screen. Produce. In the center of the screen, a delay of 1.5 frames between them occurs.
 図3(f)の画面上端と下端から画面中央部へ走査する場合には、撮像から表示までの遅延時間は、画面上端ではほぼ2フレームの遅延278を生じ、画面下端ではほぼ1フレームの遅延289を生じる。また画面中央部では、それらに0.5フレームを追加した2.5フレームの遅延を生じる。 When scanning from the top and bottom of the screen to the center of the screen in FIG. 3F, the delay time from imaging to display causes a delay of approximately 2 frames at the top of the screen and a delay of approximately 1 frame at the bottom of the screen. 289. At the center of the screen, a delay of 2.5 frames is generated by adding 0.5 frames to them.
 以上より、撮像素子11がシングルスキャンの場合、表示素子26の各走査方式について遅延時間を比較する。シングルスキャン表示方式の場合(図3(c))は、画面全体がほぼ1フレーム(1T)だけ遅延する。デュアルスキャン表示方式では、同方向走査方式の場合(図3(d))に遅延時間が最小になる。すなわち、画面下端ではほぼ1フレームの遅延となるが、画面上端では遅延なし(偶数フレームの圧縮・伸張にかかる時間だけ遅延)、画面中央部でもほぼ0.5フレームの遅延であり、遅延時間の点で有利である。 From the above, when the image sensor 11 is single scan, the delay times are compared for each scanning method of the display element 26. In the case of the single scan display method (FIG. 3C), the entire screen is delayed by approximately one frame (1T). In the dual scan display method, the delay time is minimized in the case of the same direction scan method (FIG. 3D). In other words, there is a delay of approximately 1 frame at the bottom of the screen, but there is no delay at the top of the screen (delayed by the time required to compress / expand even frames), and there is a delay of approximately 0.5 frames at the center of the screen. This is advantageous.
 以上の説明は、撮像素子11がシングルスキャン方式(図2(a))の場合としたが、次にデュアルスキャン方式(図2(b))の場合を示す。 The above description is based on the case where the image sensor 11 is a single scan method (FIG. 2A), but next, a case of a dual scan method (FIG. 2B) is shown.
 図4は、映像信号の撮像、伝送、表示タイミングの他の例を説明する図である。ここでは、撮像素子11がフレームレート120Hzでデュアルスキャンする場合(図2(b))とする。 FIG. 4 is a diagram illustrating another example of image signal imaging, transmission, and display timing. Here, it is assumed that the image sensor 11 performs dual scanning at a frame rate of 120 Hz (FIG. 2B).
 (a)は撮像タイミングであり、撮像素子11がフレームレート120Hz(周期T)でデュアルスキャンしており、第1~第6フレームの走査タイミングを矢印301~306で示している。ただし1画面の走査時間は約2Tとなる。 (A) is the imaging timing, the imaging device 11 performs dual scanning at a frame rate of 120 Hz (period T), and the scanning timings of the first to sixth frames are indicated by arrows 301 to 306. However, the scanning time for one screen is about 2T.
 (b)は伝送タイミングであり、撮像素子11から出力される奇数フレーム301、303,305は、そのままのタイミングでフレーム311,413,315として出力する。偶数フレーム302、304、306は、120Hz化用補間データ312、314、316に変換して、奇数フレームに重畳して出力する。 (B) is the transmission timing, and the odd frames 301, 303, and 305 output from the image sensor 11 are output as frames 311, 413, and 315 at the same timing. The even frames 302, 304, and 306 are converted into 120 Hz interpolated data 312, 314, and 316 and superimposed on the odd frames and output.
 (c)~(f)は、表示素子26による各種走査方式での表示タイミングを示し、周期Tで各フレームの映像を表示する。(c)は120Hzシングルスキャンの場合で、第1~第5フレームまでの表示タイミングをそれぞれ321~325で示す。(d)(e)(f)は120Hzデュアルスキャンの場合で、(d)は上下画面同方向走査、(e)は上下画面逆方向走査(中央開始)、(f)は上下画面逆方向走査(上下開始)の場合である。画面上部の表示タイミングを、それぞれ矢印331~336,351~355,371~374で、画面下部の表示タイミングを、それぞれ矢印341~345,361~365,381~384で示す。 (C) to (f) show display timings in various scanning methods by the display element 26, and display an image of each frame at a period T. (C) is a case of 120 Hz single scan, and display timings 321 to 325 are shown for the first to fifth frames, respectively. (D), (e), and (f) are 120 Hz dual scans, (d) is the same scan in the top and bottom screen, (e) is the top and bottom screen reverse scan (center start), and (f) is the top and bottom screen reverse scan. This is the case of (starting up and down). Display timings at the top of the screen are indicated by arrows 331 to 336, 351 to 355, 371 to 374, respectively, and display timings at the bottom of the screen are indicated by arrows 341 to 345, 361 to 365, 381 to 384, respectively.
 以下、図1と図4を参照して、撮像・表示システムの動作を説明する。
  まず、撮像装置1において撮像素子11は、図4(a)に示すように、120Hzのデュアルスキャンで撮像し、非圧縮映像信号を出力する。1画面の撮像時間は約2Tである。奇数フレームはメモリ12に、偶数フレームはメモリ16に蓄積する。メモリ12に蓄積された奇数フレームの映像信号(図4(a)の301,303,305)については、そのままのタイミングで読み出して、伝送用の60Hzの映像信号(図4(b)の311,313,315)とする。よって、メモリ12をスルーパスして重畳部13へ出力してもよい。
The operation of the imaging / display system will be described below with reference to FIGS.
First, as shown in FIG. 4A, the imaging device 11 in the imaging apparatus 1 captures an image with a dual scan of 120 Hz and outputs an uncompressed video signal. The imaging time for one screen is about 2T. Odd frames are stored in the memory 12 and even frames are stored in the memory 16. The odd-frame video signals (301, 303, and 305 in FIG. 4A) stored in the memory 12 are read at the same timing, and the 60-Hz video signals for transmission (311 in FIG. 4B) are read. 313, 315). Therefore, the memory 12 may be passed through and output to the superimposing unit 13.
 圧縮部17では、メモリ12と16に蓄積された奇数フレームと偶数フレームの映像信号を用いて、奇数フレームから偶数フレームの映像を復元するための補間データを生成する。補間データ生成のために遅延時間318が生じるので、補間データ312,314,316の伝送開始タイミングは、対応する偶数フレーム302,304,306の撮像開始タイミングから時間318だけ遅延する。補間データの生成方法やエラー訂正符号化部18、重畳部13、8B/10Bエンコーダ14、制御部19の動作は、前記図3での動作と同様であるので、説明を省略する。 The compression unit 17 uses the odd-numbered frame and even-numbered frame video signals stored in the memories 12 and 16 to generate interpolation data for restoring the even-numbered frame video from the odd-numbered frame. Since a delay time 318 occurs for generating the interpolation data, the transmission start timing of the interpolation data 312, 314, 316 is delayed by the time 318 from the imaging start timing of the corresponding even frames 302, 304, 306. Since the interpolation data generation method and the operations of the error correction encoding unit 18, the superimposing unit 13, the 8B / 10B encoder 14, and the control unit 19 are the same as the operations in FIG.
 表示装置2の動作は、表示素子26の走査方式に従い、図4(c)~(f)に示す走査タイミングで各フレームの映像を表示する。これらの表示タイミングは、いずれも前記図3の場合と同様となるので、撮像タイミング(図4(a))からの遅延時間について説明する。 The operation of the display device 2 displays the image of each frame at the scanning timing shown in FIGS. 4C to 4F in accordance with the scanning method of the display element 26. Since these display timings are the same as those in FIG. 3, the delay time from the imaging timing (FIG. 4A) will be described.
 図4(c)のシングルスキャン表示方式では、画面上端で1フレーム相当の遅延328、画面中央部でほぼ0.5フレームの遅延、画面下端では0フレームに近い遅延(偶数フレームの圧縮・伸張にかかる時間329)が生じる。 In the single scan display method of FIG. 4C, a delay 328 corresponding to one frame at the upper end of the screen, a delay of about 0.5 frame at the center of the screen, and a delay close to 0 frame at the lower end of the screen (for compression / decompression of even frames). Such a time 329) occurs.
 図4(d)のデュアルスキャン表示方式(同方向走査)では、全画面でほぼ0フレームの遅延となる。上画面では偶数フレームの圧縮・伸張にかかる時間338、下画面では上画面の遅延時間338に120Hzフレームの帰線期間相当時間を加えた時間349の遅延が生じるだけである。よって、シングルスキャン表示方式(図4(c))よりも有利である。 In the dual scan display method (same direction scanning) in FIG. 4D, there is a delay of almost 0 frames in the entire screen. On the upper screen, there is only a delay of time 349, which is the time required for compression / decompression of even frames, and on the lower screen, the delay time 338 of the upper screen plus the time equivalent to the blanking period of 120 Hz frame. Therefore, it is more advantageous than the single scan display method (FIG. 4C).
 図4(e)のデュアルスキャン表示方式(逆方向、画面中央開始)では、上画面の上端部は約2フレームの遅延時間358となる。下画面はほぼ0フレームの遅延369で、図4(d)よりやや早めとなる。 In the dual scan display method (reverse direction, start of screen center) of FIG. 4 (e), the upper end of the upper screen has a delay time 358 of about 2 frames. The lower screen has a delay 369 of almost 0 frames, which is slightly earlier than that shown in FIG.
 図4(f)のデュアルスキャン表示方式(逆方向、画面上下開始)では、画面上端で約2フレームの遅延378、画面下端ではぼぼ遅延なし、中央部では約2フレームの遅延になる。 In the dual scan display method of FIG. 4 (f) (reverse direction, start of screen up / down), there is a delay of about 2 frames 378 at the top of the screen, no blur at the bottom of the screen, and a delay of about 2 frames at the center.
 図5は、図3と図4に説明した各走査方式における遅延時間を表にまとめたものである。撮像素子と表示素子の走査方式の組み合わせに対し、撮像タイミングから表示タイミングまでの遅延時間をフレーム単位(周期T)での概数で示している。各ケースの上段・中段・下段の数値は、それぞれ画面上端・中央部・下端での遅延時間を示す。なお、中段の数値を2つ示したものは、第1の数値は上画面の下端、第2の数値は下画面の上端での遅延時間である。 FIG. 5 is a table summarizing the delay times in each scanning method described in FIGS. 3 and 4. The delay time from the imaging timing to the display timing is shown as an approximate number in frame units (period T) for the combination of the scanning method of the imaging element and the display element. The numerical values in the upper, middle, and lower stages of each case indicate the delay times at the top, center, and bottom of the screen, respectively. In the middle stage, two numerical values are shown, the first numerical value is the delay time at the lower end of the upper screen, and the second numerical value is the delay time at the upper end of the lower screen.
 この表では比較のために、これまで説明しなかった走査方式を追加している。まず表示素子の走査方式(g)は、同方向走査のデュアルスキャンで(d)と類似するが、上画面と下画面を同一フレーム映像信号で走査する方式である。方式(g)での遅延時間の大きさは、下画面では(d)と同じであるが、上画面では(d)に1フレームを加えたものとなっている。 In this table, scanning methods that have not been explained so far are added for comparison. First, the scanning method (g) of the display element is similar to (d) in the dual scanning of the same direction scanning, but is a method of scanning the upper screen and the lower screen with the same frame video signal. The delay time in the method (g) is the same as (d) on the lower screen, but one frame is added to (d) on the upper screen.
 さらに撮像素子の走査方式をシャッタモードで区分している。これまで説明したのは「ラインシャッタ」モードであり、走査に合わせて画面内のシャッタタイミングをシフトするものである。一方、追加した「フレームシャッタ」モードは、全画面を同一タイミングで撮像して読み出しを順次走査するものである。フレームシャッタモードでの遅延時間は、ラインシャッタモードの場合よりも大きくなる。走査開始の直前でシャッタを切る場合を想定すると、例えば、ラインシャッタでシングルスキャンの場合と比べて、画面上端で0フレーム、中央部で0.5フレーム、下端で1フレームだけ遅延時間が増加する。なお、フレームシャッタモードの場合、撮像素子がシングルスキャンとデュアルスキャン方式とで遅延時間に差異はなくなる。 Furthermore, the scanning mode of the image sensor is divided by shutter mode. What has been described so far is the “line shutter” mode, which shifts the shutter timing in the screen in accordance with the scanning. On the other hand, the added “frame shutter” mode captures the entire screen at the same timing and sequentially scans readout. The delay time in the frame shutter mode is longer than that in the line shutter mode. Assuming that the shutter is released immediately before the start of scanning, for example, the delay time is increased by 0 frame at the upper end of the screen, 0.5 frame at the center, and 1 frame at the lower end, compared to the case of single scanning with a line shutter. . In the case of the frame shutter mode, there is no difference in delay time between the image pickup device of the single scan method and the dual scan method.
 図5の結果から、遅延時間の改善には次のような組み合わせが望ましいと言える。
  まず、撮像素子の走査方式に対する表示素子の望ましい走査方式は以下である。
(A1)撮像素子がラインシャッタモードの場合は、表示素子にデュアルスキャン方式((d):上下画面を同方向走査)を選択すると、表示の遅延時間が最小となる。画面内の遅延時間ばらつきを最小にする観点であれば、撮像素子がシングルスキャン方式であれば表示素子もシングルスキャン方式に切換えるとよい。
(A2)撮像素子がフレームシャッタモードの場合は、表示素子にデュアルスキャン方式((e):画面中央開始)またはシングルスキャン方式を選択すると、画面内の遅延時間ばらつきを低減できる。
From the results of FIG. 5, it can be said that the following combinations are desirable for improving the delay time.
First, a desirable scanning method of the display element with respect to the scanning method of the image sensor is as follows.
(A1) When the image pickup device is in the line shutter mode, the display delay time is minimized when the dual scan method ((d): upper and lower screens are scanned in the same direction) is selected as the display device. From the viewpoint of minimizing the delay time variation in the screen, if the image sensor is a single scan system, the display element may be switched to the single scan system.
(A2) When the image sensor is in the frame shutter mode, the delay time variation in the screen can be reduced by selecting the dual scan method ((e): start of the screen center) or the single scan method for the display element.
 上記の制御のため、表示装置は、撮像装置からの映像信号に付与されているメタデータをインフォフレーム等から抜き出して撮像素子の走査方式を検出し、表示素子の走査方式を選択すればよい。 For the above control, the display device may extract the metadata attached to the video signal from the imaging device from the info frame or the like, detect the scanning method of the imaging device, and select the scanning method of the display device.
 次に、表示素子の走査方式に対する撮像素子の望ましい走査方式は以下である。
(B1)表示素子がシングルスキャン方式の場合は、撮像素子にシングルスキャン方式(ラインシャッタモード)を選択すると、表示画面内の遅延時間ばらつきが最小となる。
(B2)表示素子がデュアルスキャン方式の場合は、撮像素子にデュアルスキャン方式(ラインシャッタモード)を選択すると、表示の遅延時間が最小となる。
(B3)表示素子がデュアルスキャン方式の場合は、撮像素子にデュアルスキャン方式(ラインシャッタモード)を選択すると、表示画面内の遅延時間ばらつきが最小となる。
Next, a desirable scanning method of the image sensor with respect to the scanning method of the display element is as follows.
(B1) When the display element is a single scan system, if the single scan system (line shutter mode) is selected as the image sensor, the delay time variation in the display screen is minimized.
(B2) When the display element is a dual scan system, the display delay time is minimized when the dual scan system (line shutter mode) is selected for the image sensor.
(B3) When the display element is a dual scan system, if the dual scan system (line shutter mode) is selected as the image sensor, the delay time variation in the display screen is minimized.
 上記の制御のため、撮像装置は、表示装置の表示能力情報をEDID等から読み取って表示素子の走査方式を検出し、撮像素子の走査方式を選択すればよい。 For the above control, the imaging device may read the display capability information of the display device from EDID or the like, detect the scanning method of the display element, and select the scanning method of the imaging device.
 なお、遅延時間のばらつきを抑えるために、走査処理部25において、前後の映像信号や動きベクトルから途中タイミングの映像を作り出し、図5から得られる画面内の遅延時間を逆補正してもよい。また、表示素子がバックライトを使用し、撮像素子がフレームシャッタモードの場合、走査完了後にバックライトを全面同一タイミングで、フレームシャッタの開口時間に合わせて点灯させることによって、画面内の表示タイミングを合わせることもできる。 In order to suppress variations in delay time, the scanning processing unit 25 may create a video at an intermediate timing from the preceding and following video signals and motion vectors, and reversely correct the delay time in the screen obtained from FIG. In addition, when the display element uses a backlight and the image sensor is in the frame shutter mode, the display timing in the screen can be adjusted by lighting the backlight in accordance with the opening time of the frame shutter at the same timing after scanning is completed. It can also be combined.
 以上述べたように、表示装置が表示走査情報(シングル/デュアルスキャン、走査方向、バックライト点灯タイミング)を撮像装置へ伝え、あるいは逆に撮像装置が撮像走査情報(シングル/デュアルスキャン、走査方向、ライン/フレームシャッタ、開口時間)を映像データに付与して表示装置に伝えることによって、それぞれの走査方式を最適になるよう選択する。これにより、撮像タイミングから表示タイミングまでの遅延時間の短い、または遅延時間のばらつきの少ない撮像・表示システムを実現することができる。 As described above, the display device transmits display scanning information (single / dual scan, scanning direction, backlight lighting timing) to the imaging device, or conversely, the imaging device captures imaging scanning information (single / dual scanning, scanning direction, (Line / frame shutter, opening time) is added to the video data and transmitted to the display device, so that each scanning method is selected to be optimum. Accordingly, it is possible to realize an imaging / display system with a short delay time from the imaging timing to the display timing or with little variation in the delay time.
 実施例2では、放送用システムを想定した撮像・表示システムを説明する。
  図6は、実施例2に係る撮像・表示システムの全体構成を示す図である。本システムでは、撮像装置(カメラ)3、エンコーダ4、サーバ5、放送局6、セットトップボックス(STB:Set Top Box)7、表示装置(ディスプレイ)2で構成されている。本システムでは、撮像装置3は実施例1(図1)における撮像素子11に相当し、エンコーダ4やSTB7は、放送規格に従い映像信号の圧縮・伸張処理等を行う。以下、各機器の構成と基本的な処理について説明する。
In the second embodiment, an imaging / display system assuming a broadcasting system will be described.
FIG. 6 is a diagram illustrating the overall configuration of the imaging / display system according to the second embodiment. The system includes an imaging device (camera) 3, an encoder 4, a server 5, a broadcasting station 6, a set top box (STB) 7, and a display device (display) 2. In this system, the imaging device 3 corresponds to the imaging device 11 in the first embodiment (FIG. 1), and the encoder 4 and the STB 7 perform compression / decompression processing of a video signal according to the broadcast standard. Hereinafter, the configuration and basic processing of each device will be described.
 撮像装置3が例えばフレームレート120Hzで撮像した映像信号を、エンコーダ4はH.264やH.265規格などに基づいて圧縮符号化し、サーバ5に蓄積する。図示していない編集局は、蓄積された映像を映像編集装置等で放送用に編集して、再びサーバ5に蓄積する。放送局6は、サーバ5に蓄積された放送用映像信号を、放送電波やケーブルテレビ回線、インターネット回線等を通じて家庭にあるSTB7へ送信する。もちろん、撮像装置3やエンコーダ4の映像信号を、そのまま生番組としてSTB7へ送信してもよい。STB7は、受信した複数の映像信号からユーザが視聴する映像信号を抽出して表示装置2へ伝送し、表示装置2は受信した映像を表示する。 The video signal picked up by the imaging device 3 at a frame rate of 120 Hz, for example, the encoder 4 H.264 and H.264. The data is compressed and encoded based on the H.265 standard and stored in the server 5. An editing station (not shown) edits the stored video for broadcasting using a video editing device or the like, and stores it again in the server 5. The broadcast station 6 transmits the broadcast video signal stored in the server 5 to the STB 7 in the home through a broadcast radio wave, a cable TV line, an Internet line, or the like. Of course, the video signals of the imaging device 3 and the encoder 4 may be transmitted to the STB 7 as they are as a live program. The STB 7 extracts a video signal that the user views from a plurality of received video signals and transmits the video signal to the display device 2, and the display device 2 displays the received video.
 スポーツ番組などの動きが速い映像は、例えば120Hzの高フレームレート映像信号で送信して、表示装置も120Hzの高フレームレートで表示するのが望ましい。しかし、高フレームレートに非対応の普及型STBや表示装置も存在するため、放送信号としては60Hzと120Hzのいずれの表示装置でも対応可能な伝送が望ましい。それを実現するために、60Hzの映像信号と高フレームレート化用補間データを伝送する方式が有効である。 It is desirable that a fast-moving video such as a sports program is transmitted with a high frame rate video signal of 120 Hz, for example, and the display device also displays at a high frame rate of 120 Hz. However, since there are popular STBs and display devices that do not support high frame rates, it is desirable that the broadcast signal be compatible with both 60 Hz and 120 Hz display devices. In order to realize this, a method of transmitting a 60 Hz video signal and interpolation data for increasing the frame rate is effective.
 具体的な伝送方式として、H.264やH.265に定義されているMVC形式(Multiview Video Coding)を用いて、ベースビュー(Base View)に60Hz映像信号を、非ベースビュー(non-Base View)に120Hz化用の補間データ(60Hz映像信号相当)を割り当てるのがよい。60Hz対応のSTB7や表示装置2では、ベースビューのみを表示する。120Hz対応のSTB7や表示装置2では、非ベースビューも合わせて復号して表示する。これにより、1つの放送信号で60Hzと120Hzの双方の表示方式に対応できる。高フレームレート化のために上記のMVC形式を用いることで、動き判定結果(動きベクトル)を用いる場合よりも、より忠実に120Hz映像信号を復元できる。 As a specific transmission method, H.264 H.264 and H.264. Using the MVC format (Multiview VideodingCoding) defined in H.265, the 60Hz video signal for the base view (Base View) and the interpolation data for 120Hz conversion to the non-base view (non-Base View) (equivalent to 60Hz video signal) ) Should be assigned. The STB 7 and display device 2 compatible with 60 Hz display only the base view. The STB 7 and the display device 2 compatible with 120 Hz also decode and display the non-base view. Thereby, it is possible to support both 60 Hz and 120 Hz display methods with one broadcast signal. By using the above MVC format for increasing the frame rate, it is possible to restore the 120 Hz video signal more faithfully than when using the motion determination result (motion vector).
 GOP(Group Of Pictures)単位でIフレーム(Intra-coded Frame)、Pフレーム(Predicted Frame)、Bフレーム(Bi-directional Predicted Frame)を割り当てる場合、ベースビュー(60Hz映像信号)はIBPBを、非ベースビュー(120Hz化用補間データ)はBBBBとするのがよい。ベースビューはBフレームを減らして圧縮・伸張に伴う遅延時間を短くし、非ベースビューは全てBフレームとすることでデータ量を削減する効果がある。 When I frames (Intra-coded Frames), P frames (Predicted Frames), and B frames (Bi-directional Predicted Frames) are allocated in GOP (Group Of Pictures) units, the base view (60 Hz video signal) is non-base The view (interpolated data for 120 Hz conversion) is preferably BBBB. The base view has the effect of reducing the amount of data by reducing the B frame to shorten the delay time associated with compression / decompression and making all non-base views B frames.
 次に、図6で示した各機器間で送受される映像信号の形式について説明する。
  撮像装置3は、120Hzベースバンドのディジタル映像信号をSDI(Serial Digital Interface)でエンコーダ4へ出力する。エンコーダ4は、前記MVC形式で、60Hz映像信号のベースビューと120Hz化用補間データの非ベースビューを生成し、パケット形式のネットワーク、例えばイーサネット(登録商標)でサーバ5へ出力する。
Next, the format of the video signal transmitted and received between the devices shown in FIG. 6 will be described.
The imaging device 3 outputs a 120 Hz baseband digital video signal to the encoder 4 by SDI (Serial Digital Interface). The encoder 4 generates a base view of a 60 Hz video signal and a non-base view of 120 Hz conversion data in the MVC format, and outputs them to the server 5 through a packet format network, for example, Ethernet (registered trademark).
 サーバ5は、MVC形式の映像信号をイーサネットで放送局6へ出力する。サーバ5の入出力をイーサネットで行うことにより、放送局6はエンコーダ4から出力される生番組を、サーバ5をスルーして受け取ることが容易になる。放送局6は、放送電波やケーブルテレビ網、インターネット等を介して、MVC形式の映像信号をSTB7へ送信する。 The server 5 outputs a video signal in MVC format to the broadcast station 6 via Ethernet. By performing input / output of the server 5 via Ethernet, the broadcast station 6 can easily receive the live program output from the encoder 4 through the server 5. The broadcasting station 6 transmits a video signal in the MVC format to the STB 7 via a broadcast radio wave, a cable television network, the Internet, or the like.
 STB7は、表示装置2とHDMIで接続されていると、EDID(Extended Display Identification Data)やCEC(Consumer Electronics Control)などの仕組みによって、表示装置2の表示能力を検出することができる。表示装置2が60Hz非圧縮のみ対応であれば、ベースビューのみをデコードした60Hzの非圧縮映像信号を、例えばHDMIで伝送する。表示装置2が120Hz非圧縮のみ対応であれば、ベースビューと非ベースビューの双方をデコードした120Hz非圧縮映像信号で伝送する。 When the STB 7 is connected to the display device 2 via HDMI, the display capability of the display device 2 can be detected by a mechanism such as EDID (Extended Display Identification Data) or CEC (Consumer Electronics Control). If the display device 2 supports only 60 Hz non-compression, a 60 Hz uncompressed video signal obtained by decoding only the base view is transmitted by, for example, HDMI. If the display device 2 supports only 120 Hz non-compression, the display device 2 transmits a 120-Hz uncompressed video signal obtained by decoding both the base view and the non-base view.
 また、表示装置2が120Hz化用補間データ付60Hz非圧縮映像信号対応であれば、ベースビューをデコードした60Hz非圧縮映像信号に、非ベースビューの120Hz化用補間データをメタデータとしてインフォフレーム(Info Frame)に付加して伝送する。表示装置2はこの補間データを用いて、撮像装置3が撮像したフレームレート120Hz、さらにはこれを超える240Hzなどに高フレームレート化して表示させ、動き性能を強化することができる。 Also, if the display device 2 is compatible with 60 Hz uncompressed video signal with 120 Hz interpolation data, the 60 Hz uncompressed video signal obtained by decoding the base view is converted into an InfoFrame (120 Hz interpolation data of the non-base view as metadata. It is added to Info Frame) and transmitted. Using this interpolation data, the display device 2 can display a frame rate of 120 Hz picked up by the image pickup device 3 at a higher frame rate, such as 240 Hz, which is higher than this, thereby enhancing motion performance.
 HDMIにコンテンツ保護のHDCP(High-bandwidth Digital Content Protection)を適用しても、インフォフレームは暗号化されていないので、補間データを暗号化してインフォフレームで伝送してもよい。復号化の鍵はHDCP認証から得られる鍵を流用してもよいし、HDCPから復元された非圧縮映像データの一部を鍵として使用して簡素化してもよい。 Even if HDCP (High-bandwidth Digital Content Protection) for content protection is applied to HDMI, since the info frame is not encrypted, the interpolation data may be encrypted and transmitted by the info frame. The decryption key may be a key obtained from HDCP authentication, or may be simplified by using a part of uncompressed video data restored from HDCP as a key.
 次に、エンコーダ4とSTB7について詳細に説明する。
  図7は、エンコーダ4の内部構成を示す図である。エンコーダ4は、入力部41、メモリ42,46、圧縮部43,47、パケット処理部44、出力部45、制御部48を有する。
Next, the encoder 4 and the STB 7 will be described in detail.
FIG. 7 is a diagram showing an internal configuration of the encoder 4. The encoder 4 includes an input unit 41, memories 42 and 46, compression units 43 and 47, a packet processing unit 44, an output unit 45, and a control unit 48.
 入力部41には図6の撮像装置3から、例えばSDIで出力したフレームレート120Hzの非圧縮映像信号が入力する。奇数フレームはメモリ42に、偶数フレームはメモリ46に蓄積される。圧縮部43は、メモリ42に蓄積された奇数フレームの映像信号を圧縮する。 The input unit 41 receives an uncompressed video signal with a frame rate of 120 Hz output from, for example, SDI from the imaging device 3 in FIG. Odd frames are stored in the memory 42 and even frames are stored in the memory 46. The compression unit 43 compresses the odd-numbered frame video signal stored in the memory 42.
 一方圧縮部47は、メモリ46に蓄積された偶数フレームの映像信号を、奇数フレーム映像やその圧縮後の映像信号を参照して圧縮する。奇数フレーム映像を参照することで、奇数フレーム映像よりも高効率に圧縮できる。これより実施例1でも説明したように、120Hz化用補間データとして差分情報や、前後フレームの混合比率や、動きベクトル等を生成する。 On the other hand, the compression unit 47 compresses the even frame video signal stored in the memory 46 with reference to the odd frame video and the compressed video signal. By referring to the odd frame video, compression can be performed more efficiently than the odd frame video. Thus, as described in the first embodiment, difference information, the mixing ratio of the previous and subsequent frames, the motion vector, and the like are generated as the interpolation data for 120 Hz conversion.
 パケット処理部44は、圧縮部43で圧縮された奇数フレームと、圧縮部47でさらに高効率に圧縮された偶数フレームをパケット化し、出力部45は、パケット化された映像データを例えばLAN(Local Area Network)などで図6のサーバ5へ送る。制御部48は、サーバ5やオペレータが指定する圧縮方式に合わせて、エンコーダ4内の各部を制御すると共に、撮像装置情報(シングル/デュアルスキャン、走査方向、ライン/フレームシャッタ、開口時間)や圧縮方式情報を映像データのメタデータとして付与する。また制御部48は、サーバ5やオペレータが指定する表示装置2の表示走査情報(シングル/デュアルスキャン、走査方向、バックライト点灯タイミング)を撮像装置3へ伝える。 The packet processing unit 44 packetizes the odd frames compressed by the compression unit 43 and the even frames compressed more efficiently by the compression unit 47, and the output unit 45 converts the packetized video data into, for example, a LAN (Local (Area Network) or the like to the server 5 in FIG. The control unit 48 controls each unit in the encoder 4 in accordance with the compression method designated by the server 5 or the operator, and at the same time, the imaging device information (single / dual scan, scanning direction, line / frame shutter, opening time) and compression. System information is assigned as metadata of video data. Further, the control unit 48 transmits the display scanning information (single / dual scan, scanning direction, backlight lighting timing) of the display device 2 designated by the server 5 or the operator to the imaging device 3.
 以上のようにエンコーダ4は、撮像装置3の撮像走査情報と表示装置2の表示走査情報をもとに、最適な圧縮映像データを生成し、これに撮像走査情報を付与して表示装置2側に伝送する。これによって、実施例1で説明したように、撮像タイミングから表示タイミングまでの遅延時間が短く、または遅延時間ばらつきの少ない放送システムを実現できる。 As described above, the encoder 4 generates optimal compressed video data based on the imaging scanning information of the imaging device 3 and the display scanning information of the display device 2, and gives the imaging scanning information to the display device 2 side. Transmit to. As a result, as described in the first embodiment, it is possible to realize a broadcasting system in which the delay time from the imaging timing to the display timing is short or the delay time variation is small.
 図8は、STB7の内部構成を示す図である。STB7は、入力部71、パケット処理部72、分離部73、伸張部74、重畳部75、8B/10Bエンコーダ76、出力部77、メモリ78、エラー訂正符号化部79、制御部80を有する。 FIG. 8 is a diagram showing the internal configuration of the STB 7. The STB 7 includes an input unit 71, a packet processing unit 72, a separation unit 73, a decompression unit 74, a superposition unit 75, an 8B / 10B encoder 76, an output unit 77, a memory 78, an error correction coding unit 79, and a control unit 80.
 入力部71には図6の放送局6から、放送電波やネットワーク等を介して、MVC形式の圧縮映像信号(60Hz映像信号と120Hz化用補間データ)を受信し、パケット処理部72は、受信したパケット形式の映像信号をデパケット化する。分離部73は、60Hz映像信号(奇数フレーム)と120Hz化用補間データ(偶数フレーム)に分離し、それぞれ伸張部74とメモリ78へ送る。伸張部74は、奇数フレームの映像信号をデコードし、60Hz非圧縮映像信号を得る。 The input unit 71 receives a compressed video signal (60 Hz video signal and 120 Hz interpolated data) in the MVC format from the broadcasting station 6 of FIG. 6 via a broadcast radio wave or a network, and the packet processing unit 72 receives the compressed video signal. The packetized video signal is depacketized. The separation unit 73 separates the 60 Hz video signal (odd number frame) and the 120 Hz conversion data (even number frame), and sends them to the decompression unit 74 and the memory 78, respectively. The decompression unit 74 decodes the odd frame video signal to obtain a 60 Hz uncompressed video signal.
 一方エラー訂正符号化部79は、メモリ78に蓄積された偶数フレームの120Hz化用補間データについてエラー訂正符号化を施す。重畳部75は、奇数フレームの60Hz非圧縮映像信号の帰線期間に偶数フレームの120Hz化用補間データを重畳する。8B/10Bエンコーダ76は、この120Hz化用補間データ付60Hz非圧縮映像信号をシリアライズ化(8ビットデータを10ビットデータに変換)し、出力部77から例えばHDMI形式の映像信号で表示装置2へ出力する。 On the other hand, the error correction encoding unit 79 performs error correction encoding on the interpolation data for 120 Hz conversion of even frames stored in the memory 78. The superimposing unit 75 superimposes the interpolation data for 120 Hz conversion of the even frame in the blanking period of the 60 Hz uncompressed video signal of the odd frame. The 8B / 10B encoder 76 serializes the 60 Hz uncompressed video signal with 120 Hz interpolated data (converts 8-bit data into 10-bit data), and outputs the video signal from the output unit 77 to the display device 2 as an HDMI format video signal, for example. Output.
 制御部80は各部を制御し、表示装置2の表示走査情報(シングル/デュアルスキャン、走査方向、バックライト点灯タイミング)を読み取り、最適な補間データ付非圧縮映像信号を生成する。また、受信した映像信号に付与されている撮像装置情報(シングル/デュアルスキャン、走査方向、ライン/フレームシャッタ、開口時間)を60Hz非圧縮映像信号のメタデータとして付与する。また、120Hz化用補間データから、表示装置2が利用可能なデータを抽出または変換して表示装置へ伝送する機能や、帰線期間中に同時伝送が必要な音声データ伝送帯域を確保する機能を有してもよい。 The control unit 80 controls each unit, reads display scanning information (single / dual scan, scanning direction, backlight lighting timing) of the display device 2, and generates an optimal uncompressed video signal with interpolation data. Also, imaging device information (single / dual scan, scanning direction, line / frame shutter, opening time) added to the received video signal is added as metadata of the 60 Hz uncompressed video signal. Also, a function of extracting or converting data usable by the display device 2 from the interpolation data for 120 Hz and transmitting it to the display device, and a function of securing a voice data transmission band that requires simultaneous transmission during the retrace period You may have.
 以上のようにSTB7は、撮像装置3の撮像走査情報と表示装置2の表示走査情報をもとに、最適な非圧縮映像信号を生成し、これに高フレームレート化補間データや撮像走査情報を付与して表示装置2に出力する。これによって、実施例1で説明したように、撮像タイミングから表示タイミングまでの遅延時間が短く、または遅延時間ばらつきの少ない放送システムを実現できる。 As described above, the STB 7 generates an optimal uncompressed video signal based on the imaging scanning information of the imaging device 3 and the display scanning information of the display device 2, and generates high frame rate interpolation data and imaging scanning information on this. And output to the display device 2. As a result, as described in the first embodiment, it is possible to realize a broadcasting system in which the delay time from the imaging timing to the display timing is short or the delay time variation is small.
 実施例3では、フレームパッキング方式による3D映像伝送の場合について説明する。フレームパッキング方式は、3Dの左目用映像(L映像)と右目用映像(R映像)を交互に伝送する方式である。実施例1の120Hz映像伝送方式に適用すると、例えば奇数フレームをL映像、偶数フレームをR映像に割当てることが考えられる。この場合、R映像を常に偶数フレームに割り当てると、前記したように偶数フレームは奇数フレームと補間用データから復元されることから、常に奇数フレームに割当てられたL映像と比較して画質が低下する可能性がある。また、60Hz伝送における非圧縮映像信号はL映像のみであるので、補間データの処理機能のない表示装置では、2D表示のみとなる。 In the third embodiment, a case of 3D video transmission by the frame packing method will be described. The frame packing method is a method of alternately transmitting 3D left-eye video (L video) and right-eye video (R video). When applied to the 120 Hz video transmission system of the first embodiment, for example, it is conceivable to assign odd frames to L video and even frames to R video. In this case, if the R video is always assigned to the even frame, the even frame is restored from the odd frame and the interpolation data as described above, so that the image quality is always lower than that of the L video assigned to the odd frame. there is a possibility. Further, since the uncompressed video signal in 60 Hz transmission is only L video, the display device without the interpolation data processing function performs only 2D display.
 本実施例では、60Hzにおける非圧縮映像伝送でもL映像とR映像の両方を交互に含むフレームパッキング方式とする場合を説明する。本実施例により、R映像にも非圧縮映像信号が割当てられるので、補間データを処理できる表示装置ではL映像とR映像の画質が均一化する利点がある。また、補間データの処理機能のない表示装置でも3D映像を表示することができる。 In the present embodiment, a case will be described in which a frame packing method including both L video and R video alternately even in uncompressed video transmission at 60 Hz is described. According to this embodiment, since an uncompressed video signal is also assigned to the R video, a display device that can process the interpolation data has an advantage that the image quality of the L video and the R video is uniform. In addition, a 3D image can be displayed even on a display device without an interpolation data processing function.
 撮像・表示システムは実施例1(図1)と同様の構成となる。伝送、表示における時間軸変換は、メモリ12,16,24,30、走査処理部25で実現する。 The imaging / display system has the same configuration as that of the first embodiment (FIG. 1). Time axis conversion in transmission and display is realized by the memories 12, 16, 24, 30 and the scanning processing unit 25.
 図9は、3D映像信号の撮像、伝送、表示タイミングの例を説明する図である。ここでは、撮像走査方式がフレームレート120Hzでシングルスキャンする場合とする。太い矢印(例えば符号401)はL映像(奇数フレームに対応)の走査タイミングを、細い矢印(例えば符号402)はR映像(偶数フレームに対応)の走査タイミングを示す。さらに、実線の矢印(例えば符号411)は非圧縮映像信号を、破線の矢印(例えば符号413)は補間データを示す。 FIG. 9 is a diagram illustrating an example of imaging, transmission, and display timing of a 3D video signal. Here, it is assumed that the imaging scanning method performs a single scan at a frame rate of 120 Hz. A thick arrow (eg, reference numeral 401) indicates the scanning timing of the L video (corresponding to odd frames), and a thin arrow (eg, reference numeral 402) indicates the scanning timing of the R video (corresponding to even frames). Further, a solid line arrow (for example, reference numeral 411) indicates an uncompressed video signal, and a broken line arrow (for example, reference numeral 413) indicates interpolation data.
 (a)は撮像タイミングであり、撮像素子11がフレームレート120Hzでシングルスキャンしており、L映像401,403,405とR映像402,404,406を交互に出力している。 (A) is an imaging timing, the image sensor 11 performs a single scan at a frame rate of 120 Hz, and L images 401, 403, and 405 and R images 402, 404, and 406 are alternately output.
 (b)は伝送タイミングであり、撮像素子11から出力されるL映像401,405とR映像402を交互に走査し、60Hzシングルスキャン信号411,412,415(時間幅2T)に変換する。また、これらに挟まれたL映像403とR映像404を120Hz化用補間データ413,414に変換して、上記60Hzシングルスキャン信号411,412,415に重畳させて出力している。 (B) is the transmission timing, and the L images 401 and 405 and the R image 402 output from the image sensor 11 are alternately scanned and converted into 60 Hz single scan signals 411, 412 and 415 (time width 2T). Further, the L video 403 and the R video 404 sandwiched between them are converted into 120 Hz interpolated data 413 and 414 and are superimposed on the 60 Hz single scan signals 411, 412 and 415 and output.
 (c)(g)(e)(f)は、各種表示走査方式における表示タイミングである。(c)は120Hzシングルスキャンの場合で、表示素子26にてL映像421,423とR映像422,424を交互に走査している。L映像とR映像の走査に合わせて、シャッタ眼鏡を同期して開閉させることにより、視聴者の左目がL映像、右目がR映像をとらえて、3D映像を再現できる。 (C) (g) (e) (f) are display timings in various display scanning methods. (C) is a case of 120 Hz single scan, and the display element 26 alternately scans the L images 421 and 423 and the R images 422 and 424. In synchronization with the scanning of the L video and the R video, the shutter glasses are opened and closed synchronously, so that the viewer's left eye captures the L video and the right eye captures the R video, thereby reproducing the 3D video.
 (g)(e)(f)は120Hzデュアルスキャンの場合で、(g)は同方向走査、(e)は逆方向走査(中央開始)、(f)は逆方向走査(上下開始)である。なお、(g)では上画面と下画面に同一フレーム映像を表示するようにしている。例えば(g)では、表示素子26の上画面はL映像431,433とR映像432,434を交互に走査し、下画面は対応する同一フレームのL映像441,443とR映像442,444を交互に走査している。他の走査方式(e)(f)においても同様である。これにシャッタ眼鏡を同期して用いれば、(c)と同様に3D映像を再現できる。 (G) (e) (f) is the case of 120 Hz dual scan, (g) is the same direction scan, (e) is the reverse direction scan (center start), and (f) is the reverse direction scan (up and down start). . In (g), the same frame image is displayed on the upper screen and the lower screen. For example, in (g), the upper screen of the display device 26 alternately scans the L video 431, 433 and the R video 432, 434, and the lower screen displays the corresponding L video 441, 443 and R video 442, 444 of the same frame. Scanning alternately. The same applies to the other scanning methods (e) and (f). If shutter glasses are used in synchronism with this, 3D video can be reproduced in the same manner as (c).
 以下、伝送と表示のタイミングについて詳細に説明する。また遅延時間について実施例1(図3)の場合と比較して説明する。 The transmission and display timing will be described in detail below. The delay time will be described in comparison with the case of the first embodiment (FIG. 3).
 図9(b)の伝送タイミングでは、撮像素子11が120Hzシングルスキャンで撮像したL映像401,403,405とR映像402,403,406を、60Hzに変換してL映像とR映像を交互に並べて伝送する。これにより、補間データ処理を行わない3D表示装置でもフレームパッキングの3D映像表示が実現できる利点がある。このため、撮像したL映像401は即時L映像411として伝送し、R映像402については1フレーム(1T)遅らせてR映像412を伝送開始する。次のL映像403とR映像404はそれぞれ圧縮して、L映像補間データ413とR映像414として伝送する。L映像補間データ413の伝送開始は、L映像403の撮像開始よりも圧縮や符号化の処理時間418だけ遅延が生じる。R映像補間データ414の伝送開始は、L映像補間データ413の伝送終了直後とすると、両者の補間データの伝送期間が重ならず、伝送帯域を有効に使うことができる。なお、R映像補間データ414の伝送開始を、L映像補間データ413の伝送開始から60Hzの1フレーム分(2T)だけ遅延させてもよい。 At the transmission timing of FIG. 9B, the L image 401, 403, 405 and the R image 402, 403, 406 captured by the image sensor 11 with a single scan of 120 Hz are converted to 60 Hz, and the L image and the R image are alternately displayed. Transmit side by side. Accordingly, there is an advantage that 3D video display with frame packing can be realized even in a 3D display device that does not perform interpolation data processing. For this reason, the captured L video 401 is transmitted as an immediate L video 411, and the R video 402 is started to be transmitted after being delayed by one frame (1T). The next L video 403 and R video 404 are compressed and transmitted as L video interpolation data 413 and R video 414, respectively. The transmission start of the L video interpolation data 413 is delayed by the compression or encoding processing time 418 compared to the start of the imaging of the L video 403. If the transmission of the R video interpolation data 414 starts immediately after the transmission of the L video interpolation data 413, the transmission periods of the two interpolation data do not overlap, and the transmission band can be used effectively. The transmission start of the R video interpolation data 414 may be delayed by one frame (2T) of 60 Hz from the transmission start of the L video interpolation data 413.
 図9(c)の表示タイミングは、表示素子26がシングルスキャンでL映像とR映像を交互に表示する。L映像421の表示開始は、L映像401の撮像開始よりもほぼ2フレーム分(2T)の遅延428,429が生じる。これは、次のR映像422の表示終了時点をR映像412の伝送終了時点に合わせ、L映像421とR映像422の表示間隔を1Tになるようにしたからである。その結果、実施例1(図3(c))と比べて、全体的に約1フレーム(1T)遅れた表示となる。 In the display timing of FIG. 9C, the display element 26 alternately displays the L video and the R video in a single scan. At the start of display of the L video 421, delays 428 and 429 of approximately 2 frames (2T) are generated compared to the start of imaging of the L video 401. This is because the display end time of the next R video 422 is matched with the transmission end time of the R video 412, and the display interval between the L video 421 and the R video 422 is set to 1T. As a result, the display is delayed by about 1 frame (1T) as a whole as compared with the first embodiment (FIG. 3C).
 図9(g)は、同方向デュアルスキャン方式でL映像とR映像を交互表示する場合であるが、この場合にはシャッタ眼鏡の同期をとる必要がある。すなわち、図3(d)のような2フレームにまたがる走査を行うと、L映像とR映像が同一画面に表示されシャッタ眼鏡で分離しにくくなる。よって、同方向デュアルスキャンでは、上下の画面とも、同一のフレームを走査している。遅延時間は、図9(c)の場合とほぼ同じである。しかしながら、画面中央部での走査タイミングは上画面と下画面で時間ずれが生じるため、図2(b)で述べたように、画面中央部に表示される移動体が歪んで見える場合がある。 FIG. 9 (g) shows a case where L video and R video are alternately displayed by the dual scan method in the same direction. In this case, it is necessary to synchronize the shutter glasses. That is, when scanning over two frames as shown in FIG. 3D is performed, the L video and the R video are displayed on the same screen and are difficult to separate with the shutter glasses. Therefore, in the same direction dual scan, the same frame is scanned on both the upper and lower screens. The delay time is almost the same as in the case of FIG. However, since the scanning timing at the center of the screen is time-shifted between the upper screen and the lower screen, the moving body displayed at the center of the screen may appear distorted as described with reference to FIG.
 図9(e)は、中央部から上下端へ走査するデュアルスキャン方式である。下画面では図9(g)と同じ走査であるが、上画面では図9(g)と走査方向が逆になる。その結果、撮像タイミングからの遅延時間は、画面上端では約3フレームの遅延458、画面下端では約2フレームの遅延469、中央部では約1.5フレームの遅延が生じる。なお、画面中央部の走査タイミングは同じであるため、移動体の歪みは生じない。 FIG. 9 (e) shows a dual scan method in which scanning is performed from the center to the upper and lower ends. In the lower screen, the scanning is the same as in FIG. 9G, but in the upper screen, the scanning direction is opposite to that in FIG. 9G. As a result, the delay time from the imaging timing has a delay 458 of about 3 frames at the upper end of the screen, a delay 469 of about 2 frames at the lower end of the screen, and a delay of about 1.5 frames at the center. Note that since the scanning timing in the center of the screen is the same, the moving body is not distorted.
 図9(f)は、上下端から中央部へ走査するデュアルスキャン方式である。画面上下端から走査するため、走査開始時に画面上端と下端の映像データが必要になる。特に下画面のR映像482の表示開始は、R映像412の伝送終了時417以降となり、他の表示もこれに合わせる。その結果、画面上端の遅延478は約3フレーム、画面中央部の遅延は約3.5フレーム、画面下端の遅延489は約2フレームとなる。 FIG. 9F shows a dual scan method in which scanning is performed from the upper and lower ends to the center. Since scanning is performed from the upper and lower ends of the screen, video data at the upper and lower ends of the screen is required at the start of scanning. In particular, the display start of the R video 482 on the lower screen is after 417 at the end of transmission of the R video 412, and other displays are also adjusted to this. As a result, the delay 478 at the upper end of the screen is about 3 frames, the delay at the center of the screen is about 3.5 frames, and the delay 489 at the lower end of the screen is about 2 frames.
 図10は、3D映像信号の撮像、伝送、表示タイミングの他の例を説明する図である。ここでは、撮像走査方式がフレームレート120Hzでデュアルスキャンする場合とする。 FIG. 10 is a diagram illustrating another example of imaging, transmission, and display timing of a 3D video signal. Here, it is assumed that the imaging scanning method performs dual scanning at a frame rate of 120 Hz.
 (a)は撮像タイミングであり、撮像素子11がフレームレート120Hzでデュアルスキャンしており、L映像501,503,505とR映像502,504,506を交互に出力している(時間幅2T)。この場合、撮像素子をL映像とR映像に2個並べ、それらを120Hの1フレーム分の位相差を付けて出力するものであってもよい。 (A) is an imaging timing, the image sensor 11 is dual-scanning at a frame rate of 120 Hz, and L images 501, 503, 505 and R images 502, 504, 506 are alternately output (time width 2T). . In this case, two image sensors may be arranged for the L video and the R video, and output with a phase difference of one frame of 120H.
 (b)は伝送タイミングであり、撮像素子11から出力されるL映像501,505とR映像502を交互に走査し60Hzシングルスキャン信号511,512,515とする。また、これらに挟まれたL映像503とR映像504を120Hz化用補間データ513,514に変換して、上記60Hzシングルスキャン信号511,512,515に重畳させて出力している。その結果、図9(b)の伝送タイミングと同様になる。 (B) is the transmission timing, and the L images 501 and 505 and the R image 502 output from the image sensor 11 are alternately scanned to obtain 60 Hz single scan signals 511, 512 and 515. Also, the L video 503 and the R video 504 sandwiched between them are converted into 120 Hz interpolated data 513 and 514 and superimposed on the 60 Hz single scan signals 511, 512 and 515 for output. As a result, the transmission timing is the same as that in FIG.
 (c)(g)(e)(f)は、各種表示走査方式における表示タイミングである。(c)は120Hzシングルスキャンの場合、(g)(e)(f)は120Hzデュアルスキャンの場合である。これらの表示タイミングは、図9(c)(g)(e)(f)と同様であるので、説明を省略する。 (C) (g) (e) (f) are display timings in various display scanning methods. (C) is a case of 120 Hz single scan, and (g), (e), and (f) are cases of 120 Hz dual scan. Since these display timings are the same as those in FIGS. 9C, 9G, 9E, and 9F, the description thereof is omitted.
 図11は、図9と図10に説明した3D表示の各走査方式における遅延時間を表にまとめたものである。これを、実施例1の表示(2D表示)の場合と比較する。これから言えることは、3D表示(図11)では2D表示(図5)の場合に比べて遅延時間が大きくなること、3D表示において遅延時間の大きさとばらつきが共に小さいのは、表示走査方式が、画面中央から上下端に走査するデュアルスキャン方式(e)の場合であること、である。なお、シャッタ眼鏡方式の3D表示では、2フレームかけて同方向走査するデュアルスキャン表示方式(d)は採用困難である。 FIG. 11 is a table summarizing the delay times in each of the 3D display scanning methods described in FIGS. 9 and 10. This is compared with the display in Example 1 (2D display). What can be said from this is that the delay time is larger in the 3D display (FIG. 11) than in the case of the 2D display (FIG. 5). This is the case of the dual scan method (e) in which scanning is performed from the center of the screen to the upper and lower ends. In the 3D display using the shutter glasses method, it is difficult to adopt the dual scan display method (d) that scans in the same direction over two frames.
 シャッタ眼鏡方式3D表示装置では、シャッタ眼鏡の開閉切換時間を確保するため、シャッタ眼鏡の開閉に合わせてバックライトを点滅させる場合がある。全画面同時に点滅させる表示素子を用いる場合は、フレームシャッタの撮像素子と組合わせると遅延時間の画面内分布がなく一様な3D表示が容易となる。 In the shutter glasses type 3D display device, in order to secure the opening / closing switching time of the shutter glasses, the backlight may blink in accordance with the opening / closing of the shutter glasses. In the case of using a display element that blinks on the entire screen at the same time, when combined with an image pickup element of a frame shutter, there is no distribution of delay time in the screen, and uniform 3D display becomes easy.
 以上のように、3D表示装置が表示走査情報(シングル/デュアルスキャン、走査方向、バックライト点灯タイミング)を3D撮像装置へ伝え、あるいは逆に3D撮像装置が撮像走査情報(シングル/デュアルスキャン、走査方向、ライン/フレームシャッタ、開口時間)を映像データに付与して3D表示装置に伝えることによって、それぞれの走査方式を最適になるよう選択する。これにより、撮像タイミングから表示タイミングまでの遅延時間の短い、または遅延時間ばらつきの少ない撮像・表示システムを実現することができる。 As described above, the 3D display device transmits display scanning information (single / dual scan, scanning direction, backlight lighting timing) to the 3D imaging device, or conversely, the 3D imaging device captures imaging scanning information (single / dual scan, scanning). (Direction, line / frame shutter, opening time) are given to the video data and transmitted to the 3D display device, so that each scanning method is selected to be optimum. Accordingly, it is possible to realize an imaging / display system with a short delay time from the imaging timing to the display timing or with little delay time variation.
 実施例4では、表示装置に高画質化処理を追加し、表示素子のフレームレートは120Hzのままで走査速度を2倍にした構成を説明する。 Example 4 describes a configuration in which image quality enhancement processing is added to a display device, and the scanning rate is doubled while the frame rate of the display element remains 120 Hz.
 図12は、実施例4に係る表示装置の構成例を示す図である。表示装置8は、入力部81、高画質化処理部82、補間データ復元部83、簡易高画質化処理部84、走査処理部85、表示素子86を有する。なお、表示素子86の表示走査情報を撮像装置側に伝え、撮像走査情報を検出して表示装置8全体を制御する制御部の記載は省略している。 FIG. 12 is a diagram illustrating a configuration example of the display device according to the fourth embodiment. The display device 8 includes an input unit 81, an image quality improvement processing unit 82, an interpolation data restoration unit 83, a simple image quality improvement processing unit 84, a scanning processing unit 85, and a display element 86. Note that a description of a control unit that transmits the display scanning information of the display element 86 to the imaging device side, detects the imaging scanning information, and controls the entire display device 8 is omitted.
 表示装置8の動作を説明する。入力部81には、60Hz非圧縮映像信号と120Hz化用補間データが入力する。入力した60Hz非圧縮映像信号は高画質化処理部82へ送られ、映像解析して明るさや色合い、解像度感などを改善する。一方、補間データ復元部83は、入力した120Hz化用補間データと上記60Hz非圧縮映像信号から、120Hz補間映像信号を復元する。この復元された120Hz補間映像信号は、入力された60Hz非圧縮映像信号部分を含まないので、60Hz相当の映像信号になっている。すなわち、入力部81に入力された60Hz映像信号と復元された120Hz補間映像信号をそのまま組合せれば、120Hzデュアルスキャン映像信号になっている。 The operation of the display device 8 will be described. The input unit 81 receives a 60 Hz uncompressed video signal and 120 Hz interpolation data. The input 60 Hz uncompressed video signal is sent to the image quality enhancement processing unit 82, and the video is analyzed to improve the brightness, hue, sense of resolution, and the like. On the other hand, the interpolation data restoration unit 83 restores a 120 Hz interpolated video signal from the input 120 Hz conversion data and the 60 Hz uncompressed video signal. Since the restored 120 Hz interpolated video signal does not include the input 60 Hz uncompressed video signal portion, it is a video signal equivalent to 60 Hz. That is, if the 60 Hz video signal input to the input unit 81 and the restored 120 Hz interpolated video signal are combined as they are, a 120 Hz dual scan video signal is obtained.
 補間データ復元部83から出力される120Hz補間映像信号は、簡易高画質化処理部84で高画質化処理を行う。120Hz補間映像信号は60Hz映像信号との相関が高いので、高画質化処理部82で映像解析した結果を使うことで高画質化処理を簡易に実現できる利点がある。高画質化処理部82から出力される60Hz映像信号と、簡易高画質化処理部84から出力される120Hz補間映像信号を走査処理部85に入力し、撮像走査情報と表示素子の走査方式に合わせた映像信号を生成し、表示素子86にて高画質化された映像を表示する。 The 120 Hz interpolated video signal output from the interpolation data restoration unit 83 is subjected to image quality improvement processing by the simple image quality improvement processing unit 84. Since the 120 Hz interpolated video signal has a high correlation with the 60 Hz video signal, there is an advantage that the image quality improvement processing can be easily realized by using the result of the image analysis performed by the image quality improvement processing unit 82. The 60 Hz video signal output from the image quality improvement processing unit 82 and the 120 Hz interpolated video signal output from the simple image quality improvement processing unit 84 are input to the scanning processing unit 85 to match the imaging scanning information and the scanning method of the display element. The video signal is generated, and the display device 86 displays the video with high image quality.
 一方、表示素子86がバックライトを点滅させる液晶表示素子である場合は、走査時間を例えば1/2に短縮し、走査終了後にバックライトを点灯させる時間を設けることで、動画応答性を改善できる。この走査時間を短縮した映像信号は、走査処理部85の時間軸圧縮処理(走査速度の2倍化)で生成できる。バックライト点灯時間は、撮像素子の開口時間に相当しており、撮影者の意思を生かす意味で同等の時間に設定するとよい。一方、撮影者は、表示素子のフレームレートとバックライト点灯時間の情報を用いて、表示素子のぼやけ具合を推定しながら撮像することができるので、撮影者の意思を反映させやすい。しかし、映像放送のように表示素子の方式が特定できない映像伝送形態では、撮像走査情報を表示装置側へ伝達し、表示装置側で撮影者の意思を汲んで表示走査方式やバックライト点灯時間を選択するのがよい。 On the other hand, when the display element 86 is a liquid crystal display element that blinks the backlight, the moving image response can be improved by reducing the scanning time to, for example, ½, and providing the time for turning on the backlight after the scanning is completed. . The video signal whose scanning time is shortened can be generated by the time axis compression processing (doubling the scanning speed) of the scanning processor 85. The backlight lighting time corresponds to the opening time of the image sensor, and it is preferable to set the backlight lighting time to be equivalent in order to make use of the photographer's intention. On the other hand, since the photographer can take an image while estimating the degree of blurring of the display element using the information on the frame rate of the display element and the backlight lighting time, it is easy to reflect the intention of the photographer. However, in video transmission formats where the display element method cannot be specified, such as video broadcasting, the imaging scanning information is transmitted to the display device side, and the display scanning method and backlight lighting time are set based on the photographer's intention on the display device side. It is good to choose.
 図13は、表示素子の走査時間を短縮した場合の表示タイミングを示す図である。これを実施例1(図3)と対比して説明する。 FIG. 13 is a diagram showing the display timing when the scanning time of the display element is shortened. This will be described in comparison with Example 1 (FIG. 3).
 (a)の撮像タイミング(120Hzシングルスキャン)と(b)の伝送タイミングは、図3と同じである。(c)(g)(e)(f)は、各種表示走査方式における表示タイミングである。 (A) Imaging timing (120 Hz single scan) and (b) transmission timing are the same as those in FIG. (C), (g), (e), and (f) are display timings in various display scanning methods.
 (c)はシングルスキャン表示の場合で、図3(c)の各フレームの走査開始より有効表示期間の半分、すなわち約0.5フレーム(0.5T)の期間829だけ遅れて走査を開始し、走査終了のタイミングは図3(c)の走査終了タイミングに合わせている。これにより、走査時間を半分(0.5T)に短縮している。このように、走査終了から次の走査が開始するまでの期間829を長く確保し、その間にバックライトの点灯を行うとよい。なお、シャッタ眼鏡方式の3D表示では、バックライトが消灯している間に開閉スイッチング動作させることによって、L映像とR映像の良好な分離表示を実現し、良好な立体表示を実現できる。 (C) is a case of single scan display, and scanning is started with a delay of half of the effective display period, that is, a period 829 of about 0.5 frame (0.5 T) from the start of scanning of each frame in FIG. The scanning end timing is matched with the scanning end timing in FIG. As a result, the scanning time is reduced to half (0.5 T). In this manner, it is preferable to secure a long period 829 from the end of scanning to the start of the next scanning, and turn on the backlight during that period. In the shutter glasses 3D display, by performing an open / close switching operation while the backlight is turned off, a good separation display of the L video and the R video can be realized, and a good stereoscopic display can be realized.
 (g)はデュアルスキャン表示(同方向走査)の場合であり、走査期間の後半で上下画面を走査する。この場合、図3(d)のように上画面と下画面を異なるフレームの映像信号とすると、画面中央部の上下で約0.5フレームの時間ずれが生じる。よって、上画面と下画面を同一フレーム831,841の映像信号とし、画面内の遅延時間ばらつきを0.5フレーム以内に抑える方が望ましい。 (G) is a case of dual scan display (same direction scanning), and the upper and lower screens are scanned in the latter half of the scanning period. In this case, as shown in FIG. 3D, if the upper screen and the lower screen are video signals of different frames, a time lag of about 0.5 frames occurs above and below the center of the screen. Therefore, it is desirable to use the upper screen and the lower screen as the video signals of the same frames 831 and 841, and suppress the delay time variation within the screen to within 0.5 frames.
 (e)はデュアルスキャン表示(逆方向走査、中央開始)の場合であり、図3(e)の走査期間後半で上下画面を走査している。これは、倍速化第2フレーム862の表示終了時点は、対応する補間データ212の伝送終了時点217より後ろに来る必要があるからである。 (E) is a case of dual scan display (reverse scanning, center start), and the upper and lower screens are scanned in the latter half of the scanning period of FIG. This is because the display end point of the double speed second frame 862 needs to come after the transmission end point 217 of the corresponding interpolation data 212.
 (f)はデュアルスキャン表示(逆方向走査、上下端開始)の場合であり、図5(f)の走査期間前半で上下画面を走査している。これも、倍速化第2フレーム882の走査開始時点は、対応する補間データ212の伝送終了時点217より後ろに来る必要があるからである。これを図3(f)と比べると、画面中央部での遅延時間は短くなる。 (F) is a case of dual scan display (reverse scanning, upper and lower end start), and the upper and lower screens are scanned in the first half of the scanning period of FIG. 5 (f). This is also because the scanning start time of the double speed second frame 882 needs to come after the transmission end time 217 of the corresponding interpolation data 212. Compared with FIG. 3F, the delay time at the center of the screen becomes shorter.
 本実施例では、3D映像伝送において、L映像を60Hz映像信号、R映像を120Hz化用補間データに固定する場合であっても、走査が休止する期間があるのでシャッタ眼鏡方式の3D表示が可能となる。もちろん、実施例3(図9や図10)で述べたL映像とR映像が入れ替わる3D映像伝送においても、走査時間が短縮することによりシャッタ眼鏡の開閉切換期間を確保できる効果がある。 In this embodiment, in the 3D video transmission, even when the L video is fixed to the 60 Hz video signal and the R video is fixed to the interpolation data for 120 Hz conversion, there is a period during which scanning is paused, so that shutter glasses 3D display is possible. It becomes. Of course, even in the 3D video transmission in which the L video and the R video are switched as described in the third embodiment (FIGS. 9 and 10), there is an effect that the opening / closing switching period of the shutter glasses can be secured by shortening the scanning time.
 実施例5では、表示素子のフレームレートを2倍の240Hzとした構成を説明する。
  図14は、実施例5に係る表示装置の構成例を示す図である。表示装置9は、入力部91、高画質化処理部92、補間データ高画質化部93、フレームレート変換部94、走査処理部95、表示素子96を有する。
In the fifth embodiment, a configuration in which the frame rate of the display element is doubled to 240 Hz will be described.
FIG. 14 is a diagram illustrating a configuration example of the display device according to the fifth embodiment. The display device 9 includes an input unit 91, an image quality improvement processing unit 92, an interpolation data image quality improvement unit 93, a frame rate conversion unit 94, a scanning processing unit 95, and a display element 96.
 入力部91には、60Hz非圧縮映像信号と120Hz化用補間データが入力する。入力した60Hz非圧縮映像信号は高画質化処理部92へ送られ、映像解析して明るさや色合い、解像度感などを改善する。一方、補間データ高画質化部93は、入力した120Hz化用補間データに、高画質化処理部92の映像解析結果を反映させる。例えば、補間データが差分情報であって、その差分情報が該当するエリアにおいて輝度や色補正が行われた場合、その補正量を差分情報に反映させるものである。フレームレート変換部94は、高画質化処理部92からの60Hz映像信号を、120Hz化用補間データを参照しながら4倍の240Hz映像信号に変換する。 The input unit 91 receives a 60 Hz uncompressed video signal and 120 Hz interpolation data. The input 60 Hz uncompressed video signal is sent to the image quality enhancement processing unit 92, and the video is analyzed to improve brightness, hue, resolution, and the like. On the other hand, the interpolation data image quality improving unit 93 reflects the video analysis result of the image quality improving processing unit 92 in the input 120 Hz conversion data. For example, when the interpolation data is difference information and luminance or color correction is performed in an area corresponding to the difference information, the correction amount is reflected in the difference information. The frame rate conversion unit 94 converts the 60 Hz video signal from the image quality improvement processing unit 92 into a 240-fold video signal of 4 times while referring to the 120-Hz interpolation data.
 フレームレート変換部94から出力される高画質化処理された240Hz映像信号は、走査処理部95に送られる。走査処理部95は、撮像走査情報と表示素子の走査方式に合わせた映像信号を生成し、表示素子96にて高フレームレートでの映像表示を行う。 The 240 Hz video signal subjected to the high image quality processing output from the frame rate conversion unit 94 is sent to the scanning processing unit 95. The scanning processing unit 95 generates a video signal in accordance with the imaging scanning information and the scanning method of the display element, and performs video display at a high frame rate on the display element 96.
 表示素子96のフレームレートを240Hz化すると、走査から次の走査までの間に表示し続ける液晶などのホールド型表示素子の動画特性を改善できる。ただし、フレームレート変換後に処理量の大きな高画質化処理を施すと回路規模が増大し、消費電力が増大することになる。これを回避するため、60Hz映像信号に高画質化処理を行い、120Hz化用補間データの高画質化を行った後にフレームレート変換を行うのが望ましい。 When the frame rate of the display element 96 is set to 240 Hz, the moving image characteristics of a hold type display element such as a liquid crystal that continues to be displayed from one scan to the next can be improved. However, if high-quality processing with a large amount of processing is performed after frame rate conversion, the circuit scale increases and power consumption increases. In order to avoid this, it is desirable to perform the image quality improvement processing on the 60 Hz video signal and perform the frame rate conversion after the image quality improvement of the 120 Hz conversion data.
 本発明は上記した各実施例に限定されるものではなく、様々な変形例が含まれる。例えば、撮像素子のフレームレートを120Hzとした場合について説明したが、撮像素子のフレームレートがそれ以外の場合(100Hzや96Hzなど)にも同様に適用できることは言うまでもない。また、撮像素子のフレームレートを240Hzとし、伝送する非圧縮映像信号を4フレームに一度のフレームレート60Hz、補間データは撮像時の240Hzに復元するものであってもよい。 The present invention is not limited to the above-described embodiments, and includes various modifications. For example, although the case where the frame rate of the image sensor is set to 120 Hz has been described, it is needless to say that the present invention can be similarly applied to cases where the frame rate of the image sensor is other than that (such as 100 Hz and 96 Hz). The frame rate of the image sensor may be 240 Hz, the uncompressed video signal to be transmitted may be restored to a frame rate of 60 Hz once every four frames, and the interpolation data may be restored to 240 Hz at the time of imaging.
 以上説明した実施例では、伝送対象の映像が撮像により生成された映像であったが、本発明の伝送対象はこれに限られるものではない。コンピュータグラフィックスなどにより生成された映像など、撮像装置で生成する映像とは異なる映像を伝送対象としてもよい。この場合、上記各実施例において、「撮像装置」は「映像出力装置」に置き換えればよく、「撮像走査情報」は「映像を生成するときの走査情報」に置き換えればよい。 In the embodiment described above, the transmission target video is a video generated by imaging, but the transmission target of the present invention is not limited to this. A video that is different from a video generated by the imaging device, such as a video generated by computer graphics, may be a transmission target. In this case, in each of the above-described embodiments, “imaging device” may be replaced with “video output device”, and “imaging scanning information” may be replaced with “scanning information when generating a video”.
 また、「撮像装置」の例においても、「撮像走査情報」は当該撮像映像を生成するときの走査情報といえるので、「撮像走査情報」は、「映像を生成するときの走査情報」に含まれる概念といえる。 Also, in the example of “imaging device”, “imaging scanning information” can be said to be scanning information when generating the captured image. Therefore, “imaging scanning information” is included in “scanning information when generating image”. It can be said that
 上記した実施例では、本発明を分かりやすく説明するために各部の構成を詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることや、ある実施例の構成に他の実施例の構成を追加することも可能である。 In the above-described embodiments, the configuration of each part is described in detail in order to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to one having all the configurations described. Further, a part of the configuration of a certain embodiment can be replaced with the configuration of another embodiment, or the configuration of another embodiment can be added to the configuration of a certain embodiment.
 1,3:撮像装置、
 2,8,9:表示装置、
 4:エンコーダ、
 5:サーバ、
 6:放送局、
 7:STB、
 11:撮像素子、
 13,75:重畳部、
 14,76:8B/10Bエンコーダ、
 17,43,47,74:圧縮部、
 18,79:エラー訂正符号化部、
 19,27,48,80:制御部、
 22:10B/8Bデコーダ、
 23,73:分離部、
 25,85,95:走査処理部、
 26,86,96:表示素子、
 28:エラー訂正復号化部、
 29,74:伸張部、
 31:表示信号処理部、
 44:パケット化処理部、
 72:パケット復元処理部、
 82:高画質化処理部、
 83:補間データ復元部、
 84:簡易高画質化処理部、
 93:補間データ高画質化部、
 94:フレームレート変換部。
1, 3: an imaging device,
2, 8, 9: display device,
4: Encoder,
5: Server,
6: Broadcasting station
7: STB,
11: Image sensor
13, 75: Superimposition part,
14, 76: 8B / 10B encoder,
17, 43, 47, 74: compression section,
18, 79: Error correction encoding unit,
19, 27, 48, 80: control unit,
22: 10B / 8B decoder,
23, 73: separation part,
25, 85, 95: scanning processing unit,
26, 86, 96: display elements,
28: Error correction decoding unit,
29, 74: extension part,
31: Display signal processing unit,
44: Packetization processing unit,
72: Packet restoration processing unit,
82: High image quality processing unit,
83: Interpolation data restoration unit,
84: Simple high image quality processing unit,
93: Interpolation data image quality improvement section,
94: Frame rate conversion unit.

Claims (10)

  1.  映像出力装置の出力映像を表示する映像機器において、
     前記映像出力装置から、第1の映像信号と、該第1の映像信号のフレームを補間するための補間データと、前記映像を生成するときの走査情報とを受信する入力部と、
     前記入力部が受信した前記第1の映像信号と前記補間データを用いて、前記第1の映像信号のフレームを補間する第2の映像信号を生成する表示信号処理部と、
     前記第1の映像信号と、前記表示信号処理部が生成した前記第2の映像信号を表示する表示素子と、
     前記表示信号処理部と前記表示素子を制御する制御部と、を備え、
     前記制御部は、前記入力部が受信した前記走査情報に基づいて、前記表示素子の表示走査方式を決定し、
     前記表示信号処理部は、前記制御部が決定した前記表示素子の表示走査方式に基づいて前記第2の映像信号を生成することを特徴とする映像機器。
    In video equipment that displays video output from the video output device,
    An input unit for receiving, from the video output device, a first video signal, interpolation data for interpolating a frame of the first video signal, and scanning information for generating the video;
    A display signal processing unit that generates a second video signal for interpolating a frame of the first video signal using the first video signal received by the input unit and the interpolation data;
    A display element for displaying the first video signal and the second video signal generated by the display signal processing unit;
    The display signal processing unit and a control unit for controlling the display element,
    The control unit determines a display scanning method of the display element based on the scanning information received by the input unit,
    The video apparatus, wherein the display signal processing unit generates the second video signal based on a display scanning method of the display element determined by the control unit.
  2.  請求項1に記載の映像機器において、
     前記走査情報は、前記映像を取得する撮像素子における同時走査の数、走査方向、ライン/フレームシャッタ判別の少なくとも一つを含み、
     前記表示走査方式は、前記表示素子における同時走査の数、走査方向、バックライトの点灯タイミングの少なくとも一つを含むことを特徴とする映像機器。
    The video device according to claim 1,
    The scanning information includes at least one of the number of simultaneous scans, a scanning direction, and a line / frame shutter determination in an image sensor that acquires the video,
    The display apparatus includes at least one of the number of simultaneous scans in the display element, a scanning direction, and a backlight lighting timing.
  3.  表示装置へ映像を出力する映像機器において、
     映像源が出力する第1の映像信号から、該第1の映像信号のフレームレートより低いフレームレートを持つ第2の映像信号と、該第2の映像信号のフレームを補間するための補間データとを生成する映像信号処理部と、
     前記表示装置に向けて、前記第2の映像信号と前記補間データとを送信する出力部と、
     前記映像信号処理部を制御する制御部と、を備え、
     前記制御部は、前記表示装置の表示走査情報と処理可能な前記第2の映像信号および前記補間データに関する情報を取得し、前記表示走査情報と前記処理可能な前記第2の映像信号および前記補間データに関する情報に基づいて前記映像信号処理部の出力する前記第2の映像信号と前記補間データの形式を決定し、
     前記映像信号処理部は、前記制御部が決定した前記形式に基づいて前記第2の映像信号と前記補間データを生成することを特徴とする映像機器。
    In video equipment that outputs video to a display device,
    A first video signal output from the video source, a second video signal having a frame rate lower than the frame rate of the first video signal, and interpolation data for interpolating a frame of the second video signal; A video signal processing unit for generating
    An output unit for transmitting the second video signal and the interpolation data to the display device;
    A control unit for controlling the video signal processing unit,
    The control unit acquires display scanning information of the display device, information on the second video signal that can be processed, and the interpolation data, and acquires the display scanning information, the second video signal that can be processed, and the interpolation. Determining the format of the second video signal output from the video signal processing unit and the interpolation data based on the data-related information;
    The video apparatus, wherein the video signal processing unit generates the second video signal and the interpolation data based on the format determined by the control unit.
  4.  請求項3に記載の映像機器において、
     前記映像源として撮像素子を備え、
     前記制御部は前記表示走査情報と前記処理可能な前記第2の映像信号および前記補間データに関する情報に基づいて前記撮像素子の撮像走査方式を決定し、
     前記撮像素子は前記制御部が決定した前記撮像走査方式で走査し、
     前記表示走査情報は、前記表示素子における同時走査の数、走査方向、バックライトの点灯タイミングの少なくとも一つを含み、
     前記撮像走査方式は、前記撮像素子における同時走査の数、走査方向、ライン/フレームシャッタ判別の少なくとも一つを含むことを特徴とする映像機器。
    The video equipment according to claim 3,
    An image sensor is provided as the video source,
    The control unit determines an imaging scanning method of the imaging element based on the display scanning information, the second video signal that can be processed, and information on the interpolation data,
    The imaging element scans with the imaging scanning method determined by the control unit,
    The display scan information includes at least one of the number of simultaneous scans in the display element, the scan direction, and the lighting timing of the backlight,
    The imaging apparatus includes at least one of the number of simultaneous scans, a scanning direction, and line / frame shutter discrimination in the imaging device.
  5.  請求項3に記載の映像機器において、
     前記映像源が出力する前記第1の映像信号が、左目用映像と右目用映像を含む3D映像信号である場合、
     前記映像信号処理部の生成する前記第2の映像信号は、前記左目用映像と前記右目用映像を交互に含むフレームパッキング方式としたことを特徴とする映像機器。
    The video equipment according to claim 3,
    When the first video signal output from the video source is a 3D video signal including a left-eye video and a right-eye video,
    The video apparatus, wherein the second video signal generated by the video signal processing unit is a frame packing method that alternately includes the left-eye video and the right-eye video.
  6.  請求項1に記載の映像機器において、
     前記表示信号処理部は、
     前記入力部が受信した前記第1の映像信号を高画質化処理する高画質化処理部と、
     前記入力部が受信した前記第1の映像信号と前記補間データから、前記第2映像信号を生成する補間データ復元部と、
     前記高画質化処理部が決めた高画質化パラメータを用いて、前記補間データ復元部が出力する映像信号を高画質化する簡易高画質化処理部を有することを特徴とする映像機器。
    The video device according to claim 1,
    The display signal processing unit
    An image quality improvement processing unit that performs an image quality improvement process on the first video signal received by the input unit;
    An interpolation data restoration unit for generating the second video signal from the first video signal and the interpolation data received by the input unit;
    A video apparatus comprising: a simple image quality improvement processing unit for improving an image quality of a video signal output from the interpolation data restoration unit using an image quality improvement parameter determined by the image quality improvement processing unit.
  7.  請求項1に記載の映像機器において、
     前記表示信号処理部は、
     前記入力部が受信した前記第1の映像信号を高画質化処理する高画質化処理部と、
     前記入力部が受信した前記補間データを、前記高画質化処理部の処理結果を用いて補正する補間データ高画質化部と、
     補正された前記補間データを用いて前記第1の映像信号のフレームを補間する前記第2の映像信号を生成し、前記第1の映像信号と前記第2の映像信号を合成して第1の映像信号の2倍以上のフレームレートを持つ第3の映像信号に変換するフレームレート変換部を有することを特徴とする映像機器。
    The video device according to claim 1,
    The display signal processing unit
    An image quality improvement processing unit that performs an image quality improvement process on the first video signal received by the input unit;
    An interpolation data image quality improvement unit that corrects the interpolation data received by the input unit using a processing result of the image quality improvement processing unit;
    Using the corrected interpolation data, the second video signal for interpolating a frame of the first video signal is generated, and the first video signal and the second video signal are combined to generate the first video signal. What is claimed is: 1. A video device comprising a frame rate conversion unit for converting into a third video signal having a frame rate twice or more that of a video signal.
  8.  請求項2に記載の映像機器において、
     前記制御部は、
     前記走査情報がラインシャッタモードを示す場合は、前記表示素子の表示走査方式として上下画面を同方向走査するデュアルスキャン方式を選択し、
     前記走査情報がフレームシャッタモードを示す場合は、前記表示素子の表示走査方式として画面中央走査開始のデュアルスキャン方式を選択することを特徴とする映像機器。
    The video equipment according to claim 2,
    The controller is
    When the scanning information indicates a line shutter mode, select a dual scan method for scanning the upper and lower screens in the same direction as the display scanning method of the display element,
    When the scanning information indicates a frame shutter mode, a video apparatus characterized by selecting a dual scan method in which screen center scanning is started as the display scanning method of the display element.
  9.  請求項4に記載の映像機器において、
     前記制御部は、
     前記表示走査情報がシングルスキャン方式を示す場合は、前記撮像素子の撮像走査方式としてラインシャッタモードのシングルスキャン方式を選択し、
     前記表示走査情報がデュアルスキャン方式を示す場合は、前記撮像素子の撮像走査方式としてラインシャッタモードのデュアルスキャン方式を選択することを特徴とする映像機器。
    The video equipment according to claim 4,
    The controller is
    When the display scanning information indicates a single scan method, select a single scan method in a line shutter mode as an imaging scan method of the image sensor,
    When the display scanning information indicates a dual scan method, a video apparatus, wherein a line scan mode dual scan method is selected as an imaging scan method of the image sensor.
  10.  撮像素子にて取得した映像信号を表示素子にて表示する映像処理方法において、
     前記撮像素子が出力する前記第1の映像信号から、該第1の映像信号より低いフレームレートの第2の映像信号と、該第2の映像信号のフレームを補間する補間データとを生成する映像信号生成ステップと、
     前記第2の映像信号と、前記補間データと、前記撮像素子の撮像走査情報とを送受信する送受信ステップと、
     受信した前記第2の映像信号と前記補間データを用いて、前記第2の映像信号のフレームを補間する第3の映像信号を生成する表示信号処理ステップと、
     前記第2の映像信号と前記第3の映像信号を前記表示素子にて表示する表示ステップと、を備え、
     前記表示素子の表示走査方式は、受信した前記撮像走査情報に基づいて決定され、
     あるいは、前記撮像素子の撮像走査方式は、前記表示素子の表示走査情報に基づいて決定されることを特徴とする映像処理方法。
    In a video processing method for displaying a video signal acquired by an image sensor on a display element,
    Video that generates a second video signal having a lower frame rate than the first video signal and interpolation data for interpolating the frames of the second video signal from the first video signal output by the imaging device A signal generation step;
    A transmission / reception step of transmitting / receiving the second video signal, the interpolation data, and imaging scanning information of the imaging device;
    A display signal processing step of generating a third video signal for interpolating a frame of the second video signal using the received second video signal and the interpolation data;
    A display step of displaying the second video signal and the third video signal on the display element;
    The display scanning method of the display element is determined based on the received imaging scanning information,
    Alternatively, the imaging scanning method of the imaging element is determined based on display scanning information of the display element.
PCT/JP2014/055980 2014-03-07 2014-03-07 Video device and video processing method WO2015132957A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/055980 WO2015132957A1 (en) 2014-03-07 2014-03-07 Video device and video processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/055980 WO2015132957A1 (en) 2014-03-07 2014-03-07 Video device and video processing method

Publications (1)

Publication Number Publication Date
WO2015132957A1 true WO2015132957A1 (en) 2015-09-11

Family

ID=54054789

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/055980 WO2015132957A1 (en) 2014-03-07 2014-03-07 Video device and video processing method

Country Status (1)

Country Link
WO (1) WO2015132957A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018173597A1 (en) * 2017-03-22 2018-09-27 富士フイルム株式会社 Imaging device, imaging method, and imaging program
US11326891B2 (en) * 2018-08-21 2022-05-10 Samsung Electronics Co., Ltd. Method for providing image to vehicle and electronic device therefor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012175613A (en) * 2011-02-24 2012-09-10 Sony Corp Image transmission device, image transmission method, and program
JP2013114112A (en) * 2011-11-30 2013-06-10 Sony Corp Resolution converting device and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012175613A (en) * 2011-02-24 2012-09-10 Sony Corp Image transmission device, image transmission method, and program
JP2013114112A (en) * 2011-11-30 2013-06-10 Sony Corp Resolution converting device and method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018173597A1 (en) * 2017-03-22 2018-09-27 富士フイルム株式会社 Imaging device, imaging method, and imaging program
CN110463185A (en) * 2017-03-22 2019-11-15 富士胶片株式会社 Photographic device, image capture method and imaging program
US10812705B2 (en) * 2017-03-22 2020-10-20 Fujifilm Corporation Imaging apparatus, imaging method, and imaging program
CN110463185B (en) * 2017-03-22 2021-09-03 富士胶片株式会社 Image pickup apparatus, image pickup method, and storage medium
US11326891B2 (en) * 2018-08-21 2022-05-10 Samsung Electronics Co., Ltd. Method for providing image to vehicle and electronic device therefor

Similar Documents

Publication Publication Date Title
US20210235065A1 (en) Process and system for encoding and playback of stereoscopic video sequences
US9161023B2 (en) Method and system for response time compensation for 3D video processing
RU2463731C1 (en) Stereoimage data transmitting apparatus and method, stereoimage data receiving apparatus and method, image data transmitting apparatus and image data receiving apparatus
US9185328B2 (en) Device and method for displaying a three-dimensional PIP image
JP5482254B2 (en) Reception device, transmission device, communication system, display control method, program, and data structure
US20050041736A1 (en) Stereoscopic television signal processing method, transmission system and viewer enhancements
JP5448558B2 (en) Transmission apparatus, stereoscopic image data transmission method, reception apparatus, stereoscopic image data reception method, relay apparatus, and stereoscopic image data relay method
JP5262546B2 (en) Video signal processing system, playback device and display device, and video signal processing method
US20110050850A1 (en) Video combining device, video display apparatus, and video combining method
US20110149020A1 (en) Method and system for video post-processing based on 3d data
WO2011064913A1 (en) Video signal processing device and video signal processing method
WO2011045872A1 (en) Video signal processing device and video signal processing method
US20110149040A1 (en) Method and system for interlacing 3d video
US20110063298A1 (en) Method and system for rendering 3d graphics based on 3d display capabilities
EP2312859A2 (en) Method and system for communicating 3D video via a wireless communication link
WO2015132957A1 (en) Video device and video processing method
WO2015053352A1 (en) Display device and super-resolution processing method
JP4966400B2 (en) Image output apparatus and image output method
US20110150355A1 (en) Method and system for dynamic contrast processing for 3d video
JP4747214B2 (en) Video signal processing apparatus and video signal processing method
KR100952309B1 (en) High definition image broadcasting method and system of the same
WO2015145745A1 (en) Image output display system, image output apparatus, and image display apparatus
WO2015155893A1 (en) Video output apparatus, video reception apparatus, and video output method
JP2018067861A (en) Image transmission device, image reception device, image transmission method, and image transmission program
JP2004153421A (en) Camera video image processing system, camera video processing method, and imaging camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14884935

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14884935

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP