WO2012107985A1 - Image signal processing device and image signal processing method - Google Patents

Image signal processing device and image signal processing method Download PDF

Info

Publication number
WO2012107985A1
WO2012107985A1 PCT/JP2011/007062 JP2011007062W WO2012107985A1 WO 2012107985 A1 WO2012107985 A1 WO 2012107985A1 JP 2011007062 W JP2011007062 W JP 2011007062W WO 2012107985 A1 WO2012107985 A1 WO 2012107985A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
output
video
frame rate
synchronization
Prior art date
Application number
PCT/JP2011/007062
Other languages
French (fr)
Japanese (ja)
Inventor
進吾 宮内
武田 英俊
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012107985A1 publication Critical patent/WO2012107985A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/006Details of the interface to the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0442Handling or displaying different aspect ratios, or changing the aspect ratio
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/12Frame memory handling
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/18Use of a frame buffer in a display terminal, inclusive of the display panel

Definitions

  • the present invention relates to a video signal processing apparatus and a video signal processing method for displaying a plurality of videos having different frame rates in synchronization.
  • DTV digital television
  • video content it has become common to view a plurality of video contents simultaneously by helping to increase the screen size and resolution.
  • a viewer can view videos of different channels at the same time.
  • video content from a plurality of video sources such as TV broadcast video, video from a recording device, or video obtained from an external network.
  • sub-video video
  • a PC or the like determines a frame rate with reference to a free-run generated vertical synchronization signal, and displays all videos in synchronization with the determined frame rate.
  • frame synchronization processing is adjustment processing for excess or deficiency in the number of frames.
  • the frame synchronization process if the frame rate of the sub-video is larger than the frame rate of the main video, the sub-video frames are thinned out at a constant period (skip process) and displayed. Also, when the frame rate of the sub-video is smaller than the frame rate of the main video, the same frame of the sub-video is displayed repeatedly (repeat processing) at a constant period.
  • the sub video frame is displayed every time about 1000 frames (about 16.7 seconds) are displayed. One frame is skipped. (For example, see Patent Document 1 and Patent Document 2).
  • the sub-picture frames are thinned out at regular intervals, or the same frame is repeatedly displayed, which may give the viewer a sense of discomfort.
  • the judder of the video is likely to be perceived by the viewer. That is, the problem is that the viewer feels uncomfortable by the frame sync processing and the viewing quality is impaired.
  • an object of the present invention is to provide a video signal processing apparatus capable of displaying a smooth and high-definition video with little discomfort for a viewer when a plurality of videos with different frame rates are displayed in synchronization.
  • a video signal processing apparatus is a video signal processing apparatus that outputs a plurality of input videos having different input frame rates in synchronization with each other.
  • a video acquisition unit that sequentially acquires the frames included in the corresponding input frame rate, an output frame rate determination unit that determines an output frame rate, and a frame of each of the plurality of input videos acquired by the video acquisition unit
  • a frame synchronizer that sequentially outputs in synchronism with the output frame rate, and a video output that sequentially outputs a synthesized frame obtained by synthesizing each frame output from the frame synchronizer in synchronism with the output frame rate.
  • the frame synchronization unit includes a corresponding input frame of the plurality of input images. Mureto respect asynchronous video different from the output frame rate, the adjustment process to eliminate the deviation between the output frame rate to the input frame rate, executes during which it can be determined as a hard period noticeable to the viewer.
  • the frame synchronization unit detects the adjustment processing period in which the uncomfortable feeling generated in the composite frame by executing the adjustment process on the asynchronous video is a period in which the viewer who views the composite frame is not easily noticed. It is preferable to include a detection unit and a control unit that executes the adjustment process on the asynchronous video during the adjustment process period detected by the synchronization timing detection unit.
  • the synchronization timing detection unit detects a motion amount of the asynchronous video, a period in which the motion amount is smaller than a first threshold value, and a second threshold value in which the motion amount is larger than the first threshold value. Is detected as the adjustment process period, or the synchronization timing detection unit detects a change in the scene of the asynchronous video, and a period corresponding to the change in the scene of the video is set as the adjustment process period. It may be detected.
  • the frame adjustment process is performed during a period in which there is little video movement, a period in which there is a lot of video movement, and a period corresponding to the switching of the video scene.
  • the frame synchronization unit as the adjustment process, outputs a part of a frame when the input frame rate of the asynchronous video is larger than the output frame rate, and the input frame rate of the asynchronous video is If the output frame rate is smaller than the same, the same frame is repeatedly output.
  • the frame synchronization unit uses the M (M is an integer of 2 or more) original frames included in the asynchronous video as a frame between the M original frames as the adjustment process.
  • Corresponding N (M ⁇ N) interpolation frames may be generated, and the generated N interpolation frames may be output as the asynchronous video frame in the predetermined period.
  • the frame synchronization unit detects a motion vector between original frames located before and after the generated interpolation frame, and obtains the detected motion vector in proportion to the original frame with a temporal distance of the interpolation frame.
  • the interpolation frame may be generated assuming that the interpolation motion vector to be obtained is a motion vector for the original frame of the interpolation frame.
  • the output frame rate determining unit may determine any one of the plurality of input videos as the output frame rate, and the output frame rate determining unit may determine the plurality of input videos. A frame rate different from any of the input frame rates may be determined as the output frame rate.
  • the present invention is also a video signal processing method for outputting a plurality of input videos having different input frame rates in synchronization with each other, and sequentially including the frames included in each of the plurality of input videos at the corresponding input frame rate.
  • a video acquisition step for acquiring, an output frame rate determination step for determining an output frame rate, and a frame synchronization step for sequentially outputting frames of each of the plurality of input videos acquired by the video acquisition unit at the output frame rate;
  • a video output step for sequentially outputting a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization step.
  • the frame synchronization step the plurality of input videos Among these, the corresponding input frame rate is the output.
  • An adjustment process for eliminating a difference between the input frame rate and the output frame rate is performed on an asynchronous video different from a frame rate so as not to be noticeable to a viewer, and the frame synchronization step includes a frame of the asynchronous video.
  • the synchronization timing detection step for detecting an adjustment processing period in which the uncomfortable feeling generated in the composite frame due to the adjustment process being applied to the viewer who views the composite frame is difficult to detect is detected in the synchronization timing detection step.
  • the video signal processing method of the present invention is a video signal processing method for outputting a plurality of input videos having different input frame rates in synchronization with each other, and corresponding to the frames included in each of the plurality of input videos.
  • An image acquisition step for sequentially acquiring at an input frame rate, an output frame rate determination step for determining an output frame rate, and a frame of each of the plurality of input images acquired by the image acquisition unit is sequentially output at the output frame rate
  • a frame synchronization step and a video output step for sequentially outputting a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization step.
  • the corresponding input frame Using M (M is an integer of 2 or more) original frames included in a predetermined period of asynchronous video whose frame rate is different from the output frame rate, N (M ⁇ ) corresponding to a frame between the M original frames N) interpolation frames may be generated, and the generated N interpolation frames may be output as the asynchronous video frames in the predetermined period.
  • the video signal processing apparatus can display a smooth and high-quality video in which a viewer does not feel uncomfortable when a plurality of videos with different frame rates are displayed in synchronization.
  • FIG. 1 is a block diagram showing a configuration of a video signal processing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a flowchart of the operation of the video signal processing apparatus according to Embodiment 1 of the present invention.
  • FIG. 3 is a block diagram showing the configuration of the frame synchronization unit in the first embodiment of the present invention.
  • FIG. 4 is a diagram for explaining conventional frame skip processing.
  • FIG. 5 is a diagram for explaining a conventional frame repeat process.
  • FIG. 6 is a diagram for explaining frame skip processing according to Embodiment 1 of the present invention.
  • FIG. 7 is a diagram illustrating frame repeat processing according to Embodiment 1 of the present invention.
  • FIG. 8 is a flowchart of the adjustment process of the frame synchronization unit in the first embodiment of the present invention.
  • FIG. 9 is a block diagram showing the configuration of the frame synchronization unit in the second embodiment of the present invention.
  • FIG. 10 is a diagram for explaining the adjustment process of the frame synchronization unit in the second embodiment of the present invention.
  • FIG. 11 is a diagram for explaining the adjustment process of the frame synchronization unit in the second embodiment of the present invention.
  • FIG. 12 is a flowchart of the adjustment process of the frame synchronization unit according to the second embodiment of the present invention.
  • FIG. 13 is a diagram showing an application example of the video signal processing apparatus according to Embodiments 1 and 2 of the present invention.
  • Embodiment 1 of the present invention will be described with reference to FIGS. 1 and 2.
  • FIG. 1 is a block diagram showing the configuration of the video signal processing apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a flowchart of the operation of the video signal processing apparatus according to Embodiment 1 of the present invention.
  • the video signal processing device 10 includes a video acquisition unit 100, an output frame rate determination unit 200, a frame synchronization unit 300, and a video output unit 500.
  • a frame memory 400 and a video display unit 600 are also shown.
  • the video acquisition unit 100 sequentially acquires frames included in each of the plurality of input videos at a corresponding input frame rate (S10). In addition, the video acquisition unit 100 outputs each frame of the input video signal acquired according to the input frame rate to the frame synchronization unit 300.
  • the output frame rate determining unit 200 determines the output frame rate. Specifically, one of the input frame rates corresponding to the plurality of video signals input to the video acquisition unit 100 is determined as the output frame rate (S20).
  • the output frame rate determination unit 200 may determine the output frame rate from the free-run vertical synchronization signal.
  • the free-run vertical synchronization signal is a signal that realizes a predetermined frequency (for example, 60 Hz) using a clock generated by a clock generator having a variable clock frequency.
  • the frame synchronization unit 300 sequentially outputs the frames of each of the plurality of input videos acquired by the video acquisition unit 100 at the output frame rate determined by the output frame rate determination unit 200 (S30). Specifically, the frames of the plurality of input videos acquired by the video acquisition unit 100 are temporarily stored in the frame memory 400, read at the output frame rate, and output to the video output unit 500.
  • the frame synchronization unit 300 adjusts the number of frames and outputs each frame of an asynchronous video having a corresponding input frame rate different from the output frame rate among a plurality of input videos. A more specific configuration of the frame synchronization unit 300 and details of the adjustment process will be described later.
  • the frame memory 400 is a storage unit that temporarily stores frames of a plurality of input videos.
  • the specific configuration of the storage unit is not particularly limited.
  • DRAM Dynamic Random Access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • flash memory ferroelectric memory
  • HDD HardDisc Data
  • the video output unit 500 sequentially outputs combined frames obtained by combining the frames output from the frame sync unit 300 in synchronization with the output frame rate to the video display unit 600 (S40).
  • the video display unit 600 displays a video composed of the composite frame output from the video output unit 500 (S50).
  • the video display unit 600 is an LCD (Liquid Crystal Display), for example. Not only this but PDP (Plasma Display Panel) and an organic electroluminescent display (OLED: Organic Light Emitting Display) may be sufficient.
  • FIG. 3 is a configuration diagram of the frame synchronization unit 300 according to the first embodiment.
  • the frame synchronization unit 300 includes a synchronization timing detection unit 310 and an address control unit 320 (control unit).
  • the synchro timing detection unit 310 detects an adjustment processing period in which the uncomfortable feeling generated in the composite frame by performing the adjustment process on the frame of the asynchronous video is less noticeable to the viewer who views the composite frame.
  • the synchronization timing detection unit 310 adjusts a period in which the video motion is small, the video motion is intense, or the video scene changes and a blackout or whiteout occurs when the scene changes. Detect as period.
  • SAD is a parameter obtained by calculating an absolute value of a luminance difference between corresponding pixels of a frame that is temporally continuous with the frame for each pixel constituting the frame, and summing the absolute values of the calculated luminance differences of the pixels. is there.
  • a method for detecting the amount of motion of an image using SAD will be described.
  • the synchronization timing detection unit 310 divides each frame into several small areas for temporally continuous frames constituting the asynchronous video.
  • the synchronization timing detection unit 310 obtains the SAD of each divided small area of one frame with the corresponding small area of the previous frame or the subsequent frame. That is, the synchronization timing detection unit 310 calculates the sum of absolute values of the luminance differences of the pixels constituting the small area and the corresponding pixels in the previous frame or the subsequent frame. As a result, the same number of SADs as the number of small regions is obtained.
  • the synchronization timing detection unit 310 obtains the sum of SADs corresponding to each small area.
  • the synchronization timing detection unit 310 sets a threshold for the sum of the SADs, and detects a period in which the video motion is small and a period in which the video motion is intense.
  • the synchronization timing detection unit 310 detects a period corresponding to the frame as a period in which the motion of the video is small for a frame in which the total SAD is smaller than the first threshold. In addition, the synchronization timing detection unit 310 detects a period corresponding to the frame as a period in which the motion of the video is intense for a frame in which the sum of SADs is larger than the second threshold, which is larger than the first threshold.
  • the synchronization timing detection unit 310 may detect a period in which the video motion is small and a period in which the video motion is intense depending on the number of small regions where the SAD exceeds a predetermined value.
  • the synchronization timing detection unit 310 detects a period corresponding to a frame in which the number of small areas exceeding a predetermined value is smaller than the first threshold as a period in which video motion is small.
  • the synchronization timing detection unit 310 detects a period corresponding to a frame in which the number of small areas exceeding a predetermined value is larger than the second threshold and larger than the second threshold as a period in which video motion is intense.
  • the period during which the video scene changes is detected by, for example, APL (Average Picture Level) which is the average luminance of the entire frame.
  • APL Average Picture Level
  • the synchronization timing detection unit 310 calculates the APL for the frame of the asynchronous video, and for the frame in which the difference from the average of the APLs of a plurality of consecutive frames before the frame is large, the frame The period corresponding to is detected as the period during which the video scene changes.
  • 4 and 5 are diagrams for explaining conventional adjustment processing.
  • the address control unit 320 sets a write address representing a frame stored in the frame memory 400 when a frame included in the input video (asynchronous video) is stored in the frame memory 400 according to the input frame rate.
  • the write address is counted up according to the frame output timing of the input frame rate.
  • the write address is reset to “0” when the number of frames that can be stored in the frame memory is exceeded.
  • the number of frames that can be stored in the frame memory 400 is eight frames. Therefore, any of the numbers “0” to “7” is set as the address.
  • the address control unit 320 reads the frame from the frame memory 400 in synchronization with the output frame rate and outputs it. At this time, the address control unit 320 reads a frame from the frame memory 400 by designating the write address set at the time of storing the frame as a read address. The read address is counted up according to the frame output timing of the output frame rate.
  • phase difference the difference between the write address, the read address and the number of frames at the timing when the read address is counted up.
  • phase difference when the phase difference is “1”, it means that the read address is an address representing a frame one frame before the write address at the same timing.
  • the address control unit 320 performs frame skip processing based on whether the phase difference at the frame output timing of the output frame rate is a predetermined phase difference.
  • the predetermined phase difference is “1”.
  • the address control unit 320 When the input frame rate of asynchronous video is larger than the output frame rate, the address control unit 320 thins out a part of the frame and outputs it. Specifically, the address control unit 320 thins out and reads out the frames of the asynchronous video stored in the frame memory 400 and outputs them to the video output unit 500 in synchronization with the output frame rate. This process is called a frame skip process.
  • FIG. 4 is a diagram for explaining a conventional frame skip process.
  • Timing (1) in FIG. 4 is the first timing at which the address control unit 320 starts outputting a frame.
  • the address control unit 320 sets the address “0” at which the phase difference is “1” which is a predetermined phase difference as the read address, A frame corresponding to the address “0” is read from the frame memory 400 and output.
  • the read address is set to “2” with respect to the write address “3”.
  • the read address counted up from the state at the timing (1) is “2” with respect to the write address “4”. That is, at the timing (3), the phase difference is “2” with respect to the predetermined phase difference “1”, and the phase difference is larger than the predetermined phase difference.
  • the address control unit 320 performs frame skip processing at timing (3). Specifically, the address control unit 320 skips the address “2”, sets “3” as a read address, reads the frame corresponding to the address “3” from the frame memory 400, and outputs the frame. That is, the frame corresponding to the address “2” is skipped.
  • the phase difference at the timing (4) in FIG. 5 is “1”, which is equal to the predetermined phase difference “1”.
  • the address control unit 320 outputs each frame of the asynchronous video according to the output frame rate by performing a frame skip process at a timing when the phase difference is larger than the predetermined phase difference.
  • the address control unit 320 when the input frame rate of asynchronous video is smaller than the output frame rate, the address control unit 320 repeatedly outputs the same frame. Specifically, the address control unit 320 repeatedly reads out the frame of the asynchronous video stored in the frame memory 400 and outputs it to the video output unit 500 in synchronization with the output frame rate. This process is called a frame repeat process.
  • FIG. 5 is a diagram for explaining a conventional frame repeat process.
  • Timing (1) in FIG. 5 is the first timing at which the address control unit 320 starts outputting a frame.
  • the address control unit 320 sets the address “0” at which the phase difference is “1” which is a predetermined phase difference as the read address, A frame corresponding to the address “0” is read from the frame memory 400 and output.
  • the read address is set to “2” with respect to the write address “3”.
  • the read address counted up from the state at the timing (2) is “3” with respect to the write address “3”. That is, at the timing (3), the phase difference is “0” with respect to the predetermined phase difference “1”, and the phase difference is smaller than the predetermined phase difference.
  • the address control unit 320 performs frame repeat processing at timing (3). Specifically, the address control unit 320 sets the address “2” as the read address again at the timing (3), reads the frame corresponding to the address “2” from the frame memory 400, and outputs it. That is, the frame corresponding to the address “2” is repeatedly output.
  • the phase difference at the timing (4) in FIG. 6 is “1”, which is equal to the predetermined phase difference “1”.
  • the address control unit 320 outputs each frame of the asynchronous video according to the output frame rate by performing a frame repeat process at a timing when the phase difference is smaller than the predetermined phase difference.
  • the address control unit 320 performs adjustment processing (frame skip processing and frame repeat) processing at regular intervals based on the phase difference, and synchronizes the input frame with the output frame rate. Output. For this reason, depending on the video displayed at the timing when the adjustment process is performed, the viewer may feel uncomfortable with the video.
  • the address control unit 320 performs an adjustment process (frame skip process, frame repeat process) on each frame of the asynchronous video during the adjustment process period detected by the synchronization timing detection unit 310 and outputs it. To do.
  • the address control unit 320 When the asynchronous video input frame rate is higher than the output frame rate, the address control unit 320 performs the frame skip processing as described above.
  • FIG. 6 is a diagram for explaining frame skip processing of the address control unit 320 according to Embodiment 1 of the present invention.
  • the address control unit 320 adjusts the frame output timing in addition to the conventional control for detecting whether or not the phase difference is a predetermined phase difference. Frame skip processing is performed based on whether or not it is included in the processing period.
  • the predetermined phase difference is “4”
  • the number of frames that can be stored in the frame memory 400 is eight frames.
  • the timing (1) of the period (a) in FIG. 6 is the first timing at which the address control unit 320 starts outputting a frame.
  • the address control unit 320 since the write address is “4”, the address control unit 320 reads the address “0” where the phase difference is “4” which is a predetermined phase difference.
  • the frame is set as an address, and the frame corresponding to the address “0” is read from the frame memory 400 and output.
  • the write address is counted up according to the input frame rate, and the read address is counted up according to the output frame rate.
  • the phase difference is “4” at any of the timings (1) to (3), and the address control unit 320 does not need to perform the frame skip process.
  • the input frame rate is larger than the output frame rate. Accordingly, when the time elapses without the address controller 320 performing frame skip processing, the phase difference between the write address and the read address increases.
  • the period (b) in FIG. 6 represents a state in which several frames have elapsed from the state of the period (a) in FIG. 6 and the phase difference between the write address and the read address has expanded.
  • the period (b) in FIG. 6 includes the adjustment process period detected by the synchronization timing detection unit 310, and the address control unit 320 performs the frame skip process in the adjustment process period.
  • the read address is set to “6” with respect to the write address “4”. That is, the phase difference is “6” with respect to the predetermined phase difference “4”, and the phase difference is larger than the predetermined phase difference.
  • the write address is “5”, and the read address counted up from the state of the timing (1) in the period (b) in FIG. 6 is “7”. . That is, the phase difference at timing (2) is “6”, which is larger than the predetermined phase difference “4”, and the timing (2) in the period (b) in FIG. It is included in the period detected as a period.
  • the address control unit 320 performs frame skip processing at the timing (2) of the period (b) in FIG. Specifically, the address control unit 320 skips the address “7”, sets “0” as a read address, reads the frame corresponding to the address “0” from the frame memory 400, and outputs the frame. That is, the frame corresponding to the address “7” is skipped.
  • the write address is “6”
  • the read address counted up from the state of the timing (2) in the period (b) in FIG. 6 is “1”.
  • the phase difference is “5”, which is larger than the predetermined phase difference “4”, and the timing (3) in the period (b) in FIG. Included in the adjustment process period. Therefore, at the timing (3) of the period (b) in FIG. 6, the address control unit 320 performs a frame skip process as the adjustment process. Specifically, the address control unit 320 skips the address “1”, sets the address “2” as a read address, reads out the frame corresponding to the address “2” from the frame memory 400, and outputs it. That is, the frame corresponding to the address “1” is skipped.
  • the phase difference at the timing (4) in the period (b) in FIG. 6 is “4”, and the predetermined phase difference “4”. Is equal to
  • the address control unit 320 performs frame skip processing to output each frame of the asynchronous video as an output frame. Output according to the rate.
  • the address control unit 320 skips and outputs one frame at a time as frame skip processing, but the address control unit 320 skips a plurality of frames at a time. May be output. That is, at the timing (2) of the period (b) in FIG. 6, the address control unit 320 skips the addresses “7” and “0”, sets “1” as the read address, and corresponds to the address “2”. A frame to be output may be output. In this case, two frames of addresses “7” and “0” are skipped at a time.
  • the number of frames that can be stored in the frame memory 400 is finite, and in the example of FIG. 6, it is 8 frames as described above. Therefore, when the adjustment process period is not detected for a long time and the frame skip process cannot be performed, the address control unit 320 may inevitably perform the frame skip process during a period other than the adjustment process period.
  • the period (c) in FIG. 6 is a diagram showing such a case.
  • the write address is “7”, and the read address is counted up from the state of the timing (1) and should be “7” originally.
  • the frame corresponding to the read address “7” does not exist in the frame memory 400 because a new frame is stored at the timing when the write address is counted up to “7”.
  • the address control unit 320 performs the frame skip process at the timing (2) of the period (c) in FIG. 6 that is not included in the adjustment process period. Specifically, the address control unit 320 skips the address “7”, sets the address “0” as the read address, reads out the frame corresponding to the address “0” from the frame memory 400, and outputs it.
  • the address control unit 320 When the input frame rate of asynchronous video is smaller than the output frame rate, the address control unit 320 performs frame repeat processing.
  • FIG. 7 is a diagram for explaining frame repeat processing of the address control unit 320 according to Embodiment 1 of the present invention.
  • the address control unit 320 has a frame output timing in the adjustment processing period in addition to the conventional control for detecting whether or not the phase difference is a predetermined phase difference. Frame repeat processing is performed based on whether or not it is included.
  • the predetermined phase difference is “4”, and the number of frames that can be stored in the frame memory 400 is eight frames.
  • the timing (1) of the period (a) in FIG. 7 is the first timing at which the address control unit 320 starts to output a frame.
  • the address control unit 320 since the write address is “4”, the address control unit 320 reads the address “0” where the phase difference is “4” which is a predetermined phase difference.
  • the frame is set as an address, and the frame corresponding to the address “0” is read from the frame memory 400 and output.
  • the write address is counted up according to the input frame rate, and the read address is counted up according to the output frame rate.
  • the phase difference is “4” at any timing of (1) to (3), and the address control unit 320 does not need to perform frame repeat processing.
  • the input frame rate is smaller than the output frame rate. Accordingly, when the time elapses without the address control unit 320 performing the frame repeat process, the phase difference between the write address and the read address becomes small.
  • the period (b) in FIG. 7 represents a state in which the frame repeat process is not performed from the state in the period (a) in FIG. 7 and a time of several frames elapses and the phase difference between the write address and the read address is reduced.
  • the read address is set to “1” with respect to the write address “4”. That is, the phase difference is “3” with respect to the predetermined phase difference “4”, and the phase difference is smaller than the predetermined phase difference.
  • the write address is “5”, and the read address counted up from the timing (1) in the period (b) in FIG. 7 is “2”. That is, the phase difference is “3” at the timing (2), which is smaller than the predetermined phase difference “4”, and the synchronization timing detection unit 310 performs the adjustment process at the timing (2) in FIG. It is included in the period detected as a period. Therefore, the address control unit 320 performs a frame repeat process at the timing (2) of the period (b) in FIG. Specifically, the address control unit 320 sets the address “1” as the read address again, reads out the frame corresponding to the address “1” from the frame memory 400, and outputs the frame. That is, the frame corresponding to the address “1” is output again.
  • the phase difference at the timing (3) of the period (b) is “4”, which is equal to the predetermined phase difference “4”.
  • the address control unit 320 performs frame repeat processing to output each frame of the asynchronous video as an output frame. Output according to the rate.
  • the address control unit 320 repeatedly outputs a frame once as a frame repeat process, but the address control unit 320 outputs a plurality of the same frames. You may output continuously.
  • the number of frames that can be stored in the frame memory 400 is finite, and in the example of FIG. 7, it is 8 frames as described above. Therefore, when the adjustment process period is not detected for a long time and the frame repeat process cannot be performed, the address control unit 320 may inevitably perform the frame repeat process during a period other than the adjustment process period.
  • FIG. 7 (C) of FIG. 7 is a figure which shows such a case.
  • the frame repeat process is not performed, and the phase difference between the write address and the read address is small. Since the write address is “4” while the read address is “4”, the phase difference is “0”.
  • the write address is “4”, and the read address is counted up from the state of the timing (1) and should be “5” originally.
  • the write address is not counted up to “5”. That is, the frame corresponding to the address “5” at the timing (2) is not yet stored, and the frame corresponding to the read address “5” does not exist in the frame memory 400.
  • the address control unit 320 performs the frame repeat process at the timing (2) of the period (c) in FIG. 7 that is not included in the adjustment process period. Specifically, the address control unit 320 sets the address “4” as the read address again, reads out the frame corresponding to the address “4” from the frame memory 400, and outputs it.
  • FIG. 8 is a flowchart of the adjustment process of the frame synchronization unit 300 according to the first embodiment.
  • the adjustment process of the frame synchronization unit 300 that eliminates the difference between the input frame rate and the output frame rate in the first embodiment is as follows.
  • the frame synchronization unit 300 acquires the input frame of the input video acquired by the video acquisition unit 100 (S110).
  • the frame synchronization unit 300 acquires the output frame rate determined by the output frame rate determination unit 200 (S120).
  • the synchronization timing detection unit 310 detects an adjustment process period, which is a period during which the adjustment process performed by the address control unit 320 is difficult for the viewer to perceive the asynchronous video (S130).
  • the address control unit 320 outputs the input frame according to the output frame rate based on the phase difference.
  • the address control unit 320 performs the adjustment process to display the frame. Output (S160).
  • the address control unit 320 When the timing at which the deviation occurs is not included in the adjustment processing period (No in S140), the address control unit 320 outputs the frame as it is without performing the adjustment processing (S160).
  • the frame adjustment processing for eliminating the deviation between the input frame rate and the output frame rate is performed during the period when the video motion is small. It is performed in a period corresponding to the switching of video scenes. As a result, a smooth and high-definition video display is realized in which the viewer does not feel a sense of discomfort in the video.
  • the configuration and operation of the video signal processing apparatus 10 according to the second embodiment of the present invention are different from the first embodiment only in the configuration and operation of the frame synchronization unit 300. Therefore, the configuration and adjustment processing of the frame synchronization unit 300 according to Embodiment 2 will be described in detail below.
  • FIG. 9 is a block diagram showing the configuration of the frame synchronization unit 300 according to Embodiment 2 of the present invention.
  • the frame synchronization unit 300 includes an address control unit 320 and an interpolation frame generation unit 360.
  • the address control unit 320 performs a frame skip process or a frame repeat process on the input frame, which is each frame of the asynchronous video, for a predetermined period, and outputs it to the interpolation frame generation unit 360 as a read frame.
  • the address control unit 320 also outputs to the interpolation frame generation unit 360 the preceding and following frames that are temporally preceding and following the read frame.
  • the interpolation frame generation unit 360 generates an interpolation frame using the read frame and the previous and next frames output from the address control unit 320. In addition, the interpolation frame generation unit 360 replaces a frame included in a predetermined period of the read frame with the interpolation frame and outputs it as an output frame. Details of the interpolation frame generation method of the interpolation frame generation unit 360 will be described later.
  • FIG. 10 is a diagram for explaining adjustment processing of the address control unit 320 and the interpolation frame generation unit 360 when the input frame rate is smaller than the output frame rate.
  • the basic operation of storing a frame in the frame memory 400 of the address control unit 320 is as described in the first embodiment. As shown in FIG. 10, the write address is counted up according to the input frame rate, and the read address is counted up according to the output frame rate.
  • the address control unit 320 detects the timing for performing the frame repeat process based on whether or not the phase difference at the frame output timing of the output frame rate is a predetermined phase difference.
  • the predetermined phase difference is “1”
  • the number of frames that can be stored in the frame memory 400 is eight frames.
  • the frame corresponding to the input frame address “0” is expressed as frame [0].
  • the frame [1] is a frame corresponding to the address “1” of the input frame
  • the frame [2] is a frame corresponding to the address “2” of the input frame.
  • the interpolation frame generation unit 360 uses the interpolation frame to mitigate changes in the time distance between frames. Specifically, for example, the interpolation frame generation unit 360 outputs timings (4), (5), (() that are output timings of output frames corresponding to the frames [2], [2], [3] of the read frame. In 6), interpolation frames [1.8], [2.5], and [3.2] are output, respectively.
  • the output frame uses the interpolation frame, and the frames [0], [1], [1.8], [2.5], [3.2], [4], [5]. It becomes. Thereby, since the change of the time distance between frames is relieved, the video display which a viewer does not feel uncomfortable is realized.
  • the address control unit 320 detects a replacement frame to be replaced with an interpolation frame in advance from the read frame from the input frame rate and the output frame rate of the asynchronous video.
  • the frame [2] output by the frame repeat process and the immediately preceding and immediately following frames are set as the replacement frame. That is, the replacement frame is the frames [2], [2], and [3] of the read frame.
  • the address control unit 320 determines an interpolation frame generated by the interpolation frame generation unit 360.
  • the interpolated frames are set to frames [1.8], [2.5], and [3.2] in order to mitigate changes in the temporal distance between frames.
  • Timing (1) in FIG. 10 is the first timing at which the address control unit 320 starts to output a read frame.
  • the write address is “1” and the predetermined phase difference is “1”. Therefore, the address control unit 320 sets the read address to “0”, reads the frame [0] corresponding to the address “0” from the frame memory 400 as a read frame, and outputs it to the interpolation frame generation unit 360.
  • the read address is counted up to “1”, and the phase difference is equal to the predetermined phase difference. For this reason, the frame repeat process is not performed, and the output read frame is frame [1].
  • Timing (2) is a timing at which the interpolation frame generation unit 360 starts to output an output frame.
  • the interpolation frame generation unit 360 outputs the frame [0] output from the address control unit 320 at timing (1) to the video output unit 500 as an output frame.
  • the read address is counted up to “2”, and the phase difference is “1” equal to a predetermined phase difference. Therefore, the read frame output at timing (3) is frame [2].
  • the frame [2] output from the address control unit 320 at timing (3) is a frame to be replaced.
  • the readout frame at timing (3) is replaced with the interpolation frame [1.8] by the interpolation frame generation unit 360.
  • the interpolation frame [1.8] is generated from the frames [1] and [2] by linear interpolation described later.
  • the address control unit 320 sets the address “1” corresponding to the immediately preceding frame as the front and rear addresses in addition to the address “2”.
  • the front-rear address is an address for the address control unit 320 to read a frame from the frame memory 400 so that the interpolation frame generation unit 360 generates an interpolation frame.
  • frame [1] is output to the interpolated frame generation unit 360 as the previous and subsequent frames.
  • the interpolation frame generation unit 360 outputs the frame [1] output from the address control unit 320 at timing (2).
  • the phase difference expands to “2” at timing (4). Therefore, at the timing (4), the frame synchronization unit 300 performs the frame repeat process, sets the address “2” as the read address again, and outputs the frame [2] as the read frame.
  • the read frame output at timing (4) is a frame to be replaced and is replaced with frame [2.5]. Therefore, the address control unit 320 sets the next address “3” after the address “2” as the previous and next addresses, and outputs the frame [3] to the interpolated frame generation unit 360 as the previous and next frames.
  • the interpolation frame generation unit 360 linearly interpolates the frames [1] and [2] output from the address control unit 320 at timing (3). 8 "is generated and output as an output frame. The linear interpolation will be described later.
  • the read address is counted up to “3”, and the phase difference is “1” equal to a predetermined phase difference. Therefore, frame [3] is output as a read frame.
  • the read frame is a frame to be replaced, and is replaced with the frame [3.2]. Therefore, frame [4.5] is output as the previous and subsequent frames.
  • the interpolation frame generation unit 360 generates the interpolation frame [2.5] from the frames [2] and [3] output from the address control unit 320 at timing (4), and outputs the output frame. Output as.
  • the read address is counted up to “3”, and the phase difference is “1” equal to a predetermined phase difference. Therefore, frame [3] is output as a read frame. Further, since the read frame is not a frame to be replaced, the preceding and following addresses are not set.
  • the interpolation frame generation unit 360 generates an interpolation frame [3.2] from the frames [3] and [4] output from the address control unit 320 at timing (5), and outputs an output frame. Output as.
  • the address control unit 320 outputs only the read frame at the timing when the normal read frame is output, and the read frame (replaced frame) and the preceding and following frames at the timing when the replaced frame is output. Is output.
  • the interpolation frame generation unit 360 outputs the read frame as it is as an output frame at the timing after one frame.
  • the interpolation frame generation unit 360 generates and outputs an interpolation frame from the read frame and the previous and next frames at a timing one frame later.
  • the interpolation frame is generated by performing linear interpolation using two frames.
  • the interpolation frame [1.8] is generated by linear interpolation using the motion vector obtained from the frame [1] and the frame [2].
  • the interpolation frame generation unit 360 divides, for example, the frame [1] and the frame [2] of the read frame into several small regions.
  • the interpolation frame generation unit 360 obtains the SAD between one small area of the frame [2] (hereinafter referred to as a small area A) and each small area of the frame [1]. Subsequently, the interpolation frame generation unit 360 obtains a small area (hereinafter, referred to as a small area A ′) of the frame [1] corresponding to the smallest SAD value among the SADs obtained by the number of small areas. Further, the interpolation frame generation unit 360 uses the small area A and the small area A ′ to obtain a motion vector A that represents a change in position from the small area A to the small area A ′.
  • the interpolation frame generation unit 360 includes a small area (hereinafter, a small area) at a position obtained by the motion vector A ⁇ 0.2 from the position of the small area A among the small areas included in the interpolation frame [1.8]. Region A ′′).
  • the luminance of each pixel in the small area A ′′ is obtained by proportionally distributing the luminance of each pixel in the small area A and the luminance of each pixel in the small area A ′.
  • the luminance of the small area A ′′ of the interpolation frame [1.8] is obtained.
  • the luminance of the other small areas of the interpolation frame [1.8] is obtained by obtaining the motion vectors. As a result, an interpolation frame [1.8] is generated.
  • the interpolation frame [2.5] is generated by linearly interpolating from the frame [2] and the frame [3] using a motion vector.
  • the interpolation frame [3.2] is generated by linearly interpolating from the frame [3] and the frame [4] using a motion vector.
  • FIG. 11 is a diagram for explaining the adjustment processing of the address control unit 320 and the interpolation frame generation unit 360 when the input frame rate is larger than the output frame rate.
  • the interpolation frame generation unit 360 performs interpolation at each timing of outputting frames [3] and [4] of the read frame among the output timings of the output frame. Frames [2.3] and [3.7] are output.
  • the output frames are frames [0], [1], [2.3], [3.7], [5], [6],... Using the interpolation frame.
  • the frame control procedure of the address control unit 320 and the interpolation frame generation method are the same as when the input frame rate is smaller than the output frame rate.
  • FIG. 12 is a flowchart showing the elimination of the synchronization error in the second embodiment.
  • the adjustment process of the frame synchronization unit 300 in the second embodiment is as follows.
  • the frame synchronization unit 300 acquires the input frame of the input video acquired by the video acquisition unit 100 (S210).
  • the frame synchronization unit 300 acquires the output frame rate determined by the output frame rate determination unit 200 (S220).
  • the address control unit 320 detects a frame to be replaced based on the input frame rate and the output frame rate (S230).
  • the address control unit 320 outputs the input frame as a read frame according to the output frame rate based on the phase difference (S240).
  • the address control unit 320 When the read frame is a frame to be replaced (YES in S250), the address control unit 320 outputs the previous and next frames (S260).
  • the interpolation frame generation unit 360 generates an interpolation frame and outputs it as an output frame when the read frame and the preceding and following frames are acquired (S270).
  • the interpolation frame generation unit 360 outputs the read frame as an output frame as it is (S280).
  • the frame synchronization unit 300 does not output a frame specified by a read address as it is, but generates an interpolated frame, performs replacement processing, and outputs the interlaced frames. Improves temporal continuity of Thereby, smooth and high-quality video display is realized. Also, since not all frames are replaced with interpolation frames, but only necessary portions are replaced with interpolation frames, the load on signal processing is small, and synchronization deviation can be corrected effectively.
  • Each of the above devices is specifically a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
  • a computer program is stored in the RAM or hard disk unit.
  • Each device achieves its functions by the microprocessor operating according to the computer program.
  • the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
  • a part or all of the components constituting each of the above devices may be configured by one system LSI (Large Scale Integration).
  • the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on a single chip, and specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
  • a computer program is stored in the RAM.
  • the system LSI achieves its functions by the microprocessor operating according to the computer program.
  • a part or all of the constituent elements constituting each of the above devices may be constituted by an IC card or a single module that can be attached to and detached from each device.
  • the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
  • the IC card or the module may include the super multifunctional LSI described above.
  • the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
  • the present invention may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
  • the present invention also provides a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc). ), Recorded in a semiconductor memory or the like. Further, the digital signal may be recorded on these recording media.
  • a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc).
  • the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
  • the present invention may be a computer system including a microprocessor and a memory, the memory storing the computer program, and the microprocessor operating according to the computer program.
  • the program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like, and executed by another independent computer system. It is good.
  • the video signal processing device has been described based on the embodiment and the modifications thereof.
  • the frame adjustment processing for eliminating the difference between the input frame rate and the output frame rate is performed during the period when the video motion is low, the video motion is high, and the video scene is switched. Is performed in a period corresponding to.
  • interpolated frames are generated, replaced and output, thereby improving temporal continuity between frames.
  • the video signal processing apparatus 10 is realized as the television 700 shown in FIG.
  • the specific configuration of the video display unit 600 is not particularly limited, and is, for example, a liquid crystal display, a plasma display, an organic EL (Electro Luminescence) display, or the like.
  • the video acquisition unit 100 acquires a video from a television broadcast, a DVD (Digital Versatile Disc) player 710 and a set top box 720 shown in FIG.
  • the video signal processing apparatus 10 may be realized as a DVD player 710.
  • the video acquisition unit 100 acquires video from the inserted DVD.
  • the acquisition source of the video is not limited to the DVD, and can be acquired from any recording medium such as Blu-ray Disc, HDD (Hard Disk Drive) and the like.
  • the video signal processing apparatus 10 may be realized as a set top box 720.
  • the video acquisition unit 100 acquires video from cable television broadcasting or the like.
  • this invention is not limited to these embodiment or its modification. Unless it deviates from the gist of the present invention, various modifications conceived by those skilled in the art are applied to the present embodiment or the modification thereof, or a form constructed by combining different embodiments or components in the modification. It is included within the scope of the present invention.
  • the viewer when a plurality of images with different frame rates are displayed in synchronization, the viewer can feel smooth and high-quality images without feeling uncomfortable, and has a function of displaying a plurality of images.
  • the present invention is useful as a video signal processing device and a video signal processing method used for a display device such as a television or a personal computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Television Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An image signal processing device comprises an image acquisition unit (100) for sequentially acquiring a plurality of frames included in respective input images at a corresponding input frame rate, an output frame rate determining unit (200) for determining an output frame rate, a frame synchro unit (300) for sequentially outputting the plurality of frames for respective input images obtained by the image acquisition unit at the output frame rate, and an image outputting unit (500) for sequentially outputting a composite frame obtained by compositing each frame synchronously outputted at the output frame rate from the frame synchro unit (300), wherein the frame synchro unit (300) performs reconciliation processing that eliminates deviation between the input frame rate and the output frame rate, so as not be noticeable to a viewer, with respect to asynchronous images wherein the corresponding input frame rate among the plurality of input images differs from the output frame rate.

Description

映像信号処理装置および映像信号処理方法Video signal processing apparatus and video signal processing method
 本発明は、フレームレートが異なる複数の映像を同期させて表示する映像信号処理装置および映像信号処理方法に関する。 The present invention relates to a video signal processing apparatus and a video signal processing method for displaying a plurality of videos having different frame rates in synchronization.
 近年、DTV(デジタルテレビ)等において、ディスプレイの大画面化、高精細化が一助となり、複数の映像コンテンツを同時に視聴することが一般的になっている。例えば、視聴者は、異なるチャンネルの映像を同時に視聴することが可能である。また、テレビ放送の映像、録画機器の映像、または外部ネットワークから取得した映像などの複数の映像ソースからの映像コンテンツを同時に視聴することも可能である。 In recent years, in DTV (digital television) and the like, it has become common to view a plurality of video contents simultaneously by helping to increase the screen size and resolution. For example, a viewer can view videos of different channels at the same time. It is also possible to simultaneously view video content from a plurality of video sources such as TV broadcast video, video from a recording device, or video obtained from an external network.
 複数の映像ソースから取得した各コンテンツを同期させて表示する場合には、各コンテンツを複数の映像ソースのいずれかのフレームレートに同期させて表示する必要がある。このような場合、ディスプレイ上での表示サイズが最も大きく、視聴者に知覚されやすい映像である主映像のフレームレートを用いて、他の映像(以下、副映像と記す)を同期させて表示する方法が一般的である。また、PC等では、フリーラン生成した垂直同期信号を基準にフレームレートを決定し、全ての映像を決定したフレームレートに同期させて表示させる場合もある。 When displaying content acquired from a plurality of video sources in synchronization, it is necessary to display each content in synchronization with one of the frame rates of the plurality of video sources. In such a case, other video (hereinafter referred to as sub-video) is displayed in synchronization using the frame rate of the main video, which is the largest display size on the display and is easily perceived by the viewer. The method is common. In some cases, a PC or the like determines a frame rate with reference to a free-run generated vertical synchronization signal, and displays all videos in synchronization with the determined frame rate.
 副映像を主映像に同期させて表示するためには、フレーム数の過不足の調整処理であるフレームシンクロ処理を行うのが一般的である。フレームシンクロ処理においては、主映像のフレームレートよりも、副映像のフレームレートが大きい場合は、一定周期で副映像のフレームが間引かれて(スキップ処理)表示される。また、主映像のフレームレートよりも、副映像のフレームレートが小さい場合は、一定周期で副映像の同一フレームが繰返して(リピート処理)表示される。例えば、主映像のフレームレートが59.94fpsであり、副映像のフレームレートが60.00fpsである場合、主映像が約1000フレーム(約16.7秒)表示される毎に副映像のフレームは1フレームスキップされる。(例えば、特許文献1、および特許文献2を参照)。 In order to display the sub-video in synchronization with the main video, it is common to perform frame synchronization processing, which is adjustment processing for excess or deficiency in the number of frames. In the frame synchronization process, if the frame rate of the sub-video is larger than the frame rate of the main video, the sub-video frames are thinned out at a constant period (skip process) and displayed. Also, when the frame rate of the sub-video is smaller than the frame rate of the main video, the same frame of the sub-video is displayed repeatedly (repeat processing) at a constant period. For example, when the frame rate of the main video is 59.94 fps and the frame rate of the sub video is 60.00 fps, the sub video frame is displayed every time about 1000 frames (about 16.7 seconds) are displayed. One frame is skipped. (For example, see Patent Document 1 and Patent Document 2).
特開2005-341132号公報JP 2005-341132 A 特開2007-271848号公報JP 2007-271848 A
 しかしながら、このようなフレームシンクロ処理では、一定周期で副映像のフレームが間引かれ、または同一フレームが繰り返して表示されるため、視聴者に違和感を与えてしまう場合がある。例えば、映像がパンしたり、チルトしたりするような期間において、フレームシンクロ処理が行われると、映像のジャダー(ガタツキ)が視聴者に知覚されやすい。つまり、フレームシンクロ処理によって視聴者が違和感を感じ、視聴品位が損なわれることが課題である。 However, in such a frame synchronization process, the sub-picture frames are thinned out at regular intervals, or the same frame is repeatedly displayed, which may give the viewer a sense of discomfort. For example, if frame synchronization processing is performed during a period in which the video pans or tilts, the judder of the video is likely to be perceived by the viewer. That is, the problem is that the viewer feels uncomfortable by the frame sync processing and the viewing quality is impaired.
 そこで本発明は、フレームレートの異なる複数の映像を同期させて表示する場合に、視聴者にとって違和感の少ないスムースで高品位な映像を表示できる映像信号処理装置を提供することを目的とする。 Therefore, an object of the present invention is to provide a video signal processing apparatus capable of displaying a smooth and high-definition video with little discomfort for a viewer when a plurality of videos with different frame rates are displayed in synchronization.
 上記課題を解決するため、本発明の一態様に係る映像信号処理装置は、入力フレームレートが互いに異なる複数の入力映像を同期させて出力する映像信号処理装置であって、前記複数の入力映像それぞれに含まれるフレームを、対応する前記入力フレームレートで順次取得する映像取得部と、出力フレームレートを決定する出力フレームレート決定部と、前記映像取得部で取得された前記複数の入力映像それぞれのフレームを、前記出力フレームレートに同期して順次出力するフレームシンクロ部と、前記フレームシンクロ部から前記出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを順次出力する映像出力部とを備え、前記フレームシンクロ部は、前記複数の入力映像のうち、対応する前記入力フレームレートが前記出力フレームレートと異なる非同期映像に対して、前記入力フレームレートと前記出力フレームレートとのズレを解消する調整処理を、視聴者に目立ちにくい期間として判断できる期間に実行する。 In order to solve the above problems, a video signal processing apparatus according to an aspect of the present invention is a video signal processing apparatus that outputs a plurality of input videos having different input frame rates in synchronization with each other. A video acquisition unit that sequentially acquires the frames included in the corresponding input frame rate, an output frame rate determination unit that determines an output frame rate, and a frame of each of the plurality of input videos acquired by the video acquisition unit A frame synchronizer that sequentially outputs in synchronism with the output frame rate, and a video output that sequentially outputs a synthesized frame obtained by synthesizing each frame output from the frame synchronizer in synchronism with the output frame rate. And the frame synchronization unit includes a corresponding input frame of the plurality of input images. Mureto respect asynchronous video different from the output frame rate, the adjustment process to eliminate the deviation between the output frame rate to the input frame rate, executes during which it can be determined as a hard period noticeable to the viewer.
 また、前記フレームシンクロ部は、前記非同期映像に前記調整処理を実行することによって前記合成フレームに生じる違和感が、当該合成フレームを視聴する視聴者に目立ちにくい期間である調整処理期間を検出するシンクロタイミング検出部と、前記シンクロタイミング検出部で検出された前記調整処理期間に、前記非同期映像に対して前記調整処理を実行する制御部とを備えることが好ましい。 Further, the frame synchronization unit detects the adjustment processing period in which the uncomfortable feeling generated in the composite frame by executing the adjustment process on the asynchronous video is a period in which the viewer who views the composite frame is not easily noticed. It is preferable to include a detection unit and a control unit that executes the adjustment process on the asynchronous video during the adjustment process period detected by the synchronization timing detection unit.
 これにより、視聴者の違和感が生じにくい期間においてフレームの調整処理が行われるため、スムースで高品位な映像表示が実現される。 As a result, since the frame adjustment process is performed in a period in which the viewer does not feel uncomfortable, a smooth and high-quality video display is realized.
 また、前記シンクロタイミング検出部は、前記非同期映像の動き量を検出し、前記動き量が第一の閾値よりも小さい期間、及び前記動き量が前記第一の閾値よりも大きい第二の閾値よりも大きい期間を、前記調整処理期間として検出する、または、前記シンクロタイミング検出部は、前記非同期映像の場面の切り替わりを検出し、前記映像の場面の切り替わりに対応する期間を、前記調整処理期間として検出してもよい。 The synchronization timing detection unit detects a motion amount of the asynchronous video, a period in which the motion amount is smaller than a first threshold value, and a second threshold value in which the motion amount is larger than the first threshold value. Is detected as the adjustment process period, or the synchronization timing detection unit detects a change in the scene of the asynchronous video, and a period corresponding to the change in the scene of the video is set as the adjustment process period. It may be detected.
 これにより、フレームの調整処理が、映像の動きの少ない期間、映像の動きの多い期間および映像の場面の切り替わりに対応する期間に行われるため、視聴者が映像の違和感を感じにくい。 Thus, the frame adjustment process is performed during a period in which there is little video movement, a period in which there is a lot of video movement, and a period corresponding to the switching of the video scene.
 なお、前記フレームシンクロ部は、前記調整処理として、前記非同期映像の前記入力フレームレートが前記出力フレームレートより大きい場合にフレームの一部を間引いて出力し、前記非同期映像の前記入力フレームレートが前記出力フレームレートより小さい場合に同一のフレームを繰り返し出力する。 The frame synchronization unit, as the adjustment process, outputs a part of a frame when the input frame rate of the asynchronous video is larger than the output frame rate, and the input frame rate of the asynchronous video is If the output frame rate is smaller than the same, the same frame is repeatedly output.
 また、前記フレームシンクロ部は、前記調整処理として、所定の期間に前記非同期映像に含まれるM(Mは2以上の整数)枚の元フレームを用いて当該M枚の元フレームの間のフレームに相当するN(M≠N)枚の補間フレームを生成し、前記所定の期間において、生成した前記N枚の補間フレームを前記非同期映像のフレームとして出力してもよい。 In addition, the frame synchronization unit uses the M (M is an integer of 2 or more) original frames included in the asynchronous video as a frame between the M original frames as the adjustment process. Corresponding N (M ≠ N) interpolation frames may be generated, and the generated N interpolation frames may be output as the asynchronous video frame in the predetermined period.
 これにより、一定周期でフレームの調整処理が行われる場合であっても、補間フレームを生成し、表示することで視聴者の違和感が小さい映像表示が実現できる。 Thus, even when frame adjustment processing is performed at a fixed period, an interpolated frame is generated and displayed, so that a video display with a small discomfort for the viewer can be realized.
 また、前記フレームシンクロ部は、生成する補間フレームの前後に位置する元フレーム間の動きベクトルを検出し、検出した前記動きベクトルを前記元フレームに対する前記補間フレームの時間的距離で比例配分して得られる補間動きベクトルを、前記補間フレームの前記元フレームに対する動きベクトルであると仮定して、前記補間フレームを生成してもよい。 In addition, the frame synchronization unit detects a motion vector between original frames located before and after the generated interpolation frame, and obtains the detected motion vector in proportion to the original frame with a temporal distance of the interpolation frame. The interpolation frame may be generated assuming that the interpolation motion vector to be obtained is a motion vector for the original frame of the interpolation frame.
 これにより、動きベクトルを用いることでより精度の高い補間フレームが生成され、スムースな映像を表示できる。 Thus, a more accurate interpolation frame is generated by using the motion vector, and a smooth video can be displayed.
 なお、出力フレームレート決定部は、前記複数の入力映像のうちのいずれかの前記入力フレームレートを、前記出力フレームレートと決定してもよいし、出力フレームレート決定部は、前記複数の入力映像の前記入力フレームレートのいずれとも異なるフレームレートを、前記出力フレームレートと決定してもよい。 The output frame rate determining unit may determine any one of the plurality of input videos as the output frame rate, and the output frame rate determining unit may determine the plurality of input videos. A frame rate different from any of the input frame rates may be determined as the output frame rate.
 また、本発明は、入力フレームレートが互いに異なる複数の入力映像を同期させて出力する映像信号処理方法であって、前記複数の入力映像それぞれに含まれるフレームを、対応する前記入力フレームレートで順次取得する映像取得ステップと、出力フレームレートを決定する出力フレームレート決定ステップと、前記映像取得部で取得された前記複数の入力映像それぞれのフレームを、前記出力フレームレートで順次出力するフレームシンクロステップと、前記フレームシンクロステップから前記出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを順次出力する映像出力ステップとを含み、前記フレームシンクロステップでは、前記複数の入力映像のうち、対応する前記入力フレームレートが前記出力フレームレートと異なる非同期映像に対して、前記入力フレームレートと前記出力フレームレートとのズレを解消する調整処理を、視聴者に目立たないように実行し、前記フレームシンクロステップは、前記非同期映像のフレームに前記調整処理を施したことによって前記合成フレームに生じる違和感が、当該合成フレームを視聴する視聴者に目立ちにくい期間である調整処理期間を検出するシンクロタイミング検出ステップと、前記シンクロタイミング検出ステップで検出された前記調整処理期間に、前記非同期映像の各フレームに前記調整処理を施して出力する制御ステップとを含む映像信号処理方法として実現されてもよい。 The present invention is also a video signal processing method for outputting a plurality of input videos having different input frame rates in synchronization with each other, and sequentially including the frames included in each of the plurality of input videos at the corresponding input frame rate. A video acquisition step for acquiring, an output frame rate determination step for determining an output frame rate, and a frame synchronization step for sequentially outputting frames of each of the plurality of input videos acquired by the video acquisition unit at the output frame rate; And a video output step for sequentially outputting a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization step. In the frame synchronization step, the plurality of input videos Among these, the corresponding input frame rate is the output. An adjustment process for eliminating a difference between the input frame rate and the output frame rate is performed on an asynchronous video different from a frame rate so as not to be noticeable to a viewer, and the frame synchronization step includes a frame of the asynchronous video. The synchronization timing detection step for detecting an adjustment processing period in which the uncomfortable feeling generated in the composite frame due to the adjustment process being applied to the viewer who views the composite frame is difficult to detect is detected in the synchronization timing detection step. And a control step of performing the adjustment process on each frame of the asynchronous video and outputting the frame during the adjustment process period.
 また、本発明の映像信号処理方法は、入力フレームレートが互いに異なる複数の入力映像を同期させて出力する映像信号処理方法であって、前記複数の入力映像それぞれに含まれるフレームを、対応する前記入力フレームレートで順次取得する映像取得ステップと、出力フレームレートを決定する出力フレームレート決定ステップと、前記映像取得部で取得された前記複数の入力映像それぞれのフレームを、前記出力フレームレートで順次出力するフレームシンクロステップと、前記フレームシンクロステップから前記出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを順次出力する映像出力ステップとを含み、前記フレームシンクロステップでは、前記複数の入力映像のうち、対応する前記入力フレームレートが前記出力フレームレートと異なる非同期映像の所定の期間に含まれるM(Mは2以上の整数)枚の元フレームを用いて当該M枚の元フレームの間のフレームに相当するN(M≠N)枚の補間フレームを生成し、前記所定の期間において、生成した前記N枚の補間フレームを、前記非同期映像のフレームとして出力してもよい。 Also, the video signal processing method of the present invention is a video signal processing method for outputting a plurality of input videos having different input frame rates in synchronization with each other, and corresponding to the frames included in each of the plurality of input videos. An image acquisition step for sequentially acquiring at an input frame rate, an output frame rate determination step for determining an output frame rate, and a frame of each of the plurality of input images acquired by the image acquisition unit is sequentially output at the output frame rate A frame synchronization step, and a video output step for sequentially outputting a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization step. In the frame synchronization step, Among a plurality of input images, the corresponding input frame Using M (M is an integer of 2 or more) original frames included in a predetermined period of asynchronous video whose frame rate is different from the output frame rate, N (M ≠) corresponding to a frame between the M original frames N) interpolation frames may be generated, and the generated N interpolation frames may be output as the asynchronous video frames in the predetermined period.
 本発明の一態様に係る映像信号処理装置によれば、フレームレートの異なる複数の映像を同期させて表示する場合に、視聴者が違和感を感じない、スムースで高品位な映像を表示できる。 The video signal processing apparatus according to one aspect of the present invention can display a smooth and high-quality video in which a viewer does not feel uncomfortable when a plurality of videos with different frame rates are displayed in synchronization.
図1は、本発明の実施の形態における映像信号処理装置の構成を表すブロック図である。FIG. 1 is a block diagram showing a configuration of a video signal processing apparatus according to an embodiment of the present invention. 図2は、本発明の実施の形態1に係る映像信号処理装置の動作のフローチャートである。FIG. 2 is a flowchart of the operation of the video signal processing apparatus according to Embodiment 1 of the present invention. 図3は、本発明の実施の形態1におけるフレームシンクロ部の構成を表すブロック図である。FIG. 3 is a block diagram showing the configuration of the frame synchronization unit in the first embodiment of the present invention. 図4は、従来のフレームスキップ処理を説明するための図である。FIG. 4 is a diagram for explaining conventional frame skip processing. 図5は、従来のフレームリピート処理を説明するための図である。FIG. 5 is a diagram for explaining a conventional frame repeat process. 図6は、本発明の実施の形態1におけるフレームスキップ処理を説明する図である。FIG. 6 is a diagram for explaining frame skip processing according to Embodiment 1 of the present invention. 図7は、本発明の実施の形態1におけるフレームリピート処理を説明する図である。FIG. 7 is a diagram illustrating frame repeat processing according to Embodiment 1 of the present invention. 図8は、本発明の実施の形態1におけるフレームシンクロ部の調整処理のフローチャートである。FIG. 8 is a flowchart of the adjustment process of the frame synchronization unit in the first embodiment of the present invention. 図9は、本発明の実施の形態2におけるフレームシンクロ部の構成を表すブロック図である。FIG. 9 is a block diagram showing the configuration of the frame synchronization unit in the second embodiment of the present invention. 図10は、本発明の実施の形態2におけるフレームシンクロ部の調整処理を説明するための図である。FIG. 10 is a diagram for explaining the adjustment process of the frame synchronization unit in the second embodiment of the present invention. 図11は、本発明の実施の形態2におけるフレームシンクロ部の調整処理を説明するための図である。FIG. 11 is a diagram for explaining the adjustment process of the frame synchronization unit in the second embodiment of the present invention. 図12は、本発明の実施の形態2におけるフレームシンクロ部の調整処理のフローチャートである。FIG. 12 is a flowchart of the adjustment process of the frame synchronization unit according to the second embodiment of the present invention. 図13は、本発明の実施の形態1及び2に係る映像信号処理装置の適用例を示す図である。FIG. 13 is a diagram showing an application example of the video signal processing apparatus according to Embodiments 1 and 2 of the present invention.
 以下、本発明の実施の形態について図面を用いて詳細に説明する。なお、以下で説明する本発明の実施の形態は、本発明の好ましい一具体例を示すものである。本実施の形態で示される数値、形状、構成要素、構成要素の配置および接続形態などは、一例であり、本発明を限定する主旨ではない。本発明は、請求の範囲だけによって限定される。よって、以下の実施の形態における構成要素のうち、独立請求項に記載されていない構成要素は、本発明の課題を達成するのに必ずしも必要ではないが、より好ましい形態を構成するものとして説明される。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. The embodiment of the present invention described below shows a preferred specific example of the present invention. The numerical values, shapes, components, arrangement of components, connection forms, and the like shown in this embodiment are merely examples, and are not intended to limit the present invention. The present invention is limited only by the claims. Therefore, among the constituent elements in the following embodiments, constituent elements that are not described in the independent claims are not necessarily required to achieve the object of the present invention, but are described as constituting more preferable embodiments. The
 (実施の形態1)
  以下、本発明の実施の形態1について、図1及び図2を用いて説明する。
(Embodiment 1)
Hereinafter, Embodiment 1 of the present invention will be described with reference to FIGS. 1 and 2.
  図1は、本発明の実施の形態1に係る映像信号処理装置の構成を示すブロック図である。 FIG. 1 is a block diagram showing the configuration of the video signal processing apparatus according to Embodiment 1 of the present invention.
  図2は、本発明の実施の形態1に係る映像信号処理装置の動作のフローチャートである。 FIG. 2 is a flowchart of the operation of the video signal processing apparatus according to Embodiment 1 of the present invention.
 図1に示すように、映像信号処理装置10は、映像取得部100と、出力フレームレート決定部200と、フレームシンクロ部300と、映像出力部500とを備える。なお、図1ではフレームメモリ400と映像表示部600も図示されている。 As shown in FIG. 1, the video signal processing device 10 includes a video acquisition unit 100, an output frame rate determination unit 200, a frame synchronization unit 300, and a video output unit 500. In FIG. 1, a frame memory 400 and a video display unit 600 are also shown.
 映像取得部100は、複数の入力映像それぞれに含まれるフレームを、対応する入力フレームレートで順次取得する(S10)。また、映像取得部100は、入力フレームレートに応じて取得した入力映像信号の各フレームをフレームシンクロ部300へ出力する。 The video acquisition unit 100 sequentially acquires frames included in each of the plurality of input videos at a corresponding input frame rate (S10). In addition, the video acquisition unit 100 outputs each frame of the input video signal acquired according to the input frame rate to the frame synchronization unit 300.
 出力フレームレート決定部200は、出力フレームレートを決定する。具体的には、映像取得部100に入力された複数の映像信号にそれぞれ対応する入力フレームレートうちの一の入力フレームレートを出力フレームレートとして決定する(S20)。 The output frame rate determining unit 200 determines the output frame rate. Specifically, one of the input frame rates corresponding to the plurality of video signals input to the video acquisition unit 100 is determined as the output frame rate (S20).
 なお、出力フレームレート決定部200は、フリーラン垂直同期信号から出力フレームレートを決定してもよい。フリーラン垂直同期信号とは、クロック周波数が可変であるクロック発生器により発生させたクロックを用い、所定の周波数(例えば60Hz)を実現した信号である。 Note that the output frame rate determination unit 200 may determine the output frame rate from the free-run vertical synchronization signal. The free-run vertical synchronization signal is a signal that realizes a predetermined frequency (for example, 60 Hz) using a clock generated by a clock generator having a variable clock frequency.
 フレームシンクロ部300は、映像取得部100で取得された複数の入力映像それぞれのフレームを、出力フレームレート決定部200で決定された出力フレームレートで順次出力する(S30)。具体的には、映像取得部100で取得された複数の入力映像それぞれのフレームをフレームメモリ400に一旦格納し、出力フレームレートで読み出し、映像出力部500へ出力する。フレームシンクロ部300は、複数の入力映像のうち、対応する入力フレームレートが出力フレームレートと異なる非同期映像の各フレームついては、フレーム数の調整処理を施して出力する。フレームシンクロ部300のさらに具体的な構成及び調整処理の詳細については後述する。 The frame synchronization unit 300 sequentially outputs the frames of each of the plurality of input videos acquired by the video acquisition unit 100 at the output frame rate determined by the output frame rate determination unit 200 (S30). Specifically, the frames of the plurality of input videos acquired by the video acquisition unit 100 are temporarily stored in the frame memory 400, read at the output frame rate, and output to the video output unit 500. The frame synchronization unit 300 adjusts the number of frames and outputs each frame of an asynchronous video having a corresponding input frame rate different from the output frame rate among a plurality of input videos. A more specific configuration of the frame synchronization unit 300 and details of the adjustment process will be described later.
 フレームメモリ400は、複数の入力映像それぞれのフレームを一時的に格納する記憶部である。 記憶部の具体的な構成は特に限定されないが、例えば、DRAM(Dynamic Random Access Memory)、SDRAM(Synchronous Dynamic Random Access Memory)、フラッシュメモリ、強誘電体メモリ、又はHDD(Hard Disk Drive)等のデータを記憶可能な手段であればどのようなものを利用しても構わない。 The frame memory 400 is a storage unit that temporarily stores frames of a plurality of input videos. The specific configuration of the storage unit is not particularly limited. For example, DRAM (Dynamic Random Access Memory), SDRAM (Synchronous Dynamic Random Access Memory), flash memory, ferroelectric memory, HDD (HardDisc Data), etc. Any means can be used as long as it can memorize.
 映像出力部500は、フレームシンクロ部300から出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを映像表示部600へ順次出力する(S40)。 The video output unit 500 sequentially outputs combined frames obtained by combining the frames output from the frame sync unit 300 in synchronization with the output frame rate to the video display unit 600 (S40).
 映像表示部600は、映像出力部500から出力された合成フレームで構成される映像を表示する(S50)。映像表示部600は、具体的には、例えば、LCD(Liquid Crystal Display)である。これに限らず、PDP(Plasma Display Panel)や有機ELディスプレイ(OLED:Organic Light Emitting Display)であってもよい。 The video display unit 600 displays a video composed of the composite frame output from the video output unit 500 (S50). Specifically, the video display unit 600 is an LCD (Liquid Crystal Display), for example. Not only this but PDP (Plasma Display Panel) and an organic electroluminescent display (OLED: Organic Light Emitting Display) may be sufficient.
 次に、フレームシンクロ部300についてさらに詳しく説明する。 Next, the frame synchronization unit 300 will be described in more detail.
 なお、以下の説明では、フレームシンクロ部300に入力された1つの非同期映像について説明するが、実際には複数の非同期映像について以下の調整処理が行われる。 In the following description, one asynchronous video input to the frame synchronization unit 300 will be described. However, the following adjustment processing is actually performed for a plurality of asynchronous videos.
 図3は、実施の形態1におけるフレームシンクロ部300の構成図である。 FIG. 3 is a configuration diagram of the frame synchronization unit 300 according to the first embodiment.
 フレームシンクロ部300は、シンクロタイミング検出部310と、アドレス制御部320(制御部)とで構成される。 The frame synchronization unit 300 includes a synchronization timing detection unit 310 and an address control unit 320 (control unit).
 シンクロタイミング検出部310は、非同期映像のフレームに調整処理を施すことによって合成フレームに生じる違和感が、当該合成フレームを視聴する視聴者に目立ちにくい期間である調整処理期間を検出する。 The synchro timing detection unit 310 detects an adjustment processing period in which the uncomfortable feeling generated in the composite frame by performing the adjustment process on the frame of the asynchronous video is less noticeable to the viewer who views the composite frame.
 例えば、シンクロタイミング検出部310は、映像の動きが少ない期間、または映像の動きが激しい期間、または映像の場面が変化し、場面の変化に際してブラックアウトや、ホワイトアウト等の発生する期間を調整処理期間として検出する。 For example, the synchronization timing detection unit 310 adjusts a period in which the video motion is small, the video motion is intense, or the video scene changes and a blackout or whiteout occurs when the scene changes. Detect as period.
 このような調整処理期間において調整処理を行うことにより、視聴者の違和感が小さい映像表示が実現できる。 By performing the adjustment process in such an adjustment process period, it is possible to realize a video display with a little discomfort for the viewer.
 以下、調整処理期間の詳細について説明する。 The details of the adjustment processing period will be described below.
 映像の動きが少ない期間では、時間的に連続したフレーム間での輝度変化は小さく、映像の動きが激しい期間では、フレーム間での輝度変化は大きい。このため、映像の動きが少ない期間、または映像の動きが激しい期間は、例えば、SAD(Sum of Absolute Difference)によって検出される。 In the period when the motion of the image is small, the change in luminance between temporally continuous frames is small, and in the period where the motion of the image is intense, the change in luminance between frames is large. For this reason, a period in which the motion of the video is small or a period in which the motion of the video is intense is detected by, for example, SAD (Sum of Absolute Difference).
 SADとは、フレームを構成する各画素について、当該フレームと時間的に連続するフレームの対応する画素との輝度差の絶対値を求め、求めた各画素の輝度差の絶対値を合計したパラメータである。以下、SADを用いて映像の動き量を検出する方法について説明する。 SAD is a parameter obtained by calculating an absolute value of a luminance difference between corresponding pixels of a frame that is temporally continuous with the frame for each pixel constituting the frame, and summing the absolute values of the calculated luminance differences of the pixels. is there. Hereinafter, a method for detecting the amount of motion of an image using SAD will be described.
 まず、シンクロタイミング検出部310は、非同期映像を構成する時間的に連続したフレームについて、それぞれのフレームをいくつかの小領域に分割する。 First, the synchronization timing detection unit 310 divides each frame into several small areas for temporally continuous frames constituting the asynchronous video.
 次に、シンクロタイミング検出部310は、1つのフレームの分割した小領域のそれぞれについて、前のフレームまたは後のフレームの対応する小領域とのSADを求める。つまり、シンクロタイミング検出部310は、小領域を構成する各画素について、前のフレームまたは後のフレームの対応する画素とのそれぞれの輝度差の絶対値の総和を求める。この結果、小領域の数と同じ数のSADが求められる。 Next, the synchronization timing detection unit 310 obtains the SAD of each divided small area of one frame with the corresponding small area of the previous frame or the subsequent frame. That is, the synchronization timing detection unit 310 calculates the sum of absolute values of the luminance differences of the pixels constituting the small area and the corresponding pixels in the previous frame or the subsequent frame. As a result, the same number of SADs as the number of small regions is obtained.
 最後に、シンクロタイミング検出部310は、それぞれの小領域に対応するSADの総和を求める。シンクロタイミング検出部310は、このSADの総和に対して閾値を設定し、映像の動きが少ない期間と映像の動きが激しい期間とを検出する。 Finally, the synchronization timing detection unit 310 obtains the sum of SADs corresponding to each small area. The synchronization timing detection unit 310 sets a threshold for the sum of the SADs, and detects a period in which the video motion is small and a period in which the video motion is intense.
 具体的には、シンクロタイミング検出部310は、SADの総和が第一の閾値よりも小さいフレームについて、当該フレームに対応する期間を、映像の動きが少ない期間として検出する。また、シンクロタイミング検出部310は、SADの総和が第一の閾値よりも大きい第二の閾値よりも大きいフレームについて、当該フレームに対応する期間を映像の動きが激しい期間として検出する。 Specifically, the synchronization timing detection unit 310 detects a period corresponding to the frame as a period in which the motion of the video is small for a frame in which the total SAD is smaller than the first threshold. In addition, the synchronization timing detection unit 310 detects a period corresponding to the frame as a period in which the motion of the video is intense for a frame in which the sum of SADs is larger than the second threshold, which is larger than the first threshold.
 なお、シンクロタイミング検出部310は、SADが所定の値を超えた小領域の個数によって映像の動きが少ない期間と映像の動きが激しい期間とを検出してもよい。 Note that the synchronization timing detection unit 310 may detect a period in which the video motion is small and a period in which the video motion is intense depending on the number of small regions where the SAD exceeds a predetermined value.
 この場合、シンクロタイミング検出部310は、所定の値を超えた小領域の個数が第一の閾値よりも少ないフレームに対応する期間を、映像の動きが少ない期間として検出する。また、シンクロタイミング検出部310は、所定の値を超えた小領域の個数が第一の閾値よりも大きい第二の閾値よりも大きいフレームに対応する期間を映像の動きが激しい期間として検出する。 In this case, the synchronization timing detection unit 310 detects a period corresponding to a frame in which the number of small areas exceeding a predetermined value is smaller than the first threshold as a period in which video motion is small. The synchronization timing detection unit 310 detects a period corresponding to a frame in which the number of small areas exceeding a predetermined value is larger than the second threshold and larger than the second threshold as a period in which video motion is intense.
 また、映像の場面が変化する期間は、例えば、フレーム全体の平均輝度であるAPL(Average Picture Level)によって検出される。具体的には、シンクロタイミング検出部310は、非同期映像のフレームについてAPLを算出し、当該フレームより時間的に前の、連続する複数フレームのAPLの平均との差が大きくなるフレームについて、当該フレームに対応する期間を映像の場面が変化する期間として検出する。 Also, the period during which the video scene changes is detected by, for example, APL (Average Picture Level) which is the average luminance of the entire frame. Specifically, the synchronization timing detection unit 310 calculates the APL for the frame of the asynchronous video, and for the frame in which the difference from the average of the APLs of a plurality of consecutive frames before the frame is large, the frame The period corresponding to is detected as the period during which the video scene changes.
 次に、アドレス制御部320の調整処理について詳細に説明する。 Next, the adjustment process of the address control unit 320 will be described in detail.
 まず、アドレス制御部320の従来の調整処理について説明する。 First, a conventional adjustment process of the address control unit 320 will be described.
 図4及び図5は、従来の調整処理を説明するための図である。 4 and 5 are diagrams for explaining conventional adjustment processing.
 従来、アドレス制御部320は、入力映像(非同期映像)に含まれるフレームを入力フレームレートに応じてフレームメモリ400に格納する際、フレームメモリ400に格納されるフレームを表す書込みアドレスを設定する。書込みアドレスは、入力フレームレートのフレーム出力タイミングに応じてカウントアップされる。 Conventionally, the address control unit 320 sets a write address representing a frame stored in the frame memory 400 when a frame included in the input video (asynchronous video) is stored in the frame memory 400 according to the input frame rate. The write address is counted up according to the frame output timing of the input frame rate.
 なお、書込みアドレスは、フレームメモリに格納可能なフレーム数を超えた時点で「0」にリセットされる。図4及び図5に示される例ではフレームメモリ400に格納可能なフレーム数は8フレームである。したがって、アドレスは、「0」~「7」の各数字のいずれかが設定される。 The write address is reset to “0” when the number of frames that can be stored in the frame memory is exceeded. In the example shown in FIGS. 4 and 5, the number of frames that can be stored in the frame memory 400 is eight frames. Therefore, any of the numbers “0” to “7” is set as the address.
 一方、アドレス制御部320は、出力フレームレートに同期させてフレームをフレームメモリ400から読み出し、出力する。このとき、アドレス制御部320は、上記フレーム格納時に設定された書込みアドレスを、読出しアドレスとして指定することで、フレームメモリ400からフレームを読み出す。読出しアドレスは、出力フレームレートのフレーム出力タイミングに応じてカウントアップされる。 On the other hand, the address control unit 320 reads the frame from the frame memory 400 in synchronization with the output frame rate and outputs it. At this time, the address control unit 320 reads a frame from the frame memory 400 by designating the write address set at the time of storing the frame as a read address. The read address is counted up according to the frame output timing of the output frame rate.
 また、読出しアドレスをカウントアップするタイミングにおける、書込みアドレスと読出しアドレスとフレーム数の差を位相差と呼ぶ。例えば、位相差が「1」である場合、読出しアドレスが同じタイミングの書込みアドレスよりも1フレーム前のフレームを表すアドレスであることを意味する。 Also, the difference between the write address, the read address and the number of frames at the timing when the read address is counted up is called a phase difference. For example, when the phase difference is “1”, it means that the read address is an address representing a frame one frame before the write address at the same timing.
 図4及び図5に示される従来の例では、アドレス制御部320は、出力フレームレートのフレーム出力タイミングでの位相差が、所定の位相差であるかどうかを基準にフレームスキップ処理を行う。図4及び図5に示される例では、所定の位相差を「1」とする。 In the conventional example shown in FIGS. 4 and 5, the address control unit 320 performs frame skip processing based on whether the phase difference at the frame output timing of the output frame rate is a predetermined phase difference. In the example shown in FIGS. 4 and 5, the predetermined phase difference is “1”.
 非同期映像の入力フレームレートが出力フレームレートよりも大きい場合、アドレス制御部320は、フレームの一部を間引いて出力する。具体的には、アドレス制御部320は、フレームメモリ400に格納された非同期映像のフレームを間引いて読み出し、出力フレームレートに同期させて映像出力部500へ出力する。この処理をフレームスキップ処理と呼ぶ。 When the input frame rate of asynchronous video is larger than the output frame rate, the address control unit 320 thins out a part of the frame and outputs it. Specifically, the address control unit 320 thins out and reads out the frames of the asynchronous video stored in the frame memory 400 and outputs them to the video output unit 500 in synchronization with the output frame rate. This process is called a frame skip process.
 図4は、従来のフレームスキップ処理を説明するための図である。 FIG. 4 is a diagram for explaining a conventional frame skip process.
 図4のタイミング(1)は、アドレス制御部320がフレームを出力し始める最初のタイミングである。図4のタイミング(1)では、書込みアドレスが「1」であるため、アドレス制御部320は、位相差が所定の位相差である「1」となるアドレス「0」を読出しアドレスとして設定し、アドレス「0」に対応するフレームをフレームメモリ400から読み出して出力する。 Timing (1) in FIG. 4 is the first timing at which the address control unit 320 starts outputting a frame. At timing (1) in FIG. 4, since the write address is “1”, the address control unit 320 sets the address “0” at which the phase difference is “1” which is a predetermined phase difference as the read address, A frame corresponding to the address “0” is read from the frame memory 400 and output.
 図4のタイミング(2)では、書込みアドレス「3」に対して読出しアドレスが「2」に設定されている。続くタイミング(3)では、書込みアドレス「4」に対して、タイミング(1)の状態からカウントアップされた読出しアドレスは「2」である。つまり、タイミング(3)において、所定の位相差「1」に対して位相差が「2」であり、位相差が所定の位相差よりも大きくなっている。 At timing (2) in FIG. 4, the read address is set to “2” with respect to the write address “3”. At the subsequent timing (3), the read address counted up from the state at the timing (1) is “2” with respect to the write address “4”. That is, at the timing (3), the phase difference is “2” with respect to the predetermined phase difference “1”, and the phase difference is larger than the predetermined phase difference.
 したがって、アドレス制御部320は、タイミング(3)において、フレームスキップ処理を行う。具体的には、アドレス制御部320は、アドレス「2」をスキップして「3」を読出しアドレスとして設定し、アドレス「3」に対応するフレームをフレームメモリ400から読み出して出力する。つまり、アドレス「2」に対応するフレームがスキップされる。 Therefore, the address control unit 320 performs frame skip processing at timing (3). Specifically, the address control unit 320 skips the address “2”, sets “3” as a read address, reads the frame corresponding to the address “3” from the frame memory 400, and outputs the frame. That is, the frame corresponding to the address “2” is skipped.
 このように、フレームスキップ処理が行われた結果、図5のタイミング(4)における位相差は「1」であり所定の位相差「1」と等しくなる。 As described above, as a result of the frame skip processing, the phase difference at the timing (4) in FIG. 5 is “1”, which is equal to the predetermined phase difference “1”.
 以降、同様にアドレス制御部320は、位相差が所定の位相差よりも大きいタイミングにおいてフレームスキップ処理を行うことで、非同期映像の各フレームを出力フレームレートに応じて出力する。 Thereafter, similarly, the address control unit 320 outputs each frame of the asynchronous video according to the output frame rate by performing a frame skip process at a timing when the phase difference is larger than the predetermined phase difference.
 また、非同期映像の入力フレームレートが出力フレームレートより小さい場合、アドレス制御部320は、同一のフレームを繰り返し出力する。具体的には、アドレス制御部320は、フレームメモリ400に格納された非同期映像のフレームを繰り返して読み出し、出力フレームレートに同期させて映像出力部500に出力する。この処理をフレームリピート処理と呼ぶ。 Also, when the input frame rate of asynchronous video is smaller than the output frame rate, the address control unit 320 repeatedly outputs the same frame. Specifically, the address control unit 320 repeatedly reads out the frame of the asynchronous video stored in the frame memory 400 and outputs it to the video output unit 500 in synchronization with the output frame rate. This process is called a frame repeat process.
 図5は、従来のフレームリピート処理を説明するための図である。 FIG. 5 is a diagram for explaining a conventional frame repeat process.
 図5のタイミング(1)は、アドレス制御部320がフレームを出力し始める最初のタイミングである。図5のタイミング(1)では、書込みアドレスが「1」であるため、アドレス制御部320は、位相差が所定の位相差である「1」となるアドレス「0」を読出しアドレスとして設定し、アドレス「0」に対応するフレームをフレームメモリ400から読み出して出力する。 Timing (1) in FIG. 5 is the first timing at which the address control unit 320 starts outputting a frame. At timing (1) in FIG. 5, since the write address is “1”, the address control unit 320 sets the address “0” at which the phase difference is “1” which is a predetermined phase difference as the read address, A frame corresponding to the address “0” is read from the frame memory 400 and output.
 図5のタイミング(2)では、書込みアドレス「3」に対して読出しアドレスが「2」に設定されている。続くタイミング(3)では、書込みアドレス「3」に対して、タイミング(2)の状態からカウントアップされた読出しアドレスは「3」である。つまり、タイミング(3)において、所定の位相差「1」に対して位相差は「0」であり、位相差が所定の位相差よりも小さくなっている。 At timing (2) in FIG. 5, the read address is set to “2” with respect to the write address “3”. At the subsequent timing (3), the read address counted up from the state at the timing (2) is “3” with respect to the write address “3”. That is, at the timing (3), the phase difference is “0” with respect to the predetermined phase difference “1”, and the phase difference is smaller than the predetermined phase difference.
 したがって、アドレス制御部320は、タイミング(3)において、フレームリピート処理を行う。具体的には、アドレス制御部320は、タイミング(3)においてアドレス「2」を再度読出しアドレスとして設定し、アドレス「2」に対応するフレームをフレームメモリ400から読み出して出力する。つまり、アドレス「2」に対応するフレームが繰り返して出力される。 Therefore, the address control unit 320 performs frame repeat processing at timing (3). Specifically, the address control unit 320 sets the address “2” as the read address again at the timing (3), reads the frame corresponding to the address “2” from the frame memory 400, and outputs it. That is, the frame corresponding to the address “2” is repeatedly output.
 このように、フレームリピート処理が行われた結果、図6のタイミング(4)における位相差は「1」であり所定の位相差「1」と等しくなる。 As described above, as a result of the frame repeat process, the phase difference at the timing (4) in FIG. 6 is “1”, which is equal to the predetermined phase difference “1”.
 以降、同様にアドレス制御部320は、位相差が所定の位相差よりも小さいタイミングにおいてフレームリピート処理を行うことで、非同期映像の各フレームを出力フレームレートに応じて出力する。 Thereafter, similarly, the address control unit 320 outputs each frame of the asynchronous video according to the output frame rate by performing a frame repeat process at a timing when the phase difference is smaller than the predetermined phase difference.
 以上のように、従来の調整処理では、アドレス制御部320は、位相差を基準に一定の期間ごとに調整処理(フレームスキップ処理及びフレームリピート)処理を行い、入力フレームを出力フレームレートに同期させて出力する。このため、調整処理が行われるタイミングにおいて表示される映像によっては、視聴者は映像に違和感を感じることがある。 As described above, in the conventional adjustment processing, the address control unit 320 performs adjustment processing (frame skip processing and frame repeat) processing at regular intervals based on the phase difference, and synchronizes the input frame with the output frame rate. Output. For this reason, depending on the video displayed at the timing when the adjustment process is performed, the viewer may feel uncomfortable with the video.
 次に、本発明の実施の形態1に係るアドレス制御部320の調整処理について説明する。 Next, adjustment processing of the address control unit 320 according to Embodiment 1 of the present invention will be described.
 本発明の実施の形態1に係るアドレス制御部320は、シンクロタイミング検出部310によって検出された調整処理期間に、非同期映像の各フレームに調整処理(フレームスキップ処理、フレームリピート処理)を施して出力する。 The address control unit 320 according to the first embodiment of the present invention performs an adjustment process (frame skip process, frame repeat process) on each frame of the asynchronous video during the adjustment process period detected by the synchronization timing detection unit 310 and outputs it. To do.
 まず、本発明の実施の形態1に係るフレームスキップ処理について説明する。 First, the frame skip processing according to Embodiment 1 of the present invention will be described.
 非同期映像の入力フレームレートが出力フレームレートより大きい場合、アドレス制御部320は、上述のようにフレームスキップ処理を行う。 When the asynchronous video input frame rate is higher than the output frame rate, the address control unit 320 performs the frame skip processing as described above.
 図6は、本発明の実施の形態1に係るアドレス制御部320のフレームスキップ処理を説明するための図である。 FIG. 6 is a diagram for explaining frame skip processing of the address control unit 320 according to Embodiment 1 of the present invention.
 図6に示されるように、本発明の実施の形態1に係るアドレス制御部320は、位相差が、所定の位相差であるかどうかを検出する従来の制御に加えて、フレーム出力タイミングが調整処理期間に含まれているかどうかを基準にフレームスキップ処理を行う。 As shown in FIG. 6, the address control unit 320 according to Embodiment 1 of the present invention adjusts the frame output timing in addition to the conventional control for detecting whether or not the phase difference is a predetermined phase difference. Frame skip processing is performed based on whether or not it is included in the processing period.
 なお、図6に示される例では、所定の位相差を「4」とし、フレームメモリ400に格納可能なフレーム数は8フレームである。 In the example shown in FIG. 6, the predetermined phase difference is “4”, and the number of frames that can be stored in the frame memory 400 is eight frames.
 図6の期間(a)のタイミング(1)は、アドレス制御部320がフレームを出力し始める最初のタイミングである。図6の期間(a)のタイミング(1)では、書込みアドレスが「4」であるため、アドレス制御部320は、位相差が所定の位相差である「4」となるアドレス「0」を読出しアドレスとして設定し、アドレス「0」に対応するフレームをフレームメモリ400から読み出して出力する。 The timing (1) of the period (a) in FIG. 6 is the first timing at which the address control unit 320 starts outputting a frame. In the timing (1) of the period (a) in FIG. 6, since the write address is “4”, the address control unit 320 reads the address “0” where the phase difference is “4” which is a predetermined phase difference. The frame is set as an address, and the frame corresponding to the address “0” is read from the frame memory 400 and output.
 図6の期間(a)で示されるように、書込みアドレスは、入力フレームレートに応じてカウントアップされ、読出しアドレスは、出力フレームレートに応じてカウントアップされる。図6の期間(a)では、タイミング(1)~(3)のいずれにおいても位相差は「4」であり、アドレス制御部320がフレームスキップ処理を行う必要はない。 As shown in the period (a) in FIG. 6, the write address is counted up according to the input frame rate, and the read address is counted up according to the output frame rate. In the period (a) in FIG. 6, the phase difference is “4” at any of the timings (1) to (3), and the address control unit 320 does not need to perform the frame skip process.
 図6に示される例では、入力フレームレートが出力フレームレートよりも大きい。したがって、アドレス制御部320がフレームスキップ処理をしないで時間が経過した場合、書込みアドレスと読込みアドレスとの位相差は大きくなる。 In the example shown in FIG. 6, the input frame rate is larger than the output frame rate. Accordingly, when the time elapses without the address controller 320 performing frame skip processing, the phase difference between the write address and the read address increases.
 図6の期間(b)は、図6の期間(a)の状態からフレームスキップ処理が行われずに数フレームの時間が経過し、書込みアドレスと読み込みアドレスの位相差が拡大した状態を表す。図6の期間(b)には、シンクロタイミング検出部310が検出した調整処理期間が含まれ、アドレス制御部320は調整処理期間においてフレームスキップ処理を行う。 The period (b) in FIG. 6 represents a state in which several frames have elapsed from the state of the period (a) in FIG. 6 and the phase difference between the write address and the read address has expanded. The period (b) in FIG. 6 includes the adjustment process period detected by the synchronization timing detection unit 310, and the address control unit 320 performs the frame skip process in the adjustment process period.
 図6の期間(b)のタイミング(1)では、書込みアドレス「4」に対して読出しアドレスが「6」に設定されている。つまり、所定の位相差「4」に対して位相差が「6」であり、位相差は所定の位相差よりも拡大している。 In the timing (1) of the period (b) in FIG. 6, the read address is set to “6” with respect to the write address “4”. That is, the phase difference is “6” with respect to the predetermined phase difference “4”, and the phase difference is larger than the predetermined phase difference.
 図6の期間(b)のタイミング(2)では、書込みアドレスが「5」であり、図6の期間(b)のタイミング(1)の状態からカウントアップされた読出しアドレスは「7」である。つまり、タイミング(2)での位相差は「6」であり、所定の位相差「4」より大きく、なおかつ図6の期間(b)のタイミング(2)は、シンクロタイミング検出部310が調整処理期間として検出した期間に含まれる。このため、図6の期間(b)のタイミング(2)でアドレス制御部320は、フレームスキップ処理を行う。具体的には、アドレス制御部320は、アドレス「7」をスキップして「0」を読出しアドレスとして設定し、アドレス「0」に対応するフレームをフレームメモリ400から読み出して出力する。つまり、アドレス「7」に対応するフレームがスキップされる。 In the timing (2) of the period (b) in FIG. 6, the write address is “5”, and the read address counted up from the state of the timing (1) in the period (b) in FIG. 6 is “7”. . That is, the phase difference at timing (2) is “6”, which is larger than the predetermined phase difference “4”, and the timing (2) in the period (b) in FIG. It is included in the period detected as a period. For this reason, the address control unit 320 performs frame skip processing at the timing (2) of the period (b) in FIG. Specifically, the address control unit 320 skips the address “7”, sets “0” as a read address, reads the frame corresponding to the address “0” from the frame memory 400, and outputs the frame. That is, the frame corresponding to the address “7” is skipped.
 同様に、続く、タイミング(3)では、書込みアドレスが「6」であり、図6の期間(b)のタイミング(2)の状態からカウントアップされた読出しアドレスは「1」である。このとき、図6の期間(b)のタイミング(3)では位相差は「5」であり、所定の位相差「4」よりも大きく、なおかつ図6の期間(b)のタイミング(3)は、調整処理期間に含まれる。このため、図6の期間(b)のタイミング(3)で、アドレス制御部320は、調整処理としてフレームスキップ処理を行う。具体的には、アドレス制御部320は、アドレス「1」をスキップしてアドレス「2」を読み込みアドレスとして設定し、アドレス「2」に対応するフレームをフレームメモリ400から読み出して出力する。つまり、アドレス「1」に対応するフレームがスキップされる。 Similarly, at the subsequent timing (3), the write address is “6”, and the read address counted up from the state of the timing (2) in the period (b) in FIG. 6 is “1”. At this time, in the timing (3) of the period (b) in FIG. 6, the phase difference is “5”, which is larger than the predetermined phase difference “4”, and the timing (3) in the period (b) in FIG. Included in the adjustment process period. Therefore, at the timing (3) of the period (b) in FIG. 6, the address control unit 320 performs a frame skip process as the adjustment process. Specifically, the address control unit 320 skips the address “1”, sets the address “2” as a read address, reads out the frame corresponding to the address “2” from the frame memory 400, and outputs it. That is, the frame corresponding to the address “1” is skipped.
 このように、アドレス制御部320が調整処理期間においてフレームスキップ処理を行った結果、図6の期間(b)のタイミング(4)での位相差は「4」であり所定の位相差「4」と等しくなる。 As described above, as a result of the frame skip processing performed in the adjustment processing period by the address control unit 320, the phase difference at the timing (4) in the period (b) in FIG. 6 is “4”, and the predetermined phase difference “4”. Is equal to
 以降、同様にシンクロタイミング検出部310が検出した調整処理期間において、位相差が所定の位相差よりも大きい場合、アドレス制御部320は、フレームスキップ処理を行うことで非同期映像の各フレームを出力フレームレートに応じて出力する。 Thereafter, when the phase difference is larger than a predetermined phase difference in the adjustment processing period similarly detected by the synchronization timing detection unit 310, the address control unit 320 performs frame skip processing to output each frame of the asynchronous video as an output frame. Output according to the rate.
 なお、図6の期間(b)においては、アドレス制御部320は、フレームスキップ処理としてフレームを1つずつスキップして出力しているが、アドレス制御部320は、一度に複数のフレームをスキップして出力してもよい。つまり、図6の期間(b)のタイミング(2)で、アドレス制御部320は、アドレス「7」及び「0」をスキップして「1」を読出しアドレスとして設定し、アドレス「2」に対応するフレームを出力してもよい。この場合、アドレス「7」及び「0」の2フレームが一度にスキップされる。 In the period (b) in FIG. 6, the address control unit 320 skips and outputs one frame at a time as frame skip processing, but the address control unit 320 skips a plurality of frames at a time. May be output. That is, at the timing (2) of the period (b) in FIG. 6, the address control unit 320 skips the addresses “7” and “0”, sets “1” as the read address, and corresponds to the address “2”. A frame to be output may be output. In this case, two frames of addresses “7” and “0” are skipped at a time.
 なお、フレームメモリ400に格納可能なフレーム数は有限であり、図6の例では、上述のように8フレームである。したがって、調整処理期間が長期間検出されず、フレームスキップ処理ができない場合、アドレス制御部320は、やむを得ず調整処理期間ではない期間にフレームスキップ処理を行う場合がある。 Note that the number of frames that can be stored in the frame memory 400 is finite, and in the example of FIG. 6, it is 8 frames as described above. Therefore, when the adjustment process period is not detected for a long time and the frame skip process cannot be performed, the address control unit 320 may inevitably perform the frame skip process during a period other than the adjustment process period.
 図6の期間(c)はこのような場合を示す図である。 The period (c) in FIG. 6 is a diagram showing such a case.
 図6の期間(c)のタイミング(1)では、調整処理期間が検出されなかったためにフレームスキップ処理が長期間行われず、結果として書込みアドレスと読出しアドレスの位相差は大きくなっている。書込みアドレスが「5」であるのに対し、読出しアドレスは「6」であるため、位相差は「7」である。 In the timing (1) of the period (c) in FIG. 6, since the adjustment process period is not detected, the frame skip process is not performed for a long time, and as a result, the phase difference between the write address and the read address is large. Since the write address is “5”, the read address is “6”, the phase difference is “7”.
 図6の期間(c)のタイミング(2)では、書込みアドレスが「7」であり、読出しアドレスはタイミング(1)の状態からカウントアップされ、本来「7」となるはずである。しかしながら、書込みアドレスが「7」にカウントアップされたタイミングで、新たなフレームが格納されることにより、読出しアドレス「7」に対応するフレームは、フレームメモリ400内に存在しない。 In the timing (2) of the period (c) in FIG. 6, the write address is “7”, and the read address is counted up from the state of the timing (1) and should be “7” originally. However, the frame corresponding to the read address “7” does not exist in the frame memory 400 because a new frame is stored at the timing when the write address is counted up to “7”.
 したがって、調整処理期間に含まれない図6の期間(c)のタイミング(2)において、アドレス制御部320は、フレームスキップ処理を行う。具体的には、アドレス制御部320は、アドレス「7」をスキップしてアドレス「0」を読込みアドレスとして設定し、アドレス「0」に対応するフレームをフレームメモリ400から読み出して出力する。 Therefore, the address control unit 320 performs the frame skip process at the timing (2) of the period (c) in FIG. 6 that is not included in the adjustment process period. Specifically, the address control unit 320 skips the address “7”, sets the address “0” as the read address, reads out the frame corresponding to the address “0” from the frame memory 400, and outputs it.
 次に、本発明の実施の形態1に係るフレームリピート処理について説明する。 Next, the frame repeat processing according to Embodiment 1 of the present invention will be described.
 非同期映像の入力フレームレートが出力フレームレートより小さい場合、アドレス制御部320は、フレームリピート処理を行う。 When the input frame rate of asynchronous video is smaller than the output frame rate, the address control unit 320 performs frame repeat processing.
 図7は、本発明の実施の形態1に係るアドレス制御部320のフレームリピート処理を説明するための図である。 FIG. 7 is a diagram for explaining frame repeat processing of the address control unit 320 according to Embodiment 1 of the present invention.
 図7に示されるように本発明の実施の形態1に係るアドレス制御部320は、位相差が所定の位相差であるかどうかを検出する従来の制御に加えて、フレーム出力タイミングが調整処理期間に含まれているかどうかを基準にフレームリピート処理を行う。なお、所定の位相差は、「4」であり、フレームメモリ400に格納可能なフレーム数は8フレームである。 As shown in FIG. 7, the address control unit 320 according to the first embodiment of the present invention has a frame output timing in the adjustment processing period in addition to the conventional control for detecting whether or not the phase difference is a predetermined phase difference. Frame repeat processing is performed based on whether or not it is included. The predetermined phase difference is “4”, and the number of frames that can be stored in the frame memory 400 is eight frames.
 図7の期間(a)のタイミング(1)は、アドレス制御部320がフレームを出力し始める最初のタイミングである。図7の期間(a)のタイミング(1)では、書込みアドレスが「4」であるため、アドレス制御部320は、位相差が所定の位相差である「4」となるアドレス「0」を読出しアドレスとして設定し、アドレス「0」に対応するフレームをフレームメモリ400から読み出して出力する。 The timing (1) of the period (a) in FIG. 7 is the first timing at which the address control unit 320 starts to output a frame. In the timing (1) of the period (a) in FIG. 7, since the write address is “4”, the address control unit 320 reads the address “0” where the phase difference is “4” which is a predetermined phase difference. The frame is set as an address, and the frame corresponding to the address “0” is read from the frame memory 400 and output.
 図7の期間(a)で示されるように、書込みアドレスは、入力フレームレートに応じてカウントアップされ、読出しアドレスは、出力フレームレートに応じてカウントアップされる。図7の期間(a)では(1)~(3)のいずれのタイミングにおいても位相差は「4」であり、アドレス制御部320がフレームリピート処理を行う必要はない。 As shown in the period (a) in FIG. 7, the write address is counted up according to the input frame rate, and the read address is counted up according to the output frame rate. In the period (a) in FIG. 7, the phase difference is “4” at any timing of (1) to (3), and the address control unit 320 does not need to perform frame repeat processing.
 図7に示される例では、入力フレームレートが出力フレームレートよりも小さい。したがって、アドレス制御部320がフレームリピート処理をしないで時間が経過した場合、書込みアドレスと読込みアドレスとの位相差は小さくなる。 In the example shown in FIG. 7, the input frame rate is smaller than the output frame rate. Accordingly, when the time elapses without the address control unit 320 performing the frame repeat process, the phase difference between the write address and the read address becomes small.
 図7の期間(b)は、図7の期間(a)の状態からフレームリピート処理が行われずに数フレームの時間が経過し、書込みアドレスと読み込みアドレスの位相差が縮小した状態を表す。 The period (b) in FIG. 7 represents a state in which the frame repeat process is not performed from the state in the period (a) in FIG. 7 and a time of several frames elapses and the phase difference between the write address and the read address is reduced.
 図7の期間(b)のタイミング(1)では、書込みアドレス「4」に対して読出しアドレスが「1」に設定されている。つまり、所定の位相差「4」に対して位相差が「3」であり、位相差が所定の位相差よりも縮小している。 In the timing (1) of the period (b) in FIG. 7, the read address is set to “1” with respect to the write address “4”. That is, the phase difference is “3” with respect to the predetermined phase difference “4”, and the phase difference is smaller than the predetermined phase difference.
 図7の期間(b)のタイミング(2)では、書込みアドレスが「5」であり、図7の期間(b)のタイミング(1)からカウントアップされた読出しアドレスは「2」である。つまり、(2)のタイミングでは位相差は「3」であり、所定の位相差「4」より小さく、なおかつ、図7(b)の(2)のタイミングは、シンクロタイミング検出部310が調整処理期間として検出した期間に含まれる。このため、図7の期間(b)のタイミング(2)でアドレス制御部320は、フレームリピート処理を行う。具体的には、アドレス制御部320は、アドレス「1」を再度読出しアドレスとして設定し、アドレス「1」に対応するフレームをフレームメモリ400から読み出して出力する。つまり、アドレス「1」に対応するフレームが再度出力される。 In the timing (2) of the period (b) in FIG. 7, the write address is “5”, and the read address counted up from the timing (1) in the period (b) in FIG. 7 is “2”. That is, the phase difference is “3” at the timing (2), which is smaller than the predetermined phase difference “4”, and the synchronization timing detection unit 310 performs the adjustment process at the timing (2) in FIG. It is included in the period detected as a period. Therefore, the address control unit 320 performs a frame repeat process at the timing (2) of the period (b) in FIG. Specifically, the address control unit 320 sets the address “1” as the read address again, reads out the frame corresponding to the address “1” from the frame memory 400, and outputs the frame. That is, the frame corresponding to the address “1” is output again.
 このように、調整処理期間においてリピート処理を行った結果、期間(b)のタイミング(3)での位相差は「4」であり所定の位相差「4」と等しくなる。 As described above, as a result of performing the repeat process in the adjustment process period, the phase difference at the timing (3) of the period (b) is “4”, which is equal to the predetermined phase difference “4”.
 以降、同様にシンクロタイミング検出部310が検出した調整処理期間において、位相差が所定の位相差よりも大きい場合、アドレス制御部320は、フレームリピート処理を行うことで非同期映像の各フレームを出力フレームレートに応じて出力する。 Thereafter, when the phase difference is larger than a predetermined phase difference in the adjustment processing period similarly detected by the synchronization timing detection unit 310, the address control unit 320 performs frame repeat processing to output each frame of the asynchronous video as an output frame. Output according to the rate.
 なお、図7の期間(b)のタイミング(2)においては、アドレス制御部320は、フレームリピート処理としてフレームを1度だけ繰り返して出力しているが、アドレス制御部320は、同じフレームを複数回連続して出力してもよい。 Note that, in the timing (2) of the period (b) in FIG. 7, the address control unit 320 repeatedly outputs a frame once as a frame repeat process, but the address control unit 320 outputs a plurality of the same frames. You may output continuously.
 なお、フレームメモリ400に格納可能なフレーム数は有限であり、図7の例では、上述のように8フレームである。したがって、調整処理期間が長期間検出されず、フレームリピート処理ができない場合、アドレス制御部320は、やむを得ず調整処理期間ではない期間にフレームリピート処理を行う場合がある。 Note that the number of frames that can be stored in the frame memory 400 is finite, and in the example of FIG. 7, it is 8 frames as described above. Therefore, when the adjustment process period is not detected for a long time and the frame repeat process cannot be performed, the address control unit 320 may inevitably perform the frame repeat process during a period other than the adjustment process period.
 図7の(c)はこのような場合を示す図である。 (C) of FIG. 7 is a figure which shows such a case.
 図7の期間(c)のタイミング(1)では、フレームリピート処理が行われず、書込みアドレスと読出しアドレスとの位相差が小さくなっている。書込みアドレスが「4」であるのに対し、読出しアドレスは「4」であるため、位相差は「0」である。 In the timing (1) of the period (c) in FIG. 7, the frame repeat process is not performed, and the phase difference between the write address and the read address is small. Since the write address is “4” while the read address is “4”, the phase difference is “0”.
 図7の期間(c)のタイミング(2)では、書込みアドレスが「4」であり、読出しアドレスはタイミング(1)の状態からカウントアップされ、本来「5」となるはずである。しかしながら、タイミング(2)では書込みアドレスが「5」にカウントアップされていない。つまり、タイミング(2)でのアドレス「5」に対応するフレームは、まだ格納されておらず、読出しアドレス「5」に対応するフレームは、フレームメモリ400内に存在しない。 In the timing (2) of the period (c) in FIG. 7, the write address is “4”, and the read address is counted up from the state of the timing (1) and should be “5” originally. However, at the timing (2), the write address is not counted up to “5”. That is, the frame corresponding to the address “5” at the timing (2) is not yet stored, and the frame corresponding to the read address “5” does not exist in the frame memory 400.
 したがって、調整処理期間に含まれない図7の期間(c)のタイミング(2)において、アドレス制御部320は、フレームリピート処理を行う。具体的には、アドレス制御部320は、アドレス「4」を再度読込みアドレスとして設定し、アドレス「4」に対応するフレームをフレームメモリ400から読み出して出力する。 Therefore, the address control unit 320 performs the frame repeat process at the timing (2) of the period (c) in FIG. 7 that is not included in the adjustment process period. Specifically, the address control unit 320 sets the address “4” as the read address again, reads out the frame corresponding to the address “4” from the frame memory 400, and outputs it.
 最後に、フレームシンクロ部300の調整処理の処理順序について説明する。 Finally, the adjustment processing sequence of the frame synchronization unit 300 will be described.
 図8は、実施の形態1におけるフレームシンクロ部300の調整処理のフローチャートである。 FIG. 8 is a flowchart of the adjustment process of the frame synchronization unit 300 according to the first embodiment.
 実施の形態1における入力フレームレートと出力フレームレートとのズレを解消するフレームシンクロ部300の調整処理は、次のような流れとなる。 The adjustment process of the frame synchronization unit 300 that eliminates the difference between the input frame rate and the output frame rate in the first embodiment is as follows.
 まず、フレームシンクロ部300は、映像取得部100が取得した入力映像の入力フレームを取得する(S110)。 First, the frame synchronization unit 300 acquires the input frame of the input video acquired by the video acquisition unit 100 (S110).
 続いて、フレームシンクロ部300は、出力フレームレート決定部200が決定した出力フレームレートを取得する(S120)。 Subsequently, the frame synchronization unit 300 acquires the output frame rate determined by the output frame rate determination unit 200 (S120).
 次に、シンクロタイミング検出部310は、非同期映像に対して、アドレス制御部320が行う調整処理が視聴者に知覚されにくい期間である調整処理期間を検出する(S130)。 Next, the synchronization timing detection unit 310 detects an adjustment process period, which is a period during which the adjustment process performed by the address control unit 320 is difficult for the viewer to perceive the asynchronous video (S130).
 アドレス制御部320は、位相差を基準に、入力フレームを出力フレームレートに応じて出力する。フレーム出力時に位相差にズレが生じた場合(S140でYes)、ズレが生じたタイミングが、調整処理期間に含まれる場合(S150でYes)、アドレス制御部320は、調整処理を行い、フレームを出力する(S160)。 The address control unit 320 outputs the input frame according to the output frame rate based on the phase difference. When the phase difference is shifted during frame output (Yes in S140), and the timing when the shift is included in the adjustment processing period (Yes in S150), the address control unit 320 performs the adjustment process to display the frame. Output (S160).
 ズレが生じたタイミングが調整処理期間に含まれない場合(S140でNo)、アドレス制御部320は、調整処理を行わずにそのままフレームを出力する(S160)。 When the timing at which the deviation occurs is not included in the adjustment processing period (No in S140), the address control unit 320 outputs the frame as it is without performing the adjustment processing (S160).
 以上のように、本発明の実施の形態1に係る映像信号処理装置10では、入力フレームレートと出力フレームレートとのズレを解消するフレームの調整処理が、映像の動きの少ない期間、映像の動きの多い期間及び映像の場面の切り替わりに対応する期間に行われる。これにより、視聴者が映像の違和感を感じにくいスムースで高品位な映像表示が実現される。 As described above, in the video signal processing apparatus 10 according to Embodiment 1 of the present invention, the frame adjustment processing for eliminating the deviation between the input frame rate and the output frame rate is performed during the period when the video motion is small. It is performed in a period corresponding to the switching of video scenes. As a result, a smooth and high-definition video display is realized in which the viewer does not feel a sense of discomfort in the video.
 (実施の形態2)
  以下、本発明の実施の形態2について、図面を参照しながら説明する。
(Embodiment 2)
Embodiment 2 of the present invention will be described below with reference to the drawings.
 本発明の実施の形態2に係る映像信号処理装置10の構成、及び動作は、実施の形態1とフレームシンクロ部300の構成、動作のみが異なる。したがって、以下、実施の形態2に係るフレームシンクロ部300の構成、及び調整処理について詳細に説明する。 The configuration and operation of the video signal processing apparatus 10 according to the second embodiment of the present invention are different from the first embodiment only in the configuration and operation of the frame synchronization unit 300. Therefore, the configuration and adjustment processing of the frame synchronization unit 300 according to Embodiment 2 will be described in detail below.
 なお、以下の説明では、フレームシンクロ部300に入力された1つの非同期映像について説明するが、実際には複数の非同期映像について調整処理が行われる。 In the following description, one asynchronous video input to the frame synchronization unit 300 will be described, but actually, adjustment processing is performed for a plurality of asynchronous videos.
 図9は、本発明の実施の形態2におけるフレームシンクロ部300の構成を表すブロック図である。 FIG. 9 is a block diagram showing the configuration of the frame synchronization unit 300 according to Embodiment 2 of the present invention.
 フレームシンクロ部300は、アドレス制御部320と、補間フレーム生成部360とを備える。 The frame synchronization unit 300 includes an address control unit 320 and an interpolation frame generation unit 360.
 アドレス制御部320は、所定の期間に、非同期映像の各フレームである入力フレームにフレームスキップ処理、またはフレームリピート処理を施し、読出しフレームとして補間フレーム生成部360へ出力する。また、アドレス制御部320は、読出しフレームと時間的に前後するフレームである前後フレームを補間フレーム生成部360へ出力する。 The address control unit 320 performs a frame skip process or a frame repeat process on the input frame, which is each frame of the asynchronous video, for a predetermined period, and outputs it to the interpolation frame generation unit 360 as a read frame. The address control unit 320 also outputs to the interpolation frame generation unit 360 the preceding and following frames that are temporally preceding and following the read frame.
 補間フレーム生成部360は、アドレス制御部320が出力した読出しフレームと前後フレームとを用いて補間フレームを生成する。また、補間フレーム生成部360は、読出しフレームの所定の期間に含まれるフレームを補間フレームに置き換え、出力フレームとして出力する。補間フレーム生成部360の補間フレーム生成方法の詳細については後述する。 The interpolation frame generation unit 360 generates an interpolation frame using the read frame and the previous and next frames output from the address control unit 320. In addition, the interpolation frame generation unit 360 replaces a frame included in a predetermined period of the read frame with the interpolation frame and outputs it as an output frame. Details of the interpolation frame generation method of the interpolation frame generation unit 360 will be described later.
 次に、アドレス制御部320、及び補間フレーム生成部360の調整処理について具体例を挙げて詳細に説明する。 Next, the adjustment processing of the address control unit 320 and the interpolation frame generation unit 360 will be described in detail with specific examples.
 図10は、入力フレームレートが出力フレームレートより小さい場合のアドレス制御部320、及び補間フレーム生成部360の調整処理を説明するための図である。 FIG. 10 is a diagram for explaining adjustment processing of the address control unit 320 and the interpolation frame generation unit 360 when the input frame rate is smaller than the output frame rate.
 アドレス制御部320のフレームメモリ400へフレームを格納する基本的な動作は、実施の形態1で説明した通りである。図10に示されるように、書込みアドレスは、入力フレームレートに応じてカウントアップされ、読出しアドレスは、出力フレームレートに応じてカウントアップされる。 The basic operation of storing a frame in the frame memory 400 of the address control unit 320 is as described in the first embodiment. As shown in FIG. 10, the write address is counted up according to the input frame rate, and the read address is counted up according to the output frame rate.
 また、アドレス制御部320は、出力フレームレートのフレーム出力タイミングでの位相差が、所定の位相差であるかどうかを基準にフレームリピート処理を行うタイミングを検出する。図10では、所定の位相差は「1」であり、フレームメモリ400に格納可能なフレーム数は8フレームである。 Also, the address control unit 320 detects the timing for performing the frame repeat process based on whether or not the phase difference at the frame output timing of the output frame rate is a predetermined phase difference. In FIG. 10, the predetermined phase difference is “1”, and the number of frames that can be stored in the frame memory 400 is eight frames.
 また、以下の説明では、入力フレームのアドレス「0」に対応するフレームをフレーム[0]と表記する。例えば、フレーム[1]は、入力フレームのアドレス「1」に対応するフレームであり、フレーム[2]は入力フレームのアドレス「2」に対応するフレームである。 In the following description, the frame corresponding to the input frame address “0” is expressed as frame [0]. For example, the frame [1] is a frame corresponding to the address “1” of the input frame, and the frame [2] is a frame corresponding to the address “2” of the input frame.
 入力フレームのアドレスは、時間の経過とともに1ずつカウントアップされているため、[]内の数字は、時間距離を表す。 ∙ Since the address of the input frame is counted up by 1 with the passage of time, the number in [] represents the time distance.
 図10に示される例では、アドレス制御部320が、実施の形態1と同様に位相差に基づき読出しアドレスを制御した場合、読出しフレームとしてフレーム[0]、[1]、[2]、[2]、[3]、[4]、[5]、[6]・・・が連続して出力される。この期間のうち、フレーム[2]が連続して出力されるタイミングにおいて、フレーム間の時間距離の変化がないため、視聴者に違和感を与える可能性が高い。 In the example shown in FIG. 10, when the address control unit 320 controls the read address based on the phase difference as in the first embodiment, frames [0], [1], [2], [2] ], [3], [4], [5], [6]... Are continuously output. In this period, since there is no change in the time distance between frames at the timing when frames [2] are continuously output, there is a high possibility that the viewer will feel uncomfortable.
 補間フレーム生成部360は、補間フレームを用いてフレーム間の時間距離の変化を緩和する。具体的には、例えば、補間フレーム生成部360は、読出しフレームのフレーム[2]、[2]、[3]に対応する、出力フレームの出力タイミングであるタイミング(4)、(5)、(6)において、補間フレーム[1.8]、[2.5]、[3.2]をそれぞれ出力する。 The interpolation frame generation unit 360 uses the interpolation frame to mitigate changes in the time distance between frames. Specifically, for example, the interpolation frame generation unit 360 outputs timings (4), (5), (() that are output timings of output frames corresponding to the frames [2], [2], [3] of the read frame. In 6), interpolation frames [1.8], [2.5], and [3.2] are output, respectively.
 これにより、出力フレームは、補間フレームを用いて、フレーム[0]、[1]、[1.8]、[2.5」、[3.2]、[4]、[5]・・・となる。これにより、フレーム間の時間距離の変化が緩和されるため、視聴者が違和感を感じにくい映像表示が実現される。 As a result, the output frame uses the interpolation frame, and the frames [0], [1], [1.8], [2.5], [3.2], [4], [5]. It becomes. Thereby, since the change of the time distance between frames is relieved, the video display which a viewer does not feel uncomfortable is realized.
 上記のような制御を行うために、まず、アドレス制御部320は、非同期映像の入力フレームレートと出力フレームレートとから、読出しフレームのうち、あらかじめ補間フレームに置き換えられる被置換フレームを検出する。図10の例では、フレームリピート処理によって出力されるフレーム[2]とその直前、直後のフレームを被置換フレームに設定する。つまり、被置換フレームは、読出しフレームのフレーム[2]、[2]、[3]である。 In order to perform the control as described above, first, the address control unit 320 detects a replacement frame to be replaced with an interpolation frame in advance from the read frame from the input frame rate and the output frame rate of the asynchronous video. In the example of FIG. 10, the frame [2] output by the frame repeat process and the immediately preceding and immediately following frames are set as the replacement frame. That is, the replacement frame is the frames [2], [2], and [3] of the read frame.
 次に、アドレス制御部320は、補間フレーム生成部360が生成する補間フレーム決定する。上述のように、フレーム間の時間距離の変化を緩和するため、補間フレームをフレーム[1.8]、[2.5]、[3.2]に設定する。 Next, the address control unit 320 determines an interpolation frame generated by the interpolation frame generation unit 360. As described above, the interpolated frames are set to frames [1.8], [2.5], and [3.2] in order to mitigate changes in the temporal distance between frames.
 以下、図10を用いて、アドレス制御部320及び補間フレーム生成部360のフレームの入出力について順を追って詳細に説明する。 Hereinafter, the input / output of frames of the address control unit 320 and the interpolation frame generation unit 360 will be described in detail with reference to FIG.
 図10のタイミング(1)は、アドレス制御部320が読出しフレームを出力し始める最初のタイミングである。タイミング(1)では、書込みアドレスは「1」であり、所定の位相差は「1」である。したがって、アドレス制御部320は、読出しアドレスを「0」に設定し、アドレス「0」に対応するフレーム[0]を読出しフレームとしてフレームメモリ400から読出し、補間フレーム生成部360へ出力する。 Timing (1) in FIG. 10 is the first timing at which the address control unit 320 starts to output a read frame. At timing (1), the write address is “1” and the predetermined phase difference is “1”. Therefore, the address control unit 320 sets the read address to “0”, reads the frame [0] corresponding to the address “0” from the frame memory 400 as a read frame, and outputs it to the interpolation frame generation unit 360.
 図10のタイミング(2)では読出しアドレスがカウントアップされ「1」であり、位相差は、所定の位相差と等しい。このため、フレームリピート処理は行われず、出力される読出しフレームはフレーム[1]となる。 At timing (2) in FIG. 10, the read address is counted up to “1”, and the phase difference is equal to the predetermined phase difference. For this reason, the frame repeat process is not performed, and the output read frame is frame [1].
 また、タイミング(2)は、補間フレーム生成部360が出力フレームを出力し始めるタイミングである。タイミング(2)において、補間フレーム生成部360は、タイミング(1)においてアドレス制御部320から出力されたフレーム[0]を出力フレームとして映像出力部500へ出力する。 Timing (2) is a timing at which the interpolation frame generation unit 360 starts to output an output frame. At timing (2), the interpolation frame generation unit 360 outputs the frame [0] output from the address control unit 320 at timing (1) to the video output unit 500 as an output frame.
 続く、タイミング(3)では読出しアドレスは、カウントアップされて「2」となり、位相差は、所定の位相差と等しい「1」である。このためタイミング(3)において出力される読出しフレームはフレーム[2]となる。 Subsequently, at timing (3), the read address is counted up to “2”, and the phase difference is “1” equal to a predetermined phase difference. Therefore, the read frame output at timing (3) is frame [2].
 ここで、タイミング(3)で、アドレス制御部320が出力するフレーム[2]は、被置換フレームである。タイミング(3)における読出しフレームは、補間フレーム生成部360によって補間フレーム[1.8]と置き換えられる。補間フレーム[1.8]は、後述する線形補間によって、フレーム[1]及び[2]から生成される。 Here, the frame [2] output from the address control unit 320 at timing (3) is a frame to be replaced. The readout frame at timing (3) is replaced with the interpolation frame [1.8] by the interpolation frame generation unit 360. The interpolation frame [1.8] is generated from the frames [1] and [2] by linear interpolation described later.
 よって、アドレス制御部320は、タイミング(3)において、アドレス「2」に加えて、直前のフレームに対応するアドレス「1」を前後アドレスとして設定する。前後アドレスは、補間フレーム生成部360で補間フレームを生成するためにアドレス制御部320がフレームメモリ400からフレームを読出すためのアドレスである。この結果、タイミング(3)では読出しフレームに加えて、フレーム[1]が前後フレームとして補間フレーム生成部360へ出力される。 Therefore, in timing (3), the address control unit 320 sets the address “1” corresponding to the immediately preceding frame as the front and rear addresses in addition to the address “2”. The front-rear address is an address for the address control unit 320 to read a frame from the frame memory 400 so that the interpolation frame generation unit 360 generates an interpolation frame. As a result, at timing (3), in addition to the read frame, frame [1] is output to the interpolated frame generation unit 360 as the previous and subsequent frames.
 また、タイミング(3)では、補間フレーム生成部360は、タイミング(2)でアドレス制御部320から出力されたフレーム[1]を出力する。 Also, at timing (3), the interpolation frame generation unit 360 outputs the frame [1] output from the address control unit 320 at timing (2).
 続く、タイミング(4)で位相差は「2」に拡大する。したがって、タイミング(4)では、フレームシンクロ部300は、フレームリピート処理を行い、アドレス「2」を再度読出しアドレスとして設定し、フレーム[2]を読出しフレームとして出力する。また、タイミング(4)で出力される読出しフレームは、被置換フレームであり、フレーム[2.5]に置き換えられる。よって、アドレス制御部320は、アドレス「2」の次のアドレス「3」を前後アドレスに設定し、フレーム[3]を前後フレームとして補間フレーム生成部360へ出力する。 Next, the phase difference expands to “2” at timing (4). Therefore, at the timing (4), the frame synchronization unit 300 performs the frame repeat process, sets the address “2” as the read address again, and outputs the frame [2] as the read frame. The read frame output at timing (4) is a frame to be replaced and is replaced with frame [2.5]. Therefore, the address control unit 320 sets the next address “3” after the address “2” as the previous and next addresses, and outputs the frame [3] to the interpolated frame generation unit 360 as the previous and next frames.
 また、タイミング(4)では、補間フレーム生成部360は、タイミング(3)でアドレス制御部320から出力されたフレーム[1]及び[2]を線形補間することで生成される補間フレーム「1.8」を生成し、出力フレームとして出力する。線形補間については後述する。 At timing (4), the interpolation frame generation unit 360 linearly interpolates the frames [1] and [2] output from the address control unit 320 at timing (3). 8 "is generated and output as an output frame. The linear interpolation will be described later.
 続く、タイミング(5)では読出しアドレスは、カウントアップされて「3」となり、位相差は、所定の位相差と等しい「1」である。よって、読出しフレームとしてフレーム[3]が出力される。また、当該読出しフレームは、被置換フレームであり、フレーム[3.2]に置き換えられる。よって、フレーム[4.5]が前後フレームとして出力される。 Subsequently, at timing (5), the read address is counted up to “3”, and the phase difference is “1” equal to a predetermined phase difference. Therefore, frame [3] is output as a read frame. The read frame is a frame to be replaced, and is replaced with the frame [3.2]. Therefore, frame [4.5] is output as the previous and subsequent frames.
 また、タイミング(5)では、補間フレーム生成部360は、タイミング(4)でアドレス制御部320から出力されたフレーム[2]及び[3]から補間フレーム[2.5]を生成し、出力フレームとして出力する。 At timing (5), the interpolation frame generation unit 360 generates the interpolation frame [2.5] from the frames [2] and [3] output from the address control unit 320 at timing (4), and outputs the output frame. Output as.
 タイミング(6)では、読出しアドレスは、カウントアップされて「3」となり、位相差は、所定の位相差と等しい「1」である。よって、読出しフレームとしてフレーム[3]が出力される。また、当該読出しフレームは、被置換フレームではないため、前後アドレスは設定されない。 At timing (6), the read address is counted up to “3”, and the phase difference is “1” equal to a predetermined phase difference. Therefore, frame [3] is output as a read frame. Further, since the read frame is not a frame to be replaced, the preceding and following addresses are not set.
 また、タイミング(6)では、補間フレーム生成部360は、タイミング(5)でアドレス制御部320から出力されたフレーム[3]及び[4]から補間フレーム[3.2]を生成し、出力フレームとして出力する。 At timing (6), the interpolation frame generation unit 360 generates an interpolation frame [3.2] from the frames [3] and [4] output from the address control unit 320 at timing (5), and outputs an output frame. Output as.
 以降、同様に、アドレス制御部320は、通常の読出しフレームが出力されるタイミングにおいては、読出しフレームのみを出力し、被置換フレームが出力されるタイミングでは読出しフレーム(被置換フレーム)と前後フレームとを出力する。補間フレーム生成部360は、読出しフレームのみを取得した場合は、1フレーム後のタイミングにおいて当該読出しフレームをそのまま出力フレームとして出力する。補間フレーム生成部360は、読出しフレームと、前後フレームとを取得した場合は1フレーム後のタイミングにおいて当該読出しフレームと当該前後フレームとから、補間フレームを生成して出力する。 Thereafter, similarly, the address control unit 320 outputs only the read frame at the timing when the normal read frame is output, and the read frame (replaced frame) and the preceding and following frames at the timing when the replaced frame is output. Is output. When only the read frame is acquired, the interpolation frame generation unit 360 outputs the read frame as it is as an output frame at the timing after one frame. When the read frame and the preceding and following frames are acquired, the interpolation frame generation unit 360 generates and outputs an interpolation frame from the read frame and the previous and next frames at a timing one frame later.
 次に補間フレームの生成方法について詳細に説明する。 Next, the interpolation frame generation method will be described in detail.
 補間フレームは2つのフレームを用い、線形補間を行うことで生成される。例えば、補間フレーム[1.8]は、フレーム[1]とフレーム[2]とから求められる動きベクトルを用いて、線形補間することで生成される。 The interpolation frame is generated by performing linear interpolation using two frames. For example, the interpolation frame [1.8] is generated by linear interpolation using the motion vector obtained from the frame [1] and the frame [2].
 まず、補間フレーム生成部360は、例えば、読出しフレームのうちフレーム[1]とフレーム[2]とをいくつかの小領域に分割する。 First, the interpolation frame generation unit 360 divides, for example, the frame [1] and the frame [2] of the read frame into several small regions.
 次に、補間フレーム生成部360は、フレーム[2]の一つの小領域(以下、小領域Aとする。)と、フレーム[1]の各小領域とのSADを求める。続いて、補間フレーム生成部360は、小領域の数だけ求められたSADのうち、最も小さいSAD値に対応するフレーム[1]の小領域(以下、小領域A´とする)を求める。さらに、補間フレーム生成部360は、小領域A及び小領域A´を用いて、小領域Aから小領域A´への位置の変化を表す動きベクトルAを求める。 Next, the interpolation frame generation unit 360 obtains the SAD between one small area of the frame [2] (hereinafter referred to as a small area A) and each small area of the frame [1]. Subsequently, the interpolation frame generation unit 360 obtains a small area (hereinafter, referred to as a small area A ′) of the frame [1] corresponding to the smallest SAD value among the SADs obtained by the number of small areas. Further, the interpolation frame generation unit 360 uses the small area A and the small area A ′ to obtain a motion vector A that represents a change in position from the small area A to the small area A ′.
 次に、補間フレーム生成部360は、補間フレーム[1.8]に含まれる小領域のうち、小領域Aの位置から、動きベクトルA×0.2によって求められる位置の小領域(以下、小領域A´´とする)を求める。 Next, the interpolation frame generation unit 360 includes a small area (hereinafter, a small area) at a position obtained by the motion vector A × 0.2 from the position of the small area A among the small areas included in the interpolation frame [1.8]. Region A ″).
 小領域A´´の各画素の輝度は、小領域Aの各画素の輝度と小領域A´の各画素との輝度を比例配分することによって求められる。例えば、小領域A´´の任意の画素aの輝度は(小領域A´´の画素aの輝度)=(小領域Aの画素aに対応する画素の輝度)×0.8+(小領域A´の画素aに対応する画素の輝度)×0.2という式で求められる。 The luminance of each pixel in the small area A ″ is obtained by proportionally distributing the luminance of each pixel in the small area A and the luminance of each pixel in the small area A ′. For example, the luminance of an arbitrary pixel a in the small area A ″ is (luminance of the pixel a in the small area A ″) = (luminance of the pixel corresponding to the pixel a in the small area A) × 0.8 + (small area A The luminance of the pixel corresponding to the pixel a of ′) × 0.2.
 以上のように各画素の輝度を求めることで、補間フレーム[1.8]の小領域A´´の輝度が求められる。フレーム[2]の他の小領域についても、動きベクトルをそれぞれ求めることで、補間フレーム[1.8]の他の小領域の輝度が求められる。この結果、補間フレーム[1.8]が生成される。 By obtaining the luminance of each pixel as described above, the luminance of the small area A ″ of the interpolation frame [1.8] is obtained. With respect to the other small areas of the frame [2], the luminance of the other small areas of the interpolation frame [1.8] is obtained by obtaining the motion vectors. As a result, an interpolation frame [1.8] is generated.
 また、同様に、補間フレーム[2.5]は、フレーム[2]とフレーム[3]とから、動きベクトルを用いて線形補間することで生成される。補間フレーム[3.2]は、フレーム[3]とフレーム[4]とから、動きベクトルを用いて線形補間することで生成される。 Similarly, the interpolation frame [2.5] is generated by linearly interpolating from the frame [2] and the frame [3] using a motion vector. The interpolation frame [3.2] is generated by linearly interpolating from the frame [3] and the frame [4] using a motion vector.
 このように、読出しアドレスで指定されたフレームをそのまま出力するのではなく、補間フレームを生成し、置き換えて出力する調整処理を行うことで、フレーム間の時間的な連続性が改善され、映像のジャダーが軽減される。 In this way, instead of outputting the frame specified by the read address as it is, by performing adjustment processing to generate an interpolated frame, replace it, and output it, the temporal continuity between frames is improved, and the video Judder is reduced.
 なお、これまで入力フレームレートが出力フレームレートより小さい場合のアドレス制御部320、及び補間フレーム生成部360の調整処理について説明したが、入力フレームレートが出力フレームレートより大きい場合も同様である。 Although the adjustment processing of the address control unit 320 and the interpolation frame generation unit 360 when the input frame rate is smaller than the output frame rate has been described so far, the same applies when the input frame rate is larger than the output frame rate.
 図11は、入力フレームレートが出力フレームレートより大きい場合のアドレス制御部320、及び補間フレーム生成部360の調整処理を説明するための図である。 FIG. 11 is a diagram for explaining the adjustment processing of the address control unit 320 and the interpolation frame generation unit 360 when the input frame rate is larger than the output frame rate.
 図11では、読出しフレームがそのまま出力フレームとして出力される場合、フレームスキップ処理により、読出しアドレス「2」に対応するフレーム[2]が間引かれて出力されるため、読出しフレームのタイミング(4)において、フレーム間のアドレスの変化量が大きい。 In FIG. 11, when the read frame is output as an output frame as it is, the frame [2] corresponding to the read address “2” is thinned out and output by the frame skip process, so the timing (4) of the read frame The amount of change in address between frames is large.
 このアドレス変化量を緩和するため、図11の例では、補間フレーム生成部360は、出力フレームの出力タイミングのうち、読出しフレームのフレーム[3]、[4]を出力するそれぞれのタイミングにおいて、補間フレーム[2.3]、[3.7]を出力する。 In order to reduce this address change amount, in the example of FIG. 11, the interpolation frame generation unit 360 performs interpolation at each timing of outputting frames [3] and [4] of the read frame among the output timings of the output frame. Frames [2.3] and [3.7] are output.
 これにより、出力フレームは、補間フレームを用いて、フレーム[0]、[1]、[2.3]、[3.7]、[5]、[6]、・・・となる。 Thus, the output frames are frames [0], [1], [2.3], [3.7], [5], [6],... Using the interpolation frame.
 なお、アドレス制御部320のフレームの制御手順や、補間フレームの生成方法などは、入力フレームレートが出力フレームレートより小さい場合と同様である。 Note that the frame control procedure of the address control unit 320 and the interpolation frame generation method are the same as when the input frame rate is smaller than the output frame rate.
 最後に、実施の形態2に係るフレームシンクロ部300の調整処理の処理順序について説明する。 Finally, the processing sequence of the adjustment processing of the frame synchronization unit 300 according to the second embodiment will be described.
 図12は、本実施の形態2における同期ずれの解消を表すフローチャートである。 FIG. 12 is a flowchart showing the elimination of the synchronization error in the second embodiment.
 実施の形態2におけるフレームシンクロ部300の調整処理は、次のような流れとなる。 The adjustment process of the frame synchronization unit 300 in the second embodiment is as follows.
 まず、フレームシンクロ部300は、映像取得部100が取得した入力映像の入力フレームを取得する(S210)。 First, the frame synchronization unit 300 acquires the input frame of the input video acquired by the video acquisition unit 100 (S210).
 続いて、フレームシンクロ部300は、出力フレームレート決定部200が決定した出力フレームレートを取得する(S220)。 Subsequently, the frame synchronization unit 300 acquires the output frame rate determined by the output frame rate determination unit 200 (S220).
 次に、アドレス制御部320は、入力フレームレートと、出力フレームレートとを元に被置換フレームを検出する(S230)。 Next, the address control unit 320 detects a frame to be replaced based on the input frame rate and the output frame rate (S230).
 さらに、アドレス制御部320は、位相差を基準に、入力フレームを出力フレームレートに応じて読出しフレームとして出力する(S240)。 Further, the address control unit 320 outputs the input frame as a read frame according to the output frame rate based on the phase difference (S240).
 読出しフレームが被置換フレームである場合(S250でYES)、アドレス制御部320は、前後フレームを出力する(S260)。 When the read frame is a frame to be replaced (YES in S250), the address control unit 320 outputs the previous and next frames (S260).
 補間フレーム生成部360は、読出しフレームと、前後フレームとを取得した場合(S270)は補間フレームを生成し、出力フレームとしてて出力する。 The interpolation frame generation unit 360 generates an interpolation frame and outputs it as an output frame when the read frame and the preceding and following frames are acquired (S270).
 また、補間フレーム生成部360は、読出しフレームのみを取得した場合は、読出しフレームをそのまま出力フレームとして出力する(S280)。 In addition, when only the read frame is acquired, the interpolation frame generation unit 360 outputs the read frame as an output frame as it is (S280).
 以上のように、実施の形態2では、フレームシンクロ部300は、読出しアドレスで指定されたフレームをそのまま出力するのではなく、補間フレーム生成し、置き換えて出力する調整処理を行うことで、フレーム間の時間的な連続性を改善する。これにより、スムースで高品位な映像表示が実現される。また、全てのフレームを補間フレームに置き換えるのではなく、必要な部分のみを補間フレームに置き換えるため、信号処理に対する負荷が小さく、効果的に同期ずれを補正することができる。 As described above, in the second embodiment, the frame synchronization unit 300 does not output a frame specified by a read address as it is, but generates an interpolated frame, performs replacement processing, and outputs the interlaced frames. Improves temporal continuity of Thereby, smooth and high-quality video display is realized. Also, since not all frames are replaced with interpolation frames, but only necessary portions are replaced with interpolation frames, the load on signal processing is small, and synchronization deviation can be corrected effectively.
 さらに、本発明は、以下のように変形することもできる。 Furthermore, the present invention can be modified as follows.
 (1)上記の各装置は、具体的には、マイクロプロセッサ、ROM、RAM、ハードディスクユニット、ディスプレイユニット、キーボード、マウスなどから構成されるコンピュータシステムである。前記RAMまたはハードディスクユニットには、コンピュータプログラムが記憶されている。前記マイクロプロセッサが、前記コンピュータプログラムにしたがって動作することにより、各装置は、その機能を達成する。ここでコンピュータプログラムは、所定の機能を達成するために、コンピュータに対する指令を示す命令コードが複数個組み合わされて構成されたものである。 (1) Each of the above devices is specifically a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like. A computer program is stored in the RAM or hard disk unit. Each device achieves its functions by the microprocessor operating according to the computer program. Here, the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
 (2)上記の各装置を構成する構成要素の一部または全部は、1個のシステムLSI(Large Scale Integration:大規模集積回路)から構成されているとしてもよい。システムLSIは、複数の構成部を1個のチップ上に集積して製造された超多機能LSIであり、具体的には、マイクロプロセッサ、ROM、RAMなどを含んで構成されるコンピュータシステムである。前記RAMには、コンピュータプログラムが記憶されている。前記マイクロプロセッサが、前記コンピュータプログラムにしたがって動作することにより、システムLSIは、その機能を達成する。 (2) A part or all of the components constituting each of the above devices may be configured by one system LSI (Large Scale Integration). The system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on a single chip, and specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. . A computer program is stored in the RAM. The system LSI achieves its functions by the microprocessor operating according to the computer program.
 (3)上記の各装置を構成する構成要素の一部または全部は、各装置に脱着可能なICカードまたは単体のモジュールから構成されているとしてもよい。前記ICカードまたは前記モジュールは、マイクロプロセッサ、ROM、RAMなどから構成されるコンピュータシステムである。前記ICカードまたは前記モジュールは、上記の超多機能LSIを含むとしてもよい。マイクロプロセッサが、コンピュータプログラムにしたがって動作することにより、前記ICカードまたは前記モジュールは、その機能を達成する。このICカードまたはこのモジュールは、耐タンパ性を有するとしてもよい。 (3) A part or all of the constituent elements constituting each of the above devices may be constituted by an IC card or a single module that can be attached to and detached from each device. The IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like. The IC card or the module may include the super multifunctional LSI described above. The IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
 (4)本発明は、上記に示す方法であるとしてもよい。また、これらの方法をコンピュータにより実現するコンピュータプログラムであるとしてもよいし、前記コンピュータプログラムからなるディジタル信号であるとしてもよい。 (4) The present invention may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
 また、本発明は、前記コンピュータプログラムまたは前記ディジタル信号をコンピュータ読み取り可能な記録媒体、例えば、フレキシブルディスク、ハードディスク、CD-ROM、MO、DVD、DVD-ROM、DVD-RAM、BD(Blu-ray Disc)、半導体メモリなどに記録したものとしてもよい。また、これらの記録媒体に記録されている前記ディジタル信号であるとしてもよい。 The present invention also provides a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc). ), Recorded in a semiconductor memory or the like. Further, the digital signal may be recorded on these recording media.
 また、本発明は、前記コンピュータプログラムまたは前記ディジタル信号を、電気通信回線、無線または有線通信回線、インターネットを代表とするネットワーク、データ放送等を経由して伝送するものとしてもよい。 In the present invention, the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
 また、本発明は、マイクロプロセッサとメモリを備えたコンピュータシステムであって、前記メモリは、上記コンピュータプログラムを記憶しており、前記マイクロプロセッサは、前記コンピュータプログラムにしたがって動作するとしてもよい。 Further, the present invention may be a computer system including a microprocessor and a memory, the memory storing the computer program, and the microprocessor operating according to the computer program.
 また、前記プログラムまたは前記ディジタル信号を前記記録媒体に記録して移送することにより、または前記プログラムまたは前記ディジタル信号を前記ネットワーク等を経由して移送することにより、独立した他のコンピュータシステムにより実施するとしてもよい。 In addition, the program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like, and executed by another independent computer system. It is good.
 (5)上記実施の形態及び上記変形例をそれぞれ組み合わせるとしてもよい。 (5) The above embodiment and the above modifications may be combined.
 以上、本発明の一態様に係る映像信号処理装置について、実施の形態及びその変形例に基づいて説明した。本発明に係る映像信号処理装置によれば、入力フレームレートと出力フレームレートとのズレを解消するフレームの調整処理が、映像の動きの少ない期間、映像の動きの多い期間及び映像の場面の切り替わりに対応する期間に行われる。また、本発明に係る映像信号処理装置によれば、補間フレーム生成し、置き換えて出力することで、フレーム間の時間的な連続性を改善する。 As described above, the video signal processing device according to one aspect of the present invention has been described based on the embodiment and the modifications thereof. According to the video signal processing device of the present invention, the frame adjustment processing for eliminating the difference between the input frame rate and the output frame rate is performed during the period when the video motion is low, the video motion is high, and the video scene is switched. Is performed in a period corresponding to. In addition, according to the video signal processing apparatus of the present invention, interpolated frames are generated, replaced and output, thereby improving temporal continuity between frames.
 これにより、視聴者が映像のジャダーなど、映像の違和感を感じにくいスムースで高品位な映像表示が実現される。 This enables smooth and high-quality video display that makes it difficult for viewers to feel uncomfortable with the video, such as video judder.
 例えば、上記の各実施の形態に係る映像信号処理装置10は、図13に示されるテレビ700として実現される。このとき、映像表示部600の具体的な構成は特に限定されないが、例えば、液晶ディスプレイ、プラズマディスプレイ、又は有機EL(Electro Luminescence)ディスプレイ等である。この場合、映像取得部100は、テレビ放送や、図13に示されるDVD(Digital Versatile Disc)プレーヤ710、及びセットトップボックス720から映像を取得する。 For example, the video signal processing apparatus 10 according to each of the above embodiments is realized as the television 700 shown in FIG. At this time, the specific configuration of the video display unit 600 is not particularly limited, and is, for example, a liquid crystal display, a plasma display, an organic EL (Electro Luminescence) display, or the like. In this case, the video acquisition unit 100 acquires a video from a television broadcast, a DVD (Digital Versatile Disc) player 710 and a set top box 720 shown in FIG.
 また、例えば、映像信号処理装置10は、DVDプレーヤ710として実現されてもよい。この場合、映像取得部100は、挿入されたDVDから映像を取得する。なお、映像の取得先はDVDに限定されず、Blu-ray Disc、HDD(Hard Disk Drive)等のあらゆる記録媒体から取得することができる。 Further, for example, the video signal processing apparatus 10 may be realized as a DVD player 710. In this case, the video acquisition unit 100 acquires video from the inserted DVD. The acquisition source of the video is not limited to the DVD, and can be acquired from any recording medium such as Blu-ray Disc, HDD (Hard Disk Drive) and the like.
 さらに、映像信号処理装置10は、セットトップボックス720として実現されてもよい。この場合、映像取得部100は、映像をケーブルテレビ放送等から取得する。 Furthermore, the video signal processing apparatus 10 may be realized as a set top box 720. In this case, the video acquisition unit 100 acquires video from cable television broadcasting or the like.
 なお、本発明は、これらの実施の形態またはその変形例に限定されるものではない。本発明の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態またはその変形例に施したもの、あるいは異なる実施の形態またはその変形例における構成要素を組み合わせて構築される形態も、本発明の範囲内に含まれる。 In addition, this invention is not limited to these embodiment or its modification. Unless it deviates from the gist of the present invention, various modifications conceived by those skilled in the art are applied to the present embodiment or the modification thereof, or a form constructed by combining different embodiments or components in the modification. It is included within the scope of the present invention.
 本発明によればフレームレートの異なる複数の映像を同期させて表示する場合に、視聴者が違和感を感じない、スムースで高品位な映像表示が可能であり、複数の映像を表示する機能を有するテレビや、パーソナルコンピュータなどの表示装置に用いられる映像信号処理装置及び映像信号処理方法として有用である。 According to the present invention, when a plurality of images with different frame rates are displayed in synchronization, the viewer can feel smooth and high-quality images without feeling uncomfortable, and has a function of displaying a plurality of images. The present invention is useful as a video signal processing device and a video signal processing method used for a display device such as a television or a personal computer.
 10 映像信号処理装置
 100 映像取得部
 200 出力フレームレート決定部
 300 フレームシンクロ部
 310 シンクロタイミング検出部
 320 アドレス制御部
 360 補間フレーム生成部
 400 フレームメモリ
 500 映像出力部
 600 映像表示部
 700 テレビ
 710 DVDプレーヤ
 720 セットトップボックス
DESCRIPTION OF SYMBOLS 10 Image | video signal processing apparatus 100 Image | video acquisition part 200 Output frame rate determination part 300 Frame synchronization part 310 Synchronization timing detection part 320 Address control part 360 Interpolation frame production | generation part 400 Frame memory 500 Image | video output part 600 Image | video display part 700 Television 710 DVD player 720 Set top box

Claims (11)

  1.  入力フレームレートが互いに異なる複数の入力映像を同期させて出力する映像信号処理装置であって、
     前記複数の入力映像それぞれに含まれるフレームを、対応する前記入力フレームレートで順次取得する映像取得部と、
     出力フレームレートを決定する出力フレームレート決定部と、
     前記映像取得部で取得された前記複数の入力映像それぞれのフレームを、前記出力フレームレートに同期して順次出力するフレームシンクロ部と、
     前記フレームシンクロ部から前記出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを順次出力する映像出力部とを備え、
     前記フレームシンクロ部は、前記複数の入力映像のうち、対応する前記入力フレームレートが前記出力フレームレートと異なる非同期映像に対して、前記入力フレームレートと前記出力フレームレートとのズレを解消する調整処理を、視聴者に目立ちにくい期間として判断できる期間に実行する
     映像信号処理装置。
    A video signal processing apparatus that outputs a plurality of input videos having different input frame rates in synchronization with each other,
    A video acquisition unit that sequentially acquires frames included in each of the plurality of input videos at the corresponding input frame rate;
    An output frame rate determining unit for determining an output frame rate;
    A frame synchronization unit that sequentially outputs the frames of the plurality of input videos acquired by the video acquisition unit in synchronization with the output frame rate;
    A video output unit that sequentially outputs a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization unit;
    The frame synchronization unit adjusts to eliminate a difference between the input frame rate and the output frame rate for an asynchronous video having a corresponding input frame rate different from the output frame rate among the plurality of input videos. Is executed during a period that can be determined as a period in which the viewer does not easily stand out.
  2.  前記フレームシンクロ部は、
     前記非同期映像に前記調整処理を実行することによって前記合成フレームに生じる違和感が、当該合成フレームを視聴する視聴者に目立ちにくい期間である調整処理期間を検出するシンクロタイミング検出部と、
     前記シンクロタイミング検出部で検出された前記調整処理期間に、前記非同期映像に対して前記調整処理を実行する制御部とを備える
     請求項1に記載の映像信号処理装置。
    The frame sync part is
    A synchronization timing detection unit that detects an adjustment processing period in which the uncomfortable feeling that occurs in the composite frame by executing the adjustment process on the asynchronous video is less noticeable to a viewer who views the composite frame;
    The video signal processing apparatus according to claim 1, further comprising: a control unit that executes the adjustment process on the asynchronous video during the adjustment process period detected by the synchronization timing detection unit.
  3.  前記シンクロタイミング検出部は、
     前記非同期映像の動き量を検出し、
     前記動き量が第一の閾値よりも小さい期間、及び前記動き量が前記第一の閾値よりも大きい第二の閾値よりも大きい期間を、前記調整処理期間として検出する
     請求項2に記載の映像信号処理装置。
    The synchronization timing detection unit
    Detecting the amount of motion of the asynchronous video;
    The video according to claim 2, wherein a period in which the amount of motion is smaller than a first threshold and a period in which the amount of motion is larger than a second threshold larger than the first threshold are detected as the adjustment processing period. Signal processing device.
  4.  前記シンクロタイミング検出部は、
     前記非同期映像の場面の切り替わりを検出し、
     前記映像の場面の切り替わりに対応する期間を、前記調整処理期間として検出する
     請求項2または3に記載の映像信号処理装置。
    The synchronization timing detection unit
    Detecting the switching of the asynchronous video scene,
    The video signal processing apparatus according to claim 2, wherein a period corresponding to switching of the video scene is detected as the adjustment processing period.
  5.  前記フレームシンクロ部は、前記調整処理として、
     前記非同期映像の前記入力フレームレートが前記出力フレームレートより大きい場合にフレームの一部を間引いて出力し、
     前記非同期映像の前記入力フレームレートが前記出力フレームレートより小さい場合に同一のフレームを繰り返し出力する
     請求項1~4のいずれか1項に記載の映像信号処理装置。
    The frame synchronization unit performs the adjustment process as follows:
    If the input frame rate of the asynchronous video is larger than the output frame rate, a part of the frame is thinned out and output,
    5. The video signal processing apparatus according to claim 1, wherein the same frame is repeatedly output when the input frame rate of the asynchronous video is smaller than the output frame rate.
  6.  前記フレームシンクロ部は、前記調整処理として、
     所定の期間に前記非同期映像に含まれるM(Mは2以上の整数)枚の元フレームを用いて当該M枚の元フレームの間のフレームに相当するN(M≠N)枚の補間フレームを生成し、
     前記所定の期間において、生成した前記N枚の補間フレームを前記非同期映像のフレームとして出力する
     請求項1に記載の映像信号処理装置。
    The frame synchronization unit performs the adjustment process as follows:
    Using M (M is an integer of 2 or more) original frames included in the asynchronous video during a predetermined period, N (M ≠ N) interpolation frames corresponding to frames between the M original frames are used. Generate
    The video signal processing apparatus according to claim 1, wherein, in the predetermined period, the generated N interpolation frames are output as frames of the asynchronous video.
  7.  前記フレームシンクロ部は、
     生成する補間フレームの前後に位置する元フレーム間の動きベクトルを検出し、
     検出した前記動きベクトルを前記元フレームに対する前記補間フレームの時間的距離で比例配分して得られる補間動きベクトルを、前記補間フレームの前記元フレームに対する動きベクトルであると仮定して、前記補間フレームを生成する
     請求項6に記載の映像信号処理装置。
    The frame sync part is
    Detect motion vectors between original frames located before and after the generated interpolation frame,
    Assuming that an interpolated motion vector obtained by proportionally allocating the detected motion vector with a temporal distance of the interpolated frame with respect to the original frame is a motion vector of the interpolated frame with respect to the original frame, the interpolated frame is The video signal processing device according to claim 6 to be generated.
  8.  出力フレームレート決定部は、前記複数の入力映像のうちのいずれかの前記入力フレームレートを、前記出力フレームレートと決定する
     請求項1~7のいずれか1項に記載の映像信号処理装置。
    The video signal processing device according to any one of claims 1 to 7, wherein the output frame rate determination unit determines any one of the plurality of input videos as the output frame rate.
  9.  出力フレームレート決定部は、前記複数の入力映像の前記入力フレームレートのいずれとも異なるフレームレートを、前記出力フレームレートと決定する
     請求項1~7のいずれか1項に記載の映像信号処理装置。
    The video signal processing device according to any one of claims 1 to 7, wherein the output frame rate determination unit determines a frame rate different from any of the input frame rates of the plurality of input videos as the output frame rate.
  10.  入力フレームレートが互いに異なる複数の入力映像を同期させて出力する映像信号処理方法であって、
     前記複数の入力映像それぞれに含まれるフレームを、対応する前記入力フレームレートで順次取得する映像取得ステップと、
     出力フレームレートを決定する出力フレームレート決定ステップと、
     前記映像取得部で取得された前記複数の入力映像それぞれのフレームを、前記出力フレームレートで順次出力するフレームシンクロステップと、
     前記フレームシンクロステップから前記出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを順次出力する映像出力ステップとを含み、
     前記フレームシンクロステップでは、前記複数の入力映像のうち、対応する前記入力フレームレートが前記出力フレームレートと異なる非同期映像に対して、前記入力フレームレートと前記出力フレームレートとのズレを解消する調整処理を、視聴者に目立たないように実行し、
     前記フレームシンクロステップは、前記非同期映像のフレームに前記調整処理を施したことによって前記合成フレームに生じる違和感が、当該合成フレームを視聴する視聴者に目立ちにくい期間である調整処理期間を検出するシンクロタイミング検出ステップと、
     前記シンクロタイミング検出ステップで検出された前記調整処理期間に、前記非同期映像の各フレームに前記調整処理を施して出力する制御ステップとを含む
     映像信号処理方法。
    A video signal processing method for synchronizing and outputting a plurality of input videos having different input frame rates,
    A video acquisition step of sequentially acquiring frames included in each of the plurality of input videos at the corresponding input frame rate;
    An output frame rate determining step for determining an output frame rate;
    A frame synchronization step for sequentially outputting frames of each of the plurality of input videos acquired by the video acquisition unit at the output frame rate;
    A video output step of sequentially outputting a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization step;
    In the frame synchronization step, an adjustment process for eliminating a difference between the input frame rate and the output frame rate for an asynchronous video having a corresponding input frame rate different from the output frame rate among the plurality of input videos. To make it inconspicuous,
    The frame synchronization step is a synchronization timing for detecting an adjustment processing period in which the uncomfortable feeling generated in the composite frame due to the adjustment process being performed on the frame of the asynchronous video is less noticeable to a viewer who views the composite frame. A detection step;
    And a control step of performing the adjustment process on each frame of the asynchronous video and outputting the same during the adjustment process period detected in the synchro timing detection step.
  11.  入力フレームレートが互いに異なる複数の入力映像を同期させて出力する映像信号処理方法であって、
     前記複数の入力映像それぞれに含まれるフレームを、対応する前記入力フレームレートで順次取得する映像取得ステップと、
     出力フレームレートを決定する出力フレームレート決定ステップと、
     前記映像取得部で取得された前記複数の入力映像それぞれのフレームを、前記出力フレームレートで順次出力するフレームシンクロステップと、
     前記フレームシンクロステップから前記出力フレームレートに同期して出力された各フレームを合成して得られる合成フレームを順次出力する映像出力ステップとを含み、
     前記フレームシンクロステップでは、前記複数の入力映像のうち、対応する前記入力フレームレートが前記出力フレームレートと異なる非同期映像の所定の期間に含まれるM(Mは2以上の整数)枚の元フレームを用いて当該M枚の元フレームの間のフレームに相当するN(M≠N)枚の補間フレームを生成し、
     前記所定の期間において、生成した前記N枚の補間フレームを、前記非同期映像のフレームとして出力する
     映像信号処理方法。
    A video signal processing method for synchronizing and outputting a plurality of input videos having different input frame rates,
    A video acquisition step of sequentially acquiring frames included in each of the plurality of input videos at the corresponding input frame rate;
    An output frame rate determining step for determining an output frame rate;
    A frame synchronization step for sequentially outputting frames of each of the plurality of input videos acquired by the video acquisition unit at the output frame rate;
    A video output step of sequentially outputting a synthesized frame obtained by synthesizing each frame output in synchronization with the output frame rate from the frame synchronization step;
    In the frame synchronization step, M (M is an integer of 2 or more) original frames included in a predetermined period of the asynchronous video having a corresponding input frame rate different from the output frame rate among the plurality of input videos. To generate N (M ≠ N) interpolation frames corresponding to frames between the M original frames,
    A video signal processing method for outputting the generated N interpolation frames as the asynchronous video frame in the predetermined period.
PCT/JP2011/007062 2011-02-10 2011-12-19 Image signal processing device and image signal processing method WO2012107985A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-026831 2011-02-10
JP2011026831A JP2012169727A (en) 2011-02-10 2011-02-10 Image signal processor and image signal processing method

Publications (1)

Publication Number Publication Date
WO2012107985A1 true WO2012107985A1 (en) 2012-08-16

Family

ID=46638228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/007062 WO2012107985A1 (en) 2011-02-10 2011-12-19 Image signal processing device and image signal processing method

Country Status (2)

Country Link
JP (1) JP2012169727A (en)
WO (1) WO2012107985A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113067960A (en) * 2021-03-16 2021-07-02 合肥合芯微电子科技有限公司 Image interpolation method, device and storage medium

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016017266A1 (en) * 2014-07-29 2016-02-04 三菱電機株式会社 Video information playback device and playback method
KR101553846B1 (en) 2014-12-16 2015-09-17 연세대학교 산학협력단 Apparatus and Method of frame synchronization for video stitching
KR101687104B1 (en) * 2015-06-03 2016-12-16 어드밴인터내셔널코프 Method and apparatus of V-Sync Delay Calculation
GB2547438B (en) * 2016-02-17 2019-07-03 Insync Tech Limited Method and apparatus for generating a video field/frame
WO2023017577A1 (en) * 2021-08-11 2023-02-16 日本電信電話株式会社 Apparatus, method, and program for combining video signals

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004048530A (en) * 2002-07-15 2004-02-12 Matsushita Electric Ind Co Ltd Frame rate converting device and conversion information multiplexer
JP2005124167A (en) * 2003-09-25 2005-05-12 Canon Inc Frame rate conversion device, overtaking prediction method used in the same, display control device and video image receiving display device
JP2005341132A (en) * 2004-05-26 2005-12-08 Toshiba Corp Video data processor and processing method
JP2006050230A (en) * 2004-08-04 2006-02-16 Hitachi Ltd Frame rate converting method, converter, image signal recorder, and reproducer
JP2007318193A (en) * 2006-05-23 2007-12-06 Hitachi Ltd Image processing apparatus
JP2008236098A (en) * 2007-03-19 2008-10-02 Hitachi Ltd Video image processing device and video image display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004048530A (en) * 2002-07-15 2004-02-12 Matsushita Electric Ind Co Ltd Frame rate converting device and conversion information multiplexer
JP2005124167A (en) * 2003-09-25 2005-05-12 Canon Inc Frame rate conversion device, overtaking prediction method used in the same, display control device and video image receiving display device
JP2005341132A (en) * 2004-05-26 2005-12-08 Toshiba Corp Video data processor and processing method
JP2006050230A (en) * 2004-08-04 2006-02-16 Hitachi Ltd Frame rate converting method, converter, image signal recorder, and reproducer
JP2007318193A (en) * 2006-05-23 2007-12-06 Hitachi Ltd Image processing apparatus
JP2008236098A (en) * 2007-03-19 2008-10-02 Hitachi Ltd Video image processing device and video image display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113067960A (en) * 2021-03-16 2021-07-02 合肥合芯微电子科技有限公司 Image interpolation method, device and storage medium
CN113067960B (en) * 2021-03-16 2022-08-12 合肥合芯微电子科技有限公司 Image interpolation method, device and storage medium

Also Published As

Publication number Publication date
JP2012169727A (en) 2012-09-06

Similar Documents

Publication Publication Date Title
WO2012107985A1 (en) Image signal processing device and image signal processing method
JP2010148084A (en) Method and apparatus for processing video
JP4762343B2 (en) Image quality adjusting apparatus and image quality adjusting method
KR20140111736A (en) Display apparatus and control method thereof
JP4691193B1 (en) Video display device and video processing method
JP2012182673A (en) Image display apparatus and image processing method
JP2009145707A (en) Plasma display apparatus
JP4951487B2 (en) Video processing apparatus and video display apparatus using the same
KR20080011026A (en) Image processing apparatus, display apparatus and image processing method
JP4580347B2 (en) Flicker video conversion device, program and method thereof, and video display device
JP2004194311A (en) Video playback device and video playback method
JP2009175182A (en) Image processing device
JP2008028507A (en) Image correction circuit, image correction method and image display
US7630018B2 (en) On-screen display apparatus and on-screen display generation method
US20090086090A1 (en) Picture signal processing apparatus and picture signal processing method
JP2009135847A (en) Video processor and frame rate conversion method
JP2008294539A (en) Digital broadcast receiver
US10212316B2 (en) Video processing apparatus
JP5237582B2 (en) Method and apparatus for displaying image sequences
JP4747214B2 (en) Video signal processing apparatus and video signal processing method
JP2010010778A (en) Video processing apparatus and method for controlling the video processing apparatus
JP2008058483A (en) Animation image display device and method
JP5259867B2 (en) Video display device and video processing method
CN112544075B (en) Display device, signal processing device, and signal processing method
JP5911210B2 (en) Image display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11858368

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11858368

Country of ref document: EP

Kind code of ref document: A1