WO2021117233A1 - Moving image separation device, program, and moving image separation method - Google Patents

Moving image separation device, program, and moving image separation method Download PDF

Info

Publication number
WO2021117233A1
WO2021117233A1 PCT/JP2019/048991 JP2019048991W WO2021117233A1 WO 2021117233 A1 WO2021117233 A1 WO 2021117233A1 JP 2019048991 W JP2019048991 W JP 2019048991W WO 2021117233 A1 WO2021117233 A1 WO 2021117233A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
foreground
moving image
pixels
Prior art date
Application number
PCT/JP2019/048991
Other languages
French (fr)
Japanese (ja)
Inventor
愛梨 守谷
秀明 前原
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021559701A priority Critical patent/JP7003342B2/en
Priority to PCT/JP2019/048991 priority patent/WO2021117233A1/en
Publication of WO2021117233A1 publication Critical patent/WO2021117233A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images

Definitions

  • This disclosure relates to a video separation device, a program, and a video separation method.
  • the conventional moving image processing device divides each frame into small areas, calculates a feature amount based on motion for each divided area, and classifies the small area according to the calculated feature amount, thereby foreground and background. And separate.
  • each frame must be divided into small areas, and the feature amount based on the movement must be calculated for each divided area. Therefore, the processing cost for separating the foreground and the background in each frame is very high. It's getting higher.
  • one or a plurality of aspects of the present disclosure aims to reduce the processing cost when separating the foreground and the background in the image included in the moving image data.
  • the position of the foreground in the two images is deviated from the image acquisition unit that acquires two images including the foreground from the moving image captured from the moving viewpoint.
  • the deviation amount specifying part that specifies the deviation amount and the deviation amount of the two images are corrected according to the deviation amount, and each of the corresponding pixels of the two images after correction, or the two after correction.
  • a correlation calculation unit that calculates the correlation of a predetermined area for each group of a plurality of corresponding pixels of one image, and a pixel whose calculated correlation is larger than a predetermined threshold are defined as pixels in the foreground. It is characterized by including a threshold processing unit that generates an image obtained by extracting pixel values of the pixels of the foreground as a foreground image from one image included in the moving image.
  • the image acquisition unit that acquires two images including the foreground from the moving viewpoint captured by the computer, and the positions of the foreground in the two images are deviated from each other.
  • the deviation amount specifying unit for specifying the deviation amount, the deviation of the positions of the two images is corrected according to the deviation amount, and each of the corresponding pixels of the two images after correction, or the two after correction.
  • a correlation calculation unit that calculates the correlation of a predetermined area for each group of a plurality of corresponding pixels of an image, and a pixel whose calculated correlation is larger than a predetermined threshold are defined as pixels in the foreground. It is characterized in that it functions as a threshold processing unit that generates an image obtained by extracting the pixel values of the pixels of the foreground from one image included in the moving image as a foreground image.
  • two images including a foreground are acquired from a moving image captured from a moving viewpoint, and the amount of deviation in the position of the foreground in the two images is specified. Then, according to the amount of deviation, the displacement of the positions of the two images is corrected, and each of the corrected pixels of the two images or a group of a plurality of corresponding pixels of the corrected two images. For each, the correlation of the predetermined area is calculated, and the pixel whose calculated correlation is larger than the predetermined threshold is defined as the foreground pixel, and from one image included in the moving image, the foreground pixel is used. It is characterized in that an image obtained by extracting the pixel value of is generated as a foreground image.
  • FIG. 1 It is a block diagram which shows schematic structure of the moving image separation apparatus which concerns on Embodiments 1 to 3.
  • (A) and (B) are schematic views showing the two acquired frames. It is a schematic diagram for demonstrating the specification of the deviation amount in the deviation amount specifying part in Embodiment 1.
  • FIG. It is a schematic diagram which shows the correlation image which classified the calculated correlation by the value of the predetermined range for each pixel.
  • (A) to (H) are schematic diagrams for explaining the process of dividing the first frame and the second frame and calculating the deviation amount.
  • (A) to (C) are schematic views showing one frame included in the moving image, and the first frame and the second frame acquired from the one frame.
  • (A) and (B) are schematic views showing a first frame and a mask image.
  • FIG. 1 is a block diagram schematically showing the configuration of the moving image separation device 100 according to the first embodiment.
  • the moving image separation device 100 includes a moving image input unit 101, an image acquisition unit 102, a deviation amount specifying unit 103, a correlation calculation unit 104, a threshold value processing unit 105, and an image output unit 106.
  • the video input unit 101 receives video input from an external camera.
  • the moving image input unit 101 gives the input moving image to the image acquisition unit 102.
  • the moving image contains a plurality of images as a plurality of frames.
  • the moving image is a moving image of an object in the foreground while the camera is moving.
  • the viewpoint of the camera is also moving.
  • the moving image in the first embodiment is a copy of an electric wire from a helicopter. Therefore, the moving image in the first embodiment is a moving image captured from a moving viewpoint.
  • the foreground does not change significantly with time in the moving image, and the foreground and the background change with time.
  • the image acquisition unit 102 acquires two images including the foreground from the moving image. For example, the image acquisition unit 102 acquires two frames that are close in time from a plurality of frames included in the moving image.
  • the image acquisition unit 102 acquires two frames that are close in time from a plurality of frames included in the moving image.
  • the earlier frame in time is referred to as the first frame
  • the later frame is referred to as the second frame.
  • the image acquisition unit 102 is included in two frames included in a predetermined time or within a predetermined number of frames among the plurality of frames included in the moving image. All you have to do is get two frames. Here, it is assumed that the image acquisition unit 102 acquires two consecutive frames.
  • FIG. 2 (A) and 2B are schematic views showing two frames acquired by the image acquisition unit 102.
  • FIG. 2 (A) shows the first frame 110
  • FIG. 2 (B) shows the second frame 111.
  • the foreground electric wire 112A is reflected in the first frame 110, and a portion other than the electric wire 112A is the background.
  • the focus is on the electric wire 112A, which is the foreground, and the forest in the background is blurred. Since the background is blurred, it is assumed that the forest is reflected as patterns 113A and 114A.
  • the electric wire 112B as the foreground is also reflected in the second frame, and the portion other than the electric wire 112B is the background.
  • the forest is reflected as blurred patterns 113B and 114B.
  • the deviation amount specifying unit 103 specifies the deviation amount in which the positions of the foregrounds in the two acquired frames are displaced. For example, the deviation amount specifying unit 103 specifies the deviation amount in which the positions of these two frames are displaced so that the portions reflected in the two acquired frames match. Specifically, the deviation amount specifying unit 103 may specify the deviation amount of the positions of these two frames so that the foreground parts included in these two frames match. Specifically, the deviation amount specifying unit 103 specifies the deviation amount by obtaining the correlation of the entire two frames. The deviation amount specifying unit 103 may specify the deviation amount by matching the feature points.
  • FIG. 3 is a schematic diagram for explaining the specification of the deviation amount by the deviation amount specifying unit 103.
  • the first frame 110 and the second frame 111 are imaged by a camera moving in the D direction. Therefore, when the portion included in the first frame 110 and the corresponding portion included in the second frame 111 are overlapped with each other, as shown in FIG. 3, the second frame 111 Will be displaced in the D direction with respect to the first frame 110. In addition, the frame shifts in the vertical direction depending on the vibration or the moving direction of the camera.
  • the deviation amount specifying unit 103 specifies the deviation amount X in the horizontal direction and the deviation amount Y in the vertical direction. For example, as described above, when the foreground electric wires 112A and 112B are in focus and the background is blurred, the correlation of the entire image of the first frame 110 and the second frame 111 can be calculated. The correlation is high when the corresponding portion of the electric wire 112A and the corresponding portion of the electric wire 112B match. Therefore, as shown in FIG. 3, when the corresponding portions of the electric wire 112A shown in the first frame 110 and the electric wire 112B shown in the second frame 111 overlap each other, The correlation is high.
  • the correlation calculation unit 104 corrects the deviation of the positions of the two frames according to the deviation amount specified by the deviation amount specifying unit 103, and has a predetermined area for each of the corresponding pixels of the two corrected frames. Calculate the correlation. For example, the correlation calculation unit 104 shifts either one of the first frame 110 and the second frame 111 by the amount of deviation specified by the deviation amount specifying unit 103, and determines in advance for each matching pixel. Calculate the correlation of size windows.
  • FIG. 4 is a schematic view showing a correlation image 114 in which the correlation calculated by the correlation calculation unit 104 is classified for each value in a predetermined range for each pixel.
  • the portion corresponding to the electric wire in the foreground has a relatively high correlation
  • the background which is the other portion, has a relatively low correlation.
  • the threshold processing unit 105 identifies the pixels having a higher correlation than the predetermined threshold as the pixels in the foreground by comparing the correlation for each pixel with the predetermined threshold, and other than that. Is specified as a background pixel.
  • the threshold processing unit 105 is an image obtained by extracting the pixel values of the pixels of the foreground in at least one of the frames included in the moving image, here, the first frame 110 and the second frame 111.
  • An image and a background image which is an image obtained by extracting the pixel values of the background pixels, are generated.
  • the foreground image is an image in which the pixel values of the foreground pixels are extracted without extracting the pixel values of the background pixels
  • the background image is an image in which the pixel values of the foreground pixels are not extracted. It is an image obtained by extracting the pixel value of.
  • the image output unit 106 outputs at least one of the foreground image and the background image.
  • FIG. 5 is a block diagram showing a hardware configuration example of the moving image separator 100.
  • the I / O (In or Out) interface 120 the auxiliary storage device 121 that functions as a non-volatile memory, the CPU (Central Processing Unit) 122 that functions as a processor, and the main memory 123 are bus 124. It can be realized by the computer connected to.
  • the I / O (In or Out) interface 120 the auxiliary storage device 121 that functions as a non-volatile memory
  • the CPU (Central Processing Unit) 122 that functions as a processor
  • main memory 123 main memory 123. It can be realized by the computer connected to.
  • the moving image input unit 101 and the image output unit 106 can be realized from the I / O interface 120.
  • the image acquisition unit 102, the deviation amount specifying unit 103, the correlation calculation unit 104, and the threshold processing unit 105 read the program stored in the auxiliary storage device 121 into the main memory 123 and execute the program. It can be realized.
  • Such a program may be provided through a network, or may be recorded and provided on a recording medium. That is, such a program may be provided as, for example, a program product.
  • the camera 125 as an imaging device outputs a moving image to the moving image separating device 100.
  • the moving image separation device 100 outputs at least one of the generated foreground image and the generated background image to the display 126 as a display device, so that at least one of the foreground image and the background image is displayed on the display 126. To display.
  • the camera 125 and the display 126 are not included in the moving image separator 100, but at least one of them may be included in the moving image separating device 100.
  • the camera 125 when the camera 125 is included in the moving image separating device 100, the camera 125 functions as an imaging unit for capturing a moving image.
  • the display 126 when the display 126 is included in the moving image separating device 100, the display 126 functions as a display unit for displaying at least one of the foreground image and the background image.
  • FIG. 6 is a flowchart showing the processing in the moving image separator 100 according to the first embodiment.
  • the video input unit 101 receives video input from an external camera (S10).
  • the moving image input unit 101 gives the input moving image to the image acquisition unit 102.
  • the image acquisition unit 102 acquires two frames that are close in time from the plurality of frames included in the moving image as the first frame and the second frame (S11).
  • the image acquisition unit 102 gives the acquired first frame and the second frame to the deviation amount specifying unit 103.
  • the deviation amount specifying unit 103 specifies the deviation amount in which the positions of these two frames are displaced so that the portions reflected in the two acquired frames match (S12).
  • the deviation amount specifying unit 103 gives the specified deviation amount and the first frame and the second frame to the correlation calculation unit 104.
  • the correlation calculation unit 104 shifts either one of the first frame 110 and the second frame 111 by a given amount of deviation, for each matching pixel, and for each window of a predetermined size.
  • the correlation is calculated in (S13).
  • the correlation calculation unit 104 gives the correlation data indicating the calculated correlation for each pixel, the amount of deviation, and the first frame and the second frame to the threshold value processing unit 105.
  • the threshold processing unit 105 compares the correlation shown in the correlation data with a predetermined threshold for each pixel, so that the pixel having a higher correlation than the predetermined threshold is in the foreground. Is specified as the pixel of the above, and the other pixels are specified as the background pixel (S14).
  • the threshold processing unit 105 in at least one of the first frame 110 and the second frame 111, an image composed of pixel values of the pixels in the foreground and an image composed of pixel values of the pixels in the background.
  • a background image is generated (S15).
  • the threshold processing unit 105 gives at least one of the generated foreground image and background image to the image output unit 106.
  • the image output unit 106 outputs at least one of the foreground image and the background image (S16).
  • the processing cost is low and the background and the foreground can be separated with high accuracy.
  • the deviation amount specifying unit 103 specifies the deviation amounts X and Y from the entire first frame 110 and the second frame 111.
  • Form 1 is not limited to such an example.
  • the deviation amount specifying unit 103 may divide the first frame 110 and the second frame into a plurality of divided frames in the same manner, and calculate the deviation amount for each divided frame.
  • FIG. 7 (A) to 7 (H) are schematic views for explaining a process of dividing the first frame 110 and the second frame to calculate the amount of deviation.
  • the deviation amount specifying unit 103 divides the first frame 110 shown in FIG. 7 (A) in the middle in the horizontal direction, thereby dividing the first frame 110 shown in FIG. 7 (B).
  • the frame 110A and the second divided frame 110B shown in FIG. 7C are generated.
  • the deviation amount specifying unit 103 divides the second frame 111 shown in FIG. 7 (D) in the middle in the horizontal direction, so that the third frame 111 shown in FIG. 7 (E) is divided.
  • the division frame 111A and the fourth division frame 111B shown in FIG. 7 (F) are generated.
  • the deviation amount specifying unit 103 sets the displacement amount between the first division frame 110A and the third division frame 111A in the same manner as in FIG. By specifying to, the division deviation amounts X 1 and Y 1 , which are the deviation amounts of these positions, are specified. Similarly, as shown in FIG. 7 (H), the deviation amount specifying unit 103 identifies the displacement amount between the second divided frame 110B and the fourth divided frame 111B, and these The division deviation amounts X 2 and Y 2 , which are the deviation amounts of the positions of, are specified.
  • the correlation calculation unit 104 specifies the corresponding pixels by using the division deviation amounts X 1 and Y 1 for the pixels included in the first division frame 110A or the third division frame 111A. Then, the correlation may be calculated. Similarly, the correlation calculation unit 104, for the pixels included in the second divided frame 110B or the fourth divided frame 111B, using a split shift amount X 2, Y 2, to identify the corresponding pixel , The correlation may be calculated.
  • the image acquisition unit 102 or the deviation amount specifying unit 103 may convert the first frame and the second frame into grayscale when they are in color. Then, the deviation amount specifying unit and the correlation calculation unit 104 may perform the above-mentioned processing using the first frame and the second frame converted into gray scale.
  • the earlier frame is the first frame and the later frame is the second frame, but the earlier frame is the earlier frame. May be the second frame, and the slower frame may be the first frame.
  • the moving image separation device 200 includes a moving image input unit 101, an image acquisition unit 202, a deviation amount specifying unit 103, a correlation calculation unit 104, and a threshold processing unit. It includes 205 and an image output unit 106.
  • the moving image input unit 101, the deviation amount specifying unit 103, the correlation calculation unit 104, and the image output unit 106 of the moving image separation device 200 according to the second embodiment are the moving image input units 101 and the deviation amount of the moving image separating device 100 according to the first embodiment. This is the same as the quantity specifying unit 103, the correlation calculation unit 104, and the image output unit 106.
  • the moving image input to the moving image input unit 101 is assumed to be a moving image captured by the interlace method.
  • the interlace method is a kind of image transmission method, and is a method in which every other scanning line is transmitted in two times.
  • the image acquisition unit 202 acquires two frames from one frame included in the moving image. Here, since it is an interlaced moving image, the image acquisition unit 202 extracts the pixel values of the pixels of the even-numbered rows and the first frame from which the pixel values of the pixels of the odd-numbered rows included in one frame are extracted. The second frame is acquired.
  • FIG. 8 (A) to 8 (C) show, in the second embodiment, one frame 215 included in the moving image, and the first frame 210 and the second frame 211 acquired from the one frame 215. It is a schematic diagram which shows. In the second embodiment, from one frame 215 shown in FIG. 8 (A) to the first frame 210 in which only the pixels of the odd-numbered rows are extracted as shown in FIG. 8 (B). , As shown in FIG. 8C, the second frame 211 in which only the pixels of even-numbered rows are extracted is acquired.
  • the threshold processing unit 205 compares the correlation calculated by the correlation calculation unit 104 with a preset threshold value for each pixel, and sets pixels having a higher correlation than the preset threshold value as foreground pixels. It is specified, and the other pixels are specified as background pixels.
  • the threshold processing unit 205 extracts the foreground image, which is an image obtained by extracting the pixel values of the pixels in the foreground, and the pixel values of the pixels in the background from the original frame from which the first frame and the second frame are acquired. Generates a background image that is an image.
  • the first frame having only odd-numbered lines and the second frame having only even-numbered lines are acquired from one frame of the interlaced moving image. Since the amount of deviation of the first frame and the second frame acquired in this way is specified by the deviation amount specifying unit 103, the blurring due to the temporal deviation between the odd-numbered rows and the even-numbered rows in the interlace method is eliminated. Will be done.
  • the moving image separation device 300 includes a moving image input unit 101, an image acquisition unit 102, a deviation amount specifying unit 303, a correlation calculation unit 104, and a threshold processing unit. It includes 105 and an image output unit 106.
  • the video input unit 101, the image acquisition unit 102, the correlation calculation unit 104, the threshold processing unit 105, and the image output unit 106 of the video separation device 300 according to the third embodiment are used to input the video of the video separation device 100 according to the first embodiment. This is the same as the unit 101, the image acquisition unit 102, the correlation calculation unit 104, the threshold processing unit 105, and the image output unit 106.
  • the deviation amount specifying unit 303 specifies the deviation amount in which the positions of these two frames are displaced so that the portions reflected in the two acquired frames match.
  • the amount of deviation from the other frames is specified by using a partial image obtained by extracting only the portion matching the features of the foreground from one frame selected from these two frames.
  • one frame selected from the two frames is also referred to as a selected image, and the remaining frame is also referred to as a non-selected image.
  • FIG. 9 (A) and 9 (B) are schematic views for explaining the processing of the deviation amount specifying unit 303 in the third embodiment.
  • the deviation amount specifying unit 303 is, for example, a mask as shown in FIG. 9 (B) by using the characteristics of the electric wire 112A which is the foreground from the first frame 110 shown in FIG. 9 (A).
  • Generate image 316 is, for example, a mask as shown in FIG. 9 (B) by using the characteristics of the electric wire 112A which is the foreground from the first frame 110 shown in FIG. 9 (A).
  • the deviation amount specifying unit 303 specifies the saturation range corresponding to the gray color of the electric wire 112A from the first frame 110, and therefore, FIG. A mask image 316 as shown in B) can be generated.
  • the first frame 110 shown in FIG. 9A shows the electric wire 112A as the foreground, the portion other than the electric wire 112A is the background, and the forest as the background is blurred.
  • the patterns are shown as 113A and 114A.
  • the mask image 316 shown in FIG. 9B is divided into an area 317 corresponding to the foreground and an area 318 corresponding to the background.
  • the deviation amount specifying unit 303 uses the mask image 316 shown in FIG. 9 (B) to form the region 317 corresponding to the foreground from the first frame 110 shown in FIG. 9 (A).
  • a partial image that is a partial image can be extracted.
  • the deviation amount specifying unit 303 specifies the deviation amount by obtaining the overall correlation between the extracted partial image and the second frame.
  • the deviation amount specifying unit 303 can accurately identify the deviation in the foreground.
  • the influence of the background can be suppressed and the deviation of the foreground can be accurately identified.
  • the threshold processing units 105 and 205 generate the foreground image and the background image, but only the foreground image may be generated.
  • the correlation calculation unit 104 calculates the correlation of a predetermined area for each corresponding pixel of the two corrected frames, but the first embodiment 1 ⁇ 3 is not limited to such an example.
  • the correlation calculation unit 104 may calculate the correlation of a predetermined area for each group of a plurality of corresponding pixels of the two corrected frames.
  • the correlation calculation unit 104 performs a correlation in a window having a predetermined size such as 5 ⁇ 9 in each pixel included in a group of a plurality of predetermined pixels in the window. It can also be a correlation. As a result, the calculation load when calculating the correlation can be reduced. It is desirable that the group of a plurality of pixels includes the pixel in the center of the window.

Abstract

This moving image separation device is provided with: an image acquisition unit (102) that acquires two images including the foreground from moving images captured at a moving viewpoint; a misalignment amount identification unit (103) that identifies a misalignment amount regarding the position of the foreground between the two images; a correlation calculation unit (104) that corrects a positional misalignment between the two images in accordance with the misalignment amount, and calculates a correlation, in a predetermined area, between each two pixels corresponding one another in the two corrected images or between each two groups of pixels corresponding one another in the two corrected images; and a threshold value processing unit (105) that sets pixels where the calculated correlation is larger than a predetermined threshold value as pixels of the foreground, and generates an image having pixel values of the pixels of the foreground extracted as the foreground image from one image included in the moving images.

Description

動画分離装置、プログラム及び動画分離方法Video separator, program and video separation method
 本開示は、動画分離装置、プログラム及び動画分離方法に関する。 This disclosure relates to a video separation device, a program, and a video separation method.
 動画に含まれているフレームにおいて、背景と前景とを分離する技術として、特許文献1に記載されている動画像処理装置がある。
 従来の動画像処理装置は、各フレームを小領域に分割し、分割された領域毎に動きに基づく特徴量を算出し、算出された特徴量によって、小領域を分類することで、前景と背景とを分離する。
As a technique for separating the background and the foreground in the frame included in the moving image, there is a moving image processing apparatus described in Patent Document 1.
The conventional moving image processing device divides each frame into small areas, calculates a feature amount based on motion for each divided area, and classifies the small area according to the calculated feature amount, thereby foreground and background. And separate.
特開2005-176339号公報Japanese Unexamined Patent Publication No. 2005-176339
 従来の技術は、各フレームを小領域に分割し、分割された領域毎に動きに基づく特徴量を算出しなければならないため、各フレームにおける前景と背景とを分離する際の処理コストが非常に高くなっている。 In the conventional technology, each frame must be divided into small areas, and the feature amount based on the movement must be calculated for each divided area. Therefore, the processing cost for separating the foreground and the background in each frame is very high. It's getting higher.
 そこで、本開示の一又は複数の態様は、動像データに含まれている画像において前景と背景とを分離する際の処理コストを低くすることを目的とする。 Therefore, one or a plurality of aspects of the present disclosure aims to reduce the processing cost when separating the foreground and the background in the image included in the moving image data.
 本開示の一態様に係る動画分離装置は、移動する視点で撮像された動画から、前景が含まれる二つの画像を取得する画像取得部と、前記二つの画像における前記前景の位置がずれているズレ量を特定するズレ量特定部と、前記ズレ量に従って、前記二つの画像の位置のずれを補正して、補正後の前記二つの画像の対応する画素毎に、又は、補正後の前記二つの画像の対応する複数の画素のグループ毎に、予め定められた面積の相関を算出する相関算出部と、前記算出された相関が予め定められた閾値よりも大きい画素を前記前景の画素とし、前記動画に含まれる一つの画像から、前記前景の画素の画素値を抽出した画像を前景画像として生成する閾値処理部と、を備えることを特徴とする。 In the moving image separation device according to one aspect of the present disclosure, the position of the foreground in the two images is deviated from the image acquisition unit that acquires two images including the foreground from the moving image captured from the moving viewpoint. The deviation amount specifying part that specifies the deviation amount and the deviation amount of the two images are corrected according to the deviation amount, and each of the corresponding pixels of the two images after correction, or the two after correction. A correlation calculation unit that calculates the correlation of a predetermined area for each group of a plurality of corresponding pixels of one image, and a pixel whose calculated correlation is larger than a predetermined threshold are defined as pixels in the foreground. It is characterized by including a threshold processing unit that generates an image obtained by extracting pixel values of the pixels of the foreground as a foreground image from one image included in the moving image.
 本開示の一態様に係るプログラムは、コンピュータを、移動する視点で撮像された動画から、前景が含まれる二つの画像を取得する画像取得部、前記二つの画像における前記前景の位置がずれているズレ量を特定するズレ量特定部、前記ズレ量に従って、前記二つの画像の位置のずれを補正して、補正後の前記二つの画像の対応する画素毎に、又は、補正後の前記二つの画像の対応する複数の画素のグループ毎に、予め定められた面積の相関を算出する相関算出部、及び、前記算出された相関が予め定められた閾値よりも大きい画素を前記前景の画素とし、前記動画に含まれる一つの画像から、前記前景の画素の画素値を抽出した画像を前景画像として生成する閾値処理部、として機能させることを特徴とする。 In the program according to one aspect of the present disclosure, the image acquisition unit that acquires two images including the foreground from the moving viewpoint captured by the computer, and the positions of the foreground in the two images are deviated from each other. The deviation amount specifying unit for specifying the deviation amount, the deviation of the positions of the two images is corrected according to the deviation amount, and each of the corresponding pixels of the two images after correction, or the two after correction. A correlation calculation unit that calculates the correlation of a predetermined area for each group of a plurality of corresponding pixels of an image, and a pixel whose calculated correlation is larger than a predetermined threshold are defined as pixels in the foreground. It is characterized in that it functions as a threshold processing unit that generates an image obtained by extracting the pixel values of the pixels of the foreground from one image included in the moving image as a foreground image.
 本開示の一態様に係る動画分離方法は、移動する視点で撮像された動画から、前景が含まれる二つの画像を取得し、前記二つの画像における前記前景の位置がずれているズレ量を特定し、前記ズレ量に従って、前記二つの画像の位置のずれを補正し、補正後の前記二つの画像の対応する画素毎に、又は、補正後の前記二つの画像の対応する複数の画素のグループ毎に、予め定められた面積の相関を算出し、前記算出された相関が予め定められた閾値よりも大きい画素を前記前景の画素とし、前記動画に含まれる一つの画像から、前記前景の画素の画素値を抽出した画像を前景画像として生成することを特徴とする。 In the moving image separation method according to one aspect of the present disclosure, two images including a foreground are acquired from a moving image captured from a moving viewpoint, and the amount of deviation in the position of the foreground in the two images is specified. Then, according to the amount of deviation, the displacement of the positions of the two images is corrected, and each of the corrected pixels of the two images or a group of a plurality of corresponding pixels of the corrected two images. For each, the correlation of the predetermined area is calculated, and the pixel whose calculated correlation is larger than the predetermined threshold is defined as the foreground pixel, and from one image included in the moving image, the foreground pixel is used. It is characterized in that an image obtained by extracting the pixel value of is generated as a foreground image.
 本開示の一又は複数の態様によれば、動像に含まれている画像において前景と背景とを分離する際の処理コストを低くすることができる。 According to one or more aspects of the present disclosure, it is possible to reduce the processing cost when separating the foreground and the background in the image included in the moving image.
実施の形態1~3に係る動画分離装置の構成を概略的に示すブロック図である。It is a block diagram which shows schematic structure of the moving image separation apparatus which concerns on Embodiments 1 to 3. (A)及び(B)は、取得された二つのフレームを示す概略図である。(A) and (B) are schematic views showing the two acquired frames. 実施の形態1におけるズレ量特定部でのズレ量の特定を説明するための概略図である。It is a schematic diagram for demonstrating the specification of the deviation amount in the deviation amount specifying part in Embodiment 1. FIG. 算出された相関を、画素毎に予め定められた範囲の値毎に分類した相関画像を示す概略図である。It is a schematic diagram which shows the correlation image which classified the calculated correlation by the value of the predetermined range for each pixel. 動画分離装置のハードウェア構成例を示すブロック図である。It is a block diagram which shows the hardware configuration example of the moving image separation apparatus. 動画分離装置での処理を示すフローチャートである。It is a flowchart which shows the processing in a moving image separation apparatus. (A)~(H)は、第1のフレーム及び第2のフレームを分割して、ズレ量を算出する処理を説明するための概略図である。(A) to (H) are schematic diagrams for explaining the process of dividing the first frame and the second frame and calculating the deviation amount. (A)~(C)は、動画に含まれている一つのフレームと、その一つのフレームから取得された第1のフレーム及び第2のフレームとを示す概略図である。(A) to (C) are schematic views showing one frame included in the moving image, and the first frame and the second frame acquired from the one frame. (A)及び(B)は、第1のフレームと、マスク画像とを示す概略図である。(A) and (B) are schematic views showing a first frame and a mask image.
実施の形態1.
 図1は、実施の形態1に係る動画分離装置100の構成を概略的に示すブロック図である。
 動画分離装置100は、動画入力部101と、画像取得部102と、ズレ量特定部103と、相関算出部104と、閾値処理部105と、画像出力部106とを備える。
Embodiment 1.
FIG. 1 is a block diagram schematically showing the configuration of the moving image separation device 100 according to the first embodiment.
The moving image separation device 100 includes a moving image input unit 101, an image acquisition unit 102, a deviation amount specifying unit 103, a correlation calculation unit 104, a threshold value processing unit 105, and an image output unit 106.
 動画入力部101は、外部のカメラから動画の入力を受け付ける。動画入力部101は、入力された動画を画像取得部102に与える。動画には、複数の画像が複数のフレームとして含まれている。
 ここでの動画は、カメラが移動しながら前景となる対象を撮像した動画である。言い換えると、カメラを移動させることで、カメラの視点も移動しているものとする。具体的には、実施の形態1における動画は、ヘリコプターから電線を写したものとする。このため、実施の形態1における動画は、移動する視点で撮像された動画である。
The video input unit 101 receives video input from an external camera. The moving image input unit 101 gives the input moving image to the image acquisition unit 102. The moving image contains a plurality of images as a plurality of frames.
The moving image here is a moving image of an object in the foreground while the camera is moving. In other words, by moving the camera, it is assumed that the viewpoint of the camera is also moving. Specifically, it is assumed that the moving image in the first embodiment is a copy of an electric wire from a helicopter. Therefore, the moving image in the first embodiment is a moving image captured from a moving viewpoint.
 実施の形態1においては、動画において、前景は時間によって大きく変化せず、前景と背景とは時間によって変化に差があることを前提とする。 In the first embodiment, it is premised that the foreground does not change significantly with time in the moving image, and the foreground and the background change with time.
 画像取得部102は、動画から、前景が含まれる二つの画像を取得する。
 例えば、画像取得部102は、動画に含まれている複数のフレームから、時間的に近い二つのフレームを取得する。ここでは、取得された二つのフレームの内、時間的に早い方のフレームを第1のフレーム、遅い方のフレームを第2のフレームとする。
The image acquisition unit 102 acquires two images including the foreground from the moving image.
For example, the image acquisition unit 102 acquires two frames that are close in time from a plurality of frames included in the moving image. Here, of the two acquired frames, the earlier frame in time is referred to as the first frame, and the later frame is referred to as the second frame.
 具体的には、画像取得部102は、動画に含まれている複数のフレームの内、予め定められた時間内に含まれる二つのフレーム、又は、予め定められたフレーム数内に含まれている二つのフレームを取得すればよい。ここでは、画像取得部102は、連続する二つのフレームを取得するものとする。 Specifically, the image acquisition unit 102 is included in two frames included in a predetermined time or within a predetermined number of frames among the plurality of frames included in the moving image. All you have to do is get two frames. Here, it is assumed that the image acquisition unit 102 acquires two consecutive frames.
 図2(A)及び(B)は、画像取得部102が取得した二つのフレームを示す概略図である。ここでは、図2(A)が第1のフレーム110を示し、図2(B)が第2のフレーム111を示す。
 第1のフレーム110には、前景となる電線112Aが写っており、電線112A以外の部分が背景となる。ここでは、前景となる電線112Aに焦点が合っており、その背景の森は、ぼけて写っているものとする。背景がぼけているため、森がぼけた模様113A、114Aとして写っているものとする。
2A and 2B are schematic views showing two frames acquired by the image acquisition unit 102. Here, FIG. 2 (A) shows the first frame 110, and FIG. 2 (B) shows the second frame 111.
The foreground electric wire 112A is reflected in the first frame 110, and a portion other than the electric wire 112A is the background. Here, it is assumed that the focus is on the electric wire 112A, which is the foreground, and the forest in the background is blurred. Since the background is blurred, it is assumed that the forest is reflected as patterns 113A and 114A.
 また、第2のフレームにも、前景となる電線112Bが写っており、電線112B以外の部分が背景となる。背景には、ぼけた模様113B、114Bとして、森が写っている。
 ここで、カメラは、D方向に移動しているものとする。
Further, the electric wire 112B as the foreground is also reflected in the second frame, and the portion other than the electric wire 112B is the background. In the background, the forest is reflected as blurred patterns 113B and 114B.
Here, it is assumed that the camera is moving in the D direction.
 ズレ量特定部103、取得された二つのフレームにおける前景の位置がずれているズレ量を特定する。
 例えば、ズレ量特定部103は、取得された二つのフレームに写っている部分が一致するように、これらの二つのフレームの位置がずれているズレ量を特定する。
 具体的には、ズレ量特定部103は、これらの二つのフレームに含まれている前景となる部分が一致するように、これらの二つのフレームの位置のズレ量を特定すればよい。具体的には、ズレ量特定部103は、二つのフレームの全体の相関を求めることで、ズレ量を特定している。なお、ズレ量特定部103は、特徴点のマッチングで、ズレ量を特定してもよい。
The deviation amount specifying unit 103 specifies the deviation amount in which the positions of the foregrounds in the two acquired frames are displaced.
For example, the deviation amount specifying unit 103 specifies the deviation amount in which the positions of these two frames are displaced so that the portions reflected in the two acquired frames match.
Specifically, the deviation amount specifying unit 103 may specify the deviation amount of the positions of these two frames so that the foreground parts included in these two frames match. Specifically, the deviation amount specifying unit 103 specifies the deviation amount by obtaining the correlation of the entire two frames. The deviation amount specifying unit 103 may specify the deviation amount by matching the feature points.
 図3は、ズレ量特定部103でのズレ量の特定を説明するための概略図である。
 第1のフレーム110及び第2のフレーム111は、D方向に移動するカメラで撮像されている。このため、第1のフレーム110に含まれている部分と、第2のフレーム111に含まれている対応する部分とを重ね合わせると、図3に示されているように、第2のフレーム111は、第1のフレーム110に対してD方向にずれることとなる。また、カメラの振動又は移動方向に応じて、フレームは、縦方向にもずれる。
FIG. 3 is a schematic diagram for explaining the specification of the deviation amount by the deviation amount specifying unit 103.
The first frame 110 and the second frame 111 are imaged by a camera moving in the D direction. Therefore, when the portion included in the first frame 110 and the corresponding portion included in the second frame 111 are overlapped with each other, as shown in FIG. 3, the second frame 111 Will be displaced in the D direction with respect to the first frame 110. In addition, the frame shifts in the vertical direction depending on the vibration or the moving direction of the camera.
 以上のようなずれを補正するため、ズレ量特定部103は、横方向のズレ量Xと、縦方向のズレ量Yとを特定する。例えば、上述のように、前景である電線112A、112Bに焦点が合っており、背景がぼけている場合には、第1のフレーム110及び第2のフレーム111の画像全体の相関を算出すると、電線112Aの対応する部分と、電線112Bの対応する部分とが一致している場合に、相関が高くなる。このため、図3に示されているように、第1のフレーム110に写っている電線112Aと、第2のフレーム111に写っている電線112Bとの対応する部分同士が重なり合っている場合に、相関が高くなる。 In order to correct the above deviation, the deviation amount specifying unit 103 specifies the deviation amount X in the horizontal direction and the deviation amount Y in the vertical direction. For example, as described above, when the foreground electric wires 112A and 112B are in focus and the background is blurred, the correlation of the entire image of the first frame 110 and the second frame 111 can be calculated. The correlation is high when the corresponding portion of the electric wire 112A and the corresponding portion of the electric wire 112B match. Therefore, as shown in FIG. 3, when the corresponding portions of the electric wire 112A shown in the first frame 110 and the electric wire 112B shown in the second frame 111 overlap each other, The correlation is high.
 また、特徴点のマッチングを行う場合でも、焦点の合っている対象同士が一致している場合に、マッチングしている度合いが高くなるため、図3と同様のズレ量X、Yを特定することができる。 Further, even when matching feature points, the degree of matching is high when the focused objects match each other. Therefore, the same deviation amounts X and Y as in FIG. 3 should be specified. Can be done.
 相関算出部104は、ズレ量特定部103で特定されたズレ量に従って、二つのフレームの位置のずれを補正して、補正後の二つのフレームの対応する画素毎に、予め定められた面積の相関を算出する。
 例えば、相関算出部104は、第1のフレーム110及び第2のフレーム111の何れか一方を、ズレ量特定部103で特定されたズレ量だけずらして、一致する画素毎に、予め定められた大きさのウィンドウの相関を算出する。
The correlation calculation unit 104 corrects the deviation of the positions of the two frames according to the deviation amount specified by the deviation amount specifying unit 103, and has a predetermined area for each of the corresponding pixels of the two corrected frames. Calculate the correlation.
For example, the correlation calculation unit 104 shifts either one of the first frame 110 and the second frame 111 by the amount of deviation specified by the deviation amount specifying unit 103, and determines in advance for each matching pixel. Calculate the correlation of size windows.
 図4は、相関算出部104で算出された相関を、画素毎に予め定められた範囲の値毎に分類した相関画像114を示す概略図である。
 図4の相関画像114に示されているように、前景となる電線に対応する部分は、相対的に相関が高くなり、その他の部分である背景は、相対的に相関が低くなる。
FIG. 4 is a schematic view showing a correlation image 114 in which the correlation calculated by the correlation calculation unit 104 is classified for each value in a predetermined range for each pixel.
As shown in the correlation image 114 of FIG. 4, the portion corresponding to the electric wire in the foreground has a relatively high correlation, and the background, which is the other portion, has a relatively low correlation.
 このため、閾値処理部105は、画素毎の相関を、予め定められた閾値と比較することにより、予め定められた閾値よりも高い相関となっている画素を前景の画素として特定し、それ以外の画素を背景の画素として特定する。 Therefore, the threshold processing unit 105 identifies the pixels having a higher correlation than the predetermined threshold as the pixels in the foreground by comparing the correlation for each pixel with the predetermined threshold, and other than that. Is specified as a background pixel.
 そして、閾値処理部105は、動画に含まれているフレーム、ここでは、第1のフレーム110及び第2のフレーム111の少なくとも何れか一方において、前景の画素の画素値を抽出した画像である前景画像と、背景の画素の画素値を抽出した画像である背景画像とを生成する。
 なお、前景画像は、背景の画素の画素値を抽出せずに、前景の画素の画素値を抽出した画像であり、背景画像は、前景の画素の画素値を抽出せずに、背景の画素の画素値を抽出した画像である。
Then, the threshold processing unit 105 is an image obtained by extracting the pixel values of the pixels of the foreground in at least one of the frames included in the moving image, here, the first frame 110 and the second frame 111. An image and a background image, which is an image obtained by extracting the pixel values of the background pixels, are generated.
The foreground image is an image in which the pixel values of the foreground pixels are extracted without extracting the pixel values of the background pixels, and the background image is an image in which the pixel values of the foreground pixels are not extracted. It is an image obtained by extracting the pixel value of.
 画像出力部106は、前景画像及び背景画像の少なくとも何れか一方を出力する。 The image output unit 106 outputs at least one of the foreground image and the background image.
 図5は、動画分離装置100のハードウェア構成例を示すブロック図である。
 動画分離装置100は、I/O(In or Out)インタフェース120と、不揮発性メモリとして機能する補助記憶装置121と、プロセッサとして機能するCPU(Central Processing Unit)122と、メインメモリ123とがバス124に接続されているコンピュータにより実現することができる。
FIG. 5 is a block diagram showing a hardware configuration example of the moving image separator 100.
In the moving image separation device 100, the I / O (In or Out) interface 120, the auxiliary storage device 121 that functions as a non-volatile memory, the CPU (Central Processing Unit) 122 that functions as a processor, and the main memory 123 are bus 124. It can be realized by the computer connected to.
 具体的には、動画入力部101及び画像出力部106は、I/Oインタフェース120より実現することができる。
 画像取得部102、ズレ量特定部103、相関算出部104及び閾値処理部105は、CPU122が、補助記憶装置121に記憶されているプログラムをメインメモリ123に読み出して、そのプログラムを実行することで実現することができる。
Specifically, the moving image input unit 101 and the image output unit 106 can be realized from the I / O interface 120.
The image acquisition unit 102, the deviation amount specifying unit 103, the correlation calculation unit 104, and the threshold processing unit 105 read the program stored in the auxiliary storage device 121 into the main memory 123 and execute the program. It can be realized.
 なお、そのようなプログラムは、ネットワークを通じて提供されてもよく、また、記録媒体に記録されて提供されてもよい。即ち、このようなプログラムは、例えば、プログラムプロダクトとして提供されてもよい。 Note that such a program may be provided through a network, or may be recorded and provided on a recording medium. That is, such a program may be provided as, for example, a program product.
 なお、撮像装置としてのカメラ125は、動画を動画分離装置100に出力する。
 動画分離装置100は、生成された前景画像及び生成された背景画像の少なくとも何れか一方を、表示装置としてのディスプレイ126に出力することで、前景画像及び背景画像の少なくとも何れか一方を、ディスプレイ126に表示させる。
The camera 125 as an imaging device outputs a moving image to the moving image separating device 100.
The moving image separation device 100 outputs at least one of the generated foreground image and the generated background image to the display 126 as a display device, so that at least one of the foreground image and the background image is displayed on the display 126. To display.
 また、図5に示されている例では、カメラ125及びディスプレイ126は、動画分離装置100に含まれていないが、これらの少なくとも何れか一方が、動画分離装置100に含まれていてもよい。
 例えば、カメラ125が動画分離装置100に含まれている場合には、カメラ125は、動画を撮像する撮像部として機能する。
 また、ディスプレイ126が動画分離装置100に含まれている場合には、ディスプレイ126は、前景画像及び背景画像の少なくとも何れか一方を表示する表示部として機能する。
Further, in the example shown in FIG. 5, the camera 125 and the display 126 are not included in the moving image separator 100, but at least one of them may be included in the moving image separating device 100.
For example, when the camera 125 is included in the moving image separating device 100, the camera 125 functions as an imaging unit for capturing a moving image.
Further, when the display 126 is included in the moving image separating device 100, the display 126 functions as a display unit for displaying at least one of the foreground image and the background image.
 図6は、実施の形態1における動画分離装置100での処理を示すフローチャートである。
 まず、動画入力部101は、外部のカメラから動画の入力を受け付ける(S10)。動画入力部101は、入力された動画を画像取得部102に与える。
FIG. 6 is a flowchart showing the processing in the moving image separator 100 according to the first embodiment.
First, the video input unit 101 receives video input from an external camera (S10). The moving image input unit 101 gives the input moving image to the image acquisition unit 102.
 次に、画像取得部102は、動画に含まれている複数のフレームから、時間的に近い二つのフレームを、第1のフレーム及び第2のフレームとして取得する(S11)。ここでは、取得された二つのフレームの内、時間的に早い方のフレームを第1のフレーム、遅い方のフレームを第2のフレームとする。そして、画像取得部102は、取得された第1のフレーム及び第2のフレームをズレ量特定部103に与える。 Next, the image acquisition unit 102 acquires two frames that are close in time from the plurality of frames included in the moving image as the first frame and the second frame (S11). Here, of the two acquired frames, the earlier frame in time is referred to as the first frame, and the later frame is referred to as the second frame. Then, the image acquisition unit 102 gives the acquired first frame and the second frame to the deviation amount specifying unit 103.
 次に、ズレ量特定部103は、取得された二つのフレームに写っている部分が一致するように、これらの二つのフレームの位置がずれているズレ量を特定する(S12)。ズレ量特定部103は、特定されたズレ量と、第1のフレーム及び第2のフレームとを相関算出部104に与える。 Next, the deviation amount specifying unit 103 specifies the deviation amount in which the positions of these two frames are displaced so that the portions reflected in the two acquired frames match (S12). The deviation amount specifying unit 103 gives the specified deviation amount and the first frame and the second frame to the correlation calculation unit 104.
 次に、相関算出部104は、第1のフレーム110及び第2のフレーム111の何れか一方を、与えられたズレ量だけずらして、一致する画素毎に、予め定められた大きさのウィンドウ毎に相関を算出する(S13)。そして、相関算出部104は、算出された相関を画素毎に示す相関データと、ズレ量と、第1のフレーム及び第2のフレームとを閾値処理部105に与える。 Next, the correlation calculation unit 104 shifts either one of the first frame 110 and the second frame 111 by a given amount of deviation, for each matching pixel, and for each window of a predetermined size. The correlation is calculated in (S13). Then, the correlation calculation unit 104 gives the correlation data indicating the calculated correlation for each pixel, the amount of deviation, and the first frame and the second frame to the threshold value processing unit 105.
 次に、閾値処理部105は、相関データで示されている相関を、画素毎に、予め定められた閾値と比較することで、予め定められた閾値よりも高い相関となっている画素を前景の画素として特定し、それ以外の画素を背景の画素として特定する(S14)。 Next, the threshold processing unit 105 compares the correlation shown in the correlation data with a predetermined threshold for each pixel, so that the pixel having a higher correlation than the predetermined threshold is in the foreground. Is specified as the pixel of the above, and the other pixels are specified as the background pixel (S14).
 そして、閾値処理部105は、第1のフレーム110及び第2のフレーム111の少なくとも何れか一方において、前景の画素の画素値からなる画像である前景画像と、背景の画素の画素値からなる画像である背景画像とを生成する(S15)。閾値処理部105は、生成された前景画像及び背景画像の少なくとも何れか一方を、画像出力部106に与える。 Then, the threshold processing unit 105 in at least one of the first frame 110 and the second frame 111, an image composed of pixel values of the pixels in the foreground and an image composed of pixel values of the pixels in the background. A background image is generated (S15). The threshold processing unit 105 gives at least one of the generated foreground image and background image to the image output unit 106.
 次に、画像出力部106は、前景画像及び背景画像の少なくとも何れか一方を出力する(S16)。 Next, the image output unit 106 outputs at least one of the foreground image and the background image (S16).
 以上のように、実施の形態1に係る動画分離装置100によれば、処理コストが少なく、精度よく背景と前景とを分離することができる。 As described above, according to the moving image separating device 100 according to the first embodiment, the processing cost is low and the background and the foreground can be separated with high accuracy.
 実施の形態1では、ズレ量特定部103は、図3に示されているように、第1のフレーム110及び第2のフレーム111の全体からズレ量X、Yを特定しているが、実施の形態1は、このような例に限定されない。例えば、ズレ量特定部103は、第1のフレーム110及び第2のフレームを、同じように複数の分割フレームに分割して、分割フレーム毎にズレ量を算出してもよい。 In the first embodiment, as shown in FIG. 3, the deviation amount specifying unit 103 specifies the deviation amounts X and Y from the entire first frame 110 and the second frame 111. Form 1 is not limited to such an example. For example, the deviation amount specifying unit 103 may divide the first frame 110 and the second frame into a plurality of divided frames in the same manner, and calculate the deviation amount for each divided frame.
 図7(A)~(H)は、第1のフレーム110及び第2のフレームを分割して、ズレ量を算出する処理を説明するための概略図である。
 例えば、ズレ量特定部103は、図7(A)に示されている第1のフレーム110を、横方向の真ん中で分割することにより、図7(B)に示されている第1の分割フレーム110Aと、図7(C)に示されている第2の分割フレーム110Bとを生成する。
7 (A) to 7 (H) are schematic views for explaining a process of dividing the first frame 110 and the second frame to calculate the amount of deviation.
For example, the deviation amount specifying unit 103 divides the first frame 110 shown in FIG. 7 (A) in the middle in the horizontal direction, thereby dividing the first frame 110 shown in FIG. 7 (B). The frame 110A and the second divided frame 110B shown in FIG. 7C are generated.
 同様に、ズレ量特定部103は、図7(D)に示されている第2のフレーム111を、横方向の真ん中で分割することにより、図7(E)に示されている第3の分割フレーム111Aと、図7(F)に示されている第4の分割フレーム111Bとを生成する。 Similarly, the deviation amount specifying unit 103 divides the second frame 111 shown in FIG. 7 (D) in the middle in the horizontal direction, so that the third frame 111 shown in FIG. 7 (E) is divided. The division frame 111A and the fourth division frame 111B shown in FIG. 7 (F) are generated.
 そして、ズレ量特定部103は、図7(G)に示されているように、第1の分割フレーム110Aと、第3の分割フレーム111Aとの間で位置のズレ量を、図3と同様に特定することで、これらの位置のズレ量である分割ズレ量X、Yを特定する。
 同様に、ズレ量特定部103は、図7(H)に示されているように、第2の分割フレーム110Bと、第4の分割フレーム111Bとの間で位置のズレ量を特定し、これらの位置のズレ量である分割ズレ量X、Yを特定する。
Then, as shown in FIG. 7 (G), the deviation amount specifying unit 103 sets the displacement amount between the first division frame 110A and the third division frame 111A in the same manner as in FIG. By specifying to, the division deviation amounts X 1 and Y 1 , which are the deviation amounts of these positions, are specified.
Similarly, as shown in FIG. 7 (H), the deviation amount specifying unit 103 identifies the displacement amount between the second divided frame 110B and the fourth divided frame 111B, and these The division deviation amounts X 2 and Y 2 , which are the deviation amounts of the positions of, are specified.
 このような場合、相関算出部104は、第1の分割フレーム110A又は第3の分割フレーム111Aに含まれている画素については、分割ズレ量X、Yを用いて、対応する画素を特定して、相関を算出すればよい。
 同様に、相関算出部104は、第2の分割フレーム110B又は第4の分割フレーム111Bに含まれている画素については、分割ズレ量X、Yを用いて、対応する画素を特定して、相関を算出すればよい。
In such a case, the correlation calculation unit 104 specifies the corresponding pixels by using the division deviation amounts X 1 and Y 1 for the pixels included in the first division frame 110A or the third division frame 111A. Then, the correlation may be calculated.
Similarly, the correlation calculation unit 104, for the pixels included in the second divided frame 110B or the fourth divided frame 111B, using a split shift amount X 2, Y 2, to identify the corresponding pixel , The correlation may be calculated.
 また、画像取得部102又はズレ量特定部103は、第1のフレーム及び第2のフレームがカラーである場合に、これらをグレースケールに変換してもよい。そして、ズレ量特定部及び相関算出部104は、グレースケールに変換された第1のフレーム及び第2のフレームを用いて、上述の処理を行ってもよい。 Further, the image acquisition unit 102 or the deviation amount specifying unit 103 may convert the first frame and the second frame into grayscale when they are in color. Then, the deviation amount specifying unit and the correlation calculation unit 104 may perform the above-mentioned processing using the first frame and the second frame converted into gray scale.
 なお、実施の形態1では、取得された二つのフレームの内、時間的に早い方のフレームを第1のフレーム、遅い方のフレームを第2のフレームとしているが、時間的に早い方のフレームを第2のフレーム、遅い方のフレームを第1のフレームとしてもよい。 In the first embodiment, of the two acquired frames, the earlier frame is the first frame and the later frame is the second frame, but the earlier frame is the earlier frame. May be the second frame, and the slower frame may be the first frame.
実施の形態2.
 図1に示されているように、実施の形態2に係る動画分離装置200は、動画入力部101と、画像取得部202と、ズレ量特定部103と、相関算出部104と、閾値処理部205と、画像出力部106とを備える。
Embodiment 2.
As shown in FIG. 1, the moving image separation device 200 according to the second embodiment includes a moving image input unit 101, an image acquisition unit 202, a deviation amount specifying unit 103, a correlation calculation unit 104, and a threshold processing unit. It includes 205 and an image output unit 106.
 実施の形態2に係る動画分離装置200の動画入力部101、ズレ量特定部103、相関算出部104及び画像出力部106は、実施の形態1に係る動画分離装置100の動画入力部101、ズレ量特定部103、相関算出部104及び画像出力部106と同様である。 The moving image input unit 101, the deviation amount specifying unit 103, the correlation calculation unit 104, and the image output unit 106 of the moving image separation device 200 according to the second embodiment are the moving image input units 101 and the deviation amount of the moving image separating device 100 according to the first embodiment. This is the same as the quantity specifying unit 103, the correlation calculation unit 104, and the image output unit 106.
 なお、実施の形態2では、動画入力部101に入力される動画は、インターレース方式で撮像された動画であるものとする。インターレース方式は、画像伝送の方式の一種で、走査線を1本おきに2回に分けて伝送する方式である。 In the second embodiment, the moving image input to the moving image input unit 101 is assumed to be a moving image captured by the interlace method. The interlace method is a kind of image transmission method, and is a method in which every other scanning line is transmitted in two times.
 画像取得部202は、動画に含まれている一つのフレームから、二つのフレームを取得する。ここでは、インターレース方式の動画であるため、画像取得部202は、一つのフレームに含まれている奇数行の画素の画素値を抽出した第1のフレームと、偶数行の画素の画素値を抽出した第2のフレームとを取得する。 The image acquisition unit 202 acquires two frames from one frame included in the moving image. Here, since it is an interlaced moving image, the image acquisition unit 202 extracts the pixel values of the pixels of the even-numbered rows and the first frame from which the pixel values of the pixels of the odd-numbered rows included in one frame are extracted. The second frame is acquired.
 図8(A)~(C)は、実施の形態2において、動画に含まれている一つのフレーム215と、その一つのフレーム215から取得された第1のフレーム210及び第2のフレーム211とを示す概略図である。
 実施の形態2では、図8(A)に示されている一つのフレーム215から、図8(B)に示されているように、奇数行の画素のみが抽出された第1のフレーム210と、図8(C)に示されているように、偶数行の画素のみが抽出された第2のフレーム211とが取得される。
8 (A) to 8 (C) show, in the second embodiment, one frame 215 included in the moving image, and the first frame 210 and the second frame 211 acquired from the one frame 215. It is a schematic diagram which shows.
In the second embodiment, from one frame 215 shown in FIG. 8 (A) to the first frame 210 in which only the pixels of the odd-numbered rows are extracted as shown in FIG. 8 (B). , As shown in FIG. 8C, the second frame 211 in which only the pixels of even-numbered rows are extracted is acquired.
 閾値処理部205は、相関算出部104で算出された相関を、画素毎に予め定められた閾値と比較することにより、予め定められた閾値よりも高い相関となっている画素を前景の画素として特定し、それ以外の画素を背景の画素として特定する。 The threshold processing unit 205 compares the correlation calculated by the correlation calculation unit 104 with a preset threshold value for each pixel, and sets pixels having a higher correlation than the preset threshold value as foreground pixels. It is specified, and the other pixels are specified as background pixels.
 そして、閾値処理部205は、第1のフレーム及び第2のフレームを取得した元のフレームから、前景の画素の画素値を抽出した画像である前景画像と、背景の画素の画素値を抽出した画像である背景画像とを生成する。 Then, the threshold processing unit 205 extracts the foreground image, which is an image obtained by extracting the pixel values of the pixels in the foreground, and the pixel values of the pixels in the background from the original frame from which the first frame and the second frame are acquired. Generates a background image that is an image.
 以上のように、実施の形態2では、インターレース方式の動画の一つのフレームから、奇数行のみの第1のフレームと、偶数行のみの第2のフレームとが取得される。このように取得された第1のフレーム及び第2のフレームは、ズレ量特定部103でズレ量が特定されるため、インターレース方式における、奇数行と偶数行との時間的なズレによるボケが解消される。 As described above, in the second embodiment, the first frame having only odd-numbered lines and the second frame having only even-numbered lines are acquired from one frame of the interlaced moving image. Since the amount of deviation of the first frame and the second frame acquired in this way is specified by the deviation amount specifying unit 103, the blurring due to the temporal deviation between the odd-numbered rows and the even-numbered rows in the interlace method is eliminated. Will be done.
 従って、実施の形態2によれば、一つのフレームから、その一つのフレームのボケを解消しながら、前景と背景との分離を行うことができる。 Therefore, according to the second embodiment, it is possible to separate the foreground and the background from one frame while eliminating the blurring of that one frame.
実施の形態3.
 図1に示されているように、実施の形態3に係る動画分離装置300は、動画入力部101と、画像取得部102と、ズレ量特定部303と、相関算出部104と、閾値処理部105と、画像出力部106とを備える。
Embodiment 3.
As shown in FIG. 1, the moving image separation device 300 according to the third embodiment includes a moving image input unit 101, an image acquisition unit 102, a deviation amount specifying unit 303, a correlation calculation unit 104, and a threshold processing unit. It includes 105 and an image output unit 106.
 実施の形態3に係る動画分離装置300の動画入力部101、画像取得部102、相関算出部104、閾値処理部105及び画像出力部106は、実施の形態1に係る動画分離装置100の動画入力部101、画像取得部102、相関算出部104、閾値処理部105及び画像出力部106と同様である。 The video input unit 101, the image acquisition unit 102, the correlation calculation unit 104, the threshold processing unit 105, and the image output unit 106 of the video separation device 300 according to the third embodiment are used to input the video of the video separation device 100 according to the first embodiment. This is the same as the unit 101, the image acquisition unit 102, the correlation calculation unit 104, the threshold processing unit 105, and the image output unit 106.
 ズレ量特定部303は、取得された二つのフレームに写っている部分が一致するように、これらの二つのフレームの位置がずれているズレ量を特定する。
 実施の形態3では、これらの二つのフレームから選択された一つのフレームから、前景の特徴に一致する部分のみを抽出した部分画像を用いて、他のフレームとのズレ量を特定する。ここで、二つのフレームから選択された一つのフレームを選択画像、残りのフレームを非選択画像ともいう。
The deviation amount specifying unit 303 specifies the deviation amount in which the positions of these two frames are displaced so that the portions reflected in the two acquired frames match.
In the third embodiment, the amount of deviation from the other frames is specified by using a partial image obtained by extracting only the portion matching the features of the foreground from one frame selected from these two frames. Here, one frame selected from the two frames is also referred to as a selected image, and the remaining frame is also referred to as a non-selected image.
 図9(A)及び(B)は、実施の形態3におけるズレ量特定部303の処理を説明するための概略図である。
 ここでは、取得された二つのフレームの内の第1のフレームを例に、具体例を説明するが、第2のフレームが用いられてもよい。
 ズレ量特定部303は、例えば、図9(A)に示されている第1のフレーム110から、前景となる電線112Aの特徴を用いて、図9(B)に示されているようなマスク画像316を生成する。
9 (A) and 9 (B) are schematic views for explaining the processing of the deviation amount specifying unit 303 in the third embodiment.
Here, a specific example will be described by taking the first frame of the two acquired frames as an example, but the second frame may be used.
The deviation amount specifying unit 303 is, for example, a mask as shown in FIG. 9 (B) by using the characteristics of the electric wire 112A which is the foreground from the first frame 110 shown in FIG. 9 (A). Generate image 316.
 電線112Aは、例えば、灰色に近い色を備えているため、ズレ量特定部303は、第1のフレーム110から、電線112Aの灰色に対応する彩度の範囲を特定することで、図9(B)に示されているようなマスク画像316を生成することができる。 Since the electric wire 112A has, for example, a color close to gray, the deviation amount specifying unit 303 specifies the saturation range corresponding to the gray color of the electric wire 112A from the first frame 110, and therefore, FIG. A mask image 316 as shown in B) can be generated.
 図9(A)に示されている第1のフレーム110には、実施の形態1と同様、前景となる電線112Aが写っており、電線112A以外の部分が背景となり、背景となる森がぼけた模様113A、114Aとして写っている。
 図9(B)に示されているマスク画像316は、前景に対応する領域317と、背景に対応する領域318とに分けられる。
Similar to the first embodiment, the first frame 110 shown in FIG. 9A shows the electric wire 112A as the foreground, the portion other than the electric wire 112A is the background, and the forest as the background is blurred. The patterns are shown as 113A and 114A.
The mask image 316 shown in FIG. 9B is divided into an area 317 corresponding to the foreground and an area 318 corresponding to the background.
 そして、ズレ量特定部303は、図9(B)に示されているマスク画像316を用いて、図9(A)に示されている第1のフレーム110から、前景に対応する領域317の部分の画像である部分画像を抽出することができる。 Then, the deviation amount specifying unit 303 uses the mask image 316 shown in FIG. 9 (B) to form the region 317 corresponding to the foreground from the first frame 110 shown in FIG. 9 (A). A partial image that is a partial image can be extracted.
 ズレ量特定部303は、抽出された部分画像と、第2のフレームとの全体の相関を求めることで、ズレ量を特定する。
 ここで、抽出された部分画像には、図9(A)に示されている模様113A、114Aが含まれないため、ズレ量特定部303は、前景のズレを正確に特定することができる。
The deviation amount specifying unit 303 specifies the deviation amount by obtaining the overall correlation between the extracted partial image and the second frame.
Here, since the extracted partial image does not include the patterns 113A and 114A shown in FIG. 9A, the deviation amount specifying unit 303 can accurately identify the deviation in the foreground.
 以上のように、実施の形態3によれば、背景の影響を抑制して、前景のズレを正確に特定することができるようになる。 As described above, according to the third embodiment, the influence of the background can be suppressed and the deviation of the foreground can be accurately identified.
 以上に記載された実施の形態1~3では、閾値処理部105、205は、前景画像及び背景画像を生成しているが、前景画像だけを生成してもよい。 In the above-described first to third embodiments, the threshold processing units 105 and 205 generate the foreground image and the background image, but only the foreground image may be generated.
 以上に記載された実施の形態1~3では、相関算出部104は、補正後の二つのフレームの対応する画素毎に、予め定められた面積の相関を算出しているが、実施の形態1~3は、このような例に限定されない。
 例えば、相関算出部104は、補正後の二つのフレームの対応する複数の画素のグループ毎に、予め定められた面積の相関を算出してもよい。具体的には、相関算出部104は、5×9等の予め定められた大きさのウィンドウにおける相関を、そのウィンドウ内の予め定められた複数の画素のグループに含まれている各々の画素における相関とすることもできる。これにより、相関を算出する際の算出負荷を軽減することができる。なお、複数の画素のグループには、そのウィンドウの中心の画素が含まれていることが望ましい。
In the first to third embodiments described above, the correlation calculation unit 104 calculates the correlation of a predetermined area for each corresponding pixel of the two corrected frames, but the first embodiment 1 ~ 3 is not limited to such an example.
For example, the correlation calculation unit 104 may calculate the correlation of a predetermined area for each group of a plurality of corresponding pixels of the two corrected frames. Specifically, the correlation calculation unit 104 performs a correlation in a window having a predetermined size such as 5 × 9 in each pixel included in a group of a plurality of predetermined pixels in the window. It can also be a correlation. As a result, the calculation load when calculating the correlation can be reduced. It is desirable that the group of a plurality of pixels includes the pixel in the center of the window.
 100,200,300 動画分離装置、 101 動画入力部、 102,202 画像取得部、 103,303 ズレ量特定部、 104 相関算出部、 105,205 閾値処理部、 106 画像出力部。 100, 200, 300 video separator, 101 video input unit, 102, 202 image acquisition unit, 103, 303 deviation amount identification unit, 104 correlation calculation unit, 105, 205 threshold processing unit, 106 image output unit.

Claims (13)

  1.  移動する視点で撮像された動画から、前景が含まれる二つの画像を取得する画像取得部と、
     前記二つの画像における前記前景の位置がずれているズレ量を特定するズレ量特定部と、
     前記ズレ量に従って、前記二つの画像の位置のずれを補正して、補正後の前記二つの画像の対応する画素毎に、又は、補正後の前記二つの画像の対応する複数の画素のグループ毎に、予め定められた面積の相関を算出する相関算出部と、
     前記算出された相関が予め定められた閾値よりも大きい画素を前記前景の画素とし、前記動画に含まれる一つの画像から、前記前景の画素の画素値を抽出した画像を前景画像として生成する閾値処理部と、を備えること
     を特徴とする動画分離装置。
    An image acquisition unit that acquires two images including the foreground from a moving image captured from a moving viewpoint,
    A shift amount specifying unit that specifies the shift amount in which the foreground position is shifted in the two images, and a shift amount specifying unit.
    According to the amount of deviation, the displacement of the positions of the two images is corrected for each corresponding pixel of the two images after correction, or for each group of a plurality of corresponding pixels of the two images after correction. In addition, a correlation calculation unit that calculates the correlation of a predetermined area,
    Pixels whose calculated correlation is larger than a predetermined threshold value are defined as the foreground pixels, and an image obtained by extracting the pixel values of the foreground pixels from one image included in the moving image is generated as the foreground image. A moving image separator characterized by having a processing unit.
  2.  前記閾値処理部は、前記一つの画像において前記前景の画素以外の画素を背景の画素とし、前記一つの画像から前記背景の画素の画素値を抽出した画像を背景画像として生成すること
     を特徴とする請求項1に記載の動画分離装置。
    The threshold processing unit is characterized in that pixels other than the pixels in the foreground are used as background pixels in the one image, and an image obtained by extracting the pixel values of the background pixels from the one image is generated as the background image. The moving image separation device according to claim 1.
  3.  前記画像取得部は、前記動画における予め定められた時間内の二つのフレームを前記二つの画像として取得すること
     を特徴とする請求項1又は2に記載の動画分離装置。
    The moving image separation device according to claim 1 or 2, wherein the image acquisition unit acquires two frames within a predetermined time in the moving image as the two images.
  4.  前記画像取得部は、前記動画における予め定められたフレーム数内の二つのフレームを前記二つの画像として取得すること
     を特徴とする請求項1又は2に記載の動画分離装置。
    The moving image separation device according to claim 1 or 2, wherein the image acquisition unit acquires two frames within a predetermined number of frames in the moving image as the two images.
  5.  前記一つの画像は、前記二つの画像の何れか一方であること
     を特徴とする請求項1から4の何れか一項に記載の動画分離装置。
    The moving image separation device according to any one of claims 1 to 4, wherein the one image is one of the two images.
  6.  前記ズレ量特定部は、前記二つの画像から選択された一つの画像である選択画像から、前記前景に対応する部分の画像である部分画像を抽出し、前記部分画像と、前記二つの画像における前記選択画像とは異なる一つの画像である非選択画像とを用いて、前記ズレ量を特定すること
     を特徴とする請求項1から5の何れか一項に記載の動画分離装置。
    The deviation amount specifying unit extracts a partial image which is an image of a portion corresponding to the foreground from a selected image which is one image selected from the two images, and in the partial image and the two images. The moving image separation device according to any one of claims 1 to 5, wherein the amount of deviation is specified by using a non-selected image which is one image different from the selected image.
  7.  前記ズレ量特定部は、前記二つの画像の各々を複数の画像に分割し、分割された複数の画像のそれぞれから前記ズレ量を特定すること
     を特徴とする請求項1から5の何れか一項に記載の動画分離装置。
    Any one of claims 1 to 5, wherein the deviation amount specifying unit divides each of the two images into a plurality of images, and specifies the deviation amount from each of the divided plurality of images. The video separator described in the section.
  8.  前記動画は、インターレース方式で撮像された動画であり、
     前記画像取得部は、前記動画に含まれる一つのフレームから、奇数行の画素の画素値を抽出した画像と、偶数行の画素の画素値を抽出した画像とを前記二つの画像として取得すること
     を特徴とする請求項1又は2に記載の動画分離装置。
    The moving image is a moving image captured by an interlaced method.
    The image acquisition unit acquires, as the two images, an image obtained by extracting the pixel values of the pixels of the odd-numbered rows and an image obtained by extracting the pixel values of the pixels of the even-numbered rows from one frame included in the moving image. The moving image separator according to claim 1 or 2.
  9.  前記一つの画像は、前記一つのフレームであること
     を特徴とする請求項8に記載の動画分離装置。
    The moving image separation device according to claim 8, wherein the one image is the one frame.
  10.  前記動画を撮像する撮像部をさらに備えること
     を特徴とする請求項1から9の何れか一項に記載の動画分離装置。
    The moving image separation device according to any one of claims 1 to 9, further comprising an imaging unit for capturing the moving image.
  11.  前記前景画像を表示する表示部をさらに備えること
     を特徴とする請求項1から10の何れか一項に記載の動画分離装置。
    The moving image separating device according to any one of claims 1 to 10, further comprising a display unit for displaying the foreground image.
  12.  コンピュータを、
     移動する視点で撮像された動画から、前景が含まれる二つの画像を取得する画像取得部、
     前記二つの画像における前記前景の位置がずれているズレ量を特定するズレ量特定部、
     前記ズレ量に従って、前記二つの画像の位置のずれを補正して、補正後の前記二つの画像の対応する画素毎に、又は、補正後の前記二つの画像の対応する複数の画素のグループ毎に、予め定められた面積の相関を算出する相関算出部、及び、
     前記算出された相関が予め定められた閾値よりも大きい画素を前記前景の画素とし、前記動画に含まれる一つの画像から、前記前景の画素の画素値を抽出した画像を前景画像として生成する閾値処理部、として機能させること
     を特徴とするプログラム。
    Computer,
    An image acquisition unit that acquires two images including the foreground from a moving image captured from a moving viewpoint.
    A shift amount specifying unit that specifies a shift amount in which the positions of the foregrounds in the two images are shifted,
    According to the amount of deviation, the displacement of the positions of the two images is corrected for each corresponding pixel of the two images after correction, or for each group of a plurality of corresponding pixels of the two images after correction. In addition, a correlation calculation unit that calculates the correlation of a predetermined area, and
    Pixels whose calculated correlation is larger than a predetermined threshold value are defined as the foreground pixels, and an image obtained by extracting the pixel values of the foreground pixels from one image included in the moving image is generated as the foreground image. A program characterized by functioning as a processing unit.
  13.  移動する視点で撮像された動画から、前景が含まれる二つの画像を取得し、
     前記二つの画像における前記前景の位置がずれているズレ量を特定し、
     前記ズレ量に従って、前記二つの画像の位置のずれを補正し、
     補正後の前記二つの画像の対応する画素毎に、又は、補正後の前記二つの画像の対応する複数の画素のグループ毎に、予め定められた面積の相関を算出し、
     前記算出された相関が予め定められた閾値よりも大きい画素を前記前景の画素とし、
     前記動画に含まれる一つの画像から、前記前景の画素の画素値を抽出した画像を前景画像として生成すること
     を特徴とする動画分離方法。
    Two images including the foreground are acquired from the video captured from the moving viewpoint.
    The amount of deviation in the foreground position in the two images is specified, and the amount of deviation is specified.
    According to the amount of deviation, the deviation between the positions of the two images is corrected.
    A predetermined area correlation is calculated for each corresponding pixel of the two corrected images or for each group of a plurality of corresponding pixels of the two corrected images.
    Pixels whose calculated correlation is larger than a predetermined threshold value are defined as the pixels in the foreground.
    A moving image separation method, characterized in that an image obtained by extracting the pixel values of the pixels of the foreground from one image included in the moving image is generated as a foreground image.
PCT/JP2019/048991 2019-12-13 2019-12-13 Moving image separation device, program, and moving image separation method WO2021117233A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021559701A JP7003342B2 (en) 2019-12-13 2019-12-13 Video separator, program and video separation method
PCT/JP2019/048991 WO2021117233A1 (en) 2019-12-13 2019-12-13 Moving image separation device, program, and moving image separation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/048991 WO2021117233A1 (en) 2019-12-13 2019-12-13 Moving image separation device, program, and moving image separation method

Publications (1)

Publication Number Publication Date
WO2021117233A1 true WO2021117233A1 (en) 2021-06-17

Family

ID=76330115

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/048991 WO2021117233A1 (en) 2019-12-13 2019-12-13 Moving image separation device, program, and moving image separation method

Country Status (2)

Country Link
JP (1) JP7003342B2 (en)
WO (1) WO2021117233A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004334818A (en) * 2003-03-07 2004-11-25 Fuji Photo Film Co Ltd Dynamic image cut-out device, method and program
JP2005202706A (en) * 2004-01-16 2005-07-28 Seiko Epson Corp Synthetic processing of image which does not contain body in motion

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004334818A (en) * 2003-03-07 2004-11-25 Fuji Photo Film Co Ltd Dynamic image cut-out device, method and program
JP2005202706A (en) * 2004-01-16 2005-07-28 Seiko Epson Corp Synthetic processing of image which does not contain body in motion

Also Published As

Publication number Publication date
JPWO2021117233A1 (en) 2021-06-17
JP7003342B2 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
US10277812B2 (en) Image processing to obtain high-quality loop moving image
US8144255B2 (en) Still subtitle detection apparatus and image processing method therefor
US11037310B2 (en) Image processing device, image processing method, and image processing program
EP2330812A1 (en) Apparatus for generating a panoramic image, method for generating a panoramic image, and computer-readable medium
US10026155B1 (en) Image-processing apparatus
JP5216984B2 (en) Motion information acquisition apparatus and image processing apparatus
US5479218A (en) Motion vector detecting apparatus
JP2017050731A (en) Moving picture frame interpolation device, moving picture frame interpolation method, and moving picture frame interpolation program
US9106926B1 (en) Using double confirmation of motion vectors to determine occluded regions in images
JP7003342B2 (en) Video separator, program and video separation method
KR20030049804A (en) Method and apparatus for estimating camera motion
JP2018033080A (en) Image processing apparatus, image processing method and program
JP4483085B2 (en) Learning device, application device, learning method, and application method
CN114885112B (en) High-frame-rate video generation method and device based on data fusion
JP2008113292A (en) Motion estimation method and device, program thereof and recording medium thereof
AU2004200237B2 (en) Image processing apparatus with frame-rate conversion and method thereof
WO2014077024A1 (en) Image processing device, image processing method and image processing program
US20080165277A1 (en) Systems and Methods for Deinterlacing Video Data
JP2006215657A (en) Method, apparatus, program and program storage medium for detecting motion vector
KR101012621B1 (en) Image processing device
JP2008193462A (en) Frame interpolation apparatus and frame interpolation method
JP2019097004A (en) Image generation apparatus, image generation method and image generation program
JP2009124261A5 (en)
JP5824937B2 (en) Motion vector deriving apparatus and method
JP5751117B2 (en) Image generation apparatus, image generation method, and program for image generation apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956136

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021559701

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956136

Country of ref document: EP

Kind code of ref document: A1