WO2013150829A1 - 画像処理装置、画像処理装置の制御方法およびプログラム - Google Patents
画像処理装置、画像処理装置の制御方法およびプログラム Download PDFInfo
- Publication number
- WO2013150829A1 WO2013150829A1 PCT/JP2013/054557 JP2013054557W WO2013150829A1 WO 2013150829 A1 WO2013150829 A1 WO 2013150829A1 JP 2013054557 W JP2013054557 W JP 2013054557W WO 2013150829 A1 WO2013150829 A1 WO 2013150829A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- region
- moving body
- moving
- image processing
- rear region
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 262
- 238000000034 method Methods 0.000 title claims description 83
- 238000009499 grossing Methods 0.000 claims abstract description 109
- 238000001514 detection method Methods 0.000 claims abstract description 96
- 230000000873 masking effect Effects 0.000 claims description 7
- 230000000737 periodic effect Effects 0.000 claims description 5
- 239000013598 vector Substances 0.000 description 100
- 238000010586 diagram Methods 0.000 description 45
- 238000012937 correction Methods 0.000 description 32
- 238000004364 calculation method Methods 0.000 description 27
- 210000000746 body region Anatomy 0.000 description 24
- 230000000694 effects Effects 0.000 description 19
- 238000012986 modification Methods 0.000 description 19
- 230000004048 modification Effects 0.000 description 19
- 238000003384 imaging method Methods 0.000 description 16
- 230000005484 gravity Effects 0.000 description 13
- 230000014509 gene expression Effects 0.000 description 12
- 230000010365 information processing Effects 0.000 description 10
- 238000000605 extraction Methods 0.000 description 9
- 238000013500 data storage Methods 0.000 description 6
- 238000004091 panning Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 235000008429 bread Nutrition 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/144—Movement detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20182—Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
Definitions
- the present technology relates to an image processing apparatus, a control method thereof, and a program for causing a computer to execute the method.
- the present invention relates to an image processing apparatus that executes smoothing processing, a control method thereof, and a program for causing a computer to execute the method.
- the above-described conventional technology cannot generate a dynamic image.
- the photographer may shoot an image that expresses the speed of the moving object or the force of movement by adjusting the shutter speed or the like so that a part of the image is blurred.
- it is not possible to generate a dynamic image from an image photographed without using such a photographing technique.
- This technology was created in view of such a situation, and aims to generate a dynamic image.
- the present technology has been made to solve the above-described problems.
- the first aspect of the present technology is a moving object in a target image that is at least one of a plurality of images that are continuous in time series.
- a moving body acquisition unit that acquires a region of the moving body, a moving direction acquisition unit that acquires a moving direction of the moving body, and a rear portion that detects a rear region of the moving body region with respect to the moving direction as a rear region.
- An image processing apparatus including an area detection unit and a rear area processing unit that executes predetermined image processing in the rear area, a control method thereof, and a program for causing a computer to execute the method.
- the moving body acquisition unit may detect the area of the moving body in the target image. This brings about the effect that the rear region is detected in the target image.
- the target image includes a plurality of blocks having a predetermined shape
- the moving body acquisition unit calculates a motion amount of the block for each block using a block matching algorithm.
- the moving body acquisition unit further acquires a region of the moving body in a reference image that is an image immediately before the target image among the plurality of images, and acquires the moving direction.
- the unit may detect a direction from a specific coordinate in the area of the moving object in the reference image to a specific coordinate in the area of the moving object in the target image as the moving direction. This brings about the effect
- the rear region detection unit may detect a region surrounded by the contour line of the rear portion whose position is changed in the movement direction in the target image and the contour line before the change.
- the rear region may be detected.
- the rear area detection unit masks the area of the moving body before the change as the area of the moving body whose position is changed in the movement direction in the target image as a mask area.
- the region generated by this may be detected as the rear region. Accordingly, there is an effect that a region generated by masking a region of the moving body whose position is changed in the moving direction as a mask region in the region of the moving body is detected as a rear region.
- the rear region detection unit includes a moving speed detection unit that detects a moving speed of the moving body, and a region of the moving body in which the position is changed according to a change amount corresponding to the moving speed. And a mask processing unit for performing the mask as the mask region. This brings about the effect that masking is performed using the area of the moving body whose position has been changed by the change amount according to the moving speed as the mask area.
- the rear region detection unit further includes an enlargement unit that enlarges the region of the moving object in the movement direction in the target image, and the mask processing unit is enlarged by the enlargement unit.
- the mask may be performed in the enlarged area before the change, with the area where the position of the enlarged area, which is the area of the moving body, changed in the moving direction as the mask area. This brings about an effect that an area generated by masking an enlarged area whose position is changed in the movement direction as a mask area in the enlarged area is detected as a rear area.
- the predetermined image processing may be smoothing processing along the moving direction. Therefore, the effect
- the smoothing process may be performed on the rear region in a degree corresponding to the moving speed along the moving direction.
- the enlargement unit may enlarge the area of the moving body in the moving direction according to the moving speed.
- the target image is any one of the plurality of images
- the rear region detection unit detects the plurality of rear regions in a rear portion of the moving body.
- the smoothing processing unit may detect and generate a plurality of images obtained by performing the smoothing processing on the plurality of rear regions. Thereby, a plurality of rear regions are detected, and an effect is produced in which an image on which smoothing processing has been performed is generated.
- the rear region processing unit is a low-pass having a pass band having a width corresponding to the amplitude of a periodic function representing the distribution of pixel values in the rear region in the direction along the moving direction.
- the smoothing process may be performed using a filter.
- the smoothing process is performed using a low-pass filter having a passband having a width corresponding to the amplitude of the periodic function.
- the moving body detecting unit may further include a processing unit, and the moving body detecting unit may detect the moving body in the target image in which the positions are aligned. This brings about the effect that the moving object is detected in the target image on which the alignment process has been performed.
- the rear region processing unit may further execute a smoothing process with a degree different from that of the rear region along the movement direction for a region other than the moving body in the target image. Good. This brings about the effect
- the image processing apparatus may further include an enhancement processing unit that emphasizes a line segment along the moving direction among the line segments included in the rear region. This brings about the effect that the line segment along the moving direction is emphasized.
- the predetermined image processing may be processing for filling the rear area with a predetermined color. This brings about the effect that the rear region is filled with a predetermined color.
- the image processing apparatus can have an excellent effect that it can generate a dynamic image.
- region detection part in 1st Embodiment. 3 is a flowchart illustrating an example of an operation of the image processing apparatus according to the first embodiment. It is a figure which shows an example of the input image data in 1st Embodiment.
- First embodiment image processing: an example in which smoothing processing is executed for a rear region
- Second embodiment image processing: an example in which smoothing processing is executed for a plurality of rear regions in one moving body
- Third embodiment image processing: an example in which smoothing processing is executed for a rear region in an image after alignment
- FIG. 1 is a block diagram illustrating a configuration example of the information processing apparatus 100 according to the first embodiment.
- the information processing apparatus 100 is an apparatus that executes various types of information processing such as imaging of moving image data and image processing on the moving image data.
- the information processing apparatus 100 includes an imaging unit 110, a control unit 120, a display unit 130, an input / output interface 140, a moving image data storage unit 150, a bus 160, and an image processing device 200.
- the imaging unit 110 captures a subject such as a moving body and generates moving image data.
- This moving image data includes a plurality of pieces of image data continuous in time series.
- the imaging unit 110 outputs the generated moving image data to the moving image data storage unit 150.
- the control unit 120 controls the information processing apparatus 100 as a whole. For example, the control unit 120 performs control for causing the imaging unit 110 to generate moving image data, and control for causing the image processing apparatus 200 to perform image processing on the moving image data.
- the display unit 130 displays moving image data.
- the input / output interface 140 is for outputting data to an external device of the information processing apparatus 100 and inputting data from an external device.
- the input / output data includes moving image data and the like.
- the moving image data storage unit 150 stores moving image data.
- the bus 160 is a common path through which the imaging unit 110, the control unit 120, the display unit 130, the input / output interface 140, the moving image data storage unit 150, and the image processing apparatus 200 transmit and receive data.
- the image processing apparatus 200 performs predetermined image processing on the image data in the moving image data.
- the image processing apparatus 200 reads a plurality of image data as input image data from the moving image data storage unit 150 via the signal line 208.
- the image processing apparatus 200 detects an object in at least one of these input image data as a target image, and detects a moving object in the target image. Then, the image processing apparatus 200 detects the moving direction of the moving body.
- the image processing apparatus 200 detects a rear portion of the moving body as a rear region with respect to the moving direction.
- the image processing apparatus 200 performs a smoothing process on the detected rear region along the movement direction.
- the image processing apparatus 200 outputs the smoothed image data as output image data to the display unit 130 or the like via the signal line 209.
- FIG. 2 is a block diagram illustrating a configuration example of the image processing apparatus 200 according to the first embodiment.
- the image processing apparatus 200 includes a moving object detection unit 220, a movement vector detection unit 230, a rear region detection unit 240, and a smoothing processing unit 250.
- the moving body detection unit 220 detects a moving body in the input image data. For example, the moving object detection unit 220 detects a moving object in each of n (n is an integer of 2 or more) pieces of input image data I 0 to I n ⁇ 1 . Details of the detection method of the moving object will be described later.
- the moving object detection unit 220 supplies data indicating the detected moving object region as moving object region data M 0 to M n ⁇ 1 to the moving vector detection unit 230 and the rear region detection unit 240 via the signal line 229.
- the moving body region data is, for example, image data in which the pixel value of the pixel in the moving body region is “1” and the other pixel values are “0”.
- the moving body detection unit 220 is an example of a moving body acquisition unit described in the claims.
- the movement vector detection unit 230 detects the moving direction of the moving body. For example, the movement vector detection unit 230 sets the image immediately before the target image as a reference image corresponding to the target image. Then, the movement vector detection unit 230 uses a movement vector (that is, movement) as a starting vector from a specific coordinate (for example, the coordinates of the center of gravity) in the moving body in the reference image and a specific coordinate in the moving body of the target image. Direction). However, since there is no corresponding reference image for the first input image data in the time series among the n input image data, the movement vector detection unit 230 sets the n ⁇ 1 input image data after the second one. A motion vector is detected in each of. The movement vector detection unit 230 supplies the detected movement vectors V 1 to V n ⁇ 1 to the rear region detection unit 240 and the smoothing processing unit 250 via the signal line 239.
- a movement vector that is, movement
- a specific coordinate for example, the coordinates of the center of gravity
- the movement vector detection unit 230 is an example of a movement direction acquisition unit described in the claims.
- the movement vector detection unit 230 obtains a movement vector for the second and subsequent input image data, but may obtain a movement vector for the first input image data.
- the movement vector detection unit 230 may interpolate the same vector as the second movement vector V 1 as the movement vector V 0 for the first input image data.
- the rear region detection unit 240 detects, as a rear region, a region in the rear part with respect to the movement vector in each of the input image data in which the movement vector is detected. Details of the rear region detection method will be described later.
- the rear region detection unit 240 supplies rear region data B 1 to B n ⁇ 1 indicating the detected rear region to the smoothing processing unit 250 via the signal line 249.
- the rear region data is, for example, image data in which the pixel value of the pixel in the rear region is “1” and the other pixel values are “0”.
- the smoothing processing unit 250 performs a smoothing process on the rear region along the moving direction.
- the smoothing processing unit 250 executes the smoothing process along the direction of the movement vector in each rear region of the second and subsequent input image data in which the rear region is detected.
- the smoothing processing unit 250 performs smoothing using, for example, a moving average filter. In this moving average filter, the pixel value after the smoothing process is calculated from the following Expression 1 and Expression 2, for example.
- P 0 to P K-1 are pixel values of pixels in the moving body region before the smoothing process.
- P 1 to P K ⁇ 1 are pixel values of pixels lined up in the moving direction when viewed from the pixel corresponding to P 0 .
- P 0 is a pixel value at coordinates (0, 0) and the movement direction is the X-axis direction
- the pixel values at coordinates (1, 0) to (K ⁇ 1, 0) are P 1 to P K.
- K is the degree of smoothing, specifically the filter order of the moving average filter.
- P 0 ′ is a pixel value after smoothing processing in the pixel corresponding to P 0 .
- Equation 2
- is the magnitude of the movement vector between two frames (in other words, the moving speed of the moving body).
- ⁇ is a predetermined coefficient, and a real number is set.
- the smoothing processing unit 250 outputs the image data after the smoothing processing as output image data O 1 to O n ⁇ 1 .
- the smoothing processing unit 250 executes the smoothing process on all of the second and subsequent input image data. However, a part of the input image data (for example, any one of the input image data) Smoothing processing may be executed only on (image data). In addition, the smoothing processing unit 250 executes the smoothing process using the moving average filter, but may execute the smoothing process using a filter (for example, a Gaussian filter) other than the moving average filter. Good. Further, the smoothing processing unit 250 sets the smoothing degree (K) to a value corresponding to the moving speed (
- FIG. 3 is a block diagram illustrating a configuration example of the moving object detection unit 220 according to the first embodiment.
- the moving body detection unit 220 includes a pixel selection unit 221, a background reference value calculation unit 222, and a moving body region extraction unit 223.
- the pixel selection unit 221 selects n pixel values of corresponding coordinates in the n input image data I 0 to I n ⁇ 1 . For example, when each of the input image data includes w ⁇ h pixels (w and h are integers of 1 or more) having coordinates (0, 0) to (w ⁇ 1, h ⁇ 1), the pixel selection unit 221 is selected. Selects a pixel value of coordinates (0, 0) from n pieces of input image data. As a result, n pixel values are selected. Next, the pixel selection unit 221 selects a pixel value at coordinates (0, 1) in each of the n input image data. As a result, the next n pixel values are selected. In this way, finally, n pixel values for w ⁇ h sets are selected for the input image data I 0 to I n ⁇ 1 . These selected pixel values are supplied to the background reference value calculation unit 222 and the moving body region extraction unit 223.
- the background reference value calculation unit 222 calculates a reference value for each coordinate when determining whether or not a pixel in the input image data is a background pixel.
- the background reference value calculation unit 222 calculates, for example, the mode value as the reference value in each of the w ⁇ h sets. This is because in n consecutive images, a pixel having a pixel value with a high appearance frequency is estimated to be highly likely to be a background.
- the background reference value calculation unit 222 supplies the calculated reference values V (0, 0) to V (w ⁇ 1, h ⁇ 1) to the moving object region extraction unit 223. Note that the background reference value calculation unit 222 may calculate an average value of n pixel values as a reference value.
- the moving body area extraction unit 223 extracts a moving body area from each of the input image data.
- the moving body region extraction unit 223 calculates, for each pixel, the difference between the pixel value of the pixel and the reference value corresponding to the pixel in the input image data.
- the moving body region extraction unit 223 determines that the pixel is a pixel in the background.
- the moving body region extraction unit 223 determines that the pixel is a pixel in the moving body.
- the moving body region extraction unit 223 generates and outputs moving body region data M 0 to M n ⁇ 1 based on the determination result.
- a pixel value of “1” is set for a pixel determined as a moving object
- a pixel value of “0” is set for a pixel determined as a background.
- the mobile body detection part 220 has detected the mobile body based on the appearance frequency of a pixel value, it can also detect a mobile body using another method.
- the moving body detection unit 220 calculates an inter-frame difference method that calculates a pixel value difference between corresponding pixels in a plurality of continuous images and detects a pixel having the difference equal to or greater than a threshold as a moving body pixel. May be used.
- FIG. 4 is a block diagram illustrating a configuration example of the movement vector detection unit 230 according to the first embodiment.
- the movement vector detection unit 230 includes a barycentric coordinate calculation unit 231 and a movement vector calculation unit 232.
- the barycentric coordinate calculation unit 231 calculates barycentric coordinates in the area of the moving object.
- the barycentric coordinate calculation unit 231 receives input image data and moving body region data corresponding to the input image data.
- the center-of-gravity coordinate calculation unit 231 calculates the center of gravity of the moving object using each pixel value in the moving object in the input image data as a density. Specifically, the barycentric coordinate calculation unit 231 calculates the barycentric coordinates using, for example, the following Expression 3 and Expression 4.
- Equation 3 g i is the x coordinate of the center of gravity, and g j is the Y coordinate of the center of gravity.
- i is the x coordinate of the pixel in the area of the moving object, and j is the y coordinate of the pixel in the area of the moving object.
- P [i] [j] is a pixel value at the coordinates (i, j).
- the barycentric coordinate calculation unit 231 calculates barycentric coordinates G 0 to G n ⁇ 1 for n pieces of input data and supplies the calculated values to the movement vector calculation unit 232.
- the center-of-gravity coordinate calculation unit 231 may obtain the center-of-gravity coordinates with all the pixel values in the area of the moving object in the input image data as a constant value (for example, “1”). In this case, the center-of-gravity coordinate calculation unit 231 calculates the center-of-gravity coordinates only from the moving body region data, with P [i] [j] in Equations 3 and 4 all set to a constant value (eg, “1”).
- the movement vector calculation unit 232 calculates a movement vector from the barycentric coordinates.
- the movement vector computing unit 232 computes a vector having a starting point at the center of gravity of the moving object in the image (reference image) immediately preceding the target image, and an end point at the center of gravity of the moving body in the target image. To do. For example, a vector having the center of gravity coordinate G 0 as the starting point and the center of gravity coordinate G 1 as the end point is calculated as the moving vector V 1 of the moving object in the second input image data I 1 .
- the movement vector detection unit 230 may use coordinates other than the center of gravity as the start point or end point of the movement vector. For example, the movement vector detection unit 230 may obtain the average value of the x coordinate and the average value of the y coordinate in each moving body, and may use the coordinate formed by these average values as the start point or the end point of the vector.
- the movement vector detection unit 230 calculates a movement vector for each input image data (frame).
- the movement vector detection unit 230 calculates an average of the movement vectors within a certain period (for example, 30 frames) and calculates the average vector. May be output as a movement vector within that period.
- FIG. 5 is a block diagram showing a configuration example of the rear region detection unit 240 in the first embodiment.
- the rear region detection unit 240 includes an enlargement unit 241, a moving body position changing unit 242, and a mask processing unit 243.
- the expansion unit 241 expands the area of the moving body in the movement direction.
- the enlargement unit 241 receives the moving body region data M and the moving vector V from the moving body detection unit 220 and the movement vector detection unit 230.
- the enlargement unit 241 calculates the magnitude
- the enlargement unit 241 enlarges the moving body region M corresponding to the movement vector V by
- the enlargement unit 241 outputs the enlarged area M of the moving body as the enlarged area data W to the moving body position changing unit 242 and the mask processing unit 243.
- the moving object position changing unit 242 changes the position of the enlarged moving object region (W) in the moving direction.
- the moving body position changing unit 242 receives the enlarged region data W and the movement vector V from the enlargement unit 241 and the movement vector detection unit 230. Then, the moving body position changing unit 242 changes (shifts) the position of the enlarged region by a distance of
- the moving body position changing unit 242 supplies the enlarged region data whose position has been changed to the mask processing unit 243 as the mask region data W ′.
- the enlargement unit 241 enlarges the size of the area of the moving object in accordance with the moving speed (
- the moving body position changing unit 242 changes the position of the distance according to the moving speed (
- the moving body position changing unit 242 may be configured to change a certain distance regardless of the moving speed. .
- the mask processing unit 243 executes mask processing using the mask area data W ′ in the enlarged area data W. By this masking process, the portion of the enlarged area data W that overlaps the mask area data W ′ is deleted, and the remaining area is extracted.
- the mask processing unit 243 outputs the rear region data B indicating the rear region, with the region generated by the mask processing (that is, the region other than the mask region data W ′) as the rear region.
- FIG. 6 is a flowchart illustrating an example of the operation of the image processing apparatus 200 according to the first embodiment. This operation starts, for example, when input image data I 0 to I n-1 is input to the image processing apparatus 200.
- the image processing apparatus 200 detects the moving object in the input image data I 0 to I n-1, to produce a moving object region data M 0 through M n-1 (step S910).
- the image processing apparatus 200 detects the moving vectors V 1 to V n-1 of the moving body from the input image data I 0 to I n-1 and the moving body region data M 0 to M n-1 (step S920).
- the image processing apparatus 200 detects the rear region in each of the moving objects from the movement vectors V 1 to V n ⁇ 1 and the moving object region data M 0 to M n ⁇ 1 to detect the rear region data B 1 to B n ⁇ 1. Is generated (step S930). Then, the image processing apparatus 200 uses the input image data I 0 to I n ⁇ 1 , the rear region data B 1 to B n ⁇ 1 and the movement vectors V 1 to V n ⁇ 1 to move the rear region in the input image data. A smoothing process is executed along with (Step S940).
- FIG. 7 is a diagram illustrating an example of input image data according to the first embodiment.
- a is the first input image data I 0 in time series
- b to d in FIG. 7 are the second to fourth input image data I 1 to I 3 .
- a car is photographed in FIGS. 7A to 7D, and the position of the car is moving in the horizontal direction as time passes.
- FIG. 8 is a diagram illustrating a distribution example of pixel values according to the first embodiment.
- the vertical axis represents pixel values and the horizontal axis represents time.
- P 0 (0,0) to P 10 (0,0) are pixel values of coordinates (0,0) in each of the input image data I 0 to I 10 .
- the appearance frequency of pixel values such as P 0 (0,0) to P 2 (0,0) is high, and this pixel value is used as the reference value V (0,0).
- P 0 (0,0) to P 2 (0,0), P 6 (0,0) to P 10 (0,0), and the like have a difference between the reference value and the threshold value.
- the corresponding pixel is determined to be a pixel in the background.
- FIG. 9 is a diagram illustrating an example of moving area data according to the first embodiment.
- a to d are moving body region data M 0 to M 3 generated from the input image data I 0 to I 3 exemplified in a to d in FIG.
- a white area is a background area
- a black area is a moving body area.
- an automobile area is detected as a moving body area.
- FIG. 10 is a diagram illustrating an example of the center of gravity and the movement vector in the first embodiment.
- 10A to 10D are diagrams illustrating the center of gravity and the movement vector detected in the input image data I 0 to I 3 exemplified in a to d in FIG. The background is omitted in a to d in FIG.
- the center-of-gravity coordinates G 0 to G 3 are calculated from Equation 3 and Equation 4 in each moving body (automobile).
- the movement vector V 1 to the end point of the G 1 is being detected.
- movement vectors V 2 and V 3 are detected from the barycentric coordinates G 1 to G 3 .
- FIG. 11 is a diagram illustrating an example of mobile object region data, extension region data, mask region data, and rear region data in the first embodiment.
- “a” is a diagram illustrating the moving body region data M 1 corresponding to the second input image data I 1 .
- the moving body in the moving object region data M 1 a in FIG. 11, the moving body is an area enlargement region data W 1 to an enlarged (area of black) in the moving direction.
- the rear region detection unit 240 expands the region of the moving body in the moving direction (V 1 ) by
- C in FIG. 11 is mask area data W 1 ′ in which the position of the enlarged area in b of FIG. 11 is changed in the movement direction.
- the rear region detection unit 240 changes the position of the enlarged region by
- D in FIG. 11 is the rear region data B 1 generated by masking with the mask region data W 1 ′ in the enlarged region data W 1 of b in FIG.
- the rear region detection unit 240 deletes an overlapping region between b in FIG. 11 and c in FIG. 11 in FIG. 11 b and detects the remaining region as a rear region. To do.
- the area of the moving body is shifted without being enlarged to be a mask area
- the area other than the rear part of the moving body (such as the front wheel of an automobile) may remain after the mask processing.
- the region such as the front wheel of the automobile is prevented from being detected as the rear region.
- the rear region detection unit 240 can be configured to shift the region of the moving body without enlarging it.
- the rear region detection unit 240 can detect a region surrounded by the contour line of the rear part whose position has been changed in the moving direction and the contour line before the change, the rear region detection unit 240 performs processing other than the mask processing.
- the rear region can also be detected. Specifically, the rear region detection unit 240 divides the region of the moving body into two along a line perpendicular to the moving direction, and of the two divided portions, the contour of the rear portion with respect to the moving direction Detect lines. Then, the rear region detection unit 240 may change the position of the contour line in the movement direction, and detect a region surrounded by the contour line before and after the position change as a rear region.
- the rear region detection unit 240 divides the region of the moving body into two along a line perpendicular to the moving direction, and of the two divided portions, the region of the rear portion with respect to the moving direction is left as it is. It may be detected as a rear region.
- FIG. 12 is a diagram illustrating an example of output image data according to the first embodiment.
- FIG. 12 shows the output image data O 1 generated by executing the smoothing process along the moving direction for the rear region of the vehicle in the input image data I 1 illustrated in FIG. 7B. Thereby, an image in which the rear part of the moving body is blurred in the moving direction is generated, and the speed feeling of the moving body is emphasized.
- the image processing apparatus 200 executes the smoothing process along the moving direction for the rear area of the moving object, so that the rear area is aligned with the moving direction. And a smoothed image can be generated. As a result, a dynamic image in which the sense of speed of the moving object is emphasized can be obtained.
- the image processing apparatus 200 uses a pixel having an image value with a high appearance frequency for each pixel as a background and other pixels as a moving object, and calculates a movement vector from a temporal change in the center of gravity of the moving object. I was asking. However, the image processing apparatus 200 may obtain the movement vector by block matching.
- the image processing apparatus 200 according to the first modified example is different from the first embodiment in that a movement vector is obtained by block matching.
- the moving object detection unit 220 divides input image data into a plurality of blocks having a predetermined shape, blocks in one search range of adjacent input image data, and the other search range.
- the block having the highest correlation with the inner block is obtained.
- the search range is a range for searching for a motion vector.
- the high degree of correlation is obtained by, for example, SAD (SumSof Absolute Difference estimation) processing for calculating the sum of absolute difference values of pixel values.
- SAD SudSof Absolute Difference estimation
- MPC Maximum matching Pixel Count
- the moving object detection unit 220 detects a vector from one of the two blocks having the highest correlation within the search range to the other as a motion vector. Then, the moving object detection unit 220 detects an area composed of blocks in which the amount of motion indicated by the motion vector exceeds a predetermined threshold as an area of the moving object.
- the moving body detection unit 220 may supply the obtained motion vector as it is to the rear region detection unit 240 and the smoothing processing unit 250 as a movement vector.
- the image processing apparatus 200 smoothes the rear region along the movement direction.
- the image processing apparatus 200 may further perform a process of enhancing a line segment along the movement direction.
- the image processing apparatus 200 according to the second modified example is different from the first embodiment in that a line segment along the moving direction is emphasized in the rear region.
- FIG. 13 is a block diagram illustrating a configuration example of the image processing apparatus 200 according to the second modification of the first embodiment.
- the image processing apparatus 200 according to the second modification is different from the first embodiment in that an edge enhancement processing unit 260 is further provided.
- the edge enhancement processing unit 260 enhances a line segment along the moving direction in the rear region.
- the edge enhancement processing unit 260 receives the image data that has been smoothed by the smoothing processing unit 250, the moving body region data, and the movement vector.
- the edge enhancement processing unit 260 performs processing for enhancing an edge in a direction perpendicular to the movement vector using a high-pass filter or the like in a rear region in the image data. By emphasizing the edge in the direction of 90 degrees with respect to the moving direction, the line segment along the moving direction is relatively emphasized.
- the edge enhancement processing unit 260 outputs the image data with the edge enhanced as output image data.
- the edge enhancement processing unit 260 is an example of an enhancement processing unit described in the claims.
- FIG. 14 is a diagram illustrating an example of output image data in the second modification.
- FIG. 14A is a diagram illustrating a part of the moving object in the input image data before the smoothing process. It is assumed that a moving body having a black and white checker pattern is detected as indicated by a in FIG. In FIG. 14, a region surrounded by a dotted line is a rear region of the moving body.
- FIG. 14 is a diagram illustrating a part of the moving body in the input image data after the smoothing process. As shown in b in FIG. 14, the rear region is smoothed.
- FIG. 14C is a diagram illustrating a part of the moving object in the input image data after edge enhancement. As indicated by c in FIG. 14, the edge is emphasized in the direction perpendicular to the movement vector in the rear region. As a result, the line segment extending in the direction horizontal to the movement vector is emphasized. For this reason, it is possible to provide a relatively smooth feeling with respect to the moving direction.
- the image processing apparatus 200 detects the moving object and the movement vector, but the image processing apparatus 200 does not necessarily need to detect the moving object and the movement vector.
- the image processing apparatus 200 according to the fourth modification is different from the first embodiment in that the apparatus itself does not detect the moving body and the movement vector.
- FIG. 15 is a block diagram illustrating a configuration example of the image processing apparatus 200 according to the third modification of the first embodiment.
- the image processing apparatus 200 according to the third modified example is different from the first embodiment in that a moving body acquisition unit 225 and a movement vector acquisition unit 235 are provided instead of the moving body detection unit 220 and the movement vector detection unit 230.
- the moving body region data and the movement vector are input to the image processing apparatus 200 of the third modified example.
- the user obtains the area of the moving body and the movement vector manually and inputs them to the information processing apparatus 100.
- the moving body acquisition unit 225 acquires the input area of the moving body and supplies it to the rear area detection unit 240.
- the movement vector acquisition unit 235 acquires the input movement vector and supplies it to the rear region detection unit 240 and the smoothing processing unit 250. Note that the user may input only one of the moving object and the moving vector, and the image processing apparatus 200 may detect the other.
- the image processing apparatus 200 performs smoothing in the rear region, but may perform image processing other than the smoothing processing. For example, a process of filling the rear area with a predetermined color may be executed.
- the image processing apparatus 200 according to the fourth modified example is different from the first embodiment in that the rear region is filled with a predetermined color.
- FIG. 16 is a block diagram illustrating a configuration example of the image processing apparatus 200 according to the fourth modification example of the first embodiment.
- the image processing apparatus 200 according to the fourth modification is different from the first embodiment in that a rear region processing unit 255 is provided instead of the smoothing processing unit 250.
- the rear region processing unit 255 obtains the pixel value P0 ′ after the smoothing process according to the following Expression 6 or Expression 7 instead of Expression 1 and Expression 2.
- P0 ′ 0
- Formula 6 255
- 0 a value indicating the minimum pixel value represented by 8 bits
- Expression 7, 255 is a value indicating the maximum pixel value.
- FIG. 17 is a block diagram illustrating a configuration example of the image processing apparatus 200 according to the second embodiment.
- the second image processing apparatus 200 is different from the first embodiment in that a plurality of different rear areas are detected in one input image data and each rear area is smoothed.
- the moving body detection unit 220 according to the second embodiment generates only the moving body area data M t and M t-1 instead of the moving body area data M 0 to M n ⁇ 1 . This is different from the first embodiment.
- the moving body detection unit 220 outputs the moving body region data M t and M t ⁇ 1 to the moving vector detection unit 230, and outputs the moving body region data M t to the rear region detection unit 240.
- the movement vector detection unit 230 is different from the first embodiment in that it detects only one movement vector V t instead of the movement vectors M 1 to M n ⁇ 1 .
- Rear area detection unit 240 is different from the moving object region data M t and moving vector V t, the first embodiment in that to detect a plurality of rear area in the mobile.
- the rear region detection unit 240 generates rear region data B 0 to m ⁇ 1 (m is an integer of 2 or more) indicating these rear regions, and outputs the data to the smoothing processing unit 250.
- Smoothing processing unit 250 from the input image data I t and the rear region data B 0 to B m-1 and the movement vector V t, the input image data I t, a point to execute the smoothing process to the plurality of rear area
- the smoothing processing unit 250 outputs output image data O 0 to O m ⁇ 1 that are the execution results of the smoothing processing.
- the smoothing processing unit 250 may perform different degrees of smoothing for each of the rear regions.
- FIG. 18 is a block diagram illustrating a configuration example of the rear region detection unit 240 according to the second embodiment.
- the rear region detection unit 240 according to the second embodiment is different from the first embodiment in that it further includes a rear position changing unit 244.
- the enlargement unit 241 of the second embodiment is different from the first embodiment in that one enlargement area data Wt is generated instead of n ⁇ 1 enlargement area data.
- the moving body position changing unit 242 according to the second embodiment is different from the first embodiment in that one mask area data W t ′ is generated instead of n ⁇ 1 mask area data.
- the mask processing unit 243 according to the second embodiment is different from the first embodiment in that one rear region data B t is generated instead of n ⁇ 1 rear region data.
- the rear position changing unit 244 changes the position of the rear region data Bt along the movement direction.
- the rear position changing unit 244 generates a plurality of rear region data whose positions are different from each other in the movement direction, and outputs them as rear region data B 0 to B m ⁇ 1 .
- FIG. 19 is a diagram illustrating an example of the rear region data in the second embodiment.
- a is the rear region data B 0 generated by the mask process, as in the first embodiment.
- “1” is set in ⁇ representing the amount of movement in the enlarged region.
- FIG. 20 is a diagram illustrating an example of the rear portion of the moving object according to the second embodiment.
- a to c are rear portions of the moving body specified by the rear region data B 0 to B 3 exemplified in a to c in FIG.
- the rearmost region of the rear regions in the rear region data B 0 to B 3 is detected as the rear region.
- FIG. 20b a region located in front of the rear region of FIG. 20a is detected as the rear region.
- a region located further forward than the rear region of b in FIG. 20 is detected as the rear region.
- the rear position changing unit 244 changes the position of the rear region obtained by the mask process along the moving direction, a plurality of different rear regions are detected in one moving body.
- FIG. 21 is a diagram illustrating an example of output image data according to the second embodiment.
- a to c are image data obtained by smoothing the rear regions exemplified in a to c in FIG. 19 with respect to one input image data.
- the areas to be smoothed are different from each other. Therefore, when these are continuously reproduced, the rear part appears to flow backward. Thereby, the sense of speed of the moving body is emphasized.
- the image processing apparatus 200 performs a plurality of images in which different rear regions are smoothed by executing the smoothing process for each of the different rear regions. It is possible to generate a video consisting of As a result, a dynamic moving image can be obtained in which the rear region appears to flow.
- the image processing apparatus 200 performs smoothing using a moving average filter, but may perform smoothing using a low-pass filter.
- the image processing apparatus 200 according to the modified example is different from the second embodiment in that smoothing is performed using a low-pass filter.
- the smoothing processing unit 250 of the modification performs smoothing by a low-pass filter having a pass band proportional to the value of a trigonometric function representing a wave traveling in the reverse direction of the movement vector Vt. That is, more specifically, the following processing is performed.
- S (t1, x1, y1) be a trigonometric function representing a wave traveling in the opposite direction of the movement vector Vt.
- the function S is a function in which the wave value at the time t1 and the position (x1, y1) is S (t1, x1, y1).
- the function S is represented by the following equation, for example.
- K is a proportionality constant
- “ ⁇ ” in the formula represents an inner product.
- Q is the coordinates of (x1, y1) in the output image data.
- the smoothing processing unit 250 may use a function other than the trigonometric function exemplified in Expression 5 as long as it is a periodic function representing the distribution of pixel values of pixels in the rear region in the direction along the moving direction.
- the smoothing processing unit 250 output image data O t (t is 0 or m-1 or less)
- a low-pass filter having a pass band proportional to S (t, x2, y2) is applied, where (x2, y2) is the pixel position of each pixel of It.
- FIG. 22 is a diagram illustrating an example of the locus of the function S in the modified example of the second embodiment.
- the horizontal axis represents the coordinates of the pixel along the movement vector
- the vertical axis represents the locus of the function S at a certain time.
- the wave represented by the function S travels in the direction opposite to the movement vector when the time t changes from 0 to m ⁇ 1. Furthermore, since the cycle is m, if the repetition is performed by returning from m ⁇ 1 to 0, the wave always travels in the direction opposite to the movement vector without interruption. Then, the image processing apparatus 200 applies a low-pass filter having a pass band having a width corresponding to the amplitude of the function S to the pixels of the input image data It. Specifically, when the amplitude of the function S is large, the image processing apparatus 200 applies a low-pass filter having a wide pass band to the pixels of the input image data It (in other words, it does not smooth much).
- a low-pass filter having a narrow pass band is applied to the pixels of the input image data It (in other words, it is considerably smoothed).
- the output image data O 0 to O m ⁇ 1 subjected to such a low-pass filter is continuously reproduced, a pattern smoothed to a degree corresponding to the amplitude of the function S flows in the backward direction. As a result, the sense of speed is emphasized.
- FIG. 23 is a block diagram illustrating a configuration example of the image processing apparatus 200 according to the third embodiment.
- the image processing apparatus 200 detects the moving object on the assumption that the photographer has taken the image without moving the imaging device when photographing the input image data.
- a photographer may take a picture while moving the imaging device itself.
- the photographer may move (pan) the imaging device in the movement direction following the movement of the moving body.
- moving image data in which a stationary object moves may be captured while the moving body does not move apparently on the screen. If the moving object detection method exemplified in the first embodiment is applied to such moving image data as it is, an object that is not a moving object may be erroneously detected as a moving object.
- the image processing apparatus 200 corrects a positional deviation between input image data due to movement of an imaging device in the input image data. This is different from the first embodiment in that a moving body is detected. Specifically, as shown in FIG. 23, the image processing apparatus 200 of the third embodiment is different from the first embodiment in that it further includes an alignment processing unit 210.
- the alignment processing unit 210 executes an alignment process between input image data. Specifically, a process of calculating “position correction parameters C 1 to C n-1 ” for aligning the input image data and the input image data using the “position correction parameters C 1 to C n-1 ”. Two processes of adding an offset to the coordinate system of I 1 to I n-1 are performed.
- FIG. 25 is a block diagram illustrating a configuration example of the alignment processing unit 210 according to the third embodiment.
- the alignment processing unit 210 includes a position correction parameter calculation unit 211 and an input image position correction unit 212.
- the position correction parameter calculation unit 211 receives the input image data I 0 and the input image data I 1 to I n ⁇ 1 . Then, “position correction parameters C 1 to C n ⁇ 1 ” for aligning the input image data are calculated.
- the "position correction parameter C t" the input image data I t is, the input image data I 0, is data indicating whether any positional relationship. That is, the two-dimensional coordinate system of the input image data I t, considering only the coordinate system offset amount represented by the position correction parameter C t, the input at any position in the "Offset coordinate system (coordinate values) image subject "that is projected on the data I t will be projected on the same position of the input image data I 0 (coordinate value).
- the “position correction parameter C t ” is data composed of two scalar values, an X-direction offset value (scalar value) and a Y-direction offset value (scalar value).
- Such "position correction parameter C t" is the input image data I t, can be obtained by matching calculation between two images of the input image data I 0.
- the matching calculation is a conventionally known technique and will not be described in detail.
- the position correction parameter calculator 211 performs matching processing between the input image data I 0 input image data I t of the image, that obtaining a shift amount of the two images Do. Then, this deviation amount is output as “position correction parameter C t ”.
- a phase only correlation method is used.
- the position correction parameter calculation unit 211 performs a Fourier transform on the images to be compared, obtains a cross power spectrum of both images from the result of the Fourier transform, and performs an inverse Fourier transform on the cross power spectrum.
- a function of x, y that is steep at a certain value of x, y is obtained. If the position of the target image is shifted in the x and y directions by the values of x and y at the time of steepness, the target image will be in good agreement with the reference image. Is a position correction parameter.
- This phase-only correlation method is described in B. Srinivasa Reddy and B. N. Chatterji, An FFT-Based Technique for Translation, Rotation, and Scale-Invariant Image Registration, IEEE TRANSACTIONS ON IMAGE PROCESSING 1996. As described above, this method is often used in the alignment process.
- the input image position correction unit 212 performs processing to add an offset to the coordinate system of the input image data I 1 to I n-1 to obtain an image on a new coordinate system.
- the same subject is projected at the same position (coordinate value) for each of the input image data I 0 to I n ⁇ 1 . That is, it can be considered as an image taken by the photographer without moving the imaging device. That is, the input image shown in FIG. 26 can be obtained by adding an offset to the input image shown in FIG.
- FIG. 26 is a diagram illustrating an example of input image data according to the third embodiment.
- a is the first input image data I 0 in time series
- b to d in FIG. 26 are the second to fourth input image data I 1 to I 3 .
- the automobile and the background object are photographed, and the background object is moved in the horizontal direction by the pan operation of the photographer.
- the position may not change much between frames.
- FIG. 27 is a diagram illustrating an example of image data obtained by correcting the position in the third embodiment by using an offset.
- a is the first input image data I 0 in the time series, and is used as a reference image.
- b represents the second input image data I 1 whose position is offset-corrected using the position correction parameter C 1 so that a common area with the reference image overlaps.
- c and d respectively represent the third and fourth input image data I 2 whose positions are offset-corrected using the position correction parameters C 2 and C 3 so that the common area with the reference image overlaps. , I 3 .
- the image processing apparatus 200 can reliably detect the moving object by correcting the offset of the position.
- the first embodiment of the present invention By performing the same processing (after 220 in FIG. 23) as the first embodiment of the present invention on the input image data I 0 to I n-1 in the offset coordinate system, the first embodiment of the present invention. Similar to the result of the form, an “image with a dynamic feeling in which the sense of speed of the moving object is emphasized” is obtained. Furthermore, in the third embodiment, by adding processing that is not in the first embodiment of the present invention, a further “image with a sense of speed in which a sense of speed is emphasized” can be obtained. This will be described below.
- an alignment processing unit 210 is added to the first embodiment (FIG. 2) of the present invention, and input to the alignment processing unit 210.
- Different data is used. Specifically, the information of “position correction parameters C 1 to C n ⁇ 1 ” obtained by the alignment processing unit 210 and the moving body region data M 0 to M n ⁇ 1 extracted by the moving body detection unit 220. Is input to the smoothing processing unit 250.
- FIG. 27 is a block diagram illustrating a configuration example of the smoothing processing unit 250 according to the third embodiment.
- the smoothing processing unit 250 includes a rear region smoothing processing unit 251 and a background region smoothing processing unit 252.
- the rear region smoothing processing unit 251 performs a smoothing process on the rear region according to the moving speed.
- the rear region smoothing processing unit 251 supplies the input image data subjected to the smoothing processing to the background region smoothing processing unit 252.
- the background area smoothing processing unit 252 executes a smoothing process on the background area along the correction direction according to the correction amount indicated by the position correction parameter. Specifically, the background area smoothing processing unit 252 extracts a background area other than the moving body area based on the moving body area data in the input image data in which the rear area is smoothed. Then, the background region smoothing processing unit 252 calculates the size (correction amount) of the correction vector from the position correction parameter. More specifically, determine the size of the bread when taken two consecutive position correction parameters C t and C t-1 of the difference from the input image data I t, the value and the correction amount. C 0 is 0.
- the background area smoothing processing unit 252 executes the smoothing process on the background area along the direction of the correction vector with the correction amount ⁇ ⁇ ′ as the smoothing degree K.
- the value of the coefficient ⁇ ′ is desirably a smaller value (for example, “1/10”) than the coefficient ⁇ (for example, “1”) set when the rear region is smoothed.
- the background region smoothing processing unit 252 outputs the image data that has been subjected to the smoothing process as output image data.
- FIG. 29 is a diagram illustrating an example of output image data according to the third embodiment.
- a in FIG. 29 is an example of the input image data I 1 ′ that is a result of executing the smoothing process along the movement direction for the rear region.
- B in FIG. 29 is an example of the output image data O 1 that is a result of executing the smoothing process on the background of the moving body along the correction direction (pan direction).
- the rear region of the moving object is smoothed along the moving direction, and the background is smoothed along the pan direction.
- the image processing apparatus 200 aligns the position of the target image with the position of the reference image so that a common area overlaps the reference image and the target image. From this, it is possible to detect the area of the moving body. Thereby, even when the imaging device is moved, the region of the moving body is reliably detected. Furthermore, the pan direction of the camera can be determined from the position correction parameter, and a so-called panning effect is provided by performing a slight smoothing on the “background region other than the moving object region” along the direction. be able to. With this panning effect, it is possible to obtain a further “dynamic image with a sense of speed”.
- the processing procedure described in the above embodiment may be regarded as a method having a series of these procedures, and a program for causing a computer to execute these series of procedures or a recording medium storing the program. You may catch it.
- a recording medium for example, a CD (Compact Disc), an MD (MiniDisc), a DVD (Digital Versatile Disc), a memory card, a Blu-ray Disc (Blu-ray Disc (registered trademark)), or the like can be used.
- this technique can also take the following structures.
- a moving body acquisition unit that acquires a region of a moving body in a target image that is at least one of a plurality of images that are continuous in time series;
- a moving direction acquisition unit for acquiring a moving direction of the moving body;
- a rear region detection unit that detects a rear region of the region of the moving body as a rear region with respect to the moving direction;
- An image processing apparatus comprising: a rear region processing unit that executes predetermined image processing in the rear region.
- the target image includes a plurality of blocks having a predetermined shape
- the mobile object acquisition unit obtains a motion amount of the block for each block using a block matching algorithm, and detects an area composed of blocks in which the motion amount exceeds a predetermined threshold as the mobile object region ( 2)
- the moving body acquisition unit further acquires a region of the moving body in a reference image that is an image immediately before the target image among the plurality of images.
- the moving direction acquisition unit detects, as the moving direction, a direction from a specific coordinate in the area of the moving body in the reference image to a specific coordinate in the area of the moving body in the target image.
- the image processing apparatus according to any one of (3) to (3).
- the rear region detection unit detects, as the rear region, a region surrounded by the contour line of the rear portion whose position is changed in the movement direction in the target image and the contour line before the change.
- the image processing apparatus according to any one of (1) to (4).
- the rear region detection unit is a region generated by masking in the region of the moving body before the change, using the region of the moving body whose position is changed in the moving direction in the target image as a mask region.
- the image processing apparatus according to any one of (1) to (5), wherein the image processing device is detected as the rear region.
- the rear region detection unit A moving speed detector for detecting a moving speed of the moving body;
- the rear region detection unit further includes an expansion unit that expands the region of the moving body in the movement direction in the target image,
- the mask processing unit performs the mask in the enlarged region before the change, using the region obtained by changing the position of the enlarged region, which is the region of the moving body enlarged by the enlargement unit, in the moving direction as the mask region.
- the image processing apparatus according to any one of (1) to (8), wherein the predetermined image processing is a smoothing process along the moving direction.
- the image processing device according to (9), wherein the rear area processing unit executes the smoothing process with a degree corresponding to the moving speed along the moving direction for the rear area.
- the image processing apparatus according to (9) or (10), wherein the enlargement unit enlarges an area of the moving body in the movement direction according to the movement speed.
- the rear region processing unit uses a low-pass filter having a passband having a width corresponding to the amplitude of a periodic function representing a distribution of pixel values in the rear region in the direction along the moving direction.
- the image processing apparatus according to any one of (9) to (11), which executes a smoothing process.
- the target image is any one of the plurality of images
- the rear region detection unit detects a plurality of the rear regions in a rear portion of the moving body
- the image processing apparatus according to any one of (1) to (12), wherein the rear region processing unit generates a plurality of images obtained by performing the predetermined image processing on a plurality of the rear regions.
- the image processing apparatus further includes an alignment processing unit that aligns the position of the target image with the position of the reference image so that a common area overlaps the reference image that is the image immediately before the target image and the target image,
- the image processing apparatus according to any one of (1) to (13), wherein the moving body detection unit detects the moving body in the target image in which the positions are aligned.
- the smoothing processing unit further executes a smoothing process with a degree different from that of the rear region along the moving direction for a region other than the moving body in the target image.
- DESCRIPTION OF SYMBOLS 100 Information processing apparatus 110 Imaging part 120 Control part 130 Display part 140 Input / output interface 150 Moving image data storage part 160 Bus 200 Image processing apparatus 210 Positioning process part 211 Position correction parameter calculation part 212 Input image position correction part 220 Moving body detection part 221 pixel selection unit 222 background reference value calculation unit 223 moving body region extraction unit 225 moving body acquisition unit 230 movement vector detection unit 231 centroid coordinate calculation unit 232 movement vector calculation unit 235 movement vector acquisition unit 240 rear region detection unit 241 enlargement unit 242 Moving object position changing unit 243 Mask processing unit 244 Rear position changing unit 250 Smoothing processing unit 251 Rear region smoothing processing unit 252 Background region smoothing processing unit 255 Rear region processing unit 260 Edge enhancement processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Picture Signal Circuits (AREA)
Abstract
Description
1.第1の実施の形態(画像処理:後部領域について平滑化処理を実行する例)
2.第2の実施の形態(画像処理:1つの移動体内の複数の後部領域について平滑化処理を実行する例)
3.第3の実施の形態(画像処理:位置合わせ後の画像における後部領域について平滑化処理を実行する例)
[情報処理装置の構成例]
図1は、第1の実施の形態における情報処理装置100の一構成例を示すブロック図である。この情報処理装置100は、動画データの撮像や、その動画データに対する画像処理などの各種の情報処理を実行する装置である。情報処理装置100は、撮像部110、制御部120、表示部130、入出力インターフェース140、動画データ記憶部150、バス160、および、画像処理装置200を備える。
図2は、第1の実施の形態における画像処理装置200の一構成例を示すブロック図である。画像処理装置200は、移動体検出部220、移動ベクトル検出部230、後部領域検出部240、および、平滑化処理部250を備える。
図6は、第1の実施の形態における画像処理装置200の動作の一例を示すフローチャートである。この動作は、例えば、入力画像データI0乃至In-1が画像処理装置200に入力されたときに開始する。画像処理装置200は、入力画像データI0乃至In-1において移動体を検出して、移動体領域データM0乃至Mn-1を生成する(ステップS910)。画像処理装置200は、入力画像データI0乃至In-1および移動体領域データM0乃至Mn-1から、移動体の移動ベクトルV1乃至Vn-1を検出する(ステップS920)。
第1の実施の形態では、画像処理装置200は、画素毎に出現頻度の高い画像値の画素は背景とし、それ以外の画素を移動体として、移動体の重心の時間的変化から移動ベクトルを求めていた。しかしながら、画像処理装置200は、ブロックマッチングにより移動ベクトルを求めてもよい。第1の変形例の画像処理装置200は、ブロックマッチングにより移動ベクトルを求める点において第1の実施の形態と異なる。
第1の実施の形態では、画像処理装置200は、後部領域を移動方向に沿って平滑化していたが、さらに移動方向に沿った線分を強調する処理を行ってもよい。第2の変形例の画像処理装置200は、後部領域において、移動方向に沿った線分を強調する点において、第1の実施の形態と異なる。
第1の実施の形態では、画像処理装置200は、移動体および移動ベクトルを検出していたが、画像処理装置200は必ずしも移動体および移動ベクトルを検出する必要はない。第4の変形例の画像処理装置200は、装置自身が移動体および移動ベクトルを検出しない点において第1の実施の形態と異なる。
第1の実施の形態では、画像処理装置200は、後部領域において平滑化を行っていたが、平滑化処理以外の画像処理を実行してもよい。例えば、後部領域を所定の色により塗り潰す処理を実行してもよい。第4の変形例の画像処理装置200は、後部領域を所定の色により塗り潰す点において第1の実施の形態と異なる。
P0'=0 ・・・式6
P0'=255 ・・・式7
式6において0は、8ビットにより表わされる画素値の最小値を示す値であり、式7において255は、画素値の最大値を示す値である。
[画像処理装置の構成例]
図17は、第2の実施の形態における画像処理装置200の一構成例を示すブロック図である。第2の画像処理装置200は、1個の入力画像データにおいて、異なる複数の後部領域を検出し、それぞれの後部領域を平滑化する点において第1の実施の形態と異なる。
第2の実施の形態では、画像処理装置200は、移動平均フィルタを使用して平滑化を行っていたが、ローパスフィルタを使用して平滑化を行ってもよい。変形例の画像処理装置200は、ローパスフィルタを使用して平滑化を行う点により、第2の実施の形態と異なる。
[画像処理装置の構成例]
図23は、第3の実施の形態における画像処理装置200の一構成例を示すブロック図である。第1の実施の形態では、画像処理装置200は、入力画像データの撮影において、撮影者が撮像機器を動かさずに撮影したものと想定して移動体の検出を行っている。しかしながら、撮影において、撮影者が撮像機器自体を動かしながら撮影することがある。例えば、図24に例示するように、移動体の移動に追従して、撮影者が撮像機器を移動方向へ動かす(パンする)ことがある。撮像機器を動かす速度によっては、画面内で見かけ上、移動体が移動しない一方で、静止物体が移動する動画データが撮像されることがある。このような動画データに対して、第1の実施の形態に例示した移動体の検出方法をそのまま適用すると、移動体でない物体が誤って移動体として検出されてしまうことがある。
(1)時系列に沿って連続する複数の画像のうちの少なくとも1つの画像である対象画像において移動体の領域を取得する移動体取得部と、
前記移動体の移動方向を取得する移動方向取得部と、
前記移動体の領域において前記移動方向に対して後方の部分の領域を後部領域として検出する後部領域検出部と、
前記後部領域において所定の画像処理を実行する後部領域処理部と
を具備する画像処理装置。
(2)前記移動体取得部は、前記対象画像において前記移動体の領域を検出する
前記(1)記載の画像処理装置。
(3)前記対象画像は複数の所定形状のブロックを含み、
前記移動体取得部は、ブロックマッチングアルゴリズムを使用して前記ブロック毎に当該ブロックの動き量を求めて当該動き量が所定の閾値を超えるブロックからなる領域を前記移動体の領域として検出する
前記(2)記載の画像処理装置。
(4)前記移動体取得部は、前記複数の画像のうち前記対象画像の1つ前の画像である基準画像において前記移動体の領域をさらに取得し、
前記移動方向取得部は、前記基準画像における前記移動体の領域内の特定の座標から前記対象画像における前記移動体の領域内の特定の座標への方向を前記移動方向として検出する
前記(1)乃至(3)のいずれかに記載の画像処理装置。
(5)前記後部領域検出部は、前記対象画像内で前記移動方向に位置を変更した前記後方の部分の輪郭線と前記変更前の前記輪郭線とに囲まれる領域を前記後部領域として検出する
前記(1)乃至(4)のいずれかに記載の画像処理装置。
(6)前記後部領域検出部は、前記対象画像内で前記移動方向に位置を変更した前記移動体の領域をマスク領域として前記変更前の前記移動体の領域においてマスクすることにより生成された領域を前記後部領域として検出する
前記(1)乃至(5)のいずれかに記載の画像処理装置。
(7)前記後部領域検出部は、
前記移動体の移動速度を検出する移動速度検出部と、
前記移動速度に応じた変更量に従って前記位置を変更した前記移動体の領域を前記マスク領域として前記マスクを行うマスク処理部と
を備える前記(6)記載の画像処理装置。
(8)前記後部領域検出部は、前記対象画像において前記移動体の領域を前記移動方向に拡大する拡大部をさらに備え、
前記マスク処理部は、前記拡大部により拡大された前記移動体の領域である拡大領域の位置を前記移動方向に変更した領域を前記マスク領域として前記変更前の前記拡大領域において前記マスクを行う
前記(7)記載の画像処理装置。
(9)前記所定の画像処理は、前記移動方向に沿った平滑化処理である
前記(1)乃至(8)のいずれかに記載の画像処理装置。
(10)前記後部領域処理部は、前記後部領域について前記移動方向に沿って前記移動速度に応じた度合いの前記平滑化処理を実行する
前記(9)記載の画像処理装置。
(11)前記拡大部は、前記移動体の領域を前記移動速度に応じて前記移動方向に拡大する
前記(9)または(10)記載の画像処理装置。
(12)前記後部領域処理部は、前記移動方向に沿った方向における前記後部領域内の画素値の分布を表す周期関数の振幅に応じた広さの通過帯域を有するローパスフィルタを使用して前記平滑化処理を実行する
前記(9)乃至(11)のいずれかに記載の画像処理装置。
(13)前記対象画像は、前記複数の画像のうちのいずれか1つの画像であり、
前記後部領域検出部は、前記移動体の後方の部分において複数の前記後部領域を検出し、
前記後部領域処理部は、複数の前記後部領域について前記所定の画像処理を実行した複数の画像を生成する
前記(1)乃至(12)のいずれかに記載の画像処理装置。
(14)前記対象画像の1つ前の画像である基準画像と前記対象画像とにおいて共通の領域が重なるように前記基準画像の位置に前記対象画像の位置を合わせる位置合わせ処理部をさらに備え、
前記移動体検出部は、前記位置を合わせた前記対象画像において前記移動体を検出する
前記(1)乃至(13)のいずれかに記載の画像処理装置。
(15)前記平滑化処理部は、前記対象画像における前記移動体以外の領域について前記移動方向に沿って前記後部領域とは異なる度合いの平滑化処理をさらに実行する
前記(1)乃至(14)のいずれかに記載の画像処理装置。
(16)前記後部領域に含まれる線分のうち前記移動方向に沿った線分を強調する強調処理部をさらに具備する
前記(1)乃至(15)のいずれかに記載の画像処理装置。
(17)前記所定の画像処理は、前記後部領域を所定の色により塗り潰す処理である
前記(1)乃至(16)のいずれかに記載の画像処理装置。
(18)時系列に沿って連続する複数の画像のうちの少なくとも1つの画像である対象画像において移動体の領域を取得する移動体取得ステップと、
前記移動体の移動方向を取得する移動方向取得ステップと、
前記移動体の領域において前記移動方向に対して後方の部分の領域を後部領域として検出する後部領域検出ステップと、
前記後部領域において所定の画像処理を実行する後部領域処理ステップと
を具備する画像処理装置の制御方法。
(19)時系列に沿って連続する複数の画像のうちの少なくとも1つの画像である対象画像において移動体の領域を取得する移動体取得ステップと、
前記移動体の移動方向を取得する移動方向取得ステップと、
前記移動体の領域において前記移動方向に対して後方の部分の領域を後部領域として検出する後部領域検出ステップと、
前記後部領域において所定の画像処理を実行する後部領域処理ステップと
をコンピュータに実行させるためのプログラム。
110 撮像部
120 制御部
130 表示部
140 入出力インターフェース
150 動画データ記憶部
160 バス
200 画像処理装置
210 位置合わせ処理部
211 位置補正パラメータ算出部
212 入力画像位置補正部
220 移動体検出部
221 画素選択部
222 背景基準値算出部
223 移動体領域抽出部
225 移動体取得部
230 移動ベクトル検出部
231 重心座標演算部
232 移動ベクトル演算部
235 移動ベクトル取得部
240 後部領域検出部
241 拡大部
242 移動体位置変更部
243 マスク処理部
244 後部位置変更部
250 平滑化処理部
251 後部領域平滑化処理部
252 背景領域平滑化処理部
255 後部領域処理部
260 エッジ強調処理部
Claims (19)
- 時系列に沿って連続する複数の画像のうちの少なくとも1つの画像である対象画像において移動体の領域を取得する移動体取得部と、
前記移動体の移動方向を取得する移動方向取得部と、
前記移動体の領域において前記移動方向に対して後方の部分の領域を後部領域として検出する後部領域検出部と、
前記後部領域において所定の画像処理を実行する後部領域処理部と
を具備する画像処理装置。 - 前記移動体取得部は、前記対象画像において前記移動体の領域を検出する
請求項1記載の画像処理装置。 - 前記対象画像は複数の所定形状のブロックを含み、
前記移動体取得部は、ブロックマッチングアルゴリズムを使用して前記ブロック毎に当該ブロックの動き量を求めて当該動き量が所定の閾値を超えるブロックからなる領域を前記移動体の領域として検出する
請求項2記載の画像処理装置。 - 前記移動体取得部は、前記複数の画像のうち前記対象画像の1つ前の画像である基準画像において前記移動体の領域をさらに取得し、
前記移動方向取得部は、前記基準画像における前記移動体の領域内の特定の座標から前記対象画像における前記移動体の領域内の特定の座標への方向を前記移動方向として検出する
請求項1記載の画像処理装置。 - 前記後部領域検出部は、前記対象画像内で前記移動方向に位置を変更した前記後方の部分の輪郭線と前記変更前の前記輪郭線とに囲まれる領域を前記後部領域として検出する
請求項1記載の画像処理装置。 - 前記後部領域検出部は、前記対象画像内で前記移動方向に位置を変更した前記移動体の領域をマスク領域として前記変更前の前記移動体の領域においてマスクすることにより生成された領域を前記後部領域として検出する
請求項1記載の画像処理装置。 - 前記後部領域検出部は、
前記移動体の移動速度を検出する移動速度検出部と、
前記移動速度に応じた変更量に従って前記位置を変更した前記移動体の領域を前記マスク領域として前記マスクを行うマスク処理部と
を備える請求項6記載の画像処理装置。 - 前記後部領域検出部は、前記対象画像において前記移動体の領域を前記移動方向に拡大する拡大部をさらに備え、
前記マスク処理部は、前記拡大部により拡大された前記移動体の領域である拡大領域の位置を前記移動方向に変更した領域を前記マスク領域として前記変更前の前記拡大領域において前記マスクを行う
請求項7記載の画像処理装置。 - 前記所定の画像処理は、前記移動方向に沿った平滑化処理である
請求項8記載の画像処理装置。 - 前記後部領域処理部は、前記後部領域について前記移動方向に沿って前記移動速度に応じた度合いの前記平滑化処理を実行する
請求項9記載の画像処理装置。 - 前記拡大部は、前記移動体の領域を前記移動速度に応じて前記移動方向に拡大する
請求項9記載の画像処理装置。 - 前記後部領域処理部は、前記移動方向に沿った方向における前記後部領域内の画素値の分布を表す周期関数の振幅に応じた広さの通過帯域を有するローパスフィルタを使用して前記平滑化処理を実行する
請求項9記載の画像処理装置。 - 前記対象画像は、前記複数の画像のうちのいずれか1つの画像であり、
前記後部領域検出部は、前記移動体の後方の部分において複数の前記後部領域を検出し、
前記後部領域処理部は、複数の前記後部領域について前記所定の画像処理を実行した複数の画像を生成する
請求項1記載の画像処理装置。 - 前記対象画像の1つ前の画像である基準画像と前記対象画像とにおいて共通の領域が重なるように前記基準画像の位置に前記対象画像の位置を合わせる位置合わせ処理部をさらに備え、
前記移動体検出部は、前記位置を合わせた前記対象画像において前記移動体を検出する
請求項1記載の画像処理装置。 - 前記平滑化処理部は、前記対象画像における前記移動体以外の領域について前記移動方向に沿って前記後部領域とは異なる度合いの平滑化処理をさらに実行する
請求項1記載の画像処理装置。 - 前記後部領域に含まれる線分のうち前記移動方向に沿った線分を強調する強調処理部をさらに具備する
請求項1記載の画像処理装置。 - 前記所定の画像処理は、前記後部領域を所定の色により塗り潰す処理である
請求項1記載の画像処理装置。 - 時系列に沿って連続する複数の画像のうちの少なくとも1つの画像である対象画像において移動体の領域を取得する移動体取得ステップと、
前記移動体の移動方向を取得する移動方向取得ステップと、
前記移動体の領域において前記移動方向に対して後方の部分の領域を後部領域として検出する後部領域検出ステップと、
前記後部領域において所定の画像処理を実行する後部領域処理ステップと
を具備する画像処理装置の制御方法。 - 時系列に沿って連続する複数の画像のうちの少なくとも1つの画像である対象画像において移動体の領域を取得する移動体取得ステップと、
前記移動体の移動方向を取得する移動方向取得ステップと、
前記移動体の領域において前記移動方向に対して後方の部分の領域を後部領域として検出する後部領域検出ステップと、
前記後部領域において所定の画像処理を実行する後部領域処理ステップと
をコンピュータに実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014509075A JP5983737B2 (ja) | 2012-04-03 | 2013-02-22 | 画像処理装置、画像処理装置の制御方法およびプログラム |
BR112014024049A BR112014024049A8 (pt) | 2012-04-03 | 2013-02-22 | Dispositivo de processamento de imagem, método de controle de um dispositivo de processamento de imagem, e, programa |
US14/381,874 US9361703B2 (en) | 2012-04-03 | 2013-02-22 | Image processing device, control method of image processing device and program |
CN201380016685.2A CN104205803A (zh) | 2012-04-03 | 2013-02-22 | 图像处理设备,图像处理设备的控制方法和程序 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012084964 | 2012-04-03 | ||
JP2012-084964 | 2012-04-03 | ||
JP2013000990 | 2013-01-08 | ||
JP2013-000990 | 2013-01-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013150829A1 true WO2013150829A1 (ja) | 2013-10-10 |
Family
ID=49300329
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/054557 WO2013150829A1 (ja) | 2012-04-03 | 2013-02-22 | 画像処理装置、画像処理装置の制御方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9361703B2 (ja) |
JP (1) | JP5983737B2 (ja) |
CN (1) | CN104205803A (ja) |
BR (1) | BR112014024049A8 (ja) |
WO (1) | WO2013150829A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015088771A (ja) * | 2013-10-28 | 2015-05-07 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
JP2016038415A (ja) * | 2014-08-05 | 2016-03-22 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
JP2020149132A (ja) * | 2019-03-11 | 2020-09-17 | セコム株式会社 | 画像処理装置及び画像処理プログラム |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9432575B2 (en) * | 2013-06-28 | 2016-08-30 | Canon Kabushiki Kaisha | Image processing apparatus |
KR102180466B1 (ko) * | 2013-12-19 | 2020-11-18 | 삼성메디슨 주식회사 | 대상체의 측정값과 관련된 부가 정보를 디스플레이하기 위한 방법 및 장치 |
US9390487B2 (en) * | 2014-10-20 | 2016-07-12 | Microsoft Technology Licensing, Llc | Scene exposure auto-compensation for differential image comparisons |
US9704272B2 (en) | 2014-11-21 | 2017-07-11 | Microsoft Technology Licensing, Llc | Motion blur using cached texture space blur |
US10697766B1 (en) * | 2014-11-25 | 2020-06-30 | Hunter Engineering Company | Method and apparatus for compensating vehicle inspection system measurements for effects of vehicle motion |
US9591237B2 (en) * | 2015-04-10 | 2017-03-07 | Qualcomm Incorporated | Automated generation of panning shots |
US9584716B2 (en) * | 2015-07-01 | 2017-02-28 | Sony Corporation | Method and apparatus for autofocus area selection by detection of moving objects |
TWI554108B (zh) * | 2015-08-04 | 2016-10-11 | 緯創資通股份有限公司 | 電子裝置及影像處理方法 |
JP2017049783A (ja) * | 2015-09-01 | 2017-03-09 | キヤノン株式会社 | 画像処理装置及び画像処理方法 |
KR102474837B1 (ko) * | 2015-09-14 | 2022-12-07 | 주식회사 한화 | 전경 추출 방법 및 장치 |
JP6736916B2 (ja) * | 2016-03-02 | 2020-08-05 | 株式会社リコー | 情報処理装置、情報処理方法、及びプログラム |
JP6604908B2 (ja) * | 2016-06-10 | 2019-11-13 | キヤノン株式会社 | 画像処理装置、その制御方法、および制御プログラム |
JP2018146663A (ja) * | 2017-03-02 | 2018-09-20 | キヤノン株式会社 | 像ブレ補正装置およびその制御方法、撮像装置、レンズ装置 |
JP2019117547A (ja) * | 2017-12-27 | 2019-07-18 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
CN109064427A (zh) * | 2018-08-01 | 2018-12-21 | 京东方科技集团股份有限公司 | 增强图像对比度的方法、装置、显示设备和存储介质 |
JP2021164087A (ja) * | 2020-04-01 | 2021-10-11 | キヤノン株式会社 | 撮像装置およびその制御方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010273183A (ja) * | 2009-05-22 | 2010-12-02 | Nikon Corp | 撮像装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3603629B2 (ja) * | 1998-12-24 | 2004-12-22 | カシオ計算機株式会社 | 画像処理装置及び画像処理方法 |
JP3639476B2 (ja) * | 1999-10-06 | 2005-04-20 | シャープ株式会社 | 画像処理装置および画像処理方法ならびに画像処理プログラムを記録した記録媒体 |
JP2003037767A (ja) | 2001-07-24 | 2003-02-07 | Minolta Co Ltd | デジタルカメラ |
JP2006050070A (ja) * | 2004-08-02 | 2006-02-16 | Fuji Photo Film Co Ltd | 画像処理方法および装置並びにプログラム |
JP2006086933A (ja) * | 2004-09-17 | 2006-03-30 | Canon Inc | 撮像装置及び制御方法 |
JP5397481B2 (ja) * | 2009-12-18 | 2014-01-22 | 富士通株式会社 | 画像選別装置及び画像選別方法 |
JP4924727B2 (ja) | 2010-02-16 | 2012-04-25 | カシオ計算機株式会社 | 画像処理装置及び画像処理プログラム |
JP2011182213A (ja) | 2010-03-02 | 2011-09-15 | Casio Computer Co Ltd | 撮像装置及び撮像装置の制御プログラム |
-
2013
- 2013-02-22 US US14/381,874 patent/US9361703B2/en not_active Expired - Fee Related
- 2013-02-22 BR BR112014024049A patent/BR112014024049A8/pt not_active IP Right Cessation
- 2013-02-22 CN CN201380016685.2A patent/CN104205803A/zh active Pending
- 2013-02-22 JP JP2014509075A patent/JP5983737B2/ja not_active Expired - Fee Related
- 2013-02-22 WO PCT/JP2013/054557 patent/WO2013150829A1/ja active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010273183A (ja) * | 2009-05-22 | 2010-12-02 | Nikon Corp | 撮像装置 |
Non-Patent Citations (1)
Title |
---|
YOSHIKAZU TAKEUCHI: "Zumen, Shashin, Pasu ga Kirari to Hikaru Hito Kufu Jitsumu Chokketsu! Sekkeisha no Temeno 'Cho Teiban' Photoshop to 'Musho' GIMP Master Guide", CAD & CG MAGAZINE, vol. 10, no. 5, 1 May 2008 (2008-05-01), pages 62 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015088771A (ja) * | 2013-10-28 | 2015-05-07 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
JP2016038415A (ja) * | 2014-08-05 | 2016-03-22 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
JP2020149132A (ja) * | 2019-03-11 | 2020-09-17 | セコム株式会社 | 画像処理装置及び画像処理プログラム |
JP7290961B2 (ja) | 2019-03-11 | 2023-06-14 | セコム株式会社 | 画像処理装置及び画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
US9361703B2 (en) | 2016-06-07 |
BR112014024049A2 (ja) | 2017-06-20 |
BR112014024049A8 (pt) | 2017-07-25 |
US20150043786A1 (en) | 2015-02-12 |
CN104205803A (zh) | 2014-12-10 |
JPWO2013150829A1 (ja) | 2015-12-17 |
JP5983737B2 (ja) | 2016-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5983737B2 (ja) | 画像処理装置、画像処理装置の制御方法およびプログラム | |
Guilluy et al. | Video stabilization: Overview, challenges and perspectives | |
Cho et al. | Video deblurring for hand-held cameras using patch-based synthesis | |
JP4947060B2 (ja) | 画像合成装置、画像合成方法、プログラム | |
US9591237B2 (en) | Automated generation of panning shots | |
KR102563750B1 (ko) | 영상 블러링 제거 방법 및 장치 | |
JP3935500B2 (ja) | 動きベクトル演算方法とこの方法を用いた手ぶれ補正装置、撮像装置、並びに動画生成装置 | |
US10818018B2 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
WO2010024479A1 (en) | Apparatus and method for converting 2d image signals into 3d image signals | |
JP5803467B2 (ja) | 画像処理装置および撮像装置、ならびに画像処理方法 | |
TW200937946A (en) | Full-frame video stabilization with a polyline-fitted camcorder path | |
JP4210954B2 (ja) | 画像処理方法、画像処理方法のプログラム、画像処理方法のプログラムを記録した記録媒体及び画像処理装置 | |
JP2009253506A (ja) | 画像処理装置、画像処理方法、手振れ範囲推定装置、手振れ範囲推定方法、及びプログラム | |
KR101671391B1 (ko) | 레이어 블러 모델에 기반한 비디오 디블러링 방법, 이를 수행하기 위한 기록 매체 및 장치 | |
JP5325800B2 (ja) | 画像処理装置、画像撮影装置、画像表示装置、およびプログラム | |
TWI792990B (zh) | 滑動變焦效果的產生方法和系統 | |
JP5251410B2 (ja) | カメラワーク算出プログラム、撮像装置及びカメラワーク算出方法 | |
JPWO2009150696A1 (ja) | 画像補正装置および画像補正方法 | |
JP2012073703A (ja) | 画像ボケ量計算装置およびそのプログラム | |
JP2011209070A (ja) | 画像処理装置 | |
JP2012169701A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP4578653B2 (ja) | 奥行き画像生成装置、奥行き画像生成方法およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 | |
JP3804836B2 (ja) | キー信号生成装置および画像合成装置、並びにキー信号生成方法および画像合成方法 | |
JP2007179211A (ja) | 画像処理装置、画像処理方法、およびそのプログラム | |
Lee | Novel video stabilization for real-time optical character recognition applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13772240 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014509075 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14381874 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014024049 Country of ref document: BR |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13772240 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 112014024049 Country of ref document: BR Kind code of ref document: A2 Effective date: 20140926 |