WO2012096163A1 - 画像処理装置、画像処理方法、及びそのプログラム - Google Patents
画像処理装置、画像処理方法、及びそのプログラム Download PDFInfo
- Publication number
- WO2012096163A1 WO2012096163A1 PCT/JP2012/000106 JP2012000106W WO2012096163A1 WO 2012096163 A1 WO2012096163 A1 WO 2012096163A1 JP 2012000106 W JP2012000106 W JP 2012000106W WO 2012096163 A1 WO2012096163 A1 WO 2012096163A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sub
- feature
- feature point
- feature points
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
Definitions
- the present invention relates to an image processing apparatus and the like.
- the present invention relates to an image processing apparatus and the like for aligning an image.
- Techniques for detecting feature points from images and techniques for extracting feature points that satisfy predetermined conditions from the detected feature points are known.
- the detection and extraction of feature points is also simply referred to as extraction of feature points.
- Techniques for extracting feature points are widely used in the fields of, for example, matching of images, recognition of specific objects included in images, alignment of images, and calibration at the time of 3D image generation.
- matching points are pairs of points that represent the same spatial position between different images.
- a stereoscopic image is generated from two images having a predetermined parallax.
- the two images may include, in addition to the parallax in the horizontal direction as viewed from the viewer, which is necessary for stereoscopic viewing, the parallax in the vertical direction that occurs due to an assembly error of the lens, blurring at the time of shooting, and the like.
- the parallax in the vertical direction is a factor that hinders comfortable stereoscopic vision. Therefore, processing to convert one image into the other image is generally performed to suppress parallax in the vertical direction. Specifically, it is necessary to generate a warping matrix for warping one image to the other based on certain conditions so that the difference in vertical position between corresponding feature points becomes smaller.
- SIFT Scale Invariant Feature Transform
- SURF Speeded-up Robust Feature
- U.S. Patent Application Publication No. 2009/0052780 describes a technique for dividing an image into a plurality of areas for a multiprocessor system.
- DoG Difference of Gaussian
- SIFT difference of Gaussian
- the number of feature points extracted from each region is variable, which is determined by the DoG threshold.
- Patent Document 2 (US Pat. No. 5,731,851) describes a motion compensation method based on feature points.
- the feature points of the moving object are searched in an area, and the grids associated with these feature points are formed in a hierarchical structure for encoding.
- Patent Document 3 (US Pat. No. 5,617,459) describes a method for extracting a plurality of feature points in the outline of one object.
- feature point extraction processing first, feature amounts at a plurality of points included in an image are calculated. Thereafter, a point whose feature amount value is larger than a predetermined threshold is extracted as a feature point.
- the contrast at a certain point is higher, the value of the feature amount of the point is larger. Therefore, in one image, many feature points are extracted from a portion with high contrast, but feature points are hardly extracted from a portion with low contrast. As a result, in the portion where feature points are hardly extracted, the accuracy of image alignment is degraded.
- an object of the present invention is to provide an image processing apparatus that further improves the accuracy of image alignment.
- One aspect of the image processing apparatus extracts a feature point for aligning the vertical direction of the first image and the second image, which are images obtained by photographing the same object from different viewpoints.
- An image processing apparatus comprising: a division unit that divides each of the first image and the second image into a plurality of sub-regions; and an extraction unit that performs feature point extraction processing for each of the sub-regions.
- the extraction unit may set the value of the number of the feature points extracted by the extraction process so that the value indicating the degree of variation among the plurality of sub-regions is equal to or less than a predetermined value. It is an image processing apparatus that performs extraction processing.
- the image processing apparatus extracts feature points so as to suppress variation in the number of feature points extracted for each sub-region. Therefore, feature points can be extracted so as to be uniformly distributed in the image. As a result, the image processing apparatus can further improve the accuracy of image alignment.
- the present invention can be realized not only as such an image processing apparatus, but also as an image processing method in which characteristic means included in the image processing apparatus are taken as steps, or such characteristic steps are used as a computer. It can also be realized as a program to be executed. It goes without saying that such a program can be distributed via a recording medium such as a compact disc read only memory (CD-ROM) and a transmission medium such as the Internet.
- a recording medium such as a compact disc read only memory (CD-ROM) and a transmission medium such as the Internet.
- CD-ROM compact disc read only memory
- the present invention can be realized as a semiconductor integrated circuit (LSI) that realizes part or all of the functions of such an image processing apparatus, or as an image processing system including such an image processing apparatus.
- LSI semiconductor integrated circuit
- FIG. 1 is a conceptual view showing strong features and weak features distributed in an image.
- FIG. 2 is a diagram showing functional blocks of the image processing apparatus according to the first embodiment of the present invention.
- FIG. 3 is a flowchart showing an entire process performed by the image processing apparatus according to the first embodiment of the present invention.
- FIG. 4A is a diagram showing the positions of a plurality of feature points included in a certain image.
- FIG. 4B is a diagram showing the result of extracting feature points from the image shown in FIG. 4A according to the prior art.
- FIG. 4C is a diagram showing the result of extracting feature points from the image shown in FIG. 4A by the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 4A is a diagram showing the positions of a plurality of feature points included in a certain image.
- FIG. 4B is a diagram showing the result of extracting feature points from the image shown in FIG. 4A according to the prior art.
- FIG. 4C is a diagram showing
- FIG. 5 is a diagram showing functional blocks of the extraction unit according to Embodiment 1 of the present invention.
- FIG. 6 is a flowchart illustrating in more detail the flow of processing of the image processing apparatus according to the first embodiment of the present invention.
- FIG. 7 is a flowchart illustrating in more detail the process performed by the extraction unit in step S308a of FIG.
- FIG. 8 is a flowchart showing a flow of processing performed by the image processing apparatus when the adjustment unit adaptively extracts a feature point by adjusting the contrast.
- FIG. 9 is a flowchart illustrating in more detail the process performed by the extraction unit in step S308b of FIG.
- FIG. 10 is a flow chart for explaining the flow of processing for correcting the parallax between the reference image and the target image.
- FIG. 11 is a conceptual diagram for explaining a process of generating a virtual feature point performed by the image processing apparatus according to the second embodiment of the present invention.
- FIG. 12 is a diagram showing functional blocks of an extraction unit included in the image processing apparatus according to the second embodiment.
- FIG. 13 is a diagram showing functional blocks of the virtual feature point generator.
- FIG. 14 is a conceptual diagram for explaining in detail the process performed by the virtual feature point generator.
- FIG. 15 is a detailed flowchart illustrating an example of image alignment processing using virtual feature points performed by the image processing apparatus according to the second embodiment.
- FIG. 16A is a diagram showing a result of feature point extraction by the image processing apparatus according to the first embodiment.
- FIG. 16B is a diagram showing the result of matching of feature points with respect to the feature points shown in FIG. 16A.
- FIG. 16A is a diagram showing a result of feature point extraction by the image processing apparatus according to the first embodiment.
- FIG. 16B is a diagram showing the result of matching of feature points with respect to
- FIG. 16C is a diagram showing an image obtained by superimposing two images whose deviations have been corrected by the matched feature point pair.
- FIG. 17 is a block diagram showing the hardware configuration of a computer system that implements the image processing apparatus according to Embodiments 1 and 2 of the present invention.
- One aspect of the image processing apparatus extracts a feature point for aligning the vertical direction of the first image and the second image, which are images obtained by photographing the same object from different viewpoints.
- An image processing apparatus comprising: a division unit that divides each of the first image and the second image into a plurality of sub-regions; and an extraction unit that performs feature point extraction processing for each of the sub-regions.
- the extraction unit may set the value of the number of the feature points extracted by the extraction process so that the value indicating the degree of variation among the plurality of sub-regions is equal to or less than a predetermined value. Perform extraction processing.
- the image processing apparatus extracts feature points so as to suppress variation in the number of feature points extracted for each sub-region. Therefore, feature points can be extracted so as to be uniformly distributed in the image. As a result, the image processing apparatus can further improve the accuracy of image alignment.
- the image processing apparatus further includes an alignment unit that aligns the first image and the second image based on the feature points, and the alignment unit includes the first image and the second image.
- the feature point included in one of the images is matched with the feature point corresponding to it, which is included in the other image, and the difference between the coordinate values in the vertical direction between the matched feature points is more Coordinate conversion may be performed on at least one image so as to be smaller.
- the image processing apparatus can keep the number of feature points extracted from each sub region within a certain range.
- the extraction unit is configured such that the difference between the number of feature points extracted from the sub-region and the predetermined number N is equal to or less than a predetermined value,
- the feature point extraction process may be performed.
- the image processing apparatus 100 can adjust the number of feature points extracted from the sub region by adjusting the size of the feature point threshold value for each sub region.
- the extraction unit corresponds, in each of the plurality of sub regions, a feature amount calculation unit that calculates a feature amount corresponding to each of the plurality of pixels included in the sub region, and corresponds among the plurality of pixels.
- a feature point extraction unit that extracts pixels having feature values equal to or greater than a predetermined feature point threshold as feature points, and the number of feature points extracted by the feature point extraction unit in each of the plurality of sub-regions
- the image processing apparatus may further include an adjusting unit configured to adjust the size of the feature point threshold in the sub region such that the difference with N is equal to or less than the predetermined value.
- the image processing apparatus can generate virtual feature points instead of the missing feature points. Therefore, the image alignment can be performed more accurately even for an image for which it is difficult to extract a predetermined number of feature points, such as an image with a small change in contrast.
- the extraction unit further generates virtual feature points in a first sub region of the plurality of sub regions based on the feature points extracted by the feature point extraction unit.
- the virtual feature point generation unit is configured such that, in the first sub-region, the difference between the sum of the number of extracted feature points and the number of virtual feature points and the N is less than or equal to the predetermined value
- the virtual feature points may be generated by the number corresponding to
- the image processing apparatus can generate virtual feature points in the first image and the second image based on the two points already extracted in the first image.
- the virtual feature point generation unit is configured to calculate a first virtual point, which is a virtual feature point in the first image, based on a first feature point, which is a feature point included in the first image, and a second feature point.
- a first virtual point generation unit for generating a distance between the first virtual point and the first feature point in the first image, and a distance between the first virtual point and the second feature point; ,
- a third feature point which is a point in the second image corresponding to the first feature point, and a point in the second image corresponding to the second feature point.
- a corresponding point acquisition unit for acquiring a fourth feature point that is the second feature point, and a second virtual point as a virtual feature point corresponding to the first virtual point in the second image by referring to the reference information;
- a two virtual point generation unit may be provided.
- the image processing apparatus 100 can adjust the number of extracted feature points by adjusting the contrast of the image for each sub region instead of adjusting the feature point threshold.
- the extraction unit corresponds, in each of the plurality of sub regions, a feature amount calculation unit that calculates a feature amount corresponding to each of the plurality of pixels included in the sub region, and corresponds among the plurality of pixels.
- a feature point extraction unit that extracts pixels having feature values equal to or greater than a predetermined feature point threshold as feature points, and the number of feature points extracted by the feature point extraction unit in each of the plurality of sub-regions
- the image processing apparatus may further include an adjusting unit configured to adjust the size of the image contrast in the sub region such that the difference with N is equal to or less than the predetermined value.
- the image processing apparatus can extract feature points such that the difference between the number of feature points extracted from each sub region for each sub region falls within a certain range.
- the extraction unit may calculate the number of feature points extracted from the first sub-regions included in the plurality of sub-regions and the feature points extracted from the second sub-region different from the first sub-region.
- a feature point extraction process may be performed from the first sub-region and the second sub-region such that the difference between the first sub-region and the second sub-region is less than or equal to a predetermined threshold value.
- FIG. 1 shows how strong and weak features are distributed in the image.
- any feature can be used as the feature of the image.
- FIG. 2 shows functional blocks of the image processing apparatus according to the present embodiment, which solves this problem.
- the image processing apparatus 100 shown in FIG. 2 is characterized in that the first image and the second image are used to align the first image and the second image, which are images obtained by photographing the same object from different viewpoints.
- Image processing apparatus for extracting The first image and the second image are, for example, an image for left eye and an image for right eye for stereoscopic vision.
- the image processing apparatus 100 includes a division unit 102, an extraction unit 104, and a registration unit 106.
- the dividing unit 102 divides each of the first image and the second image acquired as image data into a plurality of sub-regions.
- the extraction unit 104 performs feature point extraction processing for each sub-region. More specifically, the extraction unit 104 performs feature point extraction processing so that the value indicating the degree of variation among the plurality of sub-regions of the number of extracted feature points is equal to or less than a predetermined value. . Specifically, for each sub region, the extraction unit 104 is characterized by a number such that the difference between the number of feature points extracted from the sub region by the extraction processing and the predetermined number N is equal to or less than a predetermined value. Perform point extraction processing. The extraction unit 104 also calculates a feature descriptor for each of the extracted feature points. The specific process performed by the extraction unit 104 will be described later.
- the alignment unit 106 performs alignment based on the extracted feature points such that the parallax in the vertical direction between the first image and the second image is smaller.
- the alignment unit 106 first adds a feature point included in one of the first image and the second image and a feature point corresponding to the feature point and included in the other image.
- the corresponding feature points are feature points that represent the same spatial position.
- the alignment unit 106 determines, for each of the feature points included in the first image, a feature whose feature descriptor is most similar among the feature points included in the second image. By searching for points, feature points between both images are matched.
- the alignment unit 106 performs coordinate conversion on at least one of the images so that the difference between the coordinate values in the vertical direction between the matched feature points becomes smaller.
- coordinate conversion is performed on the second image based on the first image.
- the alignment unit 106 outputs data representing the first image and the coordinate-transformed second image.
- the image processing apparatus 100 may not include the alignment unit 106.
- the image processing apparatus 100 outputs coordinate values of feature points extracted for each of the first image and the second image. Since the process performed by the alignment unit 106 is a process according to the related art, for example, when the external device of the image processing apparatus 100 performs the process corresponding to the alignment unit 106, the same effect of the invention can be obtained. Further, the alignment unit 106 may not perform coordinate conversion after matching feature points between both images. In this case, the alignment unit 106 outputs a pair of coordinate values of the matched feature points.
- FIG. 3 is a flowchart showing an example of the entire processing performed by the image processing apparatus 100.
- step S100 a plurality of images (for example, a first image and a second image) are input to the image processing apparatus 100.
- step S102 the division unit 102 divides each input image into a plurality of subregions.
- the number of divisions is not particularly limited. Also, the first image and the second image may be divided similarly, or the first image and the second image may be divided differently.
- step S104 the extraction unit 104 performs feature point extraction processing continuously or in parallel from each sub-region using the threshold value adaptively changed for each sub-region.
- step S106 the extraction unit 104 calculates a feature descriptor for each feature point extracted in the previous step.
- step S108 the alignment unit 106 matches feature points between different images. Specifically, a feature point having a feature amount descriptor most similar to a feature amount descriptor of a certain feature point, which is selected from feature points included in the first image, is included in the second image Select from among the feature points. The two feature points selected in this manner are used as matched feature points.
- step S110 a pair of coordinate values of the matched feature points is output.
- the alignment unit 106 may perform tracking of feature points instead of matching the feature points.
- the image processing apparatus 100 may omit the process of at least one of steps S108 and S110. In this case, the same effect of the invention can be obtained by performing the process corresponding to the step in which the external device is omitted.
- the feature points extracted by the conventional technique are compared with the feature points extracted by the extracting unit 104 provided in the image processing apparatus 100.
- FIG. 4A shows the positions of feature points included in an image 200.
- a feature point indicated by a filled triangle indicates a feature point whose feature is strong (that is, the feature amount of the feature point is large).
- a feature point indicated by a triangle in which a dot is sung indicates a feature point of which the feature is medium (that is, the size of the feature amount of the feature point is medium).
- the feature point indicated by an open triangle indicates a feature point having a weak feature (that is, a feature amount of the feature point is small).
- FIG. 4B shows the result of extracting feature points from the image 200 according to the prior art.
- the prior art most of the extracted feature points are occupied by feature points with strong features. This is because in the related art, in the entire image 200, a point having a feature amount equal to or more than a predetermined threshold value is extracted as a feature point.
- the extracted feature points are unevenly distributed around the upper right of the image 200.
- the entire image is incorrectly represented due to the non-uniform distribution of feature points.
- FIG. 4C shows the result of extracting feature points from the image 200 by the image processing apparatus 100 according to the present embodiment.
- feature points are uniformly extracted from the entire image.
- the extracted feature points include strong feature points, relatively strong (i.e., medium) feature points, and weak feature points.
- strong feature points relatively strong (i.e., medium) feature points
- weak feature points As shown in FIG. 4C, if the distribution of the extracted feature points is uniform, the alignment of the image based on the feature points is more robust and stable.
- FIG. 5 shows functional blocks of the extraction unit 104 according to the present embodiment.
- the extraction unit 104 includes a feature amount calculation unit 112, a feature point extraction unit 114, and an adjustment unit 116.
- the feature amount calculation unit 112 calculates, in each of the plurality of sub-areas divided by the division unit 102, a feature amount corresponding to each of the plurality of pixels included in the sub-area. As described above, although any feature amount can be used, the feature amount calculation unit 112 calculates the feature amount using, for example, the contrast value of the pixel. More specifically, it is conceivable to use gradient-based features such as SIFT features or SURF features.
- the feature point extraction unit 114 extracts, as a feature point, a pixel whose corresponding feature amount is equal to or greater than a predetermined feature point threshold among a plurality of pixels. Thereafter, the feature point extraction unit 114 outputs the coordinate values of the extracted feature points.
- Adjustment unit 116 determines that the difference between the number of feature points extracted by feature point extraction unit 114 and the target number of feature points to be extracted in each of the plurality of sub-regions is a predetermined value or less The size of the feature point threshold in the sub region is adjusted so that
- the adjustment unit 116 further reduces the size of the feature point threshold. Conversely, when the number of feature points extracted by the feature point extraction unit 114 is larger than N, the size of the feature point threshold is increased.
- FIG. 6 is a flowchart for explaining the process flow of the image processing apparatus 100 in more detail.
- step S302 the dividing unit 102 divides the reference image and the target image into a plurality of sub-regions.
- One of the first and second images described above corresponds to a reference image, and the other corresponds to a target image.
- step S304 the extraction unit 104 calculates a contrast value for each sub region included in each of the reference image and the target image.
- the extraction unit 104 performs feature point extraction processing for each sub region using a predetermined feature point threshold value in step S306 based on the contrast value. Note that the extraction processing of the feature points may be performed continuously for each sub region by the extraction unit 104, or may be performed in parallel between the sub regions.
- the extraction unit 104 adaptively adjusts the feature point threshold so that each sub region has a sufficient number of feature points.
- the extraction unit 104 may adjust the feature point threshold so that the difference between the number of feature points included in the sub area for each sub area is equal to or less than a predetermined value.
- the extraction unit 104 calculates a feature descriptor for each extracted feature point.
- the feature descriptor is, for example, information representing a feature amount for each direction at each extracted feature point.
- the feature amount is, for example, information indicating the gradient of the luminance value calculated for each direction.
- step S312 the alignment unit 106 matches corresponding feature points representing the same spatial position between the reference image and the target image.
- step S314 the alignment unit 106 outputs a pair of coordinate values of the matched feature points.
- FIG. 7 is a flowchart illustrating in more detail the process performed by the extraction unit 104 in step S308a of FIG.
- step S402 the extraction unit 104 selects an arbitrary one of the plurality of sub-regions. Thereafter, the feature amount calculation unit 112 calculates feature amounts for a plurality of points included in the selected sub-region. For example, the feature amount calculation unit 112 may calculate the feature amounts for all the pixels in the sub region, or may calculate the feature amounts for a plurality of pixels sampled without bias in the sub region.
- step S404 the feature point extraction unit 114 extracts feature points in the selected sub-region using a predetermined feature point threshold.
- step S406 the adjustment unit 116 determines whether the number of extracted feature points has reached or is close to N. In other words, the adjustment unit 116 determines whether the difference between the number of extracted feature points and N is equal to or less than a predetermined value.
- the adjustment unit 116 outputs the feature points extracted in the sub-region.
- step S410 the extraction unit 104 determines whether there is no subregion not selected yet in step S402 among the plurality of subregions (that is, the subregion selected in step S402 is selected by the extraction unit 104). Determine if it is the last sub-region to be processed).
- the extraction unit 104 selects the next sub-region and performs the same processing (S402) ).
- the extraction unit 104 extracts feature points extracted for all the sub-regions. Output coordinates.
- the adjustment unit 116 determines in step S408 that the feature point threshold value new for the target sub-region. Set Specifically, when the number of extracted feature points is larger than N, the feature point threshold is increased. In addition, when the number of extracted feature points is smaller than N, the feature point threshold is made smaller.
- the extraction unit 104 causes the feature point extraction unit 114 to extract feature points in the sub-region again (S404).
- the adjustment unit 116 extracts feature points by adjusting the feature point threshold for each sub region.
- the adjustment unit 116 may extract feature points by adjusting the contrast of the image for each sub region.
- FIG. 8 is a flowchart showing a flow of processing performed by the image processing apparatus 100 when the adjustment unit 116 adaptively extracts a feature point by adjusting the contrast.
- step S308b The processes shown in FIG. 8 are the same as those in FIG. 6 except for step S308b. Thus, the process of step S308b will be described here.
- step S308b the adjustment unit 116 sharpens weak features by adaptively adjusting the image contrast for each sub region.
- the feature point extraction unit 114 can easily extract weak features.
- the extraction unit 104 can perform feature point extraction processing so that each sub region reliably includes a sufficient number of feature points.
- the extraction unit 104 may adjust the feature point threshold so that the difference between the number of feature points included in the sub area for each sub area is equal to or less than a predetermined value.
- FIG. 9 is a flowchart illustrating in more detail the process performed by the extraction unit 104 in step S308b of FIG. Note that among the processes performed by the extraction unit 104, the difference from FIG. 7 is only step S409. Therefore, the process of step S409 will be mainly described.
- the adjustment unit 116 adjusts the contrast in the target sub-region in step S409. Specifically, when the number of extracted feature points is smaller than N, the adjustment unit 116 makes the contrast value larger. Conversely, if the number of extracted feature points is larger than N, the contrast value is made smaller. As described above, the adjustment unit 116 adaptively adjusts the contrast for each sub region.
- the extraction unit 104 causes the feature point extraction unit 114 to extract feature points in the sub-region again (S404).
- the extraction unit 104 calculates the feature amount corresponding to each of the plurality of pixels included in the sub-region, and the corresponding feature among the plurality of pixels.
- the number and N of feature points extracted by the feature point extracting unit 114 in each of the plurality of sub-regions and the feature point extracting unit 114 extracting a pixel whose amount is equal to or more than a predetermined feature point threshold as a feature point
- the adjustment unit 116 configured to adjust the size of the image contrast in the sub region so that the difference between
- FIG. 10 is a flowchart for explaining the flow of processing in which the image processing apparatus 100 corrects the parallax between the reference image and the target image.
- step S502 the division unit 102 divides the reference image and the target image into a plurality of subregions.
- step S504 the extraction unit 104 calculates a contrast value for each image. Based on the contrast value, in step S506, the extraction unit 104 performs feature point extraction processing for each sub-region using a predetermined threshold.
- step S508 the extraction unit 104 adaptively adjusts the threshold value to extract from each sub-region the feature points that are sufficient in number for alignment and have the same number in each sub-region. .
- step S510 the extraction unit 104 calculates a feature descriptor for each extracted feature point.
- step S512 the alignment unit 106 matches feature points between the reference image and the target image.
- the alignment unit 106 calculates a warp matrix based on the matched feature points.
- the alignment unit 106 calculates, for example, a matrix that suppresses the difference in the vertical direction with respect to the image among the coordinate values of the matched feature points, as a warp matrix.
- the warping matrix is represented, for example, in the form of an affine transformation matrix or a rotation matrix.
- step S514 the alignment unit 106 adjusts the positional relationship between the reference image and the target image to be suitable for stereoscopic vision by applying the warping matrix to the reference image.
- the image processing apparatus 100 divides one image into sub-regions, and extracts feature points so that the number of feature points extracted in each sub-region is not biased. In order to adjust the threshold used for each subregion.
- the image processing apparatus 100 can perform image alignment at higher speed.
- Second Embodiment Also in the image processing apparatus 100 according to the first embodiment, for example, in a region where there is almost no difference in contrast, it becomes difficult to extract an appropriate feature point by adjusting the threshold value.
- the image processing apparatus generates virtual feature points in order to solve this problem. A more detailed description will be given below.
- FIG. 11 is a conceptual diagram for explaining the process of generating virtual feature points.
- FP1 and FP2 are assumed to be two true feature points extracted in the reference image.
- No_T is a sub-region (textureless region) not including a texture.
- No_T indicates the generated virtual feature point FakePt.
- the position of the virtual feature point FakePt is a point derived using the two extracted true feature points by specifying the distance d1 between FakePt and FP1 and the distance d2 between FakePt and FP2 is there.
- the same method is applied to the target image. This is to arrange virtual feature points at the same distance as the distances d1 and d2 respectively from the two true feature points extracted in the target image, which respectively match the two true feature points in the reference image It takes place in
- virtual feature points can be generated in the textureless region included in the target image and the reference image.
- FIG. 12 shows functional blocks of the extraction unit 104A included in the image processing apparatus according to the present embodiment.
- the extraction unit 104A includes a feature amount calculation unit 112, a feature point extraction unit 114A, an adjustment unit 116, and a virtual feature point generation unit 118.
- a feature amount calculation unit 112 the extraction unit 104A includes a feature amount calculation unit 112 and a feature point extraction unit 114A.
- an adjustment unit 116 the adjustment unit 116 and a virtual feature point generation unit 118.
- a virtual feature point generation unit 118 As illustrated in FIG. 12, the extraction unit 104A includes a feature amount calculation unit 112, a feature point extraction unit 114A, an adjustment unit 116, and a virtual feature point generation unit 118.
- the feature point extraction unit 114A causes the virtual feature point generation unit 118 to generate virtual feature points when it is determined that the required number of feature points can not be extracted even by adjusting the feature point threshold value by the adjustment unit 116.
- the virtual feature point generation unit 118 generates a virtual feature point in the first sub-region included in the plurality of sub-regions based on the feature point extracted by the feature point extraction unit 114A. More specifically, in the first sub-region, the virtual feature point generation unit 118 sets the difference between the sum of the number of extracted feature points and the number of virtual feature points and the predetermined number N. The virtual feature points are generated by the number that is equal to or less than a predetermined value.
- the first sub-region is a sub-region in which a required number of feature points could not be extracted by the adjustment unit 116 because the change in luminance is small.
- the sub-region indicated as No_T in FIG. 11 corresponds.
- FIG. 13 shows functional blocks of the virtual feature point generator 118.
- the virtual feature point generation unit 118 includes a first virtual point generation unit 122, a reference information acquisition unit 124, a corresponding point acquisition unit 126, and a second virtual point generation unit 128.
- the first virtual point generation unit 122 generates a virtual feature in the first image based on the first feature point and the second feature point, which are feature points extracted from the first image by the feature point extraction unit 114A. Generate points. This virtual feature point is also referred to as a first virtual point.
- the reference information acquisition unit 124 acquires reference information including each of the distance between the first virtual point and the first feature point and the distance between the first virtual point and the second feature point in the first image.
- the corresponding point obtaining unit 126 sets a third feature point which is a point in the second image corresponding to the first feature point, and a fourth feature point which is a point in the second image corresponding to the second feature point.
- the second virtual point generation unit 128 generates a second virtual point as a virtual feature point corresponding to the first virtual point in the second image by referring to the reference information.
- the first virtual point generation unit 122 determines that the number of extracted feature points does not reach the predetermined number N among the plurality of sub-regions.
- the first sub-region is selected as the region.
- the first virtual point generation unit 122 may, for example, select a first feature point (also referred to as FP1) from among the feature points extracted by the extraction unit 104A within a predetermined distance from the first sub-region.
- a feature point (also referred to as FP2) is selected.
- the first virtual point generation unit 122 sets a feature point included in a sub-region outside the first sub-region as the closest feature point to the first sub-region as FP1 and a second closest feature point as FP2. You may choose.
- the first virtual point generation unit 122 generates a virtual feature point as a point at which the line segment connecting FP1 and FP2 is divided at a predetermined ratio.
- the first virtual point generation unit 122 determines the ratio of dividing the line segment connecting FP1 and FP2 to be proportional to the strength (the size of the feature amount) of the features of FP1 and FP2, for example. It is also good.
- the virtual feature point may be determined to be located in the sub-region from which the feature point is not extracted. Alternatively, it may be determined as the midpoint of FP1 and FP2.
- the reference information acquisition unit 124 acquires the distance d1 between FP1 and the virtual feature point, and the distance d2 between FP2 and the virtual feature point.
- Information including d1 and d2 is referred to as reference information.
- the corresponding point acquisition unit 126 is a third feature point (also referred to as FP3) and a fourth feature point (also FP4) which are feature points extracted in the second image and are points corresponding to FP1 and FP2, respectively.
- FP3 and FP1 correspond to the same spatial position.
- FP4 and FP2 correspond to the same spatial position.
- the second virtual point generation unit 128 sets a point located at a distance from FP3 to d1 and at a distance from FP4 to d2 as a virtual feature point.
- the second virtual point generation unit 128 is the feature that is most similar to the virtual feature point among the points included in a certain area centering on a point that is at a distance from FP3 to d1 and from FP4 to d2.
- a point having a descriptor may be set as a virtual feature point.
- FIG. 15 shows an example of image alignment processing using virtual feature points in the present embodiment.
- step S702 an image filter is applied to the input image pair, and the response is calculated.
- an image filter a low pass filter, a band pass filter, etc. are considered, for example.
- step S704 the division unit 102 divides the image into a plurality of subregions. Note that the division unit 102 may divide the image into sub-regions before this.
- step S706 the extraction unit 104A performs feature point extraction processing from each sub region. Furthermore, in step S708, the extraction unit 104A confirms whether the extracted feature points are sufficient and whether they are uniformly distributed over the entire image.
- the alignment unit 106 performs the process on the extracted feature points in step S716. Perform the matching process. Subsequently, in step S718, a warp matrix is generated using the pair of corresponding feature points matched in step S716. Finally, the alignment unit 106 aligns the image pair in step S720.
- the extraction unit 104A determines that a sufficient number of feature points can not be uniformly extracted from the entire image (No in step S708), the extraction unit 104A adjusts the feature point threshold in step S710, Extract feature points with weak features.
- step S716 when the extraction unit 104A determines that a sufficient number of feature points have been uniformly extracted from the entire image (yes in S711), the process proceeds to step S716.
- the extraction unit 104A determines in step S712 whether adjustment of the feature point threshold has been performed a predetermined number of times or more.
- the extraction unit 104A adjusts the feature point threshold again in step S710. If the predetermined number of times has been reached (yes in step S712), the virtual feature point generation unit 118 generates a virtual feature point in step S713.
- the image processing apparatus in an image including a sub-region whose contrast is weak enough to make it difficult to extract feature points, the image processing apparatus according to the first embodiment can A virtual feature point is generated as a pair of a virtual point and a second virtual point corresponding thereto.
- the image processing apparatus in addition to the pair of true feature points extracted by the extraction unit 104, the image processing apparatus can perform image alignment with higher accuracy by using the pair of virtual feature points in combination.
- FIGS. 16A to 16C a comparison of the result of image registration by the image processing apparatus according to the present invention with the result of image registration by the conventional method is shown as an example.
- An image 802 shown in FIG. 16A, an image 804 shown in FIG. 16B, and an image 806 shown in FIG. 16C show the results of feature point extraction by the image processing apparatus according to the present invention.
- an image 808 shown in FIG. 16A, an image 810 shown in FIG. 16B, and an image 812 shown in FIG. 16C show results of extracting feature points based on pixel values included in the entire frame by a conventional method.
- feature points are extracted from sub-region 822 in image 802, but no feature points are extracted from sub-region 882 in image 808.
- the image 802 is extracted so that feature points are more uniformly distributed in the entire image than in the case of the image 808. Specifically, many feature points in the image 808 gather on the right side of the image. This non-uniform distribution of feature points affects the results of feature matching and image registration as described below.
- FIG. 16B shows the result of matching of feature points.
- White lines shown in the image indicate positional deviations between feature points of matched pairs.
- the image 804 is a result of matching the feature points in two images (for example, the image for the left eye and the image for the right eye) in which the feature points are extracted using the image processing apparatus according to the present invention
- an image 810 shows the result of matching of feature points in two images in which feature points are extracted using the prior art.
- the matching results in image 804 are more uniformly distributed in the image than in image 810, which shows the results of matching with feature points extracted by the conventional method.
- the feature point pair 842 indicates one pair of feature points matched between the reference image and the target image.
- FIG. 16C shows an image obtained by superimposing two images whose deviations have been corrected by the pair of matched feature points.
- the image 806 is an image obtained by superimposing two images aligned using the image processing apparatus according to the present invention.
- an image 812 is an image obtained by superimposing two images aligned using the conventional technique.
- image 806 the result of registration of the image based on feature point extraction according to the present invention is more reasonable and consistent throughout the image. Specifically, distant objects have small parallaxes and near objects have larger parallaxes.
- image 812 the result of image registration based on conventional feature point extraction is inaccurate and inconsistent across the entire image. Specifically, the distant object has a large disparity. Thus, in many cases high quality results are obtained in the registration of images according to the invention. This shows that the distribution of uniform feature points extracted by the feature point extraction method by the image processing apparatus described in the present invention is useful.
- the image processing apparatus first performs a process of dividing the acquired reference image and target image into sub-regions. However, before extracting the feature points for each sub region, the image processing apparatus extracts the feature points for the entire image of one sheet, and only when the distribution of the extracted feature points is biased, the image May be divided into a plurality of subregions. For example, referring to FIG. 6, before the dividing unit 102 divides each of the reference image and the target image into a plurality of regions in step S302, the extracting unit 104 extracts the entire image region of each of the reference image and the target image. Extract feature points. After that, the extraction unit 104 may perform the process after step S302 if the deviation of the distribution position in each image of the extracted feature points is equal to or more than a predetermined threshold.
- the extraction unit compares the number of feature points extracted for each sub-region with the number N determined in advance, and the difference is equal to or less than a predetermined value.
- the feature point threshold was adjusted. However, if the extraction unit can extract feature points so that the degree of variation between the sub-regions of the number of feature points extracted for each sub-region is less than or equal to a predetermined value, the feature points are obtained by another method. The threshold may be adjusted.
- the extraction unit is configured to calculate the number of feature points extracted from the first sub-region and the number of feature points extracted from the second sub-region different from the first sub-region among the plurality of sub-regions
- the feature points may be extracted from the first sub-region and the second sub-region such that the difference of is equal to or less than a predetermined threshold.
- the extraction unit In each of them, the feature points may be extracted by the number such that the difference between the number of feature points extracted from the sub-region and the number N determined in advance is equal to or less than a predetermined value.
- FIG. 17 is a block diagram showing the hardware configuration of a computer system for realizing the image processing apparatus according to the present invention.
- the image processing apparatus is executed by a computer 34, a keyboard 36 and a mouse 38 for giving instructions to the computer 34, a display 32 for presenting information such as calculation results of the computer 34, and the computer 34.
- a program which is a process performed by the image processing apparatus according to the present invention is stored in a CD-ROM 42 which is a computer readable medium, and read by the CD-ROM device 40. Alternatively, it is read by the communication modem 52 through a computer network.
- the computer 34 includes a central processing unit (CPU) 44, a read only memory (ROM) 46, a random access memory (RAM) 48, a hard disk 50, a communication modem 52, and a bus 54.
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- the CPU 44 executes the program read via the CD-ROM device 40 or the communication modem 52.
- the ROM 46 stores programs and data necessary for the operation of the computer 34.
- the RAM 48 stores data such as parameters at the time of program execution.
- the hard disk 50 stores programs, data, and the like.
- the communication modem 52 communicates with other computers via a computer network.
- the bus 54 mutually connects the CPU 44, the ROM 46, the RAM 48, the hard disk 50, the communication modem 52, the display 32, the keyboard 36, the mouse 38 and the CD-ROM device 40 to one another.
- the system LSI is a super-multifunctional LSI manufactured by integrating a plurality of components on one chip, and more specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- IC card or module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or module may include the above-described ultra-multifunctional LSI.
- the IC card or module achieves its functions by the microprocessor operating according to the computer program. This IC card or this module may be tamper resistant.
- the present invention may be the method described above.
- the present invention may also be a computer program that implements these methods by a computer. Also, it may be a digital signal consisting of a computer program.
- the present invention is a computer-readable recording medium that can read the computer program or the digital signal, such as a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a BD (Blu-ray Disc (Registered trademark), a memory card such as a USB memory or an SD card, or a semiconductor memory may be used. Further, the present invention may be the digital signal recorded on these recording media.
- the computer program or the digital signal may be transmitted via a telecommunications line, a wireless or wired communication line, a network represented by the Internet, data broadcasting, and the like.
- the present invention may be a computer system comprising a microprocessor and a memory, wherein the memory stores the computer program, and the microprocessor operates according to the computer program.
- the present invention can be applied to an image processing apparatus, and in particular, to an image processing apparatus that aligns an image.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、画像内に強い特徴と弱い特徴とが分布する様子を示す。ここで、本発明において、画像の特徴には任意の特徴が利用可能である。例えば、画素ごとのコントラスト値などを画像の特徴として利用することが考えられる。
実施の形態1に係る画像処理装置100においても、例えばコントラストの差がほとんどない領域等においては、閾値の調整により適切な特徴点を抽出することは困難となる。
34 コンピュータ
36 キーボード
38 マウス
40 CD-ROM装置
42 CD-ROM
44 CPU
46 ROM
48 RAM
50 ハードディスク
52 通信モデム
54 バス
100 画像処理装置
102 分割部
104、104A 抽出部
106 位置合わせ部
112 特徴量算出部
114、114A 特徴点抽出部
116 調整部
118 仮想特徴点生成部
122 第1仮想点生成部
124 参照情報取得部
126 対応点取得部
128 第2仮想点生成部
200、802、804、806、808,810、812 画像
222、224、226 特徴点
822、882 サブ領域
842 特徴点ペア
Claims (11)
- 同一の物体を異なる視点から撮影した画像である第1画像と第2画像との垂直方向の視差を抑制する位置合わせをするための特徴点を抽出する画像処理装置であって、
前記第1画像及び前記第2画像のそれぞれを、複数のサブ領域へ分割する分割部と、
前記サブ領域ごとに、特徴点の抽出処理を行う抽出部とを備え、
前記抽出部は、前記抽出処理により抽出される前記特徴点の数の、前記複数のサブ領域間におけるばらつき度合いを示す値が、事前に定められた値以下となるように、前記特徴点の抽出処理を行う
画像処理装置。 - さらに、前記特徴点に基づいて、前記第1画像と前記第2画像との位置合わせを行う位置合わせ部を備え、
前記位置合わせ部は、
前記第1画像及び前記第2画像の一方の画像に含まれる特徴点と、これに対応する特徴点であって、他方の画像に含まれる特徴点とをマッチングさせ、
マッチングされた特徴点同士の垂直方向の座標値の差がより小さくなるように、少なくとも一方の画像に対して座標変換を施す
請求項1に記載の画像処理装置。 - 前記抽出部は、前記複数のサブ領域の各々において、当該サブ領域から抽出した特徴点の数と事前に定められた数Nとの差が所定値以下となる個数だけ、前記特徴点の抽出処理を行う
請求項1又は請求項2に記載の画像処理装置。 - 前記抽出部は、前記複数のサブ領域の各々において、当該サブ領域に含まれる複数の画素のそれぞれに対応する特徴量を算出する特徴量算出部と、
前記複数の画素のうち、対応する特徴量が事前に定められた特徴点閾値以上となる画素を特徴点として抽出する特徴点抽出部と、
前記複数のサブ領域の各々において、前記特徴点抽出部によって抽出された特徴点の数と前記Nとの差が前記所定値以下となるように、当該サブ領域における前記特徴点閾値の大きさを調整する調整部とを有する
請求項3に記載の画像処理装置。 - 前記抽出部は、さらに、前記特徴点抽出部によって抽出された前記特徴点に基づいて、前記複数のサブ領域のうち第1サブ領域に仮想特徴点を生成する仮想特徴点生成部を有し、
前記仮想特徴点生成部は、前記第1サブ領域において、抽出された特徴点の数と前記仮想特徴点の数との合計値と、前記Nとの差が前記所定値以下となる個数だけ、前記仮想特徴点を生成する
請求項4に記載の画像処理装置。 - 前記仮想特徴点生成部は、
前記第1画像に含まれる特徴点である第1特徴点と第2特徴点とに基づいて、当該第1画像中の仮想特徴点である第1仮想点を生成する第1仮想点生成部と、
当該第1画像における、前記第1仮想点と前記第1特徴点との距離、及び当該第1仮想点と前記第2特徴点との距離の各々を含む参照情報を取得する参照情報取得部と、
前記第1特徴点に対応する前記第2画像中の点である第3特徴点と、前記第2特徴点に対応する前記第2画像中の点である第4特徴点とを取得する対応点取得部と、
前記参照情報を参照することにより、前記第2画像において前記第1仮想点に対応する仮想特徴点として第2仮想点を生成する第2仮想点生成部とを有する
請求項5に記載の画像処理装置。 - 前記抽出部は、前記複数のサブ領域の各々において、当該サブ領域に含まれる複数の画素のそれぞれに対応する特徴量を算出する特徴量算出部と、
前記複数の画素のうち、対応する特徴量が事前に定められた特徴点閾値以上となる画素を特徴点として抽出する特徴点抽出部と、
前記複数のサブ領域の各々において、前記特徴点抽出部によって抽出された特徴点の数と前記Nとの差が前記所定値以下となるように、当該サブ領域における画像コントラストの大きさを調整する調整部とを有する
請求項3に記載の画像処理装置。 - 前記抽出部は、前記複数のサブ領域に含まれる第1のサブ領域から抽出された特徴点の数と、前記第1のサブ領域とは異なる第2のサブ領域から抽出された特徴点の数との差が、事前に定められた閾値以下となるように、前記第1のサブ領域及び前記第2のサブ領域から特徴点の抽出処理を行う
請求項1に記載の画像処理装置。 - 同一の物体を異なる視点から撮影した画像である第1画像と第2画像との垂直方向の視差を抑制する位置合わせをするための特徴点を抽出する画像処理方法であって、
前記第1画像及び前記第2画像のそれぞれを、複数のサブ領域へ分割する分割ステップと、
前記サブ領域ごとに特徴点の抽出処理を行う抽出ステップとを含み、
前記抽出ステップにおいては、前記抽出処理により抽出される前記特徴点の数の、前記複数のサブ領域間におけるばらつき度合いが、事前に定められた値以下となるように、前記特徴点の抽出処理を行う
画像処理方法。 - 請求項9に記載の方法をコンピュータに実行させる
プログラム。 - 同一の物体を異なる視点から撮影した画像である第1画像と第2画像との垂直方向の視差を抑制する位置合わせをするための特徴点を抽出する集積回路であって、
前記第1画像及び前記第2画像のそれぞれを、複数のサブ領域へ分割する分割部と、
前記サブ領域ごとに特徴点の抽出処理を行う抽出部とを備え、
前記抽出部は、前記抽出処理により抽出される前記特徴点の数の、前記複数のサブ領域間におけるばらつき度合いが、事前に定められた値以下となるように、前記特徴点の抽出処理を行う
集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/583,646 US9070042B2 (en) | 2011-01-13 | 2012-01-11 | Image processing apparatus, image processing method, and program thereof |
JP2012519262A JP5954712B2 (ja) | 2011-01-13 | 2012-01-11 | 画像処理装置、画像処理方法、及びそのプログラム |
CN201280000903.9A CN102859555B (zh) | 2011-01-13 | 2012-01-11 | 图像处理装置及图像处理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-004937 | 2011-01-13 | ||
JP2011004937 | 2011-01-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012096163A1 true WO2012096163A1 (ja) | 2012-07-19 |
Family
ID=46507072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/000106 WO2012096163A1 (ja) | 2011-01-13 | 2012-01-11 | 画像処理装置、画像処理方法、及びそのプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US9070042B2 (ja) |
JP (1) | JP5954712B2 (ja) |
CN (1) | CN102859555B (ja) |
WO (1) | WO2012096163A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014010263A1 (ja) * | 2012-07-11 | 2014-01-16 | オリンパス株式会社 | 画像処理装置及び画像処理方法 |
EP2741485A3 (en) * | 2012-12-10 | 2014-09-10 | LG Electronics, Inc. | Input device having a scan function and image processing method thereof |
JP2014229030A (ja) * | 2013-05-21 | 2014-12-08 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2015179426A (ja) * | 2014-03-19 | 2015-10-08 | 富士通株式会社 | 情報処理装置、パラメータの決定方法、及びプログラム |
JP2015191622A (ja) * | 2014-03-28 | 2015-11-02 | 富士重工業株式会社 | 車外環境認識装置 |
KR20180136057A (ko) * | 2017-06-14 | 2018-12-24 | 현대모비스 주식회사 | 어라운드 뷰 모니터링 시스템의 카메라 각도 추정 방법 |
JP7341712B2 (ja) | 2018-05-09 | 2023-09-11 | キヤノン株式会社 | 画像処理装置、画像処理方法、撮像装置、およびプログラム |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5746550B2 (ja) * | 2011-04-25 | 2015-07-08 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
US8666169B2 (en) * | 2011-10-24 | 2014-03-04 | Hewlett-Packard Development Company, L.P. | Feature descriptors |
EP2677733A3 (en) * | 2012-06-18 | 2015-12-09 | Sony Mobile Communications AB | Array camera imaging system and method |
EP3331231A1 (en) * | 2013-08-28 | 2018-06-06 | Ricoh Company Ltd. | Image processing apparatus, image processing method, and imaging system |
TW201531960A (zh) * | 2013-11-14 | 2015-08-16 | Sicpa Holding Sa | 用於驗證產品的圖像分析 |
KR102113813B1 (ko) * | 2013-11-19 | 2020-05-22 | 한국전자통신연구원 | 정합 쌍을 이용한 신발 영상 검색 장치 및 방법 |
CN105224582B (zh) * | 2014-07-03 | 2018-11-09 | 联想(北京)有限公司 | 信息处理方法和设备 |
US9659384B2 (en) * | 2014-10-03 | 2017-05-23 | EyeEm Mobile GmbH. | Systems, methods, and computer program products for searching and sorting images by aesthetic quality |
KR102281184B1 (ko) | 2014-11-20 | 2021-07-23 | 삼성전자주식회사 | 영상 보정 방법 및 장치 |
US10931933B2 (en) * | 2014-12-30 | 2021-02-23 | Eys3D Microelectronics, Co. | Calibration guidance system and operation method of a calibration guidance system |
CN105447857B (zh) * | 2015-11-17 | 2018-05-04 | 电子科技大学 | 脉冲涡流红外热图像的特征提取方法 |
CN105574873B (zh) * | 2015-12-16 | 2018-05-22 | 西安空间无线电技术研究所 | 一种数量可控的图像特征点检测方法 |
CN105740802A (zh) * | 2016-01-28 | 2016-07-06 | 北京中科慧眼科技有限公司 | 基于视差图的障碍物检测方法和装置及汽车驾驶辅助系统 |
US10453204B2 (en) * | 2016-12-06 | 2019-10-22 | Adobe Inc. | Image alignment for burst mode images |
CN106910210B (zh) * | 2017-03-03 | 2018-09-11 | 百度在线网络技术(北京)有限公司 | 用于生成图像信息的方法和装置 |
WO2018198025A1 (fr) | 2017-04-24 | 2018-11-01 | Patek Philippe Sa Geneve | Procédé d'identification d'une pièce d'horlogerie |
CN108717069B (zh) * | 2018-05-29 | 2020-08-11 | 电子科技大学 | 一种基于行变步长分割的高压容器热成像缺陷检测方法 |
CN110660090B (zh) * | 2019-09-29 | 2022-10-25 | Oppo广东移动通信有限公司 | 主体检测方法和装置、电子设备、计算机可读存储介质 |
FI20196125A1 (en) * | 2019-12-23 | 2021-06-24 | Truemed Oy | A method for identifying the authenticity of an object |
CN113298187B (zh) * | 2021-06-23 | 2023-05-12 | 展讯通信(上海)有限公司 | 图像处理方法及装置、计算机可读存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1115951A (ja) * | 1997-06-24 | 1999-01-22 | Sharp Corp | ずれ検出装置および画像合成装置 |
WO2004077356A1 (ja) * | 2003-02-28 | 2004-09-10 | Fujitsu Limited | 画像結合装置、画像結合方法 |
JP2007102458A (ja) * | 2005-10-04 | 2007-04-19 | Yamaguchi Univ | 画像処理による注目部分を自動描出する方法及びそのための装置並びにプログラムを記録した記録媒体 |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2679423B2 (ja) * | 1991-02-05 | 1997-11-19 | 日本ビクター株式会社 | 多次元画像圧縮伸張方式 |
EP0692772A1 (fr) | 1994-07-12 | 1996-01-17 | Laboratoires D'electronique Philips S.A.S. | Procédé et dispositif pour détecter des points clés situés sur le contour d'un objet |
KR0181029B1 (ko) | 1995-03-15 | 1999-05-01 | 배순훈 | 에지를 이용한 특징점 선정장치 |
JPH09282080A (ja) * | 1996-04-16 | 1997-10-31 | Canon Inc | 情報入力方法とその装置 |
JP3534551B2 (ja) * | 1996-09-20 | 2004-06-07 | シャープ株式会社 | 動き検出装置 |
US6215914B1 (en) * | 1997-06-24 | 2001-04-10 | Sharp Kabushiki Kaisha | Picture processing apparatus |
JP2000331160A (ja) * | 1999-05-24 | 2000-11-30 | Nec Corp | 画像マッチング装置、画像マッチング方法及び画像マッチング用プログラムを記録した記録媒体 |
US6873723B1 (en) * | 1999-06-30 | 2005-03-29 | Intel Corporation | Segmenting three-dimensional video images using stereo |
US7103211B1 (en) * | 2001-09-04 | 2006-09-05 | Geometrix, Inc. | Method and apparatus for generating 3D face models from one camera |
US20050196070A1 (en) | 2003-02-28 | 2005-09-08 | Fujitsu Limited | Image combine apparatus and image combining method |
EP1840832A4 (en) * | 2005-01-19 | 2013-02-06 | Nec Corp | PRINTING INFORMATION RECORDING APPARATUS, METHOD AND PROGRAM THEREOF, FILTERING SYSTEM |
JP4961850B2 (ja) * | 2006-06-15 | 2012-06-27 | ソニー株式会社 | 動き検出方法、動き検出方法のプログラム、動き検出方法のプログラムを記録した記録媒体及び動き検出装置 |
JP4321645B2 (ja) * | 2006-12-08 | 2009-08-26 | ソニー株式会社 | 情報処理装置および情報処理方法、認識装置および情報認識方法、並びに、プログラム |
JP4967666B2 (ja) * | 2007-01-10 | 2012-07-04 | オムロン株式会社 | 画像処理装置および方法、並びに、プログラム |
JP4989308B2 (ja) * | 2007-05-16 | 2012-08-01 | キヤノン株式会社 | 画像処理装置及び画像検索方法 |
US8306366B2 (en) | 2007-08-23 | 2012-11-06 | Samsung Electronics Co., Ltd. | Method and apparatus for extracting feature points from digital image |
JP4362528B2 (ja) * | 2007-09-10 | 2009-11-11 | シャープ株式会社 | 画像照合装置、画像照合方法、画像データ出力処理装置、プログラム、及び記録媒体 |
US8260061B2 (en) * | 2007-09-21 | 2012-09-04 | Sharp Kabushiki Kaisha | Image data output processing apparatus and image data output processing method |
EP2211302A1 (en) * | 2007-11-08 | 2010-07-28 | Nec Corporation | Feature point arrangement checking device, image checking device, method therefor, and program |
JP4852591B2 (ja) * | 2008-11-27 | 2012-01-11 | 富士フイルム株式会社 | 立体画像処理装置、方法及び記録媒体並びに立体撮像装置 |
JP5430138B2 (ja) * | 2008-12-17 | 2014-02-26 | 株式会社トプコン | 形状測定装置およびプログラム |
JP5166230B2 (ja) * | 2008-12-26 | 2013-03-21 | 富士フイルム株式会社 | 画像処理装置および方法並びにプログラム |
JP4752918B2 (ja) * | 2009-01-16 | 2011-08-17 | カシオ計算機株式会社 | 画像処理装置、画像照合方法、および、プログラム |
JP5363878B2 (ja) * | 2009-06-02 | 2013-12-11 | 株式会社トプコン | ステレオ画像撮影装置及びその方法 |
JP5269707B2 (ja) * | 2009-07-01 | 2013-08-21 | 富士フイルム株式会社 | 画像合成装置及び方法 |
US8385689B2 (en) * | 2009-10-21 | 2013-02-26 | MindTree Limited | Image alignment using translation invariant feature matching |
US8791996B2 (en) * | 2010-03-31 | 2014-07-29 | Aisin Aw Co., Ltd. | Image processing system and position measurement system |
JP5661359B2 (ja) * | 2010-07-16 | 2015-01-28 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
US8571350B2 (en) * | 2010-08-26 | 2013-10-29 | Sony Corporation | Image processing system with image alignment mechanism and method of operation thereof |
US8494254B2 (en) * | 2010-08-31 | 2013-07-23 | Adobe Systems Incorporated | Methods and apparatus for image rectification for stereo display |
TWI433530B (zh) * | 2010-11-01 | 2014-04-01 | Ind Tech Res Inst | 具有立體影像攝影引導的攝影系統與方法及自動調整方法 |
US8554016B2 (en) * | 2010-11-10 | 2013-10-08 | Raytheon Company | Image registration system and method for registering images for deformable surfaces |
US8903133B2 (en) * | 2011-02-21 | 2014-12-02 | Nissan Motor Co., Ltd. | Periodic stationary object detection system and periodic stationary object detection method |
JP5768684B2 (ja) * | 2011-11-29 | 2015-08-26 | 富士通株式会社 | ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム |
KR20130127868A (ko) * | 2012-05-15 | 2013-11-25 | 삼성전자주식회사 | 점 대응을 찾기 위한 방법, 상기 방법을 수행할 수 있는 장치, 및 이를 포함하는 시스템 |
-
2012
- 2012-01-11 CN CN201280000903.9A patent/CN102859555B/zh active Active
- 2012-01-11 WO PCT/JP2012/000106 patent/WO2012096163A1/ja active Application Filing
- 2012-01-11 US US13/583,646 patent/US9070042B2/en active Active
- 2012-01-11 JP JP2012519262A patent/JP5954712B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1115951A (ja) * | 1997-06-24 | 1999-01-22 | Sharp Corp | ずれ検出装置および画像合成装置 |
WO2004077356A1 (ja) * | 2003-02-28 | 2004-09-10 | Fujitsu Limited | 画像結合装置、画像結合方法 |
JP2007102458A (ja) * | 2005-10-04 | 2007-04-19 | Yamaguchi Univ | 画像処理による注目部分を自動描出する方法及びそのための装置並びにプログラムを記録した記録媒体 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014010263A1 (ja) * | 2012-07-11 | 2014-01-16 | オリンパス株式会社 | 画像処理装置及び画像処理方法 |
US9881227B2 (en) | 2012-07-11 | 2018-01-30 | Olympus Corporation | Image processing apparatus and method |
EP2741485A3 (en) * | 2012-12-10 | 2014-09-10 | LG Electronics, Inc. | Input device having a scan function and image processing method thereof |
US9185261B2 (en) | 2012-12-10 | 2015-11-10 | Lg Electronics Inc. | Input device and image processing method thereof |
JP2014229030A (ja) * | 2013-05-21 | 2014-12-08 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2015179426A (ja) * | 2014-03-19 | 2015-10-08 | 富士通株式会社 | 情報処理装置、パラメータの決定方法、及びプログラム |
JP2015191622A (ja) * | 2014-03-28 | 2015-11-02 | 富士重工業株式会社 | 車外環境認識装置 |
KR20180136057A (ko) * | 2017-06-14 | 2018-12-24 | 현대모비스 주식회사 | 어라운드 뷰 모니터링 시스템의 카메라 각도 추정 방법 |
KR102325690B1 (ko) * | 2017-06-14 | 2021-11-15 | 현대모비스 주식회사 | 어라운드 뷰 모니터링 시스템의 카메라 각도 추정 방법 |
JP7341712B2 (ja) | 2018-05-09 | 2023-09-11 | キヤノン株式会社 | 画像処理装置、画像処理方法、撮像装置、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN102859555B (zh) | 2016-04-20 |
US9070042B2 (en) | 2015-06-30 |
US20130004079A1 (en) | 2013-01-03 |
JP5954712B2 (ja) | 2016-07-20 |
CN102859555A (zh) | 2013-01-02 |
JPWO2012096163A1 (ja) | 2014-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012096163A1 (ja) | 画像処理装置、画像処理方法、及びそのプログラム | |
JP6767743B2 (ja) | 映像補正方法及び装置 | |
RU2612378C1 (ru) | Способ замены объектов в потоке видео | |
KR101706216B1 (ko) | 고밀도 삼차원 영상 재구성 장치 및 방법 | |
EP2549762A1 (en) | Stereovision-image position matching apparatus, stereovision-image position matching method, and program therefor | |
JP6570296B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
EP2561683B1 (en) | Image scaling | |
US10116917B2 (en) | Image processing apparatus, image processing method, and storage medium | |
US20140340486A1 (en) | Image processing system, image processing method, and image processing program | |
JP2010152521A (ja) | 画像立体化処理装置及び方法 | |
WO2013035457A1 (ja) | 立体画像処理装置、立体画像処理方法、及びプログラム | |
US20080226159A1 (en) | Method and System For Calculating Depth Information of Object in Image | |
KR102362345B1 (ko) | 이미지 처리 방법 및 장치 | |
KR101797035B1 (ko) | 오버레이 영역의 3d 영상 변환 방법 및 그 장치 | |
US10096116B2 (en) | Method and apparatus for segmentation of 3D image data | |
CN113313707A (zh) | 原始图像处理方法、装置、设备及可读存储介质 | |
KR102516358B1 (ko) | 영상 처리 방법 및 영상 처리 장치 | |
KR20160101762A (ko) | 색상 정보를 활용한 자동 정합·파노라믹 영상 생성 장치 및 방법 | |
US20130208976A1 (en) | System, method, and computer program product for calculating adjustments for images | |
KR101196573B1 (ko) | 영역 분할을 이용한 3디 입체영상의 밝기 보상 방법 | |
JP2014191430A (ja) | 画像処理装置及び画像処理方法 | |
JP6131256B2 (ja) | 映像処理装置及びその映像処理方法 | |
JP6131256B6 (ja) | 映像処理装置及びその映像処理方法 | |
KR20140118370A (ko) | 입체 영상 제작 시스템 및 다시점 영상 정렬방법 | |
CN117689609A (zh) | 视差获取方法、装置、电子设备和存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280000903.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012519262 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12734461 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13583646 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12734461 Country of ref document: EP Kind code of ref document: A1 |