WO2012133371A1 - 撮像位置および撮像方向推定装置、撮像装置、撮像位置および撮像方向推定方法ならびにプログラム - Google Patents
撮像位置および撮像方向推定装置、撮像装置、撮像位置および撮像方向推定方法ならびにプログラム Download PDFInfo
- Publication number
- WO2012133371A1 WO2012133371A1 PCT/JP2012/057866 JP2012057866W WO2012133371A1 WO 2012133371 A1 WO2012133371 A1 WO 2012133371A1 JP 2012057866 W JP2012057866 W JP 2012057866W WO 2012133371 A1 WO2012133371 A1 WO 2012133371A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- depth
- estimation
- image
- estimated
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to an imaging position and imaging direction estimation apparatus (imaging position direction estimation apparatus), an imaging apparatus, an imaging position and imaging direction estimation method (imaging position direction estimation method), and a program, and does not particularly require landmark positioning information.
- the present invention relates to an imaging position and imaging direction estimation device, an imaging device, an imaging position and imaging direction estimation method, and a program.
- Non-patent Document 1 As a method of estimating the self-position at the time of imaging based on the image captured by the camera, a method of estimating from a plurality of landmarks detected from the image (Patent Documents 1 and 2), a position and a direction are given. There has been proposed a method (Non-patent Document 1) for collating with a group of images captured in advance.
- the self-position is information indicating the position at the time of imaging and the imaging direction.
- Patent Document 1 discloses a technique for estimating a self-position based on position information obtained by positioning a building or a sign used as a landmark in advance with a high-precision GPS (Global Positioning System) device and camera parameters of the imaging device.
- the estimation method obtains an angle per pixel from the camera parameters of the camera used for imaging, defines a constraint circle where the self-position may exist from the maximum included angle and the minimum included angle between landmarks.
- a score is calculated using a model that defines the likelihood of a corner. Then, the score calculated for each landmark is added to the coordinate position, and the coordinate position with the maximum score is estimated as the self position.
- Patent Document 2 discloses a method for estimating a self-position in consideration of the degree of coincidence when a landmark is detected.
- the degree of coincidence between an object and a landmark in a captured image is calculated using a database storing landmark positions and feature quantities on a map. Thereafter, a self-existing range for estimating the self-position is determined in descending order of landmark coincidence. Thereby, when a plurality of landmarks are extracted, a position error of a landmark having a low coincidence can be reduced, and the self position can be estimated with high accuracy.
- Non-Patent Document 1 discloses a method for estimating a self-position by associating an image database with an input image.
- the estimation method generates a panoramic image to which a position and a direction are assigned in advance for each intersection.
- the features extracted from the input image are associated with the features of the panorama image group, and the associated panorama image is voted.
- the position of the panoramic image with the most voting results is determined as the self position.
- Homography homoography
- the input image is projected onto the panorama image using the calculated Homography, and the projection center is determined as the direction.
- the position is the position where the panoramic image is captured.
- Patent Document 1 since it is assumed that the camera parameters of the imaging apparatus are known, there is a problem that the position estimation accuracy is deteriorated unless the imager performs calibration for obtaining the camera parameters. Further, in Patent Document 1 and Patent Document 2, since it is necessary to measure the position of the landmark using high-precision GPS or the like, it is troublesome to work on positioning the position, width, and height of the landmark. Cost.
- Non-Patent Document 1 since the input image and the panoramic image stored in the database are associated with each other, it is necessary to densely capture an image for generating the panoramic image. It takes. Furthermore, since the panorama image imaging position stored in the database is set as the self-position, there is a problem that the self-position estimation accuracy deteriorates as the input image imaging position is further away from the panorama image imaging position.
- the object of the present invention is that landmark positioning information is not required, and the position where the photographer has taken (captured) (hereinafter referred to as an imaging position; corresponding to the above-mentioned self-position) is determined in advance.
- An imaging position and imaging direction estimation apparatus, an imaging apparatus, an imaging position and an imaging direction estimation method, and an imaging position and imaging direction estimation apparatus that can estimate an imaging position and an imaging direction even when an image to which a position and a direction are given is separated Is to provide a program.
- the imaging position and imaging direction estimation apparatus of the present invention includes an area determination unit that determines a plurality of areas to be associated between an image captured by the imaging apparatus and a predetermined image, and a depth of each of the plurality of areas. Based on the depth estimation information estimated by the depth estimation unit and the depth estimation unit that estimates the corresponding depth estimation information, the imaging direction of the imaging apparatus is estimated according to the region having a large depth, and the region having a small depth And an imaging direction estimation unit for estimating the imaging position of the imaging device according to the above.
- the region determination unit determines a plurality of regions to be associated between the image captured by the imaging device and the predetermined image, and the depth estimation unit corresponds to the depth of each of the plurality of regions. Estimated depth estimation information. Then, the imaging position and imaging direction estimation unit estimates the imaging direction of the imaging device according to a region with a large depth based on the depth estimation information estimated by the depth estimation unit, and the imaging device according to a region with a small depth Is estimated. Therefore, for example, by adding a position and a direction to a predetermined image in advance, the imaging position and imaging direction estimation unit is added to the depth estimation information estimated by the depth estimation unit and the predetermined image in advance.
- the position and direction of the image captured by the imaging device can be estimated based on the existing position and direction. That is, for example, by preparing an image having a position and a direction that are not related to a specific landmark, it is possible to estimate the position and the direction of the image captured by the imaging device. Therefore, landmark positioning information and camera parameter information are not necessarily used.
- the depth estimation information is obtained by the depth estimation unit, the depth estimation information is used, so that the imaging position can be obtained even if the position and the direction previously assigned are separated from the captured position. And the imaging direction can be estimated.
- FIG. 4 is a diagram for explaining an operation of acquiring a depth value of a certain feature point from depth information of an “outdoor” scene in the imaging position and imaging direction estimation apparatus of FIG. 3. It is a block diagram which shows the imaging device as other embodiment of this invention.
- FIG. 1 is a block diagram showing a configuration of an imaging position and imaging direction estimation apparatus (imaging position / direction estimation apparatus) 200 according to the first embodiment of the present invention.
- the imaging position and imaging direction estimation apparatus 200 includes an estimation unit 2.
- the estimation unit 2 estimates a plurality of regions to be associated from the images captured by the imaging device, and estimates the distant view and the foreground of the extracted region, that is, information that represents the depth of each region (hereinafter referred to as the depth).
- a depth estimation unit 220 that estimates depth information (depth estimation information), and an imaging position and imaging direction estimation unit 230 that estimates an imaging position and an imaging direction using the estimated depth information.
- the imaging position and imaging direction estimation apparatus 200 is an apparatus that estimates the position and direction of a query image based on a query image (inquiry image) input from the image input apparatus 10.
- the imaging position and imaging direction estimation apparatus 200 includes, for example, a CPU (Central Processing Unit), a storage device, and the like, and operates by executing a predetermined program stored in the storage device using the CPU. Can be configured to.
- the query image is an image captured by a user to estimate an imaging position using an imaging device such as a digital camera or a video camera.
- the image input device 10 inputs the query image and the image to which the position / direction information is given to the imaging position and imaging direction estimation device 200.
- the image to which the position / direction information is assigned is an image to which the position and the direction are assigned in advance.
- the image to which the position / direction information is added is, for example, an image for each frame of a plurality of still images that are actually captured or a moving image that is continuously captured, and the position and the direction that are applied are the imaging device at the time of imaging And the imaging direction.
- an image to which a plurality of pieces of position / direction information are assigned is input to one or a plurality of query images.
- the region determination unit 210 determines a plurality of regions to be associated between the query image input from the image input device 10 and the image to which the position / direction information is given.
- the depth estimation unit 220 estimates whether each region for which association is determined by the region determination unit 210 is a distant view or a near view, that is, information indicating the depth. Note that the estimation result may be expressed as a discrete value of a distant view or a near view, or may be expressed as a continuous value indicating a distant view or a close view. That is, the depth estimation unit 220 estimates depth estimation information corresponding to each depth of the plurality of regions.
- the imaging position and imaging direction estimation unit 230 associates a region having a large depth (distant view) from the depth information of each associated region, estimates a direction from the result, and determines a region having a small depth (near view). Association is performed, and the position is estimated from the result. Then, information on the direction and position estimated by the imaging position and imaging direction estimation unit 230 is output to a predetermined apparatus as an imaging position and imaging direction output. That is, the imaging position and imaging direction estimation unit 230 estimates the imaging direction of the imaging device according to the region with a large depth based on the depth estimation information estimated by the depth estimation unit 220, and performs imaging according to the region with a small depth. Estimate the imaging position of the device.
- the imaging position and imaging direction estimation unit 230 associates the query position and the imaging direction estimation unit 230 based on the depth of the area associated between the query image captured by the imaging device and the image to which the position / direction information is assigned. The region determines whether to estimate the position or the direction. Then, the imaging position and imaging direction estimation unit 230 estimates the position of a predetermined image corresponding to the area determined to estimate the position as the imaging position, and determines the direction of the predetermined image corresponding to the area determined to estimate the direction. Estimated as imaging direction.
- a region having a large depth is a region having a relatively large value in a plurality of regions in one image (that is, a value that is larger than the depth of another region when compared with another region). Or, it means an area having a depth of an absolutely large value (that is, a value larger than a predetermined reference value).
- a region with a large depth is a region having a larger order than a predetermined order when the depth estimation information is arranged in descending order among the plurality of regions, the region having a large depth, Or it is an area
- a region having a small depth means a region having a relatively small depth in a plurality of regions in one image, or a region having an absolutely small depth. Is.
- the region determination unit 210 determines a region to be associated from the query image 20 and the image 30 to which the position / direction information is given (hereinafter referred to as the image 30 with position / direction). For example, an image portion corresponding to a sign on a road or a signboard of a store is set as an association region. In the example shown in FIG. 2, eight regions 21 to 28 in the query image 20 and eight regions 31 to 38 in the image 30 with position / direction are determined as association regions. In this case, the areas 21 to 28 and the areas 31 to 38 are associated with each other by the imaging position and imaging direction estimation unit 230, respectively.
- the depth estimation unit 220 determines whether the association areas 21 to 28 (or the association areas 31 to 38) are a distant view or a foreground (that is, estimates the depth of each association area). For example, when the query image 20 is acquired as a moving image, it is determined as a distant view area when the movement distance of the association area is small between frames, and is determined as a foreground area when the movement distance of the association area is large between frames. To do. That is, for example, the movement distance between the n-th frame area 28 (n is a natural number) of the query image 20 and the (n + 1) -th frame area 28 is calculated. to decide.
- the movement distance between the n-th frame area 22 and the (n + 1) -th frame area 22 of the query image 20 is calculated.
- the depth estimation by the depth estimation unit 220 is not limited to this method.
- the distant view / near view determination is performed by referring to the depth value set for each image capturing scene according to the comparison result between the near scene registered in advance and the plurality of image capturing scenes in the distant view. You can go.
- the depth may be estimated based on the regions 31 to 38 in the image 30 with position and direction.
- the imaging position and imaging direction estimation unit 230 associates each region of the query image 20 and the plurality of images 30 with position and direction, and estimates the direction from the distant view association region.
- the imaging position and imaging direction estimation unit 230 compares, for example, a region estimated as a distant view between one query image 20 and a plurality of images 30 with a position direction.
- An image 30 with a position / direction having an estimated region and a region having a relatively high similarity (or a relatively high similarity) is determined, and the direction set in the image 30 with a position / direction is set to the query image. 20 directions are estimated.
- the imaging position and imaging direction estimation unit 230 may associate the plurality of images 30 with position / direction with each other, acquire the directions of the associated images, calculate an average value, and use the estimated direction as the estimated direction.
- the direction may be calculated by multiplying the weight according to the association distance. For example, the shorter the association distance, the greater the weight, and the direction is emphasized. Also, the longer the association distance, the smaller the weight and the less the direction. This is because it can be determined that images are captured from substantially the same composition as the correspondence distance is shorter.
- the imaging position and imaging direction estimation unit 230 estimates the position from the foreground associating area.
- the imaging position and imaging direction estimation unit 230 compares, for example, a region estimated as a foreground between one query image 20 and a plurality of images 30 with a position direction.
- the position-oriented image 30 having the estimated region and a region having a relatively high similarity (or a relatively high similarity) is determined, and the position set in the position-directional image 30 is determined as the query image. 20 positions are estimated.
- the imaging position and imaging direction estimation unit 230 uses the depth estimated by the depth estimation unit 220 to set the image with position / direction 30 having a region having a relatively high similarity (or a relatively high similarity).
- the estimated position of the query image 20 may be set after the corrected position is corrected.
- the imaging position and imaging direction estimation unit 230 may associate the plurality of images 30 with position directions, acquire the positions of the associated images, calculate the average value, and use the estimated positions as the estimated positions.
- the position may be calculated by multiplying the weight according to the association distance. For example, as the association distance is shorter, the weight is increased and the position is emphasized. In addition, the longer the association distance, the smaller the weight and the lighter the position. This is because it can be determined that images are captured from substantially the same composition as the correspondence distance is shorter.
- the imaging position and imaging direction estimation unit 230 for example, based on the depth of the area associated between the query image captured by the imaging device and the image to which a plurality of position / direction information is assigned, Thus, whether to estimate the position or the direction is determined, and weighting is performed on the image to which the position and direction information is given according to the distance between the associated areas. Then, the imaging position and imaging direction estimation unit 230 multiplies the position of the image to which the position / direction information corresponding to the region determined to estimate the position is given by this weight, and estimates it as the imaging position. In addition, the imaging position and imaging direction estimation unit 230 may estimate the imaging direction by multiplying the direction of the image to which the position and direction information corresponding to the area determined to estimate the direction is given by this weight. .
- the region determination unit 210 determines a plurality of regions to be associated between an image captured by the imaging device and a predetermined image, and the depth estimation unit 220 determines each of the plurality of regions. Depth information is estimated according to the depth. Then, based on the depth information estimated by the depth estimation unit 220, the imaging position and imaging direction estimation unit 230 estimates the imaging direction of the imaging device according to a region with a large depth, and the imaging device according to a region with a small depth. Is estimated. At this time, the position and direction are given in advance to the image 30 with position and direction.
- the imaging position and imaging direction estimation unit 230 determines the position and direction of the query image 20 based on the depth of the region estimated by the depth estimation unit 220 and the position and direction previously given to the image 30 with position / direction. Can be estimated. That is, for example, the position and direction of the query image 20 can be estimated by preparing the image 30 with position and direction to which the position and direction are assigned. At this time, the position and direction information given to the image 30 with position / direction need not correspond to a specific landmark. That is, landmark positioning information and camera parameter information are not necessarily used. In addition, since the depth estimation unit 220 estimates the depth of the area, the estimated depth information is used to move away from the position where the position-directional image 30 to which the position and direction are assigned in advance is captured. However, the imaging position can be estimated.
- FIG. 3 is a block diagram showing the configuration of the imaging position and imaging direction estimation apparatus 100 as the second embodiment of the present invention.
- the imaging position and imaging direction estimation apparatus 100 includes an estimation unit 1 and a data storage unit 140.
- the estimation unit 1 extracts a feature from an image and determines a region for association, and a depth estimation unit for estimating depth information of a region from which the feature is extracted (hereinafter referred to as a feature region). 120 is associated with the estimated feature region, and includes an imaging position and imaging direction estimation unit 130 that estimates the imaging position according to the depth information and the association result.
- the imaging position and imaging direction estimation apparatus 100 includes, for example, a CPU, a storage device, and the like, and can be configured to operate by executing a predetermined program stored in the storage device by the CPU. it can. Moreover, the same code
- the feature extraction unit 110 has a configuration corresponding to the region determination unit 210 in the first embodiment.
- the feature extraction unit 110 performs feature extraction from the query image input from the image input device 10 by performing predetermined image processing, and picks up one or a plurality of regions from which the features have been extracted, as an imaging position and imaging direction estimation unit 130. Are determined as areas to be associated. Also, feature extraction is performed in the same manner on an image to which position / direction information is given, and one or a plurality of regions from which the feature is extracted are determined.
- the feature extraction method may be a method of extracting features from feature points in an image such as a SIFT (Scale-Invariant Feature Transform) feature, a SURF (Speeded Up Robust Features) feature, or the like. A method of dividing the grid and extracting the features from the grid may be used.
- the depth estimation unit 120 gives depth estimation information stored in a first depth information storage unit 151 to be described later to the feature region extracted by the feature extraction unit 110.
- This depth estimation information that is, the depth value depending on the position in the image, is a contribution distribution (position estimation contribution distribution) parameter for estimating the position by the imaging position and imaging direction estimation unit 130 and a contribution distribution for estimating the direction ( (Direction estimation contribution distribution) parameter is used when reading from a second depth information storage unit 152 described later.
- the imaging position and imaging direction estimation unit 130 associates the feature area of the query image with the feature area of the image to which the position / direction information is given, and calculates a movement amount between the associated feature areas.
- the amount of movement means the distance and direction between feature regions.
- the imaging position and imaging direction estimation unit 130 uses the imaging position and imaging direction estimation amount storage unit 160 in which the movement amount distribution of the position and direction is stored based on the calculated movement amount between the feature regions, and the position direction information. The distribution of the amount of movement of the position and the amount of deviation of the direction is selected for the image to which is given.
- the imaging position and imaging direction estimation unit 130 performs imaging based on the position estimation contribution distribution and the direction estimation contribution distribution read from the second depth information storage unit 152 and the position movement amount / direction deviation amount distribution of all the feature regions. Estimate the position and imaging direction.
- the data storage unit 140 includes a depth information storage unit 150 and an imaging position and imaging direction estimation amount storage unit 160.
- the depth information storage unit 150 includes a first depth information storage unit 151 and a second depth information storage unit 152.
- the first depth information storage unit 151 stores a depth value corresponding to the imaging scene.
- the depth value may be a relative value indicating the magnitude relationship of the depth, or may be an absolute value such as a distance value.
- the second depth information storage unit 152 stores parameters of the position estimation contribution distribution and parameters of the direction estimation contribution distribution, and these parameters are selected according to the depth value. For example, the position estimation contribution distribution has a higher contribution value as the depth becomes smaller, and the direction estimation contribution distribution has a lower contribution degree as the depth becomes smaller.
- the imaging position and imaging direction estimation amount storage unit 160 determines the movement amount of the position based on the movement amount between the feature region extracted from the query image and the feature region extracted from the image to which the position and the direction are given, and the depth information. And parameters for estimating the amount of direction deviation are stored.
- the feature extraction unit 110 inputs a query image captured at a place where the imaging position and the imaging direction are to be estimated from the image input device 10 (step S1). Next, the feature extraction unit 110 extracts features from the image (step S2).
- the depth estimation unit 120 refers to the depth information storage unit 150, and includes the depth value, the position estimation contribution distribution parameter and the direction estimation contribution distribution parameter obtained from the depth value for the extracted feature region. Depth information is given (step S3). That is, the depth estimation unit 120 determines the imaging scene of the query image, and reads the depth information corresponding to the determined imaging scene from the first depth information storage unit 151, thereby estimating the depth estimation information.
- the imaging position and imaging direction estimation unit 130 refers to the depth information storage unit 150 and the imaging position and imaging direction estimation amount storage unit 160, and the feature region of the query image and the image to which the position and direction information has been assigned in advance are stored.
- the amount of movement between the associated feature areas is obtained from the result of association between the feature areas, and the imaging position and imaging direction are estimated using the depth information (step S4).
- the imaging position and imaging direction estimation unit 130 calculates the position estimation contribution distribution and the direction estimation contribution distribution based on the depth estimation information estimated by the depth estimation unit 120, and the second depth information storage unit 152. Read from.
- the imaging position and imaging direction estimation unit 130 calculates the position movement amount distribution and the direction deviation amount distribution based on the distance and direction between the regions corresponding to the regions associated with each other by the feature extraction unit 110. And it reads out from the imaging direction estimation amount storage unit 160. Then, the imaging position and imaging direction estimation unit 130 estimates the imaging position based on the read position estimation contribution distribution and the read position movement amount distribution, and reads the read direction estimation contribution distribution and the read direction deviation amount distribution. Based on the above, the imaging direction is estimated.
- the first depth information storage unit 151 stores a depth value corresponding to the imaging scene in advance.
- the general cityscape composition is such that the upper part of the image is far from the imaging position, and the lower part of the image is a short distance extracted from the ground or the like. Can be defined.
- a value in which the depth value increases from the bottom to the top of the image is stored as depth information (see FIG. 5B).
- FIG. 5B is a diagram illustrating an example of the depth information corresponding to the image in FIG. 5A in gray scale, and the white color indicates that the depth value is large.
- the center of the image can be defined as a distance far from the imaging position, and the other can be defined as a distance near the imaging position (see FIG. 5D).
- the imaging scene of the image may be determined and depth information corresponding to the imaging scene may be acquired, or a fixed value that does not depend on the imaging scene Also good.
- the second depth information storage unit 152 stores parameters of the position estimation contribution distribution and the direction estimation contribution distribution that depend on the depth value.
- the depth value x a is defined by a Gaussian distribution having the variance ⁇ a and the coefficient c a as parameters
- x a , ⁇ a , c a ) is expressed by the formula (
- x b , ⁇ b , c b ) is expressed by Equation (2).
- the position estimation contribution distribution Npc has a high value when the depth value is small, and has a low value when the depth value is large.
- the direction estimation contribution distribution Ndc in FIG. 6B shows the opposite tendency. This means that nearby feature regions are used to estimate position and far feature regions are used to estimate direction.
- 6A and 6B are diagrams illustrating a plurality of examples of the position estimation contribution distribution Npc and the position estimation contribution distribution Ndc, where the horizontal axis represents the depth value and the vertical axis represents the frequency.
- the imaging position and imaging direction estimation amount storage unit 160 stores a distribution corresponding to the amount of movement between feature regions.
- the amount of movement means distance and direction.
- the distance between the feature regions is obtained from the pixel value, and how many meters in the real world corresponds to one pixel of the pixel value depends on the depth.
- the amount of movement per pixel cannot be determined uniquely because there are variations in camera parameters for each camera model. Therefore, in advance, a distribution of how many meters one pixel corresponds to each depth is created, and parameters of the position movement amount distribution Npm including the movement amount and direction between feature regions are stored (FIG. 7A).
- x a , ⁇ a , c a ) composed of a depth value x a , a variance value ⁇ a , and a coefficient ca is obtained.
- x a , ⁇ a , c a ) is set (FIG. 7B).
- FIG. 7A and 7B show the amount of movement in two examples (depth values x a and x e ) of the position movement amount distribution Npm and the direction deviation amount distribution Ndm, with the horizontal axis representing the depth value and the vertical axis representing the frequency. It is a figure which represents distribution of loss amount typically.
- the imaging position and imaging direction estimation device 100 estimates the position from the query image captured by the user.
- the feature extraction unit 110 extracts features from a query image (query image 20 in FIG. 2) and an image to which position / direction information is given (image 30 with position / direction in FIG. 2).
- the depth estimation unit 120 determines the captured scene of the query image, and if it is determined to be “outdoor”, the depth value x ai is obtained for each feature region from the depth information of the “outdoor” scene (see FIG. 5A). ). Note that i represents the index of the feature region in the image. Further, a position estimation contribution distribution Npc (x
- the depth information is obtained for all the feature regions of the query image, but only the following feature regions may be obtained.
- the imaging position and imaging direction estimation unit 130 is extracted by the feature extraction unit 110 from the query image (query image 20 in FIG. 2) and the image to which the position / direction information is added (image 30 with position / direction in FIG. 2).
- the corresponding feature points are associated.
- distance calculation is performed on all the feature points of the image 30 with position and direction.
- the feature point with the shortest distance is set as the feature point associated with the feature point of the image to which the position / direction information is given. It should be noted that as long as the distance between the feature point having the second shortest distance and the difference is equal to or larger than a certain value, the correspondence may be made.
- FIG. 8 shows a distance between a plurality of feature points indicated by circles and feature points indicated by arrows in a query image indicated by a rectangular frame corresponding to an outdoor scene as shown in FIG. 5A. An example of the relationship is shown.
- the imaging position and imaging direction estimation unit 130 obtains a position estimation amount and a direction estimation amount.
- the feature point F i has a depth value zi and a distance value F i . Since len is determined, the position movement amount distribution Npm (x
- the direction estimation amount F di in the feature F i is obtained by the following equation (4) from the direction deviation amount distribution N dmi and the direction estimation contribution distribution N dci .
- the position estimation distribution F p is obtained for all the feature points by the expression (5) using the position estimation amount F pi .
- the direction estimated distribution F d is also determined by equation (6) using the direction estimator F di. The location where these distributions are maximized is the imaging position and imaging direction estimation result.
- x and ⁇ represent a direct product.
- ⁇ and c or ⁇ i or c i represent the variance and coefficient of each distribution or estimator.
- the imaging position and imaging direction estimation apparatus 100 can estimate the imaging position and imaging direction in the same manner as the imaging position and imaging direction estimation apparatus 200 described above.
- the imaging position and imaging direction estimation apparatus 100 or 200 according to the present embodiment described above implements an imaging position and imaging direction estimation apparatus that estimates an imaging position from an image or an imaging position and imaging direction estimation apparatus in a computer. It can be applied to applications such as
- the imaging apparatus 1000 such as a digital camera or a video camera may include the imaging position and imaging direction estimation apparatus 100 or 200 as described with reference to FIGS. 1 to 8. .
- the imaging apparatus 1000 may further include an imaging unit 300, a recording unit 400, and a captured image storage unit 500.
- the imaging unit 300 includes an optical system such as a lens and an imaging device such as a CCD (Charge Coupled Device) image sensor, and an image received through the optical system is converted into an electrical signal by the imaging device. Is captured as a captured image.
- the imaging unit 300 also captures the query image described above.
- the captured image captured by the imaging unit 300 is input to the image input device 10 described with reference to FIG. 1 or FIG.
- the imaging position and imaging direction estimation apparatus 100 or 200 estimates and outputs the imaging position and imaging direction based on the image input to the image input apparatus 10 as described with reference to FIGS. .
- the imaging unit 300 may correspond to the image input device 10 described with reference to FIG. 1 or FIG.
- the imaging position and imaging direction estimation apparatus 100 or 200 estimates the imaging position and the imaging direction based on the captured image captured by the imaging unit 300, as described with reference to FIGS. Output.
- the recording unit 400 registers and stores the captured image captured by the imaging unit 300 in the captured image storage unit 500.
- the recording unit 400 associates the imaging position and imaging direction estimated by the imaging position and imaging direction estimation apparatus 100 or 200 when the captured image captured by the imaging unit 300 is registered in the captured image storage unit 500. Then, the captured image may be registered and stored in the captured image storage unit 500.
- the captured image storage unit 500 is a storage unit that stores captured images, and may be a storage medium such as a flash memory.
- the imaging apparatus 1000 when the imaging apparatus 1000 includes the imaging position and imaging direction estimation apparatus 100 or 200 according to the present embodiment described above, the imaging position (and the imaging based on the captured image). By estimating the (direction), the self-position can be estimated.
- the self position is an imaging position and an imaging direction at the time of imaging. Therefore, the imaging apparatus 1000 including the imaging position and imaging direction estimation apparatus 100 or 200 according to the present embodiment can be applied to applications such as positioning in an environment where GPS radio waves are difficult to reach.
- the imaging device 1000 as the self-position detection device may include the imaging position and imaging direction estimation device 100 or 200 and the imaging unit 300. Even in this case, the imaging apparatus 1000 as the self-position detecting apparatus can estimate the self-position as described above.
- the image input device 10 itself included in the imaging direction estimation device 100 or 200 described with reference to FIG. 1 or FIG. 3 may be the imaging unit 300 described with reference to FIG. Even in this case, the imaging direction estimation apparatus 100 or 200 as the self-position detection apparatus can estimate the self-position as described above.
- the imaging position and imaging direction estimation apparatus 100 or 200 is not limited to the above.
- the imaging position and imaging direction estimation apparatus 100 or 200 and the image input apparatus 10 may be configured as an integral unit, or the estimation unit 1 Alternatively, the data storage unit 140 may be integrated.
- a part or all of the program executed by the imaging position and imaging direction estimation apparatus 100 or 200 can be distributed via a computer-readable recording medium or a communication line.
- an imaging position and imaging direction estimation device that estimates the position and direction at the time of imaging from a captured image without requiring positioning information such as landmarks.
- DESCRIPTION OF SYMBOLS 1 Estimation part 2 Estimation part 10 Image input device 20 Query image 30 Image with position direction 100 Imaging position and imaging direction estimation apparatus 110 Feature extraction part 120 Depth estimation part 130 Imaging position and imaging direction estimation part 140 Data storage part 150 Depth information storage Unit 151 first depth information storage unit 152 second depth information storage unit 160 imaging position and imaging direction estimation amount storage unit 200 imaging position and imaging direction estimation device 210 region determination unit 220 depth estimation unit 230 imaging position and imaging direction estimation unit 1000 Imaging device
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
その推定方法は、撮像に用いたカメラのカメラパラメータから1ピクセルあたりの角度を求め、ランドマーク間の最大挟角、最小挟角から自己位置が存在する可能性のある拘束円を定義し、挟角の尤度を規定したモデルを用いてスコアを算出する。そして、ランドマーク間毎に算出されたスコアを座標位置に加算し、スコアが最大となる座標位置を自己位置として推定する。
以下、図面を参照して本発明の実施の形態について説明する。図1は、本発明の第1の実施の形態としての撮像位置および撮像方向推定装置(撮像位置方向推定装置)200の構成を示すブロック図である。
また、クエリ画像は、例えば利用者がデジタルカメラやビデオカメラなどの撮像装置を用いて、撮像位置を推定するために撮像した画像である。
次に、図3から図8を参照して本発明の第2の実施形態について説明する。図3は、本発明の第2の実施形態としての撮像位置および撮像方向推定装置100の構成を示すブロック図である。本実施の形態にかかる撮像位置および撮像方向推定装置100は、推定部1と、データ記憶部140とを有している。推定部1は、画像中から特徴を抽出し、対応付けを行う領域を決定する特徴抽出部110と、特徴が抽出された領域(以下、特徴領域と呼ぶ)の奥行情報を推定する奥行き推定部120と、推定した特徴領域を対応づけるとともに、奥行き情報と対応付け結果とに応じて、撮像位置を推定する撮像位置および撮像方向推定部130と、を備える。撮像位置および撮像方向推定装置100は、例えばCPU、記憶装置などを備えて構成されていて、そのCPUで記憶装置に格納されている所定のプログラムを実行することで動作するように構成することができる。また、図1に示すものと同一の構成には同一の符号を用いている。
例えば撮像シーンが「屋外」と判定された場合(図5A参照)、一般的な街並みの構図である、画像の上方は撮像位置から遠い距離、画像の下方は地面などから抽出された近い距離と定義することができる。この場合、画像の下方から上方に伴い奥行き値が大きくなる値を奥行き情報として格納する(図5B参照)。なお、図5Bは、図5Aの画像に対応した奥行き情報の一例をグレースケールで示す図であり、白いほど奥行き値が大きいことを示している。
なお、上記に説明した本実施形態による撮像位置および撮像方向推定装置100または200は、画像から撮像位置を推定する撮像位置および撮像方向推定装置や、撮像位置および撮像方向推定装置をコンピュータに実現するためのプログラムといった用途に適用できる。
撮像部300は、レンズなどの光学系やCCD(Charge Coupled Device)イメージセンサなどの撮像素子を備えており、光学系を介して受光した像を、撮像素子により電気信号に変換することにより、像を撮像画像として撮像する。この撮像部300は、上述したクエリ画像も撮像する。
撮像部300により撮像された撮像画像は、図1または図3を用いて説明した画像入力装置10に入力される。撮像位置および撮像方向推定装置100または200は、画像入力装置10に入力された画像に基づいて、図1から図8を用いて説明したように、撮像位置と撮像方向とを推定して出力する。
なお、撮像部300が、図1または図3を用いて説明した画像入力装置10に対応していてもよい。この場合、撮像位置および撮像方向推定装置100または200は、撮像部300により撮像された撮像画像に基づいて、図1から図8を用いて説明したように、撮像位置と撮像方向とを推定して出力する。
記録部400は、撮像部300により撮像された撮像画像を、撮像画像記憶部500に登録して記憶させる。なお、記録部400は、撮像部300により撮像された撮像画像を、撮像画像記憶部500に登録する場合に、撮像位置および撮像方向推定装置100または200により推定された撮像位置と撮像方向と関連付けて、当該撮像画像を撮像画像記憶部500に登録して記憶させてもよい。
撮像画像記憶部500は、撮像画像を記憶する記憶部であり、フラッシュメモリなどの記憶媒体であってもよい。
なお、自己位置検出装置としての撮像装置1000は、撮像位置および撮像方向推定装置100または200と、撮像部300とを備えていてもよい。この場合でも、自己位置検出装置としての撮像装置1000は、上述したように自己位置を推定することができる。
また、図1または図3を用いて説明した撮像方向推定装置100または200が備えている画像入力装置10そのものが、図9を用いて説明したような撮像部300であってもよい。この場合でも、自己位置検出装置としての撮像方向推定装置100または200は、上述したように自己位置を推定することができる。
本願は、2011年3月28日に、日本に出願された特願2011-069251号に基づき優先権を主張し、その内容をここに援用する。
2 推定部
10 画像入力装置
20 クエリ画像
30 位置方向付き画像
100 撮像位置および撮像方向推定装置
110 特徴抽出部
120 奥行き推定部
130 撮像位置および撮像方向推定部
140 データ記憶部
150 奥行き情報記憶部
151 第1奥行き情報記憶部
152 第2奥行き情報記憶部
160 撮像位置および撮像方向推定量記憶部
200 撮像位置および撮像方向推定装置
210 領域決定部
220 奥行き推定部
230 撮像位置および撮像方向推定部
1000 撮像装置
Claims (10)
- 撮像装置により撮像された画像と所定の画像との間で対応付けを行う複数の領域を決定する領域決定部と、
前記複数の領域それぞれの奥行きに対応した奥行推定情報を推定する奥行き推定部と、
前記奥行き推定部によって推定された奥行推定情報に基づき、奥行きが大きい前記領域に応じて前記撮像装置の撮像方向を推定し、奥行きが小さい前記領域に応じて前記撮像装置の撮像位置を推定する撮像位置および撮像方向推定部と、
を備えている撮像位置方向推定装置。 - 前記所定の画像が、あらかじめ位置と方向とが付与された画像であり、
前記撮像位置および前記撮像方向推定部が、前記奥行推定情報に基づき、奥行きが大きい前記領域に対応する前記所定の画像に付与されている方向に応じて前記撮像方向を推定し、奥行きが小さい前記領域に対応する前記所定の画像に付与されている位置に応じて前記撮像位置を推定する請求項1に記載の撮像位置方向推定装置。 - 前記撮像位置および撮像方向推定部は、
前記撮像装置により撮像された画像と前記所定の画像との間で対応付けられた領域の奥行きに基づいて、当該領域により、位置を推定するか、または、方向を推定するかを決定し、
前記位置を推定すると決定した領域に対応する前記所定の画像の位置を前記撮像位置として推定し、
前記方向を推定すると決定した領域に対応する前記所定の画像の方向を前記撮像方向として推定する請求項1または2に記載の撮像位置方向推定装置。 - 前記撮像位置および撮像方向推定部は、前記撮像装置により撮像された画像と複数の前記所定の画像との間で対応付けられた領域の奥行きに基づいて、当該領域により、位置を推定するか、または、方向を推定するかを決定するとともに、対応付けされた領域間の距離に応じて前記所定の画像に対して重み付けを行い、
前記位置を推定すると決定した領域に対応する前記所定の画像の位置に、前記重みを乗じて、前記撮像位置として推定し、
前記方向を推定すると決定した領域に対応する前記所定の画像の方向に、前記重みを乗じて、前記撮像方向として推定する請求項1から3のいずれか1項に記載の撮像位置方向推定装置。 - 撮像シーンに応じた奥行き情報があらかじめ記憶されている第1奥行き情報記憶部、
を備えており、
前記奥行き推定部は、
前記撮像装置により撮像された画像の撮像シーンを判定し、判定された撮影シーンに対応する奥行き情報を、前記第1奥行き情報記憶部から読み出すことにより、前記奥行推定情報を推定する請求項1から4のいずれか1項に記載の撮像位置方向推定装置。 - 前記領域決定部は、
前記撮像装置により撮像された画像と前記所定の画像とのそれぞれから特徴を抽出し、
前記撮像装置により撮像された画像と前記所定の画像との画像領域のうち、前記特徴が抽出された画像領域を、前記撮像装置により撮像された画像と前記所定の画像との間で対応付けを行う複数の領域として決定する請求項1から5のいずれか1項に記載の撮像位置方向推定装置。 - 奥行推定情報と関連付けて、当該奥行推定情報に応じた位置推定寄与分布と方向推定寄与分布とが予め記憶されている第2奥行き情報記憶部と、
特徴間の距離と方向とに関連付けて、当該特徴間の距離と方向とに応じた位置移動量分布と方向ずれ量分布とが予め記憶されている撮像位置方向推定量記憶部と、
を備えており、
前記撮像位置方向推定部は、
前記奥行き推定部により推定された奥行推定情報に基づいて、前記位置推定寄与分布と前記方向推定寄与分布とを、前記第2奥行き情報記憶部から読み出し、
前記領域決定部により前記対応付けを行われた領域に対応する前記特徴間の距離および方向に基づいて、前記位置移動量分布と前記方向ずれ量分布とを、前記撮像位置方向推定量記憶部から読み出し、
前記読み出した前記位置推定寄与分布と前記読み出した前記位置移動量分布とに基づいて、前記撮像位置を推定し、
前記読み出した前記方向推定寄与分布と前記読み出した前記方向ずれ量分布とに基づいて、前記撮像方向を推定する請求項6に記載の撮像位置方向推定装置。 - 請求項1から7のいずれか1項に記載の撮像位置方向推定装置、
を備えている撮像装置。 - 撮像装置により撮像された画像と所定の画像との間で対応付けを行う複数の領域を決定する領域決定過程と、
前記複数の領域それぞれの奥行きに対応した奥行推定情報を求める奥行き推定過程と、
前記奥行き推定過程で推定された奥行推定情報に基づき、奥行きが大きい前記領域に応じて前記撮像装置の撮像方向を推定し、奥行きが小さい前記領域に応じて前記撮像装置の撮像位置を推定する撮像位置方向推定過程と、
を含む撮像位置方向推定方法。 - 撮像装置により撮像された画像と所定の画像との間で対応付けを行う複数の領域を決定する領域決定過程と、
前記複数の領域それぞれの奥行きに対応した奥行推定情報を求める奥行き推定過程と、
前記奥行き推定過程で推定された奥行推定情報に基づき、奥行きが大きい前記領域に応じて前記撮像装置の撮像方向を推定し、奥行きが小さい前記領域に応じて前記撮像装置の撮像位置を推定する撮像位置方向推定過程と、
をコンピュータに実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013507587A JPWO2012133371A1 (ja) | 2011-03-28 | 2012-03-27 | 撮像位置および撮像方向推定装置、撮像装置、撮像位置および撮像方向推定方法ならびにプログラム |
US14/007,411 US9232128B2 (en) | 2011-03-28 | 2012-03-27 | Image capture position and image capture direction estimation device, image capture device, image capture position and image capture direction estimation method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011069251 | 2011-03-28 | ||
JP2011-069251 | 2011-03-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012133371A1 true WO2012133371A1 (ja) | 2012-10-04 |
Family
ID=46931088
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/057866 WO2012133371A1 (ja) | 2011-03-28 | 2012-03-27 | 撮像位置および撮像方向推定装置、撮像装置、撮像位置および撮像方向推定方法ならびにプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9232128B2 (ja) |
JP (1) | JPWO2012133371A1 (ja) |
WO (1) | WO2012133371A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104204726A (zh) * | 2012-03-06 | 2014-12-10 | 日产自动车株式会社 | 移动物体位置姿态估计装置和移动物体位置姿态估计方法 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9323338B2 (en) * | 2013-04-12 | 2016-04-26 | Usens, Inc. | Interactive input system and method |
WO2015049826A1 (ja) * | 2013-10-01 | 2015-04-09 | 日本電気株式会社 | 物体検出装置、物体検出方法および学習装置 |
US11908212B2 (en) * | 2019-07-22 | 2024-02-20 | Nec Corporation | Matching position output system |
US11412350B2 (en) | 2019-09-19 | 2022-08-09 | Apple Inc. | Mobile device navigation system |
JP7199337B2 (ja) * | 2019-11-15 | 2023-01-05 | 株式会社東芝 | 位置推定装置、位置推定方法およびプログラム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002048513A (ja) * | 2000-05-26 | 2002-02-15 | Honda Motor Co Ltd | 位置検出装置、位置検出方法、及び位置検出プログラム |
JP2009020014A (ja) * | 2007-07-12 | 2009-01-29 | Toyota Motor Corp | 自己位置推定装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4851874B2 (ja) | 2006-07-11 | 2012-01-11 | 富士通株式会社 | 自己位置推定プログラム、自己位置推定方法および自己位置推定装置 |
US8611592B2 (en) * | 2009-08-26 | 2013-12-17 | Apple Inc. | Landmark identification using metadata |
-
2012
- 2012-03-27 JP JP2013507587A patent/JPWO2012133371A1/ja active Pending
- 2012-03-27 US US14/007,411 patent/US9232128B2/en active Active
- 2012-03-27 WO PCT/JP2012/057866 patent/WO2012133371A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002048513A (ja) * | 2000-05-26 | 2002-02-15 | Honda Motor Co Ltd | 位置検出装置、位置検出方法、及び位置検出プログラム |
JP2009020014A (ja) * | 2007-07-12 | 2009-01-29 | Toyota Motor Corp | 自己位置推定装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104204726A (zh) * | 2012-03-06 | 2014-12-10 | 日产自动车株式会社 | 移动物体位置姿态估计装置和移动物体位置姿态估计方法 |
EP2824425A4 (en) * | 2012-03-06 | 2015-05-27 | Nissan Motor | POSITIONING / ALIGNMENT ESTIMATION APPARATUS FOR A MOBILE OBJECT AND METHOD FOR ESTIMATING THE POSITION / ORIENTATION OF A MOVING OBJECT |
US9797981B2 (en) | 2012-03-06 | 2017-10-24 | Nissan Motor Co., Ltd. | Moving-object position/attitude estimation apparatus and moving-object position/attitude estimation method |
Also Published As
Publication number | Publication date |
---|---|
US9232128B2 (en) | 2016-01-05 |
JPWO2012133371A1 (ja) | 2014-07-28 |
US20140015998A1 (en) | 2014-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110568447B (zh) | 视觉定位的方法、装置及计算机可读介质 | |
US10334168B2 (en) | Threshold determination in a RANSAC algorithm | |
US8798387B2 (en) | Image processing device, image processing method, and program for image processing | |
US9888235B2 (en) | Image processing method, particularly used in a vision-based localization of a device | |
JP5954668B2 (ja) | 画像処理装置、撮像装置および画像処理方法 | |
CN110111388B (zh) | 三维物体位姿参数估计方法及视觉设备 | |
WO2012133371A1 (ja) | 撮像位置および撮像方向推定装置、撮像装置、撮像位置および撮像方向推定方法ならびにプログラム | |
JP5074322B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム、及び、撮像装置 | |
US8155387B2 (en) | Method and system for position determination using image deformation | |
JPWO2016181687A1 (ja) | 画像処理装置と画像処理方法およびプログラム | |
TW201229962A (en) | Augmenting image data based on related 3D point cloud data | |
CN105765628A (zh) | 深度图生成 | |
EP3093822B1 (en) | Displaying a target object imaged in a moving picture | |
JP6521626B2 (ja) | 被写体追跡装置、方法およびプログラム | |
JP2015148532A (ja) | 距離計測装置、撮像装置、距離計測方法、およびプログラム | |
JP2019004305A (ja) | 画像処理装置、画像処理方法、およびプログラム | |
JP2018173882A (ja) | 情報処理装置、方法、及びプログラム | |
JP5071866B2 (ja) | 距離測定装置および方法並びにプログラム | |
JP5748355B2 (ja) | 3次元座標算出装置、3次元座標算出方法、及びプログラム | |
JP5310402B2 (ja) | 画像変換パラメータ算出装置、画像変換パラメータ算出方法およびプログラム | |
JP2015022631A (ja) | 情報処理装置、情報処理システム及びプログラム | |
JP6304815B2 (ja) | 画像処理装置ならびにその画像特徴検出方法、プログラムおよび装置 | |
US10430971B2 (en) | Parallax calculating apparatus | |
JP7034781B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP2011171991A (ja) | 画像処理装置、電子機器、画像処理方法、および、画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12763298 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013507587 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14007411 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12763298 Country of ref document: EP Kind code of ref document: A1 |