WO2017022033A1 - Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images - Google Patents

Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images Download PDF

Info

Publication number
WO2017022033A1
WO2017022033A1 PCT/JP2015/071862 JP2015071862W WO2017022033A1 WO 2017022033 A1 WO2017022033 A1 WO 2017022033A1 JP 2015071862 W JP2015071862 W JP 2015071862W WO 2017022033 A1 WO2017022033 A1 WO 2017022033A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
point
key frame
camera
points
Prior art date
Application number
PCT/JP2015/071862
Other languages
English (en)
Japanese (ja)
Inventor
山口 伸康
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2015/071862 priority Critical patent/WO2017022033A1/fr
Priority to JP2017532266A priority patent/JP6338021B2/ja
Publication of WO2017022033A1 publication Critical patent/WO2017022033A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention relates to an image processing apparatus and the like.
  • FIG. 14 is a diagram illustrating an example of AR technology. As shown in FIG. 14, for example, when a user photographs the marker 11 and the inspection target 12 using a camera built in the mobile terminal 10, object information 13 for the marker 11 is displayed on the display screen 10 a of the mobile terminal 10. Is displayed.
  • Prior art 1 detects feature points based on the fact that the variation in shade is large near the point of interest and the position of the point of interest on the image is uniquely defined by the variation in shade.
  • Prior art 1 uses a set of three-dimensional coordinates of feature points generated in advance. In the following description, the three-dimensional coordinates of feature points generated in advance are referred to as map points, and a set of map points is referred to as a three-dimensional map.
  • Prior art 1 calculates the position and orientation of the camera by associating the feature points present in the current captured image with the projected map points on the captured image.
  • FIG. 15 is a diagram for explaining the related art 1 for obtaining the position and orientation of the camera.
  • map points S 1 to S 6 exist.
  • a certain map point S i is represented by equation (1) in the world coordinate system.
  • feature points x 1 to x 6 exist on the captured image 20.
  • a certain feature point x i is represented by equation (2) in the camera coordinate system.
  • Map points projected on the captured image 20 are assumed to be projection points x 1 ′ to x 6 ′.
  • a certain projection point x i ′ is represented by Expression (3) in the camera coordinate system.
  • the camera position / orientation is obtained by calculating the camera position / orientation matrix RT that minimizes the sum of squares E calculated by the equation (4). Estimating the position and orientation of the camera for each series of captured images is called “tracking”.
  • FIG. 16 is a diagram for explaining the related art 1 for generating a three-dimensional map.
  • the conventional technique 1 uses the principle of stereo photography.
  • Prior art 1 associates the same feature points in two shot images with different shooting positions.
  • Prior art 1 generates a three-dimensional map having corresponding points as map points from the positional relationship in each captured image of a plurality of associated corresponding points.
  • a map point to be restored as S i a line segment connecting the shooting position Ca and map points S i in the initial camera, the point where the first captured image 20a intersect, characterized Let it be a point x ai .
  • FIG. 17 is a diagram illustrating an example of the definition of the shooting direction of the camera. As shown in FIG. 17, for example, the origin of the three-dimensional coordinates of the three-dimensional map with reference to the position (T x , T y , T z ) of the camera 50 and the shooting direction (R x , R y , R z ). Define
  • Prior art 2 As an application of prior art 1, there is prior art 2 in which tracking and three-dimensional map generation are performed in parallel. Prior art 2 is called SLAM (Simultaneous Localization and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • the positions of the captured images of the two captured images are different and are separated by a sufficient distance. ... (Condition 1)
  • the overlapping range is wide in the shooting range of two shot images. ... (Condition 2)
  • the user performs an operation of moving the camera in parallel before starting tracking.
  • This operation is an operation that is rarely performed during normal shooting with a camera, and reduces the usability of the user.
  • this operation requires a corresponding amount of time and reduces the user's work efficiency.
  • Prior art 3 performs tracking using an infinite map until a three-dimensional map by stereo shooting is generated.
  • Prior Art 3 generates a 3D map at the timing when two captured images suitable for stereo shooting are acquired during tracking, and appropriately switches between an infinite map and a 3D map according to the shooting range, Continue tracking.
  • the above-described conventional technique has a problem that the estimation accuracy of the position and orientation of the camera is lowered.
  • An object of one aspect of the present invention is to provide an image processing apparatus, an image processing method, and an image processing program that can prevent a decrease in estimation accuracy of the position and orientation of a camera.
  • the image processing apparatus includes a key frame determination unit, an initial three-dimensional map generation unit, a position and orientation estimation unit, an adjustment unit, and a three-dimensional map update unit.
  • the key frame determination unit determines a key frame from a plurality of captured images captured by the camera.
  • the initial three-dimensional map generation unit is an initial three-dimensional map in which map points are arranged on a plane separated from the camera by a reference distance based on the position of the feature point existing in the first key frame determined by the key frame determination unit. Generate a map.
  • the position / orientation estimation unit estimates the position / orientation of the camera on the basis of the relationship between the points where the map points of the three-dimensional map are projected onto the captured image and the feature points of the captured image.
  • the adjustment unit finely adjusts the position of the map point of the three-dimensional map based on the position and orientation of the camera.
  • the 3D map updating unit updates the position of the map point of the 3D map based on the feature point of the first key frame and the feature point of the second key frame after the first key frame. .
  • FIG. 1 is a functional block diagram illustrating the configuration of the image processing apparatus according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of a data structure of key frame information.
  • FIG. 3 is a diagram illustrating an example of a data structure of the three-dimensional map information.
  • FIG. 4 is a diagram for explaining the process of the initial three-dimensional map generation unit according to the first embodiment.
  • FIG. 5 is a flowchart illustrating the processing procedure of the image processing apparatus according to the first embodiment.
  • FIG. 6 is a diagram (1) illustrating an example of a marker.
  • FIG. 7 is a diagram (2) illustrating an example of a marker.
  • FIG. 8 is a diagram for explaining another process (1) of the adjustment unit.
  • FIG. 9 is a diagram for explaining another process (2) of the adjustment unit.
  • FIG. 1 is a functional block diagram illustrating the configuration of the image processing apparatus according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of a data structure of key frame information.
  • FIG. 10 is a diagram (1) for explaining other processes of the three-dimensional map update unit.
  • FIG. 11 is a diagram (2) for explaining another process of the three-dimensional map update unit.
  • FIG. 12 is a flowchart illustrating a processing procedure of other processing of the position / orientation estimation unit.
  • FIG. 13 is a diagram illustrating an example of a computer that executes an image processing program.
  • FIG. 14 is a diagram illustrating an example of the AR technique.
  • FIG. 15 is a diagram for explaining the related art 1 for obtaining the position and orientation of the camera.
  • FIG. 16 is a diagram for explaining the related art 1 for generating a three-dimensional map.
  • FIG. 17 is a diagram illustrating an example of the definition of the shooting direction of the camera.
  • FIG. 1 is a functional block diagram illustrating the configuration of the image processing apparatus according to the first embodiment.
  • the image processing apparatus 100 is connected to a camera 50.
  • the image processing apparatus 100 includes a storage unit 110 and an image acquisition unit 120.
  • the storage unit 110 includes key frame information 110a, three-dimensional map information 110b, and position and orientation information 110c.
  • the storage unit 110 corresponds to a storage device such as a semiconductor memory element such as a RAM (Random Access Memory), a ROM (Read Only Memory), and a flash memory (Flash Memory).
  • the key frame information 110a includes information on the position and orientation of the camera 50 at the time of shooting the key frame, and information on feature points included in the key frame.
  • FIG. 2 is a diagram illustrating an example of a data structure of key frame information. As shown in FIG. 2, the key frame information 110a has a parallel progression, a rotation matrix, and feature point information for each key frame number.
  • the key frame number is information that uniquely identifies a key frame. In the example shown in FIG. 2, information on a parallel progression sequence, a rotation matrix, and a feature point corresponding to the key frame number “1” is shown.
  • the parallel progression is a matrix indicating the position of the camera 50, and is indicated by (Tx, Ty, Tz), for example.
  • the rotation matrix is a matrix indicating the position of the camera 50, and is indicated by (r0, r1, r2, r3, r4, r5, r6, r7, r8), for example.
  • the feature point information includes a feature point number, a two-dimensional coordinate, a map point index, and a map point three-dimensional coordinate.
  • the feature point number is information for uniquely identifying the feature point included in the key frame.
  • Two-dimensional coordinates are two-dimensional coordinates of feature points on a key frame.
  • the map point index is information for uniquely identifying the map point corresponding to the feature point.
  • the map point three-dimensional coordinates are three-dimensional coordinates of map points in the world coordinate system.
  • the feature point with the feature point number “1” is associated with the map point with the map point index “1”.
  • the two-dimensional coordinates of the feature point with the feature point number “1” are “u1, v1”, and the three-dimensional coordinates of the map point with the map point index “1” are “x1, y1, z1”.
  • FIG. 3 is a diagram illustrating an example of a data structure of the three-dimensional map information. As shown in FIG. 3, this 3D map information 110b associates a map point index, 3D coordinates, type, key frame number, feature amount, image pattern template, epipolar constraint direction vector, and distance.
  • the map point index is information that uniquely identifies a map point.
  • the three-dimensional coordinate is a three-dimensional coordinate of a map point in the world coordinate system.
  • the type is either “unthree-dimensionalized” or “three-dimensional coordinate determination”.
  • unthree-dimensionalization the three-dimensional coordinates of the corresponding map point are estimated from the position of the feature point of a single key frame, and among (x, y, z), the value of z is 0 3 Indicates that it is a dimensional coordinate (two-dimensional coordinate).
  • “Confirm three-dimensional coordinates” indicates that the three-dimensional coordinates of the corresponding map point are three-dimensional coordinates obtained by two key frames based on the principle of stereo photography.
  • the key frame number is a key frame number of a key frame having a feature point corresponding to the map point.
  • the feature amount is obtained by arranging pixel values around feature points on a captured image (key frame) corresponding to map points in a predetermined order.
  • the image pattern template is an image in a predetermined range including feature points on a captured image (key frame) corresponding to a map point.
  • the epipolar constraint direction vector corresponds to a vector connecting the camera position and the true value of the map point. Details of the epipolar constraint direction vector will be described later.
  • the distance indicates the distance between the origin of the world coordinate system and the map point.
  • the position and orientation information 110c is information on the current position and orientation of the camera 50.
  • the position and orientation information corresponds to the parallel progression T and the rotation matrix R with respect to the origin of the world coordinate system.
  • the position / orientation information 110c is updated by a position / orientation estimation unit 150 described later.
  • the image acquisition unit 120 is a processing unit that acquires captured image information from the camera 50.
  • the image acquisition unit 120 outputs the captured image information to the feature point detection unit 130.
  • the feature point detection unit 130 is a processing unit that detects a feature point from a captured image.
  • the feature point detection unit 130 detects a feature point on the basis of the fact that the variation in shade near the point of interest is large and the position of the point of interest on the image is uniquely defined by the shade variation.
  • the feature points often correspond to the corners of the object.
  • Feature point detection unit 130 outputs feature point information to matching unit 140.
  • the feature point information is, for example, information that associates the feature point number, the two-dimensional coordinates of the feature point, and information about the pixels around the feature point.
  • the matching unit 140 is a processing unit that matches the feature points of the photographed image with the projected points obtained by projecting the map points of the three-dimensional map information on the photographed image.
  • the relationship among the feature points, map points, and projection points of the captured image is as described with reference to FIG.
  • the matching unit 140 compares the feature point and the projection point, and based on the pair of the feature point and the projection point that are close to each other in the captured image and have similar image features, the feature point-map point pair Generate information.
  • the matching unit 140 determines the feature point x 1 and the map point S 1. To generate feature point-map point pair information.
  • the matching unit 140 outputs the feature point-map point pair information and the position information of the feature point on the captured image to the position / orientation estimation unit 150.
  • the position and orientation of the camera 50 can be expressed by a translation component T and a rotation component R with respect to the origin of the world coordinate system.
  • the translation component T is defined by equation (5)
  • the rotation component R is defined by equation (6).
  • the relationship between the map point S i (X i , Y i , Z i ) in the world coordinate system and the projection point x i ′ (u i , v i ) projected on the captured image is defined by the equation (7).
  • the matching unit 140 calculates the position of the projection point corresponding to the map point based on Expression (7).
  • the matching unit 140 acquires the translation component T and the rotation component R from the position / orientation information 110c.
  • the position / orientation estimation unit 150 is a processing unit that estimates the position and orientation of the camera 50 based on the feature point-map point pair information, the position information of the feature point on the captured image, and the three-dimensional map information 110b. .
  • the position / orientation estimation unit 150 updates the position / orientation information 110c with the estimated position / orientation information of the camera 50.
  • the position / orientation estimation unit 150 translates the camera so that the sum of the differences between the paired feature points x i and the projected points x i ′ of the map points is minimized.
  • T and rotation component R are searched, and the position and orientation of the camera are calculated.
  • the position / orientation estimation unit 150 identifies the position of the feature point x i based on position information of the feature point on the captured image. In the search process, the position / orientation estimation unit 150 calculates the position of the projection point xi ′ by the above equation (7) while changing the positions of the translation component T and the rotation component R.
  • the key frame determination unit 160 is a processing unit that determines whether or not a captured image is a key frame that satisfies a condition used for generating a three-dimensional map.
  • the key frame determination unit 160 executes a first key frame suitability determination process or a second key frame suitability determination process described later.
  • the key frame determination unit 160 outputs the determination result to the position / orientation estimation unit 150 and the initial three-dimensional map generation unit 170.
  • the key frame determination unit 160 registers information about the key frame in the key frame information 110a.
  • the key frame information 110a includes the information described with reference to FIG.
  • the key frame determination unit 160 acquires the feature point number, two-dimensional coordinates, map point index, and map point three-dimensional coordinate information included in the key frame from the matching unit 140.
  • the key frame determination unit 160 acquires information about the position and orientation of the camera 50 that captured the key frame from the position and orientation estimation unit 150.
  • the first key frame determination process will be described.
  • the key frame determination unit 160 performs a first key frame determination process before the position / orientation estimation unit 150 estimates the position / orientation of the camera 50.
  • the key frame determination unit 160 acquires a captured image from the image acquisition unit 120, compares the previous captured image with the current captured image, and determines whether the movement of the camera 50 is small.
  • the key frame determination unit 160 determines that the current captured image is a key frame when the movement of the camera 50 is small.
  • the key frame determination unit 160 determines whether or not the movement of the camera 50 is small by executing movement tracking by optical flow, alignment by pattern matching, and the like. For example, the key frame determination unit 160 moves the camera 50 when the difference between the position of the target object existing in the previous captured image and the position of the target object present in the current captured image is less than a threshold value. It is determined that there are few.
  • the second key frame determination process will be described.
  • the key frame determination unit 160 performs second key frame determination processing at a stage after the position / orientation estimation unit 150 estimates the position / orientation of the camera 50.
  • the key frame determination unit 160 acquires the position and orientation information (T t , R t ) of the camera 50 that captured the captured image at a certain time tn from the position and orientation estimation unit 150, and captured the previous key frame.
  • Information (T kf , R kf ) is acquired from the key frame information 100a.
  • the key frame determining unit 160 feature point x t of the photographed image at a point in time t, to obtain information of the map point S t, acquires information of the feature point x kf, 3-dimensional map S kf of the previous keyframe To do.
  • the key frame determination unit 160 obtains position / orientation information (T t , R t ), (T kf , R kf ), feature point x t , map point S t , feature point x kf , and three-dimensional map S kf . Based on this, it is determined whether or not the set of the previous key frame and the current captured image satisfies the conditions for stereo shooting / measurement. For example, the key frame determination unit 160 determines that the condition of stereo shooting / measurement is satisfied when the combination of the key frame and the captured image satisfies the above-described conditions 1 and 2, and the current captured image is the key frame. It is determined that
  • the initial 3D map generation unit 170 is a processing unit that generates initial 3D map information 110b.
  • the initial three-dimensional map generation unit 170 arranges map points on a plane that is separated from the camera by a reference distance based on the position of the feature points existing in a single key frame, so that the initial three-dimensional map information 110b is generated.
  • FIG. 4 is a diagram for explaining the processing of the initial three-dimensional map generation unit.
  • FIG. 4 is a schematic diagram viewed from the y-axis direction of world coordinates.
  • the initial three-dimensional map generator 170 defines a world coordinate system in which the center of the captured image is the origin, the horizontal direction of the captured image is the x axis, the vertical direction is the y axis, and the camera capture direction 50b from the origin is the z axis. .
  • the intersection of the shooting direction 50b of the camera 50 and the x axis is the origin of the world coordinate system.
  • the z-axis distance from the origin to the camera 50 is fixed to a predetermined reference distance L0.
  • the translation component T of the camera 50 in the world coordinate system is (0, 0, L0), and the rotation component R is a unit matrix.
  • the initial three-dimensional map generation unit 170 defines a straight line passing through the camera 50 and the feature point x i , and assumes that a map point exists at the intersection of the defined straight line and the XY plane 60 perpendicular to the Z axis. Locate the map point. For example, the initial three-dimensional map generation unit 170 assumes that a map point S 1 exists at the intersection of a straight line passing through the camera 50 and the feature point x i and the XY plane 60.
  • the initial three-dimensional map generation unit 170 performs the above processing to identify the map points S 1 to S 7 corresponding to the feature points x 1 to x 7 .
  • the values of the z coordinate of each map point specified by the initial three-dimensional map generation unit 170 are all zero.
  • the initial three-dimensional map generation unit 170 registers feature point information and map point information in the three-dimensional map information 110b. That is, the initial three-dimensional map generation unit 170 registers the map point index, three-dimensional coordinates, type, key frame number, feature amount, image pattern template, epipolar constraint vector, and distance information in the three-dimensional map information 110b.
  • the types corresponding to the map points identified by the initial three-dimensional map generation unit 170 are all “unthree-dimensionalized”.
  • the key frame number is a key frame number corresponding to the key frame from which the feature points x 1 to x 7 are extracted.
  • the adjustment unit 180 is a processing unit that finely adjusts the position of the map point whose type is “not three-dimensional” based on the position / orientation information 110c of the camera 50.
  • the adjustment unit 180 individually executes processing for each map point corresponding to the feature point of the captured image.
  • the adjustment unit 180 projects the map points S i (X i , Y i , Z i ) into the captured image based on the equation (7) and the position and orientation (T t , R t ) of the camera.
  • the two-dimensional coordinates (u i , v i ) of the point x i ′ are obtained.
  • Adjustment unit 180, the projection point x i 'and performs matching between feature points x i in the captured image, the projection points x i the matched' so that the error E i between the feature point x i decreases, map
  • the three-dimensional coordinates of the point Si are adjusted using a method such as a least square method.
  • the error E i is defined by equation (8).
  • the adjustment unit 180 reflects the adjustment result on the three-dimensional coordinates of the three-dimensional map information 110b.
  • the adjustment unit 180 calculates an error E i for each map point, calculated error E i is equal to or greater than the threshold and The three-dimensional coordinates may be adjusted for the map points.
  • the three-dimensional map update unit 190 calculates the three-dimensional coordinates of map points that are not registered in the three-dimensional map information 110b based on a plurality of key frames by stereo measurement, and the calculated result is the three-dimensional map information 110b. Is a processing unit to be registered.
  • the three-dimensional map update unit 190 recalculates the three-dimensional coordinates of the map points whose type is “unthree-dimensionalized” by stereo measurement based on a plurality of key frames.
  • the three-dimensional map update unit 190 updates the three-dimensional map information 110b with the recalculated three-dimensional coordinates, and updates the corresponding type from “unthree-dimensionalized” to “three-dimensional coordinate determination”.
  • the three-dimensional map updating unit 190 associates the feature points in the captured image with the map points, and generates a map point with the type “unthree-dimensional” in the key frame information 110a.
  • the information on the correspondence between the feature points in the key frame and the map points is stored.
  • the three-dimensional map update unit 190 performs stereo measurement as a feature point pair of a key frame associated with a map point of the type “unthree-dimensionalized” and a feature point of a captured image, thereby “unthree-dimensionalized”.
  • the 3D coordinates of the map points are recalculated.
  • FIG. 5 is a flowchart illustrating the processing procedure of the image processing apparatus according to the first embodiment.
  • the image acquisition unit 120 of the image processing apparatus 100 acquires a captured image (step S101).
  • the key frame determination unit 160 of the image processing apparatus 100 performs key frame appropriateness determination as to whether the captured image is appropriate as a key frame (step S102).
  • step S103 determines that the captured image is not appropriate as a key frame
  • step S101 determines that the captured image is appropriate as a key frame
  • step S104 proceeds to step S104.
  • the initial three-dimensional map generation unit 170 of the image processing apparatus 100 generates initial three-dimensional map information 110b using a single key frame (step S104).
  • the image acquisition unit 120 acquires a captured image (step S105).
  • the position / orientation estimation unit 150 of the image processing apparatus 100 estimates the position / orientation of the camera 50 (step S106).
  • step S107 If the position / orientation estimation by the position / orientation estimation unit 150 is not successful (step S107, No), the process proceeds to step S105. On the other hand, when the position / orientation estimation by the position / orientation estimation unit 150 is successful (step S107, Yes), the process proceeds to step S108.
  • the adjustment unit 180 of the image processing apparatus 100 finely adjusts the three-dimensional coordinates of the map point of the type “unthree-dimensionalized” included in the three-dimensional map information 110b (step S108).
  • the key frame determination unit 160 performs key frame appropriateness determination as to whether or not the captured image is appropriate as a key frame (step S109).
  • step S110, No When the key frame determination unit 160 determines that the captured image is not appropriate as a key frame (step S110, No), the process proceeds to step S113. On the other hand, if the key frame determination unit 160 determines that the captured image is appropriate as a key frame (step S110, Yes), the key frame determination unit 160 proceeds to step S111.
  • the 3D map update unit 190 of the image processing apparatus 100 identifies the 3D coordinates of the map points using the two key frames, and adds data to the 3D map information 110b (step S111).
  • the three-dimensional map update unit 190 recalculates the three-dimensional coordinates and updates the three-dimensional map information 110b for the map point whose type is “unthree-dimensionalized” (step S112).
  • the image processing apparatus 100 performs content display based on the position / orientation information 110c of the camera (step S113).
  • the image processing apparatus 100 determines whether or not to continue the process (step S114). If the image processing apparatus 100 continues the process (step S114, Yes), the image processing apparatus 100 proceeds to step S105. When the image processing apparatus 100 does not continue the process (No at Step S114), the process ends.
  • the image processing apparatus 100 finely adjusts the three-dimensional coordinates of the map points of the three-dimensional map information 110b in the process of performing tracking based on the three-dimensional map information 110b assuming that the map points exist on a plane. Then, when two appropriate key frames are acquired, the image processing apparatus 100 recalculates the three-dimensional coordinates of the temporarily determined map points based on the principle of stereo measurement, and updates the three-dimensional map information 110b. . For this reason, according to the image processing apparatus 100, it can suppress that the estimation precision of the position and orientation of the camera 50 falls.
  • the image processing apparatus 100 finely adjusts the three-dimensional coordinates of the map points so that the three-dimensional coordinates of the map points approach the true value before acquiring two appropriate key frames.
  • the appropriate key frame is acquired, the appropriate three-dimensional position of the map point is determined, and the three-dimensional coordinates of the map point are switched. For this reason, when the three-dimensional coordinates of the map points are switched, it is possible to prevent the positional deviation of the three-dimensional coordinates from decreasing and the estimation accuracy of the position and orientation of the camera 50 from decreasing.
  • the initial three-dimensional map information 110b is generated from one key frame, and tracking is started. For this reason, it is possible to estimate the position and orientation of the camera without the user performing an operation of translating the position of the camera 50 before starting tracking, and it is possible to realize a more convenient AR technology.
  • the three-dimensional coordinates of the map points are provisionally determined, and fine adjustment and recalculation of the three-dimensional coordinates of the map points are performed in the process of executing tracking. For this reason, the precision of the three-dimensional coordinate of the three-dimensional map information 110b can be improved, and tracking can be continued smoothly and appropriately.
  • one type of three-dimensional map is centrally managed by the three-dimensional map information 110b. For this reason, tracking can be executed by the same calculation method for estimating the position and orientation of the camera as in the past, and it is possible to realize an AR technique that is easy to process and manage.
  • the processing of the image processing apparatus 100 shown in the first embodiment is an example. In the second embodiment, other processes executed by the image processing apparatus 100 will be described.
  • the content display position is often determined to be a specific position to be imaged.
  • the limited position changes for each captured image, and the content display position is fixed. The problem of not being able to occur.
  • the image processing apparatus 100 uses a marker having a specific pattern in which the installation position and physical size in the space are determined in advance and the position and the feature amount of the feature point detected at the time of shooting are determined as a position reference. Install and use as.
  • FIG. 6 is a diagram (1) illustrating an example of a marker.
  • the marker 15 has reference points 15a, 15b, 15c, and 15d at four corners.
  • the origin 16 of the coordinate system and the origin of the three-dimensional coordinates of the camera 50 coincide.
  • the marker 15 has a known shape and physical size, and the image processing apparatus 100 detects the reference points 15a to 15d as feature points and uses the origin 16 as the origin of the world coordinate system.
  • the initial 3D map generation unit 170 may acquire a captured image including the marker 15 and generate initial 3D map information 110b based on the acquired captured image.
  • the initial three-dimensional map generation unit 170 detects the reference points 15a to 15d of the marker 15 included in the captured image as feature points.
  • the initial three-dimensional map generation unit 170 specifies the three-dimensional coordinates of the marker point corresponding to the detected feature point using the positional relationship between the actual origin 16 of the marker 15 and the reference points 15a to 15d as a clue.
  • the initial three-dimensional map generation unit 170 registers feature point information and map point information in the three-dimensional map information 110b.
  • the marker is not limited to FIG. 6, and may be a logo mark as shown in FIG. 7 as long as the mutual arrangement and physical distance of the reference points of the marker are known.
  • FIG. 7 is a diagram (2) illustrating an example of a marker.
  • a logo mark 17 corresponding to a marker has reference points 17a to 17d.
  • the origin of the logo mark 17 is the origin 18, which coincides with the origin of the three-dimensional coordinates of the camera 50.
  • the existence range of the map point to be adjusted can be limited using the epipolar constraint condition of the feature point in the captured image.
  • FIG. 8 is a diagram for explaining another process (1) of the adjustment unit.
  • a position Ca indicates the position of the camera 50 when the key frame 20c is captured
  • a position Cb indicates the position of the camera when the captured image 20d is captured.
  • S i indicates the position of the map point before fine adjustment
  • S iref indicates the position of the map point after fine adjustment
  • St indicates the true value (actual position) of the map point.
  • the feature point xa1 is located at the intersection of the straight line connecting the position Ca and the true value St and the key frame 20c.
  • the feature point xb1 is located at the intersection of the straight line connecting the position Cb and the true value St and the captured image 20d.
  • the feature point xa1 and the feature point xb1 are corresponding points. In a straight line connecting the a position Ca and the position Cb, and intersections between the key frame 20c and the straight line and the intersection x a2, the intersection of the photographed image 20d and the straight line intersection x b2.
  • Adjustment unit 180 is subject to adjustment, the true value St of the map points S i can be estimated to be present at any position on the straight line 21a which passes through the direction vector nkf.
  • the straight line 21a when the straight line 21a is viewed from the photographed image 20d at the time of fine adjustment, it is represented as a three-dimensional straight line 21.
  • This three-dimensional straight line 21 is defined as an epipolar straight line 21. Since the true value of the map point is on the straight line 21a, the matching feature point in the captured image 20d is also on the epipolar line 21b.
  • the adjustment unit 180 limits the fine adjustment range of the three-dimensional coordinates of the map point Si of the type “unthree-dimensionalized” to the straight line 21a of the direction vector Nkf that is an epipolar line. Under this condition, the adjustment unit 180 can express S iref after fine adjustment as an equation of ⁇ P as shown in Equations (9) and (10).
  • ⁇ P is the moving distance of the three-dimensional coordinates of the map point by fine adjustment.
  • the adjustment unit 180 adjusts the three-dimensional coordinates of the map point S i using a method such as a least square method so that Ei expressed using this relationship becomes small, and obtains S iref .
  • the adjustment unit 180 may limit the range of ⁇ P to a predetermined range ⁇ P maxth so that the three-dimensional coordinates of the map points after fine adjustment do not change significantly.
  • the three-dimensional coordinates of the map point of the type “unthree-dimensionalized” are more It can be fine-tuned to an appropriate position.
  • Fine adjustment of map points of type “unthree-dimensional” can be applied in principle to all map points of type “unthree-dimensional”, but originally located at or near the origin of the world coordinate system Since there are few errors with respect to map points, it is not necessary to perform fine adjustment. If fine adjustment is performed on the map point existing at or near the origin, the map point position may be finely adjusted in the wrong direction depending on the position and orientation of the camera 50 at the time of capturing the key frame and the captured image. May end up. When the origin of the world coordinate system is shifted, the estimation accuracy of the position and orientation of the camera 50 may be lowered.
  • FIG. 9 is a diagram for explaining another process (2) of the adjustment unit.
  • an area that is a predetermined distance DIST th or less from the origin O is referred to as a neighborhood area A.
  • map points S 4 and S 5 are included in the neighborhood region A among the map points S 1 to S 8 .
  • the type of the map points S 1 to S 8 is set to “unthree-dimensional”. Therefore, the adjustment unit 180 performs fine adjustment of the three-dimensional coordinates for the map points S 1 to S 3 and S 6 to S 8 , and fine adjustment of the three-dimensional coordinates for the map points S 4 and S 5. To skip.
  • the adjustment unit 180 identifies map points that are not included in the neighborhood area A based on the equation (11), and performs fine adjustment on the identified map points.
  • DIST i indicates the distance between the map point S i and the origin.
  • DIST i is larger than DIST th , the corresponding map point S i is subject to fine adjustment.
  • the three-dimensional map update unit 190 executes a process of recalculating the three-dimensional coordinates of the map points of the type “unthree-dimensional”, and updates the type of the three-dimensional coordinates of the map points to “confirm three-dimensional coordinates”. It was.
  • the three-dimensional map update unit 190 may perform processing for adding a map point whose type is “unthree-dimensionalized” to the three-dimensional map information 110b.
  • the 3D map update unit 190 updates the 3D coordinates of the map points, the feature points in the captured image that could not be matched with the map points or the key frame feature points There is a high possibility that the feature point exists in a new spatial range that has not been shot. In order to store these feature points as map points, it is necessary to track a spatial range without map points. In this case, the number of map points included in the shooting range of the camera 50 is reduced. Therefore, the estimation accuracy of the position and orientation of the camera is lowered, and tracking may fail in some cases.
  • the 3D map updating unit 190 registers the feature points in the captured image that could not be matched with the map points or the feature points of the key frame in the 3D map information 110b as “unthree-dimensionalized” map points.
  • the three-dimensional map update unit 190 defines three-dimensional coordinates on the assumption that feature points are distributed on a plane.
  • FIG. 10 is a diagram (1) for explaining other processes of the three-dimensional map update unit.
  • map points S 1 to S 6 are map points whose type is “determined three-dimensional coordinates”. Further, it corresponds to the map points S 7 to S 8 corresponding to the feature points x 1 to x 6, but the map points of the feature points x 7 and x 8 do not exist.
  • the three-dimensional map update unit 190 calculates the gravity center position G for the map points S 1 to S 6 .
  • the three-dimensional map update unit 190 defines a plane 30a that passes through the gravity center position G and intersects the imaging direction 50b (the optical axis of the camera 50) perpendicularly.
  • 3D map updating unit 190 specifies the straight line passing through the position and the feature point x 7 camera 50, the intersection between the specified straight lines and planes 30a, three-dimensional coordinates of the map point S 7 of the feature point x 7 As specified. Also, 3-dimensional map updating unit 190 specifies the straight line passing through the position and the feature point x 8 cameras 50, the intersection between the specified straight lines and planes 30a, 3 maps point S 8 of the feature point x 8 Identified as a dimensional coordinate. Note that the type of the map points S 7 and S 8 is “unthree-dimensional”. The three-dimensional map update unit 190 additionally registers the information of the map points S 7 and S 8 in the three-dimensional map information 110b.
  • the three-dimensional map update unit 190 sets the plane 30a as shown in FIG. 10, but may set the plane as shown in FIG.
  • FIG. 11 is a diagram (2) for explaining another process of the three-dimensional map update unit.
  • map points S 1 to S 6 are map points whose type is “3D coordinate confirmation”. Further, it corresponds to the map points S 7 to S 8 corresponding to the feature points x 1 to x 6, but the map points of the feature points x 7 and x 8 do not exist.
  • the three-dimensional map update unit 190 sets the approximate plane 30b of the map points S 7 to S 8 .
  • 3D map updating unit 190 specifies the straight line passing through the position and the feature point x 7 camera 50, the intersection between the specified straight lines and planes 30b, 3-dimensional coordinates of the map point S 7 of the feature point x 7 As specified.
  • 3-dimensional map updating unit 190 specifies a line passing through the position and the feature point x 8 cameras 50, the intersection between the specified straight lines and planes 30b, 3 maps point S 8 of the feature point x 8 Identified as a dimensional coordinate.
  • the type of the map points S 7 and S 8 is “unthree-dimensional”.
  • the three-dimensional map update unit 190 additionally registers the information of the map points S 7 and S 8 in the three-dimensional map information 110b.
  • the three-dimensional map update unit 190 adds the map points of the type “unthree-dimensional” by executing the above-described processing, so that the number of map points included in the shooting range of the camera 50 is reduced. It can suppress and can improve the estimation accuracy of the position and orientation of the camera.
  • the position / orientation estimation unit 150 there may be a case where the estimated position and orientation error of the camera 50 becomes large.
  • the position and orientation of the camera 50 includes a translation component and a rotation component.
  • the influence of the translation component error is larger than the influence of the rotation component error.
  • the influence becomes more conspicuous.
  • the error is reduced by the effect of determining the origin.
  • the translation component error in the estimation result of the position and orientation of the camera is reduced, and more accurate tracking can be performed.
  • the position / orientation estimation unit 150 preferentially uses the “three-dimensional coordinate confirmation” map points, and uses the “three-dimensional coordinate confirmation” map points only when the number of “three-dimensional coordinate confirmation” map points is insufficient. By supplementing, the position and orientation of the camera are estimated.
  • FIG. 12 is a flowchart showing a processing procedure of other processing of the position / orientation estimation unit.
  • the position / orientation estimation unit 150 selects map points in the captured image (step S201), and calculates a projection point of the map points in the captured image (step S202).
  • the position / orientation estimation unit 150 extracts map points whose type is “3D confirmed” from the map points (step S203).
  • the position / orientation estimation unit 150 identifies the number of map points whose map point type is “3D confirmed” (step S204).
  • step S205 If the number of map points is equal to or greater than the predetermined number (step S205, Yes), the position / orientation estimation unit 150 proceeds to step S207.
  • step S205, No When the number of map points is less than the predetermined number (step S205, No), the position / orientation estimation unit 150 adds a map point whose type is “unthree-dimensional” (step S206). The position / orientation estimation unit 150 calculates the position / orientation of the camera 50 (step S207).
  • the “three-dimensional coordinate confirmation” map point is used preferentially, Since the position and orientation of the camera are estimated, the estimation accuracy of the position and orientation can be improved.
  • FIG. 13 is a diagram illustrating an example of a computer that executes an image processing program.
  • the computer 200 includes a CPU 201 that executes various arithmetic processes, an input device 202 that receives input of data from a user, and a display 203.
  • the computer 200 also includes a reading device 204 that reads a program and the like from a storage medium, an interface device 205 that exchanges data with other computers via a network, and a camera 206.
  • the computer 200 also includes a RAM 207 that temporarily stores various types of information and a hard disk device 208.
  • the devices 201 to 208 are connected to the bus 209.
  • the hard disk device 208 has a key frame determination program 208a, a position / orientation estimation program 208b, an adjustment program 208c, and a three-dimensional map update program 208d.
  • the CPU 201 reads the key frame determination program 208 a, the position / orientation estimation program 208 b, the adjustment program 208 c, and the three-dimensional map update program 208 d and develops them in the RM 207.
  • the key frame determination program 208a functions as a key frame determination process 207a.
  • the position / orientation estimation program 208b functions as a position / orientation estimation process 207b.
  • the adjustment program 208c functions as an adjustment process 207c.
  • the three-dimensional map update program 208d functions as a three-dimensional map update process 207d.
  • the processing of the key frame determination process 207a corresponds to the processing of the key frame determination unit 160.
  • the processing of the position / orientation estimation process 207 b corresponds to the processing of the position / orientation estimation unit 160.
  • the process of the adjustment process 207 c corresponds to the process of the adjustment unit 180.
  • the three-dimensional map update process 207d corresponds to the processing of the three-dimensional map update unit 190.
  • each program is stored in a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, and an IC card inserted into the computer 200. Then, the computer 200 may read and execute each of the programs 207a to 208d.
  • a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, and an IC card inserted into the computer 200.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un dispositif de traitement d'images (100) qui détermine des trames clés à partir d'une pluralité d'images de capture d'image capturées par une caméra, et sur la base de la position d'un point caractéristique dans une première trame clé, génère une carte en 3D initiale dans laquelle un point de carte est disposé sur un plan qui est séparé d'une caméra par une distance de référence. Le dispositif de traitement d'images (100) estime la posture de position de la caméra sur la base de la relation entre un point au niveau duquel le point de carte de la carte en 3D est projeté sur l'image de capture d'image et le point caractéristique de l'image de capture d'image. Le dispositif de traitement d'images (100) exécute un réglage précis de la position du point de carte de la carte en 3D sur la base de la posture de position de la caméra. Le dispositif de traitement d'images (100) met à jour la position du point de carte de la carte en 3D sur la base du point de caractéristique de la première trame clé et du point caractéristique d'une seconde trame clé qui suit la première trame clé.
PCT/JP2015/071862 2015-07-31 2015-07-31 Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images WO2017022033A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/071862 WO2017022033A1 (fr) 2015-07-31 2015-07-31 Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images
JP2017532266A JP6338021B2 (ja) 2015-07-31 2015-07-31 画像処理装置、画像処理方法および画像処理プログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/071862 WO2017022033A1 (fr) 2015-07-31 2015-07-31 Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images

Publications (1)

Publication Number Publication Date
WO2017022033A1 true WO2017022033A1 (fr) 2017-02-09

Family

ID=57942698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/071862 WO2017022033A1 (fr) 2015-07-31 2015-07-31 Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images

Country Status (2)

Country Link
JP (1) JP6338021B2 (fr)
WO (1) WO2017022033A1 (fr)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018173882A (ja) * 2017-03-31 2018-11-08 富士通株式会社 情報処理装置、方法、及びプログラム
JP2018194346A (ja) * 2017-05-15 2018-12-06 日本電気株式会社 画像処理装置、画像処理方法及び画像処理プログラム
CN109035334A (zh) * 2018-06-27 2018-12-18 腾讯科技(深圳)有限公司 位姿的确定方法和装置、存储介质及电子装置
JP2019133658A (ja) * 2018-01-31 2019-08-08 株式会社リコー 測位方法、測位装置及び読取り可能な記憶媒体
CN110520694A (zh) * 2017-10-31 2019-11-29 深圳市大疆创新科技有限公司 一种视觉里程计及其实现方法
CN111105467A (zh) * 2019-12-16 2020-05-05 北京超图软件股份有限公司 一种图像标定方法、装置及电子设备
CN111373442A (zh) * 2017-11-20 2020-07-03 松下电器(美国)知识产权公司 三维点群数据生成方法、位置推断方法、三维点群数据生成装置以及位置推断装置
CN111383282A (zh) * 2018-12-29 2020-07-07 杭州海康威视数字技术股份有限公司 位姿信息确定方法及装置
CN111754388A (zh) * 2019-03-28 2020-10-09 北京初速度科技有限公司 一种建图方法及车载终端
CN111829532A (zh) * 2019-04-18 2020-10-27 顺丰科技有限公司 一种飞行器重定位系统和重定位方法
KR20210095913A (ko) * 2018-11-30 2021-08-03 후아웨이 테크놀러지 컴퍼니 리미티드 맵 작성 방법, 장치, 및 시스템, 및 저장 매체
CN113239072A (zh) * 2021-04-27 2021-08-10 华为技术有限公司 一种终端设备定位方法及其相关设备
CN113506337A (zh) * 2021-05-17 2021-10-15 南京理工大学 一种基于EPnP的双目相机位姿估计方法
CN113624222A (zh) * 2021-07-30 2021-11-09 深圳市优必选科技股份有限公司 一种地图的更新方法、机器人及可读存储介质
JP2022501684A (ja) * 2019-08-23 2022-01-06 上海亦我信息技術有限公司 撮影に基づく3dモデリングシステムおよび方法、自動3dモデリング装置および方法
WO2024009377A1 (fr) * 2022-07-05 2024-01-11 日本電気株式会社 Dispositif de traitement d'informations, procédé d'estimation de position propre et support non transitoire lisible par ordinateur

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009237846A (ja) * 2008-03-27 2009-10-15 Sony Corp 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム
JP2010534878A (ja) * 2007-07-26 2010-11-11 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 深さ関連情報伝達のための方法及び装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010534878A (ja) * 2007-07-26 2010-11-11 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 深さ関連情報伝達のための方法及び装置
JP2009237846A (ja) * 2008-03-27 2009-10-15 Sony Corp 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018173882A (ja) * 2017-03-31 2018-11-08 富士通株式会社 情報処理装置、方法、及びプログラム
JP2018194346A (ja) * 2017-05-15 2018-12-06 日本電気株式会社 画像処理装置、画像処理方法及び画像処理プログラム
CN110520694A (zh) * 2017-10-31 2019-11-29 深圳市大疆创新科技有限公司 一种视觉里程计及其实现方法
CN111373442A (zh) * 2017-11-20 2020-07-03 松下电器(美国)知识产权公司 三维点群数据生成方法、位置推断方法、三维点群数据生成装置以及位置推断装置
JP2019133658A (ja) * 2018-01-31 2019-08-08 株式会社リコー 測位方法、測位装置及び読取り可能な記憶媒体
CN109035334A (zh) * 2018-06-27 2018-12-18 腾讯科技(深圳)有限公司 位姿的确定方法和装置、存储介质及电子装置
KR20210095913A (ko) * 2018-11-30 2021-08-03 후아웨이 테크놀러지 컴퍼니 리미티드 맵 작성 방법, 장치, 및 시스템, 및 저장 매체
KR102474160B1 (ko) 2018-11-30 2022-12-02 후아웨이 테크놀러지 컴퍼니 리미티드 맵 작성 방법, 장치, 및 시스템, 및 저장 매체
CN111383282A (zh) * 2018-12-29 2020-07-07 杭州海康威视数字技术股份有限公司 位姿信息确定方法及装置
CN111383282B (zh) * 2018-12-29 2023-12-01 杭州海康威视数字技术股份有限公司 位姿信息确定方法及装置
CN111754388A (zh) * 2019-03-28 2020-10-09 北京初速度科技有限公司 一种建图方法及车载终端
CN111829532A (zh) * 2019-04-18 2020-10-27 顺丰科技有限公司 一种飞行器重定位系统和重定位方法
JP2022501684A (ja) * 2019-08-23 2022-01-06 上海亦我信息技術有限公司 撮影に基づく3dモデリングシステムおよび方法、自動3dモデリング装置および方法
JP7223449B2 (ja) 2019-08-23 2023-02-16 上海亦我信息技術有限公司 撮影に基づく3dモデリングシステム
CN111105467A (zh) * 2019-12-16 2020-05-05 北京超图软件股份有限公司 一种图像标定方法、装置及电子设备
CN111105467B (zh) * 2019-12-16 2023-08-29 北京超图软件股份有限公司 一种图像标定方法、装置及电子设备
WO2022228391A1 (fr) * 2021-04-27 2022-11-03 华为技术有限公司 Procédé de positionnement de dispositif terminal et dispositif associé
CN113239072A (zh) * 2021-04-27 2021-08-10 华为技术有限公司 一种终端设备定位方法及其相关设备
CN113506337A (zh) * 2021-05-17 2021-10-15 南京理工大学 一种基于EPnP的双目相机位姿估计方法
CN113506337B (zh) * 2021-05-17 2024-04-16 南京理工大学 一种基于EPnP的双目相机位姿估计方法
CN113624222A (zh) * 2021-07-30 2021-11-09 深圳市优必选科技股份有限公司 一种地图的更新方法、机器人及可读存储介质
WO2024009377A1 (fr) * 2022-07-05 2024-01-11 日本電気株式会社 Dispositif de traitement d'informations, procédé d'estimation de position propre et support non transitoire lisible par ordinateur

Also Published As

Publication number Publication date
JP6338021B2 (ja) 2018-06-06
JPWO2017022033A1 (ja) 2018-02-01

Similar Documents

Publication Publication Date Title
JP6338021B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
JP6503906B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
CN110568447B (zh) 视觉定位的方法、装置及计算机可读介质
JP6464934B2 (ja) カメラ姿勢推定装置、カメラ姿勢推定方法およびカメラ姿勢推定プログラム
US10930008B2 (en) Information processing apparatus, information processing method, and program for deriving a position orientation of an image pickup apparatus using features detected from an image
US11037325B2 (en) Information processing apparatus and method of controlling the same
WO2017041731A1 (fr) Réalité augmentée multi-objet multi-utilisateur sans marqueur sur des dispositifs mobiles
US10438412B2 (en) Techniques to facilitate accurate real and virtual object positioning in displayed scenes
JP6464938B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
CN111127524A (zh) 一种轨迹跟踪与三维重建方法、系统及装置
JP6985897B2 (ja) 情報処理装置およびその制御方法、プログラム
US11348323B2 (en) Information processing apparatus for correcting three-dimensional map, information processing method for correcting three-dimensional map, and non-transitory computer-readable storage medium for correcting three-dimensional map
JP6894707B2 (ja) 情報処理装置およびその制御方法、プログラム
JP6061770B2 (ja) カメラ姿勢推定装置及びそのプログラム
JP2020067978A (ja) 床面検出プログラム、床面検出方法及び端末装置
JP6922348B2 (ja) 情報処理装置、方法、及びプログラム
JP2006113832A (ja) ステレオ画像処理装置およびプログラム
KR20210051002A (ko) 포즈 추정 방법 및 장치, 컴퓨터 판독 가능한 기록 매체 및 컴퓨터 프로그램
KR20150119770A (ko) 카메라를 사용한 3차원 좌표 측정 장치 및 방법
JPH11194027A (ja) 三次元座標計測装置
JP2018032144A (ja) 画像処理装置、画像処理方法およびプログラム。
JP2022095239A (ja) 画像標定方法、画像標定装置、画像標定システム及び画像標定プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15900347

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017532266

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15900347

Country of ref document: EP

Kind code of ref document: A1