WO2015132981A1 - 位置測定装置及び位置測定方法 - Google Patents
位置測定装置及び位置測定方法 Download PDFInfo
- Publication number
- WO2015132981A1 WO2015132981A1 PCT/JP2014/070233 JP2014070233W WO2015132981A1 WO 2015132981 A1 WO2015132981 A1 WO 2015132981A1 JP 2014070233 W JP2014070233 W JP 2014070233W WO 2015132981 A1 WO2015132981 A1 WO 2015132981A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature point
- captured image
- image
- reflection surface
- camera
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/907—Television signal recording using static stores, e.g. storage tubes or semiconductor memories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present invention relates to a position measuring apparatus and a position measuring method for measuring the position of an object having a reflecting surface in a three-dimensional space using a camera.
- Patent Documents 1 and 2 are inventions of Patent Documents 1 and 2 as methods for measuring the position of an object having a reflecting surface in a three-dimensional space.
- an object having a reflecting surface illuminated is photographed by two cameras, and the normal direction of reflected light of the object having the reflecting surface is calculated from each photographed image.
- the invention of Patent Document 1 searches for corresponding pixels between images based on the normal direction, and performs three-dimensional surveying (stereo measurement) based on the parallax between corresponding pixels.
- an object having a reflective surface is irradiated with a laser spot light, and the reflected light reflected by the object having a reflective surface is observed by two image sensors.
- the invention of Patent Document 2 calculates the incident vector of the observed reflected light, and calculates the position where the spot light hits the object having the reflecting surface based on the incident vector.
- Patent Documents 3 and 4 as methods for measuring the shape of an object having a specular reflection surface.
- JP 2010-071782 A Japanese Patent Application Laid-Open No. 2011-117832 JP 2010-197391 A JP 2007-322162 A
- Patent Documents 1 and 2 in order to acquire an image of an object having a reflecting surface, a special light source is used to irradiate the object having the reflecting surface.
- a special light source is used to irradiate the object having the reflecting surface.
- these measures alone cannot solve the light intensity difference problem, and the position of an object having a reflecting surface in a three-dimensional space may not be measured. In the first place, the apparatus cost and energy for irradiating light are needed.
- Patent Documents 3 and 4 also do not disclose solving these problems.
- An object of the present invention is to solve the above-mentioned problems, and position measurement that can stably measure the position of an object having a reflecting surface in a three-dimensional space without using a special light source and a wide dynamic range camera. It is to provide an apparatus and method.
- a position measurement device includes: A camera that acquires a captured image including at least a part of the reflective surface by capturing a target object having the reflective surface; A first storage unit that stores shooting conditions including a viewpoint position and a shooting direction of the camera; A second storage unit for storing model data including the shape of the target object; A marker object having a plurality of feature points and fixed at a predetermined position with respect to the camera; A third storage unit for storing feature point data representing a mutual positional relationship between the viewpoint position and the feature point; A feature of extracting a plurality of feature points from a photographed image acquired by the camera in a state where a plurality of the feature points are reflected on the reflection surface, and determining a position of the feature point in the photographed image. A point extractor; A position measuring unit configured to calculate a position of the target object based on the shooting condition, the model data, the feature point data, and a position of the feature point in the shot image;
- the position of an object having a reflecting surface in a three-dimensional space can be measured without using a special light source and a wide dynamic range camera.
- FIG. 1 It is a block diagram which shows the structure of the position measuring apparatus which concerns on Embodiment 1 of this invention. It is a figure which shows the example of the pattern of the marker object 2 which the position measuring apparatus which concerns on Embodiment 1 of this invention has. It is a figure explaining the relationship between the camera and marker object, the reflective surface, and the mirror image body of a camera and marker object in the position measuring device according to Embodiment 1 of the present invention. It is a flowchart explaining the position measurement process performed by the position measuring apparatus which concerns on Embodiment 1 of this invention. It is a flowchart explaining the mirror body position calculation process of the position measurement process performed by the position measurement apparatus which concerns on Embodiment 1 of this invention.
- Embodiment 1 In each embodiment of the present invention, in order to measure the position of a target object having a reflecting surface (an object whose position is to be measured), a marker object having a plurality of feature points whose positional relationships are determined is used.
- a virtually existing mirror image is obtained from the mirror image of the camera and marker object reflected on the reflecting surface, the positions of a plurality of feature points included in the mirror image are calculated, and the feature points of the mirror image object The position and orientation of the reflecting surface are calculated based on the position of.
- the first embodiment of the present invention uses a reflection surface model that represents the shape of the reflection surface when the reflection surface is a known curved surface, and reflects the light from the camera viewpoint vertically on the reflection surface.
- An approximate plane that approximates the reflection surface is determined around the point (this point is called a mirror image center point, which will be described in detail later), and the reflection surface model is applied to the approximation plane.
- the position of the reflecting surface is the position in the three-dimensional space of the mirror image center point on the reflecting surface.
- the direction of the reflecting surface is the normal direction of the plane that is in contact with the reflecting surface at the mirror image center point.
- FIG. 1 is a block diagram showing a configuration of a position measuring apparatus according to Embodiment 1 of the present invention.
- the position measuring device in FIG. 1 measures the position of a target object having a reflecting surface in a three-dimensional space.
- the target object is, for example, a split mirror 21 obtained by dividing the reflection surface of a large-diameter reflection telescope.
- the split mirror 21 includes a reflective surface 22 and a non-reflective portion (such as an edge 23, a side surface, and a back surface of the reflective surface 22) that is a portion that is not the reflective surface 22.
- a gripping mechanism which is a robot hand suspended by a crane, for example.
- the position measuring device is used to measure the position of the split mirror 21 in a three-dimensional space with a predetermined accuracy when the split mirror 21 is gripped by a gripping mechanism.
- the relative position of the gripping mechanism and the dividing mirror 21 is known in advance by an error of about several tens of centimeters by another method.
- the position measurement apparatus according to Embodiment 1 of the present invention obtains the position of the split mirror 21 with an accuracy of, for example, about several millimeters or less.
- the position measuring apparatus according to Embodiment 1 of the present invention can also be used when measuring the position of another object having a reflecting surface.
- the camera 1 includes at least one camera 1, a marker object 2, a storage device 3, and a processing device 4.
- the camera 1 is a camera that has been subjected to general adjustment according to ambient light.
- the camera 1 captures an image including at least a part of the reflecting surface 22 of the split mirror 21 (hereinafter referred to as “captured image”) by capturing the split mirror 21.
- the marker object 2 is an object including a plurality of markers each fixed at a predetermined position with the camera 1 as a reference, and has a plurality of feature points.
- the camera 1 may also have at least one feature point.
- the marker is a plane figure or a three-dimensional figure having feature points.
- the storage device 3 stores in advance information necessary for position measurement in a three-dimensional space.
- the storage device 3 stores a first storage unit that stores shooting conditions including the viewpoint position and shooting direction of the camera 1 when shooting an image, and a second storage unit that stores model data including the shape of the split mirror 21. And a third storage unit that stores feature point data representing the mutual positional relationship between the viewpoint position and the feature point.
- the model data includes a reflection surface model representing the shape of the reflection surface 22 and the positional relationship between the reflection surface 22 and the non-reflection portion.
- the shape of the reflecting surface 22 includes the radius of curvature of the reflecting surface 22 and the shape of a contour line (circular, square, hexagonal, etc.).
- the feature point data includes the positions of a plurality of feature points of the marker object 2 with reference to the viewpoint position.
- the feature point data may further include the position of at least one feature point of the camera 1 based on the viewpoint position.
- the positional relationship between the feature points stored in advance as the feature point data is a mirror image relationship (an inverted positional relationship) with respect to the actual positional relationship between the feature points. used.
- the feature point may be based on a position different from the viewpoint position. Any feature point data may be used as long as the positional relationship between the viewpoint position and the feature point can be expressed.
- a curvature may be used instead of the radius of curvature.
- the processing device 4 determines the position of the split mirror 21 in the three-dimensional space based on the image of the split mirror 21 taken by the camera 1.
- the processing device 4 includes a feature point extraction unit 11 and a position measurement unit 12.
- the feature point extraction unit 11 extracts a plurality of feature points from the photographed image photographed by the camera 1 in a state where a plurality of the feature points are reflected in the reflection surface 22, and determines the positions of the feature points in the photographed image. decide.
- the feature point extraction unit 11 further associates the position of the feature point in the captured image with the position of the feature point stored in advance as feature point data.
- the position measurement unit 12 calculates the position of the split mirror 21 based on the shooting conditions, model data, feature point data, and the position of the feature point in the shot image.
- the position measuring device may further include a display device 5 showing the image 6 taken by the camera 1 and / or the process and result of the position measurement of the split mirror 21 in the three-dimensional space. By providing the display device 5, the user can easily understand the situation.
- the position measurement device may not include the display device 5.
- an image 6 captured by the camera 1 includes a reflection surface 22 and an edge 23 of the split mirror 21 and a camera 1 and a marker object 2 reflected on the reflection surface 22 of the split mirror 21. And a mirror image 24.
- the field of view of the camera 1 is sufficiently wide with respect to the split mirror 21 and the split mirror 21 and the marker object 2 are within the field of view of the camera 1.
- the mirror image of the object means an image of the object reflected on the reflecting surface 22.
- the position measuring unit 12 includes an enantiomer position calculating unit 13, a reflecting surface position calculating unit 14, and a target object position determining unit 15.
- the mirror body position calculation unit 13 is based on the position of the feature point stored in advance as the feature point data, and the camera 1 and the marker virtually exist on the opposite side of the reflection surface 22 from the camera 1 and the marker object 2. The positions of a plurality of feature points included in the mirror image of the object 2 are calculated.
- the mirror body is a camera with respect to the reflection surface 22 such that the same image as the mirror image 24 of the camera 1 and the marker object 2 reflected on the reflection surface 22 can be photographed by the camera 1 when it is assumed that the reflection surface 22 does not exist.
- 1 is a virtual object existing on the opposite side of 1 (see FIGS. 3, 7, and 8).
- the mirror image position calculation unit 13 is configured so that the positions of a plurality of feature points included in the virtual captured image obtained by virtually capturing the mirror image under the same shooting conditions as when the actual captured image is acquired by the camera 1 are actually Are included in the mirror body so as to be close to positions of a plurality of corresponding feature points in the captured image acquired by the camera 1 (feature points in the actual captured image corresponding to the feature points in the virtual captured image), respectively. Calculate the position of multiple feature points.
- the reflecting surface position calculation unit 14 determines the feature point position of the marker object 2 determined from the viewpoint position and feature point data (previously stored feature point positions), and the mirror object feature calculated by the mirror body position calculation unit 13. Based on the position of the point and the radius of curvature of the reflecting surface 22 stored in advance as model data, the position and orientation of the reflecting surface 22 are calculated.
- the target object position determination unit 15 determines the division mirror based on the positional relationship between the reflective surface and the non-reflective part stored in advance as model data and the position and orientation of the reflective surface 22 calculated by the reflective surface position calculation unit 14. The position of 21 is determined. Specifically, the target object position determination unit 15 specifies the position of the split mirror 21 by detecting the position of the edge 23 of the reflecting surface 22.
- the mirror body position calculation unit 13 positions the plurality of feature points included in the mirror body so as to reduce the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image. Calculate The mirror body position calculation unit includes a projective conversion unit 16 and a size changing unit 17.
- the projective transformation unit 16 is a projective transformation applied to the position of the feature point in the actual photographed image or the position of the feature point in the virtual photographed image, and the figure formed by the feature point in the virtual photographed image and the actual Projective transformation is determined such that the figures formed by the corresponding feature points in the captured image are similar to each other.
- projective transformation is applied to the position of a feature point in a virtual photographed image
- the position of each feature point after the projective transformation is expanded or reduced at an appropriate magnification and translated to correspond to the actual photographed image.
- the position is the same as or close to the position of the feature point.
- projective transformation is determined by using a homography matrix calculation generally performed in image processing or computer vision. Can do. This projective transformation represents the difference between the assumed posture of the mirror body and the posture of the camera 1 and the marker object 2 actually captured by the camera 1 (actual mirror body posture).
- the size changing unit 17 applies the determined projective transformation to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image, and then the figure formed by the feature point in the virtual captured image or
- the size change rate applied to the figure formed by the corresponding feature point in the actual captured image, the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image Determine a resizing rate that reduces the difference.
- the size changing unit 17 determines the size change rate such that the figure formed by the feature points in the virtual photographed image approaches congruent with the figure formed by the corresponding feature points in the actual photographed image. .
- the size change rate includes the same size.
- the mirror body position calculation unit 13 changes the positions of a plurality of feature points included in the mirror body based on the determined projective transformation and the determined size change rate.
- the mirror body position calculation unit 13 until the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image is less than a predetermined threshold (allowable error).
- Projective transformation and resizing rate are calculated iteratively.
- the threshold value is set for the maximum absolute value of the error of the position of each feature point, the sum of the absolute values of the error of the position of each feature point, the square sum of the error of the position of each feature point, and the like.
- the position measurement device determines the position of the determined split mirror 21 for further processing (for example, holding and moving the split mirror 21 by a gripping mechanism).
- the position may be sent to an external device (not shown).
- the storage device 3 may store a set of shooting conditions including one viewpoint position of the camera 1 and one shooting direction.
- the position measurement unit 12 calculates the position of the split mirror 21 based on the shooting conditions, model data, feature point data, and a feature point set that includes the positions of a plurality of feature points in the shot image. May be.
- FIG. 2 is a diagram illustrating an example of a pattern of the marker object 2 included in the position measurement device according to Embodiment 1 of the present invention.
- the marker object 2 in FIG. 2 has a plurality of squares (markers) that form a checkered rectangular area, and four different symbols (graphic figures) provided at four corners of the rectangular area.
- Each marker has a feature point such as the vertex or centroid of an individual graphic.
- the storage device 3 stores the positions of a plurality of feature points included in the camera 1 and the marker object 2 as models of the camera 1 and the marker object 2.
- the viewpoint position of the camera 1 may be the origin, and another position may be the origin.
- the marker object 2 can be distinguished from the top, bottom, left, and right by providing four different types of symbols.
- the image 6 captured by the camera 1 includes a mirror image 24 of the camera 1 and the marker object 2 reflected on the reflection surface 22 of the split mirror 21, the image 1 is included in the camera 1 and the marker object 2.
- the plurality of feature points (the model stored in the storage device 3) can be associated with the plurality of feature points included in the mirror image 24.
- the checker pattern of the marker object 2 shown in FIG. 2 is an example, and the marker object 2 is another marker or a mark arranged two-dimensionally or three-dimensionally, such as a plurality of circles forming a polka dot pattern. Geometric features may be included. For example, when the marker is a circle, the center of the circle may be used as the feature point. In addition, the number of symbols provided for distinguishing the top, bottom, left, and right of the marker object 2 is not limited to four. In addition, it is not necessary to provide a special symbol for distinguishing the top, bottom, left, and right of the marker object 2. For example, by giving a color different from that of the other markers to at least one marker, Can be distinguished.
- the storage device 3 further stores the color of each marker.
- the marker object is not limited to a two-dimensional array such as a checker pattern, and any marker object may be used as long as it has a plurality of feature points respectively arranged at predetermined positions with respect to the position of the camera 1. .
- FIG. 3 is a diagram for explaining the relationship between the camera 1 and the marker object 2, the reflection surface, and the mirror body of the camera 1 and the marker object 2 in the position measurement apparatus according to Embodiment 1 of the present invention.
- the relationship between the camera 1 and the marker object 2, the reflecting surface 22, and the mirror image of the camera 1 and the marker object 2 will be described with reference to FIG.
- a virtual camera 1M and a virtual marker object 2M exist virtually.
- the camera 1 has a viewpoint 31, a shooting direction 32, and a field of view 33.
- the virtual camera 1M which is a mirror image, has a virtual viewpoint 31M, a virtual shooting direction 32M, and a virtual field of view 33M.
- a straight line passing through the viewpoint 31 of the camera 1 and the viewpoint 31M of the virtual camera 1M intersects the reflecting surface 22 perpendicularly.
- a straight line passing through the viewpoint 31 of the camera 1 and the viewpoint 31M of the virtual camera 1M is referred to as a reflecting surface vertical line.
- the intersection of the reflecting surface vertical line 34 and the reflecting surface 22 is called a mirror image center point 35.
- FIG. 4 is a flowchart for explaining position measurement processing executed by the position measurement apparatus according to Embodiment 1 of the present invention.
- FIG. 5 is a flowchart for explaining the mirror position calculation process of the position measurement process executed by the position measurement apparatus according to the first embodiment of the present invention.
- the position of the split mirror 21 includes the position of a predetermined reference point (for example, the center of gravity) of the split mirror 21, the direction in which the predetermined surface of the split mirror 21 faces (referred to as the front direction), and the front direction. It can be decomposed into the rotation angle at.
- a combination of a relative direction of the front direction of the split mirror 21 and a rotation angle around the front direction with respect to a straight line from the viewpoint position of the camera 1 to a predetermined point of the split mirror 21 is defined as a split mirror. Called 21 posture. If the position and posture of the split mirror 21 are determined, the positions of all the parts of the split mirror 21 are also determined.
- the camera 1 and the marker object 2 reflected on the reflecting surface of the dividing mirror 21 when a rough position and posture are given.
- the position of the feature point included in the virtual captured image obtained by virtually capturing the image is calculated, and the position of the feature point in the virtual captured image is close to the position of the corresponding feature point in the actual captured image As described above, the position and posture of the split mirror 21 are corrected.
- the position of the feature point of the mirror body of the camera 1 and the marker object 2 is calculated.
- the positions of the mirror 1 of the camera 1 and the marker object 2 determined from the rough position of the dividing mirror 21 are set as reference positions. Call it.
- the split mirror 21 constitutes a part of the reflecting surface of the reflective telescope, or whether the split mirror 21 is separated from the reflective telescope and placed on the mounting tool. You can also roughly see the front direction.
- the posture of the mirror body of the camera 1 and the marker object 2 determined from the front direction of the split mirror 21 which is roughly known is referred to as a reference posture. Since the relative positional relationship between the feature points included in the mirror image is determined from the positional relationship between the feature points of the camera 1 and the marker object 2, the reference position and the reference posture can be determined.
- the position measurement process in FIG. 4 is executed by the processing device 4 in FIG. 1 in order to measure the position of the target object.
- the feature point extraction unit 11 extracts feature points from the actual captured image captured by the camera 1, and sends information about the extracted feature points to the position measurement unit 12.
- the mirror body position calculation unit 13 executes a mirror body position calculation process, and determines the positions of the feature points of the mirror bodies (virtual camera 1M and virtual marker object 2M) of the camera 1 and the marker object 2.
- the mirror body position calculation unit 13 determines the reference position and reference posture of the mirror body of the camera 1 and the marker object 2 as initial values of the position and posture of the mirror body.
- the mirror body position calculation unit 13 performs virtual shooting obtained by virtually shooting the mirror body under the same shooting conditions as when the actual shot image was acquired by the camera 1 based on the position and orientation of the mirror body. The positions of a plurality of feature points included in the image are calculated.
- the enantiomer position calculation unit 13 determines whether the difference between the position of the feature point in the actual captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value. To do.
- step S13 If the difference between the positions of the feature points is less than the threshold value, that is, if step S13 is YES, the process of the flowchart of FIG. 5 is terminated, and the process proceeds to step S3 of FIG. If the difference between the positions of the feature points is greater than or equal to the threshold value, that is, if step S13 is NO, the process proceeds to step S14. Differences in the positions of feature points are calculated as, for example, the sum of absolute values of distances between feature points corresponding to each other, the sum of squares of distances, and the like.
- the mirror body position calculation unit 13 uses the projection conversion unit 16 to perform the projective transformation applied to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image.
- projective transformation is determined such that the figure formed by the feature points in the virtual photographed image and the figure formed by the corresponding feature points in the actual photographed image are close to each other.
- the reflecting surface 22 of the split mirror 21 is a plane
- projective transformation is determined based on the positions of all feature points in the actual captured image (and the positions of corresponding feature points in the virtual captured image). The attitude of the mirror image can be calculated stably.
- the reflecting surface 22 of the split mirror 21 is a curved surface
- the position of the feature point within the range in which the distortion of the mirror image around the mirror image center point 35 shown in the actual captured image can be ignored (and the corresponding in the virtual captured image).
- the attitude of the mirror image can be calculated stably while suppressing the influence of the distortion of the mirror image.
- a point corresponding to the mirror image center point 35 in the image is referred to as a mirror image center point in the image.
- step S15 of FIG. 5 the determined projective transformation is applied to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image.
- step S ⁇ b> 16 the mirror body position calculation unit 13 uses the size changing unit 17 to change the size change rate applied to the graphic formed by the feature points in the virtual photographed image or the feature points in the actual photographed image.
- the difference between the position of the feature point in the virtual photographed image and the position of the corresponding feature point in the actual photographed image is minimized without changing the position of the mirror image center point 35 in the image.
- the correct resizing rate is minimized without changing the position of the mirror image center point 35 in the image.
- FIG. 6 is a diagram for explaining an example of a process of calculating the position of the mirror body by the mirror body position calculation unit 13 of the position measurement apparatus according to the first embodiment of the present invention.
- the diagram denoted by reference numeral 101 shows a case where the actual captured image includes the reflection surface 22 and the mirror image 24 of the marker object 2.
- the mirror image 24 of the marker object 2 has a plurality of feature points 24a.
- the diagram denoted by reference numeral 104 shows an image 41 of the virtual marker object 2M in a virtual photographed image obtained by virtually photographing a mirror image whose position and orientation are initial values (reference position and reference orientation).
- the image 41 of the virtual marker object 2M has a plurality of feature points 41a.
- FIG. 103 shows an image 42 of the virtual marker object 2M in the virtual photographed image after the projective transformation obtained in step S14 of FIG. 5 is applied to the virtual photographed image.
- the image 42 of the virtual marker object 2M has a plurality of feature points 42a.
- the diagram with reference numeral 102 shows the image 43 of the virtual marker object 2M in the virtual photographed image after the size of the virtual photographed image is changed at the size change rate obtained in step S16 in FIG.
- the image 43 of the virtual marker object 2M has a plurality of feature points 43a.
- the position of the feature point in the actual captured image (reference numeral It can be seen that the difference between the position of the corresponding feature point in the virtual photographed image and the position of the corresponding feature point in the virtual photographed image is smaller than in the case of the image 41.
- the mirror body position calculation unit 13 changes the posture of the mirror body based on the determined projective transformation.
- the mirror body position calculation unit 13 changes the distance from the camera 1 and the marker object 2 to the mirror body (the virtual camera 1M and the virtual marker object 2M) based on the determined size change rate.
- the mirror body position calculation unit 13 does not execute steps S15, S16, and S18, and after the execution of step S17, the difference between the position of the feature point in the actual captured image and the position of the corresponding feature point in the virtual captured image.
- the distance from the camera 1 and the marker object 2 to the mirror image may be directly changed so as to minimize This is also true for other embodiments.
- the mirror body position calculation unit 13 repeats steps S12 to S18 until the difference between the position of the feature point in the actual captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value.
- the reflecting surface position calculation unit 14 calculates a plurality of feature points of the marker object 2 reflected in the vicinity of the mirror image center point 35 on the reflecting surface 22 and a plurality of feature points corresponding to the mirror image object.
- a plurality of division points that respectively divide the connecting line segments are determined, and the approximate plane 51 is applied to these division points.
- Each division point is at a position where a line segment connecting the feature point of the marker object 2 and the corresponding feature point of the mirror image body is divided into two at a division ratio determined from the radius of curvature of the reflection surface 22. The division ratio will be described later.
- step S ⁇ b> 4 the reflection surface position calculation unit 14 determines the position and orientation of the reflection surface 22 based on the model of the reflection surface 22 so that the reflection surface 22 contacts the approximate plane 51 at the mirror image center point 35.
- the reflecting surface position calculation unit 14 determines the position and orientation of the approximate plane 51 that approximates the reflecting surface 22 in the vicinity of the mirror image center point 35, and the approximate plane 51 at the mirror image center point 35.
- the position and orientation of the reflecting surface 22 are determined by determining the position and orientation of the reflecting surface model so as to be in contact with each other.
- the division ratio for obtaining the division point of the line segment connecting the feature point of the marker object 2 and the corresponding feature point of the mirror image is derived from the radius of curvature of the reflection surface 22 by applying the lens formula.
- the division ratio is 0.5
- the division point is the midpoint of the line segment.
- FIG. 7 is a diagram illustrating an example of a process of determining the position and orientation of the reflecting surface of the split mirror by the reflecting surface position calculation unit 14 of the position measuring device according to Embodiment 1 of the present invention.
- symbol 111 shows the camera 1 and the marker object 2, and the virtual camera 1M and the virtual marker object 2M which are those mirror images.
- the diagram denoted by reference numeral 112 indicates a line segment connecting the feature points of the camera 1 and the marker object 2 and the corresponding feature points of the virtual camera 1M and the virtual marker object 2M with dotted lines.
- symbol 113 shows the dividing point of the line segment of the figure of the code
- the intersection of the straight line passing through the viewpoint of the camera 1 and the viewpoint of the virtual camera 1M and the approximate plane 51 is determined as the mirror image center point, and the approximate plane 51 near the mirror image center point of the reflection surface 22 is determined.
- the position and orientation of the reflective surface 22 determined by fitting the reflective surface model to match are shown.
- the camera 1 captures a captured image (the camera 1 and the marker object 2 of the camera 1 and the marker object 2) including a mirror image 24 of the camera 1 and the marker object 2 reflected on the reflecting surface 22 (see the reference numeral 114) of the split mirror 21.
- the feature point extraction unit 11 extracts feature points from the photographed image and determines the positions of the feature points in the photographed image. .
- the feature point extraction unit 11 further associates the position of the feature point in the captured image with the position of the feature point stored in advance as the feature point data (the actual position of the feature point of the camera 1 and the marker object 2).
- the mirror image position calculation unit 13 calculates the positions of the feature points included in the mirror image bodies (the virtual camera 1M and the virtual marker object 2M) of the camera 1 and the marker object 2.
- the mirror body position calculator 13 determines the postures of the virtual camera 1M and the virtual marker object 2M so as to reduce the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image. It corrects and repeats correcting the distance from the camera 1 and the marker object 2 to the mirror image (the virtual camera 1M and the virtual marker object 2M).
- the mirror body position calculation unit 13 determines the position of each feature point of the mirror body, so that the relative positions of the camera 1 and the marker object 2, and the virtual camera 1M and the virtual marker object 2M are determined as shown in FIG. Can grasp the relative positional relationship.
- the reflecting surface position calculation unit 14 sets each of the positions of a plurality of feature points of the camera 1 and the marker object 2 and the positions of corresponding feature points of the virtual camera 1M and the virtual marker object 2M. Calculate the position of the dividing point.
- Each division point is a line segment (see reference numeral 112) connecting the feature points of the camera 1 and the marker object 2 and the corresponding feature points of the virtual camera 1M and the virtual marker object 2M. It is in a position to divide into two at a division ratio determined from.
- the reflection surface position calculation unit 14 determines an approximate plane 51 that is a plane that approximates the vicinity of the mirror image center point of the reflection surface 22 by applying a plane equation to each division point as shown in the diagram of reference numeral 113.
- the approximate plane 51 itself becomes the reflecting surface 22.
- the shape of the reflecting surface 22 is determined by fitting the reflecting surface model so as to minimize the error with the approximate plane 51 in the vicinity of the mirror image center point. .
- the reflecting surface 22 is an aspherical curved surface, the position of the reflecting surface 22 slightly changes depending on which part of the reflecting surface model is applied to the approximate plane 51, but the edge 23 of the reflecting surface 22 is changed. When deciding, the position of the part to be applied to the reflection surface model is also decided.
- FIG. 8 is a diagram illustrating another process of determining the position and orientation of the reflecting surface of the split mirror by the reflecting surface position calculation unit of the position measuring apparatus according to Embodiment 1 of the present invention.
- the reflection surface 22 takes into account not only the positions of the feature points of the camera 1 and the marker object 2 and the positions of the feature points of the virtual camera 1M and the virtual marker object 2M but also the position of the viewpoint 31 of the camera 1. Determine the position and orientation.
- a diagram denoted by reference numeral 121 indicates a line segment connecting the viewpoint 31 of the camera 1 and the feature points of the virtual camera 1M and the virtual marker object 2M with a dotted line.
- 122 shows the position of the dividing point in the leftmost line segment connecting the viewpoint 31 of the camera 1 and the feature points of the virtual camera 1M and the virtual marker object 2M (see the figure of 121). It is a figure explaining the method to determine.
- a line segment connecting the feature points of the camera 1 and the marker object 2 and the corresponding feature points of the virtual camera 1M and the virtual marker object 2M (in the diagram indicated by reference numeral 122, the feature point at the left end of the marker object 2 and the virtual marker object 2M)
- the line segment connecting to the leftmost feature point) is divided into two at a determined division ratio, and a plane 52 orthogonal to the line segment is determined.
- the intersection of this plane 52 and the line segment connecting the viewpoint 31 shown in the diagram with reference numeral 121 and the feature points of the virtual camera 1M and the virtual marker object 2M is obtained as a dividing point of the line segment.
- the diagram of reference numeral 123 is a diagram in which an approximate plane 51 passing through the dividing points of the plurality of line segments of the diagram of reference numeral 121 is obtained in the vicinity of a straight line passing through the viewpoint of the camera 1 and the viewpoint of the virtual camera 1M.
- the intersection point of the approximate plane 51 with the straight line passing through the viewpoint of the camera 1 and the viewpoint of the virtual camera 1M is determined as the mirror image center point and coincides with the approximate plane 51 in the vicinity of the mirror image center point of the reflection surface 22.
- the position and orientation of the reflecting surface 22 determined by fitting the reflecting surface model in the same manner.
- the reflecting surface 22 of the split mirror 21 is changed.
- a plane or a curved surface whose boundary is not determined is applied.
- the target object position determining unit 15 determines the position of the edge 23 of the reflecting surface 22 and determines the position of the split mirror 21 that is the target object.
- the target object position determination unit 15 determines the position of the split mirror 21 that is the target object based on the non-reflective portion of the split mirror 21 in the captured image.
- the target object position determination unit 15 extracts a non-reflective portion of the split mirror 21 from the captured image in order to determine the position of the target object.
- the edge 23 of the reflecting surface 22 is used as a non-reflecting part used for determining the position of the target object.
- the target object position determination unit 15 determines each straight line from the camera 1 toward a plurality of points on the edge 23 of the reflection surface 22 based on the position of the edge pixel in the captured image.
- the target object position determination unit 15 calculates the position of each intersection point between the surface including the reflection surface 22 calculated by the reflection surface position calculation unit 14 and each straight line, and sets each intersection point on the edge 23 of the reflection surface 22. Determine as each point.
- the target object position determination unit 15 calculates the distance between each point on the edge 23 and the mirror image center point 35 based on the captured image, and based on these distances, the mirror image is obtained. The position of the point on the reflecting surface 22 corresponding to the center point 35 is calculated, and the position and orientation of the reflecting surface 22 are estimated based on the reflecting surface model. The target object position determination unit 15 determines the intersection of the straight line from the camera 1 to each point on the edge 23 and the reflection surface 22 as the position of each point on the edge 23.
- the processing device 4 obtains a mirror image corresponding to the mirror image of the camera 1 and the marker object 2 reflected on the reflecting surface 22 of the split mirror 21, and based on the obtained mirror image, the three-dimensional space of the split mirror 21. Determine the position at.
- the position of the target object having the reflecting surface in the three-dimensional space can be determined without using a special light source and a wide dynamic range camera.
- the position of the mirror image corresponding to the mirror image of the camera 1 and the marker object 2 reflected on the reflecting surface 22 of the split mirror 21 is calculated in this way, and the relationship of mirror image conversion is based on the result.
- the position and orientation of the reflecting surface 22 of the split mirror 21 can be estimated. Accordingly, the position of the split mirror 21 in the three-dimensional space can be stably measured at a low cost with simple adjustment without using a special light source and a wide dynamic range camera.
- Embodiment 2 in order to measure the position of the target object, the position of the feature point included in the virtual photographed image is calculated without using the mirror image of the first embodiment, The position and orientation of the reflecting surface are calculated so that the difference between the position of the feature point and the position of the corresponding feature point in the virtual photographed image is less than a predetermined threshold value.
- FIG. 9 is a block diagram showing the configuration of the position measuring apparatus according to Embodiment 2 of the present invention.
- the position measuring device in FIG. 9 includes a processing device 4A instead of the processing device 4 in FIG.
- the processing device 4A includes a feature point extraction unit 11 and a position measurement unit 12A.
- the feature point extraction unit 11 of the processing device 4A operates in the same manner as the feature point extraction unit 11 of the processing device 4 of FIG.
- the position measuring unit 12A includes a reflecting surface position calculating unit 14A and a target object position determining unit 15.
- the reflection surface position calculation unit 14A virtually determines the position and orientation of the reflection surface based on the shape of the reflection surface 22 stored in advance as model data and the position of the feature point stored in advance as feature point data.
- the reflection surface position calculation unit 14A acquires the actual photographed image of the camera 1 and the marker object 2 that are virtually reflected on the reflection surface having the virtually determined position and orientation.
- the positions of a plurality of feature points included in a virtual photographed image obtained by virtually photographing under the same photographing conditions are calculated, and the positions of the plurality of feature points included in the virtual photographed image are a plurality of corresponding features in the photographed image.
- the position and orientation of the reflecting surface 22 are calculated so as to be close to the position of each point.
- the target object position determination unit 15 determines the division mirror based on the positional relationship between the reflecting surface and the non-reflecting part stored in advance as model data, and the position and orientation of the reflecting surface 22 calculated by the reflecting surface position calculation unit 14A. The position of 21 is determined.
- the reflecting surface position calculation unit 14A virtually determines the position and orientation of the reflecting surface 22, and uses the reflecting surfaces (referred to as virtual reflecting surfaces) having these virtually determined positions and orientations. Use to calculate the actual position and orientation of the reflective surface 22.
- the reflection surface position calculation unit 14A is an image obtained by virtually photographing the camera 1 and the marker object 2 reflected on the virtual reflection surface in a state where the camera 1 and the marker object 2 are virtually reflected on the virtual reflection surface. The position of a feature point included in a virtual image is calculated.
- the reflection surface position calculation unit 14A is configured so that the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actually captured image becomes smaller than the threshold value. The correction of the position and orientation of the reflecting surface 22 determined in a repeated manner is repeated.
- the reflecting surface position calculation unit 14A is a virtual image obtained by virtually capturing a mirror image of the camera 1 and the marker object 2 that are virtually reflected on the virtual reflecting surface under the same shooting conditions as when an actual captured image is acquired by the camera 1.
- the position and orientation of the virtual reflection surface are corrected by repeated calculation so that the positions of the plurality of feature points included in the captured image are close to the positions of the corresponding plurality of feature points in the actual captured image.
- the reflection surface position calculation unit 14A changes the position and orientation of the virtual reflection surface so as to reduce the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image.
- Calculate by The reflecting surface position calculation unit 14 ⁇ / b> A includes a projective conversion unit 16 and a size changing unit 17.
- the projection conversion unit 16 and the size change unit 17 of the reflection surface position calculation unit 14A operate in the same manner as the projection conversion unit 16 and the size change unit 17 of the mirror body position calculation unit 13 of FIG.
- the reflection surface position calculation unit 14A changes the position and orientation of the virtual reflection surface based on the projection conversion determined by the projection conversion unit 16 and the size change rate determined by the size change unit 17.
- FIG. 10 is a flowchart for explaining position measurement processing executed by the position measurement apparatus according to Embodiment 2 of the present invention.
- FIG. 11 is a flowchart for explaining the reflection surface position calculation process of the position measurement process executed by the position measurement apparatus according to the second embodiment of the present invention.
- step S ⁇ b> 21 the feature point extraction unit 11 extracts feature points from the actual captured image captured by the camera 1, and sends information about the extracted feature points to the position measurement unit 12 ⁇ / b> A.
- step S ⁇ b> 22 the reflection surface position calculation unit 14 ⁇ / b> A executes the reflection surface position calculation process, and calculates the position and orientation of the reflection surface 22 by calculating the position and orientation of the virtual reflection surface.
- step S31 of FIG. 11 the reflecting surface position calculation unit 14A determines initial values of the position and orientation of the virtual reflecting surface.
- step S ⁇ b> 32 the reflection surface position calculation unit 14 ⁇ / b> A calculates the position of the feature point in the virtual captured image in which the marker object 2 is reflected on the virtual reflection surface based on the position and orientation of the virtual reflection surface and the model of the reflection surface 22. To do.
- step S33 the reflecting surface position calculation unit 14A determines whether the difference between the position of the feature point in the captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value.
- step S33 If the difference between the positions of the feature points is less than the threshold value, that is, if step S33 is YES, the process of the flowchart of FIG. 11 is terminated and the process proceeds to step S23 of FIG. If the difference in the position of the feature points is greater than or equal to the threshold value, that is, if step S33 is NO, the process proceeds to step S34.
- the reflection surface position calculation unit 14A uses the projection conversion unit 16 to perform projection conversion applied to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image. Projective transformation is determined such that a figure formed by feature points in the captured image and a figure formed by corresponding feature points in the actual captured image are similar to each other.
- the determined projective transformation is applied to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image.
- the reflection surface position calculation unit 14 ⁇ / b> A uses the size changing unit 17 to change the size change rate applied to the graphic formed by the feature points in the virtual captured image or the feature points in the actual captured image. The difference between the position of the feature point in the virtual photographed image and the position of the corresponding feature point in the actual photographed image is minimized without changing the position of the mirror image center point 35 in the image. Determine the resizing rate.
- step S37 the reflection surface position calculation unit 14A changes the orientation of the virtual reflection surface based on the determined projective transformation.
- the reflecting surface position calculation unit 14A changes the distance from the camera 1 to the virtual reflecting surface based on the determined size change rate.
- the reflecting surface position calculation unit 14A repeats steps S32 to S38 until the difference between the position of the feature point in the actual captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value.
- the reflective surface position calculation unit 14A acquires the position and orientation of the reflective surface 22 by iterative calculation.
- step S23 of FIG. 10 the target object position determination unit 15 determines the position of the edge 23 of the reflecting surface 22 based on the non-reflective portion of the split mirror 21 in the captured image, thereby the split mirror that is the target object. The position of 21 is determined.
- the position of the mirror image is calculated repeatedly, but in the second embodiment, the position and orientation of the virtual reflecting surface are calculated repeatedly.
- the processing device 4A determines the position of the split mirror 21 in the three-dimensional space based on the virtual photographed image obtained by virtually photographing the camera 1 and the marker object 2 that are virtually reflected on the virtual reflecting surface. decide.
- the position of the target object having the reflecting surface in the three-dimensional space can be determined without using a special light source and a wide dynamic range camera.
- Embodiment 3 In Embodiment 3 of the present invention, even when the radius of curvature of the reflecting surface is unknown, the difference between the position of the feature point in the actual captured image and the position of the corresponding feature point in the virtual captured image is The position, orientation, and radius of curvature of the reflecting surface are calculated so as to be less than a predetermined threshold value.
- the third embodiment is the same as the second embodiment.
- FIG. 12 is a block diagram showing the configuration of the position measuring apparatus according to Embodiment 3 of the present invention.
- the position measuring device in FIG. 12 includes a processing device 4B instead of the processing device 4A in FIG.
- the model data stored in the storage device 3 includes the shape of the reflective surface 22 that does not include the radius of curvature and the positional relationship between the reflective surface and the non-reflective portion.
- the processing device 4B includes a feature point extraction unit 11 and a position measurement unit 12B.
- the feature point extraction unit 11 of the processing device 4B operates in the same manner as the feature point extraction unit 11 of the processing device 4 of FIG.
- the position measuring unit 12B includes a reflecting surface position calculating unit 14B and a target object position determining unit 15.
- the reflecting surface position calculation unit 14B virtually copies the camera 1 and the marker object 2 based on the shape of the reflecting surface 22 stored in advance as model data and the position of the feature point stored in advance as feature point data. By calculating the position, orientation, and radius of curvature of the virtual reflecting surface, the position, orientation, and radius of curvature of the reflecting surface 22 are calculated.
- the reflection surface position calculation unit 14B is a virtual image obtained by virtually shooting the camera 1 and the marker object 2 that are virtually reflected on the virtual reflection surface under the same shooting conditions as when the camera 1 acquires an actual shot image.
- the positions of a plurality of feature points included in the captured image are calculated, and the positions of the plurality of feature points included in the virtual captured image are approximated to the positions of the corresponding plurality of feature points in the actual captured image, respectively.
- the target object position determination unit 15 determines the division mirror based on the positional relationship between the reflecting surface and the non-reflecting part stored in advance as model data and the position and orientation of the reflecting surface 22 calculated by the reflecting surface position calculation unit 14B. The position of 21 is determined.
- the reflection surface position calculation unit 14B reduces the position, orientation, and radius of curvature of the virtual reflection surface so as to reduce the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image. Calculate by changing.
- the reflection surface position calculation unit 14 ⁇ / b> B includes a projection conversion unit 16, a size change unit 17, and a curvature calculation unit 18.
- the projection conversion unit 16 and the size change unit 17 of the reflection surface position calculation unit 14B operate in the same manner as the projection conversion unit 16 and the size change unit 17 of the mirror body position calculation unit 13 of FIG.
- the curvature calculation unit 18 is based on the difference between the position of three or more feature points in the captured image and the position of the corresponding feature point in the virtual captured image, and the mirror image center in the captured image or the virtual captured image. Based on the distance from the point 35 to the feature point in the actual captured image or the feature point in the virtual captured image, the curvature radius of the reflecting surface 22 is estimated. The difference between the position of the feature point in the actually captured image and the position of the corresponding feature point in the virtual captured image is determined according to the difference between the assumed curvature radius of the reflecting surface 22 and the actual curvature radius.
- the reflection surface position calculation unit 14B is a virtual reflection surface based on the projection conversion determined by the projection conversion unit 16, the size change rate determined by the size change unit 17, and the curvature radius calculated by the curvature calculation unit 18. Change the position, orientation, and radius of curvature.
- FIG. 13 is a flowchart for explaining position measurement processing executed by the position measurement apparatus according to Embodiment 3 of the present invention.
- FIG. 14 is a flowchart for explaining the reflection surface position calculation process of the position measurement process executed by the position measurement apparatus according to the third embodiment of the present invention.
- step S41 the feature point extraction unit 11 extracts feature points from the actual captured image captured by the camera 1, and sends information about the extracted feature points to the position measurement unit 12B.
- step S42 the reflecting surface position calculation unit 14B executes the reflecting surface position calculation process, and calculates the position, orientation, and radius of curvature of the reflecting surface 22 by calculating the position, orientation, and radius of curvature of the virtual reflecting surface. calculate.
- step S51 shown in FIG. 14 the reflection surface position calculation unit 14B determines the initial value of the position, orientation, and radius of curvature of the virtual reflection surface.
- step S52 the reflection surface position calculation unit 14B calculates the position of the feature point in the virtual photographed image where the marker object 2 is reflected on the virtual reflection surface based on the position, orientation, and radius of curvature of the virtual reflection surface.
- step S53 the reflection surface position calculation unit 14B determines whether the difference between the position of the feature point in the captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value. If the difference between the positions of the feature points is less than the threshold value, that is, if step S53 is YES, the processing of the flowchart in FIG. If the difference between the positions of the feature points is greater than or equal to the threshold value, that is, if step S53 is NO, the process proceeds to step S54.
- step S54 the reflection surface position calculation unit 14B uses the projection conversion unit 16 to perform projection conversion to be applied to the position of the feature point in the captured image or the position of the feature point in the virtual captured image.
- a graphic formed by the feature points in the actual captured image included in a predetermined range in the vicinity of the mirror image center point 35 in the image, and a graphic formed by the corresponding feature points in the virtual captured image, Projective transformations that are similar to each other are determined.
- the obtained projective transformation is applied to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image.
- the reflecting surface position calculation unit 14B uses the size changing unit 17 to change the size at a size change rate applied to a graphic formed by the feature points in the virtual photographed image or the feature points in the actual photographed image.
- the difference between the position of the feature point in the actual captured image included in the predetermined range near the mirror image center point 35 in the actual captured image and the position of the corresponding feature point in the virtual captured image Determine the resizing rate that minimizes.
- the predetermined range in the vicinity of the mirror image center point 35 in the image used for obtaining the size change rate is such that a sufficient number of feature points are included, and actual photographing due to the curvature of the reflecting surface 22 is performed.
- the difference between the position of the feature point in the image and the position of the corresponding feature point in the virtual photographed image is determined to be small enough to allow.
- step S57 the reflection surface position calculation unit 14B uses the curvature calculation unit 18 to calculate the assumed curvature based on the difference between the position of the feature point in the captured image and the position of the corresponding feature point in the virtual captured image. The difference between the radius and the actual curvature radius is calculated, and the curvature radius of the virtual reflecting surface used in the next repeated calculation is determined.
- FIG. 15 is a diagram illustrating an example of the principle of estimating the curvature radius of the reflecting surface by the curvature calculation unit 18 of the position measurement device according to the third embodiment of the present invention.
- the diagram of reference numeral 131 shows a case where the reflecting surface 22 and the mirror image 44 of the marker object 2 are reflected in the actual captured image when the reflecting surface 22 is a curved surface.
- the mirror image 44 of the marker object 2 has a plurality of feature points 44a.
- the diagram of reference numeral 132 shows the image 45 of the marker object 2 in the virtual photographed image after applying the projective transformation determined in step S54 of FIG. 14 and the size change rate determined in step S55 of FIG. .
- the image 45 of the marker object 2 includes a plurality of feature points 45a.
- the diagram of reference numeral 132 shows the position of the feature point 45a included in the virtual photographed image obtained by virtually photographing the marker object 2 virtually reflected on the flat reflecting mirror under predetermined photographing conditions.
- the diagram of reference numeral 133 shows the difference between the position of the feature point 44a of reference numeral 131 indicated by a black circle and the position of the feature point 45a of reference numeral 132 indicated by a white circle.
- the actual straight line When an actual straight line is reflected on the curved reflecting surface 22, in the mirror image 24 on the reflecting surface 22, the actual straight line appears as a curve in a portion far from the mirror image center point 35.
- the angle difference between the paths from the viewpoint position toward both ends of the straight line reflected on the reflecting surface 22 is reflected by the plane mirror according to the radius of curvature of the reflecting surface 22. It changes from the case. For example, when the curved surface is a concave surface, the smaller the radius of curvature, the larger the angle difference compared to the case where a straight line is reflected by a plane mirror. Then, as the point on the straight line reflected on the reflecting surface 22 becomes farther from the mirror image center point, the degree of increase in the angle difference increases, so that the straight line appears curved.
- the radius of curvature of the reflecting surface 22 can be estimated based on the degree of bending of the curved line in which a real straight line is reflected by a curved surface.
- the position of the feature point to which the projective transformation determined in step S54 of FIG. 14 and the size change rate determined in step S55 of FIG. 14 are applied is a distortion due to the curved surface of the curvature radius assumed by the reflection surface 22. It has something.
- the difference between all the feature points 44a in the actual photographed image and the position of the corresponding feature point 45a in the virtual photographed image after applying the determined projective transformation and the determined size change rate is the reflection surface 22 Depends on the difference between the assumed radius of curvature and the actual radius of curvature when is a curved surface, and also on the distance from the mirror image center point 35 in the actual captured image or virtual captured image to the position of the feature point . Therefore, the difference between the position of the feature point 44a in the actual captured image and the position of the corresponding feature point 45a in the virtual captured image, and the feature point 44a or the mirror image center point 35 in the actual captured image or virtual captured image.
- the radius of curvature of the reflecting surface 22 can be estimated.
- a radius of curvature is calculated based on a ratio between a distance on the image between the mirror image center point 35 and the feature point in the photographed image or the virtual photographed image and a difference (displacement) in the position of the feature point at the distance. .
- the difference between the distance from the mirror image center point 35 and the position of the feature point on the actual photographed image or the virtual photographed image is the number of pixels that fall within the distance or position difference and the straight line from the viewpoint position of the camera. It is expressed by angle difference.
- a correction amount for obtaining a radius of curvature to be used in the next iterative calculation may be calculated by, for example, weighted averaging of a plurality of curvature radii obtained from a plurality of feature points.
- the reflective surface position calculation part 14B changes the direction of a virtual reflective surface based on the determined projective transformation.
- the reflection surface position calculation unit 14B changes the distance from the camera 1 to the virtual reflection surface based on the determined size change rate.
- the reflection surface position calculation unit 14B changes the curvature radius of the virtual reflection surface based on the determined curvature radius.
- the reflecting surface position calculation unit 14B repeats steps S52 to S60 until the difference between the position of the feature point in the actual captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value.
- the reflective surface position calculation unit 14B obtains the position, orientation, and radius of curvature of the reflective surface 22 by iterative calculation.
- step S43 in FIG. 13 the target object position determination unit 15 determines the position of the edge 23 of the reflecting surface 22 based on the non-reflective portion of the split mirror 21 in the captured image, so that the split mirror that is the target object. The position of 21 is determined.
- the curvature radius of the reflecting surface 22 may be determined so as to have different values in different directions.
- the reflecting surface 22 may be modeled such that the radius of curvature changes according to a formula determined in advance according to the position in the reflecting surface 22.
- the reflecting surface 22 may be expressed by mathematical expressions such as a paraboloid and a hyperboloid, and parameters used in the mathematical expressions may be changed.
- the processing device 4B determines the position of the split mirror 21 in the three-dimensional space based on the virtual photographed image obtained by virtually photographing the camera 1 and the marker object 2 that are virtually reflected on the virtual reflection surface. decide.
- the position of the target object having the reflecting surface in the three-dimensional space can be determined without using a special light source and a wide dynamic range camera.
- Embodiment 4 the position and posture of a reflection surface having a known shape having different radii of curvature depending on directions are determined.
- FIG. 16 is a block diagram showing a configuration of a position measuring apparatus according to Embodiment 4 of the present invention. It is assumed that the reflecting surface 22C of the split mirror 21C can be approximated by a combination of a concave spherical surface and a concave cylindrical surface when viewed from the camera 1.
- the direction in which the dent amount of the cylindrical surface is maximized and constant is the direction in which the radius of curvature is maximum, that is, the curvature is minimized. Therefore, the direction in which the amount of depression on the cylindrical surface is constant is called the minimum curvature direction.
- the reflective surface 22C shown in the display device 5 of FIG. 16 the direction of the diagonal line from the upper right corner to the lower left corner is the minimum curvature direction of the reflective surface 22C.
- the posture of the reflecting surface 22C includes information on which direction the minimum curvature direction is directed.
- the 16 includes a processing device 4C instead of the processing device 4 of FIG.
- the processing device 4C includes a feature point extraction unit 11 and a position measurement unit 12C.
- the feature point extraction unit 11 of the processing device 4C operates in the same manner as the feature point extraction unit 11 of the processing device 4 of FIG.
- the position measurement unit 12C includes a reflection surface position calculation unit 14C and a target object position determination unit 15C.
- the reflecting surface position calculation unit 14C virtually copies the camera 1 and the marker object 2 based on the shape of the reflecting surface 22C stored in advance as model data and the position of the feature point stored in advance as feature point data.
- the position and orientation of the reflecting surface 22C are calculated by calculating the position and orientation of the virtual reflecting surface.
- the reflecting surface position calculation unit 14C is a virtual image obtained by virtually photographing the camera 1 and the marker object 2 that are virtually reflected on the virtual reflecting surface under the same photographing conditions as when the actual photographed image is acquired by the camera 1.
- the positions of a plurality of feature points included in the captured image are calculated, and the positions of the plurality of feature points included in the virtual captured image are approximated to the positions of the corresponding plurality of feature points in the actual captured image, respectively.
- the target object position determination unit 15C determines the reflection surface based on the positional relationship between the reflection surface and the non-reflection part stored in advance as model data and the position and orientation of the reflection surface 22C calculated by the reflection surface position calculation unit 14C.
- the minimum curvature direction of 22C is determined, and further, the position of the split mirror 21 is determined.
- the reflection surface position calculation unit 14C changes the position and orientation of the virtual reflection surface so as to reduce the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image.
- Calculate by The reflecting surface position calculation unit 14 ⁇ / b> C includes a projective conversion unit 16 and a size changing unit 17.
- the projection conversion unit 16 and the size change unit 17 of the reflection surface position calculation unit 14C operate in the same manner as the projection conversion unit 16 and the size change unit 17 of the mirror body position calculation unit 13 of FIG.
- the reflection surface position calculation unit 14 ⁇ / b> C changes the position and orientation of the virtual reflection surface based on the projection conversion determined by the projection conversion unit 16 and the size change rate determined by the size change unit 17.
- FIG. 17 is a flowchart for explaining a position measurement process executed by the position measurement apparatus according to the fourth embodiment of the present invention.
- the position measurement process in FIG. 17 is executed by the processing device 4C in FIG. 16 in order to measure the position of the target object.
- the feature point extraction unit 11 extracts feature points from the actual captured image captured by the camera 1, and sends information about the extracted feature points to the position measurement unit 12C.
- the reflecting surface position calculation unit 14C determines the initial value of the position and orientation of the virtual reflecting surface and the minimum curvature direction.
- the reflecting surface position calculation unit 14C calculates the position of the feature point in the virtual photographed image in which the marker object 2 is reflected on the virtual reflecting surface based on the position, posture, and minimum curvature direction of the virtual reflecting surface.
- step S64 the reflecting surface position calculation unit 14C determines whether or not the difference between the position of the feature point in the captured image and the position of the corresponding feature point in the virtual captured image is less than the threshold value. If the difference between the positions of the feature points is less than the threshold value, that is, if step S64 is YES, the process ends.
- step S65 the reflection surface position calculation unit 14C uses the projection conversion unit 16 to perform projection conversion applied to the position of the feature point in the captured image or the position of the feature point in the virtual captured image. Projective transformation is determined so that the figure formed by the feature points in the image and the figure formed by the corresponding feature points in the actual captured image are close to each other.
- step S66 the determined projective transformation is applied to the position of the feature point in the actual captured image or the position of the feature point in the virtual captured image.
- step S ⁇ b> 67 the reflection surface position calculation unit 14 ⁇ / b> C uses the size changing unit 17 to change the size to be applied to the figure formed by the feature points in the virtual photographed image or the feature points in the actual photographed image. Then, the size change rate is determined so as to minimize the difference between the position of the feature point in the virtual captured image and the position of the corresponding feature point in the actual captured image.
- step S68 the reflection surface position calculation unit 14C changes the attitude of the virtual reflection surface based on the determined projective transformation.
- step S69 the reflecting surface position calculation unit 14C changes the distance from the camera 1 to the virtual reflecting surface based on the determined size change rate.
- the target object position determination unit 15C determines the position of the split mirror 21 and the minimum curvature direction based on the non-reflective portion of the split mirror 21 in the captured image.
- the reflecting surface position calculation unit 14C and the target object position determination unit 15C perform steps. S63 to S70 are repeated.
- the reflection surface position calculation unit 14C and the target object position determination unit 15C acquire the position, posture, and minimum curvature direction of the reflection surface 22C by iterative calculation.
- the processing device 4C determines the position of the split mirror 21C in the three-dimensional space based on the virtual photographed image obtained by virtually photographing the camera 1 and the marker object 2 that are virtually reflected on the virtual reflection surface. decide.
- the position of the target object having the reflecting surface in the three-dimensional space can be determined without using a special light source and a wide dynamic range camera.
- Embodiment 5 In the position measurement apparatus according to the first embodiment, the positional relationship between the feature points stored in advance as the feature point data is a mirror image relationship (an inverted positional relationship) with respect to the actual positional relationship between the feature points. . On the other hand, in the fifth embodiment, the same positional relationship between actual feature points is used as feature point data.
- FIG. 18 is a block diagram showing a configuration of a position measuring apparatus according to Embodiment 5 of the present invention.
- 18 includes a processing device 4D instead of the processing device 4 of FIG.
- the processing device 4D includes an image inversion unit 19 in addition to the feature point extraction unit 11 and the position measurement unit 12 of FIG.
- the image reversing unit 19 sends a reversed image 6D obtained by reversing the left and right of the image photographed by the camera 1 to the feature point extracting unit 11 as a photographed image.
- the inverted image 6D includes an image in which the left and right sides of the reflecting surface 22 and the edge 23 of the dividing mirror 21 are inverted, and the camera 1 and marker object reflected on the reflecting surface 22 of the dividing mirror 21. 2 is a mirror image 24D (that is, an image of the camera 1 and the marker object 2 having the same positional relationship as the actual image).
- the storage device 3 (third storage unit) stores in advance, as feature point data, the same positional relationship between actual feature points (a feature point having a non-inverted positional relationship).
- the fifth embodiment is the same as the first embodiment.
- the processing device 4D executes the position measurement process of FIGS.
- the position measurement unit 12 in FIG. 18 calculates the positions of a plurality of feature points included in the mirror image, similarly to the position measurement unit 12 in FIG. 1, and the position of the reflection surface 22 based on the positions of the feature points of the mirror image object. And the orientation can be calculated.
- the mirror body position calculation unit 13 calculates the position of the feature point included in the mirror body based on the captured image that is a reverse image and the feature point data having a non-reverse positional relationship.
- the reflection surface position calculation unit 14 and the target object position determination unit 15 calculate the position and orientation of the reflection surface 22 and determine the position of the split mirror 21.
- the image inverting unit 19 and feature point data (the same positional relationship between actual feature points) of the fifth embodiment can be applied to the second to fourth embodiments.
- FIG. 19 is a block diagram showing a configuration of a position measuring apparatus according to Embodiment 6 of the present invention.
- the position measuring device of FIG. 19 further includes a drive device 61 that moves the camera 1 (and the marker object 2) in addition to the configuration of the position measuring device of FIG. 19 includes a processing device 4E instead of the processing device 4 of FIG.
- the processing device 4E further includes a driving device 61 and a camera control unit 71 that controls the shooting direction of the camera 1.
- the drive device 61 and the camera control unit 71 constitute a camera drive device that moves the camera 1 so that at least one of the viewpoint position and the shooting direction of the camera 1 is different.
- the field of view may be widened by moving the camera 1 with the driving device 61 as shown in FIG.
- the coordinate origin is not the position of the moving camera 1 but the position of the driving device 61 or another fixed position so that a consistent coordinate system can be used even if the camera 1 moves.
- the position of the camera 1 is calibrated with respect to the drive device 61 in advance.
- the processing device 4E captures the split mirror 21 under at least two sets of shooting conditions among a plurality of sets of shooting conditions in which at least one of the viewpoint position and the shooting direction is different, and acquires a plurality of shot images. For example, the processing device 4E moves the camera 1 with the driving device 61, and images the split mirror 21 at a plurality of positions (that is, under a plurality of sets of imaging conditions) to acquire a plurality of captured images.
- the processing device 4E stores a combination of a plurality of captured images and imaging conditions when each captured image is captured in the storage device 3 (first storage unit).
- the feature point extraction unit of the processing device 4E extracts a plurality of feature points from the plurality of photographed images, respectively, and determines the positions of the feature points in the plurality of photographed images.
- a set including the positions of a plurality of feature points extracted from one captured image is referred to as a feature point set.
- the processing device 4E stores the position of the feature point in the extracted captured image in the storage device 3 (first storage unit) in combination with the corresponding captured image and shooting conditions.
- the storage device 3 (first storage unit) stores a plurality of captured data sets each including a captured image, an imaging condition when the captured image is captured, and a feature point set of the captured image.
- the imaging data set may include other information.
- the position measurement unit of the processing device 4E performs the position measurement processing on one shooting data set in the same manner as in any of the processing devices of the first to fifth embodiments, and the position of the split mirror 21 in the three-dimensional space. May be determined.
- the processing device 4E may determine the position of the split mirror 21 in the three-dimensional space by performing a position measurement process by combining a plurality of imaging data sets. In this way, the position of the split mirror 21 having a size larger than the field of view of the camera 1 in the three-dimensional space can be measured.
- the position measurement unit of the processing device 4E is based on model data, feature point data, a plurality of feature point sets that respectively correspond to a plurality of captured images, and a plurality of imaging conditions that respectively correspond to a plurality of captured images.
- the position of the split mirror 21 may be calculated.
- the enantiomeric position calculation unit for each of the imaging data sets, includes a plurality of virtual captured images obtained by virtually capturing the mirror image under the imaging conditions.
- the positions of the plurality of feature points included in the mirror image may be calculated so that the positions of the feature points are close to the positions of the corresponding plurality of feature points in the actual captured image captured under the imaging conditions.
- the mirror image position calculation unit obtains a plurality of mirror images corresponding to a plurality of imaging data sets.
- the reflection surface position calculation unit of the processing device 4E is included in the positions of the feature points of the camera 1 and the marker object 2 determined from the viewpoint position and the feature point data, and the plurality of mirror images corresponding to the plurality of imaging data sets, respectively.
- the position and orientation of the reflecting surface 22 may be calculated based on the positions of the plurality of feature points and the radius of curvature of the reflecting surface 22.
- the reflection surface position calculation unit may process the positions of the feature points of the plurality of mirror images respectively corresponding to the plurality of imaging data sets separately or in combination.
- the combination method is arbitrary.
- the positions of the feature points of each mirror image corresponding to all the imaging data sets may be processed simultaneously or sequentially.
- the positions of the feature points of each mirror image corresponding to each shooting data set may be processed simultaneously.
- the reflection surface position calculation unit further calculates the position and orientation of the reflection surface calculated and processed separately for each imaging data set, and the position and orientation of the reflection surface calculated by processing a combination of a plurality of imaging data sets. You may process in combination.
- the target object position determination unit of the processing device 4E processes each part of the split mirror 21 captured with a plurality of images, or simultaneously processes a plurality of parts of the split mirror 21 captured with a plurality of images, Based on the model data, the position of the split mirror 21 that is the target object may be determined.
- the reflection surface position calculation unit separately processes the virtual reflection surfaces corresponding to a plurality of imaging data sets. Or you may process combining arbitrarily.
- the reflection surface position calculation unit is the input data input corresponding to the plurality of imaging data sets, the positions of the feature points of the plurality of mirror bodies (Embodiment 1), or the plurality of feature point sets ( From the second embodiment to the fourth embodiment), the position and orientation of the reflecting surface 22 are calculated.
- the reflection surface position calculation unit of the processing device 4E may calculate the positions and orientations of the plurality of reflection surfaces 22 respectively corresponding to the plurality of imaging data sets.
- the target object position determination unit of the processing device 4E determines the positional relationship between the reflection surface and the non-reflection part stored in advance as model data, and the positions and orientations of the plurality of reflection surfaces 22 calculated by the reflection surface position calculation unit.
- the position of the split mirror 21 may be determined based on the combination.
- the description of the sixth embodiment can be applied to the case where the position of the target object is obtained using captured images captured under different imaging conditions in other embodiments.
- FIG. 20 is a block diagram showing a configuration of the position measuring apparatus according to the seventh embodiment of the present invention.
- the position measuring device of FIG. 20 includes, for example, three cameras 1a, 1b, 1c and marker objects 2a, 2b, 2c and cameras 1a, 1b, 1c in addition to the configuration of the position measuring device of FIG. And a fixed jig 62 for supporting the.
- Each of the cameras 1a, 1b, and 1c captures different areas where different points on the edge 23 of the reflecting surface 22 enter.
- 20 includes a processing device 4F instead of the processing device 4E of the sixth embodiment.
- the processing device 4F is substantially the same as the processing device 4E, but includes a camera control unit 71F that controls the shooting directions of the cameras 1a, 1b, and 1c.
- the position measuring device may include a driving device that moves at least one of the plurality of cameras.
- the camera control unit 71F also controls the driving device.
- the viewpoint position and shooting direction of at least one camera may be fixed.
- the positional relationship between the cameras 1a, 1b, 1c and the marker objects 2a, 2b, 2c may be different for each camera or may be the same.
- the entire split mirror 21 can be obtained without moving the cameras.
- the position in the three-dimensional space can be measured at a time.
- the position of each camera 1a, 1b, 1c is set in advance as a reference world coordinate system and calibrated in advance on the world coordinate system.
- the processing device 4F extracts different sections of the edge 23 of the reflecting surface 22 from the captured images of the split mirrors 21 captured by the cameras 1a, 1b, and 1c, and sets the position of the pixel on the edge in the captured image. Based on each of the cameras 1a, 1b, and 1c, each straight line is determined from at least one point on the corresponding section of the edge 23 of the reflecting surface 22. The position of one or more points on the edge 23 of the reflecting surface 22 is measured from each of the cameras 1a, 1b, and 1c. For example, when the position measuring device includes two cameras, a straight line directed to at least three points on the corresponding section of the edge 23 of the reflecting surface 22 is determined by the sum of the two cameras.
- each intersection is determined as each point on the edge 23 of the reflection surface 22.
- the captured images captured by the cameras 1a, 1b, and 1c may be processed separately. Instead, an approximate plane for each camera may be determined based on the reflection surface model so that the error of the entire reflection surface 22 is minimized.
- the predetermined reference position of the split mirror 21 and the reflecting surface 22 of the split mirror 21 are arranged so that the gripping mechanism of the split mirror 21 can easily process at least three positions on the split mirror 21 thus obtained. It converts into the direction (reflective surface direction) to face, the position in the periphery of the reflective surface direction of the holding part of the split mirror 21, etc.
- the position of the split mirror in the three-dimensional space can be determined without using a special light source and a wide dynamic range camera. Further, the position of the split mirror 21 in the three-dimensional space can be determined based on an image obtained by one shooting with the cameras 1a, 1b, and 1c, and the calculation time can be shortened.
- the processing device 4F includes a plurality of images included in the mirror image 24 of the camera and the marker object reflected on the reflecting surface 22 of the split mirror 21 from the captured image of the split mirror 21 captured by only one of the cameras 1a, 1b, and 1c. Extract feature points.
- the processing device 4F divides each line segment connecting the extracted feature point positions and the corresponding feature point positions stored in the storage device 3 into two at a division ratio determined by the radius of curvature of the reflecting surface. The position of the point is calculated, and each point to be divided into two is determined as a point on the reflecting surface of the dividing mirror 21.
- the processing device 4F applies a plane or curved surface equation to each point that bisects the line segment connecting each extracted feature point and each corresponding feature point stored in the storage device 3, and reflects the reflection of the dividing mirror 21.
- a plane including the plane 22 is determined.
- the processing device 4F extracts the edges 23 of the reflection surface 22 from the captured images captured by the cameras 1a, 1b, and 1c, and goes from the cameras 1a, 1b, and 1c to a plurality of points on the edge 23 of the reflection surface 22.
- Each straight line is determined, the position of each intersection between the surface including the reflection surface of the split mirror 21 and each straight line is calculated, and each intersection is determined as each point on the edge 23 of the reflection surface 22.
- the processing device 4F uses the images photographed by the cameras 1a, 1b, and 1c to execute a process for determining the position of the split mirror 21 in the three-dimensional space as in the processing device 4 of the first embodiment.
- the position in the three-dimensional space of the split mirror 21 having a size larger than the field of view of each camera can be measured at high speed.
- the number of cameras is set to two or more, the position of one or more points is determined on the edge 23 of the reflecting surface 22 within the range that can be photographed by each camera, and the target object is set to three or more points. Can be determined.
- the shape model of the target object can be arranged so that the sum of errors between the shape model of the target object and the measured position of the target object is minimized.
- the processing for determining the position of the split mirror in the three-dimensional space by the processing devices 4 to 4F may be executed by a dedicated hardware device or by software operating on a general-purpose processor.
- the processor of the processing device stores, in the second storage, model data including the step of storing the shooting conditions including the viewpoint position and the shooting direction of the camera 1 in the first storage unit, and the shape of the split mirror 21 having a reflecting surface.
- a step of storing in the unit, a step of arranging a marker object having a plurality of feature points at a predetermined position with respect to the camera 1, and feature point data representing a mutual positional relationship between the viewpoint position and the feature points A step of storing in the third storage unit, a step of acquiring a photographed image including at least a part of the reflecting surface 22 on which a plurality of feature points are reflected by the camera 1, and a plurality of feature points from the photographed image. Based on the step of extracting and determining the position of the feature point in the photographed image, the photographing condition, the model data, the feature point data, and the position of the feature point in the photographed image, the position of the split mirror 21 is determined. And a step of calculation.
Abstract
Description
反射面を有する目標物体を撮影することにより前記反射面の少なくとも一部が入った撮影画像を取得するカメラと、
前記カメラの視点位置及び撮影方向を含む撮影条件を記憶する第1の記憶部と、
前記目標物体の形状を含むモデルデータを記憶する第2の記憶部と、
複数の特徴点を有し、前記カメラに対して予め決められた位置に固定されたマーカー物体と、
前記視点位置及び前記特徴点の間の互いの位置関係を表す特徴点データを記憶する第3の記憶部と、
前記特徴点のうちの複数個が前記反射面内に映った状態で前記カメラにより取得された撮影画像から複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定する特徴点抽出部と、
前記撮影条件と、前記モデルデータと、前記特徴点データと、前記撮影画像内の前記特徴点の位置とに基づいて、前記目標物体の位置を計算する位置測定部とを備える。
本発明の各実施の形態では、反射面を有する目標物体(位置を測定すべき物体)の位置を測定するために、互いの位置関係が決まった複数の特徴点を有するマーカー物体を使用する。本発明の実施の形態1では、反射面に映るカメラ及びマーカー物体の鏡像から仮想的に存在する鏡像体を求め、鏡像体に含まれる複数の特徴点の位置を計算し、鏡像体の特徴点の位置に基づいて反射面の位置及び向きを計算する。本発明の実施の形態1は、反射面が既知の曲面である場合に、反射面の形状を表す反射面モデルを使用し、カメラの視点からの光が反射面で垂直に反射する反射面上の点(より詳しい説明は後でするが、この点を鏡像中心点と呼ぶ)を中心に反射面を近似する近似平面を決定し、近似平面に反射面モデルを当てはめる。
本発明の実施の形態2では、目標物体の位置を測定するために、実施の形態1の鏡像体を用いることなく、仮想撮影画像に含まれる特徴点の位置を計算し、実際の撮影画像内の特徴点の位置と仮想撮影画像内の対応する特徴点の位置との差が予め決められたしきい値未満になるように、反射面の位置及び向きを計算する。
本発明の実施の形態3では、反射面の曲率半径が未知である場合であっても、実際の撮影画像内の特徴点の位置と仮想撮影画像内の対応する特徴点の位置との差が予め決められたしきい値未満になるように、反射面の位置、向き、及び曲率半径を計算する。その他の点では、実施の形態3は、実施の形態2と同様である。
本発明の実施の形態4では、方向に応じて異なる曲率半径を有する、既知の形状の反射面の位置及び姿勢を決定する。
実施の形態1に係る位置測定装置では、特徴点データとして予め記憶された特徴点間の位置関係は、実際の特徴点間の位置関係に対して鏡像関係(反転した位置関係)になっていた。一方、実施の形態5では、実際の特徴点間の位置関係と同じものを特徴点データとして用いる。
図19は、本発明の実施の形態6に係る位置測定装置の構成を示すブロック図である。図19の位置測定装置は、図1の位置測定装置の構成に加えて、カメラ1(及びマーカー物体2)を移動させる駆動装置61をさらに備える。また、図19の位置測定装置は、図1の処理装置4に代えて、処理装置4Eを備える。処理装置4Eは、実施の形態1から5のうちの1つの処理装置の構成を備えたことに加えて、駆動装置61及びカメラ1の撮影方向を制御するカメラ制御部71をさらに備える。駆動装置61とカメラ制御部71は、カメラ1の視点位置及び撮影方向のいずれか少なくとも一つが異なるようにカメラ1を移動させるカメラ駆動装置を構成する。
図20は、本発明の実施の形態7に係る位置測定装置の構成を示すブロック図である。図20の位置測定装置は、図1の位置測定装置の構成に加えて、位置測定装置は例えば3つのカメラ1a,1b,1c及びマーカー物体2a,2b,2cと、各カメラ1a,1b,1cを支持する固定ジグ62とを備える。各カメラ1a,1b,1cは、反射面22のエッジ23上における異なる点が入る異なる領域をそれぞれ撮影する。また、図20の位置測定装置は、実施の形態6の処理装置4Eに代えて、処理装置4Fを備える。処理装置4Fは、処理装置4Eとほぼ同様であるが、カメラ1a,1b,1cの撮影方向を制御するカメラ制御部71Fを備える。位置測定装置は、複数のカメラの中のいずれか少なくとも一つのカメラを移動させる駆動装置を備えてもよい。移動できるカメラに関しては、カメラ制御部71Fが駆動装置も制御する。少なくとも一つのカメラの視点位置及び撮影方向は固定されていてもよい。カメラ1a,1b,1cとマーカー物体2a,2b,2cの位置関係は、カメラごとに異なってもよいし、同じものがあってもよい。
処理装置4から4Fにより分割鏡の3次元空間における位置を決定する処理は、専用のハードウェア装置によって実行されても、汎用プロセッサ上で動作するソフトウェアによって実行されてもよい。処理装置のプロセッサは、例えば、カメラ1の視点位置及び撮影方向を含む撮影条件を第1の記憶部に記憶するステップと、反射面を有する分割鏡21の形状を含むモデルデータを第2の記憶部に記憶するステップと、複数の特徴点を有するマーカー物体をカメラ1に対して予め決められた位置に配置するステップと、視点位置及び特徴点の間の互いの位置関係を表す特徴点データを第3の記憶部に記憶するステップと、特徴点のうちの複数個が映っている反射面22の少なくとも一部を含む撮影画像をカメラ1により取得するステップと、撮影画像から複数の特徴点を抽出し、撮影画像内の特徴点の位置を決定するステップと、撮影条件と、モデルデータと、特徴点データと、撮影画像内の特徴点の位置とに基づいて、分割鏡21の位置を計算するステップとを実行する。
Claims (16)
- 反射面を有する目標物体を撮影することにより前記反射面の少なくとも一部が入った撮影画像を取得するカメラと、
前記カメラの視点位置及び撮影方向を含む撮影条件を記憶する第1の記憶部と、
前記目標物体の形状を含むモデルデータを記憶する第2の記憶部と、
複数の特徴点を有し、前記カメラに対して予め決められた位置に固定されたマーカー物体と、
前記視点位置及び前記特徴点の間の互いの位置関係を表す特徴点データを記憶する第3の記憶部と、
前記特徴点のうちの複数個が前記反射面内に映った状態で前記カメラにより取得された撮影画像から複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定する特徴点抽出部と、
前記撮影条件と、前記モデルデータと、前記特徴点データと、前記撮影画像内の前記特徴点の位置とに基づいて、前記目標物体の位置を計算する位置測定部とを備えた位置測定装置。 - 前記目標物体は、前記反射面及び非反射部分を有し、
前記モデルデータは、前記反射面の曲率半径を含む前記反射面の形状と、前記反射面及び前記非反射部分の位置関係とを含み、
前記位置測定部は、
前記特徴点データに基づいて、前記反射面に対して前記マーカー物体とは反対側に仮想的に存在する前記マーカー物体の鏡像体に含まれる複数の前記特徴点の位置を計算する鏡像体位置計算部であって、前記鏡像体を前記撮影条件で仮想的に撮影して得られる仮想撮影画像に含まれる複数の前記特徴点の位置が前記撮影画像内の対応する複数の前記特徴点の位置にそれぞれ近接するように、前記鏡像体に含まれる複数の前記特徴点の位置を計算する鏡像体位置計算部と、
前記視点位置及び前記特徴点データから決まる前記マーカー物体の前記特徴点の位置と、計算された前記鏡像体の前記特徴点の位置と、前記反射面の曲率半径とに基づいて、前記反射面の位置及び向きを計算する反射面位置計算部と、
前記反射面及び前記非反射部分の位置関係と、前記計算された反射面の位置及び向きとに基づいて、前記目標物体の位置を決定する目標物体位置決定部とを備えることを特徴とする請求項1に記載の位置測定装置。 - 前記鏡像体位置計算部は、前記仮想撮影画像内の前記特徴点の位置と前記撮影画像内の対応する前記特徴点の位置との差を小さくするように、前記鏡像体に含まれる複数の前記特徴点の位置を計算する請求項2に記載の位置測定装置。
- 前記鏡像体位置計算部は、
前記撮影画像内の前記特徴点の位置又は前記仮想撮影画像内の前記特徴点の位置に適用される射影変換であって、前記仮想撮影画像内の前記特徴点によって形成される図形と前記撮影画像内の対応する前記特徴点によって形成される図形とが互いに相似に近づくような射影変換を決定する射影変換部と、
前記撮影画像内の前記特徴点の位置又は前記仮想撮影画像内の前記特徴点の位置に決定された前記射影変換を適用した後に、前記仮想撮影画像内の前記特徴点によって形成される図形又は前記撮影画像内の対応する前記特徴点によって形成される図形に適用されるサイズ変更率であって、前記仮想撮影画像内の前記特徴点の位置と前記撮影画像内の対応する前記特徴点の位置との差を小さくするようなサイズ変更率を決定するサイズ変更部とを備え、
前記鏡像体位置計算部は、決定された前記射影変換及び決定された前記サイズ変更率に基づいて、前記鏡像体に含まれる複数の前記特徴点の位置を計算する請求項3に記載の位置測定装置。 - 前記モデルデータは、前記反射面の形状を表す反射面モデルを含み、
前記反射面位置計算部は、前記視点位置を通りかつ前記反射面に直交する直線と前記反射面との交点の近傍において前記反射面を近似する近似平面の位置及び向きを決定し、前記交点において前記近似平面と接するように前記反射面モデルの位置及び向きを決定することにより、前記反射面の位置及び向きを決定する請求項2から請求項4までのうちの1つに記載の位置測定装置。 - 前記目標物体は、前記反射面及び非反射部分を有し、
前記モデルデータは、前記反射面の形状と、前記反射面及び前記非反射部分の位置関係とを含み、
前記位置測定部は、
前記反射面の形状と、前記特徴点データとに基づいて、前記反射面の位置及び向きを計算する反射面位置計算部であって、仮想的に決定された位置及び向きを有する前記反射面に仮想的に映った前記特徴点を前記撮影条件で仮想的に撮影して得られる仮想撮影画像に含まれる複数の前記特徴点の位置を計算し、前記仮想撮影画像に含まれる複数の前記特徴点の位置が前記撮影画像内の対応する複数の前記特徴点の位置にそれぞれ近接するように、前記反射面の位置及び向きを計算する反射面位置計算部と、
前記反射面及び前記非反射部分の位置関係と、前記計算された反射面の位置及び向きとに基づいて、前記目標物体の位置を決定する目標物体位置決定部とを備える請求項1に記載の位置測定装置。 - 前記反射面位置計算部は、前記仮想撮影画像内の前記特徴点の位置と前記撮影画像内の対応する前記特徴点の位置との差を小さくするように、前記反射面の位置及び向きを計算する請求項6に記載の位置測定装置。
- 前記反射面位置計算部は、
前記撮影画像内の前記特徴点の位置又は前記仮想撮影画像内の前記特徴点の位置に適用される射影変換であって、前記仮想撮影画像内の前記特徴点によって形成される図形と前記撮影画像内の対応する前記特徴点によって形成される図形とが互いに相似に近づくような射影変換を決定する射影変換部と、
前記撮影画像内の前記特徴点の位置又は前記仮想撮影画像内の前記特徴点の位置に決定された前記射影変換を適用した後に、前記仮想撮影画像内の前記特徴点によって形成される図形又は前記撮影画像内の対応する前記特徴点によって形成される図形に適用されるサイズ変更率であって、前記仮想撮影画像内の前記特徴点の位置と前記撮影画像内の対応する前記特徴点の位置との差を小さくするようなサイズ変更率を決定するサイズ変更部とを備え、
前記反射面位置計算部は、前記決定された射影変換及び前記決定されたサイズ変更率に基づいて、前記反射面の位置及び向きを計算する請求項7に記載の位置測定装置。 - 前記反射面位置計算部は、前記撮影画像内の前記特徴点の位置と前記仮想撮影画像内の対応する前記特徴点の位置との差に基づいて、かつ、前記視点位置を通りかつ前記反射面に直交する直線と前記反射面との交点に対応する前記撮影画像内又は前記仮想撮影画像内の点から前記撮影画像内の前記特徴点又は前記仮想撮影画像内の前記特徴点までの距離に基づいて、前記反射面の曲率半径を推定する曲率計算部を備える請求項7又は請求項8に記載の位置測定装置。
- 前記特徴点抽出部は、前記視点位置及び前記撮影方向の少なくとも一つが異なる複数組の前記撮影条件でそれぞれ撮影された複数の撮影画像のそれぞれから複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定し、
前記位置測定部は、複数組の前記撮影条件と、前記モデルデータと、前記特徴点データと、複数の前記撮影画像にそれぞれ対応する複数の特徴点集合であって、前記各特徴点集合が複数の前記撮影画像のうちの1つの撮影画像内の複数の前記特徴点の位置を含む複数の特徴点集合とに基づいて、前記目標物体の位置を計算する請求項1に記載の位置測定装置。 - 異なる位置にそれぞれ設置された複数の前記カメラを備える請求項10に記載の位置測定装置。
- 前記視点位置及び前記撮影方向の少なくとも一つが異なるように前記カメラを移動させるカメラ駆動装置を備える請求項10に記載の位置測定装置。
- 前記特徴点抽出部は、前記視点位置及び前記撮影方向の少なくとも一つが異なる複数組の前記撮影条件でそれぞれ撮影された複数の撮影画像のそれぞれから複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定し、
前記鏡像体位置計算部は、複数組の前記撮影条件と前記撮影画像内の複数の前記特徴点との組のそれぞれに対して、複数の前記鏡像体に含まれる前記特徴点の位置を計算し、
前記反射面位置計算部は、複数の前記鏡像体に含まれる前記特徴点の位置を組み合わせて処理して、前記反射面の位置及び向きを計算する請求項2に記載の位置測定装置。 - 前記特徴点抽出部は、前記視点位置及び前記撮影方向の少なくとも一つが異なる複数組の前記撮影条件でそれぞれ撮影された複数の撮影画像のそれぞれから複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定し、
前記反射面位置計算部は、複数組の前記撮影条件と前記撮影画像内の複数の前記特徴点との組を組み合わせて処理して、前記反射面の位置及び向きを計算する請求項6に記載の位置測定装置。 - 前記特徴点抽出部は、前記視点位置及び前記撮影方向の少なくとも一つが異なる複数組の前記撮影条件でそれぞれ撮影された複数の撮影画像のそれぞれから複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定し、
前記反射面位置計算部は、複数の前記撮影画像にそれぞれ対応する複数の前記反射面の位置及び向きを計算し、
前記目標物体位置決定部は、前記反射面及び前記非反射部分の位置関係と、計算された複数の前記反射面の位置及び向きの組み合わせとに基づいて、前記目標物体の位置を決定する請求項2又は請求項6に記載の位置測定装置。 - 目標物体の位置を決定する処理装置によって実行される位置測定方法であって、前記位置測定方法は、
カメラの視点位置及び撮影方向を含む撮影条件を第1の記憶部に記憶するステップと、
反射面を有する目標物体の形状を含むモデルデータを第2の記憶部に記憶するステップと、
複数の特徴点を有するマーカー物体を前記カメラに対して予め決められた位置に配置するステップと、
前記視点位置及び前記特徴点の間の互いの位置関係を表す特徴点データを第3の記憶部に記憶するステップと、
前記特徴点のうちの複数個が映っている前記反射面の少なくとも一部を含む撮影画像を前記カメラにより取得するステップと、
前記撮影画像から複数の前記特徴点を抽出し、前記撮影画像内の前記特徴点の位置を決定するステップと、
前記撮影条件と、前記モデルデータと、前記特徴点データと、前記撮影画像内の前記特徴点の位置とに基づいて、前記目標物体の位置を計算するステップとを含む位置測定方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14884656.1A EP3115741B1 (en) | 2014-03-03 | 2014-07-31 | Position measurement device and position measurement method |
JP2016506069A JP6067175B2 (ja) | 2014-03-03 | 2014-07-31 | 位置測定装置及び位置測定方法 |
US15/120,682 US10210628B2 (en) | 2014-03-03 | 2014-07-31 | Position measurement apparatus for measuring position of object having reflective surface in the three-dimensional space |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-040636 | 2014-03-03 | ||
JP2014040636 | 2014-03-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015132981A1 true WO2015132981A1 (ja) | 2015-09-11 |
Family
ID=54054808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/070233 WO2015132981A1 (ja) | 2014-03-03 | 2014-07-31 | 位置測定装置及び位置測定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10210628B2 (ja) |
EP (1) | EP3115741B1 (ja) |
JP (1) | JP6067175B2 (ja) |
CL (1) | CL2016002201A1 (ja) |
WO (1) | WO2015132981A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017073090A1 (ja) * | 2015-10-27 | 2017-05-04 | 三菱電機株式会社 | 分割鏡式望遠鏡の鏡交換装置およびその鏡交換方法 |
WO2017056089A3 (en) * | 2015-10-01 | 2017-07-27 | Infinity Augmented Reality Israel Ltd. | Method and a system for identifying reflective surfaces in a scene |
JP2021524599A (ja) * | 2018-05-25 | 2021-09-13 | アイメトラム リミテッド | モーションエンコーダ |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018100830A (ja) * | 2016-12-19 | 2018-06-28 | 横河電機株式会社 | 光スペクトル測定装置 |
JP7143857B2 (ja) * | 2017-10-24 | 2022-09-29 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、プログラム、及び、移動体 |
US10670390B2 (en) * | 2017-11-13 | 2020-06-02 | Faro Technologies, Inc. | System and method for verifying projection accuracy |
US10942045B1 (en) * | 2018-04-03 | 2021-03-09 | Waymo Llc | Portable sensor calibration target for autonomous vehicle |
US10960297B2 (en) * | 2018-09-17 | 2021-03-30 | Disney Enterprises, Inc. | Systems and methods for tracking a physical object using a passive object having a reflective surface |
US11080892B2 (en) * | 2019-10-07 | 2021-08-03 | The Boeing Company | Computer-implemented methods and system for localizing an object |
EP3886046A1 (en) | 2020-03-26 | 2021-09-29 | Sony Group Corporation | Multi-view positioning using reflections |
US11417020B2 (en) * | 2020-04-07 | 2022-08-16 | Zebra Technologies Corporation | Detection of calibration errors |
CN112767418B (zh) * | 2021-01-21 | 2022-10-14 | 大连理工大学 | 基于深度感知的镜子图像分割方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04148814A (ja) * | 1990-10-12 | 1992-05-21 | Toyota Central Res & Dev Lab Inc | 非接触型測定装置 |
JP2010231780A (ja) * | 2009-03-27 | 2010-10-14 | Mitsubishi Electric Research Laboratories Inc | 鏡面反射物体の3d姿勢を推定する方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6978167B2 (en) * | 2002-07-01 | 2005-12-20 | Claron Technology Inc. | Video pose tracking system and method |
ES2569411T3 (es) * | 2006-05-19 | 2016-05-10 | The Queen's Medical Center | Sistema de seguimiento de movimiento para imágenes adaptativas en tiempo real y espectroscopia |
JP2007322162A (ja) | 2006-05-30 | 2007-12-13 | 3D Media Co Ltd | 3次元形状測定装置及び3次元形状測定方法 |
US8265376B2 (en) * | 2008-07-21 | 2012-09-11 | Cognitens Ltd. | Method and system for providing a digital model of an object |
JP2010071782A (ja) | 2008-09-18 | 2010-04-02 | Omron Corp | 3次元計測装置およびその方法 |
US8441532B2 (en) | 2009-02-24 | 2013-05-14 | Corning Incorporated | Shape measurement of specular reflective surface |
US8437537B2 (en) * | 2009-03-27 | 2013-05-07 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for estimating 3D pose of specular objects |
JP5487920B2 (ja) | 2009-12-03 | 2014-05-14 | 国立大学法人茨城大学 | 光学式3次元形状計測装置及び光学式3次元形状計測方法 |
GB201303712D0 (en) * | 2013-03-01 | 2013-04-17 | Geissler Michael P A | Optical navigation & positioning system |
-
2014
- 2014-07-31 WO PCT/JP2014/070233 patent/WO2015132981A1/ja active Application Filing
- 2014-07-31 US US15/120,682 patent/US10210628B2/en active Active
- 2014-07-31 JP JP2016506069A patent/JP6067175B2/ja active Active
- 2014-07-31 EP EP14884656.1A patent/EP3115741B1/en active Active
-
2016
- 2016-09-01 CL CL2016002201A patent/CL2016002201A1/es unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04148814A (ja) * | 1990-10-12 | 1992-05-21 | Toyota Central Res & Dev Lab Inc | 非接触型測定装置 |
JP2010231780A (ja) * | 2009-03-27 | 2010-10-14 | Mitsubishi Electric Research Laboratories Inc | 鏡面反射物体の3d姿勢を推定する方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3115741A4 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017056089A3 (en) * | 2015-10-01 | 2017-07-27 | Infinity Augmented Reality Israel Ltd. | Method and a system for identifying reflective surfaces in a scene |
US10049303B2 (en) | 2015-10-01 | 2018-08-14 | Infinity Augmented Reality Israel Ltd. | Method and a system for identifying reflective surfaces in a scene |
US10395142B2 (en) | 2015-10-01 | 2019-08-27 | Infinity Augmented Reality Israel Ltd. | Method and a system for identifying reflective surfaces in a scene |
US10719740B2 (en) | 2015-10-01 | 2020-07-21 | Alibaba Technology (Israel) Ltd. | Method and a system for identifying reflective surfaces in a scene |
WO2017073090A1 (ja) * | 2015-10-27 | 2017-05-04 | 三菱電機株式会社 | 分割鏡式望遠鏡の鏡交換装置およびその鏡交換方法 |
JPWO2017073090A1 (ja) * | 2015-10-27 | 2018-04-12 | 三菱電機株式会社 | 分割鏡式望遠鏡の鏡交換装置およびその鏡交換方法 |
JP2021524599A (ja) * | 2018-05-25 | 2021-09-13 | アイメトラム リミテッド | モーションエンコーダ |
JP7344283B2 (ja) | 2018-05-25 | 2023-09-13 | アイメトラム リミテッド | モーションエンコーダ |
US11885650B2 (en) | 2018-05-25 | 2024-01-30 | Imetrum Ltd. | Motion encoder |
Also Published As
Publication number | Publication date |
---|---|
US20170221224A1 (en) | 2017-08-03 |
US10210628B2 (en) | 2019-02-19 |
JP6067175B2 (ja) | 2017-01-25 |
JPWO2015132981A1 (ja) | 2017-04-06 |
EP3115741B1 (en) | 2023-05-10 |
EP3115741A4 (en) | 2017-08-16 |
CL2016002201A1 (es) | 2016-12-23 |
EP3115741A1 (en) | 2017-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6067175B2 (ja) | 位置測定装置及び位置測定方法 | |
CN110809786B (zh) | 校准装置、校准图表、图表图案生成装置和校准方法 | |
JP6271953B2 (ja) | 画像処理装置、画像処理方法 | |
JP5132832B1 (ja) | 計測装置および情報処理装置 | |
JP6079333B2 (ja) | 校正装置、方法及びプログラム | |
JP6324025B2 (ja) | 情報処理装置、情報処理方法 | |
JP6370038B2 (ja) | 位置姿勢計測装置及び方法 | |
JP6222898B2 (ja) | 3次元計測装置及びロボット装置 | |
JP6573419B1 (ja) | 位置決め方法、ロボット及びコンピューター記憶媒体 | |
CN107155341B (zh) | 三维扫描系统和框架 | |
JP6092530B2 (ja) | 画像処理装置、画像処理方法 | |
JP2014013146A5 (ja) | ||
JP2011198349A (ja) | 情報処理方法及びその装置 | |
JP2009042162A (ja) | キャリブレーション装置及びその方法 | |
CN113841384B (zh) | 校准装置,用于校准的图表和校准方法 | |
CN107808398B (zh) | 摄像头参数算出装置以及算出方法、程序、记录介质 | |
JP2016100698A (ja) | 校正装置、校正方法、プログラム | |
JP2017144498A (ja) | 情報処理装置、情報処理装置の制御方法およびプログラム | |
JP2015031601A (ja) | 3次元計測装置及び方法並びにプログラム | |
Legarda et al. | A new method for Scheimpflug camera calibration | |
US10252417B2 (en) | Information processing apparatus, method of controlling information processing apparatus, and storage medium | |
JP2011155412A (ja) | 投影システムおよび投影システムにおける歪み修正方法 | |
JP6486083B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
Huiyang et al. | Calibration of camera with small FOV and DOF telecentric lens | |
JP6890422B2 (ja) | 情報処理装置、情報処理装置の制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14884656 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016506069 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15120682 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014884656 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014884656 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |