WO2013005244A1 - Three-dimensional relative coordinate measuring device and method - Google Patents

Three-dimensional relative coordinate measuring device and method Download PDF

Info

Publication number
WO2013005244A1
WO2013005244A1 PCT/JP2011/003774 JP2011003774W WO2013005244A1 WO 2013005244 A1 WO2013005244 A1 WO 2013005244A1 JP 2011003774 W JP2011003774 W JP 2011003774W WO 2013005244 A1 WO2013005244 A1 WO 2013005244A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewpoint
points
point
pixel
dimensional
Prior art date
Application number
PCT/JP2011/003774
Other languages
French (fr)
Japanese (ja)
Inventor
覚 大浦
Original Assignee
株式会社ベイビッグ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ベイビッグ filed Critical 株式会社ベイビッグ
Priority to JP2011541976A priority Critical patent/JP5070435B1/en
Priority to PCT/JP2011/003774 priority patent/WO2013005244A1/en
Priority to JP2013522375A priority patent/JP5629874B2/en
Priority to PCT/JP2011/005654 priority patent/WO2013005265A1/en
Publication of WO2013005244A1 publication Critical patent/WO2013005244A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Definitions

  • the present invention relates to a three-dimensional relative coordinate measuring apparatus that measures a relative coordinate between any one of three reference points whose relative coordinates are known and one target point whose coordinates are unknown in a three-dimensional coordinate system.
  • a stereo method is known as a method for measuring relative coordinates from an image.
  • stereo method two images are obtained by photographing with two cameras placed at different positions or by photographing from two different positions with one camera.
  • relative coordinate information is obtained from subtle differences in images based on parallax, as in human binocular vision.
  • Patent Document 1 discloses a photo measurement system capable of performing highly accurate surveying by a stereo method.
  • Patent Document 1 it is assumed that the position and orientation of the camera at the photographing point are grasped in advance by using a detection device such as an inclination angle sensor.
  • the present invention has been made in view of such a situation, and provides a three-dimensional relative coordinate measuring apparatus capable of measuring relative coordinates without grasping in advance the position and orientation of a camera at a shooting point.
  • the purpose is to do.
  • a relative coordinate measuring apparatus includes any one of three first reference points whose relative coordinates in a three-dimensional coordinate system are known, and the three-dimensional coordinate system.
  • a three-dimensional relative coordinate measuring apparatus that measures relative coordinates with a single target point whose coordinates are unknown, wherein the three first reference points are imaged by a first imaging device from a first viewpoint.
  • a viewpoint projection angle extraction unit that acquires three first reference viewpoint projection angles that are the first pixel viewpoint projection angles corresponding to the first reference point, and the first pixel viewpoint projection angle is The first pixel is projected to the first two-dimensional coordinates that are the coordinates of each pixel in the first acquired image.
  • An angle formed by a line segment connecting each point of the three-dimensional coordinate system and the first viewpoint, and the optical axis of the first imaging device, and the three first reference viewpoint projections Using a corner and the relative coordinates of the three first reference points, a first imaging surface that is an imaging surface of the first acquired image and a first reference point that includes the three first reference points Relative coordinates between one of the three first reference points and the target point are measured using an inclination angle calculation unit that calculates a first inclination angle formed with a reference plane, and the first inclination angle. And a measuring unit to perform.
  • the viewpoint projection angle it is possible to measure relative coordinates without knowing in advance the position and orientation of the camera at the shooting point.
  • the three first reference points are any three of three or more reference points whose relative coordinates in the three-dimensional coordinate system are known, and the image acquisition unit includes the first reference point.
  • the first reference image of the three first reference points and the target point from the viewpoint are acquired by the first imaging device, and the reference points of the three or more points are acquired from the second viewpoint.
  • a second acquired image obtained by imaging a second reference point, which is any three of the three points, and the target point with a second imaging device that is the same as or different from the first imaging device.
  • the viewpoint projection angle extraction unit further holds information on a second pixel viewpoint projection angle for each pixel, and projects the second acquired image on the second acquired image using the information on the second pixel viewpoint projection angle.
  • Three second reference views that are the second pixel viewpoint projection angles corresponding to the three second reference points
  • a projection angle is acquired, and the second pixel viewpoint projection angle is calculated based on each point of the three-dimensional coordinate system projected on a second pixel two-dimensional coordinate that is a coordinate of each pixel in the second acquired image, and An angle formed by a line segment connecting the second viewpoint and the optical axis of the second imaging device, and the tilt angle calculation unit further includes the three second reference viewpoint projection angles and the three points.
  • the second imaging plane formed by the second imaging plane which is the imaging plane of the second acquired image
  • the second reference plane including the three second reference points.
  • the measurement unit calculates a relative positional relationship between the first reference point of the three points and the second reference point of the three points, the first inclination angle, and the second inclination point.
  • the relative coordinates between the target point and any one of the first reference point of the three points and the second reference point of the three points may be measured.
  • the relative positional relationship of the camera at each shooting point is determined without grasping in advance the relative posture relationship of the camera at each shooting point. Relative coordinates between any one of the three reference points and the target point can be measured without doing so.
  • the tilt angle calculation unit uses the three first and second reference viewpoint projection angles and the three first and second reference points to use the first viewpoint and the first reference point. Determining the first reference plane including the first reference points of the three points on a straight line connecting the first reference points of the three points projected on the acquired image, and the second viewpoint And determining the second reference plane including the three second reference points on a straight line connecting the three second reference points projected onto the second acquired image Also good.
  • the viewpoint projection angle extraction unit further acquires a first target viewpoint projection angle that is the first pixel viewpoint projection angle corresponding to the target point projected on the first acquired image, and A second target viewpoint projection angle that is the second pixel viewpoint projection angle corresponding to the target point projected on the two acquired images
  • the measurement unit includes the first target viewpoint projection angle, Calculating a first vector passing through the first viewpoint and the target point projected on the first acquired image in the three-dimensional coordinate system using the first inclination angle; A second passing through the second viewpoint and the target point projected on the second acquired image in the three-dimensional coordinate system using the second target viewpoint projection angle and the second inclination angle. And calculate the vector in the three-dimensional coordinate system using the relative positional relationship.
  • the coordinates of the closest point between the first vector and the second vector are calculated as relative coordinates between the target point and any one of the three first reference points and the three second reference points. May be.
  • At least one of the three first reference points may be different from the three second reference points.
  • the target point may be the first viewpoint.
  • the relative coordinates between any of the three reference points and the shooting point can be measured without previously knowing the posture of the camera at the shooting point. can do.
  • the tilt angle calculation unit projects the first viewpoint and the first acquired image using the three first reference viewpoint projection angles and the three first reference points.
  • the first reference plane including the three first reference points on a straight line connecting each of the three first reference points may be determined.
  • the tilt angle calculation unit can calculate the posture of the camera at the shooting point.
  • the measurement unit corresponds to the two angles in the first reference plane by using two of the three first reference viewpoint projection angles and the first inclination angle. Two vectors passing through two of the three first reference points are calculated, and the coordinates of the closest point of the two vectors are set as one of the three first reference points and the You may calculate as a relative coordinate with an object point.
  • the present invention can be realized not only as such a three-dimensional relative coordinate measuring apparatus, but also as a step in the operation of the characteristic components included in such a three-dimensional relative coordinate measuring apparatus. It can also be realized as a measurement method or a program for causing a computer to execute these steps.
  • the present invention can provide a three-dimensional relative coordinate measuring apparatus capable of measuring relative coordinates without grasping in advance the position and orientation of the camera at the photographing point.
  • FIG. 1A is a diagram showing a configuration of a system including a three-dimensional relative coordinate measurement device at the time of three-dimensional relative coordinate measurement according to the present invention.
  • FIG. 1B is a diagram showing a configuration of a system including a three-dimensional relative coordinate measuring apparatus at the time of pixel calibration of an image according to the present invention.
  • FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional relative coordinate measuring apparatus according to the present invention.
  • FIG. 3 is a flowchart showing a procedure for pixel calibration of an image according to the present invention.
  • FIG. 4 is a diagram for explaining a viewpoint projection angle according to the present invention.
  • FIG. 5A is a diagram showing an image when a calibration plate is photographed before distortion correction according to the present invention.
  • FIG. 5A is a diagram showing an image when a calibration plate is photographed before distortion correction according to the present invention.
  • FIG. 5B is a diagram showing an image when the calibration plate is photographed after distortion correction according to the present invention.
  • FIG. 6 is a diagram for explaining processing for calculating the distance between the calibration board and the viewpoint according to the present invention.
  • FIG. 7 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention.
  • FIG. 8 is a diagram three-dimensionally showing the relationship among the viewpoint, imaging surface, reference point, and target point according to Embodiment 1 of the present invention.
  • FIG. 9A is a diagram showing a reference point and a target point projected on the first imaging surface according to Embodiment 1 of the present invention.
  • FIG. 9B is a diagram showing a reference point and a target point projected on the second imaging surface according to Embodiment 1 of the present invention.
  • FIG. 10 is a flowchart showing a processing procedure performed by the tilt angle calculation unit according to the present invention.
  • FIG. 11 is a diagram for explaining the processing shown in the flowchart of FIG. 10 according to the present invention.
  • FIG. 12 is a diagram when FIG. 11 is projected onto the Z 1 -X 1 plane according to the present invention.
  • FIG. 13 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 2 of the present invention.
  • FIG. 14 is a diagram three-dimensionally illustrating the relationship between the viewpoint, the imaging surface, and the reference point according to Embodiment 2 of the present invention.
  • FIG. 15 is a diagram illustrating a reference point projected on the first imaging surface according to Embodiment 2 of the present invention.
  • FIG. 1A is a diagram showing a configuration of a system including a three-dimensional relative coordinate measurement device at the time of three-dimensional relative coordinate measurement according to the present invention.
  • FIG. 1A shows a system configuration when acquiring two images from two viewpoints.
  • a three-dimensional relative coordinate measuring apparatus 90 according to the present invention is connected to the camera 10 via a cable 30.
  • the cable 30 is, for example, a USB (Universal Serial Bus) cable.
  • the three-dimensional relative coordinate measuring apparatus 90 can acquire the image data 60 captured by the camera 10 via the cable 30.
  • the cable 30 may be a cable other than USB.
  • the image data 60 may be acquired wirelessly or via a recording medium.
  • M 1 and M 2 correspond to the positions of the principal points of the photographing lens of the camera 10.
  • O 1 and O 2 written by two-dot chain lines indicate the optical axis of the photographing lens (the optical axis of the camera) when the viewpoint is at M 1 and M 2 , respectively.
  • the camera 10 may capture an image from the first viewpoint so that the reference structure 40 is captured.
  • the reference structure 40 has three reference points A, B, and C whose relative coordinates are known in a three-dimensional coordinate system. In other words, the triangle defined by the points A, B, and C is referred to as the reference structure 40.
  • the target structure 50 has a target point W whose coordinates are unknown in a three-dimensional coordinate system. In addition, when acquiring one image from one viewpoint, the target structure 50 may not be used.
  • FIG. 1B is a diagram showing a configuration of a system including a three-dimensional relative coordinate measuring apparatus at the time of pixel calibration of an image according to the present invention.
  • a three-dimensional relative coordinate measuring apparatus 90 according to the present invention is connected to the camera 10 via a cable 30.
  • the cable 30 is, for example, a USB cable.
  • the three-dimensional relative coordinate measuring apparatus 90 can acquire the image data 61 captured by the camera 10 via the cable 30.
  • the cable 30 may be a cable other than USB.
  • the image data 61 may be acquired wirelessly or via a recording medium.
  • the camera 10 captures an image at the viewpoint M so that the calibration plate 70 can be seen.
  • M corresponds to the position of the principal point of the taking lens of the camera 10.
  • O written by a two-dot chain line indicates the optical axis of the photographing lens (the optical axis of the camera) when M has a viewpoint.
  • the calibration plate 70 is a transparent plate having high rigidity and has a grid pattern on the surface.
  • a weight 72 is attached to the calibration plate 70 with a thread 71.
  • the three-dimensional relative coordinate measuring device 90 is, for example, a computer.
  • a common computer is used for three-dimensional relative coordinate measurement and image pixel calibration, but separate computers may be used. Two or more computers may be used.
  • FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional relative coordinate measuring apparatus according to the present invention.
  • the three-dimensional relative coordinate measurement system 90 includes a three-dimensional relative coordinate measurement unit 20 and an image pixel calibration unit 80 according to the present invention.
  • the three-dimensional relative coordinate measurement unit 20 includes an image acquisition unit 100, a two-dimensional coordinate extraction unit 110, a viewpoint projection angle extraction unit 120, a calculation unit 130, and a display unit 140. Further, the image pixel calibration unit 80 includes an adjustment unit 200.
  • the adjustment unit 200 performs distortion correction of image pixels from the image data 61 obtained by imaging the calibration plate 70, the thread 71, and the weight 72 from the viewpoint M, and the viewpoint projection angle of the arbitrary image pixel (pixel) (present invention). Of the first and second pixel viewpoint projection angles).
  • Image acquisition unit 100 from the first viewpoint M 1, the reference point of the 3-point, or a reference point and the target point of the three points (corresponding to the first and second imaging apparatus of the present invention) camera
  • the image data 60 imaged in (1) is acquired as the first acquired image.
  • the image acquisition unit 100 from the second viewpoint M 2, and the reference point and the target point of the three points to obtain the image data 60 captured by the camera as the second acquired image.
  • the two-dimensional coordinate extraction unit 110 extracts the three reference points projected on the first acquired image acquired by the image acquisition unit 100, or the two-dimensional coordinates of the three reference points and the target point. In addition, the two-dimensional coordinate extraction unit 110 extracts the three reference points and the two-dimensional coordinates of the target points that are projected on the second acquired image acquired by the image acquisition unit 100.
  • the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 extracts the viewpoint projection angle of the arbitrary image pixel to be held.
  • a viewpoint projection angle corresponding to the two-dimensional coordinates is extracted from the viewpoint projection angles of arbitrary image pixels. The definition of the viewpoint projection angle will be described later with reference to FIG.
  • the calculation unit 130 further includes an inclination angle calculation unit 130a, a vector calculation unit 130b, a conversion vector calculation unit 130c, and a relative coordinate measurement unit 130d.
  • the tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle using the relative coordinates of the three reference points and the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120, respectively.
  • the vector calculation unit 130b calculates the first vector and the vector 2 using the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120, respectively. Further, the vector calculation unit 130b calculates two vectors using the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120 and the first tilt angle calculated by the tilt angle calculation unit 130a.
  • the conversion vector calculation unit 130c uses the first inclination angle and the second inclination angle calculated by the inclination angle calculation unit 130a and the vector 2 calculated by the vector calculation unit 130b to calculate the second vector. calculate.
  • the relative coordinate measurement unit 130d uses either the first vector calculated by the vector calculation unit 130b and the second vector calculated by the conversion vector calculation unit 130c to obtain either one of the three reference points and the target point. Measure relative coordinates. In addition, the relative coordinate measuring unit 130d measures the relative coordinates between any of the three reference points and the shooting point using the two vectors calculated by the vector calculating unit 130b.
  • the display unit 140 displays, for example, the acquired image acquired by the image acquisition unit 100, displays the two-dimensional coordinates extracted by the two-dimensional coordinate extraction unit 110, or is extracted by the viewpoint projection angle extraction unit 120. Display viewpoint projection angles, and display various calculation results performed by the calculation unit 130.
  • FIG. 3 is a flowchart showing a procedure for pixel calibration of an image according to the present invention.
  • the pixel calibration image set the optical center position C p is the center point of the imaging plane (S101), performs distortion correction (S102), perspective projection angle of an arbitrary image pixels (pixel) theta (present invention (Corresponding to the first and second pixel viewpoint projection angles) (S103).
  • the viewpoint projection angle ⁇ of an arbitrary image pixel is a two-dimensional coordinate (corresponding to the first and second pixel two-dimensional coordinates of the present invention) of an arbitrary image pixel P on the image, as shown in FIG.
  • the camera optical axis O (corresponding to the optical axis of the first and second imaging device of the present invention) It is the angle formed by.
  • the viewpoint M since the optical axis O and the imaging surface of the camera are orthogonal, and the viewpoint M, and the distance x between the optical center position C p is the intersection of the optical axis O and the imaging surface of the camera, for example, optical
  • the viewpoint projection angle ⁇ of the arbitrary image pixel P can be calculated using the two-dimensional coordinates of the arbitrary image pixel P with the center position C p as the origin.
  • the optical center position Cp is set (S101).
  • a weight 72 is attached by a thread 71 to a transparent plate having high rigidity (hereinafter referred to as a calibration plate 70).
  • the calibration plate 70 is assumed to be kept horizontal by a level or the like. Further, it is assumed that grids are marked on the surface of the calibration plate 70.
  • the calibration plate 70 is imaged by the camera 10 from above. Then, the image pickup surface is (1) a grid around the attachment position of the thread 71 so that the attachment position of the calibration plate 70 and the thread 71 and the attachment position of the weight 72 and the thread 71 overlap.
  • the mounting position of the thread 71 on the calibration plate 70 is the optical center position C p.
  • the adjustment unit 200 performs image distortion correction (S102). As shown in FIG. 5A, before distortion correction, the image when the calibration plate 70 is photographed is an image in which the grid is distorted as shown in FIG. 5A due to the characteristics of the photographing lens of the camera 10. The adjustment unit 200 adjusts the captured image to an image as illustrated in FIG. 5B by performing normalization processing on the captured image.
  • the adjustment unit 200 calculates the viewpoint projection angle ⁇ of an arbitrary image pixel P (S103).
  • FIG. 6 is a diagram for explaining processing for calculating the distance between the calibration board and the viewpoint according to the present invention.
  • the adjustment unit 200 acquires an image of a plate before movement, which is an image after distortion correction obtained by photographing the calibration plate 70 as shown in FIG. 5B.
  • the adjustment unit 200 is a distortion-corrected image obtained by capturing the calibration plate 70 when the calibration plate 70 is moved close to the viewpoint M by the distance y while the calibration plate 70 is kept horizontal. Acquire an image of the later board.
  • the adjustment unit 200 uses the image of the plate before movement and the image of the plate after movement, for example, two-dimensional coordinates of the points P 1 ′ and P 2 ′ with the optical center position C p as the origin. To extract.
  • an intersection of a straight line connecting the viewpoint M and P 1 and the imaging surface is defined as P 1 ′. Further, corresponding to P 1 of the calibration plate 70 surface before the movement, the point of the calibration plate 70 surface after the movement and P 2.
  • P 2 ′ be the intersection of the straight line connecting the viewpoints M and P 2 and the imaging surface.
  • Q be the intersection of a straight line passing through the point P 2 and the point P 2 ′ and the surface of the calibration plate 70 before the movement.
  • d 2 represents the length of the line segment C 1 P 1
  • d 3 represents the length of the line segment P 1 ′ P 2 ′
  • d 4 represents the length of the line segment C p P 1 ′.
  • d 5 indicates the length of the line segment C 1 Q.
  • Viewpoint M and the length x of a line connecting the optical center position C p can be expressed by equation 5.
  • the viewpoint projection angle ⁇ of an arbitrary pixel point P can be calculated by Expression 6.
  • d 6 indicates the length of the line segment C p P.
  • the viewpoint projection angle ⁇ of the arbitrary image pixel P can be calculated.
  • the optical center position Cp is set, distortion correction is performed, and the viewpoint projection angle ⁇ of an arbitrary image pixel is calculated, whereby the pixel calibration of the image is completed.
  • the image pixel calibration method has been described above based on the embodiment, but the image pixel calibration method is not limited to this embodiment.
  • the grid of the grid is written on the calibration plate 70, but the calibration plate may be marked with a known interval.
  • the light from the lamp light source passes through a cylindrical structure processed perpendicularly to the calibration plate.
  • a light beam or a laser beam may be used.
  • FIG. 7 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention.
  • the image acquisition unit 100 acquires a first acquired image and a second acquired image (S110).
  • the first acquired image and the second captured image from the first viewpoint M 1 and second viewpoint M 2, the reference point A of the 3-point coordinates are known, B, and C, and coordinates
  • the three reference points included in the first acquired image and the second acquired image are the same will be described as an example.
  • the three reference points included in the second acquired image may be different.
  • the two-dimensional coordinate extraction unit 110 uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image.
  • the coordinates of the points A 01 , B 01 , C 01 that are C are extracted from the first acquired image.
  • the two-dimensional coordinate extraction unit 110 extracts the coordinates of the point W 01 that is the target point W projected on the first acquired image (first target two-dimensional coordinates) from the first acquired image (S120). .
  • the two-dimensional coordinate extraction unit 110 uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image.
  • the coordinates of the points A 01 , B 01 , C 01 that are C are extracted from the first acquired image.
  • the two-dimensional coordinate extraction unit 110 extracts the coordinates of the point W 01 that is the target point W projected on the first acquired image (first target two-dimensional coordinates) from the first acquired image (S120). .
  • the two-dimensional coordinate extraction unit 110 uses the optical center position Cp2 in the second acquired image as an origin and the reference points A, B, and B projected on the second acquired image.
  • the coordinates of the points A 02 , B 02 , and C 02 (three second reference two-dimensional coordinates) that are C are extracted from the second acquired image.
  • 2-dimensional coordinate extracting section 110 extracts the second acquired image point which is the target point W which is projected to W 02 coordinates (second target two-dimensional coordinates) from the second acquired image (S120) .
  • the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 out of the viewpoint projection angles of the arbitrary image pixel to be held.
  • the first and second reference two-dimensional coordinates of the three points extracted by the above and the viewpoint projection angles of arbitrary image pixels corresponding to the first and second target two-dimensional coordinates are set as the three first and second references.
  • the viewpoint projection angle and the first and second target viewpoint projection angles are extracted (S130).
  • the inclination angle calculation unit 130a includes first and second imaging surfaces that are imaging surfaces of the first and second acquired images, and a reference plane (first and second reference planes) including a reference point.
  • the formed angle (first and second inclination angles) is calculated (S140).
  • FIG. 10 is a flowchart showing a processing procedure performed by the inclination angle calculation unit according to the present invention.
  • FIG. 11 is a figure explaining the process shown by the flowchart of FIG. 10 based on this invention.
  • the process of calculating the first inclination angle will be described mainly based on FIGS. 10 and 11.
  • the tilt angle calculation unit 130a has one of the three reference points A, B, and C whose relative coordinates are known, for example, the X 1 -Y 1 -Z 1 coordinate system with the point A as the origin. (First three-dimensional coordinate system) is set (S141). At this time, the inclination angle calculating unit 130a, X 1 axis and Y 1 axis is set to be parallel to the first imaging plane taken from the first viewpoint M 1. The inclination angle calculating unit 130a, Z 1 axis is set to be the first imaging plane and perpendicular captured from the first viewpoint M 1. Note that the X 1 -Y 1 plane at this time is also referred to as a first conversion plane.
  • the inclination angle calculating unit 130a a relative coordinate reference point A of the three known, B, set C and equivalent first virtual point B 1 and C 1 of the coordinates (S142).
  • the point B 1 corresponds to the point B
  • the point C 1 corresponds to the point C.
  • the coordinates of the point B 1 are set as (B 1x , 0, 0)
  • the coordinates of the point C 1 are set as (0, C 1y , 0).
  • the tilt angle calculation unit 130a finely and separately rotates the points B 1 and C 1 by an angle ⁇ 1n around the X 1 axis, an angle ⁇ 1n around the Y 1 axis, and an angle ⁇ 1n around the Z 1 axis.
  • a plurality of first conversion reference points B 1n ′ (B 1nx ′, B 1ny ′, B 1nz ′) and point C 1n ′ (C 1nx ′, C 1ny ′, C 1nz ′) are calculated ( S143).
  • n 1, 2, 3,...
  • the tilt angle calculation unit 130a uses the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120 to project a plurality of points B 1n ′ and point C 1n ′ onto the X 1 -Y 1 plane.
  • the point B 1n ′′ (B 1nx ′′, B 1ny ′′, 0) and the point C 1n ′′ (C 1nx ′′, C 1ny ′′, 0), which are the first conversion projection reference points, are calculated (S144).
  • n 1, 2, 3,...
  • Z 1 -X 1 side perspective projection angle epsilon 1x of point B 01 a perspective projection angle of Z 1 -Y 1 side of the point B 01 epsilon 1y
  • viewpoint projection angle of Z 1 -X 1 side of the point C 01 ⁇ 1x viewpoint projection angle of Z 1 -Y 1 side of the point C 01 and phi 1y.
  • Reference point A in the first imaging plane, B that it is C A 01, B 01, a triangle A 01 B 01 C 01 connecting the C 01, points A, triangle A connecting the points B 1n "point C 1n” , B 1n ′′ C 1n ′′ is similar, ⁇ B 1n ′ M 1 C p1 and ⁇ B 1n ′′ M 1 C p1 are ⁇ 1x as shown in FIG.
  • B 1nx ′′ can be calculated using Equation 7.
  • B 1ny ′′, C 1nx ′′, and C 1ny ′′ can be calculated using Equation 8 to Equation 10.
  • the inclination angle calculation unit 130a the triangle A 01 B 01 C 01 connecting the points A 01 , B 01 , C 01 which are the reference points A, B, C on the first imaging surface, and the points A, B
  • the first transformed projection reference points B 1 ′′ and C 1 ′′ that are most similar to the triangle AB 1n ′′ C 1n ′′ connecting 1 n ′′ and the point C 1n ′′
  • the first inclination angle ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) can be calculated (S145).
  • the line segment A 01 B 01 is made the same length as the line segment AB 1n ′′, and the point B 01 and the point C 01 are moved so that the point A 01 coincides with the point A. Then, the first inclination angle ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) is calculated so that the sum of the length of the line segment B 01 B 1n ′′ and the length of the line segment C 01 C 1n ′′ is minimized. There is a way.
  • the similarity comparison calculation method is not limited to the above method.
  • a line segment A 01 B 01 "in the same length as the line segment A 01 B 01 segment AB 1n” segment AB 1n moving the point C 01 to match the point C
  • the first inclination angles ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) may be calculated so that the deviation amount between 01 and the point C 1n ′′ is minimized.
  • the inclination angle calculation unit 130a can calculate a second inclination angle ( ⁇ 2 , ⁇ 2 , ⁇ 2 ) that is an angle formed between the second imaging surface and a plane including the reference point. .
  • the calculation of the tilt angle has been described based on the embodiment.
  • the calculation of the tilt angle is not limited to this embodiment.
  • the tilt angle calculation unit 130a has an X 1 -Y 1 -Z 1 coordinate system with one of the three reference points A, B, and C whose coordinates are known as the origin.
  • the origin may be selected from other than the three reference points whose coordinates are known.
  • the point A is placed at a certain position on the viewpoint projection angle line of the point A, and the point B is at a certain pitch on the viewpoint projection angle line of the point B and in a certain range.
  • the point C is obtained by being a similar shape based on the line segment AB by the point B at a certain pitch, and the point C is on the circumference with the line segment AB as an axis. I understand that.
  • the condition of the closest approach on the viewpoint projection angle of the point C may be stored, the point B may be sequentially slid to calculate the closest point of the point C at that time, and the inclination angle obtained when the closest approach is obtained.
  • the point A is placed at a certain position on the viewpoint projection angle line of the point A, and the point B at that time is set to Since the point C is on the circumference around the line segment AB, the condition closest to the point C projection angle line is memorized.
  • the inclination angle obtained when the point A is closest to the viewpoint projection angle line when the above-described calculation is performed for the point A within a certain range may be used.
  • the vector calculation unit 130b calculates the first vector and the vector 2 (S150).
  • the first target two-dimensional coordinates of the point W 01 that is the target point W on the first imaging surface are converted into similar shapes on the first conversion plane where the triangles A, B 1 ′′, C 1 ′′ exist. This point is defined as W 1 (W 1x , W 1y , 0).
  • the first vector is the viewpoint projection.
  • the viewpoint projection angle ⁇ 1x of the Z 1 -X 1 plane of the point W 01 and the viewpoint projection angles ⁇ 1y and W 1z ′ of the Z 1 -Y 1 plane of the point W 01 extracted by the corner extraction unit 120.
  • the vector calculation unit 130b can calculate the vector 2.
  • the conversion vector calculation unit 130c generates a line segment connecting any two points of the points A, B 1 ′′, C 1 ′′, for example, the line segment AB 1 ′′ and the points A, B 2 ′′, C 2 ′′. Among them, the line segment AB 2 ′′ corresponding to the line segment AB 1 ′′ is extracted. Then, the conversion vector calculation unit 130 c uses the first inclination angles ( ⁇ 1 , ⁇ 1 , ⁇ obtained by the inclination angle calculation unit 130 a. 1 ) and the second inclination angle ( ⁇ 2 , ⁇ 2 , ⁇ 2 ), the vector 2 is moved so that the line segment AB 2 ′′ and the line segment AB 1 ′′ overlap with each other. Is calculated (S160).
  • the relative coordinate measuring unit 130d calculates the coordinates of the closest point between the first vector and the second vector, so that the origin A and the target point W, which are any one of the reference points, are calculated. Relative coordinates are calculated (S170).
  • vector calculation unit 130b corresponds to the measurement unit of the present invention.
  • the three-dimensional relative coordinate measuring unit 20 makes a relative coordinate between any one of the three reference points and the photographing point (corresponding to the first viewpoint of the present invention). The operation of measuring the will be described.
  • FIG. 13 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 2 of the present invention.
  • the image acquisition unit 100 acquires a first acquired image as shown in FIG. 14 (S210).
  • the first acquired image is considerably than the first viewpoint M 1
  • the reference point A of the 3-point coordinates are known
  • B the first imaging device to the camera (the present invention as C objects appear together Is the image data 60 imaged in (1).
  • the two-dimensional coordinate extraction unit 110 uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image.
  • the coordinates of the points A 01 , B 01 and C 01 (three first reference two-dimensional coordinates) that are C are extracted from the first acquired image (S220).
  • the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 out of the viewpoint projection angles of the arbitrary image pixel to be held.
  • the viewpoint projection angles of arbitrary image pixels corresponding to the three first reference two-dimensional coordinates extracted by the above are extracted as three first reference viewpoint projection angles (S230).
  • the tilt angle calculation unit 130a calculates an angle (first tilt angle) formed by the first imaging plane, which is the imaging plane of the first acquired image, and the first reference plane including the reference point ( S240).
  • the vector calculation unit 130b calculates two vectors (S250).
  • the two vectors for example, B 1 determined in (S145) and "the, and the straight line B 1 'B 1 connecting the' B 1 corresponding to 'point B 1n at that time" (S145) C 1 determined in is that of "a, linear C 1 'C 1 connecting the' C 1 corresponding to the 'C 1n terms of their time".
  • the relative coordinate measuring unit 130d calculates the coordinates of the closest point of the two vectors, thereby calculating the relative coordinates between the point A, which is one of the reference points, and the shooting point (S260). ).
  • the point A is placed at a certain position on the viewpoint A projection angle line of the point A, and the point B at that time is the point B Since the point C is on the circumference around the line segment AB, the condition closest to the point C projection angle line is memorized.
  • the inclination angle obtained when the point A is closest to the viewpoint projection angle line when the above-described calculation is performed for the point A within a certain range may be used.
  • the triangle ABM 1 connecting the point A and point B and the point M 1 because the length and ⁇ M 1 AB and ⁇ M 1 BA line segment AB is known, using the triangle ABM 1, the reference point The relative coordinates of any one of the points and the shooting point can be calculated. Further, since ⁇ AM 1 B is known from the viewpoint projection angle when the line segment AB is viewed from the viewpoint M 1 , similarly, using the triangle ABM 1 , any one of the reference points and the shooting point Relative coordinates can be calculated.
  • the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and selects the two-dimensional coordinates from the stored viewpoint projection angles of the arbitrary image.
  • the reference and target viewpoint projection angles corresponding to the two-dimensional coordinates extracted by the extraction unit 110 have been extracted.
  • the viewpoint projection angle extraction unit 120 holds Equation 6, and uses the held Equation 6 to use the two-dimensional coordinates extracted by the two-dimensional coordinate extraction unit 110 and the distortion correction equation as a reference and target.
  • the viewpoint projection angle may be calculated.
  • the adjustment unit 200 by Equation 5, it is necessary to obtain the distance x between the viewpoint M and the optical center position C p.
  • the two-dimensional coordinate extraction unit 110 extracts the three reference points and the two-dimensional coordinates of the target points in the first and second acquired images.
  • the image acquisition unit 100 May extract the two-dimensional coordinates. In this case, the two-dimensional coordinate extraction unit 110 is not necessary.
  • the tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle by setting two different three-dimensional coordinate systems, but one three-dimensional coordinate system is used.
  • the first inclination angle and the second inclination angle may be calculated by setting only.
  • the vector calculation unit 130b, a first target viewpoint projection angle using the first inclination angle calculates a first vector which passes through the point M 1 and the point W 01 in one of the three-dimensional coordinate system .
  • the vector calculation unit 130b, and a second target viewpoint projection angles by using the second inclination angle calculates a second vector passing through the point M 2 and the point W 02 in one of the three-dimensional coordinate system. Therefore, in this case, the conversion vector calculation unit 130c is not necessary.
  • the conversion vector calculation unit The calculation may be performed by the vector calculation unit 130b instead of 130c.
  • shooting is performed from two viewpoints, but shooting may be performed from three or more viewpoints, or the number of reference points whose relative coordinates are known may be increased to four or more. .
  • the inclination angle is an angle with respect to different reference deformation surfaces (first and second reference planes).
  • the relative coordinate measurement unit 130d calculates the relative coordinates of the target point using the relative positional relationship between the three first reference points and the three second reference points. Specifically, the relative coordinate measuring unit 130d calculates the closest point of two vectors using the relative positional relationship. Further, this relative positional relationship may be calculated when calculating the tilt angle. In other words, the calculated tilt angle may be adjusted to one of the reference planes. Alternatively, the calculated tilt angle may be calculated as an angle unified in the three-dimensional coordinate system.
  • first tilt angle and the second tilt angle may be calculated by performing nonlinear approximation calculation using a large number of reference points whose relative coordinates are known.
  • the correspondence can be identified by using a special marker or line.
  • a reference point for verification whose relative coordinates are known may be used in order to prevent reversal (turn over) due to rotation of the reference point and calculation of an incorrect solution (tilt angle).
  • restrictions for example, rotation conditions
  • restrictions may be added so as not to calculate an incorrect solution.
  • a part or all of the functions of the three-dimensional relative coordinate measuring apparatus according to the embodiment of the present invention may be realized by a processor such as a CPU executing a program.
  • connection relationship between the constituent elements is exemplified for specifically explaining the present invention, and the connection relationship for realizing the function of the present invention is not limited to this.
  • division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be. Further, the functions of a plurality of functional blocks having similar functions may be processed by the three-dimensional measuring apparatus in parallel or in time division.
  • the present invention may be the above program or a non-transitory computer-readable recording medium on which the above program is recorded.
  • the program can be distributed via a transmission medium such as the Internet.
  • the configuration of the three-dimensional measurement apparatus is for illustrating the present invention in detail, and the three-dimensional measurement apparatus according to the present invention does not necessarily have all of the above-described configurations. In other words, the three-dimensional measuring apparatus according to the present invention only needs to have a minimum configuration that can realize the effects of the present invention.
  • the three-dimensional measurement method using the three-dimensional measurement apparatus is for illustration in order to specifically describe the present invention, and the three-dimensional measurement method using the three-dimensional measurement apparatus according to the present invention is described above. It is not necessary to include all of the steps. In other words, the three-dimensional measurement method according to the present invention needs to include only the minimum steps that can realize the effects of the present invention.
  • the present invention can be used when three-dimensional relative coordinate measurement, for example, an interval between two points is automatically measured in an environment where relative coordinates cannot be measured mechanically and electrically.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A three-dimensional relative coordinate measuring device (90) is provided with: an image acquisition unit (100) that acquires a first acquired image obtained by capturing three reference points from a first perspective with an image capture device; a perspective projection angle extraction unit (120) that holds information on a pixel perspective projection angle for each pixel, and uses said information to acquire three first reference perspective projection angles, which are pixel perspective projection angles corresponding to the three reference points projected by the first acquired image; an inclination angle calculation unit (130a) that uses the relative coordinates of the three first reference perspective projection angles and the three reference points to calculate a first inclination angle formed between a first imaging plane, which is an imaging plane of the first acquired image, and a reference plane containing the three reference points; and a relative coordinate measuring unit (130d) that uses the first inclination angle to measure the relative coordinates between either of the three reference points and a target point.

Description

3次元相対座標計測装置およびその方法Three-dimensional relative coordinate measuring apparatus and method
 本発明は、3次元座標系において、相対座標が既知である3点の基準点のいずれかと、座標が未知である1点の対象点との相対座標を計測する3次元相対座標計測装置に関する。 The present invention relates to a three-dimensional relative coordinate measuring apparatus that measures a relative coordinate between any one of three reference points whose relative coordinates are known and one target point whose coordinates are unknown in a three-dimensional coordinate system.
 画像から相対座標を計測する方法として、ステレオ法が知られている。 A stereo method is known as a method for measuring relative coordinates from an image.
 ステレオ法は、異なる位置に置かれた2つのカメラで撮像する、または、1つのカメラで異なる2つの位置から撮影することにより、2つの画像を得る。そして、ステレオ法は、人間の両眼視のように、視差に基づく画像の微妙な違いから、相対座標情報を得る。 In the stereo method, two images are obtained by photographing with two cameras placed at different positions or by photographing from two different positions with one camera. In the stereo method, relative coordinate information is obtained from subtle differences in images based on parallax, as in human binocular vision.
 特許文献1では、ステレオ法により、精度の高い測量を行うことができる写真計測システムが開示されている。 Patent Document 1 discloses a photo measurement system capable of performing highly accurate surveying by a stereo method.
特開平10-221072号公報Japanese Patent Laid-Open No. 10-222102
 しかしながら、特許文献1では、傾斜角センサ等の検知機器を用いることにより、撮影点におけるカメラの位置および姿勢を事前に把握していることが前提となる。 However, in Patent Document 1, it is assumed that the position and orientation of the camera at the photographing point are grasped in advance by using a detection device such as an inclination angle sensor.
 そこで、本発明は、このような状況に鑑みてなされたものであり、撮影点におけるカメラの位置および姿勢を事前に把握することなく相対座標を計測することができる3次元相対座標計測装置を提供することを目的とする。 Therefore, the present invention has been made in view of such a situation, and provides a three-dimensional relative coordinate measuring apparatus capable of measuring relative coordinates without grasping in advance the position and orientation of a camera at a shooting point. The purpose is to do.
 上記目的を達成するために、本発明の一形態に係る相対座標計測装置は、3次元座標系における相対座標が既知である3点の第1の基準点のいずれかと、前記3次元座標系における座標が未知である1点の対象点との相対座標を計測する3次元相対座標計測装置であって、第1の視点より前記3点の第1の基準点が第1の撮像装置で撮像された第1の取得画像を取得する画像取得部と、各画素に対する第1の画素視点投影角の情報を保持し、前記情報を用いて、前記第1の取得画像に投影される前記3点の第1の基準点に対応する前記第1の画素視点投影角である3つの第1の基準視点投影角を取得する視点投影角抽出部とを備え、前記第1の画素視点投影角は、前記第1の取得画像における各画素の座標である第1の画素2次元座標に投影される前記3次元座標系の各点と前記第1の視点とを結ぶ線分と、前記第1の撮像装置の光軸とのなす角であり、さらに、前記3つの第1の基準視点投影角と前記3点の第1の基準点の相対座標とを用いて、前記第1の取得画像の撮像面である第1の撮像面と前記3点の第1の基準点を含む第1の基準平面とのなす第1の傾斜角を算出する傾斜角算出部と、前記第1の傾斜角を用いて、前記3点の第1の基準点のいずれかと前記対象点との相対座標を計測する計測部とを備える。 In order to achieve the above object, a relative coordinate measuring apparatus according to an aspect of the present invention includes any one of three first reference points whose relative coordinates in a three-dimensional coordinate system are known, and the three-dimensional coordinate system. A three-dimensional relative coordinate measuring apparatus that measures relative coordinates with a single target point whose coordinates are unknown, wherein the three first reference points are imaged by a first imaging device from a first viewpoint. An image acquisition unit for acquiring the first acquired image, and information on a first pixel viewpoint projection angle for each pixel, and using the information, the three points projected on the first acquired image A viewpoint projection angle extraction unit that acquires three first reference viewpoint projection angles that are the first pixel viewpoint projection angles corresponding to the first reference point, and the first pixel viewpoint projection angle is The first pixel is projected to the first two-dimensional coordinates that are the coordinates of each pixel in the first acquired image. An angle formed by a line segment connecting each point of the three-dimensional coordinate system and the first viewpoint, and the optical axis of the first imaging device, and the three first reference viewpoint projections Using a corner and the relative coordinates of the three first reference points, a first imaging surface that is an imaging surface of the first acquired image and a first reference point that includes the three first reference points Relative coordinates between one of the three first reference points and the target point are measured using an inclination angle calculation unit that calculates a first inclination angle formed with a reference plane, and the first inclination angle. And a measuring unit to perform.
 これにより、視点投影角を用いることによって、撮影点におけるカメラの位置および姿勢を事前に把握することなく相対座標を計測することができる。 Thus, by using the viewpoint projection angle, it is possible to measure relative coordinates without knowing in advance the position and orientation of the camera at the shooting point.
 また、前記3点の第1の基準点は、前記3次元座標系における相対座標が既知である3点以上の基準点のうちのいずれか3点であり、前記画像取得部は、前記第1の視点より前記3点の第1の基準点と前記対象点とが前記第1の撮像装置で撮像された前記第1の取得画像を取得し、第2の視点より、前記3点以上の基準点のうちのいずれか3点である3点の第2の基準点と前記対象点とが、前記第1の撮像装置と同一又は異なる第2の撮像装置で撮像された第2の取得画像を取得し、前記視点投影角抽出部はさらに、各画素に対する第2の画素視点投影角の情報を保持し、前記第2の画素視点投影角の情報を用いて、前記第2の取得画像に投影される前記3点の第2の基準点に対応する前記第2の画素視点投影角である3つの第2の基準視点投影角を取得し、前記第2の画素視点投影角は、前記第2の取得画像における各画素の座標である第2の画素2次元座標に投影される前記3次元座標系の各点と前記第2の視点とを結ぶ線分と、前記第2の撮像装置の光軸とのなす角であり、前記傾斜角算出部はさらに、前記3つの第2の基準視点投影角と前記3点の第2の基準点の相対座標とを用いて、前記第2の取得画像の撮像面である第2の撮像面と前記3点の第2の基準点を含む第2の基準平面とのなす第2の傾斜角を算出し、前記計測部は、前記3点の第1の基準点と前記3点の第2の基準点との相対位置関係と、前記第1の傾斜角と、前記第2の傾斜角とを用いて、前記3点の第1の基準点および前記3点の第2の基準点のいずれかと前記対象点との相対座標を計測してもよい。 The three first reference points are any three of three or more reference points whose relative coordinates in the three-dimensional coordinate system are known, and the image acquisition unit includes the first reference point. The first reference image of the three first reference points and the target point from the viewpoint are acquired by the first imaging device, and the reference points of the three or more points are acquired from the second viewpoint. A second acquired image obtained by imaging a second reference point, which is any three of the three points, and the target point with a second imaging device that is the same as or different from the first imaging device. The viewpoint projection angle extraction unit further holds information on a second pixel viewpoint projection angle for each pixel, and projects the second acquired image on the second acquired image using the information on the second pixel viewpoint projection angle. Three second reference views that are the second pixel viewpoint projection angles corresponding to the three second reference points A projection angle is acquired, and the second pixel viewpoint projection angle is calculated based on each point of the three-dimensional coordinate system projected on a second pixel two-dimensional coordinate that is a coordinate of each pixel in the second acquired image, and An angle formed by a line segment connecting the second viewpoint and the optical axis of the second imaging device, and the tilt angle calculation unit further includes the three second reference viewpoint projection angles and the three points. Using the relative coordinates of the second reference point, the second imaging plane formed by the second imaging plane, which is the imaging plane of the second acquired image, and the second reference plane including the three second reference points. 2, and the measurement unit calculates a relative positional relationship between the first reference point of the three points and the second reference point of the three points, the first inclination angle, and the second inclination point. The relative coordinates between the target point and any one of the first reference point of the three points and the second reference point of the three points may be measured.
 上記により、2枚の画像を使用して、視点投影角を用いることにより、各撮影点におけるカメラの相対姿勢関係を事前に把握することなく、また、各撮影点におけるカメラの相対位置関係を確定させることなく3点の基準点のいずれかと対象点との相対座標を計測することができる。 As described above, by using the viewpoint projection angle using two images, the relative positional relationship of the camera at each shooting point is determined without grasping in advance the relative posture relationship of the camera at each shooting point. Relative coordinates between any one of the three reference points and the target point can be measured without doing so.
 また、前記傾斜角算出部は、前記3つの第1および第2の基準視点投影角と前記3点の第1および第2の基準点とを用いて、前記第1の視点と前記第1の取得画像に投影される前記3点の第1の基準点のそれぞれとを結ぶ直線上にある前記3点の第1の基準点を含む前記第1の基準平面を決定し、前記第2の視点と前記第2の取得画像に投影される前記3点の第2の基準点のそれぞれとを結ぶ直線上にある前記3点の第2の基準点を含む前記第2の基準平面を決定してもよい。 Further, the tilt angle calculation unit uses the three first and second reference viewpoint projection angles and the three first and second reference points to use the first viewpoint and the first reference point. Determining the first reference plane including the first reference points of the three points on a straight line connecting the first reference points of the three points projected on the acquired image, and the second viewpoint And determining the second reference plane including the three second reference points on a straight line connecting the three second reference points projected onto the second acquired image Also good.
 また、前記視点投影角抽出部はさらに、前記第1の取得画像に投影される前記対象点に対応する前記第1の画素視点投影角である第1の対象視点投影角を取得し、前記第2の取得画像に投影される前記対象点に対応する前記第2の画素視点投影角である第2の対象視点投影角を取得し、前記計測部は、前記第1の対象視点投影角と、前記第1の傾斜角とを用いて、前記3次元座標系における、前記第1の視点と前記第1の取得画像に投影される前記対象点とを通る第1のベクトルを算出し、前記第2の対象視点投影角と、前記第2の傾斜角とを用いて、前記3次元座標系における、前記第2の視点と前記第2の取得画像に投影される前記対象点とを通る第2のベクトルを算出し、前記相対位置関係を用いて、前記3次元座標系における、前記第1のベクトルと前記第2のベクトルとの最接近点の座標を、前記3点の第1の基準点及び前記3点の第2の基準点のいずれかと前記対象点との相対座標として算出してもよい。 Further, the viewpoint projection angle extraction unit further acquires a first target viewpoint projection angle that is the first pixel viewpoint projection angle corresponding to the target point projected on the first acquired image, and A second target viewpoint projection angle that is the second pixel viewpoint projection angle corresponding to the target point projected on the two acquired images, and the measurement unit includes the first target viewpoint projection angle, Calculating a first vector passing through the first viewpoint and the target point projected on the first acquired image in the three-dimensional coordinate system using the first inclination angle; A second passing through the second viewpoint and the target point projected on the second acquired image in the three-dimensional coordinate system using the second target viewpoint projection angle and the second inclination angle. And calculate the vector in the three-dimensional coordinate system using the relative positional relationship. The coordinates of the closest point between the first vector and the second vector are calculated as relative coordinates between the target point and any one of the three first reference points and the three second reference points. May be.
 また、前記3点の第1の基準点のうち少なくとも一つは、前記3点の第2の基準点と異なってもよい。 In addition, at least one of the three first reference points may be different from the three second reference points.
 また、前記対象点は前記第1の視点であってもよい。 Further, the target point may be the first viewpoint.
 上記により、1枚の画像を使用して、視点投影角を用いることにより、撮影点におけるカメラの姿勢を事前に把握することなく、3点の基準点のいずれかと撮影点との相対座標を計測することができる。 As described above, by using a single image and using the viewpoint projection angle, the relative coordinates between any of the three reference points and the shooting point can be measured without previously knowing the posture of the camera at the shooting point. can do.
 また、前記傾斜角算出部は、前記3つの第1の基準視点投影角と前記3点の第1の基準点とを用いて、前記第1の視点と前記第1の取得画像に投影される前記3点の第1の基準点のそれぞれとを結ぶ直線上にある前記3点の第1の基準点を含む前記第1の基準平面を決定してもよい。 Further, the tilt angle calculation unit projects the first viewpoint and the first acquired image using the three first reference viewpoint projection angles and the three first reference points. The first reference plane including the three first reference points on a straight line connecting each of the three first reference points may be determined.
 上記により、傾斜角算出部によって、撮影点におけるカメラの姿勢を算出することができる。 As described above, the tilt angle calculation unit can calculate the posture of the camera at the shooting point.
 また、前記計測部は、前記3つの第1の基準視点投影角のうち2つの角と、前記第1の傾斜角とを用いて、前記第1の基準平面における、前記2つの角に対応する前記3点の第1の基準点のうち2点とを通る2本のベクトルを算出し、前記2本のベクトルの最接近点の座標を、前記3点の第1の基準点のいずれかと前記対象点との相対座標として算出してもよい。 Further, the measurement unit corresponds to the two angles in the first reference plane by using two of the three first reference viewpoint projection angles and the first inclination angle. Two vectors passing through two of the three first reference points are calculated, and the coordinates of the closest point of the two vectors are set as one of the three first reference points and the You may calculate as a relative coordinate with an object point.
 なお、本発明は、このような3次元相対座標計測装置として実現することができるだけでなく、このような3次元相対座標計測装置が備える特徴的な構成部の動作をステップとする3次元相対座標計測方法、または、それらのステップをコンピュータに実行させるプログラムとして実現することもできる。 Note that the present invention can be realized not only as such a three-dimensional relative coordinate measuring apparatus, but also as a step in the operation of the characteristic components included in such a three-dimensional relative coordinate measuring apparatus. It can also be realized as a measurement method or a program for causing a computer to execute these steps.
 本発明は、撮影点におけるカメラの位置および姿勢を事前に把握することなく相対座標を計測することができる3次元相対座標計測装置を提供することができる。 The present invention can provide a three-dimensional relative coordinate measuring apparatus capable of measuring relative coordinates without grasping in advance the position and orientation of the camera at the photographing point.
図1Aは、本発明に係る3次元相対座標計測時における、3次元相対座標計測装置を含むシステムの構成を示す図である。FIG. 1A is a diagram showing a configuration of a system including a three-dimensional relative coordinate measurement device at the time of three-dimensional relative coordinate measurement according to the present invention. 図1Bは、本発明に係る画像のピクセルキャリブレーション時における、3次元相対座標計測装置を含むシステムの構成を示す図である。FIG. 1B is a diagram showing a configuration of a system including a three-dimensional relative coordinate measuring apparatus at the time of pixel calibration of an image according to the present invention. 図2は、本発明に係る3次元相対座標計測装置の特徴的な機能構成を示すブロック図である。FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional relative coordinate measuring apparatus according to the present invention. 図3は、本発明に係る画像のピクセルキャリブレーションの処理手順を示すフローチャートである。FIG. 3 is a flowchart showing a procedure for pixel calibration of an image according to the present invention. 図4は、本発明に係る視点投影角を説明する図である。FIG. 4 is a diagram for explaining a viewpoint projection angle according to the present invention. 図5Aは、本発明に係る歪み補正前において、キャリブレーション板を撮影した際の画像を示す図である。FIG. 5A is a diagram showing an image when a calibration plate is photographed before distortion correction according to the present invention. 図5Bは、本発明に係る歪み補正後において、キャリブレーション板を撮影した際の画像を示す図である。FIG. 5B is a diagram showing an image when the calibration plate is photographed after distortion correction according to the present invention. 図6は、本発明に係るキャリブレーション板と視点との距離を算出するための処理を説明する図である。FIG. 6 is a diagram for explaining processing for calculating the distance between the calibration board and the viewpoint according to the present invention. 図7は、本発明の実施の形態1に係る3次元相対座標計測装置による、相対座標計測の処理手順を示すフローチャートである。FIG. 7 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention. 図8は、本発明の実施の形態1に係る視点、撮像面、基準点、対象点の関係を3次元的に示す図である。FIG. 8 is a diagram three-dimensionally showing the relationship among the viewpoint, imaging surface, reference point, and target point according to Embodiment 1 of the present invention. 図9Aは、本発明の実施の形態1に係る第1の撮像面に投影される基準点および対象点を示す図である。FIG. 9A is a diagram showing a reference point and a target point projected on the first imaging surface according to Embodiment 1 of the present invention. 図9Bは、本発明の実施の形態1に係る第2の撮像面に投影される基準点および対象点を示す図である。FIG. 9B is a diagram showing a reference point and a target point projected on the second imaging surface according to Embodiment 1 of the present invention. 図10は、本発明に係る傾斜角算出部が行う処理手順を示すフローチャートである。FIG. 10 is a flowchart showing a processing procedure performed by the tilt angle calculation unit according to the present invention. 図11は、本発明に係る、図10のフローチャートに示される処理を説明する図である。FIG. 11 is a diagram for explaining the processing shown in the flowchart of FIG. 10 according to the present invention. 図12は、本発明に係る、図11をZ-X平面上に投影した際の図である。FIG. 12 is a diagram when FIG. 11 is projected onto the Z 1 -X 1 plane according to the present invention. 図13は、本発明の実施の形態2に係る3次元相対座標計測装置による、相対座標計測の処理手順を示すフローチャートである。FIG. 13 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 2 of the present invention. 図14は、本発明の実施の形態2に係る視点、撮像面、基準点の関係を3次元的に示す図である。FIG. 14 is a diagram three-dimensionally illustrating the relationship between the viewpoint, the imaging surface, and the reference point according to Embodiment 2 of the present invention. 図15は、本発明の実施の形態2に係る第1の撮像面に投影される基準点を示す図である。FIG. 15 is a diagram illustrating a reference point projected on the first imaging surface according to Embodiment 2 of the present invention.
 以下、本発明の実施の形態について図面を用いて詳細に説明する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
 まず、本発明に係る3次元相対座標計測装置を含むシステムの構成を、本発明の実施の形態に係る3次元相対座標計測時と画像のピクセルキャリブレーション時とに分けて説明する。 First, the configuration of a system including a three-dimensional relative coordinate measurement apparatus according to the present invention will be described separately for three-dimensional relative coordinate measurement and image pixel calibration according to an embodiment of the present invention.
 図1Aは、本発明に係る3次元相対座標計測時における、3次元相対座標計測装置を含むシステムの構成を示す図である。なお、図1Aは、2つの視点から、2枚の画像を取得する際におけるシステムの構成を示している。本発明に係る3次元相対座標計測装置90は、ケーブル30を介して、カメラ10と接続される。ケーブル30は、例えば、USB(Universal Serial Bus)ケーブルである。このことにより、3次元相対座標計測装置90は、カメラ10によって撮像された画像データ60を、ケーブル30を介して取得することができる。なお、ケーブル30は、USB以外のケーブルを用いてもよい。また、画像データ60は、無線や記録媒体経由で取得してもよい。 FIG. 1A is a diagram showing a configuration of a system including a three-dimensional relative coordinate measurement device at the time of three-dimensional relative coordinate measurement according to the present invention. Note that FIG. 1A shows a system configuration when acquiring two images from two viewpoints. A three-dimensional relative coordinate measuring apparatus 90 according to the present invention is connected to the camera 10 via a cable 30. The cable 30 is, for example, a USB (Universal Serial Bus) cable. As a result, the three-dimensional relative coordinate measuring apparatus 90 can acquire the image data 60 captured by the camera 10 via the cable 30. The cable 30 may be a cable other than USB. The image data 60 may be acquired wirelessly or via a recording medium.
 カメラ10は、基準構造物40と対象構造物50とが共に写るように、それぞれ異なる第1の視点Mおよび第2の視点Mにおいて撮像する。ここで、MおよびMは、カメラ10の撮影レンズの主点の位置に対応している。また、図1A中、2点鎖線で書かれているOおよびOは、それぞれMおよびMに視点があるときの撮影レンズの光軸(カメラの光軸)を示している。なお、1つの視点から、1枚の画像を取得する際には、カメラ10は、基準構造物40が写るように、第1の視点において撮像すればよい。 基準構造物40は、3次元座標系において、相対座標が既知である3点の基準点A、B、Cを有している。言い換えると、点A、B、Cによって定義される三角形を基準構造物40と呼ぶ。 Camera 10, so that the reference structure 40 and the target structure 50 objects appear together, capturing the different first viewpoint M 1 and second viewpoint M 2, respectively. Here, M 1 and M 2 correspond to the positions of the principal points of the photographing lens of the camera 10. In FIG. 1A, O 1 and O 2 written by two-dot chain lines indicate the optical axis of the photographing lens (the optical axis of the camera) when the viewpoint is at M 1 and M 2 , respectively. Note that when acquiring one image from one viewpoint, the camera 10 may capture an image from the first viewpoint so that the reference structure 40 is captured. The reference structure 40 has three reference points A, B, and C whose relative coordinates are known in a three-dimensional coordinate system. In other words, the triangle defined by the points A, B, and C is referred to as the reference structure 40.
 対象構造物50は、3次元座標系において、座標が未知である対象点Wを有している。なお、1つの視点から、1枚の画像を取得する際には、対象構造物50は使用しなくてもよい。 The target structure 50 has a target point W whose coordinates are unknown in a three-dimensional coordinate system. In addition, when acquiring one image from one viewpoint, the target structure 50 may not be used.
 図1Bは、本発明に係る画像のピクセルキャリブレーション時における、3次元相対座標計測装置を含むシステムの構成を示す図である。本発明に係る3次元相対座標計測装置90は、ケーブル30を介して、カメラ10と接続される。ケーブル30は、例えば、USBケーブルである。このことにより、3次元相対座標計測装置90は、カメラ10によって撮像された画像データ61を、ケーブル30を介して取得することができる。なお、ケーブル30は、USB以外のケーブルを用いてもよい。また、画像データ61は、無線や記録媒体経由で取得してもよい。 FIG. 1B is a diagram showing a configuration of a system including a three-dimensional relative coordinate measuring apparatus at the time of pixel calibration of an image according to the present invention. A three-dimensional relative coordinate measuring apparatus 90 according to the present invention is connected to the camera 10 via a cable 30. The cable 30 is, for example, a USB cable. Thus, the three-dimensional relative coordinate measuring apparatus 90 can acquire the image data 61 captured by the camera 10 via the cable 30. The cable 30 may be a cable other than USB. Further, the image data 61 may be acquired wirelessly or via a recording medium.
 カメラ10は、キャリブレーション板70が写るように、視点Mにおいて撮像する。ここで、Mは、カメラ10の撮影レンズの主点の位置に対応している。また、図1B中、2点鎖線で書かれているOは、Mに視点があるときの撮影レンズの光軸(カメラの光軸)を示している。 The camera 10 captures an image at the viewpoint M so that the calibration plate 70 can be seen. Here, M corresponds to the position of the principal point of the taking lens of the camera 10. In FIG. 1B, O written by a two-dot chain line indicates the optical axis of the photographing lens (the optical axis of the camera) when M has a viewpoint.
 キャリブレーション板70は、剛性が高い透明な板であり、表面には碁盤の目が記されている。また、キャリブレーション板70には、糸71によって錘72が取り付けられている。 The calibration plate 70 is a transparent plate having high rigidity and has a grid pattern on the surface. A weight 72 is attached to the calibration plate 70 with a thread 71.
 3次元相対座標計測装置90は、例えば、コンピュータである。なお、本実施の形態では、3次元相対座標計測時と画像のピクセルキャリブレーション時とで共通のコンピュータを使用しているが、別々のコンピュータとしてもよい。また、コンピュータは2台以上使用してもよい。 The three-dimensional relative coordinate measuring device 90 is, for example, a computer. In the present embodiment, a common computer is used for three-dimensional relative coordinate measurement and image pixel calibration, but separate computers may be used. Two or more computers may be used.
 図2は、本発明に係る3次元相対座標計測装置の特徴的な機能構成を示すブロック図である。 FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional relative coordinate measuring apparatus according to the present invention.
 3次元相対座標計測システム90は、本発明に係る3次元相対座標計測部20および画像のピクセルキャリブレーション部80を備える。 The three-dimensional relative coordinate measurement system 90 includes a three-dimensional relative coordinate measurement unit 20 and an image pixel calibration unit 80 according to the present invention.
 また、本発明に係る3次元相対座標計測部20は、画像取得部100、2次元座標抽出部110、視点投影角抽出部120、演算部130および表示部140を備える。さらに、画像のピクセルキャリブレーション部80は、調整部200を備える。 The three-dimensional relative coordinate measurement unit 20 according to the present invention includes an image acquisition unit 100, a two-dimensional coordinate extraction unit 110, a viewpoint projection angle extraction unit 120, a calculation unit 130, and a display unit 140. Further, the image pixel calibration unit 80 includes an adjustment unit 200.
 調整部200は、視点Mより、キャリブレーション板70、糸71および錘72が撮像された画像データ61から、画像ピクセルの歪み補正を行い、任意の画像ピクセル(画素)の視点投影角(本発明の第1および第2の画素視点投影角に相当する)を算出する。 The adjustment unit 200 performs distortion correction of image pixels from the image data 61 obtained by imaging the calibration plate 70, the thread 71, and the weight 72 from the viewpoint M, and the viewpoint projection angle of the arbitrary image pixel (pixel) (present invention). Of the first and second pixel viewpoint projection angles).
 画像取得部100は、第1の視点Mより、3点の基準点、または、3点の基準点と対象点とがカメラ(本発明の第1のおよび第2の撮像装置に相当する)で撮像された画像データ60を第1の取得画像として取得する。また、画像取得部100は、第2の視点Mより、3点の基準点と対象点とがカメラで撮像された画像データ60を第2の取得画像として取得する。 Image acquisition unit 100, from the first viewpoint M 1, the reference point of the 3-point, or a reference point and the target point of the three points (corresponding to the first and second imaging apparatus of the present invention) camera The image data 60 imaged in (1) is acquired as the first acquired image. The image acquisition unit 100, from the second viewpoint M 2, and the reference point and the target point of the three points to obtain the image data 60 captured by the camera as the second acquired image.
 2次元座標抽出部110は、画像取得部100によって取得された第1の取得画像に投影される3点の基準点、または、3点の基準点および対象点の2次元座標を抽出する。また、2次元座標抽出部110は、画像取得部100によって取得された第2の取得画像に投影される3点の基準点および対象点の2次元座標を抽出する。 The two-dimensional coordinate extraction unit 110 extracts the three reference points projected on the first acquired image acquired by the image acquisition unit 100, or the two-dimensional coordinates of the three reference points and the target point. In addition, the two-dimensional coordinate extraction unit 110 extracts the three reference points and the two-dimensional coordinates of the target points that are projected on the second acquired image acquired by the image acquisition unit 100.
 視点投影角抽出部120は、予め調整部200によって算出された任意の画像ピクセルの視点投影角を保持し、保持する任意の画像ピクセルの視点投影角の中から、2次元座標抽出部110によって抽出された2次元座標に対応する視点投影角を、任意の画像ピクセルの視点投影角の中から抽出する。なお、視点投影角の定義については、後で図4を用いて説明する。 The viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 extracts the viewpoint projection angle of the arbitrary image pixel to be held. A viewpoint projection angle corresponding to the two-dimensional coordinates is extracted from the viewpoint projection angles of arbitrary image pixels. The definition of the viewpoint projection angle will be described later with reference to FIG.
 ここで、演算部130は、さらに、傾斜角算出部130a、ベクトル算出部130b、変換ベクトル算出部130cおよび相対座標計測部130dを備える。 Here, the calculation unit 130 further includes an inclination angle calculation unit 130a, a vector calculation unit 130b, a conversion vector calculation unit 130c, and a relative coordinate measurement unit 130d.
 傾斜角算出部130aは、3点の基準点の相対座標並びに視点投影角抽出部120で抽出された視点投影角を用いて、それぞれ第1の傾斜角および第2の傾斜角を算出する。 The tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle using the relative coordinates of the three reference points and the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120, respectively.
 ベクトル算出部130bは、視点投影角抽出部120で抽出された視点投影角を用いて、それぞれ第1のベクトルおよびベクトル2を算出する。また、ベクトル算出部130bは、視点投影角抽出部120で抽出された視点投影角および傾斜角算出部130aで算出された第1の傾斜角を用いて、2本のベクトルを算出する。 The vector calculation unit 130b calculates the first vector and the vector 2 using the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120, respectively. Further, the vector calculation unit 130b calculates two vectors using the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120 and the first tilt angle calculated by the tilt angle calculation unit 130a.
 変換ベクトル算出部130cは、傾斜角算出部130aで算出された、第1の傾斜角および第2の傾斜角と、ベクトル算出部130bで算出されたベクトル2とを用いて、第2のベクトルを算出する。 The conversion vector calculation unit 130c uses the first inclination angle and the second inclination angle calculated by the inclination angle calculation unit 130a and the vector 2 calculated by the vector calculation unit 130b to calculate the second vector. calculate.
 相対座標計測部130dは、ベクトル算出部130bで算出された第1のベクトルと変換ベクトル算出部130cで算出された第2のベクトルとを用いて、3点の基準点のいずれかと対象点との相対座標を計測する。また、相対座標計測部130dは、ベクトル算出部130bで算出された2本のベクトルを用いて、3点の基準点のいずれかと撮影点との相対座標を計測する。 The relative coordinate measurement unit 130d uses either the first vector calculated by the vector calculation unit 130b and the second vector calculated by the conversion vector calculation unit 130c to obtain either one of the three reference points and the target point. Measure relative coordinates. In addition, the relative coordinate measuring unit 130d measures the relative coordinates between any of the three reference points and the shooting point using the two vectors calculated by the vector calculating unit 130b.
 表示部140は、例えば、画像取得部100によって取得された取得画像の表示を行ったり、2次元座標抽出部110により抽出された2次元座標を表示したり、視点投影角抽出部120によって抽出された視点投影角を表示したり、演算部130によって行われた各種演算結果を表示したりする。 The display unit 140 displays, for example, the acquired image acquired by the image acquisition unit 100, displays the two-dimensional coordinates extracted by the two-dimensional coordinate extraction unit 110, or is extracted by the viewpoint projection angle extraction unit 120. Display viewpoint projection angles, and display various calculation results performed by the calculation unit 130.
 次に、以上のように構成された本実施形態における3次元相対座標計測装置を含むシステムの動作について説明する。 Next, the operation of the system including the three-dimensional relative coordinate measuring apparatus according to this embodiment configured as described above will be described.
 前提として、3次元相対座標計測部20が相対座標計測を行う前に、画像のピクセルキャリブレーションを行う必要がある。 As a premise, it is necessary to perform pixel calibration of the image before the three-dimensional relative coordinate measurement unit 20 performs relative coordinate measurement.
 図3は、本発明に係る画像のピクセルキャリブレーションの処理手順を示すフローチャートである。画像のピクセルキャリブレーションとは、撮像面の中心点である光学中心位置Cを設定し(S101)、歪み補正を行い(S102)、任意の画像ピクセル(画素)の視点投影角θ(本発明の第1および第2の画素視点投影角に相当する)を算出する(S103)ことである。ここで、任意の画像ピクセルの視点投影角θとは、図4に示すように、画像上の任意の画像ピクセルPの2次元座標(本発明の第1および第2の画素2次元座標に相当する)に投影される3次元空間上の任意の点Pと視点Mとを結ぶ線分と、カメラの光軸O(本発明の第1および第2の撮像装置の光軸に相当する)とのなす角のことである。なお、図4において、カメラの光軸Oと撮像面とは直交するので、視点Mと、カメラの光軸Oと撮像面の交点である光学中心位置Cとの距離xと、例えば、光学中心位置Cを原点とする、任意の画像ピクセルPの2次元座標とを用いて、任意の画像ピクセルPの視点投影角θを算出することができる。 FIG. 3 is a flowchart showing a procedure for pixel calibration of an image according to the present invention. The pixel calibration image, set the optical center position C p is the center point of the imaging plane (S101), performs distortion correction (S102), perspective projection angle of an arbitrary image pixels (pixel) theta (present invention (Corresponding to the first and second pixel viewpoint projection angles) (S103). Here, the viewpoint projection angle θ of an arbitrary image pixel is a two-dimensional coordinate (corresponding to the first and second pixel two-dimensional coordinates of the present invention) of an arbitrary image pixel P on the image, as shown in FIG. a line segment connecting the to) to the arbitrary point P 3 of the three-dimensional space to be projected and viewpoint M, the camera optical axis O (corresponding to the optical axis of the first and second imaging device of the present invention) It is the angle formed by. In FIG. 4, since the optical axis O and the imaging surface of the camera are orthogonal, and the viewpoint M, and the distance x between the optical center position C p is the intersection of the optical axis O and the imaging surface of the camera, for example, optical The viewpoint projection angle θ of the arbitrary image pixel P can be calculated using the two-dimensional coordinates of the arbitrary image pixel P with the center position C p as the origin.
 まず、光学中心位置Cの設定を行う(S101)。はじめに、剛性が高い透明な板(以下、キャリブレーション板70と記載)には、糸71によって錘72が取り付けられている。キャリブレーション板70は水準器などにより、水平が保たれているものとする。また、キャリブレーション板70の表面には碁盤の目が記されているものとする。次に、上記キャリブレーション板70を上からカメラ10で撮像する。そして、撮像面を、(1)キャリブレーション板70と糸71との取り付け位置と、錘72と糸71との取り付け位置とが重なるように、(2)糸71の取り付け位置を中心に、碁盤の目の表示が対称になるように調整したとき、キャリブレーション板70上の糸71の取り付け位置が光学中心位置Cとなる。 First, the optical center position Cp is set (S101). First, a weight 72 is attached by a thread 71 to a transparent plate having high rigidity (hereinafter referred to as a calibration plate 70). The calibration plate 70 is assumed to be kept horizontal by a level or the like. Further, it is assumed that grids are marked on the surface of the calibration plate 70. Next, the calibration plate 70 is imaged by the camera 10 from above. Then, the image pickup surface is (1) a grid around the attachment position of the thread 71 so that the attachment position of the calibration plate 70 and the thread 71 and the attachment position of the weight 72 and the thread 71 overlap. when the display of the eye to adjust to be symmetrical, the mounting position of the thread 71 on the calibration plate 70 is the optical center position C p.
 次に、調整部200は、画像の歪み補正を行う(S102)。図5Aに示すように、歪み補正前においては、キャリブレーション板70を撮影したときの画像は、カメラ10の撮影レンズの特性により、図5Aのように碁盤の目が歪んだ画像となる。調整部200は、撮像した画像に正規化処理を行うことにより、撮像した画像を図5Bに示すような画像に調整する。 Next, the adjustment unit 200 performs image distortion correction (S102). As shown in FIG. 5A, before distortion correction, the image when the calibration plate 70 is photographed is an image in which the grid is distorted as shown in FIG. 5A due to the characteristics of the photographing lens of the camera 10. The adjustment unit 200 adjusts the captured image to an image as illustrated in FIG. 5B by performing normalization processing on the captured image.
 最後に、調整部200は、図4に示すように、任意の画像ピクセルPの視点投影角θを算出する(S103)。 Finally, as shown in FIG. 4, the adjustment unit 200 calculates the viewpoint projection angle θ of an arbitrary image pixel P (S103).
 図6は、本発明に係るキャリブレーション板と視点との距離を算出するための処理を説明する図である。 FIG. 6 is a diagram for explaining processing for calculating the distance between the calibration board and the viewpoint according to the present invention.
 まず、調整部200は、図5Bに示すような、キャリブレーション板70を撮影した歪み補正後の画像である、移動前の板の画像を取得する。次に、調整部200は、キャリブレーション板70を水平に保ったまま、キャリブレーション板70を視点Mに距離y接近させたときのキャリブレーション板70を撮像した歪み補正後の画像である、移動後の板の画像を取得する。そして、調整部200は、移動前の板の画像と移動後の板の画像を用いて、例えば、光学中心位置Cを原点とする、点P’、及び点P’の2次元座標を抽出する。 First, the adjustment unit 200 acquires an image of a plate before movement, which is an image after distortion correction obtained by photographing the calibration plate 70 as shown in FIG. 5B. Next, the adjustment unit 200 is a distortion-corrected image obtained by capturing the calibration plate 70 when the calibration plate 70 is moved close to the viewpoint M by the distance y while the calibration plate 70 is kept horizontal. Acquire an image of the later board. Then, the adjustment unit 200 uses the image of the plate before movement and the image of the plate after movement, for example, two-dimensional coordinates of the points P 1 ′ and P 2 ′ with the optical center position C p as the origin. To extract.
 ここで、視点Mと光学中心位置Cとを結ぶ直線と、キャリブレーション板70の表面との交点をCとする。また、キャリブレーション板70表面の任意の点をPとする。また、視点MとPとを結ぶ直線と、撮像面との交点をP’とする。また、移動前のキャリブレーション板70表面のPに対応する、移動後のキャリブレーション板70表面の点をPとする。また、視点MとPとを結ぶ直線と、撮像面との交点をP’とする。また、点Pと点P’とを通る直線と、移動前のキャリブレーション板70の表面との交点をQとする。 Here, a line connecting the viewpoint M and the optical center position C p, the intersection of the surface of the calibration plate 70 and C 1. Further, the arbitrary point of the calibration plate 70 surface and P 1. In addition, an intersection of a straight line connecting the viewpoint M and P 1 and the imaging surface is defined as P 1 ′. Further, corresponding to P 1 of the calibration plate 70 surface before the movement, the point of the calibration plate 70 surface after the movement and P 2. Also, let P 2 ′ be the intersection of the straight line connecting the viewpoints M and P 2 and the imaging surface. Further, let Q be the intersection of a straight line passing through the point P 2 and the point P 2 ′ and the surface of the calibration plate 70 before the movement.
 このとき、線分PQの長さdは、式1で表される。 At this time, the length d 1 of the line segment P 1 Q is expressed by Expression 1.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、dは線分Cの長さ、dは線分P’P’の長さ、dは線分C’の長さを示す。 Here, d 2 represents the length of the line segment C 1 P 1 , d 3 represents the length of the line segment P 1 ′ P 2 ′, and d 4 represents the length of the line segment C p P 1 ′.
 また、点Qの視点投影角θは、式2で表される。 Further, the viewpoint projection angle θ 1 of the point Q is expressed by Expression 2.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 したがって、キャリブレーション板70と視点Mとの距離zは、式3で表される。 Therefore, the distance z between the calibration plate 70 and the viewpoint M is expressed by Equation 3.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 ここで、dは、線分CQの長さを示す。 Here, d 5 indicates the length of the line segment C 1 Q.
 そして、キャリブレーション板70と視点Mとの距離zを用いて、例えば、図6におけるP’の視点投影角θは、式4で表される。 Then, using the distance z between the calibration plate 70 and the viewpoint M, for example, the viewpoint projection angle θ 2 of P 1 ′ in FIG.
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 視点Mと光学中心位置Cとを結ぶ線分の長さxは、式5で表される。 Viewpoint M and the length x of a line connecting the optical center position C p can be expressed by equation 5.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 したがって、任意のピクセル点Pの視点投影角θは、式6にて算出することができる。 Therefore, the viewpoint projection angle θ of an arbitrary pixel point P can be calculated by Expression 6.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 ここで、dは、線分CPの長さを示す。 Here, d 6 indicates the length of the line segment C p P.
 つまり、光学中心位置Cを原点とする、任意の画像ピクセルPの2次元座標は既知であるため、任意の画像ピクセルPの視点投影角θを算出することができる。 That is, since the two-dimensional coordinates of an arbitrary image pixel P with the optical center position C p as the origin are known, the viewpoint projection angle θ of the arbitrary image pixel P can be calculated.
 このようにして、光学中心位置Cを設定し、歪み補正を行い、任意の画像ピクセルの視点投影角θを算出されることにより、画像のピクセルキャリブレーションが完了する。 In this way, the optical center position Cp is set, distortion correction is performed, and the viewpoint projection angle θ of an arbitrary image pixel is calculated, whereby the pixel calibration of the image is completed.
 以上、画像のピクセルキャリブレーション手法について、実施の形態に基づいて説明したが、画像のピクセルキャリブレーション手法は、この実施の形態に限定されるものではない。 The image pixel calibration method has been described above based on the embodiment, but the image pixel calibration method is not limited to this embodiment.
 例えば、本実施の形態では、キャリブレーション板70には碁盤の目が記されているものとしていたが、キャリブレーション板には、間隔が既知である印が記されているものとしてもよい。 For example, in the present embodiment, the grid of the grid is written on the calibration plate 70, but the calibration plate may be marked with a known interval.
 また、本実施の形態では、キャリブレーション時に糸及び錘を用いる例を述べたが、糸及び錘の代わりに、キャリブレーション板と垂直に加工された筒状の構造を通した、ランプ光源からの光線またはレーザ光を用いてもよい。 In this embodiment, an example of using a thread and a weight at the time of calibration has been described. However, instead of the thread and the weight, the light from the lamp light source passes through a cylindrical structure processed perpendicularly to the calibration plate. A light beam or a laser beam may be used.
 次に、3次元相対座標計測部20が相対座標計測を行う動作について説明する。 Next, the operation in which the three-dimensional relative coordinate measurement unit 20 performs relative coordinate measurement will be described.
 (実施の形態1)
 実施の形態1では、2枚の画像を使用して、3次元相対座標計測部20が、3点の基準点のいずれかと対象点との相対座標を計測する動作について説明する。
(Embodiment 1)
In the first embodiment, the operation in which the three-dimensional relative coordinate measurement unit 20 measures the relative coordinates between any of the three reference points and the target point using two images will be described.
 図7は、本発明の実施の形態1に係る3次元相対座標計測装置による、相対座標計測の処理手順を示すフローチャートである。 FIG. 7 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention.
 まず、画像取得部100は、図8に示すように、第1の取得画像および第2の取得画像を取得する(S110)。ここで、第1の取得画像および第2の取得画像は、第1の視点Mおよび第2の視点Mより、座標が既知である3点の基準点A,B,Cと、座標が未知である対象点Wとが共に写るようにカメラ(本発明の第1および第2の撮像装置に相当する)で撮像された画像データ60である。 First, as shown in FIG. 8, the image acquisition unit 100 acquires a first acquired image and a second acquired image (S110). Here, the first acquired image and the second captured image, from the first viewpoint M 1 and second viewpoint M 2, the reference point A of the 3-point coordinates are known, B, and C, and coordinates This is image data 60 captured by a camera (corresponding to the first and second imaging devices of the present invention) so that the unknown target point W can be seen together.
 なお、本実施例では、説明の簡略化のため、第1の取得画像および第2の取得画像に含まれる3点の基準点が同一である場合を例に説明するが、第1の取得画像および第2の取得画像に含まれる3点の基準点は異なってもよい。 In the present embodiment, for the sake of simplicity of explanation, a case where the three reference points included in the first acquired image and the second acquired image are the same will be described as an example. The three reference points included in the second acquired image may be different.
 次に、2次元座標抽出部110は、図9Aに示すように、例えば、第1の取得画像における光学中心位置Cp1を原点として、第1の取得画像に投影される基準点A、B,Cである点A01,B01、C01の座標(3点の第1の基準2次元座標)を第1の取得画像から抽出する。また、2次元座標抽出部110は、第1の取得画像に投影される対象点Wである点W01の座標(第1の対象2次元座標)を第1の取得画像から抽出する(S120)。同様に、2次元座標抽出部110は、図9Bに示すように、例えば、第2の取得画像における光学中心位置Cp2を原点として、第2の取得画像に投影される基準点A、B,Cである点A02,B02、C02の座標(3点の第2の基準2次元座標)を第2の取得画像から抽出する。また、2次元座標抽出部110は、第2の取得画像に投影される対象点Wである点W02の座標(第2の対象2次元座標)を第2の取得画像から抽出する(S120)。 Next, as illustrated in FIG. 9A, the two-dimensional coordinate extraction unit 110, for example, uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image. The coordinates of the points A 01 , B 01 , C 01 that are C (first reference two-dimensional coordinates of three points) are extracted from the first acquired image. In addition, the two-dimensional coordinate extraction unit 110 extracts the coordinates of the point W 01 that is the target point W projected on the first acquired image (first target two-dimensional coordinates) from the first acquired image (S120). . Similarly, as illustrated in FIG. 9B, the two-dimensional coordinate extraction unit 110, for example, uses the optical center position Cp2 in the second acquired image as an origin and the reference points A, B, and B projected on the second acquired image. The coordinates of the points A 02 , B 02 , and C 02 (three second reference two-dimensional coordinates) that are C are extracted from the second acquired image. Further, 2-dimensional coordinate extracting section 110 extracts the second acquired image point which is the target point W which is projected to W 02 coordinates (second target two-dimensional coordinates) from the second acquired image (S120) .
 また、視点投影角抽出部120は、予め調整部200によって算出された任意の画像ピクセルの視点投影角を保持し、保持する任意の画像ピクセルの視点投影角の中から、2次元座標抽出部110によって抽出された3点の第1および第2の基準2次元座標並びに第1および第2の対象2次元座標に対応する任意の画像ピクセルの視点投影角を、3つの第1および第2の基準視点投影角並びに第1および第2の対象視点投影角として抽出する(S130)。 The viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 out of the viewpoint projection angles of the arbitrary image pixel to be held. The first and second reference two-dimensional coordinates of the three points extracted by the above and the viewpoint projection angles of arbitrary image pixels corresponding to the first and second target two-dimensional coordinates are set as the three first and second references. The viewpoint projection angle and the first and second target viewpoint projection angles are extracted (S130).
 そして、傾斜角算出部130aは、第1および第2の取得画像の撮像面である第1および第2の撮像面と、基準点を含む基準平面(第1および第2の基準平面)とのなす角(第1および第2の傾斜角)を算出する(S140)。 Then, the inclination angle calculation unit 130a includes first and second imaging surfaces that are imaging surfaces of the first and second acquired images, and a reference plane (first and second reference planes) including a reference point. The formed angle (first and second inclination angles) is calculated (S140).
 図10は、本発明に係る傾斜角算出部が行う処理手順を示すフローチャートである。また、図11は、本発明に係る、図10のフローチャートに示される処理を説明する図である。以下、主に図10および図11に基づいて、第1の傾斜角を算出する過程を説明する。 FIG. 10 is a flowchart showing a processing procedure performed by the inclination angle calculation unit according to the present invention. Moreover, FIG. 11 is a figure explaining the process shown by the flowchart of FIG. 10 based on this invention. Hereinafter, the process of calculating the first inclination angle will be described mainly based on FIGS. 10 and 11.
 まず、傾斜角算出部130aは、相対座標が既知である3点の基準点A,B,Cのうちいずれか1点、例えば、点Aを原点とするX-Y-Z座標系(第1の3次元座標系)を設定する(S141)。このとき、傾斜角算出部130aは、X軸およびY軸は、第1の視点Mより撮像された第1の撮像面と平行になるように設定する。また、傾斜角算出部130aは、Z軸は、第1の視点Mより撮像された第1の撮像面と垂直になるように設定する。なお、このときのX-Y平面を、第1の変換平面とも呼ぶ。 First, the tilt angle calculation unit 130a has one of the three reference points A, B, and C whose relative coordinates are known, for example, the X 1 -Y 1 -Z 1 coordinate system with the point A as the origin. (First three-dimensional coordinate system) is set (S141). At this time, the inclination angle calculating unit 130a, X 1 axis and Y 1 axis is set to be parallel to the first imaging plane taken from the first viewpoint M 1. The inclination angle calculating unit 130a, Z 1 axis is set to be the first imaging plane and perpendicular captured from the first viewpoint M 1. Note that the X 1 -Y 1 plane at this time is also referred to as a first conversion plane.
 次に、傾斜角算出部130aは、相対座標が既知である3点の基準点A,B,Cと同等な第1の仮想点BおよびCの座標を設定する(S142)。ここで、点Bは点Bに相当し、点Cは点Cに相当する。例えば、本実施の形態では、点Bの座標を(B1x,0,0)、点Cの座標を(0,C1y,0)と設定する。 Then, the inclination angle calculating unit 130a, a relative coordinate reference point A of the three known, B, set C and equivalent first virtual point B 1 and C 1 of the coordinates (S142). Here, the point B 1 corresponds to the point B, and the point C 1 corresponds to the point C. For example, in the present embodiment, the coordinates of the point B 1 are set as (B 1x , 0, 0), and the coordinates of the point C 1 are set as (0, C 1y , 0).
 そして、傾斜角算出部130aは、点BおよびCを、X軸まわりに角度α1n、Y軸まわりに角度β1n、Z軸まわりに角度γ1n、細かく別々に回転させた複数の第1の変換基準点である、点B1n’(B1nx’,B1ny’,B1nz’)および点C1n’(C1nx’,C1ny’,C1nz’)を算出する(S143)。ここで、n=1、2、3・・・である。 Then, the tilt angle calculation unit 130a finely and separately rotates the points B 1 and C 1 by an angle α 1n around the X 1 axis, an angle β 1n around the Y 1 axis, and an angle γ 1n around the Z 1 axis. A plurality of first conversion reference points B 1n ′ (B 1nx ′, B 1ny ′, B 1nz ′) and point C 1n ′ (C 1nx ′, C 1ny ′, C 1nz ′) are calculated ( S143). Here, n = 1, 2, 3,...
 次に、傾斜角算出部130aは、視点投影角抽出部120より抽出された視点投影角を用いて、点B1n’および点C1n’を、X-Y平面に投影させた複数の第1の変換投影基準点である、点B1n”(B1nx”,B1ny”,0)および点C1n”(C1nx”,C1ny”,0)を算出する(S144)。ここで、n=1、2、3・・・である。 Next, the tilt angle calculation unit 130a uses the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120 to project a plurality of points B 1n ′ and point C 1n ′ onto the X 1 -Y 1 plane. The point B 1n ″ (B 1nx ″, B 1ny ″, 0) and the point C 1n ″ (C 1nx ″, C 1ny ″, 0), which are the first conversion projection reference points, are calculated (S144). Here, n = 1, 2, 3,...
 ここで、視点投影角抽出部120により抽出された、点B01のZ-X面の視点投影角をε1x、点B01のZ-Y面の視点投影角をε1y、点C01のZ-X面の視点投影角をφ1x、点C01のZ-Y面の視点投影角をφ1yとする。第1の撮像面における基準点A、B,Cである点A01,B01、C01を結ぶ三角形A010101と、点A、点B1n” 点C1n”を結ぶ三角形A、B1n” C1n”が相似となるときには、図12に示すように、∠B1n’Mp1および∠B1n”Mp1はε1xとなる。 Here, extracted by the viewpoint projection angle extracting unit 120, Z 1 -X 1 side perspective projection angle epsilon 1x of point B 01, a perspective projection angle of Z 1 -Y 1 side of the point B 01 epsilon 1y, viewpoint projection angle of Z 1 -X 1 side of the point C 01 φ 1x, the perspective projection angle of Z 1 -Y 1 side of the point C 01 and phi 1y. Reference point A in the first imaging plane, B, that it is C A 01, B 01, a triangle A 01 B 01 C 01 connecting the C 01, points A, triangle A connecting the points B 1n "point C 1n" , B 1n ″ C 1n ″ is similar, ∠B 1n ′ M 1 C p1 and ∠B 1n ″ M 1 C p1 are ε 1x as shown in FIG.
 したがって、B1nx”は、式7にて算出することができる。 Therefore, B 1nx ″ can be calculated using Equation 7.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 同様にして、B1ny”、C1nx”およびC1ny”は、式8~式10にて算出することができる。 Similarly, B 1ny ″, C 1nx ″, and C 1ny ″ can be calculated using Equation 8 to Equation 10.
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000010
 最後に、傾斜角算出部130aは、第1の撮像面における基準点A、B、Cである点A01,B01、C01を結ぶ三角形A010101と、点A、点B1n”、点C1n”を結ぶ三角形AB1n”C1n”とが最も相似となる第1の変換投影基準点B”およびC”を決定することにより、第1の傾斜角(α,β,γ)を算出することができる(S145)。 Finally, the inclination angle calculation unit 130a, the triangle A 01 B 01 C 01 connecting the points A 01 , B 01 , C 01 which are the reference points A, B, C on the first imaging surface, and the points A, B By determining the first transformed projection reference points B 1 ″ and C 1 ″ that are most similar to the triangle AB 1n ″ C 1n ″ connecting 1 n ″ and the point C 1n ″, the first inclination angle (α 1 , Β 1 , γ 1 ) can be calculated (S145).
 具体的な相似比較計算法として、線分A0101を、線分AB1n”と同じ長さにして、点A01を点Aと一致させるように点B01および点C01を移動させたとき、線分B011n”の長さと線分C011n”の長さとの和が最小となるように、第1の傾斜角(α,β,γ)を算出する方法がある。 As a specific similarity calculation method, the line segment A 01 B 01 is made the same length as the line segment AB 1n ″, and the point B 01 and the point C 01 are moved so that the point A 01 coincides with the point A. Then, the first inclination angle (α 1 , β 1 , γ 1 ) is calculated so that the sum of the length of the line segment B 01 B 1n ″ and the length of the line segment C 01 C 1n ″ is minimized. There is a way.
 ただし、相似比較計算法は、上記の方法に限られない。例えば、線分A0101を、線分AB1n”と同じ長さにして、線分A0101を線分AB1n”と一致させるように点C01を移動させたとき、点C01と点C1n”とのずれ量が最小となるように、第1の傾斜角(α,β,γ)を算出してもよい。 However, the similarity comparison calculation method is not limited to the above method. For example, when a line segment A 01 B 01, "in the same length as the line segment A 01 B 01 segment AB 1n" segment AB 1n moving the point C 01 to match the point C The first inclination angles (α 1 , β 1 , γ 1 ) may be calculated so that the deviation amount between 01 and the point C 1n ″ is minimized.
 同様にして、傾斜角算出部130aは、第2の撮像面と、基準点を含む平面とのなす角である第2の傾斜角(α,β,γ)を算出することができる。 Similarly, the inclination angle calculation unit 130a can calculate a second inclination angle (α 2 , β 2 , γ 2 ) that is an angle formed between the second imaging surface and a plane including the reference point. .
 このようにして、第1の傾斜角および第2の傾斜角が算出される。 In this way, the first tilt angle and the second tilt angle are calculated.
 以上、傾斜角の算出について、実施の形態に基づいて説明したが、傾斜角の算出は、この実施の形態に限定されるものではない。 As described above, the calculation of the tilt angle has been described based on the embodiment. However, the calculation of the tilt angle is not limited to this embodiment.
 例えば、本実施の形態では、傾斜角算出部130aは、座標が既知である3点の基準点A,B,Cのうちいずれか1点を原点とするX-Y-Z座標系を設定していたが、原点は、座標が既知である3点の基準点以外から選定してもよい。 For example, in the present embodiment, the tilt angle calculation unit 130a has an X 1 -Y 1 -Z 1 coordinate system with one of the three reference points A, B, and C whose coordinates are known as the origin. However, the origin may be selected from other than the three reference points whose coordinates are known.
 また、例えば、基準点A、B、Cがある時、点Aを点Aの視点投影角線上のある位置に置き、点Bを点Bの視点投影角線上のある指定ピッチで、かつある範囲間を順次スライドさせた時、あるピッチの時の点Bによる線分ABを元に相似形であることで点Cを求めていき、点Cが線分ABを軸とした円周上にあることがわかる。点Cの視点投影角上に最接近した条件を記憶しておき、点Bを順次スライドさせてその時の点Cの最接近点を算出し、最も接近した時が求める傾斜角としてもよい。 Further, for example, when there are reference points A, B, and C, the point A is placed at a certain position on the viewpoint projection angle line of the point A, and the point B is at a certain pitch on the viewpoint projection angle line of the point B and in a certain range. When the space is sequentially slid, the point C is obtained by being a similar shape based on the line segment AB by the point B at a certain pitch, and the point C is on the circumference with the line segment AB as an axis. I understand that. The condition of the closest approach on the viewpoint projection angle of the point C may be stored, the point B may be sequentially slid to calculate the closest point of the point C at that time, and the inclination angle obtained when the closest approach is obtained.
 また、例えば、基準点A、B、Cがあり、それぞれの各点間の長さが既知の時、点Aを点Aの視点投影角線上のある位置に置き、その時の点Bを点Bの視点投影角線上に置くか近傍に置き、その場合に点Cが線分ABを軸とした円周上にあることがわかるので、点Cの視点投影角線上に最接近した条件を記憶しておき、点Aをある範囲で上記の計算を行ったときの各点が視点投影角線上に最も接近した時が求める傾斜角としてもよい。 Further, for example, when there are reference points A, B, and C, and the length between each point is known, the point A is placed at a certain position on the viewpoint projection angle line of the point A, and the point B at that time is set to Since the point C is on the circumference around the line segment AB, the condition closest to the point C projection angle line is memorized. The inclination angle obtained when the point A is closest to the viewpoint projection angle line when the above-described calculation is performed for the point A within a certain range may be used.
 再度、図7を用いて、本発明の実施の形態1に係る3次元相対座標計測装置による、相対座標計測の処理手順の説明を行う。 Again, the relative coordinate measurement processing procedure by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention will be described with reference to FIG.
 図7において、傾斜角の算出(S140)が完了した後、ベクトル算出部130bは、第1のベクトルおよびベクトル2を算出する(S150)。 7, after the calculation of the tilt angle (S140) is completed, the vector calculation unit 130b calculates the first vector and the vector 2 (S150).
 まず、第1の撮像面における対象点Wである点W01の第1の対象2次元座標を、三角形A、B”、C”が存在する第1の変換平面上に相似形により換算した点を、W(W1x,W1y,0)とする。 First, the first target two-dimensional coordinates of the point W 01 that is the target point W on the first imaging surface are converted into similar shapes on the first conversion plane where the triangles A, B 1 ″, C 1 ″ exist. This point is defined as W 1 (W 1x , W 1y , 0).
 次に、点Wと第1の視点Mを結ぶ線分上の点を、W’(W1x’,W1z’,W1z’)としたとき、第1のベクトルは、視点投影角抽出部120により抽出された、点W01のZ-X面の視点投影角η1xおよび点W01のZ-Y面の視点投影角η1yとW1z’とを用いて、(W1x-W1z’tanη1x,W1y-W1z’tanη1y,W1z’)と算出することができる。 Next, when the point on the line segment connecting the point W 1 and the first viewpoint M 1 is W 1 ′ (W 1x ′, W 1z ′, W 1z ′), the first vector is the viewpoint projection. Using the viewpoint projection angle η 1x of the Z 1 -X 1 plane of the point W 01 and the viewpoint projection angles η 1y and W 1z ′ of the Z 1 -Y 1 plane of the point W 01 extracted by the corner extraction unit 120. , (W 1x −W 1z ′ tan η 1x , W 1y −W 1z ′ tan η 1y , W 1z ′).
 同様にして、ベクトル算出部130bは、ベクトル2を算出することができる。 Similarly, the vector calculation unit 130b can calculate the vector 2.
 そして、変換ベクトル算出部130cは、点A、B”、C”のうちいずれか2点を結ぶ線分、例えば、線分AB”と、点A、B”、C” のうち、線分AB”に対応する線分AB”とを抽出する。そして、変換ベクトル算出部130cは、傾斜角算出部130aで求めた第1の傾斜角(α,β,γ)および第2の傾斜角(α,β,γ)を用いて、線分AB”と線分AB”とが重なりあうように、ベクトル2を移動させることで、第2のベクトルを算出する(S160)。 Then, the conversion vector calculation unit 130c generates a line segment connecting any two points of the points A, B 1 ″, C 1 ″, for example, the line segment AB 1 ″ and the points A, B 2 ″, C 2 ″. Among them, the line segment AB 2 ″ corresponding to the line segment AB 1 ″ is extracted. Then, the conversion vector calculation unit 130 c uses the first inclination angles (α 1 , β 1 , γ obtained by the inclination angle calculation unit 130 a. 1 ) and the second inclination angle (α 2 , β 2 , γ 2 ), the vector 2 is moved so that the line segment AB 2 ″ and the line segment AB 1 ″ overlap with each other. Is calculated (S160).
 最後に、相対座標計測部130dは、第1のベクトルと第2のベクトルとの最接近点の座標を算出することにより、基準点のうちいずれか1点である原点Aと対象点Wとの相対座標を算出する(S170)。 Finally, the relative coordinate measuring unit 130d calculates the coordinates of the closest point between the first vector and the second vector, so that the origin A and the target point W, which are any one of the reference points, are calculated. Relative coordinates are calculated (S170).
 なお、上述したベクトル算出部130b、変換ベクトル算出部130c及び相対座標計測部130dは、本発明の計測部に相当する。 In addition, the vector calculation unit 130b, the conversion vector calculation unit 130c, and the relative coordinate measurement unit 130d described above correspond to the measurement unit of the present invention.
 このようにして、基準点のうちいずれかと対象点との相対座標が算出される。 In this way, the relative coordinates between any of the reference points and the target point are calculated.
 (実施の形態2)
 実施の形態2では、1枚の画像を使用して、3次元相対座標計測部20が、3点の基準点のいずれかと撮影点(本発明の第1の視点に相当する)との相対座標を計測する動作について説明する。
(Embodiment 2)
In the second embodiment, using a single image, the three-dimensional relative coordinate measuring unit 20 makes a relative coordinate between any one of the three reference points and the photographing point (corresponding to the first viewpoint of the present invention). The operation of measuring the will be described.
 図13は、本発明の実施の形態2に係る3次元相対座標計測装置による、相対座標計測の処理手順を示すフローチャートである。 FIG. 13 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 2 of the present invention.
 まず、画像取得部100は、図14に示すように、第1の取得画像を取得する(S210)。ここで、第1の取得画像は、第1の視点Mより、座標が既知である3点の基準点A,B,Cが共に写るようにカメラ(本発明の第1の撮像装置に相当する)で撮像された画像データ60である。 First, the image acquisition unit 100 acquires a first acquired image as shown in FIG. 14 (S210). Here, the first acquired image is considerably than the first viewpoint M 1, the reference point A of the 3-point coordinates are known, B, the first imaging device to the camera (the present invention as C objects appear together Is the image data 60 imaged in (1).
 次に、2次元座標抽出部110は、図15に示すように、例えば、第1の取得画像における光学中心位置Cp1を原点として、第1の取得画像に投影される基準点A、B,Cである点A01,B01、C01の座標(3点の第1の基準2次元座標)を第1の取得画像から抽出する(S220)。 Next, as illustrated in FIG. 15, the two-dimensional coordinate extraction unit 110, for example, uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image. The coordinates of the points A 01 , B 01 and C 01 (three first reference two-dimensional coordinates) that are C are extracted from the first acquired image (S220).
 また、視点投影角抽出部120は、予め調整部200によって算出された任意の画像ピクセルの視点投影角を保持し、保持する任意の画像ピクセルの視点投影角の中から、2次元座標抽出部110によって抽出された3点の第1の基準2次元座標に対応する任意の画像ピクセルの視点投影角を、3つの第1の基準視点投影角として抽出する(S230)。 The viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 out of the viewpoint projection angles of the arbitrary image pixel to be held. The viewpoint projection angles of arbitrary image pixels corresponding to the three first reference two-dimensional coordinates extracted by the above are extracted as three first reference viewpoint projection angles (S230).
 そして、傾斜角算出部130aは、第1の取得画像の撮像面である第1の撮像面と、基準点を含む第1の基準平面とのなす角(第1の傾斜角)を算出する(S240)。 Then, the tilt angle calculation unit 130a calculates an angle (first tilt angle) formed by the first imaging plane, which is the imaging plane of the first acquired image, and the first reference plane including the reference point ( S240).
 なお、第1の傾斜角を算出する際の、傾斜角算出部130aが行う処理手順は、図10におけるS141~S145と同様であるため、説明は省略する。 Note that the processing procedure performed by the inclination angle calculation unit 130a when calculating the first inclination angle is the same as S141 to S145 in FIG.
 また、ベクトル算出部130bは、2本のベクトルを算出する(S250)。 The vector calculation unit 130b calculates two vectors (S250).
 ここで、2本のベクトルとは、例えば、(S145)において決定したB”と、そのときの点B1n’に対応するB’とを結ぶ直線B’B”及び(S145)において決定したC”と、そのときの点C1n’に対応するC’とを結ぶ直線C’C”のことである。 Here, the two vectors, for example, B 1 determined in (S145) and "the, and the straight line B 1 'B 1 connecting the' B 1 corresponding to 'point B 1n at that time" (S145) C 1 determined in is that of "a, linear C 1 'C 1 connecting the' C 1 corresponding to the 'C 1n terms of their time".
 最後に、相対座標計測部130dは、2本のベクトルの最接近点の座標を算出することにより、基準点のうちいずれか1点である点Aと撮影点との相対座標を算出する(S260)。 Finally, the relative coordinate measuring unit 130d calculates the coordinates of the closest point of the two vectors, thereby calculating the relative coordinates between the point A, which is one of the reference points, and the shooting point (S260). ).
 また、例えば、基準点A、B、Cがあり、それぞれの各点間の長さが既知の時、点Aを点Aの視点投影角線上のある位置に置き、その時の点Bを点Bの視点投影角線上に置くか近傍に置き、その場合に点Cが線分ABを軸とした円周上にあることがわかるので、点Cの視点投影角線上に最接近した条件を記憶しておき、点Aをある範囲で上記の計算を行ったときの各点が視点投影角線上に最も接近した時が求める傾斜角としてもよい。この場合、点Aと点Bと点Mを結ぶ三角形ABMについて、線分ABの長さ並びに∠MAB及び∠MBAが既知であるため、三角形ABMを用いて、基準点のうちいずれか1点と撮影点との相対座標を算出することができる。また、視点Mから線分ABを見たとして∠AMBが視点投影角により既知であるので、同様に、三角形ABMを用いて、基準点のうちいずれか1点と撮影点との相対座標を算出することができる。 Also, for example, when there are reference points A, B, and C, and the length between the respective points is known, the point A is placed at a certain position on the viewpoint A projection angle line of the point A, and the point B at that time is the point B Since the point C is on the circumference around the line segment AB, the condition closest to the point C projection angle line is memorized. The inclination angle obtained when the point A is closest to the viewpoint projection angle line when the above-described calculation is performed for the point A within a certain range may be used. In this case, the triangle ABM 1 connecting the point A and point B and the point M 1, because the length and ∠M 1 AB and ∠M 1 BA line segment AB is known, using the triangle ABM 1, the reference point The relative coordinates of any one of the points and the shooting point can be calculated. Further, since ∠AM 1 B is known from the viewpoint projection angle when the line segment AB is viewed from the viewpoint M 1 , similarly, using the triangle ABM 1 , any one of the reference points and the shooting point Relative coordinates can be calculated.
 このようにして、基準点のうちいずれかと撮影点との相対座標が算出される。 In this way, the relative coordinates between any of the reference points and the shooting point are calculated.
 以上、本発明の実施の形態に係る3次元相対座標計測装置について説明したが、本発明は、この実施の形態に限定されるものではない。 The three-dimensional relative coordinate measuring apparatus according to the embodiment of the present invention has been described above, but the present invention is not limited to this embodiment.
 本実施の形態では、視点投影角抽出部120は、予め調整部200によって算出された任意の画像ピクセルの視点投影角を保持し、保持する任意の画像の視点投影角の中から、2次元座標抽出部110によって抽出された2次元座標に対応する基準および対象視点投影角を抽出していた。しかし、例えば、視点投影角抽出部120は、式6を保持し、保持する式6を用いて、2次元座標抽出部110によって抽出された2次元座標と歪み補正式とを用いて基準および対象視点投影角を算出してもよい。この場合、調整部200は、式5により、視点Mと光学中心位置Cとの距離xを求めておく必要がある。 In the present embodiment, the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and selects the two-dimensional coordinates from the stored viewpoint projection angles of the arbitrary image. The reference and target viewpoint projection angles corresponding to the two-dimensional coordinates extracted by the extraction unit 110 have been extracted. However, for example, the viewpoint projection angle extraction unit 120 holds Equation 6, and uses the held Equation 6 to use the two-dimensional coordinates extracted by the two-dimensional coordinate extraction unit 110 and the distortion correction equation as a reference and target. The viewpoint projection angle may be calculated. In this case, the adjustment unit 200, by Equation 5, it is necessary to obtain the distance x between the viewpoint M and the optical center position C p.
 また、本実施の形態では、2次元座標抽出部110において、第1および第2の取得画像における3点の基準点および対象点の2次元座標を抽出していたが、例えば、画像取得部100が上記2次元座標を抽出してもよい。この場合には、2次元座標抽出部110は不要となる。 In the present embodiment, the two-dimensional coordinate extraction unit 110 extracts the three reference points and the two-dimensional coordinates of the target points in the first and second acquired images. For example, the image acquisition unit 100 May extract the two-dimensional coordinates. In this case, the two-dimensional coordinate extraction unit 110 is not necessary.
 また、本実施の形態では、傾斜角算出部130aにおいて、2つの異なる3次元座標系を設定することにより、第1の傾斜角および第2の傾斜角を算出したが、1つの3次元座標系のみを設定することより、第1の傾斜角および第2の傾斜角を算出してもよい。この場合、ベクトル算出部130bにおいて、第1の対象視点投影角と第1の傾斜角を用いて、1つの3次元座標系における点Mと点W01とを通る第1のベクトルを算出する。また、ベクトル算出部130bにおいて、第2の対象視点投影角と第2の傾斜角を用いて、1つの3次元座標系における点Mと点W02とを通る第2のベクトルを算出する。したがって、この場合には、変換ベクトル算出部130cは不要となる。 In the present embodiment, the tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle by setting two different three-dimensional coordinate systems, but one three-dimensional coordinate system is used. The first inclination angle and the second inclination angle may be calculated by setting only. In this case, the vector calculation unit 130b, a first target viewpoint projection angle using the first inclination angle, calculates a first vector which passes through the point M 1 and the point W 01 in one of the three-dimensional coordinate system . Further, the vector calculation unit 130b, and a second target viewpoint projection angles by using the second inclination angle, calculates a second vector passing through the point M 2 and the point W 02 in one of the three-dimensional coordinate system. Therefore, in this case, the conversion vector calculation unit 130c is not necessary.
 また、本実施の形態では、傾斜角算出部130aにおいて、2つの異なる3次元座標系を設定することにより、第1の傾斜角および第2の傾斜角を算出する場合においても、変換ベクトル算出部130cの代わりにベクトル算出部130bで演算を行ってもよい。 In the present embodiment, even when the tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle by setting two different three-dimensional coordinate systems, the conversion vector calculation unit The calculation may be performed by the vector calculation unit 130b instead of 130c.
 また、本実施の形態では、2つの視点から撮影を行ったが、3つ以上の視点から撮影を行ったり、相対座標が既知である基準点の数を4点以上に増やしたりしてもよい。 In the present embodiment, shooting is performed from two viewpoints, but shooting may be performed from three or more viewpoints, or the number of reference points whose relative coordinates are known may be increased to four or more. .
 また、相対座標が既知である4点以上の基準点を用いる場合、2つの画像の各々には、この4点以上の基準点のうち、少なくとも3点の基準点が撮影されていればよい。また、各画像に撮影される少なくとも3点の基準点の組み合わせは異なってもよい。また、4点以上の基準点は同一平面に配置されるのではなく、立体的に配置されていることが好ましい。例えば、三角錐、又は四角錐の頂点のうち4点以上を基準点として用いることができる。これにより、広範囲で安定的な角度検出を行える。例えば、上述した3点の基準点のみを用いる場合では、3点の基準点を含む平面と視点との角度が無いと誤差が発生しやすくなる。一方で、例えば、2つの視点から、上記三角錐の異なる面を撮影することで、このような場合でも安定して角度検出を行える。 In addition, when using four or more reference points whose relative coordinates are known, it is only necessary that at least three reference points of the four or more reference points are captured in each of the two images. Further, the combination of at least three reference points photographed in each image may be different. Further, it is preferable that four or more reference points are arranged in a three-dimensional manner rather than in the same plane. For example, four or more of the apexes of a triangular pyramid or a quadrangular pyramid can be used as the reference point. Thereby, a wide range and stable angle detection can be performed. For example, when only the above-described three reference points are used, an error is likely to occur if there is no angle between the plane including the three reference points and the viewpoint. On the other hand, for example, by photographing different surfaces of the triangular pyramid from two viewpoints, angle detection can be performed stably even in such a case.
 この場合、傾斜角は、異なる基準変面(第1および第2の基準平面)に対する角度である。また、相対座標計測部130dは、3点の第1の基準点と3点の第2の基準点との相対位置関係を用いて、対象点の相対座標を算出する。具体的には、相対座標計測部130dは、当該相対位置関係を用いて、2本のベクトルの最接近点を算出する。また、この相対位置関係は、傾斜角算出時に計算されてもよい。言い換えると、算出される傾斜角はどちらかの基準平面に合わせたものであってもよい。または、算出される傾斜角は、3次元座標系で統一された角度で算出されてもよい。 In this case, the inclination angle is an angle with respect to different reference deformation surfaces (first and second reference planes). The relative coordinate measurement unit 130d calculates the relative coordinates of the target point using the relative positional relationship between the three first reference points and the three second reference points. Specifically, the relative coordinate measuring unit 130d calculates the closest point of two vectors using the relative positional relationship. Further, this relative positional relationship may be calculated when calculating the tilt angle. In other words, the calculated tilt angle may be adjusted to one of the reference planes. Alternatively, the calculated tilt angle may be calculated as an angle unified in the three-dimensional coordinate system.
 また、相対座標が既知である基準点の数を多く用いて、非線形近似計算を行うことにより、第1の傾斜角および第2の傾斜角を算出してもよい。 Further, the first tilt angle and the second tilt angle may be calculated by performing nonlinear approximation calculation using a large number of reference points whose relative coordinates are known.
 また、例えば、相対座標が既知である3点の基準点A、B、Cと、第1の撮像面における基準点A、B,Cである点A01,B01、C01とのうち、点Aと点A01、点Bと点B01、点Cと点C01のそれぞれが対応づけられない場合には、特別なマーカーやラインを使用することにより、対応関係を識別することができる。 Further, for example, among the three reference points A, B, and C whose relative coordinates are known and the points A 01 , B 01 , and C 01 that are the reference points A, B, and C on the first imaging surface, When the points A and A 01 , the points B and B 01 , and the points C and C 01 are not associated with each other, the correspondence can be identified by using a special marker or line. .
 また、基準点の回転による反転(裏返り)および誤った解(傾斜角)の算出を防ぐために相対座標が既知の検証用の基準点を用いてもよい。 In addition, a reference point for verification whose relative coordinates are known may be used in order to prevent reversal (turn over) due to rotation of the reference point and calculation of an incorrect solution (tilt angle).
 また、誤った解を算出しないために制限(例えば、回転条件)を加えてもよい。 Also, restrictions (for example, rotation conditions) may be added so as not to calculate an incorrect solution.
 また、本発明の実施の形態に係る、3次元相対座標計測装置の機能の一部または全てを、CPU等のプロセッサがプログラムを実行することにより実現してもよい。 Further, a part or all of the functions of the three-dimensional relative coordinate measuring apparatus according to the embodiment of the present invention may be realized by a processor such as a CPU executing a program.
 さらに、構成要素間の接続関係は、本発明を具体的に説明するために例示するものであり、本発明の機能を実現する接続関係はこれに限定されない。 Furthermore, the connection relationship between the constituent elements is exemplified for specifically explaining the present invention, and the connection relationship for realizing the function of the present invention is not limited to this.
 また、ブロック図における機能ブロックの分割は一例であり、複数の機能ブロックを一つの機能ブロックとして実現したり、一つの機能ブロックを複数に分割したり、一部の機能を他の機能ブロックに移してもよい。また、類似する機能を有する複数の機能ブロックの機能を上記3次元計測装置が並列または時分割に処理してもよい。 In addition, division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be. Further, the functions of a plurality of functional blocks having similar functions may be processed by the three-dimensional measuring apparatus in parallel or in time division.
 さらに、本発明は上記プログラムであってもよいし、上記プログラムが記録された非一時的なコンピュータ読み取り可能な記録媒体であってもよい。また、上記プログラムは、インターネット等の伝送媒体を介して流通させることができるのは言うまでもない。 Furthermore, the present invention may be the above program or a non-transitory computer-readable recording medium on which the above program is recorded. Needless to say, the program can be distributed via a transmission medium such as the Internet.
 また、上記3次元計測装置の構成は、本発明を具体的に説明するために例示するためのものであり、本発明に係る3次元計測装置は、上記構成の全てを必ずしも備える必要はない。言い換えると、本発明に係る3次元計測装置は、本発明の効果を実現できる最小限の構成のみを備えればよい。 Further, the configuration of the three-dimensional measurement apparatus is for illustrating the present invention in detail, and the three-dimensional measurement apparatus according to the present invention does not necessarily have all of the above-described configurations. In other words, the three-dimensional measuring apparatus according to the present invention only needs to have a minimum configuration that can realize the effects of the present invention.
 同様に、上記の3次元計測装置による3次元計測方法は、本発明を具体的に説明するために例示するためのものであり、本発明に係る3次元計測装置による3次元計測方法は、上記ステップの全てを必ずしも含む必要はない。言い換えると、本発明に係る3次元計測方法は、本発明の効果を実現できる最小限のステップのみを含めばよい。 Similarly, the three-dimensional measurement method using the three-dimensional measurement apparatus is for illustration in order to specifically describe the present invention, and the three-dimensional measurement method using the three-dimensional measurement apparatus according to the present invention is described above. It is not necessary to include all of the steps. In other words, the three-dimensional measurement method according to the present invention needs to include only the minimum steps that can realize the effects of the present invention.
 また、上記のステップが実行される順序は、本発明を具体的に説明するために例示するためのものであり、上記以外の順序であってもよい。 Further, the order in which the above steps are executed is for illustration in order to specifically describe the present invention, and may be in an order other than the above.
 また、上記ステップの一部が、他のステップと同時(並列)に実行されてもよい。 Also, some of the above steps may be executed simultaneously (in parallel) with other steps.
 さらに、本発明の主旨を逸脱しない限り、本実施の形態に対して当業者が思いつく範囲内の変更を施した各種変形例も本発明に含まれる。 Furthermore, various modifications in which the present embodiment is modified within the scope conceived by those skilled in the art are also included in the present invention without departing from the gist of the present invention.
 本発明は、3次元相対座標計測、例えば機械的および電気的に相対座標を計測することができない環境において、2点間の間隔を自動測定する場合に利用することができる。 The present invention can be used when three-dimensional relative coordinate measurement, for example, an interval between two points is automatically measured in an environment where relative coordinates cannot be measured mechanically and electrically.
    10  カメラ
    20  3次元相対座標計測部
    30  ケーブル
    40  基準構造物
    50  対象構造物
    60、61  画像データ
    70  キャリブレーション板
    71  糸
    72  錘
    80  画像のピクセルキャリブレーション部
    90  3次元相対座標計測装置
    100  画像取得部
    110  2次元座標抽出部
    120  視点投影角抽出部
    130  演算部
    130a  傾斜角算出部
    130b  ベクトル算出部
    130c  変換ベクトル算出部
    130d  相対座標計測部
    140  表示部
    200  調整部
DESCRIPTION OF SYMBOLS 10 Camera 20 Three-dimensional relative coordinate measuring part 30 Cable 40 Reference | standard structure 50 Target structure 60, 61 Image data 70 Calibration board 71 Yarn 72 Weight 80 Image pixel calibration part 90 Three-dimensional relative coordinate measuring apparatus 100 Image acquisition part DESCRIPTION OF SYMBOLS 110 Two-dimensional coordinate extraction part 120 Viewpoint projection angle extraction part 130 Calculation part 130a Inclination angle calculation part 130b Vector calculation part 130c Conversion vector calculation part 130d Relative coordinate measurement part 140 Display part 200 Adjustment part

Claims (10)

  1.  3次元座標系における相対座標が既知である3点の第1の基準点のいずれかと、前記3次元座標系における座標が未知である1点の対象点との相対座標を計測する3次元相対座標計測装置であって、
     第1の視点より前記3点の第1の基準点が第1の撮像装置で撮像された第1の取得画像を取得する画像取得部と、
     各画素に対する第1の画素視点投影角の情報を保持し、前記情報を用いて、前記第1の取得画像に投影される前記3点の第1の基準点に対応する前記第1の画素視点投影角である3つの第1の基準視点投影角を取得する視点投影角抽出部とを備え、
     前記第1の画素視点投影角は、前記第1の取得画像における各画素の座標である第1の画素2次元座標に投影される前記3次元座標系の各点と前記第1の視点とを結ぶ線分と、前記第1の撮像装置の光軸とのなす角であり、
     さらに、
     前記3つの第1の基準視点投影角と前記3点の第1の基準点の相対座標とを用いて、前記第1の取得画像の撮像面である第1の撮像面と前記3点の第1の基準点を含む第1の基準平面とのなす第1の傾斜角を算出する傾斜角算出部と、
     前記第1の傾斜角を用いて、前記3点の第1の基準点のいずれかと前記対象点との相対座標を計測する計測部と
     を備える3次元相対座標計測装置。
    3D relative coordinates for measuring relative coordinates between any one of the three first reference points whose relative coordinates in the 3D coordinate system are known and one target point whose coordinates in the 3D coordinate system are unknown A measuring device,
    An image acquisition unit that acquires a first acquired image in which the three first reference points are captured by the first imaging device from a first viewpoint;
    The first pixel viewpoint corresponding to the first reference point of the three points that is stored in the information of the first pixel viewpoint projection angle for each pixel and is projected onto the first acquired image using the information A viewpoint projection angle extraction unit that acquires three first reference viewpoint projection angles that are projection angles;
    The first pixel viewpoint projection angle is obtained by calculating each point of the three-dimensional coordinate system projected on the first pixel two-dimensional coordinate which is a coordinate of each pixel in the first acquired image and the first viewpoint. An angle formed by a connecting line segment and the optical axis of the first imaging device;
    further,
    Using the three first reference viewpoint projection angles and the relative coordinates of the three first reference points, the first imaging surface, which is the imaging surface of the first acquired image, and the three points An inclination angle calculation unit for calculating a first inclination angle formed with a first reference plane including one reference point;
    A three-dimensional relative coordinate measurement apparatus comprising: a measurement unit that measures a relative coordinate between any one of the three first reference points and the target point using the first inclination angle.
  2.  前記3点の第1の基準点は、前記3次元座標系における相対座標が既知である3点以上の基準点のうちのいずれか3点であり、
     前記画像取得部は、
     前記第1の視点より前記3点の第1の基準点と前記対象点とが前記第1の撮像装置で撮像された前記第1の取得画像を取得し、
     第2の視点より、前記3点以上の基準点のうちのいずれか3点である3点の第2の基準点と前記対象点とが、前記第1の撮像装置と同一又は異なる第2の撮像装置で撮像された第2の取得画像を取得し、
     前記視点投影角抽出部はさらに、
     各画素に対する第2の画素視点投影角の情報を保持し、前記第2の画素視点投影角の情報を用いて、前記第2の取得画像に投影される前記3点の第2の基準点に対応する前記第2の画素視点投影角である3つの第2の基準視点投影角を取得し、
     前記第2の画素視点投影角は、前記第2の取得画像における各画素の座標である第2の画素2次元座標に投影される前記3次元座標系の各点と前記第2の視点とを結ぶ線分と、前記第2の撮像装置の光軸とのなす角であり、
     前記傾斜角算出部はさらに、
     前記3つの第2の基準視点投影角と前記3点の第2の基準点の相対座標とを用いて、前記第2の取得画像の撮像面である第2の撮像面と前記3点の第2の基準点を含む第2の基準平面とのなす第2の傾斜角を算出し、
     前記計測部は、
     前記3点の第1の基準点と前記3点の第2の基準点との相対位置関係と、前記第1の傾斜角と、前記第2の傾斜角とを用いて、前記3点の第1の基準点および前記3点の第2の基準点のいずれかと前記対象点との相対座標を計測する
     請求項1に記載の3次元相対座標計測装置。
    The three first reference points are any three of three or more reference points whose relative coordinates in the three-dimensional coordinate system are known;
    The image acquisition unit
    Acquiring the first acquired image in which the three first reference points and the target point are imaged by the first imaging device from the first viewpoint;
    From the second viewpoint, two second reference points, which are any three of the three or more reference points, and the target point are the same or different from the first imaging device. Acquiring a second acquired image captured by the imaging device;
    The viewpoint projection angle extraction unit further includes:
    Information on the second pixel viewpoint projection angle for each pixel is held, and information on the second pixel viewpoint projection angle is used as the second reference point of the three points projected on the second acquired image. Obtaining three second reference viewpoint projection angles which are the corresponding second pixel viewpoint projection angles;
    The second pixel viewpoint projection angle is obtained by calculating each point of the three-dimensional coordinate system projected on the second pixel two-dimensional coordinate which is a coordinate of each pixel in the second acquired image and the second viewpoint. An angle formed by a connecting line segment and the optical axis of the second imaging device;
    The inclination angle calculation unit further includes
    Using the three second reference viewpoint projection angles and the relative coordinates of the three second reference points, the second imaging surface, which is the imaging surface of the second acquired image, and the third of the three points. Calculating a second inclination angle formed with a second reference plane including two reference points;
    The measuring unit is
    Using the relative positional relationship between the first reference point of the three points and the second reference point of the three points, the first inclination angle, and the second inclination angle, the third reference point The three-dimensional relative coordinate measuring apparatus according to claim 1, wherein a relative coordinate between any one of the one reference point and the three second reference points and the target point is measured.
  3.  前記傾斜角算出部は、
     前記3つの第1および第2の基準視点投影角と前記3点の第1および第2の基準点とを用いて、前記第1の視点と前記第1の取得画像に投影される前記3点の第1の基準点のそれぞれとを結ぶ直線上にある前記3点の第1の基準点を含む前記第1の基準平面を決定し、前記第2の視点と前記第2の取得画像に投影される前記3点の第2の基準点のそれぞれとを結ぶ直線上にある前記3点の第2の基準点を含む前記第2の基準平面を決定する
     請求項2に記載の3次元相対座標計測装置。
    The inclination angle calculation unit
    The three points projected onto the first viewpoint and the first acquired image using the three first and second reference viewpoint projection angles and the three first and second reference points. Determining the first reference plane including the three first reference points on a straight line connecting each of the first reference points, and projecting the first reference plane onto the second acquired image and the second acquired image 3. The three-dimensional relative coordinates according to claim 2, wherein the second reference plane including the three second reference points on a straight line connecting each of the three second reference points is determined. Measuring device.
  4.  前記視点投影角抽出部はさらに、
     前記第1の取得画像に投影される前記対象点に対応する前記第1の画素視点投影角である第1の対象視点投影角を取得し、
     前記第2の取得画像に投影される前記対象点に対応する前記第2の画素視点投影角である第2の対象視点投影角を取得し、
     前記計測部は、
     前記第1の対象視点投影角と、前記第1の傾斜角とを用いて、前記3次元座標系における、前記第1の視点と前記第1の取得画像に投影される前記対象点とを通る第1のベクトルを算出し、前記第2の対象視点投影角と、前記第2の傾斜角とを用いて、前記3次元座標系における、前記第2の視点と前記第2の取得画像に投影される前記対象点とを通る第2のベクトルを算出し、
     前記相対位置関係を用いて、前記3次元座標系における、前記第1のベクトルと前記第2のベクトルとの最接近点の座標を、前記3点の第1の基準点及び前記3点の第2の基準点のいずれかと前記対象点との相対座標として算出する
     請求項2または3に記載の3次元相対座標計測装置。
    The viewpoint projection angle extraction unit further includes:
    Obtaining a first target viewpoint projection angle that is the first pixel viewpoint projection angle corresponding to the target point projected on the first acquired image;
    Obtaining a second target viewpoint projection angle that is the second pixel viewpoint projection angle corresponding to the target point projected on the second acquired image;
    The measuring unit is
    The first target viewpoint projection angle and the first tilt angle are used to pass through the first viewpoint and the target point projected on the first acquired image in the three-dimensional coordinate system. A first vector is calculated and projected onto the second viewpoint and the second acquired image in the three-dimensional coordinate system using the second target viewpoint projection angle and the second tilt angle. Calculating a second vector passing through the target point to be
    Using the relative positional relationship, the coordinates of the closest points of the first vector and the second vector in the three-dimensional coordinate system are determined as the first reference point of the three points and the third reference point of the three points. The three-dimensional relative coordinate measuring device according to claim 2 or 3, wherein the three-dimensional relative coordinate measuring device is calculated as a relative coordinate between one of two reference points and the target point.
  5.  前記3点の第1の基準点のうち少なくとも一つは、前記3点の第2の基準点と異なる
     請求項2~4のいずれか1項に記載の3次元相対座標計測装置。
    The three-dimensional relative coordinate measurement apparatus according to any one of claims 2 to 4, wherein at least one of the three first reference points is different from the second reference point.
  6.  前記対象点は前記第1の視点である
     請求項1に記載の3次元相対座標計測装置。
    The three-dimensional relative coordinate measuring apparatus according to claim 1, wherein the target point is the first viewpoint.
  7.  前記傾斜角算出部は、
     前記3つの第1の基準視点投影角と前記3点の第1の基準点とを用いて、前記第1の視点と前記第1の取得画像に投影される前記3点の第1の基準点のそれぞれとを結ぶ直線上にある前記3点の第1の基準点を含む前記第1の基準平面を決定する
     請求項6に記載の3次元相対座標計測装置。
    The inclination angle calculation unit
    The three first reference points projected on the first viewpoint and the first acquired image using the three first reference viewpoint projection angles and the three first reference points. The three-dimensional relative coordinate measuring apparatus according to claim 6, wherein the first reference plane including the first reference points of the three points on a straight line connecting each of the three is determined.
  8.  前記計測部は、
     前記3つの第1の基準視点投影角のうち2つの角と、前記第1の傾斜角とを用いて、前記第1の基準平面における、前記2つの角に対応する前記3点の第1の基準点のうち2点とを通る2本のベクトルを算出し、
     前記2本のベクトルの最接近点の座標を、前記3点の第1の基準点のいずれかと前記対象点との相対座標として算出する
     請求項6または7に記載の3次元相対座標計測装置。
    The measuring unit is
    The first of the three points corresponding to the two angles in the first reference plane using two of the three first reference viewpoint projection angles and the first inclination angle. Calculate two vectors that pass through two of the reference points,
    The three-dimensional relative coordinate measuring apparatus according to claim 6 or 7, wherein coordinates of the closest approach point of the two vectors are calculated as relative coordinates between any one of the three first reference points and the target point.
  9.  3次元座標系における相対座標が既知である3点の第1の基準点のいずれかと、前記3次元座標系における座標が未知である1点の対象点との相対座標を計測する3次元相対座標計測方法であって、
     第1の視点より前記3点の第1の基準点が第1の撮像装置で撮像された第1の取得画像を取得する画像取得ステップと、
     各画素に対する第1の画素視点投影角の情報を保持し、前記情報を用いて、前記第1の取得画像に投影される前記3点の第1の基準点に対応する前記第1の画素視点投影角である3つの第1の基準視点投影角を取得する視点投影角抽出ステップとを含み、
     前記第1の画素視点投影角は、前記第1の取得画像における各画素の座標である第1の画素2次元座標に投影される前記3次元座標系の各点と前記第1の視点とを結ぶ線分と、前記第1の撮像装置の光軸とのなす角であり、
     さらに、
     前記3つの第1の基準視点投影角と前記3点の第1の基準点の相対座標とを用いて、前記第1の取得画像の撮像面である第1の撮像面と前記3点の第1の基準点を含む第1の基準平面とのなす第1の傾斜角を算出する傾斜角算出ステップと、
     前記第1の傾斜角を用いて、前記3点の第1の基準点のいずれかと前記対象点との相対座標を計測する計測ステップと
     を含む3次元相対座標計測方法。
    3D relative coordinates for measuring relative coordinates between any one of the three first reference points whose relative coordinates in the 3D coordinate system are known and one target point whose coordinates in the 3D coordinate system are unknown Measuring method,
    An image acquisition step of acquiring a first acquired image in which the three first reference points are captured by the first imaging device from a first viewpoint;
    The first pixel viewpoint corresponding to the first reference point of the three points that is stored in the information of the first pixel viewpoint projection angle for each pixel and is projected onto the first acquired image using the information A viewpoint projection angle extracting step of obtaining three first reference viewpoint projection angles that are projection angles;
    The first pixel viewpoint projection angle is obtained by calculating each point of the three-dimensional coordinate system projected on the first pixel two-dimensional coordinates, which are coordinates of each pixel in the first acquired image, and the first viewpoint. An angle formed by a connecting line segment and the optical axis of the first imaging device;
    further,
    Using the three first reference viewpoint projection angles and the relative coordinates of the three first reference points, the first imaging surface, which is the imaging surface of the first acquired image, and the three points An inclination angle calculating step for calculating a first inclination angle formed with a first reference plane including one reference point;
    A three-dimensional relative coordinate measurement method comprising: a measurement step of measuring a relative coordinate between any one of the three first reference points and the target point using the first inclination angle.
  10.  請求項9に記載の3次元相対座標計測方法に含まれるステップをコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the steps included in the three-dimensional relative coordinate measurement method according to claim 9.
PCT/JP2011/003774 2011-07-01 2011-07-01 Three-dimensional relative coordinate measuring device and method WO2013005244A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2011541976A JP5070435B1 (en) 2011-07-01 2011-07-01 Three-dimensional relative coordinate measuring apparatus and method
PCT/JP2011/003774 WO2013005244A1 (en) 2011-07-01 2011-07-01 Three-dimensional relative coordinate measuring device and method
JP2013522375A JP5629874B2 (en) 2011-07-01 2011-10-07 3D coordinate measuring apparatus and 3D coordinate measuring method
PCT/JP2011/005654 WO2013005265A1 (en) 2011-07-01 2011-10-07 Three-dimensional coordinate measuring device and three-dimensional coordinate measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/003774 WO2013005244A1 (en) 2011-07-01 2011-07-01 Three-dimensional relative coordinate measuring device and method

Publications (1)

Publication Number Publication Date
WO2013005244A1 true WO2013005244A1 (en) 2013-01-10

Family

ID=47277831

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2011/003774 WO2013005244A1 (en) 2011-07-01 2011-07-01 Three-dimensional relative coordinate measuring device and method
PCT/JP2011/005654 WO2013005265A1 (en) 2011-07-01 2011-10-07 Three-dimensional coordinate measuring device and three-dimensional coordinate measuring method

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/005654 WO2013005265A1 (en) 2011-07-01 2011-10-07 Three-dimensional coordinate measuring device and three-dimensional coordinate measuring method

Country Status (2)

Country Link
JP (1) JP5070435B1 (en)
WO (2) WO2013005244A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104634246A (en) * 2015-02-03 2015-05-20 李安澜 Floating type stereo visual measuring system and measuring method for coordinates of object space
CN108195345A (en) * 2017-12-20 2018-06-22 合肥英睿系统技术有限公司 A kind of distance measuring method and system based on electronic imager
CN110108203A (en) * 2019-04-11 2019-08-09 东莞中子科学中心 A kind of silk thread location measurement method and system based on photogrammetric technology
CN112325767A (en) * 2020-10-16 2021-02-05 华中科技大学鄂州工业技术研究院 Spatial plane size measurement method integrating machine vision and flight time measurement
CN112991742A (en) * 2021-04-21 2021-06-18 四川见山科技有限责任公司 Visual simulation method and system for real-time traffic data
CN113884081A (en) * 2016-11-01 2022-01-04 北京墨土科技有限公司 Method and equipment for measuring three-dimensional coordinates of positioning point

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424104B (en) * 2013-09-04 2015-11-18 中测新图(北京)遥感技术有限责任公司 A kind of close shot large format digital Photogrammetric System and method
CN104748680B (en) * 2015-03-19 2018-09-14 酷派软件技术(深圳)有限公司 A kind of dimension measurement method and device based on camera
CN106441243A (en) * 2016-09-22 2017-02-22 云南电网有限责任公司电力科学研究院 Method and device for measuring ground clearance
JP6950273B2 (en) * 2017-05-17 2021-10-13 日本電気株式会社 Flying object position detection device, flying object position detection system, flying object position detection method and program
CN110595433A (en) * 2019-08-16 2019-12-20 太原理工大学 Binocular vision-based transmission tower inclination measurement method
CN115297780A (en) * 2020-03-20 2022-11-04 皇家飞利浦有限公司 Three-dimensional measurement grid tool for X-ray images

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1139506A (en) * 1997-07-16 1999-02-12 Atr Chino Eizo Tsushin Kenkyusho:Kk Optional view point image generator
JP2000121362A (en) * 1998-10-20 2000-04-28 Asahi Optical Co Ltd Target measuring instrument for photographic survey
JP2002090118A (en) * 2000-09-19 2002-03-27 Olympus Optical Co Ltd Three-dimensional position and attitude sensing device
JP2009248214A (en) * 2008-04-03 2009-10-29 Kanto Auto Works Ltd Image processing device and robot control system
JP2010025759A (en) * 2008-07-18 2010-02-04 Fuji Xerox Co Ltd Position measuring system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0680402B2 (en) * 1985-05-28 1994-10-12 富士通株式会社 Position measuring device
JPH0493705A (en) * 1990-08-09 1992-03-26 Topcon Corp Apparatus and method for measuring three-dimensional position
JPH07218251A (en) * 1994-02-04 1995-08-18 Matsushita Electric Ind Co Ltd Method and device for measuring stereo image
JP3777067B2 (en) * 1999-07-07 2006-05-24 ペンタックス株式会社 Photogrammetry image processing apparatus, photogrammetry image processing method, and storage medium storing photogrammetry image processing program
JP2004271292A (en) * 2003-03-07 2004-09-30 Meidensha Corp Calibrator and stereo camera position/attitude calibration device
JP4095491B2 (en) * 2003-05-19 2008-06-04 本田技研工業株式会社 Distance measuring device, distance measuring method, and distance measuring program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1139506A (en) * 1997-07-16 1999-02-12 Atr Chino Eizo Tsushin Kenkyusho:Kk Optional view point image generator
JP2000121362A (en) * 1998-10-20 2000-04-28 Asahi Optical Co Ltd Target measuring instrument for photographic survey
JP2002090118A (en) * 2000-09-19 2002-03-27 Olympus Optical Co Ltd Three-dimensional position and attitude sensing device
JP2009248214A (en) * 2008-04-03 2009-10-29 Kanto Auto Works Ltd Image processing device and robot control system
JP2010025759A (en) * 2008-07-18 2010-02-04 Fuji Xerox Co Ltd Position measuring system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104634246A (en) * 2015-02-03 2015-05-20 李安澜 Floating type stereo visual measuring system and measuring method for coordinates of object space
CN113884081A (en) * 2016-11-01 2022-01-04 北京墨土科技有限公司 Method and equipment for measuring three-dimensional coordinates of positioning point
CN113884081B (en) * 2016-11-01 2024-02-27 北京墨土科技有限公司 Method and equipment for measuring three-dimensional coordinates of positioning point
CN108195345A (en) * 2017-12-20 2018-06-22 合肥英睿系统技术有限公司 A kind of distance measuring method and system based on electronic imager
CN110108203A (en) * 2019-04-11 2019-08-09 东莞中子科学中心 A kind of silk thread location measurement method and system based on photogrammetric technology
CN112325767A (en) * 2020-10-16 2021-02-05 华中科技大学鄂州工业技术研究院 Spatial plane size measurement method integrating machine vision and flight time measurement
CN112325767B (en) * 2020-10-16 2022-07-26 华中科技大学鄂州工业技术研究院 Spatial plane dimension measurement method integrating machine vision and flight time measurement
CN112991742A (en) * 2021-04-21 2021-06-18 四川见山科技有限责任公司 Visual simulation method and system for real-time traffic data

Also Published As

Publication number Publication date
JP5070435B1 (en) 2012-11-14
JPWO2013005244A1 (en) 2015-02-23
WO2013005265A1 (en) 2013-01-10

Similar Documents

Publication Publication Date Title
JP5070435B1 (en) Three-dimensional relative coordinate measuring apparatus and method
JP6585006B2 (en) Imaging device and vehicle
TWI555379B (en) An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN110809786B (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
CN106548489B (en) A kind of method for registering, the three-dimensional image acquisition apparatus of depth image and color image
US10621753B2 (en) Extrinsic calibration of camera systems
JP3624353B2 (en) Three-dimensional shape measuring method and apparatus
WO2018068719A1 (en) Image stitching method and apparatus
JP2013539147A5 (en)
CN107808398B (en) Camera parameter calculation device, calculation method, program, and recording medium
JP2015203652A (en) Information processing unit and information processing method
JPWO2008053649A1 (en) Wide-angle image acquisition method and wide-angle stereo camera device
JP2008096162A (en) Three-dimensional distance measuring sensor and three-dimensional distance measuring method
JP2016100698A (en) Calibration device, calibration method, and program
JP5648159B2 (en) Three-dimensional relative coordinate measuring apparatus and method
KR20240089161A (en) Filming measurement methods, devices, instruments and storage media
JP2022024688A (en) Depth map generation device and program thereof, and depth map generation system
JP4837538B2 (en) End position measuring method and dimension measuring method
JP2017098859A (en) Calibration device of image and calibration method
CN108062790B (en) Three-dimensional coordinate system establishing method applied to object three-dimensional reconstruction
US20220124253A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
JP2009253715A (en) Camera calibrating device, camera calibrating method, camera calibrating program, and recording medium for recording the program
JP5727969B2 (en) Position estimation apparatus, method, and program
TWM594322U (en) Camera configuration system with omnidirectional stereo vision
JP2005275789A (en) Three-dimensional structure extraction method

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2011541976

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11869130

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11869130

Country of ref document: EP

Kind code of ref document: A1