WO2013005265A1 - Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles - Google Patents

Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles Download PDF

Info

Publication number
WO2013005265A1
WO2013005265A1 PCT/JP2011/005654 JP2011005654W WO2013005265A1 WO 2013005265 A1 WO2013005265 A1 WO 2013005265A1 JP 2011005654 W JP2011005654 W JP 2011005654W WO 2013005265 A1 WO2013005265 A1 WO 2013005265A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewpoint
angle
target
point
optical axis
Prior art date
Application number
PCT/JP2011/005654
Other languages
English (en)
Japanese (ja)
Inventor
覚 大浦
Original Assignee
株式会社ベイビッグ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ベイビッグ filed Critical 株式会社ベイビッグ
Priority to JP2013522375A priority Critical patent/JP5629874B2/ja
Publication of WO2013005265A1 publication Critical patent/WO2013005265A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Definitions

  • the present invention relates to a three-dimensional coordinate measuring apparatus and a three-dimensional coordinate measuring method for measuring three-dimensional coordinates of a target point using two images obtained by photographing the target point from different viewpoints.
  • a stereo method is known as a method of measuring the coordinates of relative points using an image.
  • a target point is imaged by two cameras (imaging devices) placed at different positions, or a target point is imaged from two different positions by one camera.
  • the coordinates of the target point are calculated using these two images.
  • the stereo method calculates three-dimensional coordinates of a target point based on subtle differences in images based on parallax, as in human binocular vision (see, for example, Patent Document 1 and Patent Document 2). .
  • Patent Document 2 calculates the three-dimensional coordinates of the target point even when the two cameras are deviated from the parallel stereo arrangement by correcting the deviation of the shooting direction of the cameras. It is possible.
  • Patent Document 2 has a problem that correction cannot be performed when the arrangement of the two cameras deviates greatly from the stereo arrangement.
  • an object of the present invention is to provide a three-dimensional coordinate measuring apparatus that can improve the degree of freedom of setting of two cameras.
  • a three-dimensional coordinate measuring apparatus is a three-dimensional coordinate measuring apparatus that measures three-dimensional coordinates of a target point, and is a first imaging device from a first viewpoint.
  • An image acquisition unit that acquires a first target image in which the target point is captured and a second target image in which the target point is captured by a second imaging device from a second viewpoint; For each pixel in the image captured by the imaging device, a line segment connecting each point of the three-dimensional coordinate system projected onto the pixel and the first viewpoint, and the optical axis of the first imaging device Information on the first pixel viewpoint projection angle, which is an angle formed, and for each pixel in the image captured by the second imaging device, each point of the three-dimensional coordinate system projected on the pixel and the second A second image that is an angle formed by a line segment connecting the viewpoint and the optical axis of the second imaging device.
  • first optical axis information indicating the optical axis direction of the first imaging device
  • second optical axis information indicating the optical axis direction of the second imaging device
  • the storage unit that holds the inter-viewpoint information indicating the relative position between the second viewpoint and the second viewpoint, and the information about the first pixel viewpoint projection angle is used to project the first target image onto the first target image.
  • a first viewpoint projection angle extraction unit that obtains a first target viewpoint projection angle that is the first pixel viewpoint projection angle corresponding to a target point, and information about the second pixel viewpoint projection angle
  • a second viewpoint projection angle extraction unit that acquires a second target viewpoint projection angle that is the second pixel viewpoint projection angle corresponding to the target point projected on the second target image;
  • a coordinate calculating unit for calculating three-dimensional coordinates of the point.
  • the three-dimensional coordinate measuring apparatus can easily calculate the three-dimensional coordinates of the target point using the viewpoint projection angle even when the two cameras are not arranged in parallel stereo.
  • the three-dimensional coordinate measuring apparatus can improve the degree of freedom of setting two cameras.
  • the storage unit further includes information indicating a first rotation angle that is an angle at which the first imaging device rotates about the optical axis of the first imaging device, and the second The imaging device holds information indicating a second rotation angle that is an angle of rotation about the optical axis of the second imaging device, and the coordinate calculation unit includes the first and second objects.
  • the three-dimensional coordinates of the target point are calculated using the viewpoint projection angle, the optical axis directions of the first and second imaging devices, the inter-viewpoint information, and the first and second rotation angles. May be.
  • the three-dimensional coordinate measuring apparatus can easily calculate the three-dimensional coordinates of the target point even when the camera rotates about the optical axis of the camera.
  • the coordinate calculation unit uses the first target viewpoint projection angle, the optical axis direction of the first imaging device, and the first rotation angle to use the first viewpoint and the second viewpoint.
  • a first target angle that is an angle formed by a line segment that connects the first viewpoint and a line segment that connects the first viewpoint and the target point, and calculates the second target viewpoint projection angle, A line segment connecting the first viewpoint and the second viewpoint using the optical axis direction of the second imaging device and the second rotation angle, and the second viewpoint and the target point.
  • a second target angle that is an angle formed by a connecting line segment is calculated, and the three-dimensional coordinates of the target point are calculated using the inter-viewpoint information, the first target angle, and the second target angle. May be calculated.
  • the first optical axis information includes a first reference line that passes through the first viewpoint and is perpendicular to a line segment that connects the first viewpoint and the second viewpoint; Information indicating a first photographing angle that is an angle formed with an optical axis of one imaging device, and the second optical axis information passes through the second viewpoint and the first viewpoint and the A second reference line that is perpendicular to a line connecting the second viewpoint and is parallel to the first reference line and an optical axis of the second imaging device; Information indicating a shooting angle, wherein the coordinate calculation unit includes an angle formed by a line segment connecting the first viewpoint and the second viewpoint and the first reference line, and the first viewpoint projection angle.
  • And calculating the first target angle using the first imaging angle and the first rotation angle, and a line segment connecting the first viewpoint and the second viewpoint, and the second And the angle formed by the reference line, the second viewpoint projection angle, the second imaging angle, may calculate the second target angle using a said second rotation angle.
  • the image acquisition unit further includes a first reference point, which is any three of three or more reference points whose relative coordinates are known, from the first viewpoint to the first imaging device.
  • the second reference point which is any one of the first reference image picked up by (1) and the three or more reference points, was picked up by the second image pickup device from the second viewpoint.
  • a second reference image is acquired, and the three-dimensional coordinate measuring apparatus further uses the information of the first pixel viewpoint projection angle to project the first of the three points to be projected onto the first reference image.
  • a third viewpoint projection angle extraction unit that obtains three first reference viewpoint projection angles that are the first pixel viewpoint projection angles corresponding to the reference point, and information on the second pixel viewpoint projection angle.
  • a fourth viewpoint projection angle extraction unit that obtains three second reference viewpoint projection angles that are point projection angles; and the relative coordinates of the three first reference viewpoint projection angles and the three first reference points.
  • a second imaging surface which is an imaging surface of the second reference image, using an angle calculation unit, the three second reference viewpoint projection angles and the relative coordinates of the three second reference points;
  • a second inclination angle calculating unit for calculating a second inclination angle formed with a second reference plane including the three second reference points, the three first reference points, and the three reference points.
  • the optical axis direction of the first imaging device and the second imaging device The optical axis direction of the a first rotation angle A, and a setting information calculating unit that calculates a second rotation angle.
  • the three-dimensional coordinate measuring apparatus can easily calculate the optical axis directions of the first and second imaging apparatuses using the viewpoint projection angle.
  • the setting information calculation unit includes a relative positional relationship between the three first reference points and the three second reference points, the first inclination angle, and the second inclination angle. May be used to calculate the three-dimensional coordinates of the first viewpoint and the second viewpoint, and to calculate the inter-viewpoint information from the three-dimensional coordinates of the first viewpoint and the second viewpoint. .
  • the three-dimensional coordinate measuring apparatus can easily calculate the inter-viewpoint distance using the viewpoint projection angle.
  • the present invention can be realized not only as such a three-dimensional coordinate measuring apparatus, but also as a three-dimensional coordinate measuring method using characteristic means included in the three-dimensional coordinate measuring apparatus as a step. It can also be realized as a program that causes a computer to execute typical steps. Needless to say, such a program can be distributed via a non-transitory computer-readable recording medium such as a CD-ROM and a transmission medium such as the Internet.
  • the present invention can provide a three-dimensional coordinate measuring apparatus that can improve the degree of freedom of setting of two cameras.
  • FIG. 8 is a diagram for explaining the first adjustment operation according to the embodiment of the present invention.
  • FIG. 9A is a diagram showing a reference point projected on the first imaging surface according to the embodiment of the present invention.
  • FIG. 9B is a diagram showing a reference point projected on the second imaging surface according to the embodiment of the present invention.
  • FIG. 10 is a flowchart of the tilt angle calculation process according to the embodiment of the present invention.
  • FIG. 11 is a diagram for explaining the tilt angle calculation processing according to the embodiment of the present invention.
  • FIG. 12 is a diagram for explaining the tilt angle calculation processing according to the embodiment of the present invention.
  • FIG. 13 is a diagram for explaining a first adjustment operation according to the embodiment of the present invention.
  • FIG. 14 is a diagram showing a configuration of a system including a three-dimensional coordinate measuring apparatus at the time of the second adjustment operation according to the embodiment of the present invention.
  • FIG. 15 is a flowchart of the second adjustment operation according to the embodiment of the present invention.
  • FIG. 16 is a diagram for explaining a viewpoint projection angle according to the embodiment of the present invention.
  • FIG. 17A is a diagram showing an example of an image before distortion correction according to the embodiment of the present invention.
  • FIG. 17B is a diagram showing an example of an image after distortion correction according to the embodiment of the present invention.
  • FIG. 18 is a diagram for explaining a second adjustment operation according to the embodiment of the present invention.
  • the 3D coordinate measuring apparatus extracts the viewpoint projection angle of the target point in each image from two images obtained by capturing the target point from different viewpoints. Then, the three-dimensional coordinate measuring apparatus calculates the coordinates of the target point using the viewpoint projection angle.
  • the three-dimensional coordinate measuring apparatus can calculate the three-dimensional coordinates of the target point even when the two cameras are not arranged in parallel stereo.
  • the three-dimensional coordinate measuring apparatus can improve the degree of freedom in setting two cameras.
  • FIG. 1 is a diagram showing a configuration of a three-dimensional coordinate measurement system according to an embodiment of the present invention.
  • the three-dimensional coordinate measurement system shown in FIG. This three-dimensional coordinate measurement system includes a three-dimensional coordinate measurement device 90, cameras 10A and 10B that are imaging devices, and cables 30A and 30B.
  • the camera 10A is the target structure 50 is the first imaging device for generating image data 60A by capturing the first viewpoint M 1.
  • Camera 10B is a target structure 50 is the second imaging device for generating image data 60B by capturing from a second viewpoint M 2.
  • the first viewpoint M 1 and second viewpoint M 2 corresponds to the position of the principal point of the photographic lens of the camera 10A and 10B.
  • O 1 and O 2 written by two-dot chain lines indicate the optical axes of the cameras 10A and 10B (the optical axes of the photographing lenses), respectively.
  • the relative positional relationship between the camera 10A and the camera 10B and the optical axes O 1 and O 2 are set in advance.
  • the camera 10A and the camera 10B are connected via a predetermined member. Thereby, the relative positional relationship between the camera 10A and the camera 10B and the orientation (optical axis) of the camera 10A and the camera 10B are fixed.
  • the image data 60A generated by the camera 10A is sent to the three-dimensional coordinate measuring device 90 via the cable 30A.
  • the image data 60B generated by the camera 10B is sent to the three-dimensional coordinate measuring apparatus 90 via the cable 30B.
  • the three-dimensional coordinate measuring device 90 is, for example, a personal computer.
  • the three-dimensional coordinate measuring apparatus 90 measures the three-dimensional coordinates of the target point W using the image data 60A and 60B.
  • FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional coordinate measuring apparatus 90 according to the embodiment of the present invention.
  • the function of the processing unit shown in FIG. 2 is realized by a processor included in a personal computer executing a program.
  • the three-dimensional coordinate measuring apparatus 90 calculates the three-dimensional coordinates 112 of the target point W using two images obtained by capturing the target point W from different viewpoints.
  • the three-dimensional coordinates 112 is, for example, the coordinates of the first point of view M 1 in the three-dimensional coordinate system with the origin.
  • the three-dimensional coordinate system of the three-dimensional coordinates 112 is not limited to this, and an arbitrary three-dimensional coordinate system may be used.
  • the three-dimensional coordinates 112 may be relative coordinates between a predetermined point and the target point, or may be information indicating a distance between the predetermined point and the target point W.
  • the point defined here in advance for example, a first viewpoint M 1 or the second viewpoint M 2.
  • the point this predetermined may be any point existing between the first view point M 1 and the second viewpoint M 2.
  • the three-dimensional coordinate measuring apparatus 90 includes an image acquisition unit 101, a storage unit 102, a first viewpoint projection angle extraction unit 103A, a second viewpoint projection angle extraction unit 103B, and a coordinate calculation unit. 105 and a camera setting detection unit 106.
  • Image acquisition unit 101 the first target image target point W is imaged (image data 60A) by the first camera 10A from the viewpoint M 1, target point W by the second viewpoint M 2 from the camera 10B is imaged The obtained second target image (image data 60B) is acquired.
  • the storage unit 102 holds the first viewpoint projection angle information 111A, the second viewpoint projection angle information 111B, the first shooting angle ⁇ A , the second shooting angle ⁇ B, and the inter-viewpoint distance d. To do.
  • FIG. 3 is a diagram showing the relationship between the cameras 10A and 10B and the target point W.
  • the reference line R is a straight line that passes through the second viewpoint M 2 and is perpendicular to the line segment M 1 M 2 .
  • the reference line L and the reference line R are parallel lines.
  • FIG. 3 two-dimensional coordinates and angles are illustrated, but actually three-dimensional coordinates and angles are used. Also in this case, since the same method as shown below can be used, the description will be given here with reference to FIG.
  • the first shooting angle ⁇ A is an angle formed between the first reference line L and the optical axis O 1 of the camera 10A.
  • the second shooting angle ⁇ B is an angle formed by the second reference line R and the optical axis O 2 of the camera 10B.
  • the first shooting angle ⁇ A is first optical axis information indicating the optical axis direction of the camera 10A
  • the second shooting angle ⁇ B is second light indicating the optical axis direction of the camera 10B.
  • Distance between viewpoints d show first viewpoint M 1 and the distance between the second viewpoint M 2, the first viewpoint M 1 and the arrangement relationship between the second viewpoint M 2.
  • first viewpoint M 1 and the positional relationship between the second viewpoint M 2 for example, a camera 10A (first viewpoint M 1) and a camera 10B (first viewpoint M 2) either the right of This is information indicating which is arranged on the left side. That is, distance between viewpoints d is the relative position information indicating the first viewpoint M 1 and second viewpoint M 2 and the relative coordinate (relative position).
  • the first viewpoint projection angle information 111A is information indicating the viewpoint projection angle of each pixel (corresponding to the first pixel viewpoint projection angle of the present invention) in an image taken by the camera 10A.
  • the viewpoint projection angle is an angle formed by a line segment connecting each point of the three-dimensional coordinate system projected onto the pixel and the viewpoint and the optical axis of the imaging device (camera).
  • the viewpoint projection angle is a value determined according to the camera setting, and is a value uniquely determined for each setting state of the camera.
  • the first viewpoint projection angle information 111A is a table indicating viewpoint projection angles corresponding to each of a plurality of pixels.
  • the second viewpoint projection angle information 111B is information indicating the viewpoint projection angle of each pixel (corresponding to the second pixel viewpoint projection angle of the present invention) in the image captured by the camera 10B.
  • the second viewpoint projection angle information 111B is a table indicating viewpoint projection angles corresponding to each of a plurality of pixels.
  • the first viewpoint projection angle information 111A, the second viewpoint projection angle information 111B, the first shooting angle ⁇ A , the second shooting angle ⁇ B, and the inter-viewpoint distance d are set in advance. It may be stored in the storage unit 102 or may be calculated by the camera setting detection unit 106 as described later.
  • first viewpoint projection angle information 111A, the second viewpoint projection angle information 111B, the first shooting angle ⁇ A , the second shooting angle ⁇ B, and the inter-viewpoint distance d are stored in a plurality of memories. It may be divided and stored in the unit 102.
  • the first viewpoint projection angle extraction unit 103A uses the viewpoint projection angle ⁇ A1 corresponding to the target point W projected on the first target image (the first viewpoint of the present invention). (Corresponding to the target viewpoint projection angle).
  • the second viewpoint projection angle extraction unit 103B uses the second viewpoint projection angle information 111B to generate a viewpoint projection angle ⁇ B1 corresponding to the target point W projected on the second target image (the second viewpoint of the present invention). (Corresponding to the target viewpoint projection angle).
  • the coordinate calculation unit 105 includes the target viewpoint projection angles ⁇ A1 and ⁇ B1 acquired by the first viewpoint projection angle extraction unit 103A and the second viewpoint projection angle extraction unit 103B, the first imaging angle ⁇ A, and the second The three-dimensional coordinates 112 of the target point W are calculated using the shooting angle ⁇ B and the inter-viewpoint distance d.
  • the three-dimensional coordinates 112 of the target point W calculated by the coordinate calculation unit 105 are displayed on, for example, a display unit (not shown) included in the three-dimensional coordinate measurement device 90.
  • the three-dimensional coordinates 112 may be output to the outside of the three-dimensional coordinate measuring device 90 or may be stored in a storage unit provided in the three-dimensional coordinate measuring device 90.
  • the camera setting detection unit 106 includes first viewpoint projection angle information 111A, second viewpoint projection angle information 111B, a first shooting angle ⁇ A , a second shooting angle ⁇ B, and an inter-viewpoint distance d.
  • the first viewpoint projection angle information 111A, the second viewpoint projection angle information 111B, the first shooting angle ⁇ A , the second shooting angle ⁇ B, and the inter-viewpoint distance d are calculated. Store in the storage unit.
  • FIG. 4 is a flowchart of the three-dimensional coordinate measurement process performed by the three-dimensional coordinate measurement apparatus 90.
  • the image acquisition unit 101 acquires a first target image and a second target image (S101).
  • the first target image, the target point W by the first camera 10A from the view point M 1 is an image that has been captured.
  • the second target image, the target point W by the second viewpoint M 2 from the camera 10B is an image that has been captured.
  • the first viewpoint projection angle extraction unit 103A extracts the coordinates of the target point W from the first target image. Then, the first viewpoint projection angle extraction unit 103A refers to the first viewpoint projection angle information 111A and acquires the viewpoint projection angle ⁇ A1 corresponding to the extracted coordinates of the target point W. Similarly, the second viewpoint projection angle extraction unit 103B extracts the coordinates of the target point W from the second target image. Then, the second viewpoint projection angle extraction unit 103B refers to the second viewpoint projection angle information 111B to acquire the viewpoint projection angle ⁇ B1 corresponding to the extracted coordinates of the target point W (S102).
  • the coordinate calculation unit 105 uses the first shooting angle ⁇ A , the second shooting angle ⁇ B, and the viewpoint projection angles ⁇ A1 and ⁇ B1 to use the first target angle ⁇ shown in FIG. A2 and the second target angle ⁇ B2 are calculated (S103).
  • the first target angle ⁇ A2 is an angle formed by the line segment M 1 M 2 and the line segment M 1 W.
  • the second target angle ⁇ B2 is an angle formed by the line segment M 1 M 2 and the line segment M 2 W.
  • the coordinate calculation unit 105 calculates the first target angle ⁇ A2 using the following (Equation 1). In addition, the coordinate calculation unit 105 calculates the second target angle ⁇ B2 using the following (Formula 2).
  • ⁇ A2 ⁇ L ⁇ A ⁇ A1 (Formula 1)
  • ⁇ B2 ⁇ R ⁇ B ⁇ B1 (Formula 2)
  • ⁇ L is an angle formed by the first reference line L and the line segment M 1 M 2 .
  • ⁇ R is an angle formed by the second reference line R and the line segment M 1 M 2 . Further, here, theta L and theta R, are both 90 degrees.
  • the coordinate calculation unit 105 calculates the three-dimensional coordinate 112 of the target point W using the first target angle ⁇ A2 , the second target angle ⁇ B2, and the inter-viewpoint distance d (S104). .
  • distance between viewpoints d is information indicating the first viewpoint M 1 and the relative positional relationship between the second viewpoint M 2.
  • the coordinate calculation unit 105 uses a three-dimensional coordinate system in which the coordinates of the first viewpoint M 1 are the origin (0, 0, 0) and the coordinates of the second viewpoint M 2 are (d, 0, 0). Set. Then, the coordinate calculation unit 105 calculates the coordinates of the target point W in the set three-dimensional coordinate system.
  • the three-dimensional coordinate calculation unit 105 can calculate the three-dimensional coordinates 112 of the target point W by the following method.
  • the coordinate calculation unit 105 calculates a first vector passing through the first viewpoint M 1 and the target point W using the first target angle ⁇ A2 . Specifically, the coordinate calculation unit 105, through the first view point M 1, and to calculate the direction of the vector represented by the first target angle theta A2 as the first vector. Note that the coordinate calculation unit 105 may use the first shooting angle ⁇ A and the viewpoint projection angle ⁇ A1 instead of the first target angle ⁇ A2 .
  • the coordinate calculation section 105 uses the second target angle theta B2, calculates a second vector that passes through the second viewpoint M 2 and the target point W. Further, the coordinate calculation unit 105 may use the second shooting angle ⁇ B and the viewpoint projection angle ⁇ B1 instead of the second target angle ⁇ B2 .
  • the coordinate calculation unit 105 calculates the coordinates of the closest point between the first vector and the second vector as the three-dimensional coordinates 112 of the target point W.
  • the three-dimensional coordinate measuring apparatus 90 can calculate the three-dimensional coordinates 112 of the target point W.
  • the three-dimensional coordinate measuring apparatus 90 holds the viewpoint projection angle with respect to each pixel of the image captured by the camera 10A and the camera 10B in advance, and sets the viewpoint projection angle to be held. Using this, the three-dimensional coordinates 112 of the target point W are calculated. Specifically, the three-dimensional coordinate measuring apparatus 90 extracts the viewpoint projection angle of the target point in each image from two images obtained by capturing the target point from different viewpoints. Then, the three-dimensional coordinate measuring apparatus calculates the coordinates of the target point using the viewpoint projection angle.
  • the three-dimensional coordinate measuring apparatus 90 can easily calculate the three-dimensional coordinates of the target point W. Further, the three-dimensional coordinate measuring device 90 calculates the three-dimensional coordinates of the target point W even when the two cameras are not arranged in parallel stereo by using the first photographing angle ⁇ A and the second photographing angle ⁇ B. it can. As described above, the three-dimensional coordinate measuring apparatus 90 according to the embodiment of the present invention can improve the degree of freedom in setting two cameras.
  • Patent Document 1 essential information is required for the captured image. Specifically, in Patent Document 1, a license plate is required, and in Patent Document 2, “at least two straight lines that are mutually changed in real space” are required. On the other hand, the three-dimensional coordinate measuring apparatus 90 according to the present embodiment does not need such information.
  • the first reference line L uses a straight line that passes through the first viewpoint M 1 and is perpendicular to the line segment M 1 M 2
  • it may be a straight line passing through the first viewpoint M 1.
  • the second reference line R may be any straight line passing through the second point of view M 2.
  • the first shooting angle ⁇ A and the second shooting angle ⁇ B may be information indicating the optical axis directions of the cameras 10A and 10B.
  • the first reference line L and the second reference line R may be straight lines that are parallel to each other and are not perpendicular to the line segment M 1 M 2 .
  • the angle formed by the line segment M 1 M 2 and the optical axis of the camera may be used as the first shooting angle ⁇ A and the second shooting angle ⁇ B.
  • distance between viewpoints d may be information indicating the first viewpoint M 1 and the relative positional relationship between the second viewpoint M 2.
  • first viewpoint M 1 coordinates and a second viewpoint M 2 coordinates in the three-dimensional coordinate system may be used instead of the distance between viewpoints d.
  • first adjustment operation for detecting the first shooting angle ⁇ A , the second shooting angle ⁇ B and the inter-viewpoint distance d by the camera setting detection unit 106 will be described.
  • first viewpoint projection angle information 111A and the second viewpoint projection angle information 111B are held in the storage unit 102.
  • FIG. 5 is a diagram showing a configuration of the three-dimensional coordinate measurement system according to the present embodiment during the first adjustment operation.
  • the camera 10A when the first adjusting operation, the camera 10A generates image data 61A (first reference image) by imaging the reference structure 40 from the first viewpoint M 1.
  • Camera 10B by imaging the reference structure 40 from the second viewpoint M 2, and generates an image data 61B (second reference image).
  • the reference structure 40 has three reference points A, B, and C whose relative coordinates are known in a three-dimensional coordinate system.
  • the triangle defined by the points A, B, and C is referred to as the reference structure 40.
  • FIG. 6 is a block diagram illustrating a configuration of the camera setting detection unit 106.
  • the camera setting detection unit 106 includes a third viewpoint projection angle extraction unit 121A, a fourth viewpoint projection angle extraction unit 121B, a first tilt angle calculation unit 122A, and a second tilt.
  • An angle calculation unit 122B and a setting information calculation unit 123 are provided.
  • the third viewpoint projection angle extraction unit 121A uses three viewpoint projection angles (corresponding to each of the reference points A, B, and C projected on the first reference image). (Corresponding to the first reference viewpoint projection angle of the present invention).
  • the fourth viewpoint projection angle extraction unit 121B uses the second viewpoint projection angle information 111B to provide three viewpoint projection angles (corresponding to each of the reference points A, B, and C projected on the second reference image). (Corresponding to the second reference viewpoint projection angle of the present invention).
  • the first inclination angle calculation unit 122A calculates the first inclination angle using the three first reference viewpoint projection angles and the relative coordinates of the three reference points A, B, and C.
  • the first inclination angle is an angle formed by a first imaging surface that is an imaging surface of the first reference image and a reference plane including three reference points A, B, and C.
  • the second tilt angle calculation unit 122B calculates the second tilt angle using the three second reference viewpoint projection angles and the relative coordinates of the three reference points A, B, and C.
  • the second inclination angle is an angle formed by the second imaging surface, which is the imaging surface of the second reference image, and the reference plane including the three reference points A, B, and C.
  • the setting information calculation unit 123 calculates the first shooting angle ⁇ A and the second shooting angle ⁇ B using the first tilt angle and the second tilt angle. Also, setting information calculating unit 123 calculates a first inclined angle, with a second inclination angle, the first view point M 1 and the second three-dimensional coordinates of the viewpoint M 2. Then, setting information calculation unit 123, from the first viewpoint M 1 and second three-dimensional coordinates of the viewpoint M 2, calculates interview distance d.
  • FIG. 7 is a flowchart of the first adjustment operation by the three-dimensional coordinate measuring apparatus 90.
  • the image acquisition unit 101 acquires the first reference image and the second reference image as shown in FIG. 8 (S111).
  • the case where the three reference points included in the first reference image and the second reference image are the same will be described as an example.
  • the three reference points included in the image and the second reference image may be different.
  • the first reference image, the first reference point is any three points of the three points or more reference points relative coordinates is known, is captured by the camera 10A from the first viewpoint M 1 Any image can be used.
  • the second reference image, the second reference point is any three points among the three points or more reference points may be an image captured by the second viewpoint M 1 from the camera 10B .
  • each calculation process by considering the relative positional relationship between the first reference point and the second reference point, the same as the case where the first reference point and the second reference point are the same. Processing can be performed.
  • the third viewpoint projection angle extraction unit 121A uses the optical center position C p1 in the first reference image as the origin and the reference point A projected on the first acquired image.
  • B and C the coordinates of points A 01 , B 01 and C 01 are extracted from the first reference image.
  • the third viewpoint projection angle extraction unit 121A uses the first viewpoint projection angle information 111A to calculate the first reference viewpoint projection angle corresponding to the coordinates of the extracted points A 01 , B 01 and C 01. Extract.
  • the fourth viewpoint projection angle extraction unit 121B uses the optical center position C p2 in the second reference image as the origin, and the reference point A projected on the second reference image.
  • the fourth viewpoint projection angle extraction unit 121B uses the second viewpoint projection angle information 111B to calculate the second reference viewpoint projection angle corresponding to the coordinates of the extracted points A 02 , B 02 and C 02. Extract (S112).
  • the first inclination angle calculation unit 122A is a first angle that is an angle formed between the first imaging surface that is the imaging surface of the first reference image and the reference plane that includes the reference points A, B, and C.
  • the inclination angle is calculated.
  • the second inclination angle calculation unit 122B is a second angle that is an angle formed between the second imaging surface that is the imaging surface of the second reference image and the reference plane that includes the reference points A, B, and C.
  • An inclination angle is calculated (S113).
  • FIG. 10 is a flowchart of the process (S113) for calculating the first tilt angle and the second tilt angle.
  • FIG. 11 is a diagram for explaining the processing shown in FIG.
  • the process of calculating the first tilt angle will be described with reference to FIGS. 10 and 11.
  • the first inclination angle calculation unit 122A sets one of the three reference points A, B, and C, for example, an X 1 -Y 1 -Z 1 coordinate system with the point A as the origin. (S141). At this time, the first tilt angle calculating unit 122A has the X 1 axis and Y 1 axis is set in parallel to the first imaging plane taken from the first viewpoint M 1. In addition, the first inclination angle calculation unit 122A sets the Z 1 axis to be perpendicular to the first imaging surface imaged from the first viewpoint M 1 .
  • the first inclination angle calculating section 122A sets a reference point B and C and equivalent virtual points B 1 and C 1 coordinates (S142).
  • the point B 1 corresponds to the point B
  • the point C 1 corresponds to the point C.
  • the coordinates of the point B 1 are set as (B 1x , 0, 0)
  • the coordinates of the point C 1 are set as (0, C 1y , 0).
  • the first inclination angle calculation unit 122A finely and separately separates the points B 1 and C 1 into an angle ⁇ 1n around the X 1 axis, an angle ⁇ 1n around the Y 1 axis, and an angle ⁇ 1n around the Z 1 axis.
  • a plurality of rotated conversion reference points, points B 1n ′ (B 1nx ′, B 1ny ′, B 1nz ′) and point C 1n ′ (C 1nx ′, C 1ny ′, C 1nz ′) are calculated ( S143).
  • n 1, 2, 3,...
  • the first inclination angle calculation unit 122A uses the viewpoint projection angle extracted by the third viewpoint projection angle extraction unit 121A to convert the point B 1n ′ and the point C 1n ′ into the X 1 -Y 1 plane.
  • a plurality of transformation projection reference points projected onto the points B 1n ′′ (B 1nx ′′, B 1ny ′′, 0) and point C 1n ′′ (C 1nx ′′, C 1ny ′′, 0) are calculated (S144). .
  • n 1, 2, 3,...
  • the viewpoint projection angle of the plane Z 1 -X 1 at the point B 01 extracted by the third viewpoint projection angle extraction unit 121A is ⁇ 1x
  • the viewpoint projection angle of the Z 1 -Y 1 plane at the point B 01 is epsilon 1y, 1x viewpoint projection angle of Z 1 -X 1 side of the point C 01 phi, the viewpoint projection angle of Z 1 -Y 1 side of the point C 01 and phi 1y.
  • each of B 1ny ′′, C 1nx ′′, and C 1ny ′′ is expressed by (Expression 4) to (Expression 6).
  • the first inclination angle calculation unit 122A determines the converted projection reference points B 1 ′′ and C 1 ′′ in which the triangle A 01 B 01 C 01 and the triangle AB 1n “C 1n ” are most similar.
  • the first inclination angle ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) is calculated (S145).
  • the line segment A 01 B 01 has the same length as the line segment AB 1n ′′, and the point B 01 and the point C 01 are moved so that the point A 01 coincides with the point A
  • the first inclination angle ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) is set so that the sum of the length of the line segment B 01 B 1n ′′ and the length of the line segment C 01 C 1n ′′ is minimized.
  • the similarity comparison calculation method is not limited to the above method.
  • the line segment A 01 B 01 "with the same length as the line segment A 01 B 01 segment AB 1n" segment AB 1n moving the point C 01 to match with.
  • the first inclination angles ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) may be calculated so that the shift amount between the point C 01 and the point C 1n ′′ is minimized.
  • the second inclination angle calculation unit 122B calculates a second inclination angle ( ⁇ 2 , ⁇ 2 , ⁇ 2 ) that is an angle formed by the second imaging surface and a plane including the reference point. .
  • the calculation of the tilt angle has been described based on the embodiment.
  • the calculation methods of the first and second tilt angles are not limited to this embodiment.
  • the first inclination angle calculation unit 122A has X 1 -Y 1 -Z whose origin is one of three reference points A, B, and C whose coordinates are known. Although one coordinate system has been set, the origin may be selected from other than the three reference points whose coordinates are known.
  • the first inclination angle calculation unit 122A places the point A at a certain position on the viewpoint projection angle line of the point A. Then, the first inclination angle calculation unit 122A sequentially slides the point B on the viewpoint projection angle line of the point B at a specified pitch and within a certain range. At this time, the first inclination angle calculation unit 122A arranges the point C on the circumference with the line segment AB as an axis for each position of the point B. Then, the first inclination angle calculation unit 122A stores a condition in which the point C is closest to the viewpoint projection angle line of the point C among the positions on the circumference.
  • the first inclination angle calculation unit 122A slides the point B sequentially, and calculates the closest point of the point C at that time. Then, the first inclination angle calculation unit 122A may calculate the inclination angle from the condition of the closest approach point among the closest points of the point C corresponding to all the slide positions of the point B.
  • the first inclination angle calculation unit 122A places the point A at a certain position on the viewpoint projection angle line of the point A and places the point B on or near the viewpoint projection angle line of the point B. At this time, the point C exists on the circumference with the line segment AB as an axis. Therefore, the first inclination angle calculation unit 122A stores a condition in which the point C is closest to the viewpoint projection angle line of the point C among the positions on the circumference. The first inclination angle calculation unit 122A moves the point A within a certain range and performs the above calculation for each moved position. Then, the first inclination angle calculation unit 122A may calculate the inclination angle from the condition of the closest approach point among the closest points of the point C corresponding to all positions of the point A.
  • the setting information calculation unit 123 calculates two vectors.
  • the two vectors for example, the determined points B 1 "(point B 1n") in step S145, the straight line B 1 'B connecting the' B 1 corresponding to 'point B 1n at that time 1 ", and, C 1 point determined in step S145" is (point C 1n ") and, linear C 1 'C 1 connecting the' C 1 corresponding to the 'C 1n terms of their time".
  • the setting information calculation unit 123 calculates the three-dimensional coordinates with the first viewpoint M 1 by calculating the coordinates of the closest point of the two vectors.
  • setting information calculating unit 123 calculates a second three-dimensional coordinates of the viewpoint M 2.
  • using the following method may calculate the first three-dimensional coordinates of the viewpoint M 1.
  • the triangle ABM 1 connecting the the point M 1 points A and B, the length and ⁇ M 1 AB and ⁇ M 1 BA line segment AB is known. Therefore, the three-dimensional coordinates of the first viewpoint M 1 can be calculated using the triangle ABM 1 . Further, since 1AM 1 B is known from the viewpoint projection angle, similarly, the three-dimensional coordinates of the first viewpoint M 1 can be calculated using the triangle ABM 1 .
  • setting information calculation unit 123 by using the three-dimensional coordinates of the first point of view M 1, and a second three-dimensional coordinates of the viewpoint M 2, and calculates the distance between viewpoints d (S115).
  • the setting information calculation unit 123 calculates the first shooting angle ⁇ A and the second shooting angle ⁇ B (S116). Specifically, by the above process, the first viewpoint M 1, and second viewpoint M 2, reference point A, the three-dimensional coordinates of the B and C are calculated. Therefore, the setting information calculation unit 123 calculates, for example, the angles ⁇ A4 and ⁇ B4 illustrated in FIG. 13 from the coordinates of the reference point A, the first viewpoint M 1 , and the second viewpoint M 2 . Furthermore, the setting information calculation unit 123 calculates the first shooting angle ⁇ A using the following (formula 7). In addition, the setting information calculation unit 123 calculates the second shooting angle ⁇ B using the following (Equation 8).
  • ⁇ A ⁇ L ⁇ A3 ⁇ A4 (Formula 7)
  • ⁇ B ⁇ R ⁇ B3 ⁇ B4 (Formula 8)
  • ⁇ L is an angle formed by the first reference line L and the line segment M 1 M 2 .
  • ⁇ R is an angle formed by the second reference line R and the line segment M 1 M 2 .
  • theta L and theta R are both 90 degrees.
  • ⁇ A3 is the viewpoint projection angle of the reference point A in the first reference image
  • ⁇ B3 is the viewpoint projection angle of the reference point A in the second reference image.
  • the setting information calculation unit 123 can calculate the first shooting angle ⁇ A and the second shooting angle ⁇ B.
  • the calculation method of the first shooting angle ⁇ A and the second shooting angle ⁇ B by the setting information calculation unit 123 is not limited to the above method.
  • the direction of the optical axis O 1 is calculated from the first tilt angle
  • the first photographing angle ⁇ A is calculated from the direction of the optical axis O 1. May be.
  • the first shooting angle ⁇ A , the second shooting angle ⁇ B and the inter-viewpoint distance d calculated by the setting information calculation unit 123 are stored in the storage unit 102.
  • the first shooting angle ⁇ A , the second shooting angle ⁇ B, and the inter-viewpoint distance d can be easily calculated using the reference structure 40.
  • the first shooting angle ⁇ A , the second shooting angle ⁇ B and the inter-viewpoint distance d can be easily calculated.
  • readjustment can be easily performed even when the arrangement of the cameras 10A and 10B is adjusted according to the object or the measurement environment, and even when the arrangement of the cameras 10A and 10B is shifted due to vibration or the like.
  • the relative positions of the two cameras 10A and 10B need not be fixed, and the second adjustment operation can be performed individually.
  • the process of detecting the first viewpoint projection angle information 111A and the process of detecting the second viewpoint projection angle information 111B are the same, only the process of detecting the first viewpoint projection angle information 111A will be described below. explain.
  • FIG. 14 is a diagram illustrating a configuration of a system including the three-dimensional coordinate measuring apparatus 90 during the second adjustment operation.
  • the camera 10A images the calibration plate 70 from the viewpoint M.
  • the viewpoint M corresponds to the position of the principal point of the photographing lens of the camera 10A.
  • O written by a two-dot chain line indicates the optical axis of the photographing lens (the optical axis of the camera).
  • the calibration plate 70 is a transparent plate having high rigidity and has a grid pattern on the surface.
  • a weight 72 is attached to the calibration plate 70 with a thread 71.
  • the camera setting detection unit 106 performs image pixel distortion correction using the image data 62 ⁇ / b> A obtained by capturing the calibration plate 70, the thread 71, and the weight 72 from the viewpoint M. Further, the camera setting detection unit 106 calculates a viewpoint projection angle (corresponding to the first and second pixel viewpoint projection angles of the present invention) of an arbitrary image pixel (pixel).
  • FIG. 15 is a flowchart of the second adjustment operation.
  • the camera setting detection unit 106 sets the optical center position C p that is the center point of the imaging surface (S201), performs distortion correction (S202), and views the viewpoint of an arbitrary image pixel (pixel).
  • the projection angle ⁇ is calculated (S203).
  • the viewpoint projection angle ⁇ of an arbitrary image pixel is an arbitrary point P 3 on the three-dimensional space projected on the two-dimensional coordinates of the arbitrary image pixel P on the image and the viewpoint as shown in FIG. This is the angle between the line segment connecting M and the optical axis O of the camera.
  • the optical axis O of the camera and the imaging surface are orthogonal. Therefore, a viewpoint M, and the distance x between the optical center position C p is the intersection of the optical axis O and the imaging surface of the camera, for example, the origin of the optical center position C p, the two-dimensional coordinates of an arbitrary image pixels P Can be used to calculate the viewpoint projection angle ⁇ of any image pixel P.
  • the optical center position Cp is set (S201).
  • the calibration plate 70 is assumed to be kept horizontal by a level or the like.
  • the calibration plate 70 is imaged by the camera 10A from above.
  • the image pickup surface is (1) a grid around the attachment position of the thread 71 such that (1) the attachment position of the calibration plate 70 and the thread 71 and the attachment position of the weight 72 and the thread 71 overlap. Adjust so that the eye display is symmetrical.
  • the mounting position of the thread 71 on the calibration plate 70 is the optical center position C p.
  • the camera setting detection unit 106 calculates parameters for performing image distortion correction (S202).
  • the image when the calibration plate 70 is photographed is an image in which the grid's eyes are distorted due to the characteristics of the photographing lens of the camera 10A.
  • the camera setting detection unit 106 performs normalization processing on the captured image to calculate parameters for adjusting the captured image to an image as illustrated in FIG. 17B.
  • the subsequent processing including the above-described measurement processing of the three-dimensional coordinates of the target point W and the first adjustment operation, for example, in the image acquisition unit 101, for the image captured by the camera 10A using the parameter. Then, distortion correction is performed.
  • the camera setting detection unit 106 calculates the viewpoint projection angle ⁇ of an arbitrary image pixel P (S203).
  • FIG. 18 is a diagram for explaining processing for calculating the distance between the calibration plate and the viewpoint.
  • the camera setting detection unit 106 acquires an image of a plate before movement, which is an image after distortion correction obtained by photographing the calibration plate 70 as shown in FIG. 17B.
  • the calibration plate 70 is moved closer to the viewpoint M by the distance y while keeping the calibration plate 70 horizontal.
  • the camera setting detection unit 106 acquires an image of the plate after movement, which is an image after distortion correction obtained by imaging the calibration plate 70.
  • the camera setting detection unit 106 uses the image of the plate before movement and the image of the plate after movement, for example, two points P 1 ′ and P 2 ′ with the optical center position C p as the origin. Extract dimension coordinates.
  • an intersection of a straight line connecting the viewpoint M and P 1 and the imaging surface is defined as P 1 ′. Further, corresponding to P 1 of the calibration plate 70 surface before the movement, the point of the calibration plate 70 surface after the movement and P 2.
  • P 2 ′ be the intersection of the straight line connecting the viewpoints M and P 2 and the imaging surface.
  • Q be the intersection of a straight line passing through the point P 2 and the point P 2 ′ and the surface of the calibration plate 70 before the movement.
  • d 2 represents the length of the line segment C 1 P 1
  • d 3 represents the length of the line segment P 1 ′ P 2 ′
  • d 4 represents the length of the line segment C p P 1 ′.
  • viewpoint projection angle ⁇ 1 of the point Q is expressed by the following (formula 10).
  • the distance z between the calibration plate 70 and the viewpoint M is expressed by the following (formula 11).
  • d 5 indicates the length of the line segment C 1 Q.
  • d 6 indicates the length of the line segment C p P.
  • the viewpoint projection angle ⁇ of the arbitrary image pixel P can be calculated.
  • the calibration plate 70 is marked with a grid, but the calibration plate may be marked with a known interval.
  • the camera setting detection unit 106 calculates the viewpoint projection angle for each pixel in the image captured by the camera 10A, and stores the first viewpoint projection angle information 111A indicating the viewpoint projection angle for each pixel. Store in the unit 102. Similarly, the camera setting detection unit 106 calculates the viewpoint projection angle for each pixel in the image captured by the camera 10B, and stores the second viewpoint projection angle information 111B indicating the viewpoint projection angle for each pixel. To remember.
  • the three-dimensional coordinate measuring apparatus according to the embodiment of the present invention has been described above, but the present invention is not limited to this embodiment.
  • the first viewpoint projection angle information 111A is a table indicating viewpoint projection angles corresponding to each of a plurality of pixels, but is an expression indicating a correspondence relationship between pixels and viewpoint projection angles. Also good.
  • the said formula is said (Formula 14).
  • the first viewpoint projection angle extraction unit 103A may calculate the viewpoint projection angle of the target point W in the first target image using (Equation 14).
  • the camera setting detection unit 106 needs to obtain the distance x between the viewpoint M and the optical center position C p using the above (Equation 13).
  • the second viewpoint projection angle information 111B may be an expression indicating a correspondence relationship between a pixel and a viewpoint projection angle.
  • shooting may be performed from three or more viewpoints.
  • the number of reference points whose relative coordinates are known may be four or more. In this case, it is only necessary that at least three of the four or more reference points are photographed in each of the first and second reference images. Further, the combination of at least three reference points captured in each of the first reference image and the second reference image may be different. Furthermore, it is preferable that four or more reference points are arranged in a three-dimensional manner rather than in the same plane. For example, four or more of the apexes of a triangular pyramid or a quadrangular pyramid can be used as the reference point. Thereby, a wide range and stable angle detection can be performed.
  • first and second inclination angles are angles with respect to different reference planes (first and second reference planes).
  • the first and second inclination angles may be adjusted to one of the reference planes.
  • the first and second tilt angles may be calculated as angles unified in the three-dimensional coordinate system.
  • first tilt angle and the second tilt angle may be calculated by performing nonlinear approximation calculation using a large number of reference points whose relative coordinates are known.
  • the correspondence can be identified by using a special marker or line.
  • a reference point for verification whose relative coordinates are known may be used in order to prevent reversal (turn over) due to rotation of the reference point and calculation of an incorrect solution (tilt angle).
  • restrictions for example, rotation conditions
  • restrictions may be added so as not to calculate an incorrect solution.
  • the storage unit 102 holds in advance a first rotation angle around the optical axis of the camera 10A and a second rotation angle around the optical axis of the camera 10B.
  • the image acquisition part 101 performs correction
  • the image acquisition unit 101 performs correction to rotate the second target image acquired from the camera 10B according to the second rotation angle.
  • the three-dimensional coordinate measuring apparatus 90 calculates the three-dimensional coordinates of the target point W by performing the above-described processing on the corrected first target image and second target image.
  • the target to be corrected according to the first rotation angle and the second rotation angle is not limited to the first and second target images.
  • the coordinate calculation unit 105 may calculate the three-dimensional coordinates of the target point in consideration of the first and second rotation angles.
  • the first and second rotation angles can be detected by the following method.
  • the camera setting detection unit 106 calculates the first and second reference viewpoint projection angles of the reference points A, B, and C when the line segment M 1 M 2 and the baseline are parallel by the method described above. Keep it. Further, the camera setting detection unit 106 projects the first and second reference viewpoints of the reference points A, B, and C in the actual setting state (when the camera 10A and the camera 10B are rotated around the optical axis). Calculate the corner. Then, the camera setting detection unit 106 calculates the first rotation angle based on the calculated angle difference between the two first reference viewpoint projection angles. Similarly, the camera setting detection unit 106 calculates a second rotation angle based on the calculated angle difference between the two second reference viewpoint projection angles.
  • the angle information such as the viewpoint projection angle used above is not limited to a value indicating the angle itself, but may be a vector represented by coordinates of two points, for example.
  • part or all of the functions of the three-dimensional coordinate measuring apparatus may be realized by a processor such as a CPU executing a program.
  • connection relationship between the constituent elements is exemplified for specifically explaining the present invention, and the connection relationship for realizing the function of the present invention is not limited to this.
  • division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be. Further, the functions of a plurality of functional blocks having similar functions may be processed by the three-dimensional coordinate measuring apparatus in parallel or in time division.
  • the present invention may be the above program or a non-transitory computer-readable recording medium on which the above program is recorded.
  • the program can be distributed via a transmission medium such as the Internet.
  • the configuration of the three-dimensional coordinate measuring apparatus is for illustration in order to specifically describe the present invention, and the three-dimensional coordinate measuring apparatus according to the present invention is not necessarily provided with all of the above configurations. Absent. In other words, the three-dimensional coordinate measuring apparatus according to the present invention only needs to have a minimum configuration capable of realizing the effects of the present invention.
  • the three-dimensional coordinate measuring apparatus 90 may not include the camera setting detection unit 106.
  • the camera setting detection unit 106 may have only a function of performing one of the first adjustment operation and the second adjustment operation.
  • the present invention can be applied to a three-dimensional coordinate measuring apparatus.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif (90) de mesure de coordonnées tridimensionnelles qui comprend : une première unité (103A) d'extraction d'angle de projection de perspective qui acquiert un premier angle (θA1) de projection de perspective de cible correspondant à un point cible (W) qui est projeté par une première image cible (60A) représentée à partir d'une première perspective (M1) ; une seconde unité (103B) d'extraction d'angle de projection de perspective, qui acquiert un second angle (θB1) de projection de perspective de cible correspondant au point cible (W) qui est projeté par une seconde image cible (60B) représentée à partir d'une seconde perspective (M2) ; et une unité de calcul de coordonnées (105) qui calcule des coordonnées tridimensionnelles (112) du point cible (W) à l'aide du premier angle (θA1) de projection de perspective de cible, du second angle (θB1) de projection de perspective de cible, de la direction de l'axe optique (θA) d'un premier dispositif de capture d'image (10A), de la direction de l'axe optique (θB) d'un second dispositif (10B) de capture d'image et de la distance (d) entre les perspectives.
PCT/JP2011/005654 2011-07-01 2011-10-07 Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles WO2013005265A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013522375A JP5629874B2 (ja) 2011-07-01 2011-10-07 三次元座標計測装置及び三次元座標計測方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PCT/JP2011/003774 WO2013005244A1 (fr) 2011-07-01 2011-07-01 Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles
JPPCT/JP2011/003774 2011-07-01

Publications (1)

Publication Number Publication Date
WO2013005265A1 true WO2013005265A1 (fr) 2013-01-10

Family

ID=47277831

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2011/003774 WO2013005244A1 (fr) 2011-07-01 2011-07-01 Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles
PCT/JP2011/005654 WO2013005265A1 (fr) 2011-07-01 2011-10-07 Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/003774 WO2013005244A1 (fr) 2011-07-01 2011-07-01 Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles

Country Status (2)

Country Link
JP (1) JP5070435B1 (fr)
WO (2) WO2013005244A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018195965A (ja) * 2017-05-17 2018-12-06 日本電気株式会社 飛行物体位置検知装置、飛行物体位置検知システム、飛行物体位置検知方法及びプログラム
JP2023506602A (ja) * 2020-03-20 2023-02-16 コーニンクレッカ フィリップス エヌ ヴェ X線画像に対する3d測定値グリッドツール

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424104B (zh) * 2013-09-04 2015-11-18 中测新图(北京)遥感技术有限责任公司 一种近景大幅面数字摄影测量系统及方法
CN104634246B (zh) * 2015-02-03 2017-04-12 李安澜 目标空间坐标的浮动式立体视觉测量系统及测量方法
CN104748680B (zh) * 2015-03-19 2018-09-14 酷派软件技术(深圳)有限公司 一种基于摄像头的尺寸测量方法及装置
CN106441243A (zh) * 2016-09-22 2017-02-22 云南电网有限责任公司电力科学研究院 一种地物净空距离的测量方法及装置
CN106546230B (zh) * 2016-11-01 2021-06-22 北京墨土科技有限公司 定位点布置方法及装置、测定定位点三维坐标的方法及设备
CN108195345A (zh) * 2017-12-20 2018-06-22 合肥英睿系统技术有限公司 一种基于电子成像器的测距方法及系统
CN110108203B (zh) * 2019-04-11 2020-12-15 东莞中子科学中心 一种基于摄影测量技术的丝线位置测量方法及系统
CN110595433A (zh) * 2019-08-16 2019-12-20 太原理工大学 一种基于双目视觉的输电杆塔倾斜的测量方法
CN112325767B (zh) * 2020-10-16 2022-07-26 华中科技大学鄂州工业技术研究院 一种融合机器视觉和飞行时间测量的空间平面尺寸测量方法
CN112991742B (zh) * 2021-04-21 2021-08-20 四川见山科技有限责任公司 一种实时交通数据的可视化仿真方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0493705A (ja) * 1990-08-09 1992-03-26 Topcon Corp 3次元位置測定装置及び測定方法
JPH07218251A (ja) * 1994-02-04 1995-08-18 Matsushita Electric Ind Co Ltd ステレオ画像計測方法およびステレオ画像計測装置
JP2001021351A (ja) * 1999-07-07 2001-01-26 Asahi Optical Co Ltd 写真測量画像処理装置、写真測量画像処理方法、および写真測量画像処理プログラムを格納した記憶媒体
JP2004340840A (ja) * 2003-05-19 2004-12-02 Honda Motor Co Ltd 距離測定装置、距離測定方法、及び距離測定プログラム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0680402B2 (ja) * 1985-05-28 1994-10-12 富士通株式会社 位置計測装置
JP3122629B2 (ja) * 1997-07-16 2001-01-09 株式会社エイ・ティ・アール知能映像通信研究所 任意視点画像生成装置
JP2000121362A (ja) * 1998-10-20 2000-04-28 Asahi Optical Co Ltd 写真測量用ターゲット測定装置
JP4282216B2 (ja) * 2000-09-19 2009-06-17 オリンパス株式会社 3次元位置姿勢センシング装置
JP2004271292A (ja) * 2003-03-07 2004-09-30 Meidensha Corp 校正器及びステレオカメラ位置姿勢校正装置
JP4794011B2 (ja) * 2008-04-03 2011-10-12 関東自動車工業株式会社 画像処理装置、およびロボット制御システム
JP2010025759A (ja) * 2008-07-18 2010-02-04 Fuji Xerox Co Ltd 位置計測システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0493705A (ja) * 1990-08-09 1992-03-26 Topcon Corp 3次元位置測定装置及び測定方法
JPH07218251A (ja) * 1994-02-04 1995-08-18 Matsushita Electric Ind Co Ltd ステレオ画像計測方法およびステレオ画像計測装置
JP2001021351A (ja) * 1999-07-07 2001-01-26 Asahi Optical Co Ltd 写真測量画像処理装置、写真測量画像処理方法、および写真測量画像処理プログラムを格納した記憶媒体
JP2004340840A (ja) * 2003-05-19 2004-12-02 Honda Motor Co Ltd 距離測定装置、距離測定方法、及び距離測定プログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018195965A (ja) * 2017-05-17 2018-12-06 日本電気株式会社 飛行物体位置検知装置、飛行物体位置検知システム、飛行物体位置検知方法及びプログラム
JP2023506602A (ja) * 2020-03-20 2023-02-16 コーニンクレッカ フィリップス エヌ ヴェ X線画像に対する3d測定値グリッドツール
JP7366286B2 (ja) 2020-03-20 2023-10-20 コーニンクレッカ フィリップス エヌ ヴェ X線画像に対する3d測定値グリッドツール

Also Published As

Publication number Publication date
WO2013005244A1 (fr) 2013-01-10
JP5070435B1 (ja) 2012-11-14
JPWO2013005244A1 (ja) 2015-02-23

Similar Documents

Publication Publication Date Title
WO2013005265A1 (fr) Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles
TWI555379B (zh) 一種全景魚眼相機影像校正、合成與景深重建方法與其系統
JP5392415B2 (ja) ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム
US11455746B2 (en) System and methods for extrinsic calibration of cameras and diffractive optical elements
EP2751521B1 (fr) Procédé et système d'alignement d'un modèle sur un cliché à codage spatial
CN110809786B (zh) 校准装置、校准图表、图表图案生成装置和校准方法
WO2018076154A1 (fr) Étalonnage de positionnement spatial d'un procédé de génération de séquences vidéo panoramiques fondé sur une caméra ultra-grand-angulaire
JP6079333B2 (ja) 校正装置、方法及びプログラム
JP5011528B2 (ja) 3次元距離計測センサおよび3次元距離計測方法
JP2009139246A (ja) 画像処理装置、画像処理方法、画像処理プログラムおよび位置検出装置並びにそれを備えた移動体
CN106331527A (zh) 一种图像拼接方法及装置
CN105190234A (zh) 三维表面测量的装置和方法
US10810762B2 (en) Image processing apparatus
CN107808398B (zh) 摄像头参数算出装置以及算出方法、程序、记录介质
JPWO2008053649A1 (ja) 広角画像取得方法及び広角ステレオカメラ装置
JP2011086111A (ja) 撮像装置校正方法及び画像合成装置
JP2016100698A (ja) 校正装置、校正方法、プログラム
WO2020208686A1 (fr) Dispositif et procédé d'étalonnage de caméra et support non transitoire lisible par ordinateur sur lequel est mémorisé un programme associé
KR20190130407A (ko) 전방위 카메라의 캘리브레이션을 위한 장치 및 방법
JP5648159B2 (ja) 3次元相対座標計測装置およびその方法
JP2010176325A (ja) 任意視点画像生成装置及び任意視点画像生成方法
JP4764896B2 (ja) カメラ校正装置、カメラ校正方法、カメラ校正プログラムおよびそのプログラムを記録した記録媒体
JP2007033087A (ja) キャリブレーション装置及び方法
JP6065670B2 (ja) 3次元計測システム、プログラム及び方法。
JP4778569B2 (ja) ステレオ画像処理装置、ステレオ画像処理方法及びステレオ画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11868876

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013522375

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11868876

Country of ref document: EP

Kind code of ref document: A1