JP4491687B2  Coordinate transformation function correction method  Google Patents
Coordinate transformation function correction method Download PDFInfo
 Publication number
 JP4491687B2 JP4491687B2 JP2005123197A JP2005123197A JP4491687B2 JP 4491687 B2 JP4491687 B2 JP 4491687B2 JP 2005123197 A JP2005123197 A JP 2005123197A JP 2005123197 A JP2005123197 A JP 2005123197A JP 4491687 B2 JP4491687 B2 JP 4491687B2
 Authority
 JP
 Japan
 Prior art keywords
 coordinate
 coordinate system
 dimensional
 dimensional shape
 dimensional image
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Active
Links
Images
Description
The present invention relates to a method for correcting a coordinate conversion function for converting threedimensional image data in each coordinate system of an object measured at a plurality of different positions by a threedimensional shape measuring apparatus into a threedimensional shape data group in one reference coordinate system. .
Conventionally, a threedimensional shape measuring apparatus in which a measurement position is moved with respect to an object to be measured measures a threedimensional shape of the measurement object from a plurality of different positions and synthesizes threedimensional image data for each measurement position. A method for measuring a threedimensional shape that allows a threedimensional shape of a measurement object to be viewed and displayed from an arbitrary direction is well known. In this measurement, before measuring the threedimensional shape of the object to be measured, a coordinate conversion function for converting the coordinate value at each measurement position by the threedimensional shape measuring apparatus into the coordinate value in the reference coordinate system is performed. Is called. Specifically, a reference object having a predetermined shape is placed in the measurement target space, this threedimensional shape is measured, and the reference object is previously stored using threedimensional image data representing the measured threedimensional shape. The coordinate value of the fixed point set in is calculated for each measurement position, and the coordinate conversion function is calculated using the calculated coordinate value. The coordinate conversion function converts the threedimensional image data representing the threedimensional shape of the measurement object at each measurement position into threedimensional image data in the reference coordinate system (see Patent Document 1).
However, when the same measurement object is measured at a plurality of different positions using the coordinate transformation function calculated as described above, the inventors of the present invention can obtain a threedimensional reference coordinate system representing the measurement object. It was found that the image data had slightly different values for each measurement position. This is considered to be due to the fact that there is an error in the calculated coordinate conversion function because a measurement error occurs in the threedimensional image data representing the reference object when measuring the threedimensional shape of the reference object. For this reason, there is a problem that the position of the threedimensional shape of the measurement object displayed on the display device is slightly shifted depending on the direction of viewing the measurement object, and accurate threedimensional shape measurement cannot be performed.
The present invention has been made to cope with the abovedescribed problem, and an object of the present invention is to provide a coordinate conversion function capable of calculating a coordinate conversion function with high accuracy in order to perform accurate threedimensional shape measurement of a measurement object. It is to provide a correction method.
To achieve the above object, the first inventions, is measured threedimensional shape of the measuring object in the threedimensional shape measurement device which is supported by a deformable support mechanism (20) (30), by a computer system The measurement information of the measurement object by the threedimensional shape measurement device is input to the configured threedimensional image processing device (42) (S72), and the coordinates relating to the threedimensional shape measurement device are used using the input measurement information. 3D image data representing the 3D shape of the measured measurement object in the system is generated (S74) , and the generated 3D image data is converted into a coordinate rotation function and a coordinate transfer function by the 3D image processing apparatus. Using the first coordinate transformation function consisting of the following, the threedimensional image data in the coordinate system relating to the deformable support mechanism, and the transformed threedimensional image data is converted into the support data. Using the second coordinate transformation function that varies according to the deformation state of the lifting mechanism is converted into 3dimensional image data of a predetermined coordinate system determined in advance in (S76), the display look the measurement object from any direction In a correction method of a coordinate conversion function for correcting a first coordinate conversion function applied to a possible threedimensional image generation system, a direction of a coordinate axis in a coordinate system related to the threedimensional shape measuring apparatus, and the predetermined predetermined coordinate system after having matched the direction of the coordinate axis of the support mechanism to deform in the threedimensional shape measuring apparatus is moved parallel to the axes of the predetermined coordinate system said predetermined, said at least two positions 3D The shape measuring device measures the threedimensional shape of the reference object arranged in the measurement target space of the threedimensional shape measuring device, and the threedimensional image processing device measures at the at least two positions. The threedimensional shape measuring apparatus is inputted measurement information of the reference object by, and to generate threedimensional image data representing a threedimensional shape of the reference object in a coordinate system related to the threedimensional shape measuring apparatus wherein each measurement position ( S42) using the first coordinate transformation function , the second coordinate transformation function, and the generated threedimensional image data for each measurement position in the three dimensional image processing device, and measuring at the at least two positions. wherein the fixed point of the reference object to calculate the threedimensional data represented by the predetermined given coordinate system (S44, S46, S48, S50 ), the threedimensional image processing apparatus, the coordinate axes of a predetermined coordinate system said predetermined make correct coordinate rotation function in the first coordinate transformation function according to the inclination of the line defined by using the threedimensional data of the fixed point relative to (S52, S54) In the door.
The second invention is a threedimensional image processing apparatus configured by a computer device by causing a threedimensional shape measuring device (30) supported by a deformable support mechanism (20) to measure a threedimensional shape of a measurement object. (42), the measurement information of the measurement object by the threedimensional shape measurement apparatus is input (S72), and the measurement object in the coordinate system related to the threedimensional shape measurement apparatus is input using the input measurement information. Threedimensional image data representing a threedimensional shape of an object is generated (S74), and the threedimensional image processing apparatus causes the generated threedimensional image data to be converted into a first coordinate transformation function comprising a coordinate rotation function and a coordinate movement function. Is converted into threedimensional image data in a coordinate system relating to the deformable support mechanism, and the converted threedimensional image data is adapted to the deformation state of the support mechanism. A threedimensional image generation system capable of converting the measurement object into threedimensional image data in a predetermined coordinate system using a second coordinate conversion function that changes in advance (S76) and displaying the measurement object as viewed from an arbitrary direction. In the correction method of the coordinate transformation function for correcting the first coordinate transformation function applied to the coordinate system, the direction of the coordinate axis in the coordinate system related to the threedimensional shape measuring apparatus matches the direction of the coordinate axis of the predetermined coordinate system Then, the support mechanism is deformed to move the threedimensional shape measuring apparatus in parallel with the coordinate axis of the predetermined coordinate system, and the threedimensional shape measuring apparatus is moved to the threedimensional shape measuring apparatus at at least two positions. The threedimensional shape measuring device which measures the threedimensional shape of a reference object arranged in the measurement target space of the shape measuring device and causes the threedimensional image processing device to measure at the at least two positions. The measurement information of the reference object is input, and threedimensional image data representing the threedimensional shape of the reference object in the coordinate system related to the threedimensional shape measuring apparatus is generated for each measurement position (S42). Using the first coordinate transformation function, the second coordinate transformation function, and the generated threedimensional image data for each measurement position, the fixed point of the reference object measured at the at least two positions is stored in an image processing apparatus. The threedimensional data represented by the predetermined coordinate system is calculated (S44, S46, S48, S50), and the threedimensional image processing apparatus measures the threedimensional shape between the positions where the threedimensional shape of the reference object is measured. The movement amount of the threedimensional shape measuring apparatus is calculated according to the deformation state of the support mechanism, and the calculated movement amount and the coordinate axis direction of the predetermined coordinate system determined in advance The coordinate rotation function in the first coordinate transformation function is corrected according to the slope of a straight line defined by the amount of change in the coordinate value of the fixed point of the reference object in the coordinate axis direction different from the moving direction of the threedimensional shape measuring apparatus. There is.
A third invention is a threedimensional image processing apparatus configured by a computer device by causing a threedimensional shape measuring device (30) supported by a deformable support mechanism (20) to measure a threedimensional shape of a measurement object. (42), the measurement information of the measurement object by the threedimensional shape measurement apparatus is input (S72), and the measurement object in the coordinate system related to the threedimensional shape measurement apparatus is input using the input measurement information. Threedimensional image data representing a threedimensional shape of an object is generated (S74), and the threedimensional image processing apparatus causes the generated threedimensional image data to be converted into a first coordinate transformation function comprising a coordinate rotation function and a coordinate movement function. Is converted into threedimensional image data in a coordinate system relating to the deformable support mechanism, and the converted threedimensional image data is adapted to the deformation state of the support mechanism. A threedimensional image generation system capable of converting the measurement object into threedimensional image data in a predetermined coordinate system using a second coordinate conversion function that changes in advance (S76) and displaying the measurement object as viewed from an arbitrary direction. In the correction method of the coordinate transformation function for correcting the first coordinate transformation function applied to the coordinate system, the direction of the coordinate axis in the coordinate system related to the threedimensional shape measuring apparatus matches the direction of the coordinate axis of the predetermined coordinate system In such a state, the support mechanism is deformed to move the threedimensional shape measuring apparatus parallel to the direction of the coordinate axis in the coordinate system related to the threedimensional shape measuring apparatus, and the threedimensional shape measuring apparatus is moved to the threedimensional shape measuring apparatus at at least two positions. A threedimensional shape of a reference object placed in a measurement target space of the threedimensional shape measuring apparatus is measured, and the threedimensional image processing apparatus measures at the at least two positions. Further, measurement information of the reference object by the threedimensional shape measurement apparatus is input, and threedimensional image data representing the threedimensional shape of the reference object in the coordinate system related to the threedimensional shape measurement apparatus is generated for each measurement position. (S42) The threedimensional data of the coordinate system of the threedimensional shape measuring apparatus in which the threedimensional image processing apparatus has moved the threedimensional data of the predetermined coordinate system determined in advance according to the deformation state of the support mechanism. A coordinate transformation correction function for transforming into the first coordinate transformation function is calculated (S42, S45), and the threedimensional image processing apparatus causes the first coordinate transformation function, the second coordinate transformation function, and 3 for each of the generated measurement positions. Using the threedimensional image data, threedimensional data representing the fixed point of the reference object measured at the at least two positions in the predetermined coordinate system is calculated, and the calculation is performed. The threedimensional data represented by a predetermined coordinate system determined in advance is converted into the threedimensional data of the coordinate system of the moved threedimensional shape measuring apparatus using the coordinate conversion correction function (S44, S46, S48, S50). The first coordinate transformation is performed on the threedimensional image processing device according to the slope of the straight line defined using the transformed threedimensional data of the fixed point with respect to the coordinate axis of the coordinate system of the moved threedimensional shape measuring device. The coordinate rotation function in the function is corrected (S52, S54).
In this case, for example, the reference object may have any one shape of a sphere, a prism, a pyramid, a cylinder, and a cone.
According to these first to third inventions, the coordinate rotation function in the first coordinate conversion function is corrected. Therefore, even when an error in the coordinate rotation function is occurring in the first coordinate transformation function, it is possible to correct the first coordinate transformation function, it is possible to calculate a good first coordinate transformation function accuracy. As a result, accurate threedimensional shape measurement of the measurement object can be performed.
According to a fourth aspect of the present invention, in addition to the configurations of the first to third aspects of the present invention, a palpation fixed to the support mechanism is brought into contact with a plurality of locations of the reference object, and the threedimensional image processing apparatus has the touch. Threedimensional data representing a fixed point of the reference object in the predetermined coordinate system in accordance with the deformed state of the support mechanism at each of a plurality of locations in contact with the needle is calculated as threedimensional data representing a specific fixed point. (S12 to S16), in the threedimensional image processing apparatus, threedimensional data representing a fixed point of the reference object in a coordinate system related to the threedimensional shape measuring apparatus based on the threedimensional image data measured by the threedimensional shape measuring apparatus. And the threedimensional data represented in the coordinate system related to the calculated threedimensional shape measuring apparatus, the first coordinate transformation function in which the coordinate rotation function is corrected, and the threedimensional shape measuring device Using the second coordinate conversion function according to the deformation state of the support mechanism at the time of measurement of the threedimensional image data by means of conversion into threedimensional data represented in the predetermined coordinate system (S56 to S60), In accordance with the difference between the threedimensional data of the fixed point expressed in the predetermined coordinate system that is converted and determined in advance, and the calculated threedimensional data of the specific fixed point. This is because the coordinate movement function of the first coordinate conversion function is corrected (S62, S64).
According to this, the coordinate movement function of the first coordinate conversion function can be corrected with high accuracy , and the accurate first coordinate conversion function can be calculated. As a result, more accurate threedimensional shape measurement of the measurement object can be performed.
Hereinafter, an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is a schematic diagram showing the basic configuration of a threedimensional image generation system used in the coordinate transformation function correction method of the present invention.
This threedimensional image generation system includes a support mechanism 20 that is fixed on a base 10 and that freely displaces a tip portion in a measurement target space, and a threedimensional shape measuring device 30 attached to the tip portion of the support mechanism 20. I have. The support mechanism 20 includes a fixed pole 21, a rotating rod 22, a first arm 23, a second arm 24, and a third arm 25.
The fixed pole 21 is formed in a cylindrical shape, and is vertically erected and fixed on the base 10 at the lower end thereof. The rotating rod 22 is formed in a columnar shape, and is supported at the lower end of the rotating rod 22 so as to be rotatable around the axis, and protrudes upward from the fixing pole 21. The first arm 23 is assembled to a connecting portion 22 a provided at the distal end of the rotating rod 22 at a connecting portion 23 a provided at the base end thereof so as to be rotatable around an axis perpendicular to the axial direction of the rotating rod 22. . The second arm 24 is assembled at a connecting portion 24 a provided at the base end thereof to a connecting portion 23 b provided at the distal end of the first arm 23 so as to be rotatable about an axis perpendicular to the axial direction of the first arm 23. ing. The third arm 25 is assembled at a connecting portion 25a provided at the base end thereof to a connecting portion 24b provided at the distal end of the second arm 24 so as to be rotatable about an axis perpendicular to the axial direction of the second arm 24. ing. A threedimensional shape measuring apparatus 30 is attached to the distal end portion of the third arm 25 via a fixing member 26 so as to be rotatable about the axis of the third arm 25 and detachable. The fixing member 26 is fixed to the bottom surface or the side surface of the rectangular housing of the threedimensional shape measuring apparatus 30.
Further, rotation angle sensors 27a, 27b, 27c, 27d, and 27e are provided in the support mechanism 20, respectively. The rotation angle sensor 27 a is incorporated in the fixed pole 21 and detects a rotation angle around the axis of the rotation rod 22 with respect to the fixed pole 21. The rotation angle sensor 27 b is incorporated in the connection portion 23 a of the first arm 23 and detects a rotation angle around one axis in the connection portion 23 a of the first arm 23 with respect to the connection portion 22 a of the rotation rod 22. The rotation angle sensor 27 c is incorporated in the connection portion 24 a of the second arm 24 and detects a rotation angle around one axis in the connection portion 24 a of the second arm 24 with respect to the connection portion 23 a of the first arm 23. The rotation angle sensor 27 d is incorporated in the connection portion 25 a of the third arm 25 and detects a rotation angle around one axis in the connection portion 25 a of the third arm 25 with respect to the connection portion 24 b of the second arm 24. The rotation angle sensor 27 e is incorporated in the distal end portion of the third arm 25, and the rotation angle of the third arm 25 of the threedimensional shape measuring apparatus 30 relative to the third arm 25 around one axis, that is, the threedimensional shape measuring apparatus 30. A rotation angle with respect to the tip of the third arm 25 is detected.
The threedimensional shape measuring apparatus 30 measures the threedimensional shape of an object located on the front side of the object and outputs information representing the measured threedimensional shape. In this embodiment, laser light is used. The threedimensional shape of an object is measured according to a triangulation method. In the present embodiment, laser light is used. However, other light may be used as long as the surface shape of the threedimensional object can be measured and the reflectance and color can be identified.
In this threedimensional shape measuring apparatus 30, a virtual plane that is substantially perpendicular to the traveling direction of the laser light emitted from the laser light source toward the object is assumed, and the Xaxis direction and Y that are orthogonal to each other on the virtual plane are assumed. A large number of minute areas divided along the axial direction are assumed. The threedimensional shape measuring apparatus 30 sequentially irradiates the plurality of minute areas with laser light, and sequentially detects the distance to the object surface defined by the minute areas as reflected in the Zaxis direction by reflected light from the object. Information on X, Y, and Z coordinates representing each divided area position obtained by dividing the surface of the object into minute areas is obtained, and the shape of the object surface facing the threedimensional shape measuring apparatus 30 is measured.
Therefore, the threedimensional shape measuring apparatus 30 includes an Xaxis direction scanner that changes the direction of the emitted laser light in the Xaxis direction, a Yaxis direction scanner that changes the direction of the emitted laser light in the Yaxis direction, A distance detector that receives reflected laser light reflected from the surface and detects the distance to the object surface. The Xaxis direction scanner and the Yaxis direction scanner may be any mechanism that can change the optical path of the laser beam emitted from the laser light source independently in the Xaxis direction and the Yaxis direction. Rotating around the axis in the direction and the Yaxis direction by an electric motor, or rotating the galvanometer mirror provided in the optical path of the emitted laser light and changing the direction around the axes in the Xaxis direction and the Yaxis direction by the electric motor Can be used. As the distance detector, a plurality of imaging lenses such as an imaging lens that condenses the reflected laser light reflected on the object surface and rotates following the optical path of the emitted laser light, and a CCD that receives the condensed laser light are used. It is possible to use a mechanism for detecting the distance to the object surface based on the light receiving position of the reflected laser beam by the line sensor.
Therefore, such a threedimensional shape measuring apparatus 30 uses the reference of the emitted laser light by the Xaxis direction scanner as information on the X, Y, and Z coordinates representing the divided area positions obtained by dividing the surface of the object into minute areas. The inclination θx in the Xaxis direction with respect to the direction, the inclination θy in the Yaxis direction with respect to the reference direction of the laser beam emitted by the Yaxis direction scanner, and the distance Lz to the object surface by the distance detector are the virtual Xaxis. Are output for each of a number of minute areas divided along the direction and the Yaxis direction. More specifically, the inclinations θx and θy in the Xaxis and Yaxis directions are rotation angles from the reference position of the electric motor. Further, the distance Lz to the object surface is the light receiving position of the reflected laser beam in the line sensor. The threedimensional shape measuring apparatus 30 described above is an example, and any threedimensional shape measuring apparatus can be used, such as one using millimeter waves or ultrasonic waves instead of laser light.
A controller 41 and a threedimensional image processing device 42 are connected to the threedimensional shape measuring device 30. The controller 41 controls the operation of the threedimensional shape measuring apparatus 30 in accordance with an instruction from the input device 43 including a keyboard including a plurality of operators. Further, the controller 41 controls the operation of the threedimensional image processing device 42 in accordance with an instruction from the input device 43 and supplies data input by the input device 43 to the threedimensional image processing device 42.
The threedimensional image processing device 42 is constituted by a computer device, and the rotation angles detected by the rotation angle sensors 27a to 27e and the threedimensional shape from the threedimensional shape measuring device 30 by executing the programs of FIGS. Information representing the dimensional shape, specifically, the inclination θx in the Xaxis direction, the inclination θy in the Yaxis direction, and the distance Lz to the object surface are input, and the threedimensional shape of the object located in the measurement target space is input. Threedimensional image data that can be viewed from an arbitrary direction is generated. A display device 44 is connected to the threedimensional image processing device 42. The display device 44 includes a liquid crystal display, a plasma display, a CRT display, and the like, and displays a threedimensional shape of an object located in the measurement target space based on the threedimensional image data output from the threedimensional image processing device 42.
Hereinafter, the operation of the threedimensional image generation system configured as described above will be described. First, the worker expresses threedimensional image data represented by a coordinate system (hereinafter referred to as a camera coordinate system) related to the threedimensional shape measuring apparatus 30 by a coordinate system (hereinafter referred to as an arm coordinate system) related to the support mechanism 20. A first coordinate conversion function for converting into the threedimensional image data is calculated. Here, the camera coordinate system is a coordinate system related to the threedimensional shape measuring apparatus 30 and includes three coordinate axes (X axis, Y axis, Z axis) orthogonal to each other, and a specific point of the threedimensional shape measuring apparatus 30 as an origin. This is a threedimensional coordinate system. The arm coordinate system is a coordinate system related to the mounting portion of the threedimensional shape measuring apparatus 30 and includes three coordinate axes (X axis, Y axis, Z axis) corresponding to the camera coordinate system. 3 is a threedimensional coordinate system having a predetermined position of the tip end of the third arm 25 as an attachment portion for attaching to the support mechanism 20 as an origin. The predetermined position of the tip portion of the third arm 25 is a point that rotates around the axis with respect to the third arm 25 in detail.
First, as shown in FIG. 2, the operator fixes a base 51 of a stylus 50 formed by sharpening the tip to the tip of the third arm 25. Next, the worker places the reference sphere 60 at an appropriate position on the base 10. In this case, the reference sphere 60 is installed via a columnar support portion 61 and is disposed at an appropriate height from the upper surface of the base 10. The reference sphere 60 is formed in a true sphere and functions as a reference object for determining a fixed point in the measurement target space. In the stylus 50, the position of the tip of the base 51 with respect to a predetermined position is recognized by the operator and stored in advance in the memory device of the threedimensional image processing device 42. The lengths of the fixed pole 21, the rotating rod 22 and the first to third arms 23 to 25 are also recognized by the operator and stored in advance in the memory device of the threedimensional image processing device 42. Further, in the present embodiment, the reference sphere 60 is arranged on the base 10 via the support portion 61, but the reference sphere 60 may be directly arranged on the base 10. Furthermore, the tip of the stylus 50 may be formed in a spherical shape, and the center position of the sphere with respect to a predetermined position of the base 51 may be recognized.
Next, after the preparation is completed, the operator operates the input device 43 to cause the threedimensional image processing device 42 to execute the stylus measurement program shown in FIG. Then, the operator displaces the stylus 50 to bring the tip of the stylus 50 into contact with four different points on the outer surface of the reference sphere 60, and operates the input device 43 for each contact, thereby operating the stylus 50. The contact to the reference sphere 60 is input. The threedimensional image processing device 42 has started the stylus measurement program in step S10, and when the stylus 50 is in contact with the reference sphere 60 in step S12 by the input of the contact, the rotation angle sensors 27a to 27c. The detected rotation angle is input from 27d.
In addition to the set of detected rotation angles, the reference ball 60 of the stylus 50 is used by using the lengths of the fixed pole 21, the rotation rod 22, the first to third arms 23 to 25 and the stylus 50. The coordinates of the reference coordinate system, which is a coordinate system representing the contact point to the base 10 and is a coordinate system related to the base 10, is calculated. The reference coordinate system is composed of three coordinate axes (X axis, Y axis, Z axis) corresponding to the camera coordinate system, and a threedimensional coordinate having a predetermined point at the fixed portion of the fixed pole 21 to the base 10 as an origin. It is a system. Such calculation of coordinates in the reference coordinate system is performed for each contact point, and four sets of coordinate data respectively corresponding to the four contact points are temporarily stored. After the processing in step S12, the threedimensional image processing apparatus 42 is a coordinate representing the center position of the reference sphere 60 in step S14 and is a coordinate in the reference coordinate system (hereinafter referred to as center coordinates (x ″, y ″, z ″)). In the calculation, the four sets of coordinate data are substituted into X, Y, and Z in the following equation 1 representing the outer surface of the sphere. Then, the coordinates (x ″, y ″, z ″) representing the center position of the reference sphere 60 are calculated by calculating each value of a, b, c. Note that a, b, and c are X, Y, and Z coordinates representing the center position of the sphere, and d is the radius of the sphere.
Next, in step S16, the threedimensional image processing device 42 stores the calculated center coordinates (x ″, y ″, z ″) as coordinates representing a fixed point of the reference sphere 60. Then, step S18. In this embodiment, the stylus 50 is brought into contact with the reference sphere 60 at four locations, but correction is performed by making contact at five or more locations. In addition, when the radius d of the reference sphere 60 is known, the center coordinates of the reference sphere 60 are obtained even if the number of contact points is three. be able to.
Next, as shown in FIG. 1, the operator assembles the threedimensional shape measuring device 30 at the tip of the third arm 25 so as to be rotatable around the axis of the third arm 25, and the coordinate conversion function shown in FIG. 4. The calculation program is executed by the threedimensional image processing device 42. Then, the operator operates the input device 43 to instruct the threedimensional shape measuring device 30 to start measuring the threedimensional shape of the reference sphere 60 via the controller 41. In response to this, the threedimensional shape measurement apparatus 30 starts measuring the threedimensional shape of the reference sphere 60. In this case, the operator holds the threedimensional shape measuring device 30 with his hand, and the reference sphere is accompanied by rotational displacement at each connecting portion of the rotating rod 22, the first to third arms 25 and the threedimensional shape measuring device 30. The threedimensional shape measuring apparatus 30 is moved with respect to 60 to measure the threedimensional shape of the reference sphere 60. In this case, the measurement of the reference sphere 60 is performed from three different measurement positions.
Meanwhile, during the measurement of the threedimensional shape of the reference sphere 60 by the threedimensional shape measuring device 30, the threedimensional image processing device 42 starts execution of the coordinate conversion function calculation program in step S20 and supports it in step S22. The detected rotation angles are input from the rotation angle sensors 27a to 27e in the mechanism 20, respectively. After step S22, information representing the threedimensional shape of the reference sphere 60 is input after waiting for the measurement by the threedimensional shape measuring apparatus 30 in step S24. That is, information (specifically, inclinations θx, θy and distance Lz) relating to XYZ coordinates representing the divided area positions obtained by dividing the surface of the reference sphere 60 into minute areas is input. In step S24, from the threedimensional shape data group representing the threedimensional shape of the reference sphere 60 for each measurement position, based on the input information about the XYZ coordinates from the threedimensional shape measurement apparatus 30. Each of the threedimensional image data is calculated. In this case, the threedimensional image data is represented by a coordinate system related to the threedimensional shape measuring apparatus 30, that is, a camera coordinate system. If the 3D image data includes unnecessary 3D image data related to an object other than the reference sphere 60 (for example, the base 10 or the like), the 3D image processing device 42 has a 3D shape. Along with information on the XYZ coordinates from the measuring device 30, data on the received light quantity of the line sensor or data on the ratio of the received light quantity to the emitted light quantity is input, and in step S24, the received light quantity or received light quantity is emitted. A process for removing the unnecessary threedimensional image data from the data of the ratio with respect to the amount of light is also executed.
Next, in step S26, the threedimensional image processing device 42 applies the calculated threedimensional image data for each measurement position to X, Y, and Z of the above equation 1, and uses the least square method. Center coordinates (x1, y1, z1), (x2, y2, z2), (x3, y3, z3) are calculated for each measurement position of the reference sphere 60. In addition, the threedimensional image data of the reference sphere 60 for each measurement position is distinguished using the rotation angles detected by the input rotation angle sensors 27a to 27e.
Next, in step S28, the threedimensional image processing device 42 adds the fixed rotation angle 21, the rotation rod 22, and the first to third arms 23 to 25 in addition to the input rotation angle detected by the rotation angle sensors 27a to 27e. Are coordinates representing a predetermined position of the tip of the third arm 25, that is, coordinates representing the origin of the arm coordinate system and coordinates (xd1, yd1, zd1), ( xd2, yd2, zd2) and (xd3, yd3, zd3) are calculated for each measurement position. In step S28, rotation angles (α1, β1, γ1), (about the X, Y, and Z axes of the arm coordinate system are used by using the rotation angles detected by the input rotation angle sensors 27a to 27e. α2, β2, γ2) and (α3, β3, γ3) are also calculated for each measurement position.
Next, in step S30, the threedimensional image processing device 42 performs center coordinates (x ″, y ″, z) in the reference coordinate system of the reference sphere 60 stored in the memory device by executing the stylus measurement program of FIG. )), The origin coordinates (xd1, yd1, zd1), (xd2, yd2, zd2), (xd3, yd3, zd3) of the arm coordinate system in the reference coordinate system calculated by the processing of step S28, and step S28. The rotation angles (α1, β1, γ1), (α2, β2, γ2), (α3, β3, γ3) of the coordinate axes X, Y, and Z of the arm coordinate system with respect to the reference coordinate system calculated by the process of Substituting into 2 and Equation 3, the center coordinates of the reference sphere 60 and the center coordinates (x′1, y′1, z′1), (x′2, y′2, z ′) in the arm coordinate system. 2) Calculate (x'3, y'3, z'3).
Equations (2) and (3) above represent the coordinates (x, y, z) of one point in the first coordinate system composed of XYZ coordinates, and the first coordinate system around the X, Y, and Z axes, respectively. The coordinates of the same point in the second coordinate system are rotated by xθ, yθ, zθ and the origin of the first coordinate system is moved by a, b, c in the Xaxis direction, Yaxis direction and Zaxis direction, respectively ( x ′, y ′, z ′). In the calculation in step S30, the coordinate values x, y, and z in Equation 2 are the center coordinates of the reference sphere 60 and each of the center coordinates (x ″, y ″, z ″) of the reference coordinate system. The coordinate values x ′, y ′, and z ′ in Equation 2 are the center coordinates of the reference sphere 60 and the center coordinates (x ′, y) of the arm coordinate system, respectively. ', z') corresponding to the respective X, Y, Z coordinate values, wherein the values a, b, c in the equation 2 are the origin coordinates (xd1, yd1,) of the arm coordinate system calculated in step S28. zd1), (xd2, yd2, zd2), (xd3, yd3, zd3), and α, β, γ in Equation 2 are rotation angles (α1, β1, γ1) calculated in step S28. , (Α2, β2, γ2), (α3, β3, γ3).
After the process of step S30, in step S32, the center coordinates (x1, y1, z1), (x2, y2, z2), for each measurement position of the reference sphere 60 of the camera coordinate system calculated by the process of step S26, (x3, y3, z3) and the center coordinates (x'1, y'1, z'1), (x'2, y'2, z) of the reference sphere 60 of the arm coordinate system calculated by the processing of step S30 The first coordinate conversion function from the camera coordinate system to the arm coordinate system is calculated using '2) and (x'3, y'3, z'3).
Before calculating the first coordinate conversion function, this type of coordinate conversion will be briefly described. First, the first coordinate system composed of XYZ coordinates and the first coordinate system are rotated by xθ, yθ, zθ around the X, Y, and Z axes, respectively, and the origin of the first coordinate system Is assumed to be a second coordinate system that is moved by a, b, and c in the Xaxis direction, the Yaxis direction, and the Zaxis direction, respectively. Also in this case, if the coordinates of one point in the first coordinate system are (x, y, z) and the coordinates of the same point in the second coordinate system are (x ', y', z '), Similarly to the case, the following equation 4 is established, and the matrix M in the equation 4 is represented by the following equation 5.
The calculation of the first coordinate transformation function in step S32 is performed by calculating the matrix values g _{11} , g _{12} , g _{13} , g _{21} , g _{22} , g _{23} , g _{31} , g _{32} , g _{33 in} the equations 4 and 5. This means that matrix values a, b, and c are calculated. In this case, the camera coordinate system of the present embodiment corresponds to the first coordinate system, and the arm coordinate system corresponds to the second coordinate system. Accordingly, the center coordinates (x1, y1, z1), (x2, y2, z2), (x3, y3, z3) of the reference sphere 60 in the camera coordinate system for each measurement position, and the reference sphere 60 in the arm coordinate system. The center coordinates (x′1, y′1, z′1), (x′2, y′2, z′2), and (x′3, y′3, z′3) of When applied, the following relationships 6 to 8 are established.
When Equation 6 is transformed, the following Equation 9 is established.
Here, the normal vector of the plane including the fixed point coordinates (x1, y1, z1), (x2, y2, z2), (x3, y3, z3) is (α, β, γ), and the fixed point coordinates (x ′ 1, y′1, z′1), (x′2, y′2, z′2), (x′3, y′3, z′3) and the normal vector of the plane including (α ′, Assuming that β ′ and γ ′), the following equation 10 holds if the two normal vectors have the same magnitude. The matrix M in the equation 10 is represented by the equation 5.
The normal vector of the plane including the fixed point coordinates (x1, y1, z1), (x2, y2, z2), (x3, y3, z3) is directed from (x2, y2, z2) to (x1, y1, z1) The normal vector (α, β, γ) is expressed by the following equation (11), assuming that the vector is formed by the outer product of the vector and the vector from (x3, y3, z3) to (x2, y2, z2).
Similarly, the vector (α ′, β ′, γ ′) is represented by the following formula 12.
Substituting Equation 11 and Equation 12 into Equation 10 yields Equation 13 below.
If the first equation of Equation 13 is added to Equation 9, the following simultaneous equations of Equation 14 are obtained.
The matrix values g _{11} , g _{12} , and g _{13} can be calculated by solving the simultaneous equations of Equation 14. Further, the equations 7 and 8 are also transformed into the simultaneous equations of the equation 9, and the matrix value g _{21} is obtained by solving the simultaneous equations obtained by adding the second equation and the third equation of the equation 13, respectively. , G _{22} , g _{23} and matrix values g _{31} , g _{32} , g _{33} can be calculated. Then, by substituting these calculated matrix values into the equations 6 to 8, matrix values a, b, and c can be calculated. Thereby, the first coordinate conversion function for converting the coordinate value (x, y, z) in the camera coordinate system into the coordinate value (x ′, y ′, z ′) in the arm coordinate system is calculated.
Next, in step S34, the calculated matrix values g _{11} , g _{12} , g _{13} , g _{21} , g _{22} , g _{23} , g _{31} , g _{32} , g _{33} , a, b, c is stored in the memory device of the threedimensional image processing device 42. Then, in step S36, the threedimensional image processing device 42 ends the execution of the coordinate conversion function calculation program.
Next, the operator corrects the first coordinate conversion function calculated by the coordinate conversion function calculation program. Specifically, the operator operates the input device 43 to instruct the threedimensional image processing device 42 to correct the first coordinate conversion function. In response to this instruction, the threedimensional image processing device 42 starts execution of the coordinate transformation function correction program shown in FIG. 5 at step S40, and measurement information representing the threedimensional shape of the reference sphere 60 at step S42. Wait for input.
Next, the operator adjusts the direction of the threedimensional shape measuring apparatus 30 so that the direction of each coordinate axis of the camera coordinate system of the threedimensional shape measuring apparatus 30 substantially matches the direction of each coordinate axis of the reference coordinate system. Then, while holding the threedimensional shape measuring device 30 by hand, the rotational distance at each connecting portion of the rotating rod 22, the first to third arms 25 and the threedimensional shape measuring device 30 is 3 with respect to the reference sphere 60. The threedimensional shape of the reference sphere 60 is measured by moving the dimension shape measuring device 30 and instructing the start of measurement from the input device 43. In this case, the operator moves the threedimensional shape measuring device 30 in each movement direction while moving the threedimensional shape measuring apparatus 30 in parallel to two coordinate axes corresponding to each other among the three coordinate axes of the camera coordinate system and the reference coordinate system. The reference sphere 60 is measured at three positions. In the present embodiment, as shown in FIG. 6, the threedimensional shape measuring apparatus 30 is moved in the Yaxis direction and measured at three positions (positions a, b, and c in the figure), and then in the Zaxis direction. Move and measure at three positions (d, b, c in the figure). In this case, if the threedimensional shape measuring device 30 is moved in a cross shape and measured once at the position of the crossshaped intersection (position b in the figure), the measured value is used as each measured value in the two moving directions. The efficiency of the measurement work can be improved.
The threedimensional shape measurement apparatus 30 outputs information representing the threedimensional shape of the reference sphere 60 located in the measurement target space to the threedimensional image processing apparatus 42 for each measurement position. That is, information (specifically, inclinations θx, θy and distance Lz) regarding the XYZ coordinates representing the divided area positions obtained by dividing the surface of the reference sphere 60 into minute areas is given to the threedimensional image processing device 42. Output. In step S42, the threedimensional image processing device 42 performs a reference for each measurement position based on the information about the XYZ coordinates output from the threedimensional shape measurement device 30, as in step S24. Threedimensional image data composed of a threedimensional shape data group representing the threedimensional shape of the sphere 60 is calculated. In this case, the threedimensional image data is represented by a coordinate system related to the threedimensional shape measuring apparatus 30, that is, a camera coordinate system. When the 3D image data includes unnecessary 3D image data related to an object other than the reference sphere 60 (for example, the base 10 or the like), the 3D image processing apparatus 42 performs the same step S42. In the same manner as in step S24, a process for removing the unnecessary threedimensional image data is also executed. As a result, threedimensional image data representing the threedimensional shapes of the three sets of reference spheres 60 can be obtained along each Yaxis direction and Zaxis direction of the camera coordinate system and the reference coordinate system.
Next, in step S44, the threedimensional image processing device 42 performs coordinate conversion of the three sets of threedimensional image data into threedimensional image data represented by an arm coordinate system. Specifically, using the first coordinate transformation function calculated by the coordinate transformation function calculation program shown in FIG. 4, the threedimensional image data represented by the arm coordinate system is converted from the threedimensional image data represented by the camera coordinate system. Convert coordinates to.
Next, in step S46, the threedimensional image processing device 42 calculates a second coordinate conversion function for converting the coordinate value of the arm coordinate system to the coordinate value of the reference coordinate system. The threedimensional image processing device 42 inputs the rotation angles detected by the rotation angle sensors 27a to 27e provided in the support mechanism 20, and moves the fixed pole 21, the rotation rod 22, and the first to third arms 23 to 25, respectively. The coordinate transformation function corresponding to is calculated. Specifically, the threedimensional image processing device 42 uses an input detection rotation angle and the arm lengths of the fixed pole 21, the rotation rod 22, and the first to third arms 23 to 25 from the arm coordinate system. The coordinates of the origin of the seen reference coordinate system, that is, the coordinates (xoa, yob, zoc) of the origin of the reference coordinate system in the arm coordinate system are calculated as matrix values a, b, c of the second coordinate transformation function in the equation (2). At the same time, the rotation angle of each coordinate axis of the reference coordinate system with respect to each coordinate axis of the arm coordinate system is calculated as the angles α, β, γ of the second coordinate transformation function in the above equation 3. This second coordinate conversion function is calculated for each measurement position by the threedimensional shape measuring apparatus 30. The second coordinate conversion function and the first coordinate conversion function constitute a coordinate conversion function according to the present invention.
Next, in step S48, the threedimensional image processing device 42 converts the three sets of threedimensional image data into threedimensional image data represented by a reference coordinate system. Specifically, the threedimensional image data coordinateconverted to the arm coordinate system in step S44 is converted into the second coordinate conversion function α, β, γ, a, b, c calculated in step S46. Used to convert coordinates to threedimensional image data represented by the reference coordinate system. As a result, information representing the threedimensional shape of the reference sphere 60 for each measurement position is coordinateconverted into threedimensional image data represented by the reference coordinate system. Then, the threedimensional image processing device 42 stores the threedimensional image data for each measurement position subjected to coordinate conversion in the memory device. In this case, the threedimensional image processing device 42 also associates the second coordinate transformation function α, β, γ, a, b, c for each measurement position with the memory device corresponding to the threedimensional image data for each measurement position. To remember.
Next, in step S50, the threedimensional image processing device 42 calculates the center coordinates of the reference sphere 60 for each measurement position as a fixed point. Specifically, the threedimensional image data (X, Y, Z coordinate values) represented by the reference coordinate system in the step S48 is converted into X, Y, Z on the left side of the equation 1, which is an expression representing a sphere. Substituting each other, the unknowns a, b, and c are calculated using the least square method. In this case, a, b, and c represent the x, y, and z coordinate values of the sphere center represented by the threedimensional image data, respectively, and d represents the radius of the sphere. Then, the threedimensional image processing device 42 stores the calculated x, y, z coordinate values of the sphere center in the memory device as a fixed point. As a result, the center coordinates (xa ″, ya ″, za ″), (xb ″, yb ″) in the reference coordinate system of the reference sphere 60 measured from three positions for each movement direction (Y axis, Z axis). , zb ″), (xc ″, yc ″, zc ″), (xd ″, yd ″, zd ″), and (xe ″, ye ″, ze ″) are respectively calculated as threedimensional data representing fixed points.
Next, the threedimensional image processing device 42 calculates a correction function for the first coordinate transformation function. The correction function of the first coordinate conversion function includes a coordinate rotation function that represents rotation of the coordinate axis with respect to the inclination of each coordinate axis, and a coordinate movement function that represents linear movement of the coordinate axis with respect to the origin deviation of the coordinate system. In step S52, the threedimensional image processing apparatus 42 calculates a coordinate rotation function M ′ for each coordinate axis. The coordinate rotation function M ′ is a function related to the angle of each coordinate axis in the first coordinate transformation function, that is, the matrix values g _{11} , g _{12} , g _{13} , g _{21} , g _{22} , of the function M shown in the equations 4 and 5. This is a function for correcting each value of g _{23} , g _{31} , g _{32} , and g _{33} and is represented by the following formula 15. That is, the process of step S52 means that θx, θy, and θz in the following formula 15 are calculated. θx, θy, θz represent the rotation angle of each coordinate axis (X axis, Y axis, Z axis), and in the conversion of the coordinate axis by the camera coordinate system to the coordinate axis by the arm coordinate system, the first coordinate conversion function and the arm The coordinate axis of the camera coordinate system when considered based on the coordinate axis based on the coordinate system, in other words, the camera coordinate system when the coordinate axis based on the arm coordinate system is converted to the coordinate axis of the camera coordinate system by the inverse function of the first coordinate conversion function ( Hereinafter, it represents a deviation in angle between the coordinate axis of the temporary camera coordinate system and the coordinate axis of the true camera coordinate system.
The calculation of the rotation angles θx, θy, θz will be described in detail. The rotation angles θx, θy, θz are the angles of the coordinate axes x, y, z of the temporary camera coordinate system and the coordinate axes x ′ ″, y ′ ″, z ′ ″ of the true camera coordinate system. It represents a gap. Since this angle deviation is small, the angle deviation of each coordinate axis can be considered on a plane composed of two coordinate axes orthogonal to each coordinate axis. Specifically, the Xaxis angle shift θx is considered on the YZaxis plane, the Yaxis angle shift θy on the XZaxis plane, and the Zaxis angle shift θz on the XYaxis plane.
If the direction of each coordinate axis of the camera coordinate system is the same as the direction of each coordinate axis of the reference coordinate system corresponding to each coordinate axis, each coordinate conversion by the first coordinate conversion function and the second coordinate conversion function Coordinate transformation relating to the angle of the coordinate axis can be ignored, and the orientation of each coordinate axis in the arm coordinate system can also be considered to be the same as the coordinate axes of the camera coordinate system and the reference coordinate system. In this case, for example, in order to calculate the angle deviation θz of the Z axis, the coordinate value of the camera coordinate system on the XY axis plane is (x, y), and the coordinate value of the arm coordinate system is (x ′, y ′). ) If the coordinate value of the reference coordinate system is (x ″, y ″) and the coordinate value of the temporary camera coordinate system is (x ′ ″, y ′ ″), the coordinate conversion between the coordinate systems is as follows: As shown in Equation 16. In the following equation 16, (α1, β1) is a deviation between the origin of the arm coordinate system and the origin of the camera coordinate system, (α2, β2) is a deviation between the origin of the reference coordinate system and the origin of the arm coordinate system, (α3, β3) represents the deviation between the origin of the arm coordinate system and the origin of the temporary camera coordinate system.
In this case, if the coordinates representing the fixed point in the reference coordinate system are (x ″ 0, y ″ 0), each coordinate system is expressed as shown in the following Expression 17.
When converting the coordinates of the camera coordinate system to the coordinates of the arm coordinate system and converting the coordinates of the arm coordinate system to the coordinates of the reference coordinate system, the coordinates of the true camera coordinate system are regarded as the coordinates of the temporary camera coordinate system. The coordinates of the arm coordinate system are converted to the coordinates of the arm coordinate system, and the coordinates of the arm coordinate system are converted to the coordinates of the reference coordinate system. That is, the second coordinate of the equation 17 is converted by the third equation of the equation 16, and then converted by the second equation. For this reason, when coordinates representing a fixed point in the camera coordinate system are measured in the camera coordinate system, converted into coordinates in the arm coordinate system, and further converted into coordinate values in the reference coordinate system, the coordinates are expressed as shown in Equation 18 below. expressed.
When the camera coordinate system is moved by a in the Xaxis direction and by b in the Yaxis direction, the position of the fixed point (x ″ 0, y ″ 0) is expressed for each coordinate system as shown in Equation 19 below. It becomes like this.
In this case, coordinates (x ″ 0, y ″ 0) representing a fixed point in the reference coordinate system are measured in the camera coordinate system, converted into coordinates in the arm coordinate system, and then converted into coordinates in the reference coordinate system. After the second coordinate of Equation 19 is transformed by the third equation of Equation 16, and further transformed by the equation obtained by adding (a, b) to the left side of the second equation of Equation 16, this The coordinate value is expressed as shown in Equation 20 below.
The slope Kz of the straight line passing through the fixed point shown in the equation 18 and the fixed point shown in the equation 20 is expressed by the following equation 21. In the following equation 21, the denominator represents the amount of change on the X axis of the fixed point converted to the reference coordinate system, and the numerator represents the amount of change on the Y axis of the identification point.
In this case, when the movement amount “a” in the Xaxis direction is extremely larger than the movement amount “b” in the Yaxis direction among the movement amounts of the camera coordinate system, the equation 21 is expressed as shown in the following equation 22. Is done.
When the formula of trigonometric function (1COSθ = 2SIN ^{2} θ / 2) is applied to the equation 22, the equation 22 is modified as shown in the following equation 23.
Here, since the angle deviation θz of the Zaxis is extremely small, the equation 23 is expressed as the following equation 24.
In Equation 23, if the moving direction is the Yaxis direction and the direction orthogonal to the moving direction is the Xaxis direction, the Zaxis angle deviation θz is expressed as shown in Equation 25 below.
According to Equation 25, the Zaxis angle deviation θz is obtained when the camera coordinate system is moved in the Xaxis direction orthogonal to the Zaxis in the reference coordinate system of a straight line that passes through the fixed point before the movement and the fixed point after the movement. It can be calculated by doubling the tilt angle. In this case, when the camera coordinate system is moved in the Yaxis direction orthogonal to the Zaxis, even if the inclination angle of the straight line passing through the two fixed points in the reference coordinate system is doubled, the Zaxis angle shift is similarly performed. θz can be calculated. In other words, the angle shifts θx, θy, θz of the coordinate axes are directions of the coordinate axes of the camera coordinate system and the reference coordinate system corresponding to one of the two coordinate axes orthogonal to the coordinate axes to be calculated for the angle shift. It is possible to calculate from the inclination of the straight line relative to the coordinate axis of the reference coordinate system obtained from each fixed point coordinate converted to the reference coordinate system when the camera coordinate system is moved in parallel with the camera coordinate system. Each fixed point coordinate converted into the reference coordinate system is a coordinate obtained by converting the fixed point coordinate in the camera coordinate system measured at each measurement position by the first coordinate conversion function and the second coordinate conversion function.
Equation 25 will be described visually with reference to FIG. With the orientation of each coordinate axis of the camera coordinate system (XY) and the orientation of each coordinate axis of the reference coordinate system (X ″ Y ″) matched, the camera coordinate system is set to the X axis and Y of the camera coordinate system. A fixed point (x "0, y" 0) is measured at each of the measurement positions a, b, c, d, e by moving in parallel in the axial direction. In this case, the positions of the fixed points when the positions of the measured fixed points are converted into the coordinates of the reference coordinate system (X ″ −Y ″) by coordinate conversion are indicated by a ′, b ′, c ′, d ′, and e ′. It becomes like this. That is, corresponding to the deviation of the angle of the coordinate axis of the temporary camera coordinate system from the coordinate axis of the camera coordinate system, the position of the fixed point coordinate when the fixed point coordinate measured by the camera coordinate system is converted to the fixed point coordinate by the reference coordinate system is also respectively It is not fixed to the same point.
The above equation (25) is obtained by doubling the angle of the slope Kz with respect to the Y ″ coordinate axis of the straight reference coordinate system passing through the fixed points a ′, b ′, and c ′, and the angle of the Z ″ coordinate axis of the temporary camera coordinate system. It means that it calculates as deviation  shift (theta) z of. In this case, a value obtained by doubling the angle of the inclination Kz with respect to the X ″ coordinate axis of the straight line passing through the fixed points d ′, b ′, and e ′ may be calculated as the angle deviation θz of the Z ″ coordinate axis of the temporary camera coordinate system. it can. In FIG. 7, since the angle deviation of the coordinate axis of the temporary camera coordinate system with respect to the coordinate axis of the reference coordinate system is exaggerated, it is completely different from the above formula 25 assuming that the deviation is extremely small. Does not match. In addition, since the direction of each coordinate axis of the camera coordinate system is matched with the direction of each coordinate axis of the reference coordinate system corresponding to each coordinate axis, the direction of each coordinate axis of the arm coordinate system (X′−Y ′) is also the same. It can be considered the same as the camera coordinate system and the reference coordinate system.
Similarly to the angle deviation θz of the Z axis, the angle deviations θx and θy of the coordinate axes of the X axis and the Y axis are expressed by the following equation (26). Therefore, the threedimensional image processing device 42 calculates the angle shifts θx, θy, θz of the coordinate axes in the temporary camera coordinate system using the following equation 26 in addition to the equation 25. In the following equation (26), Kx and Ky are inclinations of straight lines for calculating the angle deviations θx and θy between the X axis and the Y axis, respectively, and one of the two coordinate axes orthogonal to each coordinate axis. This is the inclination of a straight line defined with respect to the coordinate axis of the reference coordinate system defined from the fixed point coordinates converted from the coordinates of the camera coordinate system to the coordinates of the reference coordinate system when the camera coordinate system is moved parallel to the direction.
As can be seen from the above description, the correction function of the first coordinate transformation function is a function for making the coordinate axis of the temporary camera coordinate system coincide with the coordinate axis of the true camera coordinate system.
In step S52, the threedimensional image processing device 42 determines the slopes Kx, Ky, Kz of the straight lines as fixed points (xa ″, ya ″, za ″), (center coordinates in the reference coordinate system of the reference sphere 60). xb ″, yb ″, zb ″), (xc ″, yc ″, zc ″), (xd ″, yd ″, zd ″), (xe ″, ye ″, ze ″). For the slope Kx, fixed points (ya ″, za ″), (yb ″, zb ″), (yc ″, zc ″) or (yb ″, zb ″), (yd ″, zd ″), (ye ″, A straight line is calculated by the least square method using each threedimensional data of ze ″) and the inclination Kz of the straight line is calculated. In this embodiment, the clockwise rotation of each coordinate axis as viewed from the origin is defined as “positive”. .
The threedimensional image processing device 42 substitutes the angle shifts θx, θy, θz of the coordinate axes of the temporary camera coordinate system calculated using the equation 26 into the equation 15, and each matrix of the coordinate rotation function M ′. The values g ′ _{11} , g ′ _{12} , g ′ _{13} , g ′ _{21} , g ′ _{22} , g ′ _{23} , g ′ _{31} , g ′ _{32} , g ′ _{33} are calculated. Next, in step S54, the threedimensional image processing device 42 corrects the first coordinate conversion function using the coordinate rotation function M ′. The first coordinate transformation function corrected using this coordinate rotation function M ′ is shown in the following equation (27).
When coordinate transformation by the determinant of the matrix values of g′11 to g′33 in Equation 27 is performed on the coordinate axes of the camera coordinate system, the coordinate axes of the temporary camera coordinate system and the coordinate axes of the camera coordinate system as shown in FIG. The direction of is the same. In this case, the positions of the fixed points (x ″ 0, y ″ 0) measured in the camera coordinate system at the respective measurement positions a, b, c, d, e are all converted to coordinates in the reference coordinate system (xc0, yc0). That is, the coordinate obtained by converting the fixed point coordinate for each measurement position in the camera coordinate system into the coordinate in the reference coordinate system is determined as one point regardless of the measurement position.
Next, the threedimensional image processing device 42 calculates a coordinate movement function among the correction functions of the first coordinate conversion function. First, in step S56, the threedimensional image processing device 42 calculates threedimensional data of a fixed point represented by the camera coordinate system. Specifically, the threedimensional image processing device 42 has fixed points (xa ″, ya ″, za ″), (xb ″) that are the respective central coordinates in the reference coordinate system of the reference sphere 60 stored in the memory device. , yb ″, zb ″), (xc ″, yc ″, zc ″), (xd ″, yd ″, zd ″), (xe ″, ye ″, ze ″) corresponding to the fixed points, respectively. Coordinate values in the camera coordinate system using a coordinate conversion function opposite to the stored second coordinate conversion function and a coordinate conversion function opposite to the first coordinate conversion function before being corrected by the coordinate rotation function M ′. Respectively.
Next, in step S58, the threedimensional image processing device 42 calculates the coordinate value of the fixed point represented by the reference coordinate system when coordinate transformation is performed using the corrected first coordinate transformation function and second coordinate transformation function. To do. Specifically, the threedimensional image processing apparatus 42 performs first coordinate conversion in which threedimensional data representing each fixed point in the camera coordinate system that has been coordinateconverted in step S56 is corrected by the coordinate rotation function M ′. Using the function and the second coordinate conversion function for each fixed point, the coordinates are converted again to each fixed point in the reference coordinate system. In this case, since the angle shifts θx, θy, θz of the respective coordinate axes of the temporary camera coordinate system are corrected by the coordinate rotation function M ′, as shown in FIG. Are almost the same value.
Next, in step S60, the threedimensional image processing apparatus 42 calculates the coordinate value of one fixed point by averaging the coordinate values of the fixed points represented by the reference coordinate system. Specifically, the coordinate values of each fixed point in the reference coordinate system transformed in step 58 are added together for each coordinate axis, and the coordinate value of one fixed point is divided by the number of fixed points. Calculate the value.
Next, in step S62, the threedimensional image processing device 42 calculates a coordinate movement function. Specifically, the deviation amount for each coordinate axis between the coordinate value of one fixed point in the reference coordinate system calculated in step 58 and the coordinate value of one fixed point in the reference coordinate system measured by the stylus measurement program. A matrix value (a ′, b ′, c ′) representing is calculated as a coordinate movement function. Next, in step S64, the threedimensional image processing device 42 corrects the first coordinate conversion function using the coordinate movement function (a ', b', c '). The first coordinate transformation function corrected using this coordinate movement function (a ′, b ′, c ′) is shown in the following equation (28).
In steps S56 to S62, fixed points (xa ″, ya ″, za ″), (xb ″, yb ″) which are the respective central coordinates in the reference coordinate system of the reference sphere 60 stored in the memory device. , zb ″), (xc ″, yc ″, zc ″), (xd ″, yd ″, zd ″), and (xe ″, ye ″, ze ″) using all threedimensional data ( a ′, b ′, c ′), the fixed points (xa ″, ya ″, za ″), (xb ″, yb ″, zb ″), (xc ″, yc ″, zc ″), ( xd ″, yd ″, zd ″), (xe ″, ye ″, ze ″) using the same coordinate movement function (a ′, b ′, c) using threedimensional data of one or more and four or less fixed points. ') May be calculated. According to this, the coordinate movement function (a ′, b ′, c ′) can be calculated in a short time.
Note that steps S56 to S62 may be omitted, and the correction of the first coordinate conversion function may be only the correction by the coordinate rotation function M ′. In this case, the coordinate value converted from the camera coordinate system to the reference coordinate system is shifted by the coordinate movement function with respect to the coordinate value of the reference coordinate system, but the coordinate movement function is a constant value. The value shift is always constant regardless of the position of the camera coordinate system. Therefore, there is no problem with displaying the threedimensional shape of the measurement object when viewed from an arbitrary direction.
Next, the threedimensional image processing apparatus 42 ends the execution of the coordinate transformation function correction program in step S66. Thereby, the deviation of the angle of each coordinate axis and the deviation of the origin of the coordinate system in the first coordinate conversion function are corrected. That is, the coordinate axis of the temporary camera coordinate system coincides with the coordinate axis of the true camera coordinate system by the corrected first coordinate transformation function.
After the execution of the coordinate conversion function correction program, the operator removes the reference sphere 60 from the base 10 and places a measurement object (not shown) on the base 10. Then, the input device 43 is operated to instruct the display of the threedimensional shape of the measurement object. In response to this, the threedimensional image processing device 42 starts execution of the threedimensional shape display program shown in FIG. 9 in step S70, and in step S72, the measurement information representing the threedimensional shape of the measurement object is displayed. Wait for input. On the other hand, the threedimensional shape measuring apparatus 30 is controlled by the controller 41 and starts measuring the threedimensional shape of the measurement object.
The operator holds the threedimensional shape measuring device 30 with his hand and applies the rotational displacement at each connecting portion of the rotating rod 22, the first to third arms 25 and the threedimensional shape measuring device 30 to the measurement object. Then, the threedimensional shape measuring apparatus 30 is moved to measure the threedimensional shape. In this case, the operator performs measurement from three different positions with respect to the measurement object. Then, when the measurement of the measurement object is completed, the threedimensional shape measurement device 30 outputs information representing the threedimensional shape for each measurement position to the threedimensional image processing device 42. The measurement of the measurement object is not limited to the three positions described above, and may be performed from four or more positions. According to this, a highly accurate threedimensional shape can be measured.
In step S72, the threedimensional image data processing device 42 inputs information representing the threedimensional shape of the measurement object. That is, information (specifically, inclinations θx, θy and distance Lz) related to the XYZ coordinates representing the divided area positions obtained by dividing the surface of the measurement object into minute areas is input. Next, in step S74, the threedimensional image processing apparatus 42 is based on the input information regarding the XYZ coordinates from the input threedimensional shape measuring apparatus 30 in the same manner as the processes in steps S24 and S42 described above. Thus, threedimensional image data composed of a threedimensional shape data group representing the threedimensional shape of the measurement object at each measurement position is calculated. The threedimensional image data in this case is represented by a coordinate system related to the threedimensional shape measuring apparatus 30, that is, a camera coordinate system.
Next, in step S76, the threedimensional image processing device 42 includes the first coordinate conversion function (the above equation 28) corrected by the execution of the coordinate conversion function correction program shown in FIG. Are used to convert the threedimensional image data represented by the camera coordinate system for each measurement position into the threedimensional image data represented by the reference coordinate system. In this case, the second coordinate conversion function is calculated for each measurement position in the same manner as in step S46. In step S78, the threedimensional image processing device 42 combines the threedimensional image data represented by the reference coordinate system of each measurement position into a set of threedimensional shape data groups. In this synthesis, since the threedimensional image data at all measurement positions is represented by coordinate values on the reference coordinate system that is the same coordinate system, the portion of the measurement object that is not measured at each measurement position (3 at each measurement position). The threedimensional image data representing the outer surface of the measurement object positioned on the back side with respect to the threedimensional shape measuring apparatus 30 are complemented to form a set of data.
Next, in step S80, the threedimensional image processing device 42 causes the display device 44 to display the threedimensional shape of the measurement object using the synthesized threedimensional shape data group. Then, the threedimensional image processing device 42 ends the execution of the threedimensional shape display program in step S82. In displaying the threedimensional shape of the measurement object, the operator can instruct the display direction of the measurement object by operating the input device 43, and the controller 41 and the threedimensional image processing device 42 are displayed on the display device 44. The display direction of the measurement object displayed at is changed. Thereby, the threedimensional shape which looked at the measuring object from arbitrary directions can be displayed.
Further, when a new measurement object is placed on the base 10 and the display of the measurement object is instructed as described above, the same first coordinate transformation function as described above and A threedimensional shape of a new measurement object viewed from an arbitrary direction can be displayed on the display device 44 using the second coordinate conversion function. Accordingly, the first coordinate transformation function and the second coordinate transformation function relating to the measurement target space on the base 10 are calculated only once using the reference sphere 60, and the first coordinate transformation is performed by the coordinate transformation function correction program shown in FIG. If the function is corrected, the threedimensional shape can be displayed on the display device 44 by changing the measurement object one after another.
As can be understood from the above description of operation, according to the above embodiment, the first coordinate conversion function calculated based on the threedimensional image data of the reference sphere 60 is converted into the coordinate rotation function M ′ related to the angle shift of each coordinate axis. And a coordinate movement function (a ′, b ′, c ′) relating to the origin deviation of the coordinate system. The coordinate rotation function M ′ is one of the two coordinate axes orthogonal to the coordinate axis that is the object of calculation of the angle deviation, and is parallel to each coordinate axis of the camera coordinate system and the reference coordinate system corresponding to the coordinate axis. Among the coordinate values of the three fixed points measured by moving the threedimensional shape measuring apparatus 30 and converted into the coordinate axes of the reference coordinate system, the coordinate values on two coordinate axes other than the coordinate axes that are subject to calculation of the angular deviation are passed. It is calculated based on the inclination of the straight reference coordinate system with respect to the coordinate axis. The difference in the coordinate values of the three fixed points in the reference coordinate system is due to a shift in the angle of each coordinate axis of the first coordinate conversion function. Therefore, the inclination of the straight line in the reference coordinate system, that is, the inclination of the straight line calculated based on the coordinate value of the fixed point coordinateconverted by the first coordinate conversion function represents the deviation of the angle of each coordinate axis of the first coordinate conversion function. It will be.
Further, the coordinate movement function (a ′, b ′, c ′) includes the coordinate values of the fixed points coordinatetransformed by the first coordinate transformation function and the second coordinate transformation function corrected by the coordinate rotation function M ′. It is calculated based on the difference between each coordinate value in the reference coordinate system of the identification point. In this case, the fixed point for each measurement position subjected to the coordinate conversion by the first coordinate conversion function and the second coordinate conversion function corrected by the coordinate rotation function M ′ is determined to be the same point in the reference coordinate system. Therefore, according to the above embodiment, even if the measurement error is included in the threedimensional image data for calculating the first coordinate transformation function, the first coordinate transformation function and the second coordinate transformation function are used. Since the first coordinate conversion function is corrected based on the angle of the straight line defined by the coordinate values subjected to coordinate conversion with respect to the coordinate axis of the reference coordinate system, a highly accurate coordinate conversion function can be calculated. As a result, accurate threedimensional shape measurement of the measurement object can be performed.
Although one embodiment of the present invention has been described above, the present invention is not limited to the above embodiment, and various modifications can be made without departing from the object of the present invention. Hereinafter, modified examples will be described.
a. First Modification In the above embodiment, the coordinate rotation function M ′ for correcting the first coordinate conversion function is calculated using the fixed point coordinates in the reference coordinate system for each measurement position. However, the present invention is not limited to this. Absent. For example, in addition to the fixed point coordinates in the reference coordinate system for each measurement position, the distance between the measurement positions, that is, the amount of movement between the measurement positions of the threedimensional shape measuring apparatus 30 (camera coordinate system) is used. It is also possible to calculate the angle deviations θx, θy, θz of the coordinate axes.
Specifically, the coordinate value in the Yaxis direction of the fixed point coordinate in the reference coordinate system when the threedimensional shape measuring device 30 (camera coordinate system) is moved in each of the two coordinate axis directions of the Xaxis direction and the Yaxis direction. The slope Kz of the straight line defined by the amount of change (numerator of Equation 21) and the amount of movement a of the threedimensional shape measuring device 30 (camera coordinate system) in the Xaxis direction in the reference coordinate system is It is expressed as shown in In the following equation 29, a and b represent the movement amounts in the Xaxis direction and the Yaxis direction in the reference coordinate system of the threedimensional shape measuring apparatus 30 (camera coordinate system).
In this case, of the movement amounts a and b of the threedimensional shape measuring apparatus 30 (camera coordinate system), the movement amount a in the Xaxis direction is extremely larger than the movement amount b in the Yaxis direction, and the angle of the Zaxis is Since the deviation θz is extremely small, the equation 29 is modified as shown in the following equation (30).
In this case, the Zaxis angle shift θz shown in 30 is the amount of movement a in the Xaxis direction of the threedimensional shape measuring apparatus 30 (camera coordinate system) and the fixed point coordinate converted to the reference coordinate system in the Yaxis direction. It is the inclination angle of a straight line defined by the amount of change in coordinate values. In this case, the movement amount b in the Yaxis direction of the camera coordinate system when the camera coordinate system is moved in the Yaxis direction orthogonal to the Zaxis, and the coordinate value in the Xaxis direction of the fixed point coordinate converted into the reference coordinate system. Even if the inclination angle of the straight line is calculated using the amount of change, the Z axis angle deviation θz can be calculated in the same manner. That is, the deviation of the angle of each coordinate axis is a threedimensional shape in the direction of each coordinate axis of the camera coordinate system and the reference coordinate system corresponding to one of the two coordinate axes orthogonal to the coordinate axis to be calculated for the angle deviation. It can be calculated from the inclination angle of a straight line defined by the amount of movement of the measuring device 30 (camera coordinate system) and the amount of change in the coordinate value of the fixed point coordinate converted into the reference coordinate system in the other coordinate axis direction.
Similarly to the angle deviation θz of the Z axis, the angle deviations θx and θy of the coordinate axes of the X axis and the Y axis are respectively expressed by the following equation (31). Therefore, the threedimensional image processing device 42 can calculate the angle deviations θx, θy, θz of the coordinate axes of the temporary camera coordinate system using the following equation 31 in addition to the equation 30. In the following equation 30, Kx and Ky are inclinations of straight lines corresponding to the Kz for calculating the angular deviations θx and θy of the X axis and the Y axis, respectively. Of the two coordinate axes orthogonal to each coordinate axis, The movement amount of the threedimensional shape measuring device 30 (camera coordinate system) in the direction of each coordinate axis of the camera coordinate system and the reference coordinate system corresponding to one coordinate axis, and the fixed point coordinate converted to the reference coordinate system in the other coordinate axis direction This is the slope of the straight line calculated by the change amount of the coordinate value.
These inclinations Kx, Ky, Kz are fixed points (xa ″, ya ″, za ″), (xb ″, yb ″, zb ″), (xc ″) that are the respective central coordinates in the reference coordinate system of the reference sphere 60. , yc ″, zc ″), (xd ″, yd ″, zd ″), (xe ″, ye ″, ze ″) and the amount of movement of the threedimensional shape measuring apparatus 30 between the measurement positions. In this case, in the measurement process of the reference sphere in step S42 in the coordinate transformation function correction program shown in Fig. 5, the arm represented by the reference coordinate system for each measurement position of the reference sphere 60 by the threedimensional shape measuring apparatus 30. The origin position of the coordinate system is stored in the memory device, and the movement amount of the threedimensional shape measuring apparatus 30 can be calculated based on the origin position of the arm coordinate system for each measurement position stored therein.
For example, the straight line inclination Kx is calculated using the coordinate values yd ″, ye ″ and the movement amount bz in the Zaxis direction, or the coordinate values za ″, zc ″ and the movement amount ay in the Yaxis direction. Here, the movement amount bz in the Zaxis direction is the distance between the position d and the position e shown in FIG. 6, and the movement amount ay in the Yaxis direction is the position a and the position c shown in FIG. Is the distance. Thereby, the inclination Kx of the straight line can be calculated, and the angle deviation θx of the coordinate axes can be calculated using the inclination Kx of the straight line. Further, the straight line inclinations Ky and Kz can be calculated in the same manner as the straight line inclination Kx, and the angle deviations θy and θz of the coordinate axes can be calculated using the straight line inclinations Ky and Kz. As a result, the coordinate rotation function M ′ can be calculated as in the above embodiment. Thereby, the same effect as the abovementioned embodiment can be expected.
b. Second Modification In the above embodiment, after the direction of each coordinate axis of the camera coordinate system and the direction of each coordinate axis of the reference coordinate system are matched, the threedimensional shape measuring apparatus 30 is set to two coordinate axes of both coordinate systems, respectively. Although the reference sphere 60 is measured while being moved in parallel, the measurement may be performed by moving the reference sphere 60 in two directions orthogonal to each other within the plane of the two coordinate axes. That is, the reference sphere 60 is measured by moving the threedimensional shape measuring device 30 in two directions that are shifted by the same angle with respect to two coordinate axes in both the camera coordinate system and the reference coordinate system. Is also possible.
In this case, the straight line inclinations Kx, Ky, and Kz in the reference coordinate system expressed by the equations 25 and 26 are determined in the straight reference coordinate system corresponding to the moving direction of the threedimensional shape measuring apparatus 30 (camera coordinate system). Is corrected by the slopes Mx, My, and Mz. The straight line inclination Kz shown in the equation 25 will be specifically described. The straight line inclination Mz corresponding to the movement direction of the camera coordinate system is the amount of movement b in the Y axis direction relative to the movement amount a in the X axis direction of the camera coordinate system. Calculated by the ratio (Mz = a / b). When the amount of change on the Yaxis and the amount of change on the Xaxis shown in Equation 21 are corrected using the slope Mz of this straight line, the following Equation 32 is obtained.
When the straight line inclinations Kx and Ky are corrected in the same manner as the straight line inclination Kz, the following equation 33 is obtained.
These inclinations Kx, Ky, Kz are fixed points (xa ″, ya ″, za ″), (xb ″, yb ″, zb ″), (xc ″) that are the respective central coordinates in the reference coordinate system of the reference sphere 60. , yc ″, zc ″), (xd ″, yd ″, zd ″), (xe ″, ye ″, ze ″), and the inclination Mx of the straight line in the reference coordinate system corresponding to the moving direction of the camera coordinate system, In this case, in the measurement process of the reference sphere in step S42 in the coordinate transformation function correction program shown in Fig. 5, for each measurement position of the reference sphere 60 by the threedimensional shape measurement apparatus 30, The origin position of the arm coordinate system represented by the reference coordinate system is stored in the memory device, and the straight line inclinations Mx, My, Mz are calculated based on the origin position of the arm coordinate system for each measurement position stored in the memory device. can do.
For example, the straight line slope Kx is calculated by the least square method using the fixed points (ya ″, za ″), (yb ″, zb ″), (yc ″, yc ″), and the straight line slope Mx is the camera coordinate. It is calculated by the ratio of the movement amount bz in the Zaxis direction to the movement amount ay in the Yaxis direction of the system (Mx = ay / bz). Here, the movement amount ay in the Yaxis direction is the distance between the position a and the position c shown in FIG. 6, and the movement amount bz in the Zaxis direction is the position d and the position e shown in FIG. Is the distance. Thereby, the inclination Kx of the straight line can be calculated, and the angle deviation θx of the coordinate axes can be calculated using the inclination Kx of the straight line. Further, the straight line inclinations Ky and Kz can be calculated in the same manner as the straight line inclination Kx, and the angle deviations θy and θz of the coordinate axes can be calculated using the straight line inclinations Ky and Kz. As a result, the coordinate rotation function M ′ can be calculated as in the above embodiment. Thereby, the same effect as the abovementioned embodiment can be expected.
Further, using the origin position of the arm coordinate system represented by the reference coordinate system for each measurement position of the reference sphere 60 by the threedimensional shape measuring apparatus 30 stored in the memory device, the above Equation 32 and Equation 33 are used. Can also calculate the inclination angles θx, θy, θz of each coordinate axis by another calculation formula. Specifically, with the measurement position b shown in FIG. 6 as the origin (0, 0), the other measurement positions a, c, d, e are represented using the origin position of the arm coordinate system. For example, the measurement position of a is (a1, b1), the measurement position of c is (a3, b3), the measurement position of d is (a4, b4), and the measurement position of e is (a5, b5). Substituting the value into the equation 29 and expressing it for each of the slopes Kx, Ky, Kz of the straight line, the following equation 27 is obtained. In the following equation 34, the slope Kx of the straight line is calculated from (a1, z1), (0, z2), (a3, z3) by the least square method, and the slope Ky of the straight line is (x4, b4), (x2 , 0), (x5, b5) by the least square method, and the slope Kz of the straight line is calculated by the least square method from (a1, x1), (0, x2), (a3, x3). . Even using the following equation 34, the tilt angles θx, θy, θz of the coordinate axes can be calculated in the same manner as described above, and the coordinate rotation function M ′ can be calculated.
c. Third Modification In the above embodiment, after the direction of each coordinate axis of the camera coordinate system is matched with the direction of each coordinate axis of the reference coordinate system, the threedimensional shape measuring apparatus 30 is parallel to the two coordinate axes of both coordinate axes. The reference sphere 60 was measured by moving it, but if a fixed point for each measurement position can be calculated when the camera coordinate system is moved in parallel to each coordinate axis of the camera coordinate system, each coordinate axis of the camera coordinate system can be calculated. The first coordinate conversion function can be corrected without matching the direction and the direction of the coordinate axis of the reference coordinate system.
Specifically, the operator calculates the first coordinate conversion function by executing the stylus measurement program shown in FIG. 3 and the coordinate conversion function calculation program shown in FIG. 4, and then the orientation and reference of each coordinate axis in the camera coordinate system. The threedimensional shape of the reference sphere 60 is measured in a state where the directions of the coordinate axes of the coordinate system do not match. When measuring the threedimensional shape of the reference sphere 60, the operator causes the threedimensional image processing device 42 to execute the coordinate conversion function correction program shown in FIG. The measurement of the threedimensional shape of the reference sphere 60 is performed in such a manner that the direction of each coordinate axis of the camera coordinate system does not coincide with the direction of each coordinate axis of the reference coordinate system. And move them in parallel. Other measurement operations are the same as in the above embodiment. Each process in the coordinate transformation function correction program is the same as that in the above embodiment except for step S42, step S45, and step S50, and therefore, processing contents different from those in the above embodiment will be described.
In step S42, the threedimensional image processing device 42 stores threedimensional image data representing the threedimensional shape of the reference sphere 60 for each measurement position in the memory device in the same manner as in the above embodiment. In this case, the origin position of the arm coordinate system represented by the reference coordinate system is stored in the memory device for each measurement position of the reference sphere 60 by the threedimensional shape measuring apparatus 30. Then, the threedimensional image processing device 42 calculates a coordinate transformation correction function in step S45 indicated by a broken line in FIG. This coordinate transformation correction function is a coordinate value obtained by performing coordinate transformation using the first coordinate transformation function and the second coordinate transformation function in a state where the orientation of each coordinate axis in the camera coordinate system does not match the orientation of each coordinate axis in the reference coordinate system. Is a function that corrects according to the angle deviation between each coordinate axis of the camera coordinate system and each coordinate axis of the reference coordinate system. The calculation of the coordinate transformation correction function includes the following substeps 1 to 3.
Substep 1: The 3D image processing device 42 calculates two vectors respectively corresponding to the two movement directions of the 3D shape measuring device 30. Specifically, using the origin of the arm coordinate system for each measurement position stored in the memory device, a straight line passing through each origin for each movement direction of the threedimensional shape measurement apparatus 30 is calculated as a vector. That is, vectors parallel to the two coordinate axes of the camera coordinate system respectively corresponding to the two movement directions of the threedimensional shape measuring apparatus 30 are calculated.
Substep 2: The threedimensional image processing device 42 calculates unit vectors of the two vectors calculated in the substep 1 and calculates an outer product of the unit vectors. Thereby, one vector orthogonal to each unit vector of two vectors corresponding to each moving direction of the threedimensional shape measuring apparatus 30 is calculated. One vector calculated by the outer product of the two unit vectors and the two unit vectors is three unit vectors orthogonal to each other, and each vector is parallel to each coordinate axis of the camera coordinate system.
Substep 3: The threedimensional image processing device 42 uses the three unit vectors calculated in the substep 2 as basis vectors (1, 0, 0), (0, 1, 0), (0, 0, A conversion function for converting to 1) is calculated, and the conversion function is stored in the memory device as a coordinate correction conversion function. This conversion function is calculated based on the rotation angle of each vector for converting each of the three unit vectors into each base vector. That is, this conversion function corresponds to a coordinate conversion function that matches the direction of each coordinate axis of the reference coordinate system with the direction of each coordinate axis of the camera coordinate system.
Next, the process of step S50 in the coordinate conversion function correction program is changed as follows. In step S50, the threedimensional image processing device 42 corrects each coordinate value calculated as the sphere center of the reference sphere 60 by using the coordinate transformation correction function calculated in the substeps 1 to 3 in step S45. As a result, the coordinate value transformed by the first coordinate transformation function is corrected according to the amount of angular deviation between each coordinate axis of the camera coordinate system and each coordinate axis of the reference coordinate system. That is, the coordinate value corrected by the coordinate conversion correction function is equivalent to the coordinate value in the reference coordinate system calculated in step S50 in the embodiment. Then, the threedimensional image processing device 42 stores the corrected coordinate value in the memory device as a fixed point. The subsequent processing is the same as in the above embodiment. According to the third modified example configured as described above, since it is not necessary to match the direction of each coordinate axis of the camera coordinate system with the direction of the coordinate axis of the reference coordinate system, the measurement work of the reference sphere 60 can be performed efficiently. Can do.
d. Other Modifications In the embodiment, the first modification, the second modification, and the third modification, the threedimensional shape of the reference sphere 60 is calculated at three different positions in order to calculate the first coordinate conversion function. Although the measurement is performed, the present invention is not limited to this as long as the fixed points of the reference sphere 60 at three different positions can be calculated. For example, the measurement work of the reference sphere 60 in the coordinate transformation function calculation program shown in FIG. 4 and the measurement work of the reference sphere 60 in the coordinate transformation function correction program shown in FIG.
Specifically, after the execution of the stylus measurement program, the operator assembles the threedimensional shape measuring device 30 at the tip of the third arm 25 of the support mechanism 20 so as to be rotatable around the axis of the third arm 25. Then, the threedimensional shape of the reference sphere 60 is measured. When measuring the threedimensional shape of the reference sphere 60, the operator causes the threedimensional image processing device 42 to execute the coordinate conversion function calculation program shown in FIG. In this case, the measurement of the threedimensional shape of the reference sphere 60 is performed in the same manner as the measurement operation in the coordinate conversion function correction program, with the orientation of each coordinate axis of the camera coordinate system and the coordinate axes of the reference coordinate system of the threedimensional shape measurement apparatus 30. After matching the directions, the threedimensional shape measuring apparatus 30 is moved in parallel with respect to the two coordinate axes, and measurement is performed at five different positions. The threedimensional image processing device 42 calculates the first coordinate conversion function using three fixed points among the five fixed points for each measurement position in the calculation process of the first coordinate conversion function in step S32. In this case, as the three fixed points, three fixed points whose all fixed points are not on the same straight line are used. Thereby, the first coordinate conversion function is calculated.
Next, the worker causes the threedimensional image processing device 42 to execute the coordinate conversion function correction program shown in FIG. In this case, the coordinate transformation function correction program corrects the first coordinate transformation function by executing each step using the threedimensional image data of the reference sphere 60 measured in the coordinate transformation function calculation program. That is, the measurement process of the reference sphere in step S42 in the coordinate conversion function program is skipped. Therefore, according to this modification, based on the threedimensional image data obtained by one measurement operation of the reference sphere 60, the center coordinate of the reference sphere 60 for each measurement position is calculated as a fixed point, and the identification point is determined as the first point. It is used to calculate the correction function of the onecoordinate conversion function and the first coordinate conversion function. For this reason, the frequency  count of the measurement operation  work of the reference  standard sphere 60 reduces, and the calculation of the 1st coordinate transformation function and the correction  amendment of the 1st coordinate transformation function can be performed efficiently.
In the embodiment, the first modification, the second modification, and the third modification, the threedimensional shape measuring apparatus 30 calculates each fixed position for calculating a plurality of fixed points represented by the reference coordinate system. After the coordinate conversion of the threedimensional image data in the camera coordinate system of the reference sphere 60 measured to the threedimensional image data in the reference coordinate system, the fixed point is calculated using the threedimensional image data. However, the present invention is not limited to this. For example, after calculating a fixed point using the threedimensional image data representing the reference sphere 60 measured at each measurement position by the threedimensional shape measurement apparatus 30, the calculated fixed point at each measurement position is coordinated with the reference coordinate system. You may make it convert. Also by this, the same effect as the abovementioned embodiment can be expected.
In the embodiment, the first modified example, the second modified example, and the third modified example, the reference sphere 60 is measured at three positions for each moving direction of the threedimensional shape measuring apparatus 30, and each measurement position is measured. A straight line for calculating the deviation of the angle of each coordinate axis is defined using the three fixed points corresponding to, but the present invention is not limited to this as long as the straight line can be defined. That is, it is only necessary to obtain at least two fixed points. For example, four or more fixed points may be obtained in each moving direction. According to this, a straight line can be defined more accurately, and a highly accurate coordinate rotation function M ′ can be calculated.
In the embodiment, the first modified example, the second modified example, and the third modified example, the reference sphere 60 is measured by moving the threedimensional shape measuring apparatus 30 in the Yaxis direction and the Zaxis direction. However, the present invention is not limited to this as long as the three angular deviations θx, θy, and θz corresponding to the three coordinate axes can be calculated. That is, one of the Yaxis direction and the Zaxis direction may be replaced with the Xaxis direction. Also by this, the same effect as the abovementioned embodiment can be expected.
In the embodiment, the first modified example, the second modified example, and the third modified example, the reference sphere 60 is moved by moving the threedimensional shape measuring apparatus 30 in the two coordinate axis directions of the Yaxis direction and the Zaxis direction. However, the present invention is not limited to this as long as the angle shifts θx, θy, and θz can be calculated for each of the three coordinate axes. That is, the reference sphere 60 may be measured by moving the threedimensional shape measuring apparatus 30 in the Xaxis direction in addition to the Yaxis direction and the Zaxis direction. In this case, two angular deviations θx, θy, and θz corresponding to the three coordinate axes are respectively calculated, but the two calculated values are averaged to obtain respective angular deviations θx, θy, and θz. Also good. According to this, a more accurate coordinate rotation function M ′ can be calculated, and a highly accurate first coordinate conversion function can be calculated.
Moreover, in the said embodiment, the 1st modification, the 2nd modification, and the 3rd modification, although the spherical body was used as the reference  standard sphere 60, the information of reference  standard objects, such as a fixed point, can be pinpointed by the contact of the stylus 50, In addition, the diameter and the number of the spheres are not limited as long as the object can be specified by measuring the outer shape by the threedimensional shape measuring apparatus 30 and the coordinate value of the fixed point defined by the object can be specified. An object other than a sphere can also be used as a reference object. For example, an object such as a cube such as a cube or a rectangular parallelepiped, a pyramid, a cylinder, or a cone can be used as the reference object. Also by this, the same effect as the abovementioned embodiment can be expected.
Moreover, in the said embodiment, a 1st modification, a 2nd modification, and a 3rd modification, in order to calculate a 1st coordinate transformation function and a 2nd coordinate transformation function, one reference sphere 60 is provided on the base 10. Although used, it is not limited to this. For example, three reference spheres are arranged on the base 10, the three reference spheres are measured by the stylus 50 and the threedimensional shape measuring device 30, and the first coordinate conversion function and the second coordinate are set with the center coordinates as fixed points. You may comprise so that a conversion function may be calculated. Also by this, the same effect as the abovementioned embodiment can be expected.
In the embodiment, the first modification, the second modification, and the third modification, the reference sphere 60 is arranged on the base 10 in order to specify the reference point in the reference coordinate system. Although the measurement was performed using the stylus 50 in which the reference sphere 60 was attached to the tip of the support mechanism 20, the measurement is not limited to this. For example, if the reference sphere 60 is arranged at a position that can be specified in advance in the reference coordinate system, an operation for measuring the reference sphere 60 and specifying a reference point, that is, the reference sphere 60 by the stylus measurement program shown in FIG. No measurement work is required. Also by this, the same effect as the abovementioned embodiment can be expected.
In the embodiment, the first modified example, the second modified example, and the third modified example, the coordinates of the camera coordinate system are converted into the reference coordinate system using the threedimensional shape measuring apparatus 30 supported by the deformable support mechanism. Although the coordinate conversion function for converting to the coordinates of is corrected, the present invention is not limited to this. For example, the coordinate transformation function may be corrected in a threedimensional image generation system that can move the threedimensional shape measuring apparatus 30 independently while measuring the position and orientation of the threedimensional shape measuring apparatus 30 by another measuring means. . In this case, one point arbitrarily determined around the threedimensional shape measuring apparatus 30 corresponds to the origin of the arm coordinate system, and the second coordinate conversion function for converting the coordinates corresponding to the arm coordinate system into the coordinates of the reference coordinate system is threedimensional. It is calculated from the position and orientation data of the shape measuring device 30. For other configurations, the coordinate transformation function can be corrected by the same method as in the above embodiment.
DESCRIPTION OF SYMBOLS 10 ... Base, 20 ... Support mechanism, 21 ... Fixed pole, 22 ... Rotating rod, 2325 ... Arm, 27a27e ... Rotation angle sensor, 30 ... Threedimensional shape measuring apparatus, 41 ... Controller, 42 ... Image processing Device, 43 ... input device, 44 ... display device, 50 ... stylus, 60 ... reference sphere.
Claims (4)
 The threedimensional shape measurement device which is supported by a deformable support mechanism (20) (30) to measure the threedimensional shape of the measurement object,
The measurement information of the measurement object by the threedimensional shape measuring device is input to a threedimensional image processing device (42) configured by a computer device (S72), and the threedimensional shape measurement is performed using the input measurement information. Generating threedimensional image data representing the threedimensional shape of the measured object in the coordinate system relating to the apparatus (S74) ;
The threedimensional image processing apparatus converts the generated threedimensional image data into threedimensional image data in a coordinate system related to the deformable support mechanism using a first coordinate transformation function including a coordinate rotation function and a coordinate movement function. The converted threedimensional image data is converted into threedimensional image data of a predetermined coordinate system using a second coordinate conversion function that changes according to the deformation state of the support mechanism (S76). ) ,
In the correction method of the coordinate transformation function for correcting the first coordinate transformation function applied to the threedimensional image generation system capable of displaying the measurement object as viewed from an arbitrary direction,
Wherein the direction of the coordinate axis in the coordinate system relating to the threedimensional shape measuring apparatus, wherein after having matched the direction of the coordinate axes of the predetermined prescribed coordinate system, the support mechanism is deformed by the threedimensional shape measurement device predetermined by parallel movement and coordinate axes of a predetermined coordinate system, measuring the threedimensional shape of the measurement object placed reference objects in the space of the threedimensional shape measuring apparatus in the threedimensional shape measuring device in at least two positions Let
The measurement information of the reference object by the threedimensional shape measurement device measured at the at least two positions is input to the threedimensional image processing device, and the coordinate system in the coordinate system related to the threedimensional shape measurement device is set for each measurement position. Generating threedimensional image data representing the threedimensional shape of the reference object (S42) ;
The threedimensional image processing apparatus, the first coordinate conversion function, using a 3dimensional image data for each measurement position which is the second coordinate transformation function and said generating, said reference object measured at least two locations Calculating threedimensional data representing the fixed point in the predetermined coordinate system (S44, S46, S48, S50) ;
The 3dimensional image processing apparatus, causes correct coordinate rotation function in the first coordinate transformation function according to the inclination of a straight line the defined using threedimensional data of the fixed point relative to the coordinate axes of the predetermined given coordinate system (S52, S54)
A method of correcting a coordinate transformation function characterized by the above.  The threedimensional shape measuring device (30) supported by the deformable support mechanism (20) measures the threedimensional shape of the measurement object ,
The measurement information of the measurement object by the threedimensional shape measuring device is input to a threedimensional image processing device (42) configured by a computer device (S72), and the threedimensional shape measurement is performed using the input measurement information. Generating threedimensional image data representing the threedimensional shape of the measured object in the coordinate system relating to the apparatus (S74) ;
The threedimensional image processing apparatus converts the generated threedimensional image data into threedimensional image data in a coordinate system related to the deformable support mechanism using a first coordinate transformation function including a coordinate rotation function and a coordinate movement function. The converted threedimensional image data is converted into threedimensional image data of a predetermined coordinate system using a second coordinate conversion function that changes according to the deformation state of the support mechanism (S76). ),
In the correction method of the coordinate transformation function for correcting the first coordinate transformation function applied to the threedimensional image generation system capable of displaying the measurement object as viewed from an arbitrary direction,
The direction of the coordinate axis in the coordinate system related to the threedimensional shape measuring apparatus is matched with the direction of the coordinate axis of the predetermined coordinate system, and the support mechanism is deformed to change the threedimensional shape measuring apparatus to the Move in parallel with the coordinate axis of a predetermined coordinate system determined in advance, and measure the threedimensional shape of the reference object placed in the measurement target space of the threedimensional shape measuring device in the threedimensional shape measuring device at at least two positions. Let
The measurement information of the reference object by the threedimensional shape measurement device measured at the at least two positions is input to the threedimensional image processing device, and the coordinate system in the coordinate system related to the threedimensional shape measurement device is set for each measurement position. Generating threedimensional image data representing the threedimensional shape of the reference object (S42);
The reference object measured at the at least two positions using the first coordinate transformation function, the second coordinate transformation function, and the generated threedimensional image data for each measurement position in the threedimensional image processing apparatus. The threedimensional data representing the fixed point in the predetermined coordinate system is calculated (S44, S46, S48, S50),
The 3dimensional image processing apparatus, an amount of movement of the threedimensional shape measurement equipment between the threedimensional shape was measured positions of the reference object is calculated according to the deformed state of the support mechanism, and was the calculated A straight line defined by the amount of movement and the amount of change in the coordinate value of the fixed point of the reference object in the coordinate axis direction different from the movement direction of the threedimensional shape measuring device among the coordinate axis directions of the predetermined coordinate system. A method for correcting a coordinate transformation function, comprising correcting a coordinate rotation function in the first coordinate transformation function according to an inclination .  The threedimensional shape measuring device (30) supported by the deformable support mechanism (20) measures the threedimensional shape of the measurement object,
The measurement information of the measurement object by the threedimensional shape measuring device is input to a threedimensional image processing device (42) configured by a computer device (S72), and the threedimensional shape measurement is performed using the input measurement information. Generating threedimensional image data representing the threedimensional shape of the measured object in the coordinate system relating to the apparatus (S74) ;
The threedimensional image processing apparatus converts the generated threedimensional image data into threedimensional image data in a coordinate system related to the deformable support mechanism using a first coordinate transformation function including a coordinate rotation function and a coordinate movement function. The converted threedimensional image data is converted into threedimensional image data of a predetermined coordinate system using a second coordinate conversion function that changes according to the deformation state of the support mechanism (S76). ),
In the correction method of the coordinate transformation function for correcting the first coordinate transformation function applied to the threedimensional image generation system capable of displaying the measurement object as viewed from an arbitrary direction,
The support mechanism is deformed so that the threedimensional shape measuring apparatus is the threedimensional shape measuring apparatus in a state where the direction of the coordinate axes in the coordinate system related to the threedimensional shape measuring apparatus does not coincide with the direction of the coordinate axes of the predetermined coordinate system. parallel to move the direction of the coordinate axes in the coordinate system relating to dimensional shape measuring device, at least two threedimensional measurement target placed reference objects in the space of the threedimensional shape measuring apparatus in the threedimensional shape measuring device at the location Let me measure the shape,
The measurement information of the reference object by the threedimensional shape measurement device measured at the at least two positions is input to the threedimensional image processing device, and the coordinate system in the coordinate system related to the threedimensional shape measurement device is set for each measurement position. Generating threedimensional image data representing the threedimensional shape of the reference object (S42);
To convert the threedimensional data of the predetermined coordinate system determined in advance according to the deformation state of the support mechanism into the threedimensional data of the coordinate system of the moved threedimensional shape measuring apparatus. The coordinate conversion correction function is calculated (S42, S45),
The reference object measured at the at least two positions using the first coordinate transformation function, the second coordinate transformation function, and the generated threedimensional image data for each measurement position in the threedimensional image processing apparatus. The threedimensional data representing the fixed point in the predetermined coordinate system is calculated , and the calculated threedimensional data expressed in the predetermined coordinate system is moved using the coordinate transformation correction function. Converted into the threedimensional data of the coordinate system of the threedimensional shape measuring apparatus (S44, S46, S48, S50),
In the threedimensional image processing apparatus, the first coordinate conversion function is set in accordance with the inclination of a straight line defined using the threedimensional data of the converted fixed point with respect to the coordinate axis of the coordinate system of the moved threedimensional shape measuring apparatus. The coordinate rotation function at is corrected (S52, S54).
A method of correcting a coordinate transformation function characterized by the above .  The method for correcting a coordinate transformation function according to any one of claims 1 to 3, further comprising:
Touching palpation fixed to the support mechanism to a plurality of locations of the reference object,
Wherein the threedimensional image processing apparatus, the threedimensional data representing a fixed point of the reference object at the predetermined given coordinate system in accordance with the state of deformation of the support mechanism for each plurality of positions obtained by contacting the probe, the specific Calculated as threedimensional data representing a fixed point (S12 to S16),
Causing the threedimensional image processing device to calculate threedimensional data representing a fixed point of the reference object in a coordinate system related to the threedimensional shape measuring device based on the threedimensional image data measured by the threedimensional shape measuring device; and The threedimensional data represented in the coordinate system relating to the calculated threedimensional shape measuring apparatus, the first coordinate transformation function in which the coordinate rotation function is corrected, and the support mechanism when measuring the threedimensional image data by the threedimensional shape measuring apparatus Using the second coordinate conversion function according to the deformation state of the above, it is converted into threedimensional data represented in the predetermined coordinate system (S56 to S60),
The 3dimensional image processing apparatus, according to the difference between the previous SL advance and 3dimensional data of the fixed point represented by a predetermined coordinate system decided, specific 3dimensional data of a fixed point obtained by the calculation by the conversion, make correct coordinate movement function of the first coordinate transformation function number (S62, S64)
A method of correcting a coordinate transformation function characterized by the above.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

JP2005123197A JP4491687B2 (en)  20050421  20050421  Coordinate transformation function correction method 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

JP2005123197A JP4491687B2 (en)  20050421  20050421  Coordinate transformation function correction method 
Publications (2)
Publication Number  Publication Date 

JP2006301991A JP2006301991A (en)  20061102 
JP4491687B2 true JP4491687B2 (en)  20100630 
Family
ID=37470197
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

JP2005123197A Active JP4491687B2 (en)  20050421  20050421  Coordinate transformation function correction method 
Country Status (1)
Country  Link 

JP (1)  JP4491687B2 (en) 
Cited By (2)
Publication number  Priority date  Publication date  Assignee  Title 

CN102692202A (en) *  20120111  20120926  河南科技大学  Reverse measurement method 
CN103411574A (en) *  20130814  20131127  西北工业大学  Aviation engine blade profile threecoordinate measuring method 
Families Citing this family (42)
Publication number  Priority date  Publication date  Assignee  Title 

DE102006031580A1 (en)  20060703  20080117  Faro Technologies, Inc., Lake Mary  Method and device for the threedimensional detection of a spatial area 
JP4834524B2 (en) *  20061120  20111214  ローランドディー．ジー．株式会社  Threedimensional shape measuring method and apparatus 
JP4863006B2 (en) *  20061228  20120125  パルステック工業株式会社  3D shape measurement method 
EP2048599B1 (en)  20071011  20091216  MVTec Software GmbH  System and method for 3D object recognition 
JP5319271B2 (en) *  20081226  20131016  川崎重工業株式会社  Robot tool position detecting method, robot and object relative position detecting method and device 
DE102009015920B4 (en)  20090325  20141120  Faro Technologies, Inc.  Device for optically scanning and measuring an environment 
US9551575B2 (en)  20090325  20170124  Faro Technologies, Inc.  Laser scanner having a multicolor light source and realtime color receiver 
KR101622659B1 (en) *  20091029  20160520  대우조선해양 주식회사  Calibration method of robot and laser vision system 
DE102009057101A1 (en)  20091120  20110526  Faro Technologies, Inc., Lake Mary  Device for optically scanning and measuring an environment 
US9113023B2 (en)  20091120  20150818  Faro Technologies, Inc.  Threedimensional scanner with spectroscopic energy detector 
US9529083B2 (en)  20091120  20161227  Faro Technologies, Inc.  Threedimensional scanner with enhanced spectroscopic energy detector 
US8630314B2 (en)  20100111  20140114  Faro Technologies, Inc.  Method and apparatus for synchronizing measurements taken by multiple metrology devices 
US9879976B2 (en)  20100120  20180130  Faro Technologies, Inc.  Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features 
US8615893B2 (en)  20100120  20131231  Faro Technologies, Inc.  Portable articulated arm coordinate measuring machine having integrated software controls 
WO2012033892A1 (en)  20100908  20120315  Faro Technologies, Inc.  A laser scanner or laser tracker having a projector 
DE112011100293T5 (en)  20100120  20130110  Faro Technologies, Inc.  Portable articulated arm coordinate measuring machine and integrated environmental recorder 
US8533967B2 (en)  20100120  20130917  Faro Technologies, Inc.  Coordinate measurement machines with removable accessories 
US9628775B2 (en)  20100120  20170418  Faro Technologies, Inc.  Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations 
US8677643B2 (en)  20100120  20140325  Faro Technologies, Inc.  Coordinate measurement machines with removable accessories 
US8875409B2 (en)  20100120  20141104  Faro Technologies, Inc.  Coordinate measurement machines with removable accessories 
US8898919B2 (en)  20100120  20141202  Faro Technologies, Inc.  Coordinate measurement machine with distance meter used to establish frame of reference 
JP5306545B2 (en)  20100120  20131002  ファロ テクノロジーズ インコーポレーテッド  Coordinate measuring machine with illuminated probe end and method of operation 
US9607239B2 (en)  20100120  20170328  Faro Technologies, Inc.  Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations 
US8832954B2 (en)  20100120  20140916  Faro Technologies, Inc.  Coordinate measurement machines with removable accessories 
CN101852593A (en) *  20100506  20101006  深南电路有限公司  Automatic optical inspection equipment, light measuring tool plate and light adjusting method thereof 
EP2385483B1 (en)  20100507  20121121  MVTec Software GmbH  Recognition and pose determination of 3D objects in 3D scenes using geometric point pair descriptors and the generalized Hough Transform 
DE102010020925B4 (en)  20100510  20140227  Faro Technologies, Inc.  Method for optically scanning and measuring an environment 
US9168654B2 (en)  20101116  20151027  Faro Technologies, Inc.  Coordinate measuring machines with dual layer arm 
JP5693978B2 (en) *  20110111  20150401  株式会社ミツトヨ  Image probe calibration method 
DE102012100609A1 (en)  20120125  20130725  Faro Technologies, Inc.  Device for optically scanning and measuring an environment 
CN102645202B (en) *  20120511  20140910  厦门大学  Method for measuring contour of largecaliber asphericsurface workpiece 
US8997362B2 (en)  20120717  20150407  Faro Technologies, Inc.  Portable articulated arm coordinate measuring machine with optical communications bus 
CN102749060A (en) *  20120723  20121024  湘电集团有限公司  Test control method and system for largeaperture disc type parabolic surface mirror 
CN102853779A (en) *  20120824  20130102  大连宏海新能源发展有限公司  Shape surface error detection method for lens unit of solarpowered disc type lightgathering system 
US9513107B2 (en)  20121005  20161206  Faro Technologies, Inc.  Registration calculation between threedimensional (3D) scans based on twodimensional (2D) scan data from a 3D scanner 
DE102012109481A1 (en)  20121005  20140410  Faro Technologies, Inc.  Device for optically scanning and measuring an environment 
US10067231B2 (en)  20121005  20180904  Faro Technologies, Inc.  Registration calculation of threedimensional scanner data performed between scans based on measurements by twodimensional scanner 
CN103486996B (en) *  20130814  20160120  西北工业大学  Blade profile measurement model unknown cad Aeroengine 
CN104504197B (en) *  20141221  20170908  浙江省计量科学研究院  A kind of eccentric parameter modification method of spiral of Archimedes flat thread 
DE102015122844A1 (en)  20151227  20170629  Faro Technologies, Inc.  3D measuring device with battery pack 
CN106017319B (en) *  20160524  20190215  北京建筑大学  A kind of laser scanning data coordinate crossover tool and method based on highprecision Point Measurement 
CN106441147B (en) *  20160830  20190212  中航动力股份有限公司  A kind of method for building up for essence casting moving turbine blade three dimensional optical measuring benchmark 

2005
 20050421 JP JP2005123197A patent/JP4491687B2/en active Active
Cited By (3)
Publication number  Priority date  Publication date  Assignee  Title 

CN102692202A (en) *  20120111  20120926  河南科技大学  Reverse measurement method 
CN103411574A (en) *  20130814  20131127  西北工业大学  Aviation engine blade profile threecoordinate measuring method 
CN103411574B (en) *  20130814  20160120  西北工业大学  Aeroengine blade profile coordinate measuring method 
Also Published As
Publication number  Publication date 

JP2006301991A (en)  20061102 
Similar Documents
Publication  Publication Date  Title 

JP3946716B2 (en)  Method and apparatus for recalibrating a threedimensional visual sensor in a robot system  
JP4015051B2 (en)  Camera correction device  
JP3070953B2 (en)  Pointbypoint type measuring method and system spatial coordinates  
US8138938B2 (en)  Handheld positioning interface for spatial query  
JP5444209B2 (en)  Frame mapping and force feedback method, apparatus and system  
US8244030B2 (en)  Method and measurement system for contactless coordinate measurement of an object surface  
JP4607095B2 (en)  Method and apparatus for image processing in surveying instrument  
CN101652628B (en)  Optical instrument and method for obtaining distance and image information  
US5870490A (en)  Apparatus for extracting pattern features  
CN100415460C (en)  Robot system  
EP1931503B1 (en)  Method for determining a virtual tool center point  
JP3859574B2 (en)  3D visual sensor  
US20110046917A1 (en)  Measuring method for an articulatedarm coordinate measuring machine  
JP2012171088A (en)  Master operation input device, and masterslave manipulator  
CN101523154B (en)  Apparatus and method for determining orientation parameters of an elongate object  
EP2056066B1 (en)  Surveying Instrument  
US6055056A (en)  Device for noncontact measurement of the surface of a three dimensional object  
US20060227211A1 (en)  Method and apparatus for measuring position and orientation  
JP2013068625A (en)  Rearrangement method of articulated coordinate measuring machine  
EP0782100B1 (en)  Threedimensional shape extraction apparatus and method  
US9228816B2 (en)  Method of determining a common coordinate system for an articulated arm coordinate measurement machine and a scanner  
EP1043126B1 (en)  Teaching model generating method  
CN1727845B (en)  Surveying instrument  
US6067165A (en)  Position calibrating method for optical measuring apparatus  
US6044308A (en)  Method and device for robot tool frame calibration 
Legal Events
Date  Code  Title  Description 

A621  Written request for application examination 
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20061201 

A977  Report on retrieval 
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20091029 

A131  Notification of reasons for refusal 
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20091110 

A521  Written amendment 
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20100107 

TRDD  Decision of grant or rejection written  
A01  Written decision to grant a patent or to grant a registration (utility model) 
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20100310 

A01  Written decision to grant a patent or to grant a registration (utility model) 
Free format text: JAPANESE INTERMEDIATE CODE: A01 

A61  First payment of annual fees (during grant procedure) 
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20100323 

R150  Certificate of patent or registration of utility model 
Free format text: JAPANESE INTERMEDIATE CODE: R150 

FPAY  Renewal fee payment (event date is renewal date of database) 
Free format text: PAYMENT UNTIL: 20130416 Year of fee payment: 3 

FPAY  Renewal fee payment (event date is renewal date of database) 
Free format text: PAYMENT UNTIL: 20130416 Year of fee payment: 3 

FPAY  Renewal fee payment (event date is renewal date of database) 
Free format text: PAYMENT UNTIL: 20130416 Year of fee payment: 3 

FPAY  Renewal fee payment (event date is renewal date of database) 
Free format text: PAYMENT UNTIL: 20140416 Year of fee payment: 4 

R250  Receipt of annual fees 
Free format text: JAPANESE INTERMEDIATE CODE: R250 

R250  Receipt of annual fees 
Free format text: JAPANESE INTERMEDIATE CODE: R250 

R250  Receipt of annual fees 
Free format text: JAPANESE INTERMEDIATE CODE: R250 

R250  Receipt of annual fees 
Free format text: JAPANESE INTERMEDIATE CODE: R250 

R250  Receipt of annual fees 
Free format text: JAPANESE INTERMEDIATE CODE: R250 

R250  Receipt of annual fees 
Free format text: JAPANESE INTERMEDIATE CODE: R250 