WO2016135856A1 - Three-dimensional shape measurement system and measurement method for same - Google Patents

Three-dimensional shape measurement system and measurement method for same Download PDF

Info

Publication number
WO2016135856A1
WO2016135856A1 PCT/JP2015/055239 JP2015055239W WO2016135856A1 WO 2016135856 A1 WO2016135856 A1 WO 2016135856A1 JP 2015055239 W JP2015055239 W JP 2015055239W WO 2016135856 A1 WO2016135856 A1 WO 2016135856A1
Authority
WO
WIPO (PCT)
Prior art keywords
shape
distance
measurement
unit
image data
Prior art date
Application number
PCT/JP2015/055239
Other languages
French (fr)
Japanese (ja)
Inventor
敬介 藤本
渡邊 高志
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2015/055239 priority Critical patent/WO2016135856A1/en
Priority to JP2017501603A priority patent/JP6282377B2/en
Publication of WO2016135856A1 publication Critical patent/WO2016135856A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication

Definitions

  • the present invention relates to a system and a measurement method for measuring a three-dimensional shape of a space.
  • a 3D sensor that can measure the three-dimensional shape of the space by measuring the distance to the space components.
  • this 3D sensor it is possible to acquire the shape of the entire space quickly and with high accuracy compared to hand measurement.
  • this 3D sensor it is possible to measure the shape in a non-destructive manner from a point away from a high place or a dangerous place that cannot be reached by human hands.
  • This 3D sensor can be broadly classified into an active method in which light is applied to a measurement object and measurement is performed and a passive method in which measurement is performed without applying light to the measurement object.
  • the active 3D sensor irradiates the measurement target with laser light or LED light, and calculates the distance to the measurement target from the time until the irradiation light returns.
  • the three-dimensional coordinates of the surface of the object are obtained by adding the position coordinates of the 3D sensor to the calculated distance information of the object.
  • Patent Document 1 discloses a high-resolution range image in a TOF (Time-Of-Flight) range image sensor that irradiates a measurement object with irradiation light and obtains a range image from the phase difference between the irradiation light and reflected light. And a technique for preventing a decrease in distance accuracy due to saturation of the light receiving element due to the influence of shot noise, ambient light, or the like when realizing a high frame rate. Specifically, the technique disclosed in Patent Document 1 receives reflected light from an object at the time when there is no light emission from a light source, and emits light that is not easily affected by environmental light based on frequency analysis of the reflected light.
  • TOF Time-Of-Flight
  • the illumination light having the frequency is obtained, the irradiation light having the optimum emission frequency is selected from a plurality of prepared light sources, and the reflected light of the illumination light of the selected light source is received to generate a distance image to the object. This suppresses the influence of ambient light during light reception.
  • Patent Document 2 discloses that a measurement target object is rotationally scanned with a pulse laser, receives reflected light from the measurement target object, and irradiates the target with laser light.
  • the passive method 3D sensor includes, for example, two imaging devices as disclosed in Patent Document 3, and the relative orientation between the imaging devices obtained by calibration in advance, and the both imaging devices.
  • the laser scanner as described in Patent Document 2 can perform highly accurate three-dimensional distance measurement. However, it takes a long time to measure by scanning one point at a time. Was not suitable to do.
  • a distance image sensor as described in Patent Document 1 can measure a distance at a high speed in order to detect a two-dimensional distance image. However, when the distance to the object is long because the irradiation light diffuses, May not be able to measure distance because it cannot obtain reflected light. Since the stereo camera type 3D sensor as described in Patent Document 3 can capture an object shape at a distant distance, a shape at a distant place in space can also be detected. However, in order to obtain high measurement accuracy, it is necessary to reduce the quantization error of the image sensor, and a high-definition image sensor is required. In addition, since a stereo camera type 3D sensor requires two image sensors, a cost problem arises.
  • a three-dimensional shape measurement system of the present invention captures image data by imaging a measurement space including a distance measurement unit that measures distances to a plurality of points of the measurement object and the measurement object.
  • a plurality of feature points of image data corresponding to each of a plurality of image data captured from different directions by the imaging unit, and a scale is obtained from position information in the image data of the feature points of the plurality of points.
  • An unknown provisional relative posture and provisional shape are obtained, and a three-dimensional coordinate value of the scale-unknown provisional relative posture and provisional shape is calculated based on the distance to the feature point measured by the distance measuring unit, and the measurement object And a shape restoring device that restores the shape.
  • the measurement method of the three-dimensional shape measurement system for measuring the shape of the measurement space including the measurement object captures a plurality of image data by imaging the measurement space from different directions. Obtaining a plurality of feature points of image data corresponding to each of the plurality of image data, obtaining a temporary relative posture and a provisional shape whose scale is unknown from position information in the image data of the feature points, Based on the distance to the measurement object corresponding to the feature point, the three-dimensional coordinate values of the temporary relative posture and the temporary shape whose scale is unknown are calculated to restore the shape of the measurement object.
  • the shape measurement of the entire measurement target can be performed.
  • the three-dimensional shape measurement system of the embodiment includes an imaging unit such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor that captures two-dimensional image information obtained by viewing a measurement object from different directions.
  • a distance measuring unit that measures the distance to a plurality of points of the measurement object by irradiating laser light or infrared light, and image information of the measurement object viewed from different positions photographed by the imaging unit and the distance
  • a three-dimensional coordinate calculation unit that calculates three-dimensional distance coordinates of the measurement object based on distance information of a plurality of points measured by the measurement unit.
  • the imaging unit shoots the measurement object from different positions by moving a shooting point or the like, or at least two image sensors are provided at different positions like a stereo camera, and the measurement object Shoot objects from different positions.
  • the distance between the two image sensors may not be fixed like a stereo camera.
  • the imaging unit may capture an RGB color image, or may capture a monochrome image or an infrared image.
  • the distance measuring unit measures the distance from the phase lag of the reflected light from the measurement object by detecting the reflected light with a laser scanner that measures the distance with laser light or diffusing and irradiating infrared light with a two-dimensional light receiving element. It is possible to use a distance image sensor that detects a distance by a TOF (Time-Of-Flight) method or an irradiation pattern method. Further, although details will be described later, since the distance information to all the measurement objects in the measurement space is not required, the laser scanner performs two-dimensional scanning on a plane having the same height as the scanner to detect one of the measurement objects. The two-dimensional shape of the part may not be measured, and a plurality of points may be measured without performing continuous scanning. A more specific configuration example of the three-dimensional shape measurement system of the present embodiment will be described later.
  • the imaging unit can capture the entire shape of the measurement object in the measurement space, whereas the distance measurement unit can measure the measurement space because the irradiation light does not reach the measurement object.
  • the distance information of the entire shape of the measurement object may not be measured.
  • the three-dimensional shape measurement system of the embodiment includes a scale obtained from the three-dimensional coordinates of the feature points whose distances can be measured among the feature points of the image of the entire shape of the measurement object in the measurement space imaged by the imaging unit. By using a shape whose scale is unknown from the feature points, the three-dimensional coordinates of the feature points whose distance could not be measured are calculated. Thereby, the three-dimensional shape measurement system of the embodiment restores the measurement object.
  • the measurement points other than the feature points of the image of the entire shape of the measurement object in the measurement space imaged by the imaging unit are measured only on the epipolar line based on the principle of epipolar geometry. By searching, corresponding points are found and the shape is restored.
  • FIG. 1 is a diagram illustrating functional blocks of the three-dimensional shape measurement system according to the embodiment.
  • the three-dimensional shape measurement system of the present embodiment includes a distance measurement unit 100 that measures a distance to a measurement object in a measurement space, an imaging unit 101 that measures surrounding color information, and a measurement data storage that stores the measured data. Part 102 is provided. In the measurement data storage unit 102, the distance data 103 measured by the distance measurement unit 100 and the image data 104 captured by the imaging unit 101 are stored as a pair for a plurality of times. The distance measuring unit 100 and the imaging unit 101 only need to measure at the same timing, and the measurement interval need not be constant.
  • the feature point recognition unit 105 that extracts the feature points of the image data 104 and the feature points extracted by the feature point recognition unit 105 between a plurality of pieces of image data 104.
  • the feature points of the image data 104 are determined by detecting corners based on changes in hue and luminance values. More specifically, it can be calculated by a Harris corner detection method or the like. Alternatively, a contour line is extracted, and a vertex or an inflection point can be obtained from the discrete curvature.
  • the three-dimensional shape measurement system has an unknown scale between the measurement points imaged by the imaging unit 101 between the plurality of image data 104 based on the feature point correspondence 114 calculated by the correspondence calculation unit 106.
  • a provisional relative posture / shape estimation unit 107 is provided for estimating a provisional relative posture 108 and a provisional shape 109 composed of three-dimensional coordinates with unknown scales related to feature points.
  • the scale unknown means that the absolute length and size related to the posture and shape are not determined, but the distance relationship and the relative size relationship between the postures and within the shape are correct.
  • the three-dimensional shape measurement system uses the temporary relative posture / shape calculation unit 120 whose scale is unknown, which includes the feature point recognition unit 105, the correspondence calculation unit 106, and the temporary relative posture / shape estimation unit 107. Based on the feature points of the plurality of image data 104, the temporary relative posture 108 and the temporary shape 109 whose scale is unknown can be obtained corresponding to each of the image data 104.
  • the distance measurement unit 100 measures the scale-unknown temporary relative posture 108 and the temporary shape 109 of the feature points obtained by the scale-unknown temporary relative posture / shape calculation unit 120.
  • An actual shape calculation unit 121 that obtains the entire scale from the distance data 103 and restores the entire shape data 113 is provided.
  • the actual shape calculation unit 121 includes a scale adjustment unit 110 and a shape restoration unit 112.
  • the scale adjustment unit 110 obtains the overall scale from the temporary shape 109 of the feature point unknown by the provisional relative posture / shape calculation unit 120 of unknown scale and the distance data 103 measured by the distance measurement unit 100, and the scale. Accordingly, the scale of the temporary relative posture 108 is adjusted. Then, the entire shape data 113 is restored by the shape restoration unit 112.
  • the shape restoration unit 112 may perform shape restoration also on portions other than the feature points by multipolar epipolar constraints.
  • the entire shape data 113 is restored from the pair. Details of processing contents of the shape restoration unit 112 will be described later.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of the three-dimensional shape measurement system according to the present embodiment.
  • the three-dimensional shape measurement system according to the present embodiment includes a distance measurement unit 100 configured by a laser scanner, a distance image sensor, and the like, and a measurement unit 20 including an imaging unit 101 configured by a CCD, a CMOS image sensor, or the like.
  • the shape restoration device 21 performs shape restoration processing based on the measurement content of the measurement unit 20 to obtain a three-dimensional shape of the measurement space.
  • the shape restoration device 21 includes a central processing unit (CPU) 202 that controls the entire device, a reception device 203 that receives measurement data from the measurement unit 20 and transfers the measurement data to the distance data 103 and the image data 104 in the memory 23, and a distance An input / display device 204 including a data use region setting unit 115 and a feature point use region setting unit 116, a processing unit 22 that performs the three-dimensional measurement process of the embodiment, measurement data of the distance data 103 and the image data 104,
  • the memory 23 stores processing data of the temporary relative posture 108, the temporary shape 109, the relative posture 111, the shape data 113, and the feature point correspondence 114.
  • the processing unit 22 includes a feature point recognition unit 105, a correspondence calculation unit 106, a temporary relative posture / shape estimation unit 107, a scale adjustment unit 110, and a shape restoration unit 112 that perform processing with reference to the image data 104 in the memory 23. ing.
  • Each processing unit may have a dedicated hardware configuration or may have a dedicated processor and a software process. Further, the CPU 202 may perform program processing.
  • FIG. 3 is a diagram illustrating a processing flow of the three-dimensional shape measurement system according to the present embodiment.
  • the three-dimensional shape measurement system measures the distance data 103 and the image data 104 (see FIG. 3) by the measurement unit 20 (see FIG. 3), and performs a measurement data acquisition process S300 stored in the memory 23. Do it.
  • the distance data 103 and the image data 104 are configured in pairs.
  • the distance and image measurement data are measured at different positions or orientations, and the measurement target is measured from different directions.
  • the measurement data may be at least two points. However, as the measurement data increases, the amount of information regarding the position of each feature point increases. Therefore, as the number of measurement data increases, the measurement range and measurement accuracy improve.
  • step S301 it is determined whether the number of measurement data necessary for the three-dimensional shape measurement process is obtained. If the number of measurement data is insufficient (No in S301), the process returns to step S300. If the number of data is sufficient (Yes in S301), the process proceeds to step S302.
  • step S302 the feature point recognition unit 105 performs feature point recognition processing for recognizing feature points on the image for each piece of image data 104 obtained by imaging the measurement object from a plurality of directions (S302). Then, the correspondence calculation unit 106 performs correspondence calculation processing for associating the same feature points among a plurality of image data (S303).
  • the temporary relative posture / shape estimating unit 107 obtains the temporary relative posture 108 and the temporary shape 109 whose scales are unknown from the position information of the same feature points between the plurality of image data 104 (S304). Next, using the ratio of the correct distance of the scale obtained from the distance related to the temporary shape 109 whose scale is unknown by the scale adjustment unit 110 and the distance data 103 measured in the measurement data acquisition process S300, the temporary relative posture 108 and the temporary shape are used. The scale of 109 is adjusted (S305).
  • the shape restoration unit 112 restores the shape of the measurement object from the temporary relative posture 108 and the temporary shape 109 and the distance data 103 measured in the measurement data acquisition process S300. (S306). At this time, if the number of feature points obtained in S302 is small, the shape restoration unit 112 may perform shape restoration on portions other than the feature points by multi-view epipolar constraint.
  • FIG. 3 a flow for performing shape restoration processing after measurement / feature point extraction of the measurement object is shown. However, more than the number of image data / distance data necessary for the restoration processing is captured and measured. In this case, during the restoration process, pipeline processing for performing the next measurement / acquisition measurement data acquisition process, feature point recognition process, and corresponding calculation process may be performed.
  • the feature point of the image data is obtained, the relative posture and shape of the feature point is obtained, and the shape of the measurement object is restored based on the distance data of the feature point, thereby reducing the calculation cost, Processing time can be shortened.
  • FIG. 4 shows a measurement example of the three-dimensional shape measurement system of the embodiment.
  • 4A shows distance data 103 that is a measurement result of the distance measurement unit 100
  • FIG. 4B shows image data 104 that is a measurement result of the imaging unit 101.
  • the distance data 103 in FIG. 4 (a) is a distance value represented on the screen as a luminance value, and a part having a longer distance is represented by a color closer to black. The part where the distance could not be measured is indicated by hatching (reference numeral 401).
  • a region 403 is a portion where the distance can be measured.
  • FIG. 4B shows image data at the same place as in FIG. 4A, and the distance of a specific part of the image data can be found from the same coordinates in FIG. 4A.
  • the ground on the lower side of FIG. 4B corresponds to a portion where the distance can be measured by reference numeral 403 in FIG.
  • the center of the distance data corresponds to the building at the center of the image data.
  • the center of the distance data is a hatched area. That is, the distance to the building cannot be measured.
  • the same coordinates may not correspond to the same part.
  • the distance between the temporary shape whose scale is unknown and the distance from which the distance is known is obtained from the position information of the feature points of the image such as the white line portion 402 in FIG. Perform shape restoration from information.
  • FIG. 5 details of the feature point recognition unit 105 in FIG. 1 are shown in FIG. 5, details of the correspondence calculation unit 106 are shown in FIG. 6, details of the temporary relative posture / shape estimation unit 107 are shown in FIG. 7, and details of the scale adjustment unit 110 are shown. Details of the shape restoration unit 112 will be described with reference to FIGS.
  • FIG. 5 is a diagram illustrating an example of feature points extracted by the feature point recognition unit 105, in which a plurality of feature points 501 (black circles) are extracted from one image data 500 of the image data 104 measured by the imaging unit 101.
  • the feature point is a point where the hue or luminance pattern around a predetermined position in the image is peculiar, such as a corner or a color boundary.
  • the feature point 501 is indicated by a black circle.
  • features for selecting feature points include the corner features of SIFT (Scale-Invariant Feature Transform), SURF (Speeded Up Robust Features), and FAST (Features from Accelerated Segment Test). Can be mentioned.
  • SIFT Scale-Invariant Feature Transform
  • SURF Speeded Up Robust Features
  • FAST Features from Accelerated Segment Test
  • FIG. 6 is a diagram illustrating an example of feature point correspondence processing by the correspondence calculation unit 106.
  • Image data 600 in FIG. 6A and image data 601 in FIG. 6B indicate image data obtained by imaging the measurement object from different points in the image data 104. For this reason, the same measurement object is imaged at different positions in the image data 600 and the image data 601.
  • the feature point 602 is one of the feature points extracted from the image data 600 by the feature point recognition unit 105
  • the feature point 603 is one of the feature points extracted from the image data 601 by the feature point recognition unit 105. It has become.
  • the feature point 602 and the feature point 603 are feature points of the same object to be measured, but the image data 600 and the image data 601 are imaged at different positions.
  • the correspondence calculation unit 106 associates the position of each feature point in the photographing target with the image data.
  • the correspondence calculation unit 106 when the feature pattern of the feature point 602 reflected in the image data 600 recognized by the feature point recognition unit 105 and the feature pattern of the feature point 603 reflected in the image data 601 match, the correspondence calculation unit 106 The feature points are associated with each other. In FIG. 6A and FIG. 6B, black points connected by dotted lines represent the associated feature points.
  • the correspondence calculation unit 106 performs the feature point correspondence processing on the feature points between all the image data. However, if this correspondence calculation process is performed between all the image data, the calculation time is increased. Therefore, images used for association may be selected at predetermined intervals. In addition, the correspondence calculation may be performed only between images in which the shooting locations of the images are known to be close.
  • FIG. 7 is a diagram illustrating a positional relationship between corresponding feature points and the imaging unit 101 in two pieces of image data captured by the imaging unit 101.
  • the temporary relative posture / shape estimation unit 107 estimates the temporary relative posture 108 and the temporary shape 109 between the imaging points whose scales are unknown from the correspondence relationship between the coordinates of a plurality of feature points between the images.
  • the relative posture is a difference regarding the position and orientation between the shooting points
  • the temporary relative posture 108 indicates a case where the scale is unknown.
  • the relative posture 111 indicates a relative posture with a correct scale.
  • the shape data 113 indicates a set of points from which three-dimensional coordinates are obtained, and the temporary shape 109 indicates shape data whose scale is unknown.
  • the relative orientation between the three-dimensional coordinates of the feature points whose scale is unknown and the shooting point can be uniquely determined if a plurality of sets of the coordinates of the feature points between the images and their corresponding relationships are obtained.
  • As the number of necessary correspondences it is only necessary that five or more feature point correspondences per two image pairs are obtained. This calculation is obtained by a 5-point algorithm, an 8-point algorithm, or the following calculation method.
  • FIG. 7A shows image data 700 captured by the imaging unit 101.
  • FIG. 7B shows image data 701 obtained by imaging the same measurement object from different positions.
  • a reference numeral 702 in FIG. 7A represents one of characteristic points of the measurement object, and a reference numeral 703 in FIG. 7B represents a characteristic point of the same measurement object corresponding to the reference numeral 702.
  • FIG. 7C is a diagram showing the positional relationship between the measurement object 707 and the image data 700 and 701.
  • Reference numeral 704 represents an imaging point when the image data 700 is captured
  • reference numeral 705 represents an imaging point when the image data 701 is captured.
  • Reference numeral 706 is the value of the relative orientation of reference numerals 704 and 705 of the imaging points.
  • a corner portion 708 of the measurement object 707 corresponds to a reference numeral 702 that is a feature point of the image data 700 and corresponds to a reference numeral 703 that is a feature point of the image data 701.
  • the positional relationship between the corner 708 of the measurement object in FIG. 7C and the reference numerals 702 and 703 of the feature points can be expressed by the following equation.
  • (fx, fy) is the focal length of the camera
  • (cx, cy) is the principal point of the screen
  • s is a normalization so that the element in the third row of the vector on the left side is 1.
  • the matrix (R, t) is the relative posture between the shooting points
  • (X, Y, Z) is the three-dimensional coordinates of the feature points.
  • Equation (1) if the tertiary coordinates of the feature points are known, the coordinates (u, v) on the image data are determined. Therefore, the three-dimensional coordinates (X, Y, Z) of the feature points and the relative postures such that (u, v) matches the coordinates (mx, my) on the image of the feature points actually obtained by photographing. By obtaining the matrix (R, t), the temporary relative posture and the temporary shape can be determined.
  • the shape of the measurement object regarding the position of the feature points is determined. Any optimization calculation method may be used as long as the evaluation formula can be minimized, such as a steepest descent method or a Levenberg-Marquardt method.
  • FIG. 8 is a diagram illustrating a method of adjusting the scale of the temporary shape 109 using the distance data 103 measured by the distance measuring unit 100.
  • the temporary relative posture 108 and the temporary shape 109 calculated by the temporary relative posture / shape estimation unit 107 using the image data 104 captured by the imaging unit 101 are correct in terms of the magnitude relationship between the distances within the shape, but are the absolute scale of the measurement target. I don't know.
  • the absolute scale is known, but in many cases, since the distances of all points cannot be measured, only the distances in a partial region in the screen are obtained. . Therefore, the scale adjustment unit 110 gives the scale information of the partial region obtained by the distance measurement unit 100 to the region where the distance measurement has not been performed but the unknown shape of the scale has been calculated, and the scale in the region is calculated. Find the ratio to be adjusted so that it can be restored.
  • the coordinate 802 of the correct distance 804 of the scale is obtained by the distance measuring unit 800.
  • the shape of the feature point 801 is estimated at the position of the distance 805 where the scale is unknown as the calculation result of the temporary relative posture / shape estimation unit 107.
  • the scale adjustment unit 110 determines the overall scale based on the ratio r between the distance 804 and the distance 805. That is, as shown in FIG. 8B, using this ratio r, the feature point 801 in the temporary shape 109 is moved to the position of the coordinate 808 (same as the coordinate 802). Further, the position of the feature point 806 for which distance measurement could not be performed is moved to the position of the coordinate 810 after the scale adjustment using the ratio r. Thereby, the feature point 801 and the feature point 806 can obtain a correct scale.
  • the feature point 809 in addition to the feature point 801, the feature point 809 also has the ratio r regarding the feature point 809 and the feature point 809 when the distance measuring unit 800 has obtained the coordinates 807 having the correct scale distance.
  • An average with the ratio r for point 801 is taken to determine the overall scale. That is, as shown in FIG. 8B, the feature point 801 in the temporary shape 109 is moved to the position of the coordinate 808 (different from the coordinate 802) by adjusting the scale according to the average ratio.
  • the feature points 806, 809, and 811 are also scaled according to the average ratio and moved to the positions of the coordinates 810, 812, and 813 to obtain a correct scale.
  • the optimum scale is calculated by taking the average of each scale.
  • the ratio r of the scale obtained by the above illustrates a method of correcting a provisional relative position t in the correct relative orientation t r scale in equation (3).
  • Equation (4) a method for correcting the coordinates (X, Y, Z) of unknown feature points to the correct coordinates (Xr, Yr, Zr) of the scales is shown in Equation (4).
  • FIG. 9 is a diagram illustrating a method for restoring the shape of the measurement object.
  • the shape restoration unit 112 further adds scale adjustment as a constraint condition to the back projection error obtained by the temporary relative posture / shape estimation unit 107 using the entire scale adjusted by the scale adjustment unit 110 as an initial value.
  • the correct relative posture and shape are restored.
  • by combining a scale unknown shape obtained using images taken from a plurality of different positions in the imaging unit and a scale obtained from a part of the shape near the sensor obtained by the distance measurement unit 100 It is possible to measure a three-dimensional shape even for a distant shape that cannot be directly measured by the distance measuring unit 100.
  • the tentatively set three-dimensional coordinate 903 and the distance to the sensor should be equal to the distance 901 obtained by the measurement.
  • the correct shape of the scale can be obtained.
  • the evaluation formula is shown in the following formula (5).
  • the weight w in Equation (5) represents the constraint weight regarding the difference between the distance to the feature point and the distance obtained by measurement. This weight controls the balance of constraints relating to the difference between the backprojection error and the distance (this will be referred to as a distance error), and is set according to the ratio between the feature point recognition accuracy and the distance measurement accuracy.
  • the first and second terms are the same as in Expression (2), and indicate a backprojection error.
  • the third term indicates the distance error
  • d indicates the distance measured by the distance measuring unit
  • s indicates the distance from the sensor obtained by Equation (1) to the temporarily set feature point.
  • the shape restoration unit 112 obtains the posture relationship between the three-dimensional coordinates of the feature points and the distance measurement unit 100. If the feature points are not obtained sufficiently densely, the relative posture of the feature points is used. Then, using the Multi ⁇ View ⁇ ⁇ ⁇ Stereo method, the shape of the part where the feature points could not be obtained is restored. This restoration is performed according to the following procedure based on the principle of epipolar geometry shown in FIG.
  • FIG. 10 is a diagram showing the correspondence between the point 1001 and the imaging unit 1003 when one point 1001 of the measurement object in the measurement space is imaged from different directions, and shows the principle of epipolar geometry.
  • the point 1001 in the measurement space corresponds to the point 1000 on the image data 1004 and is located somewhere on the extended line connecting the imaging unit 1003 and the point 1000.
  • a straight line 1002 of the image data 1005 when a line including the point 1001 is imaged from another viewpoint is referred to as an epipolar line.
  • a point on the image data 1005 when the point 1001 in the measurement space is imaged from another viewpoint is limited to the epipolar line 1002. This principle is called epipolar geometry.
  • the dense shape restoration method is not limited to Multi View Stereo as long as the dense shape can be restored from the relative posture and the image data.
  • the distance data utilization area setting unit 115 in FIG. 1 will be described.
  • the value of the weight w is set according to the balance between the feature point recognition accuracy and the distance recognition accuracy.
  • the distance recognition accuracy varies depending on the material, color, distance, etc. of the measurement object.
  • the distance data utilization area setting unit 115 sets the weight w according to the distance accuracy according to the distance accuracy, and sets a plurality of types of image data areas satisfying the distance accuracy and the weight w.
  • the shape restoration unit 112 optimizes the equation (5) while determining the scale by the scale adjustment unit 110 according to the weight w in accordance with the region in the image data.
  • FIG. 11 is a diagram for explaining the processing of the distance data use area setting unit 115.
  • FIG. 11 illustrates an example in which the distance data usage area setting unit 115 sets three types of weights based on the image data 1101.
  • the accuracy is the highest
  • the part of the feature point 1104 has the next highest precision
  • the part of the feature point 1106 has the highest precision. Let it be low.
  • the distance data utilization region setting unit 115 first sets the region 1103 including the feature point 1102 and sets the largest value for the weight w. Subsequently, an area 1105 including the feature point 1104 is set, and a weight w having a smaller value than the weight w set in the area 1103 is set. Finally, an area 1107 including the feature point 1106 is set, and the smallest weight w is set. It should be noted that a predetermined weight w is set for an area where the weight w is not set.
  • the scale adjustment unit 110 determines a scale by a weighted average
  • the shape restoration unit 112 also uses the weight w in Expression (5).
  • the shape is restored according to the value of. Note that the types of settings are not limited to three. When nothing is set, a predetermined fixed value is adopted as the weight w. It is also possible to use only the distance data in the area by setting the weight of the predetermined area to the predetermined value w and setting the other weights to zero.
  • the feature point use area setting unit 116 in FIG. 1 will be described.
  • the value of the weight w is set according to the balance between the feature point recognition accuracy and the distance recognition accuracy.
  • the feature point recognition accuracy depends on the resolution of the captured image and the sharpness of each feature point. , Each may be different. Therefore, the feature point use region setting unit 116 sets a plurality of types of the weight w for each region of the image data according to the distance recognition accuracy of the feature point, and the scale adjustment unit 110 adjusts the scale according to the weight w.
  • the shape restoration unit 112 optimizes the equation (5). Note that the set value of the weight w relates to a term relating to the distance recognition accuracy as shown in the equation (5), so the weight w is reduced as the feature point recognition accuracy is higher.
  • FIG. 12 is a diagram for explaining the processing of the feature point use area setting unit 116.
  • FIG. 12 shows an example in which the feature point use region setting unit 116 sets three types of weights based on the image data 1201.
  • the feature point 1202 has the highest feature point recognition accuracy
  • the feature point 1204 has the next highest feature point recognition accuracy
  • the feature point 1206 has the lowest feature point recognition accuracy. .
  • the feature point use region setting unit 116 first sets the region 1203 including the feature point 1202 in the image data 1201 and sets the smallest value for the weight w. Subsequently, an area 1205 including the feature point 1204 is set, and a value larger than the weight w set in the area 1203 is set. Finally, an area 1207 including the feature point 1206 is set, and the largest weight w is set. It should be noted that a predetermined weight w is set in an area where the weight w is not set.
  • the scale adjustment unit 110 determines the scale by the weighted average, and the shape restoration unit 112 uses the weight w in the equation (5).
  • the shape is restored according to the value of.
  • the types of settings are not limited to three. When nothing is set, a predetermined fixed value is adopted as the weight w. It is also possible to use only the feature points in the region by setting the weight of the predetermined region to the predetermined value w and setting the other weights to zero.
  • Example 2 In the above-described embodiment, the example in which the distance measuring unit 100 is configured by a laser scanner, a distance image sensor, or the like has been described. However, based on the distance data to the feature points that can be measured by the distance measuring unit 100, the overall shape data Is restoring. In other words, it is not necessary for the distance measuring unit 100 to measure the distances to all the points of the measurement object, and it is only necessary to measure the distances to some feature points of the captured image.
  • the present embodiment uses a 2D distance sensor that measures a two-dimensional shape of a part of the measurement object by scanning the distance measuring unit 100 on a plane, and calculates the shape of the space from the distance measured by the 2D distance sensor. Is to restore.
  • the functional block of the three-dimensional shape measurement system of this embodiment is the same as the functional block of FIG. 1 except that the distance measuring unit 100 is configured by a 2D distance sensor, and includes a distance data using region setting unit 115 and a feature point using region setting unit 116. It is different.
  • the three-dimensional shape measurement system uses a feature point recognition unit 105, a correspondence calculation unit 106, and a temporary relative posture / shape estimation unit 107 for image data 104 captured by the imaging unit 101.
  • the shape 109 is obtained.
  • the scale adjustment unit 110 uses the scale information obtained from the distance data 103 and the provisional shape 109 to correct the provisional relative posture 108 to obtain a correct relative posture 111 of the scale.
  • the shape restoration unit 112 restores the three-dimensional shape by mapping the distance data obtained by the distance measurement unit 100 on the three-dimensional space based on the relative posture.
  • This process is the same as in the above-described embodiment.
  • the processing is performed by setting the weight w in the equation (5) to a fixed predetermined value.
  • FIG. 13 is a diagram illustrating an external appearance of the three-dimensional shape measurement system according to the present embodiment.
  • the measurement apparatus of the present embodiment is configured to place the imaging apparatus 1300 on a distance measurement unit 1301 that is a 2D distance sensor. Then, the measurement apparatus of the present embodiment is moved to perform imaging of the measurement object and distance measurement from different directions.
  • the distance measuring unit 1301 is a sensor capable of measuring a two-dimensional shape such as surrounding irregularities at a time.
  • it is a laser scanner that scans a laser beam in a straight line, receives the reflected light, and measures the distance according to the time from the irradiation of the laser light to the light reception.
  • the position scanned by the distance measuring unit 1301 and the imaging position of the imaging device 1300 are associated with each other, and the measurement object corresponding to the coordinate position of the feature point of the image data of the imaging device 1300 is associated.
  • Get distance data More specifically, when the feature point of the image data 101 imaged by the imaging device 1300 exists on a horizontal line passing through the center, the laser beam scan of the distance measuring unit 1301 that is a 2D distance sensor is also made to correspond to the horizontal line. . For this reason, the direction and inclination of the distance measuring unit 1301 can be changed, and the distance measuring unit 1301 can change the scanning position of the laser light to measure the distance at an arbitrary position of the image data.
  • the measurement apparatus of the three-dimensional shape measurement system according to the present embodiment restores the measurement object with the same configuration as the shape restoration apparatus 21 shown in FIG. 2 based on the measurement results of the imaging unit 1300 and the distance measurement unit 1301. To do. For this reason, the description of the apparatus configuration that performs the restoration process in the measurement apparatus of the present embodiment is omitted.
  • the distance measuring unit 1301 is a 2D distance sensor, scans the measurement target 1401 with the laser beam 1402, receives the reflected light of the irradiated laser beam, and receives the light after irradiating from the phase difference between the irradiated light and the reflected light. The time until the measurement is obtained, and the distance to the measurement object 1401 is calculated. By scanning the laser beam in one direction, the two-dimensional shape of the scanning line is measured. At this time, since there is no measurement object in the irradiation direction of the laser beam 1403, the reflected light is not received. For this reason, distance measurement is impossible.
  • FIG. 15 is a view showing a scanning surface of the laser beam of FIG.
  • the distance measurement unit 1301 sequentially measures n data while changing the measurement direction ⁇ by the angular resolution ⁇ .
  • the measurement direction of the i-th measurement data is ⁇ i
  • the measurement result is the distance ri.
  • a combination of distance and direction (ri, ⁇ i) at this time is a position represented by a relative polar coordinate system of the measurement object with the distance measurement unit 1301 as the center.
  • a dotted line arrow represents the measurement result regarding the irradiation direction of each laser beam 1402, and the end 1500 of the arrow is the position of the point to be measured.
  • a closed solid line indicates a measurement target object in space
  • a dotted line arrow 1501 that collides with the object is data that has been successfully measured.
  • a dotted arrow that did not hit the closed solid line indicates that measurement was not possible because there was no reflected light.
  • a set of data that has been successfully measured becomes a two-dimensional shape of the measurement object 1401 expressed in polar coordinates.
  • the shape is restored by calculating the three-dimensional coordinate values (Xr, Yr, Zr) by the conversion formula (7).
  • the present embodiment by combining a scale unknown shape obtained using images taken from a plurality of different positions by the imaging unit and a scale obtained from a part of the shape obtained by the distance measurement unit 100, It is possible to measure the three-dimensional shape of the entire shape of the measurement target that cannot be directly measured by the distance measuring unit 100. Further, since a three-dimensional shape measurement system can be configured by using an inexpensive 2D distance sensor for the distance measurement unit 1301, a system can be obtained at a low cost. In addition, since the scanning period of the laser light is shortened, the measurement time of the distance required for one measurement is shortened, and the measurement object can be measured from many directions, and the measurement accuracy of the three-dimensional measurement is improved. .
  • the temporary relative posture and the temporary shape whose scale is unknown in the three-dimensional space are obtained from the correspondence between the feature points of the image data captured from a plurality of directions, and the distance between the feature points.
  • the shape of the three-dimensional space is restored by adjusting the temporary relative posture and the scale of the temporary shape using the data. That is, the distance measuring unit may obtain distance data of feature points of image data.
  • FIG. 16 is a diagram illustrating an external appearance of the three-dimensional shape measurement system according to the present embodiment.
  • the measurement apparatus according to the present embodiment includes a distance measurement unit 1601 and an imaging unit 1600, and moves to perform imaging and distance measurement of a measurement target from different directions to calculate the shape of the three-dimensional space.
  • the imaging unit 1600 captures the entire shape of the measurement object in the measurement space and acquires a plurality of image data with different viewpoints.
  • the distance measuring unit 1601 is provided with a laser distance meter that irradiates a laser beam 1602 in a certain direction and measures the distance to one point on the measurement object, and changes the irradiation direction and inclination of the laser beam 1602 of this laser distance meter.
  • a pan head 1603 for controlling the direction of the laser distance meter is provided. This pan head 1603 is controlled so that laser light can be irradiated toward a measurement object corresponding to an arbitrary point of image data.
  • the three-dimensional shape measurement system of the present embodiment restores the measurement object based on the measurement results of the imaging unit 1600 and the distance measurement unit 1601 with the same configuration as the shape restoration device 21 shown in FIG. For this reason, description of the apparatus configuration that performs the restoration process in the present embodiment is omitted.
  • FIG. 17 is a diagram showing a processing flow of the three-dimensional shape measurement system of the present embodiment.
  • the imaging unit 1600 images the measurement object in the measurement space, and acquires image data obtained by imaging the measurement object from a plurality of directions (S171).
  • the image data acquired in step S171 is analyzed, and feature point recognition processing for recognizing feature points on the image is performed (S172).
  • a correspondence calculation process for associating the same feature points among a plurality of image data is performed (S173).
  • step S174 the three-dimensional shape measurement system according to the present embodiment obtains a temporary relative posture and a temporary shape whose scale is unknown from position information of the same feature point between a plurality of image data. Thereafter, the pan head 1603 is controlled so that the laser beam is irradiated in the direction of one point of the measurement object corresponding to the feature point of the image data determined in step S172, and the distance measurement unit 1600 measures the measurement object corresponding to the feature point. The distance to one point of the object is measured (S175). This distance measurement is performed for the number of feature points of the image data.
  • the three-dimensional shape measurement system of the present embodiment uses the ratio of the correct distance of the scale obtained from the distance data measured in step S175 and the distance related to the temporary shape of unknown scale obtained in step S174 to make a temporary relative
  • the posture and the scale of the temporary shape are adjusted (S176).
  • the shape of the measurement object is restored from the temporary relative posture and the temporary shape and the measured distance data of the feature points (S177).
  • shape restoration may also be performed for portions other than the feature points by multipolar epipolar constraint.
  • step S173 and step S174 requires many processing resources, such as a processor, and requires processing time. Also, in the distance measurement in step S175, if the number of feature points is large, it takes time to adjust the direction, and the acquisition time of distance data becomes long. For this reason, the processing time of the entire flow of FIG. 17 becomes long, and there may be a case where measurement time becomes long or measurement from many viewpoint directions cannot be performed.
  • step S175 since the process of step S175 can be performed any time after the feature point is obtained, the distance data acquisition (S175) process of the feature point is started after the process of step S172.
  • the distance data acquisition process of S175 and the processes executed in the order of steps S173 and S174 may be performed in parallel.
  • the present embodiment by combining a scale unknown shape obtained using images taken from a plurality of different positions by the imaging unit and a scale obtained from a part of the shape obtained by the distance measurement unit 100, It is possible to measure the three-dimensional shape of the entire shape of the measurement target that cannot be directly measured by the distance measuring unit 100. Further, according to the present embodiment, since the distance measurement to the point of the measurement object corresponding to the feature point of the image data can be performed with high accuracy, the measurement accuracy of the three-dimensional measurement system can be easily improved.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail for easy understanding in the present invention, and are not necessarily limited to those having all the configurations described. Further, a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
  • the three-dimensional measurement system of the present invention An imaging unit that captures an image of a measurement space including a measurement object and acquires image data; Temporary relative posture for calculating a plurality of feature points of image data corresponding to each of a plurality of image data captured from different directions by the imaging unit, and calculating a temporary relative posture and a temporary shape whose scales are unknown, A shape calculation unit; A distance measuring unit for measuring a distance to the measurement object corresponding to the feature point; The three-dimensional coordinate value of the feature point whose distance could not be measured by the distance measurement unit is calculated from the distance of the feature point whose distance could be measured by the distance measurement unit, the temporary relative posture whose scale is unknown, and the temporary shape, An actual shape calculator that restores the overall shape; It is characterized by providing.
  • the measurement method of the three-dimensional measurement system of the present invention includes: A distance measurement unit that measures distances to a plurality of points of the measurement object, and an imaging unit that images the measurement space including the measurement object and acquires image data, and measures the shape of the measurement space
  • a measurement method of a three-dimensional shape measurement system Obtaining a plurality of feature points of image data corresponding to each of a plurality of image data imaged from different directions by the imaging unit, and calculating a scale-unknown provisional relative posture and provisional shape composed of the feature points;
  • the three-dimensional coordinate value of the feature point whose distance could not be measured by the distance measurement unit is calculated from the distance of the feature point whose distance could be measured by the distance measurement unit, the temporary relative posture whose scale is unknown, and the temporary shape, Restoring the overall shape; It is characterized by having.

Abstract

To provide a three-dimensional shape measurement system and measurement method capable of shape measurement of the entirety of an object of measurement when only a portion of the shape of the object of measurement can be obtained using distance measurement, a three-dimensional shape measurement system according to the present invention is provided with a distance measurement unit (100) for measuring the distance to a plurality of points on an object of measurement, an imaging unit (101) for acquiring image data by imaging a measurement space including the object of measurement, and a shape reconstruction device (21) for determining a plurality of image data feature points corresponding to each of a plurality image data items imaged from different directions by the imaging unit, determining a provisional relative posture and provisional shape of unknown scale from position information for the image data of the plurality of feature points, and reconstructing the shape of the object of measurement by calculating the three-dimensional coordinates of the provisional relative posture and provisional shape of unknown scale on the basis of the distances to the feature points measured by the distance measurement unit.

Description

3次元形状計測システムおよびその計測方法Three-dimensional shape measurement system and measurement method thereof
 本発明は、空間の3次元形状を計測するシステムおよび計測方法に関する。 The present invention relates to a system and a measurement method for measuring a three-dimensional shape of a space.
 空間の形状を計測するためのひとつの手段として、空間の構成物までの距離を測ることで空間の3次元的な形状を計測できる3Dセンサがある。この3Dセンサを用いることで、手計りに比べて迅速かつ高精度に空間全体の形状を取得できる。また、この3Dセンサを用いることで、人の手の届かない高所や危険な場所を離れた地点から非破壊で形状を計測できる。 As a means for measuring the shape of the space, there is a 3D sensor that can measure the three-dimensional shape of the space by measuring the distance to the space components. By using this 3D sensor, it is possible to acquire the shape of the entire space quickly and with high accuracy compared to hand measurement. In addition, by using this 3D sensor, it is possible to measure the shape in a non-destructive manner from a point away from a high place or a dangerous place that cannot be reached by human hands.
 この3Dセンサは、計測対象物に光を当てて計測するアクティブ方式と、計測対象物に光を当てずに計測するパッシブ方法に大別できる。
 アクティブ方式の3Dセンサは、レーザ光やLED光を計測対象に照射して、照射光が返ってくるまでの時間から、計測対象までの距離を算出する。これにより算出された対象物の距離情報に、3Dセンサの位置座標を加えることで、対象物表面の3次元座標を得ている。
This 3D sensor can be broadly classified into an active method in which light is applied to a measurement object and measurement is performed and a passive method in which measurement is performed without applying light to the measurement object.
The active 3D sensor irradiates the measurement target with laser light or LED light, and calculates the distance to the measurement target from the time until the irradiation light returns. The three-dimensional coordinates of the surface of the object are obtained by adding the position coordinates of the 3D sensor to the calculated distance information of the object.
 特許文献1には、照射光を計測対象に照射し、照射光と反射光の位相差から距離画像を取得するTOF(Time-Of-Flight)方式の距離画像センサにおいて、距離画像の高解像度化や高フレームレート化を実現する際の、ショットノイズや環境光等の影響で受光素子の飽和が発生することによる距離精度の低下を防止する技術が開示されている。詳しくは、特許文献1に開示される技術は、光源からの発光がない時点での対象物からの反射光を受光し、その反射光の周波数分析をもとに環境光の影響を受けにくい発光周波数を持つ照明光を求め、この最適な発光周波数を持つ照射光を用意された複数の光源から選択し、選択した光源の照明光の反射光を受光して対象物までの距離画像を生成することで、受光時の環境光の影響を抑えるものである。 Patent Document 1 discloses a high-resolution range image in a TOF (Time-Of-Flight) range image sensor that irradiates a measurement object with irradiation light and obtains a range image from the phase difference between the irradiation light and reflected light. And a technique for preventing a decrease in distance accuracy due to saturation of the light receiving element due to the influence of shot noise, ambient light, or the like when realizing a high frame rate. Specifically, the technique disclosed in Patent Document 1 receives reflected light from an object at the time when there is no light emission from a light source, and emits light that is not easily affected by environmental light based on frequency analysis of the reflected light. The illumination light having the frequency is obtained, the irradiation light having the optimum emission frequency is selected from a plurality of prepared light sources, and the reflected light of the illumination light of the selected light source is received to generate a distance image to the object. This suppresses the influence of ambient light during light reception.
 また、他のアクティブ方式の3Dセンサとして、特許文献2には、計測対象物体に対してパルスレーザを回転走査して、計測対象物体からの反射光を受光して、レーザ光を対象物に照射してから反射して戻ってくるまでの時間を測ることで、測距をおこなうレーザスキャナにおいて、既知の形状の高反射率を有する反射部を有する較正用ターゲットにより、レーザスキャナの較正を行う技術が開示されている。 As another active type 3D sensor, Patent Document 2 discloses that a measurement target object is rotationally scanned with a pulse laser, receives reflected light from the measurement target object, and irradiates the target with laser light. Technology for calibrating a laser scanner with a calibration target having a reflective part having a high reflectance with a known shape in a laser scanner that measures distance by measuring the time from reflection to return Is disclosed.
 パッシブ方法の3Dセンサには、例えば、特許文献3に開示されているように、2台の撮像装置を備え、予めキャリブレーションによって求めておいた撮像装置間の相対姿勢と、両方の撮像装置の計測対象の所定の部位が映った位置から、三角測量の原理によって所定の部位の3次元座標を求めるステレオカメラ方式がある。 The passive method 3D sensor includes, for example, two imaging devices as disclosed in Patent Document 3, and the relative orientation between the imaging devices obtained by calibration in advance, and the both imaging devices. There is a stereo camera system that obtains the three-dimensional coordinates of a predetermined part from the position where the predetermined part to be measured is reflected by the principle of triangulation.
国際公開第2010/021090号International Publication No. 2010/021090 特開2010-151682号公報JP 2010-151682 A 特開2013-253799号公報JP 2013-253799 A
 特許文献2に記載されているようなレーザスキャナは、高精度の3次元の距離計測をおこなうことができるが、一点ずつ走査して計測を行うため計測時間が掛かり、高速移動しながらの測距を行うのに適していなかった。
 特許文献1に記載されているような距離画像センサは、2次元の距離画像を検出するため高速に測距を行うことができるが、照射光が拡散するために対象までの距離が遠い場合には、反射光を得ることができないため、距離を計測する事ができないことがあった。
 特許文献3に記載されているようなステレオカメラ方式の3Dセンサでは、遠い距離の物体形状も撮像できるので、空間の遠い場所の形状も検出することができる。しかし、高い計測精度を得るためには撮像素子の量子化誤差を低減する必量があり、高精細な撮像素子が必要となる。また、ステレオカメラ方式の3Dセンサでは、2つの撮像素子が必要となるため、コスト上の問題が生じる。
The laser scanner as described in Patent Document 2 can perform highly accurate three-dimensional distance measurement. However, it takes a long time to measure by scanning one point at a time. Was not suitable to do.
A distance image sensor as described in Patent Document 1 can measure a distance at a high speed in order to detect a two-dimensional distance image. However, when the distance to the object is long because the irradiation light diffuses, May not be able to measure distance because it cannot obtain reflected light.
Since the stereo camera type 3D sensor as described in Patent Document 3 can capture an object shape at a distant distance, a shape at a distant place in space can also be detected. However, in order to obtain high measurement accuracy, it is necessary to reduce the quantization error of the image sensor, and a high-definition image sensor is required. In addition, since a stereo camera type 3D sensor requires two image sensors, a cost problem arises.
 上記の課題を解決するため、本発明の3次元形状計測システムは、測定対象物の複数の点までの距離を測定する距離計測部と、前記測定対象物を含む測定空間を撮像して画像データを取得する撮像部と、前記撮像部により異なる方向から撮像した複数の画像データのそれぞれに対応する画像データの特徴点を複数点求め、前記複数点の特徴点の画像データにおける位置情報から、スケール未知の仮相対姿勢および仮形状を求め、前記距離計測部により計測した前記特徴点までの距離に基づいて、前記スケール未知の仮相対姿勢および仮形状の3次元座標値を算出して測定対象物の形状を復元する形状復元装置と、を備えるようにした。 In order to solve the above-described problems, a three-dimensional shape measurement system of the present invention captures image data by imaging a measurement space including a distance measurement unit that measures distances to a plurality of points of the measurement object and the measurement object. A plurality of feature points of image data corresponding to each of a plurality of image data captured from different directions by the imaging unit, and a scale is obtained from position information in the image data of the feature points of the plurality of points. An unknown provisional relative posture and provisional shape are obtained, and a three-dimensional coordinate value of the scale-unknown provisional relative posture and provisional shape is calculated based on the distance to the feature point measured by the distance measuring unit, and the measurement object And a shape restoring device that restores the shape.
 また、上記の課題を解決するため、本発明の測定対象物を含む測定空間の形状を計測する3次元形状計測システムの計測方法は、異なる方向から前記測定空間を撮像して複数の画像データを取得し、前記複数の画像データのそれぞれに対応する画像データの特徴点を複数点求め、前記特徴点の画像データにおける位置情報から、スケール未知の仮相対姿勢および仮形状を求め、撮像点から前記特徴点に対応する測定対象物までの距離に基づいて、前記スケール未知の仮相対姿勢および仮形状の3次元座標値を算出して、測定対象物の形状を復元するようにした。 In order to solve the above problem, the measurement method of the three-dimensional shape measurement system for measuring the shape of the measurement space including the measurement object according to the present invention captures a plurality of image data by imaging the measurement space from different directions. Obtaining a plurality of feature points of image data corresponding to each of the plurality of image data, obtaining a temporary relative posture and a provisional shape whose scale is unknown from position information in the image data of the feature points, Based on the distance to the measurement object corresponding to the feature point, the three-dimensional coordinate values of the temporary relative posture and the temporary shape whose scale is unknown are calculated to restore the shape of the measurement object.
 本発明の3次元形状計測システムおよび計測方法によれば、距離計測部では測定対象の一部の形状しか得られない場合でも、測定対象全体の形状計測を行うことができる。 According to the three-dimensional shape measurement system and the measurement method of the present invention, even when only a part of the shape of the measurement target can be obtained by the distance measurement unit, the shape measurement of the entire measurement target can be performed.
実施例の3次元形状計測システムの機能ブロックを示す図である。It is a figure which shows the functional block of the three-dimensional shape measurement system of an Example. 実施例の3次元形状計測システムのハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the three-dimensional shape measurement system of an Example. 実施例の3次元形状計測システムの処理フローを示す図である。It is a figure which shows the processing flow of the three-dimensional shape measurement system of an Example. 実施例の3次元形状計測システムの計測例を示す図である。It is a figure which shows the example of a measurement of the three-dimensional shape measurement system of an Example. 特徴点認識部が抽出した特徴点の一例を示す図である。It is a figure which shows an example of the feature point which the feature point recognition part extracted. 対応計算部による特徴点の対応処理の例を示す図である。It is a figure which shows the example of the correspondence process of the feature point by a correspondence calculation part. 仮相対姿勢・形状推定部の処理の詳細を説明するである。It is a detail of the process of a temporary relative attitude | position and a shape estimation part. スケール調整部の詳細処理を説明する図である。It is a figure explaining the detailed process of a scale adjustment part. 計測対象物の形状の復元方法を説明する図である。It is a figure explaining the restoration method of the shape of a measuring object. 3次元空間のひとつの点を異なる方向から撮像したときの、点と撮像部のエピポーラ幾何の対応関係を示した図である。It is the figure which showed the correspondence of a point and the epipolar geometry of an imaging part when one point of a three-dimensional space is imaged from a different direction. 距離データ利用領域設定部の処理を説明する図である。It is a figure explaining the process of a distance data utilization area | region setting part. 特徴点利用領域設定部の処理を説明する図である。It is a figure explaining the process of a feature point utilization area | region setting part. 他の実施例の3次元形状計測システムの装置外観を示す図である。It is a figure which shows the apparatus external appearance of the three-dimensional shape measurement system of another Example. 距離計測部の計測例を説明する図である。It is a figure explaining the example of a measurement of a distance measurement part. 図14のレーザ光の走査面を示した図である。It is the figure which showed the scanning surface of the laser beam of FIG. 他の実施例の3次元形状計測システムの装置外観を示す図である。It is a figure which shows the apparatus external appearance of the three-dimensional shape measurement system of another Example. 他の実施例の3次元形状計測システムの処理フローを示す図である。It is a figure which shows the processing flow of the three-dimensional shape measurement system of another Example.
 (実施例1)
 まず、本実施例の3次元形状計測システムの計測方法の概要を説明する。
 実施例の3次元形状計測システムは、測定対象物を異なる方向から視た2次元の画像情報を撮影するCCD(Charge Coupled Device)イメージセンサやCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等の撮像部と、レーザ光や赤外光を照射して、測定対象物の複数の点までの距離を測定する距離計測部と、前記撮像部で撮影した異なる位置から視た測定対象物の画像情報と前記距離計測部で計測した複数点の距離情報を基に、測定対象物の3次元距離座標を算出する3次元座標算出部と、を備えている。
(Example 1)
First, the outline of the measurement method of the three-dimensional shape measurement system of the present embodiment will be described.
The three-dimensional shape measurement system of the embodiment includes an imaging unit such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor that captures two-dimensional image information obtained by viewing a measurement object from different directions. A distance measuring unit that measures the distance to a plurality of points of the measurement object by irradiating laser light or infrared light, and image information of the measurement object viewed from different positions photographed by the imaging unit and the distance A three-dimensional coordinate calculation unit that calculates three-dimensional distance coordinates of the measurement object based on distance information of a plurality of points measured by the measurement unit.
 より詳細には、前記撮像部は、撮影ポイントを移動する等により測定対象物を異なる位置から撮影するか、または、ステレオカメラのように、少なくとも2つイメージセンサを異なる位置に設けて、測定対象物を異なる位置から撮影する。この時、2つのイメージセンサの距離は、距離算出に使用しないため、ステレオカメラのようにイメージセンサの距離を固定したものでなくてもよい。
 また、前記撮像部は、RGBカラー画像を撮像するものであってもよいし、モノクロ画像や赤外線画像を撮像するものであってもよい。
More specifically, the imaging unit shoots the measurement object from different positions by moving a shooting point or the like, or at least two image sensors are provided at different positions like a stereo camera, and the measurement object Shoot objects from different positions. At this time, since the distance between the two image sensors is not used for the distance calculation, the distance between the image sensors may not be fixed like a stereo camera.
The imaging unit may capture an RGB color image, or may capture a monochrome image or an infrared image.
 前記距離計測部は、レーザ光により測距をおこなうレーザスキャナや、赤外光を拡散照射して反射光を2次元受光素子で検出し、測定対象物からの反射光の位相遅れから距離を換算するTOF(Time-Of-Flight)方式や照射パターン方式で距離を検出する距離画像センサを使用することができる。
 さらに、詳細は後述するが、測定空間のすべての測定対象物までの距離情報を必要としないため、レーザスキャナは、スキャナと同じ高さの平面上を2次元走査することで測定対象物の一部の2次元形状を計測するものでなくてもよく、また、連続スキャンをおこなわずに、複数のポイントを計測するものであってもよい。
 本実施例の3次元形状計測システムのより具体的な構成例は、後述する。
The distance measuring unit measures the distance from the phase lag of the reflected light from the measurement object by detecting the reflected light with a laser scanner that measures the distance with laser light or diffusing and irradiating infrared light with a two-dimensional light receiving element. It is possible to use a distance image sensor that detects a distance by a TOF (Time-Of-Flight) method or an irradiation pattern method.
Further, although details will be described later, since the distance information to all the measurement objects in the measurement space is not required, the laser scanner performs two-dimensional scanning on a plane having the same height as the scanner to detect one of the measurement objects. The two-dimensional shape of the part may not be measured, and a plurality of points may be measured without performing continuous scanning.
A more specific configuration example of the three-dimensional shape measurement system of the present embodiment will be described later.
 実施例の3次元形状計測システムにおいて、撮像部は、計測空間の測定対象物の形状全体を撮影できるのに対して、距離計測部は、照射光が測定対象物に届かない等により、計測空間の測定対象物の形状全体の距離情報を計測できないことがある。
 しかし、実施例の3次元形状計測システムは、撮像部で撮像した計測空間の測定対象物の形状全体の画像の特徴点のうち、距離が計測できた特徴点の3次元座標から得られるスケールと、特徴点から求まるスケール未知の形状を用いることにより、距離が計測できなかった特徴点の3次元座標を算出する。これにより、実施例の3次元形状計測システムは、測定対象物を復元する。
 さらに、実施例の3次元形状計測システムは、撮像部で撮像した計測空間の測定対象物の形状全体の画像の特徴点以外の計測点については、エピポーラ幾何の原理に基づいて、エピポーラ線上のみを探索することで対応点を見つけて形状を復元する。
In the three-dimensional shape measurement system of the embodiment, the imaging unit can capture the entire shape of the measurement object in the measurement space, whereas the distance measurement unit can measure the measurement space because the irradiation light does not reach the measurement object. The distance information of the entire shape of the measurement object may not be measured.
However, the three-dimensional shape measurement system of the embodiment includes a scale obtained from the three-dimensional coordinates of the feature points whose distances can be measured among the feature points of the image of the entire shape of the measurement object in the measurement space imaged by the imaging unit. By using a shape whose scale is unknown from the feature points, the three-dimensional coordinates of the feature points whose distance could not be measured are calculated. Thereby, the three-dimensional shape measurement system of the embodiment restores the measurement object.
Furthermore, in the three-dimensional shape measurement system of the embodiment, the measurement points other than the feature points of the image of the entire shape of the measurement object in the measurement space imaged by the imaging unit are measured only on the epipolar line based on the principle of epipolar geometry. By searching, corresponding points are found and the shape is restored.
 以下、測定対象物を前記撮像部によって異なる位置から撮影した画像情報と前記距離計測部で計測した測定対象物上の複数点の距離情報を基に、前記測定対象物の3次元距離座標を算出する3次元座標算出部の詳細を説明する。
 図1は、実施例の3次元形状計測システムの機能ブロックを示す図である。
Hereinafter, the three-dimensional distance coordinates of the measurement object are calculated based on image information obtained by photographing the measurement object from different positions by the imaging unit and distance information of a plurality of points on the measurement object measured by the distance measurement unit. Details of the three-dimensional coordinate calculation unit will be described.
FIG. 1 is a diagram illustrating functional blocks of the three-dimensional shape measurement system according to the embodiment.
 本実施例の3次元形状計測システムは、測定空間の測定対象物までの距離を計測する距離計測部100と、周囲の色情報を計測する撮像部101と、計測したデータを記憶する計測データ記憶部102を備える。計測データ記憶部102には、前記距離計測部100で計測した距離データ103と前記撮像部101で撮像した画像データ104を対にして、複数回分記憶している。
 前記距離計測部100と撮像部101は、同じタイミングで計測をすればよく、計測間隔を一定にする必要はない。
The three-dimensional shape measurement system of the present embodiment includes a distance measurement unit 100 that measures a distance to a measurement object in a measurement space, an imaging unit 101 that measures surrounding color information, and a measurement data storage that stores the measured data. Part 102 is provided. In the measurement data storage unit 102, the distance data 103 measured by the distance measurement unit 100 and the image data 104 captured by the imaging unit 101 are stored as a pair for a plurality of times.
The distance measuring unit 100 and the imaging unit 101 only need to measure at the same timing, and the measurement interval need not be constant.
 さらに、本実施例の3次元形状計測システムは、前記画像データ104の特徴点を抽出する特徴点認識部105と、複数枚の画像データ104の間で、特徴点認識部105で抽出した特徴点を対応付ける対応計算部106を備える。
 ここで、画像データ104の特徴点は、色相や輝度値の変動に基づいてコーナー検出を行い決定する。より具体的には、ハリスのコーナー検出方法等により算出することができる。または、輪郭線を抽出し、その離散曲率から頂点や変曲点を求めることもできる。
Furthermore, in the three-dimensional shape measurement system of this embodiment, the feature point recognition unit 105 that extracts the feature points of the image data 104 and the feature points extracted by the feature point recognition unit 105 between a plurality of pieces of image data 104. Is provided.
Here, the feature points of the image data 104 are determined by detecting corners based on changes in hue and luminance values. More specifically, it can be calculated by a Harris corner detection method or the like. Alternatively, a contour line is extracted, and a vertex or an inflection point can be obtained from the discrete curvature.
 また、本実施例の3次元形状計測システムは、対応計算部106が計算した特徴点の対応関係114から、複数の画像データ104の間で、撮像部101で撮像した計測地点間のスケール未知の仮相対姿勢108と、特徴点に関するスケール未知の3次元座標から成る仮形状109とをそれぞれ推定する仮相対姿勢・形状推定部107を備える。
 なお、スケール未知とは、姿勢や形状に関する絶対的な長さや大きさが定まらないが、姿勢間・形状内における距離関係や大きさの相対的な関係は正しいことを意味する。
Further, the three-dimensional shape measurement system according to the present embodiment has an unknown scale between the measurement points imaged by the imaging unit 101 between the plurality of image data 104 based on the feature point correspondence 114 calculated by the correspondence calculation unit 106. A provisional relative posture / shape estimation unit 107 is provided for estimating a provisional relative posture 108 and a provisional shape 109 composed of three-dimensional coordinates with unknown scales related to feature points.
The scale unknown means that the absolute length and size related to the posture and shape are not determined, but the distance relationship and the relative size relationship between the postures and within the shape are correct.
 上記のように、本実施例の3次元形状計測システムは、特徴点認識部105と対応計算部106と仮相対姿勢・形状推定部107から成るスケール未知の仮相対姿勢・形状算出部120により、複数の画像データ104の特徴点に基づいて、スケール未知の仮相対姿勢108および仮形状109を、画像データ104のそれぞれに対応して求めることができる。 As described above, the three-dimensional shape measurement system according to the present embodiment uses the temporary relative posture / shape calculation unit 120 whose scale is unknown, which includes the feature point recognition unit 105, the correspondence calculation unit 106, and the temporary relative posture / shape estimation unit 107. Based on the feature points of the plurality of image data 104, the temporary relative posture 108 and the temporary shape 109 whose scale is unknown can be obtained corresponding to each of the image data 104.
 さらに、本実施例の3次元形状計測システムは、スケール未知の仮相対姿勢・形状算出部120で求めた特徴点のスケール未知の仮相対姿勢108および仮形状109と、距離計測部100が計測した距離データ103から全体のスケールを求め、全体の形状データ113を復元する実形状算出部121を持つ。 Further, in the three-dimensional shape measurement system of this embodiment, the distance measurement unit 100 measures the scale-unknown temporary relative posture 108 and the temporary shape 109 of the feature points obtained by the scale-unknown temporary relative posture / shape calculation unit 120. An actual shape calculation unit 121 that obtains the entire scale from the distance data 103 and restores the entire shape data 113 is provided.
 詳しくは、実形状算出部121は、スケール調整部110と形状復元部112から構成される。
 スケール調整部110は、スケール未知の仮相対姿勢・形状算出部120で求めた特徴点のスケール未知の仮形状109と、距離計測部100が計測した距離データ103から全体のスケールを求め、そのスケールに従い仮相対姿勢108のスケールを調整する。
 そして、形状復元部112により、全体の形状データ113が復元される。
 形状復元部112は、また、特徴点が密でなかった場合には、多視点のエピポーラ拘束により特徴点以外の部分についても、形状復元をおこなうようにしてもよい。
Specifically, the actual shape calculation unit 121 includes a scale adjustment unit 110 and a shape restoration unit 112.
The scale adjustment unit 110 obtains the overall scale from the temporary shape 109 of the feature point unknown by the provisional relative posture / shape calculation unit 120 of unknown scale and the distance data 103 measured by the distance measurement unit 100, and the scale. Accordingly, the scale of the temporary relative posture 108 is adjusted.
Then, the entire shape data 113 is restored by the shape restoration unit 112.
In addition, when the feature points are not dense, the shape restoration unit 112 may perform shape restoration also on portions other than the feature points by multipolar epipolar constraints.
 形状復元部112で全体の形状復元を行う際には、スケール調整部110によって求めたスケールの正しい相対姿勢111と、前記仮形状109と、前記特徴点対応114のうち前記特徴点利用領域設定部116が設定した範囲内にある特徴点および重みの対と、前記距離計測部100が計測した前記距離データ103のうち前記距離データ利用領域設定部115が設定した範囲内にある距離データおよび重みの対から、全体の形状データ113を復元する。形状復元部112の処理内容の詳細は、後述する。 When the entire shape is restored by the shape restoration unit 112, the correct relative posture 111 of the scale obtained by the scale adjustment unit 110, the temporary shape 109, and the feature point use region setting unit among the feature point correspondences 114. 116 of feature points and weights within the range set by 116 and distance data and weights within the range set by the distance data utilization region setting unit 115 among the distance data 103 measured by the distance measurement unit 100. The entire shape data 113 is restored from the pair. Details of processing contents of the shape restoration unit 112 will be described later.
 図2は、本実施例の3次元形状計測システムのハードウェア構成の一例を示す図である。本実施例の3次元形状計測システムには、レーザスキャナや距離画像センサなどで構成される距離計測部100と、CCDやCMOSイメージセンサ等から構成される撮像部101から成る計測部20が設けられ、形状復元装置21が、計測部20の計測内容に基づいて形状復元処理をおこない、測定空間の3次元形状を得ている。 FIG. 2 is a diagram illustrating an example of a hardware configuration of the three-dimensional shape measurement system according to the present embodiment. The three-dimensional shape measurement system according to the present embodiment includes a distance measurement unit 100 configured by a laser scanner, a distance image sensor, and the like, and a measurement unit 20 including an imaging unit 101 configured by a CCD, a CMOS image sensor, or the like. The shape restoration device 21 performs shape restoration processing based on the measurement content of the measurement unit 20 to obtain a three-dimensional shape of the measurement space.
 この形状復元装置21は、装置全体の制御を行うCPU(Central Processing Unit)202と、計測部20から計測データを受信しメモリ23の距離データ103や画像データ104に転送する受信装置203と、距離データ利用領域設定部115と特徴点利用領域設定部116を備える入力・表示装置204と、実施例の3次元計測処理をおこなう処理部22と、前記距離データ103と前記画像データ104の計測データと前記仮相対姿勢108と前記仮形状109と前記相対姿勢111と前記形状データ113と前記特徴点対応114の処理データを記憶するメモリ23をもつ。 The shape restoration device 21 includes a central processing unit (CPU) 202 that controls the entire device, a reception device 203 that receives measurement data from the measurement unit 20 and transfers the measurement data to the distance data 103 and the image data 104 in the memory 23, and a distance An input / display device 204 including a data use region setting unit 115 and a feature point use region setting unit 116, a processing unit 22 that performs the three-dimensional measurement process of the embodiment, measurement data of the distance data 103 and the image data 104, The memory 23 stores processing data of the temporary relative posture 108, the temporary shape 109, the relative posture 111, the shape data 113, and the feature point correspondence 114.
 処理部22は、メモリ23の画像データ104を参照して処理をおこなう特徴点認識部105と対応計算部106と仮相対姿勢・形状推定部107とスケール調整部110と形状復元部112から構成されている。それぞれの処理部は、専用のハードウェア構成としてもよいし、また、専用のプロセッサをもちソフトウェア処理する構成であってもよい。また、CPU202でプログラム処理するようにしてもよい。 The processing unit 22 includes a feature point recognition unit 105, a correspondence calculation unit 106, a temporary relative posture / shape estimation unit 107, a scale adjustment unit 110, and a shape restoration unit 112 that perform processing with reference to the image data 104 in the memory 23. ing. Each processing unit may have a dedicated hardware configuration or may have a dedicated processor and a software process. Further, the CPU 202 may perform program processing.
 図3は、本実施例の3次元形状計測システムの処理フローを示す図である。
 まず、本実施例の3次元形状計測システムは、計測部20(図3参照)により距離データ103と画像データ104(図3参照)を計測して、メモリ23に記憶する計測データ取得処理S300をおこなう。このとき、距離データ103と画像データ104は対で構成されている。また、距離と画像の計測データは、異なる位置または姿勢で計測され、計測対象を異なる方向から測定したものとなっている。この計測データは少なくとも2点あれば良いが、計測データが多いほど各特徴点の位置に関する情報量が増えるため、計測データの点数が多いほど計測範囲や計測精度が向上する。
FIG. 3 is a diagram illustrating a processing flow of the three-dimensional shape measurement system according to the present embodiment.
First, the three-dimensional shape measurement system according to the present embodiment measures the distance data 103 and the image data 104 (see FIG. 3) by the measurement unit 20 (see FIG. 3), and performs a measurement data acquisition process S300 stored in the memory 23. Do it. At this time, the distance data 103 and the image data 104 are configured in pairs. The distance and image measurement data are measured at different positions or orientations, and the measurement target is measured from different directions. The measurement data may be at least two points. However, as the measurement data increases, the amount of information regarding the position of each feature point increases. Therefore, as the number of measurement data increases, the measurement range and measurement accuracy improve.
 ステップS301で、3次元形状計測処理に必要な計測データ数を得られたかどうか判定し、計測データ数が不足であれば(S301のNo)、ステップS300に戻る。
 データ数が十分であれば(S301のYes)、ステップS302に進む。
In step S301, it is determined whether the number of measurement data necessary for the three-dimensional shape measurement process is obtained. If the number of measurement data is insufficient (No in S301), the process returns to step S300.
If the number of data is sufficient (Yes in S301), the process proceeds to step S302.
 ステップS302では、特徴点認識部105で、複数の方向から計測対象物を撮像した画像データ104のそれぞれの画像データについて、画像上の特徴点を認識する特徴点認識処理を行う(S302)。
 そして、対応計算部106で、複数の画像データの間で、同じ特徴点を対応付ける対応計算処理を行う(S303)。
In step S302, the feature point recognition unit 105 performs feature point recognition processing for recognizing feature points on the image for each piece of image data 104 obtained by imaging the measurement object from a plurality of directions (S302).
Then, the correspondence calculation unit 106 performs correspondence calculation processing for associating the same feature points among a plurality of image data (S303).
 つぎに、仮相対姿勢・形状推定部107で、複数の画像データ104の間の同じ特徴点の位置情報から、スケール未知の仮相対姿勢108および仮形状109を求める(S304)。
 つぎに、スケール調整部110で、スケール未知の仮形状109に関する距離と、計測データ取得処理S300において計測した距離データ103によって得られるスケールの正しい距離の比率を用いて、仮相対姿勢108と仮形状109のスケールを調整する(S305)。
Next, the temporary relative posture / shape estimating unit 107 obtains the temporary relative posture 108 and the temporary shape 109 whose scales are unknown from the position information of the same feature points between the plurality of image data 104 (S304).
Next, using the ratio of the correct distance of the scale obtained from the distance related to the temporary shape 109 whose scale is unknown by the scale adjustment unit 110 and the distance data 103 measured in the measurement data acquisition process S300, the temporary relative posture 108 and the temporary shape are used. The scale of 109 is adjusted (S305).
 最後に、本実施例の3次元形状計測システムは、形状復元部112で、仮相対姿勢108および仮形状109と、計測データ取得処理S300において計測した距離データ103から計測対象物の形状を復元する(S306)。
 この時、形状復元部112は、S302で求めた特徴点の数が少なければ、多視点のエピポーラ拘束により特徴点以外の部分についても、形状復元をおこなうようにしてもよい。
Finally, in the three-dimensional shape measurement system of the present embodiment, the shape restoration unit 112 restores the shape of the measurement object from the temporary relative posture 108 and the temporary shape 109 and the distance data 103 measured in the measurement data acquisition process S300. (S306).
At this time, if the number of feature points obtained in S302 is small, the shape restoration unit 112 may perform shape restoration on portions other than the feature points by multi-view epipolar constraint.
 図3のフロー図では、測定対象物を計測・特徴点抽出した後で、形状の復元処理をおこなうフローを示したが、復元処理に必要な数以上の画像データ・距離データを撮像・計測する場合には、復元処理中に、つぎの撮像・計測の計測データ取得処理、特徴点認識処理、対応計算処理をおこなうパイプライン処理をおこなってもよい。 In the flowchart of FIG. 3, a flow for performing shape restoration processing after measurement / feature point extraction of the measurement object is shown. However, more than the number of image data / distance data necessary for the restoration processing is captured and measured. In this case, during the restoration process, pipeline processing for performing the next measurement / acquisition measurement data acquisition process, feature point recognition process, and corresponding calculation process may be performed.
 上記のように、画像データの特徴点を求め、この特徴点の相対姿勢や形状を求めて、特徴点の距離データを基に計測対象物の形状復元を行うことにより、計算コストの低減や、処理時間の短縮を図ることができる。 As described above, the feature point of the image data is obtained, the relative posture and shape of the feature point is obtained, and the shape of the measurement object is restored based on the distance data of the feature point, thereby reducing the calculation cost, Processing time can be shortened.
 図4に実施例の3次元形状計測システムの計測例を示す。図4(a)は、距離計測部100の計測結果である距離データ103を示し、図4(b)は、撮像部101の計測結果である画像データ104を示している。
 図4(a)の距離データ103は、距離の値を輝度値として画面上に表わしたもので、距離が遠い部位ほど黒に近い色で表わされる。距離が計測できたなかった部位については斜線ハッチング(符号401)で示されている。
 図4(a)では、符号403の領域が、距離測定できた部分となっている。
FIG. 4 shows a measurement example of the three-dimensional shape measurement system of the embodiment. 4A shows distance data 103 that is a measurement result of the distance measurement unit 100, and FIG. 4B shows image data 104 that is a measurement result of the imaging unit 101.
The distance data 103 in FIG. 4 (a) is a distance value represented on the screen as a luminance value, and a part having a longer distance is represented by a color closer to black. The part where the distance could not be measured is indicated by hatching (reference numeral 401).
In FIG. 4A, a region 403 is a portion where the distance can be measured.
 図4(b)は、図4(a)と同じ場所の画像データとなっており、画像データの特定の部位の距離が、図4(a)の同じ座標から判る。例えば、図4(b)の下部の地面は、図4(a)の符号403の距離計測できた部分に対応している。また、距離データの中心は、画像データの中心にある建屋に対応する。しかし、距離データの中心は、斜線ハッチングの領域となっている。つまり、建屋までの距離は計測できていないこととなる。
 ただし、距離データと画像データの対応関係が予め既知であるならば、同じ座標が同じ部位に対応せずともよい。
 詳細は後述するが、本実施例では、図4(a)の白線部402のような画像の特徴点の位置情報から求めたスケール未知の仮形状と、距離が判明している特徴点の距離情報から形状復元を行う。
FIG. 4B shows image data at the same place as in FIG. 4A, and the distance of a specific part of the image data can be found from the same coordinates in FIG. 4A. For example, the ground on the lower side of FIG. 4B corresponds to a portion where the distance can be measured by reference numeral 403 in FIG. The center of the distance data corresponds to the building at the center of the image data. However, the center of the distance data is a hatched area. That is, the distance to the building cannot be measured.
However, if the correspondence relationship between the distance data and the image data is known in advance, the same coordinates may not correspond to the same part.
Although details will be described later, in this embodiment, the distance between the temporary shape whose scale is unknown and the distance from which the distance is known is obtained from the position information of the feature points of the image such as the white line portion 402 in FIG. Perform shape restoration from information.
 つぎに、図1の特徴点認識部105の詳細を図5により、対応計算部106の詳細を図6により、仮相対姿勢・形状推定部107の詳細を図7により、スケール調整部110の詳細を図8により、形状復元部112の詳細を図9、図10により説明する。 Next, details of the feature point recognition unit 105 in FIG. 1 are shown in FIG. 5, details of the correspondence calculation unit 106 are shown in FIG. 6, details of the temporary relative posture / shape estimation unit 107 are shown in FIG. 7, and details of the scale adjustment unit 110 are shown. Details of the shape restoration unit 112 will be described with reference to FIGS.
 図5は、特徴点認識部105で抽出した特徴点の一例を示す図であり、撮像部101が計測した画像データ104のひとつの画像データ500から特徴点501(黒丸)を複数個抽出した例を示している。
 ここで、特徴点とは、画像内の所定の位置周囲の色相や輝度のパターンが特異な点であり、例えば角や色の境界部など挙げられる。図5に示す例では特徴点501を黒丸で示す。特徴点を選ぶための手法としては、前述のハリスのコーナー検出方法以外にも、SIFT(Scale-Invariant Feature Transform)やSURF(Speeded Up Robust Features)、FAST(Features from Accelerated Segment Test)のコーナー特徴など挙げられる。ただし、所定の座標周囲の画像パターンを記録するもので、異なる画像間で対応付けられるものであれば上記手法に限らない。
FIG. 5 is a diagram illustrating an example of feature points extracted by the feature point recognition unit 105, in which a plurality of feature points 501 (black circles) are extracted from one image data 500 of the image data 104 measured by the imaging unit 101. Is shown.
Here, the feature point is a point where the hue or luminance pattern around a predetermined position in the image is peculiar, such as a corner or a color boundary. In the example shown in FIG. 5, the feature point 501 is indicated by a black circle. In addition to the Harris corner detection method described above, features for selecting feature points include the corner features of SIFT (Scale-Invariant Feature Transform), SURF (Speeded Up Robust Features), and FAST (Features from Accelerated Segment Test). Can be mentioned. However, the method is not limited to the above method as long as it records an image pattern around a predetermined coordinate and can be associated with different images.
 図6は、対応計算部106による特徴点の対応処理の例を示す図である。図6(a)の画像データ600と図6(b)の画像データ601は、画像データ104の中で、異なる地点から測定対象物を撮像した画像データを示している。このため、同じ測定対象物が、画像データ600と画像データ601とで、異なる位置に撮像される。
 特徴点602は、特徴点認識部105により、画像データ600から抽出された特徴点のひとつであり、特徴点603は、特徴点認識部105により、画像データ601から抽出された特徴点のひとつとなっている。特徴点602と特徴点603は、同じ計測対象の存在物の特徴点であるが、画像データ600と画像データ601では異なる位置に撮像される。
FIG. 6 is a diagram illustrating an example of feature point correspondence processing by the correspondence calculation unit 106. Image data 600 in FIG. 6A and image data 601 in FIG. 6B indicate image data obtained by imaging the measurement object from different points in the image data 104. For this reason, the same measurement object is imaged at different positions in the image data 600 and the image data 601.
The feature point 602 is one of the feature points extracted from the image data 600 by the feature point recognition unit 105, and the feature point 603 is one of the feature points extracted from the image data 601 by the feature point recognition unit 105. It has become. The feature point 602 and the feature point 603 are feature points of the same object to be measured, but the image data 600 and the image data 601 are imaged at different positions.
 対応計算部106は、撮影対象内の各特徴点が、画像データの間でそれぞれどの位置に映っているかを対応付けるものである。図6においては、特徴点認識部105によって認識した画像データ600に映った特徴点602と、画像データ601に映った特徴点603の特徴パターンが一致していた場合に、対応計算部106は、それら特徴点同士の対応付けを行う。
 図6(a)と図6(b)においては、点線で結ばれた黒点が、対応付けられた特徴点を表わしている。
The correspondence calculation unit 106 associates the position of each feature point in the photographing target with the image data. In FIG. 6, when the feature pattern of the feature point 602 reflected in the image data 600 recognized by the feature point recognition unit 105 and the feature pattern of the feature point 603 reflected in the image data 601 match, the correspondence calculation unit 106 The feature points are associated with each other.
In FIG. 6A and FIG. 6B, black points connected by dotted lines represent the associated feature points.
 対応計算部106は、この特徴点の対応付け処理を全ての画像データ間の特徴点に対して実施する。しかし、全ての画像データ間で本対応計算処理を実施すると計算時間増に繋がるため、対応付けに利用する画像を所定の間隔で選択することでもよい。また、各画像の撮影場所が近いことが分かっている画像間のみに対して対応計算を行ってもよい。 The correspondence calculation unit 106 performs the feature point correspondence processing on the feature points between all the image data. However, if this correspondence calculation process is performed between all the image data, the calculation time is increased. Therefore, images used for association may be selected at predetermined intervals. In addition, the correspondence calculation may be performed only between images in which the shooting locations of the images are known to be close.
 図7により、仮相対姿勢・形状推定部107の処理の詳細を説明する。
 図7は、撮像部101で撮像した2つの画像データにおいて、対応する特徴点と撮像部101の位置関係を示した図である。仮相対姿勢・形状推定部107は、各画像間の複数の特徴点の座標の対応関係から、スケール未知の撮影地点間の仮相対姿勢108および仮形状109を推定する。
The details of the process of the temporary relative posture / shape estimation unit 107 will be described with reference to FIG.
FIG. 7 is a diagram illustrating a positional relationship between corresponding feature points and the imaging unit 101 in two pieces of image data captured by the imaging unit 101. The temporary relative posture / shape estimation unit 107 estimates the temporary relative posture 108 and the temporary shape 109 between the imaging points whose scales are unknown from the correspondence relationship between the coordinates of a plurality of feature points between the images.
 ここで、相対姿勢とは撮影地点間の位置と向きに関する差異であり、前記仮相対姿勢108はそのスケールが未知である場合のことを指す。また前記相対姿勢111はスケールが正しい相対姿勢を指すこととする。前記形状データ113は3次元座標が得られた点の集合のことを指し、前記仮形状109はスケールが未知の形状データを指す。 Here, the relative posture is a difference regarding the position and orientation between the shooting points, and the temporary relative posture 108 indicates a case where the scale is unknown. The relative posture 111 indicates a relative posture with a correct scale. The shape data 113 indicates a set of points from which three-dimensional coordinates are obtained, and the temporary shape 109 indicates shape data whose scale is unknown.
 スケール未知の特徴点の3次元座標と撮影地点間の相対姿勢は、各画像間の特徴点の座標とその対応関係が複数組得られていれば、一意に決定することができる。必要な対応関係数としては、2枚の画像の組あたり特徴点の対応関係が5組以上得られていればよい。この計算は5点アルゴリズム、8点アルゴリズムあるいは下記算出方法によって求められる。 The relative orientation between the three-dimensional coordinates of the feature points whose scale is unknown and the shooting point can be uniquely determined if a plurality of sets of the coordinates of the feature points between the images and their corresponding relationships are obtained. As the number of necessary correspondences, it is only necessary that five or more feature point correspondences per two image pairs are obtained. This calculation is obtained by a 5-point algorithm, an 8-point algorithm, or the following calculation method.
 以下に、より具体的な算出方法を説明する。
 図7(a)は、撮像部101で撮像した画像データ700を示している。また、図7(b)は、同じ測定対象物を異なる位置から撮像した画像データ701を示している。図7(a)の符号702は、測定対象物の特徴点のひとつを表わし、図7(b)の符号703は、符号702に対応する同じ測定対象物の特徴点を表わしているとする。
Hereinafter, a more specific calculation method will be described.
FIG. 7A shows image data 700 captured by the imaging unit 101. FIG. 7B shows image data 701 obtained by imaging the same measurement object from different positions. A reference numeral 702 in FIG. 7A represents one of characteristic points of the measurement object, and a reference numeral 703 in FIG. 7B represents a characteristic point of the same measurement object corresponding to the reference numeral 702.
 図7(c)は、測定対象物707と画像データ700、701の位置関係を表わした図である。符号704は、画像データ700を撮像したときの撮像地点を表わし、符号705は、画像データ701を撮像したときの撮像地点を表わしている。そして、符号706は、撮像地点の符号704および705の相対姿勢の値とする。
 測定対象物707の角部708は、画像データ700の特徴点である符号702に対応し、画像データ701の特徴点である符号703に対応している。
 図7(c)の測定対象物の角部708と特徴点の符号702、703の位置関係はつぎの式で表わすことができる。
 
FIG. 7C is a diagram showing the positional relationship between the measurement object 707 and the image data 700 and 701. Reference numeral 704 represents an imaging point when the image data 700 is captured, and reference numeral 705 represents an imaging point when the image data 701 is captured. Reference numeral 706 is the value of the relative orientation of reference numerals 704 and 705 of the imaging points.
A corner portion 708 of the measurement object 707 corresponds to a reference numeral 702 that is a feature point of the image data 700 and corresponds to a reference numeral 703 that is a feature point of the image data 701.
The positional relationship between the corner 708 of the measurement object in FIG. 7C and the reference numerals 702 and 703 of the feature points can be expressed by the following equation.
Figure JPOXMLDOC01-appb-M000001
 式(1)において、(fx,fy)はカメラの焦点距離、(cx,cy)は画面の主点、sは左辺のベクトルの3行目の要素が1となるようにするための正規化項であり、カメラ座標系における存在物までの距離に相当する。また、行列(R,t)は撮影地点間の相対姿勢、(X,Y,Z)は特徴点の3次元座標である。
Figure JPOXMLDOC01-appb-M000001
In equation (1), (fx, fy) is the focal length of the camera, (cx, cy) is the principal point of the screen, and s is a normalization so that the element in the third row of the vector on the left side is 1. Which corresponds to the distance to the entity in the camera coordinate system. The matrix (R, t) is the relative posture between the shooting points, and (X, Y, Z) is the three-dimensional coordinates of the feature points.
 式(1)によれば、特徴点の3次座標が分かれば、画像データ上の座標(u,v)が定まる。そのため、(u,v)が実際に撮影によって得られた特徴点の画像上の座標(mx,my)と一致するような特徴点の3次元座標(X,Y,Z)と相対姿勢を表わす行列(R,t)を求めることで仮相対姿勢と仮形状を決定できる。 According to Equation (1), if the tertiary coordinates of the feature points are known, the coordinates (u, v) on the image data are determined. Therefore, the three-dimensional coordinates (X, Y, Z) of the feature points and the relative postures such that (u, v) matches the coordinates (mx, my) on the image of the feature points actually obtained by photographing. By obtaining the matrix (R, t), the temporary relative posture and the temporary shape can be determined.
 ただし、特徴点の認識誤差などの影響で(mx,my)と(u,v)が厳密に一致する事はないため、つぎの式(2)に示す(mx,my)および(u,v)の差異となる逆投影誤差Eを最小化することで最適な相対姿勢と複数の特徴点の3次元座標を求める。
Figure JPOXMLDOC01-appb-M000002
However, since (mx, my) and (u, v) do not exactly match due to the influence of the feature point recognition error, (mx, my) and (u, v) shown in the following equation (2): The optimal relative posture and the three-dimensional coordinates of a plurality of feature points are obtained by minimizing the backprojection error E that is the difference of
Figure JPOXMLDOC01-appb-M000002
 このように、特徴点の3次元座標が得られることで、特徴点の位置に関する計測対象物の形状が確定する。最適化計算法としては最急降下法やLevenberg-Marquardt法など、評価式を最小化できるものであれば良い。 Thus, by obtaining the three-dimensional coordinates of the feature points, the shape of the measurement object regarding the position of the feature points is determined. Any optimization calculation method may be used as long as the evaluation formula can be minimized, such as a steepest descent method or a Levenberg-Marquardt method.
 つぎに、図8により、スケール調整部110の詳細処理を説明する。
 図8は、距離計測部100が計測した前記距離データ103を用いて、仮形状109のスケールを調整する方法を示す図である。
Next, detailed processing of the scale adjustment unit 110 will be described with reference to FIG.
FIG. 8 is a diagram illustrating a method of adjusting the scale of the temporary shape 109 using the distance data 103 measured by the distance measuring unit 100.
 撮像部101で撮像した画像データ104を用いて仮相対姿勢・形状推定部107によって計算した仮相対姿勢108と仮形状109は、形状内の距離の大小関係については正しいが、計測対象の絶対スケールが分からない。一方、距離計測部100によって得た部位に関しては、絶対的なスケールが分かっているが、多くの場合、すべての点の距離を測定できないので、画面内の一部領域における距離しか得られていない。
 そこで、スケール調整部110では、距離計測部100によって得られている一部領域のスケール情報を、距離計測はできていないがスケール未知の形状が計算できた領域に与え、当該領域内のスケールを復元できるように調整する比率を求める。
The temporary relative posture 108 and the temporary shape 109 calculated by the temporary relative posture / shape estimation unit 107 using the image data 104 captured by the imaging unit 101 are correct in terms of the magnitude relationship between the distances within the shape, but are the absolute scale of the measurement target. I don't know. On the other hand, regarding the part obtained by the distance measuring unit 100, the absolute scale is known, but in many cases, since the distances of all points cannot be measured, only the distances in a partial region in the screen are obtained. .
Therefore, the scale adjustment unit 110 gives the scale information of the partial region obtained by the distance measurement unit 100 to the region where the distance measurement has not been performed but the unknown shape of the scale has been calculated, and the scale in the region is calculated. Find the ratio to be adjusted so that it can be restored.
 具体的には、図8(a)の画像データ104の特徴点801については、距離計測部800により、スケールの正しい距離804の座標802が得られているとする。
 一方、特徴点801は、仮相対姿勢・形状推定部107の算出結果として、スケール未知の距離805の位置に形状が推定されたとする。
Specifically, for the feature point 801 of the image data 104 in FIG. 8A, it is assumed that the coordinate 802 of the correct distance 804 of the scale is obtained by the distance measuring unit 800.
On the other hand, it is assumed that the shape of the feature point 801 is estimated at the position of the distance 805 where the scale is unknown as the calculation result of the temporary relative posture / shape estimation unit 107.
 スケール調整部110は、距離804と距離805の比率rによって全体のスケールを決定する。つまり、図8(b)に示すように、この比率rを用いて、仮形状109における特徴点801を、座標808(座標802と同じ)の位置に移動させる。さらに、距離計測できなかった特徴点806の位置に関しても、比率rを用いてスケール調整後の座標810の位置に移動させる。これにより、特徴点801と特徴点806は、正しいスケールを得ることができる。 The scale adjustment unit 110 determines the overall scale based on the ratio r between the distance 804 and the distance 805. That is, as shown in FIG. 8B, using this ratio r, the feature point 801 in the temporary shape 109 is moved to the position of the coordinate 808 (same as the coordinate 802). Further, the position of the feature point 806 for which distance measurement could not be performed is moved to the position of the coordinate 810 after the scale adjustment using the ratio r. Thereby, the feature point 801 and the feature point 806 can obtain a correct scale.
 図8(a)で、特徴点801に加えて、特徴点809も距離計測部800により、スケールの正しい距離をもつ座標807が得られている場合には、特徴点809に関する比率rと、特徴点801に関する比率rとの平均をとり、全体のスケールを決定する。つまり、図8(b)に示すように、仮形状109における特徴点801は平均の比率によりスケール調整をおこなって座標808(座標802と異なる)の位置に移動させる。同様に、特徴点806、809、811も、平均の比率によりスケール調整をおこなって、座標810、812、813の位置に移動されて、正しいスケールを得ることができる。
 このように、距離計測できた特徴点が複数ある場合には、最適なスケールの計算は各スケールの平均を取ることとする。
In FIG. 8A, in addition to the feature point 801, the feature point 809 also has the ratio r regarding the feature point 809 and the feature point 809 when the distance measuring unit 800 has obtained the coordinates 807 having the correct scale distance. An average with the ratio r for point 801 is taken to determine the overall scale. That is, as shown in FIG. 8B, the feature point 801 in the temporary shape 109 is moved to the position of the coordinate 808 (different from the coordinate 802) by adjusting the scale according to the average ratio. Similarly, the feature points 806, 809, and 811 are also scaled according to the average ratio and moved to the positions of the coordinates 810, 812, and 813 to obtain a correct scale.
Thus, when there are a plurality of feature points whose distances can be measured, the optimum scale is calculated by taking the average of each scale.
 上記によって得られたスケールの比率rによって、仮相対姿勢tをスケールの正しい相対姿勢tに補正する方法を式(3)に示す。
Figure JPOXMLDOC01-appb-M000003
The ratio r of the scale obtained by the above, illustrates a method of correcting a provisional relative position t in the correct relative orientation t r scale in equation (3).
Figure JPOXMLDOC01-appb-M000003
 また、特徴点のスケール未知の座標(X,Y,Z)を、スケールの正しい座標(Xr,Yr,Zr)に修正する方法を式(4)に示す。
Figure JPOXMLDOC01-appb-M000004
Further, a method for correcting the coordinates (X, Y, Z) of unknown feature points to the correct coordinates (Xr, Yr, Zr) of the scales is shown in Equation (4).
Figure JPOXMLDOC01-appb-M000004
 つぎに、図9により、形状復元部112の処理内容の詳細を説明する。
 図9は、計測対象物の形状の復元方法を説明する図である。
 形状復元部112は、スケール調整部110によって調整した全体のスケールを初期値として、仮相対姿勢・形状推定部107で求めた逆投影誤差に、スケールの調整を拘束条件として更に追加して、スケールの正しい相対姿勢と形状の復元を行う。
 本発明では、撮像部で複数の異なる位置から撮影した画像を用いて得られるスケール未知の形状と、距離計測部100で得られたセンサ付近の一部の形状から得られるスケールを組み合わせることで、距離計測部100で直接計測できない遠方の形状についても3次元形状計測することが可能となる。
Next, details of processing contents of the shape restoration unit 112 will be described with reference to FIG.
FIG. 9 is a diagram illustrating a method for restoring the shape of the measurement object.
The shape restoration unit 112 further adds scale adjustment as a constraint condition to the back projection error obtained by the temporary relative posture / shape estimation unit 107 using the entire scale adjusted by the scale adjustment unit 110 as an initial value. The correct relative posture and shape are restored.
In the present invention, by combining a scale unknown shape obtained using images taken from a plurality of different positions in the imaging unit and a scale obtained from a part of the shape near the sensor obtained by the distance measurement unit 100, It is possible to measure a three-dimensional shape even for a distant shape that cannot be directly measured by the distance measuring unit 100.
 詳しくは、図9において、画像データの特徴点900は、距離計測部100によって測定対象物までの距離901が計測できていることとし、特徴点902については距離が計測できていないこととする。この際、画像データの特徴点900の3次元座標903を仮に設定し、特徴点902の3次元座標904を仮に設定すると、画像データの特徴点の位置は、式(1)によって求まる。 Specifically, in FIG. 9, it is assumed that a distance 901 to the measurement object can be measured by the distance measuring unit 100 and that a feature point 900 of the image data cannot be measured for the feature point 902. At this time, if the three-dimensional coordinate 903 of the feature point 900 of the image data is temporarily set and the three-dimensional coordinate 904 of the feature point 902 is temporarily set, the position of the feature point of the image data is obtained by Expression (1).
 式(1)を満たしつつ、さらに、距離の計測できている特徴点900については、仮に設定した3次元座標903とセンサまでの距離が、計測によって得られた距離901と等しくなるようにすることで、スケールの正しい形状を求めることができる。 Further, for the feature point 900 that satisfies the formula (1) and whose distance can be measured, the tentatively set three-dimensional coordinate 903 and the distance to the sensor should be equal to the distance 901 obtained by the measurement. Thus, the correct shape of the scale can be obtained.
 つぎの式(5)にその評価計算式を示す。
Figure JPOXMLDOC01-appb-M000005
The evaluation formula is shown in the following formula (5).
Figure JPOXMLDOC01-appb-M000005
 ここで、式(5)における重みwは、特徴点までの距離と計測によって得た距離の差異に関する拘束の重みを表す。この重みは、前記逆投影誤差と距離の差異(これを距離誤差と呼ぶこととする)に関する拘束のバランスを制御するものであり、特徴点認識精度と距離計測精度の比率に応じて設定する。
 また、式(5)において1、2番目の項は式(2)と同様であり、逆投影誤差を示す。
 3番目の項は前記距離誤差を示しており、dは距離計測部によって計測した距離、sは式(1)によって得られるセンサから仮に設定した特徴点までの距離をそれぞれ示している。
Here, the weight w in Equation (5) represents the constraint weight regarding the difference between the distance to the feature point and the distance obtained by measurement. This weight controls the balance of constraints relating to the difference between the backprojection error and the distance (this will be referred to as a distance error), and is set according to the ratio between the feature point recognition accuracy and the distance measurement accuracy.
In Expression (5), the first and second terms are the same as in Expression (2), and indicate a backprojection error.
The third term indicates the distance error, d indicates the distance measured by the distance measuring unit, and s indicates the distance from the sensor obtained by Equation (1) to the temporarily set feature point.
 さらに、形状復元部112では、特徴点の3次元座標と距離計測部100の間の姿勢関係を求めたが、特徴点が十分密に得られていない場合には、特徴点の相対姿勢を用いて、Multi View Stereoの手法により、特徴点が得られなかった部位についての形状を復元する。この復元は、図10に示すエピポーラ幾何の原理に基づき、以下に示す手順によって行う。 Furthermore, the shape restoration unit 112 obtains the posture relationship between the three-dimensional coordinates of the feature points and the distance measurement unit 100. If the feature points are not obtained sufficiently densely, the relative posture of the feature points is used. Then, using the Multi 部位 View な か っ Stereo method, the shape of the part where the feature points could not be obtained is restored. This restoration is performed according to the following procedure based on the principle of epipolar geometry shown in FIG.
 図10は、測定空間の測定対象物のひとつの点1001を異なる方向から撮像したときの、点1001と撮像部1003の対応関係を示した図であり、エピポーラ幾何の原理を示すものである。
 測定空間の点1001は、画像データ1004上の点1000に対応し、撮像部1003と点1000を結んだ線の延長線のどこかにあることになる。点1001を含む線を、他の視点から撮像したときの画像データ1005の直線1002をエピポーラ線と呼ぶ。測定空間の点1001を他の視点から撮像したときの画像データ1005上の点は、このエピポーラ線1002上に限定される。この原理をエピポーラ幾何と呼ぶ。
FIG. 10 is a diagram showing the correspondence between the point 1001 and the imaging unit 1003 when one point 1001 of the measurement object in the measurement space is imaged from different directions, and shows the principle of epipolar geometry.
The point 1001 in the measurement space corresponds to the point 1000 on the image data 1004 and is located somewhere on the extended line connecting the imaging unit 1003 and the point 1000. A straight line 1002 of the image data 1005 when a line including the point 1001 is imaged from another viewpoint is referred to as an epipolar line. A point on the image data 1005 when the point 1001 in the measurement space is imaged from another viewpoint is limited to the epipolar line 1002. This principle is called epipolar geometry.
 この原理を利用したMulti View Stereoにより、測定対象物の画像が特異なパターンでなく特徴点として認識されなかった点についても、エピポーラ線上のみを探索することで対応点を見つけることができるので、密に形状を復元できる。なお、上記密な形状の復元方法は相対姿勢と画像データから密な形状を復元できる手法であれば上記Multi View Stereoに限らない。 With Multi View Stereo using this principle, corresponding points can be found by searching only on the epipolar line even if the image of the measurement object is not a unique pattern and not recognized as a feature point. The shape can be restored. The dense shape restoration method is not limited to Multi View Stereo as long as the dense shape can be restored from the relative posture and the image data.
 つぎに、図1の距離データ利用領域設定部115について説明する。
 前記形状復元部112では、重みwの値を特徴点認識精度と距離認識精度のバランスによって設定しているが、このうち距離認識精度は、測定対象物の材質や色、距離等によってそれぞれ異なる場合がある。
 そこで、距離データ利用領域設定部115により、距離精度の高さに応じて、前記距離精度に従う重みwを設定し、前記距離精度を満たす画像データの領域と前記重みwを対にして複数種類設定し、画像データ内の領域に応じて、この重みwに従いスケール調整部110でスケールを決定しつつ、形状復元部112が式(5)を最適化する。
Next, the distance data utilization area setting unit 115 in FIG. 1 will be described.
In the shape restoration unit 112, the value of the weight w is set according to the balance between the feature point recognition accuracy and the distance recognition accuracy. Of these, the distance recognition accuracy varies depending on the material, color, distance, etc. of the measurement object. There is.
Therefore, the distance data utilization area setting unit 115 sets the weight w according to the distance accuracy according to the distance accuracy, and sets a plurality of types of image data areas satisfying the distance accuracy and the weight w. Then, the shape restoration unit 112 optimizes the equation (5) while determining the scale by the scale adjustment unit 110 according to the weight w in accordance with the region in the image data.
 図11は、距離データ利用領域設定部115の処理を説明する図である。
 図11では、距離データ利用領域設定部115が、画像データ1101に基づいて、3種類の重みを設定する例を示している。図11の例においては、特徴点1102の部位を距離計測部100で計測した時に、精度が最も高く、特徴点1104の部位では、精度が次に高く、特徴点1106の部位では、精度が最も低いこととする。
FIG. 11 is a diagram for explaining the processing of the distance data use area setting unit 115.
FIG. 11 illustrates an example in which the distance data usage area setting unit 115 sets three types of weights based on the image data 1101. In the example of FIG. 11, when the part of the feature point 1102 is measured by the distance measuring unit 100, the accuracy is the highest, the part of the feature point 1104 has the next highest precision, and the part of the feature point 1106 has the highest precision. Let it be low.
 この場合、設定手順としては、画像データ1101において、距離データ利用領域設定部115は、まず特徴点1102を含む領域1103を設定し、重みwに最も大きな値を設定する。続いて特徴点1104を含む領域1105を設定し領域1103に設定した重みwよりは小さい値の重みwを設定する。最後に特徴点1106を含む領域1107を設定し、最も小さな重みwを設定することとする。
 なお、重みwを設定しなかった領域には所定の重みwを設定することとする。
In this case, as the setting procedure, in the image data 1101, the distance data utilization region setting unit 115 first sets the region 1103 including the feature point 1102 and sets the largest value for the weight w. Subsequently, an area 1105 including the feature point 1104 is set, and a weight w having a smaller value than the weight w set in the area 1103 is set. Finally, an area 1107 including the feature point 1106 is set, and the smallest weight w is set.
It should be noted that a predetermined weight w is set for an area where the weight w is not set.
 距離データ利用領域設定部115により決められた領域毎のそれぞれの重みwに応じて、スケール調整部110は重み付き平均によりスケールを決定し、また、形状復元部112は式(5)の重みwの値に応じて形状復元を行う。
 なお、設定の種類は3種類に限らないものとする。また、何も設定しなかった場合は所定の固定値を重みwとして採用する。所定の領域の重みを所定の値wとし、それ以外の重みをゼロにすることで、領域内の距離データのみを使うということでもよい。
In accordance with each weight w determined for each area determined by the distance data utilization area setting unit 115, the scale adjustment unit 110 determines a scale by a weighted average, and the shape restoration unit 112 also uses the weight w in Expression (5). The shape is restored according to the value of.
Note that the types of settings are not limited to three. When nothing is set, a predetermined fixed value is adopted as the weight w. It is also possible to use only the distance data in the area by setting the weight of the predetermined area to the predetermined value w and setting the other weights to zero.
 つぎに、図1の特徴点利用領域設定部116について説明する。
 前記形状復元部112では、重みwの値を特徴点認識精度と距離認識精度のバランスによって設定しているが、このうち特徴点認識精度は、撮像した画像の解像度や各特徴点の鮮明さによって、それぞれ異なる場合がある。
 そこで、特徴点利用領域設定部116により、特徴点の距離認識精度の高さに応じて、前記重みwを画像データの領域毎に複数種類設定し、この重みwに従いスケール調整部110はスケールを決定し、また形状復元部112は式(5)を最適化する。なお、重みwの設定値は、式(5)に示すように距離認識精度に関する項に係るものであるため、特徴点認識精度が高いほど重みwを小さくする。
Next, the feature point use area setting unit 116 in FIG. 1 will be described.
In the shape restoration unit 112, the value of the weight w is set according to the balance between the feature point recognition accuracy and the distance recognition accuracy. Of these, the feature point recognition accuracy depends on the resolution of the captured image and the sharpness of each feature point. , Each may be different.
Therefore, the feature point use region setting unit 116 sets a plurality of types of the weight w for each region of the image data according to the distance recognition accuracy of the feature point, and the scale adjustment unit 110 adjusts the scale according to the weight w. The shape restoration unit 112 optimizes the equation (5). Note that the set value of the weight w relates to a term relating to the distance recognition accuracy as shown in the equation (5), so the weight w is reduced as the feature point recognition accuracy is higher.
 図12は、特徴点利用領域設定部116の処理を説明する図である。
 図12では、特徴点利用領域設定部116で、画像データ1201に基づいて、3種類の重みを設定する例を示している。図12の例においては、特徴点1202の部位を特徴点認識精度が最も高く、特徴点1204の部位の特徴点認識精度が次に高く、特徴点1206は特徴点認識精度が最も低いこととする。
FIG. 12 is a diagram for explaining the processing of the feature point use area setting unit 116.
FIG. 12 shows an example in which the feature point use region setting unit 116 sets three types of weights based on the image data 1201. In the example of FIG. 12, the feature point 1202 has the highest feature point recognition accuracy, the feature point 1204 has the next highest feature point recognition accuracy, and the feature point 1206 has the lowest feature point recognition accuracy. .
 この場合の設定手順としては、特徴点利用領域設定部116は、画像データ1201において、まず特徴点1202を含む領域1203を設定し、重みwに最も小さな値を設定する。続いて特徴点1204を含む領域1205を設定し、領域1203に設定した重みwより大きな値を設定する。最後に特徴点1206を含む領域1207を設定し、最も大きな重みwを設定することとする。
 なお重みwを設定しなかった領域には、所定の重みwを設定することとする。
As a setting procedure in this case, the feature point use region setting unit 116 first sets the region 1203 including the feature point 1202 in the image data 1201 and sets the smallest value for the weight w. Subsequently, an area 1205 including the feature point 1204 is set, and a value larger than the weight w set in the area 1203 is set. Finally, an area 1207 including the feature point 1206 is set, and the largest weight w is set.
It should be noted that a predetermined weight w is set in an area where the weight w is not set.
 特徴点利用領域設定部116により決められた領域毎のそれぞれの重みwに応じて、スケール調整部110は重み付き平均によりスケールを決定し、また、形状復元部112は式(5)の重みwの値に応じて形状復元を行う。
 なお、設定の種類は3種類に限らないものとする。また、何も設定しなかった場合は所定の固定値を重みwとして採用する。所定の領域の重みを所定の値wとし、それ以外の重みをゼロにすることで、領域内の特徴点のみを使うということでもよい。
In accordance with the respective weights w determined by the feature point use region setting unit 116, the scale adjustment unit 110 determines the scale by the weighted average, and the shape restoration unit 112 uses the weight w in the equation (5). The shape is restored according to the value of.
Note that the types of settings are not limited to three. When nothing is set, a predetermined fixed value is adopted as the weight w. It is also possible to use only the feature points in the region by setting the weight of the predetermined region to the predetermined value w and setting the other weights to zero.
 (実施例2)
 上述の実施形態では、距離計測部100がレーザスキャナや距離画像センサ等で構成される例を説明したが、距離計測部100で計測できた特徴点までの距離データを基に、全体の形状データを復元している。言いかえれば、距離計測部100で測定対象物の全ての点までの距離を測定する必要はなく、撮像画像の一部の特徴点までの距離が測定できればよい。
 本実施例は、距離計測部100に平面上を走査することで測定対象物の一部の2次元形状を測定する2D距離センサを使用し、この2D距離センサで計測した距離から、空間の形状を復元するものである。
(Example 2)
In the above-described embodiment, the example in which the distance measuring unit 100 is configured by a laser scanner, a distance image sensor, or the like has been described. However, based on the distance data to the feature points that can be measured by the distance measuring unit 100, the overall shape data Is restoring. In other words, it is not necessary for the distance measuring unit 100 to measure the distances to all the points of the measurement object, and it is only necessary to measure the distances to some feature points of the captured image.
The present embodiment uses a 2D distance sensor that measures a two-dimensional shape of a part of the measurement object by scanning the distance measuring unit 100 on a plane, and calculates the shape of the space from the distance measured by the 2D distance sensor. Is to restore.
 本実施例の3次元形状計測システムの機能ブロックは、図1の機能ブロックにおいて、距離計測部100が2D距離センサで構成され、距離データ利用領域設定部115と特徴点利用領域設定部116をもたない点が異なる。 The functional block of the three-dimensional shape measurement system of this embodiment is the same as the functional block of FIG. 1 except that the distance measuring unit 100 is configured by a 2D distance sensor, and includes a distance data using region setting unit 115 and a feature point using region setting unit 116. It is different.
 本実施例の3次元形状計測システムは、撮像部101で撮影した画像データ104に対し、特徴点認識部105、対応計算部106、仮相対姿勢・形状推定部107によって、仮相対姿勢108および仮形状109を求める。さらにスケール調整部110によって、距離データ103と仮形状109から得たスケール情報を用い、仮相対姿勢108を修正しスケールの正しい相対姿勢111を求める。続いて形状復元部112では、距離計測部100で得た距離データを、相対姿勢に基づいて3次元空間上にマッピングすることで3次元形状を復元する。この処理は、上述の実施形態と、同様である。
 本実施例では、式(5)における重みwを、固定の所定値に設定して処理をおこなう。
The three-dimensional shape measurement system according to the present exemplary embodiment uses a feature point recognition unit 105, a correspondence calculation unit 106, and a temporary relative posture / shape estimation unit 107 for image data 104 captured by the imaging unit 101. The shape 109 is obtained. Further, the scale adjustment unit 110 uses the scale information obtained from the distance data 103 and the provisional shape 109 to correct the provisional relative posture 108 to obtain a correct relative posture 111 of the scale. Subsequently, the shape restoration unit 112 restores the three-dimensional shape by mapping the distance data obtained by the distance measurement unit 100 on the three-dimensional space based on the relative posture. This process is the same as in the above-described embodiment.
In this embodiment, the processing is performed by setting the weight w in the equation (5) to a fixed predetermined value.
 図13は、本実施例の3次元形状計測システムの装置外観を示す図である。
 本実施例の計測装置は、撮像装置1300を2D距離センサである距離計測部1301の上に乗せる構成となっている。そして、本実施例の計測装置を移動して、異なる方向から測定対象物の撮像と距離測定を行う。
FIG. 13 is a diagram illustrating an external appearance of the three-dimensional shape measurement system according to the present embodiment.
The measurement apparatus of the present embodiment is configured to place the imaging apparatus 1300 on a distance measurement unit 1301 that is a 2D distance sensor. Then, the measurement apparatus of the present embodiment is moved to perform imaging of the measurement object and distance measurement from different directions.
 距離計測部1301は、周囲の凹凸等の2次元的な形状を一度に計測可能なセンサである。例えば、レーザ光を直線上にスキャンして、その反射光の受光し、レーザ光の照射から受光までの時間によって距離を計測するレーザスキャナである。 The distance measuring unit 1301 is a sensor capable of measuring a two-dimensional shape such as surrounding irregularities at a time. For example, it is a laser scanner that scans a laser beam in a straight line, receives the reflected light, and measures the distance according to the time from the irradiation of the laser light to the light reception.
 図13の本実施例の装置では、距離計測部1301で走査する位置と、撮像装置1300の撮像位置を対応づけて、撮像装置1300の画像データの特徴点の座標位置に対応する測定対象物の距離データを取得する。
 より具体的には、撮像装置1300によって撮像した画像データ101の特徴点が中心を通る水平線に存在する際には、2D距離センサである距離計測部1301のレーザ光のスキャンもその水平線に対応させる。このため、距離計測部1301の向きや傾きを変更可能にし、距離計測部1301が、レーザ光の走査位置を変更して、画像データの任意の位置の距離を計測できるようにする。
 本実施例の3次元形状計測システムの計測装置は、上記の撮像部1300と距離計測部1301の測定結果に基づいて、図2に示した形状復元装置21と同じ構成により、測定対象物の復元をおこなう。このため、本実施例の計測装置における、復元処理を行う装置構成については、説明を省略する。
In the apparatus of the present embodiment in FIG. 13, the position scanned by the distance measuring unit 1301 and the imaging position of the imaging device 1300 are associated with each other, and the measurement object corresponding to the coordinate position of the feature point of the image data of the imaging device 1300 is associated. Get distance data.
More specifically, when the feature point of the image data 101 imaged by the imaging device 1300 exists on a horizontal line passing through the center, the laser beam scan of the distance measuring unit 1301 that is a 2D distance sensor is also made to correspond to the horizontal line. . For this reason, the direction and inclination of the distance measuring unit 1301 can be changed, and the distance measuring unit 1301 can change the scanning position of the laser light to measure the distance at an arbitrary position of the image data.
The measurement apparatus of the three-dimensional shape measurement system according to the present embodiment restores the measurement object with the same configuration as the shape restoration apparatus 21 shown in FIG. 2 based on the measurement results of the imaging unit 1300 and the distance measurement unit 1301. To do. For this reason, the description of the apparatus configuration that performs the restoration process in the measurement apparatus of the present embodiment is omitted.
 図14により、距離計測部1301の計測例を説明する。
 距離計測部1301は、2D距離センサであり、レーザ光1402を測定対象物1401に走査して、照射したレーザ光の反射光を受光し、照射光と反射光の位相差から照射してから受光するまでに時間を求めて、測定対象物1401までの距離を算出する。レーザ光の走査を一方向に行うことで、走査線の2次元的な形状を測定する。このとき、レーザ光1403の照射方向には測定対象物がないので、反射光を受光することがない。このため、距離測定不能とする。
A measurement example of the distance measurement unit 1301 will be described with reference to FIG.
The distance measuring unit 1301 is a 2D distance sensor, scans the measurement target 1401 with the laser beam 1402, receives the reflected light of the irradiated laser beam, and receives the light after irradiating from the phase difference between the irradiated light and the reflected light. The time until the measurement is obtained, and the distance to the measurement object 1401 is calculated. By scanning the laser beam in one direction, the two-dimensional shape of the scanning line is measured. At this time, since there is no measurement object in the irradiation direction of the laser beam 1403, the reflected light is not received. For this reason, distance measurement is impossible.
 つぎに、距離計測部1301の計測結果による、形状の復元方法について説明する。図15は、図14のレーザ光の走査面を示した図である。
 距離計測部1301は、計測方向φを角度分解能δφずつ変化させながらn個のデータを順次計測する。ここではi番目の計測データの計測方向をφi、計測結果を距離riとする。この時の距離と方向の組み合わせ(ri,φi)が、距離計測部1301を中心とした測定対象物の相対的な極座標系で表される位置となる。
Next, a shape restoration method based on the measurement result of the distance measurement unit 1301 will be described. FIG. 15 is a view showing a scanning surface of the laser beam of FIG.
The distance measurement unit 1301 sequentially measures n data while changing the measurement direction φ by the angular resolution δφ. Here, the measurement direction of the i-th measurement data is φi, and the measurement result is the distance ri. A combination of distance and direction (ri, φi) at this time is a position represented by a relative polar coordinate system of the measurement object with the distance measurement unit 1301 as the center.
 なお、点線矢印は各レーザ光1402の照射方向に関する計測結果を表し、矢印の終端1500が被計測点の位置となる。閉じた実線は空間上の測定対象物体を示し、この物体とぶつかる点線矢印1501が計測に成功したデータである。閉じた実線にぶつからなかった点線矢印は、反射光がないために、計測できなかったことを示す。計測に成功したデータの集合が測定対象物1401の極座標表現された2次元的な形状となる。 In addition, a dotted line arrow represents the measurement result regarding the irradiation direction of each laser beam 1402, and the end 1500 of the arrow is the position of the point to be measured. A closed solid line indicates a measurement target object in space, and a dotted line arrow 1501 that collides with the object is data that has been successfully measured. A dotted arrow that did not hit the closed solid line indicates that measurement was not possible because there was no reflected light. A set of data that has been successfully measured becomes a two-dimensional shape of the measurement object 1401 expressed in polar coordinates.
 距離計測部1301で計測され、極座標系で表された位置(ri,φi)から直交座標系(X,Y,Z)への変換は、つぎの式(6)によって行う。
Figure JPOXMLDOC01-appb-M000006
Conversion from the position (ri, φi) measured by the distance measuring unit 1301 and expressed in the polar coordinate system to the orthogonal coordinate system (X, Y, Z) is performed by the following equation (6).
Figure JPOXMLDOC01-appb-M000006
 これにより求まった座標(X,Y,Z)と、相対姿勢111を用いて、式(7)の変換式により3次元座標値(Xr,Yr,Zr)を計算することで形状を復元する。
Figure JPOXMLDOC01-appb-M000007
Using the coordinates (X, Y, Z) obtained in this way and the relative posture 111, the shape is restored by calculating the three-dimensional coordinate values (Xr, Yr, Zr) by the conversion formula (7).
Figure JPOXMLDOC01-appb-M000007
 本実施例によれば、撮像部によって複数の異なる位置から撮影した画像を用いて得られるスケール未知の形状と、距離計測部100で得られた一部の形状から得られるスケールを組み合わせることで、距離計測部100で直接計測できない測定対象の全体形状についても3次元形状計測することが可能となる。
 また、本実施例によれば、距離計測部1301に安価な2D距離センサを使用して3次元形状計測システムを構成できるので、低コストでシステムを得ることができる。
 また、レーザ光の走査期間が短くなるので、一回の計測に掛かる距離の計測時間が短くなり、多くの方向から測定対象物の計測を行うことができ、3次元計測の計測精度が向上する。
According to the present embodiment, by combining a scale unknown shape obtained using images taken from a plurality of different positions by the imaging unit and a scale obtained from a part of the shape obtained by the distance measurement unit 100, It is possible to measure the three-dimensional shape of the entire shape of the measurement target that cannot be directly measured by the distance measuring unit 100.
Further, according to the present embodiment, since a three-dimensional shape measurement system can be configured by using an inexpensive 2D distance sensor for the distance measurement unit 1301, a system can be obtained at a low cost.
In addition, since the scanning period of the laser light is shortened, the measurement time of the distance required for one measurement is shortened, and the measurement object can be measured from many directions, and the measurement accuracy of the three-dimensional measurement is improved. .
 (実施例3)
 上述したように本実施例の3次元形状計測システムでは、複数の方向から撮像した画像データの特徴点の対応関係から3次元空間のスケール未知の仮相対姿勢と仮形状を求め、特徴点の距離データを用いて仮相対姿勢と仮形状のスケールを調整して、3次元空間の形状復元を行っている。
 つまり、距離計測部は、画像データの特徴点の距離データを求めればよい。
(Example 3)
As described above, in the three-dimensional shape measurement system according to the present embodiment, the temporary relative posture and the temporary shape whose scale is unknown in the three-dimensional space are obtained from the correspondence between the feature points of the image data captured from a plurality of directions, and the distance between the feature points. The shape of the three-dimensional space is restored by adjusting the temporary relative posture and the scale of the temporary shape using the data.
That is, the distance measuring unit may obtain distance data of feature points of image data.
 本実施例では、距離計測部として、計測方向の向きと傾きを任意に変えて測定空間の所定の一点までの距離を計測できる距離センサを使い、空間の形状を復元する実施例について説明する。
 図16は、本実施例の3次元形状計測システムの装置外観を示す図である。
 本実施例の計測装置は、距離計測部1601と撮像部1600備え、移動して異なる方向から計測対象の撮像と距離測定を行って、3次元空間の形状を算出する。
In the present embodiment, an embodiment will be described in which a distance sensor that can measure the distance to a predetermined point in the measurement space by arbitrarily changing the direction and inclination of the measurement direction is used as the distance measurement unit to restore the shape of the space.
FIG. 16 is a diagram illustrating an external appearance of the three-dimensional shape measurement system according to the present embodiment.
The measurement apparatus according to the present embodiment includes a distance measurement unit 1601 and an imaging unit 1600, and moves to perform imaging and distance measurement of a measurement target from different directions to calculate the shape of the three-dimensional space.
 撮像部1600は、上述の実施例と同様に、計測空間の測定対象物の形状全体を撮影して、視点の異なる複数の画像データを取得する。
 距離計測部1601は、ある方向のレーザ光1602を照射し、測定対象物上の一点までの距離を計測するレーザ距離計が設けられ、このレーザ距離計のレーザ光1602の照射方向と傾きを変えるために、レーザ距離計の向きを制御する雲台1603を備えている。この雲台1603は、画像データの任意の一点に対応する測定対象物に向けてレーザ光を照射できるように制御されている。
 本実施例の3次元形状計測システムは、上記の撮像部1600と距離計測部1601の測定結果に基づいて、図2に示した形状復元装置21と同じ構成により、測定対象物の復元をおこなう。このため、本実施例における復元処理を行う装置構成については、説明を省略する。
Similar to the above-described embodiment, the imaging unit 1600 captures the entire shape of the measurement object in the measurement space and acquires a plurality of image data with different viewpoints.
The distance measuring unit 1601 is provided with a laser distance meter that irradiates a laser beam 1602 in a certain direction and measures the distance to one point on the measurement object, and changes the irradiation direction and inclination of the laser beam 1602 of this laser distance meter. For this purpose, a pan head 1603 for controlling the direction of the laser distance meter is provided. This pan head 1603 is controlled so that laser light can be irradiated toward a measurement object corresponding to an arbitrary point of image data.
The three-dimensional shape measurement system of the present embodiment restores the measurement object based on the measurement results of the imaging unit 1600 and the distance measurement unit 1601 with the same configuration as the shape restoration device 21 shown in FIG. For this reason, description of the apparatus configuration that performs the restoration process in the present embodiment is omitted.
 図17に、本実施例の3次元形状計測システムの処理フローを示す図である。
 まず、本実施例の3次元形状計測システムは、撮像部1600により、測定空間の測定対象物を撮像し、複数の方向から測定対象物を撮影した画像データを取得する(S171)。
 つぎに、ステップS171で取得した画像データを解析し、画像上の特徴点を認識する特徴点認識処理を行う(S172)。
 そして、複数の画像データの間で、同じ特徴点の対応付ける対応計算処理を行う(S173)。
FIG. 17 is a diagram showing a processing flow of the three-dimensional shape measurement system of the present embodiment.
First, in the three-dimensional shape measurement system of the present embodiment, the imaging unit 1600 images the measurement object in the measurement space, and acquires image data obtained by imaging the measurement object from a plurality of directions (S171).
Next, the image data acquired in step S171 is analyzed, and feature point recognition processing for recognizing feature points on the image is performed (S172).
Then, a correspondence calculation process for associating the same feature points among a plurality of image data is performed (S173).
 本実施例の3次元形状計測システムは、ステップS174で、複数の画像データの間の同じ特徴点の位置情報から、スケールが未知の仮相対姿勢および仮形状を求める。
 その後、ステップS172で決めた画像データの特徴点に対応する測定対象物の一点の方向にレーザ光を照射するように、雲台1603を制御し、距離計測部1600により特徴点に対応する測定対象物の一点までの距離測定を行う(S175)。この距離測定は、画像データの特徴点の数分行う。
In step S174, the three-dimensional shape measurement system according to the present embodiment obtains a temporary relative posture and a temporary shape whose scale is unknown from position information of the same feature point between a plurality of image data.
Thereafter, the pan head 1603 is controlled so that the laser beam is irradiated in the direction of one point of the measurement object corresponding to the feature point of the image data determined in step S172, and the distance measurement unit 1600 measures the measurement object corresponding to the feature point. The distance to one point of the object is measured (S175). This distance measurement is performed for the number of feature points of the image data.
 つぎに、本実施例の3次元形状計測システムは、ステップS174で求めたスケール未知の仮形状に関する距離と、ステップS175で計測した距離データによって得られるスケールの正しい距離の比率を用いて、仮相対姿勢と仮形状のスケールを調整する(S176)。
 最後に、仮相対姿勢および仮形状と、計測した特徴点の距離データから計測対象物の形状を復元する(S177)。
 この時、ステップS172で求めた特徴点の数が少なければ、多視点のエピポーラ拘束により特徴点以外の部分についても、形状復元をおこなうようにしてもよい。
Next, the three-dimensional shape measurement system of the present embodiment uses the ratio of the correct distance of the scale obtained from the distance data measured in step S175 and the distance related to the temporary shape of unknown scale obtained in step S174 to make a temporary relative The posture and the scale of the temporary shape are adjusted (S176).
Finally, the shape of the measurement object is restored from the temporary relative posture and the temporary shape and the measured distance data of the feature points (S177).
At this time, if the number of feature points obtained in step S172 is small, shape restoration may also be performed for portions other than the feature points by multipolar epipolar constraint.
 上記のフローは、本実施例の計測装置が移動計測のたびに実施される。
 ところで、ステップS173とステップS174の処理は、プロセッサ等の処理リソースを多く必要とし、処理時間が掛かる。また、ステップS175の距離測定も、特徴点の数が多いと向きの調整に時間が掛かり、距離データの取得時間が長くなる。このため、図17のフロー全体の処理時間が長くなり、計測時間が長くなる場合や、多くの視点方向からの測定が行えない場合が生じる可能性がある。
The above flow is performed every time the measuring apparatus of the present embodiment performs movement measurement.
By the way, the process of step S173 and step S174 requires many processing resources, such as a processor, and requires processing time. Also, in the distance measurement in step S175, if the number of feature points is large, it takes time to adjust the direction, and the acquisition time of distance data becomes long. For this reason, the processing time of the entire flow of FIG. 17 becomes long, and there may be a case where measurement time becomes long or measurement from many viewpoint directions cannot be performed.
 このような場合には、ステップS175の処理は、特徴点が求めた後であれば、いつでも実施できるので、ステップS172の処理の後に特徴点の距離データ取得(S175)の処理を開始し、ステップS175の距離データの取得処理と、ステップS173、ステップS174の順に実行される処理を並行して行うようにするとよい。 In such a case, since the process of step S175 can be performed any time after the feature point is obtained, the distance data acquisition (S175) process of the feature point is started after the process of step S172. The distance data acquisition process of S175 and the processes executed in the order of steps S173 and S174 may be performed in parallel.
 本実施例によれば、撮像部によって複数の異なる位置から撮影した画像を用いて得られるスケール未知の形状と、距離計測部100で得られた一部の形状から得られるスケールを組み合わせることで、距離計測部100で直接計測できない測定対象の全体形状についても3次元形状計測することが可能となる。
 また、本実施例によれば、画像データの特徴点に対応する測定対象物の点までの距離測定を高精度に行えるので、3次元計測システムの測定精度の向上を容易に行うことができる。
According to the present embodiment, by combining a scale unknown shape obtained using images taken from a plurality of different positions by the imaging unit and a scale obtained from a part of the shape obtained by the distance measurement unit 100, It is possible to measure the three-dimensional shape of the entire shape of the measurement target that cannot be directly measured by the distance measuring unit 100.
Further, according to the present embodiment, since the distance measurement to the point of the measurement object corresponding to the feature point of the image data can be performed with high accuracy, the measurement accuracy of the three-dimensional measurement system can be easily improved.
 また、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。上記の実施例は本発明で分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。 Further, the present invention is not limited to the above-described embodiments, and includes various modifications. The above-described embodiments have been described in detail for easy understanding in the present invention, and are not necessarily limited to those having all the configurations described. Further, a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
 上記の実施例を他の観点から捉えると、本発明の3次元計測システムは、
 測定対象物を含む測定空間を撮像して画像データを取得する撮像部と、
 前記撮像部により異なる方向から撮像した複数の画像データのそれぞれに対応する画像データの特徴点を複数点求め、前記特徴点から成るスケール未知の仮相対姿勢と仮形状とを算出する仮相対姿勢・形状算出部と、
 前記特徴点に対応する前記測定対象物までの距離を測定する距離計測部と、
 前記距離計測部により距離を測定できなかった特徴点の3次元座標値を、前記距離計測部により距離を測定できた特徴点の距離とスケール未知の仮相対姿勢と仮形状から算出して、前記全体形状を復元する実形状算出部と、
を備えることを特徴とするものである。
Taking the above embodiment from another perspective, the three-dimensional measurement system of the present invention
An imaging unit that captures an image of a measurement space including a measurement object and acquires image data;
Temporary relative posture for calculating a plurality of feature points of image data corresponding to each of a plurality of image data captured from different directions by the imaging unit, and calculating a temporary relative posture and a temporary shape whose scales are unknown, A shape calculation unit;
A distance measuring unit for measuring a distance to the measurement object corresponding to the feature point;
The three-dimensional coordinate value of the feature point whose distance could not be measured by the distance measurement unit is calculated from the distance of the feature point whose distance could be measured by the distance measurement unit, the temporary relative posture whose scale is unknown, and the temporary shape, An actual shape calculator that restores the overall shape;
It is characterized by providing.
 また、本発明の3次元計測システムの計測方法は、
 測定対象物の複数の点までの距離を測定する距離計測部と、前記測定対象物を含む測定空間を撮像して画像データを取得する撮像部と、を備えて、測定空間の形状を計測する3次元形状計測システムの計測方法であって、
 前記撮像部により異なる方向から撮像した複数の画像データのそれぞれに対応する画像データの特徴点を複数点求め、前記特徴点から成るスケール未知の仮相対姿勢と仮形状とを算出するステップと、
 前記距離計測部により距離を測定できなかった特徴点の3次元座標値を、前記距離計測部により距離を測定できた特徴点の距離とスケール未知の仮相対姿勢と仮形状から算出して、前記全体形状を復元するステップと、
を有することを特徴とするものである。
Moreover, the measurement method of the three-dimensional measurement system of the present invention includes:
A distance measurement unit that measures distances to a plurality of points of the measurement object, and an imaging unit that images the measurement space including the measurement object and acquires image data, and measures the shape of the measurement space A measurement method of a three-dimensional shape measurement system,
Obtaining a plurality of feature points of image data corresponding to each of a plurality of image data imaged from different directions by the imaging unit, and calculating a scale-unknown provisional relative posture and provisional shape composed of the feature points;
The three-dimensional coordinate value of the feature point whose distance could not be measured by the distance measurement unit is calculated from the distance of the feature point whose distance could be measured by the distance measurement unit, the temporary relative posture whose scale is unknown, and the temporary shape, Restoring the overall shape;
It is characterized by having.
 100 距離計測部
 101 撮像部
 102 計測データ記憶部
 103 距離データ
 104 画像データ
 105 特徴点認識部
 106 対応計算部
 114 特徴点対応
 107 仮相対姿勢・形状推定部
 108 仮相対姿勢
 109 仮形状
 110 スケール調整部
 111 相対姿勢
 112 形状復元部
 113 形状データ
 115 距離データ利用領域設定部
 116 特徴点利用領域設定部
 120 仮相対姿勢・形状算出部
 121 実形状算出部
DESCRIPTION OF SYMBOLS 100 Distance measurement part 101 Imaging part 102 Measurement data storage part 103 Distance data 104 Image data 105 Feature point recognition part 106 Correspondence calculation part 114 Feature point correspondence 107 Temporary relative attitude | position / shape estimation part 108 Temporary relative attitude 109 Temporary shape 110 Scale adjustment part 111 Relative posture 112 Shape restoration unit 113 Shape data 115 Distance data use region setting unit 116 Feature point use region setting unit 120 Temporary relative posture / shape calculation unit 121 Actual shape calculation unit

Claims (16)

  1.  測定対象物の複数の点までの距離を測定する距離計測部と、
     前記測定対象物を含む測定空間を撮像して画像データを取得する撮像部と、
     前記撮像部により異なる方向から撮像した複数の画像データのそれぞれに対応する画像データの特徴点を複数点求め、
    前記複数点の特徴点の画像データにおける位置情報から、スケール未知の仮相対姿勢および仮形状を求め、
    前記距離計測部により計測した前記特徴点までの距離に基づいて、前記スケール未知の仮相対姿勢および仮形状の3次元座標値を算出して、
    測定対象物の形状を復元する形状復元装置と、
    を備えることを特徴とする3次元形状計測システム。
    A distance measuring unit for measuring the distance to a plurality of points of the measurement object;
    An imaging unit that images the measurement space including the measurement object and obtains image data;
    Obtaining a plurality of feature points of image data corresponding to each of a plurality of image data imaged from different directions by the imaging unit;
    From the position information in the image data of the feature points of the plurality of points, obtain a temporary relative posture and a temporary shape whose scale is unknown,
    Based on the distance to the feature point measured by the distance measuring unit, calculating the three-dimensional coordinate values of the temporary relative posture and the temporary shape with unknown scale,
    A shape restoration device for restoring the shape of the measurement object;
    A three-dimensional shape measurement system comprising:
  2.  請求項1に記載の3次元形状計測システムにおいて、
     前記距離計測部は、前記画像データの前記特徴点の位置を含む測定空間の一方向の距離を計測する2D距離センサである
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to claim 1,
    The three-dimensional shape measurement system, wherein the distance measurement unit is a 2D distance sensor that measures a distance in one direction of a measurement space including a position of the feature point of the image data.
  3.  請求項1に記載の3次元形状計測システムにおいて、
     前記距離計測部は、前記画像データの前記特徴点の位置に対応する測定空間の測定対象物までの計測するレーザ距離計である
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to claim 1,
    The three-dimensional shape measurement system, wherein the distance measurement unit is a laser distance meter that measures a measurement object in a measurement space corresponding to the position of the feature point of the image data.
  4.  請求項1に記載の3次元形状計測システムにおいて、
     前記形状復元装置は、
     前記距離計測部で計測した距離データと前記撮影部が撮影した画像データを対にして複数組記憶する計測データ記憶部と、
     前記画像データの特徴点を認識する特徴点認識部と、
     複数枚の前記画像データから得たそれぞれの前記特徴点を対応付ける対応計算部と、
     前記対応計算部が計算した対応からスケール未知の仮相対姿勢および仮形状を求める仮相対姿勢・形状推定部と、
     前記仮形状の大きさと形状を前記距離計測部で計測して得た距離の比率から前記仮相対姿勢・形状推定部が求めた前記仮相対姿勢および前記仮形状のスケールを調整することで正しいスケールの相対姿勢および形状を求めるスケール調整部と、
     前記スケール調整部によって計算した前記形状より形状を復元する形状復元部と、
    を有することを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to claim 1,
    The shape restoration device
    A measurement data storage unit that stores a plurality of pairs of distance data measured by the distance measurement unit and image data captured by the imaging unit;
    A feature point recognition unit for recognizing feature points of the image data;
    A correspondence calculator for associating each of the feature points obtained from a plurality of pieces of the image data;
    A temporary relative posture / shape estimation unit for obtaining a temporary relative posture and a temporary shape whose scale is unknown from the correspondence calculated by the correspondence calculation unit;
    The correct scale is obtained by adjusting the temporary relative posture and the temporary shape scale obtained by the temporary relative posture / shape estimation unit from the ratio of the distance obtained by measuring the size and shape of the temporary shape by the distance measuring unit. A scale adjustment unit for determining the relative posture and shape of
    A shape restoration unit for restoring the shape from the shape calculated by the scale adjustment unit;
    A three-dimensional shape measurement system characterized by comprising:
  5.  請求項1に記載の3次元形状計測システムにおいて、
     前記形状復元装置は、
     前記距離計測部で計測した距離データと前記撮影部が撮影した画像データを対にして複数組記憶する計測データ記憶部と、
     前記画像データの特徴点を認識する特徴点認識部と、
     複数枚の前記画像データから得たそれぞれの前記特徴点を対応付ける対応計算部と、
     前記対応計算部が計算した対応からスケール未知の仮相対姿勢および仮形状を求める仮相対姿勢・形状推定部と、
     前記仮形状の大きさと形状を前記距離計測部で計測して得た距離の比率から前記仮相対姿勢・形状推定部が求めた前記仮相対姿勢および前記仮形状のスケールを調整することで正しいスケールの相対姿勢および形状を求めるスケール調整部と、
     前記スケール調整部によって計算した前記形状および前記相対姿勢から求まる特徴点の画面座標値と前記画像データから得られる特徴点の画面座標値との差異を示す逆投影誤差と、前記スケール調整部によって計算した前記形状を構成する前記特徴点までの距離と前記距離計測部によって計測して得られた距離との差異を示す距離誤差を拘束条件とすることで形状を復元する形状復元部と、
    を有することを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to claim 1,
    The shape restoration device
    A measurement data storage unit that stores a plurality of pairs of distance data measured by the distance measurement unit and image data captured by the imaging unit;
    A feature point recognition unit for recognizing feature points of the image data;
    A correspondence calculator for associating each of the feature points obtained from a plurality of pieces of the image data;
    A temporary relative posture / shape estimation unit for obtaining a temporary relative posture and a temporary shape whose scale is unknown from the correspondence calculated by the correspondence calculation unit;
    The correct scale is obtained by adjusting the temporary relative posture and the temporary shape scale obtained by the temporary relative posture / shape estimation unit from the ratio of the distance obtained by measuring the size and shape of the temporary shape by the distance measuring unit. A scale adjustment unit for determining the relative posture and shape of
    Back projection error indicating the difference between the screen coordinate value of the feature point obtained from the shape and the relative posture calculated by the scale adjustment unit and the screen coordinate value of the feature point obtained from the image data, and calculated by the scale adjustment unit A shape restoration unit that restores the shape by using a distance error indicating a difference between a distance to the feature point that constitutes the shape and a distance obtained by measurement by the distance measurement unit, as a constraint condition;
    A three-dimensional shape measurement system characterized by comprising:
  6.  請求項4または5に記載の3次元形状計測システムにおいて、
    前記形状復元部は前記撮像部が撮影した画像データと前記相対姿勢を用いてエピポーラ幾何の原理を用いて前記特徴点以外の部位について形状を復元する
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to claim 4 or 5,
    The three-dimensional shape measurement system, wherein the shape restoration unit restores the shape of a part other than the feature point using the principle of epipolar geometry using the image data taken by the imaging unit and the relative posture.
  7.  請求項5または6に記載の3次元形状計測システムにおいて、
     前記形状復元部は、前記逆投影誤差と前記距離誤差とのバランスを設定する重み係数を有する
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to claim 5 or 6,
    The three-dimensional shape measurement system, wherein the shape restoration unit has a weighting factor for setting a balance between the backprojection error and the distance error.
  8.  請求項4から7のいずれか一項に記載の3次元形状計測システムにおいて、
     前記距離計測部は、測定空間の測定対象物の撮像に対応して、測定対象物までの距離を一度に測定する
    ことを特徴とする3次元形状計測システム。
    In the three-dimensional shape measurement system according to any one of claims 4 to 7,
    The distance measurement unit measures a distance to a measurement object at a time corresponding to imaging of the measurement object in a measurement space.
  9.  請求項4から8のいずれか一項に記載の3次元形状計測システムにおいて、
     前記距離計測部は、前記特徴点認識部が認識した特徴点ごとに、前記特徴点までの距離を計測する
    ことを特徴とする3次元形状計測システム。
    In the three-dimensional shape measurement system according to any one of claims 4 to 8,
    The said distance measurement part measures the distance to the said feature point for every feature point which the said feature point recognition part recognized, The three-dimensional shape measurement system characterized by the above-mentioned.
  10.  請求項4から9のいずれか一項に記載の3次元形状計測システムにおいて、
     前記距離計測部は、前記特徴点に対応する点を含む測定対象物の一部の2次元形状を一度に測定し、
     前記形状復元部は、前記相対姿勢に基づいて、前記距離計測部で計測した特徴点の距離を3次元座標値に変換することで形状を復元する
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to any one of claims 4 to 9,
    The distance measuring unit measures a two-dimensional shape of a part of a measurement object including a point corresponding to the feature point at a time,
    The shape restoration unit restores the shape by converting the distance of the feature point measured by the distance measurement unit into a three-dimensional coordinate value based on the relative posture.
  11.  請求項4から10のいずれか一項に記載の3次元形状計測システムにおいて、
     前記画像データに対して、距離精度の高さに応じて重み係数を設定し、前記距離精度を満たす画像データの領域と前記重み係数を対にして複数種類設定する距離データ利用領域設定部を有し、
     前記スケール調整部および前記形状復元部は前記重み係数を用いてスケール算出と形状復元をおこなう
    ことを特徴とする3次元形状計測システム。
    In the three-dimensional shape measurement system according to any one of claims 4 to 10,
    A distance data use area setting unit is provided that sets a weighting coefficient for the image data in accordance with the distance accuracy and sets a plurality of types of image data areas that satisfy the distance accuracy and the weighting coefficient as a pair. And
    The scale adjustment unit and the shape restoration unit perform scale calculation and shape restoration using the weighting factor, and a three-dimensional shape measurement system.
  12.  請求項4から11のいずれか一項に記載の3次元形状計測システムにおいて、
     前記画像データに対して、距離精度の高さに応じて重み係数を設定し、前記距離精度を満たす対応画像データの領域と前記重み係数を対にして複数種類設定する距離データ利用領域設定部を有し、
     前記スケール調整部および前記形状復元部は前記重み係数毎に前記領域内に含まれる距離データを用いてスケール算出と形状復元をおこなう
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to any one of claims 4 to 11,
    A distance data use region setting unit that sets a weighting factor for the image data in accordance with the distance accuracy and sets a plurality of types of the corresponding image data region that satisfies the distance accuracy and the weighting factor. Have
    The three-dimensional shape measurement system, wherein the scale adjustment unit and the shape restoration unit perform scale calculation and shape restoration using distance data included in the region for each weight coefficient.
  13.  請求項4から12のいずれか一項に記載の3次元形状計測システムにおいて、
     前記画像データに対して、特徴点認識精度に応じた重み係数を設定し、前記特徴点認識精度を満たす画像データの領域と前記重み係数を対にして複数種類設定する特徴点利用領域設定部を有し、
     前記スケール調整部および前記形状復元部は前記重み係数を用いてスケール算出と形状復元をおこなう
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to any one of claims 4 to 12,
    A feature point use region setting unit that sets a weighting factor corresponding to the feature point recognition accuracy for the image data, and sets a plurality of types of the image data region that satisfies the feature point recognition accuracy and the weighting factor. Have
    The scale adjustment unit and the shape restoration unit perform scale calculation and shape restoration using the weighting factor, and a three-dimensional shape measurement system.
  14.  請求項4から13のいずれか一項に記載の3次元形状計測システムにおいて、
     前記画像データに対して、特徴点認識精度に応じた重み係数を設定し、前記特徴点認識精度を満たす画像データの領域と前記重み係数を対にして複数種類設定する特徴点利用領域設定部を有し、
     前記スケール調整部および前記形状復元部は前記重み係数毎に前記領域内に含まれる特徴点を用いる
    ことを特徴とする3次元形状計測システム。
    The three-dimensional shape measurement system according to any one of claims 4 to 13,
    A feature point use region setting unit that sets a weighting factor corresponding to the feature point recognition accuracy for the image data, and sets a plurality of types of the image data region that satisfies the feature point recognition accuracy and the weighting factor. Have
    The scale adjustment unit and the shape restoration unit use a feature point included in the region for each weighting coefficient.
  15.  測定対象物を含む測定空間の形状を計測する3次元形状計測システムの計測方法であって、
     異なる方向から前記測定空間を撮像して複数の画像データを取得し、
     前記複数の画像データのそれぞれに対応する画像データの特徴点を複数点求め、
     前記特徴点の画像データにおける位置情報から、スケール未知の仮相対姿勢および仮形状を求め、
     撮像点から前記特徴点に対応する測定対象物までの距離に基づいて、前記スケール未知の仮相対姿勢および仮形状の3次元座標値を算出して、測定対象物の形状を復元する
    ことを特徴とする3次元形状計測システムの計測方法。
    A measurement method of a three-dimensional shape measurement system for measuring the shape of a measurement space including a measurement object,
    Capture the measurement space from different directions to obtain a plurality of image data,
    Obtaining a plurality of feature points of image data corresponding to each of the plurality of image data;
    From the position information in the image data of the feature points, obtain a temporary relative posture and a temporary shape whose scale is unknown,
    Based on the distance from the imaging point to the measurement object corresponding to the feature point, a three-dimensional coordinate value of the temporary relative posture and provisional shape whose scale is unknown is calculated, and the shape of the measurement object is restored. A measurement method of the three-dimensional shape measurement system.
  16.  測定空間の形状を計測する3次元形状計測システムの計測方法であって、
     距離計測部で計測した距離データと撮影部が撮影した画像データを対にして複数組記憶し、
     前記画像データの特徴点を認識し、
     複数枚の前記画像データから得たそれぞれの前記特徴点を対応付け、
     対応付けられた特徴点からスケール未知の仮相対姿勢および仮形状を求め
     前記距離計測部が計測した距離によって前記仮相対姿勢および前記仮形状のスケールを調整することで正しいスケールの相対姿勢および形状を求め、
     前記正しいスケールの相対姿勢および形状から求まる特徴点の画面座標値と前記画像データから得られる特徴点の画面座標値との差異を示す逆投影誤差と、前記正しいスケールの形状を構成する前記特徴点までの距離と前記距離計測部によって計測して得られた距離との差異を示す距離誤差を拘束条件として形状を復元する
    ことを特徴とする3次元形状計測システムの計測方法。
    A measurement method of a three-dimensional shape measurement system for measuring the shape of a measurement space,
    Multiple pairs of distance data measured by the distance measurement unit and image data captured by the imaging unit are stored in pairs,
    Recognizing feature points of the image data;
    Associating each of the feature points obtained from a plurality of the image data,
    A temporary relative posture and a temporary shape whose scale is unknown are obtained from the associated feature points, and the correct relative posture and shape of the scale are obtained by adjusting the temporary relative posture and the scale of the temporary shape according to the distance measured by the distance measuring unit. Seeking
    Back projection error indicating a difference between the screen coordinate value of the feature point obtained from the relative posture and shape of the correct scale and the screen coordinate value of the feature point obtained from the image data, and the feature point constituting the correct scale shape A method for measuring a three-dimensional shape measurement system, wherein a shape is restored with a distance error indicating a difference between a distance to a distance measured by the distance measurement unit as a constraint condition.
PCT/JP2015/055239 2015-02-24 2015-02-24 Three-dimensional shape measurement system and measurement method for same WO2016135856A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/055239 WO2016135856A1 (en) 2015-02-24 2015-02-24 Three-dimensional shape measurement system and measurement method for same
JP2017501603A JP6282377B2 (en) 2015-02-24 2015-02-24 Three-dimensional shape measurement system and measurement method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/055239 WO2016135856A1 (en) 2015-02-24 2015-02-24 Three-dimensional shape measurement system and measurement method for same

Publications (1)

Publication Number Publication Date
WO2016135856A1 true WO2016135856A1 (en) 2016-09-01

Family

ID=56788039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/055239 WO2016135856A1 (en) 2015-02-24 2015-02-24 Three-dimensional shape measurement system and measurement method for same

Country Status (2)

Country Link
JP (1) JP6282377B2 (en)
WO (1) WO2016135856A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716210A (en) * 2018-07-12 2020-01-21 发那科株式会社 Distance measuring device with distance correction function
CN111521127A (en) * 2019-02-01 2020-08-11 奥林巴斯株式会社 Measuring method, measuring apparatus, and recording medium
CN112485807A (en) * 2019-08-22 2021-03-12 丰田自动车株式会社 Object recognition device
JP2021076532A (en) * 2019-11-12 2021-05-20 株式会社豊田中央研究所 Measuring device
US20230087702A1 (en) * 2019-01-10 2023-03-23 State Farm Mutual Automobile Insurance Company Systems and methods for enhanced base map generation
US11879732B2 (en) 2019-04-05 2024-01-23 Ikegps Group Limited Methods of measuring structures

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
JP2009186353A (en) * 2008-02-07 2009-08-20 Fujitsu Ten Ltd Object detecting device and object detecting method
WO2012172870A1 (en) * 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5369036B2 (en) * 2010-03-26 2013-12-18 パナソニック株式会社 Passer detection device, passer detection method
JP2013101464A (en) * 2011-11-08 2013-05-23 Canon Inc Image processing device and image processing method
JP2014185996A (en) * 2013-03-25 2014-10-02 Toshiba Corp Measurement device
DE102013009288B4 (en) * 2013-06-04 2016-02-04 Testo Ag 3D recording device, method for creating a 3D image and method for setting up a 3D recording device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
JP2009186353A (en) * 2008-02-07 2009-08-20 Fujitsu Ten Ltd Object detecting device and object detecting method
WO2012172870A1 (en) * 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716210A (en) * 2018-07-12 2020-01-21 发那科株式会社 Distance measuring device with distance correction function
CN110716210B (en) * 2018-07-12 2024-01-30 发那科株式会社 Distance measuring device with distance correction function
US20230087702A1 (en) * 2019-01-10 2023-03-23 State Farm Mutual Automobile Insurance Company Systems and methods for enhanced base map generation
US11954797B2 (en) * 2019-01-10 2024-04-09 State Farm Mutual Automobile Insurance Company Systems and methods for enhanced base map generation
CN111521127A (en) * 2019-02-01 2020-08-11 奥林巴斯株式会社 Measuring method, measuring apparatus, and recording medium
CN111521127B (en) * 2019-02-01 2023-04-07 仪景通株式会社 Measuring method, measuring apparatus, and recording medium
US11879732B2 (en) 2019-04-05 2024-01-23 Ikegps Group Limited Methods of measuring structures
CN112485807A (en) * 2019-08-22 2021-03-12 丰田自动车株式会社 Object recognition device
CN112485807B (en) * 2019-08-22 2024-03-15 丰田自动车株式会社 Object recognition device
JP2021076532A (en) * 2019-11-12 2021-05-20 株式会社豊田中央研究所 Measuring device
JP7312089B2 (en) 2019-11-12 2023-07-20 株式会社豊田中央研究所 Measuring device

Also Published As

Publication number Publication date
JP6282377B2 (en) 2018-02-21
JPWO2016135856A1 (en) 2017-05-25

Similar Documents

Publication Publication Date Title
JP6282377B2 (en) Three-dimensional shape measurement system and measurement method thereof
TWI489082B (en) Method and system for calibrating laser measuring apparatus
GB2564794B (en) Image-stitching for dimensioning
JP5961945B2 (en) Image processing apparatus, projector and projector system having the image processing apparatus, image processing method, program thereof, and recording medium recording the program
US7711182B2 (en) Method and system for sensing 3D shapes of objects with specular and hybrid specular-diffuse surfaces
JP5393318B2 (en) Position and orientation measurement method and apparatus
JP5671281B2 (en) Position / orientation measuring apparatus, control method and program for position / orientation measuring apparatus
JP6363863B2 (en) Information processing apparatus and information processing method
JP6168577B2 (en) System and method for adjusting a reference line of an imaging system having a microlens array
JP2015528109A (en) 3D scanning and positioning system
JP2008249432A (en) Three-dimensional image measuring device, method, and program of non-static object
CN107808398B (en) Camera parameter calculation device, calculation method, program, and recording medium
JP6735615B2 (en) Information processing apparatus, information processing apparatus control method, and program
JP2016170610A (en) Three-dimensional model processing device and camera calibration system
JP4379626B2 (en) Three-dimensional shape measuring method and apparatus
GB2544263A (en) Systems and methods for imaging three-dimensional objects
JP6573196B2 (en) Distance information correction apparatus, distance information correction method, and distance information correction program
US20190313082A1 (en) Apparatus and method for measuring position of stereo camera
JP5976089B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
JP2018009927A (en) Image processing device, image processing method and program
CN113155053A (en) Three-dimensional geometry measuring device and three-dimensional geometry measuring method
US11748908B1 (en) Systems and methods for generating point-accurate three-dimensional models with point-accurate color information from a non-cosited capture
CN111742352A (en) 3D object modeling method and related device and computer program product
JP6867766B2 (en) Information processing device and its control method, program
JP2016170031A (en) Three-dimensional model processing device and camera calibration system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15883153

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017501603

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15883153

Country of ref document: EP

Kind code of ref document: A1