WO2018066698A1 - Dispositif et procédé de mesure de forme - Google Patents

Dispositif et procédé de mesure de forme Download PDF

Info

Publication number
WO2018066698A1
WO2018066698A1 PCT/JP2017/036510 JP2017036510W WO2018066698A1 WO 2018066698 A1 WO2018066698 A1 WO 2018066698A1 JP 2017036510 W JP2017036510 W JP 2017036510W WO 2018066698 A1 WO2018066698 A1 WO 2018066698A1
Authority
WO
WIPO (PCT)
Prior art keywords
wavelength
distance
shape
measuring apparatus
medium
Prior art date
Application number
PCT/JP2017/036510
Other languages
English (en)
Japanese (ja)
Inventor
いまり 佐藤
祐太 浅野
銀強 鄭
恒 西野
Original Assignee
大学共同利用機関法人情報・システム研究機構
ドレクセル・ユニバーシティ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 大学共同利用機関法人情報・システム研究機構, ドレクセル・ユニバーシティ filed Critical 大学共同利用機関法人情報・システム研究機構
Priority to JP2018543991A priority Critical patent/JP6979701B2/ja
Publication of WO2018066698A1 publication Critical patent/WO2018066698A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Definitions

  • the present invention relates to an optical shape measuring apparatus and method for measuring the shape of an object to be measured.
  • Patent Document 1 discloses a three-dimensional imaging system using absorption.
  • the fluorescent medium between the object and the sensor is excited, and light beams of two wavelengths are emitted using the excitation light from the excitation light source.
  • the sensor is reflected by the object. The received light beam is received.
  • a computer connected to the sensor determines a thickness of the medium based on a ratio between the intensity of the first wavelength and the intensity of the second wavelength, and determines a plurality of additional thicknesses in the region of the object. Then, a three-dimensional image of the object region is reconstructed.
  • Patent Document 2 discloses a shape measuring apparatus that accurately measures the surface shape of an object in a short time.
  • the shape measuring apparatus includes an illumination unit that illuminates a subject via a medium, an optical sensor that receives a light beam having two wavelengths reflected by the object, and the intensity of light having two wavelengths. And a measuring unit for measuring the distance to the surface of the object to be measured based on the transmittance of the medium.
  • An object of the present invention is to provide a shape measuring apparatus and method for measuring the shape of an object with higher accuracy than the prior art by using the brightness of the object observed at two wavelengths through the medium. .
  • Another object of the present invention is to provide a shape measuring apparatus and method for measuring the shape of an object with higher accuracy while avoiding the limitation of known reflection characteristics in the prior art by using the luminance of the object of three wavelengths. There is to do.
  • the shape measuring apparatus is An object having respective reflectance coefficients of the first wavelength and the second wavelength through a medium having each absorption coefficient at the first wavelength and an absorption coefficient at the second wavelength longer than the first wavelength.
  • a light source that irradiates the surface with light having a first wavelength and a second wavelength;
  • a sensor for receiving the light coming from the surface of the object and propagating through the medium, and measuring each intensity of the first wavelength and the second wavelength; Based on the measured intensities of the first wavelength and the second wavelength and the absorption coefficients of the medium at the first wavelength and the second wavelength, the distance from the surface of the medium to the surface of the object is calculated.
  • a shape measuring apparatus comprising: a measuring unit that calculates and measures the shape of the object based on the calculated distance; In the first wavelength and the second wavelength, the difference between the intensity of the shortest distance in the first wavelength and the intensity of the longest distance in the second wavelength is larger than a predetermined first value, and the first wavelength The difference between the reflectance coefficient at the wavelength and the reflectance coefficient at the second wavelength is selected to be smaller than the predetermined second value;
  • the shortest distance refers to the shortest distance among a plurality of corresponding distances at a plurality of positions on the surface of the object
  • the longest distance refers to a longest distance among a plurality of corresponding distances at a plurality of positions on the surface of the object.
  • the shape measuring apparatus is A medium in which the third wavelength is between the first wavelength and the second wavelength longer than the first wavelength, and has each absorption coefficient at the first wavelength, the second wavelength, and the third wavelength.
  • the reflection coefficient characteristic of the surface of the object has a substantially linear relationship over a wavelength range from the first wavelength to the second wavelength through the third wavelength.
  • a measuring unit for measuring the shape of the object is
  • the shape can be measured using light of two wavelengths coming from an object with higher accuracy than the prior art.
  • the shape of the object can be measured using light of three wavelengths coming from the object with higher accuracy than in the prior art.
  • FIG. 2 is a schematic block diagram illustrating a method.
  • 6 is a graph showing a light absorption ratio by water at 400 nm to 1400 nm. It is a photographic image showing the difference in the appearance of water between visible light and near infrared, and is a photographic image showing the appearance of water taken with an infrared camera without using a filter.
  • FIG. 2 is a schematic side view of an apparatus for indicating the amount of light according to Lambert-Beer law. For example, it is a schematic side view of the shape measuring device when the optical axes of the sensors 21 and 22 that are cameras and the light source 80 are parallel and perpendicular to the water surface. It is a graph which shows the relationship between a relative reflectance error and an estimated distance error. It is a graph which shows the reflection spectrum of "24 color checker board".
  • FIG. 16 is a spectrum diagram showing a response function from a wavelength of 300 nm to 1100 nm of the camera of FIG. 15 (manufactured by POINT-GREY, GS3-U3-41C6NIR type).
  • FIG. 16 is a spectrum diagram showing a transmission function of two bandpass filters of FIG. 15 and a response function of a camera.
  • FIG. 4 is a photographic image of “stone” photographed using a wavelength of 950 nm by the shape measuring apparatus according to the first embodiment. It is a photograph image which shows the distance image (after black-and-white conversion) of the "stone” estimated by the shape measuring apparatus which concerns on 1st Embodiment. It is a photograph image which shows the presumed shape image of the "stone” estimated by the shape measuring apparatus which concerns on 1st Embodiment. It is a photographic image which shows the RGB image (after black-and-white conversion) of the "stone” estimated by the shape measuring apparatus which concerns on 1st Embodiment. It is a photographic image of "a squirrel object made of wood” photographed by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 4 is a photographic image of “another cherry blossom object” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm. It is a photographic image of "another cherry blossom object” photographed using the shape measuring apparatus according to the first embodiment using a wavelength of 950 nm. It is a photographic image which shows the distance image (after black-and-white conversion) of "another cherry-blossom object” estimated by the shape measuring apparatus which concerns on 1st Embodiment.
  • 5 is a photographic image showing an estimated shape image of “another cherry blossom object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 3 is a photographic image of “moving hand (first image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 34A is a photographic image of “moving hand (first image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “moving hand (first image)” in FIG. 34A. is there.
  • FIG. 3 is a photographic image of “moving hand (second image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 34C is a photographic image of a “moving hand (second image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to the moving hand (second image) in FIG. 34C. It is a photographic image of "moving hand (third image)" taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • 34E is a photographic image of “moving hand (third image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “moving hand (third image)” of FIG. 34E. is there. It is a photographic image of "moving hand (fourth image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • 34G is a photographic image of “moving hand (fourth image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “moving hand (fourth image)” in FIG. is there. It is a photographic image of "goldfish (first image)” photographed using the wavelength measuring device of 905 nm by the shape measuring apparatus according to the first embodiment.
  • 35B is a photographic image of “goldfish (first image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (first image)” in FIG. 35A. It is a photographic image of "goldfish (second image)” photographed by using the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • 35C is a photographic image of “goldfish (second image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (second image)” in FIG. 35C. It is a photographic image of "goldfish (third image)” photographed using the wavelength measuring device of 905 nm by the shape measuring apparatus according to the first embodiment.
  • 35E is a photographic image of “goldfish (third image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (third image)” in FIG. 35E. It is a photographic image of "goldfish (fourth image)” photographed using a wavelength of 905 nm by the shape measuring apparatus according to the first embodiment.
  • 35G is a photographic image of “goldfish (fourth image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (fourth image)” in FIG. 35G. It is a photographic image of an RGB image (after black-and-white conversion) of “an object in a state where the surface of an egg-shaped object whose upper part is transparent is not painted”.
  • 36B is a photographic image of an “object without paint” whose shape is estimated by a shape measuring apparatus according to the prior art with respect to the “object without paint” in FIG. 36A. It is a photographic image of an RGB image (after black-and-white conversion) of an “object in a state where the surface of an egg-shaped object with a transparent upper part is painted”.
  • 36C is a photographic image of “object with paint” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “object with paint” in FIG. 36C.
  • 36B is a photographic image of “object without paint” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “object without paint” in FIG. 36A.
  • FIG. 42B is a photographic image showing an estimation result image of the shape estimated by the shape measuring apparatus according to the second embodiment with respect to the “food sample of roll cake” in FIG. 42A. It is a photographic image which shows the input image of wavelength 905nm about "the head object” for the shape estimation by the shape measuring apparatus which concerns on 2nd Embodiment.
  • FIG. 43B is a photographic image showing an estimation result image obtained by estimating the shape by the shape measuring apparatus according to the second embodiment with respect to “the turnip object” of FIG. 43A.
  • FIG. 1A is a schematic block diagram showing a configuration of an optical shape measuring apparatus according to a first basic embodiment of the present invention.
  • the state measuring apparatus measures the shape of an object 60, and includes a tank 50 including a medium 10, a light source 80, a half mirror 40, and a bandpass filter. 31 and 32, sensors 21 and 22 such as imaging cameras, a measuring device 70 including an internal memory 70 m and a digital computer, an operation unit 71, and a display 72.
  • the operation of the light source 80 is controlled by a control signal S80 from the measuring device 70.
  • the inside of the tank 50 is filled with a medium 10 that is at least one of water, liquid, gas, solid, and gel, for example, and an object to be measured (hereinafter referred to as an object) having an object surface 60s. 60) is placed, and the object 60 is parallel to, for example, the bottom surface of the tank 50 by the moving support unit 61 based on a control signal S61 from the measuring device 70 during shape measurement and three-dimensional imaging. It is moved while being supported on a two-dimensional plane.
  • the operation of the movement support unit 61 is the same in basic embodiments and embodiments described later.
  • Medium 10 has an absorption coefficient characteristic for the wavelength, for example, the absorption coefficient at the first wavelength lambda 1 alpha (lambda 1), the absorption coefficient at a second wavelength different from lambda 2 and the first wavelength lambda 1 alpha ( ⁇ 2 ).
  • the object 60 is disposed on the bottom surface of the tank 50 as described above, and the object surface 60s that is a target surface such as the upper surface of the object 60 has a reflectance characteristic with respect to the wavelength.
  • the first wavelength ⁇ 1 has a reflectivity coefficient s ( ⁇ 1), and a reflectance factor s (lambda 2) in the second wavelength lambda 2 in.
  • the light source 80 emits broadband parallel light including the first and second wavelengths ⁇ 1 and ⁇ 2 and irradiates the object surface 60 s as incident light L 80 through the medium 10.
  • the arrow of the incident light L80 indicates the direction in which the incident light L80 propagates.
  • the incident light L80 is reflected by the object surface 60s of the object 60 substantially along the optical axis 40a as reflected light R60 substantially parallel to the incident light L80.
  • the arrow of the reflected light R60 indicates the direction in which the reflected light R60 propagates.
  • light having a first pass wavelength in the wavelength lambda 1 is the medium 10
  • the half mirror 40 reflects the reflected light R60 and propagates the light toward the sensor 22 via the bandpass filter 32.
  • the band-pass filter 32 is inserted, light of the second wavelength ⁇ 2 out of the reflected light R 60 enters the sensor 22.
  • the sensors 21 and 22 can be configured by an imaging camera, for example.
  • the senor 21 detects the intensity of the reflected light R60 having the first wavelength ⁇ 1 out of the reflected light R60, and outputs data of the intensity to the measuring device 70.
  • the sensor 22 detects the intensity of the reflected light R60 having the second wavelength ⁇ 2 out of the reflected light R60, and outputs the intensity data to the measuring device 70.
  • the operation unit 71 is connected to the measuring device 70 and is provided for inputting data and instruction commands necessary for performing shape measurement processing by the user.
  • a display 72 is operably connected to the measurement device 70 and is provided for displaying the results of the shape measurement process.
  • the measuring device 70 includes an internal memory 70m that stores the following data, which is input in advance using, for example, the operation unit 71: an absorption coefficient characteristic with respect to the wavelength of the medium 10, for example, the first wavelength ⁇ 1 data on the absorption coefficient alpha (lambda 2) in the absorption coefficient alpha (lambda 1) and, second wavelength lambda 2 different from the first wavelength lambda 1 in.
  • the measurement apparatus 70 estimates and measures the three-dimensional shape of the object surface 60s of the object 60 by executing the shape measurement process of FIG.
  • the measuring device 70 is conceptually similar to the invention disclosed in Patent Document 2, and is based on the intensities of the first and second wavelengths ⁇ 1 and ⁇ 2.
  • the distance 1 (referred to as the distance from the medium surface 10s to the object surface 60s) is calculated to measure the one-dimensional, two-dimensional or three-dimensional shape of the object.
  • the shape measuring apparatus can further perform three-dimensional imaging of the object 60 by reconstructing a three-dimensional image of the object based on the measured shape of the object.
  • the object 60 is calculated by calculating the distance l with higher accuracy than the device accuracy of Patent Literature 2, using the following method different from the shape measuring device described in Patent Literature 2. Can be measured. That is, the method of the first basic embodiment is significantly different from the shape measuring apparatus of Patent Document 2 in that a three-dimensional image of the object 60 can be reconstructed even if the translucent object 60 is provided. .
  • s ( ⁇ ) is the reflected radiation spectrum of the object surface 60s of the object 60 (in the case of a transparent object, there is also light entering the object, so that the light arriving from the object surface has a reflection spectrum) And at least one of the radiation spectrum, and hereinafter referred to as “reflected radiation spectrum.”
  • reflected radiation spectrum at least one of the radiation spectrum, and hereinafter referred to as “reflected radiation spectrum.”
  • reflection spectrum it is not limited to a reflective surface.
  • s ( ⁇ ) as the ratio of radiance at a surface point to incident irradiance at a particular wavelength ⁇ , the same theory has more complex materials such as translucency, subsurface scattering, volume scattering, and interreflection Applicable to objects.
  • the first basic embodiment can perform three-dimensional imaging on the translucent object 60.
  • the intensity of the pixel at the point (l 1 ) at the distance l 1 on the object surface 60s of the ⁇ 1 image that is the image of the first wavelength ⁇ 1 should be as high as possible.
  • the intensity at the pixel at the point (l 2 ) at the distance l 2 on the object surface 60s of the ⁇ 2 image that is the image of the second wavelength ⁇ 2 should be as small as possible.
  • the difference between the intensity at the shortest distance l min at the first wavelength ⁇ 1 and the intensity at the longest distance l max at the second wavelength ⁇ 2 is larger than the threshold value D th1.
  • the difference is the maximum.
  • the shortest distance l min corresponds to a position having the minimum depth on the object surface 60 s of the object 60
  • the longest distance l max corresponds to a position having the maximum depth on the object surface 60 s of the object 60.
  • the ratio of the intensity at the shortest distance l min at the first wavelength ⁇ 1 to the intensity at the longest distance l max at the second wavelength ⁇ 2 may be greater than about 4 or 5. preferable.
  • the two wavelengths ⁇ are set so that the difference between the reflectance coefficients s ( ⁇ 1 ) and s ( ⁇ 2 ) is as small as possible, preferably smaller than the predetermined threshold value D th2. 1 and ⁇ 2 are selected.
  • the spectral sensitivity and intensity of the sensors 21 and 22 at the first and second wavelengths ⁇ 1 and ⁇ 2 are sufficiently high, and a predetermined threshold value It is preferable that it is larger than Dth3 .
  • FIG. 1B is a flowchart showing a shape measurement process executed by the measurement apparatus 70 of FIG. 1A.
  • step S1 the intensities of the first wavelength ⁇ 1 and the second wavelength ⁇ 2 measured by the sensors 21, 22 are received from the sensors 21, 22, 23.
  • step S2 the received intensities of the first wavelength ⁇ 1 and the second wavelength ⁇ 2 and the absorption coefficient ⁇ ( 1) of the medium 10 at the first wavelength ⁇ 1 stored in advance in the internal memory 70m.
  • the distance l from the medium surface 10s to the object surface 60s is calculated.
  • step S3 the shape of the object surface 60s of the object 60 is measured based on the calculated distance l.
  • step S4 the three-dimensional shape of the object 60 is determined based on the measured shape of the object surface 60s of the object 60. Reconstruct an image having a shape. Further, in step S5, an image having a three-dimensional shape of the reconstructed object 60 is displayed on the display 72, and the shape measurement process is terminated.
  • FIG. 4 is a schematic block diagram showing a correction method for correcting the distance l.
  • the incident light L80 and the reflected light R60 are inclined at angles ⁇ and ⁇ with respect to the surface of the medium 10 (medium surface perpendicular to the optical axis 40a), as shown in FIG. It should be corrected to (l m1 / cos ⁇ + l m2 / cos ⁇ ) / 2.
  • l m1 and l m2 indicate the actually measured lengths (distances) of the incident light L80 and the reflected light R60, respectively.
  • the correction method can be applied to second and third basic embodiments described later.
  • the two wavelengths ⁇ 1 and ⁇ 2 are selected as described above, and the shape measuring apparatus measures the shape of the object 60 with higher accuracy than the prior art. can do.
  • FIG. 2A is a schematic block diagram showing a configuration of an optical shape measuring apparatus according to a second basic embodiment of the present invention.
  • the shape measuring apparatus according to the second basic embodiment uses the three different wavelengths ⁇ 1 , ⁇ 2 , and ⁇ 3 such that ⁇ 1 ⁇ 3 ⁇ 2. It is characterized in that the shape of the object 60 is measured with higher accuracy than the accuracy of the embodiment.
  • the shape measuring apparatus includes a medium 10, a light source 80, half mirrors 41 and 42, bandpass filters 31, 32 and 33, sensors 21, 22 and 23 such as imaging cameras, and an internal memory 70m. 70A, an operation unit 71, and a display 72.
  • differences between the first basic embodiment and the second basic embodiment will be described.
  • the reflected light R60 is substantially along the optical axis 40a, the medium 10, propagates to the sensor 21 through a band pass filter 31 having a pass wavelength of the half mirror 41, and the first wavelength lambda 1 To do.
  • Half mirror 41 reflects the reflected light R60, propagating towards the sensor 23 through a band-pass filter 33 having a pass wavelength of the half mirror 42 and the third wavelength lambda 3.
  • Half mirror 42 reflects the reflected light R60, propagating towards the sensor 22 through a band pass filter 32 having a second pass wavelength in the wavelength lambda 2.
  • the sensors 21, 22, and 23 can be configured by, for example, an imaging camera.
  • the sensor 21 receives the reflected light R60, detects the intensity of the reflected light R60 having the first wavelength ⁇ 1, and outputs the intensity data to the measuring device 70A.
  • the sensor 22 receives the reflected light R60, detects the intensity of the reflected light R60 having the second wavelength ⁇ 2, and outputs the intensity data to the measuring device 70A.
  • the sensor 23 receives the reflected light R60, detects the intensity of the reflected light R60 having the third wavelength ⁇ 3, and outputs data of the intensity to the measuring device 70A.
  • the measuring apparatus 70A calculates the distance l based on the first wavelength, the second wavelength, and the intensities of the third wavelengths ⁇ 1 , ⁇ 2 , and ⁇ 3 having a relationship of ⁇ 1 ⁇ 3 ⁇ 2 , By calculating the distance l of the plurality of locations on the object surface 60s of the object 60, the one-dimensional, two-dimensional, and three-dimensional shapes of the object 60 can be further measured. In addition, the measurement apparatus 70A reconstructs a three-dimensional image of the object 60 based on the measured shape of the object 60, and images the three-dimensional image of the object 60.
  • the following coefficients are defined for measurement.
  • D the absorption coefficient ⁇ ( ⁇ 1 ) of the medium 10 at the first wavelength ⁇ 1 ;
  • the difference between the reflection coefficient s ( ⁇ 1 ) and the reflection coefficient s ( ⁇ 2 ) is the reflection coefficient s ( ⁇ 1 ) and the reflection coefficient s ( ⁇ 2 ). 1 ) or a value smaller than a predetermined small value such as 1/100 of the reflection coefficient s ( ⁇ 2 ).
  • the wavelength characteristic of the reflection coefficient of the object surface 60 s of the object 60 is a wavelength range from the first wavelength ⁇ 1 to the second wavelength ⁇ 2 through the third wavelength ⁇ 3 as shown in the following equation. It is preferable to have a substantially linear relationship (linear approximation) with respect to the wavelength ⁇ .
  • a and b are constants.
  • FIG. 2B is a flowchart showing the shape measurement process executed by the measurement apparatus 70A of FIG. 2A.
  • step S11 the intensities of the first wavelength ⁇ 1 , the second wavelength ⁇ 2 , and the third wavelength ⁇ 3 measured by the sensors 21, 22, 23 are represented by the sensors 21, 22 respectively. , 23.
  • step S12 the received first wavelength ⁇ 1 , second wavelength ⁇ 2 , and third wavelength ⁇ 3 , and the first wavelength ⁇ 1 stored in advance in the internal memory 70m. Based on the absorption coefficient ⁇ ( ⁇ 1 ) of the medium 10 at ⁇ 2 , the absorption coefficient ⁇ ( ⁇ 2 ) of the medium 10 at the second wavelength ⁇ 2, and the absorption coefficient ⁇ ( ⁇ 3 ) of the medium 10 at the third wavelength ⁇ 3 .
  • the distance l from the medium surface 10s to the object surface 60s is calculated.
  • the reflection coefficient characteristics of the object surface 60s is substantially over the wavelength range from the first wavelength lambda 1 to the second wavelength lambda 2 via the third wavelength lambda 3
  • the shape of the object surface 60s of the object 60 is measured under a condition having a linear relationship.
  • an image having a three-dimensional shape of the object 60 is reconstructed based on the shape of the measured object surface 60s of the object 60, and the reconstructed object 60 has a three-dimensional shape in step S15.
  • the image is displayed on the display 72, and the shape measurement process ends.
  • the measurement apparatus 70A is higher than the first basic embodiment in consideration of an error between the reflection coefficients s ( ⁇ 1 ) and s ( ⁇ 2 ) on the premise of a substantially linear relationship of the wavelength characteristic of the reflection coefficient.
  • the distance l can be calculated with accuracy.
  • the measuring apparatus 70A can calculate the distance l in a shorter time than the above method by using the nonlinear least square method and parallel processing.
  • the measuring apparatus 70A may calculate the distance l in a shorter time than the above method by using an approximate condition of the following equation.
  • the measurement apparatus 70A has a wavelength characteristic of the reflection coefficient of the object surface 60s of the object 60 having a substantially linear relationship (linear approximation) with respect to the wavelength.
  • shape measurements can be performed with higher accuracy than in the prior art.
  • the length calculation can be executed in a shorter time than the above method.
  • the length can be calculated in a shorter time than the above method.
  • FIG. 3 is a schematic block diagram showing a configuration of an optical shape measuring apparatus according to a third basic embodiment of the present invention.
  • the shape measuring apparatus of the second basic embodiment shown in FIG. 2A includes three sensors 21, 22, and 23.
  • a spectroscopic sensor 24 such as a spectroscope camera may be used.
  • the measurement apparatus 70A of FIG. instead of the three band-pass filters 31, 32, and 33, only one band-pass filter 34 that passes light of three wavelengths ⁇ 1 , ⁇ 2 , and ⁇ 3 is provided, and the measurement apparatus 70A of FIG. Instead, a measuring device 70B is provided.
  • the reflected light R ⁇ b> 60 is substantially reflected along the optical axis 40 a and is received by the spectroscope sensor 24 via the bandpass filter 34.
  • the spectroscope sensor 24 outputs not only the intensities of the first, second, and third wavelengths ⁇ 1 , ⁇ 2 , and ⁇ 3 but also photographic RGB image data to the measuring device 70B.
  • the measuring device 70B measures the shape of the object 60 to form a three-dimensional image, and outputs the photographic RGB image data as the imaged data to the display 72 for display.
  • a calculation method for speeding up and high accuracy in the second basic embodiment can also be used.
  • Shape Estimation Using Light Absorption Characteristics of Medium distance estimation using light absorption characteristics of the medium 10 which is water, for example, will be described. First, after briefly explaining the light absorption characteristics, the principle of estimating the distance from the light absorption characteristics of the two wavelengths ⁇ 1 and ⁇ 2 will be described. Next, the accuracy of the estimated distance is considered, and the reflected radiation spectrum of the object between the two wavelengths ⁇ 1 and ⁇ 2 in the near infrared region to be used last is verified.
  • the medium 10 When light passes through a medium 10, the light is absorbed by the medium 10. However, the light is not absorbed at a constant rate in the entire wavelength range and the intensity is not lowered, but the light is absorbed according to the absorption characteristics depending on the medium.
  • the medium 10 has a wavelength range in which light is strongly absorbed, and a wavelength range in which light is hardly absorbed.
  • FIG. 5 is a graph showing the light absorption ratio by water at 400 nm to 1400 nm. That is, FIG. 5 shows, as an example, a graph showing a ratio of light absorbed in a wavelength range of 400 nm to 1400 nm when light passes through water having a length of 12 mm. As can be seen from FIG. 5, light is hardly absorbed in the visible light range of 400 nm to 750 nm. This indicates the fact that water appears transparent to the human eye that is sensitive to this wavelength range. On the other hand, light absorption starts to increase from a wavelength region where the wavelength is longer than that of the visible light region, and the absorptance increases rapidly from the wavelength region of 900 nm. It can be seen that light in the near infrared region of 1200 nm to 1400 nm is absorbed cleanly.
  • FIG. 6A is a photographic image showing the difference in the appearance of water between visible light and near infrared, and is a photographic image showing the appearance of water taken with an infrared camera without using a filter.
  • FIG. 6B is a photographic image showing the difference in the appearance of water between visible light and near infrared, and is a photographic image showing the appearance of water taken with an infrared camera through a 950 nm filter.
  • FIG. 6A and FIG. 6B are comparative images for confirming that light is hardly absorbed by water in the visible light region, but light is strongly absorbed in the wavelength region where the near-infrared absorption is strong. .
  • These images are obtained by photographing a container containing water with an imaging camera sensitive to near-infrared rays of 1100 nm in addition to the visible light region.
  • FIG. 6A was taken without using a filter
  • FIG. 6B was taken with a 950 nm bandpass filter having a wavelength of strong light absorption by water in front of the camera lens. Since the bandpass filter has a characteristic of transmitting only light in the band, only 950 nm light reaches the imaging camera, and as a result, a spectral image of 950 nm is obtained.
  • FIG. 7 is a schematic side view of an apparatus for showing the amount of light according to Lambert-Beer's law. Assuming the situation of the apparatus of FIG. 7, the relationship between the intensity I 0 of light before entering the medium 10 at a certain wavelength ⁇ and the intensity I of light after passing through the medium 10 is as follows: According to the law, it can be expressed as
  • l is the distance (mm)
  • ⁇ ( ⁇ ) indicates the absorption coefficient (mm ⁇ 1 ) depending on the wavelength ⁇
  • e ⁇ ( ⁇ ) l is the Napier number ⁇ ( ⁇ ) It is the lth power.
  • the reflectance function f ( ⁇ , ⁇ ) r ( ⁇ ) s ( ⁇ ) is the geometric characteristic r ( ⁇ ) of the object 60 ( ⁇ is the angle of incident light, emitted light, etc.) and the wavelength in reflection.
  • the angle of incident light, emitted light, etc.
  • the sensors 21 and 22 are used because of the equipment that can be used at present.
  • the light source 80 is not placed in the water of the medium 10. Since it is assumed that the directional light source 80 and the sensors 21 and 22 which are orthographic cameras, for example, use a wavelength range in which two wavelengths are close, it is possible to ignore the influence of specular reflection of light on the water surface. This will be described below. Also, if the specular reflection component on the water surface is strong, only the specular reflection component is photographed, and the specular reflection component should not be photographed by taking measures to subtract it from the image that contains it or using a polarizing plate. Possible countermeasures are:
  • FIG. 8 is a schematic side view of the shape measuring apparatus when, for example, the optical axes of the sensors 21 and 22 and the light source 80 are parallel and perpendicular to the water surface.
  • the positional configuration of the light source 80 and the sensors 21 and 22 that are ideally arranged on a straight line will be considered.
  • the optical axes of the sensors 21 and 22 and the optical axis of the directional light source 80 are perpendicular to the flat water surface.
  • Monochrome light having a wavelength ⁇ 1 and a light amount I 0 is incident from the water surface, passes through the water of the medium 10, and reaches a dull point 60 p at a distance l.
  • the amount of light reflected from the point 60p and sensed by the sensors 21 and 22 is expressed by the following equation.
  • 2l is twice the distance l, and light travels from the water surface (medium surface 10s) to the point 60p on the object surface 60s, and therefore passes through twice the distance l.
  • the geometric characteristic r ( ⁇ ) of the object surface 60s and the wavelength characteristic s ( ⁇ ) due to reflection naturally are unknown because they depend on the formation and material of the object surface 60s.
  • another type of monochrome light is used.
  • the wavelength of the light is ⁇ 2 and the light source 80 has the same light amount I 0 as the first light (wavelength ⁇ 1 )
  • the second light amount I (wavelength ⁇ 2 ) sensed by the sensors 21 and 22 Is expressed by the following equation.
  • the distance l can be calculated as the following equation.
  • the geometric shape characteristic r ( ⁇ ) can be canceled.
  • the reflectances at the wavelengths ⁇ 1 and ⁇ 2 are the same, that is, If the light sources 80 in the two wavelength regions can be selected, the distance l can be approximated by the following equation.
  • Equation (5) is an equation indicating a core algorithm in the present embodiment. As a result, even if the material, surface reflection characteristics, and geometric characteristics of the object 60 are unknown, it is easy to measure the difference in luminance value of each pixel of two camera images taken with respect to an appropriately selected wavelength. The distance l can be estimated.
  • the estimated distance error ⁇ l is calculated as follows.
  • FIG. 10 is a graph showing the reflection spectrum of “24-color checkerboard”. That is, FIG. 10 shows a reflection spectrum measured for a color checker board in which 24 patch tiles (24 colors) having different colors are arranged in a lattice shape, and the 24 graphs in FIG. Corresponds to the reflection spectrum of. As can be seen from FIG. 10, the amount of light absorption in water changes rapidly in the wavelength range of 900 nm to 1000 nm. Furthermore, we will demonstrate that the reflection spectra of various materials tend to be flat in this wavelength range. First, as shown in FIG. 10, when the reflection spectrum of a standard color checkerboard was measured, the change in the reflection spectrum was greatly reduced in the long wavelength region after 900 nm in all patch tiles. I understood that.
  • FIG. 11 is a graph showing the reflection spectrum error of the “color checkerboard” in one wavelength pair.
  • a pair of 900 nm and 950 nm and a pair of 900 nm and 920 nm were examined as wavelength pairs.
  • the relative average of the difference in the reflection spectrum at the 900 nm and 950 nm wavelength pairs is 5.7%
  • the relative average of the difference in the reflection spectrum at the 900 nm and 920 nm wavelength pairs is It further decreased to 2.1%.
  • FIG. 12A is a spectrum diagram showing the reflection spectrum of “wood”.
  • FIG. 12B is a spectrum diagram showing a reflection spectrum of “cloth”.
  • FIG. 12C is a spectrum diagram showing the reflection spectrum of “leather”.
  • FIG. 12D is a spectrum diagram showing the reflection spectrum of “metal”.
  • FIG. 13A is a graph showing the reflection spectrum error of the “wood color checkerboard” in one wavelength pair.
  • FIG. 13B is a graph showing the reflection spectrum error of the “cloth color checkerboard” in one wavelength pair.
  • FIG. 13C is a graph showing the reflection spectrum error of the “leather color checkerboard” in one wavelength pair.
  • FIG. 13D is a graph showing the reflection spectrum error of the “metal color checkerboard” in one wavelength pair.
  • each material includes data of about 20 different materials.
  • 20 graphs in FIG. 12A correspond to data of about 20 different materials of wood.
  • the average error in the 900 nm and 950 nm wavelength pairs is wood, cloth, While the leather and metal material groups were 3.8%, 2.1%, 6.0%, and 11.1%, the average error in the 900 nm and 920 nm wavelength pairs was 1.4%. 1%, 1.9%, and 5.0%.
  • this database may be small in terms of the number of samples of materials, the average error evaluation results show that the error in the reflection spectrum of two adjacent pairs of wavelengths in the near infrared region is quite small.
  • the “Shape from Water” algorithm (hereinafter referred to as the SFW algorithm) for practical setup based on distance estimation using two wavelength images will be described below.
  • the SFW algorithm is intended to correct a camera position shift or an error of a distance estimation result due to an actual band-pass filter transmittance, which is a problem in constructing an imaging system.
  • the directional light source 80 and the sensors 21 and 22 which are orthographic projection cameras, for example, are both installed on the same optical axis, and the optical axis is perpendicular to the water surface. I was expecting. In actual installation of the imaging system device, it is difficult to design and adjust the optical axis perpendicular to the water surface, and the light source 80 or the sensors 21, 22 or both are slightly tilted with respect to the vertical direction. Is expected. Therefore, we will introduce a method to correct the error of the estimation result due to the deviation of the device position when the distance of a certain point is estimated.
  • FIG. 14 shows that the optical axes 21a of the sensors 21 and 22 (the optical axes of the two sensors 21 and 22 are the same, so that one reference numeral 21a is attached) and the optical axis 80a of the light source 80 are not parallel and perpendicular to the water surface.
  • the inclination of the axis 21a is ⁇
  • the inclination of the optical axis 80a of the light source 80 is ⁇ . Since the refractive index of water is substantially constant in the near infrared region, the inclinations ⁇ and ⁇ of the optical axes 21a and 80a of the sensors 21 and 22 and the light source 80 are two wavelengths close to each other in the near infrared region. It can be assumed that there is no change between. However, although twice the distance l is 2l in the equation (5), the optical axes 21a of the sensors 21 and 22 and the optical axis 80a of the light source 80 are slightly away from the reference optical axis 40a from the direction perpendicular to the water surface.
  • the relationship between the luminance value measured by the sensors 21, 22 and the water absorption coefficient is as follows. It can be expressed as an expression.
  • equation (3) is also expressed by the following equation.
  • the distance l corresponding to the optical path length can be estimated.
  • the spectrum of the light source 80 and the response function of the camera have not been considered, but these can also be considered in the same way as the filter function, and are included in the transmission function of the filter.
  • Coaxial camera system and experiment The present inventors constructed a coaxial system as an imaging system for estimating the shape of the object 60 to be measured using the SFW algorithm. This is because two sensors 21 and 22 are arranged on the same axis, and two wavelength images can be taken in real time at a video rate. From the continuously photographed images, the shape of a complex geometric object 60 in water or the shape of a dynamic object 60 is estimated. First, the outline of the imaging system is described, and the accuracy of the distance actually estimated by the system is examined.
  • FIG. 15 is a photographic image showing the appearance of the coaxial spectral imaging system (shape measuring apparatus) according to the first embodiment of the present invention.
  • a coaxial two-wavelength imaging system was constructed using a beam splitter and two gray scale cameras (POINT-GREYGS3-U3-41C6NIR).
  • FIG. 16 is a spectrum diagram showing a response function from a wavelength of 300 nm to 1100 nm of the camera of FIG. 15 (manufactured by POINT-GREY, GS3-U3-41C6NIR type).
  • the sensitivity of near infrared rays is lower than that in the visible light range, it is actually used by increasing the light amount of the light source 80 or slightly increasing the exposure time of the sensors 21 and 22.
  • a near-infrared wavelength image can be clearly captured.
  • the wavelength image can be taken with a camera.
  • FIG. 17 is a spectrum diagram showing the transmission function of the two bandpass filters of FIG. 15 and the response function of the camera. That is, in FIG. 17, in order to compare the response functions of the sensors 21 and 22 in the near infrared region actually used and the transmission function curves of the bandpass filter, both are plotted.
  • an incandescent lamp having a sufficient amount of light in the near infrared region is used as the light source 80.
  • FIG. 18 is a spectrum diagram showing the spectrum of the light source 80 of FIG.
  • the sensors 21 and 22 which are actually two infrared cameras are synchronized, and the beam splitter and the sensor 21 so that two wavelength images can be photographed at the same angle of view with the same optical axis with respect to the object 60 to be photographed. , 22 were adjusted and the camera system was constructed.
  • water absorption coefficient data is required, but it can be easily measured.
  • the white target spectrum is measured in a state where the white target is fixed at a known depth in water. Since light is absorbed according to the distance in the spectrum, the absorption coefficient of water was calculated by applying the data to Lambert-Beer's law.
  • FIG. 19 is a spectrum diagram showing the absorption coefficient of water. That is, in FIG. 19, the absorption coefficient of water measured and corrected at a plurality of depths to improve accuracy is shown.
  • the distance according to the present embodiment was estimated using a plurality of plates on a plane made of different materials. The distance measured with a ruler was taken as a true value, and the accuracy was evaluated by changing the distance from 10 mm to 40 mm. In each distance situation, the distance is estimated from Equation (5) by using, as input data, an image of two wavelengths taken using a coaxial imaging system. In order to evaluate the effectiveness of the algorithm of the method according to the present embodiment, the estimation result was corrected from the equations (7) and (10).
  • FIG. 20A is a graph showing the distance accuracy in the “cyan tile”, and is a graph showing a comparison between the estimated distance and the true value.
  • FIG. 20B is a graph showing the distance accuracy in the “cyan tile” and showing the relative error of the estimated distance.
  • FIG. 21A is a graph showing the distance accuracy in the “red plastic board”, and shows a comparison between the estimated distance and the true value.
  • FIG. 21B is a graph showing the distance accuracy in the “red plastic board” and showing the relative error of the estimated distance.
  • FIG. 22A is a graph showing the distance accuracy in “white marble”, and is a graph showing a comparison between the estimated distance and the true value.
  • FIG. 22B is a graph showing the distance accuracy in “white marble”, and is a graph showing the relative error of the estimated distance.
  • FIG. 23A is a graph showing the distance accuracy in “black marble”, and is a graph showing a comparison between the estimated distance and the true value.
  • FIG. 23B is a graph showing the distance accuracy in “black marble” and showing the relative error of the estimated distance.
  • the graphs from FIG. 20A to FIG. 23B are the results of evaluating the accuracy of the estimated distance using four types of materials: cyan tile, red plastic board, white marble, and black marble.
  • cyan tile cyan tile
  • red plastic board red plastic board
  • white marble white marble
  • black marble a material that is, a material that was extracted at random, and the average of distance estimation results of the points, the correction results, and the true values are shown in the graphs of FIGS. 20A to 23A.
  • the distribution of relative error in the estimation result corrected for the distance of the 121 points is shown in the graphs of FIGS. 20B to 23B.
  • the 25th to 75th percentile values are indicated by boxes, and the average value is indicated by a central horizontal line in the rectangular box.
  • the correction algorithm plays an important role in improving the estimation accuracy.
  • the average value of the distance estimation is considerably close to the true value, and has a relative error of about 3%.
  • the corrected estimation result measured and calculated at 121 points is consistent without depending on the texture of the material.
  • Shape estimation experiment Using the SFW algorithm, we tried to estimate the shape of an object with complex reflection characteristics and a moving object while changing its shape. Since it is difficult to obtain the true value of the shape data of the object itself, the estimated shape was qualitatively evaluated.
  • FIG. 24A is a photographic image of “shellfish” taken with a wavelength of 905 nm by the shape measuring apparatus according to the first embodiment.
  • FIG. 24B is a photographic image of “shellfish” taken using the shape measuring apparatus according to the first embodiment using a wavelength of 950 nm.
  • FIG. 24C is a photographic image showing a “shellfish” distance image (after black-and-white conversion) estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 24D is a photographic image showing a “shellfish” estimated shape image estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 24E is a photographic image showing a “shellfish” RGB image (after black-and-white conversion) estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 25A is a photographic image of “stone” photographed with a wavelength of 905 nm by the shape measuring apparatus according to the first embodiment.
  • FIG. 25B is a photographic image of “stone” photographed by using the shape measuring apparatus according to the first embodiment using a wavelength of 950 nm.
  • FIG. 25C is a photographic image showing a distance image (after black-and-white conversion) of “stone” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 25D is a photographic image showing an estimated shape image of “stone” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 25E is a photographic image showing an RGB image (after black-and-white conversion) of “stone” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 26A is a photographic image of “a squirrel object made of wood”, which was photographed by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 26B is a photographic image of “a squirrel object made of wood”, which was photographed by the shape measuring apparatus according to the first embodiment using a wavelength of 950 nm.
  • FIG. 26C is a photographic image showing a distance image (after black-and-white conversion) of the “squirrel object made of wood” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 26D is a photographic image showing an estimated shape image of the “squirrel object made of wood” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 26E is a photographic image showing an RGB image (after black-and-white conversion) of a “squirrel object made of wood” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 27A is a photographic image of “monkey object made of earthenware” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 27B is a photographic image of “monkey object made of earthenware” taken using the shape measuring apparatus according to the first embodiment at a wavelength of 950 nm.
  • FIG. 27C is a photographic image showing a distance image (after black-and-white conversion) of “monkey object made of earthenware” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 27D is a photographic image showing an estimated shape image of “monkey object made of earthenware” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 27E is a photographic image showing an RGB image (after black-and-white conversion) of “monkey object made of earthenware” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 28A is a photographic image of “a cup object having a colorful color” photographed with a wavelength of 905 nm by the shape measuring apparatus according to the first embodiment.
  • FIG. 28B is a photographic image of “a cup object having a colorful color” photographed by the shape measuring apparatus according to the first embodiment using a wavelength of 950 nm.
  • FIG. 28C is a photographic image showing a distance image (after black-and-white conversion) of “a cup object having colorful colors” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 28D is a photographic image showing an estimated shape image of “an object of a cup having colorful colors” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 28E is a photographic image that shows an RGB image (after black-and-white conversion) of a “colored cup object” estimated by the shape measurement apparatus according to the first embodiment.
  • FIG. 29A is a photographic image of “an object of a paper cup having a texture”, which was photographed by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 29B is a photographic image of “an object of a paper cup with a texture” taken using the shape measuring apparatus according to the first embodiment at a wavelength of 950 nm.
  • FIG. 29C is a photographic image showing a distance image (after black-and-white conversion) of “paper cup object with texture” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 29D is a photographic image showing an estimated shape image of “paper cup object having texture” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 29E is a photographic image showing an RGB image (after black-and-white conversion) of a “paper cup object with texture” estimated by the shape measuring apparatus according to the first embodiment.
  • the photographic images from FIG. 24A to FIG. 29E are the results of shape estimation of an opaque object having a texture or having complicated reflection characteristics. These objects are difficult to estimate by the conventional shape estimation method, but as can be seen from the estimation results, the imaging system constructed in this embodiment and the method according to this embodiment have a textured object 60 or a strong specular surface. It turns out that it is useful also for the object 60 with a reflective characteristic.
  • the colorful color cups in FIGS. 28A to 28E show the advantages of the SFW algorithm. It is a complex texture due to the colorful color, and it is difficult to estimate with the conventional method, but as you can see from the images at 905 nm and 950 nm, the reflectance of all colors is almost the same in this wavelength range. And can be treated as an object without texture.
  • the monkey object in FIGS. 27A to 27E is a craft product having a strong specular reflection component. If the sensors 21 and 22 saturate at a pixel at a point where specular reflection occurs, the distance estimation result is incorrect. Become. However, this can be avoided by setting the sensors 21 and 22 so as not to cause saturation.
  • the shells of FIGS. 24A to 24E and the stones of FIGS. 25A to 25E are considerably uneven in surface, but it can be seen from the estimation results that uneven portions can also be estimated.
  • FIG. 30A is a photographic image of a “cherry blossom flower object” photographed using the shape measurement apparatus according to the first embodiment at a wavelength of 905 nm.
  • FIG. 30B is a photographic image of a “cherry blossom flower object” taken using the shape measurement apparatus according to the first embodiment at a wavelength of 950 nm.
  • FIG. 30C is a photographic image showing a distance image (after black-and-white conversion) of the “cherry blossom flower object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 30D is a photographic image that shows an estimated shape image of the “cherry blossom flower object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 30E is a photographic image showing an RGB image (after black-and-white conversion) of the “cherry blossom object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 31A is a photographic image of “another cherry blossom object” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 31B is a photographic image of “another cherry blossom object” taken by the shape measurement apparatus according to the first embodiment using a wavelength of 950 nm.
  • FIG. 31C is a photographic image showing a distance image (after black-and-white conversion) of “another cherry blossom object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 31D is a photographic image showing an estimated shape image of “another cherry blossom object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 31E is a photographic image showing an RGB image (after black-and-white conversion) of “another cherry blossom object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 32A is a photographic image of “semi-transparent object with color gradation” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 32B is a photographic image of “an object that is translucent and has a color gradation” that was photographed by the shape measuring apparatus according to the first embodiment using a wavelength of 950 nm.
  • FIG. 32C is a photographic image showing the distance image (after black-and-white conversion) of “semi-transparent object with color gradation” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 32D is a photographic image showing an estimated shape image of “semi-transparent object having color gradation” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 32E is a photographic image showing an RGB image (after black-and-white conversion) of a “translucent object having a color gradation” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 33A is a photographic image of a “mirror-reflective, translucent heart-shaped object” taken using the shape measuring apparatus according to the first embodiment at a wavelength of 905 nm.
  • FIG. 33B is a photographic image of a “specular reflection and semi-transparent heart-shaped object” taken using the shape measurement apparatus according to the first embodiment at a wavelength of 950 nm.
  • FIG. 33C is a photographic image showing a distance image (after black-and-white conversion) of “a mirror-reflection semi-transparent heart-shaped object” estimated by the shape measuring apparatus according to the first embodiment.
  • FIG. 33A is a photographic image of a “mirror-reflective, translucent heart-shaped object” taken using the shape measuring apparatus according to the first embodiment at a wavelength of 905 nm.
  • FIG. 33B is a photographic image of a “specular reflection and semi-transparent heart-shaped object” taken using the shape measurement apparatus according to the first embodiment at a wavelength of 950 nm.
  • FIG. 33D is a photographic image that shows an estimated shape image of a “mirror-reflection, translucent heart-shaped object” estimated by the shape measurement apparatus according to the first embodiment.
  • FIG. 33E is a photographic image that shows an RGB image (after black-and-white conversion) of a “mirror-shaped, translucent heart-shaped object” estimated by the shape measurement apparatus according to the first embodiment.
  • each object in FIG. 30A to FIG. 33E is a semi-transparent object having a strong specular reflection, and a strong gloss can be seen from the RGB image.
  • the objects shown in FIGS. 32A to 32E have gradation in color and have a fairly complicated shape.
  • the coaxial imaging system constructed in this embodiment is suitable for estimating the shape of a dynamic object in real time.
  • FIG. 34A is a photographic image of a “moving hand (first image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 34B shows the “moving hand (first image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to the “moving hand (first image)” of FIG. 34A. It is a photographic image.
  • FIG. 34C is a photographic image of a “moving hand (second image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 34D is a photographic image of “moving hand (second image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to the moving hand (second image) of FIG.
  • FIG. 34E is a photographic image of a “moving hand (third image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 34F shows the “moving hand (third image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to the “moving hand (third image)” of FIG. 34E.
  • FIG. 34G is a photographic image of “moving hand (fourth image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 34H shows the “moving hand (fourth image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to the “moving hand (fourth image)” of FIG. 34G. It is a photographic image.
  • FIG. 34A to FIG. 34H are results obtained by photographing a scene where a hand is moving in water and estimating the shape. From this result, it can be seen that the shape of a dynamic object can be estimated correctly. Actual shooting is performed at about 30 FPS, and hand shape estimation is performed in each frame, but here, a frame in which movement is easy to understand is selected.
  • FIG. 33 shows the shape estimation result of the goldfish swimming in the aquarium.
  • FIG. 35A is a photographic image of “goldfish (first image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 35B is a photographic image of “goldfish (first image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (first image)” of FIG. 35A.
  • FIG. 35C is a photographic image of “goldfish (second image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 35D is a photographic image of “goldfish (second image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (second image)” of FIG. 35C.
  • FIG. 35C is a photographic image of “goldfish (second image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 35D is a photographic image of “goldfish (second image)” whose shape is estimated by
  • 35E is a photographic image of “goldfish (third image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 35F is a photographic image of “goldfish (third image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (third image)” of FIG. 35E.
  • FIG. 35G is a photographic image of “goldfish (fourth image)” taken by the shape measuring apparatus according to the first embodiment using a wavelength of 905 nm.
  • FIG. 35H is a photographic image of “goldfish (fourth image)” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “goldfish (fourth image)” of FIG. 35G.
  • the estimation of the scene where the fish is swimming could not be performed by the conventional method at all, but it can be estimated that the method using the SFW algorithm can be estimated more than a predetermined value and is effective. Similarly, a frame that is easy to understand is selected. The 905 nm image is considerably darker because the scale of the goldfish has a strong specular reflection characteristic, so that the sensors 21 and 22 are adjusted so as not to cause saturation.
  • Discussion Distance estimation by two wavelengths using the SFW algorithm according to the present embodiment cannot directly deal with ambient light at present, but in reality, the light source 80 is turned off and only the ambient light is illuminated. This can be dealt with by taking a picture of the state and subtracting it from the image that is illuminated by both ambient and light sources.
  • section 1.2.3 of the present embodiment it was shown that the reflection spectra of various materials hardly change in the wavelength range of 900 nm to 1000 nm where the absorption of water being used sharply increases. However, as an exception, it has also been confirmed that there are materials whose reflectivity changes somewhat.
  • the difference in reflectance between the two wavelength ranges of the object 60 directly leads to an error in the shape estimation result. To deal with this problem, the spectrum between the two wavelengths is approximated linearly, and in addition to the two images currently in use, another wavelength image is used to correct the reflectance error of the object. Thus, it is possible to estimate the shape with higher accuracy.
  • the three-dimensional shape estimation of the transparent object 60 has been difficult by the non-contact shape estimation method, but the SFW algorithm is an object made of a transparent material such as glass or plastic, and the material is made of near infrared rays. If light is not absorbed, shape estimation can be performed. However, when the bottom surface of the object 60 is transparent, there is a problem that the shape cannot be estimated because the distance is calculated as the distance after the light passes through the bottom surface.
  • FIG. 36A is a photographic image of an RGB image (after black-and-white conversion) of “an object in a state where the surface of an egg-shaped object whose top is transparent is not painted”.
  • FIG. 36B is a photographic image of “object without paint” whose shape is estimated by the shape measuring apparatus according to the prior art with respect to “object without paint” in FIG. 36A.
  • FIG. 36C is a photographic image of an RGB image (after black-and-white conversion) of “an object in a state where the surface of an egg-shaped object with a transparent upper part is painted”.
  • FIG. 36D is a photographic image of “object with paint” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “object with paint” in FIG. 36C.
  • FIG. 36E is a photographic image of “object without paint” whose shape is estimated by the shape measuring apparatus according to the first embodiment with respect to “object without paint” of FIG. 36A.
  • FIG. 36A to FIG. 36E show the results of an attempt to estimate the shape of an egg-shaped object that is not transparent on the bottom but made of transparent plastic on the top.
  • the shape can be estimated correctly by using the SFW algorithm.
  • the object was painted for reference, and the state in which the surface was not transparent was estimated with a laser three-dimensional measuring apparatus.
  • the transparent object by the SFW algorithm is used. It can be seen that the shape estimation result is correct.
  • the wavelengths are selected so that the difference in absorption coefficient between two near infrared wavelengths is large so that the accuracy of shape estimation is improved.
  • the light of the wavelength image with the larger absorption coefficient (950 nm image in the present embodiment) is excessively absorbed, and therefore the image becomes too dark for an appropriate SNR.
  • the shape estimation results published in this embodiment are not adjusted for the light amount of the light source due to equipment limitations, but the exposure times of the sensors 21 and 22 are adjusted.
  • the present embodiment has proposed an imaging system using the SFW algorithm, which is a completely new distance estimation method using light absorption.
  • the SFW algorithm estimates the shape using the difference in the degree of light absorption between the wavelengths in the two near infrared regions without being affected by the surface reflection characteristics of the object.
  • a coaxial imaging system was constructed, and two wavelength images were taken simultaneously, enabling real-time shape estimation. From the result of actually estimating the shape, it was found that the shape of an object having complex reflection characteristics or a dynamically deformed object can be accurately estimated by the SFW algorithm.
  • FIG. 37 is a graph showing a linear approximation method of the reflected radiation spectrum in the wavelength range [ ⁇ 1 , ⁇ 2 ] used in the shape measuring apparatus according to the second embodiment of the present invention. As shown in FIG. 37, it can be assumed that the reflection spectrum between the two wavelengths ⁇ 1 and ⁇ 2 in the near-infrared region is almost linear, although the slope is unknown.
  • the reflectance s ( ⁇ 3 ) at the third wavelength ⁇ 3 can be calculated as follows.
  • Equations (2) and (3) in the first embodiment formulate the amount of light detected by the sensors 21 and 22, but the same applies to the third wavelength ⁇ 3 as in the following equation. Can be represented.
  • the reflectance database reflectance error evaluation by estimation using three wavelengths In the wavelength region between the two wavelengths ⁇ 1 and ⁇ 2 , when the reflectance of the target object is not the same value, It is evaluated from the reflectance database that the estimation considering the difference in reflectance is possible.
  • the reflectance error is calculated by setting ⁇ 1 , ⁇ 2 , and ⁇ 3 to 900 nm, 950 nm, and 925 nm, respectively.
  • the reflectance s ( ⁇ 3 ) at the wavelength ⁇ 3 is represented by two points [ ⁇ 1 , s ( ⁇ 1 )] and [ ⁇ on the graph as shown in FIG.
  • FIG. 38A is a spectrum diagram showing the result of linear approximation of the reflection spectrum in the wavelength range of [ ⁇ 1 , ⁇ 2 ] for “wood”.
  • FIG. 38B is a spectrum diagram showing the result of linear approximation of the reflection spectrum in the [ ⁇ 1 , ⁇ 2 ] wavelength range for “cloth”.
  • FIG. 38C is a spectrum diagram showing the result of linear approximation of the reflection spectrum in the wavelength range of [ ⁇ 1 , ⁇ 2 ] for “leather”.
  • FIG. 38D is a spectrum diagram showing a result of performing linear approximation of the reflection spectrum in the wavelength range of [ ⁇ 1 , ⁇ 2 ] for “metal”.
  • the distance l can be obtained.
  • estimating the distance l for each pixel of the image is very expensive, and it is very difficult to estimate the distance of the image in real time even using the nonlinear least square method. Therefore, we devised an algorithm that can be estimated in real time by limiting the selection of the three wavelengths to be used.
  • Equation (18) can be expressed as a quadratic equation for A as in the following equation.
  • the distance l can be calculated by solving this equation (19) for A. Since this is simply a process of solving a quadratic equation, the calculation cost is also small and distance estimation in real time is possible.
  • Equation (19a) can be expressed as a cubic equation for B as in the following equation.
  • the distance l can be calculated by solving this equation (20) for B. Similarly, since only a simple cubic equation is solved, distance estimation in real time is possible.
  • Measuring device in which the optical axis of the light source (parallel light) and the camera are coaxial.
  • the light source 80 and / or the sensors 21, 22 or both are It is expected that it is slightly inclined from the vertical direction (see FIG. 14).
  • the optical axis of the light source 80 and the sensors 21 and 22 be coaxial.
  • a measuring apparatus has been developed that can shoot images in which the optical axis of the light source 80, which is parallel light, and the optical axes of the sensors 21, 22, 23 are coaxial.
  • FIG. 39 is a plan view showing a configuration example of a shape measuring device (corresponding to the device of FIG. 3 of the third basic embodiment) when the light source 80 (parallel light) and the optical axis of the spectroscope sensor 24 are coaxial. is there.
  • the spectroscope sensor 24 is configured by configuring the sensors 21, 22, and 23 with one device.
  • the spectroscope sensor 24 can photograph the object 60 arranged on the upper side of FIG. 39 in a coaxial state using the light of the light source 80.
  • the filter FR is a red filter that passes only the red wavelength
  • the filter FG is a green filter that passes only the green wavelength
  • the filter FB is a blue filter that passes only the blue wavelength.
  • the automatic movement stage MS selects one of the three filters and moves it on the optical axis.
  • the light incident from the light source 80 first becomes parallel light by passing through the cylindrical lens CL1. Furthermore, only light having a wavelength designated by one of the filters FR, FG, and FB is transmitted. Next, the light reflected at a right angle by the half mirror 40 travels toward the upper object 60 in FIG. The light irradiated to the object 60 is reflected and passes through the half mirror 40 again, but is incident on the spectroscope sensor 24 by passing through the half mirror 40 here. With this configuration, it is possible to take an image with the optical axes of the spectroscope sensor 24 and the light source 80 being coaxial. In FIG.
  • the light is once aggregated by arranging the cylindrical lenses CL2 to CL4 before and after the light enters the half mirror 40. That is, when the half mirror 40 has a size larger than the predetermined light width, the cylindrical lenses CL1 to CL4 may be omitted.
  • Three-wavelength coaxial imaging system In order to estimate the distance using the absorption of light of three wavelengths for the moving object 60, it is necessary to acquire images of the three wavelengths in real time. Therefore, a coaxial camera system with the same angle of view was constructed using three cameras 21C, 22C, and 23C.
  • FIG. 40 is a configuration example of a camera system having the same angle of view by three cameras 21C, 22C, and 23C used in the shape measuring apparatus according to the second basic embodiment (in the apparatus of FIG. 2A of the second basic embodiment).
  • FIG. 40 is a configuration example of a camera system having the same angle of view by three cameras 21C, 22C, and 23C used in the shape measuring apparatus according to the second basic embodiment (in the apparatus of FIG. 2A of the second basic embodiment).
  • the light is separated in three directions using two half mirrors 41 and 42.
  • the amount of light incident on the camera 22C for photographing the light separated by the half mirror 42 is less than half the amount of light incident on the camera 21C for photographing the separated light incident on the half mirror 41.
  • the brightness difference is corrected in each of the three cameras 21C, 22C, and 23C by adjusting the light amount of 80 and the sensitivity of the cameras 21C, 22C, and 23C.
  • RL is a relay lens
  • band pass filters 31, 32, and 33 are arranged on the front surfaces of the cameras 21C, 22C, and 23C, respectively.
  • FIG. 41A is a photographic image showing an input image with a wavelength of 905 nm for an “egg-shaped object” for shape estimation by the shape measuring apparatus according to the second embodiment.
  • FIG. 41B is a photographic image showing an estimation result image in which the shape is estimated by the shape measuring apparatus according to the second embodiment with respect to the “egg-shaped object” in FIG. 41A.
  • FIG. 42A is a photographic image showing an input image with a wavelength of 905 nm for “a food sample of roll cake” for shape estimation by the shape measuring apparatus according to the second embodiment.
  • FIG. 42B is a photographic image showing an estimation result image obtained by estimating the shape of the “roll cake food sample” in FIG. 42A by the shape measuring apparatus according to the second embodiment.
  • FIG. 41A is a photographic image showing an input image with a wavelength of 905 nm for an “egg-shaped object” for shape estimation by the shape measuring apparatus according to the second embodiment.
  • FIG. 42B is a photographic image showing an estimation result image obtained
  • FIG. 43A is a photographic image showing an input image with a wavelength of 905 nm for a “head object” for shape estimation by the shape measuring apparatus according to the second embodiment.
  • FIG. 43B is a photographic image showing an estimation result image in which the shape is estimated by the shape measuring apparatus according to the second embodiment with respect to “the turnip object” in FIG. 43A.
  • the measurement is performed with higher accuracy than the shape estimation result of the shape measuring apparatus according to the first embodiment. I understand that I can do it.
  • the shape can be measured using light of two wavelengths coming from an object with higher accuracy than the prior art.
  • the shape of the object can be measured using light of three wavelengths coming from the object with higher accuracy than in the prior art.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un dispositif de mesure de forme qui est pourvu de : une source de lumière qui rayonne une lumière ayant des première et deuxième longueurs d'onde sur la surface d'un objet ayant des facteurs de réflectance pour les première et deuxième longueurs d'onde, par l'intermédiaire d'un milieu ayant des coefficients d'absorption aux première et deuxième longueurs d'onde ; un capteur qui reçoit de la lumière qui provient de la surface de l'objet et qui s'est propagée à travers le milieu, et qui mesure les intensités aux première et deuxième longueurs d'onde ; et une unité de mesure qui calcule la distance de la surface du milieu à la surface de l'objet sur la base des intensités mesurées aux première et deuxième longueurs d'onde, et sur la base des coefficients d'absorption aux première et deuxième longueurs d'onde, et mesure la forme de l'objet sur la base de la distance calculée. Les première et deuxième longueurs d'onde sont choisies de sorte qu'une différence entre l'intensité pour une distance minimale à la première longueur d'onde et l'intensité pour une distance maximale à la deuxième longueur d'onde soit supérieure à une première valeur prescrite, et de sorte qu'une différence entre le facteur de réflectance à la première longueur d'onde et le facteur de réflectance à la deuxième longueur d'onde soit inférieure à une deuxième valeur prescrite.
PCT/JP2017/036510 2016-10-07 2017-10-06 Dispositif et procédé de mesure de forme WO2018066698A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018543991A JP6979701B2 (ja) 2016-10-07 2017-10-06 形状測定装置及び方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662405512P 2016-10-07 2016-10-07
US62/405,512 2016-10-07

Publications (1)

Publication Number Publication Date
WO2018066698A1 true WO2018066698A1 (fr) 2018-04-12

Family

ID=61831061

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036510 WO2018066698A1 (fr) 2016-10-07 2017-10-06 Dispositif et procédé de mesure de forme

Country Status (2)

Country Link
JP (2) JP6979701B2 (fr)
WO (1) WO2018066698A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021113756A (ja) * 2020-01-20 2021-08-05 株式会社東芝 推定装置、物体搬送システム、推定方法、およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015057604A (ja) * 2008-07-24 2015-03-26 マサチューセッツ インスティテュート オブ テクノロジー 吸収を利用して画像形成を行うためのシステム及び方法
CN105136062A (zh) * 2015-08-25 2015-12-09 上海集成电路研发中心有限公司 基于衰减光的三维扫描装置及三维扫描方法
JP2016017919A (ja) * 2014-07-10 2016-02-01 株式会社東京精密 距離測定装置、および距離測定方法
JP2016170164A (ja) * 2015-03-10 2016-09-23 富士フイルム株式会社 測定方法及び測定装置及びプログラム

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517575A (en) * 1991-10-04 1996-05-14 Ladewski; Theodore B. Methods of correcting optically generated errors in an electro-optical gauging system
WO1994010532A1 (fr) * 1992-10-13 1994-05-11 Kms Fusion, Inc. Systeme electro-optique de calibrage des profils de surface
JP2008256504A (ja) * 2007-04-04 2008-10-23 Nikon Corp 形状測定装置
US8384916B2 (en) * 2008-07-24 2013-02-26 Massachusetts Institute Of Technology Dynamic three-dimensional imaging of ear canals

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015057604A (ja) * 2008-07-24 2015-03-26 マサチューセッツ インスティテュート オブ テクノロジー 吸収を利用して画像形成を行うためのシステム及び方法
JP2016017919A (ja) * 2014-07-10 2016-02-01 株式会社東京精密 距離測定装置、および距離測定方法
JP2016170164A (ja) * 2015-03-10 2016-09-23 富士フイルム株式会社 測定方法及び測定装置及びプログラム
CN105136062A (zh) * 2015-08-25 2015-12-09 上海集成电路研发中心有限公司 基于衰减光的三维扫描装置及三维扫描方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021113756A (ja) * 2020-01-20 2021-08-05 株式会社東芝 推定装置、物体搬送システム、推定方法、およびプログラム
JP7317732B2 (ja) 2020-01-20 2023-07-31 株式会社東芝 推定装置、物体搬送システム、推定方法、およびプログラム

Also Published As

Publication number Publication date
JP2021185372A (ja) 2021-12-09
JP6979701B2 (ja) 2021-12-15
JPWO2018066698A1 (ja) 2019-07-18
JP7117800B2 (ja) 2022-08-15

Similar Documents

Publication Publication Date Title
US10383711B2 (en) Focus scanning apparatus recording color
TWI509220B (zh) 採用表面顏色之表面形貌干涉儀
TWI278597B (en) Optical film tester
JP6364777B2 (ja) 画像データ取得システム及び画像データ取得方法
CN102124723B (zh) 在屏幕上真实再现颜色的方法和装置
JP2018538513A (ja) 物体の反射率を決定するための方法及び関連するデバイス
US7684041B2 (en) Color inspection system
US9531950B2 (en) Imaging system and imaging method that perform a correction of eliminating an influence of ambient light for measurement data
JP7117800B2 (ja) 形状測定装置及び方法
JP2022509387A (ja) 材料の光学特性を測定するための光学装置
JP7141509B2 (ja) 脈波検出装置、脈波検出方法、および情報処理プログラム
US20160116410A1 (en) Apparatus and method for joint reflectance and fluorescence spectra estimation
TWI454670B (zh) Optical property measuring device and method for measuring optical properties
US10274307B2 (en) Film thickness measurement device using interference of light and film thickness measurement method using interference of light
WO2022070774A1 (fr) Procédé d'analyse d'image, dispositif d'analyse d'image, programme et support d'enregistrement
KR20110010439A (ko) 이미지 기반 반사율 측정 시스템 및 방법
JP2009074867A (ja) 計測装置およびその計測方法
JP6977744B2 (ja) 皮膚の色素成分の濃度画像の形成方法
JP2006113022A (ja) 反射防止フィルムの欠陥検出装置および方法
JP7441380B2 (ja) 距離測定装置
Kim The three-dimensional evolution of hyperspectral imaging
KR102087566B1 (ko) 표면 토폴로지 및 연관 컬러 결정을 위한 디바이스 및 방법
Lu et al. Spatial variant gloss measurement of dielectric material based on a polarized
WO2024088858A1 (fr) Balance blanche améliorée
JP6269183B2 (ja) 屈折率測定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17858534

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018543991

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17858534

Country of ref document: EP

Kind code of ref document: A1