WO2022185753A1 - Measuring device - Google Patents

Measuring device Download PDF

Info

Publication number
WO2022185753A1
WO2022185753A1 PCT/JP2022/001524 JP2022001524W WO2022185753A1 WO 2022185753 A1 WO2022185753 A1 WO 2022185753A1 JP 2022001524 W JP2022001524 W JP 2022001524W WO 2022185753 A1 WO2022185753 A1 WO 2022185753A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
event
measuring device
data
based camera
Prior art date
Application number
PCT/JP2022/001524
Other languages
French (fr)
Japanese (ja)
Inventor
潤 ▲高▼嶋
正行 荒川
豊 加藤
雅之 早川
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022185753A1 publication Critical patent/WO2022185753A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers

Definitions

  • the present invention relates to a measuring device.
  • a measuring device using an optical system is known as a device for measuring and inspecting measurement objects including these parts.
  • Patent Document 1 discloses a technique related to an interferometric shape measuring device.
  • the shape measuring apparatus specifies an interference pattern of the amount of light received that varies depending on the difference between the optical path length of the measurement light and the optical path length of the reference light, and obtains the surface shape of the object to be measured based on the interference pattern.
  • Patent Document 1 has a problem that high-speed measurement cannot be performed because it is necessary to move and measure a plurality of points and acquire and process a huge amount of data.
  • an object of the present invention is to provide a high-speed and high-precision measuring device.
  • a measuring device is a measuring device that measures a distance to an object, and includes a light source that projects light, a mirror, and a measuring device that guides the light projected from the light source to the object.
  • the beam splitter splits the light into reference light guided to the mirror, and the interference waveforms of the reference light reflected by the mirror and the measurement light reflected by the object are detected by detecting changes in the received luminance value to acquire data.
  • an event-based camera a processing unit that measures the distance to an object based on interference waveform data, and a control unit that controls to change the amount of light received by the event-based camera.
  • the event-based camera detects a change in the brightness value of the interference waveform of the reference light reflected by the mirror and the measurement light reflected by the object at each arranged pixel. Since the data of the interference waveform portion is acquired at the timing of the detection, the amount of data to be acquired can be reduced while acquiring the data necessary for measuring the distance to the object. That is, the measuring device can measure the object at high speed and with high accuracy.
  • control means may include a driving section that drives the mirror so as to change the optical path length difference between the measurement light and the reference light.
  • the measuring device can measure the object at high speed and with high accuracy using the white interference measurement method.
  • control means may use a wavelength swept light source as the light source so that the wavelength of the light projected from the light source changes continuously.
  • the measurement device can measure the object at high speed and with high accuracy by the wavelength sweeping measurement method.
  • the above aspect may include a log amplifier that calculates the amount of received light acquired by the event-based camera based on the polarity data, based on the accumulation of the polarity data.
  • the log amp can use the increase/decrease in the amount of light received by the event-based camera as the polarity data, and can capture minute changes at high speed, thereby shortening the imaging time.
  • the above aspect may further include an adjustment unit capable of adjusting the amount of data processed by the event-based camera.
  • the adjustment unit adjusts the amount of data processed by the event-based camera, the event-based camera can appropriately process events even when events occur simultaneously in a plurality of pixels. .
  • the adjustment unit may have an ROI function for reading out data in a predetermined partial area.
  • the ROI function is used to set the area necessary for measuring the object as the area from which the data is read, so the object can be measured more appropriately.
  • the adjustment unit includes a light-removing unit configured to guide the light irradiated to the predetermined partial region to the event-based camera and not to guide the light irradiated to the other region to the event-based camera. may contain.
  • the light-removing section is arranged so as to guide the light irradiated to the predetermined partial area to the event-based camera and not to guide the light irradiated to the other area to the event-based camera. Therefore, it is possible to measure the object more appropriately with respect to the area of the object that is necessary for measurement.
  • the light-shaping portion may be composed of a liquid crystal panel.
  • the liquid crystal panel it is possible to configure the light-shaping portion including the transmissive region that transmits the light irradiated to the predetermined partial region and the light-shielding region that blocks the other region.
  • the light-shaping section may be arranged on the common optical path of the measurement light and the reference light.
  • the predetermined partial area may be set in advance according to the shape of the object.
  • the area that is the characteristic portion of the object, the area that is to be measured, and the like are set as a predetermined partial area from which data is read or light is transmitted. Therefore, the object can be measured more appropriately.
  • FIG. 4 is a schematic diagram showing how data is acquired by an event-based camera 130 used as an imaging device;
  • FIG. 3 is a diagram showing how the surface shape of an object (distance to the surface) is measured by the measuring device 100 using the event-based camera 130 as an imaging device.
  • 4 is a diagram showing changes (increases and decreases) in the amount of received light acquired by the event-based camera 130 as bipolar, and showing an envelope based on the accumulation of the bipolar data.
  • FIG. 10 is a diagram showing an example of using a ROI (Region Of Interest) function as an adjustment unit capable of adjusting the amount of data processed by the event-based camera 130.
  • FIG. FIG. 10 is a diagram showing an example of how an event-based camera 130 is used to teach an area from which data is to be read;
  • FIG. 10 is a diagram showing an example of how drawing information is referred to and an area from which data is to be read is taught;
  • FIG. 10 is a diagram showing an example of using a light blocking section as an adjustment section capable of adjusting the amount of data processed by the event-based camera 130.
  • FIG. FIG. 2 is a schematic diagram showing a configuration in which light shielding units are arranged in the measurement apparatus 100 shown in FIG.
  • FIG. 1 is a schematic diagram showing the configuration of a measuring device 100 according to the first embodiment of the invention.
  • the measurement device 100 includes a light source 110, an optical system 120, an event-based camera 130, and a processing section 140.
  • the optical system 120 includes a beam splitter 121, a plurality of lenses 122-126 and a plurality of mirrors 127-129. Further, a driving section 150 (control means) for driving the mirror 129 is provided.
  • the light source 110 projects light L0.
  • Light source 110 is typically a white light source, and may be, for example, an SLD (Super Luminescent Diode) light source.
  • An SLD light source is a broadband light source that has the dual characteristics of light emitting diodes (LEDs) and semiconductor lasers (LDs), and has higher coherence than LED light sources and lower coherence than laser light. Note that the light source 110 emits light with a constant intensity, and the emission intensity does not change.
  • the light source 110 for example, a halogen lamp, a white LED, and a white laser light source may be used.
  • the light L 0 projected from the light source 110 is transmitted through the lens 122 (for example, a collimator lens), collimated, and reflected by the mirror 127 .
  • the light L0 reflected by the mirror 127 then enters the beam splitter 121 while being condensed by passing through the lens 123 .
  • the beam splitter 121 splits the light L0 into measurement light L1 guided to the object T and reference light guided to the mirror 129 .
  • part of the light L0 is reflected by the beam splitter 121 (measurement light L1), and the remaining part of the light L0 is transmitted through the beam splitter 121 (reference light L2).
  • the measurement light L1 is collimated by passing through the lens 124 . After that, the area of the object T is irradiated with the measurement light L1, and part of the measurement light L1 reflected by the object T is condensed by passing through the lens 124, and is again directed to the beam splitter 121. Incident.
  • the reference light L2 transmitted through the beam splitter 121 is transmitted through the lens 125 and collimated, reflected by the mirror 128, and irradiated onto the mirror 129.
  • Mirror 129 is a so-called reference mirror.
  • the reference light L2 reflected by the mirror 129 is reflected by the mirror 128, passes through the lens 125, is condensed, and enters the beam splitter 121 again.
  • the measurement light L1 reflected by the object T and incident on the beam splitter 121 and the reference light L2 reflected by the mirror 129 and incident on the beam splitter 121 interfere with each other, and pass through the lens 126 as interference light L3 to be collimated. and directed to the event-based camera 130 .
  • the event-based camera 130 acquires data by detecting changes in the received luminance value for the interference waveform L3. Data acquired by the event-based camera 130 will be described later.
  • the processing unit 140 determines the distance to the target object T based on the data of the interference waveform L3 acquired by the event-based camera 130, the position of the mirror 129 (optical path length of the reference light L2) controlled by the driving unit 150 described later, and the like. Measure distance.
  • the drive unit 150 (control means) is an actuator composed of, for example, a voice coil motor (VCM) or the like, and moves the mirror 129 along the optical axis direction so as to change the optical path length of the reference light L2.
  • VCM voice coil motor
  • an interference waveform is generated. Specifically, interference fringes can be observed around the position where the phase difference is 0, and an interference waveform is generated when the optical path lengths of the measurement light L1 and the reference light L2 match.
  • a method of measuring an object using the interference waveform is, for example, a measurement method called a white interference method.
  • the event-based camera 130 used as an imaging device will be described.
  • the event-based camera 130 detects a change in the luminance value of the light received in each arrayed pixel, captures an interference waveform at the portion (pixel) where the change is detected at the timing of the detection, and detects the polarity of the luminance change. , to get time and coordinate data.
  • FIG. 2 is a schematic diagram showing how data is acquired by the event-based camera 130 used as an imaging device.
  • the event-based camera 130 detects a change in the luminance value of the received light in each arranged pixel, acquires the data of the interference waveform portion necessary for measurement at the timing of the detection, Other parts do not acquire data.
  • CMOS used as an image sensor acquires all data for each pixel, and the amount of data is enormous.
  • the event-based camera 130 efficiently acquires the data of the interference waveform portion necessary for measuring the distance to the object T compared to the CMOS imaging device.
  • the event-based camera 130 is used as an imaging device to detect a change in the luminance value of the received light in each arrayed pixel, and to acquire the data of the interference waveform portion at the timing of the detection.
  • the amount of data to be acquired can be reduced while acquiring the data necessary for measuring the distance to the object T.
  • FIG. 3 is a diagram showing how the surface shape of an object (distance to the surface) is measured by the measuring device 100 using the event-based camera 130 as an imaging device.
  • the event-based camera 130 acquires data necessary for measurement at multiple points (on a single surface), and efficiently processes the necessary data at high speed ( reading, etc.).
  • the measuring device 100 can measure the object at high speed and with high accuracy.
  • the event-based camera 130 detects changes in the luminance value of the received light in each arranged pixel, and acquires the data of the interference waveform portion at the timing of the detection. May contain an amplifier.
  • FIG. 4 is a diagram showing a change (increase or decrease) in the amount of received light acquired by the event-based camera 130 as bipolar, and an envelope curve based on the accumulation of the bipolar data.
  • the log amp captures minute changes in the amount of light received by the event-based camera 130, and uses the increases and decreases of the changes as bipolar data. Then, based on the accumulation of the bipolar data, the envelope of the received light amount distribution is identified and calculated as the received light amount acquired by the event-based camera 130 .
  • the event-based camera 130 detects the occurrence of events such as changes in the amount of received light (luminance value) in each of the arranged pixels. Then, the event-based camera 130 acquires the data of the interference waveform portion at the pixels where it is detected that the event has occurred.
  • an adjusting unit is provided that can adjust the amount of data to be processed according to the capacity and performance of the event-based camera 130 so that even when events occur simultaneously in a plurality of pixels, it can be processed appropriately. It doesn't matter if
  • FIG. 5 is a diagram showing an example of using a ROI (Region Of Interest) function as an adjustment unit capable of adjusting the amount of data processed by the event-based camera 130.
  • a ROI Region Of Interest
  • FIG. 5 of the maximum effective imaging area that can be imaged by the event-based camera 130, five rectangular areas 0 to 4 are set for reading out data, and data is not read out in the other areas. is set to
  • the event-based camera 130 acquires data by detecting a change in the luminance value of received light (occurrence of an event) in the pixels arranged in the rectangular areas 0 to 4, but in the other areas, Data is not acquired regardless of whether an event has occurred.
  • the event-based camera 130 does not have to monitor whether an event has occurred in the other area.
  • five rectangular areas 0 to 4 are set as the areas from which data is read, but the present invention is not limited to this. Alternatively, 6 or more may be set.
  • each rectangular area may be the same or different as the area from which data is read, and may have a shape other than a rectangle.
  • the area (for example, number, shape, position, etc.) from which data is read may be appropriately set according to the type, capacity, and performance of the event-based camera 130, and the type, shape, etc. of the object to be measured.
  • the setting may be set in advance by the user or may be set by teaching.
  • FIG. 6 is a diagram showing an example of how the event-based camera 130 is used to teach the area from which data is to be read.
  • the event-based camera 130 is used as the measuring device 100 to measure the object T, and the area from which data is read is taught.
  • the event-based camera 130 acquires data for all pixels in the maximum effective imaging area using the above-described white interference measurement method.
  • the event-based camera 130 may not be able to acquire data from all pixels at the same time depending on the type, capacity, performance, and the like, so it sequentially acquires data for each number of pixels that can be acquired at the same time.
  • the purpose is to teach the area from which data is to be read, if detailed data is not required as the data to be acquired by the event-based camera 130, all pixels need not be targeted. .
  • data acquired by the event-based camera 130 data may be acquired in a thinned manner targeting pixels every several pixels.
  • 3D data of the target object T is generated, and the 3D data is referenced to set the area from which data is read.
  • the area from which data is read out may be set by the user, for example, an area that is a characteristic portion of the object T, an area to be measured, or the like, or an area having a predetermined height is automatically set. You can do so.
  • the event-based camera 130 acquires data for all pixels in the maximum effective imaging area. I do not care.
  • a half mirror or the like may be arranged so that the interference waveform L3 described using FIG. 1 can be received by a CMOS imaging device arranged separately from the event-based camera 130 . Accordingly, in teaching, the CMOS imaging device receives the interference waveform L3 reflected by the half mirror.
  • a 3D image of the object T can be obtained in the same manner as described with reference to FIG. Data is generated, and an area from which data is read is set by referring to the 3D data.
  • drawing information such as CAD data of the object T may be imported, and the area from which data is to be read may be set by referring to the drawing information.
  • FIG. 7 is a diagram showing an example of how drawing information is referred to and an area from which data is to be read is taught.
  • the CAD data of the object T to be measured by the measuring apparatus 100 is captured as drawing information, and the captured drawing information is referenced to set the data reading area.
  • the area from which data is read out may be set by the user, for example, an area that is a characteristic portion of the object T, an area to be measured, or the like, or an area having a predetermined height is automatically set. You can do so.
  • a distance image (image data whose pixel value is the height) may be generated, and the area for reading data may be set while referring to the distance image.
  • a grayscale image may be generated by performing rendering processing (generating a pseudo image by optical simulation) based on the 3D data, and area setting may be performed by referring to the grayscale image. This has the effect of reducing the calculation cost of display processing.
  • FIG. 8 is a diagram showing an example of using a light blocking section as an adjustment section capable of adjusting the amount of data processed by the event-based camera 130.
  • a light blocking section including a region for transmitting the light and a region for blocking the light is arranged in the optical path of the light (L0, L1, L2) projected from the light source 110.
  • a light blocking section including a region for transmitting the light and a region for blocking the light is arranged.
  • four transmission regions are set as regions through which the light is transmitted, and light blocking regions are formed so that light is blocked in the other regions. is set.
  • the event-based camera 130 since the event-based camera 130 receives light transmitted through the transmissive region, the pixels arranged corresponding to the transmissive region out of the plurality of pixels arranged in the event-based camera 130 receive the light. Data is acquired by detecting a change in luminance value (occurrence of an event). On the other hand, since the event-based camera 130 does not receive light that is blocked by the light-shielding region, among the plurality of pixels arranged in the event-based camera 130, the pixels arranged corresponding to the light-shielding region are It is not necessary to monitor the presence or absence of event occurrence.
  • four circular transmissive regions are set as the regions through which light is transmitted.
  • the present invention is not limited to this. , or five or more may be set.
  • each transmission region may be the same or different, and may be other than circular (eg, elliptical, oval, rectangular, etc.). It may be appropriately set according to the shape.
  • the transparent areas may be appropriately set according to the type, capacity, and performance of the event-based camera 130, and the type, shape, etc. of the object to be measured. may be set in advance by the user or may be set by teaching.
  • the teaching method for the area from which the above-described data is read can be applied.
  • the region corresponding to the data readout region may be set as the transparent region, and the other regions may be set as the light shielding regions. good.
  • a light shielding section having such a transmissive area and a light shielding area may be realized using, for example, a liquid crystal panel.
  • the liquid crystal panel is composed of a polarizing filter (vertical), a glass substrate (individual electrode), a liquid crystal sandwiched between alignment layers, a glass substrate (common electrode), and a polarizing filter (horizontal).
  • the linearly polarized light cannot pass through the polarizing filter (horizontal) while maintaining the polarization direction (light blocking state).
  • the polarization direction of light is bent by 90 degrees in the liquid crystal sandwiched between the alignment layers, the light can pass through the polarizing filter (horizontal) (non-light-shielding state).
  • Electrodes on the glass substrate (individual electrode) and the glass substrate (common electrode) are provided for switching the liquid crystal between a light-shielding state and a non-light-shielding state.
  • FIG. 9 is a schematic diagram showing a configuration in which the light blocking section is arranged in the measuring device 100 shown in FIG. As shown in FIG. 9, a light blocking section 160 is arranged between the lens 122 and the mirror 127 .
  • the position where the light blocking portion 160 is arranged is not limited to between the lens 122 and the mirror 127.
  • the light source 110 and the lens 122 between the mirror 127 and the lens 123, between the lens 123 and the beam splitter 121, between the beam splitter 121 and the lens 126, between the lens 126 and the event-based camera 130, or the like. It is sufficient if they are on a common optical path (on the optical paths of the lights L0 and L3).
  • the irradiated light may be polarized and diffracted by the liquid crystal and the state may change. and the optical path of the reference light L2 under the same conditions, it is possible to reduce the possibility of affecting the interference accuracy.
  • the liquid crystal panel has been described as an example of the light shielding portion having the transmission region and the light shielding region, but the light irradiated to the predetermined partial region is guided to the event base camera 130, and is irradiated to the other region.
  • a DMD digital micromirror device
  • a DMD is used for the mirror 127 that constitutes the measuring apparatus 100 shown in FIG.
  • a DMD reflects light irradiated to a predetermined partial region and absorbs light irradiated to other regions. That is, it is possible to reflect the light irradiated to the predetermined partial area, make it enter the mirror 123 , and finally guide it to the event-based camera 130 .
  • the liquid crystal panel, DMD, and the like guide the light irradiated to a predetermined partial region to the event base camera 130, and block or absorb the light irradiated to the other region, thereby causing an event to occur.
  • the reduction is performed so as not to lead to the base camera 130 . That is, it can be said that it is a light cutting portion that reduces a part of the irradiated light.
  • the event-based camera 130 is used as an imaging element, and further, minute changes in the amount of received light are appropriately captured by the log amplifier, and the ROI function is performed. And the light blocker 160 appropriately adjusts the amount of data processed by the event-based camera 130 . As a result, the measuring device 100 can more appropriately measure the object T at high speed and with high accuracy.
  • the optical system 120 is composed of a beam splitter 121, a plurality of lenses 122 to 126, a plurality of mirrors 127 to 129, etc., as shown in FIGS. is not limited to
  • the light L0 projected by the light source 110 is separated into the measurement light L1 and the reference light L2.
  • Any optical system may be configured as long as it can be received by the camera 130 and a measurement method using the white interference method can be performed.
  • the ROI function and the light shielding unit 160 are taken as an example of an adjustment unit capable of adjusting the amount of data processed by the event-based camera 130, and furthermore, an area for reading out data, a transparent area, a light shielding area, etc.
  • the number of pixels in the event-based camera 130 may be reduced (reduced resolution) so that data can be acquired even if events occur simultaneously in all pixels.
  • FIG. 10 is a schematic diagram showing the configuration and functions of a measuring device 100A to which the measuring device 100 according to the first embodiment of the present invention is applied.
  • the measuring device 100A includes a light source 110, an optical system 120A, an event-based camera 130, and a CMOS imaging device 170.
  • the measuring apparatus 100A differs from the measuring apparatus 100 shown in FIG. 1 in that a CMOS imaging device 170 and a half mirror 171A are added.
  • the measuring device 100A basically measures the target object T by the white interference measurement method, as described in the measuring device 100 .
  • the event-based camera 130 functions as an imaging element that acquires data about the object T in the Z-axis direction, as described for the measurement apparatus 100 .
  • the CMOS imaging device 170 receives the light reflected by the half mirror 171A and acquires image data of the target object T. More specifically, the CMOS imaging device 170 functions as an imaging device that acquires an image of the target object T by measurement on the XY axis plane.
  • the object T can be measured on the XYZ coordinate axes. It can be realized with one sensor.
  • the event-based camera 130 may be used to fulfill the same role as the CMOS imaging device 170 .
  • the event-based camera 130 detects changes in brightness, and there are types that output not only the polarity of the change in brightness but also the amount of change in brightness. In this case, it is possible to restore luminance information by performing integration processing.
  • ROI function and the light-shaping portion described above may be used in combination.
  • a plurality of ROI regions obtained by dividing the screen can be sequentially switched, captured, and synthesized to form a single image.
  • the event-based camera 130 is used to perform the same role as the CMOS imaging device 170, there is no need to newly provide the CMOS imaging device 170, and the structure can be simplified.
  • the log amplifier, the ROI function, the light shielding section 160, and the like, which are applied in the measuring apparatus 100 can also be applied to the measuring apparatus 100A.
  • FIG. 11 is a schematic diagram showing the configuration of a measuring device 200 according to the second embodiment of the invention.
  • the measurement device 200 includes a light source 210, an optical system 120, an event-based camera 130, and a processing section 140.
  • the optical system 120 includes a beam splitter 121, a plurality of lenses 122-126 and a plurality of mirrors 127-129.
  • an interference waveform is generated by changing the optical path length difference between the measurement light L1 and the reference light L2, and the object T is measured by the white light interference method.
  • a wavelength swept light source is used as the light source 210 as control means for controlling to change the amount of light received by the event-based camera 130, and the wavelength of the light projected from the light source 210 is is continuously changed to generate an interference waveform between the measurement light L1 and the reference light L2, and the object T is measured by a measurement method called a wavelength sweep method.
  • the light source 210 is, for example, a MEMS-VCSEL (Micro Electro Mechanical Systems-Vertical Cavity Surface Emitting Laser), DFB (Distributed feedback) laser, SSG-DBR (Super Structure Grating-Distributed Bragg reflector) laser, etc., as a wavelength swept light source. Used. It should be noted that the light source 210 emits light with a constant intensity, and the emission intensity does not change, like the light source 110 of the measuring device 100 according to the first embodiment.
  • MEMS-VCSEL Micro Electro Mechanical Systems-Vertical Cavity Surface Emitting Laser
  • DFB Distributed feedback
  • SSG-DBR Super Structure Grating-Distributed Bragg reflector
  • the MEMS-VCSEL changes the lasing wavelength (wavelength sweep) by driving the resonant mirror with the MEMS
  • the DFB laser sweeps the wavelength by changing the drive current of the laser diode
  • the SSG-DBR has two wavelengths.
  • a diffraction grating called a distributed reflector is formed on one resonant mirror, and the wavelength is swept by injecting current.
  • the event-based camera 130 generates an interference waveform L3 between the measurement light L1 and the reference light L2, which is generated by varying the wavelength of the light projected from the light source 210, as described in the first embodiment of the present invention. Similarly, data is acquired by detecting a change in the luminance value of received light.
  • the processing unit 140 measures the distance to the object T based on the data of the interference waveform L3 acquired by the event-based camera 130, the wavelength (frequency) of the light projected from the light source 210, and the like.
  • the measuring apparatus 200 when measuring the object T by the wavelength sweeping measurement method using the wavelength swept light source as the light source 210, the event-based The camera 130 is used to detect the change in the luminance value of the light received by each arranged pixel, and the data of the interference waveform portion is acquired at the timing of the detection. The amount of data to be acquired can be reduced while acquiring necessary data.
  • log amplifier, the ROI function, the light shielding section 160, and the like which are applied to the measuring apparatus 100 according to the first embodiment of the present invention, can also be applied to the measuring apparatus 200 according to the present embodiment.
  • the measurement of the object T on the XYZ coordinate axes can be realized with a single sensor.
  • a measuring device (100, 200) for measuring a distance to an object a light source (110, 210) for projecting light; a mirror (129); a beam splitter (121) for splitting the light projected from the light source into measurement light directed to the object and reference light directed to the mirror; an event-based camera (130) that acquires data by detecting a change in received luminance value with respect to an interference waveform of the reference light reflected by the mirror and the measurement light reflected by the object; a processing unit (140) for measuring the distance to the object based on the data of the interference waveform; a control means (150) for controlling to change the amount of light received by the event-based camera;
  • a measuring device (100, 200) comprising:

Abstract

Provided is a high-speed, high-precision measuring device. A measuring device 100 measures a distance to a target object, and is provided with: a light source 110 for projecting light; a mirror 129; a beam splitter 121 for splitting the light projected from the light source into measuring light to be guided to the target object and reference light to be guided to the mirror; an event based camera 130 for acquiring data by detecting a change in a brightness value of received light, for an interference wave between the reference light reflected by the mirror and the measuring light reflected by the target object; a processing unit 140 for measuring the distance to the target object on the basis of the interference wave data; and a control means for performing control to change a received light amount acquired by the event based camera.

Description

計測装置Measuring device
 本発明は、計測装置に関する。 The present invention relates to a measuring device.
 近年、半導体チップの多段積層、ハードディスクドライブ用のロータリー・アクチュエータの組み立て、スマートフォンに内蔵されるカメラモジュールの組み立て、及びLEDの配光制御等において、部品のアライメントに対する高精度化が求められている。 In recent years, there has been a demand for higher accuracy in the alignment of parts in the multi-layer stacking of semiconductor chips, the assembly of rotary actuators for hard disk drives, the assembly of camera modules built into smartphones, and the light distribution control of LEDs.
 これらの部品を含む測定対象物について、計測及び検査するための装置として光学系を利用した計測装置が知られている。 A measuring device using an optical system is known as a device for measuring and inspecting measurement objects including these parts.
 特許文献1では、干渉方式の形状測定装置に関する技術が開示されている。当該形状測定装置では、測定光の光路長と参照光の光路長との差により変化する受光量の干渉パターンを特定し、当該干渉パターンに基づいて測定対象物の表面形状を取得している。 Patent Document 1 discloses a technique related to an interferometric shape measuring device. The shape measuring apparatus specifies an interference pattern of the amount of light received that varies depending on the difference between the optical path length of the measurement light and the optical path length of the reference light, and obtains the surface shape of the object to be measured based on the interference pattern.
特開2018-63153号公報JP 2018-63153 A
 しかしながら、特許文献1に開示される形状測定装置では、複数点を移動計測したり、膨大なデータ量を取得して処理したりする必要があるため、高速に計測できないという問題がある。 However, the shape measuring apparatus disclosed in Patent Document 1 has a problem that high-speed measurement cannot be performed because it is necessary to move and measure a plurality of points and acquire and process a huge amount of data.
 そこで、本発明は、高速かつ高精度な計測装置を提供することを目的とする。 Therefore, an object of the present invention is to provide a high-speed and high-precision measuring device.
 本発明の一態様に係る計測装置は、対象物までの距離を計測する計測装置であって、光を投光する光源と、ミラーと、光源から投光された光を、対象物に導く測定光とミラーに導く参照光とに分割するビームスプリッタと、ミラーに反射された参照光と対象物に反射された測定光との干渉波形について、受光する輝度値の変化を検知してデータを取得するイベントベースカメラと、干渉波形のデータに基づいて対象物までの距離を計測する処理部と、イベントベースカメラが取得する受光量を変化させるように制御する制御手段と、を備える。 A measuring device according to one aspect of the present invention is a measuring device that measures a distance to an object, and includes a light source that projects light, a mirror, and a measuring device that guides the light projected from the light source to the object. The beam splitter splits the light into reference light guided to the mirror, and the interference waveforms of the reference light reflected by the mirror and the measurement light reflected by the object are detected by detecting changes in the received luminance value to acquire data. an event-based camera, a processing unit that measures the distance to an object based on interference waveform data, and a control unit that controls to change the amount of light received by the event-based camera.
 この態様によれば、イベントベースカメラは、配列された各画素において、ミラーに反射された参照光と対象物に反射された測定光との干渉波形について、受光する輝度値の変化を検知して、当該検知のタイミングで干渉波形部分のデータを取得するため、対象物までの距離を計測する上で必要なデータを取得しつつ、取得するデータ量を低減することができる。すなわち、計測装置は、高速かつ高精度に対象物を計測することができる。 According to this aspect, the event-based camera detects a change in the brightness value of the interference waveform of the reference light reflected by the mirror and the measurement light reflected by the object at each arranged pixel. Since the data of the interference waveform portion is acquired at the timing of the detection, the amount of data to be acquired can be reduced while acquiring the data necessary for measuring the distance to the object. That is, the measuring device can measure the object at high speed and with high accuracy.
 上記態様において、制御手段は、測定光と参照光との光路長差を変化させるように、ミラーを駆動させる駆動部を含んでもよい。 In the above aspect, the control means may include a driving section that drives the mirror so as to change the optical path length difference between the measurement light and the reference light.
 この態様によれば、計測装置は、白色干渉方式の計測方法で、高速かつ高精度に対象物を計測することができる。 According to this aspect, the measuring device can measure the object at high speed and with high accuracy using the white interference measurement method.
 上記態様において、制御手段は、光源から投光される光の波長が連続的に変化するように、光源として波長掃引光源を用いてもよい。 In the above aspect, the control means may use a wavelength swept light source as the light source so that the wavelength of the light projected from the light source changes continuously.
 この態様によれば、計測装置は、波長掃引方式の計測方法で、高速かつ高精度に対象物を計測することができる。 According to this aspect, the measurement device can measure the object at high speed and with high accuracy by the wavelength sweeping measurement method.
 上記態様において、イベントベースカメラが取得する受光量の増減を極性データとし、当該極性データの累積に基づいて、当該イベントベースカメラが取得する受光量として算出するログアンプを含んでもよい。 The above aspect may include a log amplifier that calculates the amount of received light acquired by the event-based camera based on the polarity data, based on the accumulation of the polarity data.
 この態様によれば、ログアンプは、イベントベースカメラが取得する受光量の増減を極性データとし、微小な変化を高速に捉えることができるため、撮像時間を短縮することができる。 According to this aspect, the log amp can use the increase/decrease in the amount of light received by the event-based camera as the polarity data, and can capture minute changes at high speed, thereby shortening the imaging time.
 上記態様において、イベントベースカメラが処理するデータ量を調整可能な調整部を、さらに備えてもよい。 The above aspect may further include an adjustment unit capable of adjusting the amount of data processed by the event-based camera.
 この態様によれば、調整部は、イベントベースカメラが処理するデータ量を調整するため、複数の画素において同時にイベントが発生する場合であっても、イベントベースカメラは、適切に処理することができる。 According to this aspect, since the adjustment unit adjusts the amount of data processed by the event-based camera, the event-based camera can appropriately process events even when events occur simultaneously in a plurality of pixels. .
 上記態様において、調整部は、所定の一部領域のデータを読み出すROI機能を有してもよい。 In the above aspect, the adjustment unit may have an ROI function for reading out data in a predetermined partial area.
 この態様によれば、ROI機能を用いて、対象物の計測に必要な領域をデータを読み出す領域として設定されるため、より適切に、対象物を計測することができる。 According to this aspect, the ROI function is used to set the area necessary for measuring the object as the area from which the data is read, so the object can be measured more appropriately.
 上記態様において、調整部は、所定の一部領域に照射される光をイベントベースカメラまで導き、それ以外の領域に照射される光をイベントベースカメラまで導かないように構成される削光部を含んでもよい。 In the above aspect, the adjustment unit includes a light-removing unit configured to guide the light irradiated to the predetermined partial region to the event-based camera and not to guide the light irradiated to the other region to the event-based camera. may contain.
 この態様によれば、所定の一部領域に照射される光をイベントベースカメラまで導き、それ以外の領域に照射される光をイベントベースカメラまで導かないように構成される削光部が配置されるため、対象物のうち計測に必要な領域について、より適切に、対象物を計測することができる。 According to this aspect, the light-removing section is arranged so as to guide the light irradiated to the predetermined partial area to the event-based camera and not to guide the light irradiated to the other area to the event-based camera. Therefore, it is possible to measure the object more appropriately with respect to the area of the object that is necessary for measurement.
 上記態様において、削光部は、液晶パネルで構成されてもよい。 In the above aspect, the light-shaping portion may be composed of a liquid crystal panel.
 この態様によれば、液晶パネルを用いて、所定の一部領域に照射される光を透過させる透過領域と、それ以外の領域では遮光する遮光領域とを含む削光部を構成することができる。 According to this aspect, using the liquid crystal panel, it is possible to configure the light-shaping portion including the transmissive region that transmits the light irradiated to the predetermined partial region and the light-shielding region that blocks the other region. .
 上記態様において、削光部は、測定光及び参照光の共通する光路上に配置されてもよい。 In the above aspect, the light-shaping section may be arranged on the common optical path of the measurement light and the reference light.
 この態様によれば、測定光及び参照光における光路が同一条件となるため、干渉精度に影響を与える可能性を低減することができる。 According to this aspect, since the optical paths of the measurement light and the reference light have the same conditions, it is possible to reduce the possibility of affecting the interference accuracy.
 上記態様において、所定の一部領域は、対象物の形状に応じて予め設定されてもよい。 In the above aspect, the predetermined partial area may be set in advance according to the shape of the object.
 この態様によれば、対象物の形状に応じて、対象物の特徴部分である領域、及び計測対象となる領域等を、データを読み出す、又は光を透過させる、所定の一部領域として設定されるため、より適切に、対象物を計測することができる。 According to this aspect, according to the shape of the object, the area that is the characteristic portion of the object, the area that is to be measured, and the like are set as a predetermined partial area from which data is read or light is transmitted. Therefore, the object can be measured more appropriately.
 本発明によれば、高速かつ高精度な計測装置を提供することができる。 According to the present invention, it is possible to provide a high-speed and highly accurate measuring device.
本発明の第1実施形態に係る計測装置100の構成を示す概要図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a schematic diagram which shows the structure of the measuring device 100 which concerns on 1st Embodiment of this invention. 撮像素子として用いられているイベントベースカメラ130によってデータが取得される様子を示す模式図である。FIG. 4 is a schematic diagram showing how data is acquired by an event-based camera 130 used as an imaging device; 撮像素子としてイベントベースカメラ130を用いた計測装置100によって対象物の表面形状(表面までの距離)が計測される様子を示す図である。FIG. 3 is a diagram showing how the surface shape of an object (distance to the surface) is measured by the measuring device 100 using the event-based camera 130 as an imaging device. イベントベースカメラ130が取得する受光量の変化(増減)を両極性として示し、当該両極性データの累積に基づく包絡線を示す図である。4 is a diagram showing changes (increases and decreases) in the amount of received light acquired by the event-based camera 130 as bipolar, and showing an envelope based on the accumulation of the bipolar data. FIG. イベントベースカメラ130が処理するデータ量を調整可能な調整部として、ROI(Region Of Interest)機能を用いる一例を示す図である。FIG. 10 is a diagram showing an example of using a ROI (Region Of Interest) function as an adjustment unit capable of adjusting the amount of data processed by the event-based camera 130. FIG. イベントベースカメラ130を用いてデータを読み出す領域をティーチングする様子の一例を示す図である。FIG. 10 is a diagram showing an example of how an event-based camera 130 is used to teach an area from which data is to be read; 図面情報を参照してデータを読み出す領域をティーチングする様子の一例を示す図である。FIG. 10 is a diagram showing an example of how drawing information is referred to and an area from which data is to be read is taught; イベントベースカメラ130が処理するデータ量を調整可能な調整部として、遮光部を用いる一例を示す図である。FIG. 10 is a diagram showing an example of using a light blocking section as an adjustment section capable of adjusting the amount of data processed by the event-based camera 130. FIG. 図1に示された計測装置100において、遮光部が配置される構成を示す概要図である。FIG. 2 is a schematic diagram showing a configuration in which light shielding units are arranged in the measurement apparatus 100 shown in FIG. 1; 本発明の第1実施形態に係る計測装置100を応用した計測装置100Aの構成及び機能を示す概要図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a schematic diagram which shows a structure and a function of 100 A of measuring devices to which the measuring device 100 which concerns on 1st Embodiment of this invention is applied. 本発明の第2実施形態に係る計測装置200の構成を示す概要図である。It is a schematic diagram which shows the structure of the measuring device 200 which concerns on 2nd Embodiment of this invention.
 以下、本発明の好適な実施形態について、添付図面を参照しながら具体的に説明する。なお、以下で説明する実施形態は、あくまで、本発明を実施するための具体的な一例を挙げるものであって、本発明を限定的に解釈させるものではない。また、説明の理解を容易にするため、各図面において同一の構成要素に対しては可能な限り同一の符号を付して、重複する説明は省略する場合がある。 Preferred embodiments of the present invention will be specifically described below with reference to the accompanying drawings. It should be noted that the embodiments described below are merely specific examples for carrying out the present invention, and are not intended to limit the interpretation of the present invention. Also, in order to facilitate understanding of the description, the same reference numerals are given to the same constituent elements in each drawing as much as possible, and redundant description may be omitted.
 <第1実施形態>
[計測装置の構成]
 図1は、本発明の第1実施形態に係る計測装置100の構成を示す概要図である。図1に示されるように、計測装置100は、光源110と、光学系120と、イベントベースカメラ130と、処理部140とを備える。光学系120は、ビームスプリッタ121、複数のレンズ122~126及び複数のミラー127~129を含む。さらに、ミラー129を駆動させる駆動部150(制御手段)を備える。
<First embodiment>
[Configuration of measuring device]
FIG. 1 is a schematic diagram showing the configuration of a measuring device 100 according to the first embodiment of the invention. As shown in FIG. 1, the measurement device 100 includes a light source 110, an optical system 120, an event-based camera 130, and a processing section 140. The optical system 120 includes a beam splitter 121, a plurality of lenses 122-126 and a plurality of mirrors 127-129. Further, a driving section 150 (control means) for driving the mirror 129 is provided.
 光源110は、光L0を投光する。光源110は、典型的には、白色光源であって、例えば、SLD(スーパールミネッセントダイオード)光源であっても構わない。SLD光源は、発光ダイオード(LED)と半導体レーザ(LD)の2つの特性を有する広帯域光源であり、コヒーレンス性は、LED光源によりも高く、レーザ光のコヒーレンス性よりも低い。なお、光源110は、一定の強度で発光し、発光強度は変化しない。 The light source 110 projects light L0. Light source 110 is typically a white light source, and may be, for example, an SLD (Super Luminescent Diode) light source. An SLD light source is a broadband light source that has the dual characteristics of light emitting diodes (LEDs) and semiconductor lasers (LDs), and has higher coherence than LED light sources and lower coherence than laser light. Note that the light source 110 emits light with a constant intensity, and the emission intensity does not change.
 また、光源110には、その他、例えば、ハロゲンランプ、白色LED及び白色レーザ光源を用いても構わない。 In addition, for the light source 110, for example, a halogen lamp, a white LED, and a white laser light source may be used.
 光源110から投光された光L0は、レンズ122(例えば、コリメータレンズ)を透過して平行化され、ミラー127によって反射される。ミラー127により反射された光L0は、その後、レンズ123を透過することにより集光されつつ、ビームスプリッタ121に入射する。 The light L 0 projected from the light source 110 is transmitted through the lens 122 (for example, a collimator lens), collimated, and reflected by the mirror 127 . The light L0 reflected by the mirror 127 then enters the beam splitter 121 while being condensed by passing through the lens 123 .
 ビームスプリッタ121は、光L0を、対象物Tに導く測定光L1とミラー129に導く参照光とに分割する。換言すれば、光L0の一部は、ビームスプリッタ121により反射され(測定光L1)、光L0の残りの一部はビームスプリッタ121を透過する(参照光L2)。 The beam splitter 121 splits the light L0 into measurement light L1 guided to the object T and reference light guided to the mirror 129 . In other words, part of the light L0 is reflected by the beam splitter 121 (measurement light L1), and the remaining part of the light L0 is transmitted through the beam splitter 121 (reference light L2).
 レンズ124(例えば、対物レンズ)では、測定光L1は、レンズ124を透過することにより平行化される。その後、測定光L1は、対象物Tの領域に照射されて、対象物Tにより反射された測定光L1の一部は、レンズ124を透過することにより集光されつつ、再び、ビームスプリッタ121に入射する。 At the lens 124 (for example, the objective lens), the measurement light L1 is collimated by passing through the lens 124 . After that, the area of the object T is irradiated with the measurement light L1, and part of the measurement light L1 reflected by the object T is condensed by passing through the lens 124, and is again directed to the beam splitter 121. Incident.
 一方、ビームスプリッタ121を透過した参照光L2は、レンズ125を透過して平行化され、ミラー128によって反射され、ミラー129に照射される。ミラー129は、所謂、参照ミラーである。 On the other hand, the reference light L2 transmitted through the beam splitter 121 is transmitted through the lens 125 and collimated, reflected by the mirror 128, and irradiated onto the mirror 129. Mirror 129 is a so-called reference mirror.
 そして、ミラー129により反射された参照光L2は、ミラー128によって反射され、レンズ125を透過することにより集光されつつ、再び、ビームスプリッタ121に入射する。 Then, the reference light L2 reflected by the mirror 129 is reflected by the mirror 128, passes through the lens 125, is condensed, and enters the beam splitter 121 again.
 対象物Tによって反射されてビームスプリッタ121に入射した測定光L1と、ミラー129によって反射されてビームスプリッタ121に入射した参照光L2とは干渉し、干渉光L3としてレンズ126を透過して平行化され、イベントベースカメラ130に導かれる。イベントベースカメラ130は、干渉波形L3について、受光する輝度値の変化を検知してデータを取得する。イベントベースカメラ130が取得するデータについては、後述する。 The measurement light L1 reflected by the object T and incident on the beam splitter 121 and the reference light L2 reflected by the mirror 129 and incident on the beam splitter 121 interfere with each other, and pass through the lens 126 as interference light L3 to be collimated. and directed to the event-based camera 130 . The event-based camera 130 acquires data by detecting changes in the received luminance value for the interference waveform L3. Data acquired by the event-based camera 130 will be described later.
 処理部140は、イベントベースカメラ130によって取得された干渉波形L3のデータ、及び後述する駆動部150によって制御されるミラー129の位置(参照光L2の光路長)等に基づいて対象物Tまでの距離を計測する。 The processing unit 140 determines the distance to the target object T based on the data of the interference waveform L3 acquired by the event-based camera 130, the position of the mirror 129 (optical path length of the reference light L2) controlled by the driving unit 150 described later, and the like. Measure distance.
 駆動部150(制御手段)は、例えば、ボイスコイルモータ(VCM)等で構成されるアクチュエータであって、参照光L2の光路長が変化するようにミラー129を光軸方向に沿って移動させる。 The drive unit 150 (control means) is an actuator composed of, for example, a voice coil motor (VCM) or the like, and moves the mirror 129 along the optical axis direction so as to change the optical path length of the reference light L2.
 このように、参照光L2の光路長を変化させて、測定光L1と参照光L2との光路長差を変化させることによって干渉波形を発生させる。具体的には、位相差が0の位置を中心に干渉縞が観察でき、測定光L1と参照光L2との光路長が一致するときに、干渉波形が発生する。当該干渉波形を利用して対象物を計測する方法は、例えば、白色干渉方式と呼ばれる計測方法である。 Thus, by changing the optical path length of the reference light L2 and changing the optical path length difference between the measurement light L1 and the reference light L2, an interference waveform is generated. Specifically, interference fringes can be observed around the position where the phase difference is 0, and an interference waveform is generated when the optical path lengths of the measurement light L1 and the reference light L2 match. A method of measuring an object using the interference waveform is, for example, a measurement method called a white interference method.
[イベントベースカメラについて]
 ここで、撮像素子として用いられているイベントベースカメラ130について、説明する。イベントベースカメラ130は、配列された各画素において、受光する輝度値の変化を検知して、当該検知のタイミングで変化を検知した部分(画素)における干渉波形を撮像して、輝度変化の両極性、時間及び座標のデータを取得する。
[About event-based cameras]
Here, the event-based camera 130 used as an imaging device will be described. The event-based camera 130 detects a change in the luminance value of the light received in each arrayed pixel, captures an interference waveform at the portion (pixel) where the change is detected at the timing of the detection, and detects the polarity of the luminance change. , to get time and coordinate data.
 図2は、撮像素子として用いられているイベントベースカメラ130によってデータが取得される様子を示す模式図である。図2に示されるように、イベントベースカメラ130は、配列された各画素において、受光する輝度値の変化を検知して、当該検知のタイミングで計測に必要な干渉波形部分のデータを取得し、それ以外の部分では、データを取得しない。 FIG. 2 is a schematic diagram showing how data is acquired by the event-based camera 130 used as an imaging device. As shown in FIG. 2, the event-based camera 130 detects a change in the luminance value of the received light in each arranged pixel, acquires the data of the interference waveform portion necessary for measurement at the timing of the detection, Other parts do not acquire data.
 従来、撮像素子として用いられるCMOSでは、各画素における全データを取得しており、そのデータ量は膨大なものである。換言すれば、イベントベースカメラ130は、CMOS撮像素子に比べて、対象物Tまでの距離を計測するために必要な干渉波形部分のデータを効率良く取得している。 Conventionally, CMOS used as an image sensor acquires all data for each pixel, and the amount of data is enormous. In other words, the event-based camera 130 efficiently acquires the data of the interference waveform portion necessary for measuring the distance to the object T compared to the CMOS imaging device.
 上述のように、撮像素子としてイベントベースカメラ130を用いて、配列された各画素において、受光する輝度値の変化を検知して、当該検知のタイミングで干渉波形部分のデータを取得するため、対象物Tまでの距離を計測する上で必要なデータを取得しつつ、取得するデータ量を低減することができる。 As described above, the event-based camera 130 is used as an imaging device to detect a change in the luminance value of the received light in each arrayed pixel, and to acquire the data of the interference waveform portion at the timing of the detection. The amount of data to be acquired can be reduced while acquiring the data necessary for measuring the distance to the object T.
 図3は、撮像素子としてイベントベースカメラ130を用いた計測装置100によって対象物の表面形状(表面までの距離)が計測される様子を示す図である。図3に示されるように、ダイボンディング及びカメラモジュールの計測について、複数点(面一括)において、イベントベースカメラ130が計測に必要なデータを取得し、当該必要なデータを効率よく高速に処理(読み込み等)している。その結果、計測装置100は、対象物について、高速かつ高精度に計測することができる。 FIG. 3 is a diagram showing how the surface shape of an object (distance to the surface) is measured by the measuring device 100 using the event-based camera 130 as an imaging device. As shown in FIG. 3, for die bonding and camera module measurement, the event-based camera 130 acquires data necessary for measurement at multiple points (on a single surface), and efficiently processes the necessary data at high speed ( reading, etc.). As a result, the measuring device 100 can measure the object at high speed and with high accuracy.
 さらに、イベントベースカメラ130は、配列された各画素において、受光する輝度値の変化を検知して、当該検知のタイミングで干渉波形部分のデータを取得するため、微小な受光量の変化を捉えるログアンプを含んでいても構わない。 Furthermore, the event-based camera 130 detects changes in the luminance value of the received light in each arranged pixel, and acquires the data of the interference waveform portion at the timing of the detection. May contain an amplifier.
 図4は、イベントベースカメラ130が取得する受光量の変化(増減)を両極性として示し、当該両極性データの累積に基づく包絡線を示す図である。図4に示されるように、ログアンプは、イベントベースカメラ130が取得する受光量の微小な変化を捉え、その変化の増減を両極性データとしている。そして、当該両極性データの累積に基づいて、受光量分布の包絡線を特定して、イベントベースカメラ130が取得する受光量として算出する。 FIG. 4 is a diagram showing a change (increase or decrease) in the amount of received light acquired by the event-based camera 130 as bipolar, and an envelope curve based on the accumulation of the bipolar data. As shown in FIG. 4, the log amp captures minute changes in the amount of light received by the event-based camera 130, and uses the increases and decreases of the changes as bipolar data. Then, based on the accumulation of the bipolar data, the envelope of the received light amount distribution is identified and calculated as the received light amount acquired by the event-based camera 130 .
 このように、イベントベースカメラ130にログアンプを含むことによって、イベントベースカメラ130が取得する受光量について、微小な変化を高速に捉えることができるため、撮像時間を短縮することができる。 In this way, by including a log amplifier in the event-based camera 130, minute changes in the amount of light received by the event-based camera 130 can be captured at high speed, so the imaging time can be shortened.
 なお、上述したように、イベントベースカメラ130は、配列された各画素において、それぞれ受光量(輝度値)の変化というイベントの発生を検知している。そして、イベントベースカメラ130は、当該イベントが発生したと検知した画素において、干渉波形部分のデータを取得する。ここで、複数の画素において同時にイベントが発生する場合であっても、適切に処理できるように、イベントベースカメラ130の容量及び性能等に応じて、処理するデータ量を調整可能な調整部を備えていても構わない。 As described above, the event-based camera 130 detects the occurrence of events such as changes in the amount of received light (luminance value) in each of the arranged pixels. Then, the event-based camera 130 acquires the data of the interference waveform portion at the pixels where it is detected that the event has occurred. Here, an adjusting unit is provided that can adjust the amount of data to be processed according to the capacity and performance of the event-based camera 130 so that even when events occur simultaneously in a plurality of pixels, it can be processed appropriately. It doesn't matter if
[ROI機能について]
 図5は、イベントベースカメラ130が処理するデータ量を調整可能な調整部として、ROI(Region Of Interest)機能を用いる一例を示す図である。図5に示されるように、イベントベースカメラ130として撮像可能な領域である最大有効撮像領域のうち、データを読み出す5個の矩形領域0~4が設定され、その他の領域ではデータを読み出さないように設定されている。
[About ROI function]
FIG. 5 is a diagram showing an example of using a ROI (Region Of Interest) function as an adjustment unit capable of adjusting the amount of data processed by the event-based camera 130. As shown in FIG. As shown in FIG. 5, of the maximum effective imaging area that can be imaged by the event-based camera 130, five rectangular areas 0 to 4 are set for reading out data, and data is not read out in the other areas. is set to
 これにより、イベントベースカメラ130は、矩形領域0~4内に配列されている画素においては、受光する輝度値の変化(イベントの発生)を検知してデータを取得するが、その他の領域では、イベント発生の有無に関わらず、データを取得しない。イベントベースカメラ130は、当該その他の領域においては、イベント発生の有無を監視しなくても構わない。 As a result, the event-based camera 130 acquires data by detecting a change in the luminance value of received light (occurrence of an event) in the pixels arranged in the rectangular areas 0 to 4, but in the other areas, Data is not acquired regardless of whether an event has occurred. The event-based camera 130 does not have to monitor whether an event has occurred in the other area.
 なお、ここでは、データを読み出す領域として、5個の矩形領域0~4が設定されることを一例として挙げているが、これに限定されるものではなく、例えば、矩形領域を4個以下、又は6個以上が設定されても構わない。 Here, as an example, five rectangular areas 0 to 4 are set as the areas from which data is read, but the present invention is not limited to this. Alternatively, 6 or more may be set.
 また、データを読み出す領域として、各矩形領域の大きさ及び形状は、同一でもあっても異なっていても構わないし、矩形以外の形状であっても構わない。 In addition, the size and shape of each rectangular area may be the same or different as the area from which data is read, and may have a shape other than a rectangle.
 データを読み出す領域(例えば、個数、形状及び位置等)は、イベントベースカメラ130の種類、容量及び性能、及び計測する対象物の種類及び形状等に応じて、適宜、設定されればよいが、当該設定は、予め、ユーザによって設定されても構わないし、ティーチングによって設定されても構わない。 The area (for example, number, shape, position, etc.) from which data is read may be appropriately set according to the type, capacity, and performance of the event-based camera 130, and the type, shape, etc. of the object to be measured. The setting may be set in advance by the user or may be set by teaching.
 図6は、イベントベースカメラ130を用いてデータを読み出す領域をティーチングする様子の一例を示す図である。例えば、計測装置l00として、対象物Tを計測するように、イベントベースカメラ130を用いてデータを読み出す領域をティーチングする。 FIG. 6 is a diagram showing an example of how the event-based camera 130 is used to teach the area from which data is to be read. For example, the event-based camera 130 is used as the measuring device 100 to measure the object T, and the area from which data is read is taught.
 図6に示されるように、イベントベースカメラ130により、最大有効撮像領域における全画素について、上述した白色干渉方式の計測方法を用いてデータを取得する。イベントベースカメラ130は、例えば、種類、容量及び性能等に応じて、全画素において同時にデータを取得できない場合があるため、同時に取得可能な画素数毎に、順次データを取得する。なお、ここでは、データを読み出す領域をティーチングすることを目的としているため、イベントベースカメラ130が取得するデータとして、詳細なデータを必要としない場合には、全画素を対象としなくても構わない。例えば、イベントベースカメラ130が取得するデータとして、数画素毎の画素を対象として、間引いた態様でデータを取得するようにしても構わない。 As shown in FIG. 6, the event-based camera 130 acquires data for all pixels in the maximum effective imaging area using the above-described white interference measurement method. For example, the event-based camera 130 may not be able to acquire data from all pixels at the same time depending on the type, capacity, performance, and the like, so it sequentially acquires data for each number of pixels that can be acquired at the same time. Here, since the purpose is to teach the area from which data is to be read, if detailed data is not required as the data to be acquired by the event-based camera 130, all pixels need not be targeted. . For example, as the data acquired by the event-based camera 130, data may be acquired in a thinned manner targeting pixels every several pixels.
 そして、イベントベースカメラ130によって取得されたデータに基づいて、対象物Tの3Dデータが生成され、当該3Dデータを参照して、データを読み出す領域が設定される。データを読み出す領域は、例えば、対象物Tの特徴部分である領域、及び計測対象となる領域等を、ユーザが設定しても構わないし、所定の高さを有する領域等について自動的に設定されるようにしても構わない。 Then, based on the data acquired by the event-based camera 130, 3D data of the target object T is generated, and the 3D data is referenced to set the area from which data is read. The area from which data is read out may be set by the user, for example, an area that is a characteristic portion of the object T, an area to be measured, or the like, or an area having a predetermined height is automatically set. You can do so.
 なお、ここでは、上述した白色干渉方式の計測方法を用いながら、イベントベースカメラ130により、最大有効撮像領域における全画素についてデータを取得したが、当該ティーチングにおいては、撮像素子としてCMOSを用いても構わない。 Here, while using the above-described measurement method of the white interference method, the event-based camera 130 acquires data for all pixels in the maximum effective imaging area. I do not care.
 例えば、図1を用いて説明した干渉波形L3を、イベントベースカメラ130とは別に配置されたCMOS撮像素子が受光できるように、ハーフミラー等を配置すればよい。これにより、ティーチングにおいて、当該ハーフミラーによって反射された干渉波形L3をCMOS撮像素子が受光する。 For example, a half mirror or the like may be arranged so that the interference waveform L3 described using FIG. 1 can be received by a CMOS imaging device arranged separately from the event-based camera 130 . Accordingly, in teaching, the CMOS imaging device receives the interference waveform L3 reflected by the half mirror.
 そして、上述した白色干渉方式の計測方法を用いながら、CMOS撮像素子により、最大有効撮像領域における全画素についてデータを取得すれば、図6を用いて説明したのと同様に、対象物Tの3Dデータが生成され、当該3Dデータを参照して、データを読み出す領域が設定される。 Then, if data is acquired for all pixels in the maximum effective imaging area using the CMOS imaging device while using the above-described white interference method measurement method, a 3D image of the object T can be obtained in the same manner as described with reference to FIG. Data is generated, and an area from which data is read is set by referring to the 3D data.
 さらに、データを読み出す領域をティーチングにより設定する方法として、例えば、対象物TのCADデータ等の図面情報を取り込み、当該図面情報を参照して、データを読み出す領域が設定されることが考えられる。 Furthermore, as a method of setting the area from which data is read by teaching, for example, drawing information such as CAD data of the object T may be imported, and the area from which data is to be read may be set by referring to the drawing information.
 図7は、図面情報を参照してデータを読み出す領域をティーチングする様子の一例を示す図である。図7に示されるように、計測装置100が計測する対象物TのCADデータを図面情報として取り込み、当該取り込まれた図面情報を参照して、データを読み出す領域が設定される。データを読み出す領域は、例えば、対象物Tの特徴部分である領域、及び計測対象となる領域等を、ユーザが設定しても構わないし、所定の高さを有する領域等について自動的に設定されるようにしても構わない。 FIG. 7 is a diagram showing an example of how drawing information is referred to and an area from which data is to be read is taught. As shown in FIG. 7, the CAD data of the object T to be measured by the measuring apparatus 100 is captured as drawing information, and the captured drawing information is referenced to set the data reading area. The area from which data is read out may be set by the user, for example, an area that is a characteristic portion of the object T, an area to be measured, or the like, or an area having a predetermined height is automatically set. You can do so.
 また、距離画像(高さを画素値とするような画像データ)を生成し、距離画像を参照しながらデータを読み出す領域の設定を行っても構わない。3D空間上の座標データとして扱う場合に比べて、表示処理の計算コストを低減できる効果がある。 Alternatively, a distance image (image data whose pixel value is the height) may be generated, and the area for reading data may be set while referring to the distance image. There is an effect that the calculation cost of the display processing can be reduced compared to the case of treating as coordinate data in 3D space.
 また、3Dデータをもとにレンダリング処理(光学シミュレーションにより擬似的な映像を生成する)を行って濃淡画像を生成し、濃淡画像を参照して領域設定を行っても構わない。これにより、表示処理の計算コストを低減する効果がある。 Alternatively, a grayscale image may be generated by performing rendering processing (generating a pseudo image by optical simulation) based on the 3D data, and area setting may be performed by referring to the grayscale image. This has the effect of reducing the calculation cost of display processing.
[遮光部について]
 図8は、イベントベースカメラ130が処理するデータ量を調整可能な調整部として、遮光部を用いる一例を示す図である。図8に示されるように、光学系120のうち、光源110から投光された光(L0、L1、L2)の光路において、当該光を透過させる領域及び遮光する領域を含む遮光部を配置する。図8に示されるように、光照射領域に光が照射された場合、当該光が透過する領域である4個の透過領域が設定され、その他の領域では光が遮光されるように遮光領域が設定されている。
[About the light shielding part]
FIG. 8 is a diagram showing an example of using a light blocking section as an adjustment section capable of adjusting the amount of data processed by the event-based camera 130. As shown in FIG. As shown in FIG. 8, in the optical system 120, in the optical path of the light (L0, L1, L2) projected from the light source 110, a light blocking section including a region for transmitting the light and a region for blocking the light is arranged. . As shown in FIG. 8, when light is irradiated to the light irradiation region, four transmission regions are set as regions through which the light is transmitted, and light blocking regions are formed so that light is blocked in the other regions. is set.
 これにより、イベントベースカメラ130は、透過領域を透過した光を受光するため、イベントベースカメラ130において配列されている複数の画素のうち、当該透過領域に対応して配列されている画素について、受光する輝度値の変化(イベントの発生)を検知してデータを取得することになる。一方、イベントベースカメラ130は、遮光領域で遮光された光は受光しないため、イベントベースカメラ130において配列されている複数の画素のうち、当該遮光領域に対応して配列されている画素については、イベント発生の有無を監視しなくても構わない。 As a result, since the event-based camera 130 receives light transmitted through the transmissive region, the pixels arranged corresponding to the transmissive region out of the plurality of pixels arranged in the event-based camera 130 receive the light. Data is acquired by detecting a change in luminance value (occurrence of an event). On the other hand, since the event-based camera 130 does not receive light that is blocked by the light-shielding region, among the plurality of pixels arranged in the event-based camera 130, the pixels arranged corresponding to the light-shielding region are It is not necessary to monitor the presence or absence of event occurrence.
 なお、ここでは、光を透過する領域として、4個の円形状の透過領域が設定されることを一例として挙げているが、これに限定されるものではなく、例えば、透過領域を3個以下、又は5個以上設定されても構わない。 Here, as an example, four circular transmissive regions are set as the regions through which light is transmitted. However, the present invention is not limited to this. , or five or more may be set.
 また、各透過領域の大きさ及び形状は、同一でもあっても異なっていても構わないし、円形以外(例えば、楕円形、長円形及び矩形等)であっても構わないし、計測する対象物の形状に応じて適宜設定されても構わない。 In addition, the size and shape of each transmission region may be the same or different, and may be other than circular (eg, elliptical, oval, rectangular, etc.). It may be appropriately set according to the shape.
 透過領域(例えば、個数、形状及び位置等)は、イベントベースカメラ130の種類、容量及び性能、及び計測する対象物の種類及び形状等に応じて、適宜、設定されればよいが、当該設定は、予めユーザによって設定されても構わないし、ティーチングによって設定されても構わない。 The transparent areas (for example, the number, shape, position, etc.) may be appropriately set according to the type, capacity, and performance of the event-based camera 130, and the type, shape, etc. of the object to be measured. may be set in advance by the user or may be set by teaching.
 なお、透過領域の設定については、上述したデータを読み出す領域をティーチング方法を適用することができる。例えば、図6及び図7に示されたデータを読み出す領域について、遮光部としては、データを読み出す領域に対応する領域を透過領域として設定すればよいし、その他の領域を遮光領域として設定すればよい。 Regarding the setting of the transmission area, the teaching method for the area from which the above-described data is read can be applied. For example, with respect to the data readout regions shown in FIGS. 6 and 7, as the light shielding portion, the region corresponding to the data readout region may be set as the transparent region, and the other regions may be set as the light shielding regions. good.
 さらに、このような透過領域及び遮光領域を備える遮光部については、例えば、液晶パネルを用いて実現されても構わない。 Furthermore, a light shielding section having such a transmissive area and a light shielding area may be realized using, for example, a liquid crystal panel.
 ここで、液晶パネルは、偏光フィルタ(垂直)、ガラス基板(個別電極)、配向層に挟まれた液晶、ガラス基板(共通電極)、及び偏光フィルタ(水平)で構成されており、偏光フィルタで直線偏光となった光は、当該偏光方向を維持したままでは、偏光フィルタ(水平)を通過することができない(遮光状態)。一方、配向層に挟まれた液晶で光の偏光方向が90度曲げられると偏光フィルタ(水平)を通過することができる(非遮光状態)。ガラス基板(個別電極)とガラス基板(共通電極)との電極は、液晶に対して、遮光状態及び非遮光状態を切り替えるために設けられている。 Here, the liquid crystal panel is composed of a polarizing filter (vertical), a glass substrate (individual electrode), a liquid crystal sandwiched between alignment layers, a glass substrate (common electrode), and a polarizing filter (horizontal). The linearly polarized light cannot pass through the polarizing filter (horizontal) while maintaining the polarization direction (light blocking state). On the other hand, when the polarization direction of light is bent by 90 degrees in the liquid crystal sandwiched between the alignment layers, the light can pass through the polarizing filter (horizontal) (non-light-shielding state). Electrodes on the glass substrate (individual electrode) and the glass substrate (common electrode) are provided for switching the liquid crystal between a light-shielding state and a non-light-shielding state.
 図9は、図1に示された計測装置100において、遮光部が配置される構成を示す概要図である。図9に示されるように、レンズ122とミラー127との間に、遮光部160が配置されている。 FIG. 9 is a schematic diagram showing a configuration in which the light blocking section is arranged in the measuring device 100 shown in FIG. As shown in FIG. 9, a light blocking section 160 is arranged between the lens 122 and the mirror 127 .
 なお、遮光部160が配置される位置は、レンズ122とミラー127との間に限定されるものではなく、例えば、光源110とレンズ122との間、ミラー127とレンズ123との間、レンズ123とビームスプリッタ121との間、ビームスプリッタ121とレンズ126との間、レンズ126とイベントベースカメラ130との間等であっても構わないし、光学系120のうち、測定光L1及び参照光L2の共通する光路上(光L0及びL3の光路上)であればよい。 The position where the light blocking portion 160 is arranged is not limited to between the lens 122 and the mirror 127. For example, between the light source 110 and the lens 122, between the mirror 127 and the lens 123, between the lens 123 and the beam splitter 121, between the beam splitter 121 and the lens 126, between the lens 126 and the event-based camera 130, or the like. It is sufficient if they are on a common optical path (on the optical paths of the lights L0 and L3).
 遮光部160を配置することによって、照射された光が液晶で偏光及び回折して状態が変化する可能性があるが、測定光L1及び参照光L2の共通する光路上であれば、測定光L1及び参照光L2における光路が同一条件となるため、干渉精度に影響を与える可能性を低減することができる。 By arranging the light shielding part 160, the irradiated light may be polarized and diffracted by the liquid crystal and the state may change. and the optical path of the reference light L2 under the same conditions, it is possible to reduce the possibility of affecting the interference accuracy.
 ここで、透過領域及び遮光領域を備える遮光部として液晶パネルを例に挙げて説明したが、所定の一部領域に照射される光をイベントベースカメラ130まで導き、それ以外の領域に照射される光をイベントベースカメラ130まで導かないようにするためには、DMD(デジタルマイクロミラーデバイス)を用いて実現しても構わない。 Here, the liquid crystal panel has been described as an example of the light shielding portion having the transmission region and the light shielding region, but the light irradiated to the predetermined partial region is guided to the event base camera 130, and is irradiated to the other region. In order not to guide the light to the event-based camera 130, a DMD (digital micromirror device) may be used.
 例えば、図1に示された計測装置100を構成するミラー127にDMDを用いる。DMDでは、所定の一部領域に照射される光を反射させ、それ以外の領域に照射される光を吸収する。すなわち、所定の一部領域に照射される光を反射させてミラー123へ入射させて、最終的には、イベントベースカメラ130まで導くことができる。 For example, a DMD is used for the mirror 127 that constitutes the measuring apparatus 100 shown in FIG. A DMD reflects light irradiated to a predetermined partial region and absorbs light irradiated to other regions. That is, it is possible to reflect the light irradiated to the predetermined partial area, make it enter the mirror 123 , and finally guide it to the event-based camera 130 .
 このように、液晶パネル及びDMD等は、所定の一部領域に照射される光をイベントベースカメラ130まで導き、それ以外の領域に照射される光を遮光したり、吸収したりすることによってイベントベースカメラ130まで導かないように削減するものである。すなわち、照射された光の一部を削減する削光部と言える。 In this way, the liquid crystal panel, DMD, and the like guide the light irradiated to a predetermined partial region to the event base camera 130, and block or absorb the light irradiated to the other region, thereby causing an event to occur. The reduction is performed so as not to lead to the base camera 130 . That is, it can be said that it is a light cutting portion that reduces a part of the irradiated light.
 以上のように、本発明の第1実施形態に係る計測装置100によれば、撮像素子としてイベントベースカメラ130を用いるとともに、さらに、ログアンプにより受光量の微小な変化を適切に捉え、ROI機能及び遮光部160によりイベントベースカメラ130が処理するデータ量を適切に調整する。その結果、計測装置100は、より適切に、対象物Tについて、高速かつ高精度に計測することができる。 As described above, according to the measurement apparatus 100 according to the first embodiment of the present invention, the event-based camera 130 is used as an imaging element, and further, minute changes in the amount of received light are appropriately captured by the log amplifier, and the ROI function is performed. And the light blocker 160 appropriately adjusts the amount of data processed by the event-based camera 130 . As a result, the measuring device 100 can more appropriately measure the object T at high speed and with high accuracy.
 なお、本実施形態では、光学系120は、図1及び図9に示されたように、ビームスプリッタ121、複数のレンズ122~126及び複数のミラー127~129等で構成されていたが、これに限定されるものではない。例えば、光源110によって投光された光L0を、測定光L1と参照光L2とに分離し、最終的に、対象物及びミラーで反射した測定光L1と参照光L2との干渉光をイベントベースカメラ130で受光でき、白色干渉方式の計測方法ができるのであれば、どのような光学系を構成しても構わない。 In this embodiment, the optical system 120 is composed of a beam splitter 121, a plurality of lenses 122 to 126, a plurality of mirrors 127 to 129, etc., as shown in FIGS. is not limited to For example, the light L0 projected by the light source 110 is separated into the measurement light L1 and the reference light L2. Any optical system may be configured as long as it can be received by the camera 130 and a measurement method using the white interference method can be performed.
 また、本実施形態では、イベントベースカメラ130が処理するデータ量を調整可能な調整部として、ROI機能及び遮光部160を例に挙げて、さらに、データを読み出す領域、透過領域及び遮光領域等のティーチングについて説明したが、これに限定されるものではない。例えば、イベントベースカメラ130における画素数を小さくし(解像度を落とし)、全画素において同時にイベントが発生したとしても、データを取得できるようにしても構わない。 Further, in the present embodiment, the ROI function and the light shielding unit 160 are taken as an example of an adjustment unit capable of adjusting the amount of data processed by the event-based camera 130, and furthermore, an area for reading out data, a transparent area, a light shielding area, etc. Although teaching has been described, it is not limited to this. For example, the number of pixels in the event-based camera 130 may be reduced (reduced resolution) so that data can be acquired even if events occur simultaneously in all pixels.
[応用例]
 図10は、本発明の第1実施形態に係る計測装置100を応用した計測装置100Aの構成及び機能を示す概要図である。図10に示されるように、計測装置100Aは、光源110と、光学系120Aと、イベントベースカメラ130と、CMOS撮像素子170とを備える。なお、図10では、模式的に示されているが、計測装置100Aは、図1に示された計測装置100と比べて、CMOS撮像素子170及びハーフミラー171Aが追加されている点で異なる。
[Application example]
FIG. 10 is a schematic diagram showing the configuration and functions of a measuring device 100A to which the measuring device 100 according to the first embodiment of the present invention is applied. As shown in FIG. 10, the measuring device 100A includes a light source 110, an optical system 120A, an event-based camera 130, and a CMOS imaging device 170. Although shown schematically in FIG. 10, the measuring apparatus 100A differs from the measuring apparatus 100 shown in FIG. 1 in that a CMOS imaging device 170 and a half mirror 171A are added.
 計測装置100Aは、基本的には、計測装置100において説明したように、白色干渉方式の計測方法によって対象物Tを計測する。イベントベースカメラ130は、計測装置100において説明したように、対象物TについてZ軸方向のデータを取得する撮像素子として機能する。 The measuring device 100A basically measures the target object T by the white interference measurement method, as described in the measuring device 100 . The event-based camera 130 functions as an imaging element that acquires data about the object T in the Z-axis direction, as described for the measurement apparatus 100 .
 さらに、計測装置100Aでは、CMOS撮像素子170は、ハーフミラー171Aによって反射された光を受光し、対象物Tの画像データを取得する。より詳細には、CMOS撮像素子170は、XY軸平面における計測によって対象物Tの画像を取得する撮像素子として機能する。 Furthermore, in the measuring device 100A, the CMOS imaging device 170 receives the light reflected by the half mirror 171A and acquires image data of the target object T. More specifically, the CMOS imaging device 170 functions as an imaging device that acquires an image of the target object T by measurement on the XY axis plane.
 上述のように、計測装置100Aによれば、CMOS撮像素子170で取得した画像とイベントベースカメラ130を用いた白色干渉方式の計測方法とを融合させることによって、対象物TについてXYZ座標軸における計測を1つのセンサで実現することができる。 As described above, according to the measurement apparatus 100A, by combining the image acquired by the CMOS imaging device 170 and the white interference measurement method using the event-based camera 130, the object T can be measured on the XYZ coordinate axes. It can be realized with one sensor.
 なお、CMOS撮像素子170と同様の役割を、イベントベースカメラ130を用いて実現しても構わない。イベントベースカメラ130は、輝度変化を検知するものであるが、輝度変化の極性だけではなく、輝度変化量を出力するタイプも存在する。この場合は、積分処理を行うことにより、輝度情報を復元することが可能である。 Note that the event-based camera 130 may be used to fulfill the same role as the CMOS imaging device 170 . The event-based camera 130 detects changes in brightness, and there are types that output not only the polarity of the change in brightness but also the amount of change in brightness. In this case, it is possible to restore luminance information by performing integration processing.
 さらに、上述したROI機能及び削光部を組み合わせて用いても構わない。画面を分割して得た複数のROI領域を順次切り替えて撮像し、合成して一枚の画像とすることができる。 Furthermore, the ROI function and the light-shaping portion described above may be used in combination. A plurality of ROI regions obtained by dividing the screen can be sequentially switched, captured, and synthesized to form a single image.
 このように、CMOS撮像素子170と同様の役割を、イベントベースカメラ130を用いて実現すれば、CMOS撮像素子170を新たに設ける必要がなく、構造をシンプルにすることができる。 In this way, if the event-based camera 130 is used to perform the same role as the CMOS imaging device 170, there is no need to newly provide the CMOS imaging device 170, and the structure can be simplified.
 なお、計測装置100Aにおいて、計測装置100で適用した、ログアンプ、ROI機能及び遮光部160等を適用することも可能である。 It should be noted that the log amplifier, the ROI function, the light shielding section 160, and the like, which are applied in the measuring apparatus 100, can also be applied to the measuring apparatus 100A.
 <第2実施形態>
 次に、本発明の第2実施形態では、イベントベースカメラが取得する受光量を変化させるように制御する制御手段として波長掃引光源を用いる計測装置について説明する。なお、本実施形態では、主に、本発明の第1実施形態と異なる構成について詳しく説明し、第1実施形態と共通の事柄についての記述を省略又は簡略化する。
<Second embodiment>
Next, in a second embodiment of the present invention, a measurement apparatus using a wavelength swept light source as control means for changing the amount of light received by an event-based camera will be described. In addition, in this embodiment, mainly the configuration different from that of the first embodiment of the present invention will be described in detail, and the description of matters common to the first embodiment will be omitted or simplified.
 図11は、本発明の第2実施形態に係る計測装置200の構成を示す概要図である。図11に示されるように、計測装置200は、光源210と、光学系120と、イベントベースカメラ130と、処理部140とを備える。光学系120は、ビームスプリッタ121、複数のレンズ122~126及び複数のミラー127~129を含む。 FIG. 11 is a schematic diagram showing the configuration of a measuring device 200 according to the second embodiment of the invention. As shown in FIG. 11, the measurement device 200 includes a light source 210, an optical system 120, an event-based camera 130, and a processing section 140. The optical system 120 includes a beam splitter 121, a plurality of lenses 122-126 and a plurality of mirrors 127-129.
 図1に示された本発明の第1実施形態に係る計測装置100では、イベントベースカメラ130が取得する受光量を変化させるように制御する制御手段として、ミラー129を駆動させる駆動部150を備え、測定光L1と参照光L2との光路長差を変化させることによって干渉波形を発生させて、白色干渉方式の計測方法で対象物Tを計測していた。 The measuring apparatus 100 according to the first embodiment of the present invention shown in FIG. , an interference waveform is generated by changing the optical path length difference between the measurement light L1 and the reference light L2, and the object T is measured by the white light interference method.
 本実施形態に係る計測装置200では、イベントベースカメラ130が取得する受光量を変化させるように制御する制御手段として、光源210に波長掃引光源を用いて、光源210から投光される光の波長を連続的に変化させることによって測定光L1と参照光L2との干渉波形を発生させて、波長掃引方式と呼ばれる計測方法で対象物Tを計測する。 In the measurement apparatus 200 according to the present embodiment, a wavelength swept light source is used as the light source 210 as control means for controlling to change the amount of light received by the event-based camera 130, and the wavelength of the light projected from the light source 210 is is continuously changed to generate an interference waveform between the measurement light L1 and the reference light L2, and the object T is measured by a measurement method called a wavelength sweep method.
 光源210は、例えば、波長掃引光源として、MEMS-VCSEL(Micro Electro Mechanical Systems - Vertical Cavity Surface Emitting Laser)、DFB(Distributed feedback)レーザ、及びSSG-DBR(Super Structure Grating - Distributed Bragg reflector)レーザ等が用いられる。なお、光源210は、第1実施形態に係る計測装置100の光源110と同様に、一定の強度で発光し、発光強度は変化しない。 The light source 210 is, for example, a MEMS-VCSEL (Micro Electro Mechanical Systems-Vertical Cavity Surface Emitting Laser), DFB (Distributed feedback) laser, SSG-DBR (Super Structure Grating-Distributed Bragg reflector) laser, etc., as a wavelength swept light source. Used. It should be noted that the light source 210 emits light with a constant intensity, and the emission intensity does not change, like the light source 110 of the measuring device 100 according to the first embodiment.
 MEMS-VCSELは、共振ミラーをMEMSで駆動させることによって、レーザ発振波長を変化(波長掃引)し、DFBレーザは、レーザダイオードの駆動電流を変化させることによって波長掃引し、SSG-DBRは、2つの共振ミラーに分布反射器と呼ばれる回折格子が形成されており、電流を注入することによって波長掃引する。 The MEMS-VCSEL changes the lasing wavelength (wavelength sweep) by driving the resonant mirror with the MEMS, the DFB laser sweeps the wavelength by changing the drive current of the laser diode, and the SSG-DBR has two wavelengths. A diffraction grating called a distributed reflector is formed on one resonant mirror, and the wavelength is swept by injecting current.
 イベントベースカメラ130は、光源210から投光される光の波長を変動させことによって発生する、測定光L1と参照光L2との干渉波形L3について、本発明の第1実施形態で説明したのと同様に、受光する輝度値の変化を検知してデータを取得する。 The event-based camera 130 generates an interference waveform L3 between the measurement light L1 and the reference light L2, which is generated by varying the wavelength of the light projected from the light source 210, as described in the first embodiment of the present invention. Similarly, data is acquired by detecting a change in the luminance value of received light.
 処理部140は、イベントベースカメラ130によって取得された干渉波形L3のデータ、及び光源210から投光された光の波長(周波数)等に基づいて対象物Tまでの距離を計測する。 The processing unit 140 measures the distance to the object T based on the data of the interference waveform L3 acquired by the event-based camera 130, the wavelength (frequency) of the light projected from the light source 210, and the like.
 以上のように、本発明の第2実施形態に係る計測装置200によれば、光源210として波長掃引光源を用いて波長掃引方式の計測方法で対象物Tを計測するに際し、撮像素子としてイベントベースカメラ130を用いて、配列された各画素において、受光する輝度値の変化を検知して、当該検知のタイミングで干渉波形部分のデータを取得するため、対象物Tまでの距離を計測する上で必要なデータを取得しつつ、取得するデータ量を低減することができる。 As described above, according to the measuring apparatus 200 according to the second embodiment of the present invention, when measuring the object T by the wavelength sweeping measurement method using the wavelength swept light source as the light source 210, the event-based The camera 130 is used to detect the change in the luminance value of the light received by each arranged pixel, and the data of the interference waveform portion is acquired at the timing of the detection. The amount of data to be acquired can be reduced while acquiring necessary data.
 なお、本実施形態に係る計測装置200において、本発明の第1実施形態に係る計測装置100で適用した、ログアンプ、ROI機能及び遮光部160等を適用することも可能である。 Note that the log amplifier, the ROI function, the light shielding section 160, and the like, which are applied to the measuring apparatus 100 according to the first embodiment of the present invention, can also be applied to the measuring apparatus 200 according to the present embodiment.
 さらには、図10を用いて説明した計測装置100Aのように、本実施形態に係る計測装置200にCMOS撮像素子170等を追加して備えることにより、CMOS撮像素子170で取得した画像とイベントベースカメラ130を用いた波長掃引方式の計測方法とを融合させることによって、対象物TについてXYZ座標軸における計測を1つのセンサで実現することもできる。 Furthermore, like the measurement device 100A described using FIG. By combining the measurement method of the wavelength sweep method using the camera 130, the measurement of the object T on the XYZ coordinate axes can be realized with a single sensor.
 以上説明した実施形態は、本発明の理解を容易にするためのものであり、本発明を限定して解釈するためのものではない。実施形態が備える各要素並びにその配置、材料、条件、形状及びサイズ等は、例示したものに限定されるわけではなく適宜変更することができる。また、異なる実施形態で示した構成同士を部分的に置換し又は組み合わせることが可能である。 The embodiments described above are for facilitating understanding of the present invention, and are not for limiting interpretation of the present invention. Each element included in the embodiment and its arrangement, materials, conditions, shape, size, etc. are not limited to those illustrated and can be changed as appropriate. Also, it is possible to partially replace or combine the configurations shown in different embodiments.
[附記]
 対象物までの距離を計測する計測装置(100,200)であって、
 光を投光する光源(110,210)と、
 ミラー(129)と、
 前記光源から投光された光を、前記対象物に導く測定光と前記ミラーに導く参照光とに分割するビームスプリッタ(121)と、
 前記ミラーに反射された参照光と前記対象物に反射された測定光との干渉波形について、受光する輝度値の変化を検知してデータを取得するイベントベースカメラ(130)と、
 前記干渉波形のデータに基づいて前記対象物までの距離を計測する処理部(140)と、
 前記イベントベースカメラが取得する受光量を変化させるように制御する制御手段(150)と、
 を備える、計測装置(100,200)。
[Appendix]
A measuring device (100, 200) for measuring a distance to an object,
a light source (110, 210) for projecting light;
a mirror (129);
a beam splitter (121) for splitting the light projected from the light source into measurement light directed to the object and reference light directed to the mirror;
an event-based camera (130) that acquires data by detecting a change in received luminance value with respect to an interference waveform of the reference light reflected by the mirror and the measurement light reflected by the object;
a processing unit (140) for measuring the distance to the object based on the data of the interference waveform;
a control means (150) for controlling to change the amount of light received by the event-based camera;
A measuring device (100, 200) comprising:
100,100A,200…計測装置、110,210…光源、120,120A…光学系、121…ビームスプリッタ、122~126…レンズ、127~129…ミラー、130…イベントベースカメラ、140…処理部、150…駆動部、160…遮光部、170…CMOS撮像素子、171A…ハーフミラー、L0~L3…光、T…対象物 100, 100A, 200... Measuring device, 110, 210... Light source, 120, 120A... Optical system, 121... Beam splitter, 122 to 126... Lens, 127 to 129... Mirror, 130... Event base camera, 140... Processing unit, DESCRIPTION OF SYMBOLS 150... drive part, 160... light-shielding part, 170... CMOS imaging element, 171A... half mirror, L0-L3... light, T... object

Claims (10)

  1.  対象物までの距離を計測する計測装置であって、
     光を投光する光源と、
     ミラーと、
     前記光源から投光された光を、前記対象物に導く測定光と前記ミラーに導く参照光とに分割するビームスプリッタと、
     前記ミラーに反射された参照光と前記対象物に反射された測定光との干渉波形について、受光する輝度値の変化を検知してデータを取得するイベントベースカメラと、
     前記干渉波形のデータに基づいて前記対象物までの距離を計測する処理部と、
     前記イベントベースカメラが取得する受光量を変化させるように制御する制御手段と、
     を備える、計測装置。
    A measuring device for measuring a distance to an object,
    a light source that projects light;
    with a mirror
    a beam splitter that splits the light projected from the light source into measurement light guided to the object and reference light guided to the mirror;
    an event-based camera that acquires data by detecting a change in received luminance value of an interference waveform between the reference light reflected by the mirror and the measurement light reflected by the object;
    a processing unit that measures the distance to the object based on the data of the interference waveform;
    control means for controlling to change the amount of light received by the event-based camera;
    A measuring device.
  2.  前記制御手段は、前記測定光と前記参照光との光路長差を変化させるように、前記ミラーを駆動させる駆動部を含む、
     請求項1に記載の計測装置。
    The control means includes a drive unit that drives the mirror so as to change the optical path length difference between the measurement light and the reference light.
    The measuring device according to claim 1.
  3.  前記制御手段は、前記光源から投光される光の波長が連続的に変化するように、前記光源として波長掃引光源を用いる、
     請求項1又は2に記載の計測装置。
    The control means uses a wavelength swept light source as the light source so that the wavelength of the light projected from the light source changes continuously.
    The measuring device according to claim 1 or 2.
  4.  前記イベントベースカメラが取得する受光量の増減を極性データとし、当該極性データの累積に基づいて、当該イベントベースカメラが取得する受光量として算出するログアンプを含む、
     請求項1から3のいずれか一項に記載の計測装置。
    A log amplifier that calculates the amount of light received by the event-based camera based on the accumulation of the polarity data, based on the increase or decrease in the amount of light received by the event-based camera,
    The measuring device according to any one of claims 1 to 3.
  5.  前記イベントベースカメラが処理するデータ量を調整可能な調整部を、さらに備える、
     請求項1から4のいずれか一項に記載の計測装置。
    further comprising an adjustment unit capable of adjusting the amount of data processed by the event-based camera;
    The measuring device according to any one of claims 1 to 4.
  6.  前記調整部は、所定の一部領域のデータを読み出すROI機能を有する、
     請求項5に記載の計測装置。
    The adjustment unit has an ROI function for reading data in a predetermined partial area,
    The measuring device according to claim 5.
  7.  前記調整部は、所定の一部領域に照射される光を前記イベントベースカメラまで導き、それ以外の領域に照射される光を前記イベントベースカメラまで導かないように構成される削光部を含む、
     請求項5又は6に記載の計測装置。
    The adjustment unit includes a light cutting unit configured to guide light applied to a predetermined partial area to the event-based camera and not guide light applied to other areas to the event-based camera. ,
    The measuring device according to claim 5 or 6.
  8.  前記削光部は、液晶パネルで構成される、
     請求項7に記載の計測装置。
    The light-shaping portion is composed of a liquid crystal panel,
    The measuring device according to claim 7.
  9.  前記削光部は、前記測定光及び前記参照光の共通する光路上に配置される、
     請求項7又は8に記載の計測装置。
    The light-shaping section is arranged on a common optical path of the measurement light and the reference light,
    The measuring device according to claim 7 or 8.
  10.  前記所定の一部領域は、前記対象物の形状に応じて予め設定される、
     請求項6から9のいずれか一項に記載の計測装置。
    The predetermined partial area is set in advance according to the shape of the object,
    The measuring device according to any one of claims 6 to 9.
PCT/JP2022/001524 2021-03-01 2022-01-18 Measuring device WO2022185753A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021031410A JP2022132771A (en) 2021-03-01 2021-03-01 Measurement device
JP2021-031410 2021-03-01

Publications (1)

Publication Number Publication Date
WO2022185753A1 true WO2022185753A1 (en) 2022-09-09

Family

ID=83154953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/001524 WO2022185753A1 (en) 2021-03-01 2022-01-18 Measuring device

Country Status (3)

Country Link
JP (1) JP2022132771A (en)
TW (1) TWI801104B (en)
WO (1) WO2022185753A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007033217A (en) * 2005-07-26 2007-02-08 Keyence Corp Interference measuring instrument, and interference measuring method
JP2014228486A (en) * 2013-05-24 2014-12-08 インスペック株式会社 Three-dimensional profile acquisition device, pattern inspection device, and three-dimensional profile acquisition method
US20210348966A1 (en) * 2020-05-08 2021-11-11 Raytheon BBN Technologies, Corp. Systems, Devices, and Methods for Hyperspectral Imaging

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200842345A (en) * 2007-04-24 2008-11-01 Univ Minghsin Sci & Tech Measurement system and method by using white light interferometer
JPWO2009147908A1 (en) * 2008-06-06 2011-10-27 コニカミノルタセンシング株式会社 Light measurement apparatus, light measurement method, and program
CN104755908B (en) * 2012-07-27 2017-12-12 统雷有限公司 Quick imaging system
US20150002852A1 (en) * 2013-06-26 2015-01-01 Zygo Corporation Coherence scanning interferometry using phase shifted interferometrty signals
CN103983206A (en) * 2014-05-12 2014-08-13 上海理工大学 Interference microscope system based on programmable illumination

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007033217A (en) * 2005-07-26 2007-02-08 Keyence Corp Interference measuring instrument, and interference measuring method
JP2014228486A (en) * 2013-05-24 2014-12-08 インスペック株式会社 Three-dimensional profile acquisition device, pattern inspection device, and three-dimensional profile acquisition method
US20210348966A1 (en) * 2020-05-08 2021-11-11 Raytheon BBN Technologies, Corp. Systems, Devices, and Methods for Hyperspectral Imaging

Also Published As

Publication number Publication date
TWI801104B (en) 2023-05-01
JP2022132771A (en) 2022-09-13
TW202235815A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
JP5281923B2 (en) Projection display
TWI564539B (en) Optical system, method for illumination control in the same and non-transitory computer-readable medium
US9910255B2 (en) Optical system for generating a pattern which changes over time for a confocal microscope
KR102090857B1 (en) Image synchronization of scanning wafer inspection system
KR20070115972A (en) Exposure apparatus and exposure method
KR102119289B1 (en) Systems and methods for sample inspection and review
KR20090116731A (en) Method and device for monitoring multiple mirror arrays in an illumination system of a microlithographic projection exposure apparatus
CN104755996A (en) Polarization splitting multiplexing device, optical system, and display unit
KR101743810B1 (en) Light irradiation apparatus and drawing apparatus
CN112469961B (en) Color confocal area sensor
JP2006208432A (en) Exposure method and apparatus
KR20180104594A (en) Laser light irradiation device
JP4090860B2 (en) 3D shape measuring device
JP4447970B2 (en) Object information generation apparatus and imaging apparatus
WO2022185753A1 (en) Measuring device
US11156566B2 (en) High sensitivity image-based reflectometry
US9041907B2 (en) Drawing device and drawing method
US20120242995A1 (en) Pattern inspection apparatus and pattern inspection method
KR100925783B1 (en) Apparatus and method for measuring shape
KR102632409B1 (en) Projection system and method utilizing adjustable angle illumination using lens decentering
JP3973979B2 (en) 3D shape measuring device
KR102103919B1 (en) Multi-image projector and electronic device having multi-image projector
WO2023282213A1 (en) Pattern exposure apparatus, exposure method, and device manufacturing method
TWI835976B (en) Imaging reflectometer
TW202407380A (en) Lidar system and resolusion improvement method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22762812

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22762812

Country of ref document: EP

Kind code of ref document: A1