WO2022102302A1 - 形状測定システム - Google Patents
形状測定システム Download PDFInfo
- Publication number
- WO2022102302A1 WO2022102302A1 PCT/JP2021/037286 JP2021037286W WO2022102302A1 WO 2022102302 A1 WO2022102302 A1 WO 2022102302A1 JP 2021037286 W JP2021037286 W JP 2021037286W WO 2022102302 A1 WO2022102302 A1 WO 2022102302A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- unit
- light source
- measurement
- reflected
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 265
- 238000012545 processing Methods 0.000 claims abstract description 64
- 230000007246 mechanism Effects 0.000 claims abstract description 38
- 230000005540 biological transmission Effects 0.000 claims abstract description 27
- 230000003287 optical effect Effects 0.000 claims description 49
- 238000004364 calculation method Methods 0.000 claims description 23
- 230000008859 change Effects 0.000 claims description 19
- 230000008054 signal transmission Effects 0.000 claims description 11
- 230000010287 polarization Effects 0.000 claims description 9
- 230000002123 temporal effect Effects 0.000 claims description 7
- 238000010408 sweeping Methods 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 abstract description 9
- 230000001360 synchronised effect Effects 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 39
- 238000006243 chemical reaction Methods 0.000 description 36
- 239000004065 semiconductor Substances 0.000 description 35
- 238000010586 diagram Methods 0.000 description 25
- 238000012546 transfer Methods 0.000 description 24
- 230000006870 function Effects 0.000 description 19
- 230000000875 corresponding effect Effects 0.000 description 16
- 239000010408 film Substances 0.000 description 15
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 11
- 238000000034 method Methods 0.000 description 11
- 238000009792 diffusion process Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000009471 action Effects 0.000 description 7
- 239000000872 buffer Substances 0.000 description 7
- 230000003321 amplification Effects 0.000 description 6
- 238000003199 nucleic acid amplification method Methods 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 4
- 230000010355 oscillation Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/47—Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/701—Line sensors
Definitions
- the present disclosure relates to a shape measurement system that three-dimensionally measures the shape of a measurement target.
- Non-scanning type image pickup device called EVS (Event-based Vision Sensor)
- EVS Event-based Vision Sensor
- scanning type synchronous type image pickup device that performs imaging in synchronization with a synchronization signal such as a vertical synchronization signal.
- the non-scanning type image pickup apparatus can detect as an event that the amount of change in the luminance of the pixel that photoelectrically converts the incident light exceeds a predetermined threshold value.
- the optical cutting method is a method used for detecting shape abnormality of a measurement target (hereinafter referred to as “measurement object”), especially in an inspection process of a factory or the like.
- measurement object a measurement target
- the shape measurement using this light cutting method when a line-shaped light is irradiated from the light projecting portion to the measurement object and the reflected light from the measurement object based on the irradiation light is received by the non-scanning image pickup device.
- the width and height of the object to be measured (object) are measured from the shape of the imaged cross section.
- a shape measuring device using a light cutting method which is one of the uses of a non-scanning type image pickup device, receives reflected light from a measured object based on the irradiation light from the light projecting unit. Shape measurement is performed.
- the imaged shape may differ from the original shape due to disturbance light, reflection of irradiation light from the light projecting portion, or the like, and correct shape measurement may not be possible.
- the object of the present disclosure is to acquire accurate three-dimensional information even for a measured object having a large difference in unevenness on the measuring surface.
- the scan mechanism is provided in the light source unit, and the light source unit includes a light source and a waveform control lens that controls the waveform of light from the light source.
- the light receiving unit is an EVS (Event-based Vision Sensor), which is an asynchronous image sensor that captures a temporal change of the light receiving lens and the reflected light transmitted through the light receiving lens among the reflected light and outputs an event signal.
- An event issuing unit that detects an event based on the output data from the EVS and outputs the event data, and a transmission unit that outputs the event data to the signal processing unit may be provided.
- the light source unit is single.
- the light receiving unit may include a first light receiving unit and a second light receiving unit that are installed at different positions from each other.
- the light source unit includes a first light source unit and a second light source unit installed at different positions from each other.
- the light receiving unit is single and has a single light receiving unit.
- the signal processing unit may include a control signal transmission unit that outputs a control signal to each of the scan mechanisms at time-division time intervals for each pixel sequence.
- the light source unit includes a first light source unit including a first light source that emits light having a first wavelength, and a second light source unit that emits light having a second wavelength.
- the light receiving unit is single and has a single light receiving unit.
- the EVS is composed of a two-division area divided into a first division area and a second division area for each pixel column, and also has a two-division area.
- the first division region is the first wavelength of the light of the first wavelength and the reflected light of the second wavelength reflected by the measurement target among the measurement lights of two different wavelengths emitted from the light source. It is provided with a first filter having a first wavelength transmission characteristic that selectively transmits the reflected light of the above.
- the second division region may include a second filter having a second wavelength transmission that selectively transmits the light of the second wavelength in the reflected light.
- the light source unit includes a first light source unit provided with the first light source and a second light source unit provided with the second light source.
- the first light source unit includes a first optical element that generates a first measurement light having a first polarizing surface from the emitted light of the first light source.
- the second light source unit includes a second optical element that generates a second measurement light having a second polarization plane from the emitted light of the second light source.
- the light receiving unit is single and has a single light receiving unit.
- the EVS is divided into a first division area and a second division area for each pixel sequence.
- the first divided region has the first polarizing surface, which is reflected by the measurement target and incident on the light receiving portion of the two types of measurement light, the first measurement light and the second measurement light.
- a first film that selectively transmits the first reflected light among the reflected light and the second reflected light having the second polarizing surface is provided.
- the second divided region may include a second film that selectively transmits the second reflected light.
- the light source unit is single, and the light source in the light source unit includes a dividing means for dividing the light from the light source into at least three types of wavelength bands.
- the light receiving unit is single and has a single light receiving unit.
- the EVS is composed of a three-divided region divided into three divided regions from the first divided region to the third divided region in pixel row units. In the first division region, among the measurement lights of three different wavelength bands emitted from the light source, the first wavelength of the reflected light of the three types of wavelength bands reflected by the measurement target. It is equipped with a first filter having a wavelength transmission characteristic that selectively transmits the reflected light of the band.
- the dividing means is composed of a prism that divides the white light from the light source into light of three types of wavelength bands, and also.
- the light in the three types of wavelength bands may be light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band.
- the EVS outputs an event signal by comparing the signal voltage based on the temporal change of the reflected light with the threshold voltage and determining that the voltage is smaller or larger than the threshold voltage. But it may be.
- a plurality of independent optical data are captured for each shortest optical path via a plurality of optical paths. Therefore, the data from the normal reflected light (which travels in the shortest optical path) among the reflected light captured from each optical path has the same value as long as it is the normal reflected light reflected at the same reflection point regardless of which optical path is followed. be. That is, with respect to the physical shape (for example, height) of the measured object at the same point, the obtained shape (for example, height) is uniquely (uniquely) determined (the shape at the same position is only one). Is).
- the height value at the same point does not differ even if the reflected light passes through any shortest optical path (root) as long as it is a normal reflected light. Therefore, if you find a pair of the same data values that should always exist, the data is obtained from the normal reflected light, so it is certain and accurate which data is the true data regarding the measurement surface. Can be detected. That is, even if unnecessary secondary reflected light in the reflected light reflected / scattered on the measurement surface of the measured object enters the light receiving portion, it is possible to suppress the occurrence of erroneous distance measurement due to the data related to the secondary reflected light.
- the EVS Event-based Vision Sensor
- the light receiving element is used as the light receiving element, so that the light receiving element is fast and accurate. Shape measurement can be realized.
- Third configuration example Multiple light sources + single light receiving unit (wavelength division control) 2-4.
- Fourth configuration example Multiple light sources + single light receiving unit (polarization switching control) 2-5.
- Fifth configuration example Single light source + single light receiving unit (spectral with prism) 3. 3. Configuration that can be taken by this disclosure
- FIG. 1 shows a schematic diagram showing a shape measurement system according to the first embodiment of the present disclosure
- FIG. 2 shows a block diagram showing a schematic configuration of the shape measurement system of the present disclosure.
- FIG. 1 shows a schematic diagram showing a shape measurement system according to the first embodiment of the present disclosure
- FIG. 2 shows a block diagram showing a schematic configuration of the shape measurement system of the present disclosure.
- FIG. 1 shows a schematic diagram showing a shape measurement system according to the first embodiment of the present disclosure
- FIG. 2 shows a block diagram showing a schematic configuration of the shape measurement system of the present disclosure.
- FIG. 1 shows a schematic diagram showing a shape measurement system according to the first embodiment of the present disclosure
- FIG. 2 shows a block diagram showing a schematic configuration of the shape measurement system of the present disclosure.
- FIG. 1 shows a schematic diagram showing a shape measurement system according to the first embodiment of the present disclosure
- FIG. 2 shows a block diagram showing a schematic configuration of the shape measurement system of the present disclosure.
- FIG. 1 shows
- the shape measuring system 1 of the present disclosure is roughly configured as a stage 2, a shape measuring device 10 for three-dimensionally measuring a measuring surface 100A of a measuring object 100 mounted on the stage 2 in a stationary state, and the shape measuring device.
- a moving mechanism 3 that constitutes a scanning mechanism that intermittently moves the 10 in the Y direction
- a control unit 4 that controls the moving mechanism 3
- a signal processing unit that performs signal processing in the shape measuring device 10 (hereinafter, “signal”). It is provided with a processing unit (called) 13.
- the shape measuring device 10 of the present embodiment includes a light source unit 11 and a light receiving unit 12, and is housed inside the head H.
- a non-scanning type (hereinafter referred to as "asynchronous type") imaging method called EVS (Event-based Vision Sensor) is used. Details of this will be described later.
- EVS Event-based Vision Sensor
- the shape measurement operation is continuously performed sequentially from the left side object to the right side object 100 for the plurality of measurement objects 100 shown in FIG.
- the moving mechanism 3 is provided for this purpose, of course, the moving mechanism 3 is not necessary when the shape is measured only for a single measurement object 100.
- the stage 2 of the present embodiment is a rectangular flat plate having a rectangular shape in a plan view, and is aligned so that the two vertical and horizontal sides coincide with each other in the X direction and the Y direction.
- the stage 2 of the present embodiment is placed on a base or the like (not shown).
- the moving mechanism 3 intermittently moves the head H accommodating the shape measuring device 10 in the Y direction at high speed in accordance with the measurement of the measured object 100 under the control of the moving control unit 42 described later. Let me. As a result, the measurement of each measurement object 100 is sequentially performed.
- a robot may be used.
- a rack and pinion, a ball screw, an endless belt, a wire, or any other appropriate means may be used.
- the moving mechanism 3 of the present embodiment is configured to intermittently move the head H in the Y direction, but the present disclosure is not particularly limited to this. That is, instead of the head H, either the stage 2 or the base may move. Further, the moving mechanism of the present disclosure may be such that the light source portion side inside the head H is moved or only the light source inside the light source portion is moved, as in the second embodiment, for example.
- the signal processing unit 13 acquires the three-dimensional shape of the measured object 100 by performing predetermined signal processing at high speed based on the "event data" obtained by the light receiving unit 12 of the shape measuring device 10.
- the signal processing unit 13 of the present embodiment includes a control signal transmission unit 131, an unnecessary signal removal unit 132, a three-dimensional shape calculation unit (hereinafter referred to as “3D calculation unit”) 133, and the signal processing unit 13. It is provided with a memory 134. The details of each of the above elements including "event data" will be described later. It was
- an asynchronous image pickup device called EVS is used, and by using this EVS, it is possible to measure the shape at a high speed in time unlike the conventional sensor. , Etc. can be obtained.
- FIG. 3 shows a first aspect (hereinafter referred to as “first configuration example”) of the second embodiment according to the present disclosure. That is, FIG. 3 shows the configuration of the shape measuring device 10A according to the first configuration example, particularly the light source unit 11, the light receiving unit 12, and the signal processing unit 13 provided in the shape measuring device 10A.
- first configuration example the configuration of the shape measuring device 10A according to the first configuration example, particularly the light source unit 11, the light receiving unit 12, and the signal processing unit 13 provided in the shape measuring device 10A.
- the same parts as those in FIGS. 1 and 2 are designated by the same reference numerals to avoid duplicate explanations.
- the shape measuring device 10A of the present disclosure may have the same configuration as that used in the shape measuring system 1 of the first embodiment described above, or may have a different configuration.
- the shape measuring device 10A of the first configuration example is composed of a type including a single light source unit 11 and a plurality of light receiving units 12.
- the shape measuring device 10A of this configuration example includes a single light source unit 11, two light receiving units 12 including a first light receiving unit 12A and a second light receiving unit 12B, and a signal processing unit. 13 is stored in the head H.
- the head H of the present disclosure may accommodate only the light source unit 11 and the light receiving unit 12, and the signal processing unit 13 may be separately installed to exchange data by using wireless communication, infrared communication, or the like. good.
- the two light receiving units 12 of the first light receiving unit 12A and the second light receiving unit 12B are separated from each other as far as possible.
- the shape measuring device 10A of the first configuration example can be installed at positions opposite to each other in the lateral (Y) direction with the light source unit 11 as the center, but it is not particularly symmetrically arranged with the light source unit 11 as the center. You may.
- the light source unit 11 (hereinafter referred to as “light source unit 11A”) of this configuration example is composed of a single light source unit 11A, and specifically, as shown in FIG. 3, the light source unit 111A and the light source unit 11A. , A waveform control lens 112, and a scanning mechanism 113.
- the light source 111A among the elements of the light source unit 11A will be described by omitting "A".
- a semiconductor laser (hereinafter referred to as "semiconductor laser 111") can be used as the light source 111.
- the semiconductor laser 111 of the present embodiment can continuously oscillate, for example, a laser beam having a blue wavelength ( ⁇ b). This blue light is used as the measurement light in the shape measuring means of the measuring object 100 of the first configuration example.
- the laser light used in the light source unit of the present disclosure is not particularly limited to this blue light.
- the light source of this configuration example it is preferable that the light source emits high-luminance light and at the same time has a certain degree of directivity. Further, the illumination light to be used does not have to be coherent like laser light, but it is preferable that the illumination light has less chromatic aberration and other aberrations.
- an edge emission type Edge Emitting Laser: EEL
- VCSL Vertical Cavity Surface Emitting Laser
- a waveform forming lens (hereinafter referred to as “corrugation forming lens”) is used for the waveform control lens 112.
- This corrugated lens is an optical element for processing the emitted laser beam into a slit shape.
- a cylindrical lens hereinafter, this is referred to as “cylindrical lens 112”.
- the waveform shape of the blue laser light oscillated and emitted from the semiconductor laser 111 in the form of a beam is directed toward the thin and wide rectangular (belt-shaped) light in the traveling direction, or toward the traveling direction.
- Waveform shaping is performed on light forming a thin fan shape (hereinafter, these are referred to as "line slit light” or simply “slit light”).
- the line slit light of this configuration example covers the entire row of pixels (column: COLUMN) having a narrow vertical length along the depth (X) direction of the measurement object 100, and the slit light as shown in FIG. 1 is simultaneously generated in time. Be irradiated.
- the scanning mechanism 113 sends out a laser beam whose waveform is formed in a slit shape by the cylindrical lens 112 in the Y (row: ROW) direction with the passage of time, and performs a sweep scan. In this way, the scan mechanism 113 can three-dimensionally measure the shape of the entire surface of the measurement surface 100A of the object to be measured 100 by sequentially transmitting the laser beam in the Y direction.
- the configuration example of the present disclosure is a type in which the stationary measured object 100 is scanned and scanned on the shape measuring device 10A side to measure the three-dimensional shape of the measured object 100.
- a type may be used in which the shape measuring device 10A side is allowed to stand still and the measured object 100 side is moved / scanned by an appropriate transport means to perform measurement.
- the shape is measured in a pitch-black environment by dropping the illumination light.
- it is necessary to consider mounting an optical filter of the same wavelength band in front of the image pickup element of the image pickup device 122.
- the light receiving unit 12 of this configuration example includes a light receiving lens 121, an image pickup device 122, an event issuing unit 123, and a transmitting unit 124, respectively.
- the light receiving unit 12 will be described as a “light receiving function unit” in the front stage portion and a “detection function unit” in the rear stage portion for convenience.
- the first light receiving unit 12A includes a first light receiving lens 121A, a first imaging device 122A, a first event issuing unit 123A, and a first transmitting unit 124A. I have.
- the second light receiving unit 12B specifically includes a second light receiving lens 121B, a second image pickup device 122B, a second event issuing unit 123B, and a second transmitting unit 124B.
- a second light receiving lens 121B specifically includes a second image pickup device 122B, a second event issuing unit 123B, and a second transmitting unit 124B.
- the light receiving function unit which is the front stage portion, receives the reflected light (scattered light) on the measurement surface 100A of the line slit light projected from the light source unit 11A.
- the light receiving function unit of the present embodiment may include a light receiving lens 121 and an image pickup unit 21 described later, which includes an optical element including an image pickup device 122 and various circuits.
- the light receiving function unit acquires the luminance (change) data related to the shape of the measurement object 100, which is the measurement of the measurement object 100 in pixel row units.
- an information signal such as the outer shape and the position information associated with the outer shape, that is, a signal for detecting event data described later is output to the detection function unit of the subsequent stage portion.
- the imaging unit 21 will be described in detail later.
- the light receiving lens 121 forms an image of the incident line slit light on the image pickup device 122.
- the light receiving lens of the present disclosure simply forms an image of reflected light on the image pickup apparatus 122.
- the image pickup apparatus 122 uses the above-mentioned EVS (Event-based Vision Sensor) as a sensor of a method different from the conventional one.
- This EVS is composed of an asynchronous image pickup device, and pixels for photoelectric conversion of light reflected and incident by the measurement object 100 are two-dimensionally arranged in a matrix.
- a plurality of pixel circuits are arranged in a two-dimensional grid pattern.
- a set of pixel circuits arranged in the depth (X) direction is referred to as a “row”, and a set of pixel circuits arranged in the horizontal (Y) direction orthogonal to this row is referred to as a “column”.
- a photoelectric conversion signal is taken into each pixel based on the reflected light ⁇ 1 and ⁇ 2 received according to the image information for each pixel row (column).
- the input data is output to the event issuing unit 123.
- the detection function unit performs signal processing. As a result, the signal from the incident light is captured as an event. Since EVS outputs only the pixel information in which the brightness changes for each pixel in pixel units, it is possible to stably output signals without being affected by the color and reflectance of the object to be measured and ambient light. can.
- the detection function unit may include an event issuing unit 123 and a transmitting unit 124.
- the event issuing unit 123 of the present embodiment detects the position of the event from the synchronization signal input at that time t. This position detection will be described later with reference to FIGS. 5 to 8. Further, the event issuing unit 123 detects the change in luminance for each received pixel address as an “address event” in real time. When the event issuing unit 123 detects an address event, it compares the brightness with the immediately preceding address event. As a result, data including position information (coordinate data) indicating the coordinate position of the event is detected and captured as an "event” when the amount of change in brightness set in advance exceeds a predetermined threshold (this is referred to as "event data"). Is output to the transmission unit 124. In addition to the location information, the event data can include time information representing the relative time when the event occurred. It may also include gradation information representing the signal level.
- the transmission unit 124 outputs the event data for three-dimensional restoration output from the event generation unit 123 to the unnecessary signal removal unit 132 described later on the signal processing unit 13 side.
- the signal processing unit 13 of the present embodiment detects and stores the three-dimensional data of the measurement object 100 at high speed based on the data obtained from the light receiving unit 12.
- the signal processing unit 13 includes a control signal transmission unit 131, an unnecessary signal removal unit 132, a 3D calculation unit 133, a memory 134, and the like.
- the control signal transmission unit 131 outputs a control signal for starting the light source 111 and the scan mechanism 113 together with the start of the shape measurement operation of the measurement object 100 (time t 0 ). That is, the sweep scanning operation in the lateral (Y) direction of the object to be measured 100 is also started in synchronization with the laser oscillation operation at the start of shape measurement. Further, in order to drive the event issuing unit 123 in synchronization with the laser oscillation operation of the light source 111 and the sweeping operation of the scanning mechanism 113, the control signal transmitting unit 131 transmits a synchronization signal to the event issuing unit 123 of the light receiving units 12A and 12B. Output.
- the event issuing unit 123 can input the luminance signal via the EVS 122 and at the same time associate the synchronization signal with the event data. As a result, shape measurement data for all pixels on the entire surface of the object to be measured 100 is generated.
- the shapes of a plurality of measured objects are formed one after another by the intermittent operation of the moving mechanism 3 that moves the shape measuring device 10A in the Y direction. Measurements can be made (this is called “multiple object measurement”).
- the shape of one measurement object is formed multiple times by performing the same control with respect to the sweep scanning operation of one course so far. The measurement operation can be performed repeatedly and continuously in time (this is referred to as "single object measurement”). As a result, in the single object measurement, the change with time of the measured object 100 can be detected at minute time intervals.
- the type of "single object measurement” is adopted, which is convenient in the case where the shape changes from moment to moment.
- “measurement of a plurality of objects” may be adopted for the configuration examples after the first configuration example.
- the unnecessary signal removing unit 132 may refer to multipath reflected light (this is referred to as “secondary reflected light”” that leads to a measurement error that occurs when the laser light from the light source 111 is incident on the measurement surface 100A. ) Is removed.
- Multipath reflected light is light that is incident from a path that is not intended for measurement, and is mainly generated by multiple reflections of various objects in the outside world.
- the unnecessary signal removing unit 132 in this configuration example is an incident angle of scattered light (which becomes “normally reflected light” described later) that is particularly required among the slit light from the light source unit 11A incident on the image pickup apparatus 122. The unnecessary signal generated by the secondary reflected light incident at an incident angle different from (reference angle) is removed.
- the coordinates (X, Y, Z) of each pixel corresponding to the specified arbitrary point on the surface of the object to be measured 100 are uniquely determined. .. That is, in the light receiving unit 12, the incident angle of the reflected light from the light source 111 toward the measurement object 100 is a constant specific value (normal incident angle).
- the reflected light (this is called “normally reflected light") when the light incident at this normal incident angle (this is sometimes called “normally incident light”) is reflected by the measuring object 100 and enters the image pickup apparatus 122. If the incident angle of) is the same height as the immediately preceding point, the incident angle at that point is also the same.
- the secondary reflected light may erroneously enter the light receiving unit 12 at an incident angle different from the normal incident angle.
- the unnecessary signal removing unit 132 is an electric signal to be generated from the light receiving intensity or the light receiving amount when incident at the original normal incident angle (hereinafter, this is referred to as "normal electric signal”), that is, a unique optical current.
- An electric signal other than this normal electric signal hereinafter, this is referred to as an "unnecessary signal" different from the value or the photovoltaic value is removed.
- the memory 134 is obtained by being processed by a required circuit based on data regarding the three-dimensional shape of the measurement surface 100A of the measurement object 100 sequentially captured by the shape measuring device 10A, particularly height information (or thickness information). Shape information related to the three-dimensional shape, which is event data for each pixel of the measurement object 100, is stored and stored.
- FIG. 5 is a block diagram showing an example of the configuration of the imaging unit 21.
- the image pickup unit 21 uses an asynchronous image pickup device called EVS, and has a pixel array unit 211, a drive unit 212, an arbiter unit 213, a column processing unit 214, and an image pickup unit 21.
- a signal processing unit 215 is provided.
- Pixel array unit In the image pickup unit 21 having the above configuration, a plurality of pixels 210 are two-dimensionally arranged in a matrix (array shape) in the pixel array unit 211.
- a vertical signal line VSL which will be described later, is wired for each pixel row (column) with respect to this matrix-shaped pixel array.
- Drive unit 212 drives each of the plurality of pixels 210 to output the pixel signal generated by each pixel 210 to the column processing unit 214.
- Arbiter unit 213 arbitrates the requests from each of the plurality of pixels 210, and transmits a response based on the arbitration result to the pixels 210.
- the pixel 210 supplies event data (address event detection signal) indicating the detection result to the drive unit 212 and the signal processing unit 215.
- event data address event detection signal
- the column processing unit 214 is composed of, for example, an analog-digital converter, and converts an analog pixel signal output from the pixel 210 of the pixel array unit 211 into a digital signal for each pixel array of the pixel array unit 211. Perform processing and so on. Then, the column processing unit 214 supplies the digital signal after the analog-to-digital conversion to the signal processing unit 215.
- Signal processing unit 215 executes predetermined signal processing such as CDS (Correlated Double Sampling) processing and image recognition processing on the digital signal supplied from the column processing unit 214. Then, the signal processing unit 215 outputs the data indicating the processing result and the event data supplied from the arbiter unit 213 via the signal line 216.
- predetermined signal processing such as CDS (Correlated Double Sampling) processing and image recognition processing
- FIG. 6 is a block diagram showing an example of the configuration of the pixel array unit 211.
- each of the plurality of pixels 210 has a photoelectric conversion unit 51, a pixel signal generation unit 52, and an address event detection unit 53. It has become.
- Photoelectric conversion unit 51 Photoelectrically converts the incident light to generate a photocurrent. Then, the photoelectric conversion unit 51 supplies the photocurrent generated by photoelectric conversion to either the pixel signal generation unit 52 or the address event detection unit 53 under the control of the drive unit 212 (see FIG. 5).
- FIG. 7 is a circuit diagram showing an example of the circuit configuration of the pixel 210. As described above, each of the plurality of pixels 210 has a photoelectric conversion unit 51, a pixel signal generation unit 52, and an address event detection unit 53.
- the photoelectric conversion unit 51 has a photoelectric conversion element (light receiving element) 511, a transfer transistor 512, and an OFG (Over Flow Gate) transistor 513.
- a photoelectric conversion element light receiving element
- a transfer transistor 512 for example, an N-type MOS (Metal Oxide Semiconductor) transistor can be used.
- the transfer transistor 512 and the OFG transistor 513 are connected in series with each other.
- Photoelectric converter The photoelectric conversion element 511 is connected between the common connection node N1 of the transfer transistor 512 and the OFG transistor 513 and the ground, and photoelectrically converts the incident light into the amount of incident light. Generates a corresponding amount of charge.
- Transfer transistor A transfer signal TRG is supplied to the gate electrode of the transfer transistor 512 from the drive unit 212 shown in FIG.
- the transfer transistor 512 supplies the charge charged photoelectrically converted by the photoelectric conversion element 511 to the pixel signal generation unit 52.
- a control signal OFG is supplied from the drive unit 212 to the gate electrode of the OFG transistor 513.
- the OFG transistor 513 supplies the electric signal generated by the photoelectric conversion element 511 to the address event detection unit 53 in response to the control signal OFG.
- the electric signal supplied to the address event detection unit 53 is a photocurrent composed of electric charges.
- the pixel signal generation unit 52 is supplied with the charge photoelectrically converted by the photoelectric conversion element 511 from the photoelectric conversion unit 51 by the transfer transistor 512.
- the electric charge supplied from the photoelectric conversion unit 51 is accumulated in the floating diffusion layer 524.
- the floating diffusion layer 524 generates a voltage signal having a voltage value according to the amount of accumulated charge. That is, the floating diffusion layer 524 converts the charge into a voltage.
- Reset transistor 521 is connected between the power supply line of the power supply voltage VDD and the floating diffusion layer 524.
- a reset signal RST is supplied from the drive unit 212 to the gate electrode of the reset transistor 521.
- the reset transistor 521 initializes (reset) the charge amount of the floating diffusion layer 524 in response to the reset signal RST.
- the amplification transistor 522 is connected in series with the selection transistor 523 between the power supply line of the power supply voltage VDD and the vertical signal line VSL.
- the amplification transistor 522 amplifies the charge-voltage-converted voltage signal in the stray diffusion layer 524.
- a selection signal SEL is supplied from the drive unit 212 to the gate electrode of the selection transistor 523.
- the selection transistor 523 outputs the voltage signal amplified by the amplification transistor 522 as a pixel signal SIG to the column processing unit 214 (see FIG. 5) via the vertical signal line VSL.
- the drive unit 212 is a photoelectric conversion unit when an instruction to start detection of an address event is instructed by a light receiving control unit (not shown).
- a light receiving control unit not shown.
- the drive unit 212 turns off the OFG transistor 513 of the pixel 210 and supplies the photocurrent to the address event detection unit 53. To stop.
- the drive unit 212 drives the transfer transistor 512 by supplying the transfer signal TRG to the transfer transistor 512, and transfers the charge photoelectrically converted by the photoelectric conversion element 511 to the floating diffusion layer 524.
- the imaging unit 21 having the pixel array unit 211 in which the pixels 210 having the above configuration are two-dimensionally arranged outputs only the pixel signal of the pixel 210 in which the occurrence of the event is detected to the column processing unit 214. ..
- the power consumption of the image pickup unit 21, and thus the image pickup device 122, and the processing amount of image processing can be reduced as compared with the case where the pixel signals of all pixels are output regardless of the occurrence of an event. can.
- the current-voltage conversion unit 531 converts the photocurrent from the photoelectric conversion unit 51 of the pixel 210 into a logarithmic voltage signal thereof.
- the current-voltage conversion unit 531 supplies the converted voltage signal to the buffer 532.
- the buffer 532 buffers the voltage signal supplied from the current-voltage conversion unit 531 and supplies it to the subtractor 533.
- Transfer unit 535 transfers the detection signal (event data) of the address event supplied from the quantizer 534 to the arbiter unit 213 and the like.
- the transfer unit 535 supplies the arbiter unit 213 with a request for transmitting an address event detection signal. Then, when the transfer unit 535 receives the response to the request from the arbiter unit 213, the transfer unit 535 supplies the detection signal of the address event to the drive unit 212 and the signal processing unit 215.
- the shape measurement principle of the shape measurement device 10A of the first configuration example according to the shape measurement system of the second embodiment of the present disclosure will be described with reference to FIGS. 9 and 10.
- the three-dimensional shape of the measurement object 100 is formed by using a single light source unit 11A and two light receiving units 12 (that is, light receiving units 12A and 12B). Will be detected.
- LD semiconductor laser
- the secondary wave (secondary reflected light) is incident on the EVS1 and EVS2 (hereinafter, these may be referred to as “EVS1 and 2”) which are the image pickup apparatus 122. It may lead to measurement (distance measurement) error. Therefore, from the data obtained by measuring at each EVS 1 and 2 at each time t, the distance to the measurement surface 100A at each EVS 1 and 2 is calculated using a predetermined calculation formula described later, and both are used. The data with the same shape obtained in EVS1 and 2 of EVS1 and 2 are used as data from the normal reflected light, and the cross-sectional shape of the measurement object 100 is calculated from this normal reflected light by the following calculation formula and captured as a three-dimensional shape. be able to. This will be described in detail with reference to FIGS. 10 and 11.
- FIG. 11 shows an optical path of a laser beam oscillating from an LD, which is a light source 111 emitted from a light source 122. Since the laser beam is coherent light, for the sake of clarity, specific in-phase (wavefront) portions on the optical path of the laser beam are sequentially indicated by thick black circles and inclined broken lines. Further, in order to clarify the height from the stage 2, the contour lines are shown by broken lines.
- the measurement light used in the present configuration example of the present disclosure may be, of course, incoherent light because it is not necessary to use and detect, for example, a phase change.
- the desired height H ⁇ y / tan ⁇ , that is, the equation (3) is derived. Therefore, the three-dimensional coordinates (X0, Ya, Za) of the point C including the data about the thickness direction of the measurement surface 100A can be obtained by the equations (1) to (3).
- normal reflected light excluding secondary reflected light from differences in phase difference and polarization plane is discriminated and extracted to generate a three-dimensional shape. It may be a configuration or the like. A configuration example using a polarizing surface will be described in detail later.
- the plane wave (which may be a spherical wave) reaching the measurement surface 100A of the measurement object 100 causes a deviation in the transmission time t of the plane wave to the light receiving portion depending on the height position of the measurement surface 100A (at the same time, due to a phase difference or the like). It causes a gap).
- the ranging error ⁇ y included in the above-mentioned arithmetic expressions (2) and (3) is generated.
- the height H of the measured object can be calculated from this time t and the distance measuring error ⁇ y.
- a time difference may be caused in the strength E of the electric field reaching the image pickup apparatus 122, the amount of accumulated charge, and the like.
- the height of the object to be measured 100 may be measured by using the difference in brightness of each pixel, the difference in the amount of accumulated charges, and the like.
- the EVS 1 and 2 which are the two image pickup devices 122A and 122B, receive the normal reflected light (scattered light), so that the shape change is small as well as the shape (unevenness).
- a detection signal giving the same shape (height H) is output from both EVS 1 and 2 even from a point where the change is large.
- FIG. 12 is a schematic diagram showing the output distribution of event data of each pixel at a certain time t1. In this figure, it can be judged that the output state of the event data is completely different between the left half area and the right half area.
- two image pickup devices 122 are used.
- the height of the convex portion at the destination obtained from both EVSs 1 and 2 is high. (Or the depth of the recess) should be uniquely determined. That is, it is physically impossible for the height H at the same point to have a plurality of values.
- the event data at the same point obtained from both EVSs 1 and 2 should be the same, and the reflected light giving a unique input signal value in both EVSs 1 and 2 is the normal reflected light.
- the above-mentioned calculation formula is used to perform the calculation based on the data from the normally reflected light. As a result, it is possible to measure the shape with high accuracy even at a point where the difference in unevenness is large. Moreover, since EVS is used for the image pickup devices 122A and 122B, it is possible to capture three-dimensional images at high speed.
- the measurement light that follows the shortest optical path is used as the measurement light from the light source 111 in the light source unit 11A toward the measurement object 100, and the measurement light is used at each time of the measurement object 100.
- the measured light directed to the light receiving unit 12 after being scattered / reflected at a specific position of the measurement surface 100A, which is the measurement point of the above, is the reflected light reflected / scattered at the same point (same position) and follows the shortest optical path different from each other. It is configured so that two measurement lights (normally reflected light) can be used.
- the reflected light can be used to accurately measure the shape of the measurement surface of the object to be measured 100. In particular, even if deep irregularities are formed in the shape, it can be calculated with high accuracy by the above-mentioned calculation formulas (1) to (3).
- Second configuration example the shape measuring device 10B of the second aspect (hereinafter referred to as “second configuration example”) according to the shape measuring system of the second embodiment of the present disclosure will be described with reference to FIGS. 13 and 14. do.
- the same parts as those in the first configuration example are designated by the same reference numerals to avoid duplicate explanations.
- This configuration example differs from the first configuration example above in that the light source unit 11 uses two light source units 11B and the second light source unit 11C, and the light source unit 12 is simply. It is a point that one thing (hereinafter referred to as a light source unit 12C) is used. It is preferable that the two light source units, the first light source unit 11B and the second light source unit 11C, are installed apart from each other. For example, it can be installed at positions opposite to each other in the lateral (Y) direction with the light receiving unit 12 as the center. In particular, it is not necessary to have a symmetrical arrangement centered on the light receiving portion 12. Further, in this configuration example, the control of the control signal transmission unit 131 of the signal processing unit 13 is also different.
- a first semiconductor laser (LD1) 111B that continuously oscillates a blue laser light having a blue wavelength ( ⁇ b) can be used as a light source.
- the second semiconductor laser (LD2) 111C having the same wavelength ( ⁇ b) as that of the first semiconductor laser (LD1) 111B can be used as the light source.
- the first semiconductor laser (LD1) 111B is driven and oscillated in a time-division manner with the second semiconductor laser (LD2) 111C by a control signal from the control signal transmission unit 131. Can be done.
- the cylindrical lens 112 and the scanning mechanism 113 can each have the same configuration as that of the first configuration example.
- each of the light source units 11B and 11C of this configuration example a sweep scan is sequentially performed with respect to the measured object 100 in the lateral (Y) direction.
- slit light is sequentially projected to the entire width in the depth (X) direction for each pixel row (column) at time intervals divided into two.
- each slit light is projected from the first light source unit 11B at a time interval ⁇ t in each pixel row (column).
- a blue laser light of a specific wavelength ( ⁇ b) is irradiated at ⁇ t / 2 seconds.
- the second light source unit 11C continuously irradiates a blue laser beam having the same wavelength ( ⁇ b) at ⁇ t / 2 seconds. These blue laser beams are projected onto the measurement object 100 from light sources 111B and 111C having different arrangements, respectively.
- the unnecessary signal removing unit 132 removes the electric signal for the secondary reflected light from the electric signal output to the signal processing unit 13.
- the light receiving unit 12 is used. Each reflected light that follows the shortest path toward the light is incident on the light receiving unit 12 as two normal reflected lights. Further, there may be reflected light (secondary reflected light) that is reflected and scattered on the measurement surface 100A, and then, for example, is reflected at another portion, goes around another optical path, and then is incident on the light receiving unit 12.
- the data obtained from the two normal reflected lights is unique as in the first configuration example, so that the data is different from these data.
- the unnecessary signal related to the secondary reflected light that generates data can be subsequently removed by the unnecessary signal removing unit 132.
- the 3D calculation unit 133 uses the above-mentioned calculation formula from these normal electric signals to obtain the depth (X) direction of the measurement object 100 on the measurement surface 100A in the same pixel row (column) from any laser beam. Along with the data of, a value that should give the same height (Z; or the same H depth) is obtained.
- EVS is used for the image pickup apparatus 122 as in the first configuration example, and even the measured object 100 having a large unevenness difference can detect a highly accurate three-dimensional shape at high speed. ..
- the configuration example of the present disclosure uses two light source units, a first light source unit 11D and a second light source unit 11E, but the first light source unit 11D and the first light source unit 11E are used. It differs from the second configuration example in that light having different wavelengths is emitted from the two light source units 11E. Further, also in this configuration example, as in the second configuration example, in the signal processing unit 13, the control signal transmission unit 131 controls the oscillation operation of the laser of each light source in a time division manner.
- the first light source unit 11D uses a first semiconductor laser (LD1) 111D that continuously oscillates a blue laser beam having a blue wavelength ( ⁇ b) as a light source.
- a second semiconductor laser (LD2) 111E that continuously oscillates a red laser beam having a red wavelength ( ⁇ r; ⁇ r> ⁇ b) is used as a light source.
- each laser beam may be distinguished by converting the wavelength ( ⁇ ) by using a semiconductor laser or the like capable of wavelength ( ⁇ ) conversion. Further, each laser beam may be distinguished by a nonlinear frequency ( ⁇ ) conversion means such as second harmonic generation (SHG) or sum frequency generation (SFG).
- SHG second harmonic generation
- FSG sum frequency generation
- the previous configuration example is applied to each pixel row (column).
- the blue laser light from the first light source unit 11D is first irradiated at ⁇ t / 2 seconds.
- the second light source unit 11E continuously irradiates the red laser light at ⁇ t / 2 seconds.
- the same scanning operation is sequentially repeated corresponding to each pixel string.
- the light receiving unit 12D has a single configuration, and EVS (Event-based Vision Sensor) similar to that of the second configuration example is used for the image pickup device 122.
- EVS Event-based Vision Sensor
- the area corresponding to each pixel row is divided into two according to the operation time interval of each semiconductor laser.
- light having a wavelength other than blue is applied to the first division region (which may be referred to as a “blue area”) in the first half of the area portion corresponding to each pixel of each pixel row.
- blue filters BF1, BF2, BF3, ... are provided.
- the red filter RF1 is used as a second filter that transmits only red light and absorbs the remaining light in the second division region (this may be referred to as a “red area”) in the latter half of each pixel sequence.
- the first half and the second half areas (blue area and red area) in each pixel of each pixel row are from the blue laser light from the first semiconductor laser 111D and the second semiconductor laser 111E, respectively. Only the red laser beam of can be allowed and incident. Therefore, in the light receiving unit 12, the reflected light of blue light and the reflected light of red are mixed and superimposed on purple or the like through the lens 121 and are superimposed on the single EVS 122, and each pixel row (column) is incident. Even so, unnecessary wavelength light is removed by each color filter, that is, the blue filter BF1, BF2, BF3, ..., Which is the first filter, and the red filter RF1, RF2, RF3, ..., Which is the second filter.
- the reflected light (scattered light) emitted from the light source units 11D and 11E and reflected by the measurement surface 100A of the measurement object 100 is incident on the light receiving unit 12D as two types of normal reflected light.
- laser light having a different wavelength is incident on a designated area divided into two in units of each pixel row corresponding to the light of each wavelength. Therefore, even if an attempt is made to enter a predetermined area covered by the blue filter BF of the EVS pixel array in a certain pixel array portion, for example, in a state where red light is mixed with blue light as ambient light, the blue filter is applied. It is blocked by BF. Therefore, only the reflected light from the blue laser light can be incident in a predetermined area of the pixel row. Further, the same thing is performed for the pixel sequence to which the red filter RF is applied. This makes it possible to accurately measure the three-dimensional shape.
- the shape measuring device 10D of the fourth aspect (hereinafter referred to as “fourth configuration example”) according to the shape measuring system of the second embodiment of the present disclosure will be described with reference to FIGS. 17 to 19. do. Also in this configuration example, the same parts as those in the previous configuration example are designated by the same reference numerals to avoid duplicate explanations.
- the difference between this configuration example and the third configuration example is that the semiconductor laser LD1 that emits laser light of the same wavelength ( ⁇ ) to the light sources 111F and 111G of the first light source unit 11F and the second light source unit 11G, respectively.
- LD2 is used.
- the first light source unit 11F and the second light source unit 11G are provided with splitters 114 and 115 having different predetermined functions. For these LD1 and LD2, completely different types of LDs that emit laser light having different wavelengths instead of the same wavelength can also be used.
- the semiconductor laser LD1 oscillates and emits a laser beam having a specific wavelength ( ⁇ ) and having substantially circular polarization (or elliptically polarized light).
- the LD2 also oscillates and emits a substantially circularly polarized (or elliptically polarized) laser light having the same wavelength as the semiconductor laser LD1.
- the wavelength of the laser light emitted from these lasers is not particularly limited, and any wavelength can be selected.
- the first light source unit 11F is provided with the above-mentioned first polarizing element 114 on the optical path (optical axis A1) immediately after LD1 shown in FIG. 18A in order to emit a vertically polarized laser beam.
- the plane of polarization is shaped from circularly polarized light (or elliptically polarized light) to vertically polarized light (hereinafter sometimes referred to as "P wave").
- the second light source unit 11G unlike the third configuration example, in order to emit a laterally polarized laser beam, the above-mentioned second polarizing element 115 is placed on the optical path (optical axis A2) immediately after LD2 shown in FIG. 18B. Is provided, and the plane of polarization is shaped from circularly polarized light (or elliptically polarized light) to laterally polarized light (hereinafter, may be referred to as “S wave”).
- the area corresponding to each pixel row has a P wave in the same pattern as that of the third configuration example.
- the region is divided into two so that only the S wave is incident, that is, it is divided into a first division region and a second division region.
- the first division area hereinafter referred to as "P wave area”
- S wave area the second division area
- the corresponding areas of the first half and the second half areas of each pixel row are the vertically polarized laser light from the first semiconductor laser and the second semiconductor laser, respectively. Only the laterally polarized laser light can be separated and incident. Therefore, in the light receiving unit 12, the laser light transmitted through the lens 121 and incident on the single EVS 122 for each pixel row (column) is equivalent to each polarizing element even if the vertical and horizontal polarizing planes are mixed.
- the vertical polarizing film FP and the horizontal polarizing film FS having the above-mentioned functions remove laser light having an unnecessary polarizing surface, for example, unnecessary secondary reflected light. As a result, only the laser beam having the required polarizing surface can be selected and incident on each area of each pixel row of the EVS. It was
- the vertically polarized laser light and the horizontally polarized laser light are polarized in the designated areas in each pixel. Only laser light with a designated surface can be incident. This makes it possible to detect a highly accurate three-dimensional shape.
- EVS is used as the image pickup device, which enables high-speed three-dimensional shape measurement.
- the shape measuring device 10E of the fifth aspect (hereinafter referred to as “fifth configuration example”) according to the shape measuring system of the second embodiment of the present disclosure will be described in detail with reference to FIGS. 20 to 22.
- the same parts as those in the previous configuration example are designated by the same reference numerals to avoid duplicate explanations.
- This configuration example differs from the previous configuration examples in that the light source unit 11H is a single light source unit 11H, and the LD111H, the cylindrical lens 112, and the scanning mechanism 113 that emit white light are contained in the light source unit 11H.
- the point that the prism 116 for spectroscopy is provided and the light receiving unit 12F are single, and the wavelength band of three colors, that is, immediately before the lens 121 on the reflected light path from the light receiving unit 12F, that is, Selectively select light with wavelengths of red (hereinafter, may be abbreviated as "R"), green (hereinafter, may be abbreviated as "G”), and blue (hereinafter, may be abbreviated as "B").
- the color filter 125 includes a first filter having a wavelength transmission characteristic of transmitting only R (hereinafter referred to as “filter R”) and a second filter having a wavelength transmission characteristic of transmitting only G (hereinafter referred to as “filter”). It is composed of a third filter (hereinafter referred to as “filter B”) having a wavelength transmission characteristic that allows only B to pass through (referred to as "G").
- the light source 111H preferably has a wide wavelength emission.
- a semiconductor laser hereinafter, sometimes referred to as "white semiconductor laser 111H” that emits white laser light covering a visible light region having an oscillation wavelength range of 380 nm to 760 nm is used.
- the light source 111H may be, for example, a white LED in which a blue LED is coated with a phosphorescent material and a part of the blue light is converted into green, yellow, or red light as long as it emits light in a wide band. , Preferably high brightness and directional.
- the prism 116 of this configuration example is roughly divided into three types of incident light (wavelength bands R, G, and B).
- These three types of incident light ⁇ are blue incident light of about 380 to 500 nm (hereinafter referred to as B light: ⁇ B ), green incident light of 500 to 580 nm (hereinafter referred to as G light: ⁇ G ), and the like.
- red incident light of 580 to 760 nm hereinafter referred to as R light: ⁇ R ).
- the light receiving unit 12F has basically the same configuration as the first and second configuration examples so far, but further includes, as described above, a color filter 125 as shown in FIGS. 20 and 22. There is. Normally, light of the same wavelength (light incident on the light receiving portion 12F) follows Fermat's principle (principle of minimum action) and has a unique optical path (uniquely determined minimum) having the shortest optical distance. It travels and incidents toward the measurement surface of the object to be measured by following the optical path). In addition, the basic principle is that the reflected light after reflection follows the same optical path (minimum optical path) as the incident light, which is the shortest optical distance, and returns in the same direction (light reversibility). ..
- the normal incident light becomes normal reflected light and is incident on each pixel row of the EVS 122. That is, the normal reflected light of each color of RGB can be incident on the corresponding same wavelength area in the three-divided region provided with the same RGB color filter 125 (that is, filter R, filter G, filter B). ..
- the color filter 125 can be unitized or modularized as a light receiving function unit by fixing it in a housing 14 (see FIG. 20) that houses the light receiving unit 12. Further, if the color filter 125 is in the form of a thin film, it may be directly attached to the lens 121.
- the lens 121 immediately after the color filter 125 has the light transmitted through each color filter portion.
- a lens having a high collimating function because the lens advances to the same pixel as it is with respect to the pixel array.
- a Selfock (registered trademark) lens or a collimator can be used.
- the return light of the B component that is incident on the filter B of the color filter 125 and is transmitted is transmitted through the lens 121 as collimated light as it is, and then is positioned with respect to the pixel in the pixel array of the EVS 122 in the depth (X) direction. It can be incident on the corresponding pixel portion without shifting.
- the same effect can be obtained by the color filter 125 for the return light of other color components (R component and G component).
- the light of each wavelength of RGB emitted and spectroscopically emitted from the light source unit 11H and projected onto the measurement object 100 is reflected by the measurement surface 100A and then reflected. , Increasing into the color filter 125.
- the slit light emitted from the light source unit 11H and dispersed in RGB which is the laser light of 380 to 760 nm, is reflected by the measurement surface 100A, respectively. Therefore, of the reflected light, the normal reflected light returns to the direction of the color filter 125 by following the shortest path having the shortest optical distance as compared with the other reflected light.
- each of the RGB normal reflected light incident on the color filter 125 may be incident on other color component areas in addition to being incident on the area of the same color component.
- the B light ⁇ B is incident on an area having a color component different from that of the filter B (that is, the filter R and the filter G)
- it is absorbed by the filter R or the filter G in these areas. Therefore, it does not incident on the pixel corresponding to the R light or the G light.
- the RGB normal reflected light is absorbed without passing through the color filter 125 even if the return light having a different RGB color component becomes stray light and strays into another RGB area.
- the fifth configuration example only the normal reflected light from a specific area of the measurement surface corresponding to each pixel can be incident on each pixel in each pixel row. Therefore, there is no possibility that data regarding the shape of other areas in the measurement surface will be mixed in, and accurate shape measurement can be performed. Moreover, image information is generated by using normal reflected light for the reflected light reflected and incident on each reflecting surface. Therefore, it is possible to generate an accurate shape of the measurement surface even if the shape has a large difference in unevenness of the measurement surface.
- RGB is dispersed from the light source unit 11H by the prism 116, travels in different optical paths, and is reflected and scattered by the measurement surface 100A of the measurement object 100.
- each reflected light sintered light
- each reflected light that follows the shortest path toward the light receiving unit 12F is incident on the light receiving unit 12F as normal reflected light in RGB.
- the unnecessary signal related to the secondary reflected light that generates data different from this data can be subsequently removed by the unnecessary signal removing unit 132.
- the secondary reflected light of the same color component is transmitted to the area of the same color component by the same action as the action on the normal reflected light by the color filter 125. It may be tolerated. However, even if an unnecessary signal is generated from the secondary reflected light, this signal can be removed by the unnecessary signal removing unit 132 of the signal processing unit 13 as described above. As a result, even if the secondary reflected light which is the light from other RGB and may cause erroneous data invades the EVS 122 in each pixel area 122 of RGB, it can be effectively removed.
- the shape measurement system of the present disclosure is not particularly limited to each shape measurement of a large number of products sent out one after another in a factory or the like. That is, the present disclosure may be used to capture a change over time when observing, monitoring, or measuring a single static or dynamic object or a specific target area. For example, it can be applied as a fixed-point observation / monitoring system such as a camera for monitoring in a station yard, a camera in a parking lot, or a camera for an in-vehicle vehicle.
- the present disclosure may also have the following configuration.
- a light source unit that emits measurement light to the measurement target, The three-dimensional shape of the measurement target is calculated from the light receiving portion that receives the reflected light of the measurement light reflected from the measurement target and the event data generated from the reflected light that is reflected by the measurement target and incident on the light receiving portion.
- a signal processing unit having a three-dimensional shape calculation unit and an unnecessary signal removal unit that removes unnecessary signals generated by incident of unnecessary light other than the reflected light on the light receiving unit from the outside.
- a scanning mechanism that scans the measurement target by sweeping with the measurement light that is projected from the light source unit onto the measurement target by moving the light source unit or the measurement target.
- a shape measurement system that can be taken by this disclosure.
- the scanning mechanism is provided in the light source unit.
- the light source unit includes a light source and a waveform control lens that controls the waveform of light from the light source.
- the light receiving unit is an EVS (Event-based Vision Sensor), which is an asynchronous image pickup element that captures a temporal change of the light receiving lens and the reflected light transmitted through the light receiving lens among the reflected light and outputs an event signal.
- the present invention comprises an event issuing unit that detects an event based on the output data from the EVS and outputs the event data, and a transmission unit that outputs the event data to the signal processing unit. Shape measurement system.
- the light source unit is single and has a single light source.
- the shape measuring system according to (2) above wherein the light receiving unit includes a first light receiving unit and a second light receiving unit installed at different positions from each other.
- the light source unit includes a first light source unit and a second light source unit installed at different positions from each other.
- the light receiving unit is single and has a single light receiving unit.
- the signal processing unit includes a control signal transmission unit that outputs a control signal to each of the scan mechanisms at time-division time intervals for each pixel sequence.
- the light source unit includes a first light source unit including a first light source that emits light of the first wavelength, and a second light source unit including a second light source that emits light of the second wavelength.
- the light receiving unit is single and has a single light receiving unit.
- the EVS is composed of a two-division area divided into a first division area and a second division area for each pixel column, and also has a two-division area.
- the first division region is the first wavelength of the light of the first wavelength and the reflected light of the second wavelength reflected by the measurement target among the measurement lights of two different wavelengths emitted from the light source. It is provided with a first filter having a first wavelength transmission characteristic that selectively transmits the reflected light of the above.
- the light having the first wavelength is light having a blue wavelength.
- the shape measurement system according to (5) above, wherein the light having the second wavelength is light having a red wavelength.
- the light source unit includes a first light source unit provided with the first light source and a second light source unit provided with the second light source.
- the first light source unit includes a first optical element that generates a first measurement light having a first polarizing surface from the emitted light of the first light source.
- the second light source unit includes a second optical element that generates a second measurement light having a second polarization plane from the emitted light of the second light source.
- the light receiving unit is single and has a single light receiving unit.
- the EVS is divided into a first division area and a second division area for each pixel sequence.
- the first divided region has the first polarizing surface, which is reflected by the measurement target and incident on the light receiving portion of the two types of measurement light, the first measurement light and the second measurement light.
- a first film that selectively transmits the first reflected light among the reflected light and the second reflected light having the second polarizing surface is provided.
- the light source unit is single and has a single light source.
- the light source in the light source unit includes a dividing means for dividing the light from the light source into at least three types of wavelength bands.
- the light receiving unit is single and has a single light receiving unit.
- the EVS is composed of a three-divided region divided into three divided regions from the first divided region to the third divided region in pixel row units. In the first division region, among the measurement lights of three different wavelength bands emitted from the light source, the first wavelength of the reflected light of the three types of wavelength bands reflected by the measurement target. It is equipped with a first color filter having a wavelength transmission characteristic that selectively transmits the reflected light of the band.
- the second division region is a second color filter having a wavelength transmission characteristic that selectively transmits the reflected light of the second wavelength band among the reflected light of the three types of wavelength bands reflected by the measurement target. Equipped with In the third division region, a third color filter having a wavelength transmission characteristic that selectively transmits the reflected light of the third wavelength band among the reflected light of the three types of wavelength bands reflected by the measurement target.
- the shape measuring system according to (2) above. The dividing means is composed of a prism that divides white light from the light source into light of three types of wavelength bands, and at the same time.
- the light in the three types of wavelength bands is light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band.
- the EVS outputs an event signal by comparing the signal voltage based on the temporal change of the reflected light with the threshold voltage and determining that the voltage is smaller or larger than the threshold voltage.
- Shape measurement system 2 Stage (measurement table) 3 Movement mechanism (scan mechanism) 4 Control unit 10, 10A to 10E Shape measuring device 11
- Light source unit 11A 1st light source unit 11B 2nd light source unit 11D 1st light source unit 11E 2nd light source unit 11F
- Light source unit 111
- Light source Semiconductor laser (LD) 111A 1st semiconductor laser (LD1) 111B 2nd semiconductor laser (LD2) 111C 3rd semiconductor laser (LD3) 111F
- First substituent (optical element) 115 Second Polarizer (Optical Element) 116 Prism 12, 12C, 12D, 12E, 12F
- Light receiving part 12A First light receiving part 12B Second light receiving part 121
- Light receiving lens 122, 122A, 122B Asynchronous type (non-scanning type) image sensor (image sensor) (EVS, EVS1, EVS2) 123
- Event issuing unit 124 Transmitting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
前記受光部は、受光レンズと、前記反射光のうち前記受光レンズを透過する反射光の時間的な変化を捉えてイベント信号を出力する非同期型の撮像素子であるEVS(Event-based Vision Sensor)と、前記EVSからの出力データに基づきイベントを検出してイベントデータを出力するイベント発行部と、前記イベントデータを前記信号処理部へ出力する送信部と、を備えてもよい。
前記受光部として、互いに位置をずらして設置した第1受光部及び第2受光部を備えてもよい。
前記受光部は、単一であり、
前記信号処理部は、画素列ごとに時分割した時間間隔で制御信号を前記各スキャン機構へ出力する制御信号送信部を備えてもよい。
前記受光部は、単一であり、
前記EVSは、画素列単位で第1分割域と第2分割域とに2分割された2分割域で構成するとともに、
前記第1分割域は、前記光源から出射される互いに異なる2種類の波長の測定光のうち、前記測定対象で反射する第1波長の光と第2波長の反射光の中の、第1波長の前記反射光を選択的に透過させる第1波長透過特性を有する第1フィルタを備えるとともに、
前記第2分割域は、前記反射光の中の、前記第2波長の光を選択的に透過させる第2波長透過性を有する第2フィルタを備えたものでもよい。
前記第1光源部は、前記第1光源の出射光から第1偏光面を有する第1測定光を生成する第1の光学素子を備え、
前記第2光源部は、前記第2光源の出射光から第2偏光面を有する第2測定光を生成する第2光学素子を備え、
前記受光部は、単一であり、
前記EVSは、画素列ごとに第1分割領域と第2分割領域とに2分割され、
前記第1分割領域は、前記第1測定光及び前記第2測定光の2種類の測定光のうち、前記測定対象で反射し前記受光部にそれぞれ入射する、前記第1偏光面を有する第1反射光及び前記第2偏光面を有する第2反射光のうち、前記第1反射光を選択的に透過させる第1フィルムを備え、
前記第2分割領域は、前記第2反射光を選択的に透過させる第2フィルムを備えてもよい。
前記第2偏光面は、横方向に振動する直線偏光であってもよい。
前記受光部は、単一であり、
前記EVSは、画素列単位で第1分割領域乃至第3分割領域に3分割された3分割領域で構成し、
前記第1分割領域には、前記光源から出射される互いに異なる3種類の波長帯の測定光のうち、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第1波長帯の反射光を選択的に透過させる波長透過特性を有する第1フィルタを備え、
前記第2分割領域には、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第2波長帯の反射光を選択的に透過させる波長透過特性を有する第2フィルタを備え、
前記第3分割領域には、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第3波長帯の反射光を選択的に透過させる波長透過特性を有する第3フィルタを備えてもよい。
前記3種類の波長帯の光は、赤色波長帯の光と、緑色波長帯の光と、青色波長帯の光であってもよい。
1.本開示の第1実施形態に係る形状測定システム
2.本開示の第2実施形態に係る形状測定システム
2-1.第1構成例・・・単一光源+複数受光部
2-2.第2構成例・・・複数光源+単一受光部(光源時分割制御)
2-3.第3構成例・・・複数光源+単一受光部(波長分割制御)
2-4.第4構成例・・・複数光源+単一受光部(偏光切替制御)
2-5.第5構成例・・・単一光源+単一受光部(プリズムで分光)
3.本開示がとることができる構成
本開示の第1実施形態に係る形状測定システムを示す模式図を図1に示し、本開示の形状測定システムの概略構成を示すブロック図を図2に示す。図1以下の図面において、測定対象である測定物の奥行方向、スキャン及び移動方向、測定物の高さ方向を明確にさせるため、それぞれ、X方向、Y方向、Z方向とした、互いに直交する3次元デカルト座標を設定してある。なお、本開示では左手系を採用している。その原点は特に測定物の四隅の一つに特定する。
本開示の形状測定システム1は、大略構成として、ステージ2と、このステージ2に静置状態で搭載された測定物100の測定面100Aを3次元測定する形状測定装置10と、この形状測定装置10をY方向に間欠的に移動するスキャン機構を構成する移動機構3と、この移動機構3を制御する制御部4と、形状測定装置10での信号処理を行う信号処理部(以下、「信号処理ユニット」とよぶ)13と、を備える。なお、本実施形態の形状測定装置10は、図2に示すように、光源部11及び受光部12を備えており、ヘッドH内部に収容されている。
本開示の形状測定システム1に係る形状測定装置10は、測定物100の長さ(Y)、奥行き(X)、高さ(Z)についての形状を3次元測定する。本開示の形状測定の手段として、ヘッドH内の光源部11で生成された幅広帯状をなす、もしくはテーパ状に広がって扇状をなす、スリット光(以下、これを“測定光”と呼ぶことがある)を用いる。なお、このスリット光の形状については上記のものに限定されない。例えば、測定物や測定面の形状などに対応し、形状測定に効果的な特有の形状であってもよい。
(2-1.第1構成例)
次に、本開示の第2実施形態の形状測定システムに備えた形状測定装置について、図面を参照しながら詳細に説明する。
図3は、本開示に係る第2の実施形態の第1の態様(以下、「第1構成例」とよぶ)を示すものである。即ち、この図3では、第1構成例に係る形状測定装置10A、特に、この形状測定装置10Aに備えた、光源部11、受光部12、及び信号処理ユニット13の構成を示す。なお、図1及び図2と同一部分には同一符号を付して重複説明を避ける。
本構成例の光源部11(以下、「光源部11A」とよぶ)は、上述のように、単一のもので構成されており、具体的には、図3に示すように、光源111Aと、波形制御レンズ112と、スキャン機構113と、を備える。なお、本構成例における以下の説明では、説明を簡単に行うため、光源部11Aの各要素のうち、特に光源111Aについては“A”を省略して説明する。
本開示の本構成例では、第1受光部12Aには、具体的には、第1受光レンズ121Aと、第1撮像装置122Aと、第1イベント発行部123Aと、第1送信部124Aとを備えている。一方、第2受光部12Bには、具体的には、第2受光レンズ121Bと、第2撮像装置122Bと、第2イベント発行部123Bと、第2送信部124Bとを備えている。なお、以下の本構成例では、説明を簡単にするため、上記の各構成要素の呼称に付加させてある符号については、3桁の数字のみ表記し、“A“及び“B”は省略することにする。
本実施形態の信号処理ユニット13は、受光部12から得たデータに基づき測定物100の3次元データを高速で検出するとともに格納していく。前述したように、信号処理ユニット13は、制御信号送信部131と、不要信号除去部132と、3D演算部133と、メモリ134などとを備える。
次に、本開示の撮像装置122を備えた撮像部21について、図5から図8を参照しながら説明する。
第1構成例に係る受光部12の撮像装置122などを備えた撮像部21の構成例について、以下に具体的に説明する。図5は、撮像部21の構成の一例を示すブロック図である。
上記の構成の撮像部21において、画素アレイ部211には、複数の画素210が行列状(アレイ状)に2次元配列されている。この行列状の画素配列に対して、画素列(カラム)毎に、後述する垂直信号線VSLが配線される。
駆動部212は、複数の画素210のそれぞれを駆動して、各画素210で生成された画素信号をカラム処理部214に出力させる。
アービタ部213は、複数の画素210のそれぞれからのリクエストを調停し、調停結果に基づく応答を画素210に送信する。アービタ部213からの応答を受け取った画素210は、検出結果を示すイベントデータ(アドレスイベントの検出信号)を駆動部212及び信号処理部215に供給する。画素210からのイベントデータの読出しについては、複数行読出しとすることも可能である。
カラム処理部214は、例えば、アナログ-デジタル変換器などから成り、画素アレイ部211の画素列毎に、その列の画素210から出力されるアナログの画素信号をデジタル信号に変換する処理などを行う。そして、カラム処理部214は、アナログ-デジタル変換後のデジタル信号を信号処理部215に供給する。
信号処理部215は、カラム処理部214から供給されるデジタル信号に対して、CDS(Correlated Double Sampling)処理や画像認識処理などの所定の信号処理を実行する。そして、信号処理部215は、処理結果を示すデータと、アービタ部213から供給されるイベントデータとを信号線216を介して出力する。
図6は、画素アレイ部211の構成の一例を示すブロック図である。
上記の構成の画素210において、光電変換部51は、入射光を光電変換して光電流を生成する。そして、光電変換部51は、駆動部212(図5参照)の制御に従って、画素信号生成部52及びアドレスイベント検出部53のいずれかに、光電変換して生成した光電流を供給する。
画素信号生成部52は、光電変換部51から供給される光電流に応じた電圧の信号を画素信号SIGとして生成し、この生成した画素信号SIGを、垂直信号線VSLを介してカラム処理部214(図5参照)に供給する。
アドレスイベント検出部53は、光電変換部51のそれぞれからの光電流の変化量が所定の閾値を超えたか否かにより、アドレスイベント(以下、単に“イベント”と記述する場合がある)の発生の有無を検出する。アドレスイベントは、例えば、光電流の変化量が上限の閾値を超えた旨を示すオンイベント、及び、その変化量が下限の閾値を下回った旨を示すオフイベントから成る。また、アドレスイベントの検出信号は、例えば、オンイベントの検出結果を示す1ビット、及び、オフイベントの検出結果を示す1ビットから成る。尚、アドレスイベント検出部53については、オンイベントのみを検出する構成とすることもできる。
図7は、画素210の回路構成の一例を示す回路図である。上述したように、複数の画素210のそれぞれは、光電変換部51、画素信号生成部52、及び、アドレスイベント検出部53を有する構成となっている。
上記の構成の画素210において、光電変換部51は、光電変換素子(受光素子)511、転送トランジスタ512、及び、OFG(Over Flow Gate)トランジスタ513を有する構成となっている。転送トランジスタ512及びOFGトランジスタ513としては、例えば、N型のMOS(Metal Oxide Semiconductor)トランジスタを用いることができる。転送トランジスタ512及びOFGトランジスタ513は、互いに直列に接続されている。
光電変換素子511は、転送トランジスタ512とOFGトランジスタ513との共通接続ノードN1とグランドとの間に接続されており、入射光を光電変換して入射光の光量に応じた電荷量の電荷を生成する。
転送トランジスタ512のゲート電極には、図5に示す駆動部212から転送信号TRGが供給される。転送トランジスタ512は、転送信号TRGに応答して、光電変換素子511で光電変換された電荷を画素信号生成部52に供給する。
OFGトランジスタ513のゲート電極には、駆動部212から制御信号OFGが供給される。OFGトランジスタ513は、制御信号OFGに応答して、光電変換素子511で生成された電気信号をアドレスイベント検出部53に供給する。アドレスイベント検出部53に供給される電気信号は、電荷からなる光電流である。
画素信号生成部52は、リセットトランジスタ521、増幅トランジスタ522、選択トランジスタ523、及び、浮遊拡散層524を有する構成となっている。リセットトランジスタ521、増幅トランジスタ522、及び、選択トランジスタ523としては、例えば、N型のMOSトランジスタを用いることができる。
増幅トランジスタ522は、電源電圧VDDの電源ラインと垂直信号線VSLとの間に、選択トランジスタ523と直列に接続されている。増幅トランジスタ522は、浮遊拡散層524で電荷電圧変換された電圧信号を増幅する。
選択トランジスタ523のゲート電極には、駆動部212から選択信号SELが供給される。選択トランジスタ523は、選択信号SELに応答して、増幅トランジスタ522によって増幅された電圧信号を画素信号SIGとして垂直信号線VSLを介してカラム処理部214(図5参照)へ出力する。
そして、ある画素210においてイベントの発生が検出されると、駆動部212は、その画素210のOFGトランジスタ513をオフ状態にしてアドレスイベント検出部53への光電流の供給を停止させる。次いで、駆動部212は、転送トランジスタ512に転送信号TRGを供給することによって当該転送トランジスタ512を駆動して、光電変換素子511で光電変換された電荷を浮遊拡散層524に転送させる。
図8は、アドレスイベント検出部53の構成の一例を示すブロック図である。同図に示すように、本構成例に係るアドレスイベント検出部53は、電流電圧変換部531、バッファ532、減算器533、量子化器534、及び、転送部535を有する構成となっている。
電流電圧変換部531は、画素210の光電変換部51からの光電流を、その対数の電圧信号に変換する。電流電圧変換部531は、変換した電圧信号をバッファ532に供給する。バッファ532は、電流電圧変換部531から供給される電圧信号をバッファリングし、減算器533に供給する。
減算器533には、駆動部212から行駆動信号が供給される。減算器533は、行駆動信号に従って、バッファ532から供給される電圧信号のレベルを低下させる。そして、減算器533は、レベル低下後の電圧信号を量子化器534に供給する。量子化器534は、減算器533から供給される電圧信号をデジタル信号に量子化してアドレスイベントの検出信号(イベントデータ)として転送部535に出力する。
転送部535は、量子化器534から供給されるアドレスイベントの検出信号(イベントデータ)をアービタ部213等に転送する。この転送部535は、イベントの発生が検出された際に、アドレスイベントの検出信号の送信を要求するリクエストをアービタ部213に供給する。そして、転送部535は、リクエストに対する応答をアービタ部213から受け取ると、アドレスイベントの検出信号を駆動部212及び信号処理部215に供給する。
次に、本開示の第2実施形態の形状測定システムに係る、第1構成例の形状測定装置10Aでの形状測定原理について、図9および図10を参照しながら説明する。
本開示の第1構成例では、上述したように、単一の光源部11Aおよび2つの受光部12(即ち、受光部12A、12B)を用いて測定物100(測定面100A)の3次元形状を検出していく。
X=任意の奥行方向での地点でのX座標・・・(1)
Y=(スキャン機構での走査速度)・(スキャン機構の作動時間)-(Y方向に関する測距誤差Δy)・・・(2)
Z=(Y方向に関する測距誤差Δy)/(tanθ)・・・(3)
但し、ここで、θ:光源の垂線に対する傾斜角度
図11は、光源122から出射する光源111であるLDから発振するレーザ光の光路を示す。レーザ光は、コヒーレント光であるから、説明を分かりやすくするため、このレーザ光の光路上において特定の同一位相(波面)部分を順次太黒丸及び傾斜破線で示す。また、ステージ2からの高さを明確にするため、等高線を破線で示している。なお、本開示の本構成例で使用する測定光については、特に、例えば位相の変化などを利用・検出する必要がないので、勿論、インコヒーレント光でもよい。
tanθ=Δy/H・・・(4)
(但し、H:測定物100の厚さ)
が成立する。この(4)式により、求める高さH=Δy/tanθ、即ち(3)式が導出される。従って、(1)~(3)式によって、測定面100Aの厚さ方向についてのデータを含んだ点Cの3次元座標(X0,Ya,Za)が得られる。なお、この点CについてはX座標がX=0としてが、特にこの座標位置のみに限定されるものではない。ステージ2上の測定物100について、そのX方向に関する任意の位置座標が適用できる。
従って、本構成例によれば、2つの撮像装置122A,122BであるEVS1,2では、正規反射光(散乱光)を受光することで、形状変化の小さいところは勿論のこと、形状(凹凸)変化の大きい地点などからであっても、双方のEVS1,2からは同一形状(高さH)を与える検出信号が出力される。図12は、ある時刻t1での各画素のイベントデータの出力分布を示す模式図である。この図において、左半部のエリアと右半分のエリアでは、イベントデータの出力状態が全く異なることまでは判断がつく。
次に、本開示の第2実施形態の形状測定システムに係る、第2の態様(以下、「第2構成例」とよぶ)の形状測定装置10Bについて、図13及び図14を参照しながら説明する。なお、第1構成例と同一部分には同一符号を付して重複説明を避ける。
従って、本構成例によれば、このように、各第1、第2半導体レーザ111B,111Cに対する駆動時間を等分で2分割させている。つまり、本構成例では、各画素列(カラム)に投光するスリット光が、第1半導体レーザからのレーザ及び第2半導体レーザからのレーザ光の順番で、それぞれ1度ずつ、都合2回投光される。このような各レーザ光は、測定物100の横(Y)方向の開始地点(Y=0;原点O)から横方向の最終地点(Y=L;但し、Lは測定物100の横方向の全長)まで、順次サイクリックに繰り返す。
次に、本開示の第2実施形態の形状測定システムに係る、第3の態様(以下、「第3構成例」とよぶ)の形状測定装置10Cについて、図15及び図16を参照しながら説明する。なお、本構成例でも、先の構成例と同一部分には同一符号を付して重複説明を避ける。
従って、本構成例によれば、このように、第1、第2半導体レーザ111D,111Eに対する駆動時間を等分で2分割させている。つまり、本構成例では、各列(カラム)に対するスリット光が、青色から赤色の順で色分けされて投光される。このような2色のレーザ光は、測定物100の横(Y)方向の開始地点(Y=0;原点O)から横方向の最終地点(Y=L;但し、Lは測定物100の横方向の全長)まで順次サイクリックに繰り返す。
次に、本開示の第2実施形態の形状測定システムに係る、第4の態様(以下、「第4構成例」とよぶ)の形状測定装置10Dについて、図17乃至図19を参照しながら説明する。なお、本構成例でも、先の構成例と同一部分には同一符号を付して重複説明を避ける。
半導体レーザLD1では、特定波長(λ)を有するほぼ円偏光(又は楕円偏光でもよい)のレーザ光を発振出射する。LD2でも、半導体レーザLD1と同様の波長を有するほぼ円偏光(又は楕円偏光)のレーザ光を発振出射する。これらのレーザから出射するレーザ光の波長については、特に限定されるものではなく、任意の波長のものが選択可能である。
一方、受光部12として、本構成例の受光部12Eには、EVSにおいて、図19に示すように、第3構成例と同様のパターンで、各画素列に対応するエリアが、それぞれ、P波、S波のみが入射するように2分割された領域、即ち、第1分割域と第2分割域とに分割されている。具体的には、EVSの各画素列の各画素に対応する各2分割エリアである第1分割域(以下、「P波エリア」とよぶ)及び第2分割域(以下、「S波エリア」とよぶ)には、第1偏光子114及び第2偏光子115と同様の機能を有する各フィルム、即ち、第1フィルムとして縦偏光フィルムFP1,FP2,FP3・・・及び第2フィルムとして横偏光フィルムFS1,FS2,FS3・・・が、各エリアに合わせてそれぞれ設置される。
次に、本開示の第2実施形態の形状測定システムに係る、第5の態様(以下、「第5構成例」とよぶ)の形状測定装置10Eについて、図20乃至図22を参照しながら詳細に説明する。なお、本構成例でも、先の構成例と同一部分には同一符号を付して重複説明を避ける。
光源111Hには、出射波長が広帯域のものであることが好ましい。本構成例では、発振波長域が380nm~760nmの可視光域をカバーする白色のレーザ光を出射する半導体レーザ(以下、“白色半導体レーザ111H”とよぶことがある)を用いている。なおこの光源111Hには、広帯域の発光であれば、例えば、青色LEDを燐光材料でコーティングして青色光の一部を緑、黄色、赤色光に変換している白色LEDなどでも可能であるが、好ましくは高輝度で指向性のあるものがよい。
受光部12Fは、基本的には、これまでの第1および第2構成例と同様の構成であるが、さらに、前述したように、図20及び図22に示すような色フィルタ125を備えている。通常、同一波長の光(受光部12Fへの入射光)は、周知のように、フェルマーの原理(最小作用の原理)に従い、最短の光学距離である固有の光路(一義的に決定された最小光路)を辿って測定物の測定面に向けて進行・入射する。また、そこで反射後の反射光は、再び最短の光学距離である入射光と同一光路(最小光路)を辿って同方向などへ戻っていく、というのが基本原理である(光の可逆性)。
従って、本構成例によれば、図20及び図22に示すように、光源部11Hから出射・分光され測定物100へ投光されたRGB各波長の光は、測定面100Aで反射し、その後、色フィルタ125へ入射する。図21に示すように、光源部11Hから出射する、380~760nmのレーザ光であるRGBに分光されたスリット光は、それぞれ測定面100Aで反射する。そこで反射後の各反射光のうち正規反射光は、他の反射光に比して最短の光学距離となる最短経路をたどって色フィルタ125の方向に戻っていく。
尚、本開示は、以下のような構成をとることもできる。
(1)測定光を測定対象に出射する光源部と、
前記測定対象から反射される前記測定光の反射光を受光する受光部と、 前記測定対象で反射し前記受光部に入射した前記反射光から生じるイベントデータにより、前記測定対象の3次元形状を算出する3次元形状演算部、及び、前記反射光以外の不要な光が外部から前記受光部に入射して生成される不要信号を除去する不要信号除去部、を有する信号処理部と、
前記光源部又は前記測定対象を移動させることで、前記光源部から前記測定対象に投光する前記測定光で掃引し前記測定対象を走査するスキャン機構と、
を備える、形状測定システム。
(2)前記スキャン機構は、前記光源部に設けられており、
前記光源部は、光源と、前記光源からの光の波形制御を行う波形制御レンズと、を備え、とともに、
前記受光部は、受光レンズと、前記反射光のうち前記受光レンズを透過する反射光の時間的な変化を捉えてイベント信号を出力する非同期型の撮像素子であるEVS(Event-based Vision Sensor)と、前記EVSからの出力データに基づきイベントを検出してイベントデータを出力するイベント発行部と、前記イベントデータを前記信号処理部へ出力する送信部と、を備える、前記(1)に記載の形状測定システム。
(3)前記光源部は、単一であり、
前記受光部として、互いに位置をずらして設置した第1受光部及び第2受光部を備える、前記(2)に記載の形状測定システム。
(4)前記光源部として、互いに位置をずらして設置した第1光源部及び第2光源部を備え、
前記受光部は、単一であり、
前記信号処理部は、画素列ごとに時分割した時間間隔で制御信号を前記各スキャン機構へ出力する制御信号送信部を備える、前記(2)に記載の形状測定システム。
(5)前記光源部として、第1波長の光を出射する第1光源を備えた第1光源部と、第2波長の光を出射する第2光源を備えた第2光源部と、を備え、
前記受光部は、単一であり、
前記EVSは、画素列単位で第1分割域と第2分割域とに2分割された2分割域で構成するとともに、
前記第1分割域は、前記光源から出射される互いに異なる2種類の波長の測定光のうち、前記測定対象で反射する第1波長の光と第2波長の反射光の中の、第1波長の前記反射光を選択的に透過させる第1波長透過特性を有する第1フィルタを備えるとともに、
前記第2分割域は、前記反射光の中の、前記第2波長の光を選択的に透過させる第2波長透過性を有する第2フィルタを備える、前記(2)に記載の形状測定システム。
(6)前記第1波長の光は、青色波長の光であり、
前記第2波長の光は、赤色波長の光である、前記(5)に記載の形状測定システム。
(7)前記光源部として、前記第1光源を備えた第1光源部と、第2光源を備えた第2光源部と、を備え、
前記第1光源部は、前記第1光源の出射光から第1偏光面を有する第1測定光を生成する第1の光学素子を備え、
前記第2光源部は、前記第2光源の出射光から第2偏光面を有する第2測定光を生成する第2光学素子を備え、
前記受光部は、単一であり、
前記EVSは、画素列ごとに第1分割領域と第2分割領域とに2分割され、
前記第1分割領域は、前記第1測定光及び前記第2測定光の2種類の測定光のうち、前記測定対象で反射し前記受光部にそれぞれ入射する、前記第1偏光面を有する第1反射光及び前記第2偏光面を有する第2反射光のうち、前記第1反射光を選択的に透過させる第1フィルムを備え、
前記第2分割領域は、前記第2反射光を選択的に透過させる第2フィルムを備える、前記(2)に記載の形状測定システム。
(8)前記第1偏光面は、縦方向に振動する直線偏光であり、 前記第2偏光面は、横方向に振動する直線偏光である、前記(7)に記載の形状測定システム。
(9)前記光源部は、単一であり、
前記光源部内の光源には、前記光源からの光を、少なくとも3種類の波長帯に分割する分割手段を備え、
前記受光部は、単一であり、
前記EVSは、画素列単位で第1分割領域乃至第3分割領域に3分割された3分割領域で構成し、
前記第1分割領域には、前記光源から出射される互いに異なる3種類の波長帯の測定光のうち、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第1波長帯の反射光を選択的に透過させる波長透過特性を有する第1色フィルタを備え、
前記第2分割領域には、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第2波長帯の反射光を選択的に透過させる波長透過特性を有する第2色フィルタを備え、
前記第3分割領域には、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第3波長帯の反射光を選択的に透過させる波長透過特性を有する第3色フィルタを備える、前記(2)に記載の形状測定システム。
(10)前記分割手段は、前記光源からの白色光を3種類の波長帯の光に分割するプリズムで構成するとともに、
前記3種類の波長帯の光は、赤色波長帯の光と、緑色波長帯の光と、青色波長帯の光である、
前記(9)に記載の形状測定システム。
(11)前記EVSは、前記反射光の時間的変化に基づく信号電圧と閾値電圧と比較し、閾値電圧よりも小さいもしくは大きいことを判定することによりイベント信号を出力する、
前記(2)に記載の形状測定システム。
2 ステージ(測定台)
3 移動機構(スキャン機構)
4 制御部
10、10A~10E 形状測定装置
11 光源部
11A 第1光源部
11B 第2光源部
11D 第1光源部
11E 第2光源部
11F 光源部
111 光源:半導体レーザ(LD)
111A 第1半導体レーザ(LD1)
111B 第2半導体レーザ(LD2)
111C 第3半導体レーザ(LD3)
111F LD
112 シリンドリカルレンズ(波形形成レンズ、波形制御レンズ)
113 スキャン機構
114 第1偏光子(光学素子)
115 第2偏光子(光学素子)
116 プリズム
12、12C、12D、12E、12F 受光部
12A 第1受光部 12B 第2受光部
121 受光レンズ
122、122A,122B 非同期型(非走査型)の撮像装置(撮像素子)(EVS、EVS1、EVS2)
123 イベント発行部
124 送信部
13 信号処理ユニット(信号処理部)
131 制御信号送信部
132 不要信号除去部
133 3D演算部
134 メモリ
21 撮像部
211 画素アレイ部
212 駆動部
213 アービタ部
214 カラム処理部
215 信号処理部
210 画素
41 システム制御部
42 移動制御部
51 光電変換部
52 画素信号生成部
53 アドレスイベント検出部
513 OFGトランジスタ
531 電流電圧変換部
532 バッファ
533 減算器
534 量子化器
535 転送部
100 測定物
100A 測定面
A1、A2 光軸
BF 青色フィルタ(第1フィルタ)
RF 赤色フィルタ(第2フィルタ)
FP 縦偏光フィルム(第1フィルム)
FS 横偏光フィルム(第2フィルム)
FR フィルタR(第1フィルタ)
FG フィルタG(第2フィルタ)
FB フィルタB(第3フィルタ)
H ヘッド
X 奥行方向
Y 横方向
Z 高さ方向
α 入射光(測定光)
αR 赤色入射光(R光)
αG 緑色入射光(G光)
αB 青色入射光(B光)
β 反射光
β1 反射光
β2 反射光
Claims (11)
- 測定光を測定対象に出射する光源部と、
前記測定対象から反射される前記測定光の反射光を受光する受光部と、
前記測定対象で反射し前記受光部に入射した前記反射光から生じるイベントデータにより、前記測定対象の3次元形状を算出する3次元形状演算部、及び、前記反射光以外の不要な光が外部から前記受光部に入射して生成される不要信号を除去する不要信号除去部、を有する信号処理部と、
前記光源部又は前記測定対象を移動させることで、前記光源部から前記測定対象に投光する前記測定光で掃引し前記測定対象を走査するスキャン機構と、
を備える、
形状測定システム。 - 前記スキャン機構は、前記光源部に設けられており、
前記光源部は、光源と、前記光源からの光の波形制御を行う波形制御レンズと、を備えるとともに、
前記受光部は、受光レンズと、前記反射光のうち前記受光レンズを透過する反射光の時間的な変化を捉えてイベント信号を出力する非同期型の撮像素子であるEVS(Event-based Vision Sensor)と、前記EVSからの出力データに基づきイベントを検出してイベントデータを出力するイベント発行部と、前記イベントデータを前記信号処理部へ出力する送信部と、を備える、
請求項1に記載の形状測定システム。 - 前記光源部は、単一であり、
前記受光部として、互いに位置をずらして設置した第1受光部及び第2受光部を備える、
請求項2に記載の形状測定システム。 - 前記光源部として、互いに位置をずらして設置した第1光源部及び第2光源部を備え、
前記受光部は、単一であり、
前記信号処理部は、画素列ごとに時分割した時間間隔で制御信号を前記各スキャン機構へ出力する制御信号送信部を備える、
請求項2に記載の形状測定システム。 - 前記光源部として、第1波長の光を出射する第1光源を備えた第1光源部と、第2波長の光を出射する第2光源を備えた第2光源部と、を備え、
前記受光部は、単一であり、
前記EVSは、画素列単位で第1分割域と第2分割域とに2分割された2分割域で構成するとともに、
前記第1分割域は、前記光源から出射される互いに異なる2種類の波長の測定光のうち、前記測定対象で反射する第1波長の光と第2波長の反射光の中の、第1波長の前記反射光を選択的に透過させる第1波長透過特性を有する第1フィルタを備えるとともに、
前記第2分割域は、前記反射光の中の、前記第2波長の光を選択的に透過させる第2波長透過性を有する第2フィルタを備える、
請求項2に記載の形状測定システム。 - 前記第1波長の光は、青色波長の光であり、
前記第2波長の光は、赤色波長の光である、
請求項5に記載の形状測定システム。 - 前記光源部として、前記第1光源を備えた第1光源部と、第2光源を備えた第2光源部と、を備え、
前記第1光源部は、前記第1光源の出射光から第1偏光面を有する第1測定光を生成する第1の光学素子を備え、
前記第2光源部は、前記第2光源の出射光から第2偏光面を有する第2測定光を生成する第2光学素子を備え、
前記受光部は、単一であり、
前記EVSは、画素列ごとに第1分割領域と第2分割領域とに2分割され、
前記第1分割領域は、前記第1測定光及び前記第2測定光の2種類の測定光のうち、前記測定対象で反射し前記受光部にそれぞれ入射する、前記第1偏光面を有する第1反射光及び前記第2偏光面を有する第2反射光のうち、前記第1反射光を選択的に透過させる第1フィルムを備え、
前記第2分割領域は、前記第2反射光を選択的に透過させる第2フィルムを備える、
請求項2に記載の形状測定システム。 - 前記第1偏光面は、縦方向に振動する直線偏光であり、
前記第2偏光面は、横方向に振動する直線偏光である、
請求項7に記載の形状測定システム。 - 前記光源部は、単一であり、
前記光源部内の光源には、前記光源からの光を、少なくとも3種類の波長帯に分割する分割手段を備え、
前記受光部は、単一であり、
前記EVSは、画素列単位で第1分割領域乃至第3分割領域に3分割された3分割領域で構成し、
前記第1分割領域には、前記光源から出射される互いに異なる3種類の波長帯の測定光のうち、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第1波長帯の反射光を選択的に透過させる波長透過特性を有する第1フィルタを備え、
前記第2分割領域には、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第2波長帯の反射光を選択的に透過させる波長透過特性を有する第2フィルタを備え、
前記第3分割領域には、前記測定対象で反射してくる3種類の波長帯の反射光の中の、第3波長帯の反射光を選択的に透過させる波長透過特性を有する第3フィルタを備える、
請求項2に記載の形状測定システム。 - 前記分割手段は、前記光源からの白色光を3種類の波長帯の光に分割するプリズムで構成するとともに、
前記3種類の波長帯の光は、赤色波長帯の光と、緑色波長帯の光と、青色波長帯の光である、
請求項9に記載の形状測定システム。 - 前記EVSは、前記反射光の時間的変化に基づく信号電圧と閾値電圧と比較し、閾値電圧よりも小さいもしくは大きいことを判定することによりイベント信号を出力する、
請求項2に記載の形状測定システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180074359.1A CN116391106A (zh) | 2020-11-10 | 2021-10-08 | 形状测量系统 |
JP2022561334A JPWO2022102302A1 (ja) | 2020-11-10 | 2021-10-08 | |
EP21891549.4A EP4246086A4 (en) | 2020-11-10 | 2021-10-08 | FORM MEASURING SYSTEM |
KR1020237012574A KR20230101799A (ko) | 2020-11-10 | 2021-10-08 | 형상 측정 시스템 |
US18/247,563 US20240019244A1 (en) | 2020-11-10 | 2021-10-08 | Shape measuring system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-187476 | 2020-11-10 | ||
JP2020187476 | 2020-11-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022102302A1 true WO2022102302A1 (ja) | 2022-05-19 |
Family
ID=81601122
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/037286 WO2022102302A1 (ja) | 2020-11-10 | 2021-10-08 | 形状測定システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20240019244A1 (ja) |
EP (1) | EP4246086A4 (ja) |
JP (1) | JPWO2022102302A1 (ja) |
KR (1) | KR20230101799A (ja) |
CN (1) | CN116391106A (ja) |
TW (1) | TW202226819A (ja) |
WO (1) | WO2022102302A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0634323A (ja) * | 1992-05-07 | 1994-02-08 | Sony Corp | 距離測定装置 |
JP2005241570A (ja) * | 2004-02-27 | 2005-09-08 | Sunx Ltd | 表面形状検出器 |
JP2015508584A (ja) * | 2011-12-08 | 2015-03-19 | ユニベルシテ ピエール エ マリーキュリー(パリ シズエム) | 非同期センサに依存するシーンの3d再構成の方法 |
JP2015072197A (ja) * | 2013-10-03 | 2015-04-16 | 株式会社ニコン | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム |
JP2016517513A (ja) * | 2013-03-15 | 2016-06-16 | ファロ テクノロジーズ インコーポレーテッド | 有向性のプローブ処理による、三次元スキャナにおける多経路干渉の診断および排除 |
JP2019134271A (ja) | 2018-01-31 | 2019-08-08 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10659764B2 (en) * | 2016-06-20 | 2020-05-19 | Intel Corporation | Depth image provision apparatus and method |
US20190018119A1 (en) * | 2017-07-13 | 2019-01-17 | Apple Inc. | Early-late pulse counting for light emitting depth sensors |
US20200341144A1 (en) * | 2019-04-26 | 2020-10-29 | Ouster, Inc. | Independent per-pixel integration registers for lidar measurements |
-
2021
- 2021-10-08 KR KR1020237012574A patent/KR20230101799A/ko unknown
- 2021-10-08 US US18/247,563 patent/US20240019244A1/en active Pending
- 2021-10-08 EP EP21891549.4A patent/EP4246086A4/en active Pending
- 2021-10-08 JP JP2022561334A patent/JPWO2022102302A1/ja active Pending
- 2021-10-08 CN CN202180074359.1A patent/CN116391106A/zh active Pending
- 2021-10-08 WO PCT/JP2021/037286 patent/WO2022102302A1/ja active Application Filing
- 2021-10-29 TW TW110140228A patent/TW202226819A/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0634323A (ja) * | 1992-05-07 | 1994-02-08 | Sony Corp | 距離測定装置 |
JP2005241570A (ja) * | 2004-02-27 | 2005-09-08 | Sunx Ltd | 表面形状検出器 |
JP2015508584A (ja) * | 2011-12-08 | 2015-03-19 | ユニベルシテ ピエール エ マリーキュリー(パリ シズエム) | 非同期センサに依存するシーンの3d再構成の方法 |
JP2016517513A (ja) * | 2013-03-15 | 2016-06-16 | ファロ テクノロジーズ インコーポレーテッド | 有向性のプローブ処理による、三次元スキャナにおける多経路干渉の診断および排除 |
JP2015072197A (ja) * | 2013-10-03 | 2015-04-16 | 株式会社ニコン | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム |
JP2019134271A (ja) | 2018-01-31 | 2019-08-08 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4246086A4 |
Also Published As
Publication number | Publication date |
---|---|
KR20230101799A (ko) | 2023-07-06 |
CN116391106A (zh) | 2023-07-04 |
EP4246086A1 (en) | 2023-09-20 |
TW202226819A (zh) | 2022-07-01 |
JPWO2022102302A1 (ja) | 2022-05-19 |
EP4246086A4 (en) | 2024-04-17 |
US20240019244A1 (en) | 2024-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10473768B2 (en) | Lidar system | |
EP0997748B1 (en) | Chromatic optical ranging sensor | |
US6181472B1 (en) | Method and system for imaging an object with a plurality of optical beams | |
WO2021212915A1 (zh) | 一种激光测距装置及方法 | |
EP0885374B1 (en) | Three-dimensional color imaging | |
US11902494B2 (en) | System and method for glint reduction | |
EP1191306A2 (en) | Distance information obtaining apparatus and distance information obtaining method | |
US20230204724A1 (en) | Reducing interference in an active illumination environment | |
TW201923305A (zh) | 藉由干涉距離測量手段來偵測物體表面輪廓之組件 | |
CN110726382B (zh) | 用于借助电磁射束检测物体表面的设备和方法 | |
WO2022102302A1 (ja) | 形状測定システム | |
EP0882211B1 (en) | A method and apparatus for reducing the unwanted effects of noise present in a three-dimensional color imaging system | |
CN114651194A (zh) | 用于固态lidar系统的投影仪 | |
WO2022196779A1 (ja) | 3次元計測装置 | |
CN216211121U (zh) | 深度信息测量装置以及电子设备 | |
JPH04110706A (ja) | 三次元形状データ取込み装置 | |
US20230288343A1 (en) | Device and method for transmission inspection of containers having at least one light-emitting diode light source | |
CN211785085U (zh) | 4d摄像装置及电子设备 | |
CN210052398U (zh) | 一种激光光束检测装置及系统 | |
WO2021200016A1 (ja) | 測距装置および発光装置 | |
JP4266286B2 (ja) | 距離情報取得装置、および距離情報取得方法 | |
CN113822875A (zh) | 深度信息测量装置、全场景避障方法以及电子设备 | |
JPH06186324A (ja) | 光源手段 | |
CN112834434A (zh) | 4d摄像装置及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21891549 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18247563 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2022561334 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021891549 Country of ref document: EP Effective date: 20230612 |