WO2023153451A1 - Measurement device - Google Patents

Measurement device Download PDF

Info

Publication number
WO2023153451A1
WO2023153451A1 PCT/JP2023/004229 JP2023004229W WO2023153451A1 WO 2023153451 A1 WO2023153451 A1 WO 2023153451A1 JP 2023004229 W JP2023004229 W JP 2023004229W WO 2023153451 A1 WO2023153451 A1 WO 2023153451A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light receiving
measurement area
measuring device
lens
Prior art date
Application number
PCT/JP2023/004229
Other languages
French (fr)
Japanese (ja)
Inventor
和也 本橋
幸雄 林
義朗 伊藤
秀倫 曽根
祐太 春瀬
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022018942A external-priority patent/JP2023116245A/en
Priority claimed from JP2022018941A external-priority patent/JP2023116244A/en
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Publication of WO2023153451A1 publication Critical patent/WO2023153451A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Definitions

  • the present disclosure relates to measuring devices.
  • Patent Document 1 describes a distance measuring device that measures the distance to a reflecting object based on the time of flight of light from the time when pulsed light is emitted until the time when reflected light is received.
  • the light is scanned by rotating the mirror, and the reflected light returned to the mirror is received by the light receiving unit.
  • a movable part that rotates a mirror is likely to cause a failure.
  • the measurable range (measurement area) becomes narrow.
  • the purpose of the present disclosure is to expand the measurement area that can be irradiated with light without using a movable part.
  • a further object of the present disclosure is to expand the measurement area capable of receiving light without using a movable part.
  • a measurement apparatus includes a light source, a light projecting optical system that irradiates a measurement area with light from the light source, and a light receiving unit that receives reflected light from the measurement area.
  • the optical system for light includes a coupling lens in which a first lens element whose optical axis is shifted in a first direction and a second lens element whose optical axis is shifted in a direction opposite to the first lens element are connected. irradiating the light from the light source onto the first measurement area through the first lens element, and causing the light from the light source to overlap the first measurement area through the second lens element A second measurement area including the overlap area is illuminated.
  • a measuring device includes an irradiation unit that irradiates a measurement area with light, a light receiving sensor, and a light receiving optical system that causes the light receiving sensor to receive reflected light from the measurement area
  • the light-receiving optical system is a coupling lens in which a first lens element whose optical axis is shifted in a first direction and a second lens element whose optical axis is shifted in a direction opposite to that of the first lens element are connected. and causing the light receiving element of the light receiving sensor to receive the reflected light from the first measurement area via the first lens element, and a second measurement including an overlapping area overlapping the first measurement area The reflected light from the area is received by the light receiving element via the second lens element.
  • the measurement area that can be irradiated with light can be expanded without using a movable part. Furthermore, according to the present disclosure, it is possible to expand the measurement area capable of receiving light without using a movable part.
  • FIG. 1 is an explanatory diagram of the overall configuration of the measuring device 1.
  • FIG. FIG. 2 is an explanatory diagram of the configuration of the irradiation unit 10 of the first embodiment.
  • FIG. 3A is an explanatory diagram of the light source 12.
  • FIG. 3B is an explanatory diagram of the measurement area 50.
  • FIG. 3C is an explanatory diagram of an example in which the measuring device 1 is mounted on a vehicle.
  • FIG. 4 is a perspective view of the coupling lens 15.
  • FIG. 5A is an explanatory diagram of the measurement area 50.
  • FIG. 5B is an explanatory diagram of the measurement area 50.
  • FIG. FIG. 6A is an explanatory diagram of the optical conditions of the irradiation section.
  • FIG. 6B is an explanatory diagram of the optical conditions of the irradiation section.
  • FIG. 7 is an explanatory diagram of the light receiving sensor 22.
  • FIG. 8 is a timing chart for explaining an example of the measuring method.
  • FIG. 9A is an explanatory diagram of another example of the light receiving sensor 22.
  • FIG. 9B is an explanatory diagram of the signal processing unit 362.
  • FIG. 9C is an explanatory diagram of a histogram.
  • FIG. 10A is an explanatory diagram showing the relationship between the light emitting area of the light source 12 and the measurement area 50.
  • FIG. 10B is an explanatory diagram showing the relationship between the light emitting area of the light source 12 and the measurement area 50.
  • FIG. 10A is an explanatory diagram showing the relationship between the light emitting area of the light source 12 and the measurement area 50.
  • FIG. 11A and 11B are explanatory diagrams of the state during measurement of the region H, which is the overlapping area 53.
  • FIG. FIG. 12 is an explanatory diagram of another measuring method.
  • FIG. 13 is an explanatory diagram of the light receiving section 20 of the first embodiment.
  • FIG. 14 is an explanatory diagram of the light receiving sensor 22.
  • FIG. 15 is a perspective view of the coupling lens 25.
  • FIG. 16A is an explanatory diagram of the measurement area 50.
  • FIG. FIG. 16B is an explanatory diagram of the measurement area 50.
  • FIG. FIG. 17A is an explanatory diagram showing the relationship between the pixels 221 (light receiving areas) of the light receiving sensor 22 and the measurement area 50.
  • FIG. 17B is an explanatory diagram showing the relationship between the pixels 221 (light receiving areas) of the light receiving sensor 22 and the measurement area 50.
  • FIG. FIG. 18A is an explanatory diagram of the optical conditions of the light receiving section.
  • FIG. 18B is an explanatory diagram of the optical conditions of the light receiving section.
  • FIG. 19A is an explanatory diagram of an example of a measurement method.
  • FIG. 19B is an explanatory diagram of an example of the measurement method.
  • FIG. 19C is an explanatory diagram of an example of the measurement method.
  • FIG. 20 is an explanatory diagram of the signal processing section 362 when measuring the overlapping area 53.
  • FIG. 21 is an explanatory diagram of another signal processing section 362 when measuring the overlapping area 53.
  • FIG. 20 is an explanatory diagram of the signal processing section 362 when measuring the overlapping area 53.
  • FIG. 1 is an explanatory diagram of the overall configuration of the measuring device 1. As shown in FIG. 1
  • the measuring device 1 is a device that measures the distance to the object 90 .
  • the measuring device 1 is a device that functions as a so-called LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging).
  • the measurement apparatus 1 emits measurement light, detects the reflected light reflected by the surface of the object 90, and measures the time from the emission of the measurement light to the reception of the reflected light.
  • the distance of is measured by the TOF method (Time of flight).
  • the measuring device 1 has an irradiation section 10 , a light receiving section 20 and a control section 30 .
  • the irradiation unit 10 irradiates the measurement light toward the object 90 .
  • the irradiation unit 10 irradiates a measurement area 50 (described later) with measurement light at a predetermined angle of view.
  • the irradiation section 10 has a light source 12 and a projection optical system 14 .
  • the light source 12 emits light.
  • the light source 12 is composed of, for example, a surface emitting laser (VCSEL).
  • the light projection optical system 14 is an optical system that irradiates the measurement area 50 with light emitted from the light source 12 . A detailed configuration of the irradiation unit 10 will be described later.
  • the light receiving unit 20 receives reflected light from the object 90 .
  • the light receiving section 20 receives reflected light from the measurement area 50 .
  • the light receiving section 20 has a light receiving sensor 22 and a light receiving optical system 24 . A detailed configuration of the light receiving unit 20 will be described later.
  • the control unit 30 controls the measurement device 1 .
  • the control unit 30 controls irradiation of light from the irradiation unit 10 .
  • the control unit 30 measures the distance to the object 90 by the TOF method (time of flight) based on the output result of the light receiving unit 20 .
  • the control unit 30 has an arithmetic device and a storage device (not shown).
  • the arithmetic device is, for example, an arithmetic processing device such as a CPU or GPU. A part of the arithmetic device may be composed of an analog arithmetic circuit.
  • a storage device is configured by a main storage device and an auxiliary storage device, and stores programs and data. Various processes for measuring the distance to the object 90 are executed by the arithmetic device executing the program stored in the storage device.
  • the figure shows functional blocks for various processes.
  • the control unit 30 has a setting unit 32, a timing control unit 34, and a distance measurement unit 36.
  • the setting unit 32 performs various settings.
  • the timing control section 34 controls the processing timing of each section.
  • the distance measuring unit 36 measures the distance to the object 90 .
  • the distance measurement section 36 has a signal processing section 362 , a time detection section 364 and a distance calculation section 366 .
  • the signal processing section 362 processes the output signal of the light receiving sensor 22 .
  • the time detection unit 364 detects the time of flight of light (time from irradiation of light to arrival of reflected light).
  • the distance calculator 366 calculates the distance to the object 90 . Note that the processing of the control unit 30 will be described later.
  • FIG. 2 is an explanatory diagram of the configuration of the irradiation unit 10 of the first embodiment.
  • the irradiation unit 10 has the light source 12 and the light projecting optical system 14 .
  • the direction along the optical axis of the projection optical system 14 is the Z direction. Note that the object 90 to be measured by the measuring device 1 is separated from the measuring device 1 in the Z direction. Also, the direction perpendicular to the Z direction and the direction in which the first lens element 151 and the second lens element 152 constituting the connecting lens 15 (described later) are arranged is defined as the Y direction. A direction perpendicular to the Z direction and the Y direction is defined as the X direction.
  • FIG. 3A is an explanatory diagram of the light source 12.
  • the light source 12 has a light emitting surface parallel to the XY plane (surface parallel to the X direction and the Y direction). The light emitting surface is configured in a rectangular shape. The light emitted from the light source 12 is applied to the measurement area 50 via the light projecting optical system 14 .
  • the projection optical system 14 has a coupling lens 15 .
  • FIG. 4 is a perspective view of the coupling lens 15.
  • FIG. The connecting lens 15 is an optical component in which the first lens element 151 and the second lens element 152 are connected.
  • the first lens element 151 and the second lens element 152 are convex lens-shaped parts (optical elements), and are arranged side by side in the Y direction.
  • the focal length of the first lens element 151 and the focal length of the second lens element 152 are the same.
  • the first lens element 151 and the second lens element 152 each have an optical axis along the Z direction.
  • the optical axis of the first lens element 151 is shifted in the +Y direction with respect to the optical axis of the projection optical system 14 (projection lens 16 described later).
  • the optical axis of the second lens element 152 is shifted in the -Y direction with respect to the optical axis of the projection optical system . That is, the optical axis of the second lens element 152 is shifted in the direction opposite to that of the first lens element 151 .
  • the projection optical system 14 also has a projection lens 16 .
  • Projection lens 16 is a lens arranged between light source 12 and coupling lens 15 .
  • the projection lens 16 By having the projection lens 16 in the projection optical system 14 , the light from the light source 12 can be projected onto a wide measurement area 50 . Further, since the light projecting optical system 14 has the projection lens 16 , the light from the light source 12 can be formed into a rectangular light distribution pattern and projected onto the measurement area 50 .
  • the distance between projection lens 16 and light source 12 is closer than the focal length of projection lens 16 .
  • the light rays directed from the projection lens 16 to the coupling lens 15 diverge, the spreading of the light emitted from the coupling lens 15 toward the measurement area 50 is suppressed by the convex lens-shaped first optical element and the second optical element ( Light close to collimated light is emitted from the light projecting optical system 14 to the measurement area 50).
  • FIG. 3B is an explanatory diagram of the measurement area 50.
  • FIG. 5A and 5B are explanatory diagrams of the measurement area 50.
  • FIG. FIG. 5A is an explanatory diagram of how the light from the light source 12 irradiates the first measurement area 51 through the first lens element 151.
  • FIG. FIG. 5B is an explanatory diagram of how light from the light source 12 is irradiated onto the second measurement area 52 via the second lens element 152 .
  • the measurement area 50 is composed of a first measurement area 51 and a second measurement area 52 .
  • the first measurement area 51 is an area irradiated with the light of the light source 12 via the first lens element 151 (in other words, the first lens element 151 irradiates the first measurement area 51 with the light of the light source 12). optical element).
  • the second measurement area 52 is an area irradiated with the light of the light source 12 via the second lens element 152 (in other words, the second lens element 152 irradiates the second measurement area 52 with the light of the light source 12). optical element).
  • the first measurement area 51 and the second measurement area 52 are arranged to be shifted in the Y direction.
  • the measurement area 50 can be set long in the Y direction (in other words, a wide range in the Y direction can be irradiated with light).
  • the first measurement area 51 and the second measurement area 52 overlap.
  • a region where the first measurement area 51 and the second measurement area 52 overlap is called an "overlapping area 53".
  • the measurement area 50 when the measurement area 50 is viewed from the Z direction, the measurement area 50 has predetermined angles of view in the X and Y directions.
  • the ratio of the Y-direction length to the X-direction length (so-called aspect ratio) of the measurement area 50 is the ratio of the Y-direction length to the X-direction length of the light source 12 (so-called aspect ratio). That is, in the first embodiment, the measurement area 50 can be set longer in the Y direction than the shape of the light source 12 (in other words, light can be emitted over a wider range in the Y direction).
  • a region that emits light to irradiate the overlapping area 53 via the first lens element 151 is called a "first region 121”.
  • a region of the light emitting surface of the light source 12 that emits the light to irradiate the overlapping area 53 via the second lens element 152 is called a "second region 122".
  • the overlapping area 53 In the overlapping area 53, light passing through the first lens element 151 and light passing through the second lens element 152 can be irradiated.
  • the first region 121 and the second region 122 of the light source 12 are caused to emit light at the same time, in the overlapping area 53, the light emitted through the first lens element 151 and the light emitted through the second lens element 152 can be superimposed. Therefore, in the overlapping area 53 , the irradiation intensity of light can be increased compared to the measurement area 50 excluding the overlapping area 53 .
  • the overlapping area 53 (hatched area) where the irradiation intensity is relatively high is located in the center of the measurement area 50 in the Y direction.
  • the light emitting surface of the light source 12 By causing the light emitting surface of the light source 12 to emit light collectively (that is, by simultaneously causing the first region 121 and the second region 122 of the light source 12 to emit light), as shown in FIG. 53) can be increased in intensity.
  • FIG. 3C is an explanatory diagram of an example in which the measuring device 1 is mounted on a vehicle.
  • the measuring device 1 when the measuring device 1 is mounted on a vehicle, it is desirable to measure the distance with a wide field of view in relatively close proximity.
  • the aspect ratio of the measurement area 50 can be increased, which is advantageous for distance measurement in a wide field of view.
  • the area required to measure the distance to a long distance is allowed to be a relatively narrow range. It is desirable to be able to irradiate intense light.
  • the intensity of the light in the overlapping area 53 of the measurement area 50 can be increased, which is advantageous for measuring a long distance.
  • FIG. 6A is an explanatory diagram of the relationship between the light source 12 and the projection lens 16.
  • FIG. 6B is an explanatory diagram of the relationship between the virtual image 12 ′ of the light source 12 and the coupling lens 15 .
  • the focal length of the projection lens 16 be fA1.
  • LA1 be the distance from the principal point of the projection lens 16 to the light source 12 .
  • the half length of the light source 12 in the Y direction is yA (the length from the optical axis of the light projecting optical system 14 to the end of the light source 12 is yA).
  • the distance LA1 from the principal point of the projection lens 16 to the light source 12 is smaller than the focal length fA1 of the projection lens 16 (LA1 ⁇ fA1). Since the light source 12 is arranged closer than the focal point of the projection lens 16, the virtual image 12' of the light source 12 (the image of the light source 12 by the projection lens 16) is on the opposite side of the light source 12 when viewed from the projection lens 16 (Fig. left side of the light source 12).
  • LA' (LA1 ⁇ fA1)/(fA1 ⁇ LA1) (A2)
  • the focal length of the first lens element 151 and the second lens element 152 be fA2.
  • LA2 be the distance from the principal point of the coupling lens 15 (the principal point of the first lens element 151 or the second lens element 152) to the virtual image 12'.
  • the connecting lens 15 irradiates the light of the virtual image 12' to the distant irradiation area as shown in FIG. 6B
  • the relationship between the focal length fA2 and the distance LA2 is expressed by the following equation (A4).
  • LA2 fA2 (A4)
  • the distance between the principal point of the projection lens 16 and the principal point of the coupling lens 15 is dA.
  • the distance LA2 corresponds to a value obtained by adding the distance dA to the distance LA'
  • the distance between the optical axis of the projection optical system 14 (projection lens 16) and the optical axis of the first lens element 151 (or the second lens element 152) is tA. tA”.
  • the first measurement area 51 and the second measurement area 52 need to overlap. Therefore, the shift amount tA needs to be smaller than the length yA' from the optical axis of the projection optical system 14 to the edge of the virtual image 12' of the light source 12 (tA ⁇ yA').
  • the shift amount tA of each of the first lens element 151 and the second lens element 152 of the coupling lens 15 must satisfy the following equation (A6).
  • FIG. 7 is an explanatory diagram of the light receiving sensor 22. As shown in FIG.
  • the light receiving sensor 22 has a plurality of pixels 221 arranged two-dimensionally. For example, in the case of the VGA light receiving sensor 22, 480 ⁇ 640 pixels 221 are two-dimensionally arranged. Each pixel 221 has a light receiving element, and the light receiving element outputs a signal according to the amount of light received.
  • the control unit 30 acquires an output signal for each pixel 221 .
  • FIG. 8 is a timing chart for explaining an example of the measurement method.
  • the control unit 30 causes the light source 12 of the irradiation unit 10 to emit pulsed light at a predetermined cycle.
  • the upper part of FIG. 8 shows the timing (emission timing) at which the light source 12 emits pulsed light.
  • the light emitted from the light source 12 is applied to the measurement area 50 via the light projecting optical system 14 .
  • the light reflected by the surface of the object 90 within the measurement area 50 is received by the light receiving sensor 22 via the light receiving optical system 24 .
  • Each pixel 221 of the light receiving sensor 22 receives the pulsed reflected light.
  • the center of FIG. 8 shows the timing (arrival timing) at which the pulsed reflected light arrives.
  • Each pixel 221 outputs a signal corresponding to the amount of light received.
  • the output signal of each pixel 221 is shown in the lower part of FIG.
  • the distance measurement unit 36 (signal processing unit 362) of the control unit 30 detects the arrival timing of the reflected light based on the output signal of each pixel 221.
  • the signal processing unit 362 detects the arrival timing of reflected light based on the peak timing of the output signal of each pixel 221 .
  • the signal processing unit 362 may obtain the arrival timing of the reflected light based on the peak of the signal obtained by cutting the DC component of the output signal of the pixel 221 in order to remove the influence of ambient light (for example, sunlight). .
  • the distance measurement unit 36 detects the time Tf from when the light is emitted to when the reflected light arrives, based on the light emission timing and the light arrival timing.
  • the time Tf corresponds to the time it takes the light to make a round trip between the measurement device 1 and the object 90 .
  • the distance measurement unit 36 calculates the distance L to the object 90 based on the time Tf.
  • the control unit 30 generates a distance image by calculating the distance to the object 90 for each pixel 221 based on the time Tf detected for each pixel 221 .
  • FIG. 9A is an explanatory diagram of another example of the light receiving sensor 22.
  • the light receiving sensor 22 has a plurality of pixels 221 arranged two-dimensionally.
  • Each pixel 221 has a plurality of light receiving elements 222 .
  • each pixel 221 has nine SPADs (Single Photon Avalanche Diodes) arranged three in the X direction and three in the Y direction as the light receiving elements 222 .
  • SPADs Single Photon Avalanche Diodes
  • FIG. 9B is an explanatory diagram of the signal processing unit 362.
  • the signal processor 362 has an adder 362A, a comparator 362B, and a histogram generator 362C.
  • the processing signal unit generates a histogram used in Time Correlated Single Photon Counting (TCSPC) based on the output signal of each pixel 221 of the light receiving sensor 22 .
  • TCSPC Time Correlated Single Photon Counting
  • the adder 362A adds the output signals of the plurality of light receiving elements 222 (SPAD) forming the pixel 221.
  • the adder 362A may adjust (shape) the pulse widths output from the light receiving elements 222 and then add the output signals of the plurality of light receiving elements 222 .
  • the comparator 362B compares the output signal of the adder 362A with a threshold, and outputs a signal when the output signal of the adder 362A is greater than or equal to the threshold.
  • the timing at which the comparator 362B outputs a signal is considered to be the timing at which the light receiving element 222 (SPAD) of the light receiving sensor 22 detects light.
  • the photons of disturbance light are temporally randomly incident on each light receiving element 222 .
  • the photons of the reflected light are incident on each of the light receiving elements 222 after a predetermined delay time (flight time according to the distance to the object 90) after light irradiation. Therefore, when photons of disturbance light are incident on the light receiving element 222 at random in terms of time, the probability that the output signal of the adder 362A is equal to or higher than the threshold is low.
  • the multiple light receiving elements 222 forming the pixel 221 detect the photons at the same time, so the output signal of the adder 362A is highly likely to be equal to or greater than the threshold. Therefore, the output signals of the plurality of light receiving elements 222 are added by the adding section 362A, and the output signal of the adding section 362A is compared with the threshold value by the comparing section 362B, whereby the light receiving element 222 (SPAD) detects the reflected light. Measure the time that can be considered.
  • FIG. 9C is an explanatory diagram of a histogram.
  • the horizontal axis in the figure is time, and the vertical axis is frequency (number of times).
  • the histogram generation unit 362C Based on the output of the comparison unit 362B, the histogram generation unit 362C repeatedly measures the time that the light receiving element 222 (SPAD) of the light receiving sensor 22 detects light, and increments the frequency (number of times) associated with that time. to generate a histogram.
  • the histogram generator 362C may increase the number corresponding to the output signal (added value) of the adder 362A instead of increasing the number by one.
  • the setting unit 32 presets the number of integrations for generating the histogram.
  • the timing control unit 34 causes the light source 12 of the irradiation unit 10 to emit pulsed light a plurality of times according to the set number of times of integration.
  • a signal is output once or a plurality of times from the adder 362A for one emission of pulsed light from the light source 12 .
  • the histogram generation unit 362C generates a histogram by incrementing the frequency (number of times) according to the output signal of the comparison unit 362B until the set number of times of accumulation is reached.
  • the distance measurement unit 36 After generating the histogram, the distance measurement unit 36 (time detection unit 364) detects the time Tf from when the light is emitted until the reflected light arrives, based on the histogram. As shown in FIG. 9C, the distance measurement unit 36 (time detection unit 364) detects the time corresponding to the frequency peak of the histogram, and sets that time as time Tf. Then, the distance measuring section 36 (distance calculating section 366) calculates the distance to the object 90 based on the time Tf.
  • the intensity of the light in the overlapping area 53 of the measurement area 50 can be increased.
  • the light source 12 collectively emits light from the light emitting surface.
  • the light source 12 may be controlled so as to emit light from a part of the light emitting surface.
  • FIGS. 10A and 10B are explanatory diagrams showing the relationship between the light emitting area of the light source 12 and the measurement area 50.
  • FIG. 10A is an explanatory diagram showing the relationship between the light emitting region of the light source 12 and the measurement area 50 when the first measurement area 51 is irradiated with light through the first lens element 151.
  • FIG. 10B is an explanatory diagram showing the relationship between the light emitting region of the light source 12 and the measurement area 50 when the second measurement area 52 is irradiated with light through the second lens element 152.
  • the corresponding light emitting area of the light source 12 is shown.
  • the light source 12 is divided into a plurality of light-emitting regions in the Y direction, here divided into 12 light-emitting regions (light-emitting regions #1 to #12).
  • the number of light emitting regions of the light source 12 divided in the Y direction is not limited to twelve.
  • the measurement area 50 is divided into a plurality of regions (regions A to P) in the Y direction, and is divided into 16 regions here. Note that the number of regions of the measurement area 50 divided in the Y direction is not limited to 16.
  • the first measurement area 51 corresponds to regions A to L.
  • the light emitting areas #1 to #12 of the light source 12 correspond to the areas L to A of the first measurement area 51, respectively.
  • the light emitted from the light emitting region #5 of the light source 12 passes through the first lens element 151 and irradiates the region H of the measurement area 50 .
  • the second measurement area 52 corresponds to regions E to P.
  • the light emitting areas #1 to #12 of the light source 12 correspond to the areas P to E of the second measurement area 52, respectively.
  • the light emitted from the light emitting region #5 of the light source 12 passes through the second lens element 152 and irradiates the region L of the measurement area 50 .
  • the light emitted from a certain light emitting region (for example, light emitting region #5) of the light source 12 passes through the first lens element 151 and the second lens element 152 of the coupling lens 15 and passes through the two regions of the measurement area 50. (eg area H and area L).
  • the overlapping area 53 corresponds to the areas E to L.
  • the light emitted from the light emitting regions #8 to #1 of the light source 12 passes through the first lens element 151 and irradiates the regions E to L, which are the overlapping area 53.
  • the light emitting regions #1 to #8 of the light source 12 correspond to the first region 121 described above.
  • light emitted from the light emitting regions #12 to #5 of the light source 12 passes through the second lens element 152 and irradiates the regions E to L, which are the overlapping area 53.
  • the light emitting regions #5 to #12 of the light source 12 correspond to the second region 122 described above.
  • FIG. 11 is an explanatory diagram of how the region H, which is the overlapping area 53, is measured.
  • the area of the corresponding light source 12 is shown on the right side of each area of the measurement area 50 in the figure.
  • a region of the corresponding measurement area 50 is shown on the left side of each pixel 221 of the light receiving sensor 22 in the drawing.
  • Each pixel 221 of the light-receiving sensor 22 receives the reflected light from the corresponding region of the measurement area 50 (the image of the measurement area 50 is formed on the light-receiving surface of the light-receiving sensor 22 by the light-receiving optical system 24).
  • the area H of the overlapping area 53 is associated with the light emitting area #5 and the light emitting area #9 of the light source 12 . As shown in the figure, when measuring region H, the control unit 30 causes the light emitting region #5 and the light emitting region #9 of the light source 12 to emit light simultaneously. Thus, when measuring a specific region of the overlapping area 53, the control unit 30 simultaneously emits light from the two light emitting regions corresponding to that region.
  • the light emitted from the light emitting region #5 of the light source 12 irradiates the region H (first measurement area 51) and the region L (second measurement area 52). Moreover, the light emitted from the region #9 of the light source 12 is irradiated to the region D (first measurement area 51) and the region H (second measurement area 52). That is, since the light emitted from the regions #5 and #9 of the light source 12 overlaps the region H (overlapping area 53) that becomes the overlapping area 53, the intensity of the irradiated light increases.
  • the pixel #9 of the light receiving sensor 22 receives the reflected light from the area H of the measurement area 50 . Since the light emitted from the light emitting regions #5 and #9 of the light source 12 is superimposed on the region H, the intensity of the reflected light received by the pixel #9 of the light receiving sensor 22 can be increased.
  • the connecting lens 15 is used for the purpose of increasing the intensity of the light in the overlapping area 53 .
  • the application of coupling lens 15 need not be to increase the intensity of light in overlapping area 53 .
  • the control unit 30 when measuring the region H, the control unit 30 emits light from the light emitting region #5 of the light source 12 and calculates the distance based on the light receiving result of the pixel #9 of the light receiving sensor 22. , and calculating the distance based on the light reception result of the pixel #9 of the light receiving sensor 22 may be performed separately. Then, the control unit 30 may calculate the distance by averaging the respective distance calculation results. Thus, coupling lens 15 may not be used for the purpose of increasing light intensity.
  • FIG. 12 is an explanatory diagram of another measuring method.
  • Light emitted from a specific light-emitting region of the light source 12 is irradiated onto the corresponding region of the first measurement area 51 and the corresponding region of the second measurement area 52 .
  • the light emitted from the specific light emitting region of the light source 12 irradiates two regions of the measurement area 50 .
  • the light emitted from the light emitting region #5 of the light source 12 irradiates the region H (first measurement area 51) and the region L (second measurement area 52).
  • the pixel #9 of the light receiving sensor 22 corresponds to the area H of the measurement area 50 and receives reflected light from the area H. Further, the pixel #5 of the light receiving sensor 22 corresponds to the area L of the measurement area 50 and receives the reflected light from the area L. As shown in FIG.
  • FIG. 13 is an explanatory diagram of the light receiving section 20 of the first embodiment.
  • the light receiving section 20 has the light receiving sensor 22 and the light receiving optical system 24 .
  • the direction along the optical axis of the light receiving optical system 24 is the Z direction.
  • the object 90 to be measured by the measuring device 1 is separated from the measuring device 1 in the Z direction.
  • the direction perpendicular to the Z direction and the direction in which the first lens element 251 and the second lens element 252 constituting the connecting lens 25 (described later) are arranged is defined as the Y direction.
  • a direction perpendicular to the Z direction and the Y direction is defined as the X direction.
  • FIG. 14 is an explanatory diagram of the light receiving sensor 22.
  • the light receiving sensor 22 has a light receiving surface parallel to the XY plane (a surface parallel to the X direction and the Y direction).
  • the light receiving surface is configured in a rectangular shape. Reflected light arriving from the measurement area is irradiated onto the light receiving surface of the light receiving sensor 22 via the light receiving optical system 24 (an image of the measurement area 50 is formed on the light receiving surface of the light receiving sensor 22 by the light receiving optical system 24).
  • the light receiving sensor 22 has a plurality of pixels 221 arranged two-dimensionally.
  • the light receiving optical system has a coupling lens 25 .
  • 15 is a perspective view of the coupling lens 25.
  • the connecting lens 25 is an optical component that connects the first lens element 251 and the second lens element 252 .
  • the first lens element 251 and the second lens element 252 are convex lens-shaped parts (optical elements), and are arranged side by side in the Y direction.
  • the focal length of the first lens element 251 and the focal length of the second lens element 252 are the same.
  • the first lens element 251 and the second lens element 252 each have an optical axis along the Z direction.
  • the optical axis of the first lens element 251 is shifted in the +Y direction with respect to the optical axis of the light-receiving optical system 24 (collecting lens 26 described later).
  • the optical axis of the second lens element 252 is shifted in the -Y direction with respect to the optical axis of the light receiving optical system . That is, the optical axis of the second lens element 252 is shifted in the direction opposite to that of the first lens element 251 .
  • the light receiving optical system 24 also has a condenser lens 26 .
  • the condenser lens 26 is a lens arranged between the light receiving sensor 22 and the coupling lens 25 . Since the light receiving optical system 24 has the condenser lens 26 , the light receiving sensor 22 can receive the reflected light from the wide measurement area 50 . Further, since the light receiving optical system 24 has the condenser lens 26 , an image of the rectangular measurement area 50 can be formed on the rectangular light receiving surface of the light receiving sensor 22 . The distance between the condenser lens 26 and the light receiving sensor 22 is shorter than the focal length of the condenser lens 26 .
  • FIG. 16A and 16B are explanatory diagrams of the measurement area 50.
  • FIG. FIG. 16A is an explanatory diagram of how the light receiving sensor 22 receives reflected light from the first measurement area 51 via the first lens element 251 .
  • FIG. 16B is an explanatory diagram of how the light receiving sensor 22 receives reflected light from the second measurement area 52 via the second lens element 252 .
  • the measurement area 50 is composed of a first measurement area 51 and a second measurement area 52 .
  • the first measurement area 51 is an area where the light sensor 22 receives reflected light via the first lens element 251 (in other words, the first lens element 251 receives the reflected light from the first measurement area 51 as a light sensor). 22).
  • the second measurement area 52 is an area where the light sensor 22 receives reflected light via the second lens element 252 (in other words, the second lens element 252 receives the light reflected from the second measurement area 52 by the light sensor. 22). Since the first lens element 251 and the second lens element 252 are arranged side by side in the Y direction, the first measurement area 51 and the second measurement area 52 are arranged to be shifted in the Y direction. This makes it possible to set the measurement area 50 long in the Y direction (in other words, it is possible to receive reflected light over a wide range in the Y direction).
  • the overlapping area 53 by providing the overlapping area 53, it is possible to prevent the formation of an area that cannot receive the reflected light between the first measurement area 51 and the second measurement area 52.
  • FIG. 17A and 17B are explanatory diagrams showing the relationship between the pixels 221 (light receiving areas) of the light receiving sensor 22 and the measurement area 50.
  • FIG. FIG. 17A is an explanatory diagram showing the relationship between the pixels 221 of the light receiving sensor 22 and the measurement area 50 when receiving reflected light from the first measurement area 51 via the first lens element 251.
  • FIG. 17B is an explanatory diagram showing the relationship between the pixels 221 of the light receiving sensor 22 and the measurement area 50 when receiving the reflected light from the second measurement area 52 via the second lens element 252 .
  • a region of the corresponding measurement area 50 is shown on the left side of each pixel 221 of the light receiving sensor 22 in the drawing.
  • Each pixel 221 of the light receiving sensor 22 receives reflected light from the corresponding region of the measurement area 50 .
  • the light-receiving sensor 22 has a plurality of pixels 221 (light-receiving regions) arranged in the Y direction, and here, 12 pixels (pixels #1 to #12) are arranged in the Y direction.
  • the number of pixels 221 arranged in the Y direction is not limited to 12 (for example, in the case of the VGA light receiving sensor 22, 640 pixels are arranged in the Y direction).
  • the measurement area 50 is divided into a plurality of regions (regions A to P) in the Y direction, and is divided into 16 regions here. Note that the number of regions of the measurement area 50 divided in the Y direction is not limited to 16.
  • the first measurement area 51 corresponds to regions A to L.
  • Pixels #1 to #12 of the light receiving sensor 22 correspond to the regions L to A of the first measurement area 51, respectively.
  • the pixel #5 of the light receiving sensor 22 receives reflected light from the region H of the measurement area 50 via the first lens element 251 .
  • the second measurement area 52 corresponds to regions E to P.
  • Pixels #1 to #12 of the light receiving sensor 22 correspond to the regions P to E of the second measurement area 52, respectively.
  • pixel #5 of the light receiving sensor 22 receives reflected light from the area L of the measurement area 50 via the second lens element 252 .
  • a certain pixel (for example, pixel #5) of the light receiving sensor 22 is connected to two regions (for example, region H and Reflected light from the area L) can be received.
  • a certain pixel (for example, pixel #5) of the light receiving sensor 22 can receive reflected light from two positions (for example, area H and area L) of the measurement area 50, the two positions ( For example, the position where the reflected light is received via the first lens element 251 (for example, the region H and the region L) is called a "first position", and the reflected light is received via the second lens element 252.
  • a region (for example, region L) to be set may be referred to as a “second position”.
  • the control unit 30 is configured to be able to control the position on the measurement area 50 irradiated with light by controlling the irradiation unit 10 . Then, the controller 30 controls the irradiator 10 so as not to irradiate the first position and the second position with light at the same time.
  • the control unit 30 detects that the received light is from the region L. It is possible to calculate the distance of the object 90 in the area H based on the signal of the light receiving element of the pixel #5, assuming that the reflected light is not the reflected light but the reflected light from the area H.
  • the overlapping area 53 corresponds to areas E to L.
  • the pixels #8 to #1 of the light receiving sensor 22 receive reflected light from the areas E to L, which are the overlapping area 53, via the first lens element 251.
  • pixels #12 to #5 of the light receiving sensor 22 receive reflected light from areas E to L, which are the overlapping area 53, via the second lens element 252.
  • Reflected light from a certain area (for example, area H) of the overlapping area 53 passes through the first lens element 251 and the second lens element 252 of the coupling lens 25 and passes through two pixels (for example, pixel #5 and pixel #5) of the light receiving sensor 22. Pixel #9) can receive light.
  • the measurement area 50 when the measurement area 50 is viewed from the Z direction, the measurement area 50 has predetermined angles of view in the X and Y directions.
  • the ratio of the length in the Y direction to the length in the X direction (so-called aspect ratio) of the measurement area 50 is the ratio of the length in the Y direction to the length in the X direction of the light receiving sensor 22 ( so-called aspect ratio). That is, in the first embodiment, the measurement area 50 can be set longer in the Y direction than the shape of the light receiving sensor 22 (in other words, reflected light over a wide range in the Y direction can be received).
  • the area of the light receiving surface of the light receiving sensor 22 that receives the reflected light from the overlapping area 53 via the first lens element 251 is called a "first area 22A”.
  • a region of the light receiving surface of the light receiving sensor 22 that receives reflected light from the overlapping area 53 via the second lens element 252 is called a "second region 22B". Reflected light from a certain point on the overlapping area 53 can be received by the pixels 221 on the first region 22A and the pixels 221 on the second region 22B of the light receiving sensor 22, respectively.
  • the overlapping area 53 (hatched area) is located in the center of the measurement area 50 in the Y direction. That is, as shown in FIG. 3B, reflected light from the central portion (overlapping area 53) of the measurement area 50 passes through the first lens element 251 and is received by the pixels 221 on the first region 22A of the light receiving sensor 22. and can be received by the pixels 221 on the second region 22B of the light receiving sensor 22 via the second lens element 252 .
  • the measuring device 1 when the measuring device 1 is mounted on a vehicle, it is desirable to measure the distance with a wide field of view in relatively close proximity.
  • the aspect ratio of the measurement area 50 can be increased, which is advantageous for distance measurement in a wide field of view.
  • the area required to measure the distance to a long distance is allowed to be a relatively narrow range. It is assumed that the intensity of the light will become weaker.
  • the reflected light from the overlapping area 53 of the measurement area 50 is applied to two pixels of the light receiving sensor 22 (pixels of each of the first region 22A and the second region 22B). 221), it is advantageous for measuring a long distance.
  • FIG. 18A and 18B are explanatory diagrams of the optical conditions of the light receiving section 20.
  • FIG. 18A is an explanatory diagram of the relationship between the light receiving sensor 22 and the condenser lens 26.
  • FIG. 18B is an explanatory diagram of the relationship between the virtual image 22′ of the light receiving sensor 22 and the coupling lens 25.
  • FIG. 18A is an explanatory diagram of the relationship between the light receiving sensor 22 and the condenser lens 26.
  • FIG. 18B is an explanatory diagram of the relationship between the virtual image 22′ of the light receiving sensor 22 and the coupling lens 25.
  • the focal length of the condenser lens 26 be fB1.
  • LB1 be the distance from the principal point of the condenser lens 26 to the light receiving sensor 22 .
  • the half length of the light receiving sensor 22 in the Y direction is yB (the length from the optical axis of the light receiving optical system 24 to the end of the light receiving sensor 22 is yB).
  • the distance LB1 from the principal point of the condenser lens 26 to the light receiving sensor 22 is smaller than the focal length fB1 of the condenser lens 26 (LB1 ⁇ fB1). Since the light-receiving sensor 22 is arranged closer than the focal point of the condenser lens 26 , the virtual image 22 ′ of the light-receiving sensor 22 (the image of the light-receiving sensor 22 by the condenser lens 26 ) is the image of the light-receiving sensor 22 when viewed from the condenser lens 26 . (left side of light receiving sensor 22 in the drawing).
  • the focal length of the first lens element 251 and the second lens element 252 be fB2.
  • LB2 be the distance from the principal point of the coupling lens 25 (the principal point of the first lens element 251 or the second lens element 252) to the virtual image 22'.
  • the relationship between the focal length fB2 and the distance LB2 is expressed by the following equation (B4 ).
  • LB2 fB2 (B4)
  • the distance between the principal point of the condensing lens 26 and the principal point of the coupling lens 25 is assumed to be dB.
  • the distance LB2 corresponds to a value obtained by adding the distance dB to the distance LB'
  • the distance between the optical axis of the light-receiving optical system 24 (condensing lens 26) and the optical axis of the first lens element 251 (or the second lens element 252) is tB. tB”.
  • the first measurement area 51 and the second measurement area 52 need to overlap. Therefore, the shift amount tB must be smaller than the length yB' from the optical axis of the light receiving optical system 24 to the edge of the virtual image 22' of the light receiving sensor 22 (tB ⁇ yB').
  • 19A to 19C are explanatory diagrams of an example of the measurement method.
  • the corresponding light emitting area of the light source 12 is shown on the right side of each area of the measurement area 50 in the drawing.
  • a region of the corresponding measurement area 50 is shown on the left side of each pixel 221 of the light receiving sensor 22 in the drawing.
  • the light source 12 is divided into a plurality of light-emitting regions in the Y direction, here 16 light-emitting regions (light-emitting regions #1 to #16).
  • the number of light emitting regions of the light source 12 divided in the Y direction is not limited to 16.
  • Each light emitting region (light emitting regions #1 to #16) of the light source 12 corresponds to each region of the measurement area 50 (region P to region A).
  • the control unit 30 controls the position on the measurement area 50 irradiated with light by causing a specific light emitting region out of the plurality of light emitting regions of the light source 12 to emit light.
  • the control unit 30 controls the light source 12 so as to emit light sequentially from the lowermost light emitting region #16 in the figure.
  • areas A to P of the measurement area 50 are irradiated with light so as to scan.
  • a certain pixel of the light receiving sensor 22 can receive reflected light from two positions (first position and second position) of the measurement area 50 by scanning the light irradiated to the measurement area 50 in the Y direction.
  • the two positions can be prevented from being irradiated with light at the same time.
  • the control unit 30 does not have to cause the light emitting regions arranged in the Y direction to emit light in order.
  • the control unit 30 may control the position on the measurement area 50 to which light is emitted by causing the light emitting regions to emit light in random order. If light is not simultaneously applied to two positions (first position and second position) where a certain pixel of the light receiving sensor 22 can receive light, light may be applied to a plurality of areas of the measurement area 50 at the same time. . Further, by rotating the mirror, the areas A to P of the measurement area 50 may be scanned with light.
  • the timing chart of measurement example 3 is the same as the timing chart of measurement example 1 in FIG. However, in measurement example 1, light is emitted from the entire light emitting surface of light source 12, while in measurement example 3, a specific light emitting region out of a plurality of light emitting regions of light source 12 is caused to emit light.
  • the timing (emission timing) at which the light emitting region of the light source 12 emits pulsed light is shown in the upper part of FIG.
  • the center of FIG. 8 shows the timing (arrival timing) at which the pulsed reflected light arrives.
  • Output signals of the pixels 221 are shown in the lower part of FIG.
  • the control unit 30 causes the light source 12 to emit pulsed light at a predetermined cycle. After the pulsed light is emitted, the distance measurement section 36 (signal processing section 362) of the control section 30 detects the arrival timing of the reflected light based on the output signal of the pixel 221 (light receiving element). For example, the signal processing unit 362 detects the arrival timing of reflected light based on the peak timing of the output signal of each pixel 221 .
  • the distance measurement unit 36 (time detection unit 364) of the control unit 30 detects the time Tf from when the light is emitted to when the reflected light arrives, based on the light emission timing and the light arrival timing.
  • the time Tf corresponds to the time it takes the light to make a round trip between the measurement device 1 and the object 90 .
  • the distance measurement unit 36 calculates the distance L to the object 90 based on the time Tf.
  • the light emitted from the light emitting region #16 of the light source 12 is irradiated to the region A of the measurement area 50 (first measurement area 51) via the light projecting optical system 14.
  • the pixel #12 of the light receiving sensor 22 receives the reflected light from the area A via the light receiving optical system 24 (first lens element 251).
  • the control unit 30 calculates the distance to the object 90 in the area A based on the output signal of the pixel #12 of the light receiving sensor 22.
  • the control unit 30 After the light emission of the light emitting region #16 (in other words, after the measurement of the region A), the control unit 30 causes the light emitting regions #15 to #13 to emit light in order, and the light to the regions B to D of the first measurement area 51. The light is emitted in order, and the distances to the object 90 in the regions B to D are calculated in order based on the output signals of the pixels #11 to #9 of the light receiving sensor 22 .
  • the light emitted from the light emitting region #12 of the light source 12 is irradiated onto the region E of the measurement area 50 via the light projecting optical system 14 . Since the area E belongs to the overlapping area 53, the reflected light from the area E passes through the first lens element 251 and the second lens element 252 of the coupling lens 25 and is received by the pixels #8 and #12 of the light receiving sensor 22. It is possible.
  • the control unit 30 calculates the distance based on the output signal of the pixel #8 of the light receiving sensor 22, calculates the distance based on the output signal of the pixel #12 of the light receiving sensor 22, and calculates the average of the two calculated distances.
  • the distance to the object 90 in the area E is calculated by obtaining the distance. This makes it possible to improve the accuracy of distance calculation and improve robustness.
  • the control unit 30 calculates the distance based on the output signal of the pixel #8 of the light receiving sensor 22, calculates the distance based on the output signal of the pixel #12 of the light receiving sensor 22, and calculates the distance between the two calculated distances. An abnormality may be detected when the difference is greater than a predetermined value. Note that the control unit 30 may calculate the distance to the object 90 in the area E based on the output signal of one of the pixels #8 and #12 of the light receiving sensor 22 .
  • the control unit 30 After the light emission of the light emitting region #12 (in other words, after the measurement of the region E), the control unit 30 causes the light emitting regions #11 to #5 to emit light in order, and the regions F to L of the overlapping area 53 to emit light in order, Based on the output signals of the two pixels that received the reflected light, the distances to the object 90 in the regions F to L are calculated in order.
  • the control unit 30 calculates the distance to the object 90 within the area M based on the output signal of the pixel #4 of the light receiving sensor 22 .
  • the control unit 30 sequentially causes the light emitting regions #3 to #1 to emit light, and sequentially irradiates the regions N to P of the second measurement area 52 with light. Then, based on the output signals of the pixels #3 to #1 of the light receiving sensor 22, the distances to the object 90 in the regions N to P are calculated in order.
  • each of the 16 divided measurement areas 50 ( It is possible to calculate the distance to the object 90 within the regions A to P).
  • the light receiving sensor 22 can receive reflected light over a wide range in the Y direction.
  • VGA 640 ⁇ 480 pixels
  • the light emitted from the light emitting region #16 of the light source 12 is irradiated onto the region A of the measurement area 50 (first measurement area 51) via the light projecting optical system 14.
  • the pixel #12 of the light receiving sensor 22 receives the reflected light from the area A via the light receiving optical system 24 (the first lens element 251), the method shown in FIGS. A distance to the object 90 is calculated.
  • the light emitted from the light emitting region #12 of the light source 12 is irradiated onto the region E of the overlapping area 53 via the light projecting optical system 14, and is reflected from the region E.
  • a case where light is received by the pixel #8 and the pixel #12 of the light receiving sensor 22 via the first lens element 251 and the second lens element 252 of the light receiving optical system 24 will be described.
  • FIG. 20 is an explanatory diagram of the signal processing unit 362 when measuring the overlapping area 53.
  • the signal processor 362 has an adder 362A, a comparator 362B, a histogram generator 362C, and a histogram synthesizer 362D. Since the adding section 362A, the comparing section 362B, and the histogram generating section 362C have already been described, the description thereof will be omitted.
  • the histogram synthesizing unit 362D synthesizes the two histograms (single histograms) generated by the two histogram generating units 362C into one histogram (composite histogram).
  • the histogram synthesizing unit 362D generates a synthesized histogram by synthesizing respective histograms (single histograms) based on two pixels associated with the overlap area 53 region.
  • the histogram combining unit 362D generates a combined histogram by combining two single histograms based on pixel #8 and pixel #12 of the light receiving sensor 22 corresponding to region E.
  • a composite histogram is generated by combining two independent histograms, even if the number of times of accumulation of the independent histograms is half of the predetermined number, the histogram of the predetermined number of times of accumulation can be generated. Therefore, in the measurement of the overlapping area 53, a histogram of the set number of integrations can be generated at high speed, so distance measurement can be performed at high speed. Also, as shown in FIG. 3C, it is assumed that the intensity of the reflected light from the overlapping area 53 is weakened. ) and measure the distance to the object 90 based on the combined histogram, it is possible to suppress the deterioration of the distance measurement accuracy even if the intensity of the reflected light is weak.
  • FIG. 21 is an explanatory diagram of another signal processing section 362 when measuring the overlapping area 53.
  • the signal processor 362 has an adder 362A, a comparator 362B, and a histogram generator 362C.
  • the adder 362A adds the output signals of the plurality of light receiving elements 222 (SPAD) forming the two pixels 221 (here, pixel #8 and pixel #12) associated with the overlapping area 53.
  • the output signals of element 222 are summed.
  • the adder 362A shown in FIG. 21 adds twice as many output signals of the light receiving elements 222 (SPAD).
  • the comparator 362B compares the output signal of the adder 362A with a threshold, and outputs a signal when the output signal of the adder 362A is greater than or equal to the threshold.
  • the setting unit 32 compares the threshold of the comparing unit 362B (see FIG. 9B) when measuring the measurement area 50 other than the overlapping area 53, and the comparing unit when measuring the overlapping area 53 362B (see FIG. 21) is set to a large value.
  • This increases the probability that the light receiving element 222 (SPAD) of the light receiving sensor 22 has detected the reflected light when the output signal of the adder 362A is greater than or equal to the threshold. Since the number of output signals of the light-receiving element 222 (SPAD) added by the adder 362A is large, even if the threshold is set to a large value, the frequency of outputting the signal from the comparator 362B to the histogram generator 362C is reduced. No (predetermined number of times of histogram generation is not delayed). Therefore, it is allowed to set the threshold value of the comparison section 362B shown in FIG. 21 to a large value.
  • the histogram generation unit 362C Based on the output of the comparison unit 362B, the histogram generation unit 362C repeatedly measures the time that the light receiving element 222 (SPAD) of the light receiving sensor 22 detects light, and increments the frequency (number of times) associated with that time. to generate a histogram. Note that compared to the signal processing unit 362 shown in FIG. 20, the signal processing unit 362 shown in FIG. can be reduced. In addition, as shown in FIG. 3C, it is assumed that the intensity of the reflected light from the overlapping area 53 is weakened. Since the light is received, even if the intensity of the reflected light is weak, the deterioration of distance measurement accuracy can be suppressed.
  • the intensity of the reflected light from the overlapping area 53 is weakened. Since the light is received, even if the intensity of the reflected light is weak, the deterioration of distance measurement accuracy can be suppressed.
  • the measuring apparatus 1 described above includes a light source 12 , a light projecting optical system 14 that irradiates a measurement area 50 with light from the light source 12 , and a light receiving section 20 that receives reflected light from the measurement area 50 .
  • the light projecting optical system 14 has a connecting lens 15 in which the first lens element 151 and the second lens element 152 are connected.
  • the optical axis of the first lens element 151 is shifted in the +Y direction (corresponding to the first direction), and the optical axis of the second lens element 152 is shifted in the -Y direction (the direction in which the optical axis of the first lens element 151 is shifted). and vice versa).
  • Light from the light source 12 irradiates the first measurement area 51 through the first lens element 151 and irradiates the second measurement area 52 including the overlapping area 53 through the second lens element 152 .
  • a wide range in the Y direction (corresponding to the first direction) can be irradiated with light.
  • the overlapping area 53 is provided, it is possible to prevent the formation of an area that cannot be irradiated with light between the first measurement area 51 and the second measurement area 52 .
  • the ratio of the length of the measurement area 50 in the Y direction (corresponding to the first direction) to the length of the X direction (corresponding to the second direction) is the same as that of the light source 12 in the X direction (corresponding to the second is greater than the ratio of the length in the Y direction (corresponding to the first direction) to the length in the Y direction (corresponding to the first direction) (see FIGS. 3A and 3B).
  • the angle of view of the measurement area 50 in the Y direction can be widened.
  • the first area 121 and the second area 122 of the light source 12 are caused to emit light at the same time, so that the light projecting optical system 14 transmits the light of the first area 121 of the light source 12 to the first lens.
  • the overlapping area 53 is illuminated through the element 151 , and the light from the second region 122 of the light source 12 is illuminated through the second lens element 152 onto the overlapping area 53 . This makes it possible to increase the intensity of the light that irradiates the overlapping area 53 (the hatched area in FIG. 3B) of the measurement area 50 .
  • the projection optical system 14 has a projection lens 16 between the light source 12 and the coupling lens 15 . Having the projection lens 16 in the projection optical system 14 makes it possible to project the light from the light source 12 over a wide range. However, the projection optical system 14 may not have the projection lens 16 (in this case, the virtual image 12' in FIG. 6B is replaced with the light source 12, so the light source 12 and the measuring device 1 are enlarged. ).
  • the focal length of the projection lens 16 is fA1
  • the distance from the principal point of the projection lens 16 to the light source 12 is LA1
  • the width of the light source 12 in the Y direction (corresponding to the first direction) is
  • tA is the distance (shift amount) between the optical axis of the projection optical system 14 and the optical axis of the first lens element 151 in the Y direction
  • tA ⁇ yA ⁇ fA1/( fA1-LA1) is desirable.
  • the control unit 30 of the measuring device 1 described above calculates the distance to the object 90 from which the reflected light is reflected, based on the arrival time from when the light source 12 emits light until when the reflected light is received.
  • the measuring device 1 is not limited to measuring the distance to the object 90 .
  • the measuring device 1 may measure the arrival time from when the light source 12 emits light until when the reflected light is received. may be used to measure
  • the control unit 30 of the measuring device 1 described above generates a histogram by repeatedly measuring the time that the light receiving element 222 of the light receiving unit 20 detects light, and detects the light arrival time Tf based on the peak of the histogram.
  • detecting the arrival time Tf of light using the histogram it is particularly effective to increase the intensity of light in the overlapping area 53 of the measurement area 50 using the coupling lens 15 of the first embodiment.
  • the measurement apparatus 1 described above includes an irradiation section 10 that irradiates a measurement area 50 with light, a light receiving sensor 22 , and a light receiving optical system 24 that causes the light receiving sensor 22 to receive the reflected light from the measurement area 50 .
  • the light-receiving optical system 24 has a connecting lens 25 connecting the first lens element 251 and the second lens element 252 .
  • the optical axis of the first lens element 251 is shifted in the +Y direction (corresponding to the first direction), and the optical axis of the second lens element 252 is shifted in the -Y direction (the direction in which the optical axis of the first lens element 251 is shifted). and vice versa).
  • Reflected light from the first measurement area 51 of the measurement area 50 is received by the light receiving element of the light receiving sensor 22 via the first lens element 251 .
  • Reflected light from the second measurement area 52 including the overlapping area 53 is received by the light receiving element of the light receiving sensor 22 via the second lens element 252 .
  • the overlapping area 53 is provided, it is possible to prevent the formation of an area that cannot receive the reflected light between the first measurement area 51 and the second measurement area 52 .
  • the control unit 30 described above controls the position on the measurement area 50 where the light is irradiated. As a result, it is possible to irradiate a predetermined position on the measurement area 50 with light or not to irradiate a predetermined position on the measurement area 50 with light.
  • a certain pixel of the light receiving sensor 22 receives reflections from two positions (first position and second position; for example, region H and region L) of the measurement area 50. It can receive light.
  • the control unit 30 controls the first position (for example, region H) of the first measurement area 51 and the second position (for example, region L) of the second measurement area 52 where the pixel (for example, pixel #5) can receive light.
  • the irradiation unit 10 is controlled so as not to irradiate both regions with light at the same time. This makes it possible to determine from which of the two positions on the measurement area 50 the reflected light is received when the light-receiving element of that pixel receives the reflected light.
  • the reflected light from the region E (predetermined position) of the overlapping area 53 can be received by the light receiving element (first light receiving element) of the pixel #8 of the light receiving sensor 22 via the first lens element 251.
  • the light can be received by the light receiving element (second light receiving element) of the pixel #12 of the light receiving sensor 22 via the second lens element 252 .
  • Robustness can be improved by enabling two light receiving elements to receive reflected light from the same position in this way.
  • the control unit 30 controls the output signal (first light receiving result) of the light receiving element of the pixel #8 of the light receiving sensor 22 and the light receiving of the pixel #12.
  • An output signal (second light receiving result) of the element is acquired. Robustness can be improved by acquiring the output signals (light receiving results) of the two light receiving elements that receive the reflected light from the same position in this way.
  • control unit 30 calculates the distance to the object 90 in the area E based on the output signal (first light receiving result) of the light receiving element of the pixel #8 of the light receiving sensor 22, and calculates the distance to the object 90 in the area E.
  • the distance to the object 90 in the same area E is calculated based on the output signal (second light reception result) of the. Robustness can be improved by calculating the distance based on the output signals (light receiving results) of the two light receiving elements that received the reflected light from the same position.
  • control unit 30 calculates the distance calculated based on the output signal (first light receiving result) of the pixel #8 of the light receiving sensor 22 and the distance calculated based on the output signal (second light receiving result) of the pixel #12. Anomalies may be detected based on the distance. With this, it is possible to determine any malfunction of the measuring device 1 .
  • the control unit 30 generates a single histogram (first histogram) by repeatedly measuring the time when the light receiving element (first light receiving element) of pixel #8 of the light receiving sensor 22 detects light.
  • a single histogram (second histogram) is generated by repeatedly measuring the time during which the light receiving element (second light receiving element) of pixel #12 detects light, and a combined histogram (third histogram) is obtained by synthesizing the two single histograms. may be generated.
  • the control unit 30 can detect the light arrival time Tf (see FIG. 9C) based on the peak of the composite histogram (third histogram), and the distance to the object can be detected based on the detected arrival time Tf.
  • the control unit 30 controls the time when the light receiving element (first light receiving element) of the pixel #8 of the light receiving sensor 22 detects light and the time when the pixel #12 (second light receiving element) detects light.
  • a histogram may be generated by repeatedly measuring time. As a result, it is possible to reduce the capacity of the memory for storing the data for generating the histogram, and reduce the amount of calculation for generating the histogram. Also in this case, the control unit 30 can detect the light arrival time Tf (see FIG. 9C) based on the peak of the histogram, and calculate the distance to the object based on the detected arrival time Tf. will do.
  • the light-receiving optical system 24 described above has a condenser lens 26 between the light-receiving sensor 22 and the coupling lens 25 . Since the light-receiving optical system 24 has the condenser lens 26, it becomes possible for the light-receiving sensor 22 to receive reflected light from a wide range. However, the light-receiving optical system 24 may not have the condenser lens 26 (in this case, the light-receiving sensor 22 having a light-receiving surface of the size of the virtual image 22' in FIG. 18B is substituted. 22 and the measuring device 1 are enlarged).
  • the focal length of the condenser lens 26 is fB1
  • the distance from the principal point of the condenser lens 26 to the light-receiving sensor 22 is LB1
  • the light-receiving sensor in the Y direction (corresponding to the first direction) 22 is LB1
  • tB is the distance (shift amount) between the optical axis of the light receiving optical system 24 and the optical axis of the first lens element 251 in the Y direction.
  • /(fB1-LB1) is desirable. Thereby, an overlapping area 53 can be provided in the measurement area 50 .
  • the present disclosure is not limited to the above embodiments, and includes various modifications.
  • the above-described embodiment describes the configuration in detail in order to explain the present disclosure in an easy-to-understand manner, and is not necessarily limited to those having all the described configurations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A measurement device (1) comprises: a light source (12); a light-projecting optical system (14) that irradiates a measurement area (50) with light from the light source (12); and a light-receiving unit (20) that receives light reflected from the measurement area (50). The light-projecting optical system (14): has a connected lens (15) formed by connecting a first lens element (151) having an optical axis shifted in a first direction and a second lens element (152) having an optical axis shifted in the opposite direction to the first lens element (151); irradiates a first measurement area (51) with the light from the light source (12) through the second lens element (152); and irradiates a second measurement area (52), which includes an overlapping area (53) overlapping the first measurement area (51), with the light from the light source (12) through the second lens element (12).

Description

測定装置measuring device
 本開示は、測定装置に関する。 The present disclosure relates to measuring devices.
 特許文献1には、パルス光を射出してから反射光を受光するまでの光の飛行時間に基づいて、反射物までの距離を測定する測距装置が記載されている。 Patent Document 1 describes a distance measuring device that measures the distance to a reflecting object based on the time of flight of light from the time when pulsed light is emitted until the time when reflected light is received.
日本国特開2021-152536号公報Japanese Patent Application Laid-Open No. 2021-152536
 特許文献1記載の装置では、ミラーを回転させることによって、光を走査させるとともに、ミラーに戻ってきた反射光を受光部に受光させている。但し、ミラーを回転させるような可動部は、故障の原因になり易い。一方、可動部を用いずに光を照射する場合や光を受光する場合、測定可能な範囲(測定エリア)が狭くなるという課題がある。 In the device described in Patent Document 1, the light is scanned by rotating the mirror, and the reflected light returned to the mirror is received by the light receiving unit. However, a movable part that rotates a mirror is likely to cause a failure. On the other hand, when irradiating or receiving light without using a movable part, there is a problem that the measurable range (measurement area) becomes narrow.
 本開示は、可動部を用いずに光を照射可能な測定エリアを広げることを目的とする。さらに本開示は、可動部を用いずに光を受光可能な測定エリアを広げることを目的とする。 The purpose of the present disclosure is to expand the measurement area that can be irradiated with light without using a movable part. A further object of the present disclosure is to expand the measurement area capable of receiving light without using a movable part.
 本開示の一形態に係る測定装置は、光源と、前記光源の光を測定エリアに照射する投光用光学系と、前記測定エリアからの反射光を受光する受光部と、を備え、前記投光用光学系は、第1方向に光軸をシフトさせた第1レンズエレメントと、前記第1レンズエレメントとは逆方向に光軸をシフトさせた第2レンズエレメントとを連結させた連結レンズを有し、前記光源の光を、前記第1レンズエレメントを介して、第1測定エリアに照射するとともに、前記光源の光を、前記第2レンズエレメントを介して、前記第1測定エリアと重複した重複エリアを含む第2測定エリアに照射する。 A measurement apparatus according to one aspect of the present disclosure includes a light source, a light projecting optical system that irradiates a measurement area with light from the light source, and a light receiving unit that receives reflected light from the measurement area. The optical system for light includes a coupling lens in which a first lens element whose optical axis is shifted in a first direction and a second lens element whose optical axis is shifted in a direction opposite to the first lens element are connected. irradiating the light from the light source onto the first measurement area through the first lens element, and causing the light from the light source to overlap the first measurement area through the second lens element A second measurement area including the overlap area is illuminated.
 本開示の他の一形態に係る測定装置は、光を測定エリアに照射する照射部と、受光センサと、前記測定エリアからの反射光を前記受光センサに受光させる受光用光学系とを備え、前記受光用光学系は、第1方向に光軸をシフトさせた第1レンズエレメントと、前記第1レンズエレメントとは逆方向に光軸をシフトさせた第2レンズエレメントとを連結させた連結レンズを有し、第1測定エリアからの前記反射光を、前記第1レンズエレメントを介して、前記受光センサの受光素子に受光させるとともに、前記第1測定エリアと重複した重複エリアを含む第2測定エリアからの前記反射光を、前記第2レンズエレメントを介して、前記受光素子に受光させる。 A measuring device according to another aspect of the present disclosure includes an irradiation unit that irradiates a measurement area with light, a light receiving sensor, and a light receiving optical system that causes the light receiving sensor to receive reflected light from the measurement area, The light-receiving optical system is a coupling lens in which a first lens element whose optical axis is shifted in a first direction and a second lens element whose optical axis is shifted in a direction opposite to that of the first lens element are connected. and causing the light receiving element of the light receiving sensor to receive the reflected light from the first measurement area via the first lens element, and a second measurement including an overlapping area overlapping the first measurement area The reflected light from the area is received by the light receiving element via the second lens element.
 その他、本願が開示する課題、及びその解決方法は、発明を実施するための形態の欄、及び図面により明らかにされる。 In addition, the problems disclosed by the present application and their solutions are clarified in the section of the mode for carrying out the invention and the drawings.
 本開示によれば、可動部を用いずに光を照射可能な測定エリアを広げることができる。さらに本開示によれば、可動部を用いずに光を受光可能な測定エリアを広げることができる。 According to the present disclosure, the measurement area that can be irradiated with light can be expanded without using a movable part. Furthermore, according to the present disclosure, it is possible to expand the measurement area capable of receiving light without using a movable part.
図1は、測定装置1の全体構成の説明図である。FIG. 1 is an explanatory diagram of the overall configuration of the measuring device 1. As shown in FIG. 図2は、第1実施形態の照射部10の構成の説明図である。FIG. 2 is an explanatory diagram of the configuration of the irradiation unit 10 of the first embodiment. 図3Aは、光源12の説明図である。FIG. 3A is an explanatory diagram of the light source 12. FIG. 図3Bは、測定エリア50の説明図である。FIG. 3B is an explanatory diagram of the measurement area 50. As shown in FIG. 図3Cは、測定装置1を車両に搭載した例の説明図である。FIG. 3C is an explanatory diagram of an example in which the measuring device 1 is mounted on a vehicle. 図4は、連結レンズ15の斜視図である。FIG. 4 is a perspective view of the coupling lens 15. FIG. 図5Aは、測定エリア50の説明図である。FIG. 5A is an explanatory diagram of the measurement area 50. FIG. 図5Bは、測定エリア50の説明図である。FIG. 5B is an explanatory diagram of the measurement area 50. FIG. 図6Aは、照射部の光学条件の説明図である。FIG. 6A is an explanatory diagram of the optical conditions of the irradiation section. 図6Bは、照射部の光学条件の説明図である。FIG. 6B is an explanatory diagram of the optical conditions of the irradiation section. 図7は、受光センサ22の説明図である。FIG. 7 is an explanatory diagram of the light receiving sensor 22. As shown in FIG. 図8は、測定方法の一例を説明するためのタイミングチャートである。FIG. 8 is a timing chart for explaining an example of the measuring method. 図9Aは、受光センサ22の別の一例の説明図である。FIG. 9A is an explanatory diagram of another example of the light receiving sensor 22. FIG. 図9Bは、信号処理部362の説明図である。FIG. 9B is an explanatory diagram of the signal processing unit 362. As shown in FIG. 図9Cは、ヒストグラムの説明図である。FIG. 9C is an explanatory diagram of a histogram. 図10Aは、光源12の発光領域と、測定エリア50との関係を示す説明図である。FIG. 10A is an explanatory diagram showing the relationship between the light emitting area of the light source 12 and the measurement area 50. FIG. 図10Bは、光源12の発光領域と、測定エリア50との関係を示す説明図である。FIG. 10B is an explanatory diagram showing the relationship between the light emitting area of the light source 12 and the measurement area 50. As shown in FIG. 図11は、重複エリア53である領域Hの測定時の様子の説明図である。11A and 11B are explanatory diagrams of the state during measurement of the region H, which is the overlapping area 53. FIG. 図12は、別の測定方法の説明図である。FIG. 12 is an explanatory diagram of another measuring method. 図13は、第1実施形態の受光部20の説明図である。FIG. 13 is an explanatory diagram of the light receiving section 20 of the first embodiment. 図14は、受光センサ22の説明図である。FIG. 14 is an explanatory diagram of the light receiving sensor 22. As shown in FIG. 図15は、連結レンズ25の斜視図である。15 is a perspective view of the coupling lens 25. FIG. 図16Aは、測定エリア50の説明図である。FIG. 16A is an explanatory diagram of the measurement area 50. FIG. 図16Bは、測定エリア50の説明図である。FIG. 16B is an explanatory diagram of the measurement area 50. FIG. 図17Aは、受光センサ22の画素221(受光領域)と、測定エリア50との関係を示す説明図である。FIG. 17A is an explanatory diagram showing the relationship between the pixels 221 (light receiving areas) of the light receiving sensor 22 and the measurement area 50. FIG. 図17Bは、受光センサ22の画素221(受光領域)と、測定エリア50との関係を示す説明図である。FIG. 17B is an explanatory diagram showing the relationship between the pixels 221 (light receiving areas) of the light receiving sensor 22 and the measurement area 50. FIG. 図18Aは、受光部の光学条件の説明図である。FIG. 18A is an explanatory diagram of the optical conditions of the light receiving section. 図18Bは、受光部の光学条件の説明図である。FIG. 18B is an explanatory diagram of the optical conditions of the light receiving section. 図19Aは、測定方法の一例の説明図である。FIG. 19A is an explanatory diagram of an example of a measurement method. 図19Bは、測定方法の一例の説明図である。FIG. 19B is an explanatory diagram of an example of the measurement method. 図19Cは、測定方法の一例の説明図である。FIG. 19C is an explanatory diagram of an example of the measurement method. 図20は、重複エリア53を測定する場合における信号処理部362の説明図である。FIG. 20 is an explanatory diagram of the signal processing section 362 when measuring the overlapping area 53. As shown in FIG. 図21は、重複エリア53を測定する場合における別の信号処理部362の説明図である。FIG. 21 is an explanatory diagram of another signal processing section 362 when measuring the overlapping area 53. As shown in FIG.
 以下、本開示を実施するための形態について図面を参照しつつ説明する。なお、以下の説明において、同一の又は類似する構成について共通の符号を付して重複した説明を省略することがある。 Embodiments for carrying out the present disclosure will be described below with reference to the drawings. In the following description, the same or similar configurations may be denoted by common reference numerals, and redundant description may be omitted.
 ===第1実施形態===
 <全体構成>
 図1は、測定装置1の全体構成の説明図である。
===First embodiment===
<Overall composition>
FIG. 1 is an explanatory diagram of the overall configuration of the measuring device 1. As shown in FIG.
 測定装置1は、対象物90までの距離を測定する装置である。測定装置1は、いわゆるLiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)としての機能を有する装置である。測定装置1は、測定光を出射し、対象物90の表面で反射した反射光を検出し、測定光を出射してから反射光を受光するまでの時間を計測することによって、対象物90までの距離をTOF方式(Time of flight)で測定する。測定装置1は、照射部10と、受光部20と、制御部30とを有する。 The measuring device 1 is a device that measures the distance to the object 90 . The measuring device 1 is a device that functions as a so-called LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging). The measurement apparatus 1 emits measurement light, detects the reflected light reflected by the surface of the object 90, and measures the time from the emission of the measurement light to the reception of the reflected light. The distance of is measured by the TOF method (Time of flight). The measuring device 1 has an irradiation section 10 , a light receiving section 20 and a control section 30 .
 照射部10は、対象物90に向かって測定光を照射する。照射部10は、所定の画角で測定エリア50(後述)に測定光を照射することになる。照射部10は、光源12と、投光用光学系14とを有する。光源12は、光を出射する。光源12は、例えば面発光レーザー(VCSEL)により構成される。投光用光学系14は、光源12から出射された光を測定エリア50に照射する光学系である。照射部10の詳しい構成については、後述する。 The irradiation unit 10 irradiates the measurement light toward the object 90 . The irradiation unit 10 irradiates a measurement area 50 (described later) with measurement light at a predetermined angle of view. The irradiation section 10 has a light source 12 and a projection optical system 14 . The light source 12 emits light. The light source 12 is composed of, for example, a surface emitting laser (VCSEL). The light projection optical system 14 is an optical system that irradiates the measurement area 50 with light emitted from the light source 12 . A detailed configuration of the irradiation unit 10 will be described later.
 受光部20は、対象物90からの反射光を受光する。受光部20は、測定エリア50からの反射光を受光することになる。受光部20は、受光センサ22と、受光用光学系24とを有する。受光部20の詳しい構成については、後述する。 The light receiving unit 20 receives reflected light from the object 90 . The light receiving section 20 receives reflected light from the measurement area 50 . The light receiving section 20 has a light receiving sensor 22 and a light receiving optical system 24 . A detailed configuration of the light receiving unit 20 will be described later.
 制御部30は、測定装置1の制御を司る。制御部30は、照射部10からの光の照射を制御する。また、制御部30は、受光部20の出力結果に基づいて、対象物90までの距離をTOF方式(Time of flight)で測定する。制御部30は、不図示の演算装置及び記憶装置を有する。演算装置は、例えばCPU、GPUなどの演算処理装置である。演算装置の一部がアナログ演算回路で構成されても良い。記憶装置は、主記憶装置と補助記憶装置とにより構成され、プログラムやデータを記憶する装置である。記憶装置に記憶されているプログラムを演算装置が実行することにより、対象物90までの距離を測定するための各種処理が実行される。図中には、各種処理の機能ブロックが示されている。 The control unit 30 controls the measurement device 1 . The control unit 30 controls irradiation of light from the irradiation unit 10 . Also, the control unit 30 measures the distance to the object 90 by the TOF method (time of flight) based on the output result of the light receiving unit 20 . The control unit 30 has an arithmetic device and a storage device (not shown). The arithmetic device is, for example, an arithmetic processing device such as a CPU or GPU. A part of the arithmetic device may be composed of an analog arithmetic circuit. A storage device is configured by a main storage device and an auxiliary storage device, and stores programs and data. Various processes for measuring the distance to the object 90 are executed by the arithmetic device executing the program stored in the storage device. The figure shows functional blocks for various processes.
 制御部30は、設定部32と、タイミング制御部34と、測距部36とを有する。設定部32は、各種設定を行う。タイミング制御部34は、各部の処理タイミングを制御する。例えば、タイミング制御部34は、光源12から光を射出させるタイミングなどを制御する。測距部36は、対象物90までの距離を測定する。測距部36は、信号処理部362と、時間検出部364と、距離算出部366とを有する。信号処理部362は、受光センサ22の出力信号を処理する。時間検出部364は、光の飛行時間(光を照射してから反射光が到達するまでの時間)を検出する。距離算出部366は、対象物90までの距離を算出する。なお、制御部30の処理については、後述する。 The control unit 30 has a setting unit 32, a timing control unit 34, and a distance measurement unit 36. The setting unit 32 performs various settings. The timing control section 34 controls the processing timing of each section. For example, the timing control unit 34 controls the timing of emitting light from the light source 12 and the like. The distance measuring unit 36 measures the distance to the object 90 . The distance measurement section 36 has a signal processing section 362 , a time detection section 364 and a distance calculation section 366 . The signal processing section 362 processes the output signal of the light receiving sensor 22 . The time detection unit 364 detects the time of flight of light (time from irradiation of light to arrival of reflected light). The distance calculator 366 calculates the distance to the object 90 . Note that the processing of the control unit 30 will be described later.
 <第1実施形態の照射部10について>
 図2は、第1実施形態の照射部10の構成の説明図である。既に説明した通り、照射部10は、光源12と、投光用光学系14とを有する。
<Regarding the irradiation unit 10 of the first embodiment>
FIG. 2 is an explanatory diagram of the configuration of the irradiation unit 10 of the first embodiment. As already explained, the irradiation unit 10 has the light source 12 and the light projecting optical system 14 .
 以下の説明では、投光用光学系14の光軸に沿った方向をZ方向とする。なお、測定装置1の測定対象となる対象物90は、測定装置1に対してZ方向に離れていることになる。また、Z方向に垂直な方向であって、連結レンズ15(後述)を構成する第1レンズエレメント151及び第2レンズエレメント152の並ぶ方向をY方向とする。また、Z方向及びY方向に垂直な方向をX方向とする。 In the following description, the direction along the optical axis of the projection optical system 14 is the Z direction. Note that the object 90 to be measured by the measuring device 1 is separated from the measuring device 1 in the Z direction. Also, the direction perpendicular to the Z direction and the direction in which the first lens element 151 and the second lens element 152 constituting the connecting lens 15 (described later) are arranged is defined as the Y direction. A direction perpendicular to the Z direction and the Y direction is defined as the X direction.
 図3Aは、光源12の説明図である。
 光源12は、XY平面(X方向及びY方向に平行な面)に平行な発光面を有する。発光面は、矩形状に構成されている。光源12から射出された光は、投光用光学系14を介して、測定エリア50に照射される。
FIG. 3A is an explanatory diagram of the light source 12. FIG.
The light source 12 has a light emitting surface parallel to the XY plane (surface parallel to the X direction and the Y direction). The light emitting surface is configured in a rectangular shape. The light emitted from the light source 12 is applied to the measurement area 50 via the light projecting optical system 14 .
 投光用光学系14は、連結レンズ15を有する。図4は、連結レンズ15の斜視図である。
 連結レンズ15は、第1レンズエレメント151と第2レンズエレメント152とを連結させた光学部品である。第1レンズエレメント151及び第2レンズエレメント152は、凸レンズ状の部位(光学エレメント)であり、Y方向に並んで配置されている。第1レンズエレメント151の焦点距離と第2レンズエレメント152の焦点距離は同じである。また、第1レンズエレメント151及び第2レンズエレメント152は、それぞれZ方向に沿った光軸を有する。第1レンズエレメント151の光軸は、投光用光学系14(後述する投影レンズ16)の光軸に対して+Y方向にシフトされている。一方、第2レンズエレメント152の光軸は、投光用光学系14の光軸に対して-Y方向にシフトされている。つまり、第2レンズエレメント152の光軸は、第1レンズエレメント151とは逆方向にシフトされている。
The projection optical system 14 has a coupling lens 15 . FIG. 4 is a perspective view of the coupling lens 15. FIG.
The connecting lens 15 is an optical component in which the first lens element 151 and the second lens element 152 are connected. The first lens element 151 and the second lens element 152 are convex lens-shaped parts (optical elements), and are arranged side by side in the Y direction. The focal length of the first lens element 151 and the focal length of the second lens element 152 are the same. Also, the first lens element 151 and the second lens element 152 each have an optical axis along the Z direction. The optical axis of the first lens element 151 is shifted in the +Y direction with respect to the optical axis of the projection optical system 14 (projection lens 16 described later). On the other hand, the optical axis of the second lens element 152 is shifted in the -Y direction with respect to the optical axis of the projection optical system . That is, the optical axis of the second lens element 152 is shifted in the direction opposite to that of the first lens element 151 .
 また、投光用光学系14は、投影レンズ16を有する。投影レンズ16は、光源12と連結レンズ15との間に配置されたレンズである。投光用光学系14が投影レンズ16を有することによって、光源12の光を広い範囲の測定エリア50に投影できる。また、投光用光学系14が投影レンズ16を有することによって、光源12の光を矩形状の配光パターンにして測定エリア50に投影できる。投影レンズ16と光源12との距離は、投影レンズ16の焦点距離よりも近い。投影レンズ16から連結レンズ15に向かう光線は発散するが、凸レンズ状の第1光学エレメント及び第2光学エレメントによって、連結レンズ15から測定エリア50に向かって照射される光の広がりは抑制される(コリメート光に近い光が投光用光学系14から測定エリア50に照射される)。 The projection optical system 14 also has a projection lens 16 . Projection lens 16 is a lens arranged between light source 12 and coupling lens 15 . By having the projection lens 16 in the projection optical system 14 , the light from the light source 12 can be projected onto a wide measurement area 50 . Further, since the light projecting optical system 14 has the projection lens 16 , the light from the light source 12 can be formed into a rectangular light distribution pattern and projected onto the measurement area 50 . The distance between projection lens 16 and light source 12 is closer than the focal length of projection lens 16 . Although the light rays directed from the projection lens 16 to the coupling lens 15 diverge, the spreading of the light emitted from the coupling lens 15 toward the measurement area 50 is suppressed by the convex lens-shaped first optical element and the second optical element ( Light close to collimated light is emitted from the light projecting optical system 14 to the measurement area 50).
 図3Bは、測定エリア50の説明図である。図5A及び図5Bは、測定エリア50の説明図である。図5Aは、光源12の光が第1レンズエレメント151を介して第1測定エリア51に照射される様子の説明図である。図5Bは、光源12の光が第2レンズエレメント152を介して第2測定エリア52に照射される様子の説明図である。 FIG. 3B is an explanatory diagram of the measurement area 50. FIG. 5A and 5B are explanatory diagrams of the measurement area 50. FIG. FIG. 5A is an explanatory diagram of how the light from the light source 12 irradiates the first measurement area 51 through the first lens element 151. FIG. FIG. 5B is an explanatory diagram of how light from the light source 12 is irradiated onto the second measurement area 52 via the second lens element 152 .
 測定エリア50は、第1測定エリア51と第2測定エリア52とにより構成される。第1測定エリア51は、第1レンズエレメント151を介して光源12の光が照射されるエリアである(言い換えると、第1レンズエレメント151は、第1測定エリア51に光源12の光を照射する光学エレメントである)。第2測定エリア52は、第2レンズエレメント152を介して光源12の光が照射されるエリアである(言い換えると、第2レンズエレメント152は、第2測定エリア52に光源12の光を照射する光学エレメントである)。第1レンズエレメント151と第2レンズエレメント152がY方向に並んで配置されているため、第1測定エリア51及び第2測定エリア52はY方向にずれて配置されている。これにより、測定エリア50をY方向に長く設定できる(言い換えると、Y方向の広い範囲に光を照射できる)。 The measurement area 50 is composed of a first measurement area 51 and a second measurement area 52 . The first measurement area 51 is an area irradiated with the light of the light source 12 via the first lens element 151 (in other words, the first lens element 151 irradiates the first measurement area 51 with the light of the light source 12). optical element). The second measurement area 52 is an area irradiated with the light of the light source 12 via the second lens element 152 (in other words, the second lens element 152 irradiates the second measurement area 52 with the light of the light source 12). optical element). Since the first lens element 151 and the second lens element 152 are arranged side by side in the Y direction, the first measurement area 51 and the second measurement area 52 are arranged to be shifted in the Y direction. As a result, the measurement area 50 can be set long in the Y direction (in other words, a wide range in the Y direction can be irradiated with light).
 第1実施形態では、第1測定エリア51と第2測定エリア52が重複している。以下の説明では、第1測定エリア51と第2測定エリア52とが重複した領域のことを「重複エリア53」と呼ぶ。重複エリア53が設けられることによって、第1測定エリア51と第2測定エリア52との間に光を照射できない領域が形成されることを防止できる。 In the first embodiment, the first measurement area 51 and the second measurement area 52 overlap. In the following description, a region where the first measurement area 51 and the second measurement area 52 overlap is called an "overlapping area 53". By providing the overlapping area 53 , it is possible to prevent the formation of a region where light cannot be irradiated between the first measurement area 51 and the second measurement area 52 .
 図3Bに示すように、Z方向から測定エリア50を見ると、測定エリア50はX方向及びY方向に所定の画角を有する。第1実施形態では、測定エリア50の、X方向の長さに対するY方向の長さの割合(いわゆるアスペクト比)は、光源12の、X方向の長さに対するY方向の長さの割合(いわゆるアスペクト比)よりも大きい。つまり、第1実施形態では、光源12の形状と比べて、測定エリア50をY方向に長く設定できる(言い換えると、Y方向の広い範囲に光を照射できる)。 As shown in FIG. 3B, when the measurement area 50 is viewed from the Z direction, the measurement area 50 has predetermined angles of view in the X and Y directions. In the first embodiment, the ratio of the Y-direction length to the X-direction length (so-called aspect ratio) of the measurement area 50 is the ratio of the Y-direction length to the X-direction length of the light source 12 (so-called aspect ratio). That is, in the first embodiment, the measurement area 50 can be set longer in the Y direction than the shape of the light source 12 (in other words, light can be emitted over a wider range in the Y direction).
 図5Aに示すように、光源12の発光面のうち、第1レンズエレメント151を介して重複エリア53に照射する光を出射する領域を「第1領域121」と呼ぶ。また、図5Bに示すように、光源12の発光面のうち、第2レンズエレメント152を介して重複エリア53に照射する光を出射する領域を「第2領域122」と呼ぶ。 As shown in FIG. 5A, of the light emitting surface of the light source 12, a region that emits light to irradiate the overlapping area 53 via the first lens element 151 is called a "first region 121". Further, as shown in FIG. 5B, a region of the light emitting surface of the light source 12 that emits the light to irradiate the overlapping area 53 via the second lens element 152 is called a "second region 122".
 重複エリア53では、第1レンズエレメント151を介する光と、第2レンズエレメント152を介する光とをそれぞれ照射させることができる。光源12の第1領域121及び第2領域122を同時に発光させると、重複エリア53では、第1レンズエレメント151を介して照射された光と、第2レンズエレメント152を介して照射された光とを重畳させることができる。このため、重複エリア53では、重複エリア53を除く測定エリア50と比べて、光の照射強度を高めることができる。 In the overlapping area 53, light passing through the first lens element 151 and light passing through the second lens element 152 can be irradiated. When the first region 121 and the second region 122 of the light source 12 are caused to emit light at the same time, in the overlapping area 53, the light emitted through the first lens element 151 and the light emitted through the second lens element 152 can be superimposed. Therefore, in the overlapping area 53 , the irradiation intensity of light can be increased compared to the measurement area 50 excluding the overlapping area 53 .
 図3Bに示すように、照射強度が比較的高くなる重複エリア53(ハッチングが施された領域)は、測定エリア50のY方向の中央部に位置する。光源12の発光面を一括発光させることによって(つまり、光源12の第1領域121及び第2領域122を同時に発光させることによって)、図3Bに示すように、測定エリア50の中央部(重複エリア53)に照射する光の強度を高めることができる。 As shown in FIG. 3B, the overlapping area 53 (hatched area) where the irradiation intensity is relatively high is located in the center of the measurement area 50 in the Y direction. By causing the light emitting surface of the light source 12 to emit light collectively (that is, by simultaneously causing the first region 121 and the second region 122 of the light source 12 to emit light), as shown in FIG. 53) can be increased in intensity.
 図3Cは、測定装置1を車両に搭載した例の説明図である。図に示すように、測定装置1を車両に搭載する場合、比較的近傍では、広視野で距離を測定することが望ましい。これに対し、第1実施形態では、図3Bに示すように、測定エリア50のアスペクト比を広げることができるので、広視野での距離の測定に有利である。
 一方、図3Cに示すように、測定装置1を車両に搭載する場合、遠方までの距離の測定が必要とされるエリアは、比較的狭い範囲であることが許容されるが、遠方まで十分な強度の光を照射できることが望ましい。これに対し、第1実施形態では、図3Bに示すように、測定エリア50の重複エリア53の光の強度を高めることができるので、遠方の距離の測定に有利である。
FIG. 3C is an explanatory diagram of an example in which the measuring device 1 is mounted on a vehicle. As shown in the figure, when the measuring device 1 is mounted on a vehicle, it is desirable to measure the distance with a wide field of view in relatively close proximity. On the other hand, in the first embodiment, as shown in FIG. 3B, the aspect ratio of the measurement area 50 can be increased, which is advantageous for distance measurement in a wide field of view.
On the other hand, as shown in FIG. 3C, when the measuring device 1 is mounted on a vehicle, the area required to measure the distance to a long distance is allowed to be a relatively narrow range. It is desirable to be able to irradiate intense light. On the other hand, in the first embodiment, as shown in FIG. 3B, the intensity of the light in the overlapping area 53 of the measurement area 50 can be increased, which is advantageous for measuring a long distance.
 <照射部10の光学条件について>
 図6A及び図6Bは、照射部10の光学条件の説明図である。図6Aは、光源12と投影レンズ16との関係の説明図である。図6Bは、光源12の虚像12’と連結レンズ15との関係の説明図である。
<Regarding the optical conditions of the irradiation unit 10>
6A and 6B are explanatory diagrams of the optical conditions of the irradiation unit 10. FIG. FIG. 6A is an explanatory diagram of the relationship between the light source 12 and the projection lens 16. FIG. FIG. 6B is an explanatory diagram of the relationship between the virtual image 12 ′ of the light source 12 and the coupling lens 15 .
 図6Aに示すように、投影レンズ16の焦点距離をfA1とする。また、投影レンズ16の主点から光源12までの距離をLA1とする。また、光源12のY方向の半分の長さをyAとする(投光用光学系14の光軸から光源12の端部までの長さをyAとする)。 As shown in FIG. 6A, let the focal length of the projection lens 16 be fA1. Also, let LA1 be the distance from the principal point of the projection lens 16 to the light source 12 . Also, the half length of the light source 12 in the Y direction is yA (the length from the optical axis of the light projecting optical system 14 to the end of the light source 12 is yA).
 第1実施形態では、投影レンズ16の主点から光源12までの距離LA1は、投影レンズ16の焦点距離fA1よりも小さい(LA1<fA1)。光源12が投影レンズ16の焦点よりも近くに配置されるため、光源12の虚像12’(投影レンズ16による光源12の像)は、投影レンズ16から見て光源12の反対側(図中の光源12の左側)に配置される。 In the first embodiment, the distance LA1 from the principal point of the projection lens 16 to the light source 12 is smaller than the focal length fA1 of the projection lens 16 (LA1<fA1). Since the light source 12 is arranged closer than the focal point of the projection lens 16, the virtual image 12' of the light source 12 (the image of the light source 12 by the projection lens 16) is on the opposite side of the light source 12 when viewed from the projection lens 16 (Fig. left side of the light source 12).
 投影レンズ16の主点から虚像12’までの距離をLA’とすると、LA1、LA’及びfA1の関係は次式(A1)の通りとなる。
 (1/LA1)-(1/LA’)=1/fA1    ・・・・(A1)
Assuming that the distance from the principal point of the projection lens 16 to the virtual image 12' is LA', the relationship between LA1, LA' and fA1 is given by the following equation (A1).
(1/LA1)-(1/LA')=1/fA1 (A1)
 このため、投影レンズ16の主点から虚像12’までの距離LA’は、次式(A2)の通りとなる。
 LA’=(LA1×fA1)/(fA1-LA1)    ・・・・(A2)
Therefore, the distance LA' from the principal point of the projection lens 16 to the virtual image 12' is given by the following equation (A2).
LA'=(LA1×fA1)/(fA1−LA1) (A2)
 また、投光用光学系14の光軸から光源12の虚像12’の端部までの長さをyA’とすると、yA’は次式(A3)の通りとなる。
 yA’=yA×(LA’/LA1)
    =yA×fA1/(fA1-LA1)    ・・・・(A3)
When the length from the optical axis of the light projecting optical system 14 to the edge of the virtual image 12' of the light source 12 is yA', yA' is given by the following equation (A3).
yA′=yA×(LA′/LA1)
=yA×fA1/(fA1−LA1) (A3)
 次に、第1レンズエレメント151及び第2レンズエレメント152の焦点距離をfA2とする。図6Bに示すように、連結レンズ15の主点(第1レンズエレメント151又は第2レンズエレメント152の主点)から虚像12’までの距離をLA2とする。ここで、図6Bに示すように、連結レンズ15が虚像12’の光を遠方の照射エリアに照射すると考えると、焦点距離fA2と距離LA2との関係は次式(A4)の通りとなる。
 LA2=fA2    ・・・・(A4)
Next, let the focal length of the first lens element 151 and the second lens element 152 be fA2. As shown in FIG. 6B, let LA2 be the distance from the principal point of the coupling lens 15 (the principal point of the first lens element 151 or the second lens element 152) to the virtual image 12'. Here, assuming that the connecting lens 15 irradiates the light of the virtual image 12' to the distant irradiation area as shown in FIG. 6B, the relationship between the focal length fA2 and the distance LA2 is expressed by the following equation (A4).
LA2=fA2 (A4)
 なお、投影レンズ16の主点と連結レンズ15の主点との距離をdAとする。ここで、距離LA2は、距離LA’に距離dAを加算した値に相当するため、焦点距離fA1,fA2と各距離との関係は、次式(A5)の通りとなる。
 (LA1×fA1)/(fA1-LA1)+dA=fA2    ・・・・(A5)
The distance between the principal point of the projection lens 16 and the principal point of the coupling lens 15 is dA. Here, since the distance LA2 corresponds to a value obtained by adding the distance dA to the distance LA', the relationship between the focal lengths fA1 and fA2 and each distance is expressed by the following equation (A5).
(LA1×fA1)/(fA1−LA1)+dA=fA2 (A5)
 また、図6Bに示すように、投光用光学系14(投影レンズ16)の光軸と第1レンズエレメント151(又は第2レンズエレメント152)の光軸との間隔をtAとし、「シフト量tA」と呼ぶ。ここで、重複エリア53が形成されるためには、第1測定エリア51と第2測定エリア52とが重なる必要がある。このため、シフト量tAは、投光用光学系14の光軸から光源12の虚像12’の端部までの長さyA’よりも小さい必要がある(tA<yA’)。つまり、重複エリア53が形成されるためには、連結レンズ15の第1レンズエレメント151及び第2レンズエレメント152のそれぞれのシフト量tAは、次式(A6)の条件を満たす必要がある。
 tA<yA×fA1/(fA1-LA1)    ・・・・(A6)
Also, as shown in FIG. 6B, the distance between the optical axis of the projection optical system 14 (projection lens 16) and the optical axis of the first lens element 151 (or the second lens element 152) is tA. tA”. Here, in order to form the overlapping area 53, the first measurement area 51 and the second measurement area 52 need to overlap. Therefore, the shift amount tA needs to be smaller than the length yA' from the optical axis of the projection optical system 14 to the edge of the virtual image 12' of the light source 12 (tA<yA'). In other words, in order to form the overlapping area 53, the shift amount tA of each of the first lens element 151 and the second lens element 152 of the coupling lens 15 must satisfy the following equation (A6).
tA<yA×fA1/(fA1−LA1) (A6)
 <測定例1>
 図7は、受光センサ22の説明図である。
<Measurement example 1>
FIG. 7 is an explanatory diagram of the light receiving sensor 22. As shown in FIG.
 受光センサ22は、2次元配置された複数の画素221を有する。例えばVGAの受光センサ22の場合、480×640の画素221が2次元配置されている。各画素221は、受光素子を有しており、受光素子は、受光量に応じた信号を出力する。制御部30は、画素221ごとの出力信号を取得することになる。 The light receiving sensor 22 has a plurality of pixels 221 arranged two-dimensionally. For example, in the case of the VGA light receiving sensor 22, 480×640 pixels 221 are two-dimensionally arranged. Each pixel 221 has a light receiving element, and the light receiving element outputs a signal according to the amount of light received. The control unit 30 acquires an output signal for each pixel 221 .
 図8は、測定方法の一例を説明するためのタイミングチャートである。 FIG. 8 is a timing chart for explaining an example of the measurement method.
 制御部30(タイミング制御部34)は、照射部10の光源12に所定の周期でパルス光を出射させる。図8の上側には、光源12がパルス光を出射するタイミング(出射タイミング)が示されている。ここでは、光源12の発光面の全体から光を出射させるものとする。光源12から出射された光は、投光用光学系14を介して測定エリア50に照射される。測定エリア50内の対象物90の表面で反射した光は、受光用光学系24を介して受光センサ22に受光される。受光センサ22の各画素221は、パルス状の反射光を受光することになる。図8の中央には、パルス状の反射光が到達するタイミング(到達タイミング)が示されている。各画素221は、受光量に応じた信号を出力する。図8の下側には、各画素221の出力信号が示されている。 The control unit 30 (timing control unit 34) causes the light source 12 of the irradiation unit 10 to emit pulsed light at a predetermined cycle. The upper part of FIG. 8 shows the timing (emission timing) at which the light source 12 emits pulsed light. Here, it is assumed that light is emitted from the entire light emitting surface of the light source 12 . The light emitted from the light source 12 is applied to the measurement area 50 via the light projecting optical system 14 . The light reflected by the surface of the object 90 within the measurement area 50 is received by the light receiving sensor 22 via the light receiving optical system 24 . Each pixel 221 of the light receiving sensor 22 receives the pulsed reflected light. The center of FIG. 8 shows the timing (arrival timing) at which the pulsed reflected light arrives. Each pixel 221 outputs a signal corresponding to the amount of light received. The output signal of each pixel 221 is shown in the lower part of FIG.
 制御部30の測距部36(信号処理部362)は、各画素221の出力信号に基づいて、反射光の到達タイミングを検出する。例えば、信号処理部362は、各画素221の出力信号のピークのタイミングに基づいて、反射光の到達タイミングを検出する。なお、信号処理部362は、外乱光(例えば太陽光)の影響を除去するため、画素221の出力信号のDC成分をカットした信号のピークに基づいて、反射光の到達タイミングを求めても良い。 The distance measurement unit 36 (signal processing unit 362) of the control unit 30 detects the arrival timing of the reflected light based on the output signal of each pixel 221. For example, the signal processing unit 362 detects the arrival timing of reflected light based on the peak timing of the output signal of each pixel 221 . Note that the signal processing unit 362 may obtain the arrival timing of the reflected light based on the peak of the signal obtained by cutting the DC component of the output signal of the pixel 221 in order to remove the influence of ambient light (for example, sunlight). .
 次に、測距部36(時間検出部364)は、光の出射タイミングと、光の到達タイミングとに基づいて、光を照射してから反射光が到達するまでの時間Tfを検出する。時間Tfは、測定装置1と対象物90との間を光が往復する時間に相当する。そして、測距部36(距離算出部366)は、時間Tfに基づいて、対象物90までの距離Lを算出する。なお、光を照射してから反射光が到達するまでの時間をTfとし、光の速度をCとしたとき、距離Lは、L=C×Tf/2となる。制御部30は、画素221ごとに検出した時間Tfに基づいて、画素221ごとに対象物90までの距離を算出することによって、距離画像を生成する。 Next, the distance measurement unit 36 (time detection unit 364) detects the time Tf from when the light is emitted to when the reflected light arrives, based on the light emission timing and the light arrival timing. The time Tf corresponds to the time it takes the light to make a round trip between the measurement device 1 and the object 90 . Then, the distance measurement unit 36 (distance calculation unit 366) calculates the distance L to the object 90 based on the time Tf. When the time from irradiation of light to arrival of reflected light is Tf, and the speed of light is C, the distance L is L=C×Tf/2. The control unit 30 generates a distance image by calculating the distance to the object 90 for each pixel 221 based on the time Tf detected for each pixel 221 .
 <測定例2>
 図9Aは、受光センサ22の別の一例の説明図である。
 受光センサ22は、2次元配置された複数の画素221を有する。各画素221は、複数の受光素子222を有する。ここでは、各画素221は、受光素子222として、X方向に3個、Y方向に3個に配列された9個のSPAD(Single Photon Avalanche Diode)を有する。SPADで構成された受光素子222は、フォトンを検出するとパルス信号を出力する。
<Measurement example 2>
FIG. 9A is an explanatory diagram of another example of the light receiving sensor 22. FIG.
The light receiving sensor 22 has a plurality of pixels 221 arranged two-dimensionally. Each pixel 221 has a plurality of light receiving elements 222 . Here, each pixel 221 has nine SPADs (Single Photon Avalanche Diodes) arranged three in the X direction and three in the Y direction as the light receiving elements 222 . A light-receiving element 222 composed of a SPAD outputs a pulse signal when it detects a photon.
 図9Bは、信号処理部362の説明図である。信号処理部362は、加算部362Aと、比較部362Bと、ヒストグラム生成部362Cとを有する。ここでは、処理信号部は、受光センサ22の各画素221の出力信号に基づいて、時間相関単一光子計数法(Time Correlated Single Photon Counting(TCSPC)で用いるヒストグラムを生成する。 9B is an explanatory diagram of the signal processing unit 362. FIG. The signal processor 362 has an adder 362A, a comparator 362B, and a histogram generator 362C. Here, the processing signal unit generates a histogram used in Time Correlated Single Photon Counting (TCSPC) based on the output signal of each pixel 221 of the light receiving sensor 22 .
 加算部362Aは、画素221を構成する複数の受光素子222(SPAD)の出力信号を加算する。加算部362Aは、受光素子222が出力するパルス幅を調整(整形)した上で、複数の受光素子222の出力信号を加算しても良い。比較部362Bは、加算部362Aの出力信号と閾値とを比較し、加算部362Aの出力信号が閾値以上の場合に信号を出力する。比較部362Bが信号を出力するタイミングは、受光センサ22の受光素子222(SPAD)が光を検知したタイミングであると考えられる。 The adder 362A adds the output signals of the plurality of light receiving elements 222 (SPAD) forming the pixel 221. The adder 362A may adjust (shape) the pulse widths output from the light receiving elements 222 and then add the output signals of the plurality of light receiving elements 222 . The comparator 362B compares the output signal of the adder 362A with a threshold, and outputs a signal when the output signal of the adder 362A is greater than or equal to the threshold. The timing at which the comparator 362B outputs a signal is considered to be the timing at which the light receiving element 222 (SPAD) of the light receiving sensor 22 detects light.
 ところで、外乱光のフォトンは時間的にランダムにそれぞれの受光素子222に入射する。これに対し、反射光のフォトンは、光を照射してから所定の遅延時間(対象物90までの距離に応じた飛行時間)にそれぞれの受光素子222に入射する。このため、外乱光のフォトンが時間的にランダムに受光素子222に入射した場合には、加算部362Aの出力信号が閾値以上になる確率は低い。一方、反射光のフォトンが受光素子222に入射した場合には、画素221を構成する複数の受光素子222が同時にフォトンを検出するため、加算部362Aの出力信号が閾値以上になる確率は高い。このため、複数の受光素子222の出力信号を加算部362Aで加算し、加算部362Aの出力信号と閾値とを比較部362Bに比較させることによって、受光素子222(SPAD)が反射光を検知したと考えられる時間を計測する。 By the way, the photons of disturbance light are temporally randomly incident on each light receiving element 222 . On the other hand, the photons of the reflected light are incident on each of the light receiving elements 222 after a predetermined delay time (flight time according to the distance to the object 90) after light irradiation. Therefore, when photons of disturbance light are incident on the light receiving element 222 at random in terms of time, the probability that the output signal of the adder 362A is equal to or higher than the threshold is low. On the other hand, when the photons of the reflected light are incident on the light receiving element 222, the multiple light receiving elements 222 forming the pixel 221 detect the photons at the same time, so the output signal of the adder 362A is highly likely to be equal to or greater than the threshold. Therefore, the output signals of the plurality of light receiving elements 222 are added by the adding section 362A, and the output signal of the adding section 362A is compared with the threshold value by the comparing section 362B, whereby the light receiving element 222 (SPAD) detects the reflected light. Measure the time that can be considered.
 図9Cは、ヒストグラムの説明図である。図中の横軸は、時間であり、縦軸は頻度(回数)である。ヒストグラム生成部362Cは、比較部362Bの出力に基づいて、受光センサ22の受光素子222(SPAD)が光を検知した時間を繰り返し計測するとともに、その時間に対応付けられた頻度(回数)をインクリメントすることによって、ヒストグラムを生成する。ヒストグラム生成部362Cは、頻度(回数)をインクリメントする時、数を1つ増加させる代わりに、加算部362Aの出力信号(加算値)に相当する数を増加させても良い。 FIG. 9C is an explanatory diagram of a histogram. The horizontal axis in the figure is time, and the vertical axis is frequency (number of times). Based on the output of the comparison unit 362B, the histogram generation unit 362C repeatedly measures the time that the light receiving element 222 (SPAD) of the light receiving sensor 22 detects light, and increments the frequency (number of times) associated with that time. to generate a histogram. When the frequency (number of times) is incremented, the histogram generator 362C may increase the number corresponding to the output signal (added value) of the adder 362A instead of increasing the number by one.
 なお、設定部32(図1参照)は、ヒストグラムを生成するための積算回数を予め設定する。タイミング制御部34は、設定された積算回数に応じて、照射部10の光源12にパルス光を複数回出射させる。光源12からの1回のパルス光の出射に対して、加算部362Aから信号が1回又は複数回出力される。ヒストグラム生成部362Cは、設定された積算回数に達するまで、比較部362Bの出力信号に応じて、頻度(回数)をインクリメントすることによって、ヒストグラムを生成する。 Note that the setting unit 32 (see FIG. 1) presets the number of integrations for generating the histogram. The timing control unit 34 causes the light source 12 of the irradiation unit 10 to emit pulsed light a plurality of times according to the set number of times of integration. A signal is output once or a plurality of times from the adder 362A for one emission of pulsed light from the light source 12 . The histogram generation unit 362C generates a histogram by incrementing the frequency (number of times) according to the output signal of the comparison unit 362B until the set number of times of accumulation is reached.
 ヒストグラムの生成後、測距部36(時間検出部364)は、ヒストグラムに基づいて、光を照射してから反射光が到達するまでの時間Tfを検出する。図9Cに示すように、測距部36(時間検出部364)は、ヒストグラムの頻度のピークに対応する時間を検出し、その時間を時間Tfとする。そして、測距部36(距離算出部366)は、時間Tfに基づいて、対象物90までの距離を算出する。 After generating the histogram, the distance measurement unit 36 (time detection unit 364) detects the time Tf from when the light is emitted until the reflected light arrives, based on the histogram. As shown in FIG. 9C, the distance measurement unit 36 (time detection unit 364) detects the time corresponding to the frequency peak of the histogram, and sets that time as time Tf. Then, the distance measuring section 36 (distance calculating section 366) calculates the distance to the object 90 based on the time Tf.
 通常、遠方の対象物90の距離を測定する場合には、対象物90に照射される光が弱まるため、対象物90の距離を高精度に測定するためには、ヒストグラムを生成するための積算回数を増やす必要がある。但し、積算回数を増やしてしまうと、ヒストグラムが完成するまでに時間がかかってしまうため、距離の測定に時間がかかってしまう(言い換えると、距離画像のフレームレート(FPS)が悪くなる)。これに対し、第1実施形態では、測定エリア50の重複エリア53の光の強度を高めることができるので、重複エリア53において遠方の対象物90の距離を測定する際に、積算回数を増やさずに済むという利点がある。 Normally, when measuring the distance of a distant object 90, the light illuminating the object 90 is weakened. Need to increase the number of times. However, if the number of integrations is increased, it will take time to complete the histogram, so it will take time to measure the distance (in other words, the frame rate (FPS) of the distance image will deteriorate). On the other hand, in the first embodiment, the intensity of the light in the overlapping area 53 of the measurement area 50 can be increased. There is an advantage that it can be completed in
 <光源12の発光制御の変形例>
 上記の第1実施形態では、光源12は、発光面を一括発光させていた。但し、発光面の一部の領域を発光させるように、光源12を制御しても良い。
<Modified Example of Light Emission Control of Light Source 12>
In the first embodiment described above, the light source 12 collectively emits light from the light emitting surface. However, the light source 12 may be controlled so as to emit light from a part of the light emitting surface.
 図10A及び図10Bは、光源12の発光領域と、測定エリア50との関係を示す説明図である。図10Aは、第1レンズエレメント151を介して第1測定エリア51に光を照射する場合の光源12の発光領域と、測定エリア50との関係を示す説明図である。図10Bは、第2レンズエレメント152を介して第2測定エリア52に光を照射する場合の光源12の発光領域と、測定エリア50との関係を示す説明図である。図10A及び図10Bの測定エリア50の各領域の右側には、対応する光源12の発光領域が示されている。 10A and 10B are explanatory diagrams showing the relationship between the light emitting area of the light source 12 and the measurement area 50. FIG. FIG. 10A is an explanatory diagram showing the relationship between the light emitting region of the light source 12 and the measurement area 50 when the first measurement area 51 is irradiated with light through the first lens element 151. FIG. FIG. 10B is an explanatory diagram showing the relationship between the light emitting region of the light source 12 and the measurement area 50 when the second measurement area 52 is irradiated with light through the second lens element 152. As shown in FIG. On the right side of each measurement area 50 in FIGS. 10A and 10B, the corresponding light emitting area of the light source 12 is shown.
 光源12は、Y方向に複数の発光領域に分割されており、ここでは12個の発光領域(発光領域#1~#12)に分割されている。なお、Y方向に分割された光源12の発光領域の数は、12に限られるものではない。また、測定エリア50は、Y方向に複数の領域(領域A~領域P)に分割されており、ここでは16個の領域に分割されている。なお、Y方向に分割された測定エリア50の領域の数は、16に限られるものではない。 The light source 12 is divided into a plurality of light-emitting regions in the Y direction, here divided into 12 light-emitting regions (light-emitting regions #1 to #12). The number of light emitting regions of the light source 12 divided in the Y direction is not limited to twelve. Also, the measurement area 50 is divided into a plurality of regions (regions A to P) in the Y direction, and is divided into 16 regions here. Note that the number of regions of the measurement area 50 divided in the Y direction is not limited to 16.
 図10Aに示すように、第1測定エリア51は、領域A~領域Lに相当する。光源12の発光領域#1~#12は、第1測定エリア51の領域L~領域Aにそれぞれ対応する。例えば、光源12の発光領域#5から出射した光は、第1レンズエレメント151を介して、測定エリア50の領域Hに照射されることになる。 As shown in FIG. 10A, the first measurement area 51 corresponds to regions A to L. The light emitting areas #1 to #12 of the light source 12 correspond to the areas L to A of the first measurement area 51, respectively. For example, the light emitted from the light emitting region #5 of the light source 12 passes through the first lens element 151 and irradiates the region H of the measurement area 50 .
 図10Bに示すように、第2測定エリア52は、領域E~領域Pに相当する。光源12の発光領域#1~#12は、第2測定エリア52の領域P~領域Eにそれぞれ対応する。例えば、光源12の発光領域#5から出射した光は、第2レンズエレメント152を介して、測定エリア50の領域Lに照射されることになる。このように、光源12の或る発光領域(例えば発光領域#5)から出射した光は、連結レンズ15の第1レンズエレメント151及び第2レンズエレメント152を介して、測定エリア50の2つの領域(例えば領域Hと領域L)に照射される。 As shown in FIG. 10B, the second measurement area 52 corresponds to regions E to P. The light emitting areas #1 to #12 of the light source 12 correspond to the areas P to E of the second measurement area 52, respectively. For example, the light emitted from the light emitting region #5 of the light source 12 passes through the second lens element 152 and irradiates the region L of the measurement area 50 . In this way, the light emitted from a certain light emitting region (for example, light emitting region #5) of the light source 12 passes through the first lens element 151 and the second lens element 152 of the coupling lens 15 and passes through the two regions of the measurement area 50. (eg area H and area L).
 図10A及び図10Bに示すように、重複エリア53は、領域E~領域Lに相当する。図10Aに示すように、光源12の発光領域#8~#1から出射した光は、第1レンズエレメント151を介して、重複エリア53である領域E~領域Lに照射される。このため、光源12の発光領域#1~#8は、前述の第1領域121に相当する。また、図10Bに示すように、光源12の発光領域#12~#5から出射した光は、第2レンズエレメント152を介して、重複エリア53である領域E~領域Lに照射される。このため、光源12の発光領域#5~#12は、前述の第2領域122に相当する。 As shown in FIGS. 10A and 10B, the overlapping area 53 corresponds to the areas E to L. As shown in FIG. 10A, the light emitted from the light emitting regions #8 to #1 of the light source 12 passes through the first lens element 151 and irradiates the regions E to L, which are the overlapping area 53. As shown in FIG. Therefore, the light emitting regions #1 to #8 of the light source 12 correspond to the first region 121 described above. Further, as shown in FIG. 10B, light emitted from the light emitting regions #12 to #5 of the light source 12 passes through the second lens element 152 and irradiates the regions E to L, which are the overlapping area 53. As shown in FIG. Therefore, the light emitting regions #5 to #12 of the light source 12 correspond to the second region 122 described above.
 図11は、重複エリア53である領域Hの測定時の様子の説明図である。図中の測定エリア50の各領域の右側には、対応する光源12の領域が示されている。また、図中の受光センサ22の各画素221の左側には、対応する測定エリア50の領域が示されている。受光センサ22の各画素221は、測定エリア50の対応する領域からの反射光を受光する(測定エリア50の像が受光用光学系24によって受光センサ22の受光面で結像する)。 FIG. 11 is an explanatory diagram of how the region H, which is the overlapping area 53, is measured. The area of the corresponding light source 12 is shown on the right side of each area of the measurement area 50 in the figure. In addition, a region of the corresponding measurement area 50 is shown on the left side of each pixel 221 of the light receiving sensor 22 in the drawing. Each pixel 221 of the light-receiving sensor 22 receives the reflected light from the corresponding region of the measurement area 50 (the image of the measurement area 50 is formed on the light-receiving surface of the light-receiving sensor 22 by the light-receiving optical system 24).
 重複エリア53の領域Hには、光源12の発光領域#5及び発光領域#9が対応付けられている。図に示すように、領域Hを測定する場合、制御部30は、光源12の発光領域#5及び発光領域#9から同時に光を出射させる。このように、重複エリア53の特定の領域を測定するとき、制御部30は、その領域に対応する2つの発光領域から同時に光を出射させる。 The area H of the overlapping area 53 is associated with the light emitting area #5 and the light emitting area #9 of the light source 12 . As shown in the figure, when measuring region H, the control unit 30 causes the light emitting region #5 and the light emitting region #9 of the light source 12 to emit light simultaneously. Thus, when measuring a specific region of the overlapping area 53, the control unit 30 simultaneously emits light from the two light emitting regions corresponding to that region.
 光源12の発光領域#5から出射した光は、領域H(第1測定エリア51)と、領域L(第2測定エリア52)に照射される。また、光源12の領域#9から出射した光は、領域D(第1測定エリア51)と、領域H(第2測定エリア52)に照射される。つまり、重複エリア53となる領域H(重複エリア53)には、光源12の領域#5及び#9から出射した光が重畳するため、照射される光の強度が高くなる。 The light emitted from the light emitting region #5 of the light source 12 irradiates the region H (first measurement area 51) and the region L (second measurement area 52). Moreover, the light emitted from the region #9 of the light source 12 is irradiated to the region D (first measurement area 51) and the region H (second measurement area 52). That is, since the light emitted from the regions #5 and #9 of the light source 12 overlaps the region H (overlapping area 53) that becomes the overlapping area 53, the intensity of the irradiated light increases.
 受光センサ22の画素#9は、測定エリア50の領域Hからの反射光を受光する。領域Hには光源12の発光領域#5及び発光領域#9から出射した光が重畳するため、受光センサ22の画素#9が受光する反射光の強度を高めることができる。 The pixel #9 of the light receiving sensor 22 receives the reflected light from the area H of the measurement area 50 . Since the light emitted from the light emitting regions #5 and #9 of the light source 12 is superimposed on the region H, the intensity of the reflected light received by the pixel #9 of the light receiving sensor 22 can be increased.
 なお、これまでの説明では、連結レンズ15は、重複エリア53の光の強度を高める目的で用いられている。但し、連結レンズ15の用途は、重複エリア53の光の強度を高めるものでなくても良い。 It should be noted that in the description so far, the connecting lens 15 is used for the purpose of increasing the intensity of the light in the overlapping area 53 . However, the application of coupling lens 15 need not be to increase the intensity of light in overlapping area 53 .
 例えば、領域Hを測定する際に、制御部30は、光源12の発光領域#5から光を出射させて受光センサ22の画素#9の受光結果に基づいて距離を算出することと、光源12の発光領域#9から光を出射させて受光センサ22の画素#9の受光結果に基づいて距離を算出することと、を別々に行っても良い。そして、制御部30は、それぞれの距離の算出結果を平均化することによって、距離を算出しても良い。このように、連結レンズ15は、光の強度を高める目的で用いられなくても良い。 For example, when measuring the region H, the control unit 30 emits light from the light emitting region #5 of the light source 12 and calculates the distance based on the light receiving result of the pixel #9 of the light receiving sensor 22. , and calculating the distance based on the light reception result of the pixel #9 of the light receiving sensor 22 may be performed separately. Then, the control unit 30 may calculate the distance by averaging the respective distance calculation results. Thus, coupling lens 15 may not be used for the purpose of increasing light intensity.
 図12は、別の測定方法の説明図である。 FIG. 12 is an explanatory diagram of another measuring method.
 光源12の特定の発光領域から出射した光は、第1測定エリア51の対応する領域と、第2測定エリア52の対応する領域とに照射される。つまり、光源12の特定の発光領域から出射した光は、測定エリア50の2つの領域に照射される。例えば、図に示すように、光源12の発光領域#5から出射した光は、領域H(第1測定エリア51)と、領域L(第2測定エリア52)に照射される。 Light emitted from a specific light-emitting region of the light source 12 is irradiated onto the corresponding region of the first measurement area 51 and the corresponding region of the second measurement area 52 . In other words, the light emitted from the specific light emitting region of the light source 12 irradiates two regions of the measurement area 50 . For example, as shown in the figure, the light emitted from the light emitting region #5 of the light source 12 irradiates the region H (first measurement area 51) and the region L (second measurement area 52).
 図に示すように、受光センサ22の画素#9は、測定エリア50の領域Hに対応しており、領域Hからの反射光を受光する。また、受光センサ22の画素#5は、測定エリア50の領域Lに対応しており、領域Lからの反射光を受光する。 As shown in the figure, the pixel #9 of the light receiving sensor 22 corresponds to the area H of the measurement area 50 and receives reflected light from the area H. Further, the pixel #5 of the light receiving sensor 22 corresponds to the area L of the measurement area 50 and receives the reflected light from the area L. As shown in FIG.
 この測定方法では、光源12の或る発光領域から光を出射することによって、測定エリア50の2つの領域における対象物90までの距離を同時に測定することが可能である。このように、連結レンズ15は、重複エリア53の光の強度を高める目的で用いられなくても良い。 In this measurement method, by emitting light from a certain light emitting area of the light source 12, it is possible to simultaneously measure the distance to the object 90 in two areas of the measurement area 50. Thus, coupling lens 15 need not be used to increase the intensity of light in overlap area 53 .
 <第1実施形態の受光部20について>
 図13は、第1実施形態の受光部20の説明図である。既に説明した通り、受光部20は、受光センサ22と、受光用光学系24とを有する。
<Regarding the light receiving unit 20 of the first embodiment>
FIG. 13 is an explanatory diagram of the light receiving section 20 of the first embodiment. As already explained, the light receiving section 20 has the light receiving sensor 22 and the light receiving optical system 24 .
 以下の説明では、受光用光学系24の光軸に沿った方向をZ方向とする。なお、測定装置1の測定対象となる対象物90は、測定装置1に対してZ方向に離れていることになる。また、Z方向に垂直な方向であって、連結レンズ25(後述)を構成する第1レンズエレメント251及び第2レンズエレメント252の並ぶ方向をY方向とする。また、Z方向及びY方向に垂直な方向をX方向とする。 In the following description, the direction along the optical axis of the light receiving optical system 24 is the Z direction. Note that the object 90 to be measured by the measuring device 1 is separated from the measuring device 1 in the Z direction. Also, the direction perpendicular to the Z direction and the direction in which the first lens element 251 and the second lens element 252 constituting the connecting lens 25 (described later) are arranged is defined as the Y direction. A direction perpendicular to the Z direction and the Y direction is defined as the X direction.
 図14は、受光センサ22の説明図である。
 受光センサ22は、XY平面(X方向及びY方向に平行な面)に平行な受光面を有する。受光面は、矩形状に構成されている。測定エリアから届く反射光は、受光用光学系24を介して、受光センサ22の受光面に照射される(測定エリア50の像が受光用光学系24によって受光センサ22の受光面で結像する)。
FIG. 14 is an explanatory diagram of the light receiving sensor 22. As shown in FIG.
The light receiving sensor 22 has a light receiving surface parallel to the XY plane (a surface parallel to the X direction and the Y direction). The light receiving surface is configured in a rectangular shape. Reflected light arriving from the measurement area is irradiated onto the light receiving surface of the light receiving sensor 22 via the light receiving optical system 24 (an image of the measurement area 50 is formed on the light receiving surface of the light receiving sensor 22 by the light receiving optical system 24). ).
 図7に示したように、受光センサ22は、2次元配置された複数の画素221を有する。 As shown in FIG. 7, the light receiving sensor 22 has a plurality of pixels 221 arranged two-dimensionally.
 受光用光学系は、連結レンズ25を有する。図15は、連結レンズ25の斜視図である。
 連結レンズ25は、第1レンズエレメント251と第2レンズエレメント252とを連結させた光学部品である。第1レンズエレメント251及び第2レンズエレメント252は、凸レンズ状の部位(光学エレメント)であり、Y方向に並んで配置されている。第1レンズエレメント251の焦点距離と第2レンズエレメント252の焦点距離は同じである。また、第1レンズエレメント251及び第2レンズエレメント252は、それぞれZ方向に沿った光軸を有する。第1レンズエレメント251の光軸は、受光用光学系24(後述する集光レンズ26)の光軸に対して+Y方向にシフトされている。一方、第2レンズエレメント252の光軸は、受光用光学系24の光軸に対して-Y方向にシフトされている。つまり、第2レンズエレメント252の光軸は、第1レンズエレメント251とは逆方向にシフトされている。
The light receiving optical system has a coupling lens 25 . 15 is a perspective view of the coupling lens 25. FIG.
The connecting lens 25 is an optical component that connects the first lens element 251 and the second lens element 252 . The first lens element 251 and the second lens element 252 are convex lens-shaped parts (optical elements), and are arranged side by side in the Y direction. The focal length of the first lens element 251 and the focal length of the second lens element 252 are the same. Also, the first lens element 251 and the second lens element 252 each have an optical axis along the Z direction. The optical axis of the first lens element 251 is shifted in the +Y direction with respect to the optical axis of the light-receiving optical system 24 (collecting lens 26 described later). On the other hand, the optical axis of the second lens element 252 is shifted in the -Y direction with respect to the optical axis of the light receiving optical system . That is, the optical axis of the second lens element 252 is shifted in the direction opposite to that of the first lens element 251 .
 また、受光用光学系24は、集光レンズ26を有する。集光レンズ26は、受光センサ22と連結レンズ25との間に配置されたレンズである。受光用光学系24が集光レンズ26を有することによって、広い範囲の測定エリア50からの反射光を受光センサ22に受光させることができる。また、受光用光学系24が集光レンズ26を有することによって、受光センサ22の矩形状の受光面に、矩形状の測定エリア50の像を結像できる。集光レンズ26と受光センサ22との距離は、集光レンズ26の焦点距離よりも近い。 The light receiving optical system 24 also has a condenser lens 26 . The condenser lens 26 is a lens arranged between the light receiving sensor 22 and the coupling lens 25 . Since the light receiving optical system 24 has the condenser lens 26 , the light receiving sensor 22 can receive the reflected light from the wide measurement area 50 . Further, since the light receiving optical system 24 has the condenser lens 26 , an image of the rectangular measurement area 50 can be formed on the rectangular light receiving surface of the light receiving sensor 22 . The distance between the condenser lens 26 and the light receiving sensor 22 is shorter than the focal length of the condenser lens 26 .
 図16A及び図16Bは、測定エリア50の説明図である。図16Aは、受光センサ22が第1レンズエレメント251を介して第1測定エリア51からの反射光を受光する様子の説明図である。図16Bは、受光センサ22が第2レンズエレメント252を介して第2測定エリア52からの反射光を受光する様子の説明図である。 16A and 16B are explanatory diagrams of the measurement area 50. FIG. FIG. 16A is an explanatory diagram of how the light receiving sensor 22 receives reflected light from the first measurement area 51 via the first lens element 251 . FIG. 16B is an explanatory diagram of how the light receiving sensor 22 receives reflected light from the second measurement area 52 via the second lens element 252 .
 図3Bに示したように、測定エリア50は、第1測定エリア51と第2測定エリア52とにより構成される。第1測定エリア51は、第1レンズエレメント251を介して受光センサ22が反射光を受光するエリアである(言い換えると、第1レンズエレメント251は、第1測定エリア51からの反射光を受光センサ22に受光させる光学エレメントである)。第2測定エリア52は、第2レンズエレメント252を介して受光センサ22が反射光を受光するエリアである(言い換えると、第2レンズエレメント252は、第2測定エリア52からの反射光を受光センサ22に受光させる光学エレメントである)。第1レンズエレメント251と第2レンズエレメント252がY方向に並んで配置されているため、第1測定エリア51及び第2測定エリア52はY方向にずれて配置されている。これにより、測定エリア50をY方向に長く設定できる(言い換えると、Y方向の広い範囲の反射光を受光できる)。 As shown in FIG. 3B, the measurement area 50 is composed of a first measurement area 51 and a second measurement area 52 . The first measurement area 51 is an area where the light sensor 22 receives reflected light via the first lens element 251 (in other words, the first lens element 251 receives the reflected light from the first measurement area 51 as a light sensor). 22). The second measurement area 52 is an area where the light sensor 22 receives reflected light via the second lens element 252 (in other words, the second lens element 252 receives the light reflected from the second measurement area 52 by the light sensor. 22). Since the first lens element 251 and the second lens element 252 are arranged side by side in the Y direction, the first measurement area 51 and the second measurement area 52 are arranged to be shifted in the Y direction. This makes it possible to set the measurement area 50 long in the Y direction (in other words, it is possible to receive reflected light over a wide range in the Y direction).
 第1実施形態では、重複エリア53が設けられることによって、第1測定エリア51と第2測定エリア52との間に反射光を受光できない領域が形成されることを防止できる。 In the first embodiment, by providing the overlapping area 53, it is possible to prevent the formation of an area that cannot receive the reflected light between the first measurement area 51 and the second measurement area 52.
 図17A及び図17Bは、受光センサ22の画素221(受光領域)と、測定エリア50との関係を示す説明図である。図17Aは、第1レンズエレメント251を介して第1測定エリア51からの反射光を受光する場合の受光センサ22の画素221と、測定エリア50との関係を示す説明図である。図17Bは、第2レンズエレメント252を介して第2測定エリア52からの反射光を受光する場合の受光センサ22の画素221と、測定エリア50との関係を示す説明図である。図中の受光センサ22の各画素221の左側には、対応する測定エリア50の領域が示されている。受光センサ22の各画素221は、測定エリア50の対応する領域からの反射光を受光する。 17A and 17B are explanatory diagrams showing the relationship between the pixels 221 (light receiving areas) of the light receiving sensor 22 and the measurement area 50. FIG. FIG. 17A is an explanatory diagram showing the relationship between the pixels 221 of the light receiving sensor 22 and the measurement area 50 when receiving reflected light from the first measurement area 51 via the first lens element 251. FIG. FIG. 17B is an explanatory diagram showing the relationship between the pixels 221 of the light receiving sensor 22 and the measurement area 50 when receiving the reflected light from the second measurement area 52 via the second lens element 252 . A region of the corresponding measurement area 50 is shown on the left side of each pixel 221 of the light receiving sensor 22 in the drawing. Each pixel 221 of the light receiving sensor 22 receives reflected light from the corresponding region of the measurement area 50 .
 受光センサ22は、Y方向に並ぶ複数の画素221(受光領域)を有しており、ここでは12個の画素(画素#1~#12)がY方向に並んでいる。なお、Y方向に並ぶ画素221の数は、12に限られるものではない(例えばVGAの受光センサ22の場合、640個の画素がY方向に並ぶ)。また、測定エリア50は、Y方向に複数の領域(領域A~領域P)に分割されており、ここでは16個の領域に分割されている。なお、Y方向に分割された測定エリア50の領域の数は、16に限られるものではない。 The light-receiving sensor 22 has a plurality of pixels 221 (light-receiving regions) arranged in the Y direction, and here, 12 pixels (pixels #1 to #12) are arranged in the Y direction. The number of pixels 221 arranged in the Y direction is not limited to 12 (for example, in the case of the VGA light receiving sensor 22, 640 pixels are arranged in the Y direction). Also, the measurement area 50 is divided into a plurality of regions (regions A to P) in the Y direction, and is divided into 16 regions here. Note that the number of regions of the measurement area 50 divided in the Y direction is not limited to 16.
 図17Aに示すように、第1測定エリア51は、領域A~領域Lに相当する。受光センサ22の画素#1~#12は、第1測定エリア51の領域L~領域Aにそれぞれ対応する。例えば、受光センサ22の画素#5は、第1レンズエレメント251を介して、測定エリア50の領域Hからの反射光を受光することになる。 As shown in FIG. 17A, the first measurement area 51 corresponds to regions A to L. Pixels #1 to #12 of the light receiving sensor 22 correspond to the regions L to A of the first measurement area 51, respectively. For example, the pixel #5 of the light receiving sensor 22 receives reflected light from the region H of the measurement area 50 via the first lens element 251 .
 図17Bに示すように、第2測定エリア52は、領域E~領域Pに相当する。受光センサ22の画素#1~#12は、第2測定エリア52の領域P~領域Eにそれぞれ対応する。例えば、受光センサ22の画素#5は、第2レンズエレメント252を介して、測定エリア50の領域Lからの反射光を受光することになる。このように、受光センサ22の或る画素(例えば画素#5)は、連結レンズ25の第1レンズエレメント251及び第2レンズエレメント252を介して、測定エリア50の2つの領域(例えば領域Hと領域L)からの反射光を受光可能である。以下の説明では、受光センサ22の或る画素(例えば画素#5)が測定エリア50の2つの位置(例えば領域Hと領域L)の反射光を受光可能である場合において、その2つの位置(例えば領域Hと領域L)のうち、第1レンズエレメント251を介して反射光を受光する位置(例えば領域H)を「第1位置」と呼び、第2レンズエレメント252を介して反射光を受光する領域(例えば領域L)を「第2位置」と呼ぶことがある。 As shown in FIG. 17B, the second measurement area 52 corresponds to regions E to P. Pixels #1 to #12 of the light receiving sensor 22 correspond to the regions P to E of the second measurement area 52, respectively. For example, pixel #5 of the light receiving sensor 22 receives reflected light from the area L of the measurement area 50 via the second lens element 252 . In this way, a certain pixel (for example, pixel #5) of the light receiving sensor 22 is connected to two regions (for example, region H and Reflected light from the area L) can be received. In the following description, when a certain pixel (for example, pixel #5) of the light receiving sensor 22 can receive reflected light from two positions (for example, area H and area L) of the measurement area 50, the two positions ( For example, the position where the reflected light is received via the first lens element 251 (for example, the region H and the region L) is called a "first position", and the reflected light is received via the second lens element 252. A region (for example, region L) to be set may be referred to as a “second position”.
 ところで、受光センサ22の或る画素が測定エリア50の2つの領域(第1位置、第2位置)からの反射光を受光可能である場合、その画素の受光素子が反射光を受光した時に、第1位置及び第2位置のうちのどちらからの反射光を受光したのかを判別できるようにする必要がある。そこで、第1実施形態では、制御部30は、照射部10を制御することによって光を照射する測定エリア50上の位置を制御可能に構成されている。そして、制御部30は、第1位置及び第2位置に同時に光を照射しないように照射部10を制御する。これにより、例えば測定エリア50の領域Hに光が照射された時に受光センサ22の画素#5の受光素子が反射光を受光した場合には、制御部30は、受光した光が領域Lからの反射光ではなく、領域Hからの反射光であるものとし、画素#5の受光素子の信号に基づいて領域H内の対象物90の距離を算出することが可能である。 By the way, when a certain pixel of the light receiving sensor 22 can receive reflected light from two regions (first position and second position) of the measurement area 50, when the light receiving element of the pixel receives the reflected light, It is necessary to be able to determine from which of the first position and the second position the reflected light is received. Therefore, in the first embodiment, the control unit 30 is configured to be able to control the position on the measurement area 50 irradiated with light by controlling the irradiation unit 10 . Then, the controller 30 controls the irradiator 10 so as not to irradiate the first position and the second position with light at the same time. As a result, when the light receiving element of the pixel #5 of the light receiving sensor 22 receives the reflected light when the region H of the measurement area 50 is irradiated with light, the control unit 30 detects that the received light is from the region L. It is possible to calculate the distance of the object 90 in the area H based on the signal of the light receiving element of the pixel #5, assuming that the reflected light is not the reflected light but the reflected light from the area H. FIG.
 図17A及び図17Bに示すように、重複エリア53は、領域E~領域Lに相当する。図17Aに示すように、受光センサ22の画素#8~#1は、第1レンズエレメント251を介して、重複エリア53である領域E~領域Lからの反射光を受光する。また、図17Bに示すように、受光センサ22の画素#12~#5は、第2レンズエレメント252を介して、重複エリア53である領域E~領域Lからの反射光を受光する。重複エリア53の或る領域(例えば領域H)からの反射光は、連結レンズ25の第1レンズエレメント251及び第2レンズエレメント252を介して、受光センサ22の2つの画素(例えば画素#5と画素#9)で受光可能である。 As shown in FIGS. 17A and 17B, the overlapping area 53 corresponds to areas E to L. As shown in FIG. 17A, the pixels #8 to #1 of the light receiving sensor 22 receive reflected light from the areas E to L, which are the overlapping area 53, via the first lens element 251. As shown in FIG. Further, as shown in FIG. 17B, pixels #12 to #5 of the light receiving sensor 22 receive reflected light from areas E to L, which are the overlapping area 53, via the second lens element 252. As shown in FIG. Reflected light from a certain area (for example, area H) of the overlapping area 53 passes through the first lens element 251 and the second lens element 252 of the coupling lens 25 and passes through two pixels (for example, pixel #5 and pixel #5) of the light receiving sensor 22. Pixel #9) can receive light.
 図3Bに示すように、Z方向から測定エリア50を見ると、測定エリア50はX方向及びY方向に所定の画角を有する。第1実施形態では、測定エリア50の、X方向の長さに対するY方向の長さの割合(いわゆるアスペクト比)は、受光センサ22の、X方向の長さに対するY方向の長さの割合(いわゆるアスペクト比)よりも大きい。つまり、第1実施形態では、受光センサ22の形状と比べて、測定エリア50をY方向に長く設定できる(言い換えると、Y方向の広い範囲の反射光を受光できる)。 As shown in FIG. 3B, when the measurement area 50 is viewed from the Z direction, the measurement area 50 has predetermined angles of view in the X and Y directions. In the first embodiment, the ratio of the length in the Y direction to the length in the X direction (so-called aspect ratio) of the measurement area 50 is the ratio of the length in the Y direction to the length in the X direction of the light receiving sensor 22 ( so-called aspect ratio). That is, in the first embodiment, the measurement area 50 can be set longer in the Y direction than the shape of the light receiving sensor 22 (in other words, reflected light over a wide range in the Y direction can be received).
 図16Aに示すように、受光センサ22の受光面のうち、第1レンズエレメント251を介して重複エリア53からの反射光を受光する領域を「第1領域22A」と呼ぶ。また、図16Bに示すように、受光センサ22の受光面のうち、第2レンズエレメント252を介して重複エリア53からの反射光を受光する領域を「第2領域22B」と呼ぶ。重複エリア53上の或る点からの反射光は、受光センサ22の第1領域22A上の画素221と、第2領域22B上の画素221とでそれぞれ受光することができる。 As shown in FIG. 16A, the area of the light receiving surface of the light receiving sensor 22 that receives the reflected light from the overlapping area 53 via the first lens element 251 is called a "first area 22A". Further, as shown in FIG. 16B, a region of the light receiving surface of the light receiving sensor 22 that receives reflected light from the overlapping area 53 via the second lens element 252 is called a "second region 22B". Reflected light from a certain point on the overlapping area 53 can be received by the pixels 221 on the first region 22A and the pixels 221 on the second region 22B of the light receiving sensor 22, respectively.
 図3Bに示すように、重複エリア53(ハッチングが施された領域)は、測定エリア50のY方向の中央部に位置する。つまり、図3Bに示すように、測定エリア50の中央部(重複エリア53)からの反射光は、第1レンズエレメント251を介して受光センサ22の第1領域22A上の画素221で受光することができるとともに、第2レンズエレメント252を介して受光センサ22の第2領域22B上の画素221で受光することができる。 As shown in FIG. 3B, the overlapping area 53 (hatched area) is located in the center of the measurement area 50 in the Y direction. That is, as shown in FIG. 3B, reflected light from the central portion (overlapping area 53) of the measurement area 50 passes through the first lens element 251 and is received by the pixels 221 on the first region 22A of the light receiving sensor 22. and can be received by the pixels 221 on the second region 22B of the light receiving sensor 22 via the second lens element 252 .
 図3Cに示すように、測定装置1を車両に搭載する場合、比較的近傍では、広視野で距離を測定することが望ましい。これに対し、第1実施形態では、図3Bに示すように、測定エリア50のアスペクト比を広げることができるので、広視野での距離の測定に有利である。
 一方、図3Cに示すように、測定装置1を車両に搭載する場合、遠方までの距離の測定が必要とされるエリアは、比較的狭い範囲であることが許容されるが、遠方から届く反射光の強度は弱くなることが想定される。これに対し、第1実施形態では、図3Bに示すように、測定エリア50の重複エリア53からの反射光を受光センサ22の2つの画素(第1領域22A及び第2領域22Bのそれぞれの画素221)で受光できるので、遠方の距離の測定に有利である。
As shown in FIG. 3C, when the measuring device 1 is mounted on a vehicle, it is desirable to measure the distance with a wide field of view in relatively close proximity. On the other hand, in the first embodiment, as shown in FIG. 3B, the aspect ratio of the measurement area 50 can be increased, which is advantageous for distance measurement in a wide field of view.
On the other hand, as shown in FIG. 3C, when the measuring device 1 is mounted on a vehicle, the area required to measure the distance to a long distance is allowed to be a relatively narrow range. It is assumed that the intensity of the light will become weaker. On the other hand, in the first embodiment, as shown in FIG. 3B, the reflected light from the overlapping area 53 of the measurement area 50 is applied to two pixels of the light receiving sensor 22 (pixels of each of the first region 22A and the second region 22B). 221), it is advantageous for measuring a long distance.
 <受光部20の光学条件について>
 図18A及び図18Bは、受光部20の光学条件の説明図である。図18Aは、受光センサ22と集光レンズ26との関係の説明図である。図18Bは、受光センサ22の虚像22’と連結レンズ25との関係の説明図である。
<Regarding the optical conditions of the light receiving unit 20>
18A and 18B are explanatory diagrams of the optical conditions of the light receiving section 20. FIG. FIG. 18A is an explanatory diagram of the relationship between the light receiving sensor 22 and the condenser lens 26. FIG. 18B is an explanatory diagram of the relationship between the virtual image 22′ of the light receiving sensor 22 and the coupling lens 25. FIG.
 図18Aに示すように、集光レンズ26の焦点距離をfB1とする。また、集光レンズ26の主点から受光センサ22までの距離をLB1とする。また、受光センサ22のY方向の半分の長さをyBとする(受光用光学系24の光軸から受光センサ22の端部までの長さをyBとする)。 As shown in FIG. 18A, let the focal length of the condenser lens 26 be fB1. Also, let LB1 be the distance from the principal point of the condenser lens 26 to the light receiving sensor 22 . The half length of the light receiving sensor 22 in the Y direction is yB (the length from the optical axis of the light receiving optical system 24 to the end of the light receiving sensor 22 is yB).
 第1実施形態では、集光レンズ26の主点から受光センサ22までの距離LB1は、集光レンズ26の焦点距離fB1よりも小さい(LB1<fB1)。受光センサ22が集光レンズ26の焦点よりも近くに配置されるため、受光センサ22の虚像22’(集光レンズ26による受光センサ22の像)は、集光レンズ26から見て受光センサ22の反対側(図中の受光センサ22の左側)に配置される。 In the first embodiment, the distance LB1 from the principal point of the condenser lens 26 to the light receiving sensor 22 is smaller than the focal length fB1 of the condenser lens 26 (LB1<fB1). Since the light-receiving sensor 22 is arranged closer than the focal point of the condenser lens 26 , the virtual image 22 ′ of the light-receiving sensor 22 (the image of the light-receiving sensor 22 by the condenser lens 26 ) is the image of the light-receiving sensor 22 when viewed from the condenser lens 26 . (left side of light receiving sensor 22 in the drawing).
 集光レンズ26の主点から虚像22’までの距離をLB’とすると、LB1、LB’及びfB1の関係は次式(B1)の通りとなる。
 (1/LB1)-(1/LB’)=1/fB1    ・・・・(B1)
Assuming that the distance from the principal point of the condensing lens 26 to the virtual image 22' is LB', the relationship between LB1, LB' and fB1 is given by the following equation (B1).
(1/LB1)-(1/LB')=1/fB1 (B1)
 このため、集光レンズ26の主点から虚像22’までの距離LB’は、次式(B2)の通りとなる。
 LB’=(LB1×fB1)/(fB1-LB1)    ・・・・(B2)
Therefore, the distance LB' from the principal point of the condenser lens 26 to the virtual image 22' is given by the following equation (B2).
LB'=(LB1×fB1)/(fB1-LB1) (B2)
 また、受光用光学系24の光軸から受光センサ22の虚像22’の端部までの長さをyB’とすると、yB’は次式(B3)の通りとなる。
 yB’=yB×(LB’/LB1)
    =yB×fB1/(fB1-LB1)    ・・・・(B3)
Further, when the length from the optical axis of the light receiving optical system 24 to the edge of the virtual image 22' of the light receiving sensor 22 is yB', yB' is given by the following equation (B3).
yB'=yB*(LB'/LB1)
=yB×fB1/(fB1-LB1) (B3)
 次に、第1レンズエレメント251及び第2レンズエレメント252の焦点距離をfB2とする。図18Bに示すように、連結レンズ25の主点(第1レンズエレメント251又は第2レンズエレメント252の主点)から虚像22’までの距離をLB2とする。ここで、図18Bに示すように、連結レンズ25が遠方の照射エリアからの反射光を虚像22’の位置に結像させると考えると、焦点距離fB2と距離LB2との関係は次式(B4)の通りとなる。
 LB2=fB2    ・・・・(B4)
Next, let the focal length of the first lens element 251 and the second lens element 252 be fB2. As shown in FIG. 18B, let LB2 be the distance from the principal point of the coupling lens 25 (the principal point of the first lens element 251 or the second lens element 252) to the virtual image 22'. Here, as shown in FIG. 18B, considering that the coupling lens 25 forms an image of the reflected light from the distant irradiation area at the position of the virtual image 22', the relationship between the focal length fB2 and the distance LB2 is expressed by the following equation (B4 ).
LB2=fB2 (B4)
 なお、集光レンズ26の主点と連結レンズ25の主点との距離をdBとする。ここで、距離LB2は、距離LB’に距離dBを加算した値に相当するため、焦点距離fB1,fB2と各距離との関係は、次式(B5)の通りとなる。 
 (LB1×fB1)/(fB1-LB1)+dB=fB2    ・・・・(B5)
The distance between the principal point of the condensing lens 26 and the principal point of the coupling lens 25 is assumed to be dB. Here, since the distance LB2 corresponds to a value obtained by adding the distance dB to the distance LB', the relationship between the focal lengths fB1 and fB2 and each distance is expressed by the following equation (B5).
(LB1×fB1)/(fB1-LB1)+dB=fB2 (B5)
 また、図18Bに示すように、受光用光学系24(集光レンズ26)の光軸と第1レンズエレメント251(又は第2レンズエレメント252)の光軸との間隔をtBとし、「シフト量tB」と呼ぶ。ここで、重複エリア53が形成されるためには、第1測定エリア51と第2測定エリア52とが重なる必要がある。このため、シフト量tBは、受光用光学系24の光軸から受光センサ22の虚像22’の端部までの長さyB’よりも小さい必要がある(tB<yB’)。つまり、重複エリア53が形成されるためには、連結レンズ25の第1レンズエレメント251及び第2レンズエレメント252のそれぞれのシフト量tBは、次式(B6)の条件を満たす必要がある。 
 tB<yB×fB1/(fB1-LB1)    ・・・・(B6)
Further, as shown in FIG. 18B, the distance between the optical axis of the light-receiving optical system 24 (condensing lens 26) and the optical axis of the first lens element 251 (or the second lens element 252) is tB. tB”. Here, in order to form the overlapping area 53, the first measurement area 51 and the second measurement area 52 need to overlap. Therefore, the shift amount tB must be smaller than the length yB' from the optical axis of the light receiving optical system 24 to the edge of the virtual image 22' of the light receiving sensor 22 (tB<yB'). In other words, in order to form the overlapping area 53, the shift amount tB of each of the first lens element 251 and the second lens element 252 of the coupling lens 25 must satisfy the following equation (B6).
tB<yB×fB1/(fB1-LB1) (B6)
 <測定例3>
 図19A~図19Cは、測定方法の一例の説明図である。図中の測定エリア50の各領域の右側には、対応する光源12の発光領域が示されている。また、図中の受光センサ22の各画素221の左側には、対応する測定エリア50の領域が示されている。
<Measurement example 3>
19A to 19C are explanatory diagrams of an example of the measurement method. On the right side of each area of the measurement area 50 in the drawing, the corresponding light emitting area of the light source 12 is shown. In addition, a region of the corresponding measurement area 50 is shown on the left side of each pixel 221 of the light receiving sensor 22 in the drawing.
 光源12は、Y方向に複数の発光領域に分割されており、ここでは16個の発光領域(発光領域#1~#16)に分割されている。なお、Y方向に分割された光源12の発光領域の数は、16に限られるものではない。光源12の各発光領域(発光領域#1~#16)は、測定エリア50の各領域(領域P~領域A)に対応している。 The light source 12 is divided into a plurality of light-emitting regions in the Y direction, here 16 light-emitting regions (light-emitting regions #1 to #16). The number of light emitting regions of the light source 12 divided in the Y direction is not limited to 16. Each light emitting region (light emitting regions #1 to #16) of the light source 12 corresponds to each region of the measurement area 50 (region P to region A).
 制御部30は、光源12の複数の発光領域の中から特定の発光領域を発光させることによって、光を照射する測定エリア50上の位置を制御する。ここでは、制御部30は、図中の一番下の発光領域#16から順に発光するように、光源12を制御する。これにより、測定エリア50の領域A~領域Pに光が走査するように照射される。測定エリア50に照射する光をY方向に走査させることによって、受光センサ22の或る画素が測定エリア50の2つの位置(第1位置、第2位置)からの反射光を受光可能である場合に、その2つの位置(第1位置及び第2位置)に同時に光を照射しないようにすることができる。なお、制御部30は、Y方向に並ぶ発光領域を順に発光させなくても良い。例えば、制御部30は、ランダムな順に発光領域を発光させることによって、光を照射する測定エリア50上の位置を制御しても良い。また、受光センサ22の或る画素が受光可能な2つの位置(第1位置、第2位置)に同時に光を照射しなければ、測定エリア50の複数の領域に同時に光を照射しても良い。また、ミラーを回転させることによって、測定エリア50の領域A~領域Pに光を走査させても良い。 The control unit 30 controls the position on the measurement area 50 irradiated with light by causing a specific light emitting region out of the plurality of light emitting regions of the light source 12 to emit light. Here, the control unit 30 controls the light source 12 so as to emit light sequentially from the lowermost light emitting region #16 in the figure. As a result, areas A to P of the measurement area 50 are irradiated with light so as to scan. A case where a certain pixel of the light receiving sensor 22 can receive reflected light from two positions (first position and second position) of the measurement area 50 by scanning the light irradiated to the measurement area 50 in the Y direction. In addition, the two positions (the first position and the second position) can be prevented from being irradiated with light at the same time. Note that the control unit 30 does not have to cause the light emitting regions arranged in the Y direction to emit light in order. For example, the control unit 30 may control the position on the measurement area 50 to which light is emitted by causing the light emitting regions to emit light in random order. If light is not simultaneously applied to two positions (first position and second position) where a certain pixel of the light receiving sensor 22 can receive light, light may be applied to a plurality of areas of the measurement area 50 at the same time. . Further, by rotating the mirror, the areas A to P of the measurement area 50 may be scanned with light.
 測定例3のタイミングチャートは、測定例1の図8のタイミングチャートと同様である。ただし測定例1は光源12の発光面の全体から光を出射させるいたのに対して、測定例3は光源12の複数の発光領域の中から特定の発光領域を発光させている。測定例3においては、図8の上側には、光源12の発光領域がパルス光を出射するタイミング(出射タイミング)が示されている。図8の中央には、パルス状の反射光が到達するタイミング(到達タイミング)が示されている。図8の下側には、画素221(受光素子)の出力信号が示されている。 The timing chart of measurement example 3 is the same as the timing chart of measurement example 1 in FIG. However, in measurement example 1, light is emitted from the entire light emitting surface of light source 12, while in measurement example 3, a specific light emitting region out of a plurality of light emitting regions of light source 12 is caused to emit light. In Measurement Example 3, the timing (emission timing) at which the light emitting region of the light source 12 emits pulsed light is shown in the upper part of FIG. The center of FIG. 8 shows the timing (arrival timing) at which the pulsed reflected light arrives. Output signals of the pixels 221 (light receiving elements) are shown in the lower part of FIG.
 制御部30(タイミング制御部34)は、光源12に所定の周期でパルス光を出射させる。パルス光の出射後、制御部30の測距部36(信号処理部362)は、画素221(受光素子)の出力信号に基づいて、反射光の到達タイミングを検出する。例えば、信号処理部362は、各画素221の出力信号のピークのタイミングに基づいて、反射光の到達タイミングを検出する。制御部30の測距部36(時間検出部364)は、光の出射タイミングと、光の到達タイミングとに基づいて、光を照射してから反射光が到達するまでの時間Tfを検出する。時間Tfは、測定装置1と対象物90との間を光が往復する時間に相当する。そして、測距部36(距離算出部366)は、時間Tfに基づいて、対象物90までの距離Lを算出する。なお、光を照射してから反射光が到達するまでの時間をTfとし、光の速度をCとしたとき、距離Lは、L=C×Tf/2となる。 The control unit 30 (timing control unit 34) causes the light source 12 to emit pulsed light at a predetermined cycle. After the pulsed light is emitted, the distance measurement section 36 (signal processing section 362) of the control section 30 detects the arrival timing of the reflected light based on the output signal of the pixel 221 (light receiving element). For example, the signal processing unit 362 detects the arrival timing of reflected light based on the peak timing of the output signal of each pixel 221 . The distance measurement unit 36 (time detection unit 364) of the control unit 30 detects the time Tf from when the light is emitted to when the reflected light arrives, based on the light emission timing and the light arrival timing. The time Tf corresponds to the time it takes the light to make a round trip between the measurement device 1 and the object 90 . Then, the distance measurement unit 36 (distance calculation unit 366) calculates the distance L to the object 90 based on the time Tf. When the time from irradiation of light to arrival of reflected light is Tf, and the speed of light is C, the distance L is L=C×Tf/2.
 図19Aに示すように、光源12の発光領域#16から出射した光は、投光用光学系14を介して、測定エリア50(第1測定エリア51)の領域Aに照射される。受光センサ22の画素#12は、受光用光学系24(第1レンズエレメント251)を介して、領域Aからの反射光を受光する。制御部30は、受光センサ22の画素#12の出力信号に基づいて、領域A内の対象物90までの距離を算出する。 As shown in FIG. 19A, the light emitted from the light emitting region #16 of the light source 12 is irradiated to the region A of the measurement area 50 (first measurement area 51) via the light projecting optical system 14. As shown in FIG. The pixel #12 of the light receiving sensor 22 receives the reflected light from the area A via the light receiving optical system 24 (first lens element 251). The control unit 30 calculates the distance to the object 90 in the area A based on the output signal of the pixel #12 of the light receiving sensor 22. FIG.
 なお、制御部30は、発光領域#16の発光後(言い換えると、領域Aの測定後)、発光領域#15~#13を順に発光させ、第1測定エリア51の領域B~Dに光を順に照射させ、受光センサ22の画素#11~#9の出力信号に基づいて、領域B~領域D内の対象物90までの距離を順に算出する。 After the light emission of the light emitting region #16 (in other words, after the measurement of the region A), the control unit 30 causes the light emitting regions #15 to #13 to emit light in order, and the light to the regions B to D of the first measurement area 51. The light is emitted in order, and the distances to the object 90 in the regions B to D are calculated in order based on the output signals of the pixels #11 to #9 of the light receiving sensor 22 .
 図19Bに示すように、光源12の発光領域#12から出射した光は、投光用光学系14を介して、測定エリア50の領域Eに照射される。領域Eは重複エリア53に属するため、領域Eからの反射光は、連結レンズ25の第1レンズエレメント251及び第2レンズエレメント252を介して、受光センサ22の画素#8と画素#12で受光可能である。 As shown in FIG. 19B , the light emitted from the light emitting region #12 of the light source 12 is irradiated onto the region E of the measurement area 50 via the light projecting optical system 14 . Since the area E belongs to the overlapping area 53, the reflected light from the area E passes through the first lens element 251 and the second lens element 252 of the coupling lens 25 and is received by the pixels #8 and #12 of the light receiving sensor 22. It is possible.
 制御部30は、受光センサ22の画素#8の出力信号に基づいて距離を算出するとともに、受光センサ22の画素#12の出力信号に基づいて距離を算出し、算出した2つの距離の平均を求めることによって、領域E内の対象物90までの距離を算出する。これにより、距離の算出精度を高めたり、ロバスト性を高めたりすることができる。また、制御部30は、受光センサ22の画素#8の出力信号に基づいて距離を算出するとともに、受光センサ22の画素#12の出力信号に基づいて距離を算出し、算出した2つの距離の差が所定値よりも大きいときに、異常を検出しても良い。なお、制御部30は、受光センサ22の画素#8と画素#12の一方の出力信号に基づいて、領域E内の対象物90までの距離を算出しても良い。 The control unit 30 calculates the distance based on the output signal of the pixel #8 of the light receiving sensor 22, calculates the distance based on the output signal of the pixel #12 of the light receiving sensor 22, and calculates the average of the two calculated distances. The distance to the object 90 in the area E is calculated by obtaining the distance. This makes it possible to improve the accuracy of distance calculation and improve robustness. Further, the control unit 30 calculates the distance based on the output signal of the pixel #8 of the light receiving sensor 22, calculates the distance based on the output signal of the pixel #12 of the light receiving sensor 22, and calculates the distance between the two calculated distances. An abnormality may be detected when the difference is greater than a predetermined value. Note that the control unit 30 may calculate the distance to the object 90 in the area E based on the output signal of one of the pixels #8 and #12 of the light receiving sensor 22 .
 制御部30は、発光領域#12の発光後(言い換えると、領域Eの測定後)、発光領域#11~#5を順に発光させ、重複エリア53の領域F~Lに光を順に照射させ、反射光を受光した2つの画素の出力信号に基づいて領域F~領域L内の対象物90までの距離を順に算出する。 After the light emission of the light emitting region #12 (in other words, after the measurement of the region E), the control unit 30 causes the light emitting regions #11 to #5 to emit light in order, and the regions F to L of the overlapping area 53 to emit light in order, Based on the output signals of the two pixels that received the reflected light, the distances to the object 90 in the regions F to L are calculated in order.
 図19Cに示すように、光源12の発光領域#4から出射した光は、投光用光学系14を介して、測定エリア50(第2測定エリア52)の領域Mに照射される。受光センサ22の画素#4は、受光用光学系24(第2レンズエレメント252)を介して、領域Mからの反射光を受光する。制御部30は、受光センサ22の画素#4の出力信号に基づいて、領域M内の対象物90までの距離を算出する。制御部30は、発光領域#4の発光後(言い換えると、領域Mの測定後)、発光領域#3~#1を順に発光させ、第2測定エリア52の領域N~Pに光を順に照射させ、受光センサ22の画素#3~#1の出力信号に基づいて、領域N~領域P内の対象物90までの距離を順に算出する。 As shown in FIG. 19C, the light emitted from the light emitting region #4 of the light source 12 is irradiated onto the region M of the measurement area 50 (second measurement area 52) via the light projecting optical system 14. As shown in FIG. The pixel #4 of the light receiving sensor 22 receives the reflected light from the area M via the light receiving optical system 24 (second lens element 252). The control unit 30 calculates the distance to the object 90 within the area M based on the output signal of the pixel #4 of the light receiving sensor 22 . After the light emission of the light emitting region #4 (in other words, after the measurement of the region M), the control unit 30 sequentially causes the light emitting regions #3 to #1 to emit light, and sequentially irradiates the regions N to P of the second measurement area 52 with light. Then, based on the output signals of the pixels #3 to #1 of the light receiving sensor 22, the distances to the object 90 in the regions N to P are calculated in order.
 上記の測定方法によれば、12個の画素221(画素#1~#12)がY方向に並ぶ受光センサ22の出力結果に基づいて、16個に分割された測定エリア50のそれぞれの領域(領域A~領域P)内の対象物90までの距離を算出することができる。このように、受光センサ22は、Y方向の広い範囲の反射光を受光できる。例えばY方向に640個の画素が並ぶVGA(640×480画素)の受光センサ22を用いて、Y方向に840画素の解像度で測定エリア50の画像(距離画像)を取得することが可能になる。 According to the above measurement method, each of the 16 divided measurement areas 50 ( It is possible to calculate the distance to the object 90 within the regions A to P). Thus, the light receiving sensor 22 can receive reflected light over a wide range in the Y direction. For example, using a VGA (640×480 pixels) light receiving sensor 22 with 640 pixels arranged in the Y direction, it is possible to acquire an image (distance image) of the measurement area 50 with a resolution of 840 pixels in the Y direction. .
 <測定例4>
 図9Aに示される、受光センサ22の別の一例が用いられてもよい。図9Aに示されるように、受光センサ22の各画素221が、複数の受光素子222を有する場合も、図19A~図19Cに示すように、制御部30は、図中の一番下の発光領域#16から順に発光するように、光源12を制御する。これにより、測定エリア50の領域A~領域Pに光が走査するように照射される。
<Measurement example 4>
Another example of the light receiving sensor 22, shown in FIG. 9A, may be used. Even when each pixel 221 of the light receiving sensor 22 has a plurality of light receiving elements 222 as shown in FIG. 9A, the control unit 30 controls the light emission shown in FIG. The light source 12 is controlled so as to emit light sequentially from the area #16. As a result, areas A to P of the measurement area 50 are irradiated with light so as to scan.
 まず、図19Aに示すように、光源12の発光領域#16から出射した光が、投光用光学系14を介して、測定エリア50(第1測定エリア51)の領域Aに照射されるとともに、領域Aからの反射光を、受光用光学系24(第1レンズエレメント251)を介して受光センサ22の画素#12が受光する場合については、図9A~図9Cに示された方法により、対象物90までの距離が算出される。 First, as shown in FIG. 19A, the light emitted from the light emitting region #16 of the light source 12 is irradiated onto the region A of the measurement area 50 (first measurement area 51) via the light projecting optical system 14. , and the pixel #12 of the light receiving sensor 22 receives the reflected light from the area A via the light receiving optical system 24 (the first lens element 251), the method shown in FIGS. A distance to the object 90 is calculated.
 次に、図19Bに示すように、光源12の発光領域#12から出射した光が、投光用光学系14を介して、重複エリア53の領域Eに照射されるとともに、領域Eからの反射光を、受光用光学系24の第1レンズエレメント251及び第2レンズエレメント252を介して受光センサ22の画素#8と画素#12が受光する場合について説明する。 Next, as shown in FIG. 19B, the light emitted from the light emitting region #12 of the light source 12 is irradiated onto the region E of the overlapping area 53 via the light projecting optical system 14, and is reflected from the region E. A case where light is received by the pixel #8 and the pixel #12 of the light receiving sensor 22 via the first lens element 251 and the second lens element 252 of the light receiving optical system 24 will be described.
 図20は、重複エリア53を測定する場合における信号処理部362の説明図である。信号処理部362は、加算部362Aと、比較部362Bと、ヒストグラム生成部362Cと、ヒストグラム合成部362Dとを有する。加算部362A、比較部362B及びヒストグラム生成部362Cについては、既に説明したため説明を省略する。 FIG. 20 is an explanatory diagram of the signal processing unit 362 when measuring the overlapping area 53. FIG. The signal processor 362 has an adder 362A, a comparator 362B, a histogram generator 362C, and a histogram synthesizer 362D. Since the adding section 362A, the comparing section 362B, and the histogram generating section 362C have already been described, the description thereof will be omitted.
 ヒストグラム合成部362Dは、2つのヒストグラム生成部362Cによって生成された2つのヒストグラム(単独ヒストグラム)を1つのヒストグラム(合成ヒストグラム)に合成する。ヒストグラム合成部362Dは、重複エリア53の領域に対応付けられた2つの画素に基づくそれぞれのヒストグラム(単独ヒストグラム)を合成することによって、合成ヒストグラムを生成する。ここでは、ヒストグラム合成部362Dは、領域Eに対応する受光センサ22の画素#8と画素#12に基づく2つの単独ヒストグラムを合成することによって、合成ヒストグラムを生成する。2つの単独ヒストグラムを合成して合成ヒストグラムが生成されるため、単独ヒストグラムの積算回数が所定数の半分であっても、所定の積算回数のヒストグラムを生成することができる。このため、重複エリア53の測定では、設定された積算回数のヒストグラムを高速に生成できるため、距離の測定を高速で行うことができる。また、図3Cに示すように重複エリア53からの反射光の強度は弱くなることが想定されるが、重複エリア53からの反射光を受光センサ22の2つの画素に基づくそれぞれのヒストグラム(単独ヒストグラム)を合成し、合成ヒストグラムに基づいて対象物90までの距離を測定することによって、反射光の強度が弱くても距離の測定精度の低下を抑制できる。 The histogram synthesizing unit 362D synthesizes the two histograms (single histograms) generated by the two histogram generating units 362C into one histogram (composite histogram). The histogram synthesizing unit 362D generates a synthesized histogram by synthesizing respective histograms (single histograms) based on two pixels associated with the overlap area 53 region. Here, the histogram combining unit 362D generates a combined histogram by combining two single histograms based on pixel #8 and pixel #12 of the light receiving sensor 22 corresponding to region E. FIG. Since a composite histogram is generated by combining two independent histograms, even if the number of times of accumulation of the independent histograms is half of the predetermined number, the histogram of the predetermined number of times of accumulation can be generated. Therefore, in the measurement of the overlapping area 53, a histogram of the set number of integrations can be generated at high speed, so distance measurement can be performed at high speed. Also, as shown in FIG. 3C, it is assumed that the intensity of the reflected light from the overlapping area 53 is weakened. ) and measure the distance to the object 90 based on the combined histogram, it is possible to suppress the deterioration of the distance measurement accuracy even if the intensity of the reflected light is weak.
 図21は、重複エリア53を測定する場合における別の信号処理部362の説明図である。
 信号処理部362は、加算部362Aと、比較部362Bと、ヒストグラム生成部362Cとを有する。加算部362Aは、重複エリア53の領域に対応付けられた2つの画素221(ここでは画素#8と画素#12)を構成する複数の受光素子222(SPAD)の出力信号を加算する。つまり、加算部362Aは、重複エリア53の領域に対応付けられた2つの画素221(ここでは画素#8と画素#12)を実質的に1つの画素とみなし、その画素を構成する複数の受光素子222(SPAD)の出力信号を加算する。図11Bに示す加算部362Aと比べると、図21に示す加算部362Aは、2倍の受光素子222(SPAD)の出力信号を加算することになる。比較部362Bは、加算部362Aの出力信号と閾値とを比較し、加算部362Aの出力信号が閾値以上の場合に信号を出力する。
FIG. 21 is an explanatory diagram of another signal processing section 362 when measuring the overlapping area 53. As shown in FIG.
The signal processor 362 has an adder 362A, a comparator 362B, and a histogram generator 362C. The adder 362A adds the output signals of the plurality of light receiving elements 222 (SPAD) forming the two pixels 221 (here, pixel #8 and pixel #12) associated with the overlapping area 53. FIG. That is, the adder 362A regards the two pixels 221 (here, the pixel #8 and the pixel #12) associated with the region of the overlap area 53 as substantially one pixel, and the plurality of light-receiving elements forming the pixel. The output signals of element 222 (SPAD) are summed. Compared with the adder 362A shown in FIG. 11B, the adder 362A shown in FIG. 21 adds twice as many output signals of the light receiving elements 222 (SPAD). The comparator 362B compares the output signal of the adder 362A with a threshold, and outputs a signal when the output signal of the adder 362A is greater than or equal to the threshold.
 なお、設定部32(図1参照)は、重複エリア53以外の測定エリア50を測定する場合の比較部362B(図9B参照)の閾値と比べて、重複エリア53の測定を行う場合の比較部362B(図21参照)の閾値を大きい値に設定する。これにより、加算部362Aの出力信号が閾値以上になる場合に、受光センサ22の受光素子222(SPAD)が反射光を検知した確率が高くなる。なお、加算部362Aが加算する受光素子222(SPAD)の出力信号の数が多いため、閾値が大きい値に設定されていても、比較部362Bがヒストグラム生成部362Cに信号を出力する頻度は低減しない(所定の積算回数のヒストグラムの生成が遅くならずに済む)。このため、図21に示す比較部362Bの閾値を大きい値に設定することが許容されている。 Note that the setting unit 32 (see FIG. 1) compares the threshold of the comparing unit 362B (see FIG. 9B) when measuring the measurement area 50 other than the overlapping area 53, and the comparing unit when measuring the overlapping area 53 362B (see FIG. 21) is set to a large value. This increases the probability that the light receiving element 222 (SPAD) of the light receiving sensor 22 has detected the reflected light when the output signal of the adder 362A is greater than or equal to the threshold. Since the number of output signals of the light-receiving element 222 (SPAD) added by the adder 362A is large, even if the threshold is set to a large value, the frequency of outputting the signal from the comparator 362B to the histogram generator 362C is reduced. No (predetermined number of times of histogram generation is not delayed). Therefore, it is allowed to set the threshold value of the comparison section 362B shown in FIG. 21 to a large value.
 ヒストグラム生成部362Cは、比較部362Bの出力に基づいて、受光センサ22の受光素子222(SPAD)が光を検知した時間を繰り返し計測するとともに、その時間に対応付けられた頻度(回数)をインクリメントすることによって、ヒストグラムを生成する。なお、図20に示す信号処理部362と比べて、図21に示す信号処理部362は、ヒストグラムを生成するためのデータを記憶させるメモリの容量を削減できるとともに、ヒストグラムを生成するための演算量を削減できる。また、図3Cに示すように重複エリア53からの反射光の強度は弱くなることが想定されるが、重複エリア53からの反射光を受光センサ22の2つの画素の受光素子222(SPAD)で受光するため、反射光の強度が弱くても距離の測定精度の低下を抑制できる。 Based on the output of the comparison unit 362B, the histogram generation unit 362C repeatedly measures the time that the light receiving element 222 (SPAD) of the light receiving sensor 22 detects light, and increments the frequency (number of times) associated with that time. to generate a histogram. Note that compared to the signal processing unit 362 shown in FIG. 20, the signal processing unit 362 shown in FIG. can be reduced. In addition, as shown in FIG. 3C, it is assumed that the intensity of the reflected light from the overlapping area 53 is weakened. Since the light is received, even if the intensity of the reflected light is weak, the deterioration of distance measurement accuracy can be suppressed.
 ===小括1===
 上記の測定装置1は、光源12と、光源12の光を測定エリア50に照射する投光用光学系14と、測定エリア50からの反射光を受光する受光部20とを備えている。投光用光学系14は、第1レンズエレメント151と第2レンズエレメント152とを連結させた連結レンズ15を有する。第1レンズエレメント151の光軸は+Y方向(第1方向に相当)にシフトされており、第2レンズエレメント152の光軸は-Y方向(第1レンズエレメント151の光軸をシフトさせた方向と逆方向)にシフトされている。光源12の光は、第1レンズエレメント151を介して第1測定エリア51に照射されるとともに、第2レンズエレメント152を介して重複エリア53を含む第2測定エリア52に照射される。このような測定装置1によれば、Y方向(第1方向に相当)の広い範囲に光を照射できる。また、重複エリア53が設けられるため、第1測定エリア51と第2測定エリア52との間に光を照射できない領域が形成されることを防止できる。
=== Summary 1 ===
The measuring apparatus 1 described above includes a light source 12 , a light projecting optical system 14 that irradiates a measurement area 50 with light from the light source 12 , and a light receiving section 20 that receives reflected light from the measurement area 50 . The light projecting optical system 14 has a connecting lens 15 in which the first lens element 151 and the second lens element 152 are connected. The optical axis of the first lens element 151 is shifted in the +Y direction (corresponding to the first direction), and the optical axis of the second lens element 152 is shifted in the -Y direction (the direction in which the optical axis of the first lens element 151 is shifted). and vice versa). Light from the light source 12 irradiates the first measurement area 51 through the first lens element 151 and irradiates the second measurement area 52 including the overlapping area 53 through the second lens element 152 . According to such a measuring device 1, a wide range in the Y direction (corresponding to the first direction) can be irradiated with light. In addition, since the overlapping area 53 is provided, it is possible to prevent the formation of an area that cannot be irradiated with light between the first measurement area 51 and the second measurement area 52 .
 上記の測定装置1では、測定エリア50の、X方向(第2方向に相当)の長さに対するY方向(第1方向に相当)の長さの割合は、光源12の、X方向(第2方向に相当)の長さに対するY方向(第1方向に相当)の長さの割合よりも大きい(図3A及び図3B参照)。これにより、測定エリア50のY方向の画角を広くすることができる。 In the measuring apparatus 1 described above, the ratio of the length of the measurement area 50 in the Y direction (corresponding to the first direction) to the length of the X direction (corresponding to the second direction) is the same as that of the light source 12 in the X direction (corresponding to the second is greater than the ratio of the length in the Y direction (corresponding to the first direction) to the length in the Y direction (corresponding to the first direction) (see FIGS. 3A and 3B). Thereby, the angle of view of the measurement area 50 in the Y direction can be widened.
 また、上記の測定装置1では、光源12の第1領域121及び第2領域122を同時に発光させることによって、投光用光学系14は、光源12の第1領域121の光を、第1レンズエレメント151を介して重複エリア53に照射するとともに、光源12の第2領域122の光を、第2レンズエレメント152を介して重複エリア53に照射する。これにより、測定エリア50の重複エリア53(図3Bのハッチングが施された領域)に照射する光の強度を高めることができる。 In addition, in the measuring apparatus 1 described above, the first area 121 and the second area 122 of the light source 12 are caused to emit light at the same time, so that the light projecting optical system 14 transmits the light of the first area 121 of the light source 12 to the first lens. The overlapping area 53 is illuminated through the element 151 , and the light from the second region 122 of the light source 12 is illuminated through the second lens element 152 onto the overlapping area 53 . This makes it possible to increase the intensity of the light that irradiates the overlapping area 53 (the hatched area in FIG. 3B) of the measurement area 50 .
 上記の投光用光学系14は、光源12と連結レンズ15との間に、投影レンズ16を有する。投光用光学系14が投影レンズ16を有することにより、光源12の光を広い範囲に投影することが可能になる。但し、投光用光学系14が投影レンズ16を有していなくても良い(この場合、図6Bの虚像12’を光源12に置き換えることになるため、光源12や測定装置1が大型化する)。 The projection optical system 14 has a projection lens 16 between the light source 12 and the coupling lens 15 . Having the projection lens 16 in the projection optical system 14 makes it possible to project the light from the light source 12 over a wide range. However, the projection optical system 14 may not have the projection lens 16 (in this case, the virtual image 12' in FIG. 6B is replaced with the light source 12, so the light source 12 and the measuring device 1 are enlarged. ).
 上記の投光用光学系14において、投影レンズ16の焦点距離をfA1とし、投影レンズ16の主点から光源12までの距離をLA1とし、Y方向(第1方向に相当)における光源12の幅の半分の長さをyAとし、Y方向における投光用光学系14の光軸と第1レンズエレメント151の光軸との間隔(シフト量)をtAとしたとき、tA<yA×fA1/(fA1-LA1)であることが望ましい。これにより、測定エリア50に重複エリア53を設けることができる。 In the projection optical system 14, the focal length of the projection lens 16 is fA1, the distance from the principal point of the projection lens 16 to the light source 12 is LA1, and the width of the light source 12 in the Y direction (corresponding to the first direction) is , and tA is the distance (shift amount) between the optical axis of the projection optical system 14 and the optical axis of the first lens element 151 in the Y direction, tA<yA×fA1/( fA1-LA1) is desirable. Thereby, an overlapping area 53 can be provided in the measurement area 50 .
 上記の測定装置1の制御部30は、光源12に光を発光させてから反射光を受光するまでの到達時間に基づいて、反射光を反射した対象物90までの距離を算出する。但し、測定装置1は、対象物90までの距離を測定するものに限られるものではない。例えば、測定装置1は、距離を算出しなくても、光源12に光を発光させてから反射光を受光するまでの到達時間を測定するものでも良いし、測定エリア50の画像(輝度画像)を測定するものでも良い。 The control unit 30 of the measuring device 1 described above calculates the distance to the object 90 from which the reflected light is reflected, based on the arrival time from when the light source 12 emits light until when the reflected light is received. However, the measuring device 1 is not limited to measuring the distance to the object 90 . For example, without calculating the distance, the measuring device 1 may measure the arrival time from when the light source 12 emits light until when the reflected light is received. may be used to measure
 上記の測定装置1の制御部30は、受光部20の受光素子222が光を検知した時間を繰り返し計測することによってヒストグラムを生成し、ヒストグラムのピークに基づいて光の到達時間Tfを検出する。このようにヒストグラムを用いて光の到達時間Tfを検出する際に、第1実施形態の連結レンズ15を用いて測定エリア50の重複エリア53の光の強度を高めることが特に有効となる。 The control unit 30 of the measuring device 1 described above generates a histogram by repeatedly measuring the time that the light receiving element 222 of the light receiving unit 20 detects light, and detects the light arrival time Tf based on the peak of the histogram. When detecting the arrival time Tf of light using the histogram in this way, it is particularly effective to increase the intensity of light in the overlapping area 53 of the measurement area 50 using the coupling lens 15 of the first embodiment.
 ===小括2===
 上記の測定装置1は、光を測定エリア50に照射する照射部10と、受光センサ22と、測定エリア50からの反射光を受光センサ22に受光させる受光用光学系24とを備えている。受光用光学系24は、第1レンズエレメント251と第2レンズエレメント252とを連結させた連結レンズ25を有する。第1レンズエレメント251の光軸は+Y方向(第1方向に相当)にシフトされており、第2レンズエレメント252の光軸は-Y方向(第1レンズエレメント251の光軸をシフトさせた方向と逆方向)にシフトされている。測定エリア50の第1測定エリア51からの反射光は、第1レンズエレメント251を介して、受光センサ22の受光素子に受光される。また、重複エリア53を含む第2測定エリア52からの反射光は、第2レンズエレメント252を介して、受光センサ22の受光素子に受光される。このような測定装置1によれば、Y方向(第1方向に相当)の広い範囲からの反射光を受光できる。また、重複エリア53が設けられるため、第1測定エリア51と第2測定エリア52との間に反射光を受光できない領域が形成されることを防止できる。
=== Summary 2 ===
The measurement apparatus 1 described above includes an irradiation section 10 that irradiates a measurement area 50 with light, a light receiving sensor 22 , and a light receiving optical system 24 that causes the light receiving sensor 22 to receive the reflected light from the measurement area 50 . The light-receiving optical system 24 has a connecting lens 25 connecting the first lens element 251 and the second lens element 252 . The optical axis of the first lens element 251 is shifted in the +Y direction (corresponding to the first direction), and the optical axis of the second lens element 252 is shifted in the -Y direction (the direction in which the optical axis of the first lens element 251 is shifted). and vice versa). Reflected light from the first measurement area 51 of the measurement area 50 is received by the light receiving element of the light receiving sensor 22 via the first lens element 251 . Reflected light from the second measurement area 52 including the overlapping area 53 is received by the light receiving element of the light receiving sensor 22 via the second lens element 252 . According to such a measuring device 1, it is possible to receive reflected light from a wide range in the Y direction (corresponding to the first direction). Moreover, since the overlapping area 53 is provided, it is possible to prevent the formation of an area that cannot receive the reflected light between the first measurement area 51 and the second measurement area 52 .
 上記の制御部30は、光を照射する測定エリア50上の位置を制御する。これにより、測定エリア50上の所定の位置に光を照射したり、測定エリア50上の所定の位置に光を照射しないようにしたりすることが可能になる。 The control unit 30 described above controls the position on the measurement area 50 where the light is irradiated. As a result, it is possible to irradiate a predetermined position on the measurement area 50 with light or not to irradiate a predetermined position on the measurement area 50 with light.
 上記の受光センサ22の或る画素(例えば図17A及び図17Bに示す画素#5)は、測定エリア50の2つの位置(第1位置と第2位置;例えば領域Hと領域L)からの反射光を受光可能である。このような場合、制御部30は、その画素(例えば画素#5)が受光可能な第1測定エリア51の第1位置(例えば領域H)及び第2測定エリア52の第2位置(例えば領域L)の両方の領域に同時に光を照射しないように照射部10を制御する。これにより、その画素の受光素子が反射光を受光した時に、測定エリア50上の2つの位置のどちらからの反射光を受光したのかを判別可能となる。 A certain pixel of the light receiving sensor 22 (for example, pixel #5 shown in FIGS. 17A and 17B) receives reflections from two positions (first position and second position; for example, region H and region L) of the measurement area 50. It can receive light. In such a case, the control unit 30 controls the first position (for example, region H) of the first measurement area 51 and the second position (for example, region L) of the second measurement area 52 where the pixel (for example, pixel #5) can receive light. ), the irradiation unit 10 is controlled so as not to irradiate both regions with light at the same time. This makes it possible to determine from which of the two positions on the measurement area 50 the reflected light is received when the light-receiving element of that pixel receives the reflected light.
 図19Bに示すように、重複エリア53の領域E(所定位置)からの反射光は、第1レンズエレメント251を介して受光センサ22の画素#8の受光素子(第1受光素子)によって受光可能であるとともに、第2レンズエレメント252を介して受光センサ22の画素#12の受光素子(第2受光素子)によって受光可能である。このように、同じ位置からの反射光を2つの受光素子で受光可能にすることによって、ロバスト性を高めることができる。 As shown in FIG. 19B, the reflected light from the region E (predetermined position) of the overlapping area 53 can be received by the light receiving element (first light receiving element) of the pixel #8 of the light receiving sensor 22 via the first lens element 251. In addition, the light can be received by the light receiving element (second light receiving element) of the pixel #12 of the light receiving sensor 22 via the second lens element 252 . Robustness can be improved by enabling two light receiving elements to receive reflected light from the same position in this way.
 また、制御部30は、重複エリア53の領域E(所定位置)に光を照射した時に、受光センサ22の画素#8の受光素子の出力信号(第1受光結果)と、画素#12の受光素子の出力信号(第2受光結果)とを取得する。このように、同じ位置からの反射光を受光する2つの受光素子の出力信号(受光結果)を取得することによって、ロバスト性を高めることができる。 Further, when light is applied to the region E (predetermined position) of the overlapping area 53, the control unit 30 controls the output signal (first light receiving result) of the light receiving element of the pixel #8 of the light receiving sensor 22 and the light receiving of the pixel #12. An output signal (second light receiving result) of the element is acquired. Robustness can be improved by acquiring the output signals (light receiving results) of the two light receiving elements that receive the reflected light from the same position in this way.
 また、制御部30は、受光センサ22の画素#8の受光素子の出力信号(第1受光結果)に基づいて領域E内の対象物90までの距離を算出するとともに、画素#12の受光素子の出力信号(第2受光結果)に基づいて同じ領域E内の対象物90までの距離を算出する。このように、同じ位置からの反射光を受光した2つの受光素子の出力信号(受光結果)に基づいてそれぞれ距離を算出することによって、ロバスト性を高めることができる。 Further, the control unit 30 calculates the distance to the object 90 in the area E based on the output signal (first light receiving result) of the light receiving element of the pixel #8 of the light receiving sensor 22, and calculates the distance to the object 90 in the area E. The distance to the object 90 in the same area E is calculated based on the output signal (second light reception result) of the. Robustness can be improved by calculating the distance based on the output signals (light receiving results) of the two light receiving elements that received the reflected light from the same position.
 また、制御部30は、受光センサ22の画素#8の出力信号(第1受光結果)に基づいて算出された距離と、画素#12の出力信号(第2受光結果)に基づいて算出された距離とに基づいて、異常を検出しても良い。これにより、測定装置1の何らかの不具合を判定可能である。 Further, the control unit 30 calculates the distance calculated based on the output signal (first light receiving result) of the pixel #8 of the light receiving sensor 22 and the distance calculated based on the output signal (second light receiving result) of the pixel #12. Anomalies may be detected based on the distance. With this, it is possible to determine any malfunction of the measuring device 1 .
 図20に示すように、制御部30は、受光センサ22の画素#8の受光素子(第1受光素子)が光を検知した時間を繰り返し計測することによって単独ヒストグラム(第1ヒストグラム)を生成し、画素#12の受光素子(第2受光素子)が光を検知した時間を繰り返し計測することによって単独ヒストグラム(第2ヒストグラム)を生成し、2つの単独ヒストグラムを合成した合成ヒストグラム(第3ヒストグラム)を生成しても良い。この場合、制御部30は、合成ヒストグラム(第3ヒストグラム)のピークに基づいて光の到達時間Tf(図9C参照)を検出することができ、検出した到達時間Tfに基づいて対象物までの距離を算出する。これにより、単独ヒストグラム(第1ヒストグラム及び第2ヒストグラム)の積算回数が所定数の半分であっても、所定の積算回数の合成ヒストグラム(第3ヒストグラム)を生成することができるので、距離の測定を高速で行うことができる。 As shown in FIG. 20, the control unit 30 generates a single histogram (first histogram) by repeatedly measuring the time when the light receiving element (first light receiving element) of pixel #8 of the light receiving sensor 22 detects light. A single histogram (second histogram) is generated by repeatedly measuring the time during which the light receiving element (second light receiving element) of pixel #12 detects light, and a combined histogram (third histogram) is obtained by synthesizing the two single histograms. may be generated. In this case, the control unit 30 can detect the light arrival time Tf (see FIG. 9C) based on the peak of the composite histogram (third histogram), and the distance to the object can be detected based on the detected arrival time Tf. Calculate As a result, even if the number of integrations of the independent histograms (the first histogram and the second histogram) is half of the predetermined number, it is possible to generate a composite histogram (the third histogram) with the predetermined number of integrations. can be performed at high speed.
 図21に示すように、制御部30は、受光センサ22の画素#8の受光素子(第1受光素子)が光を検知した時間と、画素#12(第2受光素子)が光を検知した時間とを繰り返し計測することによってヒストグラムを生成しても良い。これにより、ヒストグラムを生成するためのデータを記憶させるメモリの容量を削減できるとともに、ヒストグラムを生成するための演算量を削減できる。なお、この場合においても、制御部30は、ヒストグラムのピークに基づいて光の到達時間Tf(図9C参照)を検出することができ、検出した到達時間Tfに基づいて対象物までの距離を算出することになる。 As shown in FIG. 21, the control unit 30 controls the time when the light receiving element (first light receiving element) of the pixel #8 of the light receiving sensor 22 detects light and the time when the pixel #12 (second light receiving element) detects light. A histogram may be generated by repeatedly measuring time. As a result, it is possible to reduce the capacity of the memory for storing the data for generating the histogram, and reduce the amount of calculation for generating the histogram. Also in this case, the control unit 30 can detect the light arrival time Tf (see FIG. 9C) based on the peak of the histogram, and calculate the distance to the object based on the detected arrival time Tf. will do.
 上記の受光用光学系24は、受光センサ22と連結レンズ25との間に、集光レンズ26を有する。受光用光学系24が集光レンズ26を有することにより、広い範囲からの反射光を受光センサ22に受光させることが可能になる。但し、受光用光学系24が集光レンズ26を有していなくても良い(この場合、図18Bの虚像22’の大きさの受光面を持つ受光センサ22に置き換えることになるため、受光センサ22や測定装置1が大型化する)。 The light-receiving optical system 24 described above has a condenser lens 26 between the light-receiving sensor 22 and the coupling lens 25 . Since the light-receiving optical system 24 has the condenser lens 26, it becomes possible for the light-receiving sensor 22 to receive reflected light from a wide range. However, the light-receiving optical system 24 may not have the condenser lens 26 (in this case, the light-receiving sensor 22 having a light-receiving surface of the size of the virtual image 22' in FIG. 18B is substituted. 22 and the measuring device 1 are enlarged).
 上記の受光用光学系24において、集光レンズ26の焦点距離をfB1とし、集光レンズ26の主点から受光センサ22までの距離をLB1とし、Y方向(第1方向に相当)における受光センサ22の幅の半分の長さをyBとし、Y方向における受光用光学系24の光軸と第1レンズエレメント251の光軸との間隔(シフト量)をtBとしたとき、tB<yB×fB1/(fB1-LB1)であることが望ましい。これにより、測定エリア50に重複エリア53を設けることができる。 In the light-receiving optical system 24, the focal length of the condenser lens 26 is fB1, the distance from the principal point of the condenser lens 26 to the light-receiving sensor 22 is LB1, and the light-receiving sensor in the Y direction (corresponding to the first direction) 22, and tB is the distance (shift amount) between the optical axis of the light receiving optical system 24 and the optical axis of the first lens element 251 in the Y direction. /(fB1-LB1) is desirable. Thereby, an overlapping area 53 can be provided in the measurement area 50 .
 以上、本開示の実施形態につき詳述したが、本開示は上記の実施形態に限定されるものではなく、様々な変形例が含まれる。また、上記の実施形態は本開示を分かりやすく説明するために構成を詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、上記の実施形態の構成の一部について、他の構成に追加、削除、置換することが可能である。 Although the embodiments of the present disclosure have been described in detail above, the present disclosure is not limited to the above embodiments, and includes various modifications. In addition, the above-described embodiment describes the configuration in detail in order to explain the present disclosure in an easy-to-understand manner, and is not necessarily limited to those having all the described configurations. Moreover, it is possible to add, delete, or replace a part of the configuration of the above embodiment with another configuration.
 本出願は、2022年2月9日出願の日本出願第2022-018941号及び2022年2月9日出願の日本出願第2022-018942号に基づく優先権を主張し、前記日本出願に記載された全ての記載内容を援用するものである。 This application claims priority based on Japanese Application No. 2022-018941 filed on February 9, 2022 and Japanese Application No. 2022-018942 filed on February 9, 2022, and described in the Japanese application All description contents are used.

Claims (20)

  1.  光源と、
     前記光源の光を測定エリアに照射する投光用光学系と、
     前記測定エリアからの反射光を受光する受光部と、を備え、
     前記投光用光学系は、
        第1方向に光軸をシフトさせた第1レンズエレメントと、前記第1レンズエレメントとは逆方向に光軸をシフトさせた第2レンズエレメントとを連結させた連結レンズを有し、
        前記光源の光を、前記第1レンズエレメントを介して、第1測定エリアに照射するとともに、
        前記光源の光を、前記第2レンズエレメントを介して、前記第1測定エリアと重複した重複エリアを含む第2測定エリアに照射する、測定装置。
    a light source;
    a light projecting optical system for irradiating a measurement area with light from the light source;
    a light receiving unit that receives reflected light from the measurement area,
    The light projecting optical system includes:
    a connecting lens connecting a first lens element whose optical axis is shifted in a first direction and a second lens element whose optical axis is shifted in a direction opposite to the first lens element;
    irradiating a first measurement area with light from the light source through the first lens element;
    A measurement apparatus for irradiating light from the light source through the second lens element onto a second measurement area including an overlapping area that overlaps the first measurement area.
  2.  請求項1に記載の測定装置であって、
     前記測定エリアの、前記光軸及び前記第1方向に垂直な第2方向の長さに対する前記第1方向の長さの割合は、前記光源の、前記第2方向の長さに対する前記第1方向の長さの割合よりも大きい、測定装置。
    The measuring device according to claim 1,
    The ratio of the length of the measurement area in the first direction to the length of the second direction perpendicular to the optical axis and the first direction is the length of the light source in the first direction to the length of the light source in the second direction. A measuring device that is greater than a fraction of the length of the
  3.  請求項1又は2に記載の測定装置であって、
     前記光源の第1領域及び第2領域を同時に発光させることによって、前記投光用光学系は、
     前記第1領域の光を、前記第1レンズエレメントを介して前記重複エリアに照射するとともに、
     前記第2領域の光を、前記第2レンズエレメントを介して前記重複エリアに照射する、測定装置。
    The measuring device according to claim 1 or 2,
    By simultaneously causing the first region and the second region of the light source to emit light, the light projecting optical system
    irradiating the overlapping area with the light of the first region through the first lens element;
    A measuring device that irradiates the overlapping area with the light of the second region through the second lens element.
  4.  請求項1~3のいずれか一項に記載の測定装置であって、
     前記投光用光学系は、前記光源と前記連結レンズとの間に投影レンズを有する、測定装置。
    The measuring device according to any one of claims 1 to 3,
    The measuring device, wherein the projection optical system has a projection lens between the light source and the coupling lens.
  5.  請求項4に記載の測定装置であって、
     前記投影レンズの焦点距離をfA1、
     前記投影レンズの主点から前記光源までの距離をLA1、
     前記第1方向における前記光源の幅の半分の長さをyA、
     前記第1方向における前記投光用光学系の光軸と前記第1レンズエレメントの前記光軸との間隔をtA、としたとき、
     tA<yA×fA1/(fA1-LA1)である、測定装置。
    The measuring device according to claim 4,
    the focal length of the projection lens is fA1;
    LA1 is the distance from the principal point of the projection lens to the light source,
    yA the half length of the width of the light source in the first direction,
    When the distance between the optical axis of the light projecting optical system and the optical axis of the first lens element in the first direction is tA,
    A measuring device wherein tA<yA×fA1/(fA1−LA1).
  6.  請求項1~5のいずれか一項に記載の測定装置であって、
     前記光源に光を発光させてから前記反射光を受光するまでの到達時間に基づいて、前記反射光を反射した対象物までの距離を算出する制御部を有する、測定装置。
    The measuring device according to any one of claims 1 to 5,
    A measuring apparatus comprising a controller that calculates a distance to an object that reflects the reflected light based on an arrival time from when the light source emits light to when the reflected light is received.
  7.  請求項6に記載の測定装置であって、
     前記制御部は、
        前記受光部の受光素子が光を検知した時間を繰り返し計測することによってヒストグラムを生成し、
        前記ヒストグラムのピークに基づいて前記到達時間を検出する、測定装置。 
    The measuring device according to claim 6,
    The control unit
    generating a histogram by repeatedly measuring the time that the light receiving element of the light receiving unit detects light;
    A measuring device that detects the arrival time based on the peak of the histogram.
  8.  光を測定エリアに照射する照射部と、
     受光センサと、
     前記測定エリアからの反射光を前記受光センサに受光させる受光用光学系と、を備え、
     前記受光用光学系は、
        第1方向に光軸をシフトさせた第1レンズエレメントと、前記第1レンズエレメントとは逆方向に光軸をシフトさせた第2レンズエレメントとを連結させた連結レンズを有し、
        第1測定エリアからの前記反射光を、前記第1レンズエレメントを介して、前記受光センサの受光素子に受光させるとともに、
        前記第1測定エリアと重複した重複エリアを含む第2測定エリアからの前記反射光を、前記第2レンズエレメントを介して、前記受光素子に受光させる、測定装置。
    an irradiation unit that irradiates a measurement area with light;
    a light receiving sensor;
    a light-receiving optical system for causing the light-receiving sensor to receive the reflected light from the measurement area,
    The light-receiving optical system is
    a connecting lens connecting a first lens element whose optical axis is shifted in a first direction and a second lens element whose optical axis is shifted in a direction opposite to the first lens element;
    causing the light receiving element of the light receiving sensor to receive the reflected light from the first measurement area via the first lens element;
    A measurement apparatus, wherein the light receiving element receives the reflected light from a second measurement area including an overlapping area that overlaps the first measurement area via the second lens element.
  9.  請求項8に記載の測定装置であって、
     前記照射部が前記光を照射する前記測定エリア上の位置を制御する制御部を有する、測定装置。
    The measuring device according to claim 8,
    A measuring apparatus comprising a control unit that controls a position on the measurement area where the irradiation unit irradiates the light.
  10.  請求項9に記載の測定装置であって、
     前記制御部は、前記受光センサの前記受光素子が前記第1測定エリアの第1位置のから反射光と前記第2測定エリアの第2位置のから反射光とを受光可能であるとき、前記第1位置及び前記第2位置に同時に前記光が照射されないように、前記照射部を制御する、測定装置。
    The measuring device according to claim 9,
    When the light-receiving element of the light-receiving sensor can receive reflected light from a first position in the first measurement area and reflected light from a second position in the second measurement area, the controller controls the A measuring device that controls the irradiation unit so that the first position and the second position are not irradiated with the light at the same time.
  11.  請求項8に記載の測定装置であって、
     前記受光センサは、第1受光素子と、第2受光素子とを有し、
     前記第1受光素子は、前記第1レンズエレメントを介して、前記測定エリア上の所定位置の反射光を受光可能であり、
     前記第2受光素子は、前記第2レンズエレメントを介して、前記測定エリア上の前記所定位置の反射光を受光可能である、測定装置。
    The measuring device according to claim 8,
    The light receiving sensor has a first light receiving element and a second light receiving element,
    The first light receiving element is capable of receiving reflected light from a predetermined position on the measurement area via the first lens element,
    The measuring device, wherein the second light receiving element can receive reflected light from the predetermined position on the measurement area via the second lens element.
  12.  請求項9又は10に記載の測定装置であって、
     前記受光センサは、第1受光素子と、第2受光素子とを有し、
     前記第1受光素子は、前記第1レンズエレメントを介して、前記測定エリア上の所定位置の反射光を受光可能であり、
     前記第2受光素子は、前記第2レンズエレメントを介して、前記測定エリア上の前記所定位置の反射光を受光可能である、測定装置。
    The measuring device according to claim 9 or 10,
    The light receiving sensor has a first light receiving element and a second light receiving element,
    The first light receiving element is capable of receiving reflected light from a predetermined position on the measurement area via the first lens element,
    The measuring device, wherein the second light receiving element can receive reflected light from the predetermined position on the measurement area via the second lens element.
  13.  請求項11に記載の測定装置であって、
     前記第1受光素子の第1受光結果と、前記第2受光素子の第2受光結果とを取得する制御部を有する、測定装置。
    The measuring device according to claim 11,
    A measuring apparatus comprising a control unit that acquires a first light receiving result of the first light receiving element and a second light receiving result of the second light receiving element.
  14.  請求項12に記載の測定装置であって、
     前記制御部は、前記第1受光素子の第1受光結果と、前記第2受光素子の第2受光結果とを取得する、測定装置。
    13. The measuring device according to claim 12,
    The measuring device, wherein the control unit acquires a first light receiving result of the first light receiving element and a second light receiving result of the second light receiving element.
  15.  請求項13又は14に記載の測定装置であって、
     前記制御部は、
        前記第1受光結果に基づいて、前記所定位置における対象物までの距離を算出するとともに、
        前記第2受光結果に基づいて、前記所定位置における対象物までの距離を算出する、測定装置。
    15. The measuring device according to claim 13 or 14,
    The control unit
    Based on the first light receiving result, calculating the distance to the object at the predetermined position,
    A measuring device that calculates a distance to an object at the predetermined position based on the result of the second light reception.
  16.  請求項15に記載の測定装置であって、
     前記制御部は、前記第1受光結果に基づいて算出された前記距離と、前記第2受光結果に基づいて算出された前記距離とに基づいて、異常を検出する、測定装置。
    16. The measuring device according to claim 15,
    The measuring device, wherein the control unit detects an abnormality based on the distance calculated based on the first light reception result and the distance calculated based on the second light reception result.
  17.  請求項13又は14に記載の測定装置であって、
     前記制御部は、
        前記第1受光素子が光を検知した時間を繰り返し計測することによって第1ヒストグラムを生成し、
        前記第2受光素子が光を検知した時間を繰り返し計測することによって第2ヒストグラムを生成し、
        前記第1ヒストグラムと前記第2ヒストグラムとを合わせた第3ヒストグラムを生成し、
        前記第3ヒストグラムのピークに基づいて、光を発光させてから前記反射光を受光するまでの到達時間を検出することによって、前記所定位置における対象物までの距離を算出する、測定装置。
    15. The measuring device according to claim 13 or 14,
    The control unit
    generating a first histogram by repeatedly measuring the time that the first light receiving element detects light;
    generating a second histogram by repeatedly measuring the time that the second light receiving element detects light;
    generating a third histogram that is a combination of the first histogram and the second histogram;
    A measuring device that calculates the distance to the object at the predetermined position by detecting the arrival time from emitting light to receiving the reflected light based on the peak of the third histogram.
  18.  請求項13又は14に記載の測定装置であって、
     前記制御部は、
        前記第1受光素子が光を検知した時間と、前記第2受光素子が光を検知した時間とを繰り返し計測することによってヒストグラムを生成し、
        前記ヒストグラムのピークに基づいて、光を発光させてから前記反射光を受光するまでの到達時間を検出することによって、前記所定位置における対象物までの距離を算出する、測定装置。
    15. The measuring device according to claim 13 or 14,
    The control unit
    generating a histogram by repeatedly measuring the time when the first light receiving element detects light and the time when the second light receiving element detects light;
    A measuring device that calculates the distance to the object at the predetermined position by detecting the arrival time from emitting light to receiving the reflected light based on the peak of the histogram.
  19.  請求項8~18のいずれか一項に記載の測定装置であって、
     前記受光用光学系は、前記受光センサと前記連結レンズとの間に集光レンズを有する、測定装置。
    The measuring device according to any one of claims 8 to 18,
    The measuring device, wherein the light receiving optical system has a condenser lens between the light receiving sensor and the coupling lens.
  20.  請求項19に記載の測定装置であって、
     前記集光レンズの焦点距離をfB1、
     前記集光レンズの主点から前記受光センサまでの距離をLB1、
     前記第1方向における前記受光センサの幅の半分の長さをyB、
     前記第1方向における前記受光用光学系の光軸と前記第1レンズエレメントの光軸との間隔をtB、としたとき、
     tB<yB×fB1/(fB1-LB1)である、測定装置。 
    20. A measuring device according to claim 19,
    the focal length of the condenser lens is fB1;
    LB1 is the distance from the principal point of the condenser lens to the light receiving sensor;
    yB is half the width of the light receiving sensor in the first direction;
    When the distance between the optical axis of the light-receiving optical system and the optical axis of the first lens element in the first direction is tB,
    A measuring device wherein tB<yB×fB1/(fB1−LB1).
PCT/JP2023/004229 2022-02-09 2023-02-08 Measurement device WO2023153451A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2022018942A JP2023116245A (en) 2022-02-09 2022-02-09 Measuring apparatus
JP2022-018942 2022-02-09
JP2022-018941 2022-02-09
JP2022018941A JP2023116244A (en) 2022-02-09 2022-02-09 Measuring apparatus

Publications (1)

Publication Number Publication Date
WO2023153451A1 true WO2023153451A1 (en) 2023-08-17

Family

ID=87564477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/004229 WO2023153451A1 (en) 2022-02-09 2023-02-08 Measurement device

Country Status (1)

Country Link
WO (1) WO2023153451A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07270602A (en) * 1994-03-31 1995-10-20 Omron Corp Lens for receiving light, light receiving device, photoelectric sensor and laser radar using them and vehicle loading laser radar
JP2010133828A (en) * 2008-12-04 2010-06-17 Denso Corp Radar device
JP2011185664A (en) * 2010-03-05 2011-09-22 Panasonic Electric Works Co Ltd Object detector
CN109001747A (en) * 2018-06-20 2018-12-14 合肥菲涅尔光电科技有限公司 A kind of non-blind area laser radar system
JP2019521355A (en) * 2016-07-21 2019-07-25 ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh Optical device for lidar system, lidar system and working device
WO2021136527A1 (en) * 2020-01-03 2021-07-08 华为技术有限公司 Tof depth sensor module and image generation method
JP2021516756A (en) * 2018-02-13 2021-07-08 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Distance measurement system, automation equipment, and distance measurement method
CN113156743A (en) * 2021-04-29 2021-07-23 浙江水晶光电科技股份有限公司 Infrared 3D surveys transmitting terminal module and degree of depth camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07270602A (en) * 1994-03-31 1995-10-20 Omron Corp Lens for receiving light, light receiving device, photoelectric sensor and laser radar using them and vehicle loading laser radar
JP2010133828A (en) * 2008-12-04 2010-06-17 Denso Corp Radar device
JP2011185664A (en) * 2010-03-05 2011-09-22 Panasonic Electric Works Co Ltd Object detector
JP2019521355A (en) * 2016-07-21 2019-07-25 ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh Optical device for lidar system, lidar system and working device
JP2021516756A (en) * 2018-02-13 2021-07-08 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Distance measurement system, automation equipment, and distance measurement method
CN109001747A (en) * 2018-06-20 2018-12-14 合肥菲涅尔光电科技有限公司 A kind of non-blind area laser radar system
WO2021136527A1 (en) * 2020-01-03 2021-07-08 华为技术有限公司 Tof depth sensor module and image generation method
CN113156743A (en) * 2021-04-29 2021-07-23 浙江水晶光电科技股份有限公司 Infrared 3D surveys transmitting terminal module and degree of depth camera

Similar Documents

Publication Publication Date Title
US11977156B2 (en) Optical distance measuring device
US11662433B2 (en) Distance measuring apparatus, recognizing apparatus, and distance measuring method
US11686843B2 (en) Method and apparatus for optically measuring distance
US11619484B2 (en) Distance measurement system, distance measurement method, and program recording medium
US9568358B2 (en) Optical measurement device and vehicle
US20150241564A1 (en) Three-dimensional measuring device and three-dimensional measuring method
WO2014207983A1 (en) Distance measuring device
JP2015179078A (en) Parallax calculation system and distance measurement device
US20020040971A1 (en) Distance information obtaining apparatus and distance information obtaining method
WO2021075404A1 (en) Vehicle-mounted abnormality detecting device
US9267790B2 (en) Measuring device of measurement object, calculating device, measurement method, and method for producing item
US20240012114A1 (en) Electromagnetic wave detection apparatus, program, and information acquisition system
WO2022050279A1 (en) Three-dimensional measurement device
JP2019144184A (en) Optical distance measuring device and method therefor
JP6186863B2 (en) Ranging device and program
JP2014070936A (en) Error pixel detecting apparatus, error pixel detecting method, and error pixel detecting program
WO2023153451A1 (en) Measurement device
WO2015145599A1 (en) Video projection device
JP2023116245A (en) Measuring apparatus
JP2023116244A (en) Measuring apparatus
JPWO2018147454A1 (en) Scanning optical system and laser radar device
JP3973979B2 (en) 3D shape measuring device
JP6379646B2 (en) Information processing apparatus, measurement method, and program
CN113518894A (en) Optical distance measuring device
US20230078063A1 (en) Distance measurement device and distance measurement system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23752915

Country of ref document: EP

Kind code of ref document: A1