WO2023058666A1 - Distance measuring device - Google Patents

Distance measuring device Download PDF

Info

Publication number
WO2023058666A1
WO2023058666A1 PCT/JP2022/037197 JP2022037197W WO2023058666A1 WO 2023058666 A1 WO2023058666 A1 WO 2023058666A1 JP 2022037197 W JP2022037197 W JP 2022037197W WO 2023058666 A1 WO2023058666 A1 WO 2023058666A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
subject
distance
correction signal
sensor
Prior art date
Application number
PCT/JP2022/037197
Other languages
French (fr)
Japanese (ja)
Inventor
孝美 長谷川
Original Assignee
株式会社ブルービジョン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ブルービジョン filed Critical 株式会社ブルービジョン
Priority to JP2023552909A priority Critical patent/JP7441380B2/en
Publication of WO2023058666A1 publication Critical patent/WO2023058666A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates to a distance measuring device.
  • a distance measurement camera system includes near-infrared irradiation means for irradiating an area and a monitor for displaying an output image of the camera (see Patent Document 1).
  • the image sensor acquires, for each pixel, the time difference information of the light emitted from the near-infrared irradiating means until it reaches the image sensor after being reflected by the object.
  • the image signal processing unit calculates the distance for each pixel based on the light time difference information acquired by the image sensor, and outputs color or brightness information according to the distance. As a result, the monitor displays an image whose color or brightness is changed according to the distance.
  • the distance to the subject is the same, if the material, color scheme, or orientation of the subject is different, the light reflectance of the subject will be different, so there is a problem that it cannot be measured accurately.
  • the object is the same, if the inclination of the object changes, the light reflectance of the object also changes.
  • the amount of unexpected scattered light or ambient light is large, the ambient light or the like may enter the image sensor, making accurate measurement impossible.
  • the present invention has been proposed in view of such circumstances.
  • a distance measuring device includes a light-emitting element that emits near-infrared light to irradiate a subject with the near-infrared light, and a light-emitting element that emits near-infrared light to irradiate the subject with the near-infrared light.
  • a light separation unit that separates external light and outputs the visible light and the near-infrared light, respectively; and receives the near-infrared light output from the light separation unit to generate a distance image.
  • a distance image sensor an imaging sensor that receives the visible light output from the light separation unit and generates a subject image, and based on the subject image generated by the imaging sensor, for each predetermined block, Using a correction signal calculation unit that calculates a correction signal, the correction signal calculated by the correction signal generation unit for each of the predetermined blocks, and the distance image generated by the distance image sensor, the subject is detected. and a distance measuring unit that measures the distance of
  • the present invention can measure the distance to a subject with high accuracy without being affected by the subject or the external environment.
  • FIG. 1 is a block diagram showing the configuration of the distance measuring device.
  • FIG. 2 is a diagram showing an anti-seven ratio table.
  • FIG. 3 is a schematic diagram showing a yellow cubic corrugated cardboard box facing the distance measuring device.
  • FIG. 4 is a schematic diagram showing a cardboard box when tilted.
  • FIG. 5 is a schematic diagram showing the inclination ⁇ of the cardboard box.
  • FIG. 6 is a diagram showing the signal output level for each frame of the TOF sensor.
  • FIG. 7 is a diagram showing spectral characteristics of sunlight.
  • FIG. 1 is a block diagram showing the configuration of the distance measuring device 1.
  • the distance measuring device 1 includes a lens 11 that collects light reflected by an object, a beam splitter 12 that separates the light from the lens 11 into visible light and near-infrared light, and the visible light from the beam splitter 12.
  • a Bayer sensor 13 that generates an image signal (object image) corresponding to the image, and a first signal processing unit 14 that performs predetermined signal processing on the object image generated by the Bayer sensor 13 .
  • the distance measuring device 1 includes a TOF (Time Of Flight) sensor 15 that generates an image signal (distance image) corresponding to the near-infrared light from the beam splitter 12, and a predetermined distance image generated by the TOF sensor 15. a second signal processing unit 16 that performs signal processing, a measurement unit 17 that calculates distance information from an object, an LED light source 18 that outputs pulsed near-infrared light of a predetermined wavelength, and a light source control unit 19 for controlling.
  • TOF Time Of Flight
  • the subject is irradiated not only with natural light or illumination light but also with near-infrared light (for example, wavelength 850 nm) from the LED light source 18 .
  • the light reflected by the subject passes through the lens 11 and enters the beam splitter 12 .
  • the beam splitter 12 is composed of two rectangular prisms. The oblique faces of the right-angle prisms are bonded together so as to face each other. Visible light (wavelength 450 to 700 nm, for example) out of the light incident on the beam splitter 12 is reflected by the bonding surface (reflecting surface) and then travels straight to enter the Bayer sensor 13 . On the other hand, near-infrared light (for example, wavelength 750 to 1000 nm) of the light incident on the beam splitter 12 is transmitted through the reflecting surface and is incident on the TOF sensor 15 .
  • the beam splitter 12 separates the light incident from the subject through the same optical axis into visible light and near-infrared light.
  • the near-infrared light is mainly output from the LED light source 18 .
  • each band of visible light and near-infrared light is an example, and is not limited to the numerical values of the present embodiment.
  • a laser light source may be used instead of the LED light source 18.
  • the Bayer sensor 13 has, for example, a plurality of light receiving elements arranged in a matrix. That is, the Bayer sensor 13 is an image sensor in which color filters are arranged in a Bayer array on each light receiving element.
  • the Bayer sensor 13 generates a subject image according to visible light from the beam splitter 12 .
  • the first signal processing unit 14 performs predetermined signal processing on the subject image generated by the Bayer sensor 13 .
  • the TOF sensor 15 is, for example, a range image sensor having a plurality of light receiving elements (pixels) arranged in a matrix. Note that the pixels of the Bayer sensor 13 and the TOF sensor 15 are not the same in size and number. However, each pixel of the Bayer sensor 13 and the TOF sensor 15 correspond to each other by scaling processing. Therefore, the signal value of each pixel of the TOF sensor 15 is corrected by the signal value of each corresponding pixel of the Bayer sensor 13, as will be described later. Also, the distance from the beam splitter 12 to the Bayer sensor 13 and the distance from the beam splitter 12 to the TOF sensor 15 are substantially equal.
  • the TOF sensor 15 generates a range image according to the near-infrared light from the beam splitter 12.
  • the second signal processing unit 16 performs predetermined signal processing on the distance image generated by the TOF sensor 15 .
  • the measurement unit 17 is a computer having a data storage unit, an arithmetic processing unit, and the like.
  • the measurement unit 17 corrects the distance image output from the second signal processing unit 16 using the subject image output from the first signal processing unit 14, thereby producing a distance image that shows the distance to the subject with high accuracy. to generate
  • the measurement unit 17 first detects the phase difference between the near-infrared light (reference signal) output from the LED light source 18 and the near-infrared light (reflection signal) received by the TOF sensor 15 for each pixel. do. This phase difference corresponds to distance information. Based on the phase difference, the measurement unit 17 calculates a distance image representing distance information to the subject for each pixel.
  • the measurement unit 17 uses the subject image based on the visible light received by the Bayer sensor 13 to calculate correction information for each pixel. Finally, the measurement unit 17 corrects the distance image for each pixel using the correction information, and calculates a distance image that shows the distance to the subject with high accuracy.
  • the measurement unit 17 stores a reflectance table in advance.
  • FIG. 2 is a diagram showing a reflectance table.
  • the reflectance table shows, for each object, the corresponding reflectance for different colors of that object. For example, when the color of wood is yellow, ocher, and dark brown, the reflectances are r1, r2, and r3. It should be noted that these reflectances are values when the distance measuring device 1 is directly facing.
  • the measurement unit 17 creates the above reflectance table by performing learning processing in advance using an image of an object assumed as a subject.
  • the reflectance table is not limited to one created by learning processing, and may be set in advance as known data.
  • the measurement unit 17 determines the subject material and color scheme for each pixel, refers to the reflectance table, and calculates the reflectance corresponding to the subject material and color scheme. to select. Thereby, the reflectance of the object can be obtained accurately.
  • the reflectance R differs depending on whether the object faces the distance measuring device 1 directly or at an angle. Normally, the reflectance Ra when the object faces straight is higher than the reflectance Rb when the object is tilted (Ra>Rb).
  • the reflectances R1 and R2 are different, so the accuracy of the distance image obtained by the TOF sensor 15 is reduced.
  • R, G, and B indicate red, green, and blue signals detected by the Bayer sensor 13, and a, b, and c are coefficients of 0-1.
  • a, b, and c are coefficients of 0-1.
  • Vty Vsy ⁇ (Rsy/Rty) (2) Note that Vty is a signal detected by the Bayer sensor 13 .
  • Rty is the reflectance R1 when the yellow cardboard box faces straight.
  • Rsy is the reflectance when the yellow cardboard box is tilted.
  • Vsy Vty ⁇ (Rty/Rsy) (3)
  • Vty is a signal actually detected by the sensors (Bayer sensor 13 and TOF sensor 15).
  • Rty is the reflectance R1 determined by the material and color scheme of the subject as described above.
  • Rsy is the reflectance determined by the tilt of the subject.
  • FIG. 3 shows a yellow cubic cardboard box facing the distance measuring device 1.
  • FIG. 4 is a cardboard box when tilted.
  • FIG. 5 is a schematic diagram showing the inclination ⁇ of the cardboard box.
  • the shape of the detected edge of the subject is an isosceles trapezoid even though the front shape of the subject (the shape of the edge when facing the subject) is square.
  • the length of one side of the shape (square) when the object is directly facing is a.
  • the length of the bottom of the shape (isosceles trapezoid) when the object is tilted is a, which is the same as the length of one side of the square.
  • the height (the length between the upper base and the lower base) shall be b.
  • the inclination of the object can also be known.
  • the method of calculating the inclination of the subject is not particularly limited, and other methods may be used.
  • the measuring unit 17 detects the shape of the edge of the subject using the subject image from the Bayer sensor 13, and compares the shape of the edge when the subject faces the subject, how much the shape of the detected edge is distorted.
  • the inclination of the subject may be detected by determining .
  • the measurement unit 17 may track frame images continuously generated at 60 frames per second by the Bayer sensor 13 to determine whether the target subject is facing the front or is tilted. . Then, when the subject is tilted, the measurement unit 17 detects the tilt ⁇ . In this case, if the measurement unit 17 learns the shape of the subject in advance, it can easily detect the inclination ⁇ of the subject. Using the detected tilt ⁇ , the measurement unit 17 obtains the reflectance Rsy when the subject is tilted.
  • the measurement unit 17 uses the distance image Vty generated from the TOF sensor 15 and the reflectances Rsy and Rty obtained by the above-described processing according to equation (3) to obtain the distance image when facing straight ahead for each pixel. Vsy can be determined.
  • This distance image shows accurate distance information, taking into consideration the material, coloration, and inclination of the subject.
  • disturbance light or the like Unexpected disturbance light or scattered light (hereinafter referred to as “disturbance light or the like") exists in the imaging environment. If part of the ambient light or the like enters the TOF sensor 15, the distance image generated by the TOF sensor 15 will have abnormal values.
  • FIG. 6 is a diagram showing signal output levels of the TOF sensor 15 for each frame.
  • the upper row shows the frame period
  • the middle row shows the signal output level when there is no disturbance light
  • the lower row shows the signal output level when there is disturbance light.
  • the signal output level changes accordingly. As a result, correct distance information cannot be obtained.
  • the signal output level has already reached the saturation level when the disturbance light or the like is zero, the signal output level remains at the saturation level and does not change under the influence of the disturbance light or the like.
  • the signal contains many components such as ambient light.
  • the measurement unit 17 can use the spectral characteristics of sunlight to calculate correction information for removing the influence of disturbance light and the like.
  • the measurement unit 17 performs white balance adjustment. Specifically, the white balance adjustment of the Bayer sensor 13 is performed in the absence of ambient light including near-infrared light (NIR: 850 nm). At this time, a white LED is used as the LED light source 18 . Also, the color of the subject is white. That is, a subject defined as white is used.
  • NIR near-infrared light
  • Expression (4) is established by the white balance adjustment.
  • R, G, and B are respective signal values of red light, green light, and blue light generated from the Bayer sensor 13 that has captured a white subject.
  • the coefficients a, b, and c of the above equation (1) are all 1.
  • FIG. 7 is a diagram showing spectral characteristics of sunlight.
  • the spectral characteristics (curves) of sunlight show the normalized radiant intensity spectrum (maximum intensity is 1) for each wavelength of sunlight.
  • the measurement unit 17 stores in advance the spectral characteristics of sunlight.
  • the measurement unit 17 refers to the spectral characteristics of sunlight and estimates and calculates the signal value of near-infrared light from the ratio of the signal values of red light, green light, and blue light for each pixel. As a result, the ambient light included in the signal value of each pixel of the distance image generated by the TOF sensor 15 can be obtained.
  • the measurement unit 17 turns on the LED light source 18 and calculates the distance to the subject using the signal value of the disturbance light obtained as described above.
  • the method of calculating the distance to the subject is not particularly limited.
  • the measurement unit 17 may subtract the signal value of disturbance light from the signal value of the distance image when the LED light source 18 is on, and obtain a corrected distance image free from the influence of disturbance light. Then, the distance to the subject may be calculated based on the corrected distance image.
  • the measuring unit 17 also calculates a corrected distance obtained by converting the signal value of the disturbance light into a distance, calculates the distance to the subject based on the distance image when the LED light source 18 is on, and calculates the distance and the corrected distance. may be used to calculate the exact distance to the subject.
  • the measurement unit 17 refers to the spectral characteristics of sunlight and uses the signal values of the blue light, green light, and red light of the subject generated by the Bayer sensor 13 when the LED light source 18 is off. Then, the signal value of disturbance light in the wavelength band of near-infrared light is estimated and calculated. Then, the measurement unit 17 can use the signal value of the disturbance light obtained in this manner to remove the influence of the disturbance and the like, and obtain a highly accurate distance to the object.
  • the distance measuring device 1 of the present embodiment generates a distance image and a subject image from light that has passed through the same optical axis via the beam splitter 12, and uses the subject image to generate a distance image for each pixel. can be corrected.
  • the distance measuring device 1 uses the subject image to determine the material and color scheme of the subject for each pixel, and further detects the inclination of the subject with respect to the front to determine the material, color scheme, and inclination of the subject. can be obtained. By correcting the distance image using the reflectance obtained in this way, the distance measuring device 1 can obtain accurate distance information without being affected by the material, color scheme, or inclination of the subject. .
  • the distance measuring device 1 refers to the spectral characteristics of the sun for each pixel, and from the ratio of the signal values of each color of the subject image when the LED light source 18 is turned off, A signal value of ambient light in the wavelength band of near-infrared light is estimated.
  • the distance measuring device 1 uses the estimated signal value of ambient light and the signal value of near-infrared light detected by the TOF sensor 15 for each pixel, thereby removing the influence of ambient light and the like. , it is possible to obtain accurate distance information to the subject.
  • the distance measurement device 1 may correct the distance image for each block composed of a plurality of pixels instead of correcting the distance image for each pixel.
  • each wavelength of visible light and near-infrared light is an example, and other wavelengths may be used.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This distance measuring device (1) comprises: a light emitting element (LED light source 18) that emits near infrared light and irradiates a photographic subject with said near infrared light; a light splitting unit (beam splitter 12) that splits reflected light traveling along the same optical axis from the photographic subject into visible light and near infrared light, and outputs the visible light and the near infrared light; a distance image sensor (TOF sensor 15) that receives the near infrared light outputted from the light splitting unit, and generates a distance image; an imaging sensor (Bayer sensor 13) that receives the visible light outputted from the light splitting unit, and generates a photographic subject image; a correction signal calculation unit (measuring unit 17) that calculates a correction signal for each prescribed block on the basis of the photographic subject image generated by the imaging sensor; and a distance measuring unit (measuring unit 17) that measures the distance to the photographic subject for each prescribed block, using the correction signals calculated by the correction signal generation unit and the distance image generated by the distance image sensor. The present invention can measure the distance to a photographic subject with high accuracy, without being affected by the photographic subject or the external environment.

Description

距離測定装置distance measuring device
 本発明は、距離測定装置に関する。 The present invention relates to a distance measuring device.
 従来、可視光に加えて近赤外の波長領域に感度を有するイメージセンサ及び前記イメージセンサが入手した画像情報を処理する画像信号処理部を備えたカメラと、近赤外光を前記カメラの撮像領域に照射する近赤外線照射手段と、前記カメラの出力画像を表示するモニタと、を備えた距離測定カメラシステムが開示されている(特許文献1参照)。 Conventionally, a camera equipped with an image sensor that has sensitivity in the near-infrared wavelength region in addition to visible light and an image signal processing unit that processes image information obtained by the image sensor, and near-infrared light is captured by the camera. A distance measurement camera system is disclosed that includes near-infrared irradiation means for irradiating an area and a monitor for displaying an output image of the camera (see Patent Document 1).
 特許文献1において、イメージセンサは、近赤外線照射手段の照射光が対象物に反射してイメージセンサに到達するまでの光の時間差情報を画素毎に取得する。画像信号処理部は、イメージセンサが取得した光の時間差情報を基に、画素毎に距離を算出して距離に応じた色或いは明るさ情報を出力する。これにより、モニタには、距離に応じて色を変化させた或いは明るさを変化させた画像が表示される。 In Patent Document 1, the image sensor acquires, for each pixel, the time difference information of the light emitted from the near-infrared irradiating means until it reaches the image sensor after being reflected by the object. The image signal processing unit calculates the distance for each pixel based on the light time difference information acquired by the image sensor, and outputs color or brightness information according to the distance. As a result, the monitor displays an image whose color or brightness is changed according to the distance.
特開2021-092420号公報Japanese Patent Application Laid-Open No. 2021-092420
 しかし、被写体までの距離が同じ場合であっても、被写体の素材、配色又は向きが異なると、被写体の光反射率が異なることから、正確に測定できない問題がある。また、同一の被写体であっても、被写体の傾きが変化すると、被写体の光反射率も変化してしまう。さらに、予期せぬ散乱光または外乱光が大きいと、イメージセンサに外乱光等が入射してしまい、正確に測定できないこともある。 However, even if the distance to the subject is the same, if the material, color scheme, or orientation of the subject is different, the light reflectance of the subject will be different, so there is a problem that it cannot be measured accurately. Moreover, even if the object is the same, if the inclination of the object changes, the light reflectance of the object also changes. Furthermore, if the amount of unexpected scattered light or ambient light is large, the ambient light or the like may enter the image sensor, making accurate measurement impossible.
 本発明は、このような実情を鑑みて提案されたものである。 The present invention has been proposed in view of such circumstances.
 本発明に係る距離測定装置は、近赤外光を発光して、被写体に前記近赤外光を照射する発光素子と、被写体からの同一の光軸を通る反射光を可視光と前記近赤外光とに分離して、前記可視光と前記近赤外光とをそれぞれ出力する光分離部と、前記光分離部から出力された前記近赤外光を受光して、距離画像を生成する距離画像センサと、前記光分離部から出力されたされた前記可視光を受光して、被写体画像を生成する撮像センサと、前記撮像センサで生成された被写体画像に基づいて、所定ブロック毎に、補正信号を演算する補正信号演算部と、前記所定ブロック毎に、前記補正信号生成部により演算された前記補正信号と、前記距離画像センサで生成された距離画像と、を用いて、前記被写体までの距離を測定する距離測定部と、を備えている。 A distance measuring device according to the present invention includes a light-emitting element that emits near-infrared light to irradiate a subject with the near-infrared light, and a light-emitting element that emits near-infrared light to irradiate the subject with the near-infrared light. A light separation unit that separates external light and outputs the visible light and the near-infrared light, respectively; and receives the near-infrared light output from the light separation unit to generate a distance image. a distance image sensor, an imaging sensor that receives the visible light output from the light separation unit and generates a subject image, and based on the subject image generated by the imaging sensor, for each predetermined block, Using a correction signal calculation unit that calculates a correction signal, the correction signal calculated by the correction signal generation unit for each of the predetermined blocks, and the distance image generated by the distance image sensor, the subject is detected. and a distance measuring unit that measures the distance of
 本発明は、被写体や外部環境に影響されることなく、被写体までの距離を高精度に測定できる。 The present invention can measure the distance to a subject with high accuracy without being affected by the subject or the external environment.
図1は、距離測定装置の構成を示すブロック図である。FIG. 1 is a block diagram showing the configuration of the distance measuring device. 図2は、反7率テーブルを示す図である。FIG. 2 is a diagram showing an anti-seven ratio table. 図3は、距離測定装置に対して正対するときの黄色の立方体状の段ボール箱を示す模式図である。FIG. 3 is a schematic diagram showing a yellow cubic corrugated cardboard box facing the distance measuring device. 図4は、傾いたときの段ボール箱を示す模式図である。FIG. 4 is a schematic diagram showing a cardboard box when tilted. 図5は、段ボール箱の傾きθを示す模式図である。FIG. 5 is a schematic diagram showing the inclination θ of the cardboard box. 図6は、TOFセンサのフレーム毎の信号出力レベルを示す図である。FIG. 6 is a diagram showing the signal output level for each frame of the TOF sensor. 図7は、太陽光の分光特性を示す図である。FIG. 7 is a diagram showing spectral characteristics of sunlight.
 以下、本発明の実施形態について図面を参照しながら詳細に説明する。
 図1は、距離測定装置1の構成を示すブロック図である。距離測定装置1は、被写体で反射された光を集光するレンズ11と、レンズ11からの光を可視光と近赤外光とに分離するビームスプリッタ12と、ビームスプリッタ12からの可視光に応じた画像信号(被写体画像)を生成するベイヤーセンサ13と、ベイヤーセンサ13で生成された被写体画像に所定の信号処理を施す第1信号処理部14と、を備えている。
BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
FIG. 1 is a block diagram showing the configuration of the distance measuring device 1. As shown in FIG. The distance measuring device 1 includes a lens 11 that collects light reflected by an object, a beam splitter 12 that separates the light from the lens 11 into visible light and near-infrared light, and the visible light from the beam splitter 12. A Bayer sensor 13 that generates an image signal (object image) corresponding to the image, and a first signal processing unit 14 that performs predetermined signal processing on the object image generated by the Bayer sensor 13 .
 さらに、距離測定装置1は、ビームスプリッタ12からの近赤外光に応じた画像信号(距離画像)を生成するTOF(Time Of Flight)センサ15と、TOFセンサ15で生成された距離画像に所定の信号処理を施す第2信号処理部16と、被写体からの距離情報を演算する測定部17と、所定波長のパルス状の近赤外光を出力するLED光源18と、LED光源18の発光を制御する光源制御部19と、を備えている。 Furthermore, the distance measuring device 1 includes a TOF (Time Of Flight) sensor 15 that generates an image signal (distance image) corresponding to the near-infrared light from the beam splitter 12, and a predetermined distance image generated by the TOF sensor 15. a second signal processing unit 16 that performs signal processing, a measurement unit 17 that calculates distance information from an object, an LED light source 18 that outputs pulsed near-infrared light of a predetermined wavelength, and a light source control unit 19 for controlling.
 被写体には、自然光又は照明光だけでなく、LED光源18からの近赤外光(例えば波長850nm)が照射される。被写体で反射された光は、レンズ11を経由して、ビームスプリッタ12に入射する。 The subject is irradiated not only with natural light or illumination light but also with near-infrared light (for example, wavelength 850 nm) from the LED light source 18 . The light reflected by the subject passes through the lens 11 and enters the beam splitter 12 .
 ビームスプリッタ12は、2個の直角プリズムで構成されている。各直角プリズムの斜辺の面は、互いに対向するように貼り合わされている。ビームスプリッタ12に入射された光のうちの可視光(例えば波長450~700nm)は、貼り合わせ面(反射面)で反射した後、そのまま直進してベイヤーセンサ13に入射される。一方、ビームスプリッタ12に入射された光のうちの近赤外光(例えば波長750~1000nm)は、反射面を透過して、TOFセンサ15に入射される。 The beam splitter 12 is composed of two rectangular prisms. The oblique faces of the right-angle prisms are bonded together so as to face each other. Visible light (wavelength 450 to 700 nm, for example) out of the light incident on the beam splitter 12 is reflected by the bonding surface (reflecting surface) and then travels straight to enter the Bayer sensor 13 . On the other hand, near-infrared light (for example, wavelength 750 to 1000 nm) of the light incident on the beam splitter 12 is transmitted through the reflecting surface and is incident on the TOF sensor 15 .
 すなわち、ビームスプリッタ12は、被写体から同一の光軸を通って入射された光を、可視光と近赤外光とに分離する。本実施形態では、近赤外光は、主に、LED光源18から出力されたものである。なお、可視光と近赤外光のそれぞれの帯域は、一例であり、本実施形態の数値に限定されるものではない。また、LED光源18の代わりに、レーザ光源を用いても良い。 That is, the beam splitter 12 separates the light incident from the subject through the same optical axis into visible light and near-infrared light. In this embodiment, the near-infrared light is mainly output from the LED light source 18 . Note that each band of visible light and near-infrared light is an example, and is not limited to the numerical values of the present embodiment. Also, instead of the LED light source 18, a laser light source may be used.
 ベイヤーセンサ13は、例えばマトリクス状に配列された複数の受光素子を有している。すなわち、ベイヤーセンサ13は、各受光素子にカラーフィルタがベイヤー配列されたイメージセンサである。ベイヤーセンサ13は、ビームスプリッタ12からの可視光に応じて被写体画像を生成する。第1信号処理部14は、ベイヤーセンサ13で生成された被写体画像に所定の信号処理を行う。 The Bayer sensor 13 has, for example, a plurality of light receiving elements arranged in a matrix. That is, the Bayer sensor 13 is an image sensor in which color filters are arranged in a Bayer array on each light receiving element. The Bayer sensor 13 generates a subject image according to visible light from the beam splitter 12 . The first signal processing unit 14 performs predetermined signal processing on the subject image generated by the Bayer sensor 13 .
 TOFセンサ15は、例えばマトリクス状に配列された複数の受光素子(画素)を有する距離画像センサである。なお、ベイヤーセンサ13とTOFセンサ15の各々の画素は、大きさも数も同じではない。但し、スケーリング処理によって、ベイヤーセンサ13とTOFセンサ15の各々の画素が対応する。このため、TOFセンサ15の各画素の信号値は、後述するように、ベイヤーセンサ13の対応する各画素の信号値によって補正される。また、ビームスプリッタ12からベイヤーセンサ13までの距離と、ビームスプリッタ12からTOFセンサ15までの距離は、ほぼ等しい。 The TOF sensor 15 is, for example, a range image sensor having a plurality of light receiving elements (pixels) arranged in a matrix. Note that the pixels of the Bayer sensor 13 and the TOF sensor 15 are not the same in size and number. However, each pixel of the Bayer sensor 13 and the TOF sensor 15 correspond to each other by scaling processing. Therefore, the signal value of each pixel of the TOF sensor 15 is corrected by the signal value of each corresponding pixel of the Bayer sensor 13, as will be described later. Also, the distance from the beam splitter 12 to the Bayer sensor 13 and the distance from the beam splitter 12 to the TOF sensor 15 are substantially equal.
 TOFセンサ15は、ビームスプリッタ12からの近赤外光に応じて距離画像を生成する。第2信号処理部16は、TOFセンサ15で生成された距離画像に所定の信号処理を行う。 The TOF sensor 15 generates a range image according to the near-infrared light from the beam splitter 12. The second signal processing unit 16 performs predetermined signal processing on the distance image generated by the TOF sensor 15 .
 測定部17は、データ記憶部、演算処理部等を有するコンピュータである。測定部17は、第1信号処理部14から出力される被写体画像を用いて、第2信号処理部16から出力される距離画像を補正することによって、被写体までの距離を高精度に示す距離画像を生成する。 The measurement unit 17 is a computer having a data storage unit, an arithmetic processing unit, and the like. The measurement unit 17 corrects the distance image output from the second signal processing unit 16 using the subject image output from the first signal processing unit 14, thereby producing a distance image that shows the distance to the subject with high accuracy. to generate
 測定部17は、最初に、LED光源18から出力された近赤外光(基準信号)と、TOFセンサ15で受光された近赤外光(反射信号)と、の位相差を画素毎に検出する。この位相差が距離情報に相当する。測定部17は、その位相差に基づいて、画素毎に、被写体までの距離情報を示す距離画像を演算する。 The measurement unit 17 first detects the phase difference between the near-infrared light (reference signal) output from the LED light source 18 and the near-infrared light (reflection signal) received by the TOF sensor 15 for each pixel. do. This phase difference corresponds to distance information. Based on the phase difference, the measurement unit 17 calculates a distance image representing distance information to the subject for each pixel.
 測定部17は、次に、ベイヤーセンサ13で受光された可視光に基づく被写体画像を用いて、画素毎に補正情報を演算する。測定部17は、最後に、画素毎に、補正情報を用いて距離画像を補正して、被写体までの距離を高精度に示す距離画像を演算する。 The measurement unit 17 then uses the subject image based on the visible light received by the Bayer sensor 13 to calculate correction information for each pixel. Finally, the measurement unit 17 corrects the distance image for each pixel using the correction information, and calculates a distance image that shows the distance to the subject with high accuracy.
 以下、補正情報について、詳しく説明する。
(被写体の素材・配色に関する補正)
 一般に、光の反射率は、光が入射・反射される物体(被写体)毎に異なる。また、同一の物体であっても、物体の色毎に、光の反射率が異なる。このため、実際には同一の距離であっても、被写体の物体又は色が異なると、測定される距離画像が異なってしまう。例えば、被写体が白紙と黒紙の場合、距離測定装置1から白紙及び黒紙までのそれぞれの距離が同じ場合でも、測定される距離画像が異なることがある。そこで、本実施形態では、測定部17は、予め反射率テーブルを記憶している。
The correction information will be described in detail below.
(Correction related to the material and color scheme of the subject)
In general, the reflectance of light differs for each object (subject) on which light is incident or reflected. Moreover, even for the same object, the reflectance of light differs for each color of the object. Therefore, even if the distance is actually the same, the distance image to be measured will be different if the object or color of the subject is different. For example, when the subject is a blank sheet of paper and a black sheet of paper, even if the respective distances from the distance measuring device 1 to the blank sheet and the black sheet of paper are the same, the distance images to be measured may differ. Therefore, in this embodiment, the measurement unit 17 stores a reflectance table in advance.
 図2は、反射率テーブルを示す図である。反射率テーブルは、物体毎に、その物体の様々な色毎に、対応する反射率を示している。例えば、木材の色が黄色、黄土色、焦げ茶色の場合、反射率はr1、r2、r3となる。なお、これらの反射率は、距離測定装置1に対して正対した状態の値である。 FIG. 2 is a diagram showing a reflectance table. The reflectance table shows, for each object, the corresponding reflectance for different colors of that object. For example, when the color of wood is yellow, ocher, and dark brown, the reflectances are r1, r2, and r3. It should be noted that these reflectances are values when the distance measuring device 1 is directly facing.
 測定部17は、被写体として想定される物体の画像を用いて予め学習処理を行うことによって、上記の反射率テーブルを作成する。なお、反射率テーブルは、学習処理によって作成されたものに限らず、既知のデータとして予め設定されたものでもよい。 The measurement unit 17 creates the above reflectance table by performing learning processing in advance using an image of an object assumed as a subject. Note that the reflectance table is not limited to one created by learning processing, and may be set in advance as known data.
 測定部17は、ベイヤーセンサ13から得られた被写体画像に基づいて、画素毎に、被写体の素材及び配色を決定し、反射率テーブルを参照して、その被写体の素材及び配色に対応する反射率を選択する。これにより、被写体の反射率を正確に求めることができる。 Based on the subject image obtained from the Bayer sensor 13, the measurement unit 17 determines the subject material and color scheme for each pixel, refers to the reflectance table, and calculates the reflectance corresponding to the subject material and color scheme. to select. Thereby, the reflectance of the object can be obtained accurately.
 なお、この反射率は、距離測定装置1に対して被写体が正対している場合のものである。以下、正対時の反射率をR1する。但し、被写体は常に正対しているとは限らないため、次のことを考慮する必要がある。 It should be noted that this reflectance is obtained when the subject faces the distance measuring device 1 directly. Hereafter, the reflectance at the time of direct confrontation is R1. However, since the subject does not always face the subject, it is necessary to consider the following.
(被写体の傾きに関する補正)
 同一の物体及び色であっても、距離測定装置1に対して物体が正対しているか、又は斜めになっているかによって、反射率Rが異なる。通常、物体が正対する時の反射率Raは、物体が傾いている時の反射率Rbに比べて大きくなる(Ra>Rb)。
(Correction for subject tilt)
Even with the same object and color, the reflectance R differs depending on whether the object faces the distance measuring device 1 directly or at an angle. Normally, the reflectance Ra when the object faces straight is higher than the reflectance Rb when the object is tilted (Ra>Rb).
 ここで、被写体への放射エネルギーをLout、TOFセンサ15への入射エネルギーをLin、反射率をRとすると、次の式が成立する。
   Lin=Lout×R
Here, when the radiant energy to the subject is Lout, the incident energy to the TOF sensor 15 is Lin, and the reflectance is R, the following formula holds.
Lin = Lout x R
 このため、被写体が正対する時に比べて傾いている場合、反射率R1,R2が異なるので、TOFセンサ15で得られる距離画像の精度が低下する。 Therefore, when the subject is tilted compared to when it is facing directly, the reflectances R1 and R2 are different, so the accuracy of the distance image obtained by the TOF sensor 15 is reduced.
 ここで、被写体が黄色の段ボール箱である場合について考える。このとき、距離測定装置1に対して被写体が正対しているときの被写体画像の一画素の信号をVsyとすると、次の式が成り立つ。
   Vsy=aR+bG+cB・・・(1)
Now consider the case where the subject is a yellow cardboard box. At this time, assuming that the signal of one pixel of the subject image when the subject faces the distance measuring device 1 is Vsy, the following equation holds.
Vsy=aR+bG+cB (1)
 なお、R,G,Bは、ベイヤーセンサ13によって検出された赤色、緑色、青色の各信号を示し、a,b,cは0~1の係数である。そして、距離測定装置1に対して黄色の段ボール箱が傾いている場合、正対している場合に比べて、上記の一画素の信号については、色の比率(a:b:c)は変わらないが、全体的な明るさ(輝度信号)は変化する。このため、次の式(2)が成り立つ。
   Vty=Vsy×(Rsy/Rty)・・・(2)
 なお、Vtyは、ベイヤーセンサ13によって検出された信号である。Rtyは、黄色の段ボール箱が正対しているときの反射率R1である。Rsyは、黄色の段ボール箱が傾いたときの反射率である。
R, G, and B indicate red, green, and blue signals detected by the Bayer sensor 13, and a, b, and c are coefficients of 0-1. When the yellow cardboard box is tilted with respect to the distance measuring device 1, the color ratio (a:b:c) of the signal of one pixel does not change compared to when it faces the distance measuring device 1. However, the overall brightness (luminance signal) changes. Therefore, the following formula (2) holds.
Vty=Vsy×(Rsy/Rty) (2)
Note that Vty is a signal detected by the Bayer sensor 13 . Rty is the reflectance R1 when the yellow cardboard box faces straight. Rsy is the reflectance when the yellow cardboard box is tilted.
 よって、被写体の正対時の信号は、式(3)によって求められる。
Vsy=Vty×(Rty/Rsy)・・・(3)
 Vtyはセンサ(ベイヤーセンサ13・TOFセンサ15)によって実際に検出される信号である。Rtyは、上述のように被写体の素材及び配色によって求められる反射率R1である。Rsyは、被写体の傾きによって決定される反射率である。
Therefore, the signal when the object is facing directly can be obtained by equation (3).
Vsy=Vty×(Rty/Rsy) (3)
Vty is a signal actually detected by the sensors (Bayer sensor 13 and TOF sensor 15). Rty is the reflectance R1 determined by the material and color scheme of the subject as described above. Rsy is the reflectance determined by the tilt of the subject.
 そこで、上述したように被写体の素材及び配色を決定するだけでなく、被写体の傾きθを求めて、被写体の素材、配色及び傾きを考慮した反射率を求める必要がある。 Therefore, it is necessary not only to determine the material and coloration of the subject as described above, but also to obtain the inclination θ of the subject and to obtain the reflectance that takes into account the material, coloration, and inclination of the subject.
 図3は距離測定装置1に対して正対するときの黄色の立方体状の段ボール箱である。図4は、傾いたときの段ボール箱である。図5は、段ボール箱の傾きθを示す模式図である。 FIG. 3 shows a yellow cubic cardboard box facing the distance measuring device 1. FIG. 4 is a cardboard box when tilted. FIG. 5 is a schematic diagram showing the inclination θ of the cardboard box.
 図3及び図4に示すように、被写体の正面形状(正対時のエッジの形状)が正方形であるにもかかわらず、検出された被写体のエッジの形状が等脚台形になった場合について考える。なお、被写体の正対時の形状(正方形)一辺の長さをaする。また、被写体が傾いたときの形状(等脚台形)の下底の長さは、正方形の一辺の長さと同じaとなる。但し、その高さ(上底と下底の間の長さ)はbとする。 As shown in FIGS. 3 and 4, consider the case where the shape of the detected edge of the subject is an isosceles trapezoid even though the front shape of the subject (the shape of the edge when facing the subject) is square. . It should be noted that the length of one side of the shape (square) when the object is directly facing is a. Also, the length of the bottom of the shape (isosceles trapezoid) when the object is tilted is a, which is the same as the length of one side of the square. However, the height (the length between the upper base and the lower base) shall be b.
 被写体の縦横比が分かれば、被写体の傾きも分かる。正対の場合、縦横比がa:a(=1:1)となり、傾きはゼロである。一方、被写体が傾いている場合、縦横比がb:aとなる。そこで、被写体の正対時に対する傾きθを用いると、次の式が成り立つ。
   cosθ=b/a
If the aspect ratio of the object is known, the inclination of the object can also be known. In the case of direct confrontation, the aspect ratio is a:a (=1:1) and the inclination is zero. On the other hand, when the subject is tilted, the aspect ratio is b:a. Therefore, by using the inclination .theta. with respect to when the object is directly facing, the following equation holds.
cos θ=b/a
 そこで、測定部17は、被写体が立方体の場合、次の演算を行うことによって、傾きθを求めることができる。
   θ=cos-1(b/a)
Therefore, when the object is a cube, the measurement unit 17 can obtain the inclination θ by performing the following calculation.
θ=cos -1 (b/a)
 被写体の傾きの演算方法は特に限定されるものではなく、その他の手法を用いてもよい。例えば、測定部17は、ベイヤーセンサ13からの被写体画像を用いて被写体のエッジの形状を検出し、被写体が正対する場合のエッジの形状に比べて、検出したエッジの形状がどの程度歪んでいるかを判定することによって、被写体の傾きを検出してもよい。 The method of calculating the inclination of the subject is not particularly limited, and other methods may be used. For example, the measuring unit 17 detects the shape of the edge of the subject using the subject image from the Bayer sensor 13, and compares the shape of the edge when the subject faces the subject, how much the shape of the detected edge is distorted. The inclination of the subject may be detected by determining .
 また例えば、測定部17は、ベイヤーセンサ13によって毎秒60フレームで連続的に生成されるフレーム画像を追跡して、ターゲットである被写体が正対しているのか、傾いているのかを判定してもよい。そして、測定部17は、被写体が傾いているときは、その傾きθを検出する。この場合、測定部17は、被写体の形状を予め学習しておけば、その被写体の傾きθを容易に検出することができる。測定部17は、検出した傾きθを用いて、被写体が傾いたときの反射率Rsyを求める。 Further, for example, the measurement unit 17 may track frame images continuously generated at 60 frames per second by the Bayer sensor 13 to determine whether the target subject is facing the front or is tilted. . Then, when the subject is tilted, the measurement unit 17 detects the tilt θ. In this case, if the measurement unit 17 learns the shape of the subject in advance, it can easily detect the inclination θ of the subject. Using the detected tilt θ, the measurement unit 17 obtains the reflectance Rsy when the subject is tilted.
 ここで、TOFセンサ15から生成された距離画像の各画素の信号は、式(3)のVtyに相当する。そこで、測定部17は、式(3)に従って、TOFセンサ15から生成された距離画像Vty、上述の処理によって求められた反射率Rsy,Rtyを用いて、画素毎に、正対時の距離画像Vsyを求めることができる。この距離画像は、被写体の素材、配色及び傾きが考慮されており、正確な距離情報を示している。 Here, the signal of each pixel of the distance image generated from the TOF sensor 15 corresponds to Vty in Equation (3). Therefore, the measurement unit 17 uses the distance image Vty generated from the TOF sensor 15 and the reflectances Rsy and Rty obtained by the above-described processing according to equation (3) to obtain the distance image when facing straight ahead for each pixel. Vsy can be determined. This distance image shows accurate distance information, taking into consideration the material, coloration, and inclination of the subject.
(外乱光・散乱光を考慮した補正)
 撮影環境では、予期しない外乱光又は散乱光(以下、「外乱光等」という。)が存在している。外乱光等の一部がTOFセンサ15に入射した場合、TOFセンサ15によって生成される距離画像が異常値になってしまう。
(Correction considering ambient light and scattered light)
Unexpected disturbance light or scattered light (hereinafter referred to as "disturbance light or the like") exists in the imaging environment. If part of the ambient light or the like enters the TOF sensor 15, the distance image generated by the TOF sensor 15 will have abnormal values.
 図6は、TOFセンサ15のフレーム毎の信号出力レベルを示す図である。上段はフレーム期間、中段は外乱光等がない場合の信号出力レベル、下段は外乱光等がある場合の信号出力レベルを示す。 FIG. 6 is a diagram showing signal output levels of the TOF sensor 15 for each frame. The upper row shows the frame period, the middle row shows the signal output level when there is no disturbance light, and the lower row shows the signal output level when there is disturbance light.
 図6に示すように、外乱光等の影響を受けると、その分だけ信号出力レベルが変化する。これにより、正しい距離情報が得られない。一方、外乱光等がゼロのときに信号出力レベルが既に飽和レベルに達している場合、外乱光等の影響を受けると、信号出力レベルは飽和レベルのままで変化しない。但し、信号に外乱光等の成分が多く含まれてしまう。 As shown in FIG. 6, when affected by ambient light, the signal output level changes accordingly. As a result, correct distance information cannot be obtained. On the other hand, if the signal output level has already reached the saturation level when the disturbance light or the like is zero, the signal output level remains at the saturation level and does not change under the influence of the disturbance light or the like. However, the signal contains many components such as ambient light.
 そこで、測定部17は、太陽光の分光特性を利用して、外乱光等の影響を除去するための補正情報を演算することができる。 Therefore, the measurement unit 17 can use the spectral characteristics of sunlight to calculate correction information for removing the influence of disturbance light and the like.
(原理)
 最初に、測定部17は、ホワイトバランス調整を行う。
 具体的には、近赤外光(NIR:850nm)を含む外乱光のない状態でベイヤーセンサ13のホワイトバランス調整を行う。このとき、LED光源18は、白色LEDが使用される。また、被写体の色は白である。すなわち、白色と定義された被写体が使用される。
(principle)
First, the measurement unit 17 performs white balance adjustment.
Specifically, the white balance adjustment of the Bayer sensor 13 is performed in the absence of ambient light including near-infrared light (NIR: 850 nm). At this time, a white LED is used as the LED light source 18 . Also, the color of the subject is white. That is, a subject defined as white is used.
 ホワイトバランス調整によって、式(4)が成り立つ。
   R=G=B=1 ・・・(4)
 ここで、R、G、Bは、白色被写体を撮影したベイヤーセンサ13から生成される赤色光、緑色光、青色光の各信号値である。ホワイトバランス調整後は、上述した式(1)の係数a,b,cはすべて1になる。さらに、白色被写体を撮影したTOFセンサ15から生成される近赤外光の信号値をTsとすると、Ts=1になるように調整する。
Expression (4) is established by the white balance adjustment.
R=G=B=1 (4)
Here, R, G, and B are respective signal values of red light, green light, and blue light generated from the Bayer sensor 13 that has captured a white subject. After the white balance adjustment, the coefficients a, b, and c of the above equation (1) are all 1. Furthermore, if Ts is the signal value of near-infrared light generated from the TOF sensor 15 photographing a white subject, adjustment is made so that Ts=1.
 次に、太陽光が入射する環境、すなわち外乱光のある環境で被写体を撮影する場合について説明する。
 図7は、太陽光の分光特性を示す図である。太陽光の分光特性(曲線)は、太陽光の各波長に対する規格化放射強度スペクトル(最大強度が1)を示している。測定部17は、太陽光の分光特性を予め記憶している。
Next, the case of photographing an object in an environment where sunlight is incident, that is, in an environment with ambient light will be described.
FIG. 7 is a diagram showing spectral characteristics of sunlight. The spectral characteristics (curves) of sunlight show the normalized radiant intensity spectrum (maximum intensity is 1) for each wavelength of sunlight. The measurement unit 17 stores in advance the spectral characteristics of sunlight.
 外乱光の影響を受けて白色被写体を撮影した場合、式(4)は成立しなくなる。つまり、外乱光の影響によって、赤色光、緑色光、青色光、近赤外光の各信号値が変化する。
 そこで、測定部17は、太陽光の分光特性を参照して、画素毎に、赤色光、緑色光及び青色光の各信号値の比率から、近赤外光の信号値を推定演算する。これにより、TOFセンサ15から生成される距離画像の各画素の信号値に含まれる外乱光を求めることができる。
If a white subject is photographed under the influence of ambient light, the formula (4) is no longer valid. That is, the signal values of red light, green light, blue light, and near-infrared light change due to the influence of ambient light.
Therefore, the measurement unit 17 refers to the spectral characteristics of sunlight and estimates and calculates the signal value of near-infrared light from the ratio of the signal values of red light, green light, and blue light for each pixel. As a result, the ambient light included in the signal value of each pixel of the distance image generated by the TOF sensor 15 can be obtained.
(補正例)
 LED光源18がオフの場合に、どの程度の外乱光があるかを考える。図7の場合、外乱光によってベイヤーセンサ13で検出された青色光、緑色光、赤色光のそれぞれの信号値Bs、Gs、Rsの比率は、例えば次式のようになった。
   Bs:Gs:Rs=0.96:0.98:0.98
 よって、ベイヤーセンサ13で検出される外乱光の信号値は、式(5)となる。
(ベイヤーセンサ13で検出される外乱光信号)
   =0.96Bs+0.98Gs+0.98Rs     ・・・(5)
(Correction example)
Consider how much ambient light there is when the LED light source 18 is off. In the case of FIG. 7, the ratios of the signal values Bs, Gs, and Rs of blue light, green light, and red light detected by the Bayer sensor 13 due to ambient light are, for example, as follows.
Bs:Gs:Rs=0.96:0.98:0.98
Therefore, the signal value of disturbance light detected by the Bayer sensor 13 is given by equation (5).
(Disturbance light signal detected by Bayer sensor 13)
=0.96Bs+0.98Gs+0.98Rs (5)
 測定部17は、太陽の分光特性を参照して、ベイヤーセンサ13で検出された青色光、緑色光、赤色光のそれぞれの信号値Bs、Gs、Rsの比率に基づいて、近赤外光の信号値を推定演算する。この結果、近赤外光の比率は、例えば、0.95となった。
 よって、TOFセンサ15で検出される外乱光の信号値は式(6)のようになる。
(TOFセンサ15で検出される外乱光信号)=0.95Ts ・・・(6)
The measurement unit 17 refers to the spectral characteristics of the sun, and based on the ratio of the signal values Bs, Gs, and Rs of the blue, green, and red light detected by the Bayer sensor 13, determines the near-infrared light. Estimate the signal value. As a result, the ratio of near-infrared light was 0.95, for example.
Therefore, the signal value of ambient light detected by the TOF sensor 15 is given by equation (6).
(Ambient light signal detected by TOF sensor 15)=0.95Ts (6)
 そして、測定部17は、上記のように求めた外乱光の信号値を用いて、LED光源18をオンにして被写体までの距離を演算する。なお、被写体までの距離の演算方法は、特に限定されるものではない。 Then, the measurement unit 17 turns on the LED light source 18 and calculates the distance to the subject using the signal value of the disturbance light obtained as described above. Note that the method of calculating the distance to the subject is not particularly limited.
 例えば、測定部17は、LED光源18がオンのときの距離画像の信号値から外乱光の信号値を差し引き、外乱光の影響のない補正済みの距離画像を求めてもよい。そして、補正済みの距離画像に基づいて、被写体までの距離を演算すればよい。 For example, the measurement unit 17 may subtract the signal value of disturbance light from the signal value of the distance image when the LED light source 18 is on, and obtain a corrected distance image free from the influence of disturbance light. Then, the distance to the subject may be calculated based on the corrected distance image.
 また、測定部17は、外乱光の信号値を距離に換算した補正距離を演算し、LED光源18がオンのときの距離画像に基づいて被写体までの距離を演算し、当該距離と補正距離とを用いて被写体までの正確な距離を演算してもよい。 The measuring unit 17 also calculates a corrected distance obtained by converting the signal value of the disturbance light into a distance, calculates the distance to the subject based on the distance image when the LED light source 18 is on, and calculates the distance and the corrected distance. may be used to calculate the exact distance to the subject.
 このように、測定部17は、太陽光の分光特性を参照して、LED光源18がオフのときにベイヤーセンサ13で生成された被写体の青色光、緑色光、赤色光の各信号値を用いて、近赤外光の波長帯における外乱光の信号値を推定演算する。そして、測定部17は、このようにして求められた外乱光の信号値を用いて、外乱等の影響を除去して、被写体までの高精度の距離を求めることができる。 In this way, the measurement unit 17 refers to the spectral characteristics of sunlight and uses the signal values of the blue light, green light, and red light of the subject generated by the Bayer sensor 13 when the LED light source 18 is off. Then, the signal value of disturbance light in the wavelength band of near-infrared light is estimated and calculated. Then, the measurement unit 17 can use the signal value of the disturbance light obtained in this manner to remove the influence of the disturbance and the like, and obtain a highly accurate distance to the object.
 以上のように、本実施形態の距離測定装置1は、ビームスプリッタ12を介して、同じ光軸を通過した光から距離画像及び被写体画像を生成し、画素毎に、被写体画像を用いて距離画像を補正することができる。 As described above, the distance measuring device 1 of the present embodiment generates a distance image and a subject image from light that has passed through the same optical axis via the beam splitter 12, and uses the subject image to generate a distance image for each pixel. can be corrected.
 具体的には、距離測定装置1は、被写体画像を用いて、画素毎に、被写体の素材及び配色を決定し、さらに被写体の正対に対する傾きを検出することによって、被写体の素材、配色及び傾きを反映した反射率を求めることができる。そして、距離測定装置1は、このようにして求めた反射率を用いて距離画像を補正することで、被写体の素材、配色、傾きに影響されることなく、正確な距離情報を得ることができる。 Specifically, the distance measuring device 1 uses the subject image to determine the material and color scheme of the subject for each pixel, and further detects the inclination of the subject with respect to the front to determine the material, color scheme, and inclination of the subject. can be obtained. By correcting the distance image using the reflectance obtained in this way, the distance measuring device 1 can obtain accurate distance information without being affected by the material, color scheme, or inclination of the subject. .
 また、距離測定装置1は、外乱光が入射する場合には、画素毎に、太陽の分光特性を参照して、LED光源18をオフにしたときに被写体画像の各色の信号値の比率から、近赤外光の波長帯における外乱光の信号値を推定する。そして、距離測定装置1は、画素毎に、推定した外乱光の信号値と、TOFセンサ15で検出される近赤外光の信号値と、を用いることにより、外乱光等の影響が除去され、被写体までの正確な距離情報を得ることができる。 In addition, when disturbance light is incident, the distance measuring device 1 refers to the spectral characteristics of the sun for each pixel, and from the ratio of the signal values of each color of the subject image when the LED light source 18 is turned off, A signal value of ambient light in the wavelength band of near-infrared light is estimated. The distance measuring device 1 uses the estimated signal value of ambient light and the signal value of near-infrared light detected by the TOF sensor 15 for each pixel, thereby removing the influence of ambient light and the like. , it is possible to obtain accurate distance information to the subject.
 なお、本発明は、上述した実施形態に限定されるものではなく、特許請求の範囲に記載された事項の範囲内で設計変更されたものにも適用可能である。例えば、距離測定装置1は、距離画像を画素単位に補正するのではなく、複数の画素からなるブロック毎に補正してもよい。また、可視光及び近赤外光のそれぞれの波長は、一例であり、その他の波長であってもよい。 It should be noted that the present invention is not limited to the above-described embodiments, and can also be applied to designs modified within the scope of the matters described in the claims. For example, the distance measurement device 1 may correct the distance image for each block composed of a plurality of pixels instead of correcting the distance image for each pixel. Moreover, each wavelength of visible light and near-infrared light is an example, and other wavelengths may be used.
1 距離測定装置
11 レンズ
12 ビームスプリッタ
13 ベイヤーセンサ,
14 第1信号処理部
15 TOFセンサ
16 第2信号処理部
17 測定部
18 LED光源
19 光源制御部 

 
1 distance measuring device 11 lens 12 beam splitter 13 bayer sensor,
14 First signal processing unit 15 TOF sensor 16 Second signal processing unit 17 Measurement unit 18 LED light source 19 Light source control unit

Claims (5)

  1.  近赤外光を発光して、被写体に前記近赤外光を照射する発光素子と、
     被写体から同一の光軸を通る反射光を可視光と前記近赤外光とに分離して、前記可視光と前記近赤外光とをそれぞれ出力する光分離部と、
     前記光分離部から出力された前記近赤外光を受光して、距離画像を生成する距離画像センサと、
     前記光分離部から出力されたされた前記可視光を受光して、被写体画像を生成する撮像センサと、
     前記撮像センサで生成された被写体画像に基づいて、所定ブロック毎に、補正信号を演算する補正信号演算部と、
     前記所定ブロック毎に、前記補正信号生成部により演算された前記補正信号と、前記距離画像センサで生成された距離画像と、を用いて、前記被写体までの距離を測定する距離測定部と、
     を備えた距離測定装置。
    a light-emitting element that emits near-infrared light and irradiates a subject with the near-infrared light;
    a light separation unit that separates reflected light from a subject that passes through the same optical axis into visible light and the near-infrared light, and outputs the visible light and the near-infrared light, respectively;
    a distance image sensor that receives the near-infrared light output from the light separation unit and generates a distance image;
    an imaging sensor that receives the visible light output from the light separation unit and generates a subject image;
    a correction signal calculation unit that calculates a correction signal for each predetermined block based on the subject image generated by the imaging sensor;
    a distance measuring unit that measures the distance to the subject using the correction signal calculated by the correction signal generating unit and the distance image generated by the distance image sensor for each predetermined block;
    distance measuring device with
  2.  前記所定ブロックは、一画素又は複数の画素で構成された領域である
     請求項1に記載の距離測定装置。
    The distance measuring device according to claim 1, wherein the predetermined block is an area composed of one pixel or a plurality of pixels.
  3.  前記補正信号演算部は、前記所定ブロック毎に、前記撮像センサにより生成された前記被写体画像に基づいて前記被写体の素材及び色を検出し、検出した素材及び色に対応する前記被写体の光反射率に基づいて前記補正信号を演算する
     請求項1に記載の距離測定装置。
    The correction signal calculation unit detects the material and color of the subject based on the subject image generated by the imaging sensor for each predetermined block, and detects the light reflectance of the subject corresponding to the detected material and color. 2. The distance measuring device according to claim 1, wherein said correction signal is calculated based on:
  4.  前記補正信号演算部は、前記所定ブロック毎に、前記撮像センサにより生成された前記被写体画像に基づいて前記被写体の傾きを検出し、検出した傾きに基づいて前記補正信号を演算する
     請求項1に記載の距離測定装置。
    2. The correction signal calculation unit detects a tilt of the subject based on the subject image generated by the imaging sensor for each of the predetermined blocks, and calculates the correction signal based on the detected tilt. A distance measuring device as described.
  5.  前記補正信号演算部は、前記所定ブロック毎に、前記撮像センサにより生成された前記被写体画像と、予め記憶された太陽光の分光特性と、に基づいて、近赤外光の波長帯における外乱光の信号値を示す前記補正信号を生成する
     請求項1に記載の距離測定装置。
     

     
    The correction signal calculation unit calculates disturbance light in a wavelength band of near-infrared light based on the subject image generated by the imaging sensor and prestored spectral characteristics of sunlight for each predetermined block. 2. The distance measuring device according to claim 1, wherein the correction signal is indicative of a signal value of .


PCT/JP2022/037197 2021-10-04 2022-10-04 Distance measuring device WO2023058666A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023552909A JP7441380B2 (en) 2021-10-04 2022-10-04 distance measuring device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-163775 2021-10-04
JP2021163775 2021-10-04

Publications (1)

Publication Number Publication Date
WO2023058666A1 true WO2023058666A1 (en) 2023-04-13

Family

ID=85803455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/037197 WO2023058666A1 (en) 2021-10-04 2022-10-04 Distance measuring device

Country Status (2)

Country Link
JP (1) JP7441380B2 (en)
WO (1) WO2023058666A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000221007A (en) * 1999-02-01 2000-08-11 Toshiba Corp Image acquisition device and method therefor
JP2010071976A (en) * 2008-08-19 2010-04-02 Panasonic Corp Distance estimation device, distance estimation method, program, integrated circuit, and camera
US20130107005A1 (en) * 2011-11-02 2013-05-02 Samsung Electronics Co., Ltd. Image processing apparatus and method
JP5992116B2 (en) * 2014-01-08 2016-09-14 三菱電機株式会社 Image generation device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000221007A (en) * 1999-02-01 2000-08-11 Toshiba Corp Image acquisition device and method therefor
JP2010071976A (en) * 2008-08-19 2010-04-02 Panasonic Corp Distance estimation device, distance estimation method, program, integrated circuit, and camera
US20130107005A1 (en) * 2011-11-02 2013-05-02 Samsung Electronics Co., Ltd. Image processing apparatus and method
JP5992116B2 (en) * 2014-01-08 2016-09-14 三菱電機株式会社 Image generation device

Also Published As

Publication number Publication date
JP7441380B2 (en) 2024-02-29
JPWO2023058666A1 (en) 2023-04-13

Similar Documents

Publication Publication Date Title
US6556706B1 (en) Three-dimensional surface profile imaging method and apparatus using single spectral light condition
CN104007550B (en) Light beam scan type display device
US20110018999A1 (en) Device for Evaluating the Surface of a Tire
JP5201825B2 (en) Distance image acquisition apparatus and method
JP6252081B2 (en) Projector and control method thereof
JP4945111B2 (en) Multicolor image forming apparatus and optical sensor for the same
US5703685A (en) Alignment method
US10302562B2 (en) Gloss evaluation method and gloss evaluation device
JP4559347B2 (en) Printed material information measuring apparatus and printing apparatus
KR101824328B1 (en) 3-D scanner and the scanning method using the chromatic aberration
US20140293432A1 (en) Projector, head-up display device, and control method of projector
JPH0143931B2 (en)
TW202021340A (en) Infrared pre-flash for camera
US5742397A (en) Control device of the position and slope of a target
JP6728722B2 (en) Scanner and image data generation method
JP2024060068A (en) Optical equipment
WO2023058666A1 (en) Distance measuring device
JP2023169165A (en) Computer, program, and method
US7217943B2 (en) Film scanner
JP2015079208A (en) Image display device and control method
JP2009074867A (en) Measuring apparatus and its measurement method
US11422447B2 (en) Illumination system, projection device and color wheel calibrating method
JPH0427813A (en) Method and apparatus for obtaining distance image
JP2006105926A (en) Inspection apparatus
JPWO2023058666A5 (en)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22878534

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023552909

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE