WO2017217745A1 - Dispositif d'affichage de vision nocturne - Google Patents

Dispositif d'affichage de vision nocturne Download PDF

Info

Publication number
WO2017217745A1
WO2017217745A1 PCT/KR2017/006157 KR2017006157W WO2017217745A1 WO 2017217745 A1 WO2017217745 A1 WO 2017217745A1 KR 2017006157 W KR2017006157 W KR 2017006157W WO 2017217745 A1 WO2017217745 A1 WO 2017217745A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image sensor
delay time
time
image
Prior art date
Application number
PCT/KR2017/006157
Other languages
English (en)
Korean (ko)
Inventor
김관형
김현준
오상걸
이동렬
지석만
홍삼열
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to EP17813569.5A priority Critical patent/EP3471399A4/fr
Priority to US16/309,371 priority patent/US11560091B2/en
Priority claimed from KR1020170074250A external-priority patent/KR101946941B1/ko
Publication of WO2017217745A1 publication Critical patent/WO2017217745A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range

Definitions

  • the present invention relates to a night vision image output device for a vehicle with improved visibility of a remote object.
  • a night vision device for a vehicle is a device for assisting a driver's driving during bad weather such as night driving or snow / rain. It delivers the situation within a certain distance from the vehicle to the driver through visual data.
  • An object of the present invention is to provide a night image output device and a night image processing method having improved measurable distance while minimizing noise.
  • a night image output device is a light pulse output unit for outputting light pulses at a specific period, a plurality of images using the light pulse reflected by an external object
  • a photographing unit comprising an image sensor for forming an image sensor, a display unit for outputting a final image obtained by synthesizing the plurality of images, and a light quantity ratio for each pixel of the final image, and a light quantity ratio for each distance,
  • a controller for calculating distance information of the displayed object, wherein the controller controls the image sensor to be activated with different delay times based on the output end time of the optical pulse in one frame.
  • the controller has the first and third delay times, calculates distance data using the received light quantity ratio and the light quantity ratio, and has the second and fourth delay times.
  • the distance data may be calculated using the received light quantity ratio and the data of the light quantity ratio. Therefore, a longer distance data can be obtained without increasing the output time width of the pulse width.
  • the photographing unit may include first and second image sensors including a TOF image sensor, and the control unit may include first and third image sensors respectively during a first frame.
  • the photographing unit may be controlled to receive light simultaneously with a delay time and to simultaneously receive light while having second and fourth delay times during a second frame.
  • the image sensor capable of controlling the activation time is activated to have a different delay time based on the output period of the pulsed light, distance data can be calculated through the light quantity ratio received.
  • the image sensor is controlled to have second and fourth delay times, it is possible to detect a longer distance using the image sensor. Therefore, the measurement distance can be extended without increasing the output time width of the pulsed light, and the inflow of external light can be minimized to form a clearer image.
  • FIG. 1A is a conceptual diagram illustrating an example of a nighttime image display device mounted on a vehicle.
  • FIG. 1B is a block diagram illustrating components of a night image display device according to an exemplary embodiment.
  • FIG. 2 is a conceptual diagram illustrating a control method of an image sensor and an optical pulse output unit according to an exemplary embodiment of the present invention.
  • FIG. 3A is a graph for describing light amounts according to distances of the first and third light receiving patterns.
  • 3B and 3C are graphs for describing the amount of light according to the delay time and the distance.
  • 3C is a graph showing the amount of light received by the first and third light receiving patterns together.
  • 4A is a graph showing the amount of light according to a distance for comparing the amounts of light of the first and third light receiving patterns with the second and fourth light receiving patterns.
  • 4B shows reference data regarding light quantity ratio with distance.
  • FIG. 5 is a conceptual diagram illustrating a control method when two image sensors are included, according to another exemplary embodiment.
  • FIG. 1A is a conceptual diagram illustrating an example of a nighttime image display device mounted on a vehicle
  • FIG. 1B is a block diagram illustrating components of the nighttime image display device according to an embodiment of the present invention.
  • the display unit 200 of the night image display apparatus is preferably disposed such that the driver of the vehicle can be viewed together with the front.
  • the night image display apparatus includes a photographing unit 100 for capturing an image, a display unit 200 for outputting an image, and a controller 300.
  • the photographing unit 100 includes an optical pulse output unit 110 and an image sensor 120.
  • the optical pulse is generated by the optical pulse output unit 110, the optical pulse is reflected by the object disposed in front of the vehicle to reach the image sensor 120.
  • the image sensor 120 forms an image capable of detecting the distance of the object by using the incident light pulse.
  • FIG. 2 is a conceptual diagram illustrating a control method of an image sensor and an optical pulse output unit according to an exemplary embodiment of the present invention.
  • the light pulse output unit 110 outputs light for a specific time at a specific period.
  • the light output from the optical pulse output unit 110 is reflected by the objects at different distances and received by the image sensor 120.
  • the specific period and the specific time at which the light is output from the optical pulse output unit 110 may be set differently according to the range of distance to secure the distance data. As the pulse width of the output light increases, the distance that can be measured by one pulsed light increases, but as the pulse width increases, the inflow of external light increases, so that the noise increases.
  • the image sensor 120 is activated and receives light after a specific time is delayed from the time te when the output of light from the light pulse output unit 110 ends.
  • the image sensor 120 is controlled to be activated as late as the first to fourth delay times t1, t2, t3, and t4 over time.
  • the optical pulse output unit 110 outputs light from the start time ts to the end time te, and the image sensor 120 after the first delay time t1 has passed from the end time te. Collect the light to reach. Light output from the light pulse output unit 110 during the first delay time t1 reaches and reflects the object to the image sensor 120.
  • the output time width of the light of the optical pulse output unit 110 is formed to be substantially the same as the activation time width of the image sensor 120, or the activation time width of the image sensor 120 is the optical pulse output unit 110. It can be set longer than the output time width of.
  • the image sensor 120 While the optical pulse output unit 110 continuously outputs the optical pulses at a specific period, the image sensor 120 has the first delay time t1 from the end time te after a predetermined number of times (N1 times). Activated by the first light receiving pattern in the delayed state to receive light.
  • the image sensor 120 After the image sensor 120 is activated with the first light-receiving pattern N1, it is activated with the second light-receiving pattern No. 2.
  • the image sensor 120 is activated after a second delay time t2 is delayed from the end time te.
  • the time width at which the image sensor 120 is active is substantially the same as the time width of the first light receiving pattern.
  • the second delay time t2 may be set larger than the first delay time t1.
  • the first delay time t1 may be 100 ns and the second delay time t2 may correspond to 600 ns.
  • the image sensor 120 is repeatedly activated N2 times, the image sensor 120 is activated with the third light receiving pattern N3.
  • the image sensor 120 is activated when the third delay time t3 is delayed from the end time te.
  • the third delay time t3 may be longer than the first delay time t1 and shorter than the second delay time t2, and the third delay time t3 may correspond to about 200 ns.
  • the controller 300 may acquire the distance data of the object by using the light received by the image sensor 120 according to the first and third light receiving patterns.
  • the image sensor 120 is activated to the third light-receiving pattern N3, and then activated to the fourth light-receiving pattern N4 to receive the light reflected from the object.
  • the fourth light receiving pattern is activated by a specific time width when a fourth delay time t4 is delayed from the end time te.
  • the controller 300 forms distance data of an object by using light received by the image sensor 120 according to the second and fourth light receiving patterns.
  • the light activated and received by the second and second light receiving patterns is used, distance data of an object located farther away can be obtained even without increasing the time width at which the pulsed light is output.
  • the controller 300 controls the first and the second light. 3 Calculate the distance using the light ratio of the light received by the light receiving pattern.
  • the distance to the object to be obtained through the light quantity ratio is set by the difference between the first and third delay times t1 and t3.
  • the distance data obtained by the light quantity ratio received by the second and fourth light receiving patterns may overlap with the light quantity ratio of the light received by the first and third light receiving patterns, and the second and fourth light receiving patterns
  • the range of the distance data may be set by the difference between the second and fourth delay times t2 and t4. Images consisting of a plurality of frames are stored in one memory by the image sensor 120 activated by the first to fourth light receiving patterns.
  • the image sensor according to the embodiments is a gated image sensor.
  • the gated image sensor can handle up to about 120fps. In addition, it is possible to control to have different delay time in one frame.
  • the gated image sensor may be controlled to receive light by distinguishing the first and second light receiving patterns. Therefore, the distance range that can be calculated as the ratio of the received light amount can be extended.
  • a gated image sensor When a gated image sensor is used in a night vision display, three of the four sensor outputs are used to capture an image and one is used for distance measurement.
  • FIGS. 3B and 3C are graphs illustrating the amount of light according to the delay time and the distance.
  • FIG. 3A illustrates a comparison between the first light receiving pattern and the output pattern of the pulsed light
  • FIG. 3B is a conceptual diagram illustrating a change in light amount according to the first light receiving pattern.
  • a time width Tp of the pulsed light, an activation time width Ts of the first light-receiving pattern, an end time te of the pulsed light, and a start time t1 of the first light-receiving pattern The delay time Td is shown.
  • the brightness of light received by the image sensor 120 activated according to the first light receiving pattern is gradually increased and temporarily maintained at a specific starting point S_min, and then gradually decreased to a specific ending point S_max. do.
  • the distance data along the horizontal axis has a value of a delay time Td, a time width Tp of pulsed light, and an activation time width Ts of the first light-receiving pattern. It can be calculated by the formula.
  • the distance the light first reaches is calculated by multiplying the delay time by half the speed of light. Since the light first arrives, the incident light is continuously reflected and incident on one pulsed light, and the light amount gradually increases.
  • the distance (distance between S1 and S2) in which the amount of light is kept constant is the time width T_p at which the pulsed light is output and the time width T_s at which the image sensor is activated. ) Are not formed if they are identical.
  • the light amount gradually decreases until both the time width T_p at which the pulsed light is output and the time width T_s at which the sensor is activated have passed.
  • the distance at which the light amount is absent is equal to the sum of the time width T_p at which the pulsed light is output, the time width T_s at which the sensor is activated, and the delay time multiplied by the half value of the speed of light. .
  • the controller 300 may obtain a light quantity value corresponding to each distance by the first to fourth equations, and the image sensor 120 forms an image image corresponding to each distance using the same. can do.
  • 3C is a graph showing the amount of light received by the first and third light receiving patterns together.
  • the difference between the first and third delay times is set to satisfy the graph of FIG. 3C.
  • the change in the amount of light of the first and third light receiving patterns is shown based on the time point at which one pulsed light is output.
  • the start point and the end point mean a distance corresponding to a time point at which the light quantity is measured and a time point at which the light quantity is not measured.
  • the start point G3s of the third light receiving pattern is farther than the start point G1s of the first light receiving pattern and is closer than the end point G1e of the first light receiving pattern.
  • an end point G3e of the third light receiving pattern is far from or equal to an end point G1e of the first light receiving pattern.
  • the time widths at which the image sensor is activated in the first and third light receiving patterns may be different from each other. However, it is set such that the last point S1 at which the most amount of light is received in the first light receiving pattern is closer than or equal to the start point S3 at which the most amount of light is received in the third light receiving pattern. That is, some of the light received by the first and third light receiving patterns correspond to light reflected by objects located at overlapping distances.
  • the controller 300 calculates the distance using the light quantity ratio based on the previously stored reference data.
  • FIG. 4A is a graph showing the amount of light according to a distance for comparing the amounts of light of the first and third light receiving patterns with the second and fourth light receiving patterns
  • FIG. 4B shows reference data regarding light quantity ratios according to distances.
  • the light quantity ratio obtained by dividing the amount of light received by the third light receiving pattern by the amount of light received by the first light receiving pattern and the amount of light received by the fourth light receiving pattern are received by the second light receiving pattern.
  • the ratio of the amount of light divided by the amount of light is shown in a graph according to the measurement distance.
  • the distance data of the object located at a distance where the light quantity ratios of the first and third light receiving patterns and the light quantity ratios of the second and fourth light receiving patterns coexist can be obtained more accurately.
  • an object located between about 30m to about 105m and about 165m to about 240m is calculated by the light quantity ratio of the first and third light receiving patterns and the light quantity ratio of the second and fourth light receiving patterns, respectively. Can be.
  • the image sensor may be driven to be activated by the second and fourth light receiving patterns having a delay time different from that of the first and third light receiving patterns without extending the output time width of the pulsed light.
  • the distance data of the located object can be obtained more accurately.
  • FIG. 5 is a conceptual diagram illustrating a control method when two image sensors are included, according to another exemplary embodiment.
  • the night image display device may include two image sensors, and the two image sensors may be formed of a TOF sensor. While the light pulses are output from the light pulse output unit 110 at a specific period and a specific time width, the two image sensors receive light reflected to have different delay times.
  • the first image sensor receives light to have a first delay time t1, and at the same time the second image sensor receives light to have a second delay time t2.
  • a first frame is formed of images formed from the first and second image sensors.
  • the first image sensor After the first frame has passed, i.e. after a certain number of pulses of light have been output, i.e., by changing the output start time, the first image sensor receives the reflected light to have a third delay time t3, The second image sensor receives light that is reflected to have a fourth delay time t4.
  • the first and second image sensors are formed with a second frame next to the first frame.
  • the distance data according to the light quantity ratio is formed based on the light quantity change included in the first and second frames.
  • the controller 180 forms distance data through the light quantity ratio collected while forming the first and third delay times t1 and t3, and forms the second and fourth delay times t2 and t4.
  • the distance data is formed through the collected light quantity ratio.
  • the light amount change collected by the first image sensor is stored in a first memory, and the light amount change collected by the second image sensor is stored in a second memory. That is, they are stored in different memories.
  • the controller 300 controls the first image sensor to receive the first delay time t1 or the third delay time t3, and the second image sensor receives the second delay time t2 or the second delay time t2. Control to receive light at the fourth delay time t4.
  • One image is formed from the images acquired by the first and second frames.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un dispositif de sortie de vision nocturne qui comporte : une unité de sortie de pulsion optique destinée à émettre une lumière pulsée à des périodes précises ; une unité photographique possédant un détecteur d'image pour former une pluralité d'images en utilisant une lumière pulsée réfléchie par un objet externe ; une unité d'affichage destinée à fournir une image finale, formée par synthèse de la pluralité d'images ; une unité de commande destinée à calculer des informations de distance d'un objet affiché dans chaque pixel en utilisant des données d'un rapport de luminosité pour chaque pixel de l'image finale et un rapport de luminosité pour une distance, dans une trame, l'unité de commande commandant le détecteur d'image de telle sorte que le détecteur d'image est activé tout en ayant des temps de retard différents sur la base d'un temps de fin de sortie de la lumière pulsée.
PCT/KR2017/006157 2016-06-13 2017-06-13 Dispositif d'affichage de vision nocturne WO2017217745A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17813569.5A EP3471399A4 (fr) 2016-06-13 2017-06-13 Dispositif d'affichage de vision nocturne
US16/309,371 US11560091B2 (en) 2016-06-13 2017-06-13 Night vision display device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662349134P 2016-06-13 2016-06-13
US62/349,134 2016-06-13
KR1020170074250A KR101946941B1 (ko) 2016-06-13 2017-06-13 야간 영상표시 장치
KR10-2017-0074250 2017-06-13

Publications (1)

Publication Number Publication Date
WO2017217745A1 true WO2017217745A1 (fr) 2017-12-21

Family

ID=60663670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/006157 WO2017217745A1 (fr) 2016-06-13 2017-06-13 Dispositif d'affichage de vision nocturne

Country Status (1)

Country Link
WO (1) WO2017217745A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108120990A (zh) * 2017-12-29 2018-06-05 山东神戎电子股份有限公司 一种提高距离选通夜视仪测距精度的方法
US11591199B2 (en) 2018-11-05 2023-02-28 Oshkosh Corporation Leveling system for lift device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1376154A1 (fr) * 2002-06-12 2004-01-02 Ford Global Technologies, LLC Système actif de vision nocturne pour véhicule automobile
KR100778904B1 (ko) * 2004-09-17 2007-11-22 마츠시다 덴코 가부시키가이샤 레인지 이미지 센서
US20150144790A1 (en) * 2013-11-27 2015-05-28 Aptina Imaging Corporation Video and 3d time-of-flight image sensors
KR20150086901A (ko) * 2014-01-21 2015-07-29 (주)팜비젼 야간 차량 주행중 보행자 인식 지원 시스템 및 방법
KR20160061132A (ko) * 2014-11-21 2016-05-31 아이티에스엔지니어링 주식회사 차량번호 인식 장치 및 그 방법, 그리고 상기 방법을 수행하는 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1376154A1 (fr) * 2002-06-12 2004-01-02 Ford Global Technologies, LLC Système actif de vision nocturne pour véhicule automobile
KR100778904B1 (ko) * 2004-09-17 2007-11-22 마츠시다 덴코 가부시키가이샤 레인지 이미지 센서
US20150144790A1 (en) * 2013-11-27 2015-05-28 Aptina Imaging Corporation Video and 3d time-of-flight image sensors
KR20150086901A (ko) * 2014-01-21 2015-07-29 (주)팜비젼 야간 차량 주행중 보행자 인식 지원 시스템 및 방법
KR20160061132A (ko) * 2014-11-21 2016-05-31 아이티에스엔지니어링 주식회사 차량번호 인식 장치 및 그 방법, 그리고 상기 방법을 수행하는 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3471399A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108120990A (zh) * 2017-12-29 2018-06-05 山东神戎电子股份有限公司 一种提高距离选通夜视仪测距精度的方法
CN108120990B (zh) * 2017-12-29 2021-03-23 山东神戎电子股份有限公司 一种提高距离选通夜视仪测距精度的方法
US11591199B2 (en) 2018-11-05 2023-02-28 Oshkosh Corporation Leveling system for lift device

Similar Documents

Publication Publication Date Title
WO2017111201A1 (fr) Appareil d'affichage d'image nocturne et son procédé de traitement d'image
KR101946941B1 (ko) 야간 영상표시 장치
US20050220450A1 (en) Image-pickup apparatus and method having distance measuring function
WO2014081107A1 (fr) Procédé et dispositif pour obtenir une image 3d
EP1895766A1 (fr) Caméra dotée de deux angles de vue ou plus
WO2017217745A1 (fr) Dispositif d'affichage de vision nocturne
WO2013018962A1 (fr) Appareil de reconnaissance de voie de trafic et procédé associé
WO2017195965A1 (fr) Appareil et procédé de traitement d'image en fonction de la vitesse d'un véhicule
WO2016068353A1 (fr) Equipement de mesure de distance à base de traitement d'image et procédé associé
WO2015102280A1 (fr) Dispositif de caméra stéréoscopique et son procédé de rectification
WO2019172500A1 (fr) Dispositif de mesure de visibilité par analyse vidéo utilisant l'intelligence artificielle
WO2020054975A1 (fr) Procédé de commande d'équipement de redressement de renfort en acier et appareil associé
US11418707B2 (en) Electronic device and notification method
WO2022080586A1 (fr) Système lidar capable de réduire la consommation d'énergie et son procédé de fonctionnement
WO2018074707A1 (fr) Capteur de pluie pour véhicule et dispositif d'entraînement d'essuie-glace de véhicule équipé de celui-ci
JP2007502588A (ja) カメラ用露光量制御方法および装置
US20220375064A1 (en) Vehicle imaging station
WO2015147457A1 (fr) Appareil de photographie d'image pour véhicule utilisant une commande d'exposition et procédé pour ce dernier
JP4756938B2 (ja) 撮像装置の位置検出装置及び撮像装置の位置検出方法
WO2017026566A1 (fr) Dispositif de balayage tridimensionnel et procédé de génération d'image tridimensionnelle balayée pour un tuyau
US20100220210A1 (en) Interactive system capable of improving image processing
JPS6341000A (ja) 車両速度計測装置
JPH0996528A (ja) 車間距離検出装置及び方法
WO2019098422A1 (fr) Dispositif de suivi d'objet et procédé de suivi d'objet
JP3224875B2 (ja) 画像式信号無視車検出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17813569

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017813569

Country of ref document: EP

Effective date: 20190114