WO2019026513A1 - Dispositif de surveillance d'image, procédé de surveillance d'image, programme de surveillance d'image et support d'enregistrement - Google Patents

Dispositif de surveillance d'image, procédé de surveillance d'image, programme de surveillance d'image et support d'enregistrement Download PDF

Info

Publication number
WO2019026513A1
WO2019026513A1 PCT/JP2018/025299 JP2018025299W WO2019026513A1 WO 2019026513 A1 WO2019026513 A1 WO 2019026513A1 JP 2018025299 W JP2018025299 W JP 2018025299W WO 2019026513 A1 WO2019026513 A1 WO 2019026513A1
Authority
WO
WIPO (PCT)
Prior art keywords
image monitoring
irradiated
area
luminance value
average luminance
Prior art date
Application number
PCT/JP2018/025299
Other languages
English (en)
Japanese (ja)
Inventor
釣部 智行
信幸 廣瀬
浩平 山口
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2019026513A1 publication Critical patent/WO2019026513A1/fr
Priority to US16/683,311 priority Critical patent/US10791252B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/52Elements optimising image sensor operation, e.g. for electromagnetic interference [EMI] protection or temperature control by heat transfer or cooling elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • H04N23/811Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8066Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring rearward traffic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present disclosure relates to an image monitoring device, an image monitoring method, an image monitoring program, and a recording medium.
  • Patent Documents 1 and 2 There has been proposed a monitoring device that analyzes an image obtained from a camera mounted on a vehicle to detect another vehicle or a person. As dirt such as mud adheres to the camera, accurate detection can not be performed, and methods for detecting camera dirt have also been proposed (for example, Patent Documents 1 and 2).
  • the present disclosure provides an image monitoring apparatus, an image monitoring method, and an image monitoring program that are suitable even when camera contamination can be falsely detected.
  • An image monitoring device includes a luminance calculation unit and a darkness determination unit.
  • the brightness calculation unit calculates, in a video signal generated by photographing using a lens, a brightness average value of a radiation region irradiated with the lamp and a brightness average value of a non-radiation region not illuminated with the lamp.
  • the darkness determination unit performs darkness determination by comparing the difference between the average brightness value of the irradiated area and the average brightness value of the non-irradiated area with a first threshold.
  • an average luminance value of an irradiated area to which the lamp is irradiated and an average luminance value of a non-irradiated area to which the lamp is not irradiated Calculate Then, the darkness determination is performed by comparing the difference between the average luminance value of the irradiated area and the average luminance value of the non-irradiated area with the first threshold.
  • An image monitoring program causes a computer to function as the brightness calculation unit and the darkness determination unit.
  • a non-transitory recording medium stores the image monitoring program in a readable manner by the computer.
  • an image monitoring device an image monitoring method, an image monitoring program, and a recording medium are provided which are suitable even when dirt on a camera may be erroneously detected.
  • a block diagram showing a schematic configuration of an image monitoring device A flowchart showing an example of the processing operation of the image monitoring apparatus shown in FIG.
  • a diagram schematically showing an irradiated area and a non-irradiated area for calculating a luminance average value A diagram schematically showing an image generated by shooting with a rear camera when moving forward with the road surface bright and the license lamp lit.
  • a diagram schematically showing an image generated by shooting with a rear camera when moving forward with the license lamp lit in the dark A diagram schematically showing a region with many blurred outlines in the image shown in FIG. 5
  • the number M of pixels in which the edge amount is medium or more (the threshold TH1 or more) in the video signal and the edge amount is large (threshold TH2 (> TH1) Or more) determine the number N of pixels. Then, when the state in which M is larger than the threshold TH3 and the state N / M is smaller than the threshold TH4 (that is, the state in which the blur contour is large) continues for a predetermined time T, it is determined that the lens is contaminated.
  • the area irradiated with the license lamp is referred to as an irradiated area 31, and the area not irradiated with the license lamp is referred to as a non-irradiated area 33.
  • FIG. 4 is a view schematically showing an image generated by shooting of a rear camera when moving forward with the road surface bright and the license lamp lit.
  • the road surface is bright, there is no large difference between the brightness of the irradiated area 31 and the brightness of the non-irradiated area 33, and there is no state in which there are many blurred contours, or even if there is no such situation, it does not continue for a long time. Rather, the illumination of the license lamp will be difficult to see. Therefore, blurring of the boundary between the irradiated area 31 and the non-irradiated area 33 is difficult to confirm. Therefore, it is correctly determined that no dirt is attached to the lens.
  • FIG. 5 is a view schematically showing an image generated by shooting of the rear camera when moving forward with the license lamp lit in the dark.
  • FIG. 6 is a view schematically showing an area (hereinafter, blurred area) 35 having many blurred contours in the image shown in FIG.
  • the irradiated area 31 is bright and the non-irradiated area 33 is dark. Therefore, the blurred area 35 always occurs at the boundary between the irradiated area 31 and the non-irradiated area 33, and the time when the blurred outline is large continues for a long time. Therefore, even if the lens is not dirty, it is erroneously determined that the lens is dirty.
  • both the irradiated area 31 and the non-irradiated area 33 are sufficiently bright. Therefore, the image changes momentarily in both areas, and the inter-frame difference becomes large. Therefore, the area where the integrated value of the inter-frame difference is less than or equal to the predetermined value is small, and it is correctly determined that no dirt is attached to the lens.
  • FIG. 7 is a view schematically showing a region 37 in which the inter-frame difference is small in the video shown in FIG.
  • the image of the non-irradiated area 33 changes momentarily, and the inter-frame difference becomes large.
  • the image of the illumination area 31 is too bright and hardly changes, the inter-frame difference decreases. Therefore, in a large area 37 such as the entire irradiation area 31, the integrated value of the inter-frame difference becomes equal to or less than a predetermined value, and it is erroneously determined that the lens is contaminated even if the lens is not contaminated .
  • the camera is not dirty, it may be erroneously detected as dirty in the dark. As described above, according to the prior art, it is not always possible to detect dirt accurately.
  • FIG. 1 is a block diagram showing a schematic configuration of an image monitoring apparatus according to the embodiment.
  • FIG. 2 is a flow chart showing an example of the processing operation of the image monitoring apparatus shown in FIG.
  • FIG. 3 is a view schematically showing the irradiated area 31 and the non-irradiated area 33 for calculating the luminance average value.
  • the image monitoring apparatus includes a stain adhesion detection unit 1, a luminance calculation unit 2, and a darkness determination unit 3. Some or all of these functional units may be implemented by hardware or software. In the latter case, each functional unit can be realized by the processor executing a predetermined image monitoring program.
  • the image monitoring program may be stored in a non-transitory recording medium such as a disk memory readable by a computer.
  • the image monitoring apparatus receives a video signal generated by photographing using a rear camera lens (not shown, for example, a fisheye lens), and the video signal is input to the dirt adhesion detection unit 1 and the luminance calculation unit 2.
  • a rear camera lens not shown, for example, a fisheye lens
  • the dirt adhesion detection unit 1 detects whether dirt is attached to the lens of the rear camera based on the input video signal, and outputs the result.
  • a specific dirt detection method of the dirt adhesion detection unit 1 the method described in the above-mentioned Patent Document 1 or Patent Document 2 can be applied.
  • the luminance calculation unit 2 calculates the average luminance value YaveA of the irradiated area 31 and the average luminance value YaveB of the non-irradiated area 33 in the video signal.
  • the darkness determination unit 3 determines whether or not the darkness is traveling based on the brightness average value YaveA of the irradiation area 31 and the brightness average value YaveB of the non-irradiation area 33.
  • the dirt adhesion detection unit 1 analyzes the video signal to detect dirt attached to the lens (step S1). Darkness determination is performed in parallel with this.
  • the luminance calculation unit 2 calculates the average luminance value YaveA of the irradiated area 31 and the average luminance value YaveB of the non-irradiated area 33 in the video signal (steps S2 and S3).
  • the average luminance value YaveA of the irradiation area 31 may be the average luminance value of the entire irradiation area 31, but may be the average luminance value of the part 21 of the irradiation area 31.
  • the average luminance value YaveB of the non-irradiated area 33 may be the average luminance value of the entire non-irradiated area 33, or may be the average luminance value of the part 23 of the non-irradiated area 33. In any case, it is set in advance which part (region) the luminance average value of the video signal is to be calculated.
  • the darkness determination unit 3 determines, based on the brightness average values YaveA and YaveB, specifically, whether the darkness is or not by comparing the difference between YaveA and YaveB with a predetermined threshold TH5. More specifically, the darkness determination unit 3 determines that darkness is satisfied (step S5a) if the following equation (1) is satisfied (YES in step S4), and if not satisfied (NO in step S4): Is determined not to be dark (step S5b).
  • the darkness determination unit 3 determines that darkness is present. Then, the darkness determination unit 3 outputs the determination result (step S6).
  • the image monitoring apparatus outputs not only the dirt adhesion detection result but also the darkness determination result.
  • the image monitoring apparatus determines that the accuracy of the dirt adhesion detection by the dirt adhesion detection unit 1 is lower when it is dark than when it is not dark.
  • the darkness determination unit 3 may perform the darkness determination in consideration of the comparison between the luminance average value YaveA of the irradiation area 31 and the predetermined threshold TH6. Specifically, when the above equation (1) is satisfied and the following equation (2) is satisfied, the darkness determination unit 3 may determine that the condition is darkness.
  • the dirt adhesion detection unit 1 receives the determination result of the darkness determination unit 3 and reflects it on the dirt adhesion detection. It is also good.
  • the present disclosure is useful, for example, as an image monitoring device that analyzes an image obtained from a camera mounted on a vehicle to detect another vehicle or a person.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)

Abstract

Selon la présente invention, ce dispositif de surveillance d'image est pourvu d'une unité de calcul de luminosité et d'une unité de détermination d'obscurité. L'unité de calcul de luminosité calcule la valeur de luminosité moyenne dans une zone d'exposition exposée à une lampe, et la valeur de luminosité moyenne dans une zone de non-exposition qui n'est pas exposée à la lampe, dans un signal vidéo généré par réalisation d'une imagerie à l'aide d'une lentille. L'unité de détermination d'obscurité réalise une détermination d'obscurité par comparaison, avec une première valeur de seuil, de la différence entre la valeur de luminosité moyenne dans la zone d'exposition et la valeur de luminosité moyenne dans la zone de non-exposition.
PCT/JP2018/025299 2017-08-01 2018-07-04 Dispositif de surveillance d'image, procédé de surveillance d'image, programme de surveillance d'image et support d'enregistrement WO2019026513A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/683,311 US10791252B2 (en) 2017-08-01 2019-11-14 Image monitoring device, image monitoring method, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017149224A JP2019029897A (ja) 2017-08-01 2017-08-01 画像監視装置、画像監視方法および画像監視プログラム
JP2017-149224 2017-08-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/683,311 Continuation US10791252B2 (en) 2017-08-01 2019-11-14 Image monitoring device, image monitoring method, and recording medium

Publications (1)

Publication Number Publication Date
WO2019026513A1 true WO2019026513A1 (fr) 2019-02-07

Family

ID=65233608

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/025299 WO2019026513A1 (fr) 2017-08-01 2018-07-04 Dispositif de surveillance d'image, procédé de surveillance d'image, programme de surveillance d'image et support d'enregistrement

Country Status (3)

Country Link
US (1) US10791252B2 (fr)
JP (1) JP2019029897A (fr)
WO (1) WO2019026513A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023146714A (ja) * 2022-03-29 2023-10-12 パナソニックIpマネジメント株式会社 画像監視装置

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6772113B2 (ja) * 2017-08-02 2020-10-21 クラリオン株式会社 付着物検出装置、および、それを備えた車両システム
JP7156225B2 (ja) * 2019-09-20 2022-10-19 株式会社デンソーテン 付着物検出装置および付着物検出方法
JP7172931B2 (ja) * 2019-09-20 2022-11-16 株式会社デンソーテン 付着物検出装置および付着物検出方法
JP7188336B2 (ja) * 2019-09-20 2022-12-13 株式会社デンソーテン 付着物検出装置、および付着物検出方法
JP7200894B2 (ja) * 2019-09-20 2023-01-10 株式会社デンソーテン 付着物検出装置および付着物検出方法
JP7455284B2 (ja) * 2021-11-02 2024-03-25 三菱電機ビルソリューションズ株式会社 汚れ判定装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003189294A (ja) * 2001-12-13 2003-07-04 Secom Co Ltd 画像監視装置
JP2007293672A (ja) * 2006-04-26 2007-11-08 Toyota Motor Corp 車両用撮影装置、車両用撮影装置の汚れ検出方法
JP2014056382A (ja) * 2012-09-12 2014-03-27 Mitsubishi Electric Corp 2次元コード読取装置および2次元コード読取方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS52149552A (en) 1976-06-07 1977-12-12 Shibaura Eng Works Ltd Nut runner
JP3807331B2 (ja) 2002-03-06 2006-08-09 日産自動車株式会社 カメラの汚れ検出装置およびカメラの汚れ検出方法
JP2007208865A (ja) 2006-02-06 2007-08-16 Clarion Co Ltd カメラ状態検知システム
US9319637B2 (en) * 2012-03-27 2016-04-19 Magna Electronics Inc. Vehicle vision system with lens pollution detection
JP6117634B2 (ja) 2012-07-03 2017-04-19 クラリオン株式会社 レンズ付着物検知装置、レンズ付着物検知方法、および、車両システム
MX343203B (es) 2012-07-27 2016-10-28 Nissan Motor Dispositivo de deteccion de objeto tridimensional y metodo de deteccion de objeto tridimensional.
JP6245875B2 (ja) 2013-07-26 2017-12-13 クラリオン株式会社 レンズ汚れ検出装置およびレンズ汚れ検出方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003189294A (ja) * 2001-12-13 2003-07-04 Secom Co Ltd 画像監視装置
JP2007293672A (ja) * 2006-04-26 2007-11-08 Toyota Motor Corp 車両用撮影装置、車両用撮影装置の汚れ検出方法
JP2014056382A (ja) * 2012-09-12 2014-03-27 Mitsubishi Electric Corp 2次元コード読取装置および2次元コード読取方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023146714A (ja) * 2022-03-29 2023-10-12 パナソニックIpマネジメント株式会社 画像監視装置
JP7398644B2 (ja) 2022-03-29 2023-12-15 パナソニックIpマネジメント株式会社 画像監視装置

Also Published As

Publication number Publication date
US20200084356A1 (en) 2020-03-12
US10791252B2 (en) 2020-09-29
JP2019029897A (ja) 2019-02-21

Similar Documents

Publication Publication Date Title
WO2019026513A1 (fr) Dispositif de surveillance d'image, procédé de surveillance d'image, programme de surveillance d'image et support d'enregistrement
US10755417B2 (en) Detection system
CN102985957B (zh) 车辆周围监测装置
US8391612B2 (en) Edge detection with adaptive threshold
JP6257792B2 (ja) カメラの被覆状態の認識方法、カメラシステム、及び自動車
WO2020059565A1 (fr) Dispositif d'acquisition de profondeur, procédé d'acquisition de profondeur et programme
WO2019026785A1 (fr) Dispositif de détection d'objet fixé, et système de véhicule pourvu de celui-ci
JP5759950B2 (ja) 車載カメラ装置
US11100616B2 (en) Optical surface degradation detection and remediation
US10354413B2 (en) Detection system and picture filtering method thereof
CN107710279B (zh) 静态脏污检测与校正
WO2019026457A1 (fr) Dispositif de surveillance d'image, procédé de surveillance d'image, programme de surveillance d'image, et support d'enregistrement
EP3855215A1 (fr) Dispositif d'acquisition de profondeur, procédé d'acquisition de profondeur et programme
JP2006261761A (ja) 画像信号処理装置
US10803625B2 (en) Detection system and picturing filtering method thereof
JP4491360B2 (ja) 画像信号処理装置
US9589337B2 (en) Apparatus and method for recovering images damaged by weather phenomena
JP4087600B2 (ja) 画像監視装置
JP6970911B2 (ja) 汚れ検出装置の制御方法、および汚れ検出装置
KR101611273B1 (ko) 순차적 적외선 영상을 이용한 피사체 검출 시스템 및 방법
JPH11211845A (ja) 降雨雪検出方法およびその装置
US11393128B2 (en) Adhered substance detection apparatus
JP6362945B2 (ja) 車載画像処理装置
KR101300279B1 (ko) 다양한 조명 환경에서의 차량 검출
JP5722266B2 (ja) 光点検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18840739

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18840739

Country of ref document: EP

Kind code of ref document: A1