JPWO2019142586A1 - Image processing system and light distribution control system - Google Patents

Image processing system and light distribution control system Download PDF

Info

Publication number
JPWO2019142586A1
JPWO2019142586A1 JP2019565778A JP2019565778A JPWO2019142586A1 JP WO2019142586 A1 JPWO2019142586 A1 JP WO2019142586A1 JP 2019565778 A JP2019565778 A JP 2019565778A JP 2019565778 A JP2019565778 A JP 2019565778A JP WO2019142586 A1 JPWO2019142586 A1 JP WO2019142586A1
Authority
JP
Japan
Prior art keywords
luminance
calculation formula
image
luminance calculation
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2019565778A
Other languages
Japanese (ja)
Other versions
JP6894536B2 (en
Inventor
宏治 土井
宏治 土井
広道 大塚
広道 大塚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Publication of JPWO2019142586A1 publication Critical patent/JPWO2019142586A1/en
Application granted granted Critical
Publication of JP6894536B2 publication Critical patent/JP6894536B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/14Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
    • B60Q1/1415Dimming circuits
    • B60Q1/1423Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
    • B60Q1/143Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic combined with another condition, e.g. using vehicle recognition from camera images or activation of wipers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/06Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
    • B60Q1/08Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
    • B60Q1/085Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically due to special conditions, e.g. adverse weather, type of road, badly illuminated road signs or potential dangers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/30Indexing codes relating to the vehicle environment
    • B60Q2300/33Driving situation
    • B60Q2300/337Tunnels or bridges
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/40Indexing codes relating to other road users or special conditions
    • B60Q2300/41Indexing codes relating to other road users or special conditions preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/40Indexing codes relating to other road users or special conditions
    • B60Q2300/42Indexing codes relating to other road users or special conditions oncoming vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

本発明は、コストを上げることなく,検知対象物を高精度に識別可能な画像処理システムを提供する。本発明は、撮像装置11と,輝度計算式を利用して撮像装置の撮像画像から輝度画像を生成する輝度計算部14,及び,輝度画像を基に所定の対象物を検知する検知部15を有する画像処理装置1とを備える画像処理システムであって,画像処理装置1は,輝度画像中の所定領域の輝度指標値または検知部が検知する所定の対象物の種類に応じて,輝度計算部で利用される輝度計算式を変更する輝度計算式変更部13をさらに備える。The present invention provides an image processing system capable of identifying an object to be detected with high accuracy without increasing the cost. The present invention includes an image pickup device 11, a brightness calculation unit 14 that generates a brightness image from an image captured by the image pickup device using a brightness calculation formula, and a detection unit 15 that detects a predetermined object based on the brightness image. An image processing system including the image processing device 1 having the image processing device 1, wherein the image processing device 1 is a luminance calculation unit according to a luminance index value of a predetermined region in a luminance image or a type of a predetermined object detected by the detection unit. Further, a luminance calculation formula changing unit 13 for changing the luminance calculation formula used in the above is provided.

Description

本発明は,車両搭載用の画像処理システムと車両ヘッドライトの配光を制御する配光制御システムに関する。 The present invention relates to an image processing system mounted on a vehicle and a light distribution control system that controls the light distribution of vehicle headlights.

従来から車両搭載用の画像処理システムに関する発明が知られている(下記特許文献1を参照)。特許文献1に記載された画像処理システムは,車両に搭載された撮像手段と,前記撮像手段が撮影した画像を取得し前記画像の解析を行う画像解析手段とを備えている。この画像解析手段は,前記撮像手段が撮影した画像を解析して赤色に光る先行車のテールライトの位置を検出する。 Inventions relating to an image processing system for mounting on a vehicle have been conventionally known (see Patent Document 1 below). The image processing system described in Patent Document 1 includes an image pickup means mounted on a vehicle and an image analysis means for acquiring an image taken by the image pickup means and analyzing the image. This image analysis means analyzes the image taken by the image pickup means and detects the position of the tail light of the preceding vehicle that glows red.

この画像処理システムでは,まず撮像画像内から赤色の画素を検出し,隣接する赤色画素を連結して「要素グループ」(=光点)を作成する。次にその要素グループの特徴量(例えば,要素グループに内接又は外接する矩形のサイズや,要素グループに含まれる画素の最大・最小輝度など)に基づいて車両(テールライト)とそれ以外の物体(反射板など)を判別している。 In this image processing system, first, red pixels are detected from the captured image, and adjacent red pixels are connected to create an "element group" (= light spot). Next, the vehicle (tail light) and other objects based on the features of the element group (for example, the size of the rectangle inscribed or circumscribed in the element group, the maximum / minimum brightness of the pixels included in the element group, etc.). (Reflector, etc.) is identified.

特開2014−232431号公報Japanese Unexamined Patent Publication No. 2014-232431

特許文献1の画像処理システムでは,路面上の橙色の反射板などの赤色に近い色の物体を先行車のテールライトとして誤検知する課題がある。 The image processing system of Patent Document 1 has a problem of erroneously detecting an object having a color close to red, such as an orange reflector on a road surface, as a tail light of a preceding vehicle.

テールライト検知の精度を上げるには,対象物の色や明るさを判別する処理に適した露光条件で画像を撮像することが好ましい。しかし,1つのシステム上で複数の認識機能を同時に動作させる場合,それぞれの機能に最適な露光条件で撮像するには高コストのカメラおよび回路が必要となる。他方,低コスト化のためには,複数の認識機能で露光条件を共有し,テールライト検知の観点からは必ずしも適切でない明るさの画像を入力してテールライト検知処理をせざるを得ない場合がある。そして,同様の課題は検知対象が他車のテールライトの場合限らず,対向車両のヘッドライトや信号機等の自発光式の発光体の検知でも発生し得る。さらに,発光体に限らず,例えば日中に他車を検知する際にトンネルの出入口等で撮像画像に黒つぶれや白とびが生じた場合にも同様の課題が発生し得る。 In order to improve the accuracy of tail light detection, it is preferable to capture an image under exposure conditions suitable for the process of determining the color and brightness of an object. However, when a plurality of recognition functions are operated simultaneously on one system, a high-cost camera and a circuit are required to take an image under the optimum exposure conditions for each function. On the other hand, in order to reduce the cost, there is no choice but to share the exposure conditions with multiple recognition functions and input an image with brightness that is not always appropriate from the viewpoint of tail light detection to perform tail light detection processing. There is. The same problem can occur not only when the detection target is the tail light of another vehicle but also when detecting a self-luminous light emitter such as a headlight or a traffic light of an oncoming vehicle. Further, the same problem may occur not only in the light emitting body but also in the case where blackout or whiteout occurs in the captured image at the entrance / exit of the tunnel when detecting another vehicle in the daytime.

本発明は,上記の課題に鑑みてなされたものであり,コストを上げることなく,検知対象物を高精度に識別可能な画像処理システムを提供することを目的とする。 The present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing system capable of identifying an object to be detected with high accuracy without increasing the cost.

本願は上記課題を解決する手段を複数含んでいるが,その一例を挙げるならば,撮像装置と,輝度計算式を利用して前記撮像装置の撮像画像から輝度画像を生成する輝度計算部,及び,前記輝度画像を基に所定の対象物を検知する検知部を有する画像処理装置とを備える画像処理システムであって,前記画像処理装置は,前記輝度画像中の所定領域の輝度指標値または前記検知部が検知する前記所定の対象物の種類に応じて,前記輝度計算部で利用される輝度計算式を変更する輝度計算式変更部をさらに備えることを特徴とする。 The present application includes a plurality of means for solving the above problems. For example, an image pickup device, a brightness calculation unit that generates a brightness image from an image captured by the image pickup device by using a brightness calculation formula, and a brightness calculation unit. An image processing system including an image processing device having a detection unit that detects a predetermined object based on the luminance image, and the image processing apparatus is a luminance index value of a predetermined region in the luminance image or the above. It is characterized by further including a luminance calculation formula changing unit that changes the luminance calculation formula used in the luminance calculation unit according to the type of the predetermined object detected by the detection unit.

本発明によれば,検知対象物の検出に適した露光条件の撮像画像を得られない状況においても,コストを上げることなく当該検知対象物の検知率を向上できる。 According to the present invention, the detection rate of the detection target can be improved without increasing the cost even in a situation where an image captured under exposure conditions suitable for detecting the detection target cannot be obtained.

車両制御システム100の概略構成図。The schematic block diagram of the vehicle control system 100. 輝度画像上に設定される所定領域(前方車両領域)Aの説明図。The explanatory view of the predetermined area (front vehicle area) A set on the luminance image. 車両制御システム100による配光制御処理のフローチャート。The flowchart of the light distribution control process by the vehicle control system 100. 輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャート。The flowchart of the determination process (change process) of the luminance calculation formula by the luminance calculation formula change part 13. 先行車テールライトと反射板の画像の一例。An example of an image of the tail light and reflector of the preceding vehicle. 同じ露光条件におけるテールライトと反射板に係る画像51,52の画素についての距離と輝度値の関係を示した図。The figure which showed the relationship between the distance and the luminance value about the pixel of the image 51, 52 which concerns on a tail light and a reflector under the same exposure condition. 道路上の橙色の反射板の輝度画像及び輝度分布の一例を示す図。The figure which shows an example of the luminance image and the luminance distribution of the orange reflector on the road. 輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートの一例。An example of a flowchart of a luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13. 輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートの一例。An example of a flowchart of a luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13. 輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートの一例。An example of a flowchart of a luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13. 各輝度計算式の内容と各輝度計算式が選択される条件についてまとめた表。A table summarizing the contents of each brightness calculation formula and the conditions under which each brightness calculation formula is selected.

以下,本発明の実施形態について図面を用いて説明する。
図1は本実施形態に係る車両制御システムの概略構成図である。この図の車両制御システム100は,車両(自車と称することがある)に備えられており,車両前方に設置され車両前方の所定の領域の画像を定期的に時系列で撮影する撮像装置としてのカメラ11と,カメラ11が撮影した画像(撮像画像)を処理する画像処理装置1と,画像処理装置1の処理結果を基に車両を制御する車両制御装置2を備えている。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
FIG. 1 is a schematic configuration diagram of a vehicle control system according to the present embodiment. The vehicle control system 100 in this figure is provided in a vehicle (sometimes referred to as a own vehicle), and is installed in front of the vehicle as an imaging device that periodically captures images of a predetermined area in front of the vehicle in chronological order. The camera 11 is provided with an image processing device 1 that processes an image (captured image) taken by the camera 11, and a vehicle control device 2 that controls a vehicle based on the processing result of the image processing device 1.

画像処理装置1と車両制御装置2は,それぞれコンピュータ(例えばマイクロコンピュータ)であり,例えば,入力部と,プロセッサである中央処理装置(CPU又はMPU)と,記憶装置であるリードオンリーメモリ(ROM)及びランダムアクセスメモリ(RAM)と,出力部とをそれぞれ有している。このうち入力部は,画像処理装置1及び車両制御装置2に入力される各種情報を,CPUが演算可能なように変換する。ROMは,適宜後述する演算処理を実行する制御プログラムと,当該演算処理の実行に必要な各種情報等が記憶された記録媒体であり,CPUは,ROMに記憶された制御プログラムに従って入力部及びROM,RAMから取り入れた信号に対して所定の演算処理を行う。出力部からは,出力対象を制御するための指令や出力対象が利用する情報等が出力される。なお,記憶装置は上記のROM及びRAMという半導体メモリに限られず,例えばハードディスクドライブ等の磁気記憶装置に代替可能である。 The image processing device 1 and the vehicle control device 2 are computers (for example, a microcomputer), respectively, for example, an input unit, a central processing unit (CPU or MPU) which is a processor, and a read-only memory (ROM) which is a storage device. It also has a random access memory (RAM) and an output unit. Of these, the input unit converts various information input to the image processing device 1 and the vehicle control device 2 so that the CPU can calculate. The ROM is a recording medium in which a control program that executes arithmetic processing described later as appropriate and various information necessary for executing the arithmetic processing are stored, and the CPU is an input unit and a ROM according to the control program stored in the ROM. , Performs predetermined arithmetic processing on the signal taken from the RAM. From the output unit, commands for controlling the output target and information used by the output target are output. The storage device is not limited to the above-mentioned semiconductor memories such as ROM and RAM, and can be replaced with a magnetic storage device such as a hard disk drive, for example.

画像処理装置1は,カメラ11の撮像画像(カラー画像)に基づいて物体検知をはじめとする外界認識処理を実行するコンピュータであり,画像取得部12と,輝度計算式変更部13と,輝度計算部14と,検知部15とを備えている。車両制御装置2は,画像処理装置1が出力する検知部15の検知結果に基づいて車両制御処理を実行するコンピュータであり,配光制御部16と,定速走行・車間距離制御部17とを備えている。なお,図1の画像処理装置1と車両制御装置2は異なる2つのコンピュータで構成しているが,一体のコンピュータで構成しても良い。 The image processing device 1 is a computer that executes external world recognition processing such as object detection based on an image (color image) captured by the camera 11, and includes an image acquisition unit 12, a brightness calculation formula changing unit 13, and brightness calculation. A unit 14 and a detection unit 15 are provided. The vehicle control device 2 is a computer that executes vehicle control processing based on the detection result of the detection unit 15 output by the image processing device 1, and comprises a light distribution control unit 16 and a constant speed traveling / inter-vehicle distance control unit 17. I have. Although the image processing device 1 and the vehicle control device 2 in FIG. 1 are composed of two different computers, they may be configured by an integrated computer.

画像取得部12は,カメラ11を制御し,定期的に露光時間(シャッタースピード)の設定および撮像を行ってカラー画像の時系列を取得し,それらを画像処理装置1内に記憶する処理を実行する。カラー画像はRGBカラーモデルで定義されており,カラー画像を構成する各画素の色は赤(R),緑(G),青(B)の明るさの組合せで定義されている。本実施形態ではR,G,Bそれぞれの明るさを0−255までの整数値(すなわち256階調)で表現しており各画素の色を3つの値(R値,G値,B値)の組合せで定義している。カラー画像は所定の周期で撮像されており,例えば1秒間につき30フレーム撮像されることがある。 The image acquisition unit 12 controls the camera 11, periodically sets the exposure time (shutter speed) and captures images, acquires a time series of color images, and executes a process of storing them in the image processing device 1. To do. A color image is defined by an RGB color model, and the color of each pixel constituting the color image is defined by a combination of brightness of red (R), green (G), and blue (B). In this embodiment, the brightness of each of R, G, and B is represented by an integer value from 0 to 255 (that is, 256 gradations), and the color of each pixel is represented by three values (R value, G value, B value). It is defined by the combination of. The color image is taken at a predetermined cycle, and for example, 30 frames may be taken per second.

輝度計算部(輝度画像生成部)14は,輝度計算式変更部13で決定された輝度計算式を利用して,カメラ11の撮像画像(カラー画像)から輝度画像(Y)を生成する処理を実行する。ここで生成された輝度画像は画像処理装置1に記憶され,検知部15と輝度計算式変更部13に出力される。 The luminance calculation unit (luminance image generation unit) 14 performs a process of generating a luminance image (Y) from an image (color image) captured by the camera 11 by using the luminance calculation formula determined by the luminance calculation formula changing unit 13. Execute. The luminance image generated here is stored in the image processing device 1 and output to the detection unit 15 and the luminance calculation formula changing unit 13.

輝度計算式変更部13は,輝度計算部14で過去(例えば所定フレーム前)に生成された輝度画像の中央部に位置する所定領域Aの輝度指標値と,検知部15が検知対象とする所定の対象物の情報を含む検知部15の出力と,車両制御装置2が実行しているアプリケーションの情報を含む車両制御装置2の出力とに応じて,輝度計算部14で利用される輝度計算式を変更・決定する処理を実行する。本実施形態では後述する4つの輝度計算式(第1−第4輝度計算式)が利用されており,輝度計算式変更部13はその4つの輝度計算式から1つを選択することになっている。各輝度計算式は,R値,G値,B値のそれぞれに所定の係数を乗じた値の合計で定義されており,輝度計算部14はこの式にカラー画像を構成する画素のR値,G値,B値を代入することで当該画素の輝度を計算する。カメラ11の撮像画像の各画素について輝度を計算して集合させたものが輝度画像となる。 The luminance calculation formula changing unit 13 has a luminance index value of a predetermined region A located in the center of a luminance image generated in the past (for example, before a predetermined frame) by the luminance calculation unit 14, and a predetermined luminance index value to be detected by the detection unit 15. The brightness calculation formula used by the brightness calculation unit 14 according to the output of the detection unit 15 including the information of the object of the above and the output of the vehicle control device 2 including the information of the application executed by the vehicle control device 2. Executes the process of changing / determining. In this embodiment, four luminance calculation formulas (first to fourth luminance calculation formulas) described later are used, and the luminance calculation formula changing unit 13 is to select one from the four luminance calculation formulas. There is. Each luminance calculation formula is defined as the sum of the values obtained by multiplying each of the R value, the G value, and the B value by a predetermined coefficient, and the luminance calculation unit 14 uses this formula as the R value of the pixels constituting the color image. The brightness of the pixel is calculated by substituting the G value and the B value. A luminance image is obtained by calculating and assembling the luminance of each pixel of the image captured by the camera 11.

所定領域Aは各輝度画像に設定される所定の領域である。所定領域Aは種々の規則に基づく設定が可能であるが,このうち自車の前方を走行する車両の位置を基準とした所定領域Aは特に前方車両領域Aと称することがある。図2は所定領域Aの説明図である。上段に示した輝度画像71には,直前まで検知部15で検知されていたがトンネル出口による画像の白とびにより検知不能となった前方車両(先行車両)をその外接矩形75で示しており,その前方車両(外接矩形75)の位置を基準として予め定められた大きさの矩形を定義することで所定領域(前方車両領域)A1を決定している。中段に示した輝度画像72には,同様にトンネル入口による画像の黒とびにより検知不能となった前方車両をその外接矩形76で示しており,その前方車両(外接矩形76)の位置を基準として予め定められた大きさの矩形を定義することで所定領域(前方車両領域)A1を決定している。下段に示した輝度画像72には,中段の輝度画像72と同様に前方車両の外接矩形76を示しているが,この前方車両の位置と直接的に無関係に輝度画像の中央部(すなわちカメラ11を自車進行方向に向けた場合の撮像画像の中央部)の予め定められた位置に予め定められた大きさの閉領域(例えば矩形)として所定領域A2を設定している。なお,ここでは前方車両として自車と同方向に走行する先行車両の例のみを説明したが,自車と逆方向に走行する対向車両の位置に基づいて所定領域Aを設定しても良い。 The predetermined area A is a predetermined area set for each luminance image. The predetermined area A can be set based on various rules, and among them, the predetermined area A based on the position of the vehicle traveling in front of the own vehicle may be particularly referred to as the front vehicle area A. FIG. 2 is an explanatory diagram of a predetermined area A. In the luminance image 71 shown in the upper row, the circumscribed rectangle 75 shows the vehicle in front (preceding vehicle) that was detected by the detection unit 15 until just before but could not be detected due to the overexposure of the image due to the tunnel exit. A predetermined area (front vehicle area) A1 is determined by defining a rectangle having a predetermined size based on the position of the vehicle in front (circumscribed rectangle 75). In the luminance image 72 shown in the middle row, the vehicle in front, which is also undetectable due to the blackout of the image due to the tunnel entrance, is shown by the circumscribing rectangle 76, and the position of the vehicle in front (external rectangle 76) is used as a reference. A predetermined area (front vehicle area) A1 is determined by defining a rectangle having a predetermined size. The luminance image 72 shown in the lower row shows the circumscribing rectangle 76 of the vehicle in front as in the luminance image 72 in the middle row, but the central portion of the luminance image (that is, the camera 11) is directly independent of the position of the vehicle in front. A predetermined area A2 is set as a closed area (for example, a rectangle) having a predetermined size at a predetermined position (the central portion of the captured image when the camera is directed in the traveling direction of the own vehicle). Although only the example of the preceding vehicle traveling in the same direction as the own vehicle has been described here as the front vehicle, the predetermined area A may be set based on the position of the oncoming vehicle traveling in the opposite direction to the own vehicle.

本実施形態の輝度計算式変更部13は所定領域Aの輝度指標値に基づいて輝度計算式を変更できる。ここで輝度指標値とは,所定領域A内に含まれる複数の画素の,集合体としての輝度を示す指標値であり,例えば,所定領域Aに含まれる全画素又は適宜選択した複数の画素の輝度の平均値や,同全画素の最大輝度又は最小輝度などが利用できる。本実施形態では所定領域Aに含まれる全画素の平均輝度を輝度指標値として説明する。 The brightness calculation formula changing unit 13 of the present embodiment can change the brightness calculation formula based on the brightness index value of the predetermined region A. Here, the luminance index value is an index value indicating the luminance as an aggregate of a plurality of pixels included in the predetermined region A, and is, for example, all the pixels included in the predetermined region A or a plurality of appropriately selected pixels. The average value of the brightness and the maximum or minimum brightness of all the pixels can be used. In the present embodiment, the average brightness of all the pixels included in the predetermined area A will be described as the brightness index value.

検知部15は,画像取得部12が取得したカラー画像と,輝度計算部14が生成した輝度画像の少なくとも1つに基づいて所定の対象物を検知する処理を実行し,その検知結果を車両制御装置2に出力する。検知部15が検知対象とする「所定の対象物」は車両制御装置2で実行されるアプリケーション(プログラム)の種類によって決定されることがある。配光制御部16に係る配光制御のアプリケーションが実行される場合には,他車両(先行車両,停止車両等)のテールライト(赤色系発光体)や,他車両(対向車両)のヘッドライト(白色系発光体)が所定の対象物となることができ,これらの対象物が検知された場合には車両が検知されたとみなすことができる。また,定速走行・車間距離制御部17に係る定速走行・車間距離制御(ACC:Adaptive Cruise Control)のアプリケーションが実行される場合には,他車両(先行車両,停止車両等)のテールライト(赤色系発光体)や,暗い露光条件(例えば露光時間が相対的に短い)で撮影された他車両(先行車両,停止車両等)のテールライト(赤色系発光体)が所定の対象物となることができる。 The detection unit 15 executes a process of detecting a predetermined object based on at least one of the color image acquired by the image acquisition unit 12 and the luminance image generated by the luminance calculation unit 14, and the detection result is controlled by the vehicle. Output to device 2. The "predetermined object" to be detected by the detection unit 15 may be determined by the type of application (program) executed by the vehicle control device 2. When the light distribution control application related to the light distribution control unit 16 is executed, the tail lights (red light emitters) of other vehicles (preceding vehicle, stopped vehicle, etc.) and the headlights of other vehicles (oncoming vehicles) are executed. (White light emitter) can be a predetermined object, and when these objects are detected, it can be considered that the vehicle has been detected. In addition, when the application of constant speed driving / inter-vehicle distance control (ACC: Adaptive Cruise Control) related to the constant-speed traveling / inter-vehicle distance control unit 17 is executed, the tail light of another vehicle (preceding vehicle, stopped vehicle, etc.) (Red illuminant) and tail lights (red illuminant) of other vehicles (preceding vehicle, stopped vehicle, etc.) taken under dark exposure conditions (for example, exposure time is relatively short) are the predetermined objects. Can be

車両制御装置2における配光制御部16は,画像処理装置1の検知部15の検知結果(車両検知結果)に基づいて,自車ヘッドライトの配光制御(配光制御アプリケーション)を実行する。配光制御としては,例えば,自車前方に先行車および対向車が存在しないときは自車ヘッドライトをハイビームに設定し,先行車または対向車が存在するときは自車ヘッドライトをロービームに設定するものがある。また,自車前方に他車が検知された場合にはその他車が自車ヘッドライトの照射領域から除外されるように自車ヘッドライトの遮光領域を決定し,その遮光領域に基づいて自車ヘッドライトの配光制御を行うものがある。後者の場合,複数のヘッドライトの光軸を変化させるアクチュエータ(例えば複数のモータ)と,各ヘッドライトの光の一部を遮蔽するためのシェード機構が自車に設けられており,配光制御部16からはそれらに対する制御信号が出力される。 The light distribution control unit 16 in the vehicle control device 2 executes light distribution control (light distribution control application) of the own vehicle headlights based on the detection result (vehicle detection result) of the detection unit 15 of the image processing device 1. As for the light distribution control, for example, when there is no preceding vehicle or oncoming vehicle in front of the own vehicle, the own vehicle headlight is set to high beam, and when there is a preceding vehicle or oncoming vehicle, the own vehicle headlight is set to low beam. There is something to do. In addition, when another vehicle is detected in front of the own vehicle, the light-shielding area of the own vehicle headlight is determined so that the other vehicle is excluded from the irradiation area of the own vehicle headlight, and the own vehicle is based on the light-shielding area. Some control the light distribution of the headlights. In the latter case, the vehicle is provided with an actuator (for example, multiple motors) that changes the optical axes of multiple headlights and a shade mechanism for blocking a part of the light of each headlight, and the light distribution is controlled. A control signal for them is output from the unit 16.

定速走行・車間距離制御部17は,画像処理装置1の検知部15の検知結果(車両検知結果)に基づいて,その検知車両と自車間の距離を一定に保持しつつ,走行速度を一定に保持するように自車の操舵制御(ステアリング制御)及び速度制御(エンジン制御,ブレーキ制御)に対応する機構(ステアリングコントロールユニット,エンジンコントロールユニット,ブレーキコントロールユニット)を適宜制御するACC(ACCアプリケーション)を実行する。 The constant-speed traveling / inter-vehicle distance control unit 17 keeps the traveling speed constant while maintaining a constant distance between the detected vehicle and the own vehicle based on the detection result (vehicle detection result) of the detection unit 15 of the image processing device 1. ACC (ACC application) that appropriately controls the mechanisms (steering control unit, engine control unit, brake control unit) corresponding to the steering control (steering control) and speed control (engine control, brake control) of the own vehicle so as to hold the vehicle. To execute.

−配光制御処理−
次に図3を用いて車両制御システム100による配光制御処理の流れを説明する。車両制御システム100(画像処理装置1及び車両制御装置2)は図3のフローチャートを所定の制御周期で繰り返し実行している。
-Light distribution control processing-
Next, the flow of the light distribution control process by the vehicle control system 100 will be described with reference to FIG. The vehicle control system 100 (image processing device 1 and vehicle control device 2) repeatedly executes the flowchart of FIG. 3 at a predetermined control cycle.

処理を開始すると画像取得部12は,ステップS1において,カメラ11に対して次に撮像する画像の露光時間を設定し,所定のタイミングで撮像を行ってカラー画像(撮像画像)を取得する。 When the processing is started, the image acquisition unit 12 sets the exposure time of the image to be captured next to the camera 11 in step S1, and performs imaging at a predetermined timing to acquire a color image (captured image).

ステップS2において,輝度計算式変更部13は,輝度計算式の決定処理を実行する。
本実施形態の輝度計算式変更部13は検知部15の検知対象物の種類に応じて輝度計算式を変更している。図4は本実施形態に係る輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートである。
In step S2, the luminance calculation formula changing unit 13 executes the determination process of the luminance calculation formula.
The brightness calculation formula changing unit 13 of the present embodiment changes the brightness calculation formula according to the type of the detection object of the detection unit 15. FIG. 4 is a flowchart of the luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13 according to the present embodiment.

ここでは輝度計算部14が輝度画像の生成に通常利用する基準の輝度計算式を第1輝度計算式と称する。本実施形態における第1輝度計算式は次の式(1)で表される。式(1)におけるYは輝度値を示し,R,G,Bは各画素のR値,G値,B値を示す。すなわち第1輝度計算式は,撮像画像中の任意の画素のR値,G値,B値のそれぞれに所定の係数を乗じた値の合計で定義されている。R,G,Bの係数の合計は1となる。

Y=0.299R+0.587G+0.114B …式(1)

図4の処理が開始されると輝度計算式変更部13は,ステップS21において,検知部15から入力される検知結果を基に検知部15の検知対象物の種類を入力する。
Here, the reference luminance calculation formula normally used by the luminance calculation unit 14 for generating a luminance image is referred to as a first luminance calculation formula. The first luminance calculation formula in this embodiment is represented by the following formula (1). In the equation (1), Y indicates the luminance value, and R, G, and B indicate the R value, G value, and B value of each pixel. That is, the first luminance calculation formula is defined as the sum of the values obtained by multiplying each of the R value, G value, and B value of any pixel in the captured image by a predetermined coefficient. The sum of the coefficients of R, G, and B is 1.

Y = 0.299R + 0.587G + 0.114B ... Equation (1)

When the process of FIG. 4 is started, the luminance calculation formula changing unit 13 inputs the type of the detection object of the detection unit 15 based on the detection result input from the detection unit 15 in step S21.

ステップS22では,輝度計算式変更部13は,ステップS21の検知対象物が対向車のヘッドライト(白色系発光体)か否かを判定する。ここで検知対象物が対向車のヘッドライトであればステップS23に進み,それ以外であればステップS24に進む。 In step S22, the luminance calculation formula changing unit 13 determines whether or not the detection target in step S21 is the headlight (white light emitter) of the oncoming vehicle. Here, if the object to be detected is the headlight of an oncoming vehicle, the process proceeds to step S23, otherwise the process proceeds to step S24.

ステップS23では,輝度計算式変更部13は,輝度計算式として第2輝度計算式を設定する。第2輝度計算式は,第1輝度計算式よりもB値の係数が大きく,第1輝度計算式よりもR値とG値の係数が小さい。本実施形態における第2輝度計算式は次の式(2)で表される。ヘッドライトをはじめとする白色光に対する感度はR,G,B画素で異なり,「G>R>B」の関係にある。そのため,輝度画像の生成に第2輝度計算式を用いると第1輝度計算式を用いた場合よりも白とびが低減される。

Y=B …式(2)

ステップS24では,輝度計算式変更部13は,ステップS21の検知対象物が先行車のテールライト(赤色系発光体)か否かを判定する。ここで検知対象物が先行車のテールライトであればステップS25に進み,それ以外であればステップS26に進む。
In step S23, the luminance calculation formula changing unit 13 sets the second luminance calculation formula as the luminance calculation formula. The second luminance calculation formula has a larger B value coefficient than the first luminance calculation formula, and a smaller R value and G value coefficient than the first luminance calculation formula. The second luminance calculation formula in this embodiment is represented by the following formula (2). Sensitivity to white light such as headlights differs between R, G, and B pixels, and there is a relationship of "G>R>B". Therefore, when the second luminance calculation formula is used to generate the luminance image, overexposure is reduced as compared with the case where the first luminance calculation formula is used.

Y = B ... Equation (2)

In step S24, the luminance calculation formula changing unit 13 determines whether or not the detection target in step S21 is the tail light (red light emitter) of the preceding vehicle. Here, if the object to be detected is the tail light of the preceding vehicle, the process proceeds to step S25, otherwise the process proceeds to step S26.

ステップS25では,輝度計算式変更部13は,輝度計算式として第4輝度計算式を設定する。第4輝度計算式は,第1輝度計算式よりもR値の係数が大きく,第1輝度計算式よりもG値とB値の係数が小さい。本実施形態における第4輝度計算式は次の式(4)で表される。第1輝度計算式よりもR値の係数を大きくすることで赤色光に対する感度が向上する。このため,輝度画像の生成に第4輝度計算式を用いると第1輝度計算式を用いた場合よりも赤色系発光体の黒つぶれが低減される。 In step S25, the luminance calculation formula changing unit 13 sets the fourth luminance calculation formula as the luminance calculation formula. The fourth luminance calculation formula has a larger coefficient of R value than the first luminance calculation formula, and a smaller coefficient of G value and B value than the first luminance calculation formula. The fourth luminance calculation formula in this embodiment is represented by the following formula (4). Sensitivity to red light is improved by making the coefficient of the R value larger than that of the first luminance calculation formula. Therefore, when the fourth luminance calculation formula is used to generate the luminance image, the black crushing of the red light emitter is reduced as compared with the case where the first luminance calculation formula is used.


Y=R …式(4)

ステップS26では,輝度計算式変更部13は,輝度計算式として第1輝度計算式を設定する。

Y = R ... Equation (4)

In step S26, the luminance calculation formula changing unit 13 sets the first luminance calculation formula as the luminance calculation formula.

上記のようにステップS23,S25,S26のいずれかで輝度計算式が決定されると,輝度計算式変更部13は図4の処理を終了し,図3のステップS3の処理が開始される。 When the luminance calculation formula is determined in any of steps S23, S25, and S26 as described above, the luminance calculation formula changing unit 13 ends the process of FIG. 4 and starts the process of step S3 of FIG.

ステップS3において,輝度計算部14は,輝度計算式変更部13がステップS2(すなわちステップS23,S25,S26のいずれか)で決定した輝度計算式を利用して画像取得部12がステップS1で取得したカラー画像中の全ての画素について輝度値Yを算出し,その算出した輝度値Yから輝度画像を生成する。生成された輝度画像は検知部15や輝度計算式変更部13に出力される。 In step S3, the luminance calculation unit 14 is acquired by the image acquisition unit 12 in step S1 using the luminance calculation formula determined by the luminance calculation formula changing unit 13 in step S2 (that is, any of steps S23, S25, and S26). A luminance value Y is calculated for all the pixels in the color image, and a luminance image is generated from the calculated luminance value Y. The generated luminance image is output to the detection unit 15 and the luminance calculation formula changing unit 13.

ステップS4において,検知部15は,画像取得部12が取得したカラー画像の中から車両候補光点を検出する。具体的には,まず,カラー画像中で探索する色の条件を検知対象物の種類に応じて設定し,カラー画像を走査してその条件に該当する色の画素を検出する。次に,隣り合う検出画素を同一グループとして連結し,それらの画素群(グループ)が占める領域を光点とみなす。本ステップにより検知対象物の種類に適した複数の光点が検出され得る。 In step S4, the detection unit 15 detects the vehicle candidate light spot from the color image acquired by the image acquisition unit 12. Specifically, first, the color conditions to be searched for in the color image are set according to the type of the detection object, and the color image is scanned to detect the pixels of the color corresponding to the conditions. Next, adjacent detection pixels are connected as the same group, and the area occupied by those pixel groups (groups) is regarded as a light spot. By this step, a plurality of light spots suitable for the type of the object to be detected can be detected.

ところで,ステップS4の車両候補光点の検出には,次のような課題がある。すなわち,先行車テールライト(図5の画像51の矩形内参照)が検知部15の検知対象物の場合,カラー画像中で赤色を探索することで光点を検出する。しかし,道路上の本来は橙色の反射板の画像(図5の画像52の矩形内参照)のエッジ部にテールライトと同じ赤色が出る場合があり,光点の赤色度合に基づく判定だけではテールライトと反射板の区別が難しい。そこで,本実施形態では続くステップS5の輝度による判定を行っている。 By the way, the detection of the vehicle candidate light spot in step S4 has the following problems. That is, when the preceding vehicle tail light (see the inside of the rectangle of the image 51 in FIG. 5) is the detection target of the detection unit 15, the light spot is detected by searching for red in the color image. However, the same red color as the tail light may appear at the edge of the image of the originally orange reflector on the road (see the inside of the rectangle of image 52 in FIG. 5), and the tail is judged only by the degree of redness of the light spot. It is difficult to distinguish between a light and a reflector. Therefore, in the present embodiment, the determination based on the brightness of the subsequent step S5 is performed.

ステップS5において,検知部15は,ステップS4で検出した各光点が車両ライト(対向車ヘッドライトまたは先行車テールライト)であるか否かを判定する。具体的には,光点の領域に含まれる画素のうち代表的なもの(例えばもっとも明るい画素)を選ぶ。そして,その画素の輝度値をステップS3で生成した輝度画像で確認し,その画素の輝度値が検知対象物ごとに定められた所定の閾値以上であるか否かを判定する。その画素の輝度値が所定の閾値以上である場合にはステップS6に進み,輝度値が所定の閾値未満の場合にはステップS7へ進む。 In step S5, the detection unit 15 determines whether or not each light spot detected in step S4 is a vehicle light (oncoming vehicle headlight or preceding vehicle taillight). Specifically, a representative pixel (for example, the brightest pixel) included in the area of the light spot is selected. Then, the luminance value of the pixel is confirmed by the luminance image generated in step S3, and it is determined whether or not the luminance value of the pixel is equal to or more than a predetermined threshold value determined for each detection object. If the luminance value of the pixel is equal to or greater than a predetermined threshold value, the process proceeds to step S6, and if the luminance value is less than the predetermined threshold value, the process proceeds to step S7.

ステップS6において,検知部15は,ステップS5で輝度値の判定対象とした光点に対し,当該光点は車両ライト(対向車ヘッドライトまたは先行車テールライト)であり,その車両ライトに係る車両を自車ヘッドライトの遮光対象とする旨の情報を付加し,配光制御部16へ出力する。 In step S6, the detection unit 15 refers to a light spot whose brightness value is determined in step S5, and the light spot is a vehicle light (oncoming vehicle headlight or preceding vehicle taillight), and the vehicle related to the vehicle light. Is added to the effect that the light is to be shaded by the headlights of the own vehicle, and is output to the light distribution control unit 16.

ステップS7において,検知部15は,ステップS5で輝度値の判定対象とした光点に対し,当該光点は車両ライトではなく,その車両ライトに係る車両を自車ヘッドライトの遮光対象としない旨の情報を付加し,配光制御部16へ出力する。 In step S7, the detection unit 15 indicates that the light spot whose brightness value is determined in step S5 is not a vehicle light and the vehicle related to the vehicle light is not a light-shielding target of the own vehicle headlight. Information is added and output to the light distribution control unit 16.

ステップS8において,配光制御部16は,ステップS6またはS7で得た光点情報に基づいて,自車ヘッドライトの遮光領域を決定し,その遮光領域が自車ヘッドライトの照射領域から除外されるように自車ヘッドライトの配光制御を行う。 In step S8, the light distribution control unit 16 determines a light-shielding area of the own vehicle headlight based on the light spot information obtained in step S6 or S7, and the light-shielding area is excluded from the irradiation area of the own vehicle headlight. The light distribution of the headlights of the own vehicle is controlled so as to be performed.

以上により,車両制御システム100による配光制御処理の1サイクルが完了する。車両制御システム100は所定の制御周期だけ待機した後,ステップS1から一連の処理を再び実行する。 As described above, one cycle of the light distribution control process by the vehicle control system 100 is completed. The vehicle control system 100 waits for a predetermined control cycle, and then re-executes a series of processes from step S1.

<作用・効果>
次に上記のように構成される車両制御システム100の作用・効果について説明する。
<Action / effect>
Next, the operation / effect of the vehicle control system 100 configured as described above will be described.

まず,本発明が解決しようとする課題について説明する。図3で説明したステップS5の輝度値に基づく判定では,自発光式の発光体(例えば,先行車テールライトや対向車ヘッドライト)と反射光式の発光体(例えば反射板)を判別することが難しい場合がある。図6は同じ露光条件におけるテールライトと反射板に係る画像51,52の画素についての距離と輝度値の関係を示した図である。この図に示すようにテールライトと反射板の光点の輝度値は近距離(図中の例では150m以下)ではともに最大値に達してしまい(輝度値が飽和してしまい),輝度値の大小では判別ができない。輝度値が飽和しない露光条件で撮像すれば改善されるが,車両制御システムの他の機能と画像を共有しているため露光条件を変えることは現実的に難しい。 First, the problem to be solved by the present invention will be described. In the determination based on the brightness value in step S5 described with reference to FIG. 3, a self-luminous light emitting body (for example, a preceding vehicle tail light or an oncoming vehicle headlight) and a reflected light type light emitting body (for example, a reflector) are discriminated. Can be difficult. FIG. 6 is a diagram showing the relationship between the distance and the brightness value of the pixels of the images 51 and 52 related to the tail light and the reflector under the same exposure conditions. As shown in this figure, the brightness values of the light spots of the tail light and the reflector both reach the maximum value at a short distance (150 m or less in the example in the figure) (the brightness value is saturated), and the brightness value becomes It cannot be distinguished by size. It can be improved by taking an image under an exposure condition that does not saturate the brightness value, but it is practically difficult to change the exposure condition because the image is shared with other functions of the vehicle control system.

そこで本実施形態では,図4に示したフローチャートのように,検知部15の検知対象物の種類に応じて,輝度画像を作成する際に利用する輝度計算式を変更することとした。 Therefore, in the present embodiment, as shown in the flowchart shown in FIG. 4, the luminance calculation formula used when creating the luminance image is changed according to the type of the detection object of the detection unit 15.

具体的には,対向車ヘッドライトが検知対象物のときは,ステップS23で第2輝度計算式(Y=B)を使うことにした。図7は道路上の橙色の反射板の輝度画像及び輝度分布の一例を示す図であり,図中の左側に第1輝度計算式を使った場合を示し,図中の右側に第2輝度計算式を使った場合を示す。図7の左側に示すように,第1輝度計算式を利用した場合には輝度値の分布のピークの1つが最大値(255)に存在しており,輝度に基づくS5の処理では対向車ヘッドライトとの区別が難しいことが分かる。既述のとおりR,G,BのうちBは白色光に対する感度が一番低い。第1輝度計算式よりもB値の係数が大きく,R値とG値の係数が小さい第2輝度計算式を利用して輝度値を算出すると,図7の右側に示すように第1輝度計算式を利用したときよりも光点の輝度値が低い値で分布するようになり,図7の例では輝度値の分布を255未満の値(図中左側の値)に分散させることができる。これにより対向車ヘッドライトと反射板の輝度値に基づく区別が容易になり,白色系の自発光式の発光体である対向車ヘッドライトの検出に適した露光条件の撮像画像を得られない状況においてもコストを上げることなく当該発光体の検知率を向上できる。 Specifically, when the oncoming vehicle headlight is an object to be detected, the second luminance calculation formula (Y = B) is used in step S23. FIG. 7 is a diagram showing an example of the brightness image and the brightness distribution of the orange reflector on the road. The left side of the figure shows the case where the first brightness calculation formula is used, and the right side of the figure shows the second brightness calculation. The case where the formula is used is shown. As shown on the left side of FIG. 7, when the first luminance calculation formula is used, one of the peaks of the luminance value distribution exists at the maximum value (255), and in the processing of S5 based on the luminance, the oncoming vehicle head It turns out that it is difficult to distinguish from the light. As described above, B among R, G, and B has the lowest sensitivity to white light. When the brightness value is calculated using the second brightness calculation formula in which the B value coefficient is larger than the first brightness calculation formula and the R value and G value coefficients are smaller, the first brightness calculation is performed as shown on the right side of FIG. The brightness values of the light spots are distributed at lower values than when the equation is used, and in the example of FIG. 7, the distribution of the brightness values can be distributed to values less than 255 (values on the left side in the figure). This makes it easier to distinguish between oncoming vehicle headlights and reflectors based on the brightness value, and it is not possible to obtain an image with exposure conditions suitable for detecting oncoming vehicle headlights, which are white self-luminous light emitters. However, the detection rate of the light emitting body can be improved without increasing the cost.

また同様に,先行車テールライトが検知対象物のときは,ステップS25で第4輝度計算式(Y=R)を使うこととした。第1輝度計算式よりもR値の係数が大きく,G値とB値の係数が小さい第4輝度計算式を利用すると,テールライトに係るR値の高い画素が高輝度となって強調されるとともに第1輝度計算式を利用したときよりも光点の輝度値が低い値で分布するようになる。これにより先行車テールライトと反射板の輝度値に基づく区別が容易になり,赤色系の自発光式の発光体である先行車テールライトの検知率を向上できる。 Similarly, when the tail light of the preceding vehicle is the object to be detected, the fourth luminance calculation formula (Y = R) is used in step S25. When the fourth brightness calculation formula, which has a larger R value coefficient than the first brightness calculation formula and a smaller G value and B value coefficient, is used, the pixels having a high R value related to the tail light are emphasized as high brightness. At the same time, the brightness value of the light spot is distributed at a lower value than when the first brightness calculation formula is used. This makes it easy to distinguish between the tail light of the preceding vehicle and the tail light of the reflector based on the brightness value, and the detection rate of the tail light of the preceding vehicle, which is a red self-luminous light emitter, can be improved.

−定速走行・車間距離制御(ACC)−
上記で説明した輝度計算式変更部13による輝度計算式の変更は,例えばACCの実行中に,トンネルの入口や出口で撮像画像に黒つぶれや白とびが発生し遠方の先行車両を見失った場合(図2の画像71,72参照)にも利用可能である。図8はこの場合の輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートの一例である。輝度計算式変更部13はこのフローチャートを所定の制御周期で繰り返し実行する。
-Constant speed driving / inter-vehicle distance control (ACC)-
The change of the brightness calculation formula by the brightness calculation formula change unit 13 described above is, for example, when the captured image is blacked out or overexposed at the entrance or exit of the tunnel during execution of ACC and the preceding vehicle in the distance is lost. It can also be used (see images 71 and 72 in FIG. 2). FIG. 8 is an example of a flowchart of the luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13 in this case. The luminance calculation formula changing unit 13 repeatedly executes this flowchart at a predetermined control cycle.

図8の処理が開始されると輝度計算式変更部13は,ステップS31において,検知部15から入力される検知結果を基に,他車両が検知されている状態から検知不能の状態に変化したか否かを判定する。そして,検知不能に変化した場合にはステップS32に進み,変化しない場合(すなわち他車の検知が継続している場合)にはステップS40に進む。 When the process of FIG. 8 is started, the luminance calculation formula changing unit 13 changes from the state in which another vehicle is detected to the undetectable state based on the detection result input from the detection unit 15 in step S31. Judge whether or not. Then, if it changes undetectable, the process proceeds to step S32, and if it does not change (that is, when the detection of another vehicle continues), the process proceeds to step S40.

ステップS32では,輝度計算式変更部13は,他車両が検知されている状態でカメラ11が撮影したカラー画像を基に生成された輝度画像のうち可能な限り最新のものを読み込む。なお,輝度画像を読み込む代わりにカラー画像を読み込んでも良い。 In step S32, the luminance calculation formula changing unit 13 reads as much as possible the latest luminance image generated based on the color image taken by the camera 11 in a state where another vehicle is detected. A color image may be read instead of the luminance image.

ステップS33では,輝度計算式変更部13は,ステップS32で読み込んだ輝度画像において他車両の位置を基準にして所定領域(前方車両領域)Aを設定する。所定領域Aの設定方法の一例としては図2で説明したものがある。なお,ステップS32でカラー画像を読み込んだ場合にはその画像上の他車両の位置を基に所定領域Aを設定しても良い。 In step S33, the luminance calculation formula changing unit 13 sets a predetermined region (front vehicle region) A with reference to the position of another vehicle in the luminance image read in step S32. As an example of the setting method of the predetermined area A, there is the one described with reference to FIG. When the color image is read in step S32, the predetermined area A may be set based on the position of another vehicle on the image.

ステップS34では,輝度計算式変更部13は,ステップS32で読み込んだ輝度画像中の所定領域Aの輝度指標値を演算する。なお,ステップS32でカラー画像を読み込んだ場合には,そのカラー画像に対応する輝度画像上の所定領域Aの輝度指標値を演算する。 In step S34, the luminance calculation formula changing unit 13 calculates the luminance index value of the predetermined region A in the luminance image read in step S32. When the color image is read in step S32, the luminance index value of the predetermined region A on the luminance image corresponding to the color image is calculated.

ステップS35では,輝度計算式変更部13は,所定領域Aの輝度指標値が第1閾値(Y1)以下かつ第2閾値(Y2)以上か否かを判定する。第1閾値(Y1)は,カラー画像(撮像画像)に白とびが発生したとみなすために規定された輝度の閾値であり,第2閾値(Y2)は,同様に黒つぶれが発生したとみなすために規定された輝度の閾値である。
ただし第1閾値(Y1)は第2閾値(Y2)より大きい値とする(すなわち,Y1>Y2)。所定領域Aの輝度指標値が第2閾値以上かつ第1閾値以下であれば適正な輝度であるとみなすことができる。所定領域Aの輝度指標値が第1閾値(Y1)以下かつ第2閾値(Y2)以上である場合にはステップS36に進み,そうでない場合にはステップS37に進む。
In step S35, the luminance calculation formula changing unit 13 determines whether or not the luminance index value of the predetermined region A is equal to or less than the first threshold value (Y1) and equal to or greater than the second threshold value (Y2). The first threshold value (Y1) is the threshold value of the luminance defined to consider that overexposure has occurred in the color image (captured image), and the second threshold value (Y2) is similarly considered to have caused blackout. It is a threshold value of brightness defined for this.
However, the first threshold value (Y1) is set to a value larger than the second threshold value (Y2) (that is, Y1> Y2). If the luminance index value of the predetermined region A is equal to or greater than the second threshold value and equal to or less than the first threshold value, it can be considered that the luminance is appropriate. If the luminance index value of the predetermined region A is equal to or less than the first threshold value (Y1) and equal to or greater than the second threshold value (Y2), the process proceeds to step S36, and if not, the process proceeds to step S37.

ステップS36では,輝度計算式変更部13は,輝度計算式として第1輝度計算式を設定する。 In step S36, the luminance calculation formula changing unit 13 sets the first luminance calculation formula as the luminance calculation formula.

ステップS37では,輝度計算式変更部13は,所定領域Aの輝度指標値が第1閾値(Y1)より大きいか否かを判定する。所定領域Aの輝度指標値が第1閾値(Y1)より大きい場合にはステップS38に進み,そうでない場合(すなわち,所定領域Aの輝度指標値が第2閾値(Y2)より小さい場合)にはステップS39に進む。 In step S37, the luminance calculation formula changing unit 13 determines whether or not the luminance index value of the predetermined region A is larger than the first threshold value (Y1). If the luminance index value of the predetermined region A is larger than the first threshold value (Y1), the process proceeds to step S38, and if not (that is, when the luminance index value of the predetermined region A is smaller than the second threshold value (Y2)), the process proceeds to step S38. The process proceeds to step S39.

ステップS38では,輝度計算式変更部13は,輝度計算式として第2輝度計算式を設定する。 In step S38, the luminance calculation formula changing unit 13 sets the second luminance calculation formula as the luminance calculation formula.

ステップS39では,輝度計算式変更部13は,輝度計算式として第3輝度計算式を設定する。第3輝度計算式は,第1輝度計算式よりもG値の係数が大きく,第1輝度計算式よりもR値とB値の係数が小さい。本実施形態における第3輝度計算式は次の式(3)で表される。先述のように白色光に対する感度はR,G,B画素で異なり,「G>R>B」の関係にある。そのため,輝度画像の生成に第3輝度計算式を用いると第1輝度計算式を用いた場合よりも黒つぶれが低減される。
Y=G …式(3)
ステップS40では,輝度計算式変更部13は,輝度計算式として第1輝度計算式を設定する。
In step S39, the luminance calculation formula changing unit 13 sets the third luminance calculation formula as the luminance calculation formula. The third luminance calculation formula has a larger G value coefficient than the first luminance calculation formula, and a smaller R value and B value coefficient than the first luminance calculation formula. The third luminance calculation formula in this embodiment is represented by the following formula (3). As described above, the sensitivity to white light differs between R, G, and B pixels, and the relationship is "G>R>B". Therefore, when the third luminance calculation formula is used to generate the luminance image, the blackout is reduced as compared with the case where the first luminance calculation formula is used.
Y = G ... Equation (3)
In step S40, the luminance calculation formula changing unit 13 sets the first luminance calculation formula as the luminance calculation formula.

<作用・効果>
上記のように構成された輝度計算式変更部13が決定した輝度計算式を利用して輝度画像を生成した場合の作用・効果について説明する。
<Action / effect>
The action / effect when a luminance image is generated by using the luminance calculation formula determined by the luminance calculation formula changing unit 13 configured as described above will be described.

まず,トンネルの出口でカラー画像(撮像画像)に白とびが発生し遠方の他車両を見失ったときには,輝度計算式変更部13はステップS38で第2輝度計算式を輝度計算式に設定し,輝度計算部14はこの第2輝度計算式を利用して輝度画像を生成する。この輝度画像では第2輝度計算式によって白とびが低減されているので,カラー画像中で見失った車両を検知できる可能性が向上する。これにより白とび発生時の他車両の検出に適した露光条件の撮像画像を得られない状況においてもコストを上げることなく他車両の検知率を向上できる。 First, when the color image (captured image) is overexposed at the exit of the tunnel and the other vehicle in the distance is lost, the luminance calculation formula changing unit 13 sets the second luminance calculation formula to the luminance calculation formula in step S38. The luminance calculation unit 14 uses this second luminance calculation formula to generate a luminance image. In this luminance image, overexposure is reduced by the second luminance calculation formula, so that the possibility of detecting the lost vehicle in the color image is improved. As a result, the detection rate of other vehicles can be improved without increasing the cost even in a situation where an image captured under exposure conditions suitable for detecting other vehicles when overexposure occurs cannot be obtained.

また,トンネルの入口でカラー画像(撮像画像)に黒つぶれが発生し遠方の他車両を見失ったときには,輝度計算式変更部13はステップS39で第3輝度計算式を輝度計算式に設定し,輝度計算部14はこの第3輝度計算式を利用して輝度画像を生成する。この輝度画像では第3輝度計算式によって黒つぶれが低減されているので,カラー画像中で見失った車両を検知できる可能性が向上する。これにより黒つぶれ発生時の他車両の検出に適した露光条件の撮像画像を得られない状況においてもコストを上げることなく他車両の検知率を向上できる。 Further, when the color image (captured image) is blacked out at the entrance of the tunnel and the other vehicle in the distance is lost, the luminance calculation formula changing unit 13 sets the third luminance calculation formula to the luminance calculation formula in step S39. The luminance calculation unit 14 uses this third luminance calculation formula to generate a luminance image. In this luminance image, the blackout is reduced by the third luminance calculation formula, so that the possibility of detecting the lost vehicle in the color image is improved. As a result, the detection rate of other vehicles can be improved without increasing the cost even in a situation where an image captured under exposure conditions suitable for detecting another vehicle when blackout occurs cannot be obtained.

なお,白とびや黒つぶれが発生しない場合や,他車両を見失っていない場合には,通常と同じ第1輝度計算式が設定されるので,輝度計算式の変更に伴い不都合は生じない。 If overexposure or underexposure does not occur, or if other vehicles are not lost, the same first luminance calculation formula as usual is set, so that no inconvenience occurs due to the change in the luminance calculation formula.

−変形例1−
ところで,図8の例では,見失った車両の位置を基準に所定領域Aを設定したが,その車両の位置とは直接的に無関係に輝度画像の中央部の予め定められた位置に予め定められた大きさの矩形を所定領域として設定し,輝度計算式を決定しても良い。図9にこの場合の輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートの一例を示す。輝度計算式変更部13はこのフローチャートを所定の制御周期で繰り返し実行する。図8と同じ処理には同じ符号を付して説明は省略する。
-Modification example 1-
By the way, in the example of FIG. 8, the predetermined area A is set based on the position of the lost vehicle, but it is predetermined at a predetermined position in the central portion of the luminance image regardless of the position of the vehicle. A rectangle of the same size may be set as a predetermined area to determine the brightness calculation formula. FIG. 9 shows an example of a flowchart of the luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13 in this case. The luminance calculation formula changing unit 13 repeatedly executes this flowchart at a predetermined control cycle. The same processing as in FIG. 8 is designated by the same reference numerals, and the description thereof will be omitted.

図9の処理が開始されると輝度計算式変更部13は,ステップS41において,輝度計算式変更部13は,カメラ11が撮影したカラー画像を基に生成された輝度画像のうち可能な限り最新のものを読み込む。なお,輝度画像を読み込む代わりにカラー画像を読み込んでも良い。 When the process of FIG. 9 is started, the luminance calculation formula changing unit 13 is in step S41. In step S41, the luminance calculation formula changing unit 13 is the latest possible luminance image generated based on the color image captured by the camera 11. Read things. A color image may be read instead of the luminance image.

ステップS42では,輝度計算式変更部13は,ステップS41で読み込んだ輝度画像中の予め定められた所定領域Aの輝度指標値を演算する。なお,ステップS41でカラー画像を読み込んだ場合には,そのカラー画像に対応する輝度画像上の所定領域Aの輝度指標値を演算する。 In step S42, the luminance calculation formula changing unit 13 calculates the luminance index value of the predetermined predetermined region A in the luminance image read in step S41. When the color image is read in step S41, the luminance index value of the predetermined region A on the luminance image corresponding to the color image is calculated.

ステップ35以降の処理については図8と同じなので説明は省略する。 Since the processing after step 35 is the same as that in FIG. 8, the description thereof will be omitted.

このように輝度計算式変更部13を構成しても,白とびや黒つぶれ発生時の他車両の検出に適した露光条件の撮像画像を得られない状況においてもコストを上げることなく他車両の検知率を向上できる。 Even if the brightness calculation formula changing unit 13 is configured in this way, even in a situation where it is not possible to obtain an image of exposure conditions suitable for detecting another vehicle when overexposure or underexposure occurs, the cost of the other vehicle is not increased. The detection rate can be improved.

−変形例2−
なお,図8,9のフローチャートは図4のフローチャートと組合せても良い。図10にこの場合の輝度計算式変更部13による輝度計算式の決定処理(変更処理)のフローチャートの一例を示す。図10ではステップS31でNOと判定された場合に図4のステップS21に進むように構成されている。このように輝度計算式変更部13を構成しても,所定の検知対象物の検出に適した露光条件の撮像画像を得られない状況においてもコストを上げることなく当該所定の検知対象物の検知率を向上できる。なお,図10では図8と図4のフローチャートを組み合わせた例を示したが,図9と図4のフローチャートを組合せても良いことは言うまでもない。
-Modification example 2-
The flowcharts of FIGS. 8 and 9 may be combined with the flowchart of FIG. FIG. 10 shows an example of a flowchart of the luminance calculation formula determination process (change process) by the luminance calculation formula changing unit 13 in this case. In FIG. 10, when NO is determined in step S31, the process proceeds to step S21 in FIG. Even if the brightness calculation formula changing unit 13 is configured in this way, even in a situation where an image captured with exposure conditions suitable for detecting a predetermined detection object cannot be obtained, the detection of the predetermined detection object is performed without increasing the cost. The rate can be improved. Although FIG. 10 shows an example in which the flowcharts of FIGS. 8 and 4 are combined, it goes without saying that the flowcharts of FIGS. 9 and 4 may be combined.

最後に各実施形態で利用した4つの輝度計算式についてまとめる。図11は各輝度計算式の内容と各輝度計算式が選択される条件についてまとめた表である。 Finally, the four luminance calculation formulas used in each embodiment are summarized. FIG. 11 is a table summarizing the contents of each luminance calculation formula and the conditions under which each luminance calculation formula is selected.

この図に示すように,白とび低減のために利用される第2輝度計算式(Y=B)が選択される条件としては,例えば,所定領域(前方車両領域)Aの輝度指標値が第1閾値(Y1)より高い場合,先行車両が撮像画像の白とびにより検出不能となった場合(例えば,トンネル出口),検知部15の検知対象物が対向車のヘッドライトである場合(すなわち,ヘッドライトと反射板の判別が行われる場合)がある。 As shown in this figure, as a condition for selecting the second luminance calculation formula (Y = B) used for reducing overexposure, for example, the luminance index value of the predetermined region (front vehicle region) A is the first. When it is higher than one threshold value (Y1), when the preceding vehicle becomes undetectable due to overexposure of the captured image (for example, at the tunnel exit), when the detection target of the detection unit 15 is the headlight of the oncoming vehicle (that is, When the headlight and the reflector are distinguished).

また,黒つぶれ低減のために利用される第3輝度計算式(Y=G)が選択される条件としては,例えば,所定領域(前方車両領域)Aの輝度指標値が第2閾値(Y2)より低い場合,先行車両が撮像画像の黒つぶれにより検出不能となった場合(例えば,トンネル入口や陸橋の下)がある。 Further, as a condition for selecting the third luminance calculation formula (Y = G) used for reducing blackout, for example, the luminance index value of the predetermined region (front vehicle region) A is the second threshold value (Y2). If it is lower, the preceding vehicle may become undetectable due to blackout of the captured image (for example, under a tunnel entrance or overpass).

また,暗い赤色灯検知のために利用される第4輝度計算式(Y=R)が選択される条件としては,例えば,検知部15の検知対象が遠方(例えば自車から500m以上離れた距離)の走行車両のテールライトの場合,検知部15の検知対象が近距離(例えば自車から50m以内の距離)にある走行車両のテールライトの場合(前者の場合よりも露光条件が暗い場合)がある。 Further, as a condition for selecting the fourth luminance calculation formula (Y = R) used for detecting a dark red light, for example, the detection target of the detection unit 15 is far away (for example, a distance of 500 m or more from the own vehicle). ), The detection target of the detection unit 15 is the tail light of a traveling vehicle at a short distance (for example, within 50 m from the own vehicle) (when the exposure condition is darker than in the former case). There is.

なお,第1輝度計算式における係数は一例に過ぎず,例えば小数点第2位を四捨五入して「Y=0.3R+0.6+0.1B」としても良い。 The coefficient in the first luminance calculation formula is only an example. For example, the second decimal place may be rounded off to obtain "Y = 0.3R + 0.6 + 0.1B".

−その他−
本発明は,上記の実施の形態に限定されるものではなく,その要旨を逸脱しない範囲内の様々な変形例が含まれる。例えば,本発明は,上記の実施の形態で説明した全ての構成を備えるものに限定されず,その構成の一部を削除したものも含まれる。また,ある実施の形態に係る構成の一部を,他の実施の形態に係る構成に追加又は置換することが可能である。
− Other −
The present invention is not limited to the above-described embodiment, and includes various modifications within a range that does not deviate from the gist thereof. For example, the present invention is not limited to the one including all the configurations described in the above-described embodiment, and includes the one in which a part of the configurations is deleted. Further, it is possible to add or replace a part of the configuration according to one embodiment with the configuration according to another embodiment.

また,上記の処理装置1,制御装置2に係る各構成や当該各構成の機能及び実行処理等は,それらの一部又は全部をハードウェア(例えば各機能を実行するロジックを集積回路で設計する等)で実現しても良い。また,上記の装置1,2に係る構成は,演算処理装置
(例えばCPU)によって読み出し・実行されることで当該装置の構成に係る各機能が実現されるプログラム(ソフトウェア)としてもよい。当該プログラムに係る情報は,例えば,半導体メモリ(フラッシュメモリ,SSD等),磁気記憶装置(ハードディスクドライブ等)及び記録媒体(磁気ディスク,光ディスク等)等に記憶することができる。
Further, for each configuration related to the processing device 1 and the control device 2 and the functions and execution processing of each configuration, a part or all of them are designed by hardware (for example, the logic for executing each function is designed by an integrated circuit). Etc.) may be realized. Further, the configuration related to the above devices 1 and 2 may be a program (software) in which each function related to the configuration of the device is realized by reading and executing by an arithmetic processing unit (for example, a CPU). Information related to the program can be stored in, for example, a semiconductor memory (flash memory, SSD, etc.), a magnetic storage device (hard disk drive, etc.), a recording medium (magnetic disk, optical disk, etc.), or the like.

また,上記の各実施の形態の説明では,制御線や情報線は,当該実施の形態の説明に必要であると解されるものを示したが,必ずしも製品に係る全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えて良い。 Further, in the description of each of the above embodiments, the control lines and information lines are understood to be necessary for the description of the embodiment, but not all control lines and information lines related to the product are necessarily used. Does not always indicate. In reality, it can be considered that almost all configurations are interconnected.

1…画像処理装置,11…カメラ,12…画像取得部,13…輝度計算式変更部,14…輝度計算部,15…車両検知部,16…配光制御部,17…定速走行・車間距離制御部,100…車両制御システム 1 ... Image processing device, 11 ... Camera, 12 ... Image acquisition unit, 13 ... Luminance calculation formula change unit, 14 ... Luminance calculation unit, 15 ... Vehicle detection unit, 16 ... Light distribution control unit, 17 ... Constant speed running / inter-vehicle distance Distance control unit, 100 ... Vehicle control system

Claims (11)

撮像装置と,
輝度計算式を利用して前記撮像装置の撮像画像から輝度画像を生成する輝度計算部,及び,前記輝度画像を基に所定の対象物を検知する検知部を有する画像処理装置とを備える画像処理システムであって,
前記画像処理装置は,前記輝度画像中の所定領域の輝度指標値または前記検知部が検知する前記所定の対象物の種類に応じて,前記輝度計算部で利用される輝度計算式を変更する輝度計算式変更部をさらに備えることを特徴とする画像処理システム。
Imaging device and
Image processing including a luminance calculation unit that generates a luminance image from an image captured by the image pickup apparatus using a luminance calculation formula, and an image processing device having a detection unit that detects a predetermined object based on the luminance image. It's a system
The image processing device changes the luminance calculation formula used by the luminance calculation unit according to the luminance index value of the predetermined region in the luminance image or the type of the predetermined object detected by the detection unit. An image processing system characterized by further including a calculation formula changing unit.
請求項1の画像処理システムにおいて,
前記輝度計算式変更部は,前記輝度画像中の所定領域の輝度指標値が第1閾値を超えるとき,前記輝度画像中の所定領域の輝度指標値が前記第1閾値より小さい第2閾値以上かつ前記第1閾値以下のときに前記輝度計算部が前記輝度計算式として利用する第1輝度計算式よりも前記輝度画像の白とびを低減する第2輝度計算式に前記輝度計算式を変更することを特徴とする画像処理システム。
In the image processing system of claim 1,
When the luminance index value of the predetermined region in the luminance image exceeds the first threshold value, the luminance calculation formula changing unit has the luminance index value of the predetermined region in the luminance image equal to or greater than the second threshold value smaller than the first threshold value. Changing the luminance calculation formula to a second luminance calculation formula that reduces overexposure of the luminance image as compared with the first luminance calculation formula used by the luminance calculation unit as the luminance calculation formula when the first threshold value or less is reached. An image processing system characterized by.
請求項1の画像処理システムにおいて,
前記輝度計算式変更部は,前記検知部により前記所定の対象物が検知されている状態から検知不能の状態に変化し,かつ,前記輝度画像中の所定領域の輝度指標値が第1閾値を超えるとき,前記輝度画像中の所定領域の輝度指標値が前記第1閾値より小さい第2閾値以上かつ前記第1閾値以下のときに前記輝度計算部が前記輝度計算式として利用する基準輝度計算式よりも前記輝度画像の白とびを低減する第2輝度計算式に前記輝度計算式を変更することを特徴とする画像処理システム。
In the image processing system of claim 1,
The luminance calculation formula changing unit changes from a state in which the predetermined object is detected by the detection unit to a state in which it cannot be detected, and the luminance index value of the predetermined region in the luminance image sets the first threshold value. When it exceeds, the reference luminance calculation formula used by the luminance calculation unit as the luminance calculation formula when the luminance index value of the predetermined region in the luminance image is equal to or greater than the second threshold smaller than the first threshold and equal to or less than the first threshold. An image processing system characterized by changing the luminance calculation formula to a second luminance calculation formula that reduces overexposure of the luminance image.
請求項1の画像処理システムにおいて,
前記輝度計算式変更部は,前記検知部の前記所定の対象物が白色系発光体であるとき,前記輝度計算部が前記輝度計算式として通常利用する第1輝度計算式よりも前記輝度画像の白とびを低減する第2輝度計算式に前記輝度計算式を変更することを特徴とする画像処理システム。
In the image processing system of claim 1,
When the predetermined object of the detection unit is a white light emitter, the luminance calculation formula changing unit has a luminance image higher than the first luminance calculation formula normally used by the luminance calculation unit as the luminance calculation formula. An image processing system characterized in that the brightness calculation formula is changed to a second brightness calculation formula that reduces overexposure.
請求項2の画像処理システムにおいて,
前記撮像画像はRGBカラーモデルで定義されており,
前記第1輝度計算式は,前記撮像画像中の任意の画素のR値,G値,B値のそれぞれに所定の係数を乗じた値の合計で定義されており,
前記第2輝度計算式は,前記第1輝度計算式よりもB値の係数が大きく,前記第1輝度計算式よりもR値とG値の係数が小さいことを特徴とする画像処理システム。
In the image processing system of claim 2,
The captured image is defined by an RGB color model.
The first luminance calculation formula is defined as the sum of the values obtained by multiplying each of the R value, G value, and B value of any pixel in the captured image by a predetermined coefficient.
The second luminance calculation formula is an image processing system characterized in that the B value coefficient is larger than that of the first luminance calculation formula and the R value and G value coefficients are smaller than those of the first luminance calculation formula.
請求項1の画像処理システムにおいて,
前記輝度計算式変更部は,前記輝度画像中の所定領域の輝度指標値が第2閾値未満のとき,前記輝度画像中の所定領域の輝度指標値が前記第2閾値以上かつ前記第2閾値より大きい第1閾値以下のときに前記輝度計算部が前記輝度計算式として利用する第1輝度計算式よりも前記輝度画像の黒つぶれを低減する第3輝度計算式に前記輝度計算式を変更することを特徴とする画像処理システム。
In the image processing system of claim 1,
When the brightness index value of the predetermined region in the luminance image is less than the second threshold value, the luminance calculation formula changing unit has the luminance index value of the predetermined region in the luminance image equal to or greater than the second threshold value and more than the second threshold value. Changing the brightness calculation formula to a third brightness calculation formula that reduces blackout of the brightness image more than the first brightness calculation formula used by the brightness calculation unit as the brightness calculation formula when the first threshold value is large or less. An image processing system characterized by.
請求項1の画像処理システムにおいて,
前記輝度計算式変更部は,前記検知部により前記所定の対象物が検知されている状態から検知不能の状態に変化し,かつ,前記輝度画像中の所定領域の輝度指標値が第2閾値未満のとき,前記輝度画像中の所定領域の輝度指標値が前記第2閾値以上かつ前記第2閾値より大きい第1閾値以下のときに前記輝度計算部が前記輝度計算式として利用する第1輝度計算式よりも前記輝度画像の黒つぶれを低減する第3輝度計算式に前記輝度計算式を変更することを特徴とする画像処理システム。
In the image processing system of claim 1,
The luminance calculation formula changing unit changes from a state in which the predetermined object is detected by the detection unit to an undetectable state, and the luminance index value of the predetermined region in the luminance image is less than the second threshold value. When the brightness index value of the predetermined region in the brightness image is equal to or higher than the second threshold value and equal to or lower than the first threshold value larger than the second threshold value, the first brightness calculation used by the brightness calculation unit as the brightness calculation formula is used. An image processing system characterized in that the luminance calculation formula is changed to a third luminance calculation formula that reduces blackout of the luminance image rather than the formula.
請求項6の画像処理システムにおいて,
前記撮像画像はRGBカラーモデルで定義されており,
前記第1輝度計算式は,前記撮像画像中の任意の画素のR値,G値,B値のそれぞれに所定の係数を乗じた値の合計で定義されており,
前記第3輝度計算式は,前記第1輝度計算式よりもG値の係数が大きく,前記第1輝度計算式よりもR値とB値の係数が小さいことを特徴とする画像処理システム。
In the image processing system of claim 6,
The captured image is defined by an RGB color model.
The first luminance calculation formula is defined as the sum of the values obtained by multiplying each of the R value, G value, and B value of any pixel in the captured image by a predetermined coefficient.
The third luminance calculation formula is an image processing system characterized in that the coefficient of G value is larger than that of the first luminance calculation formula and the coefficients of R value and B value are smaller than those of the first luminance calculation formula.
請求項1の画像処理システムにおいて,
前記輝度計算式変更部は,前記検知部の前記所定の対象物が赤色系発光体であるとき,前記輝度計算部が前記輝度計算式として通常利用する第1輝度計算式よりも前記輝度画像の黒つぶれを低減する第4輝度計算式に前記輝度計算式を変更することを特徴とする画像処理システム。
In the image processing system of claim 1,
When the predetermined object of the detection unit is a red light emitter, the luminance calculation formula changing unit has a luminance image higher than the first luminance calculation formula normally used by the luminance calculation unit as the luminance calculation formula. An image processing system characterized in that the luminance calculation formula is changed to a fourth luminance calculation formula for reducing blackout.
請求項9の画像処理システムにおいて,
前記撮像画像はRGBカラーモデルで定義されており,
前記第1輝度計算式は,前記撮像画像中の任意の画素のR値,G値,B値のそれぞれに所定の係数を乗じた値の合計で定義されており,
前記第4輝度計算式は,前記第1輝度計算式よりもR値の係数が大きく,前記第1輝度計算式よりもG値とB値の係数が小さいことを特徴とする画像処理システム。
In the image processing system of claim 9,
The captured image is defined by an RGB color model.
The first luminance calculation formula is defined as the sum of the values obtained by multiplying each of the R value, G value, and B value of any pixel in the captured image by a predetermined coefficient.
The fourth luminance calculation formula is an image processing system characterized in that the R value coefficient is larger than that of the first luminance calculation formula and the G value and B value coefficients are smaller than those of the first luminance calculation formula.
撮像装置と,
輝度計算式を利用して前記撮像装置の撮像画像から輝度画像を生成する輝度計算部,及び,前記輝度画像を基に所定の対象物を検知する検知部を有する画像処理装置と,
前記検知部の検知結果に基づいて決定した自車ヘッドライトの遮光領域に基づいて配光制御を行う配光制御部とを備える配光制御システムであって,
前記画像処理装置は,前記輝度画像中の所定領域の輝度指標値または前記検知部が検知する前記所定の対象物の種類に応じて,前記輝度計算部で利用される輝度計算式を変更する輝度計算式変更部をさらに備えることを特徴とする配光制御システム。
Imaging device and
An image processing device having a luminance calculation unit that generates a luminance image from an image captured by the image pickup apparatus using a luminance calculation formula, and a detection unit that detects a predetermined object based on the luminance image.
It is a light distribution control system including a light distribution control unit that performs light distribution control based on a light-shielding area of the own vehicle headlight determined based on the detection result of the detection unit.
The image processing device changes the luminance calculation formula used by the luminance calculation unit according to the luminance index value of the predetermined region in the luminance image or the type of the predetermined object detected by the detection unit. A light distribution control system characterized by further including a calculation formula changing unit.
JP2019565778A 2018-01-17 2018-12-20 Image processing system and light distribution control system Active JP6894536B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018005794 2018-01-17
JP2018005794 2018-01-17
PCT/JP2018/046920 WO2019142586A1 (en) 2018-01-17 2018-12-20 Image processing system and light distribution control system

Publications (2)

Publication Number Publication Date
JPWO2019142586A1 true JPWO2019142586A1 (en) 2020-12-17
JP6894536B2 JP6894536B2 (en) 2021-06-30

Family

ID=67300985

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2019565778A Active JP6894536B2 (en) 2018-01-17 2018-12-20 Image processing system and light distribution control system

Country Status (3)

Country Link
JP (1) JP6894536B2 (en)
DE (1) DE112018005975T5 (en)
WO (1) WO2019142586A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024121911A1 (en) * 2022-12-05 2024-06-13 日立Astemo株式会社 Image processing device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09311927A (en) * 1996-05-24 1997-12-02 De-Shisu:Kk Parked vehicle detection device and its method
JPH1196367A (en) * 1997-09-19 1999-04-09 Nagoya Denki Kogyo Kk Method and device for detecting parked vehicle
JP2003032669A (en) * 2001-07-11 2003-01-31 Hitachi Ltd On-vehicle image processing camera device
JP2004194993A (en) * 2002-12-19 2004-07-15 Pentax Corp Electronic endoscopic apparatus
JP2007018154A (en) * 2005-07-06 2007-01-25 Honda Motor Co Ltd Vehicle and lane mark recognition device
JP2012155612A (en) * 2011-01-27 2012-08-16 Denso Corp Lane detection apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6085522B2 (en) 2013-05-29 2017-02-22 富士重工業株式会社 Image processing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09311927A (en) * 1996-05-24 1997-12-02 De-Shisu:Kk Parked vehicle detection device and its method
JPH1196367A (en) * 1997-09-19 1999-04-09 Nagoya Denki Kogyo Kk Method and device for detecting parked vehicle
JP2003032669A (en) * 2001-07-11 2003-01-31 Hitachi Ltd On-vehicle image processing camera device
JP2004194993A (en) * 2002-12-19 2004-07-15 Pentax Corp Electronic endoscopic apparatus
JP2007018154A (en) * 2005-07-06 2007-01-25 Honda Motor Co Ltd Vehicle and lane mark recognition device
JP2012155612A (en) * 2011-01-27 2012-08-16 Denso Corp Lane detection apparatus

Also Published As

Publication number Publication date
JP6894536B2 (en) 2021-06-30
WO2019142586A1 (en) 2019-07-25
DE112018005975T5 (en) 2020-08-06

Similar Documents

Publication Publication Date Title
US10084967B1 (en) Systems and methods for regionally controlling exposure time in high dynamic range imaging
JP6132412B2 (en) Outside environment recognition device
US10037473B2 (en) Vehicle exterior environment recognition apparatus
JP6211614B2 (en) Imaging apparatus, imaging method, and in-vehicle imaging system
JP3909691B2 (en) In-vehicle image processing device
US10121083B2 (en) Vehicle exterior environment recognition apparatus
US20060215882A1 (en) Image processing apparatus and method, recording medium, and program
US8395698B2 (en) Exposure determining device and image processing apparatus
JP6701253B2 (en) Exterior environment recognition device
JP6420650B2 (en) Outside environment recognition device
US9506859B2 (en) Method and device for determining a visual range in daytime fog
US20170011271A1 (en) Malfunction diagnosis apparatus
JP7241772B2 (en) Image processing device
WO2013168744A1 (en) Method and device for detecting vehicle light sources
US11303817B2 (en) Active sensor, object identification system, vehicle and vehicle lamp
JP6894536B2 (en) Image processing system and light distribution control system
JP2017188851A (en) Face imaging method for vehicle interior camera and vehicle interior camera
JP2007124676A (en) On-vehicle image processor
JP2021114762A (en) Low-light imaging system
JP5427744B2 (en) Image processing device
JPH11278182A (en) Fog status detection device for vehicle
JP6252657B2 (en) ADJUSTMENT DEVICE, ADJUSTMENT METHOD, AND PROGRAM
JP2020177340A (en) Image processing system
KR20160040335A (en) Low Power Vehicle Video Recorder with Infrared Lighting Functions for Vision Recognition Enhancement in Low Level Light Environment and Vehicle Video Recording Method thereof
JP7142131B1 (en) Lane detection device, lane detection method, and lane detection program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20200522

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20210112

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20210308

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20210511

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20210603

R150 Certificate of patent or registration of utility model

Ref document number: 6894536

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250