JP2014146164A - Object detection apparatus - Google Patents

Object detection apparatus Download PDF

Info

Publication number
JP2014146164A
JP2014146164A JP2013014315A JP2013014315A JP2014146164A JP 2014146164 A JP2014146164 A JP 2014146164A JP 2013014315 A JP2013014315 A JP 2013014315A JP 2013014315 A JP2013014315 A JP 2013014315A JP 2014146164 A JP2014146164 A JP 2014146164A
Authority
JP
Japan
Prior art keywords
pedestrian
area
region
extracted
exposure time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2013014315A
Other languages
Japanese (ja)
Other versions
JP5892079B2 (en
Inventor
Hiroaki Shimizu
宏明 清水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Original Assignee
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp filed Critical Toyota Motor Corp
Priority to JP2013014315A priority Critical patent/JP5892079B2/en
Publication of JP2014146164A publication Critical patent/JP2014146164A/en
Application granted granted Critical
Publication of JP5892079B2 publication Critical patent/JP5892079B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide an object detection apparatus which detects an object with high accuracy even if there is a high-luminance area other than the object.SOLUTION: An object detection apparatus 1 for detecting an object (a pedestrian, for example) from an image includes: imaging means 10 for capturing images in different exposure conditions; comparison means 26, 27 which compare characteristics of a shape of a high-luminance area (a position of the center of gravity of the high-luminance area, and the number of high-luminance areas, for example) between object candidate areas located in the same image position and extracted from a plurality of images captured by the imaging means 10 in different exposure conditions; and determination means 28, 29 which determine whether the object candidate area is the object or not, on the basis of a comparison result of the comparison means. The determination means determine that the object candidate area is a light-emitting object when characteristic change in the shape of the high-luminance area is small between the object candidate areas with different exposure conditions, on the basis of the comparison result of the comparison means.

Description

本発明は、画像から対象物を検出する物体検出装置に関する。   The present invention relates to an object detection apparatus that detects an object from an image.

車両の運転者は、走行中、車両周辺に存在する歩行者等に注意を払わなければならない。そのような運転者の負担を軽減するために、車両周辺の歩行者を検出するための装置が開発されている。例えば、特許文献1には、遠赤外線カメラで撮像された画像から高輝度領域を歩行者の候補領域として抽出し、この候補領域を用いて歩行者を検出することが開示されている。   The driver of the vehicle must pay attention to pedestrians and the like existing around the vehicle while traveling. In order to reduce such a burden on the driver, an apparatus for detecting a pedestrian around the vehicle has been developed. For example, Patent Document 1 discloses that a high brightness area is extracted as a candidate area for a pedestrian from an image captured by a far-infrared camera, and a pedestrian is detected using the candidate area.

特開2008−65757号公報JP 2008-65757 A

画像内の高輝度領域に基づいて歩行者を検出する場合、夜間の街灯、店舗照明等の発光物も高輝度であるので、発光物の形状が歩行者に似ていると、発光物を歩行者として誤検出する可能性がある。   When detecting pedestrians based on high-luminance areas in the image, illuminants such as night street lights and store lighting are also high-luminance, so if the shape of the illuminant resembles a pedestrian, walk around the illuminant. There is a possibility of false detection as a person.

そこで、本発明は、対象物以外の高輝度領域があっても対象物を高精度に検出できる物体検出装置を提供することを課題とする。   Therefore, an object of the present invention is to provide an object detection device that can detect an object with high accuracy even when there is a high-luminance region other than the object.

本発明に係る物体検出装置は、画像から対象物を検出する物体検出装置であって、異なる露光条件で撮像可能な撮像手段と、撮像手段で撮像した露光条件の異なる複数の画像からそれぞれ抽出された同じ画像位置の対象物候補領域間で高輝度領域の形状の特徴を比較する比較手段と、比較手段での比較結果に基づいて対象物候補領域が対象物か否かを判定する判定手段とを備えることを特徴とする。   An object detection apparatus according to the present invention is an object detection apparatus that detects an object from an image, and is extracted from an imaging unit that can capture images under different exposure conditions and a plurality of images that are captured by the imaging unit and that have different exposure conditions. Comparison means for comparing features of the shape of the high-intensity area between object candidate areas at the same image position, and determination means for determining whether the object candidate area is an object based on the comparison result of the comparison means; It is characterized by providing.

この物体検出装置では、撮像手段で複数の異なる露光条件でそれぞれ撮像し、露光条件が異なる複数の画像を取得する。露光条件としては、例えば、露光時間(シャッタスピード)、レンズの絞りがある。露光条件が異なると、同じものを撮像した場合でも撮像手段で検出される光の量が変わる。物体検出装置では、露光条件の異なる複数の画像から同じ画像位置の対象物候補領域をそれぞれ抽出し、露光条件の異なる複数の対象物候補領域から高輝度領域をそれぞれ抽出し、対象物候補領域間で高輝度領域の形状の特徴を比較する。照明等の発光物(非対象物)の場合、自ら光を発して発光中心から放射状に拡散して明るくなっている。そのため、露光条件を変えても、高輝度領域として抽出される領域が殆ど変らず、高輝度領域の形状の変化が小さい。一方、歩行者等の対象物の場合、照明等の光を反射して明るくなっている。そのため、露光条件を変えると、高輝度領域として抽出される領域が変わり、高輝度領域の形状の変化が大きい。例えば、露光時間が短くなると、反射している光の量が少ない部分が高輝度でなくなり、高輝度領域が小さくなって、やがて分割する。そこで、物体検出装置では、露光条件の異なる複数の対象物候補領域間での高輝度領域の形状の特徴の比較結果に基づいて対象物候補領域が対象物か否かを判定できる。このように、物体検出装置では、複数の露光条件の画像からそれぞれ抽出された対象物候補領域間で高輝度領域の形状の特徴を比較することにより、対象物以外の高輝度領域があっても、対象物を高精度に検出できる。   In this object detection apparatus, the imaging means captures images under a plurality of different exposure conditions, and acquires a plurality of images with different exposure conditions. Examples of exposure conditions include exposure time (shutter speed) and lens aperture. If the exposure conditions are different, the amount of light detected by the imaging means changes even when the same object is imaged. The object detection device extracts object candidate regions at the same image position from a plurality of images with different exposure conditions, extracts a high-luminance region from a plurality of object candidate regions with different exposure conditions, and Compare the characteristics of the shape of the high brightness area. In the case of a luminescent material (non-object) such as illumination, the light is emitted by itself and diffused radially from the light emission center to become bright. Therefore, even if the exposure condition is changed, the region extracted as the high luminance region is hardly changed, and the change in the shape of the high luminance region is small. On the other hand, in the case of an object such as a pedestrian, light such as illumination is reflected and brightened. Therefore, when the exposure condition is changed, the region extracted as the high luminance region changes, and the shape of the high luminance region changes greatly. For example, when the exposure time is shortened, a portion where the amount of reflected light is small does not have high luminance, and the high luminance region becomes small and is eventually divided. Therefore, the object detection apparatus can determine whether or not the object candidate area is an object based on the comparison result of the characteristics of the shape of the high luminance area between the plurality of object candidate areas with different exposure conditions. As described above, in the object detection device, even if there is a high-luminance region other than the target object by comparing the shape characteristics of the high-luminance region between the target object candidate regions extracted from the images of the plurality of exposure conditions. The object can be detected with high accuracy.

本発明の上記物体検出装置では、判定手段は、比較手段の比較結果から露光条件の異なる複数の対象物候補領域間で高輝度領域の形状の特徴の変化が小さい場合、対象物候補領域が発光物であると判定する。   In the object detection apparatus of the present invention, the determination means emits light from the object candidate area when the change in the feature of the shape of the high brightness area is small between the plurality of object candidate areas having different exposure conditions from the comparison result of the comparison means. Judge that it is a thing.

上記したように、照明等の発光物の場合、露光条件を変えても、高輝度領域として抽出される領域が殆ど変らず、高輝度領域の形状の変化が小さい。そこで、物体検出装置では、露光条件の異なる複数の対象物候補領域間で高輝度領域の形状の特徴を比較して、高輝度領域の形状の特徴の変化が小さい場合にはその対象物候補領域が発光物であると判定できる。ひいては、発光物と判定されなかった対象物候補領域を対象物と判定できる。   As described above, in the case of a illuminant such as illumination, even if the exposure condition is changed, the region extracted as the high luminance region is hardly changed, and the change in the shape of the high luminance region is small. Therefore, in the object detection device, when the feature characteristics of the high brightness area are compared between a plurality of target object areas having different exposure conditions, and the change in the shape characteristics of the high brightness area is small, the object candidate area Can be determined to be a luminescent material. As a result, the object candidate area | region which was not determined as a light-emitting object can be determined as a target object.

本発明の上記物体検出装置では、比較手段は、露光条件の異なる複数の対象物候補領域間で高輝度領域の重心位置を比較する。   In the object detection apparatus of the present invention, the comparison means compares the barycentric positions of the high-luminance areas between a plurality of object candidate areas having different exposure conditions.

照明等の発光物の場合、発光中心から放射状に明るさが拡散し、その拡散した領域が高輝度領域として抽出される。したがって、常に発光中心が最も高輝度であるので、露光条件を変えても、高輝度領域の重心位置が変化しない(あるいは、変化が小さい)。一方、歩行者等の対象物の場合、光を反射して明るくなっており、その反射している領域が高輝度領域として抽出される。したがって、反射している光の量が多い部分ほど高輝度であるので、露光条件を変えると、高輝度領域の重心位置も変化する。そこで、物体検出装置では、露光条件が異なる複数の対象物候補領域間で高輝度領域の重心位置を比較することによって、高輝度領域の形状の特徴が変化しているか否かを高精度に判定できる。   In the case of a illuminant such as illumination, the brightness diffuses radially from the light emission center, and the diffused region is extracted as a high luminance region. Accordingly, since the emission center always has the highest luminance, the center of gravity position of the high luminance region does not change (or changes are small) even if the exposure conditions are changed. On the other hand, in the case of an object such as a pedestrian, the light is reflected and brightened, and the reflected area is extracted as a high luminance area. Accordingly, since the portion with a larger amount of reflected light has higher luminance, changing the exposure condition also changes the position of the center of gravity of the high luminance region. Therefore, in the object detection device, it is determined with high accuracy whether or not the feature of the shape of the high luminance region has changed by comparing the center of gravity position of the high luminance region between a plurality of candidate object regions having different exposure conditions. it can.

本発明の上記物体検出装置では、比較手段は、露光条件の異なる複数の対象物候補領域間で高輝度領域の数を比較する。   In the object detection apparatus of the present invention, the comparison means compares the number of high-luminance areas between a plurality of object candidate areas having different exposure conditions.

照明等の発光物の場合、上記したように露光条件を変えても高輝度領域の形状の変化が小さいので、高輝度領域が分割するようなことが殆どなく、高輝度領域の数も殆ど変化しない。一方、歩行者等の対象物の場合、上記したように露光条件を変えると高輝度領域の形状の変化が大きい。この高輝度領域の形状の変化では、領域の形状が小さくなり、やがて領域が分割する場合もある。そこで、物体検出装置では、露光条件が異なる複数の対象物候補領域間で高輝度領域の数を比較することによって、高輝度領域の形状の特徴が変化しているか否かを高精度に判定できる。   In the case of illuminants such as illuminations, even if the exposure conditions are changed as described above, the change in the shape of the high-brightness area is small, so the high-brightness area hardly divides and the number of high-brightness areas also changes little. do not do. On the other hand, in the case of an object such as a pedestrian, if the exposure condition is changed as described above, the change in the shape of the high luminance region is large. With this change in the shape of the high-luminance region, the shape of the region becomes small, and the region may eventually be divided. Therefore, in the object detection device, it is possible to determine with high accuracy whether or not the feature of the shape of the high-luminance region has changed by comparing the number of high-luminance regions between a plurality of candidate object regions having different exposure conditions. .

本発明によれば、複数の露光条件の画像からそれぞれ抽出された対象物候補領域間で高輝度領域の形状の特徴を比較することにより、対象物以外の高輝度領域があっても、対象物を高精度に検出できる。   According to the present invention, even if there is a high-luminance region other than the target object by comparing the characteristics of the shape of the high-luminance region between the target object candidate regions extracted from the images of the plurality of exposure conditions, the target object Can be detected with high accuracy.

本実施の形態に係る歩行者検出装置の構成図である。It is a block diagram of the pedestrian detection apparatus which concerns on this Embodiment. 歩行者と判定される場合の処理過程の一例であり、(a)が長い露光時間の画像から抽出された歩行者候補領域であり、(b)が長い露光時間の画像での歩行者候補領域から抽出された高輝度領域であり、(c)が中間の露光時間の画像での歩行者候補領域から抽出された高輝度領域であり、(d)が短い露光時間の画像での歩行者候補領域から抽出された高輝度領域である。It is an example of a processing process in the case of being determined as a pedestrian, (a) is a pedestrian candidate area extracted from an image having a long exposure time, and (b) is a pedestrian candidate area in an image having a long exposure time (C) is a high brightness area extracted from a pedestrian candidate area in an image with an intermediate exposure time, and (d) is a pedestrian candidate in an image with a short exposure time. This is a high luminance area extracted from the area. 発光物と判定される場合の処理過程の一例であり、(a)が長い露光時間の画像から抽出された歩行者候補領域であり、(b)が長い露光時間の画像での歩行者候補領域から抽出された高輝度領域であり、(c)が中間の露光時間の画像での歩行者候補領域から抽出された高輝度領域であり、(d)が短い露光時間の画像での歩行者候補領域から抽出された高輝度領域である。It is an example of a processing process when it is determined as a luminescent object, (a) is a pedestrian candidate area extracted from an image with a long exposure time, and (b) is a pedestrian candidate area with an image with a long exposure time. (C) is a high brightness area extracted from a pedestrian candidate area in an image with an intermediate exposure time, and (d) is a pedestrian candidate in an image with a short exposure time. This is a high luminance area extracted from the area. 図1のECUにおける歩行者検出処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the pedestrian detection process in ECU of FIG.

以下、図面を参照して、本発明に係る物体検出装置の実施の形態を説明する。なお、各図において同一又は相当する要素については同一の符号を付し、重複する説明を省略する。   Hereinafter, an embodiment of an object detection device according to the present invention will be described with reference to the drawings. In addition, the same code | symbol is attached | subjected about the element which is the same or it corresponds in each figure, and the overlapping description is abbreviate | omitted.

本実施の形態では、本発明に係る物体検出装置を、車両に搭載される歩行者検出装置に適用する。本実施の形態に係る歩行者検出装置は、可視光カメラで撮像した画像から歩行者を検出し、その検出した歩行者の情報を各種運転支援装置に提供する。特に、本実施の形態では、歩行者検出装置における夜間(昼間でもトンネル内等の周辺が暗い場合に適用してもよい)での歩行者検出処理について説明する。なお、昼間での歩行者検出処理については、本実施の形態で説明する歩行者検出手法を適用してもよいが、周知の他の昼間に適した歩行者検出手法を適用すればよい。   In the present embodiment, the object detection device according to the present invention is applied to a pedestrian detection device mounted on a vehicle. The pedestrian detection device according to the present embodiment detects a pedestrian from an image captured by a visible light camera, and provides information on the detected pedestrian to various driving support devices. In particular, in the present embodiment, a pedestrian detection process at night (which may be applied when the surroundings in a tunnel or the like are dark in the daytime) in the pedestrian detection apparatus will be described. In addition, about the pedestrian detection process in daytime, you may apply the pedestrian detection method demonstrated in this Embodiment, However, What is necessary is just to apply the pedestrian detection method suitable for the other known daytime.

図1〜図3を参照して、本実施の形態に係る歩行者検出装置1について説明する。図1は、本実施の形態に係る歩行者検出装置の構成図である。図2は、歩行者と判定される場合の処理過程の一例である。図3は、発光物と判定される場合の処理過程の一例である。   With reference to FIGS. 1-3, the pedestrian detection apparatus 1 which concerns on this Embodiment is demonstrated. FIG. 1 is a configuration diagram of a pedestrian detection device according to the present embodiment. FIG. 2 is an example of a processing process when it is determined that the person is a pedestrian. FIG. 3 is an example of a processing process in the case of being determined as a luminescent material.

歩行者検出装置1は、画像から歩行者候補領域を抽出し、抽出できた歩行者候補領域内の高輝度領域に基づいて歩行者を検出する。特に、歩行者検出装置1は、夜間の街灯や店舗照明等の発光物を歩行者として誤検出しないように、3種類の異なる露光条件(露光時間)の画像からそれぞれ抽出された同じ画像位置の歩行者候補領域間で高輝度領域の形状の特徴を比較することによって歩行者候補領域が発光物か非発光物かを判定する。歩行者検出装置1は、可視光カメラ10及びECU[Electronic Control Unit]20を備えている。   The pedestrian detection device 1 extracts a pedestrian candidate area from the image, and detects a pedestrian based on the high brightness area in the extracted pedestrian candidate area. In particular, the pedestrian detection apparatus 1 has the same image position respectively extracted from images of three different exposure conditions (exposure times) so as not to erroneously detect luminous objects such as night street lamps and store lighting as pedestrians. It is determined whether the pedestrian candidate area is a luminescent object or a non-luminescent object by comparing the characteristics of the shape of the high brightness area between the pedestrian candidate areas. The pedestrian detection device 1 includes a visible light camera 10 and an ECU [Electronic Control Unit] 20.

可視光カメラ10は、CCD[Charge Coupled Device]、CMOS[Complementary Metal Oxide Semiconductor]などを用いた自車両前方を撮像する可視光カメラである。可視光カメラ10は、自車両の前側の中央の所定の高さ位置に取り付けられる。可視光カメラ10は、露光条件を変えて撮像可能なカメラである。露光条件としては、例えば、露光時間(シャッタスピード)である。また、可視光カメラ10は、少なくとも輝度情報が得られればよいので、カラーカメラでもよいしあるいは白黒カメラでもよい。可視光カメラ10では、一定時間毎に、自車両前方を撮像し、その撮像した画像データを画像信号としてECU20に送信する。なお、本実施の形態では、可視光カメラ10が特許請求の範囲に記載する撮像手段に相当する。   The visible light camera 10 is a visible light camera that images the front of the host vehicle using a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like. The visible light camera 10 is attached to a predetermined height position in the center of the front side of the host vehicle. The visible light camera 10 is a camera that can capture images by changing exposure conditions. An example of the exposure condition is an exposure time (shutter speed). The visible light camera 10 may be a color camera or a monochrome camera, as long as at least luminance information can be obtained. The visible light camera 10 images the front of the host vehicle at regular time intervals, and transmits the captured image data to the ECU 20 as an image signal. In the present embodiment, the visible light camera 10 corresponds to the imaging means described in the claims.

ECU20は、CPU[Central Processing Unit]、ROM[Read Only Memory]、RAM[Random Access Memory]などからなる電子制御ユニットである。ECU20には、所定の記憶領域に歩行者モデルデータベース21が構成される。また、ECU20には、ROMに記憶されている各種プログラムをRAMにロードしてCPUで実行することによって、露光制御部22、多重露光画像取得部23、第1歩行者判定部24、第1高輝度領域抽出部25、第2高輝度領域抽出部26、第3高輝度領域抽出部27、発光物判定部28、第2歩行者判定部29が構成される。なお、本実施の形態では、第2高輝度領域抽出部26及び第3高輝度領域抽出部27が特許請求の範囲に記載する比較手段に相当し、発光物判定部28及び第2歩行者判定部29が特許請求の範囲に記載する判定手段に相当する。   The ECU 20 is an electronic control unit including a CPU [Central Processing Unit], a ROM [Read Only Memory], a RAM [Random Access Memory], and the like. The ECU 20 includes a pedestrian model database 21 in a predetermined storage area. Further, the ECU 20 loads various programs stored in the ROM into the RAM and executes them by the CPU, whereby the exposure control unit 22, the multiple exposure image acquisition unit 23, the first pedestrian determination unit 24, the first height A luminance region extraction unit 25, a second high luminance region extraction unit 26, a third high luminance region extraction unit 27, a luminescent matter determination unit 28, and a second pedestrian determination unit 29 are configured. In the present embodiment, the second high-intensity region extraction unit 26 and the third high-intensity region extraction unit 27 correspond to the comparison means described in the claims, and the illuminant determination unit 28 and the second pedestrian determination The unit 29 corresponds to a determination unit described in the claims.

歩行者モデルデータベース21は、第1歩行者判定部24でのパターンマッチングで用いられる歩行者モデルを格納するデータベースである。歩行者モデルデータベース21は、ECU20の所定の記憶領域に構築されるデータベースである。歩行者モデルとしては、様々なモデルが格納されており、例えば、歩いている歩行者モデル、走っている歩行者モデル、止まっている歩行者モデル、男性の歩行者モデル、女性の歩行者モデル、大人の歩行者モデル、子供の歩行者モデル、老人の歩行者モデル、正面の歩行者モデル、背面の歩行者モデル、側面の歩行者モデルがあり、これらの各歩行者モデルの組み合わせの歩行者モデルもある。   The pedestrian model database 21 is a database that stores a pedestrian model used for pattern matching in the first pedestrian determination unit 24. The pedestrian model database 21 is a database constructed in a predetermined storage area of the ECU 20. Various models are stored as pedestrian models, such as walking pedestrian models, running pedestrian models, stopped pedestrian models, male pedestrian models, female pedestrian models, There are adult pedestrian models, child pedestrian models, elderly pedestrian models, front pedestrian models, back pedestrian models, side pedestrian models, and a combination of these pedestrian models. There is also.

露光制御部22は、露光条件(特に、露光時間)を変えて可視光カメラ10を撮像制御する処理部である。露光制御部22は、可視光カメラ10と双方向通信を行うために、IEEE1394等のバスで可視光カメラ10と接続される。露光制御部22では、一定時間毎に、可視光カメラ10を露光制御し、3種類の異なる露光時間(長い露光時間>中間の露光時間>短い露光時間)でそれぞれ撮像させる。この3種類の露光時間は、長い露光時間を夜間の市街地で歩行者を十分に撮像できる露光時間とし、中間の露光時間、短い露光時間を順次短い露光時間とする。短い露光時間でも、夜間の街灯や店舗照明等の発光物の光を十分に撮像できる露光時間とする。長い露光時間ほど、明るい画像が得られる。   The exposure control unit 22 is a processing unit that controls the imaging of the visible light camera 10 by changing exposure conditions (particularly, exposure time). The exposure control unit 22 is connected to the visible light camera 10 via a bus such as IEEE1394 in order to perform bidirectional communication with the visible light camera 10. The exposure control unit 22 controls the exposure of the visible light camera 10 at regular intervals, and images each of three different exposure times (long exposure time> intermediate exposure time> short exposure time). For these three types of exposure times, a long exposure time is an exposure time that can sufficiently capture a pedestrian in an urban area at night, and an intermediate exposure time and a short exposure time are sequentially set as a short exposure time. Even with a short exposure time, the exposure time is such that light from a luminescent material such as a street lamp at night or store lighting can be sufficiently imaged. The longer the exposure time, the brighter the image.

多重露光画像取得部23は、露光制御部22で露光制御された可視光カメラ10で撮像した画像を取得する処理部である。多重露光画像取得部23では、一定時間毎に、可視光カメラ10でそれぞれ撮像された長い露光時間(最も明るい)の画像、中間の露光時間(中間の明るさ)の画像、短い露光時間(最も暗い)の画像を取得する。   The multiple exposure image acquisition unit 23 is a processing unit that acquires an image captured by the visible light camera 10 whose exposure is controlled by the exposure control unit 22. In the multiple exposure image acquisition unit 23, a long exposure time (brightest) image, an intermediate exposure time (intermediate brightness) image, and a short exposure time (longest) captured by the visible light camera 10 at regular time intervals. Acquire a dark image.

第1歩行者判定部24は、多重露光画像取得部23で取得した長い露光時間(最も明るい)の画像から歩行者らしい領域(歩行者候補領域)を判定する処理部である。この判定手法は、周知の一般的な歩行者認識手法を用い、例えば、歩行者モデルによるパターンマッチングを用いる。第1歩行者判定部24では、多重露光画像取得部23で長い露光時間の画像を取得すると、歩行者モデルデータベース21に格納されている歩行者モデル毎にその長い露光時間の画像に対してパターンマッチングを行い、マッチングした領域がある場合には矩形の歩行者候補領域を抽出する。パターンマッチングの手法としては、例えば、N.Dalal and B.Triggs.“Histogram of oriented gradient for humandetection”.CVPR,2005に開示されている手法で実現できる。   The first pedestrian determination unit 24 is a processing unit that determines a pedestrian-like region (pedestrian candidate region) from an image with a long exposure time (brightest) acquired by the multiple exposure image acquisition unit 23. This determination method uses a well-known general pedestrian recognition method, for example, pattern matching using a pedestrian model. In the 1st pedestrian determination part 24, if the multiple exposure image acquisition part 23 acquires the image of long exposure time, it will pattern with respect to the image of the long exposure time for every pedestrian model stored in the pedestrian model database 21. Matching is performed, and if there is a matched area, a rectangular pedestrian candidate area is extracted. The pattern matching technique can be realized by the technique disclosed in N. Dalal and B. Triggs. “Histogram of oriented gradient for human detection”. CVPR, 2005, for example.

図2(a)には、歩行者が歩行者らしいとして判定され、長い露光時間(最も明るい)の画像から抽出された歩行者候補領域C1を示している。この歩行者は、夜間の店舗照明等の光が当たっており、その光を白い服等で反射して明るくなっている。露光時間が長い画像の場合、反射した光でも歩行者らしい形状が得られ、歩行者モデルによるパターンマッチングが可能である。また、図3(a)には、2つの発光物が歩行者らしいとして判定され、長い露光時間の画像から抽出された歩行者候補領域C2を示している。この2つの発光物は、発光中心から放射状にそれぞれ拡散して円形形状と長方形形状で明るくなっている。この2つの形状の発光物の組み合わせが歩行者に似たような形状となり、歩行者モデルによるパターンマッチングでマッチングしてしまった。 FIG. 2A shows a pedestrian candidate region C1 L that is determined as a pedestrian and extracted from an image with a long exposure time (brightest). This pedestrian is exposed to light such as nighttime store lighting, and the light is reflected by white clothes or the like and brightened. In the case of an image with a long exposure time, a pedestrian-like shape can be obtained even with reflected light, and pattern matching using a pedestrian model is possible. FIG. 3A shows a pedestrian candidate region C2 L that is determined as two luminescent objects likely to be pedestrians and extracted from an image with a long exposure time. These two luminescent materials are diffused radially from the emission center, and are brightened in a circular shape and a rectangular shape. The combination of these two shapes of illuminant has a shape similar to a pedestrian, and has been matched by pattern matching using a pedestrian model.

第1高輝度領域抽出部25は、第1歩行者判定部24で長い露光時間(最も明るい)の画像から抽出した歩行者候補領域から高輝度領域を抽出し、この高輝度領域の重心位置を計算する処理部である。第1高輝度領域抽出部25では、第1歩行者判定部24で抽出した歩行者候補領域の中から1つずつ歩行者候補領域を選択する。そして、第1高輝度領域抽出部25では、その選択した歩行者候補領域内において、第1輝度判定閾値より高い輝度を持つ領域(高輝度領域)を抽出し、その抽出した領域の重心位置を計算する。第1輝度判定閾値は、長い露光時間の画像において高輝度か否かを判定するための閾値であり、長い露光時間で得られる画像の明るさを考慮して、実験等に基づいて予め設定される。この領域抽出及び重心位置計算の方法としては、例えば、歩行者候補領域内の画素毎に輝度が第1輝度判定閾値より高い画素と第1輝度判定閾値以下の画素とで2値化し、ランレングス法を用いて第1輝度判定閾値より高い画素からなる領域にラベル付けし、同じラベルの領域の重心位置を計算する。   The first high-intensity region extraction unit 25 extracts a high-intensity region from the pedestrian candidate region extracted from the image with the long exposure time (brightest) by the first pedestrian determination unit 24, and determines the position of the center of gravity of the high-intensity region. A processing unit for calculating. The first high luminance area extraction unit 25 selects pedestrian candidate areas one by one from the pedestrian candidate areas extracted by the first pedestrian determination unit 24. Then, the first high-brightness region extraction unit 25 extracts a region (high-brightness region) having a luminance higher than the first luminance determination threshold in the selected pedestrian candidate region, and determines the center of gravity position of the extracted region. calculate. The first brightness determination threshold value is a threshold value for determining whether or not an image with a long exposure time has high brightness, and is set in advance based on experiments or the like in consideration of the brightness of the image obtained with a long exposure time. The As a method of this region extraction and center-of-gravity position calculation, for example, for each pixel in the pedestrian candidate region, binarization is performed with a pixel having a luminance higher than the first luminance determination threshold and a pixel equal to or lower than the first luminance determination threshold. By using this method, an area composed of pixels higher than the first luminance determination threshold is labeled, and the barycentric position of the area with the same label is calculated.

図2(b)には、図2(a)の長い露光時間(最も明るい)の歩行者候補領域C1から抽出された高輝度領域H1L1を示している。歩行者には、一側方から夜間の店舗照明等の光が当たっていたので、その一側方側が特に明るくなっていた。そのため、その一側方側が1つの高輝度領域H1L1として抽出され、その高輝度領域H1L1の重心位置G1L1が計算された。また、図3(b)には、図3(a)の長い露光時間の歩行者候補領域C2から抽出された高輝度領域H2L1,H2L2を示している。2つの発光物は、発光中心から拡散して円形形状と長方形形状でそれぞれ明るくなっていた。そのため、その円形形状と長方形形状がそのままの形状で高輝度領域H2L1,H2L2として抽出され、その高輝度領域H2L1の重心位置G2L1と高輝度領域H2L2の重心位置G2L2が計算された。 FIG. 2B shows a high-intensity region H1 L1 extracted from the pedestrian candidate region C1 L having a long exposure time (brightest) in FIG. The pedestrians were exposed to light from one side, such as nighttime store lighting, and the one side was particularly bright. Therefore, one side end thereof is extracted as a high-luminance region H1 L1, centroid position G1 L1 of the high luminance region H1 L1 is calculated. FIG. 3B shows high-intensity regions H2 L1 and H2 L2 extracted from the pedestrian candidate region C2 L having a long exposure time shown in FIG. The two illuminants diffused from the emission center and became brighter in a circular shape and a rectangular shape, respectively. Therefore, its circular shape and a rectangular shape is extracted as it is shaped as a high-luminance region H2 L1, H2 L2, centroid position G2 L2 center of gravity G2 L1 and the high luminance region H2 L2 of the high luminance region H2 L1 is calculated It was.

第2高輝度領域抽出部26は、第1高輝度領域抽出部25で長い露光時間の歩行者候補領域内から高輝度領域を抽出している場合、中間の露光時間(中間の明るさ)の画像における長い露光時間の歩行者候補領域と同じ領域内から抽出した高輝度領域の重心位置と長い露光時間の歩行者候補領域内から抽出されている高輝度領域の重心位置とが同じ位置かを判定する処理部である。第2高輝度領域抽出部26では、選択されている歩行者候補領域に対する第1高輝度領域抽出部25での処理が終了すると、第1高輝度領域抽出部25で長い露光時間の歩行者候補領域内から高輝度領域を抽出しているか否かを判定し、抽出している場合には多重露光画像取得部23で取得した中間の露光時間の画像からその長い露光時間の歩行者候補領域と同じ位置かつ同じ矩形形状の領域(中間の露光時間の歩行者候補領域)を抽出する。そして、第2高輝度領域抽出部26では、第1高輝度領域抽出部25と同様の方法により、その抽出した中間の露光時間の歩行者候補領域内において、第2輝度判定閾値より高い輝度を持つ領域を抽出し、その抽出した領域の重心位置を計算する。第2輝度判定閾値は、中間の露光時間(中間の明るさ)画像において高輝度か否かを判定するための閾値であり、中間の露光時間で得られる画像の明るさを考慮して、実験等に基づいて予め設定される。第2輝度判定閾値は、中間の明るさの画像から高緯度領域を抽出するので、第1輝度判定閾値よりも小さい値である。   When the first high-brightness region extraction unit 25 extracts a high-brightness region from the pedestrian candidate region having a long exposure time, the second high-brightness region extraction unit 26 has an intermediate exposure time (intermediate brightness). Whether the center of gravity of the high-intensity area extracted from the same area as the pedestrian candidate area with the long exposure time in the image and the center of gravity of the high-intensity area extracted from within the pedestrian candidate area with the long exposure time are the same position It is a processing part to determine. In the second high-intensity region extraction unit 26, when the processing in the first high-intensity region extraction unit 25 for the selected pedestrian candidate region is completed, the first high-intensity region extraction unit 25 performs a pedestrian candidate with a long exposure time. It is determined whether or not a high-intensity area is extracted from the area, and if it is extracted, a pedestrian candidate area having the long exposure time is extracted from the intermediate exposure time image acquired by the multiple exposure image acquisition unit 23. A region having the same position and the same rectangular shape (pedestrian candidate region having an intermediate exposure time) is extracted. Then, the second high brightness area extraction unit 26 uses the same method as the first high brightness area extraction unit 25 to provide a brightness higher than the second brightness determination threshold in the extracted pedestrian candidate area of the intermediate exposure time. A region having the same is extracted, and the center of gravity position of the extracted region is calculated. The second luminance determination threshold value is a threshold value for determining whether or not the image has a high luminance in an intermediate exposure time (intermediate brightness) image, and an experiment is performed in consideration of the brightness of the image obtained in the intermediate exposure time. Etc., based on the above. The second luminance determination threshold value is smaller than the first luminance determination threshold value because a high-latitude region is extracted from an intermediate brightness image.

さらに、第2高輝度領域抽出部26では、その抽出した高輝度領域の重心位置と第1高輝度領域抽出部25で抽出されている高輝度領域の重心位置との距離を計算する。どの重心位置同士の距離を計算するかを決める方法は、第1高輝度領域抽出部25で抽出されている高輝度領域に相当する領域位置内にある高輝度領域の重心位置とする。その領域位置内に高輝度領域が複数ある場合、複数の高輝度領域についてそれぞれ距離を計算し、その複数の距離の和を判定に用いるものとする。第2高輝度領域抽出部26では中間の露光時間(中間の明るさ)の画像を用いているので、第2高輝度領域抽出部26で抽出した高輝度領域は、長い露光時間(最も明るさ)の画像を用いた第1高輝度領域抽出部25で抽出されている高輝度領域よりも大きくなることはない。そのようになるように、第2輝度判定閾値の値が小さくなり過ぎないようにする。そして、第2高輝度領域抽出部26では、計算した距離が距離判定閾値以下か否かを判定し、距離が距離判定閾値以下の場合には長い露光時間の歩行者候補領域内の高輝度領域の重心位置と中間の露光時間の歩行者候補領域内の高輝度領域の重心位置とが同じ位置であると判定する。距離判定閾値は、異なる露光時間(異なる明るさ)の画像の歩行者領域間で高輝度領域の重心位置が同じ位置とみなしてよいか否かを判定するための閾値であり、実験等に基づいて予め設定される。   Further, the second high brightness area extraction unit 26 calculates the distance between the center position of the extracted high brightness area and the center position of the high brightness area extracted by the first high brightness area extraction unit 25. The method of determining which centroid position distance is to be calculated is the centroid position of the high brightness area in the area position corresponding to the high brightness area extracted by the first high brightness area extraction unit 25. When there are a plurality of high-luminance regions in the region position, the distance is calculated for each of the plurality of high-luminance regions, and the sum of the plurality of distances is used for the determination. Since the second high-intensity region extraction unit 26 uses an image having an intermediate exposure time (intermediate brightness), the high-intensity region extracted by the second high-intensity region extraction unit 26 has a long exposure time (brightest brightness). ) Is not larger than the high luminance area extracted by the first high luminance area extraction unit 25 using the image of (1). As such, the value of the second luminance determination threshold value is prevented from becoming too small. Then, the second high-intensity area extraction unit 26 determines whether or not the calculated distance is equal to or less than the distance determination threshold. If the distance is equal to or less than the distance determination threshold, the high-intensity area in the pedestrian candidate area with a long exposure time And the center of gravity position of the high brightness area in the pedestrian candidate area having an intermediate exposure time are determined to be the same position. The distance determination threshold value is a threshold value for determining whether or not the center of gravity position of the high luminance area may be regarded as the same position between pedestrian areas of images having different exposure times (different brightness), and is based on experiments or the like. Are preset.

図2(c)には、図2(a)の歩行者候補領域C1と同じ画像位置の中間の露光時間(中間の明るさ)の歩行者候補領域C1から抽出された4つの高輝度領域H1M1,高輝度領域H1M2,高輝度領域H1M3,高輝度領域H1M4を示している。歩行者の場合、自らは発光していないので、中心ほど明るいわけではなく、夜間の店舗照明等の光を反射している部分だけが明るくなる。したがって、露光時間が短くなるほど、反射している光の量が多い部分だけが高輝度領域として検出され、その高輝度領域が狭くなり(それに伴って領域の重心位置が変化し)、やがて高輝度領域が分割する。したがって、図2(b)で示した長い露光時間での高輝度領域H1L1が分割し、4つの小さな高輝度領域H1M1,H1M2,H1M3,H1M4として抽出された。この場合、高輝度領域H1L1の重心位置G1L1と高輝度領域H1M1,H1M2,H1M3,H1M4の各重心位置G1M1,G1M2,G1M3,G1M4との距離がそれぞれ計算され、4つの距離の和が距離判定閾値と比較され、距離判定閾値より大きいと判定され、重心位置が変化している。 FIG 2 (c), 4 single high luminance pedestrian candidate region C1 L intermediate exposure time of the same image position is extracted from the pedestrian candidate region C1 M (Middle brightness) shown in FIG. 2 (a) A region H1 M1 , a high luminance region H1 M2 , a high luminance region H1 M3 , and a high luminance region H1 M4 are shown. In the case of a pedestrian, since it does not emit light, it is not as bright as the center, and only the portion reflecting light such as nighttime store lighting is brightened. Therefore, as the exposure time is shortened, only the part of the reflected light is detected as a high-brightness area, and the high-brightness area becomes narrower (the center of gravity of the area changes accordingly). The area is divided. Therefore, the high brightness area H1 L1 with a long exposure time shown in FIG. 2B was divided and extracted as four small high brightness areas H1 M1 , H1 M2 , H1 M3 , and H1 M4 . In this case, the distance between the high luminance region H1 L1 of the center-of-gravity position G1 L1 high luminance region H1 M1, H1 M2, H1 M3 , H1 respective center of gravity of M4 G1 M1, G1 M2, G1 M3, G1 M4 are calculated respectively The sum of the four distances is compared with the distance determination threshold value, and is determined to be larger than the distance determination threshold value, so that the position of the center of gravity changes.

また、図3(c)には、図3(a)の歩行者候補領域C2と同じ画像位置の中間の露光時間(中間の明るさ)の歩行者候補領域C2から抽出された2つの高輝度領域H2M1,H2M2を示している。発光物の場合、自らが発光しているので、発光中心から放射状に明るさが拡散する。したがって、露光時間が短くなっても、発光中心が最も輝度が高いので、高輝度領域の重心位置が変化せず、高輝度領域の大きさもあまり小さくならない。したがって、図3(b)で示した長い露光時間での高輝度領域H2L1,H2L2から少しだけ小さくなって、高輝度領域H2M1,H2M2として抽出された。この場合、高輝度領域H2L1の重心位置G2L1と高輝度領域H2M1の重心位置G2M1との距離及び高輝度領域H2L2の重心位置G2L2と高輝度領域H2M2の重心位置G2M2との距離がそれぞれ計算され、各距離が距離判定閾値とそれぞれ比較され、2つの距離が距離判定閾値以下と判定され、重心位置が同じ位置である。 Further, in FIG. 3 (c), FIGS. 3 (a) the pedestrian candidate region C2 L intermediate exposure time of the same image position pedestrian candidate region C2 M 2 both extracted from the (brightness of the intermediate) of High luminance regions H2 M1 and H2 M2 are shown. In the case of a luminescent material, since the light is emitted, the brightness diffuses radially from the emission center. Therefore, even if the exposure time is shortened, the luminance center is the highest in luminance, so that the position of the center of gravity of the high luminance region does not change and the size of the high luminance region does not become so small. Therefore, the high luminance regions H2 L1 and H2 L2 in the long exposure time shown in FIG. 3B are slightly reduced and extracted as the high luminance regions H2 M1 and H2 M2 . In this case, the barycentric position G2 M2 of the center of gravity position G2 L2 and the high luminance region H2 M2 distance and high luminance area H2 L2 between the position of the center of gravity G2 L1 of the high luminance region H2 L1 and centroid position G2 M1 of the high luminance region H2 M1 Distances are calculated, each distance is compared with a distance determination threshold value, two distances are determined to be equal to or less than the distance determination threshold value, and the center of gravity position is the same position.

第3高輝度領域抽出部27は、第2高輝度領域抽出部26で同じ重心位置の高輝度領域があると判定した場合、短い露光時間(最も暗い)の画像における長い露光時間の歩行者候補領域と同じ領域内から抽出した高輝度領域の重心位置と長い露光時間の歩行者候補領域内から抽出されている高輝度領域の重心位置とが同じ位置かを判定する処理部である。第3高輝度領域抽出部27では、選択されている歩行者候補領域に対する第2高輝度領域抽出部26での処理が終了すると、第2高輝度領域抽出部26で長い露光時間での高輝度領域の重心位置と中間の露光時間での高輝度領域の重心位置とが同じ位置であると判定しているかを判定し、同じ重心位置と判定している場合には多重露光画像取得部23で取得した短い露光時間の画像から長い露光時間の歩行者候補領域と同じ位置かつ同じ矩形形状の領域(短い露光時間の歩行者候補領域)を抽出する。そして、第3高輝度領域抽出部27では、第1高輝度領域抽出部25と同様の方法により、その抽出した短い露光時間の歩行者候補領域内において、第3輝度判定閾値より高い輝度を持つ領域を抽出し、その抽出した領域の重心位置を計算する。第3輝度判定閾値は、短い露光時間(最も暗い)画像において高輝度か否かを判定する閾値であり、短い露光時間で得られる画像の明るさを考慮して、実験等に基づいて予め設定される。第3輝度判定閾値は、最も暗い画像から高緯度領域を抽出するので、第2輝度判定閾値よりも小さい値である。   If the second high-brightness region extraction unit 27 determines that there is a high-brightness region at the same center of gravity, the third high-brightness region extraction unit 27 has a long exposure time for a pedestrian candidate with a short exposure time (darkest). This is a processing unit that determines whether the barycentric position of the high luminance area extracted from the same area as the area and the barycentric position of the high luminance area extracted from the pedestrian candidate area having a long exposure time are the same position. In the third high-intensity region extraction unit 27, when the processing in the second high-intensity region extraction unit 26 for the selected pedestrian candidate region is completed, the second high-intensity region extraction unit 26 performs high-intensity with a long exposure time. It is determined whether the center of gravity of the area and the center of gravity of the high-intensity area at the intermediate exposure time are determined to be the same position. If the center of gravity is determined to be the same, the multiple exposure image acquisition unit 23 An area having the same position and the same rectangular shape as the pedestrian candidate area with a long exposure time (pedestrian candidate area with a short exposure time) is extracted from the acquired image with a short exposure time. Then, the third high brightness area extraction unit 27 has a brightness higher than the third brightness determination threshold in the extracted pedestrian candidate area with a short exposure time by the same method as the first high brightness area extraction unit 25. An area is extracted, and the barycentric position of the extracted area is calculated. The third luminance determination threshold is a threshold for determining whether or not the image has a high luminance in a short exposure time (darkest) image, and is set in advance based on an experiment or the like in consideration of the brightness of the image obtained in a short exposure time. Is done. The third luminance determination threshold value is smaller than the second luminance determination threshold value because a high latitude region is extracted from the darkest image.

さらに、第3高輝度領域抽出部27では、その抽出した高輝度領域の重心位置と第1高輝度領域抽出部25で抽出されている高輝度領域の重心位置との距離を計算する。第3高輝度領域抽出部27では短い露光時間(最も暗い)の画像を用いているので、第3高輝度領域抽出部27で抽出した高輝度領域は、長い露光時間(最も明るさ)の画像を用いた第1高輝度領域抽出部25で抽出されている高輝度領域や中間の露光時間(中間の明るさ)の画像を用いた第2高輝度領域抽出部26で抽出されている高輝度領域よりも大きくなることはない。そのようになるように、第3輝度判定閾値の値が小さくなり過ぎないようにする。そして、第3高輝度領域抽出部27では、計算した距離が距離判定閾値以下か否かを判定し、距離が距離判定閾値以下の場合には長い露光時間の歩行者候補領域内の高輝度領域の重心位置と短い露光時間の歩行者候補領域内の高輝度領域の重心位置とが同じ位置であると判定する。なお、第1高輝度領域抽出部25で抽出されている高輝度領域の重心位置との距離を求め、その距離で判定したが、第2高輝度領域抽出部26で抽出されている高輝度領域の重心位置との距離を求め、その距離で判定してもよい。   Further, the third high brightness area extraction unit 27 calculates the distance between the center position of the extracted high brightness area and the center position of the high brightness area extracted by the first high brightness area extraction unit 25. Since the third high-intensity area extraction unit 27 uses an image with a short exposure time (darkest), the high-intensity area extracted by the third high-intensity area extraction unit 27 is an image with a long exposure time (brightest). The high luminance region extracted by the first high luminance region extraction unit 25 using the image and the high luminance extracted by the second high luminance region extraction unit 26 using the image of the intermediate exposure time (intermediate brightness). It cannot be larger than the area. To prevent this, the value of the third luminance determination threshold value is prevented from becoming too small. Then, the third high brightness area extraction unit 27 determines whether or not the calculated distance is equal to or less than the distance determination threshold value. If the distance is equal to or less than the distance determination threshold value, the high brightness area in the pedestrian candidate area having a long exposure time is determined. It is determined that the barycentric position of the high-luminance area in the pedestrian candidate area with a short exposure time is the same position. The distance from the center of gravity of the high-brightness region extracted by the first high-brightness region extraction unit 25 is obtained, and the high-brightness region extracted by the second high-brightness region extraction unit 26 is determined based on the distance. The distance from the center of gravity of the image may be obtained, and the determination may be made based on the distance.

図2(d)には、図2(a)の歩行者候補領域C1と同じ画像位置の短い露光時間(最も暗い)の歩行者候補領域C1から抽出された1つの高輝度領域H1S1を示している。この場合、図2(c)の場合よりも露光時間が更に短くなったので、図2(c)で示した中間の露光時間での高輝度領域H1M1,H1M3,H1M4が高輝度領域として抽出されなくなり、図2(c)で示した中間の露光時間での高輝度領域H1M2だけが小さくなって高輝度領域H1S1として抽出された。この場合、高輝度領域H1L1の重心位置G1L1と高輝度領域H1S1の重心位置G1S1との距離が計算され、この距離が距離判定閾値と比較され、距離判定閾値より大きいと判定され、重心位置が変化している。 FIG. 2D shows one high-intensity region H1 S1 extracted from the pedestrian candidate region C1 S with the short exposure time (darkest) at the same image position as the pedestrian candidate region C1 L in FIG. Is shown. In this case, since the exposure time is further shorter than in the case of FIG. 2C, the high brightness areas H1 M1 , H1 M3 , and H1 M4 at the intermediate exposure time shown in FIG. Thus, only the high brightness area H1 M2 at the intermediate exposure time shown in FIG. 2C is reduced and extracted as the high brightness area H1 S1 . In this case, the distance is calculated between the position of the center of gravity G1 S1 between the center of gravity position G1 L1 of the high luminance region H1 L1 high luminance region H1 S1, this distance is compared with the distance determination threshold is determined to be greater than the distance determination threshold, The position of the center of gravity has changed.

また、図3(d)には、図3(a)の歩行者候補領域C2と同じ画像位置の短い露光時間(最も暗い)の歩行者候補領域C2から抽出された2つの高輝度領域H2S1,H2S2を示している。この場合、図3(c)の場合よりも露光時間が更に短くなっているが、図3(c)で示した中間の露光時間での高輝度領域H2M1,H2M2から少しだけ小さくなって、高輝度領域H2S1,H2S2として抽出された。この場合、高輝度領域H2L1の重心位置G2L1と高輝度領域H2S1の重心位置G2S1との距離及び高輝度領域H2L2の重心位置G2L2と高輝度領域H2S2の重心位置G2S2との距離がそれぞれ計算され、各距離が距離判定閾値とそれぞれ比較され、2つの距離が距離判定閾値以下と判定され、重心位置が同じ位置である。 Further, in FIG. 3 (d), the pedestrian candidate region C2 L and short exposure time of the same image position (darkest) two high intensity regions extracted from the pedestrian candidate region C2 S in FIGS. 3 (a) H2 S1 and H2 S2 are shown. In this case, the exposure time is further shorter than in the case of FIG. 3C, but is slightly smaller than the high luminance regions H2 M1 and H2 M2 at the intermediate exposure time shown in FIG. , Extracted as high luminance regions H2 S1 and H2 S2 . In this case, the barycentric position G2 S2 between the center of gravity position G2 L2 distance and high luminance area H2 L2 between the position of the center of gravity G2 L1 of the high luminance region H2 L1 and centroid position G2 S1 of the high luminance region H2 S1 high luminance region H2 S2 Distances are calculated, each distance is compared with a distance determination threshold value, two distances are determined to be equal to or less than the distance determination threshold value, and the center of gravity position is the same position.

発光物判定部28は、第2高輝度領域抽出部26での距離の判定結果及び第3高輝度領域抽出部27での距離の判定結果に基づいて、歩行者候補領域が発光物(非歩行者)か否かを判定する処理部である。発光物判定部28では、選択されている歩行者候補領域に対する第3高輝度領域抽出部27での処理が終了すると、第2高輝度領域抽出部26で長い露光時間の歩行者候補領域内の高輝度領域の重心位置と中間の露光時間の歩行者候補領域内の高輝度領域の重心位置とが同じ位置であると判定しかつ第3高輝度領域抽出部27で長い露光時間の歩行者候補領域内の高輝度領域の重心位置と短い露光時間の歩行者候補領域内の高輝度領域の重心位置とが同じ位置であると判定している場合(すなわち、露光時間の異なる3つの歩行者候補領域間において高輝度領域の重心位置が同じ位置である場合)、その歩行者候補領域を発光物(非歩行者)と判定する。   Based on the distance determination result in the second high-intensity region extraction unit 26 and the distance determination result in the third high-intensity region extraction unit 27, the illuminant determination unit 28 determines whether the pedestrian candidate region is a luminescent object (non-walking). It is a processing unit that determines whether or not a person is). In the illuminant determining unit 28, when the processing in the third high-intensity region extracting unit 27 for the selected pedestrian candidate region is completed, the second high-intensity region extracting unit 26 in the pedestrian candidate region having a long exposure time. It is determined that the barycentric position of the high brightness area and the barycentric position of the high brightness area in the pedestrian candidate area having an intermediate exposure time are the same position, and the third high brightness area extracting unit 27 determines the pedestrian candidate having a long exposure time. When it is determined that the barycentric position of the high brightness area in the area and the barycentric position of the high brightness area in the pedestrian candidate area having a short exposure time are the same position (that is, three pedestrian candidates having different exposure times) When the barycentric positions of the high-luminance areas are the same between the areas), the pedestrian candidate area is determined as a luminescent material (non-pedestrian).

図2に示す例の場合、上記で説明したように、露光時間の異なる歩行者候補領域C1,C1,C1間で高輝度領域の重心位置が変化していると判定された。一方、図3に示す例の場合、上記で説明したように、露光時間の異なる歩行者候補領域C2,C2,C2間で高輝度領域の重心位置が同じ位置と判定された。したがって、図3に示す歩行者候補領域は発光物(非歩行者)と判定される。 In the case of the example shown in FIG. 2, as described above, it is determined that the position of the center of gravity of the high-luminance region changes between pedestrian candidate regions C1 L , C1 M , and C1 S having different exposure times. On the other hand, in the case of the example shown in FIG. 3, as described above, the barycentric position of the high luminance area is determined to be the same among the pedestrian candidate areas C2 L , C2 M , and C2 S having different exposure times. Therefore, the pedestrian candidate area shown in FIG. 3 is determined as a luminescent material (non-pedestrian).

第2歩行者判定部29は、発光物判定部28での判定結果に基づいて歩行者候補領域毎に歩行者か否かを判定する処理部である。第2歩行者判定部29では、抽出されている全ての歩行者候補領域について発光物判定部28での判定が終了すると、歩行者候補領域毎に発光物判定部28で発光物と判定されていない歩行者候補領域については歩行者と判定する。そして、第2歩行者判定部29では、歩行者と判定した歩行者候補領域については自車両からの相対位置等の歩行者情報を計算し、各歩行者の歩行者情報からなる歩行者検出信号を運転支援装置に送信する。   The second pedestrian determination unit 29 is a processing unit that determines whether or not each pedestrian candidate area is a pedestrian based on the determination result of the light emitting object determination unit 28. In the second pedestrian determination unit 29, when all the extracted pedestrian candidate areas have been determined by the illuminant determination unit 28, the illuminant determination unit 28 determines that each pedestrian candidate area is a luminescent object. A no pedestrian candidate area is determined as a pedestrian. And in the 2nd pedestrian determination part 29, about the pedestrian candidate area | region determined as a pedestrian, pedestrian information, such as a relative position from the own vehicle, is calculated, and the pedestrian detection signal which consists of pedestrian information of each pedestrian Is transmitted to the driving support device.

図1を参照して、歩行者検出装置1での動作の流れを説明する。特に、ECU20における処理の流れについては図4のフローチャートに沿って説明する。図4は、図1のECUにおける歩行者検出処理の流れを示すフローチャートである。歩行者検出装置1では、以下で説明する動作を一定時間毎に繰り返し行う。   With reference to FIG. 1, the flow of operation | movement in the pedestrian detection apparatus 1 is demonstrated. In particular, the flow of processing in the ECU 20 will be described with reference to the flowchart of FIG. FIG. 4 is a flowchart showing the flow of pedestrian detection processing in the ECU of FIG. In the pedestrian detection apparatus 1, the operation described below is repeated at regular intervals.

可視光カメラ10では、ECU20からの露光制御に応じて長い露光時間で自車両前方を撮像し、その撮像した画像データを画像信号としてECU20に送信する。ECU20では、この画像信号を受信し、長い露光時間(最も明るい)の画像を取得する(S1)。また、可視光カメラ10では、ECU20からの露光制御に応じて中間の露光時間で自車両前方を撮像し、その撮像した画像データを画像信号としてECU20に送信する。ECU20では、この画像信号を受信し、中間の露光時間(中間の明るさ)の画像を取得する(S1)。また、可視光カメラ10では、ECU20からの露光制御に応じて短い露光時間で自車両前方を撮像し、その撮像した画像データを画像信号としてECU20に送信する。ECU20では、この画像信号を受信し、短い露光時間(最も暗い)の画像を取得する(S1)。   The visible light camera 10 images the front of the host vehicle with a long exposure time in accordance with the exposure control from the ECU 20, and transmits the captured image data to the ECU 20 as an image signal. The ECU 20 receives this image signal and acquires an image having a long exposure time (brightest) (S1). Further, the visible light camera 10 images the front of the host vehicle with an intermediate exposure time according to the exposure control from the ECU 20, and transmits the captured image data to the ECU 20 as an image signal. The ECU 20 receives this image signal and acquires an image having an intermediate exposure time (intermediate brightness) (S1). Further, the visible light camera 10 images the front of the host vehicle with a short exposure time according to the exposure control from the ECU 20, and transmits the captured image data to the ECU 20 as an image signal. The ECU 20 receives this image signal and acquires an image having a short exposure time (darkest) (S1).

ECU20では、取得した露光時間の異なる画像の中から一番長い露光時間(長い露光時間)の画像を選択する(S2)。そして、ECU20では、その選択した画像内から歩行者らしい歩行者候補領域を抽出する(S3)。ここでは、歩行者候補領域として、1つだけ抽出される場合もあれば複数抽出される場合もある。なお、歩行者候補領域が1つも抽出されない場合、ここで、今回の処理を終了する。   The ECU 20 selects an image having the longest exposure time (long exposure time) from the acquired images having different exposure times (S2). Then, the ECU 20 extracts a pedestrian candidate area that seems to be a pedestrian from the selected image (S3). Here, only one pedestrian candidate region or a plurality of pedestrian candidate regions may be extracted. If no pedestrian candidate area is extracted, the current process ends here.

ECU20では、S3で抽出した歩行者候補領域の中から歩行者候補領域を1つ選択する(S4)。そして、ECU20では、その選択した歩行者候補領域内から第1輝度判定閾値より高い輝度の領域を高輝度領域として抽出し、抽出した各高輝度領域の重心位置を計算する(S5)。ここでは、高輝度領域として、1つだけ抽出される場合もあれば複数抽出される場合もあり、1つも抽出されない場合もある。   The ECU 20 selects one pedestrian candidate area from the pedestrian candidate areas extracted in S3 (S4). Then, the ECU 20 extracts a region having a luminance higher than the first luminance determination threshold as a high luminance region from the selected pedestrian candidate region, and calculates the center of gravity position of each extracted high luminance region (S5). Here, there may be a case where only one or a plurality of high luminance regions are extracted, and there is a case where none is extracted.

ECU20では、S5の処理で高輝度領域が抽出されているか否かを判定する(S6)。S6の判定で高輝度領域が1つも抽出されていないと判定した場合、ECU20では、この歩行者候補領域に対する処理は終了し、S14の処理に移行する。   The ECU 20 determines whether or not a high brightness area has been extracted in the process of S5 (S6). If it is determined in S6 that no high-intensity area has been extracted, the ECU 20 ends the process for this pedestrian candidate area and proceeds to S14.

S6の判定で高輝度領域が抽出されていると判定した場合、ECU20では、S1で取得した露光時間の異なる画像の中から2番目に長い露光時間(中間の露光時間)の画像を選択する(S7)。そして、ECU20では、その選択した画像からS4で選択した歩行者候補領域と同じ位置かつ同じ矩形形状の領域を中間の露光時間での歩行者候補領域として抽出し、その抽出した歩行者候補領域内から第2輝度判定閾値より高い輝度の領域を高輝度領域として抽出し、抽出した各高輝度領域の重心位置を計算する(S8)。さらに、ECU20では、S5で抽出した長い露光時間での高輝度領域の重心位置とS8で抽出した中間の重心位置での高輝度領域の重心位置との距離を算出し、その距離が距離判定閾値よりも小さいか否かを判定する(S9)。S9の判定で高輝度領域の重心位置間の距離が距離判定閾値以上と判定した場合、ECU20では、同じ重心位置に高輝度領域がないと判断し、この歩行者候補領域に対する処理は終了し、S14の処理に移行する。   If it is determined in S6 that a high-luminance region has been extracted, the ECU 20 selects an image having the second longest exposure time (intermediate exposure time) from the images having different exposure times acquired in S1 ( S7). Then, the ECU 20 extracts a region having the same position and the same rectangular shape as the pedestrian candidate region selected in S4 from the selected image as a pedestrian candidate region with an intermediate exposure time, and within the extracted pedestrian candidate region. Then, a region having a luminance higher than the second luminance determination threshold is extracted as a high luminance region, and the barycentric position of each extracted high luminance region is calculated (S8). Further, the ECU 20 calculates the distance between the centroid position of the high brightness area in the long exposure time extracted in S5 and the centroid position of the high brightness area in the intermediate centroid position extracted in S8, and the distance is a distance determination threshold value. It is determined whether it is smaller than (S9). If it is determined in S9 that the distance between the centroid positions of the high luminance area is equal to or greater than the distance determination threshold, the ECU 20 determines that there is no high luminance area at the same centroid position, and the process for this pedestrian candidate area ends. The process proceeds to S14.

S9の判定で高輝度領域の重心位置間の距離が距離判定閾値より小さいと判定した場合、同じ重心位置に高輝度領域があると判断し、ECU20では、S1で取得した露光時間の異なる画像の中から3番目に長い露光時間(短い露光時間)の画像を選択する(S10)。そして、ECU20では、その選択した画像からS4で選択した歩行者候補領域と同じ位置かつ同じ矩形形状の領域を短い露光時間での歩行者候補領域として抽出し、その抽出した歩行者候補領域内から第3輝度判定閾値より高い輝度の領域を高輝度領域として抽出し、抽出した各高輝度領域の重心位置を計算する(S11)。さらに、ECU20では、S5で抽出した長い露光時間での高輝度領域の重心位置とS11で抽出した短い露光時間での高輝度領域の重心位置との距離を算出し、その距離が距離判定閾値よりも小さいか否かを判定する(S12)。S12の判定で高輝度領域の重心位置間の距離が距離判定閾値以上と判定した場合、ECU20では、同じ重心位置に高輝度領域がないと判断し、この歩行者候補領域に対する処理は終了し、S14の処理に移行する。   If it is determined in S9 that the distance between the centroid positions of the high brightness area is smaller than the distance determination threshold, it is determined that there is a high brightness area at the same centroid position, and the ECU 20 determines that the images having different exposure times acquired in S1. An image having the third longest exposure time (short exposure time) is selected (S10). Then, the ECU 20 extracts a region having the same position and the same rectangular shape as the pedestrian candidate region selected in S4 from the selected image as a pedestrian candidate region with a short exposure time, and from within the extracted pedestrian candidate region. A region having a luminance higher than the third luminance determination threshold is extracted as a high luminance region, and the barycentric position of each extracted high luminance region is calculated (S11). Further, the ECU 20 calculates the distance between the centroid position of the high brightness area at the long exposure time extracted at S5 and the centroid position of the high brightness area at the short exposure time extracted at S11, and the distance is calculated from the distance determination threshold. It is also determined whether or not (S12). If it is determined in S12 that the distance between the center of gravity positions of the high brightness area is equal to or greater than the distance determination threshold, the ECU 20 determines that there is no high brightness area at the same center of gravity position, and the process for this pedestrian candidate area ends. The process proceeds to S14.

S12の判定で高輝度領域の重心位置間の距離が距離判定閾値より小さいと判定した場合、同じ重心位置に高輝度領域があると判断し、ECU20では、3種類の異なる露光時間での歩行者候補領域内の高輝度領域の重心位置が同じ位置であるので、その歩行者候補領域を非歩行者(発光物)と判定する(S13)。そして、ECU20では、S3で抽出されている全ての歩行者候補領域に対する処理が終了したか否かを判定する(S14)。S14の判定で全ての歩行者候補領域に対する処理が終了していないと判定した場合、ECU20では、S4の処理に戻って、次の歩行者候補領域に対する処理を行う。   If it is determined in S12 that the distance between the center of gravity positions of the high brightness area is smaller than the distance determination threshold, it is determined that there is a high brightness area at the same center of gravity position, and the ECU 20 is a pedestrian at three different exposure times. Since the barycentric positions of the high brightness areas in the candidate area are the same position, the pedestrian candidate area is determined as a non-pedestrian (light emitting object) (S13). Then, the ECU 20 determines whether or not the processing for all the pedestrian candidate areas extracted in S3 is completed (S14). When it determines with the process with respect to all the pedestrian candidate areas not having been complete | finished by determination of S14, ECU20 returns to the process of S4 and performs the process with respect to the next pedestrian candidate area | region.

S14の判定で全ての歩行者候補領域に対する処理が終了したと判定した場合、ECU20では、S3で抽出されている全ての歩行者候補領域においてS13で非歩行者と判定されていない歩行者候補領域を抽出し、その歩行者候補領域を歩行者と判定する(S15)。そして、ECU20では、歩行者と判定した歩行者候補領域については歩行者情報を計算し、各歩行者の歩行者情報からなる歩行者検出信号を運転支援装置に送信する。   If it is determined in S14 that the processing for all pedestrian candidate areas has been completed, the ECU 20 determines that all pedestrian candidate areas extracted in S3 are not pedestrian candidate areas in S13. And the pedestrian candidate area is determined as a pedestrian (S15). And ECU20 calculates pedestrian information about the pedestrian candidate area | region determined to be a pedestrian, and transmits the pedestrian detection signal which consists of pedestrian information of each pedestrian to a driving assistance device.

歩行者検出装置1によれば、3種類の異なる露光時間(露光条件)で撮像した画像内の同じ画像位置の歩行者候補領域間で高輝度領域の形状の特徴を比較することにより、歩行者以外(発光物)の高輝度領域があっても、歩行者を高精度に検出できる。特に、歩行者検出装置1は、3種類の異なる露光時間の歩行者候補領域間で高輝度領域の重心位置を比較することによって、高輝度領域の形状の特徴が変化しているか否かを高精度に判定できる。   According to the pedestrian detection device 1, by comparing the characteristics of the shape of the high brightness area between the pedestrian candidate areas at the same image position in the images captured with three different exposure times (exposure conditions), the pedestrian A pedestrian can be detected with high accuracy even if there is a high-luminance region other than (light emitting object). In particular, the pedestrian detection device 1 can determine whether or not the feature of the shape of the high-brightness region has changed by comparing the position of the center of gravity of the high-brightness region between pedestrian candidate regions with three different exposure times. It can be judged with accuracy.

ちなみに、歩行者検出装置1では、撮像手段として可視光カメラ10を適用しているので、赤外光領域の波長を有さない発光物を判別することも可能である。赤外線カメラを適用すると、LED等の赤外領域の波長を有さない発光物に対する判別ができない場合がある。   Incidentally, since the pedestrian detection apparatus 1 uses the visible light camera 10 as an imaging means, it is also possible to discriminate a luminescent material that does not have a wavelength in the infrared light region. When an infrared camera is applied, there is a case where it is not possible to discriminate a light emitting object such as an LED that does not have a wavelength in the infrared region.

以上、本発明に係る実施の形態について説明したが、本発明は上記実施の形態に限定されることなく様々な形態で実施される。   As mentioned above, although embodiment which concerns on this invention was described, this invention is implemented in various forms, without being limited to the said embodiment.

例えば、本実施の形態では歩行者を検出する歩行者検出装置に適用したが、歩行者以外にも車両、自転車、自動二輪車、交通標識等の他の対象物を検出する装置にも適用可能である。また、本実施の形態では車両に搭載される歩行者検出装置(物体検出装置)に適用したが、ロボット等の移動体に搭載するなど、車載以外の物体検出装置にも適用可能である。   For example, in this embodiment, the present invention is applied to a pedestrian detection device that detects a pedestrian, but it can also be applied to devices that detect other objects such as vehicles, bicycles, motorcycles, traffic signs, etc. in addition to pedestrians. is there. In this embodiment, the present invention is applied to a pedestrian detection device (object detection device) mounted on a vehicle. However, the present invention can also be applied to an object detection device other than on-vehicle such as mounting on a moving body such as a robot.

また、本実施の形態では可視光カメラとしたが、赤外線カメラ(近赤外線カメラ等)でもよい。本実施の形態では可視光カメラで自車両前方を撮像し、自車両前方の歩行者を検出する構成としたが、可視光カメラで後方や側方等の他の周辺方向を撮像し、他の周辺方向の歩行者を検出する構成とてもよい。また、本実施の形態では車載として、専用のECUで歩行者検出装置の各処理部を構成としたが、各処理部をパソコン等の汎用コンピュータに構成してもよい。   Further, although a visible light camera is used in this embodiment, an infrared camera (near infrared camera or the like) may be used. In this embodiment, the front of the host vehicle is imaged with a visible light camera, and a pedestrian in front of the host vehicle is detected. However, with the visible light camera, other peripheral directions such as rear and side are imaged, The configuration that detects pedestrians in the peripheral direction is very good. In the present embodiment, each processing unit of the pedestrian detection device is configured by a dedicated ECU as an on-vehicle vehicle, but each processing unit may be configured as a general-purpose computer such as a personal computer.

また、本実施の形態では3種類の異なる露光時間で画像を撮像し、3種類の異なる露光時間の画像からそれぞれ抽出された歩行者候補領域間で高輝度領域の形状の特徴を比較する構成としたが、2種類あるいは4種類以上の異なる露光時間の画像を撮像し、2種類あるいは4種類以上の異なる露光時間の画像からそれぞれ抽出された歩行者候補領域間で高輝度領域の形状の特徴を比較する構成としてもよい。また、本実施の形態では露光条件として露光時間を変える構成としたが、レンズの絞り等の他の露光条件を変える構成としてもよい。   In the present embodiment, a configuration is used in which images are captured at three different exposure times, and the characteristics of the shape of the high-brightness region are compared between pedestrian candidate regions extracted from images of three different exposure times. However, two or four or more types of images with different exposure times are captured, and the characteristics of the shape of the high brightness region between the pedestrian candidate regions extracted from the images with two or more types of different exposure times, respectively. It is good also as a structure to compare. In the present embodiment, the exposure time is changed as the exposure condition, but other exposure conditions such as a lens aperture may be changed.

また、本実施の形態では異なる露光時間の歩行者候補領域からそれぞれ抽出された高輝度領域間の重心位置の距離を求めて、距離に基づいて判定する構成としたが、異なる露光時間の歩行者候補領域からそれぞれ抽出される高輝度領域の数をカウントし、歩行者候補領域内の高輝度領域の数で判定してもよい。この場合、高輝度領域の数の変化がある閾値以下か否かで判定し、中間の露光時間の歩行者候補領域との判定及び短い露光時間の歩行者候補領域との判定で共に高輝度領域の数の変化がある閾値以下の場合にはその歩行者候補領域を発光物と判定する。上記したように、歩行者の場合、露光時間が短くなると、高輝度領域が小さくなって、やがて分割し、高輝度領域の数が変化する。一方、発光物の場合、露光時間が短くなっても、高輝度領域があまり小さくならず、高輝度領域の数が変化しない。   In this embodiment, the centroid position distance between the high brightness areas respectively extracted from the pedestrian candidate areas with different exposure times is obtained and determined based on the distance. The number of high brightness areas respectively extracted from the candidate areas may be counted, and the determination may be made based on the number of high brightness areas in the pedestrian candidate area. In this case, it is determined whether or not the change in the number of high-intensity areas is equal to or less than a threshold value. If the change in the number of pedestrians is less than or equal to a threshold value, the pedestrian candidate area is determined as a luminescent object. As described above, in the case of a pedestrian, when the exposure time is shortened, the high-intensity area becomes smaller and is eventually divided, and the number of high-intensity areas changes. On the other hand, in the case of a luminescent material, even if the exposure time is shortened, the high luminance area is not so small and the number of high luminance areas does not change.

1…歩行者検出装置、10…可視光カメラ、20…ECU、21…歩行者モデルデータベース、22…露光制御部、23…多重露光画像取得部、24…第1歩行者判定部、25…第1高輝度領域抽出部、26…第2高輝度領域抽出部、27…第3高輝度領域抽出部、28…発光物判定部、29…第2歩行者判定部。   DESCRIPTION OF SYMBOLS 1 ... Pedestrian detection apparatus, 10 ... Visible light camera, 20 ... ECU, 21 ... Pedestrian model database, 22 ... Exposure control part, 23 ... Multiple exposure image acquisition part, 24 ... 1st pedestrian determination part, 25 ... 1st DESCRIPTION OF SYMBOLS 1 High brightness area extraction part, 26 ... 2nd high brightness area extraction part, 27 ... 3rd high brightness area extraction part, 28 ... Luminescent substance determination part, 29 ... 2nd pedestrian determination part.

Claims (4)

画像から対象物を検出する物体検出装置であって、
異なる露光条件で撮像可能な撮像手段と、
前記撮像手段で撮像した露光条件の異なる複数の画像からそれぞれ抽出された同じ画像位置の対象物候補領域間で高輝度領域の形状の特徴を比較する比較手段と、
前記比較手段での比較結果に基づいて前記対象物候補領域が対象物か否かを判定する判定手段と、
を備えることを特徴とする物体検出装置。
An object detection device for detecting an object from an image,
Imaging means capable of imaging under different exposure conditions;
Comparison means for comparing the characteristics of the shape of the high-brightness region between object candidate regions at the same image position respectively extracted from a plurality of images with different exposure conditions imaged by the imaging means;
Determination means for determining whether or not the object candidate region is an object based on a comparison result in the comparison means;
An object detection apparatus comprising:
前記判定手段は、前記比較手段の比較結果から露光条件の異なる複数の対象物候補領域間で高輝度領域の形状の特徴の変化が小さい場合、前記対象物候補領域が発光物であると判定することを特徴とする請求項1に記載の物体検出装置。   The determination means determines that the object candidate area is a luminescent object when a change in the feature of the shape of the high brightness area is small between a plurality of object candidate areas with different exposure conditions from the comparison result of the comparison means. The object detection apparatus according to claim 1. 前記比較手段は、露光条件の異なる複数の対象物候補領域間で高輝度領域の重心位置を比較することを特徴とする請求項1又は請求項2に記載の物体検出装置。   3. The object detection apparatus according to claim 1, wherein the comparison unit compares the position of the center of gravity of the high brightness area between a plurality of object candidate areas having different exposure conditions. 前記比較手段は、露光条件の異なる複数の対象物候補領域間で高輝度領域の数を比較することを特徴とする請求項1〜請求項3のいずれか1項に記載の物体検出装置。   4. The object detection apparatus according to claim 1, wherein the comparison unit compares the number of high-luminance areas between a plurality of object candidate areas having different exposure conditions. 5.
JP2013014315A 2013-01-29 2013-01-29 Object detection device Active JP5892079B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013014315A JP5892079B2 (en) 2013-01-29 2013-01-29 Object detection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013014315A JP5892079B2 (en) 2013-01-29 2013-01-29 Object detection device

Publications (2)

Publication Number Publication Date
JP2014146164A true JP2014146164A (en) 2014-08-14
JP5892079B2 JP5892079B2 (en) 2016-03-23

Family

ID=51426382

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013014315A Active JP5892079B2 (en) 2013-01-29 2013-01-29 Object detection device

Country Status (1)

Country Link
JP (1) JP5892079B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016081271A (en) * 2014-10-16 2016-05-16 セコム株式会社 Object detection device
WO2018008461A1 (en) * 2016-07-05 2018-01-11 日立オートモティブシステムズ株式会社 Image processing device
CN109800683A (en) * 2018-12-30 2019-05-24 昆明物理研究所 A kind of infrared pedestrian detection method and device based on FPGA
JP2021144689A (en) * 2020-03-12 2021-09-24 株式会社豊田中央研究所 On-vehicle sensing device and sensor parameter optimization device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981753A (en) * 1995-09-11 1997-03-28 Matsushita Electric Ind Co Ltd Moving body extracting device
JP2005092861A (en) * 2003-08-11 2005-04-07 Hitachi Ltd Vehicle control system
JP2007213191A (en) * 2006-02-08 2007-08-23 Fujitsu Ltd Motion detection program, motion detection method, motion detection device
JP2008054200A (en) * 2006-08-28 2008-03-06 Olympus Corp Imaging apparatus and image processing program
JP2008065757A (en) * 2006-09-11 2008-03-21 Kawasaki Heavy Ind Ltd Driving assist system
JP2012240530A (en) * 2011-05-18 2012-12-10 Koito Mfg Co Ltd Image processing apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981753A (en) * 1995-09-11 1997-03-28 Matsushita Electric Ind Co Ltd Moving body extracting device
JP2005092861A (en) * 2003-08-11 2005-04-07 Hitachi Ltd Vehicle control system
JP2007213191A (en) * 2006-02-08 2007-08-23 Fujitsu Ltd Motion detection program, motion detection method, motion detection device
JP2008054200A (en) * 2006-08-28 2008-03-06 Olympus Corp Imaging apparatus and image processing program
JP2008065757A (en) * 2006-09-11 2008-03-21 Kawasaki Heavy Ind Ltd Driving assist system
JP2012240530A (en) * 2011-05-18 2012-12-10 Koito Mfg Co Ltd Image processing apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
玉木 徹: "画像系列からの人物領域の抽出", 電気学会論文誌C VOL.119−C NO.1, vol. 電学論C, 119巻1号, 平成11年, JPN6015048325, 1 January 1999 (1999-01-01), JP, pages 37 - 43, ISSN: 0003242298 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016081271A (en) * 2014-10-16 2016-05-16 セコム株式会社 Object detection device
WO2018008461A1 (en) * 2016-07-05 2018-01-11 日立オートモティブシステムズ株式会社 Image processing device
JP2018005682A (en) * 2016-07-05 2018-01-11 日立オートモティブシステムズ株式会社 Image processor
CN109800683A (en) * 2018-12-30 2019-05-24 昆明物理研究所 A kind of infrared pedestrian detection method and device based on FPGA
JP2021144689A (en) * 2020-03-12 2021-09-24 株式会社豊田中央研究所 On-vehicle sensing device and sensor parameter optimization device
JP7230896B2 (en) 2020-03-12 2023-03-01 株式会社豊田中央研究所 In-vehicle sensing device and sensor parameter optimization device.

Also Published As

Publication number Publication date
JP5892079B2 (en) 2016-03-23

Similar Documents

Publication Publication Date Title
US10929693B2 (en) Vehicular vision system with auxiliary light source
US10635896B2 (en) Method for identifying an object in a surrounding region of a motor vehicle, driver assistance system and motor vehicle
US10552688B2 (en) Method and device for detecting objects in the surroundings of a vehicle
JP4853437B2 (en) Vehicle perimeter monitoring system
JP7268001B2 (en) Arithmetic processing unit, object identification system, learning method, automobile, vehicle lamp
KR101511853B1 (en) Night-time vehicle detection and positioning system and method using multi-exposure single camera
EP3118831B1 (en) Traffic light detection device and traffic light detection method
US20190196494A1 (en) Autonomous driving system and autonomous driving method
JP5892079B2 (en) Object detection device
JP2008040615A (en) Vehicle detection device and head lamp controller
JP6236039B2 (en) Outside environment recognition device
JP6420650B2 (en) Outside environment recognition device
CN108197523A (en) Vehicle detection at night method and system based on image conversion and profile neighborhood difference
JP2009015759A (en) Traffic light recognition apparatus
JP2012222762A (en) Image processing apparatus
EP3220637B1 (en) Vehicle-mounted camera system
CN110087946A (en) Vehicle lighting system and vehicle
US20170083775A1 (en) Method and system for pattern detection, classification and tracking
JP2019020956A (en) Vehicle surroundings recognition device
US11518465B2 (en) Motorcycle providing visual light warning to detected vehicles to notify detected vehicles of presence of motorcycle
CN104008518A (en) Object detection apparatus
JP6339840B2 (en) Outside environment recognition device
Tang et al. Robust vehicle surveillance in night traffic videos using an azimuthally blur technique
CN116234720A (en) Method for operating a lighting device and motor vehicle
JP6853890B2 (en) Object detection system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150303

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20151125

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20151201

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160112

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20160126

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20160208

R151 Written notification of patent or utility model registration

Ref document number: 5892079

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151