WO2023089686A1 - External environment recognition device - Google Patents

External environment recognition device Download PDF

Info

Publication number
WO2023089686A1
WO2023089686A1 PCT/JP2021/042199 JP2021042199W WO2023089686A1 WO 2023089686 A1 WO2023089686 A1 WO 2023089686A1 JP 2021042199 W JP2021042199 W JP 2021042199W WO 2023089686 A1 WO2023089686 A1 WO 2023089686A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image feature
feature amount
recognition device
function
Prior art date
Application number
PCT/JP2021/042199
Other languages
French (fr)
Japanese (ja)
Inventor
将一 坂本
直也 多田
裕史 大塚
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Priority to PCT/JP2021/042199 priority Critical patent/WO2023089686A1/en
Publication of WO2023089686A1 publication Critical patent/WO2023089686A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images

Definitions

  • the present invention relates to an external world recognition device such as an in-vehicle stereo camera, and more particularly to an external world recognition device that accurately determines whether to stop driving support control using an image captured by an in-vehicle camera.
  • Driving support controls such as adaptive cruise control (ACC) and collision damage mitigation braking (AEB) are well known.
  • ACC adaptive cruise control
  • AEB collision damage mitigation braking
  • the vehicle is controlled by recognizing the surrounding environment (outside world) from an image captured by a camera installed in the windshield with the subject in front of the vehicle, for example.
  • the stop determination process uses the number of edges, the number of distance data, and luminance in the current frame. It is not possible to determine whether the environment is such that brightness and edges are difficult to appear, or whether brightness and edges are not appearing due to image degradation due to heavy rain or the like. In that case, in environments such as snow-covered roads where brightness and edges are inherently difficult to display, if raindrops or snow adheres to the surface and causes slight deterioration in the image, it will be erroneously determined as HALT and the function (driving support control) will be stopped. Resulting in.
  • one of the typical external world recognition devices of the present invention includes a camera installed in the vehicle interior and an image feature amount calculation unit that obtains an image feature amount from the image acquired by the camera.
  • a recognition unit that recognizes an object outside the vehicle from the image acquired by the camera;
  • a memory that stores a first image feature amount in the first image acquired at a timing determined in a predetermined cycle; Determining whether to suspend the function of the recognition unit based on the first image feature amount or a comparison result between the first image feature amount and a second image feature amount in the current image and a function stop determination unit.
  • the present invention by making determinations using time-series data, it is possible to determine whether the environment is such that image features such as parallax are difficult to appear in the first place, or whether image features such as parallax are not appearing due to image degradation due to heavy rain or the like. Therefore, when the image deterioration is severe, such as heavy rain, it is determined as HALT and the function is stopped, and when the image is lightly deteriorated, the function is not stopped without HALT determination. Therefore, when the image deterioration is severe, such as heavy rain, the function (driving support control) is temporarily suspended. ) is not stopped, and driving assistance can be continued to reduce the burden of the driver's driving operation.
  • FIG. 1 is a configuration diagram of an in-vehicle stereo camera according to an embodiment of the present invention
  • FIG. Explanatory diagrams of (a) a left camera image, (b) a right camera image, and (c) a parallax image in a normal state.
  • Explanatory drawing of the camera image before wiper operation Explanatory drawing of the camera image immediately after wiper wiping.
  • FIG. 10 is an image of the behavior of the stop determination of this system during heavy rain, and is an explanatory diagram of time-series changes in the number of parallaxes.
  • FIG. 10 is an image of the behavior of the stop determination of this system during heavy rain, and is an explanatory diagram of the deterioration determination result in each frame.
  • FIG. 4 is an image of the behavior of the stop determination of this system during heavy rain, and is an explanatory diagram of the HALT occurrence counter.
  • FIG. 10 is a behavior image of stop determination of the system during light rain, and is an explanatory diagram of time-series changes in the number of parallaxes.
  • FIG. 4 is an image of the behavior of the stop determination of this system during light rain, and is an explanatory diagram of the deterioration determination result in each frame.
  • FIG. 4 is an image of how the system stops when it rains lightly, and is an explanatory diagram of a HALT occurrence counter.
  • FIG. 1 is a representative diagram of this embodiment, and is a configuration diagram of an in-vehicle stereo camera in this embodiment.
  • the in-vehicle stereo camera 1 is mounted on the vehicle and recognizes objects outside the vehicle from images acquired (captured) by (multiple) cameras installed in the vehicle interior, and realizes driving support functions such as ACC and AEB. It has a function as an external world recognition device.
  • the in-vehicle stereo camera 1 stores cameras 2A and 2B (left camera 2A and right camera 2B) as a pair of left and right imaging units, an image feature amount calculation unit 3, a recognition unit 4, and a first image feature amount. It is composed of a memory 5 and a function stop determination unit 6 .
  • the in-vehicle stereo camera 1 includes hardware such as a camera, CPU, memory, and electronic circuit, and software that cooperates with the hardware to assist driving of the vehicle.
  • the cameras 2A and 2B in FIG. 1 take images.
  • the cameras 2A and 2B are fixed to the vehicle so as to be spaced apart in the vehicle width direction, and can image the front through the windshield of the vehicle so as to image the same area.
  • the recognition unit 4 in FIG. 1 detects and recognizes objects such as vehicles and pedestrians in the surrounding environment (in front here) from the images acquired by the cameras 2A and 2B, and performs driving support functions such as ACC and AEB. come true.
  • the image feature amount calculation unit 3 receives an image captured by the left camera 2A (hereinafter also referred to as a left camera image) and an image captured by the right camera 2B (hereinafter also referred to as a right camera image), and calculates the An image feature amount is calculated by analyzing a pixel value (luminance value).
  • the image feature quantity can be calculated using the pixel values themselves, or by defining a local area from multiple pixels and using the average luminance, weighted average luminance, representative luminance, etc. in the local area. You may In this example, the image feature amount calculator 3 calculates the number of parallaxes and the number of edges as the image feature amount.
  • the number of edges in an image can be calculated from (the image of) the left camera 2A or the right camera 2B in the current frame as a known technique.
  • FIG. 2 shows an image of parallax calculation in the image feature quantity calculation unit 3.
  • a parallax image is calculated by pattern matching of (a) the left camera image and (b) the right camera image.
  • Parallax is calculated in areas where edges appear, such as people, white lines, and preceding vehicles, but parallax cannot be calculated in areas where edges are difficult to appear, such as the sky, and pattern matching is difficult.
  • the number of effective parallaxes calculated at the portion where the edge appears in the entire parallax image is called the number of parallaxes.
  • FIG. 3 shows an example in which raindrops adhere to the example in FIG.
  • the left camera specifically, the windshield within the field of view of the left camera
  • pattern matching with the right camera image becomes difficult
  • the overall number of parallaxes decreases as in the parallax image.
  • the memory 5 stores the first image feature quantity in the first image acquired by the image feature quantity calculator 3 .
  • the first image for calculating the first image feature amount is acquired at a timing defined in a predetermined cycle defined by wiper wiping.
  • FIGS. 4A, 4B, 5A, and 5B show wiper positions and camera images immediately after wiper wiping. 4A and 4B, 2A and 2B are left and right cameras, 10 is a windshield, and 11A and 11B are left and right wipers.
  • FIG. 4A is an image before the wipers are activated.
  • FIG. 5A is an image immediately after wiping with the wiper.
  • the wipers 11A and 11B move from the position before the wiper operation (normal or retracted position) as shown in FIG. , clockwise). Then, after reaching the farthest position (positions AA and BB in FIG.
  • the windshield is moved in a direction opposite to one direction (counterclockwise in the example shown in FIGS. 4A and 4B). ), and returns to the position before the wiper operation (normal or retracted position) as shown in FIG. 4A.
  • Raindrops adhering to the windshield are wiped off by repeating such a series of operations once or more times.
  • the wipers 11A and 11B each pass in front of the left camera 2A and in front of the right camera 2B twice (twice in one direction and twice in the other direction) in the series of operations.
  • the wipers 11A and 11B may each pass in front of the left camera 2A twice (4 times in total for the wipers 11A and 11B), or the wipers 11A and 11B may each pass in front of the right camera 2B twice (four times in total with the wipers 11A and 11B).
  • the wipers 11A and 11B respectively pass in front of the left camera 2A and right camera 2B twice in one direction and in the other direction (instead of once in one direction) so that the Then, the wiper wiping is considered to be completed.
  • the timing of the wiper wiping operation described above can be the reference for the timing of function stop determination (image deterioration determination). Therefore, the operation cycle (driving cycle) of the wipers 11A and 11B is defined from immediately after wiper wiping in a series of operations to immediately after wiper wiping in the next series of operations.
  • the operation period of the wipers 11A and 11B can be defined as the period from the position before the wipers are activated to the time when the wipers 11A and 11B operate (rotate) on the windshield and return to the positions before the wipers are activated.
  • a predetermined timing in a series of wiper wiping operations is set as a reference timing, and the wiper is operated from a predetermined timing in a series of operations to a predetermined timing in the next series of operations. It can also be defined as the operation period of 11A and 11B.
  • the image feature amount calculator 3 overwrites and saves the image feature amount of the image immediately after wiping with the wiper in the memory 5 as the image feature amount of the first image.
  • the function stop determination unit 6 in FIG. It is determined whether the function (driving support function of the recognition unit 4) is stopped by comparing with the amount.
  • FIG. 6 is a flow chart showing the procedure of the function stop determination section 6 of the in-vehicle stereo camera 1 according to this embodiment.
  • the first image feature amount immediately after wiping with the wiper stored in the memory 5 is compared with the second image feature amount, which is the feature number of the current frame, to obtain the reduction rate d.
  • S103 it is determined whether the rate of decrease d is greater than the reference value dTH. If it is greater than the reference value dTH, the current frame is determined to be deteriorated, and the counter value of the HALT occurrence counter is added in S104. If it is less than the reference value dTH, the current frame is determined to be non-degraded, and the counter value of the HALT occurrence counter is decremented in S105.
  • S106 it is determined whether to suspend the camera function (HALT) based on the counter value of the HALT generation counter calculated in S104 and S105. If the counter value is equal to or greater than the reference value, the process proceeds to S107 to temporarily stop (HALT) the camera function. In this state, the driving support function performed by the recognition unit 4 is temporarily stopped.
  • FIG. 7A is a graph of time-series changes in the number of parallaxes.
  • the number of parallaxes reaches a peak immediately after the wiper is wiped off ( ⁇ ), because the image appears sharp (see FIG. 5B).
  • raindrops gradually adhere to the windshield in front of the camera until the next wiper wipes (see FIG. 5A), so the number of parallaxes decreases.
  • FIG. 7B shows the result of the deterioration determination for each frame, representing the result of the determination in S103.
  • FIG. 7B shows the result of the deterioration determination for each frame, representing the result of the determination in S103.
  • FIG. 7B shows HALT occurrence counters in S104 and S105.
  • deterioration the rate of decrease in the number of parallaxes periodically becomes greater than the reference value multiple times
  • non-degradation the rate of decrease in the number of parallaxes periodically becomes less than or equal to the reference value multiple times
  • the increment value is increased compared to the decrement value in each frame.
  • the counter value and the reference value are compared (S106), and if the counter value exceeds the reference value, the camera function is temporarily stopped (HALT) (S107).
  • HALT camera function
  • FIG. 8A, B, and C are image diagrams of the behavior of light rain in an environment where parallax (or edge) is difficult to appear, such as snow.
  • FIG. 8A is a graph of time-series changes in the number of parallaxes.
  • the image appears sharp immediately after the wiper is wiped off ( ⁇ ) (see FIG. 5B), so the number of parallaxes reaches a peak and then decreases, but the rate of decrease is small.
  • FIG. 8B shows the result of the deterioration determination for each frame, representing the result of the determination in S103.
  • the reduction rate of the number of parallaxes is not large, so it is determined as non-degraded (0).
  • the reduction rate of the number of parallaxes is calculated by comparing with the current frame immediately after the wiper wipe (the wiper completely passes in front of the left and right cameras and the image is clearly displayed) as a reference.
  • a HALT occurrence counter is used to determine whether a function has stopped.
  • another method of capturing periodic changes in the number of parallaxes as shown in FIG. 7A may be used. For example, it is possible to determine the stoppage of a function based on the periodic change (periodicity) of the rising interval and the falling interval as a result of deterioration determination in each frame as shown in FIG. 7B.
  • the timing immediately after wiper wiping (the state in which the wiper completely passes in front of the left and right cameras and the image is clearly captured) is compared with the current frame as the timing for determining whether the wiper has been wiped.
  • the function stop was determined by calculating the reduction rate of the image feature amount of the current frame with respect to the image feature amount of the immediately following image.
  • the timing for judging whether to stop functioning is not limited to the above embodiment.
  • a function stop may be determined by comparing with the current frame after a predetermined time (or after a predetermined number of frames) immediately after wiping the wiper, and calculating the reduction rate of the image feature amount.
  • the current frame is compared with the image immediately before the wiper is wiped (immediately before the wiper completely passes in front of the left and right cameras, the state of the image in which parallax is unlikely to occur) as a reference, and the current frame is compared with the image feature amount of the image immediately before the wiper is wiped.
  • a function stop may be determined by calculating an increase rate of the image feature amount.
  • the frame immediately after wiping the wiper is compared with a frame a predetermined time (or a predetermined number of frames before) from the current frame (in other words, for example, a frame immediately before wiping the wiper or a frame after a predetermined time after wiping the wiper) to determine the image characteristics.
  • the outage may be determined by calculating the rate of change (rate of decrease or rate of increase) of the amount.
  • the predetermined period of time (or the predetermined number of frames) used in this determination can be defined from the operation period of the wipers (for example, the period from immediately after wiping with a certain wiper to immediately after wiping with the next wiper).
  • the function stop determination unit 6 of the present embodiment identifies two different points (two times) within the wiper operation cycle, and changes the image feature amount at two different points (two times) within the wiper operation cycle. By calculating the rate (decrease rate or increase rate), it is possible to determine outage.
  • the change rate of the first and second image feature values is used to determine whether the function has stopped. , it may be determined that the function has stopped.
  • the in-vehicle stereo camera (external recognition device) 1 of the present embodiment described above uses the cameras (left camera 2A, right camera 2B) installed in the vehicle interior and the image feature amount ( a recognition unit 4 for recognizing an object outside the vehicle from the image acquired by the camera; and a timing (wiper 11A, 11B A memory 5 for storing the first image feature amount in the first image acquired at the timing defined by the operation of (), the first image feature amount, or the first image feature amount and the current image and a function stop determination unit 6 for determining whether or not to suspend the function of the recognition unit 4 (HALT determination) based on the result of comparison with the second image feature amount in .
  • the in-vehicle stereo camera (external recognition device) 1 of this embodiment stores the image feature amount immediately after the wiper is wiped, calculates the rate of decrease from the current image feature amount, and determines the rate of decrease to be the reference value. If the rate of decrease is larger than the reference value and periodically becomes larger than the reference value multiple times, the function of the recognition unit 4 is temporarily stopped.
  • the present embodiment by performing determination using time-series data (for example, two time-series frames based on immediately after wiping wipers), it is possible to determine whether the environment is such that image features such as parallax are difficult to appear in the first place. Since it is possible to judge whether image features such as parallax are not appearing due to image degradation due to heavy rain, etc., if the image degradation is severe like heavy rain, it will be judged as HALT and the function will be stopped. In the case of , the function is not stopped without HALT judgment. Therefore, when the image deterioration is severe, such as heavy rain, the function (driving support control) is temporarily suspended. ) is not stopped, and driving assistance can be continued to reduce the burden of the driver's driving operation.
  • time-series data for example, two time-series frames based on immediately after wiping wipers
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means, and the like may be realized by hardware, for example, by designing them in an integrated circuit.
  • may be realized by software by a processor interpreting and executing a program for realizing each function.
  • Information such as programs, tables, files, etc. that realize each function can be stored in memory, hard disks, SSD (Solid State Drives) and other recording devices, or IC cards, SD cards, DVDs and other recording media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is an external environment recognition device that accurately determines a situation in which a function (driver assistance control) should be temporarily stopped, and determines such situation as a HALT situation and stops the function if image degradation is severe, such as in heavy rain, and does not determine such situation as a HALT situation and does not stop the function if the image degradation is mild, such as in light rain. An image feature value immediately after wiper wiping is saved and the rate of decrease from a current image feature value is calculated, and if the rate of decrease is greater than a standard value and becomes greater than the standard value multiple times cyclically, the function (driver assistance control) of a recognition unit (4) is temporarily stopped.

Description

外界認識装置External recognition device
 本発明は、車載ステレオカメラ等の外界認識装置に関し、車載カメラで撮像された画像を用いた運転支援制御の停止判定を的確に行うようにされた外界認識装置に関する。 The present invention relates to an external world recognition device such as an in-vehicle stereo camera, and more particularly to an external world recognition device that accurately determines whether to stop driving support control using an image captured by an in-vehicle camera.
 アダプティブクルーズコントロール(ACC)や衝突被害軽減ブレーキ(AEB)といった運転支援制御が知られている。これらの制御は、例えば車両前方を被写体としてフロントガラス内に設置されたカメラによって撮像される画像から周辺環境(外界)を認識し、車両の制御を行う。 Driving support controls such as adaptive cruise control (ACC) and collision damage mitigation braking (AEB) are well known. For these controls, the vehicle is controlled by recognizing the surrounding environment (outside world) from an image captured by a camera installed in the windshield with the subject in front of the vehicle, for example.
 上記運転支援制御において、豪雨やフロントガラスの曇り等によってカメラの視界が妨げられ、画像が劣化した場合には、未検知・誤検知が発生し、AEBの未制動やACC中の誤加速、AEBの誤制動やACC中の誤制動といった動作を起こす。そこで、運転支援制御がこのような動作を起こす前に運転支援制御を一時的に停止し、ユーザに一時停止したことを通知する機能であるHALT機能が必要である。ただし、運転支援制御を一時的に停止すると、ドライバの運転操作の負担を軽減できなくなるため、運転支援制御を一時的に停止すべき状況の判定を的確に行うことが望まれる。 In the above driving support control, if the camera's field of view is obstructed by heavy rain or cloudy windshield, etc., and the image deteriorates, non-detection / false detection will occur, and AEB will not brake, erroneous acceleration during ACC, AEB erroneous braking and erroneous braking during ACC. Therefore, there is a need for a HALT function, which is a function that temporarily stops the driving support control before the driving support control causes such an operation and notifies the user of the temporary stop. However, if the driving support control is temporarily stopped, the driver's driving operation burden cannot be reduced, so it is desirable to accurately determine the situation in which the driving support control should be temporarily stopped.
 特許文献1では、運転支援制御を一時的に停止すべき状況の判定を的確に行うために、ワイパー作動中に第1、第2の撮像画像データからエッジまたは輝度の差分を求め、差分が規定値以上の場合は停止判定処理を行わないことが記載されている。 In Japanese Unexamined Patent Application Publication No. 2004-100000, in order to accurately determine the situation in which driving support control should be temporarily stopped, the edge or luminance difference is obtained from first and second captured image data during wiper operation, and the difference is specified. It is described that the stop determination process is not performed when the value is equal to or greater than the value.
特開2015-088047号公報JP 2015-088047 A
 特許文献1では、停止判定処理は、現在フレームでのエッジ数や距離データ数、輝度を用いているが、現在フレームでの情報のみを用いて停止判定(画像劣化判定)を行っており、もともと輝度やエッジが出にくい環境なのか、豪雨等による画像劣化によって輝度やエッジが出ていないのかが判断できない。その場合、積雪路のようなもともと輝度やエッジが出にくい環境で、雨滴や雪が付着して画像の軽度の劣化が生じた場合に、誤ってHALTとして判定して機能(運転支援制御)を停止してしまう。 In Patent Document 1, the stop determination process uses the number of edges, the number of distance data, and luminance in the current frame. It is not possible to determine whether the environment is such that brightness and edges are difficult to appear, or whether brightness and edges are not appearing due to image degradation due to heavy rain or the like. In that case, in environments such as snow-covered roads where brightness and edges are inherently difficult to display, if raindrops or snow adheres to the surface and causes slight deterioration in the image, it will be erroneously determined as HALT and the function (driving support control) will be stopped. Resulting in.
 そこで、本発明では、豪雨のように画像の劣化が激しい場合はHALTとして判定して機能を停止させ、画像の劣化が軽度である小雨の場合はHALT判定せずに機能を停止させない外界認識装置を提供することを目的とする。 Therefore, in the present invention, when the image deterioration is severe, such as heavy rain, it is judged as HALT and the function is stopped. intended to provide
 上記課題を解決するために、代表的な本発明の外界認識装置の一つは、車室内に取り付けられたカメラと、前記カメラで取得された画像から、画像特徴量を求める画像特徴量算出部と、前記カメラで取得された画像から、車外の対象物を認識する認識部と、所定周期で定められたタイミングで取得された第一の画像における第一の画像特徴量を記憶するメモリと、前記第一の画像特徴量、または当該第一の画像特徴量と現在の画像における第二の画像特徴量との比較結果に基づいて、前記認識部の機能を一時停止するか否かを判定する機能停止判定部と、を有することを特徴とする。 In order to solve the above problems, one of the typical external world recognition devices of the present invention includes a camera installed in the vehicle interior and an image feature amount calculation unit that obtains an image feature amount from the image acquired by the camera. a recognition unit that recognizes an object outside the vehicle from the image acquired by the camera; a memory that stores a first image feature amount in the first image acquired at a timing determined in a predetermined cycle; Determining whether to suspend the function of the recognition unit based on the first image feature amount or a comparison result between the first image feature amount and a second image feature amount in the current image and a function stop determination unit.
 本発明によれば、時系列データを用いて判定を行うことで、もともと視差等の画像特徴が出にくい環境なのか、豪雨等による画像劣化によって視差等の画像特徴が出ていないのかが判断できるため、豪雨のように画像の劣化が激しい場合はHALTとして判定して機能を停止させ、画像の劣化が軽度である小雨の場合はHALT判定せずに機能を停止させない。そのため、豪雨のように画像の劣化が激しい場合は、機能(運転支援制御)を一時的に停止する一方、HALTする程度ではない画像の劣化が軽度である小雨の場合は、機能(運転支援制御)を停止させず、運転支援を継続してドライバの運転操作の負担を軽減し続けることができる。 According to the present invention, by making determinations using time-series data, it is possible to determine whether the environment is such that image features such as parallax are difficult to appear in the first place, or whether image features such as parallax are not appearing due to image degradation due to heavy rain or the like. Therefore, when the image deterioration is severe, such as heavy rain, it is determined as HALT and the function is stopped, and when the image is lightly deteriorated, the function is not stopped without HALT determination. Therefore, when the image deterioration is severe, such as heavy rain, the function (driving support control) is temporarily suspended. ) is not stopped, and driving assistance can be continued to reduce the burden of the driver's driving operation.
 上述した以外の課題、構成、及び効果は以下の実施形態の説明により明らかにされる。 Problems, configurations, and effects other than those described above will be clarified by the following description of the embodiment.
本発明の一実施形態における車載ステレオカメラの構成図。1 is a configuration diagram of an in-vehicle stereo camera according to an embodiment of the present invention; FIG. 通常時の(a)左カメラ画像と(b)右カメラ画像と(c)視差画像の説明図。Explanatory diagrams of (a) a left camera image, (b) a right camera image, and (c) a parallax image in a normal state. 雨滴付着時の(a)左カメラ画像と(b)右カメラ画像と(c)視差画像の説明図。Explanatory drawing of (a) left camera image, (b) right camera image, and (c) parallax image when raindrops adhere. ワイパー作動前のワイパー位置についての説明図。Explanatory drawing about the wiper position before wiper actuation. ワイパー払拭直後のワイパー位置についての説明図。Explanatory drawing about the wiper position immediately after wiper wiping. ワイパー作動前のカメラ画像の説明図。Explanatory drawing of the camera image before wiper operation. ワイパー払拭直後のカメラ画像の説明図。Explanatory drawing of the camera image immediately after wiper wiping. 車載ステレオカメラの機能停止判定部のフローチャート。The flowchart of the function stop determination part of a vehicle-mounted stereo camera. 豪雨時の本システムの停止判定の挙動イメージであり、視差数の時系列変化の説明図。FIG. 10 is an image of the behavior of the stop determination of this system during heavy rain, and is an explanatory diagram of time-series changes in the number of parallaxes. 豪雨時の本システムの停止判定の挙動イメージであり、各フレームでの劣化判定の結果の説明図。FIG. 10 is an image of the behavior of the stop determination of this system during heavy rain, and is an explanatory diagram of the deterioration determination result in each frame. 豪雨時の本システムの停止判定の挙動イメージであり、HALT発生カウンタの説明図。FIG. 4 is an image of the behavior of the stop determination of this system during heavy rain, and is an explanatory diagram of the HALT occurrence counter. 小雨時の本システムの停止判定の挙動イメージであり、視差数の時系列変化の説明図。FIG. 10 is a behavior image of stop determination of the system during light rain, and is an explanatory diagram of time-series changes in the number of parallaxes. 小雨時の本システムの停止判定の挙動イメージであり、各フレームでの劣化判定の結果の説明図。FIG. 4 is an image of the behavior of the stop determination of this system during light rain, and is an explanatory diagram of the deterioration determination result in each frame. 小雨時の本システムの停止判定の挙動イメージであり、HALT発生カウンタの説明図。FIG. 4 is an image of how the system stops when it rains lightly, and is an explanatory diagram of a HALT occurrence counter.
 本発明の実施形態について、図面を参照しながら以下に説明する。 An embodiment of the present invention will be described below with reference to the drawings.
 図1は、本実施形態の代表図であり、本実施形態における車載ステレオカメラの構成図である。車載ステレオカメラ1は、車両に搭載され、車室内に取り付けられた(複数の)カメラで取得(撮像)された画像から、車外の対象物を認識し、ACCやAEB等の運転支援機能を実現する外界認識装置としての機能を有する。車載ステレオカメラ1は、左右一対の撮像部としてのカメラ2A、2B(左カメラ2A、右カメラ2B)と、画像特徴量算出部3と、認識部4と、第一の画像特徴量を記憶するメモリ5と、機能停止判定部6と、で構成される。車載ステレオカメラ1は、カメラやCPU、メモリ、電子回路などのハードウェアと、ハードウェアと協働して車両の運転支援を行うソフトウェアとを備えている。 FIG. 1 is a representative diagram of this embodiment, and is a configuration diagram of an in-vehicle stereo camera in this embodiment. The in-vehicle stereo camera 1 is mounted on the vehicle and recognizes objects outside the vehicle from images acquired (captured) by (multiple) cameras installed in the vehicle interior, and realizes driving support functions such as ACC and AEB. It has a function as an external world recognition device. The in-vehicle stereo camera 1 stores cameras 2A and 2B (left camera 2A and right camera 2B) as a pair of left and right imaging units, an image feature amount calculation unit 3, a recognition unit 4, and a first image feature amount. It is composed of a memory 5 and a function stop determination unit 6 . The in-vehicle stereo camera 1 includes hardware such as a camera, CPU, memory, and electronic circuit, and software that cooperates with the hardware to assist driving of the vehicle.
 図1のカメラ2A、2Bは、画像を撮像する。カメラ2A、2Bは車幅方向に離間して配置されるように車両に固定されており、車両のフロントガラス越しに前方を撮像して、互いに同じ領域を撮像できるようになっている。 The cameras 2A and 2B in FIG. 1 take images. The cameras 2A and 2B are fixed to the vehicle so as to be spaced apart in the vehicle width direction, and can image the front through the windshield of the vehicle so as to image the same area.
 図1の認識部4は、カメラ2A、2Bより取得した画像から、周辺環境(ここでは前方)の車両や歩行者等の対象物を検知・認識して、ACCやAEB等の運転支援機能を実現する。 The recognition unit 4 in FIG. 1 detects and recognizes objects such as vehicles and pedestrians in the surrounding environment (in front here) from the images acquired by the cameras 2A and 2B, and performs driving support functions such as ACC and AEB. come true.
 画像特徴量算出部3は、左カメラ2Aで撮像された画像(以下、左カメラ画像とも呼ぶ)と右カメラ2Bで撮像された画像(以下、右カメラ画像とも呼ぶ)を入力として、画像中の画素値(輝度値)を解析することで画像特徴量を算出する。画像特徴量は、画素値そのものを利用して算出してもよいし、複数のピクセルから局所領域を定義し、局所領域内の平均輝度、重み付きの平均輝度、代表輝度などを利用して算出してもよい。本例では、画像特徴量算出部3は、画像特徴量として、視差数やエッジ数を算出する。 The image feature amount calculation unit 3 receives an image captured by the left camera 2A (hereinafter also referred to as a left camera image) and an image captured by the right camera 2B (hereinafter also referred to as a right camera image), and calculates the An image feature amount is calculated by analyzing a pixel value (luminance value). The image feature quantity can be calculated using the pixel values themselves, or by defining a local area from multiple pixels and using the average luminance, weighted average luminance, representative luminance, etc. in the local area. You may In this example, the image feature amount calculator 3 calculates the number of parallaxes and the number of edges as the image feature amount.
 画像中のエッジ数は、公知の技術として現フレームにおける左カメラ2Aまたは右カメラ2B(の画像)から算出することができる。 The number of edges in an image can be calculated from (the image of) the left camera 2A or the right camera 2B in the current frame as a known technique.
 視差は、左右カメラ2A、2Bの画像の対応点をパターンマッチングにより検出し、検出された対応点間の座標のずれを算出する。この視差を用いることで、三角測量の原理により実空間上における対応点までの距離を算出することができる。図2に、画像特徴量算出部3における視差計算のイメージを表す。図2では、(a)左カメラ画像と(b)右カメラ画像のパターンマッチングによって、(c)視差画像を算出する。視差は、人や白線、先行車等、エッジが出る部分で算出されるが、空等のエッジが出にくくパターンマッチングが困難な部分では視差が算出できない。この視差画像全体で、エッジが出る部分で算出された有効な視差の数を視差数と呼ぶ。 For parallax, corresponding points in the images of the left and right cameras 2A and 2B are detected by pattern matching, and the coordinate shift between the detected corresponding points is calculated. By using this parallax, the distance to the corresponding point in the real space can be calculated according to the principle of triangulation. FIG. 2 shows an image of parallax calculation in the image feature quantity calculation unit 3. As shown in FIG. In FIG. 2, (c) a parallax image is calculated by pattern matching of (a) the left camera image and (b) the right camera image. Parallax is calculated in areas where edges appear, such as people, white lines, and preceding vehicles, but parallax cannot be calculated in areas where edges are difficult to appear, such as the sky, and pattern matching is difficult. The number of effective parallaxes calculated at the portion where the edge appears in the entire parallax image is called the number of parallaxes.
 図3は、図2の例で雨滴が付着した場合の例を示す。図3の(a)左カメラ画像のように左カメラ(詳しくは、左カメラの視野内のフロントガラス)に雨滴が付着した場合は、雨滴によって周辺環境の撮像画像がゆがむ、あるいは、隠されるため、(b)右カメラ画像とのパターンマッチングが困難となり、(c)視差画像のように全体の視差数は低下する。 FIG. 3 shows an example in which raindrops adhere to the example in FIG. When raindrops adhere to the left camera (specifically, the windshield within the field of view of the left camera) as in the left camera image in FIG. , (b) pattern matching with the right camera image becomes difficult, and (c) the overall number of parallaxes decreases as in the parallax image.
 メモリ5は、画像特徴量算出部3により取得した第一の画像における第一の画像特徴量を記憶する。第一の画像特徴量を算出するための第一の画像は、ワイパー払拭によって規定される所定周期で定められたタイミングで取得する。図4A、B、図5A、Bに、ワイパー払拭直後について説明するワイパー位置、カメラ画像を示す。図4A、B中、2A、2Bは左右カメラ、10はフロントガラス、11A、11Bは左右ワイパーである。 The memory 5 stores the first image feature quantity in the first image acquired by the image feature quantity calculator 3 . The first image for calculating the first image feature amount is acquired at a timing defined in a predetermined cycle defined by wiper wiping. FIGS. 4A, 4B, 5A, and 5B show wiper positions and camera images immediately after wiper wiping. 4A and 4B, 2A and 2B are left and right cameras, 10 is a windshield, and 11A and 11B are left and right wipers.
 図4Aはワイパー作動前のイメージであり、雨天時にはこの時、カメラ前のフロントガラスに雨滴が付着し、カメラから得られる画像は、図5Aのようになる。一方、図5Bは、ワイパー払拭直後の画像である。図4Bのようにワイパー11A、11Bが共に左右カメラ前を通過してフロントガラスに付着した雨滴を払拭し、図5Bのような鮮明な画像が取得できる場合を、ワイパー払拭直後と定義する。例えば、一般的な車両では、ワイパー11A、11Bは、図4Aのようなワイパー作動前の位置(通常時ないし収納時の位置)から、フロントガラス上を一方向(図4A、Bに示す例では、時計回り)に動作(回転)する。そして、ワイパー作動前の位置から最も離間した位置(図4BのAA、BBの位置)に到達した後、フロントガラス上を一方向とは逆方向(図4A、Bに示す例では、反時計回り)に動作(回転)し、図4Aのようなワイパー作動前の位置(通常時ないし収納時の位置)に再び戻る。このような一連の動作を1回または複数回繰り返し実行することで、フロントガラスに付着した雨滴を払拭する。このような車両において、前記一連の動作の中で、ワイパー11A、11Bはそれぞれ、左カメラ2A前、右カメラ2B前を2回(一方向と他方向の2回)通過することになる。なお、カメラの設置位置とワイパーの払拭範囲(動作範囲)によっては、ワイパー11A、11Bがそれぞれ、左カメラ2A前を2回(ワイパー11A、11Bで合計4回)通過したり、ワイパー11A、11Bがそれぞれ、右カメラ2B前を2回(ワイパー11A、11Bで合計4回)通過したりする場合もある。一般的には、ワイパー11A、11Bがそれぞれ、左カメラ2A前、右カメラ2B前を(一方向に1回ではなく)一方向と他方向に2回通過することで、前記一連の動作の中で、ワイパー払拭が完了すると考えられる。そのため、前記一連の動作の中で、ワイパー11A、11Bがそれぞれ、左カメラ2A前、右カメラ2B前を2回目に完全に通過した直後(換言すると、一方向に通過した後に他方向に通過した直後)を、ワイパー払拭直後と定義する。 Fig. 4A is an image before the wipers are activated. In rainy weather, raindrops adhere to the windshield in front of the camera, and the image obtained from the camera is as shown in Fig. 5A. On the other hand, FIG. 5B is an image immediately after wiping with the wiper. Immediately after the wipers are wiped is defined as the case where the wipers 11A and 11B both pass in front of the left and right cameras as shown in FIG. For example, in a general vehicle, the wipers 11A and 11B move from the position before the wiper operation (normal or retracted position) as shown in FIG. , clockwise). Then, after reaching the farthest position (positions AA and BB in FIG. 4B) from the position before the wiper operation, the windshield is moved in a direction opposite to one direction (counterclockwise in the example shown in FIGS. 4A and 4B). ), and returns to the position before the wiper operation (normal or retracted position) as shown in FIG. 4A. Raindrops adhering to the windshield are wiped off by repeating such a series of operations once or more times. In such a vehicle, the wipers 11A and 11B each pass in front of the left camera 2A and in front of the right camera 2B twice (twice in one direction and twice in the other direction) in the series of operations. Note that depending on the installation position of the camera and the wiping range (operating range) of the wiper, the wipers 11A and 11B may each pass in front of the left camera 2A twice (4 times in total for the wipers 11A and 11B), or the wipers 11A and 11B may may each pass in front of the right camera 2B twice (four times in total with the wipers 11A and 11B). Generally, the wipers 11A and 11B respectively pass in front of the left camera 2A and right camera 2B twice in one direction and in the other direction (instead of once in one direction) so that the Then, the wiper wiping is considered to be completed. Therefore, in the series of operations, immediately after the wipers 11A and 11B completely pass in front of the left camera 2A and in front of the right camera 2B for the second time (in other words, after passing in one direction, they pass in the other direction). Immediately after) is defined as immediately after wiper wipe.
 また、本実施形態では、上述のワイパー払拭動作のタイミングが、機能停止判定(画像劣化判定)タイミングの基準となり得る。そのため、ある一連の動作の中のワイパー払拭直後から次の一連の動作の中のワイパー払拭直後までを、ワイパー11A、11Bの動作周期(駆動周期)と定義する。換言すると、ワイパー11A、11Bがワイパー作動前の位置からフロントガラス上を動作(回転)してワイパー作動前の位置に再び戻るまでを、ワイパー11A、11Bの動作周期と定義することもできる。さらに換言すると、ワイパー払拭動作の一連の動作の中の所定のタイミングを基準タイミングとして設定し、ある一連の動作の中の所定のタイミングから次の一連の動作の中の所定のタイミングまでを、ワイパー11A、11Bの動作周期と定義することもできる。 In addition, in the present embodiment, the timing of the wiper wiping operation described above can be the reference for the timing of function stop determination (image deterioration determination). Therefore, the operation cycle (driving cycle) of the wipers 11A and 11B is defined from immediately after wiper wiping in a series of operations to immediately after wiper wiping in the next series of operations. In other words, the operation period of the wipers 11A and 11B can be defined as the period from the position before the wipers are activated to the time when the wipers 11A and 11B operate (rotate) on the windshield and return to the positions before the wipers are activated. Furthermore, in other words, a predetermined timing in a series of wiper wiping operations is set as a reference timing, and the wiper is operated from a predetermined timing in a series of operations to a predetermined timing in the next series of operations. It can also be defined as the operation period of 11A and 11B.
 ワイパー払拭直後かどうかは、カメラの画像や、ワイパーのモーター信号、ワイパーの駆動周期、雨滴センサからの信号などから判定することができる。画像特徴量算出部3は、ワイパー払拭直後の画像での画像特徴量をメモリ5へ第一の画像の画像特徴量として上書き保存する。 Whether or not the wiper has just been wiped can be determined from the camera image, wiper motor signal, wiper drive cycle, signal from the raindrop sensor, and so on. The image feature amount calculator 3 overwrites and saves the image feature amount of the image immediately after wiping with the wiper in the memory 5 as the image feature amount of the first image.
 図1の機能停止判定部6は、画像特徴量算出部3により取得した現在のフレーム(画像)での画像特徴量を第二の画像特徴量として、メモリ5に保存された第一の画像特徴量と比較して、機能(認識部4の運転支援機能)の停止判定を行う。 The function stop determination unit 6 in FIG. It is determined whether the function (driving support function of the recognition unit 4) is stopped by comparing with the amount.
 次に、図6を参照して、上述の車載ステレオカメラ1の機能停止判定部6の具体的な動作内容について説明する。図6は、本実施形態に関わる車載ステレオカメラ1の機能停止判定部6の手順を示すフローチャートである。 Next, with reference to FIG. 6, specific operation contents of the function stop determination unit 6 of the above-described in-vehicle stereo camera 1 will be described. FIG. 6 is a flow chart showing the procedure of the function stop determination section 6 of the in-vehicle stereo camera 1 according to this embodiment.
 まず、S101でワイパー作動中かどうかを判定し、ワイパー作動中である場合は、S102へ移行する。ワイパー作動中でない場合には、機能停止判定を終了する。ワイパー作動中かどうかを判定するためには、ワイパースイッチ信号などを取得する。 First, in S101, it is determined whether the wipers are in operation, and if the wipers are in operation, the process proceeds to S102. If the wiper is not in operation, the function stop determination is terminated. A wiper switch signal or the like is acquired in order to determine whether the wiper is in operation.
 S102では、メモリ5へ保存したワイパー払拭直後の第一の画像特徴量と現在のフレームの特徴数である第二の画像特徴量を比較して、減少率dを求める。第一の画像特徴量をf1、第二の画像特徴量をf2とする時、減少率d[%]は以下の式(1)で求められる。
[数1]
  d = ( f1 - f2 ) ÷ f1 × 100  ・・・(1)
In S102, the first image feature amount immediately after wiping with the wiper stored in the memory 5 is compared with the second image feature amount, which is the feature number of the current frame, to obtain the reduction rate d. When the first image feature amount is f1 and the second image feature amount is f2, the reduction rate d[%] is obtained by the following formula (1).
[Number 1]
d = ( f1 - f2 ) ÷ f1 × 100 (1)
 S103では、減少率dが基準値dTHより大きいかどうかを判定する。基準値dTHより大きい場合は、現在フレームを劣化と判定し、S104にてHALT発生カウンタのカウンタ値を加算する。基準値dTH以下の場合は、現在フレームを非劣化と判定し、S105にてHALT発生カウンタのカウンタ値を減算する。 In S103, it is determined whether the rate of decrease d is greater than the reference value dTH. If it is greater than the reference value dTH, the current frame is determined to be deteriorated, and the counter value of the HALT occurrence counter is added in S104. If it is less than the reference value dTH, the current frame is determined to be non-degraded, and the counter value of the HALT occurrence counter is decremented in S105.
 S106では、S104、S105で算出したHALT発生カウンタのカウンタ値によってカメラ機能を一時停止(HALT)するかどうかを判定する。カウンタ値が基準値以上の場合はS107へ移行し、カメラ機能を一時停止(HALT)する。この状態の場合、認識部4で行う運転支援機能を一時的に停止する。 In S106, it is determined whether to suspend the camera function (HALT) based on the counter value of the HALT generation counter calculated in S104 and S105. If the counter value is equal to or greater than the reference value, the process proceeds to S107 to temporarily stop (HALT) the camera function. In this state, the driving support function performed by the recognition unit 4 is temporarily stopped.
 次に、上記フローの具体的な挙動について、図7A、B、C、図8A、B、Cを用いて説明する。 Next, the specific behavior of the above flow will be explained using FIGS. 7A, B, C and 8A, B, C.
 図7A、B、Cは、豪雨シーンでの挙動のイメージ図である。図7Aは、視差数の時系列変化のグラフである。豪雨の場合では、ワイパー払拭直後(▲)は画像が鮮明に映るため(図5B参照)、視差数がピークに達する。その後、次回のワイパー払拭まで、カメラ前のフロントガラスに雨滴が少しずつ付着するため(図5A参照)、視差数は減少する。次のワイパー払拭直後のタイミングで、また視差数が増加し、以降は同様の挙動を繰り返す。図7Bは、各フレームでの劣化判定の結果であり、S103の判定の結果を表す。図7Bのグラフでは、S104の劣化と判定した場合を1、S105の非劣化と判定した場合を0として表す。豪雨の場合は、視差数の減少率が大きくなるので、劣化(1)、非劣化(0)を周期的に繰り返す挙動となる。図7Cは、S104、S105でのHALT発生カウンタを表す。図7Bの場合には、劣化(視差数の減少率が周期的に複数回基準値より大きくなる)、非劣化(視差数の減少率が周期的に複数回基準値以下となる)を周期的に繰り返すことで、カウンタ値は増加、減少を繰り返す。本例では、各フレームでの減少値と比較して増加値を大きくしている。ここで、カウンタ値と基準値を比較して(S106)、カウンタ値が基準値を上回った場合に、カメラの機能を一時停止(HALT)する(S107)。これにより、認識部4で行う運転支援機能が未作動や誤作動を起こす前に運転支援機能を一時的に停止する。  Figures 7A, B, and C are image diagrams of behavior in a heavy rain scene. FIG. 7A is a graph of time-series changes in the number of parallaxes. In the case of heavy rain, the number of parallaxes reaches a peak immediately after the wiper is wiped off (▴), because the image appears sharp (see FIG. 5B). Thereafter, raindrops gradually adhere to the windshield in front of the camera until the next wiper wipes (see FIG. 5A), so the number of parallaxes decreases. Immediately after the next wiper wipe, the number of parallaxes increases again, and the same behavior is repeated thereafter. FIG. 7B shows the result of the deterioration determination for each frame, representing the result of the determination in S103. In the graph of FIG. 7B , 1 indicates deterioration in S104, and 0 indicates non-deterioration in S105. In the case of heavy rain, the rate of decrease in the number of parallaxes increases, so the behavior is such that deterioration (1) and non-deterioration (0) are repeated periodically. FIG. 7C shows HALT occurrence counters in S104 and S105. In the case of FIG. 7B, deterioration (the rate of decrease in the number of parallaxes periodically becomes greater than the reference value multiple times) and non-degradation (the rate of decrease in the number of parallaxes periodically becomes less than or equal to the reference value multiple times) are periodically performed. , the counter value repeats increment and decrement. In this example, the increment value is increased compared to the decrement value in each frame. Here, the counter value and the reference value are compared (S106), and if the counter value exceeds the reference value, the camera function is temporarily stopped (HALT) (S107). As a result, the driving support function performed by the recognition unit 4 is temporarily stopped before it fails to operate or malfunctions.
 図8A、B、Cは、積雪等の視差(またはエッジ)が出にくい環境での小雨の挙動のイメージ図である。図8Aは、視差数の時系列変化のグラフである。小雨の場合では、豪雨と同様にワイパー払拭直後(▲)は画像が鮮明に映るため(図5B参照)、視差数がピークに達し、その後は減少するが、その減少率は小さい。図8Bは、各フレームでの劣化判定の結果であり、S103の判定の結果を表す。小雨の場合は、視差数の減少率が大きくないので、非劣化(0)と判定される。図8Cは、S104、S105でのHALT発生カウンタを表す。非劣化であるため、カウンタ値は減少する(具体的には、カウンタ値は0以上であるため、変化せず、横ばいとなる)。ここで、カウンタ値と基準値を比較して(S106)、カウンタ値は基準値を下回る。これにより、認識部4で行う運転支援機能を停止させず、運転支援を継続してドライバの運転操作の負担を軽減し続けることができる。  Figs. 8A, B, and C are image diagrams of the behavior of light rain in an environment where parallax (or edge) is difficult to appear, such as snow. FIG. 8A is a graph of time-series changes in the number of parallaxes. In the case of light rain, as in heavy rain, the image appears sharp immediately after the wiper is wiped off (▴) (see FIG. 5B), so the number of parallaxes reaches a peak and then decreases, but the rate of decrease is small. FIG. 8B shows the result of the deterioration determination for each frame, representing the result of the determination in S103. In the case of light rain, the reduction rate of the number of parallaxes is not large, so it is determined as non-degraded (0). FIG. 8C shows HALT occurrence counters in S104 and S105. Since there is no deterioration, the counter value decreases (specifically, the counter value is 0 or more, so it does not change and stays the same). Here, the counter value is compared with the reference value (S106), and the counter value is below the reference value. As a result, the driving assistance function performed by the recognition unit 4 is not stopped, and the driving assistance can be continued to reduce the burden of the driver's driving operation.
 このように機能停止判定部6において、ワイパー払拭直後(左右のカメラ前をワイパーが完全に通過し、画像が鮮明に映る状態)を基準として現在フレームと比較して視差数の減少率を算出することで、もともと視差が出にくい環境なのか、豪雨等による画像劣化によって視差が出ていないのかが判断(言い換えると、通常の雨と積雪等のもともと視差が出にくい環境を区別)できる。 In this way, in the function stop determination unit 6, the reduction rate of the number of parallaxes is calculated by comparing with the current frame immediately after the wiper wipe (the wiper completely passes in front of the left and right cameras and the image is clearly displayed) as a reference. By doing so, it is possible to determine whether the environment is such that parallax is difficult to appear in the first place, or whether parallax is not appearing due to image deterioration due to heavy rain or the like (in other words, it is possible to distinguish between environments such as normal rain and snow that are originally difficult to produce parallax).
 なお、上記実施形態では、HALT発生カウンタを用いて機能停止を判定したが、図7Aのような視差数の周期的な変化をとらえる別の手法を用いてもよい。例えば、図7Bのような各フレームでの劣化判定の結果の立上り間隔や立下り間隔の周期的な変化(周期性)を基に、機能停止を判定してもよい。 In the above embodiment, a HALT occurrence counter is used to determine whether a function has stopped. However, another method of capturing periodic changes in the number of parallaxes as shown in FIG. 7A may be used. For example, it is possible to determine the stoppage of a function based on the periodic change (periodicity) of the rising interval and the falling interval as a result of deterioration determination in each frame as shown in FIG. 7B.
 また、上記実施形態では、機能停止を判定するタイミングとして、ワイパー払拭直後(左右のカメラ前をワイパーが完全に通過し、画像が鮮明に映る状態)を基準として現在フレームと比較して、ワイパー払拭直後の画像の画像特徴量に対する現在フレームの画像特徴量の減少率を算出することで、機能停止を判定した。ただし、機能停止を判定するタイミングは、上記実施形態に限られない。 Further, in the above-described embodiment, the timing immediately after wiper wiping (the state in which the wiper completely passes in front of the left and right cameras and the image is clearly captured) is compared with the current frame as the timing for determining whether the wiper has been wiped. The function stop was determined by calculating the reduction rate of the image feature amount of the current frame with respect to the image feature amount of the immediately following image. However, the timing for judging whether to stop functioning is not limited to the above embodiment.
 例えば、ワイパー払拭直後から所定時間後(または所定フレーム数後)を基準として現在フレームと比較して、画像特徴量の減少率を算出することで、機能停止を判定してもよい。また、ワイパー払拭直前(左右カメラ前をワイパーが完全に通過する直前で、視差が出にくい画像の状態)を基準として現在フレームと比較して、ワイパー払拭直前の画像の画像特徴量に対する現在フレームの画像特徴量の増加率を算出することで、機能停止を判定してもよい。また、ワイパー払拭直後などを基準として現在フレームより所定時間前(または所定フレーム数前)のフレーム(言い換えると、例えばワイパー払拭直前やワイパー払拭直後から所定時間後のフレーム)と比較して、画像特徴量の変化率(減少率または増加率)を算出することで、機能停止を判定してもよい。ここでの判定で用いる所定時間(または所定フレーム数)は、ワイパーの動作周期(例えば、あるワイパー払拭直後から次のワイパー払拭直後までの期間)などから規定することができる。 For example, a function stop may be determined by comparing with the current frame after a predetermined time (or after a predetermined number of frames) immediately after wiping the wiper, and calculating the reduction rate of the image feature amount. In addition, the current frame is compared with the image immediately before the wiper is wiped (immediately before the wiper completely passes in front of the left and right cameras, the state of the image in which parallax is unlikely to occur) as a reference, and the current frame is compared with the image feature amount of the image immediately before the wiper is wiped. A function stop may be determined by calculating an increase rate of the image feature amount. In addition, the frame immediately after wiping the wiper is compared with a frame a predetermined time (or a predetermined number of frames before) from the current frame (in other words, for example, a frame immediately before wiping the wiper or a frame after a predetermined time after wiping the wiper) to determine the image characteristics. The outage may be determined by calculating the rate of change (rate of decrease or rate of increase) of the amount. The predetermined period of time (or the predetermined number of frames) used in this determination can be defined from the operation period of the wipers (for example, the period from immediately after wiping with a certain wiper to immediately after wiping with the next wiper).
 すなわち、本実施形態の機能停止判定部6は、ワイパーの動作周期内の異なる2点(2時刻)を特定し、そのワイパーの動作周期内の異なる2点(2時刻)の画像特徴量の変化率(減少率または増加率)を算出することで、機能停止を判定することが可能である。 That is, the function stop determination unit 6 of the present embodiment identifies two different points (two times) within the wiper operation cycle, and changes the image feature amount at two different points (two times) within the wiper operation cycle. By calculating the rate (decrease rate or increase rate), it is possible to determine outage.
 さらに、上記実施形態では、第一、第二の画像特徴量の変化率を用いて機能停止を判定したが、例えば、第一、第二の画像特徴量の絶対値同士の比較結果に基づいて、機能停止を判定するようにしてもよい。 Furthermore, in the above-described embodiment, the change rate of the first and second image feature values is used to determine whether the function has stopped. , it may be determined that the function has stopped.
 以上で説明したように、従来技術では、もともと視差等の画像特徴が出にくい環境なのか、豪雨等による画像劣化によって視差等の画像特徴が出ていないのかが判断できない。その場合、積雪路のようなもともと視差等の画像特徴が出にくい環境で、雨滴や雪が付着して画像の軽度の劣化が生じた場合に、誤ってHALTとして判定して機能(運転支援制御)を停止してしまう。 As described above, with the conventional technology, it is not possible to determine whether the environment is such that image features such as parallax are difficult to appear, or whether image features such as parallax are not appearing due to image degradation due to heavy rain or the like. In this case, in environments such as snow-covered roads where image features such as parallax are inherently difficult to obtain, if raindrops or snow adheres to the image and mildly degrades the image, it is erroneously determined to be HALT and functions (driving support control). will stop.
 豪雨等によって画像が劣化したため(画面全体がゆがんだため)、視差等の画像特徴が出なくなってHALTとなり機能(運転支援制御)が一時的に停止した場合、ユーザは違和感がない。一方、雪道等ではもともと視差等の画像特徴が出にくく、雨滴によって画像に軽度の劣化が生じたため(画面の一部がゆがんだため)、視差等の画像特徴が出なくなってHALTとなり機能(運転支援制御)が一時的に停止した場合、ユーザは違和感がある。言い換えると、視差等の画像特徴が出にくい場所(雪道や真っ暗な路面)でHALTする程度ではない軽度の劣化が画像に発生した場合に、誤ってHALTを発生させて機能(運転支援制御)を一時的に停止してしまうと、ユーザに違和感を与える。 If the image is degraded due to heavy rain, etc. (the entire screen is distorted), image features such as parallax are no longer displayed, and the function (driving support control) is temporarily stopped due to HALT, the user will not feel uncomfortable. On the other hand, on snowy roads, etc., image features such as parallax are originally difficult to appear, and raindrops caused slight deterioration in the image (because part of the screen was distorted). When the driving support control) is temporarily stopped, the user feels uncomfortable. In other words, in places where image features such as parallax are difficult to see (snowy roads or dark road surfaces), if there is a slight deterioration in the image that does not require HALT, it will erroneously generate HALT and function (driving support control). Temporarily stopping the operation gives the user a sense of incongruity.
 そのため、もともと視差等の画像特徴が出にくい環境なのか、豪雨等による画像劣化によって視差等の画像特徴が出ていないのかを判断し、機能(運転支援制御)を一時的に停止すべき状況を正確に判定し、豪雨のように画像の劣化が激しい場合はHALTとして判定して機能を停止させ、画像の劣化が軽度である小雨の場合はHALT判定せずに機能を停止させないことが重要である。 Therefore, it is necessary to determine whether the environment is such that image features such as parallax are difficult to appear in the first place, or whether image features such as parallax are not appearing due to image deterioration due to heavy rain, etc., and the situation in which the function (driving support control) should be temporarily stopped. It is important to make an accurate judgment, to stop the function by judging it as HALT when the image deterioration is severe, such as heavy rain, and not to stop the function without making a HALT judgment in the case of light rain where the image deterioration is mild. be.
 以上で説明した本実施形態の車載ステレオカメラ(外界認識装置)1は、車室内に取り付けられたカメラ(左カメラ2A、右カメラ2B)と、前記カメラで取得された画像から、画像特徴量(視差数、エッジ数等)を求める画像特徴量算出部3と、前記カメラで取得された画像から、車外の対象物を認識する認識部4と、所定周期で定められたタイミング(ワイパー11A、11Bの動作で規定されるタイミング)で取得された第一の画像における第一の画像特徴量を記憶するメモリ5と、前記第一の画像特徴量、または当該第一の画像特徴量と現在の画像における第二の画像特徴量との比較結果に基づいて、前記認識部4の機能を一時停止するか否かを判定(HALT判定)する機能停止判定部6と、を有する。 The in-vehicle stereo camera (external recognition device) 1 of the present embodiment described above uses the cameras (left camera 2A, right camera 2B) installed in the vehicle interior and the image feature amount ( a recognition unit 4 for recognizing an object outside the vehicle from the image acquired by the camera; and a timing ( wiper 11A, 11B A memory 5 for storing the first image feature amount in the first image acquired at the timing defined by the operation of (), the first image feature amount, or the first image feature amount and the current image and a function stop determination unit 6 for determining whether or not to suspend the function of the recognition unit 4 (HALT determination) based on the result of comparison with the second image feature amount in .
 より詳しくは、本実施形態の車載ステレオカメラ(外界認識装置)1は、ワイパー払拭直後の画像特徴量を保存して、現在の画像特徴量との減少率を算出して、減少率が基準値より大きく、かつ、周期的に複数回減少率が基準値より大きくなる場合に、認識部4の機能を一時的に停止する。 More specifically, the in-vehicle stereo camera (external recognition device) 1 of this embodiment stores the image feature amount immediately after the wiper is wiped, calculates the rate of decrease from the current image feature amount, and determines the rate of decrease to be the reference value. If the rate of decrease is larger than the reference value and periodically becomes larger than the reference value multiple times, the function of the recognition unit 4 is temporarily stopped.
 本実施形態によれば、時系列データ(例えば、ワイパー払拭直後を基準とした2点の時系列のフレーム)を用いて判定を行うことで、もともと視差等の画像特徴が出にくい環境なのか、豪雨等による画像劣化によって視差等の画像特徴が出ていないのかが判断できるため、豪雨のように画像の劣化が激しい場合はHALTとして判定して機能を停止させ、画像の劣化が軽度である小雨の場合はHALT判定せずに機能を停止させない。そのため、豪雨のように画像の劣化が激しい場合は、機能(運転支援制御)を一時的に停止する一方、HALTする程度ではない画像の劣化が軽度である小雨の場合は、機能(運転支援制御)を停止させず、運転支援を継続してドライバの運転操作の負担を軽減し続けることができる。 According to the present embodiment, by performing determination using time-series data (for example, two time-series frames based on immediately after wiping wipers), it is possible to determine whether the environment is such that image features such as parallax are difficult to appear in the first place. Since it is possible to judge whether image features such as parallax are not appearing due to image degradation due to heavy rain, etc., if the image degradation is severe like heavy rain, it will be judged as HALT and the function will be stopped. In the case of , the function is not stopped without HALT judgment. Therefore, when the image deterioration is severe, such as heavy rain, the function (driving support control) is temporarily suspended. ) is not stopped, and driving assistance can be continued to reduce the burden of the driver's driving operation.
 なお、本発明は上述の実施形態に限定されるものではなく、様々な変形形態が含まれる。例えば、上記した実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記録装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。 It should be noted that the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Moreover, it is possible to add, delete, or replace a part of the configuration of each embodiment with another configuration. Further, each of the above configurations, functions, processing units, processing means, and the like may be realized by hardware, for example, by designing them in an integrated circuit. Moreover, each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function. Information such as programs, tables, files, etc. that realize each function can be stored in memory, hard disks, SSD (Solid State Drives) and other recording devices, or IC cards, SD cards, DVDs and other recording media.
1…車載ステレオカメラ(外界認識装置)
2A、2B…カメラ(撮像部)(2A:左カメラ、2B:右カメラ)
3…画像特徴量算出部
4…認識部
5…メモリ
6…機能停止判定部
10…フロントガラス
11A、11B…ワイパー
1... In-vehicle stereo camera (external world recognition device)
2A, 2B... Cameras (imaging units) (2A: left camera, 2B: right camera)
3... Image feature amount calculator 4... Recognition part 5... Memory 6... Function stop determination part 10... Windshields 11A, 11B... Wipers

Claims (10)

  1.  車室内に取り付けられたカメラと、
     前記カメラで取得された画像から、画像特徴量を求める画像特徴量算出部と、
     前記カメラで取得された画像から、車外の対象物を認識する認識部と、
     所定周期で定められたタイミングで取得された第一の画像における第一の画像特徴量を記憶するメモリと、
     前記第一の画像特徴量、または当該第一の画像特徴量と現在の画像における第二の画像特徴量との比較結果に基づいて、前記認識部の機能を一時停止するか否かを判定する機能停止判定部と、
     を有することを特徴とする外界認識装置。
    a camera mounted inside the vehicle,
    an image feature amount calculation unit that obtains an image feature amount from the image acquired by the camera;
    a recognition unit that recognizes an object outside the vehicle from the image acquired by the camera;
    a memory for storing a first image feature amount in the first image acquired at a timing determined in a predetermined cycle;
    Determining whether to suspend the function of the recognition unit based on the first image feature amount or a comparison result between the first image feature amount and a second image feature amount in the current image a function stop determination unit;
    An external world recognition device characterized by having:
  2.  請求項1に記載の外界認識装置において、
     前記第一の画像は、ワイパーの動作で規定されるタイミングで取得されることを特徴とする外界認識装置。
    In the external world recognition device according to claim 1,
    The external world recognition device, wherein the first image is acquired at a timing defined by the operation of the wiper.
  3.  請求項1に記載の外界認識装置において、
     前記画像特徴量算出部は、現フレームにおける2つ以上のカメラから算出した視差数を画像特徴量とすることを特徴とする外界認識装置。
    In the external world recognition device according to claim 1,
    The external world recognition apparatus, wherein the image feature amount calculation unit uses the number of parallaxes calculated from two or more cameras in the current frame as the image feature amount.
  4.  請求項1に記載の外界認識装置において、
     前記画像特徴量算出部は、現フレームにおける1つのカメラから算出したエッジ数を画像特徴量とすることを特徴とする外界認識装置。
    In the external world recognition device according to claim 1,
    The external world recognition device, wherein the image feature amount calculation unit uses the number of edges calculated from one camera in the current frame as the image feature amount.
  5.  請求項1に記載の外界認識装置において、
     前記第一の画像特徴量を記憶する前記メモリは、ワイパーの動作で規定されるタイミングで撮像された状態の画像特徴量を記憶することを特徴とする外界認識装置。
    In the external world recognition device according to claim 1,
    The external world recognition device, wherein the memory for storing the first image feature quantity stores the image feature quantity in a state of being imaged at a timing defined by the operation of the wiper.
  6.  請求項1に記載の外界認識装置において、
     前記機能停止判定部は、前記メモリに記憶された前記第一の画像特徴量と前記現在の画像における前記第二の画像特徴量とから、前記第一の画像特徴量に対する前記第二の画像特徴量の変化率を算出することを特徴とする外界認識装置。
    In the external world recognition device according to claim 1,
    The function stop determination unit determines the second image feature amount for the first image feature amount from the first image feature amount stored in the memory and the second image feature amount in the current image. An external world recognition device characterized by calculating a change rate of a quantity.
  7.  請求項6に記載の外界認識装置において、
     前記機能停止判定部は、変化率が基準値より大きければ前記認識部の機能を停止し、変化率が基準値以下の場合は前記認識部の機能を停止しないことを特徴とする外界認識装置。
    In the external world recognition device according to claim 6,
    The external world recognition device, wherein the function stop determination unit stops the function of the recognition unit when the rate of change is greater than a reference value, and does not stop the function of the recognition unit when the rate of change is less than or equal to the reference value.
  8.  請求項7に記載の外界認識装置において、
     前記機能停止判定部は、変化率が周期的に複数回基準値より大きくなる場合に前記認識部の機能を停止することを特徴とする外界認識装置。
    In the external world recognition device according to claim 7,
    The external world recognition device, wherein the function stop determination unit stops the function of the recognition unit when the rate of change periodically exceeds a reference value a plurality of times.
  9.  請求項2に記載の外界認識装置において、
     前記機能停止判定部は、ワイパーの動作周期内の2点の時系列の前記画像特徴量の変化率に基づいて、前記認識部の機能を一時停止するか否かを判定することを特徴とする外界認識装置。
    In the external world recognition device according to claim 2,
    The function stop determination unit determines whether or not to suspend the function of the recognition unit based on the rate of change of the image feature amount in the time series of two points within the operation cycle of the wiper. External recognition device.
  10.  請求項1に記載の外界認識装置において、
     前記機能停止判定部は、前記第一の画像特徴量と前記第二の画像特徴量の絶対値同士の比較結果に基づいて、前記認識部の機能を一時停止するか否かを判定することを特徴とする外界認識装置。
    In the external world recognition device according to claim 1,
    The function stop determination unit determines whether to suspend the function of the recognition unit based on a comparison result between the absolute values of the first image feature amount and the second image feature amount. Characteristic external recognition device.
PCT/JP2021/042199 2021-11-17 2021-11-17 External environment recognition device WO2023089686A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/042199 WO2023089686A1 (en) 2021-11-17 2021-11-17 External environment recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/042199 WO2023089686A1 (en) 2021-11-17 2021-11-17 External environment recognition device

Publications (1)

Publication Number Publication Date
WO2023089686A1 true WO2023089686A1 (en) 2023-05-25

Family

ID=86396396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/042199 WO2023089686A1 (en) 2021-11-17 2021-11-17 External environment recognition device

Country Status (1)

Country Link
WO (1) WO2023089686A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015088047A (en) * 2013-10-31 2015-05-07 富士重工業株式会社 Driving support device
JP2020201876A (en) * 2019-06-13 2020-12-17 株式会社デンソー Information processing device and operation support system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015088047A (en) * 2013-10-31 2015-05-07 富士重工業株式会社 Driving support device
JP2020201876A (en) * 2019-06-13 2020-12-17 株式会社デンソー Information processing device and operation support system

Similar Documents

Publication Publication Date Title
EP2381416B1 (en) Method for image restoration in a computer vision system
JP4856612B2 (en) Object detection device
JP5022609B2 (en) Imaging environment recognition device
JP6744336B2 (en) Detection of lens contamination using expected edge trajectories
US10220782B2 (en) Image analysis apparatus and image analysis method
US9205810B2 (en) Method of fog and raindrop detection on a windscreen and driving assistance device
JP6365477B2 (en) Object recognition device and dirt detection method
EP2763404B1 (en) Image display device, and image display method
EP2928178B1 (en) On-board control device
JP6081034B2 (en) In-vehicle camera control device
US20180165529A1 (en) Object detection device
JP2001147278A (en) Raindrop detecting device
WO2023089686A1 (en) External environment recognition device
US20200151464A1 (en) Method and device for ascertaining an optical flow based on an image sequence recorded by a camera of a vehicle
JP7247772B2 (en) Information processing device and driving support system
JP4033106B2 (en) Ranging performance degradation detection device for vehicles
CN113492801B (en) Driving visual field enhancement method and device in raining and storage medium
WO2023074103A1 (en) Image processing device and in-vehicle control device
US20230410318A1 (en) Vehicle and method of controlling the same
JP6875940B2 (en) Mutual distance calculation device
JP7414486B2 (en) In-vehicle stereo camera
US11979655B2 (en) Method to detect and overcome degradation image quality impacts
JP7446445B2 (en) Image processing device, image processing method, and in-vehicle electronic control device
WO2023162932A1 (en) Environment recognition device and program
JPH09179989A (en) Road recognition device for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21964702

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023561975

Country of ref document: JP

Kind code of ref document: A