WO2012004938A1 - 車両の周辺監視装置 - Google Patents
車両の周辺監視装置 Download PDFInfo
- Publication number
- WO2012004938A1 WO2012004938A1 PCT/JP2011/003417 JP2011003417W WO2012004938A1 WO 2012004938 A1 WO2012004938 A1 WO 2012004938A1 JP 2011003417 W JP2011003417 W JP 2011003417W WO 2012004938 A1 WO2012004938 A1 WO 2012004938A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- candidate
- vehicle
- image
- size
- target
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to an apparatus for monitoring the periphery of a vehicle, and more specifically, when monitoring the periphery of a vehicle, a desired object is detected from object candidates in a captured image using the moving distance of the vehicle.
- the present invention relates to an apparatus that can exclude other than the above.
- Patent Document 1 one infrared camera is mounted on a vehicle, an object around the vehicle is detected from an image captured using the camera, and based on a rate of change in size of the object in the image. Thus, a method for obtaining the distance to the object is described.
- the type of the object that is, whether the object is a person such as a pedestrian, or other artificial structures such as buildings. It is done to determine if there is. Since this determination generally includes image processing for examining the shape characteristics of the object, image processing for tracing the object in time series and examining the behavior of the object, and the like, the calculation load is relatively high. If such type determination processing is performed on all detected objects, there is a risk of increasing the calculation load.
- the desired type of object when determining a desired type of object in the above-described type determination process, it is desirable to reduce as much as possible the possibility of erroneously determining another type of object as the desired type of object.
- the desired type of object when the desired type of object is a pedestrian, it is desirable to reduce as much as possible the possibility of erroneously determining an artificial structure as a pedestrian.
- a single camera is mounted on a vehicle to detect an object. Cost can be reduced compared to mounting a pair of cameras.
- one object of the present invention is to filter object candidates detected from an image prior to the type determination process as described above, and exclude those that are unlikely to be objects of a desired type.
- a method is proposed to reduce the load of subsequent image processing, and to reduce the possibility of erroneously determining another type of object as a desired type of object.
- the desired type is a pedestrian
- the possibility of erroneously determining an artificial structure as a pedestrian is reduced.
- Another object of the present invention is to propose a technique capable of performing the above filtering process simply and efficiently even when a single camera is used.
- an apparatus mounted on a vehicle and monitoring the periphery of the vehicle includes an imaging unit that images the periphery of the vehicle, and an object that exists outside the vehicle based on the captured image.
- type determination means for determining whether the object candidate is the object of the desired type.
- the excluding means assumes that the extracted object candidate has a predetermined size set in advance for the object candidate of the desired type, and the extracted object in the current image Based on the size (Ghd0) of the object candidate, the time when the current image is captured from the means for calculating the distance (Z0) from the vehicle to the object candidate and the time when the past image is captured Means for calculating the moving distance ( ⁇ Z) of the vehicle generated at time intervals up to and including the size of the candidate object in the current image, the calculated distance to the candidate object and the movement
- the means for reducing at a change rate based on distance is compared with the size of the object candidate in the past image (Ghd1) and the size of the object candidate reduced at the change rate (Ghv).
- Comparison means and comparison means And means for excluding an object candidate that has been determined as a result of the comparison based on a difference in size of the object candidates that is greater than a predetermined value as being unlikely to be an object candidate of the desired type; .
- the excluding unit assumes that the extracted target candidate has a predetermined size set in advance for the target candidate of the desired type, Based on the size (Ghd1) of the extracted candidate object in the image, the means for calculating the distance (Z1) from the vehicle to the candidate object, and the time when the past image is captured, Means for calculating a moving distance ( ⁇ Z) of the vehicle that occurs at a time interval up to the time when the current image is captured; and the size of the target object candidate in the past image, the calculated target Means for enlarging at a rate of change based on the distance to the object candidate and the moving distance, the size of the object candidate in the current image (Ghd0), and the size of the object candidate enlarged at the rate of change Compare with Ghv When the comparison means and the comparison result by the comparison means are such that the object candidate determined that the difference in size between the object candidates is larger than a predetermined value is not likely to be the object candidate of the desired type. Means for judging and excluding.
- the candidate object in the image has the desired type. Based on the moving distance of the vehicle, it is confirmed and filtered whether it corresponds to the object. If the object candidate in the image does not actually represent the object of the desired type, the above assumption will not be satisfied, and as a result of the comparison, the difference in size of the object candidate will increase.
- the object candidate can be excluded.
- the filtering process only needs to check the size of the object in the image, and does not require complicated processing such as checking the shape characteristics and time-series behavior of the object. Objects other than the type of object can be excluded.
- target candidates that may become noise in the subsequent type determination process can be excluded, so that the calculation load of the type determination can be reduced and a reduction in erroneous determination of the type determination can be expected.
- the desired type of object is a pedestrian
- the possibility of erroneously determining another type of object such as an artificial structure as a pedestrian can be reduced.
- the imaging unit is composed of a single infrared camera, and a high-intensity region obtained by binarizing an image acquired from the single infrared camera is used as the target. Extract as an object candidate.
- the cost can be reduced.
- the image from the infrared camera is binarized, a relatively high temperature target candidate such as a pedestrian can be easily extracted.
- the block diagram which shows the structure of the vehicle periphery monitoring apparatus according to one Example of this invention.
- the figure which shows notionally the comparison process of the present image and the virtual image produced
- FIG. 1 is a block diagram showing a configuration of an apparatus for monitoring the periphery of a vehicle according to an embodiment of the present invention.
- the apparatus is mounted on a vehicle and is capable of detecting far infrared rays, an image processing unit 2 for detecting an object around the vehicle based on image data obtained by the camera 1 and determining the type. , A speaker 3 that generates an alarm by voice based on the result of the type determination, and a head-up display that displays an image obtained by imaging by the camera 1 and displays an alarm based on the result of the type determination 4) (referred to as HUD).
- HUD head-up display
- the periphery monitoring device includes a yaw rate sensor 6 that detects the yaw rate of the vehicle, and a vehicle speed sensor 7 that detects the traveling speed (vehicle speed) of the vehicle, and the detection results of these sensors are sent to the image processing unit 2. And used for predetermined image processing as required.
- the camera 1 is arranged on the central axis passing through the center of the vehicle width at the front of the vehicle 10 so as to capture the front of the vehicle 10.
- the infrared camera 1 has a characteristic that the higher the temperature of the object, the higher the level of the output signal (that is, the higher the luminance in the captured image).
- the image processing unit 2 includes an A / D conversion circuit that converts an input analog signal into a digital signal, an image memory that stores a digitized image signal, a central processing unit (CPU) that performs various arithmetic processing, and a data RAM (Random Access Memory) used to store data, ROM (Read Only Memory) that stores programs executed by the CPU and data used (including tables and maps), driving signals for the speaker 3, display signals for the HUD 4, etc. Is provided.
- the output signal of the camera 1 is converted into a digital signal and input to the CPU.
- the HUD 4 is provided so that a screen 4 a is displayed at a front position of the driver on the front window of the vehicle 10. Thus, the driver can visually recognize the screen displayed on the HUD 4.
- a display device of a so-called navigation device can be used as the display device.
- the navigation device is a device capable of detecting the current position of the vehicle, calculating an optimum route to the destination, and displaying the current position and the route on map information.
- FIG. 3 shows a functional block diagram of the image processing unit 2 according to an embodiment of the present invention.
- the object candidate extraction unit 21 acquires an image captured by the camera 1 and extracts an image area of the object candidate from the image.
- the image captured by the camera 1 is a grayscale image having a luminance value corresponding to the temperature of the imaging target
- the extraction is based on the luminance value of the grayscale image. Realized. Specifically, an image region having a luminance value higher than a predetermined luminance threshold ITH in the captured image is extracted as an object candidate.
- an object having a relatively high temperature such as a living body can be extracted, but an artificial structure may also be extracted, for example, when a heat source is provided.
- the type determination process represented by the functional block 25 is performed on the target candidate extracted in this manner.
- the type determination process is a process of finally determining what type of target object the target candidate is.
- types for example, there are pedestrians (people), artificial structures, and the like, and by this determination processing, it is possible to specify whether the object candidate is a pedestrian or an artificial structure.
- a warning is issued based on the result of the type determination, generally a relatively high accuracy is required. For this reason, it is often necessary to examine the shape of the object candidate in the image and the behavior of the object candidate in the time-series image, which is a process with a relatively high calculation load.
- processing such as whether or not a feature related to a shape unique to the pedestrian is extracted from the candidate object is performed. For example, a process for examining whether or not a circular shape representing the head can be detected from the candidate object, and whether or not a predetermined high-luminance region can be detected for each part of the body (head, torso, leg, etc.) An examination process or the like can be performed. In addition, a process for examining whether or not a behavior of a pedestrian walking is detected from the behavior of a candidate object in a time-series image can be performed. If such a detailed type determination process is performed for each of the extracted object candidates, the calculation load may increase.
- an exclusion processing unit (filtering unit) 23 is provided between the object candidate extraction unit 21 and the type determination unit 25.
- the exclusion processing unit 23 has a role of roughly (coarsely) filtering the extracted target object prior to the type determination process, and is low in the possibility of being a target candidate of a desired type
- the candidate operates to be excluded from the subsequent type determination process. Thereby, the object candidate which may become noise in the type determination process can be removed in advance.
- the “desired type” is determined in advance, and is a pedestrian (person) in this embodiment.
- the subsequent type determination unit 25 finally determines whether the target candidate is the target of the desired type, that is, a pedestrian, as described above. Judgment. If it is finally determined that the person is a pedestrian, an alarm for notifying the occupant of the presence of the pedestrian is issued via the speaker 3 and / or the HUD 4.
- the exclusion processing unit 23 Since the exclusion processing unit 23 considers only the size of the object candidate, and does not consider the feature related to the shape of the object candidate or the time-series behavior, the object candidate can be selected by a simple calculation. it can. Therefore, by providing the exclusion processing unit 23, it is possible to reduce the calculation load of the entire image processing. Moreover, since the candidate object which was not excluded by the exclusion process part 23 shows that there exists a possibility that it is a target object of a desired classification, the fall of the misjudgment by the classification determination part 25 can be anticipated. For example, when the desired type of object is a pedestrian as described above, the possibility of erroneously determining an artificial structure as a pedestrian can be reduced.
- a pedestrian is used as a desired type, and the exclusion processing unit 23 excludes object candidates that are determined to be less likely to be pedestrians. It is finally determined whether or not each candidate object remaining after the exclusion is a pedestrian.
- the pedestrian type determination process any known appropriate method can be used. For example, as described in JP 2007-241740 A, JP 2007-264778 A, JP 2007-334751 A, and the like. Has been.
- an artificial structure determination or an animal determination may be performed together with the pedestrian determination. Thereby, about the candidate object which was not determined to be a pedestrian, it can be determined whether it is an artificial structure or an animal.
- a method described in JP2003-016429A, JP2008-276787A, or the like can be used for the type determination process of the artificial structure.
- a method described in Japanese Patent Application Laid-Open No. 2007-310705, Japanese Patent Application Laid-Open No. 2007-310706, or the like can be used for the animal type determination processing.
- FIG. (A) is a figure which shows the relationship of the distance direction between the own vehicle and a target object in the past time t1 and the present time t0.
- the host vehicle 100 is indicated by a triangle.
- the object 101 that is a pedestrian is positioned in front of the host vehicle 100 and is imaged by the camera 1 mounted on the host vehicle 100.
- the distance from the own vehicle 100 to the target object 101 is Z1 at the past time point t1, and the distance from the own vehicle 100 to the target object 101 is Z0 at the current time point t0.
- the host vehicle 100 travels between the past time point t1 and the current time point t0, and the movement distance is represented by ⁇ Z.
- the travel distance ⁇ Z can be calculated by multiplying the time interval (t0-t1) between the time points t1 and t0 by the speed Vs of the vehicle 100 (detected by the vehicle speed sensor 7).
- (B) includes an image (referred to as a past image) captured by the camera 1 at the past time point t1 in (a) and an image (referred to as an original phase image) captured by the camera 1 at the current time point t0. , Is schematically represented.
- the object candidate extraction unit 21 has extracted the object candidate 105 that is a high luminance area from the past image and the current image.
- a pedestrian is shown as the object 101, but at the time of extraction, it is unknown to the image processing unit 2 whether the object candidate 105 is the pedestrian 101 or not. Please be careful.
- the height of the candidate object 105 in the past image is Ghd1 (for example, it can be expressed by the number of pixels, and so on), and the height of the candidate object 105 in the current image is Ghd0.
- Ghd1 for example, it can be expressed by the number of pixels, and so on
- Ghd0 the height of the candidate object 105 in the current image
- the extracted object candidate has a predetermined size H set in advance for a desired type of object.
- the pedestrian is finally determined in the image as a desired type of object. Therefore, a predetermined size is set for the pedestrian (in this example, it is assumed that it has a height of 170 cm, but for example, the average height of an adult can be used). Then, it is assumed that the object candidate 105 extracted from the image as described above has the predetermined size H in the real space.
- the distance Z0 of the object 101 corresponding to the object candidate 105 having the height Ghd0 in the current image from the host vehicle 100 can be calculated as follows.
- f represents the focal length of the camera 1
- H represents a predetermined size (height in this embodiment) based on the above assumption.
- Z0 (H ⁇ f) / Ghd0 (1)
- the distance Z0 calculated as described above should be a correct value.
- a virtual image is generated that estimates from what size the object should be imaged at the past time point t1 from the current image.
- the height Ghv at which the object is to be imaged in the image is estimated from the current image.
- the height Ghv should be a size corresponding to the ratio of the past distance Z1 and the current distance Z0. is there.
- the above assumption is correct (thus, the distance Z0 in equation (1) is correct), and thus the object candidate 105 in the virtual image.
- the height Ghv should substantially match the height Ghd1 of the object candidate 105 in the past image actually acquired at the past time point t1. Therefore, as shown in the comparison block of (b), the height Ghd1 of the object candidate 105 in the past image is compared with the height Ghv of the object candidate 105 in the virtual image.
- the above-mentioned assumption that is, the assumption that the extracted object candidate 105 has a size equivalent to a pedestrian in real space is assumed.
- the target object 105 is excluded from the type determination process performed by the subsequent type determination unit 25.
- the object candidate 105 is passed to the subsequent type determination unit 25, where a final determination is made as to whether or not the object candidate 105 is a pedestrian.
- the exclusion processing unit 23 excludes (filters) object candidates that cannot be said to be objects of a desired type (in this embodiment, pedestrians) prior to the type determination process with a relatively high calculation load. be able to. Therefore, it is possible to reduce the possibility of erroneous determination in the subsequent type determination process, that is, the possibility of erroneous determination of an artificial structure as a pedestrian in this embodiment. Since the filtering process can be realized by a simple calculation as described above, it can be said that the filtering process is a very effective process for selecting in advance the processing target of the type determination process.
- a virtual image is generated from the current image and compared with the past image.
- a virtual image is generated from the past image, You may compare with an image. This alternative is shown in FIG.
- the distance relationship of FIG. 4A is similarly applied. Also in this alternative form, the above assumption (assuming) is the same. According to this premise, the distance Z1 from the host vehicle 100 of the object 101 corresponding to the object candidate 105 having the height Ghd1 in the past image can be calculated as follows.
- f represents the focal length of the camera 1
- H represents a predetermined size (height in this embodiment) on the premise.
- Z1 (H ⁇ f) / Ghd1 (3)
- the distance Z1 calculated as described above should be a correct value.
- a virtual image is generated that estimates the size of the object to be imaged at the current time point t0 from the past image.
- the height Ghv that the object should be imaged in the image is estimated from the past image.
- the height Ghv has a magnitude corresponding to the ratio between the past distance Z1 and the current distance Z0. Should be.
- the ratio (Z1 / (Z1- ⁇ Z)) according to the distance is larger than 1, the height Ghv of the candidate object 105 in the virtual image is higher than the height Ghd1 of the candidate object 105 in the past image. Will be expanded.
- the object candidate 101 actually represents a pedestrian, the above assumption is correct (thus, the distance Z1 in equation (3) is correct), and thus on the virtual image.
- the height Ghv of the object candidate 105 of the current object should substantially match the height Ghd0 of the object candidate 105 in the current image actually acquired at the current time point t0. Therefore, as shown in the comparison block of FIG. 5, the height Ghd0 of the object candidate 101 in the current image is compared with the height Ghv of the object candidate 101 in the virtual image.
- the target object 105 is excluded from the type determination process performed by the subsequent type determination unit 25.
- the magnitude of the difference between the two is equal to or less than the threshold value, it indicates that the above-mentioned assumption is correct, and therefore it is determined that there is a possibility that the object candidate 105 represents a pedestrian. Therefore, the object candidate 101 is passed to the subsequent type determination unit 25, where a final determination is made as to whether or not the object candidate 105 is a pedestrian.
- FIG. 6 is a flowchart showing a process executed by the image processing unit 2 according to an embodiment of the present invention. The process is performed at predetermined time intervals. Moreover, this process is based on the form of FIG.4 (b), and makes the pedestrian (person) the target object of the desired classification finally determined.
- an output signal of the camera 1 (that is, captured image data) is received as input, A / D converted, and stored in the image memory.
- the stored image data is a gray scale image including luminance information.
- step S14 an image region representing the candidate object is extracted from the image captured by the camera 1. As described above, this can be realized by binarization of the image, for example.
- a process is performed in which a brighter area than the luminance threshold ITH is set to “1” (white) and a dark area is set to “0” (black).
- the luminance threshold value ITH can be determined by any appropriate technique.
- the binarized image data is converted into run-length data. That is, for the area that has become white due to binarization, the coordinates of the start point (the leftmost pixel of each line) of the white area (called the line) of each pixel row and the start point to the end point (the right end of each line)
- the run-length data is represented by the length (represented by the number of pixels).
- the y-axis is taken in the vertical direction in the image
- the x-axis is taken in the horizontal direction.
- the white region in the pixel row whose y coordinate is y1 is a line from (x1, y1) to (x3, y1)
- this line consists of three pixels, so (x1, y1, 3)
- This is represented by run-length data.
- the object is labeled and a process of extracting the object is performed. That is, of the lines converted into run length data, a line having a portion overlapping in the y direction is regarded as one object, and a label is given thereto. In this way, the image area of the candidate object is extracted.
- step S15 the height Ghd0 of the candidate object extracted as described above is calculated on the image captured this time, that is, on the current image. This can be calculated as the length in the height direction (y direction) of the image region extracted as the object candidate as described above (which can be represented by the number of pixels).
- the calculated height is stored in a memory or the like.
- step S16 assuming that the extracted object candidate has a predetermined size set in advance for a desired type of object, the distance Z0 in the real space to the object candidate in the current image. Is estimated.
- the distance Z0 can be estimated by substituting the height value set in advance for the pedestrian into H in Equation (1).
- step S17 a certain past time (in this embodiment, the execution time of the previous cycle, that is, the previous image acquisition time) and the moving distance ⁇ Z of the vehicle up to the current time are calculated. This is calculated by multiplying the time interval at which the process is executed, that is, the time interval ⁇ t for acquiring the image captured by the camera 1, by the speed Vs of the host vehicle detected by the vehicle speed sensor 7. be able to.
- step S18 the height GHd1 of the image region extracted as the object candidate on the image (referred to as the past image) acquired in the previous cycle is acquired. What is necessary is just to acquire what was calculated by step S15 and memorize
- step S19 a virtual image is generated from the current image based on the distance Z0 and the movement distance ⁇ Z calculated in steps S16 and S17. Specifically, according to the formula (2), the height Ghd0 in the current image is multiplied by a ratio (change rate) Z0 / (Z0 + ⁇ Z) corresponding to the distance, and the height of the candidate object to be obtained in the past image Ghv is estimated.
- step S20 the height Ghd1 of the object candidate of the past image is compared with the height GHv of the object candidate of the virtual image, and the difference between the two is calculated.
- step S21 it is determined whether or not the magnitude of the difference is equal to or less than a predetermined threshold value.
- the threshold value can be determined in advance in consideration of an allowable error (for example, it can be determined based on how much height error is allowed).
- step S21 If the magnitude of the difference is larger than a predetermined threshold value (S21 is No), it is assumed that the object candidate in the image has a size corresponding to a desired type of object (premise). Therefore, it is determined that there is a low possibility that the image area extracted as the candidate object in step S14 represents a pedestrian. Therefore, in step S22, the object candidate is excluded from the subsequent type determination process.
- step S23 a type determination process is executed on the object candidate to finally determine whether or not the object candidate is a pedestrian.
- the type determination process in accordance with a known appropriate method, for example, by examining characteristics related to the shape of the candidate object, time-series behavior, etc., it is determined whether or not the candidate object is a pedestrian. be able to. If the candidate object is determined to be a pedestrian, an alarm can be issued to the occupant to notify the presence of the pedestrian via the speaker 3 or the HUD 4.
- step S15 the height Ghd1 of the object candidate in the past image acquired in the previous cycle is acquired, and in step S16, the object candidates in the past image up to the object candidate in the past image are obtained.
- the distance Z1 is calculated.
- step S18 the height Ghd0 of the candidate object in the current image is calculated, and in step S19, a virtual image is generated from the past image.
- the height Ghv of the candidate object in the virtual image is calculated according to the equation (4) based on the distance Z1 and the movement distance ⁇ Z.
- step S20 the height Ghv of the object candidate in the virtual image is compared with the height Ghd0 of the object candidate in the current image, and it is determined whether or not the magnitude of the difference between the two is less than a predetermined value.
- the desired type of object is a pedestrian (person).
- the desired type is further classified into adults and children, and the above exclusion process is performed for each type. May be.
- the predetermined size H is, for example, the average height value of adults
- the predetermined size H is, for example, the average height value of children of a predetermined age. Good.
- a type determination process for determining whether or not the candidate is a pedestrian is performed.
- the desired type of object is not limited to a pedestrian, and may be a predetermined animal (for example, a quadruped animal such as a bear or a deer).
- the predetermined size H indicates the size in the height (vertical) direction.
- the height direction for example, in the case of a pedestrian, the vehicle of the pedestrian Regardless of the direction with respect to, the above exclusion process can be realized more accurately.
- the size in the height direction is not necessarily used, and the size in the width (horizontal) direction may be used instead. This is because the width of the target object candidate in the captured image also changes according to the distance.
- the above exclusion process may be realized on the assumption (assuming) that a candidate object extracted from a captured image has a predetermined width W set in advance for a desired type of object.
- an infrared camera is used as the camera, whereby a relatively high-temperature object such as a living body can be easily extracted.
- a visible camera may be used.
- the image area of the candidate object may be extracted from the captured image by any appropriate method, and the above-described exclusion process may be performed on the extracted candidate object.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Z0=(H×f)/Ghd0 (1)
Ghv=Ghd0×(Z0/(Z0+ΔZ)) (2)
Z1=(H×f)/Ghd1 (3)
Ghv=Ghd1×(Z1/(Z1-ΔZ)) (4)
2 画像処理ユニット
3 スピーカ
4 HUD
Claims (3)
- 車両に搭載され、車両の周辺を監視する装置は、
該車両の周辺を撮像する撮像手段と、
該撮像された画像に基づいて、車両外に存在する対象物候補を抽出する抽出手段と、
前記抽出された対象物候補から、所望の種別の対象物候補である可能性が低いと判断した対象物候補を除外する除外手段と、
前記除外の後に残った前記対象物候補のそれぞれについて、該対象物候補が前記所望の種別の対象物かどうかを判定する種別判定手段と、を備え、
前記除外手段は、さらに、
前記抽出された対象物候補が、前記所望の種別の対象物候補について予め設定された所定の大きさを有すると仮定し、現在の前記画像内の該抽出された対象物候補の大きさ(Ghd0)に基づいて、該車両から該対象物候補までの距離(Z0)を算出する手段と、
過去の画像が撮像された時刻から、前記現在の画像が撮像された時刻までの時間間隔で発生する前記車両の移動距離(ΔZ)を算出する手段と、
前記現在の画像中の前記対象物候補の大きさを、前記算出された該対象物候補までの距離および前記移動距離に基づく変化率で縮小する手段と、
前記過去の画像中の前記対象物候補の大きさ(Ghd1)と、前記変化率で縮小された前記対象物候補の大きさ(Ghv)とを比較する比較手段と、
前記比較手段による比較の結果、前記対象物候補の大きさの差が所定値より大きいと判定された対象物候補を、前記所望の種別の対象物候補である可能性が低いと判断して除外する手段と、を備える、
車両の周辺監視装置。 - 車両に搭載され、車両の周辺を監視する装置は、
該車両の周辺を撮像する撮像手段と、
該撮像された画像に基づいて、車両外に存在する対象物候補を抽出する抽出手段と、
前記抽出された対象物候補から、所望の種別の対象物候補である可能性が低いと判断した対象物候補を除外する除外手段と、
前記除外の後に残った前記対象物候補のそれぞれについて、該対象物候補が前記所望の種別の対象物かどうかを判定する種別判定手段と、を備え、
前記除外手段は、さらに、
前記抽出された対象物候補が、前記所望の種別の対象物候補について予め設定された所定の大きさを有すると仮定し、過去の前記画像内の該抽出された対象物候補の大きさ(Ghd1)に基づいて、該車両から該対象物候補までの距離(Z1)を算出する手段と、
前記過去の画像が撮像された時刻から、現在の画像が撮像された時刻までの時間間隔で発生する前記車両の移動距離(ΔZ)を算出する手段と、
前記過去の画像中の前記対象物候補の大きさを、前記算出された該対象物候補までの距離および前記移動距離に基づく変化率で拡大する手段と、
前記現在の画像中の前記対象物候補の大きさ(Ghd0)と、前記変化率で拡大された前記対象物候補の大きさ(Ghv)とを比較する比較手段と、
前記比較手段による比較の結果、前記対象物候補の大きさの差が所定値より大きいと判定された対象物候補を、前記所望の種別の対象物候補である可能性が低いと判断して除外する手段と、を備える、
車両の周辺監視装置。 - 前記撮像手段は、単一の赤外線カメラからなり、
前記抽出手段は、前記単一の赤外線カメラから取得された画像を2値化することにより得られた高輝度領域を、前記対象物候補として抽出する、
請求項1または2に記載の車両の周辺監視装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180033722.1A CN102985957B (zh) | 2010-07-09 | 2011-06-15 | 车辆周围监测装置 |
US13/808,120 US9158738B2 (en) | 2010-07-09 | 2011-06-15 | Apparatus for monitoring vicinity of a vehicle |
EP11803280.4A EP2579230A4 (en) | 2010-07-09 | 2011-06-15 | DEVICE FOR MONITORING THE PROXIMITY OF A VEHICLE |
JP2012523512A JP5576937B2 (ja) | 2010-07-09 | 2011-06-15 | 車両の周辺監視装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010156705 | 2010-07-09 | ||
JP2010-156705 | 2010-07-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012004938A1 true WO2012004938A1 (ja) | 2012-01-12 |
Family
ID=45440934
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/003417 WO2012004938A1 (ja) | 2010-07-09 | 2011-06-15 | 車両の周辺監視装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9158738B2 (ja) |
EP (1) | EP2579230A4 (ja) |
JP (1) | JP5576937B2 (ja) |
CN (1) | CN102985957B (ja) |
WO (1) | WO2012004938A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014074951A (ja) * | 2012-10-02 | 2014-04-24 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2014164461A (ja) * | 2013-02-25 | 2014-09-08 | Denso Corp | 歩行者検出装置、および歩行者検出方法 |
CN104380039A (zh) * | 2012-07-31 | 2015-02-25 | 哈曼国际工业有限公司 | 使用单个相机检测障碍物的系统和方法 |
US20210256728A1 (en) * | 2018-11-09 | 2021-08-19 | Denso Corporation | Object detection apparatus |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2827318A4 (en) * | 2012-03-12 | 2016-01-13 | Honda Motor Co Ltd | DEVICE FOR MONITORING VEHICLE PERIPHERY |
US9476970B1 (en) * | 2012-03-19 | 2016-10-25 | Google Inc. | Camera based localization |
US9460354B2 (en) | 2012-11-09 | 2016-10-04 | Analog Devices Global | Object detection |
US9672431B2 (en) * | 2012-11-09 | 2017-06-06 | Analog Devices Global | Object detection |
US9811738B2 (en) * | 2012-12-06 | 2017-11-07 | Nec Corporation | Appearance presentation system, method, and program |
JP6547292B2 (ja) | 2014-02-05 | 2019-07-24 | 株式会社リコー | 画像処理装置、機器制御システム、および画像処理プログラム |
KR20160032586A (ko) * | 2014-09-16 | 2016-03-24 | 삼성전자주식회사 | 관심영역 크기 전이 모델 기반의 컴퓨터 보조 진단 장치 및 방법 |
KR102310286B1 (ko) * | 2014-11-07 | 2021-10-07 | 현대모비스 주식회사 | 특정물체 감지 장치 및 특정물체 감지 방법 |
KR101960644B1 (ko) * | 2015-09-18 | 2019-03-20 | 닛산 지도우샤 가부시키가이샤 | 차량용 표시 장치 및 차량용 표시 방법 |
JP6782192B2 (ja) * | 2017-05-17 | 2020-11-11 | 株式会社デンソーアイティーラボラトリ | 物体検出装置、物体検出方法、及びプログラム |
CN110996065B (zh) * | 2019-12-18 | 2021-07-13 | 广州澳盾智能科技有限公司 | 一种基于物联网的远程生物调查指挥方法 |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003016429A (ja) | 2001-06-28 | 2003-01-17 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007213561A (ja) * | 2006-01-16 | 2007-08-23 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007241740A (ja) | 2006-03-09 | 2007-09-20 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007264778A (ja) | 2006-03-27 | 2007-10-11 | Honda Motor Co Ltd | 歩行者認識装置 |
JP2007310706A (ja) | 2006-05-19 | 2007-11-29 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007310705A (ja) | 2006-05-19 | 2007-11-29 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007334751A (ja) | 2006-06-16 | 2007-12-27 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2008113296A (ja) * | 2006-10-31 | 2008-05-15 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2008276787A (ja) | 2008-05-14 | 2008-11-13 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2009193390A (ja) * | 2008-02-15 | 2009-08-27 | Honda Motor Co Ltd | 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法 |
JP2009199138A (ja) * | 2008-02-19 | 2009-09-03 | Honda Motor Co Ltd | 車両周辺監視装置、車両、車両周辺監視プログラム |
JP2009265883A (ja) * | 2008-04-24 | 2009-11-12 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2009276200A (ja) * | 2008-05-14 | 2009-11-26 | Hitachi Ltd | 車載用物体検知装置 |
JP2009276910A (ja) * | 2008-05-13 | 2009-11-26 | Toyota Central R&D Labs Inc | 画像処理装置、方法及びプログラム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101593418A (zh) * | 2009-05-31 | 2009-12-02 | 上海宝康电子控制工程有限公司 | 嫌疑车辆关联查找方法 |
-
2011
- 2011-06-15 CN CN201180033722.1A patent/CN102985957B/zh not_active Expired - Fee Related
- 2011-06-15 EP EP11803280.4A patent/EP2579230A4/en not_active Withdrawn
- 2011-06-15 WO PCT/JP2011/003417 patent/WO2012004938A1/ja active Application Filing
- 2011-06-15 JP JP2012523512A patent/JP5576937B2/ja active Active
- 2011-06-15 US US13/808,120 patent/US9158738B2/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003016429A (ja) | 2001-06-28 | 2003-01-17 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007213561A (ja) * | 2006-01-16 | 2007-08-23 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007241740A (ja) | 2006-03-09 | 2007-09-20 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007264778A (ja) | 2006-03-27 | 2007-10-11 | Honda Motor Co Ltd | 歩行者認識装置 |
JP2007310706A (ja) | 2006-05-19 | 2007-11-29 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007310705A (ja) | 2006-05-19 | 2007-11-29 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2007334751A (ja) | 2006-06-16 | 2007-12-27 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2008113296A (ja) * | 2006-10-31 | 2008-05-15 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP4267657B2 (ja) | 2006-10-31 | 2009-05-27 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP2009193390A (ja) * | 2008-02-15 | 2009-08-27 | Honda Motor Co Ltd | 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法 |
JP2009199138A (ja) * | 2008-02-19 | 2009-09-03 | Honda Motor Co Ltd | 車両周辺監視装置、車両、車両周辺監視プログラム |
JP2009265883A (ja) * | 2008-04-24 | 2009-11-12 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2009276910A (ja) * | 2008-05-13 | 2009-11-26 | Toyota Central R&D Labs Inc | 画像処理装置、方法及びプログラム |
JP2008276787A (ja) | 2008-05-14 | 2008-11-13 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2009276200A (ja) * | 2008-05-14 | 2009-11-26 | Hitachi Ltd | 車載用物体検知装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2579230A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104380039A (zh) * | 2012-07-31 | 2015-02-25 | 哈曼国际工业有限公司 | 使用单个相机检测障碍物的系统和方法 |
US20150078619A1 (en) * | 2012-07-31 | 2015-03-19 | Harman International Industries, Incorporated | System and method for detecting obstacles using a single camera |
US9798936B2 (en) * | 2012-07-31 | 2017-10-24 | Harman International Industries, Incorporated | System and method for detecting obstacles using a single camera |
JP2014074951A (ja) * | 2012-10-02 | 2014-04-24 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2014164461A (ja) * | 2013-02-25 | 2014-09-08 | Denso Corp | 歩行者検出装置、および歩行者検出方法 |
US20210256728A1 (en) * | 2018-11-09 | 2021-08-19 | Denso Corporation | Object detection apparatus |
Also Published As
Publication number | Publication date |
---|---|
US9158738B2 (en) | 2015-10-13 |
JPWO2012004938A1 (ja) | 2013-09-02 |
US20130103299A1 (en) | 2013-04-25 |
CN102985957A (zh) | 2013-03-20 |
EP2579230A4 (en) | 2013-11-06 |
CN102985957B (zh) | 2015-03-04 |
JP5576937B2 (ja) | 2014-08-20 |
EP2579230A1 (en) | 2013-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5576937B2 (ja) | 車両の周辺監視装置 | |
JP4456086B2 (ja) | 車両周辺監視装置 | |
JP4173901B2 (ja) | 車両周辺監視装置 | |
JP5577398B2 (ja) | 車両の周辺監視装置 | |
JP3934119B2 (ja) | 車両周辺監視装置 | |
JP3987057B2 (ja) | 車両周辺監視装置 | |
JP4173902B2 (ja) | 車両周辺監視装置 | |
JP4528283B2 (ja) | 車両周辺監視装置 | |
JP4988781B2 (ja) | 車両の周辺監視装置 | |
JP4644273B2 (ja) | 車両周辺監視装置 | |
US9160986B2 (en) | Device for monitoring surroundings of a vehicle | |
JP4094604B2 (ja) | 車両周辺監視装置 | |
JP4128562B2 (ja) | 車両周辺監視装置 | |
JP4813304B2 (ja) | 車両周辺監視装置 | |
JP4425852B2 (ja) | 車両周辺監視装置 | |
US9030560B2 (en) | Apparatus for monitoring surroundings of a vehicle | |
JP2011134119A (ja) | 車両周辺監視装置 | |
JP5559121B2 (ja) | 対象物種別判定装置 | |
JP2004348645A (ja) | 赤外線画像認識装置、及び赤外線画像認識装置を用いた警報装置 | |
JP5383246B2 (ja) | 車両の周辺監視装置 | |
JP4472623B2 (ja) | 車両周辺監視装置 | |
JP2006185430A (ja) | 車両周辺監視装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180033722.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11803280 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012523512 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13808120 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011803280 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |