JPH08156723A - Vehicle obstruction detecting device - Google Patents

Vehicle obstruction detecting device

Info

Publication number
JPH08156723A
JPH08156723A JP6302292A JP30229294A JPH08156723A JP H08156723 A JPH08156723 A JP H08156723A JP 6302292 A JP6302292 A JP 6302292A JP 30229294 A JP30229294 A JP 30229294A JP H08156723 A JPH08156723 A JP H08156723A
Authority
JP
Japan
Prior art keywords
vehicle
obstacle
means
distance
object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP6302292A
Other languages
Japanese (ja)
Inventor
Ryota Shirato
Yasushi Ueno
裕史 上野
良太 白▲土▼
Original Assignee
Nissan Motor Co Ltd
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd, 日産自動車株式会社 filed Critical Nissan Motor Co Ltd
Priority to JP6302292A priority Critical patent/JPH08156723A/en
Publication of JPH08156723A publication Critical patent/JPH08156723A/en
Pending legal-status Critical Current

Links

Abstract

PURPOSE: To enhance the accuracy of detection of an obstruction in the advancing direction of an automobile, and to prevent a road sign or the like having no risk of bumping, from being regarded as an obstruction. CONSTITUTION: A lane area and a position in front of a vehicle within the lane are detected in accordance with an image signal from a CCD camera, and a distance to an object in front of the automobile is measured by a laser beam (step S21). If data are obtained from both, it is judged that an ahead vehicle is present, and detected distances to both are compared with each other in the lane area, and an inter-vehicle distance is determined (step S22 to step S28). Then, a degree of approach is determined (step S29 to step S31).

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a vehicle obstacle detection device, and more particularly to a technique for improving the obstacle detection accuracy.

[0002]

2. Description of the Related Art As a conventional vehicle obstacle detection device,
For example, there is one disclosed in Japanese Patent Laid-Open No. 4-193641. As a means for recognizing the situation in front of the own vehicle, this is a distance for detecting the distance and azimuth between an image processing device that processes an image of a forward traveling scene and an object existing in front of the own vehicle that is separate from the image processing device. With a measuring device,
The area of the lane in which the vehicle is traveling is recognized based on the image signal obtained from the image processing device, and the detected object is based on the image signal from the data of the vehicle horizontal direction and the distance of the object detected by the distance measuring device. It is determined whether the vehicle is within the recognized vehicle lane area or outside the area, and if it is determined that the vehicle is within the area, the detected object is determined as an obstacle. Then, it is determined whether or not the own vehicle may come into contact with the detected obstacle, and an alarm is issued to the driver when it is determined that the possibility (contact degree) of contact is high.

[0003]

However, the conventional vehicle obstacle detection device has the following problems because it is configured to detect an object only on the side of the distance measurement device. That is, a distance measuring device such as a laser radar has a distance measuring range Z in the vertical direction of the vehicle 1 as well, as shown in FIGS. Therefore, the road sign 2 close to the traveling path near the end of the uphill as shown in FIG. 12 and the road sign 3 installed on the pedestrian bridge as shown in FIG. 13 may enter the range Z. In this case, based on the azimuth and distance data obtained from them, if it is recognized that the vehicle is in the lane area of the vehicle, the road signs 2 and 3 with low contact probability are recognized as obstacles. there is a possibility.

The present invention has been made in view of such a conventional problem, and detects the position of an object existing in the own lane area recognized not only by the distance measuring device side but also by the image processing device side. An object of the present invention is to provide an obstacle detection device for a vehicle in which the detection accuracy of the obstacle is improved by performing the operation and confirming the existence of the obstacle from the detection results of both.

[0005]

Therefore, in the vehicle obstacle detection device according to the present invention as set forth in claim 1, as shown in FIG. 1, an image pickup means A for picking up an image of the road condition in the traveling direction of the host vehicle is provided. , A traveling road area detecting means B for detecting a traveling road area on which the host vehicle travels based on an image signal from the imaging means A
And an object position detecting means C for detecting an object position in the traveling road area detected by the traveling road area detecting means B based on an image signal from the image pickup means A, and the own vehicle regardless of the image signal. And a distance measuring means D for measuring the distance between the object and the object in the traveling area, and an object position detection output from the object position detecting means C, and the distance measuring means D.
When there is a distance measurement output from the vehicle, an obstacle determining means E for determining that an object existing in the traveling road area is an obstacle, a vehicle speed detecting means F for detecting the vehicle speed of the vehicle, and the obstacle determination The distance measuring means D when the means E determines that it is an obstacle
The predetermined inter-vehicle distance calculated based on the relative speed calculated from the measured value of the vehicle speed and the vehicle speed of the vehicle speed detection means F is compared with the distance to the obstacle to determine the degree of approach to the obstacle. The approach determining means G is provided.

According to the second aspect of the invention, the obstacle determining means E comprises the object position detecting means C and the distance measuring means D.
When multiple object position detection outputs and distance measurement outputs are obtained from each, the output values are compared from the order closest to the own vehicle, and when both output values match, the object corresponding to the output value is set as an obstacle. It is configured to determine that. In the invention according to claim 3, the obstacle determining means E is obtained from the distance measuring means D and the distance to the detected object calculated from the object position detection output obtained from the object position detecting means C. When the distance measurement output does not match, the object corresponding to the output value farther from the vehicle is determined as an obstacle.

Further, in the invention according to claim 4, when there is an object judged by the obstacle judging means E as an obstacle,
The object position detecting means C and the distance measuring means D are controlled to detect only the vicinity of the object.

[0008]

According to the structure of the present invention, the image pickup means A images the road condition in the traveling direction of the own vehicle, and the traveling road area detecting means B makes the traveling road area of the own vehicle based on the image signal. To detect. Further, the object position detecting means C detects the object position (distance from the own vehicle) in the traveling road area based on the image signal. At the same time, regardless of the object position detecting means C, the distance measuring means D detects the distance and azimuth to the object in the traveling road region of the vehicle. When both the object position detecting means C and the distance measuring means D detect the object, the obstacle determining means E determines that the existing object is an obstacle that interferes with the traveling of the vehicle. When the existing object is recognized as an obstacle, the approach determination means
The degree of approach to the obstacle is determined by comparing a predetermined inter-vehicle distance calculated from the relative speed obtained from the detection data from the distance measuring means D and the own vehicle speed with the distance to the actually detected object.

Thus, not only the distance measuring means D,
If the object position is detected also from the image captured by the image capturing unit A, the road sign or the like is not detected as an object in the traveling road area on the image processing side, and therefore is not determined as an obstacle. In addition, the determination method of the obstacle determination means E is, concretely, the data from the object position detection means C and the distance measurement means D from the data closest to the own vehicle, as in the invention of claim 2. The objects are sequentially compared, and when the two detection results match, the object corresponding to the detection data is determined to be an obstacle.

In this case, for example, the object position detecting means C and the distance measuring means D can detect and measure the ability to detect something other than an obstacle. It is possible to determine that the object does not exist when the detection data of the object is only one, and to determine that the obstacle exists when both detection data match by setting the output direction of , The reliability of these judgments is improved. If the detection data from both means C and D are present but do not match, it can be determined that the data farther from the vehicle is reliable.

Therefore, when the detection data of the object position detecting means C and the distance measuring means D do not match as in the third aspect of the invention, the object corresponding to the data farther from the vehicle is an obstacle. It is possible to increase the processing speed up to the determination of the presence / absence of an obstacle. Also,
According to the fourth aspect of the present invention, the detected obstacle can be followed when it is once determined that there is an obstacle.

[0012]

Embodiments of the present invention will be described below with reference to the drawings. FIG. 2 is a system configuration diagram of the first embodiment of the vehicle obstacle detection device according to the present invention. In FIG. 2, the apparatus according to the present embodiment is provided with a CCD camera 11 as an image pickup means for photographing a traveling direction of a vehicle, for example, a scene in front of the vehicle, and the CC camera 11.
Based on an image signal obtained from the D camera 11, a traveling road area detecting means and an object position detecting means for detecting a own lane area which is a traveling road area and detecting a position of an object in the detected own lane area, which will be described later. Image processing device 12 with functions
A laser radar 13 as a distance measuring means for measuring the distance, azimuth and relative speed between an object existing in the imaging range of the CCD camera 11 and the vehicle, and a vehicle speed detecting means for detecting the speed of the vehicle. While controlling the vehicle speed sensor 14 and the image processing device 12 and the laser radar 13, based on the detection output from the image processing device 12, the laser radar 13 and the vehicle speed sensor 14, each determination of the presence or absence of the obstacle and the degree of approach A microcomputer 15 having the functions of an obstacle judging means and an approach judging means for executing processing, and a microcomputer 15
And an alarm device 16 for issuing an alarm to the driver of the own vehicle based on the output from the vehicle.

The CCD camera 11 is fixed, for example, at a position near a rearview mirror so that the front scene of the vehicle 20 can be photographed, has an image pickup range X as shown in FIG. 3, and is shown in FIG. An image like this is obtained. In the figure, 21 is a forward vehicle, and 23 is a lane marker drawn on the road surface 22. The laser radar 13 is, for example, a multi-beam type that emits laser beams in three directions and is installed near the front grill, and has a range-finding range Y as shown in FIG. 4 by three laser beams. There is.

Next, the method of detecting the own lane area and the method of detecting the object position by the image processing device 12 will be described.
First, as a method of detecting the own lane area, based on the image data of the image obtained as shown in FIG. 5, a plurality of areas below the horizontal line of the screen as shown in FIG. Divide into four horizontal areas ad. In this case, the position of the lane marker existing in the lowermost area a does not change so much, so the straight line detection areas 31 and 32 are set at the positions where the existence of the lane markers on both sides can be predicted, and the lane marker is changed to a straight line among them. Approximately detect. 1 of horizontal area a
In the upper horizontal area b, the upper end point of the lane marker linearly approximated in the lower horizontal area a is used as the lower end point, and a straight line that is not much different from the inclination of the lane marker detected in the horizontal area a is detected in its own area, This straight line is used as a lane marker in the horizontal area b. Similarly, for the horizontal areas c and d, the upper end point of the lane marker linearly approximated by the next lower horizontal areas b and c is set as the lower end point, and a straight line that is not much different from the inclination of the lane marker in the horizontal areas b and c is detected. Are used as lane markers in the horizontal areas c and d. As described above, the lane markers existing on both sides of the own vehicle are recognized by the shape approximated by the polygonal line as shown in FIG. 7, and the own vehicle lane in which the own vehicle travels in the area sandwiched by the two polygonal lines 41, 42. The area S is recognized.

Next, a method of detecting the object position will be described. As a method of calculating the object position, that is, the distance between the own vehicle and the front object, for example, as shown in FIG. 8, a process of extracting the lateral edge component of the image based on the image data is executed, and Features having horizontal edge components 51 appearing on the screen, such as projected shadows and bumpers of the vehicle body, are extracted. Next, the lateral edge component 51 existing in the own lane area S is sequentially searched from the bottom of the screen, the lateral edge component 51 is recognized as a feature of the vehicle ahead (obstacle), and the y coordinate (Fig. Middle upward direction)
Ask for.

If the CCD camera 11 that has taken the image is installed so that the optical axis of the lens is horizontal to the ground, CC
Assuming that the height of the D camera 11 is H and the focal length of the lens is F, the inter-vehicle distance L with the preceding vehicle photographed at the position y 0 on the y coordinate on the screen is L = F · (H / y 0 ) ... Calculated according to (1).

The characteristics of the preceding vehicle (lateral edge component) detected at the position closest to the own vehicle by the above method
Is calculated as the first candidate, and thereafter, the distance is calculated as the second candidate, the third candidate, ... When extracting the lateral edge component, by defining the length (horizontal direction in the figure) of the lateral edge component to be extracted, an object such as a cateye installed on a road surface extremely shorter than the vehicle width can be defined. It is possible to prevent recognition as an obstacle.

As the distance measurement data by the laser radar 13, similar to the processing of the distance data of the image signal, the data having the shortest distance among the obtained distance measurement data (the data closest to the own vehicle) are sequentially ordered from the first data. Candidate, Second Candidate, Third Candidate,
・ ・Then, the image processing device 12 and the laser radar 13
The detection accuracy of the object is set to be easy to detect the object by setting the extraction condition of the lateral edge component in the image processing device 12 loosely, or increasing the output in the laser radar 13 or lowering the threshold value of the reflected wave. Even if an object other than the vehicle in front is detected, the object is set to be detected when the vehicle in front exists. That is, as a situation in which both of them are erroneously detected, the case where a vehicle other than the front vehicle is detected and the distance is calculated when there is no front vehicle, and the case where the front vehicle is present and there is a vehicle closer to the front vehicle is detected. There are two situations in which the distance is calculated by By setting the object detection accuracy of the image processing device 12 and the laser radar 13 in this way, when comparing the distance data from the image processing device 12 and the distance data from the laser radar 13, one of the two When the distance data can be obtained only from the above, it can be determined that there is no vehicle in front, and when the distance data of both do not match, it can be determined that the one with the longer distance data (the one farther from the vehicle) is more reliable. When the distance data of 1 are matched, it can be determined that the vehicle ahead is definitely present.

Next, the operation of detecting the forward vehicle (obstacle) of the first embodiment will be described with reference to the flow charts of FIGS. 9 and 10. The present system is started by turning on a switch of the present system installed near the driver's seat, and the processing described below is executed by the microcomputer 15. First, in step 1 (indicated by S1 in the figure, the same applies hereinafter), the CCD
Image processing device based on the image signal obtained from the camera 11
When the vehicle lane area S in which the vehicle is traveling and the object are detected in 12 as described above, the inter-vehicle distance Li to the object is input. The inter-vehicle distance Li data is input as the first candidate Li1, the second candidate Li2, the third candidate Li3, ... Further, from the laser radar 13, the inter-vehicle distance and the detected azimuth are first candidate (L11, d1), second, in order from the detection data closest to the own vehicle by the same number as detected.
Candidate (L12, d2), Third candidate (L13, d3) ...
・ Entered as.

In step 2, detection data (Ll1, d1), (Ll2, d2), (Ll) from the laser radar 13 is detected.
3, d3) ... Which exists in the own lane area S detected by the image processing device 12 is selected based on each azimuth data, and the one closest to the own vehicle is selected. The inter-vehicle distance data is sequentially set to the first candidate Ll1 and the second candidate Ll.
2, replaced with the third candidate L13.

In step 3, the detection data Ll1, Ll2, Ll3 ... of the laser radar 13 selected in step 2
.. is Llm, and the inter-vehicle distance data Li1, Li2, Li3 ... Obtained from the image processing device 12 are Lin,
The first candidate closest to the host vehicle is sequentially compared, and data in which both match (Lin = Llm) is searched. In step 4, it is judged whether or not there is data for which Lin = Llm. If it does not exist, the process returns to step 1,
If there is, go to step 5.

In step 5, the data value for which Lin = Llm is set as the inter-vehicle distance to the preceding vehicle. In step 6, it is determined based on the inter-vehicle distance determined in step 5 whether or not the possibility of contact is high. The degree of approach is determined based on the calculation formula (2) below.

[0023] L = Δα · T 2/2 + ΔV 2/2 · Δα ··· (2) where, L: safe distance T: sky run time ΔV: the relative speed Δα: (time change in the relative velocity) relative acceleration The relative velocity ΔV is calculated from the distance measurement data obtained from the laser radar 13. The free running time is measured by the vehicle speed sensor 14
It is set based on the vehicle speed from.

The safe inter-vehicle distance L as the predetermined inter-vehicle distance calculated by the above equation (2) is compared with the actual inter-vehicle distance determined in step 5, and the inter-vehicle distance measured is shorter than the safe inter-vehicle distance L. In this case, it is judged that the possibility of contact is high, and the alarm device 16 is driven in step 7 to warn the driver.
If the measured inter-vehicle distance is longer than the safe inter-vehicle distance L,
It is determined that the possibility of contact is low, and the process proceeds to step 8.

In step 8, the detection area of the inter-vehicle distance in the image processing device 12 is set near the inter-vehicle distance Lin by the laser radar.
The detection areas of the inter-vehicle distance in 13 are controlled so as to limit the detection to near the inter-vehicle distance Llm, and the detection data are set to Linew and Llnew, respectively. In Step 9, Linew
And the value of Llnew are compared. If Linew = Llnew, the process proceeds to step 10. If Linew = Llnew is not satisfied, the process returns to step 1.

In step 10, the value of Linew = Llnew is set as the inter-vehicle distance of the preceding vehicle. In step 11, the same judgment as in step 6 is performed. If the possibility of contact is high, the process proceeds to step 12, and the alarm device 16 is driven. If the possibility of contact is low, the process proceeds to step 13 to set Linew to Lim,
Replace Llnew with Lln, return to step 8, and repeat the operations from step 8 onward.

As described above, if the object position is detected by both the image processing device 12 and the laser radar 13, road signs and the like are not detected as objects in the lane area on the image processing device side. Even if it is detected by the laser radar side, it is not judged as an obstacle. Therefore, unlike the conventional device in which the position is detected only by the laser radar, the road sign is not recognized as an obstacle, and the reliability of the obstacle detection can be improved.

Further, the distance between the front vehicle and the vehicle in front can be reliably measured by setting the mutually matching data from a plurality of data as the inter-vehicle distance. Further, by performing the processing of step 8 and subsequent steps, it is possible to detect the forward vehicle that has been detected once, and to increase the processing speed after detecting the forward vehicle. Next, a second embodiment of the present invention will be described. The system configuration of the second embodiment is the same as that shown in FIG. 2, and only the software processing for detecting the front vehicle in the microcomputer 15 is different. Therefore, only the software processing for detecting the front vehicle will be described below. .

FIG. 11 shows a flowchart of the forward vehicle detection processing operation of the second embodiment, which will be described below.
In step 21, when the front vehicle is detected from the image lane area S and the image from the image processing device 12, the closest distance data from the own vehicle is input as the inter-vehicle distance Li, and the front vehicle is detected from the laser radar 13. In the case of being performed, the inter-vehicle distance Ll, the relative speed ΔV, and the detected direction d are input.

In step 22, it is determined whether both the image processing device 12 and the laser radar 13 have detected a vehicle ahead,
When only one of them detects a vehicle, it is determined that there is no vehicle ahead, and the process returns to step 21. If both are detected, go to step 23. In step 23, it is determined whether or not the vehicle detected by the laser radar 13 exists in the own lane area S based on the direction data, and if it does not exist, it is determined that the possibility of contact is low and the step is determined. twenty one
Return to If it exists in the own lane area S, the data closest to the own vehicle is set as the inter-vehicle distance data Ll in the laser radar 13 and the process proceeds to step 24.

In step 24, the inter-vehicle distance data Li obtained from the image processing device 12 and the inter-vehicle distance data Ll obtained from the laser radar 13 are compared. If Li = Ll, the process proceeds to step 26, where Li = Let Ll be the following distance, and Li <
If Ll, the process proceeds to step 27, where the distance data Ll on the laser radar 13 side is the inter-vehicle distance, and if Li> Ll, the process proceeds to step 28 and the distance data Li on the image processing device 12 side is the inter-vehicle distance. That is, when the distance data of both do not match, the distance data far from the own vehicle is estimated as the inter-vehicle distance.

In step 29, the safe inter-vehicle distance is calculated from the equation (2) as in the first embodiment. In Step 30, the inter-vehicle distance determined in any of Steps 26 to 28 is compared with the safe inter-vehicle distance to determine whether the possibility of contact is high or low, and if the actual inter-vehicle distance is shorter than the safe inter-vehicle distance. Judges that there is a high possibility of contact, advances to step 31 and drives the alarm device 16 to notify the driver. If the possibility of contact is low, the process returns to step 21.

As described above, when the distance data of the image processing device 12 and the distance data of the laser radar 13 which are the closest to the own vehicle do not match, the data farther from the own vehicle is regarded as the inter-vehicle distance. If so, the detection accuracy of the inter-vehicle distance is inferior to that of the first embodiment, but there is an effect that the obstacle detection processing speed can be increased.

[0034]

As described above, according to the first aspect of the invention, not only the traveling road region of the vehicle is recognized based on the image signal, but also the position of the object existing in the region is detected. On the other hand, even if the distance measuring means not related to image processing measures the distance to the object in the road area, and if both detect the object, it is recognized as an obstacle. It is possible to prevent an object that does not become an obstacle from being detected as an obstacle.

According to the second aspect of the invention, the accuracy of detecting the distance to the obstacle ahead can be improved. According to the invention of claim 3, the distance detection accuracy to the obstacle is lower than that of the invention of claim 2, but the processing speed of obstacle detection can be increased. In addition, according to the invention of claim 4, it is possible to detect an obstacle that has been detected once, and to increase the processing speed when the obstacle is detected.

[Brief description of drawings]

FIG. 1 is a block diagram illustrating a configuration of the present invention.

FIG. 2 is a system configuration diagram showing a first embodiment of the present invention.

FIG. 3 is a bird's-eye view showing the imaging range of the CCD camera.

FIG. 4 is a bird's-eye view showing the distance measurement range of the laser radar.

FIG. 5 is a diagram showing an example of a forward landscape screen imaged by a CCD camera.

FIG. 6 is an explanatory diagram of own lane area detection processing in the image processing apparatus.

FIG. 7 is an explanatory diagram of own lane area detection processing in the image processing apparatus.

FIG. 8 is an explanatory diagram of detection of a front vehicle by the image processing device.

FIG. 9 is a flowchart of an obstacle detection operation of the above embodiment.

FIG. 10 is a flowchart following FIG. 9.

FIG. 11 is a flowchart of an obstacle detection operation according to the second embodiment of the present invention.

FIG. 12 is an explanatory diagram of problems of the conventional device.

FIG. 13 is an explanatory diagram of problems of the conventional device.

[Explanation of symbols]

 11 CCD camera 12 Image processing device 13 Laser radar 14 Vehicle speed sensor 15 Microcomputer

Continuation of the front page (51) Int.Cl. 6 Identification number Office reference number FI Technical display location G08G 1/16 C

Claims (4)

[Claims]
1. An image pickup means for picking up an image of a road condition in the traveling direction of the vehicle, a traveling road area detecting means for detecting a traveling road area on which the vehicle travels based on an image signal from the image pickup means, and the traveling. An object position detecting means for detecting an object position in the traveling road area detected by the road area detecting means based on an image signal from the image pickup means, and a vehicle in the traveling road area regardless of the image signal. There is a distance measuring means for measuring the distance to the object, and there is an object position detection output from the object position detecting means, and when there is a distance measurement output from the distance measuring means, the object existing in the traveling road area is obstructed. An obstacle determining means for determining that the vehicle is an object, a vehicle speed detecting means for detecting the vehicle speed of the vehicle, and a relative speed calculated from the measured value of the distance measuring means when the obstacle determining means determines that the vehicle is an obstacle. And before An approach determination means for determining the degree of approach to an obstacle by comparing a predetermined inter-vehicle distance calculated based on the vehicle speed of the vehicle speed detection means with the distance to the obstacle. An obstacle detection device for a vehicle.
2. The obstacle judging means, when a plurality of object position detecting outputs and distance measuring outputs are respectively obtained from the object position detecting means and the distance measuring means, output the mutual output values from the order closest to the own vehicle. The vehicle obstacle detection device according to claim 1, wherein the obstacle detection device for a vehicle according to claim 1 is configured to compare and determine an object corresponding to the output value when both output values match.
3. The obstacle determining means does not match the distance to the detected object calculated from the object position detection output obtained from the object position detecting means and the distance measurement output obtained from the distance measuring means. The vehicle obstacle detection device according to claim 1, wherein the object corresponding to an output value farther away from the vehicle is determined to be an obstacle at times.
4. An object position detecting means and a distance measuring means are controlled so as to detect only the vicinity of the object when there is an object judged by the obstacle judging means as an obstacle. The vehicle obstacle detection device according to claim 1.
JP6302292A 1994-12-06 1994-12-06 Vehicle obstruction detecting device Pending JPH08156723A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP6302292A JPH08156723A (en) 1994-12-06 1994-12-06 Vehicle obstruction detecting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP6302292A JPH08156723A (en) 1994-12-06 1994-12-06 Vehicle obstruction detecting device

Publications (1)

Publication Number Publication Date
JPH08156723A true JPH08156723A (en) 1996-06-18

Family

ID=17907240

Family Applications (1)

Application Number Title Priority Date Filing Date
JP6302292A Pending JPH08156723A (en) 1994-12-06 1994-12-06 Vehicle obstruction detecting device

Country Status (1)

Country Link
JP (1) JPH08156723A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002163784A (en) * 2000-11-29 2002-06-07 Fujitsu Ten Ltd Deciding device for advancable range
EP1338477A2 (en) 2002-02-26 2003-08-27 Toyota Jidosha Kabushiki Kaisha Obstacle detection device for vehicle and method thereof
US6990216B2 (en) 2000-09-22 2006-01-24 Nissan Motor Co., Ltd. Method and apparatus for estimating inter-vehicle distance using radar and camera
JP2006517659A (en) * 2003-02-13 2006-07-27 アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ.Iee International Electronics & Engineering S.A. Automotive device used for 3D detection of inside / outside scene of automobile
EP1703299A2 (en) 2005-03-15 2006-09-20 Omron Corporation Object detector for a vehicle
JP2007240277A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP2008040819A (en) * 2006-08-07 2008-02-21 Toyota Motor Corp Obstacle recognition device
JP2009230229A (en) * 2008-03-19 2009-10-08 Toyota Motor Corp Object detection device
JP2010501952A (en) * 2007-04-19 2010-01-21 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Driver assistance system and method for determining the validity of an object
WO2018168182A1 (en) * 2017-03-17 2018-09-20 日本電気株式会社 Mobiile body detection device, mobile body detection method, and mobile body detection program

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6990216B2 (en) 2000-09-22 2006-01-24 Nissan Motor Co., Ltd. Method and apparatus for estimating inter-vehicle distance using radar and camera
JP2002163784A (en) * 2000-11-29 2002-06-07 Fujitsu Ten Ltd Deciding device for advancable range
EP1338477A2 (en) 2002-02-26 2003-08-27 Toyota Jidosha Kabushiki Kaisha Obstacle detection device for vehicle and method thereof
US6888447B2 (en) 2002-02-26 2005-05-03 Toyota Jidosha Kabushiki Kaisha Obstacle detection device for vehicle and method thereof
EP1338477A3 (en) * 2002-02-26 2006-03-08 Toyota Jidosha Kabushiki Kaisha Obstacle detection device for vehicle and method thereof
JP2006517659A (en) * 2003-02-13 2006-07-27 アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ.Iee International Electronics & Engineering S.A. Automotive device used for 3D detection of inside / outside scene of automobile
EP1703299A3 (en) * 2005-03-15 2007-08-22 Omron Corporation Object detector for a vehicle
EP1703299A2 (en) 2005-03-15 2006-09-20 Omron Corporation Object detector for a vehicle
JP2007240277A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP2008040819A (en) * 2006-08-07 2008-02-21 Toyota Motor Corp Obstacle recognition device
JP2010501952A (en) * 2007-04-19 2010-01-21 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Driver assistance system and method for determining the validity of an object
JP2009230229A (en) * 2008-03-19 2009-10-08 Toyota Motor Corp Object detection device
WO2018168182A1 (en) * 2017-03-17 2018-09-20 日本電気株式会社 Mobiile body detection device, mobile body detection method, and mobile body detection program

Similar Documents

Publication Publication Date Title
US20190073783A1 (en) Estimating distance to an object using a sequence of images recorded by a monocular camera
EP2803944B1 (en) Image Processing Apparatus, Distance Measurement Apparatus, Vehicle-Device Control System, Vehicle, and Image Processing Program
US9401028B2 (en) Method and system for video-based road characterization, lane detection and departure prevention
US9443154B2 (en) Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
EP2463843B1 (en) Method and system for forward collision warning
US8995723B2 (en) Detecting and recognizing traffic signs
JP5276637B2 (en) Lane estimation device
Srinivasa Vision-based vehicle detection and tracking method for forward collision warning in automobiles
EP1192597B1 (en) Method of detecting objects within a wide range of a road vehicle
US7266454B2 (en) Obstacle detection apparatus and method for automotive vehicle
EP0747870B1 (en) An object observing method and device with two or more cameras
JP3925488B2 (en) Image processing apparatus for vehicle
US7091837B2 (en) Obstacle detecting apparatus and method
US6493458B2 (en) Local positioning apparatus, and method therefor
US6888953B2 (en) Vehicle surroundings monitoring apparatus
US6122597A (en) Vehicle monitoring apparatus
US8175331B2 (en) Vehicle surroundings monitoring apparatus, method, and program
US5987152A (en) Method for measuring visibility from a moving vehicle
Ferryman et al. Visual surveillance for moving vehicles
JP3436074B2 (en) Car stereo camera
JP3503230B2 (en) Nighttime vehicle recognition device
JP3463858B2 (en) Perimeter monitoring device and method
US8040227B2 (en) Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device
JP3739693B2 (en) Image recognition device
US7545956B2 (en) Single camera system and method for range and lateral position measurement of a preceding vehicle