JP2009257981A - Device for generating distance image data for vehicle - Google Patents

Device for generating distance image data for vehicle Download PDF

Info

Publication number
JP2009257981A
JP2009257981A JP2008108556A JP2008108556A JP2009257981A JP 2009257981 A JP2009257981 A JP 2009257981A JP 2008108556 A JP2008108556 A JP 2008108556A JP 2008108556 A JP2008108556 A JP 2008108556A JP 2009257981 A JP2009257981 A JP 2009257981A
Authority
JP
Japan
Prior art keywords
imaging
distance
image data
vehicle
distance image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008108556A
Other languages
Japanese (ja)
Inventor
Yoshihiro Edamoto
吉広 枝本
Original Assignee
Calsonic Kansei Corp
カルソニックカンセイ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Calsonic Kansei Corp, カルソニックカンセイ株式会社 filed Critical Calsonic Kansei Corp
Priority to JP2008108556A priority Critical patent/JP2009257981A/en
Publication of JP2009257981A publication Critical patent/JP2009257981A/en
Pending legal-status Critical Current

Links

Abstract

<P>PROBLEM TO BE SOLVED: To provide a device for generating distance image data for a vehicle which enables continuous grasp of a situation in front of a driver's own vehicle. <P>SOLUTION: The device is equipped with a projector 7 which projects pulse light on a prescribed cycle to an area in front of the driver's own vehicle, a high-speed camera 8 which images reflected light returning from an imaging area at imaging timing set according to the imaging area, a timing controller 9 which controls the imaging timing so that the imaging area may be varied continuously, and an image processing part 10 which produces distance image data showing a distance of each pixel to an object, based on the brightness of the same pixel in a plurality of picked-up images different in the imaging area which are obtained by the high-speed camera 8. The timing controller 9 makes the length of an imaging time different for each target distance, according to the environment of running. <P>COPYRIGHT: (C)2010,JPO&INPIT

Description

  The present invention belongs to the technical field of a vehicle distance image data generation device.

In Patent Literature 1, whether or not an object such as an obstacle exists at the target distance is based on an image that is projected in accordance with the timing of reflected light that is projected in front of the host vehicle and returned from the target distance. Techniques for detection are disclosed.
US Pat. No. 6,732,123

  However, in the above prior art, objects other than the target distance cannot be detected. That is, there is a problem that the situation is intermittently grasped and the situation ahead of the host vehicle cannot be grasped continuously.

  An object of the present invention is to provide a vehicular distance image data generation apparatus that can continuously grasp the situation ahead of the host vehicle.

In order to achieve the above object, in the vehicle distance image data generation device of the present invention,
Light projecting means for projecting pulsed light in front of the host vehicle at a predetermined period;
Imaging means for imaging reflected light returning from the target distance at an imaging timing set according to the target distance;
Timing control means for controlling the imaging timing so that the target distance changes continuously;
Distance image data generation means for generating distance image data representing a distance to an object for each pixel based on the luminance of the same pixel in a plurality of captured images with different target distances obtained by the imaging means;
With
The timing control unit is characterized in that the length of the imaging time of each target distance varies according to the traveling environment.

  Here, the “traveling environment” refers to, for example, the speed of the host vehicle, the presence / absence and number of objects, the shape of the traveling path (straight road, curved road), the weather (clear weather, bad weather), the traveling time (day and night), and the like. Say.

Therefore, in the present invention, since the distance image data representing the distance to the object for each pixel is generated based on the luminance of the same pixel in the plurality of captured images having different target distances, the situation in front of the host vehicle is continuously displayed. Can be grasped.
In addition, since the imaging time for each target distance varies according to the driving environment, the resolution required for each target distance varies depending on the driving environment, while a specific target is not accompanied by an increase in the number of captured images. Distance resolution can be increased.

  DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The best mode for realizing a vehicular distance image data generating apparatus according to the present invention will be described below with reference to the drawings.

First, the configuration will be described.
FIG. 1 is a block diagram illustrating a configuration of an obstacle detection apparatus 1 according to a first embodiment to which the vehicle distance image data generation apparatus according to the present invention is applied. The obstacle detection apparatus 1 according to the first embodiment generates distance image data. A device 2, an object recognition processing unit 3, and a determination unit 4 are provided.

The distance image data generation device 2 includes a projector (projecting unit) 5, an objective lens 6, a light multiplying unit 7, a high-speed camera (imaging unit) 8, a timing controller (timing control unit) 9, and an image. And a processing unit (distance image data generating means) 10.
The projector 5 is a near-infrared LED disposed at the front end of the vehicle, and outputs pulsed light for a predetermined light projecting time tL (for example, 5 ns) according to the pulse signal output from the timing controller 9. The cycle of the pulse signal is the projection cycle tP of the projector 5, and the projection cycle tP is, for example, an interval of 1/100 s or less.
The objective lens 6 is for receiving reflected light from an object, and is disposed adjacent to the projector 5. For example, the optical system is set to have an angle of view capable of capturing a predetermined range in front of the host vehicle.

The light multiplication unit 7 includes a gate 7a and an image intensifier 7b.
The gate 7a opens and closes in response to an opening / closing command signal from the timing controller 9. Here, in Example 1, the opening time (gate time) tG of the gate 7a is set to 5 ns, which is the same as the light projection time tL. Here, the gate time tG corresponds to the imaging target width of the imaging area (target distance). The longer the gate time tG, the longer the imaging target width of the imaging area. In the first embodiment, since the gate time tG = 5 ns, the imaging target width is 1.5 m from the speed of light (about 3 × 10 8 m / s) × gate time (5 ns).

The image intensifier 7b temporarily converts extremely weak light (reflected light from an object, etc.) into electrons and then electrically amplifies it, and returns it to a fluorescent image again, thereby doubling the amount of light and producing an image with contrast. It is a viewing device. Photoelectrons launched from the photocathode of the image intensifier 7b by the photoelectric phenomenon are accelerated by a high voltage of the order of kV, and are emitted to the phosphor screen on the anode side, thereby emitting fluorescence having a photon number of 100 times or more. The fluorescence generated on the phosphor screen is guided to the image sensor of the high-speed camera 8 by the fiber optic plate without being scattered while maintaining the same positional relationship.
The high-speed camera 8 captures an image emitted from the light doubling unit 7 in response to a command signal from the timing controller 9 and outputs a captured image (color image) to the image processing unit 10. In the first embodiment, a camera having a resolution of 640 × 480 (horizontal: vertical), a luminance value of 1 to 255 (256 levels), and 100 fps or more is used.

The timing controller 9 is the time from the light projection start time of the projector 5 until the gate 7a is opened so that the captured image captured by the high-speed camera 8 is the timing of the reflected light returning from the target imaging area. The imaging timing is controlled by setting a certain delay time tD and outputting an open / close command signal corresponding to the delay time. That is, the delay time tD is a value that determines the distance from the host vehicle to the imaging area (imaging target distance), and the relationship between the delay time tD and the imaging target distance is as follows.
Imaging distance = speed of light (approx. 3 x 10 8 m / s) x delay time tD / 2
FIG. 2 shows a temporal relationship between the operation of the projector 5 (light projection operation) and the operation of the gate 7a (camera gate operation) when imaging one imaging area.

  The timing controller 9 increases the imaging range of the high-speed camera 8 by increasing the delay time tD by a predetermined interval (for example, 10 ns) so that the imaging area continuously moves from the front side of the vehicle to the front side. Change to the side. The timing controller 9 starts the imaging operation of the high-speed camera 8 immediately before the gate 7a is opened, and ends the imaging operation after the gate 7a is completely closed.

  Further, in the first embodiment, as illustrated in FIG. 3, when the imaging target distance is continuously changed from B1 → B2 → B3 →..., The imaging target distance is increased more than the imaging target width A of the imaging area. By increasing the amount (B2-B1), the increase amount of the imaging target distance is set so that a part of the imaging area changes while overlapping.

  FIG. 4 is a schematic diagram showing temporal luminance changes when the amount of increase in the imaging target distance is reduced to the limit, in other words, when imaging is performed with an infinite increase in the imaging area. By overlapping each other, the luminance value of the same pixel in a plurality of consecutive captured images gradually increases, and after the peak, the luminance value gradually decreases. In practice, the number of imaging areas is limited (1 to n), but by overlapping a part of continuous imaging areas, the temporal luminance change becomes close to the characteristics of FIG.

  Furthermore, the timing controller 9 performs imaging time setting control for setting the gate time tG of each imaging area so that the length of the imaging time of each imaging area differs according to the driving environment. In the first embodiment, the vehicle speed, the presence / absence of an object, and the number thereof are used as the traveling environment.

  The timing controller 9 outputs an image processing command signal to the image processing unit 10 when all the captured images for one frame, that is, the set predetermined range (area 1, area 2,..., Area n) are captured. To do.

  The image processing unit 10 generates distance image data representing the distance to the object for each pixel based on the luminance of the same pixel in the captured image for one frame captured by the high-speed camera 8, and the generated distance image data Is output to the object recognition processing unit 3.

The object recognition processing unit 3 identifies an object included in the distance image data. A well-known technique such as pattern matching can be used as an object identification method.
Based on the relationship (distance, relative speed, etc.) between the object (person, car, sign, etc.) identified by the object recognition processing unit 3 and the host vehicle, the determination unit 4 presents information to the driver by an alarm, Determine whether vehicle control such as automatic braking is necessary.

[Distance image data generation control processing]
FIG. 5 is a flowchart illustrating the flow of the distance image data generation control process executed by the image processing unit 10 according to the first embodiment. Each step will be described below. This process is repeatedly executed at a predetermined calculation cycle.

  In step S1, area 1 which is the imaging area closest to the host vehicle is imaged without performing projection by the projector 5, and luminance value data of an arbitrary pixel of the captured image is stored as minimum value data, and the process proceeds to step S2. Transition.

  In step S2, a captured image is input, and the process proceeds to step S3.

  In step S3, the minimum value data stored in step S1 is compared with the lowest luminance value data among the luminance value data of each pixel of the captured image input in step S2, and the luminance value data is more than the minimum value data. Determine whether it is low. If YES, the process proceeds to step S4. If NO, the process proceeds to step S5.

  In step S4, luminance value data lower than the minimum value data is stored as new minimum value data, and the process proceeds to step S5.

  In step S5, the difference between the luminance value data and the minimum value data is obtained for each pixel, and the current data in the pixel is obtained.

  In step S6, it is determined whether image input for one frame (area 1, area 2,..., Area n) has been completed. If YES, the process proceeds to step S7. If NO, the process proceeds to step S2.

  In step S7, the brightness value data of the same pixel of the n captured images is compared, and distance image data is generated from the frame number (delay time tD) of the captured image corresponding to the highest brightness value data, and the process returns. Transition.

[Imaging time setting control processing]
FIG. 6 is a flowchart illustrating the flow of the imaging time setting control process executed by the timing controller 9 according to the first embodiment. Each step will be described below.

  In step S21, the current vehicle speed (vehicle speed) is read, and the process proceeds to step S22.

  In step S22, it is determined whether an object is detected based on the distance image data. If YES, the process proceeds to step S23, and if NO, the process proceeds to step S24.

In step S23, the imaging target width (imaging time) of each imaging area is set according to the position of the object, and the process proceeds to return.
Specifically, the imaging target width of the imaging area where the object is detected is 1 m, the imaging target width of the imaging area before and after the imaging area is 2 m, and the imaging target width of the other imaging areas is 5 m. Set the gate time tG.

  Here, when a plurality of objects are detected, the imaging target width of the imaging area in which the object closest to the host vehicle is detected is 1 m, and the imaging targets in the imaging area in which the preceding and subsequent imaging areas and other objects are detected are detected. The gate time tG of each imaging area is set so that the width is 2 m and the imaging target width of the other imaging areas is 5 m.

In step S24, the imaging target width of each imaging area is set according to the vehicle speed, and the process proceeds to return.
Specifically, in the low vehicle speed range where the vehicle speed is lower than the low vehicle speed threshold (for example, 40 km / h), the imaging target width of the imaging area (area 1) closest to the host vehicle is 1 m and the imaging target of area 2 The gate time tG of each imaging area is set so that the width is 2 m and the other imaging areas (area 3 to area n) have an imaging target width of 5 m. If the vehicle speed exceeds 40 km / h, the imaging area with the imaging target width of 1 m is set as the farther imaging area as the vehicle speed increases, and the vehicle speed is set to a high vehicle speed threshold (for example, 40 km / h) In the high vehicle speed range as described above, the gate time tG is set so that the imaging target width of the imaging area farther away than the nearby imaging area is narrowed.
In the first embodiment, the delay time tD is set so that the imaging target distance in the area 1 becomes longer as the vehicle speed becomes higher.

Next, the operation will be described.
[Distance image data generation function]
The timing controller 9 controls the imaging timing of the high-speed camera 8 by setting the delay time tD so that the captured image captured by the high-speed camera 8 becomes the timing of the reflected light returning from the target imaging area. To do. When an object is present in the target imaging area, the time for the light emitted from the projector 5 to return from the imaging area is the time for the light to reciprocate the distance between the host vehicle and the imaging area (imaging target distance). Therefore, the delay time tD can be obtained from the imaging target distance and the speed of light.

  In the captured image of the high-speed camera 8 obtained by the above method, when an object exists in the imaging area, the luminance value data of the pixel corresponding to the position of the object is affected by the reflected light, and the luminance of other pixels Indicates a value higher than the value data. Thereby, based on the luminance value data of each pixel, the distance from the object existing in the targeted imaging area can be obtained.

  Furthermore, in the first embodiment, captured images of the imaging areas 1 to n are acquired while changing the delay time tD. Subsequently, the brightness value data of the same pixel in each captured image is compared, and the distance data (distance image data) of all the pixels (640 × 480) is generated using the highest brightness value data as the distance of the pixel.

  Conventional distance detection methods using laser radars and stereo cameras are susceptible to rain, fog, snow, etc., and the noise level with respect to the signal level is large (the SN ratio is small), so the reliability in bad weather is low. . When using a millimeter wave radar that is not easily affected by bad weather, the reliability of distance detection is high, but it is difficult to perform object recognition (object identification) from the millimeter wave radar signal. Is required. Further, since the camera image becomes unclear during bad weather, it is difficult to perform accurate object recognition.

On the other hand, in the first embodiment, only reflected waves returning from the target imaging area are reflected in the captured image, so that the light bent due to the influence of rain, fog, snow, or the like, that is, the noise mixing level is kept low. High signal-to-noise ratio can be obtained. That is, high distance detection accuracy can be obtained regardless of bad weather or night.
Since the distances of all the pixels are known from the generated distance image data, the distance to the object can be instantly grasped when performing object recognition using a technique such as pattern matching.

  Furthermore, in the first embodiment, the captured area is continuously changed to acquire a plurality of captured images, and the captured images are compared to detect the distance of each pixel. In addition, it can be grasped over a wide range. For example, even in the situation where a pedestrian has jumped between the host vehicle and the preceding vehicle, the distance between the preceding vehicle and the pedestrian can be grasped at the same time. Control can be performed.

FIG. 7 shows a situation in which four pedestrians A to D exist at different positions in front of the host vehicle, and the relationship between the distance between the host vehicle and each pedestrian is A <B <C <D. .
At this time, in Example 1, a part of the imaging area is overlapped so that the reflected light from one object is reflected in the pixels of the captured image in a plurality of continuous imaging areas. For this reason, the temporal luminance change of the pixel corresponding to each pedestrian shows a triangular characteristic that takes a peak at the position of the pedestrian as shown in FIG.

  Since the distance image data is data used for alarms and vehicle control, it is impossible to set an imaging area infinitely finely as long as a certain calculation speed is required. By making the reflected light from the plurality of captured images included, as shown in FIG. 9, the temporal luminance change of the pixels is approximated to the above characteristics, and the imaging area corresponding to the peak of the triangular portion is By setting the distance of the object in the pixel, detection accuracy can be increased.

[Imaging time setting according to the driving environment]
In the first embodiment, a predetermined range in front of the host vehicle is divided into a plurality (n) of imaging areas, and the distance image data is generated with the captured image of each imaging area as one frame. Since the distance image data is used for obstacle warning and vehicle control, the length of one frame (frame rate) is preferably as short as possible.
On the other hand, as the imaging target width (imaging time) of each imaging area becomes shorter, the distance to the object for each pixel in the distance image data becomes closer to the actual value. That is, the distance resolution can be increased as the imaging target width is shorter.

  However, the number of times of shooting per frame is limited due to the calculation processing capability of the image processing unit 10 and the performance of the high-speed camera 8 and the gate 7a. Become. That is, when priority is given to shortening the frame rate, the number of times of photographing must be reduced, resulting in a decrease in distance resolution. On the other hand, when priority is given to increasing the number of shootings, one frame must be lengthened, leading to a deterioration in responsiveness of various controls.

  On the other hand, in the first embodiment, the imaging target width of each imaging area is not divided equally, but an area with a high distance resolution and an area with a low distance resolution are set in each imaging area according to the driving environment. ing. That is, an imaging area that requires high distance resolution is limited to a part of the area depending on the driving environment. Further, it is the imaging area corresponding to the driver's forward gazing point that requires high distance resolution.

  Therefore, in the first embodiment, it is determined in which imaging area the driver's forward gazing point is in accordance with the traveling environment such as the vehicle speed and obstacles (objects), and only the imaging area having the forward gazing point or By increasing the resolution only in the front and rear areas and decreasing the resolution in the other areas, it is possible to achieve both frame rate and distance resolution.

(Resolution changing action when detecting an object)
FIG. 10 is a diagram showing the imaging target distance of each imaging area at the time of object detection, and pedestrians are detected in area 3 and area n. In this case, in the flowchart of FIG. 6, the flow proceeds from step S21 to step S22 to step S23, and the resolution (imaging target width) of area 3 where the latest pedestrian is detected is 1 m, areas 2, 4 and area n. Resolution of 2m and resolution of other areas 5m.

  That is, when a pedestrian is detected in area 3, it is necessary to accurately determine the relative position and relative speed with the pedestrian, so the distance resolution of area 3 is increased (resolution 1 m) and the distance of other areas is increased. By reducing the resolution (resolution 2 m or 5 m), it is possible to achieve both frame rate and distance resolution while accurately grasping the distance of the pedestrian.

  In addition, since the pedestrian may move to the area 2 or the area 4 adjacent to the area 3 before the next distance image data is generated, the distance resolution of the areas 2 and 4 is increased to some extent (resolution 2m), when the pedestrian moves to an area other than the area 3, it can be avoided that the distance resolution is greatly reduced.

  Furthermore, in Example 1, when a pedestrian is detected in each of the areas 3 and n, the distance resolution of the area 3 is set higher (1 m) than the distance resolution (2 m) of the area n. The reason is that when multiple pedestrians are detected, warning and vehicle control are determined according to the relative position and relative speed with the pedestrian close to the own vehicle, so the distance of the pedestrian closer to the own vehicle This is important.

(Resolution changing action at low speed)
FIG. 11 is a diagram illustrating an imaging target distance in each imaging area during low-speed traveling, and the vehicle speed of the host vehicle is 30 km / h. In this case, in the flowchart of FIG. 6, the flow proceeds from step S21 to step S22 to step S24, where the resolution of area 1 is 1 m, the resolution of area 2 is 2 m, and the resolution of areas 3 to n is 5 m.

  When driving at low speed, the driver is driving while looking at the situation in area 1 closest to the host vehicle. That is, for the driver, information on area 1 is most important, and information on other areas, particularly information on areas far from the host vehicle is not so important. Thus, in the first embodiment, when the vehicle is traveling at a low speed, the distance resolution of the area 1 is made highest (1 m), so that the situation of the area 1 that is most important for the driver can be detected more accurately and presented to the driver.

(Resolution change function during high-speed driving)
FIG. 12 is a diagram showing the imaging target distance of each imaging area during high-speed traveling, and the vehicle speed of the host vehicle is 100 km / h. In this case, in the flowchart of FIG. 6, the flow proceeds from step S21 to step S22 to step S24, where the resolution of area n is 2 m and the resolution of areas 1 to n-1 is 5 m.

  When driving at high speed, the driver is driving while looking at the situation of the area n farthest from the host vehicle. That is, for the driver, information on area n is the most important, and information on other areas, particularly information on areas close to the host vehicle is not so important. Therefore, in the first embodiment, when the high-speed traveling, the distance resolution of the area n is set to the highest (2 m), so that the situation of the area n most important for the driver can be detected more accurately and presented to the driver.

  In Example 1, when the vehicle speed exceeds 40 km / h, the distance resolution of a farther imaging area is increased as the vehicle speed increases. In other words, as the vehicle speed increases, the driver's gaze point moves to a more forward area, so the highest distance resolution of the imaging area with the driver's line of sight according to the gaze point movement is the most important for the driver. It is possible to more accurately detect the situation of a proper imaging area and present it to the driver.

  In the first embodiment, the imaging target distance in area 1 is increased as the vehicle speed increases. In other words, the non-detection range on the vehicle front side (between the host vehicle and area 1) is increased as the vehicle speed increases. As described above, the driver's point of interest moves forward as the vehicle speed increases, so that the non-detection range on the front side of the vehicle becomes wider as the vehicle speed increases, thereby avoiding an unnecessarily wide imaging area being set. it can.

Next, the effect will be described.
The distance image data generation device 2 according to the first embodiment has the following effects.

  (1) A projector 7 that projects pulsed light in front of the host vehicle at a predetermined period, a high-speed camera 8 that captures reflected light returning from the imaging area at an imaging timing set according to the imaging area, and an imaging area The distance to the object for each pixel based on the luminance of the same pixel in a plurality of captured images with different imaging areas obtained by the high-speed camera 8 and the timing controller 9 that controls the imaging timing so that the image changes continuously The timing controller 9 varies the length of the imaging time for each target distance according to the traveling environment. As a result, both the frame rate and the distance resolution can be achieved.

  (2) The timing controller 9 makes the length of the imaging time of the imaging area corresponding to the driver's forward gazing point that changes according to the driving environment shorter than the imaging time of the other imaging areas. Thereby, the situation of the area most important for the driver can be detected more accurately and presented to the driver.

  (3) The timing controller 9 is most needed by the driver when driving at low speed in order to shorten the imaging time of the imaging area close to the own vehicle and lengthen the imaging time of the imaging area far from the own vehicle in the low vehicle speed range. The situation of the imaging area can be detected more accurately and presented to the driver.

  (4) The timing controller 9 sets an imaging area for shortening the imaging time and an imaging area for increasing the imaging time according to the vehicle speed, and shortens the imaging time of the farther imaging area as the vehicle speed increases. Thereby, the situation of the imaging area that the driver needs most according to the vehicle speed can be detected more accurately and presented to the driver.

  (5) The timing controller 9 lengthens the imaging area closest to the host vehicle as the vehicle speed increases. That is, by expanding the non-detection range on the front side of the host vehicle that is not important for the driver as the vehicle speed increases, the distance resolution of the necessary imaging area can be further increased.

  (6) Since the timing controller 9 sets the imaging time of the imaging area where the object is detected to be the shortest, the timing controller 9 accurately grasps the distance of the object, which is the most important information for the driver, Can be achieved.

  (7) The timing controller 9 makes the imaging time of the imaging area adjacent to the imaging area where the object is detected shorter than the imaging time of the other imaging area, so even when the detected object moves to the adjacent imaging area. It is possible to ensure a certain distance resolution and avoid a significant deterioration of the distance resolution.

  (8) When a plurality of objects are detected, the timing controller 9 accurately sets the distance of the object most important to the driver in order to minimize the imaging time of the imaging area where the object closest to the host vehicle is detected. It can be detected.

(Other examples)
The best mode for carrying out the present invention has been described above based on the embodiment. However, the specific configuration of the present invention is not limited to the configuration of the embodiment, and it relates to each claim of the claims. Design changes and additions are allowed without departing from the spirit of the invention.

  For example, the light projection period, the light projection time, the gate time, the imaging target width, the amount of change in the imaging target distance, and the number of imaging areas in one frame are appropriately set according to the performance of the imaging means and the performance of the distance image data generation means can do.

  In the embodiment, when the object is detected, the imaging target width of each imaging area is set according to the position of the object, and when the object is not detected, the imaging target width of each imaging area is set according to the vehicle speed. Sometimes the imaging target width of each imaging area may be set according to the position of the object and the vehicle speed.

  In the embodiment, the speed of the host vehicle, the presence / absence of an object, and the number of objects are used as the “running environment”. However, the running environment includes the shape of the road (straight road, curved road), weather (clear weather, bad weather), and travel time. (Day and night) may be used.

(Traffic shape)
On a curved road, an imaging area that has higher resolution than a straight road is an area closer to the host vehicle. This is because the driver's gazing point on the curved road is closer to the host vehicle than the straight road. Thereby, the distance resolution of the area which a driver wants can be raised.

(the weather)
When the weather is rainy or foggy, the imaging area whose resolution is higher than that in clear weather is set closer to the vehicle. This is because the driver's gazing point during bad weather is closer to the vehicle than during sunny weather. Thereby, the distance resolution of the area which a driver wants can be raised.

(Travel time)
At night, an imaging area that has higher resolution than that of the daytime is an area closer to the host vehicle. The driver's gazing point at night is the sky that is closer to the vehicle than during the day. Thereby, the distance resolution of the area which a driver wants can be raised.

It is a block diagram which shows the structure of the obstruction detection apparatus 1 of Example 1. FIG. It is a figure which shows the temporal relationship between the operation | movement (light projection operation | movement) of the light projector 5, and the operation | movement (camera gate operation | movement) of the gate 7a at the time of imaging one imaging area. It is a figure which shows the state which a part of imaging area overlaps. It is a schematic diagram which shows a temporal luminance change at the time of imaging by increasing an imaging area infinitely. 3 is a flowchart illustrating a flow of distance image data generation control processing executed by the image processing unit 10 according to the first embodiment. 6 is a flowchart illustrating a flow of an imaging time setting control process executed by the timing controller 9 according to the first embodiment. It is a figure which shows the condition where four pedestrians AD exist in the different position ahead of the own vehicle. It is a schematic diagram which shows the temporal luminance change of the pixel corresponding to each pedestrian A-B. It is a figure which shows the distance image data generation effect | action of Example 1. FIG. It is a figure which shows the imaging target distance of each imaging area at the time of object detection. It is a figure which shows the imaging object distance of each imaging area at the time of low speed driving | running | working. It is a figure which shows the imaging object distance of each imaging area at the time of high-speed driving | running | working.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Obstacle detection apparatus 2 Distance image data generation apparatus 3 Object recognition process part 4 Judgment part 5 Light projector (light projection means)
6 Objective lens 7 Photomultiplier 7a Gate 7b Image intensifier 8 High-speed camera (imaging means)
9 Timing controller (timing control means)
10 Image processing unit (distance image data generating means)

Claims (8)

  1. Light projecting means for projecting pulsed light in front of the host vehicle at a predetermined period;
    Imaging means for imaging reflected light returning from the target distance at an imaging timing set according to the target distance;
    Timing control means for controlling the imaging timing so that the target distance changes continuously;
    Distance image data generation means for generating distance image data representing a distance to an object for each pixel based on the luminance of the same pixel in a plurality of captured images with different target distances obtained by the imaging means;
    With
    The vehicle distance image data generating apparatus characterized in that the timing control means varies the length of the imaging time of each target distance according to the driving environment.
  2. The vehicle distance image data generation device according to claim 1,
    The timing control means makes the length of the imaging time of the target distance corresponding to the driver's forward gazing point that changes according to the driving environment shorter than the imaging time of the other target distance. Distance image data generation device.
  3. In the vehicular distance image data generation device according to claim 1 or 2,
    The vehicle distance image data generation apparatus characterized in that the timing control means shortens the imaging time of a target distance close to the own vehicle and lengthens the imaging time of a target distance far from the own vehicle in a low vehicle speed range.
  4. The vehicle distance image data generation device according to any one of claims 1 to 3,
    The timing control means sets a target distance for shortening the imaging time and a target distance for increasing the imaging time according to the vehicle speed, and shortens the imaging time for a farther target distance as the vehicle speed increases. Distance image data generation device.
  5. In the vehicular distance image data generation device according to any one of claims 1 to 4,
    The vehicle distance image data generating apparatus characterized in that the timing control means increases the target distance closest to the host vehicle as the vehicle speed increases.
  6. In the vehicular distance image data generation device according to any one of claims 1 to 5,
    The vehicle distance image data generation apparatus characterized in that the timing control means sets the imaging time of the target distance at which the object is detected to be the shortest.
  7. The vehicle distance image data generation device according to claim 6,
    The vehicle distance image data generation apparatus characterized in that the timing control means makes the imaging time of a target distance adjacent to the target distance where the object is detected shorter than the imaging time of another target distance.
  8. In the vehicular distance image data generation device according to claim 6 or 7,
    The vehicle timing image data generating apparatus characterized in that, when a plurality of objects are detected, the timing control means shortens an imaging time of a target distance at which an object closest to the host vehicle is detected.
JP2008108556A 2008-04-18 2008-04-18 Device for generating distance image data for vehicle Pending JP2009257981A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008108556A JP2009257981A (en) 2008-04-18 2008-04-18 Device for generating distance image data for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008108556A JP2009257981A (en) 2008-04-18 2008-04-18 Device for generating distance image data for vehicle

Publications (1)

Publication Number Publication Date
JP2009257981A true JP2009257981A (en) 2009-11-05

Family

ID=41385582

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008108556A Pending JP2009257981A (en) 2008-04-18 2008-04-18 Device for generating distance image data for vehicle

Country Status (1)

Country Link
JP (1) JP2009257981A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009300133A (en) * 2008-06-11 2009-12-24 Japan Aerospace Exploration Agency Airborne optical remote air current measuring apparatus
JP2014213776A (en) * 2013-04-26 2014-11-17 株式会社デンソー Collision determination device, and collision mitigation device
JP2016037098A (en) * 2014-08-06 2016-03-22 オムロンオートモーティブエレクトロニクス株式会社 Vehicular imaging system
WO2017110413A1 (en) * 2015-12-21 2017-06-29 株式会社小糸製作所 Image acquisition device for vehicles, control device, vehicle provided with image acquisition device for vehicles and control device, and image acquisition method for vehicles

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009300133A (en) * 2008-06-11 2009-12-24 Japan Aerospace Exploration Agency Airborne optical remote air current measuring apparatus
JP2014213776A (en) * 2013-04-26 2014-11-17 株式会社デンソー Collision determination device, and collision mitigation device
US9460627B2 (en) 2013-04-26 2016-10-04 Denso Corporation Collision determination device and collision mitigation device
JP2016037098A (en) * 2014-08-06 2016-03-22 オムロンオートモーティブエレクトロニクス株式会社 Vehicular imaging system
WO2017110413A1 (en) * 2015-12-21 2017-06-29 株式会社小糸製作所 Image acquisition device for vehicles, control device, vehicle provided with image acquisition device for vehicles and control device, and image acquisition method for vehicles

Similar Documents

Publication Publication Date Title
US10257432B2 (en) Method for enhancing vehicle camera image quality
JP6536984B2 (en) Ranging imaging system, solid-state imaging device, and ranging imaging method
US9800779B2 (en) Bundling night vision and other driver assistance systems (DAS) using near infra-red (NIR) illumination and a rolling shutter
US9836657B2 (en) System and method for periodic lane marker identification and tracking
US9626570B2 (en) Vehicle control system and image sensor
EP2682898B1 (en) Lens-attached matter detector, lens-attached matter detection method, and vehicle system
US10377322B2 (en) In-vehicle camera and vehicle control system
US9352690B2 (en) Apparatus and method for detecting obstacle adaptively to vehicle speed
US9753141B2 (en) Gated sensor based imaging system with minimized delay time between sensor exposures
US9690997B2 (en) Recognition object detecting apparatus
KR101030763B1 (en) Image acquisition unit, acquisition method and associated control unit
US10295667B2 (en) Object detection system
DE102006020192B4 (en) Apparatus and method for predicting collision
DE60207395T2 (en) System for image analysis
JP4914234B2 (en) Leading vehicle detection device
JP6416085B2 (en) Gated imaging using applicable depth of field
JP3671825B2 (en) Inter-vehicle distance estimation device
US9904859B2 (en) Object detection enhancement of reflection-based imaging unit
US7839272B2 (en) Vehicle surroundings monitoring apparatus
KR101071362B1 (en) Vehicular object ranging system and method of operation
EP2815383B1 (en) Time to collision using a camera
JP5409929B2 (en) Control method for headlight device for vehicle and headlight device
EP2870031B1 (en) Gated stereo imaging system and method
US10564267B2 (en) High dynamic range imaging of environment with a high intensity reflecting/transmitting source
JP4993322B2 (en) Identification and classification of light spots around the vehicle by a camera