JPH06295601A - Headlight for vehicle - Google Patents

Headlight for vehicle

Info

Publication number
JPH06295601A
JPH06295601A JP8169293A JP8169293A JPH06295601A JP H06295601 A JPH06295601 A JP H06295601A JP 8169293 A JP8169293 A JP 8169293A JP 8169293 A JP8169293 A JP 8169293A JP H06295601 A JPH06295601 A JP H06295601A
Authority
JP
Japan
Prior art keywords
vehicle
cut line
preceding vehicle
detected
step
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP8169293A
Other languages
Japanese (ja)
Inventor
Masashi Mizukoshi
雅司 水越
Original Assignee
Toyota Motor Corp
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp, トヨタ自動車株式会社 filed Critical Toyota Motor Corp
Priority to JP8169293A priority Critical patent/JPH06295601A/en
Publication of JPH06295601A publication Critical patent/JPH06295601A/en
Pending legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/60Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution
    • F21S41/68Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on screens
    • F21S41/683Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on screens by moving screens
    • F21S41/698Shaft-shaped screens rotating along its longitudinal axis

Abstract

(57) [Summary] [Purpose] To prevent glare from being applied to the preceding vehicle. A leading vehicle is detected based on an image obtained by capturing a situation in front of the vehicle, and a window area W S is set based on the position of the leading vehicle in the image. Next window area W
After binarizing the inside of S , it differentiates along the vertical direction, and as a boundary (cut line) between the area irradiated with the light of the headlamp and the unirradiated area, a point with a differential value of a certain value or more (selection diagram ( B)) is detected, and this point is linearly approximated. An approximate straight line extending substantially horizontally in the region W S is determined to be the cut line 70, and the height position of the cut line 70 is moved upward or downward by a predetermined amount. The above process is repeated every predetermined period, and when the cut line is no longer detected, the direction is reversed and the cut line 70 is moved to the area W.
Control to make a round trip in S. As a result, the height position of the cut line 70 with respect to the preceding vehicle is always less than or equal to the predetermined height.

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a vehicular headlamp device, and more particularly to a vehicular headlamp device for controlling the light distribution of headlamps that illuminate the front of the vehicle while the vehicle is traveling.

[0002]

2. Description of the Related Art In a vehicle, a pair of headlamps are provided on the right and left sides of the front end of the vehicle. The headlamps are turned on when it is difficult to visually recognize the situation ahead, such as at night, and the driver can see the front of the vehicle. It is designed to improve sex. This headlamp generally has a configuration in which the irradiation range can be switched to only two stages of high beam and low beam, and when another vehicle such as a preceding vehicle or an oncoming vehicle is present, the driver of the other vehicle can be changed. The low beam is often chosen so as not to give dazzling and unpleasant glare. However, for example, when the distance between the vehicle and the preceding vehicle is long, the driver continuously looks at the dark area outside the irradiation range of the headlamp with the low beam, and the glare is given to the preceding vehicle with the high beam. However, there is a problem that it is difficult to always irradiate an appropriate area in front.

For this reason, a light-shielding plate for shielding the irradiation light is provided inside the headlamp, and the light-shielding plate is moved so that a sufficient irradiation range can be obtained without giving glare to other vehicles. The boundary between the area and the unirradiated area (hereinafter,
It has been proposed to control the position of this boundary). Further, as a technique for controlling the position of the cut line so as not to give glare to other vehicles, a situation in front of the vehicle is imaged by a CCD camera or the like, and the preceding vehicle is recognized based on an image signal output from the CCD camera. It has been proposed to detect the inter-vehicle distance to the preceding vehicle and control the light distribution of the headlamps according to the inter-vehicle distance (Japanese Patent Laid-Open No. 6-58242).
2-131837).

In addition, instead of detecting the inter-vehicle distance based on the image signal, it has been proposed to detect the inter-vehicle distance by a radar and perform the same control as described above.

[0005]

However, in the above-described control of the cut line based on the inter-vehicle distance, the actual position of the cut line differs from the target cut line position due to the displacement of the mounting position of the headlight and the like. The relationship between the inter-vehicle distance and the position of the appropriate cut line changes due to changes in the relative position with respect to the preceding vehicle due to the inclination of the vehicle, the slope of the road surface, etc., giving glare to the preceding vehicle, and the irradiation range The visibility in front of the vehicle may be reduced due to lack of visibility.

The present invention has been made in consideration of the above facts, and an object of the present invention is to obtain a vehicle headlight device capable of preventing the glare of a preceding vehicle.

[0007]

In order to achieve the above object, a vehicle headlamp device according to the present invention is a headlamp capable of changing at least one of an irradiation direction and an irradiation range, and an image of a situation in front of the vehicle. And an image pickup means for outputting an image signal, and the head vehicle that detects a preceding vehicle based on the image signal output from the image pickup means and is adjacent to a region in the image corresponding to the preceding vehicle in the vehicle vertical direction. Detecting means for detecting a height position of a boundary between a portion irradiated with the light of the lamp and a non-irradiated portion, and the height position of the boundary detected by the detecting means is a predetermined height with respect to the preceding vehicle. And a control means for controlling at least one of the irradiation direction and the irradiation range of the headlamp so as to be equal to or less than the above.

[0008]

For example, in a projector-type headlamp or the like, there is a boundary (that is, a cut line) between a portion irradiated with light from the headlamp (bright portion) and a portion not irradiated with light (dark portion). It appears relatively clearly. Therefore, in the present invention, the preceding vehicle is detected on the basis of the image signal representing the situation in front of the vehicle output from the image pickup means, and the headlamps adjacent to the area corresponding to the preceding vehicle in the image along the vehicle vertical direction are detected. The height position of the boundary between the part irradiated with light and the part not irradiated with light is detected.

In the area corresponding to the preceding vehicle in the image,
For example, even in the edge portion of the tail lamp of the preceding vehicle, the bright portion and the dark portion are adjacent to each other along the vehicle vertical direction, but the boundary between the illuminated portion and the non-illuminated portion caused by the headlight is usually along the vehicle width direction. For example, by only detecting the boundary that is continuous for a predetermined length or more, for example, the boundary that is continuous for a predetermined length or more, the edge part of the tail lamp is not erroneously detected as a cut line, and only the cut line is extracted. The height position of the lever can be detected.

In the present invention, the irradiation direction and the irradiation range of the headlamp are set so that the detected height position of the boundary is equal to or lower than a predetermined height with respect to the preceding vehicle (for example, the height of the tail lamp of the preceding vehicle). Control at least one. In this way, the height position of the cut line itself is detected and the light distribution of the headlamp is controlled based on the detected cut line height position, so the distance between vehicles changes or the mounting position of the headlamp shifts. Or the relative position with the preceding vehicle changes due to the inclination of the vehicle or the slope of the road surface,
In such a case as well, the position of the cut line with respect to the preceding vehicle can be set to the predetermined height or less, so that it is possible to reliably prevent the preceding vehicle from being given glare.

[0011]

Embodiments of the present invention will now be described in detail with reference to the drawings. As shown in FIG. 1, an engine hood 12 is arranged on an upper surface portion of a front body 10A of a vehicle 10, and a front bumper 16 is provided at a front end portion of the front body 10A from once in the vehicle width direction to the other end. Is fixed. This front bumper 16 and engine hood 1
A pair of headlamps 18 and 20 are disposed between the two front edge portions at both ends in the vehicle width direction.

A windshield glass 14 is provided in the vicinity of the rear end of the engine hood 12, and the vehicle 10
A room mirror 15 is provided in the vicinity of a portion corresponding to the upper side of the windshield glass 14 inside. A TV camera 22 for picking up an image of the situation in front of the vehicle is arranged near the rear-view mirror 15. TV camera 22
Is connected to the image processing device 48 (see FIG. 4). In the present embodiment, as the TV camera 22, a TV camera that includes a CCD element that simply detects only the amount of light and that outputs an image signal representing a monochrome image is used.

The TV camera 22 is arranged at a position as close as possible to the driver's viewpoint position (so-called eye point) so that the road shape in front of the vehicle can be accurately recognized and the driver's visual sense is matched. It is preferably arranged. Further, the road shape in the present embodiment includes the shape of the traveling road, for example, the road shape corresponding to one lane formed by the center line, the curb, or the like.

The vehicle 10 is provided with a speedometer (not shown), and a vehicle speed sensor 66 (see FIG. 4) for detecting the vehicle speed V of the vehicle 10 is attached to the cable of the speedometer (not shown). There is. The vehicle speed sensor 66 is connected to the image processing device 48 and outputs the detection result of the vehicle speed V.

As shown in FIGS. 2 and 3, the headlamp 18 is a projector-type headlamp and includes a convex lens 30, a bulb 32, and a lamp house 34.
The lamp house 34 is fixed to a frame (not shown) of the vehicle 10 substantially horizontally. The convex lens 30 is fixed to one opening of the lamp house 34, and the optical axis L of the convex lens 30 (convex lens 30) is fixed to the other opening. The bulb 32 is fixed via a socket 36 such that the light emitting point is located on the central axis of the bulb.

On the bulb side inside the lamp house 34, a reflector 38 having an elliptical reflection surface is formed.
The light emitted from 8 is reflected by the reflector 38 and condensed between the convex lens 30 and the bulb 32. An actuator 40 is arranged near this condensing point. The actuator 40 includes a light-shielding cam 40A that is rotatably supported by a rotary shaft 44 that is fixed in the lamp house 34 along the vehicle width direction.
A gear 40B is fixed to the. A gear 40C fixed to the drive shaft of the motor 40D meshes with the gear 40B. The motor 40D is connected to the driver 64 of the control device 50.

The bulb 3 reflected and condensed by the reflector 38
The second light is blocked by the light blocking cam 40A of the actuator 40, and the other light is emitted from the convex lens 30. The light-shielding cam 40A has a cam shape in which the distance from the rotation shaft 44 to the outer circumference continuously changes along the circumferential direction, and is rotated by driving the motor 40D in response to a signal from the control device 50. Be moved. This shading cam 40A
With the rotation of, the position of the boundary where the light of the bulb 32 is divided into the passing light and the blocked light changes up and down. This boundary appears as a cut line (cut line 70 shown in FIG. 5) which is a light / dark boundary in the light distribution in front of the vehicle 10.

As shown in FIG. 5, the position of the cut line 70 is the position corresponding to the uppermost position when the light shielding cam 40A is rotated (the position shown by the solid line as the cut line 70 in FIG. 5, the so-called high beam or less). Position) to a position corresponding to the lowest position (a position indicated by an imaginary line in FIG. 5, a position equivalent to a so-called low beam) in parallel. Also, headlamp 2
Since 0 has the same configuration as the headlamp 18, a detailed description thereof will be omitted, but as shown in FIG.
1 is attached. The actuator 41 includes a light shielding cam 41A (not shown), and the position of the cut line is moved as the light shielding cam 41A rotates.

As shown in FIG. 4, the control device 50 has a read-only memory (ROM) 52, a random access memory (RAM) 54, a central processing unit (CPU) 56, an input port 58, an output port 60, and these components. It is configured to include a bus 62 such as a data bus and a control bus. The ROM 52 stores a map and a control program described later.

A vehicle speed sensor 66 and an image processing device 48 are connected to the input port 58. This image processing device 4
Reference numeral 8 denotes a TV camera 22 and a control device 50 as described later.
The image captured by the TV camera 22 is image-processed based on the signal input from the. The output port 60 is connected to the actuator 40 of the headlamp 18 and the actuator 41 of the headlamp 20 via a driver 64. Further, the output port 60 is connected to the image processing device 4
It is also connected to 8.

Next, the operation of this embodiment will be described with reference to the flow charts of FIGS. 6 and 7. Driver is vehicle 1
When the light switch 0 (not shown) is turned on to turn on the headlamps 18 and 20, the control main routine shown in FIG. 6 is executed at predetermined intervals. In step 300 of this control main routine, the preceding vehicle recognition processing is executed,
The preceding vehicle traveling ahead of the host vehicle is recognized.
This preceding vehicle recognition processing will be described with reference to the flowchart in FIG. 7.

In FIG. 8A, the vehicle 10 is driving the road 122.
The image was taken by the TV camera 22 while traveling,
An image that roughly matches the image seen by the driver.
An example of the image (image 120) is shown. This road 122
Has white lines 124 on both sides of the lane in which the vehicle 10 travels.
ing. Note that each pixel on the above image is
Is determined by the X-axis and the Y-axis that are set to
Coordinates of the coordinate system (Xn, Y n) Identifies the location
It In the following, recognition of the preceding vehicle based on this image
Is done.

At step 400, as shown in FIG.
A white line detection window for an area with a predetermined width γ on the image
Area WsdSet as. In this embodiment, the night of the vehicle 10
Image of approximately 40 to 50m in front of the vehicle 10 when traveling between vehicles
Considering that it can be detected only 60m in front of the vehicle 10
The white line above the line is not detected. Also, in the image
The lower area is less likely to have a preceding vehicle. others
Therefore, the white line detection window area Wsd60 in front of the vehicle 10
In order to be able to detect up to m
White line inspection with the area and the area below the lower limit line 130 removed
Exit window area W sdTo set.

In the next step 402, the window area W sd
The inside is differentiated with respect to the brightness, and the peak point (maximum point) of this differential value is extracted as the edge point which is the white line candidate point. That is, in the vertical direction (direction of arrow A in FIG. 9) in the window region W sd , the brightness from the pixel at the lowermost position to the pixel at the uppermost position is differentiated with respect to each pixel in the horizontal direction, and a large variation in brightness is differentiated. The peak point of the value is extracted as the edge point. As a result, continuous edge points are extracted as shown by the broken line 132 in the window area W sd in FIG.

In step 404, a linear approximation process is performed.
In this processing, the edge points extracted by the white line candidate point extraction processing are linearly approximated using Hough transform, and approximate straight lines 142 and 144 along the line estimated to be the white line are obtained. At the next step 405, the intersection point P N (X
The coordinate value = X N ) is obtained, and the horizontal displacement amount A (A) between the obtained intersection point P N and the intersection point P 0 (X coordinate value = X 0 ) of the approximate straight line in the case of a predetermined straight line as a reference. = X N −X 0 ). The displacement amount A corresponds to the degree of the curve of the road 122.

In the next step 406, the displacement amount A is A 2
Road 1 by determining whether the ≧ A range of ≧ A 1
It is determined whether 22 is a substantially straight road. This judgment reference value A 1
Is a reference value representing the boundary between a straight road and a right curve road,
The judgment reference value A 2 is a reference value representing the boundary between the straight road and the left curved road. If it is determined in step 406 that the vehicle is a straight road, the vehicle speed V of the host vehicle 10 is read in step 408.

In the next step 410, in setting the vehicle recognition area W P for recognizing the preceding vehicle in accordance with the read vehicle speed V, a correction width α L for correcting the position of the approximate straight line,
Determine α R. When traveling at high speeds, the radius of curvature of the road on which the vehicle can turn is large, so it can be considered that the vehicle is traveling on a substantially straight road. Even if the road is close to, if the radius of curvature of the road is small in the distance, the vehicle may deviate from the vehicle recognition area W P. Therefore, the correction widths α L and α R are set so that the values increase as the speed V decreases, using the map shown in FIG.

At the next step 412, the lower limit line 130,
An approximate straight line 142 whose position is corrected by the correction widths α L and α R ,
The area surrounded by 144 is determined as the vehicle recognition area W P for recognizing the preceding vehicle (see FIG. 10). The area of the vehicle recognition area W P is also increased as the vehicle travels at a lower speed as the correction widths α L and α R are changed according to changes in the vehicle speed V (see FIG. 11).

On the other hand, if the determination in step 406 is negative, then in step 414 it is determined whether the road is a right curve road or a left curve road by determining whether or not A> A 2 . When the determination is affirmative, the road is determined to be a right curve road, the vehicle speed V of the vehicle 10 is read in step 416, and the correction widths α L , α according to the read vehicle speed V are read using the map shown in FIG. Correction value for R α L ', α R '
Is determined in step 418. In the next step 420, the gains GL for determining the correction widths α R , α L of the left and right approximate straight lines according to the displacement amount A indicating the degree of the curve,
GR is determined using the maps shown in FIGS. 13 and 14,
In step 422, the left and right correction widths α R and α L of the final window area are determined based on the determined correction values α R ′ and α L ′ and the gains GL and GR.

At this time, since the road is a curved road, the left and right sides are asymmetric, and the approximate straight lines 142 and 144 have different slopes. Therefore, the left and right correction widths α R and α L are set to independent values. That is, when the road is a right-curved road and the radius of curvature is small (the displacement amount A is large), the probability that the preceding vehicle is present on the right side is high. Therefore, the correction width α R is increased by increasing the gain GR on the right side (see FIG. 13).
In addition, the correction width α is reduced by reducing the gain GL on the left side.
Reduce L (see Figure 14). Further, when the road is a right curved road and the radius of curvature is large (the displacement amount A is small), the correction width α R is reduced by decreasing the gain GR on the right side, and the correction is performed by increasing the gain GL on the left side. Increase the width α L. This change in the correction width is shown as an image in FIG.

In step 424, the determined correction width α
A region surrounded by the approximate straight lines 142 and 144 whose positions are corrected by L and α R is determined as a vehicle recognition region W P for recognition processing of the preceding vehicle.

On the other hand, if the determination in step 414 is affirmative, it is determined that the road is a left curved road and step 4
26, and the vehicle speed V of the vehicle 10 is read. In step 428, using the map of FIG. 12, the read vehicle speed V
The left and right correction values α R ′, α L ′ are determined in accordance with the above, and in step 430, the left and right gains GL, GR corresponding to the displacement amount A are determined. That is, when the road is a left curved road and the radius of curvature is small (the displacement amount A is large), it is highly likely that the preceding vehicle exists on the left side. Therefore, the correction width can be reduced by reducing the gain GR on the right side according to the map shown in FIG. The correction width α L is increased by decreasing α R and increasing the gain GL on the left side according to the map shown in FIG.

In the next step 432, the left and right correction widths α R and α L of the final window region are determined based on the determined correction values α R 'and α L ' and the gains GL and GR.
In step 434, the left and right correction widths α R and α L determined
The vehicle recognition area W for recognizing the preceding vehicle in the area surrounded by the approximate straight lines 142 and 144 whose position has been corrected by
Determine as P.

When the vehicle recognition area W P is determined as described above, the process proceeds to step 436, and horizontal edge detection processing in the vehicle recognition area W P is performed as the preceding vehicle recognition processing. The horizontal edge detection process is performed in step 4 first.
Similar to the edge detection process 02, the horizontal edge point is detected in the vehicle recognition area W P. Next, the detected horizontal edge points are laterally integrated to detect a peak point E P at a position where the integrated value exceeds a predetermined value (see FIG. 8B). This horizontal edge is likely to appear when a preceding vehicle is present.

At the next step 438, the position coordinates of the preceding vehicle are calculated. First, vertical edge detection processing is performed. When the peak point E P of the integrated value of the horizontal edge points are multiple, the peak point E P located below on the image in order, the peak point E P
The window regions W R and W L for detecting the vertical lines are set so as to include both ends of the horizontal edge points included in (see FIG. 8C). Vertical edges are detected in the window regions W R and W L , and vertical lines 138R and 138L are detected.
Is stably detected, it is determined that the preceding vehicle exists in the area sandwiched between the window areas W R and W L.

Next, the vehicle width is obtained by obtaining the lateral distance between the vertical lines 138R and 138L detected in each of the window regions W R and W L , and the coordinates of the center of the vehicle width are set as the coordinates of the center of the vehicle. And the inter-vehicle distance Len is calculated. As a result, the preceding vehicle recognition process is completed, and the process proceeds to step 302 in the flowchart of FIG.

In step 302, it is determined whether or not a preceding vehicle is detected by the preceding vehicle recognition processing described above. If the determination in step 302 is negative, step 30
6, the angles of the light shielding cams 40A and 41A are changed according to the inter-vehicle distance Len with the preceding vehicle to control the position of the cut line 70. In this control, the gains for the actuators 40 and 41 are obtained using a map as shown in FIG. 18 as an example, and the actuators 40 and 41 are determined according to the gains.
This is done by driving 41. As a result, as the inter-vehicle distance Len with the preceding vehicle increases, the position of the cut line 70 moves upward so that the light blocking cam 40 moves.
The angles of A and 40B are controlled. At this time, since there is no preceding vehicle, the angle of the light shielding cam is unconditionally rotated to a predetermined angle corresponding to the high beam.

If the determination at step 302 is affirmative, the routine proceeds to step 304, where the inter-vehicle distance Len with the preceding vehicle detected by the preceding vehicle recognition processing is a predetermined distance A (for example, 100 m). Is smaller than. When the determination in step 304 is negative, the process proceeds to step 306, and the position of the cut line is controlled according to the inter-vehicle distance.

On the other hand, when the determination at step 304 is affirmative, the routine proceeds to step 308, where the inter-vehicle distance Len and the vertical edge of the preceding vehicle detected by the preceding vehicle recognition processing,
A predetermined area of the image corresponding to the tail portion of the preceding vehicle is set as the window region W S based on. This window region W S is, for example, as shown in FIG. 19A, a pair of vertical edges 13 detected in the preceding vehicle recognition process.
A horizontal line 150A that connects the lower ends of 8L and 138R is set, and the inter-vehicle distance Len from the preceding vehicle is set from this horizontal line 150A.
When the horizontal line 150B is set at a position separated by the distance d corresponding to, the area can be defined by the horizontal lines 150A and 150B and the vertical edges 138L and 138R. The distance d is set to decrease as the inter-vehicle distance Len increases so as not to give glare to the preceding vehicle even when the position of the cut line 70 is slightly above the horizontal line 150B. ing.

In the next step 310, the settings made above are performed.
Wind area WSThe corresponding image data in the
Convert to binary data using the threshold as a reference. As an example
The cut line 70 is located at the position shown in FIG.
Part of the headlight that is illuminated
And the portion corresponding to the tail lamp of the preceding vehicle (area W S
Is the bright part,
The part indicated by the hatching is judged to be the dark part and changed to binary data.
Will be replaced. In step 312, the wind area W S2 of
Value data is differentiated in the vertical direction and the differential value is a certain value or more
To detect. In this differentiation, the window area WSVertical inside
It corresponds to the boundary between the bright and dark areas that are adjacent along the direction.
At this point, the differential value becomes a certain value or more, and as an example, FIG.
A point as shown in (B) is detected.

In step 314, the differential point detected in step 312 is linearly approximated along the horizontal direction using Hough transform to obtain an approximate straight line estimated to be a cut line. . It should be noted that this linear approximation is performed only for points corresponding to the boundary where the dark portion changes to the light portion from the upper side to the lower side of the window region W S.
As a result, even when a point corresponding to the lower edge portion of the tail lamp of the preceding vehicle is detected as a point whose differential value is equal to or greater than a certain value, linear approximation is not performed for this point. Further, the upper edge portion of the tail lamp is changed to a bright portion from a dark portion downward from above the window area W S, but since this upper edge does not continue more than a predetermined length, the length of the By removing the short straight line, it is possible to prevent erroneous detection as a cut line.

In step 316, it is determined whether or not the linear approximation has succeeded. When the determination in step 316 is affirmative, the process proceeds to step 320, and the height position Xi of the approximate straight line in the image is calculated. In step 322, it is determined whether or not the height position Xi of the approximate straight line detected this time has changed from the height position Xi-n of the approximate straight line detected n times before. The control main routine of this embodiment is executed every predetermined period, and moves the position of the cut line every time as described later. Therefore, when the height position Xi of the approximate straight line does not change from the height position Xi-n n times before, it is considered that the detected approximate straight line is a straight line that appears due to the bumper pattern of the preceding vehicle. To be In this case, the determination at step 322 is negative and the routine proceeds to step 306, where the cut line position is controlled according to the inter-vehicle distance Len with the preceding vehicle.

On the other hand, when the determination in step 322 is positive, it can be determined that the detected approximate straight line is the cut line 70. In this case, step 324
Move to. The determination in step 322 is also affirmed when the approximate straight line is not detected n times before. If the determination in step 322 is affirmative, step 324
The value of the non-approximation counter FC provided on the memory is set to "0".

In step 326, it is determined whether or not the light shielding cams 40A and 40B have been rotated so that the height position of the cut line 70 moves in the lowering direction by the control of the previous cut line. If the determination in step 326 is affirmative, the process proceeds to step 328, and the light-shielding cams 40A, 4A, 4A so that the position of the cut line 70 moves again by a predetermined amount in the lowering direction.
Rotate 0B. When the determination in step 326 is negative, the process proceeds to step 330, and the light blocking cams 40A and 40B are rotated so that the position of the cut line 70 moves in the upward direction by a predetermined amount. Therefore, cut line 7
While 0 is detected, step 328 or step 330 of this main routine is repeatedly executed to control the angles of the light shielding cams 40A and 40B so that the cut line 70 moves in a certain direction by a predetermined amount. To be done.

Further, when the position of the cut line 70 deviates from the window region W S by continuing the movement of the cut line 70 in a certain direction, the cut line 70
Cannot be detected, the determination at step 316 is negative, and the process proceeds to step 332. Step 3
At 32, "1" is added to the non-approximation counter FC. Therefore, the value of the non-approximate counter FC is gradually increased while the cut line 70 is not detected.
In the next step 334, it is determined whether or not the value of the non-approximation counter FC has become larger than the non-approximation limit value B. If the determination in step 334 is negative, step 33
The process proceeds to 6 and it is determined whether or not the value of the non-approximation counter FC is "2" or more.

Cut line 70 detected until the last time
If is not detected, the determination in step 336 is denied, and the cut line 7 is controlled by the previous cut line control.
Shade cam 40 so that the height position of 0 moves in the lowering direction.
It is determined whether A and 40B are rotated. Step 33
If the result of the determination in step 8 is affirmative, the process proceeds to step 340, in which the movement direction of the cut line 70 is reversed, that is, the position of the cut line 70 is moved by a predetermined amount in the upward direction. Rotate. Even when the determination in step 338 is negative, the light-shielding cams 40A and 40B are moved in step 342 so that the moving direction of the cut line 70 is reversed, that is, the position of the cut line 70 is moved by a predetermined amount in the lowering direction. Rotate.

When the main routine is executed next time, the determination at step 336 is affirmative and step 326 is performed.
Then, in step 328 or step 330, the cut line 70 is moved in the same direction as the previous moving direction of the cut line 70 (in this case, the direction reversed in the previous time). As a result, when the position of the cut line 70 deviates from the window region W S by continuing the movement of the cut line 70 in the constant direction as described above, the position of the cut line 70 returns to the inside of the window W S. Thus, the angles of the light shielding cams 40A and 40B are controlled.

In this way, when the cut line 70 is detected again, the affirmative determination is made in step 316, and the cut line 70 is moved in the same direction as the previous time in step 328 or step 330. Therefore, while the cut line 70 is normally detected, the position of the cut line 70 is controlled so as to move within the window region W S by a predetermined amount in a certain direction, and the cut line 70 moves within the window region W S. When the cut line 70 passes and is no longer detected, the movement direction of the cut line 70 is reversed, and the cut line 70 is controlled to move in the reversed direction by a predetermined amount and pass through the window region W S.

Thus, in this embodiment, the cut line 7
0 is controlled so as to be located within the window region W S set with the position of the preceding vehicle as a reference, and the position of the cut line 70 is the horizontal line 150B above the window region W S.
Cut line 70 when it is judged that it has become higher than
Lower the position of. Therefore, the position of the cut line 70 with respect to the preceding vehicle is always the predetermined height even when the mounting position of the headlamp is deviated or the relative position with respect to the preceding vehicle changes due to the inclination of the vehicle, the slope of the road surface, or the like. It will be controlled so that it will be less than or equal to.

On the other hand, after the cut line 70 is no longer detected, this main routine is executed a predetermined number of times (approximate inapplicable limit value B
If the cut line 70 cannot be detected even after the execution, the determination at step 334 is affirmative and the routine proceeds to step 306, where the inter-vehicle distance Len with the preceding vehicle is Len.
The control of the cut line position is performed according to.

In this embodiment, the inter-vehicle distance Len with the preceding vehicle is detected based on the position of the preceding vehicle in the image. However, the present invention is not limited to this, and the preceding vehicle is detected based on the image. The direction in which the vehicle is present may be detected, and the distance between the vehicles may be measured by a distance measuring means such as a radar.

In this embodiment, the case where the cut line 70 having the shape shown in FIG. 5 is detected has been described as an example. However, the present invention is not limited to this, and as shown in FIG. It can also be applied when detecting the cut line 90 in which the portion corresponding to the left side of the irradiation region is inclined to the left with respect to the horizontal. It is also possible to independently control the height position of the cut line corresponding to the right side of the irradiation area and the height position of the cut line corresponding to the left side of the irradiation area.

Further, in the above embodiment, the light distribution in front of the vehicle is controlled by the light blocking cam, but the light of the headlamp may be blocked by a light blocking plate or a shutter. Further, although the light distribution is controlled by blocking the light of the headlamp, the emission optical axis of the headlamp may be deflected.

[0054]

As described above, according to the present invention, the preceding vehicle is detected based on the image signal obtained by picking up the situation in front of the vehicle, and the vehicle is vertically moved in the area corresponding to the preceding vehicle in the image. Detects the position of the boundary between the part that is illuminated and the part that is not illuminated by the headlamps that are adjacent to each other, and the detected position of the boundary is below a predetermined height that does not give glare to the preceding vehicle. Since at least one of the irradiation direction and the irradiation range of the headlamp is controlled so that the glare can be prevented from being given to the preceding vehicle in various situations while the vehicle is traveling, an excellent effect can be obtained. To be

[Brief description of drawings]

FIG. 1 is a perspective view of a vehicle showing a front portion of the vehicle used in the present embodiment as seen obliquely from the front.

FIG. 2 is a perspective view showing a schematic configuration of a headlamp to which the present invention can be applied.

FIG. 3 is a sectional view taken along line III-III in FIG.

FIG. 4 is a block diagram showing a schematic configuration of a control device.

FIG. 5 is an image diagram for explaining a cut line displaced by an actuator.

FIG. 6 is a flowchart illustrating a control main routine of this embodiment.

FIG. 7 is a flowchart illustrating details of a preceding vehicle recognition process.

8A is an image diagram of an image captured by a TV camera during the day, FIG. 8B is a conceptual diagram for explaining horizontal edge point integration processing, and FIG. 8C is a vertical edge detection processing. It is a conceptual diagram of.

FIG. 9 is a diagram showing a window area at the time of recognizing a white line.

FIG. 10 is a diagram showing a vehicle recognition area.

FIG. 11 is an image diagram for explaining that the vehicle recognition area is changed according to the vehicle speed.

FIG. 12 is a diagram showing a relationship between a vehicle speed and a correction width of an approximate straight line.

FIG. 13 is a diagram showing the relationship between the degree of a right curve road and the gain that determines the correction width of the approximate straight line on the right side.

FIG. 14 is a diagram showing a relationship between the degree of a right curve road and a gain that determines a correction width of an approximate straight line on the left side.

FIG. 15 is an image diagram showing window regions and correction widths for curved roads having different curvatures.

FIG. 16 is a diagram showing a relationship between a degree of a left curved road and a gain that determines a correction width of an approximate straight line on the right side.

FIG. 17 is a diagram showing the relationship between the degree of a left curved road and the gain that determines the correction width of the approximated straight line on the left side.

FIG. 18 is a diagram showing a relationship between an inter-vehicle distance and a gain for determining a rotation angle of a light shielding cam of an actuator.

FIG. 19 (A) is an image diagram showing a window region W S set in the tail portion of the preceding vehicle, and FIG. 19 (B) is an image diagram for explaining the process of the cut line detection processing.

FIG. 20 is an image diagram showing another example of the shape of a cut line.

[Explanation of symbols]

 18 headlamp 20 headlamp 22 TV camera 40 actuator 41 actuator 48 image processing device 50 control device 70 cut line 90 cut line 100 traveling vehicle detection device

Claims (1)

[Claims]
1. A headlamp capable of changing at least one of an irradiation direction and an irradiation range, an image pickup means for picking up a situation in front of a vehicle and outputting an image signal, and based on the image signal outputted from the image pickup means. The preceding vehicle is detected, and the height position of the boundary between the portion irradiated with the light of the headlamp adjacent to the area corresponding to the preceding vehicle in the image along the vehicle vertical direction and the portion not irradiated with the headlamp light is determined. Detecting means for detecting, and control means for controlling at least one of the irradiation direction and the irradiation range of the headlamp so that the height position of the boundary detected by the detecting means is below a predetermined height with respect to the preceding vehicle. A vehicle headlight device comprising:
JP8169293A 1993-04-08 1993-04-08 Headlight for vehicle Pending JPH06295601A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP8169293A JPH06295601A (en) 1993-04-08 1993-04-08 Headlight for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP8169293A JPH06295601A (en) 1993-04-08 1993-04-08 Headlight for vehicle

Publications (1)

Publication Number Publication Date
JPH06295601A true JPH06295601A (en) 1994-10-21

Family

ID=13753421

Family Applications (1)

Application Number Title Priority Date Filing Date
JP8169293A Pending JPH06295601A (en) 1993-04-08 1993-04-08 Headlight for vehicle

Country Status (1)

Country Link
JP (1) JPH06295601A (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998019886A1 (en) * 1996-11-06 1998-05-14 Pierre Ravussin Automatic control device for motor vehicle headlights
EP1491402A2 (en) 2003-06-25 2004-12-29 Hitachi, Ltd. Auto light system for vehicles
US7881496B2 (en) 2004-09-30 2011-02-01 Donnelly Corporation Vision system for vehicle
US8842176B2 (en) 1996-05-22 2014-09-23 Donnelly Corporation Automatic vehicle exterior light control
US8890955B2 (en) 2010-02-10 2014-11-18 Magna Mirrors Of America, Inc. Adaptable wireless vehicle vision system based on wireless communication error
US8908040B2 (en) 2007-10-04 2014-12-09 Magna Electronics Inc. Imaging system for vehicle
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US9008369B2 (en) 2004-04-15 2015-04-14 Magna Electronics Inc. Vision system for vehicle
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US9117123B2 (en) 2010-07-05 2015-08-25 Magna Electronics Inc. Vehicular rear view camera display system with lifecheck function
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US9191574B2 (en) 2001-07-31 2015-11-17 Magna Electronics Inc. Vehicular vision system
US9245448B2 (en) 2001-07-31 2016-01-26 Magna Electronics Inc. Driver assistance system for a vehicle
US9264672B2 (en) 2010-12-22 2016-02-16 Magna Mirrors Of America, Inc. Vision display system for vehicle
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US9495876B2 (en) 2009-07-27 2016-11-15 Magna Electronics Inc. Vehicular camera with on-board microcontroller
US9509957B2 (en) 2008-07-24 2016-11-29 Magna Electronics Inc. Vehicle imaging system
US9619720B2 (en) 2013-08-19 2017-04-11 Gentex Corporation Vehicle imaging system and method for distinguishing between vehicle tail lights and flashing red stop lights
US9796332B2 (en) 2007-09-11 2017-10-24 Magna Electronics Inc. Imaging system for vehicle
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10086747B2 (en) 2007-07-12 2018-10-02 Magna Electronics Inc. Driver assistance system for vehicle

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US8842176B2 (en) 1996-05-22 2014-09-23 Donnelly Corporation Automatic vehicle exterior light control
US9131120B2 (en) 1996-05-22 2015-09-08 Magna Electronics Inc. Multi-camera vision system for a vehicle
WO1998019886A1 (en) * 1996-11-06 1998-05-14 Pierre Ravussin Automatic control device for motor vehicle headlights
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US10099610B2 (en) 2001-07-31 2018-10-16 Magna Electronics Inc. Driver assistance system for a vehicle
US10406980B2 (en) 2001-07-31 2019-09-10 Magna Electronics Inc. Vehicular lane change system
US10046702B2 (en) 2001-07-31 2018-08-14 Magna Electronics Inc. Control system for vehicle
US10611306B2 (en) 2001-07-31 2020-04-07 Magna Electronics Inc. Video processor module for vehicle
US9463744B2 (en) 2001-07-31 2016-10-11 Magna Electronics Inc. Driver assistance system for a vehicle
US9834142B2 (en) 2001-07-31 2017-12-05 Magna Electronics Inc. Driving assist system for vehicle
US9191574B2 (en) 2001-07-31 2015-11-17 Magna Electronics Inc. Vehicular vision system
US9656608B2 (en) 2001-07-31 2017-05-23 Magna Electronics Inc. Driver assist system for vehicle
US9376060B2 (en) 2001-07-31 2016-06-28 Magna Electronics Inc. Driver assist system for vehicle
US9245448B2 (en) 2001-07-31 2016-01-26 Magna Electronics Inc. Driver assistance system for a vehicle
US9555803B2 (en) 2002-05-03 2017-01-31 Magna Electronics Inc. Driver assistance system for vehicle
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US9834216B2 (en) 2002-05-03 2017-12-05 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US10683008B2 (en) 2002-05-03 2020-06-16 Magna Electronics Inc. Vehicular driving assist system using forward-viewing camera
US10118618B2 (en) 2002-05-03 2018-11-06 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9643605B2 (en) 2002-05-03 2017-05-09 Magna Electronics Inc. Vision system for vehicle
US10351135B2 (en) 2002-05-03 2019-07-16 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US7449997B2 (en) 2003-06-25 2008-11-11 Hitachi, Ltd. Auto light system
EP1491402A2 (en) 2003-06-25 2004-12-29 Hitachi, Ltd. Auto light system for vehicles
US9948904B2 (en) 2004-04-15 2018-04-17 Magna Electronics Inc. Vision system for vehicle
US9191634B2 (en) 2004-04-15 2015-11-17 Magna Electronics Inc. Vision system for vehicle
US9736435B2 (en) 2004-04-15 2017-08-15 Magna Electronics Inc. Vision system for vehicle
US9008369B2 (en) 2004-04-15 2015-04-14 Magna Electronics Inc. Vision system for vehicle
US10110860B1 (en) 2004-04-15 2018-10-23 Magna Electronics Inc. Vehicular control system
US9428192B2 (en) 2004-04-15 2016-08-30 Magna Electronics Inc. Vision system for vehicle
US10187615B1 (en) 2004-04-15 2019-01-22 Magna Electronics Inc. Vehicular control system
US9609289B2 (en) 2004-04-15 2017-03-28 Magna Electronics Inc. Vision system for vehicle
US10306190B1 (en) 2004-04-15 2019-05-28 Magna Electronics Inc. Vehicular control system
US10462426B2 (en) 2004-04-15 2019-10-29 Magna Electronics Inc. Vehicular control system
US10015452B1 (en) 2004-04-15 2018-07-03 Magna Electronics Inc. Vehicular control system
US7881496B2 (en) 2004-09-30 2011-02-01 Donnelly Corporation Vision system for vehicle
US8977008B2 (en) 2004-09-30 2015-03-10 Donnelly Corporation Driver assistance system for vehicle
US10623704B2 (en) 2004-09-30 2020-04-14 Donnelly Corporation Driver assistance system for vehicle
US10509972B2 (en) 2004-12-23 2019-12-17 Magna Electronics Inc. Vehicular vision system
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9193303B2 (en) 2004-12-23 2015-11-24 Magna Electronics Inc. Driver assistance system for vehicle
US9940528B2 (en) 2004-12-23 2018-04-10 Magna Electronics Inc. Driver assistance system for vehicle
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10086747B2 (en) 2007-07-12 2018-10-02 Magna Electronics Inc. Driver assistance system for vehicle
US9796332B2 (en) 2007-09-11 2017-10-24 Magna Electronics Inc. Imaging system for vehicle
US8908040B2 (en) 2007-10-04 2014-12-09 Magna Electronics Inc. Imaging system for vehicle
US10616507B2 (en) 2007-10-04 2020-04-07 Magna Electronics Inc. Imaging system for vehicle
US10003755B2 (en) 2007-10-04 2018-06-19 Magna Electronics Inc. Imaging system for vehicle
US9509957B2 (en) 2008-07-24 2016-11-29 Magna Electronics Inc. Vehicle imaging system
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US9911050B2 (en) 2009-02-27 2018-03-06 Magna Electronics Inc. Driver active safety control system for vehicle
US10106155B2 (en) 2009-07-27 2018-10-23 Magna Electronics Inc. Vehicular camera with on-board microcontroller
US9495876B2 (en) 2009-07-27 2016-11-15 Magna Electronics Inc. Vehicular camera with on-board microcontroller
US10053012B2 (en) 2009-09-01 2018-08-21 Magna Electronics Inc. Imaging and display system for vehicle
US9789821B2 (en) 2009-09-01 2017-10-17 Magna Electronics Inc. Imaging and display system for vehicle
US10300856B2 (en) 2009-09-01 2019-05-28 Magna Electronics Inc. Vehicular display system
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US8890955B2 (en) 2010-02-10 2014-11-18 Magna Mirrors Of America, Inc. Adaptable wireless vehicle vision system based on wireless communication error
US9117123B2 (en) 2010-07-05 2015-08-25 Magna Electronics Inc. Vehicular rear view camera display system with lifecheck function
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US10486597B1 (en) 2010-12-22 2019-11-26 Magna Electronics Inc. Vehicular vision system with rear backup video display
US10336255B2 (en) 2010-12-22 2019-07-02 Magna Electronics Inc. Vehicular vision system with rear backup video display
US9731653B2 (en) 2010-12-22 2017-08-15 Magna Electronics Inc. Vision display system for vehicle
US10144352B2 (en) 2010-12-22 2018-12-04 Magna Electronics Inc. Vision display system for vehicle
US9469250B2 (en) 2010-12-22 2016-10-18 Magna Electronics Inc. Vision display system for vehicle
US9598014B2 (en) 2010-12-22 2017-03-21 Magna Electronics Inc. Vision display system for vehicle
US10589678B1 (en) 2010-12-22 2020-03-17 Magna Electronics Inc. Vehicular rear backup vision system with video display
US9264672B2 (en) 2010-12-22 2016-02-16 Magna Mirrors Of America, Inc. Vision display system for vehicle
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US9919705B2 (en) 2011-10-27 2018-03-20 Magna Electronics Inc. Driver assist system with image processing and wireless communication
US9619720B2 (en) 2013-08-19 2017-04-11 Gentex Corporation Vehicle imaging system and method for distinguishing between vehicle tail lights and flashing red stop lights

Similar Documents

Publication Publication Date Title
US9371031B2 (en) Method for controlling a headlamp system for a vehicle, and headlamp system
US8629768B2 (en) Vehicle vision system
US20140049648A1 (en) Systems and methods for detecting obstructions in a camera field of view
EP1513103B1 (en) Image processing system and vehicle control system
EP1451038B1 (en) Headlamp control to prevent glare
EP1538024B1 (en) Apparatus for controlling auxiliary equipment of vehicle
EP1683668B1 (en) Variable transmissivity window system
CN1133554C (en) Lamp device for vehicle
US8870424B2 (en) Automotive headlamp apparatus for controlling light distribution pattern
ES2735548T3 (en) Procedure for the control of a headlight device for a vehicle and headlight device
EP0869031B1 (en) Device for controlling the light beam and/or the lighting direction
US7566851B2 (en) Headlight, taillight and streetlight detection
US9415718B2 (en) Vehicular headlight apparatus
JP4496964B2 (en) Tunnel detection device for vehicle and light control device for vehicle
EP2508392B1 (en) Method and device for controlling the light emission of a front headlamp of a vehicle
US6765353B2 (en) Method and apparatus for detecting the state of humidity on a road on which a vehicle is travelling
JP4557537B2 (en) Apparatus and method for controlling direction of headlamp of vehicle
DE4436684C2 (en) Vehicle headlights with a controllable shielding device for a low beam
US8768576B2 (en) Undazzled-area map product, and system for determining whether to dazzle person using the same
US8425092B2 (en) Headlamp control device and vehicle headlamp having headlamp control device
JP2014013449A (en) On-vehicle device
JP4473232B2 (en) Vehicle front environment detecting device for vehicle and lighting device for vehicle
US9481292B2 (en) Method and control unit for influencing a lighting scene ahead of a vehicle
EP0230620B1 (en) Headlight device for vehicles, especially for motor vehicles
JP4258385B2 (en) Road surface reflection detector