JP2009012608A - Vehicular lighting device - Google Patents

Vehicular lighting device Download PDF

Info

Publication number
JP2009012608A
JP2009012608A JP2007176519A JP2007176519A JP2009012608A JP 2009012608 A JP2009012608 A JP 2009012608A JP 2007176519 A JP2007176519 A JP 2007176519A JP 2007176519 A JP2007176519 A JP 2007176519A JP 2009012608 A JP2009012608 A JP 2009012608A
Authority
JP
Japan
Prior art keywords
vehicle
line
sight
acquired
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007176519A
Other languages
Japanese (ja)
Inventor
Shunichi Doi
Naoki Fukaya
Kenji Sato
健次 佐藤
俊一 土居
直樹 深谷
Original Assignee
Denso Corp
Kagawa Univ
国立大学法人 香川大学
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp, Kagawa Univ, 国立大学法人 香川大学, 株式会社デンソー filed Critical Denso Corp
Priority to JP2007176519A priority Critical patent/JP2009012608A/en
Publication of JP2009012608A publication Critical patent/JP2009012608A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a vehicular lighting device in which an irradiation angle becomes suitable and an irradiation range becomes suitable during the traveling of a vehicle. <P>SOLUTION: A vehicle speed, a vehicle position and a front road status of the vehicle during traveling are obtained at turning off of an illumination 50, and a look of a driver is obtained from a face image of the driver obtained by a camera 30 by image processing. The obtained vehicle speed, vehicle position and front road status and the look of the driver are accumulated in an outside memory device 60, and the relationship of the vehicle speed, the vehicle position, the front road status and the look of the driver is learned and accumulated as a learned result. Further, at lighting of the illumination 50 at night, a curvature of the road at a front side of the vehicle is obtained from an on-vehicle GPS unit 20 and map database 10, and based on the relationship of the vehicle speed, the vehicle position and the look, i.e., the learned result accumulated in the outside memory device 60, the irradiation angle of the illumination 50 during traveling of the vehicle is controlled so as to become an angle suitable for the curvature of the road at the front side of the vehicle. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

  The present invention relates to a vehicular illumination device that can control the illumination direction and illumination range of a running vehicle.

  Conventionally, there is an adaptive frontlight system (hereinafter abbreviated as AFS) that controls the irradiation angle of a headlamp in accordance with the steering angle of an automobile. In AFS, the irradiation angle of the headlamp is uniquely controlled in accordance with the steering angle of the vehicle, so that there is a difference between the irradiation range of the headlamp and the range where the driver is actually gazing, and the driver actually sees it. There was a problem that it was not possible to illuminate the desired point of interest.

On the other hand, in recent years, a method for controlling the irradiation angle and irradiation range of the headlamp in accordance with the driver's point of sight has been proposed (see, for example, Patent Document 1).
JP-A-6-72234

  However, the driver's gaze distribution fluctuates little by little, and the point of sight moves to oncoming vehicles and pedestrians, so if the headlamp illumination angle is controlled according to the gaze distribution, the driver will be difficult to see. There is a problem of end.

  Also, there are curves on the road that have various sizes of curvature. Therefore, when driving on a curve with a large curvature during actual driving, the driver does not shift the gazing point from the front so much. A gaze point will be turned. Furthermore, the driver's point of sight while driving the vehicle varies depending on the individual driver.

  In the method described in the above-mentioned Patent Document 1 (a method of controlling the irradiation angle and irradiation range of the headlamps in accordance with the driver's point of sight), it is possible to cope with such road conditions (curvature of the curve ahead). Has not been made.

  That is, in the method described in Patent Document 1, since the light distribution is not optimized according to the movement of the gazing point in accordance with the shape of the curve, the irradiation angle and the irradiation range of the headlamp can be operated during curve driving. There was a problem that it did not reflect the characteristics of the person.

  The present invention has been made in view of these problems, and an object of the present invention is to provide a vehicular illumination device in which the illumination angle and illumination range of the illumination while the vehicle is traveling are appropriate for the driver.

  The lighting device for a vehicle according to claim 1, which has been made to solve such a problem (1: In this section, in order to facilitate understanding of the invention, the “best mode for carrying out the invention” is included as necessary. The reference numerals used in the "" column are attached, but this does not mean that the scope of claims is limited by the reference numerals.) Is a traveling situation acquisition means (5), road situation acquisition means (10, 20), line of sight detection Means (40), information storage means (60), line-of-sight pattern learning means (40), and illumination control means (40).

  The traveling condition acquisition means (5) acquires the traveling condition of the vehicle, and the road condition acquisition means (10, 20) acquires the road condition in front of the vehicle. The line-of-sight detection means (40) detects the line-of-sight direction of the driver of the vehicle, and the information accumulation means (60) is for accumulating information.

  The line-of-sight pattern learning means (40) stores the driver's line-of-sight direction detected by the line-of-sight detection means (40) during traveling of the vehicle, together with the vehicle travel status acquired by the travel status acquisition means (5), in the information storage means (60). Based on the driver's line-of-sight direction and the vehicle driving situation stored in the information storage means (60), the driver's line-of-sight pattern corresponding to the vehicle driving situation is learned.

  The illumination control means (40) is a visual line pattern learning means (40) according to the road condition ahead of the vehicle detected by the road condition acquisition means (10, 20) when the vehicle travels with the illumination (50) turned on. The illumination angle of the vehicle illumination (50) is controlled based on the line-of-sight pattern learned in (1).

  In such a vehicular lighting device (1), in the running status acquisition means (5), the running status of the vehicle, for example, whether the vehicle is running on a curve, and when the curve is running, what is the curvature of the curve? The driving situation such as whether or not the vehicle is in the middle is acquired.

  Further, in the road condition acquisition means (10, 20), the road condition ahead of the vehicle, for example, the road condition such as whether the road ahead is a curve or the curvature in the case of a curve is acquired. Further, the line-of-sight pattern learning means (40) learns the line-of-sight pattern of the driver according to the traveling state of the vehicle while the vehicle is traveling.

  Thus, for example, if learning is performed according to the vehicle driving condition during the day, the vehicle is learning according to the vehicle driving condition during the day when the vehicle is traveling with the illumination (50) turned on such as at night. Based on the line-of-sight pattern, the illumination angle of the vehicle illumination (50) is controlled.

  In other words, when driving on a curved road having the same curvature as daytime during night driving, the illumination angle of the illumination (50) so as to irradiate the road with the same line-of-sight pattern as the driver takes in the daytime. Is controlled.

  Thus, since the road is illuminated with illumination (50) when it is difficult for the driver to grasp the road conditions, such as at night, based on the driver's line-of-sight pattern, an appropriate irradiation angle and irradiation range are appropriate during vehicle travel. It can be. That is, the driver can ensure proper vision even at night. Therefore, it is possible to improve safety when traveling with the illumination (50) turned on.

  Moreover, since the irradiation direction of the past driver is accumulated and learned without controlling the irradiation angle of the illumination (50) according to the driver's line-of-sight direction as in the past, the irradiation direction of the illumination (50) However, it does not shake as the driver's line of sight changes. That is, since the irradiation direction of the illumination (50) is stable, the comfort for the driver increases and the safety becomes higher.

  By the way, various methods are conceivable for acquiring the road condition ahead of the vehicle. According to a second aspect of the present invention, the road condition acquisition means includes: map data storage means (10) for storing map data; and current position acquisition means (20) for acquiring the current position of the vehicle. The road condition ahead of the vehicle may be acquired based on the current position of the vehicle acquired by the current position acquisition means (20) and the map data stored in the map data storage means (10).

Thus, if the current position of the vehicle is known, the road condition ahead of the vehicle can be acquired from the map data.
Further, as described in claim 3, the vehicle includes a front image acquisition means (70) for acquiring an image ahead of the vehicle, and the road condition ahead of the vehicle is determined from the image ahead of the vehicle acquired by the front image acquisition means (70). Even if it acquires, the road condition ahead of a vehicle can be acquired.

  Furthermore, in order to detect a driver | operator's eyes | visual_axis, the eyes | visual_axis detection means (40) is provided with the face image acquisition means (30) which acquires a driver | operator's face image as described in Claim 4, Face image acquisition If the driver's line of sight is acquired from the driver's face image acquired by the means (30), the driver's line of sight can be easily acquired.

  By the way, although various things can be considered as a driving | running | working condition of the vehicle acquired in driving | running | working condition acquisition means (5), as described in Claim 5, the driving | running | working condition acquisition means (5) acquires the driving direction of a vehicle. A traveling direction acquisition means (5) may be provided, and the traveling direction of the vehicle may be acquired by the traveling direction acquisition means (5) as the traveling state of the vehicle.

  In this way, the traveling direction of the vehicle is acquired as the traveling state of the vehicle, and the driver's line of sight is accumulated together with the acquired traveling direction of the vehicle. Further, based on the accumulated traveling direction of the vehicle and the driver's line of sight, the driver's line-of-sight pattern corresponding to the traveling direction of the vehicle is learned.

  Then, when the vehicle travels with the lighting (50) turned on, the line-of-sight pattern learned by the line-of-sight pattern learning means (40) according to the road condition ahead of the vehicle detected by the road condition acquisition means (10, 20) Based on this, the illumination angle of the vehicle illumination (50) is controlled.

Thus, since the irradiation angle of the illumination (50) of the vehicle at the time of lighting is controlled based on the traveling direction of the vehicle, the traveling direction of the vehicle can be appropriately irradiated.
Further, as described in claim 6, speed acquisition means (7) for acquiring the speed of the vehicle is provided in the travel status acquisition means (5), and in addition to the travel direction as the travel status of the vehicle, the speed acquisition means (7) It is good to acquire the speed of the vehicle.

  In this way, the speed of the vehicle is acquired in addition to the traveling direction of the vehicle. Therefore, since the traveling state of the vehicle can be acquired by adding the estimated value based on the speed to the traveling direction of the vehicle, the traveling state of the vehicle can be acquired more accurately.

  By the way, it is necessary to accumulate the vehicle running situation and the driver's line-of-sight pattern before the learning level is advanced by the line-of-sight pattern learning means (40). Therefore, it is convenient that there is a standard line-of-sight pattern before the degree of learning of the line-of-sight pattern for the driver progresses.

  Therefore, as described in claim 7, the line-of-sight pattern learning means (40) calculates the average of the plurality of drivers based on the line-of-sight directions during driving of the plurality of drivers detected by the line-of-sight detection means (40). If a driving pattern is learned, a standard gaze pattern can be obtained by learning with a gaze pattern of a plurality of drivers, and the irradiation direction of the illumination (50) can be controlled using this. Is good.

  In addition, as described in claim 8, the eye-gaze pattern learning means (40) learns the driver's eye-gaze pattern according to a predetermined vehicle traveling condition until learning is performed with a predetermined number of eye-gaze patterns. As a result.

  In this way, the illumination angle of the illumination (50) is controlled by using the driver's line-of-sight pattern according to the driving condition of a predetermined vehicle as a learning result until learning results with a predetermined number or more of line-of-sight patterns are obtained. Therefore, if the learning result is appropriately set, the appropriate irradiation angle can be controlled.

  By the way, depending on the road condition, the driver may perform a driving operation different from the normal operation by his / her own intention. For example, when the road ahead is not a curve or when there is an obstacle on the road and a driving operation is performed to avoid it. In such a case, it is necessary to control the illumination (50) to an irradiation angle different from the learning result.

  Therefore, as described in claim 9, the illumination control means (40) is responsive to the vehicle travel situation acquired by the travel situation acquisition means (5) and the vehicle travel situation learned by the line-of-sight pattern learning means (40). Is compared with the driving situation in the line-of-sight pattern, and the driving situation of the vehicle acquired by the driving condition acquisition means (5) is determined from the driving situation in the line-of-sight pattern according to the driving situation of the vehicle learned by the line-of-sight pattern learning means (40). If the value is more than or equal to this value, the illumination angle of the illumination (50) may be controlled based on the vehicle travel status acquired by the travel status acquisition means (5).

In this way, for example, when the driver performs a driving operation different from normal due to his / her own intention, such as a danger avoidance operation, the vehicle travels in a line-of-sight pattern according to the driving state of the vehicle learned by the driver's operation. Deviate from the situation by more than a predetermined value. In such a case, the illumination angle of the illumination (50) is controlled based on the vehicle travel status acquired by the travel status acquisition means (5). That is, the vehicle is operated when an operation other than the normal drive operation is performed. Since lighting control that matches the driver's intention is performed, it is more preferable from the viewpoint of the driver's acceptability, and as a result, safety during driving can be ensured.

In addition, after the predetermined number or more of line-of-sight patterns are obtained, the illumination angle of the illumination (50) is controlled based on the learning result, so that the appropriate illumination angle can be controlled.
By the way, a vehicle driven by a specific driver is not necessarily a specific vehicle. Therefore, as described in claim 10, the information storage means (60) may be configured to be portable.

  In this way, since the learning result in the line-of-sight pattern learning means (40) can be stored and carried, even when the driver drives another vehicle, the illumination (50) is illuminated when the illumination (50) is turned on. Since the control of the angle is controlled based on the past learning result, safety during driving can be improved.

Embodiments to which the present invention is applied will be described below with reference to the drawings. The embodiment of the present invention is not limited to the following embodiment, and can take various forms as long as they belong to the technical scope of the present invention.
[First Embodiment]
FIG. 1 is a block diagram showing a schematic configuration of the vehicular lighting device 1. As shown in FIG. 1, the vehicular lighting device 1 includes a steering angle detection device 5, a speed detection device 7, a map database 10, a GPS vehicle-mounted device 20, an indoor camera 30, a control device 40, a lighting 50 and an external storage device 60. I have.

  The steering angle detection device 5 is for acquiring the traveling state of the vehicle, and acquires the traveling direction of the vehicle by detecting the steering angle of the vehicle. An encoder or a potentiometer is used to detect the steering angle.

  The speed detection device 7 is a device for acquiring the vehicle speed, and acquires the vehicle speed from the rotational speed of the axle. Further, the speed of the vehicle may be obtained from the rate of change of the current position of the vehicle obtained from the GPS onboard device 20.

  The map database 10 stores map data including road curvature, and the GPS onboard device 20 acquires the current position of the vehicle. The map database 10 and the GPS onboard device 20 can acquire the road situation ahead of the vehicle.

  That is, the current position of the vehicle in the map database 10 can be determined from the current position of the vehicle acquired by the GPS onboard device 20. Here, since the curvature of the road is included in the map database 10, the curvature of the road ahead of the vehicle can be acquired.

  The indoor camera 30 is a CCD camera for acquiring the driver's face image, and the driver's face image can be acquired on the dashboard of the vehicle (for example, in the meter or on the handle column) or in the vicinity of the room mirror. is set up.

The control device 40 includes a CPU, a ROM, a RAM, an I / O, and the like (not shown), and executes a line-of-sight detection process, a learning process, and an illumination control process.
In the line-of-sight detection process, the driver's line of sight is acquired from the driver's face image acquired by the indoor camera 30. As a method of acquiring a line of sight from a driver's face image, a method using a neural network (detection of a driver's driving state 1—line of sight detection by a camera with an inner mirror— “Image Recognition and Understanding Symposium (MIRU 2004)”, 2004 7 Month, I-63-68 "etc. are used.

  In the learning process, the driver's line-of-sight direction detected by the indoor camera 30 while the vehicle is traveling without turning on the lighting 50 is the road condition in front of the vehicle detected by the map database 10 and the GPS in-vehicle device 20, steering. The vehicle is stored together with the vehicle steering angle and speed detected by the angle detection device 5 and the speed detection device 7 and the vehicle speed. Based on the accumulated driver's line-of-sight direction, vehicle position, front road condition, and vehicle speed, the driver's line-of-sight pattern corresponding to the vehicle speed and the vehicle position corresponding to the front road condition is learned.

  In the illumination control process, when the vehicle travels with the illumination 50 turned on, based on the line-of-sight pattern learned in the learning process according to the road condition in front of the vehicle detected by the map database 10 and the GPS vehicle-mounted device 20, The irradiation angle of the illumination 50 is controlled.

  The illumination 50 receives a command from the control device 40 and irradiates the periphery of the vehicle. The illumination 50 receives an on / off command from the control device 40, turns on (turns on) and turns off (off) the illumination 50, and can change the irradiation direction of the illumination 50 upon receiving an angle command from the control device 40. It has become.

The external storage device 60 is a detachable storage device for storing the line-of-sight pattern learned in the learning process, and is a memory stick, a small hard disk device, or the like.
(Processing in the control device 40)
Next, the gaze detection process, the learning process, and the illumination control process in the control device 40 will be described. Since each process is executed by one CPU in the control device 40, each process will not be described separately and will be described as one control process based on FIG. FIG. 2 is a flowchart showing a flow of control processing executed by the control device 40.

  This control process is started when the vehicle lighting device 1 is turned on. When the process is started, first, in S100, it is determined whether or not the illumination 50 is turned on. When the illumination 50 is not turned on (S100: No), the process proceeds to S105, and when the illumination 50 is illuminated (S100: Yes), the process proceeds to S145.

  In S105, the vehicle speed is acquired from the speed detection device 7, and in subsequent S110, the vehicle position is acquired from the GPS vehicle-mounted device 20. In subsequent S115, the vehicle speed acquired in S105, the vehicle position acquired in S110, and the front road condition in the vehicle traveling direction are acquired from the map database 10.

  In S120, the driver's face image is acquired from the indoor camera 30, and in subsequent S125, the driver's line of sight is detected by image processing from the driver's face image acquired in S120.

  In S130, the vehicle speed acquired in S105, the vehicle position acquired in S110, the front road condition acquired in S115, and the driver's line of sight acquired in S125 are accumulated in the external storage device 60.

  Here, an example of the relationship between the steering angle of the vehicle and the driver's line of sight is shown in FIG. FIGS. 3A to 3D show the steering angle of the vehicle and the line of sight of each subject (subjects 1 to 4) when the subjects 1 to 4 drive on the same road that is curved left and right. It is shown.

3A to 3D, the horizontal axis indicates the elapsed time [second], the left vertical axis indicates the operation angle [degree], and the right vertical axis indicates the driver's line of sight [degree]. ing.
As shown in FIGS. 3A to 3D, before the own vehicle approaches the curve and gives a steering angle to the vehicle, the movement of the line of sight along the forward curve occurs, and the line of sight on the curve It can be seen that the angle is different for each individual. The movement of the line of sight of each subject is accumulated together with the vehicle speed, the vehicle position, and the road condition ahead.

  In S135, the vehicle speed, the vehicle position, the front road situation, and the driver's line of sight stored in the external storage device 60 in S130 are driven according to the vehicle driving situation, that is, the vehicle speed and the vehicle position corresponding to the front road situation. A person's gaze pattern is learned.

The contents of learning are explained concretely. As described above, the external storage device 60 stores the vehicle speed, the vehicle position, the road conditions ahead, and the line of sight of each driver as the driving situation of the vehicle.
Here, from the vehicle speed, the vehicle position, and the road condition ahead, there is obtained data that there is a curve with a curvature of how many meters ahead and how many seconds later it reaches. Therefore, if the curvature of the forward curve stored in the external storage device 60, the vehicle speed, and the vehicle positional relationship (time to reach the curve) are divided into predetermined ranges and averaged within each range, the driving corresponding to the forward curve is performed. A line-of-sight pattern corresponding to the situation (vehicle speed, vehicle position) is obtained.

  For example, “Driver A has a left curve with a curvature of R130 to 170 m ahead, a vehicle speed of 30 to 35 km / hr and a time to reach the curve entrance of about 3.5 to 4.5 seconds. In other words, a table such as “moves the line of sight in the left direction about 3 degrees on the average” is obtained. This is the learning result.

In S140, the learning result obtained in S135 is accumulated in the external storage device 60, and then the process returns to S100.
On the other hand, if it is determined in S100 that the illumination 50 is turned on and the process proceeds to S145, the vehicle speed is acquired from the speed detection device 7 in S145, and the current vehicle is acquired from the GPS onboard device 20 in S150. The position is obtained.

In S155, the current forward road situation in the vehicle traveling direction is acquired from the vehicle speed acquired in S145, the vehicle position acquired in S150, and the map database 10.
Specifically, a curve having a degree of curvature of the road ahead in the traveling direction of the vehicle on the map database 10 from the vehicle speed acquired in S145 and the vehicle position acquired in S150 is a curve having a curve after several seconds. Whether it is approaching the entrance is acquired.

  In subsequent S160, the driver's line-of-sight pattern corresponding to the vehicle speed acquired in S145, the vehicle position acquired in S150, and the front road condition of the vehicle acquired in S155 is acquired from the external storage device 60. That is, from the learning result stored in the external storage device 60, the same curvature, vehicle speed, and vehicle position as those obtained in S145 to 155 as the road condition ahead of the vehicle as the road curvature, vehicle speed, and vehicle position are obtained. In this case, the driver's line-of-sight pattern is acquired.

  In S165, the irradiation direction of the illumination 50 is controlled based on the driver's line-of-sight pattern acquired in S160. For example, if the curvature of the road ahead of the vehicle is a left curve of R150m in the left direction and the vehicle speed is 30km and the vehicle reaches the curve entrance after about 4.5 seconds, the driver's line of sight turns 3 degrees to the left from the front of the vehicle When the line-of-sight pattern is acquired, the direction of the illumination is controlled so as to cover the number of times from the front of the vehicle with the illumination.

After the illumination direction is controlled in S165, the process returns to S100.
(Characteristics of vehicle lighting device 1)
The vehicular lighting device 1 as described above is used by the driver according to the driving state of the vehicle (vehicle speed, vehicle position, front road condition) while the vehicle is running without turning on the lighting 50 such as daytime. A line-of-sight pattern is learned.

Then, when the vehicle is traveling with the illumination 50 turned on, such as at night, the irradiation angle of the illumination 50 of the vehicle is controlled based on the line-of-sight pattern learned according to the traveling state of the vehicle.
In this way, the driver irradiates the road with the lighting 50 when the driver is difficult to grasp the road condition, such as at night, based on the line-of-sight pattern when the driver can grasp the road condition well, such as daytime. Can ensure the same vision as daytime even at night. Therefore, it is possible to improve safety when traveling with the illumination 50 turned on.

  Further, the conventional driver's line-of-sight direction is accumulated and learned without controlling the irradiation angle of the illumination 50 in accordance with the driver's line-of-sight direction as in the prior art. Therefore, the irradiation direction of the illumination 50 is controlled so as to obtain a stable field of view similar to the daytime according to the road conditions, without causing the irradiation direction of the illumination 50 to fluctuate with changes in the driver's line of sight. So it becomes safer.

  Further, the average driving pattern of the plurality of drivers can be learned based on the line-of-sight directions during driving of the plurality of drivers detected by the indoor camera 30. Therefore, a standard line-of-sight pattern can be obtained by learning with a plurality of drivers' line-of-sight patterns, and the irradiation direction of the illumination 50 can be controlled using the standard line-of-sight pattern.

  Furthermore, since the external storage device 60 is configured to be portable, the learning result in the control device 40 can be stored and carried. Therefore, even when the driver drives another vehicle, the control of the irradiation angle of the illumination 50 when the illumination 50 is turned on is controlled based on the past learning result, so that safety during driving is improved. Can do.

[Second Embodiment]
Next, a second embodiment will be described based on FIG. 4 and FIG. 4 and 5 are flowcharts showing a flow of illumination control processing executed by the control device 40 in the second embodiment.

  The illumination control process in the second embodiment starts when the vehicle lighting device 1 is turned on. When the process is started, first, in S200, it is determined whether or not the illumination 50 is turned on. If the illumination 50 is not lit (S200: No), the process proceeds to S205 and S245. If the illumination 50 is lit (S200: Yes), the process proceeds to S280 (FIG. 5). . Note that S205 to S245 and S250 to S275 described below are processed in parallel.

  In S205, the vehicle speed is acquired from the speed detection device 7, and in subsequent S210, the vehicle position is acquired from the GPS onboard device 20. In subsequent S215, the steering angle of the vehicle is acquired from the steering angle detection device 5.

In subsequent S220, the vehicle speed acquired in S205, the vehicle position acquired in S210, and the front road condition in the vehicle traveling direction are acquired from the map database 10.
In S225, the driver's face image is acquired from the indoor camera 30, and in subsequent S230, the driver's line of sight is detected by image processing from the driver's face image acquired in S225.

  In S235, the vehicle speed acquired in S205, the vehicle position acquired in S210, the steering angle acquired in S215, the front road condition acquired in S220, and the driver's line of sight acquired in S230 are stored in the external storage device 60. Accumulated in.

  In S240, the vehicle speed, the vehicle position, the steering angle, the front road condition, and the driver's line of sight stored in the external storage device 60 in S235 are driven, that is, the front road condition, the steering angle, the vehicle speed, and the vehicle position. The driver's line-of-sight pattern according to the situation is learned.

In subsequent S245, the learning result in S240 is accumulated in the external storage device 60, and then the process returns to S200.
On the other hand, in S250, the steering angle of the vehicle is acquired from the steering angle detection device 5, and in S255, a driver's face image is acquired from the indoor camera 30, and in S260, the driver's face image acquired in S255. Thus, the driver's line of sight is detected by image processing.

In S265, the steering angle acquired in S250 and the driver's line of sight acquired in S260 are accumulated in the external storage device 60.
In S270, the driver's line-of-sight pattern corresponding to the steering angle is learned from the steering angle and the driver's line of sight stored in the external storage device 60 in S265.

  The contents of learning are explained concretely. As shown in FIG. 3, the line-of-sight movement is started before the steering angle is generated with respect to the curve ahead. That is, when the line-of-sight movement is started, the input value of the steering angle is learned and set within a very small range. At the same time, the relationship between the steering angle and the line-of-sight angle is learned.

  For example, “Driver A has a left curve with a curvature of R130 to 170 m ahead, a vehicle speed of 30 to 35 km / hr and a time to reach the curve entrance of about 3.5 to 4.5 seconds. In other words, a table such as “moving the line of sight in the left direction by an average of 3 degrees and the steering angle within this time zone within ± 1 degree” is obtained. Similarly, a table such as “When the steering angle is 2.5 to 3.5 degrees to the left, the driver A turns his gaze to the left by 5 degrees on the average” is obtained. These are the learning results.

In S275, the learning result obtained in S270 is accumulated in the external storage device 60, and then the process returns to S200.
On the other hand, if it is determined in S200 that the illumination 50 is turned on and the process proceeds to S280, the vehicle speed is acquired from the speed detection device 7 in S280 as shown in FIG. The current vehicle position is acquired from the vehicle-mounted device 20.

  In S290, the steering angle of the vehicle is acquired from the steering angle detection device 5, and in S300, the vehicle speed acquired in S280, the vehicle position acquired in S285, and the current forward road condition in the vehicle traveling direction are acquired from the map database 10. Is done.

  In S300, it is determined whether or not the steering angle does not match the pattern of the vehicle speed, vehicle speed, and front road situation accumulated in S245 or S275 from the vehicle speed, vehicle speed, steering angle, and front road situation acquired in S280 to S295. Is done.

  If the steering angle matches the aforementioned pattern (S300: No), the process proceeds to S305, and if not (S300: Yes), the process proceeds to S310.

In S305, a line-of-sight pattern corresponding to the vehicle speed, the vehicle position, and the road condition ahead is acquired. In S310, a line-of-sight pattern corresponding to the steering angle is acquired.
In subsequent S315, the irradiation direction of the illumination 50 is controlled in accordance with the line-of-sight pattern acquired in S305 or S310, and then the process returns to S200 (see FIG. 4), and the illumination control process is repeated.

  In the vehicle lighting device 1 described above, depending on the road conditions ahead of the vehicle, for example, when the road ahead is curved, when approaching the curve, the driver is usually ahead of the point where steering is normally started. If the steering is intentionally generated, the lighting angle control is performed in accordance with the steering angle with priority given to the steering.

  For example, when the road curvature in front of the vehicle is a left curve of R150m in the left direction and the vehicle speed is 30km and the vehicle reaches the curve entrance after approximately 4.5 seconds, the steering angle is usually less than ± 1 ° but is currently steering Since the angle of 3 degrees to the left is input, priority is given to the steering angle input, and the illumination is controlled to an irradiation range that covers a line-of-sight angle of 5 degrees in the left direction that appears on average when the steering angle is normally 3 degrees to the left.

Similarly, when there is no curve ahead, illumination angle control is performed in accordance with the input when the driver intentionally inputs the steering angle.
For example, “illumination control that covers a viewing angle range of 5 degrees in the left direction that usually appears when the steering angle is 3 degrees to the left, but no curve exists ahead” is performed. That is, by performing illumination angle control giving priority to the steering angle of the driver, it is possible to assist in ensuring nighttime visibility according to the occurrence of an intentional traveling situation such as accident avoidance or lane change.

[Third Embodiment]
Next, a third embodiment will be described with reference to FIG. FIG. 6 is a block diagram showing a schematic configuration of the vehicular lighting device 2.

  In the vehicular lighting device 1 according to the first embodiment, the road situation ahead of the vehicle is acquired using the map database 10 and the GPS on-vehicle device 20, but in the vehicular lighting device 2 according to the third embodiment, those road conditions are obtained. Instead, a front camera 70 that acquires an image ahead of the vehicle is used.

  That is, the front camera 70 is installed so that the front image can be acquired in the vehicle, and the front image of the vehicle is acquired by the front camera 70. The front camera 70 is a visible light camera such as a CCD camera. And in the process in the control apparatus 40, the following processes are performed.

  In the first embodiment, the vehicle position is acquired from the GPS onboard device 20 in S135, and the forward road situation (curvature of the road) corresponding to the current position of the vehicle is acquired from the map database 10 in S115 and S155. Instead, an image ahead of the vehicle is acquired from the front camera 70 in S115 and S155.

  Then, a road image ahead of the vehicle is extracted from the acquired image ahead of the vehicle by image processing, and the curvature of the road immediately before the vehicle is calculated from the extracted road image. Then, assuming that the vehicle travels along the curve, the front road condition is obtained from the calculated curvature of the road immediately before the vehicle.

  In such a vehicular lighting device 2, it is possible to acquire the road situation ahead of the vehicle without using the map data stored in the map database 10. That is, since the road situation often changes, the map data needs to be frequently updated when the road situation is acquired using the map data of the map database 10. On the other hand, when the front camera 70 is used, the road condition ahead of the vehicle can be acquired from the image acquired in real time, so there is no need to update the map data.

[Fourth Embodiment]
In the above embodiment, when the illumination 50 is turned off such as in the daytime, the vehicle speed, the vehicle position, the front road condition, and the driver's line of sight are accumulated in the external storage device 60 and learned, and the learning result is used when the illumination 50 is turned on at night. The irradiation angle of the illumination 50 was controlled. In this case, when the number of line-of-sight data stored in the external storage device 60 is small, there may be a case where the learning result does not necessarily match the characteristics of each driver.

  Therefore, the driver's line-of-sight pattern corresponding to the vehicle speed of the predetermined vehicle, the position ahead of the vehicle, and the road condition is stored in advance in the external storage device 60 as a learning result, and the number of line-of-sight data stored in the external storage device 60 is Until the predetermined number is reached, the irradiation angle of the illumination 50 is controlled based on the driver's line-of-sight pattern according to a predetermined vehicle position, vehicle speed and front road conditions.

  Then, when the number of line-of-sight data exceeds a predetermined value, the irradiation angle of the illumination 50 is controlled based on the learning result based on the line-of-sight pattern accumulated in the external storage device 60. A specific flow of control processing is shown in FIG. In FIG. 7, the processing in S100 to S125, S135 to S155, S160, and S165 is the same as the processing in FIG.

In the control process according to the fourth embodiment, after various data is accumulated in S130, the number of accumulated data is accumulated.
Further, after acquiring the front road condition in S155, it is determined in S157 whether or not the number of data accumulated in S130 is a predetermined number or more. If the number of data is greater than or equal to the predetermined number (S157: Yes), the process proceeds to S160, and the subsequent processes are executed as in the first embodiment. On the other hand, when the number of data is less than the predetermined number (S157: No), the process proceeds to S163.

  In S163, a predetermined line-of-sight pattern is acquired from the external storage device 60. The predetermined line-of-sight pattern is data in which the relationship between the vehicle speed, the vehicle position, the front road condition, and the line-of-sight pattern is set, and is stored in the external storage device 60 in advance. After obtaining the predetermined line-of-sight pattern, the process proceeds to S165, and the subsequent processes are executed as in the first embodiment.

  In such a vehicle lighting device, the illumination angle of the illumination 50 is controlled based on the driver's line-of-sight pattern according to the driving condition of the vehicle until learning results with a predetermined number or more of line-of-sight patterns are obtained. Therefore, if the line-of-sight pattern is set appropriately, the appropriate irradiation angle can be controlled.

In addition, after a predetermined number or more of line-of-sight patterns are obtained, the irradiation angle of the illumination 50 is controlled based on the learning result, so that the appropriate irradiation angle can be controlled.
[Other Embodiments]
As mentioned above, although embodiment of this invention was described, this invention is not limited to this embodiment, A various aspect can be taken.

  (1) In the first embodiment to the fourth embodiment, when the illumination 50 is turned off, the vehicle speed, the vehicle position, the front road situation, and the line-of-sight data are accumulated and learned. However, the illumination 50 is not necessarily turned off. Sometimes it is not necessary to accumulate and learn data, and it may be set when the driver accumulates and learns data.

That is, a data accumulation setting switch may be provided on the dashboard of the vehicle, and when the switch is on, the steering angle and line-of-sight data may be accumulated and learned.
(2) In the first to third embodiments, the steering angle and the driver's line of sight are stored in the portable external storage device 60. However, the external storage device 60 is not particularly required, and the control device 40 It may be stored in an internal RAM or the like. In this case, it is necessary to back up the power supply with a secondary battery or the like so that the stored data does not disappear even when the power supply to the control device 40 is turned off.

  (3) In the second embodiment, a visible light CCD camera is used as the front camera 70, but an infrared camera may be used. If an infrared camera is used, a road image can be acquired from a vehicle front image even when the amount of light around the vehicle is small (for example, at night), so that it is possible to calculate the curvature of the road and the steering angle of the vehicle.

1 is a block diagram illustrating a schematic configuration of a vehicular illumination device 1. FIG. It is a flowchart which shows the flow of the control processing performed with the control apparatus 40 in 1st Embodiment. It shows the steering angle of the vehicle and the line of sight of the subject when the subject drives a road that curves left and right. It is a flowchart which shows the flow of the control processing performed with the control apparatus 40 in 2nd Embodiment. It is a flowchart which shows the flow of the control processing performed with the control apparatus 40 in 2nd Embodiment. 1 is a block diagram illustrating a schematic configuration of a vehicular illumination device 2. FIG. 4 is a flowchart showing a flow of control processing executed by a control device 40 of the vehicular lighting device 2.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 1, 2 ... Vehicle illumination device, 5 ... Steering angle detection device, 7 ... Speed detection device, 10 ... Map database, 20 ... GPS vehicle equipment, 30 ... Indoor camera, 40 ... Control device, 50 ... Lighting, 60 ... External Storage device, 70 ... front camera.

Claims (10)

  1. A driving status acquisition means for acquiring the driving status of the vehicle;
    Road condition acquisition means for acquiring the road condition ahead of the vehicle;
    Line-of-sight detection means for detecting the line-of-sight direction of the driver of the vehicle;
    Information storage means for storing information;
    The driver's line-of-sight direction detected by the line-of-sight detection means during the vehicle travel is stored in the information storage means together with the vehicle travel state acquired by the travel state acquisition means, and further stored in the information storage means Line-of-sight pattern learning means for learning the line-of-sight pattern of the driver according to the driving state of the vehicle based on the line-of-sight direction of the driver and the driving state of the vehicle;
    When the vehicle travels with lighting turned on, illumination of the vehicle illumination is performed based on the line-of-sight pattern learned by the line-of-sight pattern learning means according to the road situation ahead of the vehicle detected by the road condition acquisition means. Lighting control means for controlling the angle;
    A vehicular lighting device comprising:
  2. The vehicle lighting device according to claim 1,
    The road condition acquisition means
    Map data storage means for storing map data;
    Current position acquisition means for acquiring the current position of the vehicle;
    With
    A vehicle lighting device, wherein a road condition ahead of the vehicle is acquired based on the current position of the vehicle acquired by the current position acquisition means and map data stored in the map data storage means.
  3. The vehicle lighting device according to claim 1,
    The road condition acquisition means
    A forward image acquisition means for acquiring an image ahead of the vehicle;
    A vehicle lighting device, wherein a road condition ahead of the vehicle is acquired from an image ahead of the vehicle acquired by the front image acquisition means.
  4. In the vehicle lighting device according to any one of claims 1 to 3,
    The line-of-sight detection means includes
    Comprising a face image acquisition means for acquiring the driver's face image;
    The vehicular illumination device characterized in that the driver's line of sight is acquired from the driver's face image acquired by the face image acquisition means.
  5. In the vehicle lighting device according to any one of claims 1 to 4,
    The travel status acquisition means includes
    A travel direction acquisition means for acquiring the travel direction of the vehicle;
    The vehicular lighting device, wherein the travel direction of the vehicle is acquired by the travel direction acquisition means as the travel status of the vehicle.
  6. The vehicle lighting device according to claim 5,
    The travel status acquisition means includes
    Comprising speed acquisition means for acquiring the speed of the vehicle;
    The vehicle lighting device, wherein the speed of the vehicle is acquired by the speed acquisition means in addition to the travel direction as the travel status of the vehicle.
  7. In the vehicle lighting device according to any one of claims 1 to 6,
    The line-of-sight pattern learning means includes
    An illumination device for a vehicle, which learns an average line-of-sight pattern of the plurality of drivers based on a line-of-sight direction during driving of the plurality of drivers detected by the line-of-sight detection means.
  8. In the vehicle lighting device according to any one of claims 1 to 7,
    The line-of-sight pattern learning means includes
    The vehicle lighting device according to claim 1, wherein the driver's line-of-sight pattern corresponding to a predetermined traveling state of the vehicle is used as a learning result until learning is performed with a predetermined number or more of line-of-sight patterns.
  9. In the vehicle lighting device according to any one of claims 1 to 8,
    The illumination control means includes
    Comparing the traveling state of the vehicle acquired by the traveling state acquisition unit and the traveling state in the line-of-sight pattern according to the traveling state of the vehicle learned by the line-of-sight pattern learning unit;
    When the traveling state of the vehicle acquired by the traveling state acquisition unit deviates from the traveling state in the line-of-sight pattern according to the traveling state of the vehicle learned by the line-of-sight pattern learning unit, the traveling state An illumination device for a vehicle, wherein an illumination angle of the illumination is controlled based on a traveling state of the vehicle acquired by an acquisition unit.
  10. In the vehicle lighting device according to any one of claims 1 to 9,
    The vehicle lighting device, wherein the information storage means is configured to be portable.
JP2007176519A 2007-07-04 2007-07-04 Vehicular lighting device Pending JP2009012608A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007176519A JP2009012608A (en) 2007-07-04 2007-07-04 Vehicular lighting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007176519A JP2009012608A (en) 2007-07-04 2007-07-04 Vehicular lighting device

Publications (1)

Publication Number Publication Date
JP2009012608A true JP2009012608A (en) 2009-01-22

Family

ID=40354039

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007176519A Pending JP2009012608A (en) 2007-07-04 2007-07-04 Vehicular lighting device

Country Status (1)

Country Link
JP (1) JP2009012608A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010204795A (en) * 2009-03-02 2010-09-16 Nissan Motor Co Ltd Visual line guide device
JP2010208581A (en) * 2009-03-12 2010-09-24 Nissan Motor Co Ltd Steering device for vehicle and method of operating the same
JP2010256554A (en) * 2009-04-23 2010-11-11 Pioneer Electronic Corp Information processor and image display control method
JP2013225131A (en) * 2013-05-23 2013-10-31 Pioneer Electronic Corp Information processing device and image display control method
CN103786632A (en) * 2012-10-31 2014-05-14 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
KR101486893B1 (en) 2013-05-02 2015-01-28 황선초 Search-light system for patrol car and method for processing thereof

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010204795A (en) * 2009-03-02 2010-09-16 Nissan Motor Co Ltd Visual line guide device
JP2010208581A (en) * 2009-03-12 2010-09-24 Nissan Motor Co Ltd Steering device for vehicle and method of operating the same
JP2010256554A (en) * 2009-04-23 2010-11-11 Pioneer Electronic Corp Information processor and image display control method
CN103786632A (en) * 2012-10-31 2014-05-14 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
CN103786632B (en) * 2012-10-31 2017-04-12 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
KR101486893B1 (en) 2013-05-02 2015-01-28 황선초 Search-light system for patrol car and method for processing thereof
JP2013225131A (en) * 2013-05-23 2013-10-31 Pioneer Electronic Corp Information processing device and image display control method

Similar Documents

Publication Publication Date Title
US10513269B2 (en) Road profile along a predicted path
US9682708B2 (en) Driving support controller
JP6642972B2 (en) Vehicle image display system and method
US9649936B2 (en) In-vehicle device, control method of in-vehicle device, and computer-readable storage medium
KR101714185B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
JP5652364B2 (en) Vehicle behavior control device
JP5427202B2 (en) Vehicle driving support device
JP4432930B2 (en) Parking assistance device and parking assistance method
US8566017B2 (en) Driving support apparatus for vehicle
CN107077792B (en) Travel control system
DE102007010325B4 (en) Transmission neutral state management in vehicle safety and ride comfort systems
JP2017109740A (en) Vehicle control system and control method
JP2015000722A (en) Vehicle activation method and vehicle activation system
CN104380060B (en) Paddle and sense display control program
JP4929777B2 (en) Vehicle travel control device
US10748428B2 (en) Vehicle and control method therefor
US9248796B2 (en) Visually-distracted-driving detection device
JP4021344B2 (en) Vehicle driving support device
US20170313324A1 (en) Vehicle automated driving system
US10239528B2 (en) Vehicle and method of controlling the vehicle
US8560200B2 (en) Driving support apparatus for vehicle
EP2741270A1 (en) Driving assistance apparatus and driving assistance method
JP5151452B2 (en) Information display device
CN103097196B (en) For adjusting the device and method of the illumination of vehicle in unsharp bend
CN107787282A (en) The cognition driver assistance with variable auxiliary for automated vehicle