WO2024047777A1 - Headlight control device and headlight control method - Google Patents

Headlight control device and headlight control method Download PDF

Info

Publication number
WO2024047777A1
WO2024047777A1 PCT/JP2022/032693 JP2022032693W WO2024047777A1 WO 2024047777 A1 WO2024047777 A1 WO 2024047777A1 JP 2022032693 W JP2022032693 W JP 2022032693W WO 2024047777 A1 WO2024047777 A1 WO 2024047777A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth distance
information
driver
vehicle
unit
Prior art date
Application number
PCT/JP2022/032693
Other languages
French (fr)
Japanese (ja)
Inventor
紫織 島谷
晃史 山本
真 宗平
尚嘉 竹裏
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/032693 priority Critical patent/WO2024047777A1/en
Publication of WO2024047777A1 publication Critical patent/WO2024047777A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/06Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
    • B60Q1/076Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle by electrical means including means to transmit the movements, e.g. shafts or joints

Definitions

  • the present disclosure relates to a headlight control device and a headlight control method.
  • Patent Document 1 has a problem in that the headlights cannot be controlled in consideration of how far ahead in the direction the driver is actually facing. As a result, there was a possibility that a situation occurred in which the area that the driver was actually trying to see could not be irradiated.
  • the present disclosure has been made to solve the above-mentioned problems, and in controlling the lighting of headlights in a vehicle based on the direction in which the driver is facing, it is possible to determine the direction in which the driver is actually facing. It is an object of the present invention to provide a headlight control device capable of controlling lighting in consideration of whether the person is trying to visually check the destination.
  • a headlight control device includes a direction detection unit that detects the orientation of a driver based on a captured image of a driver of a vehicle, and driving-related information that acquires driving-related information related to driving of a vehicle. Based on the acquisition unit, the orientation information regarding the driver's orientation detected by the orientation detection unit, and the driving-related information acquired by the driving-related information acquisition unit, the driver is faced from the installation position of the headlight provided on the vehicle.
  • the depth distance estimator estimates the depth distance, which is the distance between the driver's visible position and the estimated visible position in the direction in which the driver is viewed, and the depth distance estimator calculates the amount of light emitted by the headlights based on the depth distance estimated by the depth distance estimator.
  • the apparatus includes an irradiation determining section that determines an irradiation range, and a headlight control section that causes the headlight to irradiate the light onto the irradiation range determined by the irradiation determining section.
  • the lighting control when controlling the lighting of headlights in a vehicle based on the direction in which the driver is facing, the lighting control takes into consideration how far in the direction the driver is actually looking. can.
  • FIG. 1 is a diagram showing a configuration example of a headlight control device according to Embodiment 1.
  • FIG. FIG. 3 is a diagram illustrating an example of the content of depth distance estimation information used by the depth distance estimating unit to estimate the depth distance in the first embodiment. No. set in the depth distance estimation information shown in FIG. 1 ⁇ No. Regarding the depth distance corresponding to the input information under the condition 6, an example of the driving state of the vehicle estimated from the driver's behavior and vehicle information and the estimated visible object of the driver, which is the basis for deriving the depth distance.
  • FIG. FIG. 6 is a diagram for explaining an example of how the headlight control unit causes the headlight to irradiate the irradiation range determined by the irradiation determination unit in the first embodiment.
  • FIG. 3 is a flowchart for explaining the operation of the headlight control device according to the first embodiment.
  • 6A and 6B are diagrams illustrating an example of the hardware configuration of the headlight control device according to the first embodiment.
  • FIG. 3 is a diagram showing a configuration example of a headlight control device according to a second embodiment.
  • 7 is a diagram illustrating an example of the content of depth distance estimation information used by a depth distance estimation unit to estimate depth distance in Embodiment 2.
  • the driving state of the vehicle and the object visible to the driver are estimated from the driver's behavior, vehicle information, and map information, which are the basis for deriving the depth distance.
  • FIG. 10A is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the second embodiment
  • FIG. 10B is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the second embodiment.
  • FIG. 10A is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit based on the depth distance as shown in FIG. 10A estimated by the depth distance estimating unit.
  • FIGS. 11A and 11B show how, in Embodiment 1, the depth distance estimating unit specifically estimates how many meters to how many meters the depth distance range including the sidewalk near the intersection and the crosswalk at the intersection has a margin of 5 meters.
  • FIG. 10A is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the second embodiment
  • FIG. 10B is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit
  • FIG. 3 is a diagram for explaining an example of a method of calculating whether the range is within m.
  • FIG. 12A is a diagram for explaining another example of the depth distance estimated by the depth distance estimating section in the second embodiment
  • FIG. 12B is a diagram for explaining another example of the depth distance estimated by the depth distance estimating section in the second embodiment.
  • 12A is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit based on the depth distance as shown in FIG. 12A estimated by the depth distance estimating unit.
  • 7 is a flowchart for explaining the operation of the headlight control device according to the second embodiment.
  • FIG. 12A is a diagram for explaining another example of the depth distance estimated by the depth distance estimating section in the second embodiment
  • FIG. 12B is a diagram for explaining another example of the depth distance estimated by the depth distance estimating section in the second embodiment.
  • 12A is a diagram for explaining an example of how light is ir
  • FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a third embodiment.
  • FIG. 12 is a diagram showing an example of the content of depth distance estimation information used by the depth distance estimating unit to estimate the depth distance in Embodiment 3; No. set in the depth distance estimation information shown in FIG. 1 ⁇ No.
  • the irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit, and the headlight control unit applies light to the headlight in the irradiation range determined by the irradiation determining unit.
  • FIG. 3 is a diagram for explaining an example of how the light is irradiated.
  • the irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit, and the headlight control unit applies light to the headlight in the irradiation range determined by the irradiation determining unit.
  • FIG. 7 is a diagram for explaining another example of how the light is irradiated.
  • the irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit, and the headlight control unit applies light to the headlight in the irradiation range determined by the irradiation determining unit.
  • FIG. 7 is a diagram for explaining another example of how the light is irradiated. 7 is a flowchart for explaining the operation of the headlight control device according to Embodiment 3.
  • FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a fourth embodiment.
  • FIG. FIG. 12 is a diagram showing an example of the contents of depth distance estimation information used by a depth distance estimating unit to estimate a depth distance in Embodiment 4; No.
  • FIG. 24A is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the fourth embodiment
  • FIG. 24B is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the fourth embodiment.
  • FIG. 24A is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit based on the depth distance as shown in FIG. 24A estimated by the depth distance estimating unit.
  • 7 is a flowchart for explaining the operation of the headlight control device according to Embodiment 4.
  • FIG. FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a fifth embodiment.
  • FIG. 12 is a diagram for explaining an example of a scene in which the reliability determination unit determines that the reliability of the driver's orientation detected by the orientation detection unit is low in the fifth embodiment.
  • 7 is a flowchart for explaining the operation of the headlight control device according to Embodiment 5.
  • FIG. 7 is a diagram showing a configuration example of a headlight control device according to a sixth embodiment.
  • FIG. 7 is a diagram for explaining an example of how the headlight control unit causes the headlight to irradiate light onto the first irradiation range and the second irradiation range determined by the irradiation determination unit in the sixth embodiment;
  • FIG. 30A is a side view showing how the first irradiation range and the second irradiation range are irradiated with light
  • FIG. 30B is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light.
  • FIG. 12 is a flowchart for explaining the operation of the headlight control device according to the sixth embodiment.
  • FIG. 12 is a flowchart for explaining the operation of the headlight control device according to the sixth embodiment.
  • FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a seventh embodiment.
  • FIG. 12 is a diagram for explaining an example of the content of the surroundings confirmation determination condition used by the surroundings confirmation determination unit to determine whether the driver is checking the surroundings in Embodiment 7;
  • FIG. 12 is a diagram for explaining an example of how the surrounding confirmation determination unit irradiates the irradiation range with light after expanding the second irradiation range set by the irradiation determination unit in Embodiment 7;
  • FIG. 7 is a diagram for explaining an example of the irradiation range of light that the headlight control device irradiates to the headlights in Embodiment 7, and is a bird's-eye view of the surroundings of the vehicle viewed from above.
  • FIG. 36B is a diagram for explaining an example of the irradiation range of light that the headlight control device irradiates to the headlights in Embodiment 7, and FIG. 36A is a diagram illustrating an example of the irradiation range of light that the headlight control device irradiates to the headlights, and FIG. FIG. 3 is a diagram showing an example of an irradiation range of light emitted by a control device to a headlight. 13 is a flowchart for explaining the operation of the headlight control device according to Embodiment 7.
  • FIG. 1 is a diagram showing a configuration example of a headlight control device 1 according to the first embodiment.
  • the headlight control device 1 is mounted on a vehicle 100.
  • the headlight control device 1 controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100.
  • the "driver's orientation” is expressed by the driver's face orientation or the driver's line of sight direction.
  • the "driver's orientation” includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
  • the "driver's orientation" includes the driver's posture.
  • the light control of the headlights 2 based on the direction of the driver performed by the headlight control device 1 is performed in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. It is assumed that this is performed when the headlights 2 are turned on by the driver, or when the headlight control device 1 determines that the headlights 2 should be automatically turned on based on the surrounding brightness and darkness.
  • the headlight control device 1 is connected to the headlights 2, the in-vehicle imaging device 3, and the driving-related information acquisition device 4.
  • the headlight 2, the in-vehicle imaging device 3, and the driving-related information acquisition device 4 are provided in the vehicle 100.
  • the headlight 2 is a lighting device that illuminates the front of the vehicle 100.
  • the headlight 2 is a general headlight that can emit, for example, a high beam, a low beam, and an auxiliary light, so a detailed description of the configuration example will be omitted.
  • the headlights 2 include a left light (not shown) mounted on the left side of the vehicle 100 with respect to the traveling direction of the vehicle 100, and a right light (not shown) mounted on the right side of the vehicle 100 with respect to the traveling direction of the vehicle 100. (not shown).
  • the left light and the right light each include a high beam unit (not shown) that illuminates a distant area, a low beam unit (not shown) that illuminates a nearby area, and an auxiliary light unit (not shown).
  • the high beam unit, low beam unit, and auxiliary light unit each include, for example, a plurality of light sources (not shown) such as LED light sources arranged in an array, and each light source can be turned on individually.
  • being arranged in an array means that the light sources are arranged in a line in the width direction of the vehicle 100.
  • the high beam unit, low beam unit, and auxiliary light unit may each use, for example, MEMS (Micro Electro Mechanical Systems).
  • the high beam unit, low beam unit, and auxiliary light unit can control the light distribution range by reflecting light with a MEMS mirror.
  • the area in front of the vehicle 100 where the high beam unit can emit high beams is referred to as a "high beam irradiable area.” How far in front of the vehicle 100 and in what range the high beam irradiation possible area extends is determined in advance according to the specifications of the high beam unit and the like.
  • the area in front of the vehicle 100 where the low beam unit can emit a low beam is referred to as a "low beam irradiation possible area.” How far in front of the vehicle 100 and in what range the low beam irradiation possible area extends is determined in advance according to the specifications of the low beam unit and the like.
  • the area in front of the vehicle 100 where the auxiliary light unit can irradiate the auxiliary light is referred to as the "auxiliary light irradiation possible area.”
  • the extent to which the auxiliary light irradiation area is in front of the vehicle 100 is determined in advance according to the specifications of the auxiliary light unit and the like.
  • the headlight control device 1 performs control to emit or block high beam, low beam, or auxiliary light by, for example, turning each light source on or off. Thereby, the headlight control device 1 controls the range of light irradiated by the headlights 2. Note that the headlight control device 1 can not only turn on and turn off each light source, but also control the amount of light when turned on. For example, the headlight control device 1 can also control the amount of light from the headlights 2 by controlling the current value of each light source of the headlights 2.
  • the in-vehicle imaging device 3 is a camera or the like installed in the vehicle 100 for the purpose of monitoring the inside of the vehicle 100, and is installed so as to be able to image at least the driver's face.
  • the in-vehicle imaging device 3 is an infrared camera, a visible light camera, or the like.
  • the in-vehicle imaging device 3 outputs the captured image (hereinafter referred to as “in-vehicle captured image”) to the headlight control device 1.
  • the in-vehicle imaging device 3 is, for example, shared with an imaging device included in a so-called “Driver Monitoring System (DMS)” that is installed in the vehicle 100 to monitor the condition of the driver inside the vehicle 100. There may be.
  • DMS Driver Monitoring System
  • the driving-related information acquisition device 4 acquires information related to the driving of the vehicle 100 (hereinafter referred to as "driving-related information").
  • the driving-related information acquisition device 4 is assumed to be, for example, a vehicle speed sensor (not shown) or a steering wheel angle sensor (not shown).
  • a device such as a vehicle speed sensor or a steering angle sensor acquires information regarding the vehicle 100 (hereinafter referred to as “vehicle information"), such as vehicle speed or steering angle, as travel-related information.
  • vehicle information information regarding the vehicle 100
  • vehicle information such as vehicle speed or steering angle
  • the driving-related information acquisition device 4 here a device such as a vehicle speed sensor or a steering wheel angle sensor, outputs the acquired driving-related information, here vehicle information, to the headlight control device 1 .
  • the headlight control device 1 includes a direction detection section 11 , a driving-related information acquisition section 12 , a depth distance estimation section 13 , an irradiation determination section 14 , a headlight control section 15 , and a storage section 16 .
  • the driving related information acquisition section 12 includes a vehicle information acquisition section 121.
  • the orientation detection unit 11 acquires an in-vehicle captured image from the in-vehicle imaging device 3. Then, the orientation detection unit 11 detects the orientation of the driver based on the in-vehicle captured image acquired from the in-vehicle imaging device 3.
  • the orientation detection unit 11 detects the driver's facial parts (e.g., eyes, nose, mouth, etc.) from the in-vehicle image obtained from the in-vehicle imaging device 3, and determines the direction of the person's face from the captured image in which the person's face is captured.
  • the direction of the driver's face or the direction of the driver's line of sight is detected using a known image recognition technique for detecting the driver's face or a known image recognition technique for detecting the direction of the person's line of sight from a captured image of the person's face.
  • the direction detection unit 11 irradiates a near-infrared point light source and detects the direction of the driver's line of sight from the positional relationship between the Purkinje image reflected by the cornea and the pupil.
  • the orientation detection unit 11 determines the face orientation angle with the highest degree of similarity by pattern matching between a standard pattern of face images for each face orientation angle, which is prepared in advance and stored in the orientation detection unit 11, and the captured image inside the vehicle. By calculating this, the direction of the driver's face can be detected.
  • the orientation detection unit 11 detects the driver's posture using a known image recognition technique that detects the posture of a person from a captured image of the person.
  • the orientation detection unit 11 may be configured to detect at least one of the driver's face orientation and line-of-sight direction. For example, the orientation detection unit 11 may detect both the driver's face orientation and line of sight direction, and select the one with higher reliability. For example, if the driver is wearing sunglasses or glasses, the orientation detection unit 11 determines that the direction of the driver's face is more reliable than the direction of the driver's line of sight. For example, the orientation detection unit 11 may detect both the driver's face orientation and line-of-sight direction, prioritize them, and adopt either one as the driver's orientation.
  • the direction detection unit 11 adopts the driver's line of sight direction as the driver's direction, and uses the detected driver's face direction and line of sight direction. If the difference is small, the direction of the driver's face is adopted as the direction of the driver.
  • the orientation detection unit 11 detects, for example, the driver's face orientation or line-of-sight direction among the driver's orientations, with the center of the driver's head as a reference. Note that since the installation position and viewing angle of the in-vehicle imaging device 3 are known in advance, the orientation detection unit 11 can calculate the position of the center of the driver's head.
  • the position of the center of the driver's head is a point in real space, and is represented by, for example, coordinate values that can be mapped on a map.
  • the driver's face direction or line of sight direction is, for example, horizontal to a straight line passing through the center of the driver's head and a point in front of the center of the head. shall be expressed as an angle and a vertical angle.
  • “horizontal” is not limited to strictly horizontal, but includes substantially horizontal.
  • “vertical” is not limited to strictly vertical, but includes substantially vertical.
  • the driver's face direction or line of sight direction is based on, for example, when the driver faces forward with respect to the traveling direction of the vehicle 100 (0 degrees), and the driver faces forward. It is expressed as an angle that increases toward the right or upward with respect to the traveling direction of the vehicle 100.
  • the direction of the driver's face or line of sight is, for example, based on when the driver is facing the document with respect to the direction of travel of the vehicle 100 (0 degrees), and when the driver is facing forward to the direction of the vehicle 100.
  • the angle is expressed as a value that decreases as the direction of travel goes to the left or down. Note that in the first embodiment, the front is not limited to strictly directly in front, but includes substantially directly in front.
  • the “driver's face direction” refers to the “driver's face direction or line of sight direction” including the direction of the driver's line of sight. Say something.
  • the orientation detection unit 11 outputs information regarding the detected orientation of the driver (hereinafter referred to as “orientation information”) to the depth distance estimation unit 13 and causes the storage unit 16 to store the information.
  • the orientation information includes information indicating the driver's face direction and posture, and more specifically, information indicating the horizontal angle and vertical angle of the driver's face direction, and information indicating the horizontal angle and vertical angle of the driver's posture. .
  • the orientation detection unit 11 adds a detection time to the orientation information and stores the orientation information.
  • a plurality of storage units 16 eg, 50
  • the orientation detection unit 11 may cause each storage unit 16 to store the latest 50 pieces of orientation information. In this case, the orientation detection unit 11 does not need to add the detection time to the orientation information when storing the orientation information in the storage unit 16.
  • the direction detection section 11 when the direction detection section 11 does not detect the direction of the driver, the direction detection section 11 stores the direction information in the storage section 16 as an invalid value, for example, and stores the direction information in advance after the direction of the driver is no longer detected. It is determined whether a set time (hereinafter referred to as "direction detection determination time") has elapsed. If a detection time is not added to the orientation information when storing the orientation information in the storage unit 16, the orientation detection unit 11 determines the orientation detection determination time based on how many consecutive invalid values are stored in the storage unit 16. All you have to do is determine whether the time has passed.
  • a set time hereinafter referred to as "direction detection determination time”
  • the orientation detection unit 11 refers to the storage unit 16 and outputs orientation information indicating the driver's orientation detected immediately before to the depth distance estimation unit 13. If it is determined that the orientation detection determination time has elapsed since the driver's orientation was no longer detected, the orientation detection unit 11 sends the headlight control unit 15 orientation information (with an invalid value set for the driver's orientation). (hereinafter referred to as "direction invalid information") is output.
  • the orientation detection section 11 may output the orientation invalidation information to the headlight control section 15 via the depth distance estimation section 13 and the illumination determination section 14, or directly to the headlight control section 15. In addition, in FIG. 1, illustration of an arrow from the direction detection section 11 to the headlight control section 15 is omitted.
  • the travel-related information acquisition unit 12 acquires travel-related information from the travel-related information acquisition device 4 .
  • the vehicle information acquisition unit 121 acquires vehicle information from the travel-related information acquisition device 4.
  • the vehicle information acquisition unit 121 outputs the acquired vehicle information to the depth distance estimation unit 13 as travel-related information.
  • the depth distance estimating unit 13 estimates the depth distance based on the orientation information regarding the direction of the driver detected by the orientation detecting unit 11 and the travel-related information acquired by the travel-related information acquisition unit 12. Specifically, the depth distance estimating unit 13 compares orientation information, driving related information (vehicle information in this case), and depth distance estimation information in which information regarding the driver's behavior and driving related information are associated with each other.
  • the depth distance is estimated by In Embodiment 1, the depth distance refers to the position (hereinafter referred to as "the estimated position that the driver is trying to see") in the direction in which the driver is facing from the installation position of the headlight 2 provided in the vehicle 100. (referred to as the "estimated visual recognition position").
  • the depth distance is the distance from a point indicating the installation position of the headlight 2 to a point indicating the estimated visual recognition position on a virtual straight line indicating the direction in which the driver is facing.
  • the depth distance may be the distance from the installation position of the right light to the estimated visual recognition position, or may be the distance from the installation position of the left light to the estimated visual recognition position.
  • the depth distance estimation information is information for estimating the depth distance, is set in advance by an administrator, etc., and is stored in a location that can be referenced by the depth distance estimation unit 13.
  • FIG. 2 is a diagram showing an example of the contents of depth distance estimation information used by the depth distance estimation unit 13 to estimate the depth distance in the first embodiment.
  • the depth distance estimation information is information in a table format in which input information and estimation information are defined and the input information and estimation information are associated with each other.
  • driver behavior and vehicle information are defined as input information
  • depth distance is defined as estimated information.
  • the driver's behavior includes, for example, the vertical direction of the driver's face direction, the horizontal direction of the driver's face direction, and the driver's state.
  • the vehicle information includes, for example, vehicle speed and steering angle.
  • the vertical direction of the driver's face can be defined as “front” where the driver's face is facing forward, “upward” where the driver's face is facing upward, or “upward” where the driver's face is facing downward. It is expressed as “downward”.
  • the left and right direction of the driver's face can be defined as "front”, where the driver's face is facing forward, “right”, where the driver's face is facing to the right, or “right”, where the driver's face is facing to the left. It is expressed as "left”.
  • the driver's state includes, for example, a state in which the driver is leaning forward, a state in which the amount of change in the driver's face direction is less than or equal to a preset threshold value, and the like.
  • the depth distance estimating unit 13 determines the driver's behavior based on the orientation information output from the orientation detecting unit 11, and calculates the determined behavior and the vehicle speed included in the vehicle information acquired by the vehicle information acquiring unit 121.
  • the depth distance information as the estimated information is calculated by combining the information indicating the steering angle and the steering angle with the driver's behavior and vehicle information as the input information set in the depth distance estimation information as shown in Figure 2. Estimate the depth distance by
  • the depth distance estimating unit 13 determines whether the vertical direction of the driver's face is "front” or “above” from information indicating the driver's vertical face orientation included in the orientation information. Determine whether it is “downward”. For example, information (hereinafter referred to as “ "vertical angle range information”) is set and stored in a location that can be referenced by the depth distance estimation unit 13. Based on the driver's vertical face orientation detected by the orientation detection unit 11, the depth distance estimation unit 13 refers to the vertical angle range information and determines whether the vertical direction of the driver's face orientation is "front” or not. It is determined whether it is “upward” or “downward.”
  • the depth distance estimating unit 13 determines whether the horizontal direction of the driver's face is “front” or “right” from information indicating the driver's horizontal face orientation included in the orientation information. Determine whether it is on the left or on the left. For example, information (hereinafter referred to as “ (referred to as “left and right angle range information”) is set and stored in a location that can be referenced by the depth distance estimation unit 13. Based on the driver's vertical face orientation detected by the orientation detection unit 11, the depth distance estimating unit 13 refers to the left-right angle range information to determine whether the left-right direction of the driver's face is "front” or not. Determine whether it is "right” or "left.”
  • the depth distance estimating unit 13 determines whether the driver's posture is a preset "leaning state" based on information indicating the driver's posture included in the orientation information. Determine whether or not. Conditions for determining how much the driver's posture must be tilted to determine that the driver is "leaning over" are set in advance by an administrator, etc., and are stored in a location where the depth distance estimating unit 13 can refer to them. ing. For example, the depth distance estimating unit 13 calculates the amount of change in the driver's face orientation from the time-series orientation information stored in the storage unit 16, and determines whether the amount of change is less than or equal to a threshold value. judge. Note that the threshold value used for determining the amount of change is set in advance by an administrator or the like, and is stored in a location that can be referenced by the depth distance estimation unit 13.
  • the depth distance estimation unit 13 estimates that the depth distance is "80 to 100 m" using the orientation information, travel-related information (vehicle information in this case), and depth distance estimation information. (See No. 6 of the depth distance estimation information in FIG. 2).
  • the vehicle 100 is traveling on the highway.
  • the depth distance includes the driver who is driving while the vehicle 100 is traveling on the expressway at a speed of "80 km/h or more" and the steering wheel angle is "5 degrees or less to the right or left.”
  • the depth distance is set to 80 to 100 m, which is assumed to be the depth distance of the sign (No. 6 of the depth distance estimation information in Figure 2). ).
  • the depth distance is set based on the driving state of the vehicle 100 estimated from the driver's behavior and vehicle information and the estimated visible object of the driver.
  • FIG. 3 shows No. 1 set in the depth distance estimation information shown in FIG. 1 ⁇ No.
  • the driving state of the vehicle 100 estimated from the driver's behavior and vehicle information and the estimated visible object of the driver are the basis for deriving the depth distance. It is a diagram showing an example in association with each other.
  • the administrator or the like calculates the depth distance based on the correspondence relationship between the driving state of the vehicle 100 estimated from the driver's behavior and the vehicle information and an example of the driver's estimated visible object, as shown in FIG. Generate estimation information.
  • the administrator or the like may, for example, verify the above-mentioned correspondence from information obtained by test driving the vehicle 100 and generate the depth distance estimation information.
  • the depth distance estimation information may be any information that allows depth distance information to be obtained from orientation information and travel-related information (vehicle information in this case).
  • a value with a width may be set for the depth distance. For example, when the depth distance estimating unit 13 estimates that the depth distance is “80 to 100 m”, in detail, the depth distance estimating unit 13 estimates that the depth distance is 80 m from the installation position of the headlight 2 to the estimated visible position. This means that the distance was estimated to be in the range of ⁇ 100m.
  • the depth distance estimation unit 13 outputs information regarding the estimated depth distance (hereinafter referred to as “depth distance information”) to the irradiation determination unit 14.
  • Depth distance information is information in which a depth distance and an estimated visible position are associated with each other. Note that since the installation position of the headlight 2 and the position of the center of the driver's head are known, the depth distance estimating unit 13 estimates the positional relationship between the installation position of the headlight 2 and the center of the driver's head. Based on the determined depth distance and the driver's orientation, the estimated visual recognition position can be estimated.
  • the estimated visual recognition position is a point in real space, and is expressed, for example, by coordinate values with the right lamp or left lamp as the origin, or by coordinate values that can be mapped on a map.
  • the depth distance estimating unit 13 uses orientation information regarding the driver's orientation detected by the orientation detecting unit 11 and travel-related information acquired by the travel-related information acquisition unit 12 (here, vehicle information acquired by the vehicle information acquisition unit 121). If the driver's behavior and vehicle information determined based on the information do not match the driver's behavior and vehicle information as input information set in the depth distance estimation information, for example, Assuming that the depth distance could not be estimated, the initial value of the depth distance is set as the depth distance.
  • the initial value of the depth distance is set in advance by an administrator or the like, for example, and is stored in a location that can be referenced by the depth distance estimation unit 13.
  • the depth distance estimation unit 13 outputs depth distance information regarding the depth distance for which the initial value is set to the irradiation determination unit 14.
  • the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13. Specifically, the irradiation determining unit 14 determines which range of the irradiable area of the headlight 2, in other words, the high beam irradiable area, the low beam irradiable area, and the auxiliary light irradiable area, to which light is irradiated. The range is determined as the range of light irradiation by the headlights 2. In the following description, the range of light irradiated by the headlight 2 is also simply referred to as the "irradiation range.”
  • the irradiation determining unit 14 determines the vertical and horizontal directions of the irradiation range in the direction in which the driver is facing. Specifically, the irradiation determining unit 14 sets the installation position of the headlight 2 as a reference position in the direction in which the driver is facing, and sets the irradiation angle of light irradiated from the installation position to the front as a reference (0 degree). , it is determined how many times the range in the vertical direction is to be defined as the vertical direction of the irradiation range, and from what range in the horizontal direction is to be defined as the range in the left and right direction of the irradiation range.
  • the irradiation determining unit 14 first determines a virtual line segment from the installation position of the headlight 2 to an estimated visibility position separated by a depth distance, and from the installation position of the headlight 2 to the vehicle 100.
  • the vertical direction (hereinafter referred to as "depth distance vertical angle") formed by a virtual straight line drawn in the direction of movement of is calculated.
  • the irradiation determining unit 14 can calculate the depth distance and vertical angle using the same method as the depth distance and horizontal angle. Then, the irradiation determining unit 14 sets, for example, an angular range that is widened by a preset angle in the vertical direction from the depth distance vertical angle as the vertical direction of the irradiation range.
  • the irradiation determining unit 14 first determines a virtual line segment from the installation position of the headlight 2 to an estimated visibility position separated by a depth distance, and from the installation position of the headlight 2.
  • the horizontal direction (hereinafter referred to as "depth distance horizontal angle") formed by a virtual straight line drawn in the traveling direction of vehicle 100 is calculated.
  • the irradiation determining unit 14 can determine the depth distance and the estimated visible position based on the depth distance information.
  • the installation position of the headlight 2 is known in advance.
  • the irradiation determination unit 14 can calculate the depth distance and horizontal angle based on the depth distance, the estimated visible position, and the installation position of the headlight 2. Then, the irradiation determining unit 14 sets, for example, an angular range widened by a predetermined angle in the horizontal direction from the depth distance horizontal angle as the vertical direction of the irradiation range.
  • the depth distance estimation unit 13 may estimate the depth distance as a value with a range.
  • the irradiation determining unit 14 first determines the distance based on the smallest distance (for example, "80 m" in the example described above using FIG. 2) among the range of depth distances. A depth distance vertical angle (referred to as a first depth distance vertical angle) is calculated. Further, the irradiation determining unit 14 determines the depth distance vertical angle (first depth distance vertical angle) based on the largest distance (for example, "100 m" in the example described above using FIG. 2) among the depth distance ranges. ) is calculated.
  • the irradiation determining unit 14 sets, for example, the range from the smaller to the larger of the first depth distance vertical angle and the second depth distance vertical angle as the vertical direction of the irradiation range.
  • the depth distance estimating unit 13 calculates the first depth distance horizontal angle and the second depth distance horizontal angle in the same manner in the left and right directions of the irradiation range, and for example, calculates the first depth distance horizontal angle and the second depth distance horizontal angle.
  • the range from the smaller distance to the larger horizontal angle is defined as the left and right direction of the irradiation range.
  • the irradiation determining unit 14 may determine the vertical direction of the irradiation range based on the center of the depth distance range. For example, when the depth distance is estimated to be "80 m to 100 m", the irradiation determining unit 14 calculates the depth distance vertical angle from the central depth distance "90 m". The irradiation determining unit 14 may set the calculated depth distance vertical angle as the center and set the angular range that is widened by a preset angle in the vertical direction from the depth distance vertical angle as the vertical direction of the irradiation range. Further, for example, the irradiation determining unit 14 may determine the horizontal direction of the irradiation range based on the center of the depth distance range using a similar method.
  • the irradiation determining unit 14 determines the irradiation range based on the initial value. Specifically, when the depth distance estimating unit 13 outputs depth distance information regarding the depth distance for which the initial value has been set, the irradiation determining unit 14 determines whether the irradiation range is in the vertical and horizontal directions corresponding to the initial value of the depth distance. Determine.
  • the irradiation determining unit 14 outputs information regarding the determined irradiation range (hereinafter referred to as “irradiation information”) to the headlight control unit 15.
  • the irradiation information includes information indicating the angular range in the vertical direction and the angular range in the horizontal direction of the irradiation range, with the installation position of the headlight 2 as a reference position.
  • the headlight control unit 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14.
  • the irradiation determining unit 14 outputs irradiation information to the headlight control unit 15 indicating that ⁇ 1 degree to ⁇ 2 degrees are the vertical direction and ⁇ 1 degree to ⁇ 2 degrees are the horizontal direction.
  • the headlight control unit 15 individually controls the light source of the low beam unit, high beam unit, or auxiliary light unit of the headlight 2 in the range of ⁇ 1 degree to ⁇ 2 degrees in the left and right direction, and Control is performed to irradiate light in the range of ⁇ 1 degree to ⁇ 2 degrees in the vertical direction.
  • the headlight control unit 15 may control, for example, a plurality of light sources such as a low beam unit, a high beam unit, or an auxiliary light unit to individually change the optical axis and irradiate the above range with light. However, by adjusting the light source to be turned on among the plurality of light sources of the low beam unit, high beam unit, or auxiliary light unit, control may be performed to irradiate the above range with light.
  • FIG. 4 is a diagram for explaining an example of how the headlight control unit 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14 in the first embodiment.
  • the driver is indicated by "D”
  • the range of light irradiated by the headlight 2 is indicated by "LA”.
  • FIG. 4 is a side view of the road on which the vehicle 100 is traveling. Further, in FIG.
  • the driver's face direction is within the "upward" range
  • the traveling speed of the vehicle 100 is 90 km/h
  • the steering wheel angle of the vehicle 100 is 3 degrees to the right.
  • the headlight control unit 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14.
  • the depth distance estimating unit 13 calculates the depth distance based on the depth distance estimation information as shown in FIG. Estimation information No.
  • the depth distance is estimated to be 80 to 100 m
  • the irradiation determining unit 14 estimates the depth distance to be 80 to 100 m
  • the irradiation determining unit 14 determines that the depth distance is 80 to 100 m
  • the irradiation determining unit 14 determines that the depth distance is 80 to 100 m.
  • the range is determined to be the irradiation range (the irradiation range in the left and right direction is not shown in FIG. 4).
  • the irradiation determination unit 14 calculates a first depth distance vertical angle based on the depth distance "80 m" in the vertical direction of the irradiation range, calculates a second depth distance vertical angle based on the depth distance "100 m", and determines the irradiation range. It is assumed that the vertical direction of is determined.
  • the headlight control unit 15 estimates the depth distance estimation unit 13 in the direction in which the driver is facing with respect to the headlights 2, taking into consideration the driver's orientation and driving-related information (here, vehicle information). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance. As a result, the headlight control unit 15 can control the headlight 2 so that light is irradiated onto the estimated visible object that is estimated to exist at the estimated visible position. The driver can visually recognize the estimated visible object. In the situation shown in FIG. 4, the estimated visible object is estimated to be a sign (see FIG. 3).
  • the headlight control unit 15 can cause light to be irradiated onto the sign, which is the estimated visible object of the driver.
  • the sign is the estimated visible object of the driver. Note that in FIG. 4, a sign is also illustrated for ease of understanding, but the sign is an object that is presumed to exist, and does not necessarily exist actually.
  • the headlight control section 15 causes the headlight 2 to emit light in the irradiation direction for when the direction of the driver cannot be obtained.
  • the irradiation direction when the direction of the driver cannot be obtained is, for example, the front direction.
  • Information regarding the range of the front direction is generated in advance by an administrator or the like, and is stored in a location where the headlight control unit 15 can refer to it.
  • the storage unit 16 stores various information. Note that in FIG. 1, the storage unit 16 is included in the headlight control device 1, but this is only an example. The storage unit 16 may be provided at a location outside the headlight control device 1 that can be referenced by the headlight control device 1 .
  • FIG. 5 is a flowchart for explaining the operation of the headlight control device 1 according to the first embodiment.
  • the headlight control device 1 determines to perform lighting control of the headlight 2 based on the direction of the driver, and performs the lighting control according to the flowchart of FIG. Start the action as shown.
  • the headlight control device 1 operates according to the flowchart of FIG. 5 until, for example, the headlights 2 are turned off, the headlight control device 1 is turned off, or the power of the vehicle 100 is turned off. Repeat the actions shown in .
  • the control unit (not shown) of the headlight control device 1 acquires information indicating the state of the headlights 2 from a headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not. Alternatively, if there is a driver direction tracking function switch, the control unit of the headlight control device 1 determines whether the headlight control device 1 is in an on state.
  • the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13, the irradiation determining unit 14, and the headlight control unit 15. Further, when the control unit determines that the headlights 2 are off, the headlight control device 1 is off, or the vehicle 100 is turned off, the control unit turns on the headlights 2 based on the direction of the driver. Information that determines to end the control and instructs the direction detection unit 11, driving-related information acquisition unit 12, depth distance estimation unit 13, irradiation determination unit 14, and headlight control unit 15 to end the lighting control of the headlights 2. Output.
  • the orientation detection unit 11 acquires an in-vehicle captured image from the in-vehicle imaging device 3, and detects the orientation of the driver based on the acquired in-vehicle captured image (step ST1-1).
  • the orientation detection unit 11 outputs orientation information to the depth distance estimation unit 13 and stores it in the storage unit 16.
  • step ST1-1 if the orientation detection unit 11 does not detect the orientation of the driver, the orientation detection unit 11 stores the storage unit 16 until the orientation detection determination time elapses after the orientation of the driver is no longer detected.
  • Direction information indicating the direction of the driver detected immediately before is output to the depth distance estimating section 13.
  • the orientation detection unit 11 If the orientation detection determination time has elapsed since the driver's orientation is no longer detected, the orientation detection unit 11 outputs orientation invalidation information to the headlight control unit 15. When the orientation detection unit 11 outputs orientation invalidation information to the headlight control unit 15, the operation of the headlight control device 1 skips the processing of steps ST2 to ST3, which will be described later, and proceeds to the processing of step ST4.
  • the travel-related information acquisition unit 12 acquires travel-related information from the travel-related information acquisition device 4 .
  • the vehicle information acquisition unit 121 acquires vehicle information from the driving-related information acquisition device 4 (step ST1-2).
  • the vehicle information acquisition unit 121 outputs the acquired vehicle information to the depth distance estimation unit 13.
  • the depth distance estimating unit 13 calculates the direction information regarding the direction of the driver detected by the direction detecting unit 11 in step ST1-1, the driving related information acquired by the driving related information acquiring unit 12 in step ST1-2, and the depth.
  • the depth distance is estimated using the distance estimation information (step ST2).
  • the depth distance estimation unit 13 outputs depth distance information to the irradiation determination unit 14.
  • the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13 in step ST2 (step ST3).
  • the irradiation determining section 14 outputs irradiation information to the headlight control section 15.
  • the headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determining section 14 in step ST3 (step ST4). If the direction detection section 11 outputs direction invalid information to the headlight control section 15 in step ST1-1, the headlight control section 15 determines the direction of the driver with respect to the headlights 2 in step ST4. irradiates light in the irradiation direction for when it cannot be obtained.
  • the depth distance estimating unit 13 sets the initial value of the depth distance to the depth distance.
  • the depth distance estimation information is as shown in Figure 2, and the driver's face orientation is within the range of "front" in the vertical direction and "right” in the horizontal direction.
  • the traveling speed of vehicle 100 is 30 km/h.
  • the steering wheel angle is 5 degrees. In this case, the driver's face orientation and vehicle speed are determined by the depth distance estimation information No. 2, but the steering wheel angles do not match. Therefore, the depth distance estimation unit 13 sets an initial value to the depth distance.
  • step ST3 the irradiation determining unit 14 determines the irradiation range based on the depth distance for which the initial value is set, and in step ST4, the headlight control unit 15 Light is irradiated onto the irradiation range determined based on the depth distance with the set value. For example, after that, if the depth distance estimation unit 13 can estimate the depth distance using the depth distance estimation information, the depth distance estimation unit 13 estimates the depth distance using the depth distance estimation information. For example, in the above example, assume that the steering wheel angle becomes 20 degrees while the driver's face direction and vehicle speed remain the same. It is assumed that the driver turns the steering wheel to the right in order to make a right turn.
  • the depth distance estimating unit 13 determines that the driver's face orientation, vehicle speed, and steering angle are the No. 1 of the depth distance estimation information. 2, we estimate the depth distance to be 15 to 20 meters. Then, the irradiation determination unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13 based on the depth distance estimation information, and the headlight control unit 15 determines the irradiation range for the headlight 2.
  • the irradiation range determined by the unit 14 is irradiated with light.
  • the headlight control device 1 changes from the lighting control of the headlights 2 with the depth distance as an initial value to the lighting control of the headlights 2 according to the depth distance estimated by the depth distance estimation unit 13. The lighting control is performed so that light is irradiated onto the estimated visible object that is estimated to exist at the estimated visible position.
  • the distance between the installation position of the headlight 2 and the position estimated to be actually viewed by the driver (estimated viewing position), that is, the depth distance, cannot be determined only from the direction of the driver. If it is fixedly determined how far ahead the headlights 2 should emit light, the headlights 2 will not be present at the location where the driver is estimated to actually be looking, or at the location concerned. It is difficult to irradiate light onto objects that are Note that the position of the driver's head and the installation position of the headlight 2 are different.
  • the irradiation range will be too wide, giving glare to pedestrians and drivers of other vehicles.
  • the above matters are not taken into consideration, and a situation may occur in which the light of the headlights 2 cannot be irradiated to the estimated visual recognition position of the driver.
  • a situation may occur where the driver's estimated visible position is closer than the fixed range of light irradiation from the headlights 2, and the headlight 2 cannot illuminate the driver's estimated visible position.
  • the driver is checking to see if a pedestrian is jumping out from behind a parked vehicle in front of him in an underground parking lot. In this case, the driver is actually checking a few meters ahead of the parked vehicle, but the driver may be visualizing it during normal driving, such as when driving in an area where there are no parked vehicles, etc.
  • the assumed several tens of meters ahead is fixedly determined as the distance from the headlight 2 that causes the headlight 2 to emit light, the light from the headlight 2 will not reach the upper body of a pedestrian in the shadow of a stopped vehicle. may not have been irradiated. This is because the distance several meters ahead may not be within the range of light irradiated by the headlights 2. In this way, even if you do not consider the depth distance and irradiate the light in a fixed distance from the headlight 2, it may not be possible to irradiate the pedestrian who jumps out. There is sex. In this case, the driver will be delayed in discovering the pedestrian.
  • the driver's estimated visible position is farther than the fixed range of light irradiation from the headlights 2, and the light from the headlights 2 cannot illuminate the driver's estimated visible position. It can occur. Specifically, for example, suppose that a driver checks a sign ahead while driving on an expressway. In this case, the sign that the driver is trying to check is further away than the fixed range of light irradiation from the headlights 2, and may not be within the range of light irradiation from the headlights 2. Even if the distance from the headlight 2 that is fixedly determined is irradiated with light without considering the depth distance, there is a possibility that the sign will not be irradiated with light. In this case, the driver is delayed in checking the sign.
  • the headlight control device 1 detects the direction of the driver based on the captured image inside the vehicle, and acquires driving-related information (here, vehicle information).
  • the headlight control device 1 estimates a depth distance based on the detected orientation information regarding the direction of the driver and the acquired travel-related information, and determines the range of light irradiated by the headlights 2 based on the estimated depth distance. .
  • the headlight control device 1 causes the headlight 2 to irradiate the determined irradiation range with light. Therefore, the headlight control device 1 can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
  • the depth distance estimating section 13 uses the direction information regarding the direction of the driver detected by the direction detection section 11 and the driving-related information acquired by the driving-related information acquisition section 12. More specifically, the depth distance was estimated using the vehicle information acquired by the vehicle information acquisition unit 121 and the depth distance estimation information.
  • the present invention is not limited to this, and for example, the depth distance estimation unit 13 may estimate the width of the irradiation range (hereinafter referred to as "ideal width of the irradiation range”) in addition to estimating the depth distance.
  • the administrator or the like sets the depth distance in the depth distance estimation information, and also sets the upper limit in the vertical direction and the upper limit in the horizontal direction of the irradiation range as the ideal width of the irradiation range.
  • the depth distance estimation unit 13 estimates the depth distance and the ideal width of the irradiation range based on the orientation information, vehicle information, and depth distance estimation information.
  • the depth distance estimation section 13 outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination section 14.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13 and the ideal width of the irradiation range.
  • the irradiation determining unit 14 determines that the range up to the ideal width of the irradiation range is is determined as the irradiation range. For example, if the depth distance estimated by the depth distance estimation unit 13 is a value with a wide range, the irradiation range determined by the irradiation determination unit 14 based on the depth distance is too wide, and the light from the headlights 2 There is a risk of giving glare to cars etc.
  • the depth distance estimation unit 13 estimates the ideal width of the irradiation range, and the irradiation determination unit 14 determines the irradiation range with the ideal width of the irradiation range as the upper limit, so that the headlight control device 1 can reduce glare given to oncoming vehicles, etc. Can be reduced.
  • the depth distance estimating unit 13 may estimate the depth distance and the amount of light irradiated by the headlights 2 (hereinafter referred to as "ideal light amount"). For example, an administrator or the like sets a depth distance in the depth distance estimation information, and also sets an ideal irradiation light amount according to the depth distance as the ideal light amount. For example, an administrator or the like sets a larger value for the ideal light amount as the depth distance becomes larger.
  • the depth distance estimation unit 13 estimates the depth distance and the ideal light amount based on the orientation information, vehicle information, and depth distance estimation information.
  • the depth distance estimation section 13 outputs depth distance information to the irradiation determination section 14 and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
  • the headlight control unit 15 causes the headlight 2 to irradiate light at the ideal light amount estimated by the depth distance estimating unit 13 in the irradiation range determined by the irradiation determining unit 14.
  • the depth distance estimating unit 13 estimates the ideal amount of light
  • the headlight control unit 15 causes the headlights 2 to emit light at the ideal amount of light, so that the headlight control device 1 can Depending on the distance from the vehicle 100 to which the light is irradiated, the light can be irradiated with the amount of light that is assumed to be necessary for the driver to visually recognize the estimated visible object.
  • the depth distance estimation unit 13 may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
  • the headlight control device 1 is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section 12. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device.
  • the present invention is not limited to this, and some of the orientation detection unit 11, driving-related information acquisition unit 12, depth distance estimation unit 13, irradiation determination unit 14, and headlight control unit 15 are provided in the on-vehicle device of the vehicle 100.
  • Other components may be provided in a server connected to the in-vehicle device via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
  • FIG. 6A and 6B are diagrams showing an example of the hardware configuration of the headlight control device 1 according to the first embodiment.
  • the functions of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and a control unit are performed by a processing circuit.
  • This is realized by 1001. That is, the headlight control device 1 estimates the depth distance based on the driving-related information and the orientation information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above-mentioned information.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Circuit
  • the functions of the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, the headlight control section 15, and the control section are as follows. Realized by software, firmware, or a combination of software and firmware. Software or firmware is written as a program and stored in memory 1005. The processor 1004 reads out and executes the program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, and the headlight control section. 15, the functions of a control section (not shown) are executed.
  • the headlight control device 1 stores a program that, when executed by the processor 1004, results in steps ST1-1, ST1-2 to ST4 in FIG.
  • a memory 1005 is provided. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and the control unit (not shown). It can also be said to be something that causes a computer to execute a processing procedure or method.
  • the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically non-volatile or volatile semiconductors such as asable, programmable, read-only memory) This includes memory, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • the functions of the direction detection unit 11, driving-related information acquisition unit 12, depth distance estimation unit 13, irradiation determination unit 14, headlight control unit 15, and control unit are implemented using dedicated hardware. It may also be realized by software, and part of it may be realized by software or firmware.
  • the functions of the direction detection unit 11 and the driving-related information acquisition unit 12 are realized by the processing circuit 1001 as dedicated hardware, and the functions of the direction detection unit 11 and the driving-related information acquisition unit 12 are realized by the processing circuit 1001 as dedicated hardware, and the depth distance estimation unit 13, the illumination determination unit 14, and the headlight control unit 15.
  • the functions of the control unit can be realized by the processor 1004 reading and executing a program stored in the memory 1005.
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1 also includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the headlight control device 1 includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); A driving-related information acquisition unit 12 that acquires driving-related information related to the driving of the vehicle 100, orientation information regarding the driver's orientation detected by the orientation detection unit 11, and driving-related information acquired by the driving-related information acquisition unit 12.
  • Depth distance estimation for estimating the depth distance which is the distance from the installation position of the headlight 2 provided in the vehicle 100 to the estimated visual position estimated to be the driver's visible position in the direction the driver is facing, based on 13; an irradiation determining unit 14 that determines the range of light irradiated by the headlights 2 based on the depth distance estimated by the depth distance estimating unit 13;
  • the headlight controller 15 is configured to include a headlight control section 15 that irradiates a range with light. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1 determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1 can appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
  • the driving-related information acquisition unit 12 includes a vehicle information acquisition unit 121 that acquires vehicle information regarding the vehicle 100 as driving-related information
  • the depth distance estimating unit 13 includes direction information and The depth distance is estimated based on the vehicle information. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1 determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1 can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
  • the headlight control device estimates the depth distance based on direction information and vehicle information.
  • Embodiment 2 an embodiment in which depth distance is estimated using map information will be further described.
  • FIG. 7 is a diagram showing a configuration example of a headlight control device 1a according to the second embodiment.
  • the headlight control device 1a is mounted on the vehicle 100.
  • the headlight control device 1a controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100.
  • the "driver's orientation” is expressed by the driver's face orientation or the driver's line of sight direction.
  • the "driver's orientation” includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
  • the light control of the headlights 2 based on the direction of the driver performed by the headlight control device 1a is performed in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
  • the “driver's face direction” includes “driver's face direction or line of sight direction”, which also includes the direction of the driver's line of sight. It refers to
  • the headlight control device 1a according to the second embodiment is different from the headlight control device 1 according to the first embodiment in that the driving-related information acquisition section 12a includes a map information acquisition section 122 in addition to the vehicle information acquisition section 121. are different. Further, the specific operation of the depth distance estimation section 13a in the headlight control device 1a according to the second embodiment is different from the specific operation of the depth distance estimation section 13 in the headlight control device 1 according to the first embodiment. different.
  • the driving-related information acquisition device 4 includes a vehicle speed sensor (not shown), a steering wheel angle sensor (not shown), a car navigation device, a GPS (Global Positioning System), a high-precision locator, etc. Including positioning equipment. Positioning devices such as car navigation devices hold map information.
  • the map information also includes the current location of the vehicle 100, route information regarding the route the vehicle 100 is traveling on, or information regarding the lane (here, the lane is a so-called lane) the vehicle 100 is traveling on. do.
  • the high-precision locator can acquire information about the current position of the vehicle 100 in units of several tens of centimeters, and therefore can acquire information about the lane in which the vehicle 100 is traveling.
  • the driving-related information acquisition device 4 outputs vehicle information and map information to the headlight control device 1a as driving-related information.
  • the map information acquisition unit 122 acquires map information from the travel-related information acquisition device 4.
  • the map information acquisition unit 122 outputs the acquired map information to the depth distance estimation unit 13a as travel-related information.
  • the depth distance estimation unit 13a uses the driver's orientation detected by the orientation detection unit 11 and the driving-related information acquisition unit 12a (specifically, the vehicle information acquired by the vehicle information acquisition unit 121 and the map acquired by the map information acquisition unit 122). The depth distance is estimated based on the information (information). Specifically, the depth distance estimating unit 13a estimates the depth distance by comparing the orientation information, travel-related information (here, vehicle information and map information), and depth distance estimation information.
  • FIG. 8 is a diagram showing an example of the contents of depth distance estimation information used by the depth distance estimating section 13a to estimate the depth distance in the second embodiment.
  • the depth distance estimation information is, for example, a table in which driver behavior, vehicle information, map information, and depth distance are associated with each other.
  • the driver's behavior includes, for example, the vertical direction of the driver's face direction and the horizontal direction of the driver's face direction.
  • the vertical direction of the driver's face is expressed as "front,”"above,” or “down.”
  • the left and right direction of the driver's face is expressed as "front", "right", or "left”.
  • the vehicle information includes, for example, vehicle speed.
  • the map information includes, for example, information on the traveling position of the vehicle 100, route information, and information on the lane position.
  • the depth distance estimation unit 13a determines the driver's behavior based on the driver's orientation detected by the orientation detection unit 11, and the determined behavior and the vehicle information acquired by the vehicle information acquisition unit 121 include the behavior of the driver.
  • Information indicating the vehicle speed and steering angle, information indicating the traveling position of the vehicle 100, route information, and information indicating the lane position included in the map information acquired by the map information acquisition unit 122 are shown in FIG.
  • the depth distance is estimated by obtaining depth distance information by comparing the driver's behavior, vehicle information, and map information set in the depth distance estimation information as shown.
  • the depth distance estimating unit 13a determines whether the vertical direction of the driver's face is "front” or “above” from information indicating the driver's vertical face orientation included in the orientation information. Determine whether it is “downward”. For example, the depth distance estimating unit 13a determines whether the horizontal direction of the driver's face is “front” or “right” from information indicating the driver's horizontal face orientation included in the orientation information. Determine whether it is on the left or on the left.
  • Determination by the depth distance estimating unit 13a whether the vertical and horizontal directions of the driver's face direction are "front”, “up”, “down”, “right”, or “left” The method is to determine whether the vertical and horizontal directions of the driver's face direction are "front”, “above”, or “down” by the depth distance estimating unit 13, which has already been explained in the first embodiment. Since the method for determining whether it is "right” or “left” is the same, duplicate explanation will be omitted.
  • the depth distance estimation unit 13a uses the direction information, travel-related information (here, vehicle information and map information), and depth distance estimation information to determine the depth distance between the sidewalk and crosswalk at the intersection. (See No. 2 of the depth distance estimation information in FIG. 8).
  • the driver's face orientation is "front to top” in the vertical direction, “right or left” in the horizontal direction, the vehicle speed is “15 to 35 km/h”, and the vehicle 100 is "within 20 meters of an intersection". If the vehicle 100 is about to make a "right or left turn at the next intersection," the vehicle 100 is about to make a right or left turn at the intersection, and the estimated visible object of the driver is estimated to be a pedestrian. Therefore, in the depth distance estimation information, the depth distance includes the case where the vehicle 100 is trying to turn left or right at an intersection and is traveling at ⁇ 15 to 35 km/h'' within 20 meters of the intersection, and the driver tries to check for pedestrians.
  • a range with a margin of 5 m including the intersection sidewalk and crosswalk is set, in which it is assumed that the pedestrian is probably at this depth (depth distance estimation in Figure 8).
  • Information No. 2 Similarly, No. 8 of the depth distance estimation information in FIG. 1.No. 3 ⁇ No. Also in condition No. 6, the depth distance is set based on the driving state of the vehicle 100 estimated from the driver's behavior, vehicle information, and map information and the estimated visible object of the driver.
  • FIG. 9 shows No. 1 set in the depth distance estimation information shown in FIG. 1 ⁇ No.
  • FIG. 2 is a diagram showing an example of objects in association with each other.
  • the depth distance estimation information shown in FIG. 1 ⁇ No. 6 conditions are set, but this is just an example.
  • the content of the depth distance estimation information as shown in FIG. 8 is only an example.
  • a traveling position, route information, and lane position are set as map information, but this is only an example.
  • information indicating the distance to a certain position may be set.
  • the depth distance estimation information only needs to be information that allows depth distance information to be obtained from orientation information and travel-related information (here, vehicle information and map information). It is only necessary that the face direction, vehicle speed, and driving position information be associated with the depth distance.
  • the depth distance estimation unit 13a outputs depth distance information to the irradiation determination unit 14.
  • the depth distance estimation unit 13a uses the direction information regarding the driver's orientation detected by the direction detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12a (here, the vehicle information acquired by the vehicle information acquisition unit 121). If the map information acquired by the map information acquisition unit 122 does not match the input information set in the depth distance estimation information, for example, it is assumed that the depth distance could not be estimated from the depth distance estimation information, and the corresponding Set the initial value of the depth distance to the depth distance. The depth distance estimation unit 13a outputs depth distance information regarding the depth distance for which the initial value is set to the irradiation determination unit 14.
  • the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13a, and outputs irradiation information to the headlight control unit 15.
  • the headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination section 14 with light.
  • FIG. 10A is a diagram for explaining an example of the depth distance estimated by the depth distance estimation unit 13a in the second embodiment.
  • FIG. 10B shows that in the second embodiment, the headlight control unit 15 applies the irradiation to the headlight 2 determined by the irradiation determination unit 14 based on the depth distance as shown in FIG.
  • FIG. 3 is a diagram for explaining an example of how a range is irradiated with light.
  • FIG. 10A is an overhead view of the road on which the vehicle 100 is traveling.
  • FIG. 10B is a side view of the road on which the vehicle 100 is traveling. In FIG. 10B, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA". Note that in FIG.
  • the depth distance is the distance from the right light to the estimated visual position of the driver.
  • FIG. 10B shows the vehicle 100 as viewed from the left for convenience, the irradiation range shown in FIG. 10B is the irradiation range by the right light.
  • the driver's face is facing "front" in the vertical direction and "right” in the left-right direction, and the vehicle 100 passes the next intersection 15 meters before the intersection.
  • the depth distance estimating section 13a estimates the depth distance using the depth distance estimation information, and the headlight control section 15 causes the headlights 2 to: It is assumed that the irradiation range determined by the irradiation determining unit 14 is irradiated with light. Note that it is assumed that the depth distance estimation information has contents as shown in FIG. 8 .
  • the depth distance estimating unit 13a calculates the distance between the driver's behavior determined from the direction information and driving-related information (vehicle information and map information here) based on the depth distance estimation information having the content as shown in FIG. Distance estimation information No. Assuming that 2 applies, the depth distance is estimated to be "an area with a 5m margin, including the intersection sidewalk and crosswalk.” As a result, in the direction of the driver, a range of depth distance including the sidewalk near the intersection and the crosswalk at the intersection and has a margin of 5 m is estimated (see FIG. 10A). At this time, the depth distance estimating unit 13a determines specifically how many meters to how many meters the range of depth distance that includes the sidewalk near the intersection and the crosswalk at the intersection has a margin of 5 meters.
  • FIGS. 11A and 11B show how, in the first embodiment, the depth distance estimating unit 13a specifically calculates the range of depth distance that has a margin of 5 m, including the sidewalk near the intersection and the crosswalk at the intersection. This is a diagram for convenience to explain an example of a method of calculating the range of m to m, and the roads shown in FIGS. 11A and 11B do not match the road shown in FIG. 10A.
  • FIGS. 11A and 11B are overhead views of the road on which the vehicle 100 is traveling.
  • illustration of the vehicle 100 is omitted for simplicity of explanation, but the vehicle 100 is traveling on a general road and is about to turn right at the next intersection, and the driver is looking to the right. It is assumed that there is It is assumed that the direction of the driver at this time is ⁇ d degrees.
  • the depth distance estimating unit 13a acquires information indicating the number of lanes of the road including the current lane and the lane to which the vehicle is turning right, and the distance to the intersection, from the map information.
  • the depth distance estimating unit 13a has acquired information that the road including the own lane and the right turn destination lane is a road with two lanes on each side, and that the distance to the intersection is 15 meters.
  • the driver is facing to the right.
  • the depth distance estimation unit 13a estimates that the driver is looking at the sidewalk in front of the vehicle 100 based on the direction information and the map information. Furthermore, since the roads shown in FIGS.
  • the depth distance estimation unit 13a assumes that the lane width is 3.5 m. Note that it is assumed that the assumed lane width is determined in advance depending on the road type. When a high-precision locator is used as the travel-related information acquisition device 4, the depth distance estimation unit 13a may use lane width information obtained from the high-precision locator.
  • the depth distance estimation unit 13a calculates the distance from the installation position of the headlight 2 to the front of the sidewalk. Specifically, the depth distance estimation unit 13a calculates the distance from the center of the driver's head to the front of the sidewalk of a virtual straight line indicating the direction the driver is facing (indicated by "D11" in FIG. 11A). ), and based on this, the distance from the installation position of the headlight 2 to the front of the sidewalk is calculated. Since the vehicle 100 is about to turn right, the depth distance estimating unit 13a estimates that the vehicle 100 is currently traveling in the right lane of a road with two lanes on each side.
  • the depth distance estimation unit 13a may estimate the lane in which the vehicle 100 is traveling using information on the actual driving lane. .
  • the depth distance estimation unit 13a calculates driving distance from the center of the driver's head based on the position of the center of the driver's head, the orientation of the driver, the assumed lane width, and the number of road lanes obtained from the map information. The distance from a virtual straight line indicating the direction the person is facing to the front of the sidewalk is calculated. It is assumed that the lane in which the vehicle 100 is currently traveling is the right lane of a road with two lanes on each side, and the lane width is 3.5 m.
  • the direction of the driver is set to ⁇ d degrees. For example, if the position of the center of the driver's head is the widthwise center of the travel lane of the vehicle 100, the depth distance estimation unit 13a calculates (3.5 ⁇ 2.5)/sin ⁇ d(m). , is calculated as the distance from the center of the driver's head to the front of the sidewalk of a virtual straight line indicating the direction the driver is facing. Then, the depth distance estimating unit 13a calculates the distance from the installation position of the headlight 2 to the sidewalk, based on the calculated distance from the center of the driver's head to the front of the sidewalk of a virtual straight line indicating the direction the driver is facing. Calculate the distance to the front.
  • the depth distance estimation unit 13a indicates the direction in which the driver is facing from the center of the driver's head. Based on the distance from the virtual straight line to the front of the sidewalk, the distance from the installation position of the headlight 2 to the front of the sidewalk can be calculated.
  • the depth distance estimation unit 13a calculates the distance from the installation position of the headlight 2 to the back of the sidewalk using the same method as calculating the distance from the installation position of the headlight 2 to the front of the sidewalk. Specifically, the depth distance estimation unit 13a calculates the distance from the center of the driver's head to the back of the sidewalk (indicated by "D12" in FIG. 11B) of a virtual straight line indicating the direction the driver is facing. ), and based on this, the distance from the installation position of the headlight 2 to the back of the sidewalk is calculated.
  • the depth distance estimation unit 13a calculates ⁇ 15-(3.5 ⁇ 2) ⁇ /sin ⁇ d(m) as the distance to the back of the sidewalk of a virtual straight line indicating the direction in which the driver is facing. Based on this distance, the distance from the installation position of the headlight 2 to the back of the sidewalk is calculated.
  • the depth distance estimation unit 13a calculates the range of depth distance based on the calculated distance from the installation position of the headlight 2 to the front of the sidewalk and the distance from the installation position of the headlight 2 to the back of the sidewalk. It is assumed that the depth distance estimating unit 13a calculates, for example, a range of depth distances that includes the sidewalk near the intersection and the crosswalk at the intersection and has a margin of 5 meters.
  • the depth distance estimation unit 13a adds a margin of 5m to these distances, and calculates the range from "Am-5m” to "Bm+5m” as a depth distance with a margin of 5m, including the sidewalk near the intersection and the crosswalk at the intersection. Calculate as a range.
  • the irradiation determining unit 14 determines the range of ⁇ 3 degrees to ⁇ 4 degrees in the left-right direction and ⁇ 3 degrees to ⁇ 3 degrees in the vertical direction, based on the depth distance “10 m to 30 m” estimated by the depth distance estimating unit 13a .
  • the range of ⁇ 4 degrees is determined as the irradiation range (see FIG. 10B. In FIG. 10B, the irradiation range in the left and right direction is not shown).
  • the irradiation determining unit 14 calculates a first depth distance vertical angle based on a depth distance of "10 m" in the vertical direction of the irradiation range, and calculates a second depth distance vertical angle based on a depth distance of "30 m”. Then, the vertical direction of the irradiation range is determined.
  • the headlight control unit 15 uses the depth distance estimating unit 13a to calculate the direction in which the driver is facing with respect to the headlights 2, taking into account the driver's orientation and driving-related information (here, vehicle information and map information). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the estimated depth distance. As a result, the headlight control unit 15 can control the headlight 2 so that light is irradiated onto the estimated visible object that is estimated to exist at the estimated visible position. The driver can visually recognize the estimated visible object.
  • the estimated visible object is estimated to be a pedestrian (see FIG. 9).
  • the headlight control unit 15 controls a pedestrian A (indicated by W1 in FIGS. 10A and 10B) that is estimated to be a visible object of the driver and is estimated to exist in the direction the driver is facing. can be illuminated with light.
  • the driver's current direction is a direction that includes the sidewalk in front of the intersection and the road in the direction of travel. Therefore, the pedestrian who is the estimated visible object of the driver is estimated to be Pedestrian A who is crossing the sidewalk in front of the intersection or the crosswalk on the road in the direction of travel.
  • the headlight control unit 15 is configured to irradiate light onto an object that is estimated to be visible to the driver and is estimated to be located at a position (estimated visible position) that is an estimated depth distance away from the driver in the direction in which the driver is facing. , controls the headlights 2.
  • FIGS. 10A and 10B pedestrian A and pedestrian B are also illustrated for ease of understanding, but pedestrian A and pedestrian B are estimated to exist, and may not actually exist. Not necessarily.
  • FIG. 12A is a diagram for explaining another example of the depth distance estimated by the depth distance estimation unit 13a in the second embodiment.
  • FIG. 12B shows that in the second embodiment, the headlight control unit 15 applies irradiation to the headlight 2 determined by the irradiation determination unit 14 based on the depth distance as shown in FIG. 12A estimated by the depth distance estimation unit 13a.
  • FIG. 3 is a diagram for explaining an example of how a range is irradiated with light.
  • FIG. 12A is an overhead view of the road on which the vehicle 100 is traveling.
  • FIG. 12B is a side view of the road on which the vehicle 100 is traveling. In FIG.
  • FIG. 12B the driver is indicated by “D”, and the range of light irradiated by the headlight 2 is indicated by "LA”.
  • the depth distance is the distance from the right light to the driver's estimated visual recognition position.
  • FIG. 12B shows the vehicle 100 as viewed from the left side with respect to the traveling direction for convenience, the irradiation range shown in FIG. 12B is the irradiation range by the right light. Also, in FIGS. 12A and 12B, as in FIGS.
  • the driver's face is facing "front” in the vertical direction and “right” in the horizontal direction, and the vehicle 100 is
  • the depth distance estimation unit 13a estimates the depth distance using the depth distance estimation information
  • the headlight control unit 15 indicates that the headlight 2 is caused to irradiate light onto the irradiation range determined by the irradiation determining unit 14. Further, it is assumed that the depth distance estimation information had contents as shown in FIG. 8 .
  • the direction of the driver's face is different between FIG. 10 and FIG. 12.
  • the depth distance estimating unit 13a calculates the driver's behavior determined from the direction information and driving-related information (here, vehicle information and map information) based on the depth distance estimation information as shown in FIG. is the No. of the depth distance estimation information. Assuming that 2 applies, the depth distance is estimated to be "an area with a 5m margin including the intersection sidewalk and crosswalk.” As a result, in the direction in which the driver is facing, a depth distance is estimated that includes the sidewalk near the intersection and the crosswalk at the intersection and has a margin of 5 meters (see FIG. 12B).
  • the depth distance estimation unit 13a estimates the depth distance to be, for example, "20 m to 37 m" based on the map information.
  • the irradiation determining unit 14 determines the range of ⁇ 5 degrees to ⁇ 6 degrees in the left-right direction and ⁇ 5 degrees to ⁇ in the vertical direction based on the depth distance “20 m to 37 m” estimated by the depth distance estimating unit 13a .
  • the range of ⁇ 6 degrees is determined as the irradiation range (see FIG. 12B.
  • the irradiation determining unit 14 calculates the first depth distance vertical angle based on the depth distance "20 m" in the vertical direction of the irradiation range, and calculates the second depth distance vertical angle based on the depth distance "37 m”. Then, the vertical direction of the irradiation range is determined.
  • the headlight control unit 15 uses the depth distance estimating unit 13 to calculate the direction in which the driver is facing with respect to the headlights 2, taking into account the driver's orientation and travel-related information (here, vehicle information and map information). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the estimated depth distance.
  • the estimated visible object is estimated to be a pedestrian (see FIG. 9), but now, as shown in FIGS. 12A and 12B, the driver is facing the object.
  • the direction includes the road in the traveling direction of vehicle 100 and the sidewalk on the other side of the intersection.
  • the driver's estimated visible object is pedestrian B (indicated by W2 in FIGS.
  • the headlight control unit 15 is capable of irradiating light onto a pedestrian B who is estimated to be a visible object of the driver and who is estimated to exist in the direction the driver is facing. Even if there is a pedestrian A (indicated by W1 in FIG. 12A) on the sidewalk before the intersection on the route of the vehicle 100, the pedestrian A is present in the direction the driver is facing. do not. Therefore, it is not estimated to be an object that is estimated to be visible to the driver. In other words, pedestrian A is not irradiated with light.
  • the headlight control device 1a directs light to the estimated visible object of the driver according to the depth distance estimated using the orientation information, vehicle information, map information, and depth distance estimation information.
  • Appropriate lighting control of the headlights 2 can be performed so that the headlights are illuminated.
  • the depth distance estimation unit 13 estimates the depth distance from the direction information and the vehicle information using the depth distance estimation information.
  • the depth distance estimation unit 13 did not take map information into consideration when estimating the depth distance. Therefore, for example, in the headlight control device 1 according to the first embodiment, the depth distance estimation unit 13 estimates the depth distance using the depth distance estimation information as shown in FIG.
  • the person's face is ⁇ front'' in the vertical direction and ⁇ right'' in the horizontal direction, and the vehicle 100 is traveling at a speed of 30 km/h to turn right at the next intersection 15 meters before the intersection. In this case, the depth distance estimation unit 13 cannot estimate the depth distance unless the driver turns the steering wheel to a certain degree (for example, 20 degrees or more) to turn right.
  • the depth distance estimating unit 13 can estimate the depth distance to be, for example, "15 to 20 m" (see FIG. (See No. 2). This is because it is difficult to estimate the running state of the vehicle 100 and the estimated visible object of the driver until the steering wheel is turned to the right to some extent.
  • the depth distance estimation information used by the depth distance estimating section 13a to estimate the depth distance includes map information.
  • the administrator or the like can, for example, estimate the estimated visible object based on the route of the vehicle 100 and set the depth distance.
  • the depth distance estimation unit 13a estimates the depth distance using orientation information, vehicle information, map information, and depth distance estimation information. As a result, the depth distance estimating unit 13a can estimate the depth distance at a timing earlier than when the driver turns the steering wheel to some extent to make a right turn, for example in the above example.
  • the headlight control device 1a according to the second embodiment can control the lighting of the headlights 2 according to the depth distance at an earlier timing than the headlight control device 1 according to the first embodiment.
  • the headlight control device 1a according to the second embodiment starts the lighting control of the headlight 2 with the depth distance as an initial value at an earlier timing than the headlight control device 1 according to the first embodiment.
  • Lighting control of the headlight 2 according to the depth distance estimated by the estimation unit 13a in other words, lighting control so that light is irradiated to the estimated visible object that is estimated to exist at the estimated visible position of the driver, You can switch to .
  • FIG. 13 is a flowchart for explaining the operation of the headlight control device 1a according to the second embodiment.
  • the headlight control device 1a determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts operations as shown in the flowchart of FIG. do.
  • the headlight control device 1a repeats the operation shown in the flowchart of FIG. 13, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
  • the control unit (not shown) of the headlight control device 1a acquires information indicating the state of the headlights 2 from a headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not.
  • the control unit determines that the headlights 2 are in the on state
  • the control unit determines to start lighting control of the headlights 2 based on the driver's direction, and the direction detection unit 11, driving related information acquisition unit 12a, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the section 13a, the irradiation determining section 14, and the headlight control section 15.
  • control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the travel-related information acquisition section 12a, the depth distance estimation section 13a, the irradiation determination section 14, and the headlight control section 15.
  • step ST1-1, step ST1-2, and steps ST3 to ST4 are the headlight control shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing contents are the same as those of steps ST1-1, ST1-2, and steps ST3 to ST4 of the operation of the apparatus 1, duplicate explanations will be omitted.
  • the map information acquisition unit 122 acquires map information from the travel-related information acquisition device 4 (step ST1-3).
  • the map information acquisition unit 122 outputs the acquired map information to the depth distance estimation unit 13a as travel-related information.
  • the depth distance estimating unit 13a uses the orientation information regarding the driver's orientation detected by the orientation detecting unit 11 in step ST1-1, the vehicle information acquired by the vehicle information acquiring unit 121 in step ST1-2, and the vehicle information acquired in step ST1-2.
  • the depth distance is estimated using the map information acquired by the map information acquisition unit 122 in step ST2 and the depth distance estimation information (step ST2a).
  • step ST3 the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13 in step ST2a (step ST3).
  • the headlight control device 1a detects the direction of the driver based on the captured image inside the vehicle, and acquires travel-related information (here, vehicle information and map information).
  • the headlight control device 1a estimates a depth distance based on the detected orientation information regarding the direction of the driver and the acquired travel-related information, and determines the range of light irradiation by the headlights 2 based on the estimated depth distance. .
  • the headlight control device 1a causes the headlight 2 to irradiate the determined irradiation range with light. Therefore, the headlight control device 1a can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
  • map information to estimate the depth distance
  • the headlight control device 1a can estimate the depth distance even in situations where it is difficult to estimate the depth distance using only the driver's orientation and vehicle information.
  • the system can provide driving support when driving.
  • the depth distance estimation unit 13a may estimate the depth distance and the ideal width of the irradiation range.
  • the depth distance estimation section 13a outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination section 14.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13a and the ideal width of the irradiation range. Specifically, when the irradiation range calculated based on the depth distance estimated by the depth distance estimation unit 13a exceeds the ideal width of the irradiation range, the irradiation determining unit 14 determines the range up to the ideal width of the irradiation range.
  • the depth distance estimating unit 13a estimates the ideal width of the irradiation range, and the irradiation determining unit 14 determines the irradiation range with the ideal width of the irradiation range as the upper limit, so that the headlight control device 1a reduces glare imparted to oncoming vehicles, etc. Can be reduced.
  • the depth distance estimating unit 13a may estimate the ideal light amount by the headlights 2 as well as estimating the depth distance.
  • the depth distance estimation section 13a outputs depth distance information to the irradiation determination section 14, and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
  • the headlight control unit 15 causes the headlight 2 to irradiate light in the irradiation range determined by the irradiation determination unit 14 at the ideal light amount estimated by the depth distance estimation unit 13a.
  • the depth distance estimating unit 13a estimates the ideal amount of light, and the headlight control unit 15 causes the headlights 2 to emit light at the ideal amount of light, so that the headlight control device 1a, in the direction in which the driver is facing, Depending on the distance from the vehicle 100 to which the light is irradiated, the light can be irradiated with the amount of light that is assumed to be necessary for the driver to visually recognize the estimated visible object.
  • the depth distance estimation unit 13a may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
  • the headlight control device 1a is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the driving-related information acquisition section 12a, the depth distance estimation section 13a, and the irradiation determination section 12a. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device.
  • the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12a, depth distance estimation section 13a, irradiation determination section 14, headlight control section 15, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12a, the depth distance estimation unit 13a, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
  • the hardware configuration of the headlight control device 1a according to the second embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
  • the functions of the direction detection section 11, the travel-related information acquisition section 12a, the depth distance estimation section 13a, the irradiation determination section 14, the headlight control section 15, and a control section are performed by a processing circuit. This is realized by 1001.
  • the headlight control device 1a estimates the depth distance based on the driving-related information and the direction information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above-mentioned information.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12a, the depth distance estimation section 13a, the irradiation determination section 14, and the headlight control section. 15 and a control section (not shown). That is, when the headlight control device 1a is executed by the processing circuit 1001, steps ST1-1, ST1-2, and ST1-3 to ST4 in FIG. 13 described above are executed as a result.
  • a memory 1005 is provided for storing a program. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12a, the depth distance estimation unit 13a, the irradiation determination unit 14, the headlight control unit 15, and the control unit (not shown).
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1a includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the headlight control device 1a includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); A driving-related information acquisition unit 12a that acquires driving-related information related to driving of the vehicle 100, orientation information regarding the driver's orientation detected by the orientation detection unit 11, and driving-related information acquired by the driving-related information acquisition unit 12a.
  • the headlight control section 15 is configured to irradiate light onto the irradiation range determined by the irradiation determining section 14. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1a determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1a can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
  • the driving-related information acquisition unit 12a includes a vehicle information acquisition unit 121 that acquires vehicle information regarding the vehicle 100 as driving-related information, and a map information acquisition unit 121 that acquires map information as driving-related information.
  • the depth distance estimation section 13a estimates the depth distance based on the orientation information, vehicle information, and map information. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1a determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1a can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
  • the headlight control device estimates the depth distance based on direction information and vehicle information.
  • Embodiment 3 an embodiment will be described in which a depth distance is estimated based on orientation information and information around the vehicle.
  • FIG. 14 is a diagram showing a configuration example of a headlight control device 1b according to the third embodiment.
  • the headlight control device 1b is mounted on the vehicle 100.
  • the headlight control device 1b controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100.
  • the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction.
  • the "driver's orientation” includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
  • the headlight control device 1b performs the light control of the headlights 2 based on the direction of the driver in a place where the surroundings of the vehicle 100 are dark, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
  • the “driver's face direction” includes “driver's face direction or line of sight direction”, which also includes the direction of the driver's line of sight. It refers to
  • the headlight control device 1b according to the third embodiment differs from the headlight control device 1 according to the first embodiment in that the driving-related information acquisition section 12b includes an external information acquisition section 123 instead of the vehicle information acquisition section 121.
  • the points are different.
  • the specific operation of the depth distance estimation section 13b in the headlight control device 1b according to the third embodiment is different from the specific operation of the depth distance estimation section 13 in the headlight control device 1 according to the first embodiment. different.
  • the driving-related information acquisition device 4 is assumed to be a device such as an external imaging device (not shown) or a radar (not shown).
  • the driving-related information acquisition device 4 such as an external imaging device or a radar, acquires information regarding objects around the vehicle 100 (hereinafter referred to as "external information") as driving-related information.
  • the information outside the vehicle includes, for example, a captured image of the surroundings of the vehicle 100 (hereinafter referred to as the "external captured image”), and information regarding the distance to objects existing around the vehicle 100 (hereinafter referred to as "distance information"). and is included.
  • the external imaging device images at least the front of the vehicle 100.
  • the external imaging device may be configured to image not only the front of the vehicle 100 but also the side or rear of the vehicle 100.
  • the external imaging device outputs the captured external image to the headlight control device 1b as travel-related information. Note that although it is assumed here that one external imaging device is connected to the headlight control device 1b, this is only an example.
  • a plurality of external imaging devices may be mounted on the vehicle 100, and the plurality of external imaging devices may be connected to the headlight control device 1b.
  • the radar obtains at least the distance to an object that exists in front of the vehicle 100.
  • the radar may be configured to obtain the distance to an object not only in front of the vehicle 100 but also to the side or rear of the vehicle 100.
  • the radar outputs the acquired distance information regarding the distance to the object to the headlight control device 1b as travel-related information.
  • the distance information includes, for example, information indicating the presence of an object, the position of the object, the distance to the object, and the like. Note that although it is assumed here that one radar is connected to the headlight control device 1b, this is only an example.
  • a plurality of radars may be mounted on the vehicle 100, and the plurality of radars may be connected to the headlight control device 1b.
  • a plurality of radars of different types may be connected to the headlight control device 1b, such as a radar that obtains the distance to a nearby object and a radar that obtains the distance to a distant object.
  • the vehicle exterior information acquisition unit 123 acquires vehicle exterior information from the driving-related information acquisition device 4 . Specifically, the outside-vehicle information acquisition unit 123 acquires an outside-vehicle captured image from an outside-vehicle imaging device, and acquires distance information from a radar. The outside-vehicle information acquisition unit 123 outputs the acquired outside-vehicle information to the depth distance estimation unit 13b as travel-related information.
  • the depth distance estimating unit 13b estimates the depth distance based on the direction information regarding the direction of the driver detected by the direction detecting unit 11 and the external information acquired by the external information acquiring unit 123. Specifically, the depth distance estimating unit 13b estimates the depth distance by comparing the orientation information, driving related information (here, information outside the vehicle), and information for estimating depth distance.
  • FIG. 15 is a diagram showing an example of the contents of depth distance estimation information used by the depth distance estimation unit 13b to estimate the depth distance in the third embodiment.
  • the depth distance estimation information is, for example, a table in which the driver's behavior, external vehicle information, and depth distance are associated with each other.
  • the driver's behavior includes, for example, the vertical direction of the driver's face direction and the left/right direction of the driver's face direction.
  • the vertical direction of the driver's face is expressed as "front,”"above,” or “down.”
  • the left and right direction of the driver's face is expressed as "front", "right", or "left”.
  • the outside-vehicle information includes, for example, information regarding an object existing in front of the vehicle 100.
  • the depth distance estimation unit 13b when the depth distance estimating unit 13b estimates the depth distance using the depth distance estimation information, the depth distance estimation unit 13b converts the estimated depth distance into conditions for depth distance adjustment (details will be described later) based on the information outside the vehicle. Refer to and make adjustments.
  • the depth distance estimation unit 13b determines the adjusted depth distance as the estimated depth distance.
  • a detailed description will be given of a flow in which the depth distance estimation unit 13b estimates a depth distance, adjusts the estimated depth distance, and finalizes the estimated depth distance.
  • the depth distance estimation unit 13b determines the driver's behavior based on the direction information. For example, the depth distance estimating unit 13b determines whether the vertical direction of the driver's face is "front” or “above” from information indicating the driver's vertical face orientation included in the orientation information. Determine whether it is "downward”. For example, the depth distance estimating unit 13b determines whether the horizontal direction of the driver's face is “front” or “right” from information indicating the driver's horizontal face orientation included in the orientation information. Determine whether it is on the left or on the left.
  • Determination by the depth distance estimating unit 13b whether the vertical and horizontal directions of the driver's face direction are "front”, “up”, “down”, “right”, or “left” The method is to determine whether the vertical and horizontal directions of the driver's face direction are "front”, “above”, or “down” by the depth distance estimating unit 13, which has already been explained in the first embodiment. Since the method for determining whether it is "right” or “left” is the same, duplicate explanation will be omitted.
  • the depth distance estimation unit 13b determines objects (for example, signs, pedestrians, white lines, vehicles, etc.) existing around the vehicle 100 based on the external information acquired by the external information acquisition unit 123. For example, the depth distance estimating unit 13b can determine objects existing around the vehicle 100 by performing image recognition processing using a known image recognition technique on the image taken outside the vehicle. In addition, since the installation position and angle of view of the external imaging device and the installation position and detection range of the radar are known in advance, the depth distance estimating unit 13b can detect objects on the external imaging image and the distance information indicated by the distance information. It is possible to make correspondences with existing objects.
  • objects for example, signs, pedestrians, white lines, vehicles, etc.
  • the depth distance estimating unit 13b estimates how far from the vehicle 100 the object imaged in the external image is located based on the external image and the distance information. Can be linked. Note that the depth distance estimating unit 13b uses, for example, a method of considering objects that are close to each other as the same object based on the coordinates indicating the position of the object on the image taken outside the vehicle and the coordinates indicating the position of the object in the distance information. Various known methods may be used to associate the object on the outside-of-vehicle image with the object indicated by the distance information.
  • the depth distance estimating unit 13b determines how far from the vehicle 100 the object detected by the radar is based on the distance information. It can be determined whether it exists in the location.
  • the depth distance estimating unit 13b stores the outside vehicle information outputted from the outside vehicle information acquisition unit 123 in the storage unit 16 with an acquisition date and time, and calculates the acceleration of the detected object from the past outside information. If it is determined that the calculated acceleration has changed significantly, in other words, that the object has moved rapidly, the object may be determined to be an erroneously detected object. The depth distance estimating unit 13b does not use the outside-vehicle information regarding the object determined to be an erroneously detected object for estimating the depth distance.
  • the outside-vehicle information acquisition unit 123 may associate the object on the outside-the-vehicle captured image with the object indicated by the distance information. In this case, for example, the outside-vehicle information acquisition unit 123 outputs the outside-vehicle information to the depth distance estimating unit 13b in a form that allows the corresponding object to be identified. Further, the outside-vehicle information acquisition unit 123 may determine whether an object has been erroneously detected based on past outside-the-vehicle information. The outside-vehicle information acquisition unit 123 does not output outside-vehicle information regarding an object determined to be an erroneously detected object to the depth distance estimation unit 13b.
  • the depth distance estimating unit 13b combines the determined behavior and information indicating objects existing around the vehicle 100 with the driver's behavior and the information outside the vehicle set in the depth distance estimation information as shown in FIG.
  • the depth distance is estimated by comparing the information and obtaining the depth distance information.
  • the depth distance estimation unit 13b determines that the depth distance is "5 to 15 m" using orientation information regarding the driver's orientation, travel-related information (here, information outside the vehicle), and depth distance estimation information. (See No. 1 of depth distance estimation information in FIG. 15).
  • the vehicle 100 is parked or stopped.
  • the driver's estimated visible object is estimated to be a pedestrian (or a place where a pedestrian is likely to be found). Therefore, in the depth distance estimation information, if the driver tries to check for a pedestrian (or a place where a pedestrian is likely to be found) when the vehicle 100 is about to park or stop, The area where a person is likely to be located is set to 5 to 15 m, which is assumed to be at a depth of about this distance (No. 1 of information for estimating depth distance in Figure 15). . Similarly, No.
  • the depth distance is set based on the driving state of the vehicle 100 estimated from the driver's behavior and information outside the vehicle, and the estimated visible object of the driver.
  • FIG. 16 shows No. 1 set in the depth distance estimation information shown in FIG. 15. 1 ⁇ No.
  • information indicating an object existing behind the vehicle and information indicating the type of the adjacent lane (passing lane or oncoming lane) may be set as information outside the vehicle. For example, if the object behind the vehicle is a vehicle and the type of the adjacent lane is a passing lane, the depth distance indicates that the vehicle 100 is traveling on an expressway and the driver's estimated visible object is a sign. The value is set assuming that.
  • information indicating an object existing on the side of the vehicle and information indicating the height of the object may be set as the vehicle exterior information.
  • the depth distance may include a vehicle 100 traveling at an intersection with poor visibility.
  • the depth distance is set assuming that the driver's estimated visible object is a pedestrian.
  • the depth distance estimation information only needs to be information that allows depth distance information to be obtained from orientation information and travel-related information (here, information outside the vehicle).
  • the depth distance estimating unit 13b includes orientation information regarding the direction of the driver detected by the orientation detecting unit 11 and travel-related information acquired by the travel-related information acquisition unit 12b (here, vehicle exterior information acquired by the vehicle exterior information acquisition unit 123). However, if the input information does not match the input information set in the depth distance estimation information, for example, it is assumed that the depth distance could not be estimated from the depth distance estimation information, and the initial value of the depth distance is set as the depth distance. .
  • the depth distance estimation unit 13b adjusts the estimated depth distance based on the estimated depth distance and the information outside the vehicle, and finalizes the estimated depth distance. Note that the depth distance estimation unit 13b does not perform this adjustment when the depth distance estimated based on the depth distance estimation information does not have a width. Specifically, the depth distance estimating unit 13b adjusts the depth distance based on the estimated depth distance and the information outside the vehicle, with reference to depth distance adjustment conditions for adjusting the depth distance.
  • the depth distance adjustment conditions are generated in advance by an administrator or the like and stored in a location that can be referenced by the depth distance estimating unit 13b.
  • conditions such as (condition 1) to (condition 4) below are set as conditions for depth distance adjustment.
  • the depth distance estimated based on the depth distance estimation information is used as the depth distance. do. Additionally, if there is no moving object in front of the vehicle in the direction the driver is facing and there is a stationary object around the vehicle, the distance from the headlights to the stationary object is the depth distance estimation information. If the depth distance is outside the range of the depth distance estimated based on the depth distance, the depth distance estimated based on the depth distance estimation information is used as the depth distance.
  • the moving object includes a person.
  • the addition distance is set in advance by an administrator or the like, and is stored in a location that can be referenced by the depth distance estimation unit 13b. For example, a distance between 2 and 5 meters is set as the addition distance.
  • a distance between 2 and 5 meters is set as the addition distance.
  • the addition distance may be set to a value with a width.
  • the depth distance estimation unit 13b determines whether or not there is an object in front of the vehicle 100 in the direction in which the driver is facing, based on the external information acquired by the external information acquisition unit 123. can determine its type, here whether it is a moving object or a stationary object. Since the installation position and detection range of the radar are known in advance, the depth distance estimation unit 13b can determine whether or not there is an object in the direction in which the driver is facing. Note that the depth distance estimation unit 13b can determine the direction in which the driver is facing from the orientation information. Further, as described above, the depth distance estimating unit 13b is capable of associating an object on an image taken outside the vehicle with an object indicated by the distance information.
  • the depth distance estimating unit 13b when determining that there is an object in the direction that the driver is facing, performs image recognition processing using a known image recognition technique on the image taken outside the vehicle, for example. It is possible to determine the type of the object, in other words, whether it is a moving object or a stationary object.
  • the depth distance estimation unit 13b may determine the type of object using distance information, for example.
  • the distance information includes, for example, information regarding the speed of the object.
  • the depth distance estimating unit 13b calculates the object detected by the radar based on the positional relationship and distance information. It is possible to determine how far away the vehicle is from the headlights 2.
  • the depth distance estimating unit 13b uses, for example, a range expanded vertically and horizontally by a preset angle around the driver's face orientation as a target for determining whether or not an object exists. It may be "the direction in which the driver is facing”.
  • the conditions for adjusting the depth distance as described above are only an example, and the conditions for adjusting the depth distance can be set as appropriate.
  • the depth distance adjustment conditions include, for example, the type of the detected object, the distance from the headlight 2 to the object, and information indicating the difference in depth distance estimated based on the depth distance estimation information, and the information after adjustment.
  • the information may be in a table format in which the information indicating the depth distance is associated with the information.
  • the depth distance estimation unit 13b After adjusting the depth distance, the depth distance estimation unit 13b sets the adjusted depth distance as the estimated depth distance, and outputs depth distance information to the irradiation determination unit 14.
  • the depth distance estimation unit 13b uses the direction information regarding the driver's orientation detected by the direction detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12b (here, the outside-vehicle information acquired by the outside-vehicle information acquisition unit 123). If the driver's behavior and the information outside the vehicle determined based on the information do not match the driver's behavior and the information outside the vehicle as input information set in the depth distance estimation information, for example, Assuming that the depth distance could not be estimated, the initial value of the depth distance is set as the depth distance. The depth distance estimation unit 13b outputs depth distance information regarding the depth distance for which the initial value is set to the irradiation determination unit 14.
  • the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13b, and outputs irradiation information to the headlight control unit 15.
  • the headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination section 14 with light.
  • FIGS. 17, 18, and 19 show that in the third embodiment, the irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimation unit 13b, and the headlight control unit 15 is a diagram for explaining an example of how the headlight 2 is caused to irradiate light onto the irradiation range determined by the irradiation determining unit 14.
  • the depth distance estimation information used by the depth distance estimation unit 13b to estimate the depth distance has contents as shown in FIG.
  • the depth distance estimating unit 13b is assumed to adjust the depth distance according to the depth distance adjustment conditions (condition 1) to (condition 4) described above.
  • 17, 18, and 19 are views of the road on which the vehicle 100 is traveling viewed from the side. In FIGS. 17, 18, and 19, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA".
  • the driver's face direction is within the "front” range in the vertical direction, and there are a plurality of parked vehicles (indicated by "C” in FIG. 17) in front of the vehicle 100.
  • C parked vehicles
  • W pedestrian
  • the vehicle 100 is Suppose you are running. It is assumed that the pedestrian exists at a distance of 12 m from the headlight 2 in the direction the driver is facing.
  • the depth distance estimating unit 13b determines that the driver's face orientation and travel-related information (here, outside-vehicle information) are the Nos. of the depth distance estimation information based on the depth distance estimation information.
  • the depth distance is estimated to be 5 to 15 m.
  • the depth distance estimating unit 13b adjusts the estimated depth distance to "5 to 15".
  • the depth distance estimating unit 13b adjusts the depth distance to "12 m”. Let it be the estimated depth distance.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance “12 m” estimated by the depth distance estimating unit 13b.
  • the irradiation determining unit 14 has determined the range of ⁇ 7 degrees to ⁇ 8 degrees in the horizontal direction and the range of ⁇ 7 degrees to ⁇ 7 degrees in the vertical direction as the irradiation range (in FIG. The irradiation range in the direction is not shown). Note that here, the irradiation determining unit 14 calculates the depth distance vertical angle based on the depth distance "12 m" in the vertical direction of the irradiation range, and widens the depth distance vertical angle by a preset angle in the vertical direction. The range of angles determined is the range of angles in the vertical direction of the irradiation range.
  • the headlight control unit 15 estimates the distance in the direction in which the driver is facing with respect to the headlights 2 by the depth estimation unit 13b in consideration of the driver's orientation and driving-related information (here, information outside the vehicle). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance. As a result, the headlight control unit 15 can control the headlights 2 so that light is irradiated onto pedestrians in the direction in which the driver is facing. The driver can visually recognize pedestrians.
  • the depth distance estimated using the depth distance estimation information is not estimated in consideration of objects actually existing in the direction in which the driver is facing.
  • the depth distance estimating unit 13b adjusts the depth distance based on the moving objects that actually exist in the direction the driver is facing, so that the estimated visible object has a high probability of being an object that actually exists. Light will now be irradiated onto the moving object (in this case, a pedestrian).
  • the depth distance estimation unit 13b estimates the depth to be "5 to 15 m" based on the depth distance estimation information. The depth distance is directly used as the estimated depth distance.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance “5 to 15 m” estimated by the depth distance estimating unit 13b.
  • the irradiation determining unit 14 has determined the irradiation range to be a range of ⁇ 9 degrees to ⁇ 10 degrees in the horizontal direction and a range of ⁇ 9 degrees to ⁇ 10 degrees in the vertical direction (in FIG. The irradiation range in the direction is not shown).
  • the irradiation determining unit 14 calculates the first depth distance vertical angle based on the depth distance "5 m” and calculates the second depth distance vertical angle based on the depth distance "15 m" in the vertical direction of the irradiation range. Then, the vertical direction of the irradiation range is determined.
  • the headlight control unit 15 estimates the distance in the direction in which the driver is facing with respect to the headlights 2 by the depth estimation unit 13b in consideration of the driver's orientation and driving-related information (here, information outside the vehicle). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance. As a result, the headlight control unit 15 controls the headlights 2 so that the light is irradiated onto the estimated visible object of the driver.
  • the position where the pedestrian actually exists is not included in the light irradiation range of the headlight 2. That is, pedestrians are not irradiated with the light of the headlights 2.
  • a pedestrian located further away than the depth distance estimated using the depth distance estimation information is not an estimated visible object, but an estimated visible object exists at a closer distance than the pedestrian. It is estimated that there are.
  • the depth distance estimation unit 13b adjusts the depth distance, even if a pedestrian actually exists in the direction the driver is facing, the depth distance of the pedestrian is farther than the depth distance estimated using the depth distance estimation information. If the object exists at the position, the depth distance is not increased, but the depth distance estimated using the depth distance estimation information is left unchanged.
  • the headlight control device 1b does not irradiate light to a pedestrian that is unlikely to be an estimated visible object even if the pedestrian actually exists, and reduces the possibility that an estimated visible object exists.
  • the headlights 2 can be controlled so that the light is irradiated to a range where it is estimated that the height is higher.
  • the driver's face direction is within the "front” range in the vertical direction, and there are a plurality of parked vehicles (indicated by "C" in FIG. 19) in front of the vehicle 100.
  • C parked vehicles
  • the vehicle 100 is traveling in a situation where only one stopped vehicle is shown in FIG. 19 for simplicity of explanation.
  • There are no moving objects such as pedestrians around the vehicle 100.
  • the distance to the stopped vehicle is "4 m”. In this case, there is no moving object in the direction the driver is facing, but there is a stopped vehicle, and the distance from the headlight 2 to the stopped vehicle is 5 to 15 m, which is estimated based on the depth distance estimation information.
  • the depth distance estimation unit 13b adjusts the depth distance and sets "7 m" as the estimated depth distance.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance “7 m” estimated by the depth distance estimating unit 13b. Here, it is assumed that the irradiation determining unit 14 has determined the range of ⁇ 11 degrees to ⁇ 12 degrees in the horizontal direction and the range of ⁇ 11 degrees to ⁇ 12 degrees in the vertical direction as the irradiation range (in FIG. The irradiation range in the direction is not shown).
  • the irradiation determining unit 14 calculates the depth distance vertical angle based on the depth distance "7 m" in the vertical direction of the irradiation range, and widens the depth distance vertical angle by a preset angle in the vertical direction.
  • the range of angles determined is the range of angles in the vertical direction of the irradiation range.
  • the headlight control unit 15 estimates the distance in the direction in which the driver is facing with respect to the headlights 2 by the depth estimation unit 13b in consideration of the driver's orientation and driving-related information (here, information outside the vehicle). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance. As a result, the headlight control unit 15 controls the headlight 2 so that the light is irradiated onto the estimated visible object of the driver.
  • the light of the headlights 2 is irradiated to a range behind the stopped vehicle when viewed from the vehicle 100.
  • the depth distance estimating unit 13b By adjusting the depth distance by the depth distance estimating unit 13b, if the direction in which the driver is facing is a situation where a run-out is likely to occur, the depth distance is set as the estimated visual recognition position of a place where the run-out is likely to occur. Adjusted to depth distance. Thereby, the headlight control device 1b can control the headlights 2 so that light is irradiated to a range where it is estimated that there is a higher possibility that the estimated visible object exists.
  • the headlight control device 1 does not estimate the depth distance in consideration of objects actually existing around the vehicle 100.
  • the headlight control device 1b estimates the depth distance based on the information outside the vehicle, and more specifically, uses the depth distance estimated based on the information for estimating the depth distance as the information outside the vehicle. By performing adjustment based on the depth distance and estimating the depth distance to determine the depth distance, the depth distance is estimated in consideration of objects actually existing around the vehicle 100.
  • the headlight control device 1b estimates a depth distance that is more in line with the actual situation, based on whether there is an object that actually exists in the direction that the driver is facing. Thereby, the headlight control device 1b can more appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
  • FIG. 20 is a flowchart for explaining the operation of the headlight control device 1b according to the third embodiment.
  • the headlight control device 1b determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. do.
  • the headlight control device 1b repeats the operation shown in the flowchart of FIG. 20, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
  • the control unit (not shown) of the headlight control device 1b acquires information indicating the state of the headlights 2 from the headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are in the on state. Determine whether or not.
  • the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12b, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the section 13b, the irradiation determining section 14, and the headlight control section 15.
  • control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the travel-related information acquisition section 12b, the depth distance estimation section 13b, the irradiation determination section 14, and the headlight control section 15.
  • step ST1-1 and steps ST3 to ST4 are the same as those of the headlight control device 1 shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing contents are the same as those of step ST1-1 and steps ST3 to ST4, duplicate explanation will be omitted.
  • the vehicle exterior information acquisition unit 123 acquires vehicle exterior information from the driving-related information acquisition device 4 (step ST1-4). Specifically, the outside-vehicle information acquisition unit 123 acquires an outside-vehicle captured image from an outside-vehicle imaging device, and acquires distance information from a radar. The outside-vehicle information acquisition unit 123 outputs the acquired outside-vehicle information to the depth distance estimation unit 13b as travel-related information.
  • the depth distance estimation unit 13b uses the orientation information regarding the driver's orientation detected by the orientation detection unit 11 in step ST1-1, the external information acquired by the external information acquisition unit 123 in step ST1-4, and the depth distance estimation.
  • the depth distance is estimated using this information (step ST2b). Specifically, when the depth distance estimating unit 13b estimates the depth distance using the depth distance estimation information, the depth distance estimating unit 13b adjusts the estimated depth distance based on the vehicle exterior information and with reference to the depth distance adjustment conditions, and estimates the depth distance. Determine the depth distance.
  • the depth distance estimation section 13b outputs depth distance information to the irradiation determination section 14.
  • step ST3 the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13b in step ST2b (step ST3).
  • the headlight control device 1b detects the orientation of the driver based on the captured image inside the vehicle, and acquires travel-related information (here, information outside the vehicle). The headlight control device 1b estimates the depth distance based on the detected orientation information regarding the driver's orientation and the acquired travel-related information. At this time, the headlight control device 1b adjusts the depth distance estimated using the depth distance estimation information based on the vehicle exterior information and the depth distance adjustment conditions, and converts the adjusted depth distance information into the estimated depth information. do. Then, the headlight control device 1b determines the irradiation range of light by the headlight 2 based on the estimated depth distance, and causes the headlight 2 to irradiate the determined irradiation range with light. Therefore, the headlight control device 1b can illuminate the estimated visible position of the driver more appropriately, and can provide driving support when the vehicle 100 runs at night or the like.
  • travel-related information here, information outside the vehicle.
  • the headlight control device 1b estimates the depth distance based on the detected orientation information regarding the
  • the depth distance estimation unit 13b may estimate the depth distance and also estimate the ideal width of the irradiation range.
  • the depth distance estimation section 13b outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination section 14.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13b and the ideal width of the irradiation range.
  • the irradiation determination unit 14 determines the range up to the ideal width of the irradiation range. Determine the irradiation range.
  • the depth distance estimation unit 13b estimates the ideal width of the irradiation range, and the irradiation determination unit 14 determines the irradiation range with the ideal width of the irradiation range as the upper limit, so that the headlight control device 1b reduces the glare given to oncoming vehicles etc. Can be reduced.
  • the depth distance estimating unit 13b may estimate the ideal amount of light from the headlights 2 as well as estimating the depth distance.
  • the depth distance estimation section 13b outputs depth distance information to the irradiation determination section 14, and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
  • the headlight control unit 15 causes the headlight 2 to irradiate light at the ideal light amount estimated by the depth distance estimation unit 13b in the irradiation range determined by the irradiation determination unit 14.
  • the depth distance estimating unit 13b estimates the ideal amount of light, and the headlight control unit 15 causes the headlights 2 to emit light at the ideal amount of light, so that the headlight control device 1b can Depending on the distance from the vehicle 100 to which the light is irradiated, the light can be irradiated with the amount of light that is assumed to be necessary for the driver to visually recognize the estimated visible object.
  • the depth distance estimation unit 13b may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
  • the headlight control device 1b is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12b, the depth distance estimation section 13b, and the irradiation determination section 12b. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device.
  • the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12b, depth distance estimation section 13b, irradiation determination section 14, headlight control section 15, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12b, the depth distance estimation unit 13b, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
  • the hardware configuration of the headlight control device 1b according to the third embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
  • the functions of the direction detection section 11, the travel-related information acquisition section 12b, the depth distance estimation section 13b, the irradiation determination section 14, the headlight control section 15, and a control section are performed by a processing circuit. This is realized by 1001.
  • the headlight control device 1b estimates the depth distance based on the direction information related to the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3 and the driving-related information, and applies the estimated depth distance to
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection unit 11, the travel-related information acquisition unit 12b, the depth distance estimation unit 13b, the irradiation determination unit 14, and the headlight control unit. 15 and a control section (not shown). That is, the headlight control device 1b stores a program that, when executed by the processing circuit 1001, results in steps ST1-1, ST1-4 to ST4 in FIG. 20 described above being executed.
  • a memory 1005 is provided for the purpose. Further, the program stored in the memory 1005 includes the direction detection section 11, the travel-related information acquisition section 12b, the depth distance estimation section 13b, the irradiation determination section 14, the headlight control section 15, and the control section (not shown).
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1b includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the headlight control device 1b includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); A driving-related information acquisition unit 12b that acquires driving-related information related to the driving of the vehicle 100, orientation information regarding the driver's orientation detected by the orientation detection unit 11, and driving-related information acquired by the driving-related information acquisition unit 12b.
  • the headlight control section 15 is configured to irradiate light onto the irradiation range determined by the irradiation determining section 14. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1b determines how far ahead in the direction in which the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1b can more appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
  • the driving-related information acquisition unit 12b has an external information acquisition unit 123 that acquires external information regarding the front of the vehicle 100 as driving-related information, and the depth distance estimating unit 13b has a direction The depth distance is estimated based on the information and the information outside the vehicle. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1b determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration. The headlight control device 1b can more appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
  • Embodiment 4 the headlight control device in the first embodiment acquires map information in addition to vehicle information, and estimates the depth distance based on the orientation information, vehicle information, and map information.
  • the third embodiment in the headlight control device in the first embodiment, information outside the vehicle is acquired instead of vehicle information, and the depth distance is estimated based on the orientation information and the information outside the vehicle.
  • Embodiment 4 an embodiment that is a combination of Embodiment 2 and Embodiment 3 will be described.
  • FIG. 21 is a diagram showing a configuration example of a headlight control device 1c according to the fourth embodiment.
  • the headlight control device 1c is mounted on the vehicle 100.
  • headlight control devices 1, 1a, and 1b which were explained using FIG. 1, FIG. 7, and FIG. 14 in Embodiment 1, Embodiment 2, and Embodiment 3, respectively. Similar configurations will be given the same reference numerals and redundant explanations will be omitted.
  • the driving-related information acquisition section 12c includes a vehicle information acquisition section 121, a map information acquisition section 122, and an external information acquisition section 123.
  • the depth distance estimation unit 13c acquires orientation information regarding the driver's orientation detected by the orientation detection unit 11, vehicle information acquired by the vehicle information acquisition unit 121, map information acquired by the map information acquisition unit 122, and information outside the vehicle.
  • the depth distance is estimated based on the outside information acquired by the unit 123.
  • the depth distance estimating unit 13b estimates the depth distance by comparing the orientation information, travel-related information (vehicle information, map information, and information outside the vehicle) with the depth distance estimation information.
  • FIG. 22 is a diagram illustrating an example of the contents of depth distance estimation information used by the depth distance estimation unit 13c to estimate the depth distance in the fourth embodiment.
  • the depth distance estimation information is, for example, a table in which driver behavior, vehicle information, map information, outside vehicle information, and depth distance are associated with each other. be.
  • the depth distance estimating unit 13c calculates the estimated depth distance by referring to the depth distance adjustment conditions based on the information outside the vehicle. adjust.
  • the depth distance estimation unit 13c determines the adjusted depth distance as the estimated depth distance.
  • the depth distance estimation section 13c outputs depth distance information to the irradiation determination section 14.
  • FIG. 23 shows No. 2 set in the depth distance estimation information shown in FIG. 22. 1 ⁇ No. Regarding the depth distance corresponding to the input information under the condition 7, the driving state of the vehicle 100 and the driver estimated from the driver's behavior, vehicle information, map information, and external information, which are the basis for deriving the depth distance.
  • FIG. 2 is a diagram showing an example of a visually recognized object.
  • the depth distance estimation unit 13c uses the orientation information regarding the driver's orientation detected by the orientation detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12c (here, the vehicle information acquired by the vehicle information acquisition unit 121, The driver's behavior, vehicle information, map information, and vehicle exterior information determined based on the map information acquired by the map information acquisition unit 122 and the vehicle exterior information acquired by the vehicle exterior information acquisition unit 123 are used as depth distance estimation information. If the driver's behavior, vehicle information, map information, and external information set as input information do not match, for example, the depth distance cannot be estimated from the depth distance estimation information, and the depth distance is Set the initial value of the depth distance. The depth distance estimation unit 13c outputs depth distance information regarding the depth distance for which the initial value has been set to the irradiation determination unit 14.
  • the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13c, and outputs irradiation information to the headlight control unit 15.
  • the headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination section 14 with light.
  • FIG. 24A is a diagram for explaining an example of the depth distance estimated by the depth distance estimation unit 13c in the fourth embodiment.
  • FIG. 24B shows the irradiation determined by the irradiation determination unit 14 based on the depth distance as shown in FIG.
  • FIG. 3 is a diagram for explaining an example of how a range is irradiated with light.
  • FIG. 24A is an overhead view of the road on which the vehicle 100 is traveling.
  • FIG. 24B is a side view of the road on which the vehicle 100 is traveling.
  • the driver is indicated by "D"
  • the range of light irradiated by the headlight 2 is indicated by "LA”.
  • the depth distance is the distance from the right light to the estimated visual position of the driver.
  • FIG. 24B shows the vehicle 100 as viewed from the left side with respect to the traveling direction for convenience, the irradiation range shown in FIG. 24B is the irradiation range by the right light.
  • the driver's face is in the "front” range in the vertical direction, and two people are walking around the vehicle 100 in the direction in which the driver is facing. It is assumed that there are pedestrians (pedestrian C, pedestrian D, shown as “W3" and “W4" in FIGS. 24A and 24B, respectively). It is assumed that the distance between pedestrian C and headlight 2 is "6 m”, and the distance between pedestrian D and headlight 2 is "22 m”. Further, it is assumed that there is a sign (not shown in FIGS. 24A and 24B) in front of the vehicle 100, and that the white line is interrupted in the lane in which the vehicle 100 is traveling (the lane referred to here is a so-called lane).
  • the depth distance estimating unit 13c calculates the driver's behavior determined from the direction information and driving-related information (here, vehicle information, map information, and , outside vehicle information) is the No. 1 of the depth distance estimation information. Assuming that 3 applies, the depth distance is estimated to be "an area with a 5m margin including the intersection sidewalk and crosswalk.” It is assumed that the depth distance estimating unit 13c calculates the "range including the intersection sidewalk and crosswalk with a margin of 5 m" to be "5 m to 20 m" based on the map information (see FIG. 24A). Since the distance from the headlight 2 to the pedestrian C is within the depth distance range, the depth distance estimation unit 13c adjusts the depth distance to "6 m” and sets "6 m" as the estimated depth distance.
  • driving-related information here, vehicle information, map information, and , outside vehicle information
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance “6 m” estimated by the depth distance estimating unit 13c.
  • the range of ⁇ 13 degrees to ⁇ 14 degrees in the horizontal direction and the range of ⁇ 13 degrees to ⁇ 14 degrees in the vertical direction is determined as the irradiation range (see FIG. 24B.
  • the irradiation range is omitted from the illustration).
  • the irradiation determining unit 14 calculates the depth distance vertical angle based on the depth distance "6 m" in the vertical direction of the irradiation range, and widens the depth distance vertical angle by a preset angle in the vertical direction.
  • the range of angles determined is the range of angles in the vertical direction of the irradiation range.
  • the headlight control unit 15 can control the headlights 2 so that the pedestrians C who are present in the direction the driver is facing are irradiated with light. The driver can visually recognize the pedestrian C. Pedestrian D is not irradiated with light from the headlight 2.
  • the depth distance estimation unit 13c may estimate, for example, a range including the distance from the headlight 2 to the pedestrian C and the distance from the headlight 2 to the pedestrian D as the depth distance.
  • the conditions for depth distance adjustment include, for example, ⁇ If there are multiple moving objects whose distance from the headlights is within the depth distance range estimated based on the depth distance estimation information, "The range including the distances to the plurality of moving objects is adjusted so that it becomes the range of depth distance estimated based on the depth distance estimation information.”
  • the headlight control unit 15 controls the headlight 2 so that both the pedestrian C and the pedestrian D are irradiated with light.
  • the headlight control device 1c according to the fourth embodiment is a headlight control device that is a combination of the headlight control device 1a according to the second embodiment and the headlight control device 1b according to the third embodiment.
  • This makes it possible to control the headlights 2 in more situations, and also adjusts the depth distance more closely to the actual situation based on whether there is an object that actually exists in the direction the driver is facing. Estimate. Thereby, the headlight control device 1b can more appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
  • FIG. 25 is a flowchart for explaining the operation of the headlight control device 1c according to the fourth embodiment.
  • the headlight control device 1c determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. 25. do.
  • the headlight control device 1c repeats the operation shown in the flowchart of FIG. 25, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
  • the control unit (not shown) of the headlight control device 1c acquires information indicating the state of the headlights 2 from a headlight switch installed in the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not.
  • the control unit determines that the headlights 2 are in the on state, it determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12c, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13c, the irradiation determining unit 14, and the headlight control unit 15.
  • control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the travel-related information acquisition section 12c, the depth distance estimation section 13c, the irradiation determination section 14, and the headlight control section 15.
  • step ST1-1, step ST1-2, step ST1-3, and steps ST3 to ST4 are shown in the flowchart of FIG. 13, which has already been explained in the second embodiment. Since the processing contents are the same as those of steps ST1-1, ST1-2, ST1-3, and ST3-4 of the operation of the headlight control device 1a shown in FIG. Further, regarding the operation shown in the flowchart of FIG. 25, the processing content of step ST1-4 is the same as the processing of step ST1-4 of the operation of the headlight control device 1b shown in the flowchart of FIG. 20, which has already been explained in the third embodiment. Since the contents are the same, duplicate explanations will be omitted.
  • the depth distance estimation unit 13c uses the orientation information regarding the driver's orientation detected by the orientation detection unit 11 in step ST1-1, the vehicle information acquired by the vehicle information acquisition unit 121 in step ST1-2, and the vehicle information acquired in step ST1-2.
  • the depth distance is estimated using the map information acquired by the map information acquisition unit 122 in step ST1-4, the external information acquired by the vehicle exterior information acquisition unit 123 in step ST1-4, and the depth distance estimation information (step ST1-4).
  • ST2c Specifically, when the depth distance estimation unit 13c estimates the depth distance using the depth distance estimation information, the depth distance estimating unit 13c adjusts the estimated depth distance based on the vehicle exterior information and with reference to the depth distance adjustment conditions, and estimates the depth distance. Determine the depth distance.
  • the depth distance estimation section 13c outputs depth distance information to the irradiation determination section 14.
  • step ST3 the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13c in step ST2c (step ST3).
  • the headlight control device 1c detects the direction of the driver based on the captured image inside the vehicle, and acquires travel-related information (here, vehicle information, map information, and information outside the vehicle).
  • the headlight control device 1c estimates the depth distance based on the detected orientation information regarding the driver's orientation and the acquired travel-related information.
  • the headlight control device 1c adjusts the depth distance estimated using the depth distance estimation information based on the vehicle exterior information and the depth distance adjustment conditions, and converts the adjusted depth distance information into the estimated depth information. shall be.
  • the headlight control device 1c determines the irradiation range of light by the headlight 2 based on the estimated depth distance, and causes the headlight 2 to irradiate the determined irradiation range with light. Therefore, the headlight control device 1c can illuminate the estimated visible position of the driver more appropriately, and can provide driving support when the vehicle 100 runs at night or the like.
  • the depth distance estimation unit 13c may estimate the ideal width of the irradiation range as well as the depth distance.
  • the depth distance estimation unit 13c outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination unit 14.
  • the irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13c and the ideal width of the irradiation range.
  • the depth distance estimating unit 13c may estimate the ideal amount of light from the headlights 2 as well as estimating the depth distance.
  • the depth distance estimation section 13c outputs depth distance information to the irradiation determination section 14, and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
  • the headlight control unit 15 causes the headlight 2 to irradiate light in the irradiation range determined by the irradiation determination unit 14 at the ideal light amount estimated by the depth distance estimation unit 13c.
  • the depth distance estimating unit 13c may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
  • the headlight control device 1c is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12c, the depth distance estimation section 13c, and the irradiation determination section 13c. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device.
  • the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12c, depth distance estimation section 13c, irradiation determination section 14, headlight control section 15, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12c, the depth distance estimation unit 13c, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
  • Embodiment 4 above is an embodiment that combines Embodiment 2 and Embodiment 3, the present invention is not limited to this. It can also be a form.
  • the driving-related information acquisition section 12b of the headlight control device 1b includes a vehicle information acquisition section 121 and an external information acquisition section 123.
  • the depth distance estimating unit 13b estimates the depth distance based on the orientation information, vehicle information, and information outside the vehicle. Specifically, the depth distance estimating unit 13b determines whether map information is set as input information among the depth distance estimation information as shown in FIG. 22, based on orientation information, vehicle information, and vehicle exterior information. Depth distance is estimated by comparing with depth distance estimation information that is not available.
  • the hardware configuration of the headlight control device 1c according to the fourth embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
  • the functions of the direction detection section 11, the travel-related information acquisition section 12c, the depth distance estimation section 13c, the irradiation determination section 14, the headlight control section 15, and a control section are performed by a processing circuit. This is realized by 1001.
  • the headlight control device 1c estimates the depth distance based on the direction information regarding the direction of the driver detected based on the in-vehicle captured image acquired from the in-vehicle imaging device 3 and the travel-related information, and applies the estimated depth distance to the driving-related information.
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12c, the depth distance estimation section 13c, the irradiation determination section 14, and the headlight control section. 15 and a control section (not shown). That is, when the headlight control device 1c is executed by the processing circuit 1001, steps ST1-1, ST1-2, ST1-3, and ST1-4 to ST4 in FIG. 25 described above are executed as a result.
  • a memory 1005 is provided for storing programs to be executed. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12c, the depth distance estimation unit 13c, the irradiation determination unit 14, the headlight control unit 15, and the control unit (not shown).
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1c includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the headlight control device 1c includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); Based on the driving-related information acquisition unit 12c that acquires driving-related information related to the driving of the vehicle 100, the orientation of the driver detected by the orientation detection unit 11, and the driving-related information acquired by the driving-related information acquisition unit 12c, A depth distance estimating unit 13c that estimates a depth distance, an irradiation determining unit 14 that determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13c, and the headlight 2, The headlight control section 15 is configured to irradiate light onto the irradiation range determined by the irradiation determining section 14.
  • the headlight control device 1c determines how far ahead in the direction in which the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1c can more appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
  • the driving-related information acquisition unit 12c has an external information acquisition unit 123 that acquires external information regarding the front of the vehicle 100 as driving-related information
  • the depth distance estimating unit 13c has a direction Depth distance is estimated based on information, vehicle information, map information, and information outside the vehicle. Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1c determines how far ahead in the direction in which the driver is actually facing. Lighting control can be done with consideration.
  • the headlight control device 1c can more appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
  • Embodiment 5 For example, a driver may nod his head, look away momentarily so that the headlights 2 do not need to follow him, or turn his head without intending to see something, such as squinting. obtain. In this case, the driver's orientation should not be used to estimate depth distance.
  • the light from the headlights 2 is emitted in the direction the driver is facing based on the reliability that indicates whether the driver's orientation can be estimated to be the orientation when trying to visually recognize something. An embodiment in which control is performed will be described.
  • FIG. 26 is a diagram showing a configuration example of a headlight control device 1d according to the fifth embodiment.
  • the headlight control device 1d is mounted on the vehicle 100.
  • the headlight control device 1d controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100.
  • the "driver's orientation” is expressed by the driver's face orientation or the driver's line of sight direction.
  • the "driver's orientation” includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
  • the light control of the headlights 2 based on the direction of the driver performed by the headlight control device 1d is performed in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
  • the headlight control device 1d according to the fifth embodiment differs from the headlight control device 1 according to the first embodiment in that it includes a reliability determination section 17.
  • the reliability determination unit 17 determines the reliability of the driver's orientation detected by the orientation detection unit 11.
  • the "reliability" of the driver's orientation indicates the degree to which it can be estimated that the orientation is the one in which the driver is trying to visually recognize something. This includes both a case where there is a possibility that the accuracy of the detected driver's direction is low and a case where the detected direction of the driver is a direction that does not require illumination by the headlights 2. Note that an example of a case where the accuracy of the detected direction of the driver may be low is, for example, a case where the driver is squinting.
  • the reliability determining unit 17 determines whether the reliability of the direction of the driver is "high” or "low.” For example, the reliability determination unit 17 determines the reliability of the driver's orientation based on the driver's orientation detected by the orientation detection unit 11 retroactively over a preset period (hereinafter referred to as "reliability determination period"). Determine the degree. The reliability determination unit 17 may determine the orientation of the driver detected by the orientation detection unit 11 from the orientation information stored in the storage unit 16 going back through the reliability determination period.
  • the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “low”.
  • Scenes in which the driver's orientation changes for a certain period or more during the reliability determination period include, for example, a scene in which the driver nods his head, a scene in which the driver momentarily looks away, and the like.
  • the reliability of the driver's orientation detected in a scene where the driver is squinting is also considered to be "low.” In this case, the reliability can be determined at any time from the detected orientation of the driver, regardless of whether or not the orientation of the driver has changed by more than a certain value during the reliability determination period. Therefore, for example, the reliability determination unit 17 determines the reliability in a scene where the driver is squinting based on the orientation of the driver detected by the orientation detection unit 11, regardless of the reliability determination period. You can always go.
  • the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “low”, the reliability determination unit 17 does not output orientation information indicating the orientation to the depth distance estimation unit 13. .
  • the reliability determination unit 17 rewrites the orientation information output from the orientation detection unit 11 to the orientation information most recently determined to have “high” reliability, and the rewritten orientation information is transferred to the depth distance estimation unit 13. Output to.
  • the depth distance estimating unit 13 does not use the orientation of the driver for which the reliability determining unit 17 has determined that the reliability is low for estimating the depth distance. Note that when the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “high”, the reliability determination unit 17 uses the orientation information output from the orientation detection unit 11 to the depth distance estimation unit. Output to 13.
  • FIG. 27 is a diagram for explaining an example of a scene in which the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is low in the fifth embodiment.
  • the light irradiation range of the headlight 2 is indicated by "LA".
  • LA the light irradiation range of the headlight 2
  • the reliability determining unit 17 determines that the reliability of the driver's orientation is "low", and the reliability is determined to be "high” most recently, in this case, the orientation before the driver nods.
  • the information is output to the depth distance estimation section 13.
  • the depth distance estimating unit 13 estimates the depth distance using the orientation information before the driver nods, the vehicle information, and the depth distance estimation information.
  • the headlight 2 is controlled by the headlight control unit 15 to irradiate light onto the presumed visible object in the direction in which the driver is facing before the driver nods.
  • the headlight control device 1d prevents unnecessary leveling by excluding detected driver orientations with low reliability from the targets for estimating the depth distance. It is possible to reduce the annoyance caused by the light from the headlights 2 that follows a direction that is not the one the driver is trying to see.
  • unnecessary leveling means, for example, when the headlight 2 follows the direction of the driver and illuminates downward at the moment the driver nods, or when the driver squints. The problem is that the line of sight detection accuracy decreases and the headlights 2 illuminate directions that the driver is not looking at.
  • FIG. 28 is a flowchart for explaining the operation of the headlight control device 1d according to the fifth embodiment.
  • the headlight control device 1d determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. do.
  • the headlight control device 1d repeats the operation shown in the flowchart of FIG. 28, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
  • the control unit (not shown) of the headlight control device 1d acquires information indicating the state of the headlights 2 from a headlight switch installed in the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not.
  • the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, the reliability determination unit 17, and the driving related information acquisition 12, depth distance estimation section 13, irradiation determining section 14, and headlight control section 15, information instructing the start of lighting control of headlight 2 is output.
  • control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the reliability determination section 17, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, and the headlight control section 15.
  • step ST1-11, step ST1-2, and steps ST2 to ST4 are the headlight control device shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing contents are the same as those of step ST1-1, step ST1-2, and steps ST2-4 in operation 1, duplicate explanation will be omitted.
  • the reliability determination unit 17 determines the reliability of the driver's orientation detected by the orientation detection unit 11 in step ST1-11 (step ST1-12).
  • the reliability determining unit 17 determines the reliability of the driver's orientation, for example, based on the driver's orientation detected by the orientation detecting unit 11 going back a reliability determination period.
  • the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “low”
  • the reliability determination unit 17 outputs the orientation information output from the orientation detection unit 11 to the depth distance estimation unit 13. Try not to.
  • the reliability determination unit 17 rewrites the orientation information output from the orientation detection unit 11 to the orientation information most recently determined to have “high” reliability, and the rewritten orientation information is transferred to the depth distance estimation unit 13. Output to.
  • the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “high”
  • the reliability determination unit 17 transmits the orientation information output from the orientation detection unit 11 to the depth distance estimation unit 13. Output.
  • the headlight control device 1d determines the reliability of the detected driver's orientation, and if the driver's orientation is determined to have low reliability, it is used to estimate the depth distance using the depth distance estimation information. Avoid using it.
  • the headlight control device 1d prevents unnecessary leveling by excluding detected driver orientations with low reliability from the targets for estimating the depth distance. It is possible to reduce the annoyance caused by the light from the headlights 2 that follows a direction that is not the one the driver is trying to see.
  • the reliability determination unit 17 determines the reliability of the driver's orientation detected by the orientation detection unit 11, but this is only an example; The unit 17 may determine the reliability of the parts of the driver's face detected by the orientation detection unit 11. In this case, for example, the orientation detection unit 11 outputs information regarding the parts of the driver's face used to detect the driver's orientation to the reliability determination unit 17 together with the orientation information.
  • the reliability determination unit 17 determines that the reliability of the detected driver's facial parts is "low,” the reliability determining unit 17 determines whether the reliability is based on the driver's facial parts whose reliability was most recently determined to be "high.” to re-detect the driver's direction. The reliability determination unit 17 may re-detect the orientation of the driver. Note that, for example, the reliability determination unit 17 may not be able to acquire some of the detected facial parts of the driver.
  • the headlight control device 1d is an in-vehicle device mounted on the vehicle 100, and includes the orientation detection section 11, the reliability determination section 17, the driving-related information acquisition section 12, and the depth distance estimation section.
  • the unit 13, the irradiation determining unit 14, the headlight control unit 15, and a control unit (not shown) are included in the vehicle-mounted device.
  • the present invention is not limited to this, but includes the direction detection section 11, the reliability determination section 17, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, the headlight control section 15, and a control section (not shown).
  • the on-vehicle device of the vehicle 100 may be provided in the on-vehicle device of the vehicle 100, and the rest may be provided in a server connected to the on-vehicle device via a network.
  • the orientation detection section 11, the reliability determination section 17, the driving-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, the headlight control section 15, and all the control sections (not shown) may be provided in the server.
  • the headlight control device 1 according to the first embodiment includes the reliability determination section 17, but this is only an example.
  • the headlight control device 1a according to the second embodiment, the headlight control device 1b according to the third embodiment, or the headlight control device 1c according to the fourth embodiment may be configured to include the reliability determination unit 17. good.
  • the hardware configuration of the headlight control device 1d according to the fifth embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
  • the direction detection unit 11, the reliability determination unit 17, the driving-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and a control not shown The functions of the section are realized by the processing circuit 1001. That is, the headlight control device 1d estimates the depth distance based on driving-related information and direction information related to the direction of the driver detected based on the in-vehicle captured image acquired from the in-vehicle imaging device 3, and calculates the estimated depth distance.
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the following.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the reliability determination section 17, the travel-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section. 14, a headlight control section 15, and a control section (not shown). That is, when the headlight control device 1d is executed by the processing circuit 1001, steps ST1-11 to ST1-12, ST1-2, and ST2 to ST4 in FIG. 28 described above are executed as a result.
  • a memory 1005 is provided for storing different programs. Further, the program stored in the memory 1005 includes the direction detection section 11, the reliability determination section 17, the driving related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, and the headlight control section 15.
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1d includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the headlight control device 1d uses the direction of the driver detected by the direction detection section 11 based on the direction of the driver detected by the direction detection section 11 retroactively during the reliability determination period.
  • the depth distance estimating unit 13 includes a reliability determining unit 17 that determines the reliability of the orientation, and the depth distance estimating unit 13 is configured not to use the orientation of the driver that the reliability determining unit 17 determines to have low reliability for estimating the depth distance. Configured. Therefore, the headlight control device 1d prevents unnecessary leveling and reduces the annoyance to the driver caused by the light from the headlights 2 that follows a direction that is not the one the driver is trying to see. be able to.
  • Embodiment 6 will describe an embodiment in which a headlight control device sets a plurality of irradiation ranges with different light intensities.
  • FIG. 29 is a diagram showing a configuration example of a headlight control device 1e according to the sixth embodiment.
  • the headlight control device 1e is mounted on the vehicle 100.
  • the headlight control device 1e controls the headlights 2 provided in the vehicle 100 based on the direction of the driver of the vehicle 100.
  • the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction.
  • the "driver's orientation” includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
  • the headlight control device 1e performs the light control of the headlights 2 based on the direction of the driver in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
  • FIG. 29 the same components as the headlight control device 1 described in Embodiment 1 using FIG. 1 are denoted by the same reference numerals, and redundant explanation will be omitted.
  • the specific operations of the irradiation determining section 14a and the headlight control section 15a are the same as the irradiation determining section 14 and the headlight control section 14 in the headlight control device 1 according to the first embodiment.
  • the specific operation of the unit 15 is different.
  • the irradiation determining unit 14a sets a plurality of irradiation ranges in which the intensity of light irradiated to the headlight 2 differs.
  • "the light emitted by the headlights 2 is strong" means that the amount of light emitted by the headlights 2 is large.
  • the irradiation determining unit 14a determines an irradiation range (referred to as a first irradiation range) in which the amount of light irradiated by the headlights 2 is increased, and an irradiation range (referred to as a second irradiation range) in which the amount of light irradiated by the headlights 2 is decreased.
  • the amount of irradiation light in the first irradiation range can be set as appropriate, but it is assumed that the amount of irradiation light is such that the driver can sufficiently see the presumed visible object that may exist in the first irradiation range.
  • the amount of light irradiated in the second irradiation range is smaller than the amount of light irradiated in the first irradiation range, and is set to be a small amount of light that does not cause glare to pedestrians or other vehicles that may be present in the second irradiation range.
  • the irradiation determination unit 14a selects a first irradiation range that is preset in the vertical direction of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13. up to the upper limit angle in the vertical direction. For example, the irradiation determination unit 14a sets the range that is expanded vertically from the vertical center of the irradiation range calculated based on the depth distance to the upper limit angle as the first irradiation range in the vertical direction. The irradiation determination unit 14a sets the range other than the first irradiation range in the vertical direction of the irradiation range calculated based on the depth distance as the left and right direction of the second irradiation range.
  • the irradiation determining unit 14a also selects a first irradiation range that is set in advance in the horizontal direction of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 in the left and right direction of the first irradiation range. Up to the upper limit angle in the left and right direction of the irradiation range.
  • the irradiation determining unit 14a sets the range that is expanded from the center in the left-right direction of the irradiation range calculated based on the depth distance to the upper limit angle in the horizontal direction as the first irradiation range in the left-right direction.
  • the irradiation determining unit 14a sets a range other than the first irradiation range in the left-right direction of the irradiation range calculated based on the depth distance as the left-right direction of the second irradiation range.
  • the upper limit angle in the vertical direction and the upper limit angle in the horizontal direction of the first irradiation range are set by, for example, an administrator and stored in a location that can be referenced by the irradiation determining unit 14a.
  • the upper limit angle in the vertical direction of the first irradiation range is "5 degrees”
  • the upper limit angle in the horizontal direction is "8 degrees”.
  • the vertical direction and horizontal direction of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 are "3 degrees to 13 degrees" with the installation position of the headlight 2 as a reference (0 degrees). It is assumed that the temperature is in the range of 0.5 degrees to 12.5 degrees.
  • the irradiation determining unit 14a sets the range of "5.5 degrees to 10.5 degrees", which is expanded from "8 degrees" in the vertical center of the irradiation range to 5 degrees in the vertical direction, as the first irradiation range.
  • the first irradiation range is 2.5 degrees to 8.5 degrees, which is determined in the vertical direction of Determine the left and right direction.
  • the depth distance estimating unit 13 defines a second irradiation range, which is a range of "3 degrees to 5.5 degrees” and a range of "10.5 degrees to 13 degrees” in the vertical direction based on the installation position of the headlight 2. determined in the vertical direction, and the range of "0.5 degrees to 5.5 degrees” and the range of "10.5 degrees to 12.5 degrees” in the horizontal direction based on the installation position of the headlight 2, The second irradiation range is determined in the horizontal direction.
  • the irradiation determining section 14a outputs irradiation information regarding the determined irradiation range to the headlight control section 15a.
  • the irradiation information output by the irradiation determining unit 14a includes information indicating the vertical angle range and horizontal angle range of the first irradiation range, and information indicating the vertical angle range and the horizontal angle range of the second irradiation range. Contains information.
  • the headlight control unit 15a causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14a. Specifically, the headlight control unit 15a causes the headlight 2 to irradiate the second irradiation range set by the irradiation determining unit 14a among the irradiation ranges, and irradiate the first irradiation range set by the irradiation determining unit 14a. irradiate light with a smaller irradiation light amount than the light to be used.
  • FIG. 30 shows how the headlight control unit 15a causes the headlight 2 to irradiate light onto the first irradiation range and the second irradiation range determined by the irradiation determination unit 14a in the sixth embodiment. It is a figure for explaining an example.
  • FIG. 30A is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light when viewed from the side of the road on which the vehicle 100 is traveling
  • FIG. 30B is a diagram showing the first irradiation range and the second irradiation range. The figure shows how the range is irradiated with light as seen from the vehicle 100 side.
  • FIG. 30A is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light when viewed from the side of the road on which the vehicle 100 is traveling
  • FIG. 30B is a diagram showing the first irradiation range and the second irradiation range
  • the driver is indicated by “D”
  • the first irradiation range of the light irradiation range by the headlight 2 is indicated by “LA1”
  • the second irradiation range is indicated by “LA2”.
  • FIG. 30B illustration of the vehicle 100 and the like is omitted.
  • the irradiation determining unit 14a temporarily determines the depth distance. If the irradiation range is determined so that light is irradiated within the width range, the width of the irradiation range of the light that the headlight control unit 15a causes the headlight 2 to irradiate becomes large. In this case, the headlight control device 1d may give glare to pedestrians or drivers of other vehicles.
  • the headlight control device 1d sets a first irradiation range in which the amount of light irradiated by the headlight 2 is increased and a second irradiation range in which the amount of light irradiated by the headlight 2 is decreased.
  • the second irradiation range set by the irradiation determining unit 14a is irradiated with light of a smaller amount of light than the light irradiated to the first irradiation range set by the irradiation determining unit 14a.
  • the headlight control device 1d can reduce the glare that would be given to pedestrians or drivers of other vehicles, while also allowing the driver of the vehicle 100 to detect an estimated visible object in the direction in which the driver is facing. Light can be irradiated to make it visible.
  • the irradiation determining unit 14a may determine the first irradiation range and the second irradiation range using other methods. For example, of the irradiation range calculated based on the depth distance estimated by the depth distance estimation unit 13, a proportion of the first irradiation range in the vertical direction and a proportion of the first irradiation range in the horizontal direction are set respectively. Alternatively, the irradiation determining unit 14a may set the vertical direction and horizontal direction of the first irradiation range based on a preset ratio.
  • the irradiation determining unit 14a determines how much of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 should be in the vertical direction and the horizontal direction of the first irradiation range. may be set based on the driving state of the vehicle 100 estimated from the driver's behavior and driving-related information, which is the basis for deriving the depth distance.
  • the depth distance estimating unit 13 sets the upper limit angle in the vertical direction and the upper limit angle in the horizontal direction of the first irradiation range to "10 degrees"
  • the upper limit angle in the vertical direction and the upper limit angle in the horizontal direction of the first irradiation range may be set to "5 degrees”
  • the vertical direction and the horizontal direction of the first irradiation range may be set.
  • how far the first irradiation range should be in the vertical direction and the horizontal direction is determined in advance. In this case, for example, as described in Embodiment 1 using FIG.
  • information in which the driving state of vehicle 100 and the estimated visible object of the driver are associated is generated in advance by an administrator, etc., and It is stored in a location that can be referenced by the determination unit 14a. Further, the depth distance estimating unit 13 outputs orientation information and driving related information along with the depth distance information to the irradiation determining unit 14a.
  • the irradiation determining unit 14a sets the irradiation range corresponding to the ideal width of the irradiation range as the first irradiation range, and Among the irradiation ranges based on distance, a range other than the first irradiation range may be set as the second irradiation range.
  • FIG. 31 is a flowchart for explaining the operation of the headlight control device 1e according to the sixth embodiment.
  • the headlight control device 1e determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. 31. do.
  • the headlight control device 1e repeats the operation shown in the flowchart of FIG. 31, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
  • the control unit (not shown) of the headlight control device 1e acquires information indicating the state of the headlights 2 from a headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not.
  • the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13, the irradiation determining unit 14a, and the headlight control unit 15a.
  • control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the driving-related information acquisition section 12, the depth distance estimating section 13, the illumination determining section 14a, and the headlight control section 15a.
  • step ST1-1, step ST1-2, and step ST2 are the same as those of the headlight control device 1 shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing content is the same as that of steps ST1-1, ST1-2, and ST2 of the operation, duplicate explanation will be omitted.
  • the irradiation determining unit 14a determines a first irradiation range in which the amount of light irradiated by the headlights 2 is increased and a second irradiation range in which the amount of light irradiated by the headlights 2 is decreased based on the depth distance estimated by the depth distance estimation unit 13 in step ST2.
  • a range is set (step ST3a).
  • the irradiation determining section 14a outputs irradiation information regarding the determined irradiation range to the headlight control section 15a.
  • the headlight control unit 15a causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14a in step ST3a (step ST4a). Specifically, the headlight control unit 15a causes the headlight 2 to irradiate the second irradiation range set by the irradiation determining unit 14a among the irradiation ranges, and irradiate the first irradiation range set by the irradiation determining unit 14a. irradiate light with a smaller irradiation light amount than the light to be used.
  • the headlight control device 1e sets the first irradiation range and the second irradiation range in which the amount of light irradiated by the headlight 2 is smaller than the first irradiation range in the irradiation range determined based on the depth distance. , the headlight 2 is caused to irradiate a second irradiation range of the irradiation range with a smaller amount of light than the light irradiated onto one irradiation range. Therefore, the headlight control device 1e allows the driver of the vehicle 100 to visually recognize the presumed visible object in the direction in which the driver of the vehicle 100 is facing, while reducing glare that may be given to pedestrians or drivers of other vehicles. You can irradiate it with light so that you can do it.
  • the position of the irradiation range is changed to follow the change in the driver's orientation, but the headlight control device 1e is changed to follow the change in the driver's orientation.
  • the speed of movement of the position of the first irradiation range may be different from the speed of movement of the position of the second irradiation range, which is changed in accordance with a change in the orientation of the driver.
  • the irradiation determining unit 14a controls the moving speed of the first irradiation range, which is changed in accordance with a change in the driver's orientation, by a short-time average of the driver's orientation, and
  • the moving speed of the second irradiation range, which is changed to follow changes in the orientation of the driver may be controlled by a long-term average of the driver's orientation.
  • the headlight control device 1e can remain within the second irradiation range for a while even in a direction that the driver is no longer facing, and can continue to be irradiated with light from the headlights 2.
  • the headlight control device 1e illuminates the pedestrian with the light of the headlight 2 and directs the driver toward the pedestrian.
  • Pedestrians can be detected early.
  • the irradiation determining unit 14a sets two irradiation ranges with different amounts of light irradiated by the headlight 2, but this is only an example, and the irradiation determining unit 14a Three or more irradiation ranges may be set in which the amount of light irradiated by the light 2 differs in stages.
  • the irradiation determining unit 14a defines two irradiation ranges (a first irradiation range and a second irradiation range) having different irradiation light amounts in the vertical and horizontal directions of the irradiation range. ), but this is just an example.
  • the irradiation determining unit 14a may set an irradiation range in which the amount of irradiation light differs only in the vertical direction of the irradiation range, or may set an irradiation range in which the amount of irradiation light differs only in the left and right directions of the irradiation range.
  • the headlight control device 1e is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section 12. It is assumed that the section 14a, the headlight control section 15a, and a control section (not shown) are included in the in-vehicle device.
  • the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12, depth distance estimation section 13, irradiation determination section 14a, headlight control section 15a, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14a, the headlight control unit 15a, and a control unit (not shown). .
  • the first irradiation range and the second irradiation range are set in the headlight control device 1 according to the first embodiment, but this is only an example.
  • a first irradiation range and a second irradiation range may be set.
  • the headlight control device 1b according to Embodiment 3 and the headlight control device 1c according to Embodiment 4 adjust the depth distance by taking into account objects that actually exist in the driver's direction. do.
  • the depth distance is likely to be a distance that has no width.
  • the width of the irradiation range of the light that the headlight control devices 1b and 1c irradiates onto the headlight 2 does not become large.
  • the position of the object measured by an external imaging device or radar may include a measurement error. Therefore, for example, the headlight control devices 1b and 1c may set the range of the measurement error as the second irradiation range.
  • the hardware configuration of the headlight control device 1e according to the sixth embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
  • the functions of the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14a, the headlight control section 15a, and a control section are performed by a processing circuit. This is realized by 1001.
  • the headlight control device 1e estimates the depth distance based on the driving-related information and the orientation information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to the driving-related information.
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14a, and the headlight control section.
  • the functions of the unit 15a and a control unit (not shown) are executed. That is, the headlight control device 1e executes a program that, when executed by the processing circuit 1001, results in steps ST1-1, ST1-2, and ST2 to ST4a in FIG.
  • a memory 1005 is provided for storing.
  • the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14a, the headlight control unit 15a, and the control unit (not shown).
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1e includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the irradiation determining unit 14a sets the first irradiation range and the second irradiation amount by the headlight 2 to be smaller than the first irradiation range in the irradiation range.
  • the headlight control unit 15a sets the irradiation range for the headlight 2 in the second irradiation range set by the irradiation determination unit 14a among the irradiation ranges for the headlight 2. It is configured to irradiate a smaller amount of light than the light irradiated onto the area.
  • the headlight control device 1e allows the driver of the vehicle 100 to visually recognize the estimated visible object in the direction in which the driver of the vehicle 100 is facing, while reducing glare that would be given to pedestrians or drivers of other vehicles. It is possible to irradiate light to make it possible.
  • Embodiment 7 when the headlight control device sets a plurality of irradiation ranges with different light intensities, the headlight control device expands the irradiation range with weak light even further than the range set in Embodiment 6. The form will be explained.
  • FIG. 32 is a diagram showing a configuration example of a headlight control device 1f according to the seventh embodiment.
  • the same components as the configuration example of the headlight control device 1e according to the sixth embodiment described using FIG. 29 are denoted by the same reference numerals. Omit duplicate explanations.
  • the configuration example of the headlight control device 1f according to the seventh embodiment is different from the headlight control device 1e according to the sixth embodiment described using FIG. The difference is that
  • the surrounding confirmation determination unit 141 uses the information regarding the driver's orientation detected by the orientation detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12 (here, the vehicle-related information acquired by the vehicle information acquisition unit 121). Based on this, it is determined whether the driver is checking the surroundings, and if it is determined that the driver is checking the surroundings, the second irradiation range set by the irradiation determining unit 14b is expanded.
  • the surroundings confirmation determination unit 141 uses direction information, driving-related information (vehicle information in this case), and surroundings confirmation determination conditions for determining whether or not the driver is checking the surroundings. It is determined whether or not the driver is checking his/her surroundings.
  • FIG. 33 is a diagram for explaining an example of the contents of the surroundings confirmation determination condition used by the surroundings confirmation determining unit 141 to determine whether the driver is checking the surroundings in the seventh embodiment. It is a diagram.
  • the surrounding confirmation determination condition is, for example, a table in which driver behavior and vehicle information are associated with each other.
  • the conditions for surroundings confirmation determination are generated in advance by an administrator or the like, and are stored in a location where the surroundings confirmation determination unit 141 can refer to them.
  • Surrounding confirmation determining unit 141 determines the driver's behavior, vehicle speed, etc. from the direction information and vehicle information, and if the determined driver's behavior, vehicle speed, etc. match the surrounding confirmation determination conditions, the driver It is determined that the verification is being carried out.
  • the surroundings confirmation determination unit 141 determines that the driver has not checked the surroundings if the determined driver's behavior, vehicle speed, etc. do not match the surroundings confirmation determination conditions.
  • the second irradiation range set by the irradiation determining unit 14b is expanded.
  • the irradiation determination unit 14b may set the first irradiation range and the second irradiation range in the same manner as the irradiation determination unit 14a included in the headlight control device 1e according to the sixth embodiment, so the explanation will be redundant. omitted.
  • the left and right direction of the second irradiation range is defined as an area where the headlight 2 can irradiate light, in other words, Expand the high beam irradiable area, low beam irradiable area, and auxiliary light irradiable area to their limits.
  • the irradiation determining unit 14b includes information indicating the first irradiation range set by the irradiation determining unit 14b, and information indicating the expanded second irradiation range when the surrounding confirmation determining unit 141 expands the second irradiation range.
  • the irradiation information including the above is output to the headlight control section 15a. If the surrounding confirmation determination unit 141 does not expand the second irradiation range, the irradiation determination unit 14b transmits the irradiation information including information indicating the first irradiation range and the second irradiation range set by the irradiation determination unit 14b to the headlights. It is output to the control section 15a.
  • FIG. 34 is a diagram for explaining an example of how the surrounding confirmation determining unit 141 irradiates the irradiation range with light after expanding the second irradiation range set by the irradiation determining unit 14b in the seventh embodiment. It is a diagram.
  • FIG. 34 is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light, as seen from the vehicle 100 side.
  • the first irradiation range is indicated by "LA1”
  • the second irradiation range is indicated by "LA2".
  • the surrounding confirmation determination unit 141 does not expand the second irradiation range
  • the first irradiation range and the second irradiation range set by the irradiation determination unit 14b, in which light is irradiated are as shown in FIG. 30B in the sixth embodiment. This is the range shown in .
  • the headlight control device 1f By expanding the second irradiation range by the surrounding confirmation determination unit 141, as shown in FIG. It is also possible to brighten areas where there may be objects that are unlikely to exist. Note that the headlight control device 1f widens the second irradiation range, in other words, the range that is smaller than the first irradiation range and is irradiated with an amount of light that does not cause glare to pedestrians, drivers of other vehicles, etc. . Therefore, the headlight control device 1f brightens areas where objects that are likely not seen by the driver may be present, and also reduces glare to pedestrians or drivers of other vehicles in directions where the driver is not looking. It is possible to avoid giving
  • FIG. 35 and 36 are diagrams for explaining an example of the irradiation range of light that the headlight control device 1f irradiates to the headlight 2 in the seventh embodiment.
  • FIG. 35 is an overhead view of the surroundings 100 of the vehicle 100.
  • 36B is a diagram showing an example of the irradiation range of light that the headlight control device 1f irradiates to the headlights 2 in the surrounding situation of the vehicle 100 as shown in FIG.
  • FIG. 12 is a diagram showing an example of the irradiation range of light emitted to the headlights 2 by the headlight control device 1e according to the sixth embodiment in the situation around the vehicle 100 as shown in FIG.
  • the vehicle 100 is currently traveling in an underground parking lot, and in front of the vehicle 100, there are first parked vehicles (Fig. It is assumed that a second stopped vehicle (indicated by "C2" in FIG. 35) and a second stopped vehicle (indicated by "C2” in FIG. 35) are stopped. Furthermore, as viewed from the vehicle 100, a pedestrian (indicated by "W” in FIG. 35) is about to come out from behind the first stopped vehicle on the right front. The driver is facing toward the left front when viewed from the vehicle 100.
  • the second irradiation range of the irradiation range is set in the direction in which the driver is facing, that is, in the left front of the vehicle 100. Then, the right front of the vehicle 100 from which the pedestrian (indicated by "W" in FIG. 36A) is about to exit will be on the opposite side to the direction in which the driver is facing, and no irradiation range will be set. As a result, the area near the pedestrian becomes dark, and the driver is delayed in finding the pedestrian.
  • the headlight control device 1f sets the second irradiation to the left front of the vehicle 100 in the irradiation range determined based on the estimated depth distance. Expand the range in the left and right directions to the limit of the irradiable area. As a result, the right front of the vehicle 100 where the pedestrian (indicated by "W" in FIG. 36B) is about to exit is set as the irradiation range, specifically, the second irradiation range. As a result, the headlights 2 illuminate the vicinity of pedestrians, allowing the driver to spot pedestrians early.
  • the surrounding confirmation determination unit 141 expands the second irradiation range in the horizontal direction to the limit of the area where the headlight 2 can irradiate light, but this is only an example.
  • the surrounding confirmation determination unit 141 may widen the second irradiation range in the left and right direction by a preset range.
  • the surrounding confirmation determination unit 141 may be configured such that the irradiation determination unit 14b makes the irradiation range wider than the second irradiation range determined based on the depth distance estimated by the depth distance estimation unit 13.
  • the surrounding confirmation determining unit 141 also determines how much the second irradiation range should be expanded by determining the driving state of the vehicle 100 estimated from the driver's behavior and driving-related information, which is the basis for deriving the depth distance (for example, It may be set based on the vehicle speed of the vehicle 100 (while parked or stopped, or while turning right or left at an intersection) or the vehicle speed of the vehicle 100. In this case, for example, as described with reference to FIG. 3 in Embodiment 1, information in which the driving state of vehicle 100 and the estimated visible object of the driver are associated is generated in advance by an administrator, etc., and It is stored in a location that can be referenced by the confirmation determination unit 141. Further, the depth distance estimating unit 13 outputs direction information and travel-related information along with the depth distance information to the irradiation determining unit 14b.
  • the surroundings confirmation determining unit 141 widens the second irradiation range only in the left and right direction of the second irradiation range, but this is only an example, and the surroundings confirmation determining unit 141 may extend the second irradiation range in the vertical direction of the second irradiation range.
  • FIG. 37 is a flowchart for explaining the operation of the headlight control device 1f according to the seventh embodiment.
  • the headlight control device 1f determines that the lighting control of the headlight 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. 37. .
  • the headlight control device 1f repeats the operation shown in the flowchart of FIG. 37, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
  • the control unit (not shown) of the headlight control device 1f acquires information indicating the state of the headlights 2 from the headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are in the on state. Determine whether or not.
  • the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13, the irradiation determining unit 14b, and the headlight control unit 15a.
  • control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the driving-related information acquisition section 12, the depth distance estimation section 13, the illumination determination section 14b, and the headlight control section 15a.
  • step ST1-1, step ST1-2, step ST2 to step ST3a, and step ST4a are shown in the flowchart of FIG. 31, which have already been explained in the sixth embodiment. Since the processing content is the same as that of step ST1-1, step ST1-2, step ST2 to step ST3a, and step ST4a of the operation of the headlight control device 1e, duplicate explanation will be omitted.
  • the operation of the headlight control device 1f according to the seventh embodiment starts from the operation of the headlight control device 1e according to the sixth embodiment described using the flowchart of FIG. has been added.
  • step ST3a when the irradiation determination unit 14a sets the first irradiation range and the second irradiation range based on the depth distance estimated by the depth distance estimation unit 13 in step ST2, the surrounding confirmation determination unit 141 The driver checks the surroundings based on the orientation information regarding the driver's orientation detected by the orientation detection unit 11 in step ST1-1 and the vehicle-related information acquired by the vehicle information acquisition unit 121 in step ST1-2. It is determined whether or not there is one (step ST3a-1).
  • step ST3a-1 If it is determined in step ST3a-1 that the driver is checking the surroundings (“YES” in step ST3a-1), the surroundings confirmation determining unit 141 determines that the irradiation determining unit 14b is checking the surroundings in step ST3a.
  • the set second irradiation range is expanded (step ST3a-2).
  • the irradiation determining unit 14b receives irradiation information including information indicating the first irradiation range set in step ST3a and information indicating the second irradiation range expanded by the surrounding confirmation determination unit 141 in step ST3a-2. , is output to the headlight control section 15a.
  • step ST3a-1 If it is determined in step ST3a-1 that the driver has not checked the surroundings (“NO” in step ST3a-1), the irradiation determining unit 14b selects the first irradiation range set in step ST3a. And irradiation information including information indicating the second irradiation range is output to the headlight control section 15a. Then, the operation of the headlight control device 1b skips the process of step ST3a-2 and proceeds to the process of step ST4a.
  • the headlight control device 1f determines whether or not the driver is checking the surroundings based on the direction information regarding the driver's direction and driving-related information (vehicle information in this case), and If it is determined that the person is checking the surrounding area, the second irradiation range is expanded. Therefore, the headlight control device 1f can brighten not only the estimated visible target in the direction in which the driver is looking, but also a place where there may be an object that is likely not seen by the driver. In addition, the headlight control device 1f brightens areas where there may be objects that the driver is not likely to see, and also provides glare to pedestrians or drivers of other vehicles in directions that the driver is not looking at. It is possible to avoid giving
  • the headlight control device 1f is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the driving-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section 12. It is assumed that the section 14b, the headlight control section 15a, and a control section (not shown) are included in the in-vehicle device.
  • the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12, depth distance estimation section 13, irradiation determination section 14b, headlight control section 15a, and control section (not shown) are 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14b, the headlight control unit 15a, and a control unit (not shown). .
  • the hardware configuration of the headlight control device 1f according to the seventh embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, and therefore is not illustrated. do.
  • the functions of the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14b, the headlight control section 15a, and a control section are performed by a processing circuit. This is realized by 1001.
  • the headlight control device 1f estimates the depth distance based on the driving-related information and the direction information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to
  • a processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
  • the processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14b, and the headlight control section.
  • the functions of the unit 15a and a control unit (not shown) are executed. That is, the headlight control device 1f executes a program that, when executed by the processing circuit 1001, results in steps ST1-1, ST1-2, and ST2 to ST4a in FIG.
  • a memory 1005 is provided for storing.
  • the program stored in the memory 1005 includes the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14b, the headlight control section 15a, and the control section (not shown).
  • the storage unit 16 includes, for example, a memory 1005.
  • the headlight control device 1f includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
  • the irradiation determining unit 14b combines information regarding the driver's orientation detected by the orientation detecting unit 11 and driving-related information acquired by the driving-related information acquisition unit 12. Based on this, it is determined whether or not the driver is checking the surroundings, and when it is determined that the driver is checking the surroundings, the surroundings confirmation determining unit 141 is configured to widen the second irradiation range. Configured. Therefore, the headlight control device 1f can brighten not only the estimated visible target in the direction in which the driver is looking, but also a place where there may be an object that is likely not seen by the driver. In addition, the headlight control device 1f brightens areas where there may be objects that the driver is not likely to see, and also provides glare to pedestrians or drivers of other vehicles in directions that the driver is not looking at. It is possible to avoid giving
  • the depth distance estimating units 13, 13a, 13b, and 13c estimate the depth distance using the depth distance estimation information.
  • the depth distance estimating units 13, 13a, 13b, and 13c use a trained model (hereinafter referred to as a "machine learning model") that inputs orientation information and outputs a depth distance to estimate the depth distance.
  • the machine learning model may be a model that inputs orientation information and outputs depth distance and ideal width of the irradiation range, or a model that inputs orientation information and outputs depth distance and ideal light amount.
  • the administrator or the like may take the vehicle 100 for a test run and collect orientation information and depth distance information to an object that the driver attempted to see during the test run, and then use the collected orientation information and depth distance information.
  • the learning device generates a machine learning model as learning data.
  • the generated machine learning model is stored in a location where the headlight control devices 1, 1a, 1b, 1c, 1d, 1e, and 1f can refer to it.
  • the machine learning model may be a model that inputs not only orientation information but also map information, vehicle information, or information outside the vehicle, and outputs the depth distance and the ideal width of the irradiation range.
  • the headlight control devices 1, 1a, 1b, 1c, 1d, 1e, and 1f can estimate the corresponding depth distance based on variations in orientation information.
  • the headlight control device takes into consideration how far ahead in the direction of the driver's face or line of sight the driver is actually trying to see, when controlling the lighting of headlights in a vehicle based on the direction the driver is facing. You can control the lighting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The present invention comprises: an orientation detection unit (11) which detects the orientation of a driver of a vehicle (100) on the basis of an image in which the driver is captured; a travel-related information acquisition unit (12, 12a, 12b, 12c) which acquires travel-related information; a depth distance estimation unit (13, 13a, 13b, 13c) which estimates a depth distance on the basis of the information about the driver's orientation detected by the orientation detection unit (11) and the travel-related information acquired by the travel-related information acquisition unit (12, 12a, 12b, 12c); an irradiation determination unit (14, 14a, 14b) which determines an irradiation range of light from a headlight (2) on the basis of the depth distance estimated by the depth distance estimation unit (13, 13a, 13b, 13c); and a headlight control unit (15, 15a) which causes the headlight (2) to irradiate the irradiation range determined by the irradiation determination unit (14, 14a, 14b) with light.

Description

ヘッドライト制御装置、および、ヘッドライト制御方法Headlight control device and headlight control method
 本開示は、ヘッドライト制御装置、および、ヘッドライト制御方法に関する。 The present disclosure relates to a headlight control device and a headlight control method.
 従来、車両におけるヘッドライトの点灯制御において、運転者の視認性を向上させるため、運転者が向いている方向にヘッドライトの光が照射されるよう、運転者が向いている方向に基づいて、ヘッドライトの照射範囲を変化させる技術が知られている(例えば、特許文献1)。 Conventionally, in controlling the lighting of headlights in a vehicle, in order to improve the visibility of the driver, the light from the headlights is emitted in the direction the driver is facing, based on the direction the driver is facing. A technique for changing the irradiation range of headlights is known (for example, Patent Document 1).
特開2009-120148号公報Japanese Patent Application Publication No. 2009-120148
 特許文献1に開示されているような従来技術では、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮したヘッドライトの制御が行えていないという課題があった。その結果、運転者が実際に視認しようとしている場所を照射できていない状況が発生している可能性があった。 The conventional technology disclosed in Patent Document 1 has a problem in that the headlights cannot be controlled in consideration of how far ahead in the direction the driver is actually facing. As a result, there was a possibility that a situation occurred in which the area that the driver was actually trying to see could not be irradiated.
 本開示は上記のような課題を解決するためになされたもので、車両における、運転者が向いている方向に基づくヘッドライトの点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御を可能としたヘッドライト制御装置を提供することを目的とする。 The present disclosure has been made to solve the above-mentioned problems, and in controlling the lighting of headlights in a vehicle based on the direction in which the driver is facing, it is possible to determine the direction in which the driver is actually facing. It is an object of the present invention to provide a headlight control device capable of controlling lighting in consideration of whether the person is trying to visually check the destination.
 本開示に係るヘッドライト制御装置は、車両の運転者が撮像された撮像画像に基づき、運転者の向きを検出する向き検出部と、車両の走行に関連する走行関連情報を取得する走行関連情報取得部と、向き検出部が検出した運転者の向きに関する向き情報と、走行関連情報取得部が取得した走行関連情報とに基づき、車両に設けられているヘッドライトの設置位置から運転者が向いている方向において運転者の視認位置と推定される推定視認位置までの距離である奥行距離を推定する奥行距離推定部と、奥行距離推定部が推定した奥行距離に基づいて、ヘッドライトによる光の照射範囲を決定する照射決定部と、ヘッドライトに対して、照射決定部が決定した照射範囲に前記光を照射させるヘッドライト制御部とを備えたものである。 A headlight control device according to the present disclosure includes a direction detection unit that detects the orientation of a driver based on a captured image of a driver of a vehicle, and driving-related information that acquires driving-related information related to driving of a vehicle. Based on the acquisition unit, the orientation information regarding the driver's orientation detected by the orientation detection unit, and the driving-related information acquired by the driving-related information acquisition unit, the driver is faced from the installation position of the headlight provided on the vehicle. The depth distance estimator estimates the depth distance, which is the distance between the driver's visible position and the estimated visible position in the direction in which the driver is viewed, and the depth distance estimator calculates the amount of light emitted by the headlights based on the depth distance estimated by the depth distance estimator. The apparatus includes an irradiation determining section that determines an irradiation range, and a headlight control section that causes the headlight to irradiate the light onto the irradiation range determined by the irradiation determining section.
 本開示によれば、車両における、運転者が向いている方向に基づくヘッドライトの点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。 According to the present disclosure, when controlling the lighting of headlights in a vehicle based on the direction in which the driver is facing, the lighting control takes into consideration how far in the direction the driver is actually looking. can.
実施の形態1に係るヘッドライト制御装置の構成例を示す図である。1 is a diagram showing a configuration example of a headlight control device according to Embodiment 1. FIG. 実施の形態1において、奥行距離推定部が奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。FIG. 3 is a diagram illustrating an example of the content of depth distance estimation information used by the depth distance estimating unit to estimate the depth distance in the first embodiment. 図2に示す奥行距離推定用情報において設定されている、No.1~No.6の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車両情報とから推定される車両の走行状態および運転者の推定視認対象物の一例と対応付けて示した図である。No. set in the depth distance estimation information shown in FIG. 1~No. Regarding the depth distance corresponding to the input information under the condition 6, an example of the driving state of the vehicle estimated from the driver's behavior and vehicle information and the estimated visible object of the driver, which is the basis for deriving the depth distance. FIG. 実施の形態1において、ヘッドライト制御部が、ヘッドライトに対して、照射決定部が決定した照射範囲に光を照射させた様子の一例を説明するための図である。FIG. 6 is a diagram for explaining an example of how the headlight control unit causes the headlight to irradiate the irradiation range determined by the irradiation determination unit in the first embodiment. 実施の形態1に係るヘッドライト制御装置の動作について説明するためのフローチャートである。3 is a flowchart for explaining the operation of the headlight control device according to the first embodiment. 図6Aおよび図6Bは、実施の形態1に係るヘッドライト制御装置のハードウェア構成の一例を示す図である。6A and 6B are diagrams illustrating an example of the hardware configuration of the headlight control device according to the first embodiment. 実施の形態2に係るヘッドライト制御装置の構成例を示す図である。FIG. 3 is a diagram showing a configuration example of a headlight control device according to a second embodiment. 実施の形態2において、奥行距離推定部が奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。7 is a diagram illustrating an example of the content of depth distance estimation information used by a depth distance estimation unit to estimate depth distance in Embodiment 2. FIG. 図8に示す奥行距離推定用情報において設定されている、No.1~No.6の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車両情報と地図情報とから推定される車両の走行状態および運転者の視認対象物の一例と対応付けて示した図である。No. set in the depth distance estimation information shown in FIG. 1~No. Regarding the depth distance corresponding to the input information under the condition 6, the driving state of the vehicle and the object visible to the driver are estimated from the driver's behavior, vehicle information, and map information, which are the basis for deriving the depth distance. It is a figure shown in correspondence with an example. 図10Aは、実施の形態2において、奥行距離推定部が推定する奥行距離の一例について説明するための図であり、図10Bは、実施の形態2において、ヘッドライト制御部が、ヘッドライトに対して、奥行距離推定部が推定した図10Aに示すような奥行距離に基づき照射決定部が決定した照射範囲に、光を照射させた様子の一例を説明するための図である。FIG. 10A is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the second embodiment, and FIG. 10B is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the second embodiment. FIG. 10A is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit based on the depth distance as shown in FIG. 10A estimated by the depth distance estimating unit. 図11Aおよび図11Bは、実施の形態1において、奥行距離推定部が、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲について、具体的に何m~何mの範囲になるかの算出方法の一例を説明するための図である。FIGS. 11A and 11B show how, in Embodiment 1, the depth distance estimating unit specifically estimates how many meters to how many meters the depth distance range including the sidewalk near the intersection and the crosswalk at the intersection has a margin of 5 meters. FIG. 3 is a diagram for explaining an example of a method of calculating whether the range is within m. 図12Aは、実施の形態2において、奥行距離推定部が推定する奥行距離のその他の一例について説明するための図であり、図12Bは、実施の形態2において、ヘッドライト制御部が、ヘッドライトに対して、奥行距離推定部が推定した図12Aに示すような奥行距離に基づき照射決定部が決定した照射範囲に、光を照射させた様子の一例を説明するための図である。FIG. 12A is a diagram for explaining another example of the depth distance estimated by the depth distance estimating section in the second embodiment, and FIG. 12B is a diagram for explaining another example of the depth distance estimated by the depth distance estimating section in the second embodiment. 12A is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit based on the depth distance as shown in FIG. 12A estimated by the depth distance estimating unit. 実施の形態2に係るヘッドライト制御装置の動作について説明するためのフローチャートである。7 is a flowchart for explaining the operation of the headlight control device according to the second embodiment. 実施の形態3に係るヘッドライト制御装置の構成例を示す図である。FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a third embodiment. 実施の形態3において、奥行距離推定部が奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。FIG. 12 is a diagram showing an example of the content of depth distance estimation information used by the depth distance estimating unit to estimate the depth distance in Embodiment 3; 図15に示す奥行距離推定用情報において設定されている、No.1~No.8の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車外情報とから推定される車両の走行状態および運転者の視認対象物の一例と対応付けて示した図である。No. set in the depth distance estimation information shown in FIG. 1~No. Regarding the depth distance corresponding to the input information under the condition 8, an example of the driving state of the vehicle and the object visible to the driver estimated from the driver's behavior and information outside the vehicle, which are the basis for deriving the depth distance. It is a diagram shown in correspondence. 実施の形態3において、奥行距離推定部が推定した奥行距離に基づいて照射決定部が照射範囲を決定し、ヘッドライト制御部が、ヘッドライトに対して、照射決定部が決定した照射範囲に光を照射させた様子の一例を説明するための図である。In the third embodiment, the irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit, and the headlight control unit applies light to the headlight in the irradiation range determined by the irradiation determining unit. FIG. 3 is a diagram for explaining an example of how the light is irradiated. 実施の形態3において、奥行距離推定部が推定した奥行距離に基づいて照射決定部が照射範囲を決定し、ヘッドライト制御部が、ヘッドライトに対して、照射決定部が決定した照射範囲に光を照射させた様子のその他の一例を説明するための図である。In the third embodiment, the irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit, and the headlight control unit applies light to the headlight in the irradiation range determined by the irradiation determining unit. FIG. 7 is a diagram for explaining another example of how the light is irradiated. 実施の形態3において、奥行距離推定部が推定した奥行距離に基づいて照射決定部が照射範囲を決定し、ヘッドライト制御部が、ヘッドライトに対して、照射決定部が決定した照射範囲に光を照射させた様子のその他の一例を説明するための図である。In the third embodiment, the irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit, and the headlight control unit applies light to the headlight in the irradiation range determined by the irradiation determining unit. FIG. 7 is a diagram for explaining another example of how the light is irradiated. 実施の形態3に係るヘッドライト制御装置の動作について説明するためのフローチャートである。7 is a flowchart for explaining the operation of the headlight control device according to Embodiment 3. FIG. 実施の形態4に係るヘッドライト制御装置の構成例を示す図である。7 is a diagram illustrating a configuration example of a headlight control device according to a fourth embodiment. FIG. 実施の形態4において、奥行距離推定部が奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。FIG. 12 is a diagram showing an example of the contents of depth distance estimation information used by a depth distance estimating unit to estimate a depth distance in Embodiment 4; 図22に示す奥行距離推定用情報において設定されている、No.1~No.7の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車両情報と地図情報と車外情報とから推定される車両の走行状態および運転者の視認対象物の一例と対応付けて示した図である。No. set in the depth distance estimation information shown in FIG. 1~No. Regarding the depth distance corresponding to the input information under the condition 7, the driving state of the vehicle and the driver's behavior estimated from the driver's behavior, vehicle information, map information, and external information are the basis for deriving the depth distance. It is a diagram shown in association with an example of a visible object. 図24Aは、実施の形態4において、奥行距離推定部が推定する奥行距離の一例について説明するための図であり、図24Bは、実施の形態4において、ヘッドライト制御部が、ヘッドライトに対して、奥行距離推定部が推定した図24Aに示すような奥行距離に基づき照射決定部が決定した照射範囲に、光を照射させた様子の一例を説明するための図である。FIG. 24A is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the fourth embodiment, and FIG. 24B is a diagram for explaining an example of the depth distance estimated by the depth distance estimating section in the fourth embodiment. FIG. 24A is a diagram for explaining an example of how light is irradiated onto the irradiation range determined by the irradiation determining unit based on the depth distance as shown in FIG. 24A estimated by the depth distance estimating unit. 実施の形態4に係るヘッドライト制御装置の動作について説明するためのフローチャートである。7 is a flowchart for explaining the operation of the headlight control device according to Embodiment 4. FIG. 実施の形態5に係るヘッドライト制御装置の構成例を示す図である。FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a fifth embodiment. 実施の形態5において、信頼度判定部が、向き検出部によって検出された運転者の向きの信頼度が低いと判定する場面の一例を説明するための図である。FIG. 12 is a diagram for explaining an example of a scene in which the reliability determination unit determines that the reliability of the driver's orientation detected by the orientation detection unit is low in the fifth embodiment. 実施の形態5に係るヘッドライト制御装置の動作について説明するためのフローチャートである。7 is a flowchart for explaining the operation of the headlight control device according to Embodiment 5. FIG. 実施の形態6に係るヘッドライト制御装置の構成例を示す図である。FIG. 7 is a diagram showing a configuration example of a headlight control device according to a sixth embodiment. 実施の形態6において、ヘッドライト制御部が、ヘッドライトに対して、照射決定部が決定した第1照射範囲および第2照射範囲に光を照射させた様子の一例を説明するための図であって、図30Aは、第1照射範囲および第2照射範囲に光が照射されている様子を横から見た図であり、図30Bは、第1照射範囲および第2照射範囲に光が照射されている様子を車両側から見た図である。FIG. 7 is a diagram for explaining an example of how the headlight control unit causes the headlight to irradiate light onto the first irradiation range and the second irradiation range determined by the irradiation determination unit in the sixth embodiment; FIG. 30A is a side view showing how the first irradiation range and the second irradiation range are irradiated with light, and FIG. 30B is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light. FIG. 実施の形態6に係るヘッドライト制御装置の動作について説明するためのフローチャートである。12 is a flowchart for explaining the operation of the headlight control device according to the sixth embodiment. 実施の形態7に係るヘッドライト制御装置の構成例を示す図である。FIG. 7 is a diagram illustrating a configuration example of a headlight control device according to a seventh embodiment. 実施の形態7において、周囲確認判定部が、運転者が周囲の確認を行っているか否かの判定に用いる周囲確認判定用条件の内容の一例を説明するための図である。FIG. 12 is a diagram for explaining an example of the content of the surroundings confirmation determination condition used by the surroundings confirmation determination unit to determine whether the driver is checking the surroundings in Embodiment 7; 実施の形態7において、周囲確認判定部が、照射決定部が設定した第2照射範囲を広げた後の、照射範囲に光を照射させた様子の一例を説明するための図である。FIG. 12 is a diagram for explaining an example of how the surrounding confirmation determination unit irradiates the irradiation range with light after expanding the second irradiation range set by the irradiation determination unit in Embodiment 7; 実施の形態7において、ヘッドライト制御装置がヘッドライトに対して照射させる光の照射範囲の一例について説明するための図であって、車両の周囲の状況を上から見た俯瞰図である。FIG. 7 is a diagram for explaining an example of the irradiation range of light that the headlight control device irradiates to the headlights in Embodiment 7, and is a bird's-eye view of the surroundings of the vehicle viewed from above. 実施の形態7において、ヘッドライト制御装置がヘッドライトに対して照射させる光の照射範囲の一例について説明するための図であって、図36Bは、図35に示すような車両の周囲の状況において、ヘッドライト制御装置がヘッドライトに対して照射させる光の照射範囲の一例を示す図であり、図36Aは、図35に示すような車両の周囲の状況において、実施の形態6に係るヘッドライト制御装置がヘッドライトに対して照射させる光の照射範囲の一例を示す図である。FIG. 36B is a diagram for explaining an example of the irradiation range of light that the headlight control device irradiates to the headlights in Embodiment 7, and FIG. 36A is a diagram illustrating an example of the irradiation range of light that the headlight control device irradiates to the headlights, and FIG. FIG. 3 is a diagram showing an example of an irradiation range of light emitted by a control device to a headlight. 実施の形態7に係るヘッドライト制御装置の動作について説明するためのフローチャートである。13 is a flowchart for explaining the operation of the headlight control device according to Embodiment 7.
 以下、本開示の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
 図1は、実施の形態1に係るヘッドライト制御装置1の構成例を示す図である。
 実施の形態1において、ヘッドライト制御装置1は、車両100に搭載されていることを想定する。
 ヘッドライト制御装置1は、車両100の運転者の向きに基づいて、車両100に設けられているヘッドライト2の灯火制御を行う。実施の形態1において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向であらわされる。実施の形態1において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向に加え、運転者の身体の向き、言い換えれば、運転者の姿勢、を含むものとしてもよい。以下の実施の形態1では、一例として、「運転者の向き」には運転者の姿勢が含まれるものとする。
 実施の形態1では、ヘッドライト制御装置1が行う、運転者の向きに基づくヘッドライト2の灯火制御は、例えば、夜間の駐車場、または、夜間の市街地等、車両100の周囲が暗い場所において、運転者によってヘッドライト2がオンにされた場合、または、ヘッドライト制御装置1が周囲の明暗から自動でヘッドライト2をオンにすると判別した場合に行われることを想定している。
Embodiments of the present disclosure will be described in detail below with reference to the drawings.
Embodiment 1.
FIG. 1 is a diagram showing a configuration example of a headlight control device 1 according to the first embodiment.
In the first embodiment, it is assumed that the headlight control device 1 is mounted on a vehicle 100.
The headlight control device 1 controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100. In the first embodiment, the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction. In the first embodiment, the "driver's orientation" includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good. In Embodiment 1 below, as an example, it is assumed that the "driver's orientation" includes the driver's posture.
In the first embodiment, the light control of the headlights 2 based on the direction of the driver performed by the headlight control device 1 is performed in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. It is assumed that this is performed when the headlights 2 are turned on by the driver, or when the headlight control device 1 determines that the headlights 2 should be automatically turned on based on the surrounding brightness and darkness.
 ヘッドライト制御装置1は、ヘッドライト2、車内撮像装置3、および、走行関連情報取得装置4と接続される。ヘッドライト2、車内撮像装置3、および、走行関連情報取得装置4は、車両100に設けられている。 The headlight control device 1 is connected to the headlights 2, the in-vehicle imaging device 3, and the driving-related information acquisition device 4. The headlight 2, the in-vehicle imaging device 3, and the driving-related information acquisition device 4 are provided in the vehicle 100.
 ヘッドライト2は、車両100の前方を照らす照明器具である。ヘッドライト2は、例えば、ハイビームとロービームと補助光とを照射可能な一般的なヘッドライトであるため詳細な構成例についての説明は省略する。ヘッドライト2は、車両100において、車両100の進行方向に対して左側に搭載される左ライト(図示省略)と、車両100において、車両100の進行方向に対して右側に搭載される右ライト(図示省略)とを備える。左ライトと右ライトとは、それぞれ、遠方を照らすハイビームユニット(図示省略)と近方を照らすロービームユニット(図示省略)と補助光ユニット(図示省略)で構成される。
 ハイビームユニット、ロービームユニット、および、補助光ユニットは、例えば、それぞれ、アレイ状に配置された複数のLED光源等の光源(図示省略)で構成され、各光源は個々に点灯可能である。なお、実施の形態1において、アレイ状に配置されるとは、光源が、車両100の幅方向に一列に配置されることをいう。各光源が点灯することで、車両100の前方の領域にハイビーム、ロービーム、または、補助光が照射される。ハイビームユニット、ロービームユニット、および、補助光ユニットは、例えば、それぞれMEMS(Micro Electro Mechanical Systems)を用いてもよい。ハイビームユニット、ロービームユニット、および、補助光ユニットは、光をMEMSミラーで反射させることで、配光範囲を制御することができる。
The headlight 2 is a lighting device that illuminates the front of the vehicle 100. The headlight 2 is a general headlight that can emit, for example, a high beam, a low beam, and an auxiliary light, so a detailed description of the configuration example will be omitted. The headlights 2 include a left light (not shown) mounted on the left side of the vehicle 100 with respect to the traveling direction of the vehicle 100, and a right light (not shown) mounted on the right side of the vehicle 100 with respect to the traveling direction of the vehicle 100. (not shown). The left light and the right light each include a high beam unit (not shown) that illuminates a distant area, a low beam unit (not shown) that illuminates a nearby area, and an auxiliary light unit (not shown).
The high beam unit, low beam unit, and auxiliary light unit each include, for example, a plurality of light sources (not shown) such as LED light sources arranged in an array, and each light source can be turned on individually. Note that in the first embodiment, being arranged in an array means that the light sources are arranged in a line in the width direction of the vehicle 100. By lighting up each light source, a region in front of vehicle 100 is irradiated with high beam, low beam, or auxiliary light. The high beam unit, low beam unit, and auxiliary light unit may each use, for example, MEMS (Micro Electro Mechanical Systems). The high beam unit, low beam unit, and auxiliary light unit can control the light distribution range by reflecting light with a MEMS mirror.
 実施の形態1において、車両100の前方においてハイビームユニットがハイビームを照射可能とする領域を「ハイビーム照射可能領域」という。ハイビーム照射可能領域が、車両100のどれぐらい前方までの、どれぐらいの範囲の領域であるかは、ハイビームユニットの仕様等に応じて、予め決められている。実施の形態1において、車両100の前方においてロービームユニットがロービームを照射可能とする領域を「ロービーム照射可能領域」という。ロービーム照射可能領域が、車両100のどれぐらい前方までの、どれぐらいの範囲の領域であるかは、ロービームユニットの仕様等に応じて、予め決められている。実施の形態1において、車両100の前方において補助光ユニットが補助光を照射可能とする領域を「補助光照射可能領域」という。補助光照射可能領域が、車両100のどれぐらい前方までの、どれぐらいの範囲の領域であるかは、補助光ユニットの仕様等に応じて、予め決められている。 In the first embodiment, the area in front of the vehicle 100 where the high beam unit can emit high beams is referred to as a "high beam irradiable area." How far in front of the vehicle 100 and in what range the high beam irradiation possible area extends is determined in advance according to the specifications of the high beam unit and the like. In the first embodiment, the area in front of the vehicle 100 where the low beam unit can emit a low beam is referred to as a "low beam irradiation possible area." How far in front of the vehicle 100 and in what range the low beam irradiation possible area extends is determined in advance according to the specifications of the low beam unit and the like. In the first embodiment, the area in front of the vehicle 100 where the auxiliary light unit can irradiate the auxiliary light is referred to as the "auxiliary light irradiation possible area." The extent to which the auxiliary light irradiation area is in front of the vehicle 100 is determined in advance according to the specifications of the auxiliary light unit and the like.
 実施の形態1に係るヘッドライト制御装置1は、例えば、各光源について点灯または消灯させることで、ハイビーム、ロービーム、または、補助光を照射または遮光させる制御を行う。これにより、ヘッドライト制御装置1は、ヘッドライト2による光の照射範囲を制御する。
 なお、ヘッドライト制御装置1は、各光源について点灯および消灯を行うだけでなく、点灯時の光量の制御を行うことも可能である。例えば、ヘッドライト制御装置1は、ヘッドライト2の各光源の電流値を制御することでヘッドライト2の光量を制御することもできる。
The headlight control device 1 according to the first embodiment performs control to emit or block high beam, low beam, or auxiliary light by, for example, turning each light source on or off. Thereby, the headlight control device 1 controls the range of light irradiated by the headlights 2.
Note that the headlight control device 1 can not only turn on and turn off each light source, but also control the amount of light when turned on. For example, the headlight control device 1 can also control the amount of light from the headlights 2 by controlling the current value of each light source of the headlights 2.
 車内撮像装置3は、車両100内をモニタリングすることを目的に車両100に設置されたカメラ等であり、少なくとも、運転者の顔を撮像可能に設置されている。
 車内撮像装置3は、赤外線カメラまたは可視光カメラ等である。
 車内撮像装置3は、撮像した撮像画像(以下「車内撮像画像」という。)を、ヘッドライト制御装置1に出力する。
 車内撮像装置3は、例えば、車両100内の運転者の状態を監視するために車両100に搭載される、いわゆる「ドライバーモニタリングシステム(Driver Monitoring System,DMS)」が有する撮像装置と共用のものであってもよい。
The in-vehicle imaging device 3 is a camera or the like installed in the vehicle 100 for the purpose of monitoring the inside of the vehicle 100, and is installed so as to be able to image at least the driver's face.
The in-vehicle imaging device 3 is an infrared camera, a visible light camera, or the like.
The in-vehicle imaging device 3 outputs the captured image (hereinafter referred to as “in-vehicle captured image”) to the headlight control device 1.
The in-vehicle imaging device 3 is, for example, shared with an imaging device included in a so-called “Driver Monitoring System (DMS)” that is installed in the vehicle 100 to monitor the condition of the driver inside the vehicle 100. There may be.
 走行関連情報取得装置4は、車両100の走行に関連する情報(以下「走行関連情報」という。)を取得する。
 実施の形態1では、走行関連情報取得装置4は、例えば、車速センサ(図示省略)またはハンドル舵角センサ(図示省略)等の装置を想定している。車速センサまたは舵角センサ等の装置は、走行関連情報として、車速またはハンドル舵角等、車両100に関する情報(以下「車両情報」という。)を取得する。
 走行関連情報取得装置4、ここでは車速センサまたはハンドル舵角センサ等の装置は、取得した走行関連情報、ここでは車両情報を、ヘッドライト制御装置1に出力する。
The driving-related information acquisition device 4 acquires information related to the driving of the vehicle 100 (hereinafter referred to as "driving-related information").
In the first embodiment, the driving-related information acquisition device 4 is assumed to be, for example, a vehicle speed sensor (not shown) or a steering wheel angle sensor (not shown). A device such as a vehicle speed sensor or a steering angle sensor acquires information regarding the vehicle 100 (hereinafter referred to as "vehicle information"), such as vehicle speed or steering angle, as travel-related information.
The driving-related information acquisition device 4 , here a device such as a vehicle speed sensor or a steering wheel angle sensor, outputs the acquired driving-related information, here vehicle information, to the headlight control device 1 .
 ヘッドライト制御装置1は、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14、ヘッドライト制御部15、および、記憶部16を備える。走行関連情報取得部12は、車両情報取得部121を備える。 The headlight control device 1 includes a direction detection section 11 , a driving-related information acquisition section 12 , a depth distance estimation section 13 , an irradiation determination section 14 , a headlight control section 15 , and a storage section 16 . The driving related information acquisition section 12 includes a vehicle information acquisition section 121.
 向き検出部11は、車内撮像装置3から車内撮像画像を取得する。そして、向き検出部11は、車内撮像装置3から取得した車内撮像画像に基づき、運転者の向きを検出する。 The orientation detection unit 11 acquires an in-vehicle captured image from the in-vehicle imaging device 3. Then, the orientation detection unit 11 detects the orientation of the driver based on the in-vehicle captured image acquired from the in-vehicle imaging device 3.
 向き検出部11は、車内撮像装置3から取得した車内撮像画像から運転者の顔パーツ(例えば目、鼻、口等)を検出し、人の顔が撮像された撮像画像から人の顔向きを検出する公知の画像認識技術、または、人の顔が撮像された撮像画像から人の視線方向を検出する公知の画像認識技術を用いて、運転者の顔向き、または、視線方向を検出する。
 例えば、向き検出部11は、車内撮像装置3が赤外線カメラである場合、近赤外の点光源を照射し、角膜で反射されたプルキニエ像と瞳孔の位置関係から、運転者の視線方向を検出できる。例えば、向き検出部11は、予め用意され向き検出部11が記憶している、顔向き角度毎の顔画像の標準パターンと、車内撮像画像とのパターンマッチングにより最も類似度が高い顔向き角度を求めることにより、運転者の顔向きを検出できる。
The orientation detection unit 11 detects the driver's facial parts (e.g., eyes, nose, mouth, etc.) from the in-vehicle image obtained from the in-vehicle imaging device 3, and determines the direction of the person's face from the captured image in which the person's face is captured. The direction of the driver's face or the direction of the driver's line of sight is detected using a known image recognition technique for detecting the driver's face or a known image recognition technique for detecting the direction of the person's line of sight from a captured image of the person's face.
For example, when the in-vehicle imaging device 3 is an infrared camera, the direction detection unit 11 irradiates a near-infrared point light source and detects the direction of the driver's line of sight from the positional relationship between the Purkinje image reflected by the cornea and the pupil. can. For example, the orientation detection unit 11 determines the face orientation angle with the highest degree of similarity by pattern matching between a standard pattern of face images for each face orientation angle, which is prepared in advance and stored in the orientation detection unit 11, and the captured image inside the vehicle. By calculating this, the direction of the driver's face can be detected.
 また、向き検出部11は、人が撮像された撮像画像から人の姿勢を検出する公知の画像認識技術を用いて、運転者の姿勢を検出する。 Additionally, the orientation detection unit 11 detects the driver's posture using a known image recognition technique that detects the posture of a person from a captured image of the person.
 なお、向き検出部11は、運転者の顔向きと視線方向のうち、少なくともいずれか一方を検出するようになっていればよい。
 例えば、向き検出部11は、運転者の顔向きと視線方向の両方を検出し、より信頼度の高いほうを採用してもよい。例えば、向き検出部11は、運転者がサングラスまたは眼鏡を着用している場合は、運転者の視線方向より運転者の顔向きのほうがより信頼度が高いと判定する。
 例えば、向き検出部11は、運転者の顔向きと視線方向の両方を検出した上で、優先度をつけて、どちらかを運転者の向きとして採用するようにしてもよい。例えば、向き検出部11は、検出した運転者の顔向きと視線方向との差が大きい場合は運転者の視線方向を運転者の向きとして採用し、検出した運転者の顔向きと視線方向との差が小さい場合は運転者の顔向きを運転者の向きとして採用する。
Note that the orientation detection unit 11 may be configured to detect at least one of the driver's face orientation and line-of-sight direction.
For example, the orientation detection unit 11 may detect both the driver's face orientation and line of sight direction, and select the one with higher reliability. For example, if the driver is wearing sunglasses or glasses, the orientation detection unit 11 determines that the direction of the driver's face is more reliable than the direction of the driver's line of sight.
For example, the orientation detection unit 11 may detect both the driver's face orientation and line-of-sight direction, prioritize them, and adopt either one as the driver's orientation. For example, if there is a large difference between the detected driver's face direction and the line of sight direction, the direction detection unit 11 adopts the driver's line of sight direction as the driver's direction, and uses the detected driver's face direction and line of sight direction. If the difference is small, the direction of the driver's face is adopted as the direction of the driver.
 実施の形態1において、向き検出部11は、例えば、運転者の向きのうち、運転者の顔向き、または、視線方向を、運転者の頭部中心を基準として検出するものとする。なお、車内撮像装置3の設置位置および画角は予めわかっているので、向き検出部11は、運転者の頭部中心の位置を算出可能である。運転者の頭部中心の位置は、実空間上の一点であり、例えば、地図上にマッピング可能な座標値であらわされる。
 また、実施の形態1において、運転者の向きのうち、運転者の顔向き、または、視線方向は、例えば、運転者の頭部中心と当該頭部中心の正面の一点とを通る直線に対する水平角度および垂直角度であらわされるものとする。なお、実施の形態1において、「水平」とは、厳密に水平であることに限定されず、略水平を含む。また、実施の形態1において、「垂直」とは、厳密に垂直であることに限定されず、略垂直を含む。
 詳細には、運転者の顔向き、または、視線方向は、例えば、運転者が車両100の進行方向に対して正面を向いたときを基準(0度)とし、運転者が正面を向いた状態から車両100の進行方向に対して右または上に向くほど大きい値となる角度であらわされる。運転者の顔向き、または、視線方向は、例えば、運転者が車両100の進行方向に対して書面を向いたときを基準(0度)とし、運転者が正面を向いた状態から車両100の進行方向に対して左または下に向くほど、小さい値となる角度であらわされる。なお、実施の形態1において、正面とは、厳密に真正面であることに限定されず、略真正面を含む。
In the first embodiment, it is assumed that the orientation detection unit 11 detects, for example, the driver's face orientation or line-of-sight direction among the driver's orientations, with the center of the driver's head as a reference. Note that since the installation position and viewing angle of the in-vehicle imaging device 3 are known in advance, the orientation detection unit 11 can calculate the position of the center of the driver's head. The position of the center of the driver's head is a point in real space, and is represented by, for example, coordinate values that can be mapped on a map.
Further, in the first embodiment, among the driver's orientations, the driver's face direction or line of sight direction is, for example, horizontal to a straight line passing through the center of the driver's head and a point in front of the center of the head. shall be expressed as an angle and a vertical angle. Note that in the first embodiment, "horizontal" is not limited to strictly horizontal, but includes substantially horizontal. Furthermore, in the first embodiment, "vertical" is not limited to strictly vertical, but includes substantially vertical.
In detail, the driver's face direction or line of sight direction is based on, for example, when the driver faces forward with respect to the traveling direction of the vehicle 100 (0 degrees), and the driver faces forward. It is expressed as an angle that increases toward the right or upward with respect to the traveling direction of the vehicle 100. The direction of the driver's face or line of sight is, for example, based on when the driver is facing the document with respect to the direction of travel of the vehicle 100 (0 degrees), and when the driver is facing forward to the direction of the vehicle 100. The angle is expressed as a value that decreases as the direction of travel goes to the left or down. Note that in the first embodiment, the front is not limited to strictly directly in front, but includes substantially directly in front.
 以下の実施の形態1において、単に「運転者の顔向き」というとき、当該「運転者の顔向き」は、運転者の視線方向も含む、「運転者の顔向き、または、視線方向」のことをいう。 In Embodiment 1 below, when simply referring to "the driver's face direction", the "driver's face direction" refers to the "driver's face direction or line of sight direction" including the direction of the driver's line of sight. Say something.
 向き検出部11は、検出した運転者の向きに関する情報(以下「向き情報」という。)を、奥行距離推定部13に出力するとともに、記憶部16に記憶させる。向き情報は、運転者の顔向きと姿勢を示す情報、詳細には、運転者の顔向きの水平角度および垂直角度を示す情報と、運転者の姿勢の水平角度および垂直角度を示す情報を含む。
 向き検出部11は、向き情報を記憶部16に記憶させる際、例えば、向き情報に検出時刻を付与して、当該向き情報を記憶させる。
 例えば、記憶部16が複数(例えば50個)設けられるようにし、向き検出部11は、最新の50個の向き情報を、各記憶部16に記憶させるようにしてもよい。この場合、向き検出部11は、向き情報を記憶部16に記憶させる際、向き情報に検出時刻を付与しなくてもよい。
The orientation detection unit 11 outputs information regarding the detected orientation of the driver (hereinafter referred to as “orientation information”) to the depth distance estimation unit 13 and causes the storage unit 16 to store the information. The orientation information includes information indicating the driver's face direction and posture, and more specifically, information indicating the horizontal angle and vertical angle of the driver's face direction, and information indicating the horizontal angle and vertical angle of the driver's posture. .
When storing the orientation information in the storage unit 16, the orientation detection unit 11, for example, adds a detection time to the orientation information and stores the orientation information.
For example, a plurality of storage units 16 (eg, 50) may be provided, and the orientation detection unit 11 may cause each storage unit 16 to store the latest 50 pieces of orientation information. In this case, the orientation detection unit 11 does not need to add the detection time to the orientation information when storing the orientation information in the storage unit 16.
 なお、実施の形態1において、向き検出部11は、運転者の向きを検出しなかった場合、例えば記憶部16に向き情報を無効値として記憶させ、運転者の向きを検出しなくなってから予め設定された時間(以下「向き検出判定用時間」という。)が経過したかを判定する。向き情報を記憶部16に記憶させる際に向き情報に検出時刻を付与しない場合は、向き検出部11は、記憶部16に何個連続で無効値が記憶されているかで向き検出判定用時間が経過したかを判定すればよい。向き検出部11は、向き検出判定用時間が経過していない場合は、記憶部16を参照し、直前に検出した運転者の向きを示す向き情報を、奥行距離推定部13に出力する。
 運転者の向きを検出しなくなってから向き検出判定用時間が経過したと判定した場合は、向き検出部11は、ヘッドライト制御部15に、運転者の向きに無効値を設定した向き情報(以下「向き無効情報」という。)を、出力する。向き検出部11は、向き無効情報を、奥行距離推定部13および照射決定部14を介してヘッドライト制御部15に出力してもよいし、直接ヘッドライト制御部15に出力してもよい。なお、図1では、向き検出部11からヘッドライト制御部15への矢印の図示は省略している。
In the first embodiment, when the direction detection section 11 does not detect the direction of the driver, the direction detection section 11 stores the direction information in the storage section 16 as an invalid value, for example, and stores the direction information in advance after the direction of the driver is no longer detected. It is determined whether a set time (hereinafter referred to as "direction detection determination time") has elapsed. If a detection time is not added to the orientation information when storing the orientation information in the storage unit 16, the orientation detection unit 11 determines the orientation detection determination time based on how many consecutive invalid values are stored in the storage unit 16. All you have to do is determine whether the time has passed. If the orientation detection determination time has not elapsed, the orientation detection unit 11 refers to the storage unit 16 and outputs orientation information indicating the driver's orientation detected immediately before to the depth distance estimation unit 13.
If it is determined that the orientation detection determination time has elapsed since the driver's orientation was no longer detected, the orientation detection unit 11 sends the headlight control unit 15 orientation information (with an invalid value set for the driver's orientation). (hereinafter referred to as "direction invalid information") is output. The orientation detection section 11 may output the orientation invalidation information to the headlight control section 15 via the depth distance estimation section 13 and the illumination determination section 14, or directly to the headlight control section 15. In addition, in FIG. 1, illustration of an arrow from the direction detection section 11 to the headlight control section 15 is omitted.
 走行関連情報取得部12は、走行関連情報取得装置4から走行関連情報を取得する。
 詳細には、車両情報取得部121は、走行関連情報取得装置4から、車両情報を取得する。
 車両情報取得部121は、取得した車両情報を、走行関連情報として奥行距離推定部13に出力する。
The travel-related information acquisition unit 12 acquires travel-related information from the travel-related information acquisition device 4 .
Specifically, the vehicle information acquisition unit 121 acquires vehicle information from the travel-related information acquisition device 4.
The vehicle information acquisition unit 121 outputs the acquired vehicle information to the depth distance estimation unit 13 as travel-related information.
 奥行距離推定部13は、向き検出部11が検出した運転者の向きに関する向き情報と、走行関連情報取得部12が取得した走行関連情報とに基づいて、奥行距離を推定する。
 詳細には、奥行距離推定部13は、向き情報と、走行関連情報(ここでは車両情報)と、運転者の挙動に関する情報と走行関連情報とが対応付けられた奥行距離推定用情報との比較によって、奥行距離を推定する。
 実施の形態1において、奥行距離とは、車両100に設けられているヘッドライト2の設置位置から、運転者が向いている方向において、運転者が視認しようとしていると推定される位置(以下「推定視認位置」という。)までの距離である。つまり、奥行距離とは、ヘッドライト2の設置位置を示す点から、運転者が向いている方向を示す仮想的な直線上にある推定視認位置を示す点までの距離である。なお、奥行距離は、右ライトの設置位置から推定視認位置までの距離としてもよいし、左ライトの設置位置から推定視認位置までの距離としてもよい。
 奥行距離推定用情報は、当該奥行距離を推定するための情報であり、管理者等によって予め設定され、奥行距離推定部13が参照可能な場所に記憶されている。
The depth distance estimating unit 13 estimates the depth distance based on the orientation information regarding the direction of the driver detected by the orientation detecting unit 11 and the travel-related information acquired by the travel-related information acquisition unit 12.
Specifically, the depth distance estimating unit 13 compares orientation information, driving related information (vehicle information in this case), and depth distance estimation information in which information regarding the driver's behavior and driving related information are associated with each other. The depth distance is estimated by
In Embodiment 1, the depth distance refers to the position (hereinafter referred to as "the estimated position that the driver is trying to see") in the direction in which the driver is facing from the installation position of the headlight 2 provided in the vehicle 100. (referred to as the "estimated visual recognition position"). That is, the depth distance is the distance from a point indicating the installation position of the headlight 2 to a point indicating the estimated visual recognition position on a virtual straight line indicating the direction in which the driver is facing. Note that the depth distance may be the distance from the installation position of the right light to the estimated visual recognition position, or may be the distance from the installation position of the left light to the estimated visual recognition position.
The depth distance estimation information is information for estimating the depth distance, is set in advance by an administrator, etc., and is stored in a location that can be referenced by the depth distance estimation unit 13.
 ここで、図2は、実施の形態1において、奥行距離推定部13が奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。
 図2に示すように、奥行距離推定用情報は、入力情報と推定情報とが定義され、入力情報と推定情報とが対応付けられたテーブル形式の情報である。実施の形態1では、図2に示すように、例えば、入力情報として運転者の挙動と車両情報とが定義され、推定情報として奥行距離が定義されている。
 奥行距離推定用情報において、運転者の挙動には、例えば、運転者の顔向きの上下方向、運転者の顔向きの左右方向、および、運転者の状態が含まれる。また、奥行距離推定用情報において、車両情報には、例えば、車速およびハンドル舵角が含まれる。
 運転者の顔向きの上下方向は、当該運転者の顔向きが正面方向の向きであるとする「正面」、上方向の向きであるとする「上方」、または、下方向の向きであるとする「下方」であらわされる。運転者の顔向きの左右方向は、当該運転者の顔向きが正面方向の向きであるとする「正面」、右方向の向きであるとする「右」、または、左方向の向きであるとする「左」であらわされる。
 運転者の状態は、例えば、運転者が身を乗り出している状態、運転者の顔向きの変化量が予め設定された閾値以下である状態等を含む。
Here, FIG. 2 is a diagram showing an example of the contents of depth distance estimation information used by the depth distance estimation unit 13 to estimate the depth distance in the first embodiment.
As shown in FIG. 2, the depth distance estimation information is information in a table format in which input information and estimation information are defined and the input information and estimation information are associated with each other. In the first embodiment, as shown in FIG. 2, for example, driver behavior and vehicle information are defined as input information, and depth distance is defined as estimated information.
In the depth distance estimation information, the driver's behavior includes, for example, the vertical direction of the driver's face direction, the horizontal direction of the driver's face direction, and the driver's state. Furthermore, in the depth distance estimation information, the vehicle information includes, for example, vehicle speed and steering angle.
The vertical direction of the driver's face can be defined as "front" where the driver's face is facing forward, "upward" where the driver's face is facing upward, or "upward" where the driver's face is facing downward. It is expressed as "downward". The left and right direction of the driver's face can be defined as "front", where the driver's face is facing forward, "right", where the driver's face is facing to the right, or "right", where the driver's face is facing to the left. It is expressed as "left".
The driver's state includes, for example, a state in which the driver is leaning forward, a state in which the amount of change in the driver's face direction is less than or equal to a preset threshold value, and the like.
 奥行距離推定部13は、向き検出部11から出力された向き情報に基づいて、運転者の挙動を判定し、判定した挙動と、車両情報取得部121が取得した車両情報に含まれている車速およびハンドル舵角を示す情報を、図2に示すような奥行距離推定用情報で設定されている入力情報としての運転者の挙動および車両情報とつきあわせて、推定情報としての奥行距離の情報を得ることで、奥行距離を推定する。 The depth distance estimating unit 13 determines the driver's behavior based on the orientation information output from the orientation detecting unit 11, and calculates the determined behavior and the vehicle speed included in the vehicle information acquired by the vehicle information acquiring unit 121. The depth distance information as the estimated information is calculated by combining the information indicating the steering angle and the steering angle with the driver's behavior and vehicle information as the input information set in the depth distance estimation information as shown in Figure 2. Estimate the depth distance by
 例えば、奥行距離推定部13は、向き情報に含まれている運転者の垂直方向の顔向きを示す情報から、運転者の顔向きの上下方向が「正面」であるか「上方」であるか「下方」であるかを判定する。例えば、予め、運転者の顔向きが「正面」であるとする角度範囲と、「上方」であるとする角度範囲と、「下方」であるとする角度範囲とが定義された情報(以下「上下角度範囲情報」という。)が設定され、奥行距離推定部13が参照可能な場所に記憶されている。奥行距離推定部13は、向き検出部11が検出した運転者の垂直方向の顔向きに基づき、上下角度範囲情報を参照して、運転者の顔向きの上下方向が「正面」であるか「上方」であるか「下方」であるかを判定する。 For example, the depth distance estimating unit 13 determines whether the vertical direction of the driver's face is "front" or "above" from information indicating the driver's vertical face orientation included in the orientation information. Determine whether it is "downward". For example, information (hereinafter referred to as " "vertical angle range information") is set and stored in a location that can be referenced by the depth distance estimation unit 13. Based on the driver's vertical face orientation detected by the orientation detection unit 11, the depth distance estimation unit 13 refers to the vertical angle range information and determines whether the vertical direction of the driver's face orientation is "front" or not. It is determined whether it is "upward" or "downward."
 また、例えば、奥行距離推定部13は、向き情報に含まれている運転者の水平方向の顔向きを示す情報から、運転者の顔向きの左右方向が「正面」であるか「右」であるか「左」であるかを判定する。例えば、予め、運転者の顔向きが「正面」であるとする角度範囲と、「右」であるとする角度範囲と、「左」であるとする角度範囲とが定義された情報(以下「左右角度範囲情報」という。)が設定され、奥行距離推定部13が参照可能な場所に記憶されている。奥行距離推定部13は、向き検出部11が検出した運転者の垂直方向の顔向きに基づき、左右角度範囲情報を参照して、運転者の顔向きの左右方向が「正面」であるか「右」であるか「左」であるかを判定する。 For example, the depth distance estimating unit 13 determines whether the horizontal direction of the driver's face is “front” or “right” from information indicating the driver's horizontal face orientation included in the orientation information. Determine whether it is on the left or on the left. For example, information (hereinafter referred to as " (referred to as "left and right angle range information") is set and stored in a location that can be referenced by the depth distance estimation unit 13. Based on the driver's vertical face orientation detected by the orientation detection unit 11, the depth distance estimating unit 13 refers to the left-right angle range information to determine whether the left-right direction of the driver's face is "front" or not. Determine whether it is "right" or "left."
 また、例えば、奥行距離推定部13は、向き情報に含まれている運転者の姿勢を示す情報に基づいて、運転者の姿勢が、予め設定された「身を乗り出している状態」であるか否かを判定する。予め、運転者の姿勢がどれぐらい傾いたときに「身を乗り出している状態」であると判定するかの条件が管理者等によって設定され、奥行距離推定部13が参照可能な場所に記憶されている。
 また、例えば、奥行距離推定部13は、記憶部16に記憶されている時系列の向き情報から、運転者の顔向きの変化量を算出し、当該変化量が閾値以下であるか否かを判定する。なお、当該変化量の判定に用いる閾値は、管理者等によって予め設定され、奥行距離推定部13が参照可能な場所に記憶されている。
Further, for example, the depth distance estimating unit 13 determines whether the driver's posture is a preset "leaning state" based on information indicating the driver's posture included in the orientation information. Determine whether or not. Conditions for determining how much the driver's posture must be tilted to determine that the driver is "leaning over" are set in advance by an administrator, etc., and are stored in a location where the depth distance estimating unit 13 can refer to them. ing.
For example, the depth distance estimating unit 13 calculates the amount of change in the driver's face orientation from the time-series orientation information stored in the storage unit 16, and determines whether the amount of change is less than or equal to a threshold value. judge. Note that the threshold value used for determining the amount of change is set in advance by an administrator or the like, and is stored in a location that can be referenced by the depth distance estimation unit 13.
 例えば、今、運転者の顔向きが「上方」の範囲内の向きであったとする。また、車両100の走行速度が90km/hであったとする。また、車両100のハンドル舵角が右に3度であったとする。なお、奥行距離推定用情報は、図2に示すような内容であったとする。
 この場合、奥行距離推定部13は、向き情報と、走行関連情報(ここでは車両情報)と、奥行距離推定用情報とを用いて、奥行距離は「80~100m」であると推定することになる(図2の奥行距離推定用情報のNo.6参照)。
For example, assume that the driver's face direction is within the "upward" range. Further, assume that the traveling speed of vehicle 100 is 90 km/h. Further, assume that the steering wheel angle of the vehicle 100 is 3 degrees to the right. Note that it is assumed that the depth distance estimation information has contents as shown in FIG. 2 .
In this case, the depth distance estimating unit 13 estimates that the depth distance is "80 to 100 m" using the orientation information, travel-related information (vehicle information in this case), and depth distance estimation information. (See No. 6 of the depth distance estimation information in FIG. 2).
 例えば、運転者の顔向きが「上方」であり、車速が「80km/h」以上であり、かつ、ハンドル舵角が「右または左に5度以下」である場合、車両100は高速道路を走行中であり、運転者が視認しようとしている可能性が高い、運転者の顔向きの先にあると推定される物体、言い換えれば、推定視認位置に存在すると推定される物体(以下「推定視認対象物」という。)は、標識であると推定される。そこで、奥行距離推定用情報において、奥行距離には、車両100が高速道路を「80km/h」以上、かつ、ハンドル舵角が「右または左に5度以下」の状態で走行中に運転者が標識を確認しようとすると、当該標識はおそらくこれぐらいの奥行距離の位置にあるであろうと想定される「80~100m」が設定されている(図2の奥行距離推定用情報のNo.6)。 For example, if the driver's face direction is "upward," the vehicle speed is "80 km/h" or more, and the steering wheel angle is "5 degrees or less to the right or left," the vehicle 100 is traveling on the highway. An object that is estimated to be in front of the driver's face when the driver is driving and is likely to be trying to see it; in other words, an object that is estimated to exist at the estimated visibility position (hereinafter referred to as "estimated visibility"). "object") is presumed to be a sign. Therefore, in the depth distance estimation information, the depth distance includes the driver who is driving while the vehicle 100 is traveling on the expressway at a speed of "80 km/h or more" and the steering wheel angle is "5 degrees or less to the right or left." When the user tries to check the sign, the depth distance is set to 80 to 100 m, which is assumed to be the depth distance of the sign (No. 6 of the depth distance estimation information in Figure 2). ).
 同様に、図2の奥行距離推定用情報のNo.1~No.5の条件においても、運転者の挙動および車両情報から推定される車両100の走行状態および運転者の推定視認対象物に基づいて、奥行距離が設定されている。
 図3は、図2に示す奥行距離推定用情報において設定されている、No.1~No.6の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車両情報とから推定される車両100の走行状態および運転者の推定視認対象物の一例を対応付けて示した図である。
 管理者等は、例えば、図3に示すような、運転者の挙動と車両情報とから推定される車両100の走行状態および運転者の推定視認対象物の一例との対応関係に基づき、奥行距離推定用情報を生成する。管理者等は、例えば、車両100を試走させてみることで得られた情報から、上記対応関係を検証して、奥行距離推定用情報を生成してもよい。
Similarly, No. 2 of the depth distance estimation information in FIG. 1~No. Also in condition No. 5, the depth distance is set based on the driving state of the vehicle 100 estimated from the driver's behavior and vehicle information and the estimated visible object of the driver.
FIG. 3 shows No. 1 set in the depth distance estimation information shown in FIG. 1~No. Regarding the depth distance corresponding to the input information under the condition 6, the driving state of the vehicle 100 estimated from the driver's behavior and vehicle information and the estimated visible object of the driver are the basis for deriving the depth distance. It is a diagram showing an example in association with each other.
For example, the administrator or the like calculates the depth distance based on the correspondence relationship between the driving state of the vehicle 100 estimated from the driver's behavior and the vehicle information and an example of the driver's estimated visible object, as shown in FIG. Generate estimation information. The administrator or the like may, for example, verify the above-mentioned correspondence from information obtained by test driving the vehicle 100 and generate the depth distance estimation information.
 なお、図2に示す奥行距離推定用情報では、No.1~No.6の6パターンの条件が設定されているが、これは一例に過ぎない。また、図2に示すような奥行距離推定用情報の内容は、一例に過ぎない。
 奥行距離推定用情報は、向き情報と走行関連情報(ここでは車両情報)とから奥行距離の情報が得られる情報となっていればよい。
Note that in the depth distance estimation information shown in FIG. 1~No. 6 patterns of conditions are set, but this is just an example. Moreover, the content of the depth distance estimation information as shown in FIG. 2 is only an example.
The depth distance estimation information may be any information that allows depth distance information to be obtained from orientation information and travel-related information (vehicle information in this case).
 また、上述した例のように、奥行距離には幅を持たせた値が設定され得る。
 例えば、奥行距離推定部13が、奥行距離は「80~100m」と推定した場合、詳細には、奥行距離推定部13は、奥行距離を、ヘッドライト2の設置位置から推定視認位置までが80m~100mの範囲の距離と推定したことになる。
Further, as in the example described above, a value with a width may be set for the depth distance.
For example, when the depth distance estimating unit 13 estimates that the depth distance is “80 to 100 m”, in detail, the depth distance estimating unit 13 estimates that the depth distance is 80 m from the installation position of the headlight 2 to the estimated visible position. This means that the distance was estimated to be in the range of ~100m.
 奥行距離推定部13は、推定した奥行距離に関する情報(以下「奥行距離情報」という。)を、照射決定部14に出力する。
 奥行距離情報は、奥行距離と推定視認位置とが対応付けられた情報である。なお、ヘッドライト2の設置位置と運転者の頭部中心の位置とはわかっているため、奥行距離推定部13は、ヘッドライト2の設置位置と運転者の頭部中心の位置関係と、推定した奥行距離と、運転者の向きとに基づけば、推定視認位置を推定できる。推定視認位置は、実空間上の一点であり、例えば、右灯具もしくは左灯具を原点とした座標値、または、地図上にマッピング可能な座標値であらわされる。
The depth distance estimation unit 13 outputs information regarding the estimated depth distance (hereinafter referred to as “depth distance information”) to the irradiation determination unit 14.
Depth distance information is information in which a depth distance and an estimated visible position are associated with each other. Note that since the installation position of the headlight 2 and the position of the center of the driver's head are known, the depth distance estimating unit 13 estimates the positional relationship between the installation position of the headlight 2 and the center of the driver's head. Based on the determined depth distance and the driver's orientation, the estimated visual recognition position can be estimated. The estimated visual recognition position is a point in real space, and is expressed, for example, by coordinate values with the right lamp or left lamp as the origin, or by coordinate values that can be mapped on a map.
 なお、奥行距離推定部13は、向き検出部11が検出した運転者の向きに関する向き情報と走行関連情報取得部12が取得した走行関連情報(ここでは車両情報取得部121が取得した車両情報)とに基づき判定した運転者の挙動と車両情報とが、奥行距離推定用情報で設定されている入力情報としての運転者の挙動と車両情報とにつきあわない場合、例えば、奥行距離推定用情報からは奥行距離は推定できなかったとして、当該奥行距離に当該奥行距離の初期値を設定する。奥行距離の初期値は、例えば、予め、管理者等によって設定され、奥行距離推定部13が参照可能な場所に記憶されている。
 奥行距離推定部13は、初期値を設定した奥行距離に関する奥行距離情報を、照射決定部14に出力する。
Note that the depth distance estimating unit 13 uses orientation information regarding the driver's orientation detected by the orientation detecting unit 11 and travel-related information acquired by the travel-related information acquisition unit 12 (here, vehicle information acquired by the vehicle information acquisition unit 121). If the driver's behavior and vehicle information determined based on the information do not match the driver's behavior and vehicle information as input information set in the depth distance estimation information, for example, Assuming that the depth distance could not be estimated, the initial value of the depth distance is set as the depth distance. The initial value of the depth distance is set in advance by an administrator or the like, for example, and is stored in a location that can be referenced by the depth distance estimation unit 13.
The depth distance estimation unit 13 outputs depth distance information regarding the depth distance for which the initial value is set to the irradiation determination unit 14.
 照射決定部14は、奥行距離推定部13が推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する。
 詳細には、照射決定部14は、ヘッドライト2の照射可能領域、言い換えれば、ハイビーム照射可能領域、ロービーム照射可能領域、および、補助光照射可能領域のうち、どこまでの範囲を光が照射される範囲とするかを、ヘッドライト2による光の照射範囲として決定する。
 以下の説明において、ヘッドライト2による光の照射範囲のことを、単に「照射範囲」ともいう。
The irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13.
Specifically, the irradiation determining unit 14 determines which range of the irradiable area of the headlight 2, in other words, the high beam irradiable area, the low beam irradiable area, and the auxiliary light irradiable area, to which light is irradiated. The range is determined as the range of light irradiation by the headlights 2.
In the following description, the range of light irradiated by the headlight 2 is also simply referred to as the "irradiation range."
 照射決定部14による照射範囲の決定方法の一例について、説明する。
 照射決定部14は、運転者が向いている方向における、照射範囲の上下方向および左右方向を決定する。
 詳細には、照射決定部14は、運転者が向いている方向において、ヘッドライト2の設置位置を基準位置とし、当該設置位置から正面に照射される光の照射角度を基準(0度)として、垂直方向に何度から何度までの範囲を照射範囲の上下方向とし、水平方向に何度から何度までの範囲を照射範囲の左右方向とするかを決定する。
An example of a method for determining the irradiation range by the irradiation determining unit 14 will be described.
The irradiation determining unit 14 determines the vertical and horizontal directions of the irradiation range in the direction in which the driver is facing.
Specifically, the irradiation determining unit 14 sets the installation position of the headlight 2 as a reference position in the direction in which the driver is facing, and sets the irradiation angle of light irradiated from the installation position to the front as a reference (0 degree). , it is determined how many times the range in the vertical direction is to be defined as the vertical direction of the irradiation range, and from what range in the horizontal direction is to be defined as the range in the left and right direction of the irradiation range.
 例えば、照射決定部14は、照射範囲の上下方向について、まず、ヘッドライト2の設置位置から奥行距離だけ離れた推定視認位置までの仮想的な線分と、ヘッドライト2の設置位置から車両100の進行方向に引いた仮想的な直線とがなす垂直方向の向き(以下「奥行距離垂直角度」という。)を算出する。照射決定部14は、奥行距離水平角度と同様の方法で、奥行距離垂直角度を算出できる。
 そして、照射決定部14は、例えば、奥行距離垂直角度から垂直方向に予め設定された角度分だけ広げた角度の範囲を、照射範囲の上下方向とする。
For example, in the vertical direction of the irradiation range, the irradiation determining unit 14 first determines a virtual line segment from the installation position of the headlight 2 to an estimated visibility position separated by a depth distance, and from the installation position of the headlight 2 to the vehicle 100. The vertical direction (hereinafter referred to as "depth distance vertical angle") formed by a virtual straight line drawn in the direction of movement of is calculated. The irradiation determining unit 14 can calculate the depth distance and vertical angle using the same method as the depth distance and horizontal angle.
Then, the irradiation determining unit 14 sets, for example, an angular range that is widened by a preset angle in the vertical direction from the depth distance vertical angle as the vertical direction of the irradiation range.
 また、例えば、照射決定部14は、照射範囲の左右方向について、まず、ヘッドライト2の設置位置から奥行距離だけ離れた推定視認位置までの仮想的な線分と、ヘッドライト2の設置位置から車両100の進行方向に引いた仮想的な直線とがなす水平方向の向き(以下「奥行距離水平角度」という。)を算出する。照射決定部14は、奥行距離情報に基づけば、奥行距離と推定視認位置とがわかる。また、ヘッドライト2の設置位置は、予めわかっている。照射決定部14は、奥行距離と推定視認位置とヘッドライト2の設置位置とに基づけば、奥行距離水平角度が算出できる。
 そして、照射決定部14は、例えば、奥行距離水平角度から水平方向に予め設置された角度分だけ広げた角度の範囲を、照射範囲の上下方向とする。
For example, in the left and right direction of the irradiation range, the irradiation determining unit 14 first determines a virtual line segment from the installation position of the headlight 2 to an estimated visibility position separated by a depth distance, and from the installation position of the headlight 2. The horizontal direction (hereinafter referred to as "depth distance horizontal angle") formed by a virtual straight line drawn in the traveling direction of vehicle 100 is calculated. The irradiation determining unit 14 can determine the depth distance and the estimated visible position based on the depth distance information. Further, the installation position of the headlight 2 is known in advance. The irradiation determination unit 14 can calculate the depth distance and horizontal angle based on the depth distance, the estimated visible position, and the installation position of the headlight 2.
Then, the irradiation determining unit 14 sets, for example, an angular range widened by a predetermined angle in the horizontal direction from the depth distance horizontal angle as the vertical direction of the irradiation range.
 なお、上述したように、奥行距離推定部13により、奥行距離が幅を持った値として推定されていることもある。
 この場合、照射範囲の上下方向について、照射決定部14は、例えば、まず、奥行距離の範囲のうち、最も小さい距離(例えば、図2を用いて上述した例でいうと「80m」)に基づき奥行距離垂直角度(第1奥行距離垂直角度とする)を算出する。また、照射決定部14は、奥行距離の範囲のうち、最も大きい距離(例えば、図2を用いて上述した例でいうと「100m」)に基づき奥行距離垂直角度(第1奥行距離垂直角度とする)を算出する。
 そして、照射決定部14は、例えば、第1奥行距離垂直角度と第2奥行距離垂直角度のうち、小さいほうから大きいほうまでの範囲を、照射範囲の上下方向とする。
 奥行距離推定部13は、照射範囲の左右方向についても、同様の方法で、第1奥行距離水平角度と第2奥行距離水平角度とを算出し、例えば、第1奥行距離水平角度と第2奥行距離水平角度のうち、小さいほうから大きいほうまでの範囲を、照射範囲の左右方向とする。
Note that, as described above, the depth distance estimation unit 13 may estimate the depth distance as a value with a range.
In this case, with respect to the vertical direction of the irradiation range, the irradiation determining unit 14 first determines the distance based on the smallest distance (for example, "80 m" in the example described above using FIG. 2) among the range of depth distances. A depth distance vertical angle (referred to as a first depth distance vertical angle) is calculated. Further, the irradiation determining unit 14 determines the depth distance vertical angle (first depth distance vertical angle) based on the largest distance (for example, "100 m" in the example described above using FIG. 2) among the depth distance ranges. ) is calculated.
Then, the irradiation determining unit 14 sets, for example, the range from the smaller to the larger of the first depth distance vertical angle and the second depth distance vertical angle as the vertical direction of the irradiation range.
The depth distance estimating unit 13 calculates the first depth distance horizontal angle and the second depth distance horizontal angle in the same manner in the left and right directions of the irradiation range, and for example, calculates the first depth distance horizontal angle and the second depth distance horizontal angle. The range from the smaller distance to the larger horizontal angle is defined as the left and right direction of the irradiation range.
 例えば、照射決定部14は、奥行距離の範囲の中央に基づいて、照射範囲の上下方向を決定してもよい。例えば、奥行距離が「80m~100m」と推定された場合、照射決定部14は、中央となる奥行距離「90m」から、奥行距離垂直角度を算出する。照射決定部14は、算出した奥行距離垂直角度を中央とし、当該奥行距離垂直角度から垂直方向に予め設定された角度分だけ広げた角度の範囲を、照射範囲の上下方向としてもよい。
 また、例えば、照射決定部14は、同様の方法で、奥行距離の範囲の中央に基づいて、照射範囲の水平方向を決定してもよい。
For example, the irradiation determining unit 14 may determine the vertical direction of the irradiation range based on the center of the depth distance range. For example, when the depth distance is estimated to be "80 m to 100 m", the irradiation determining unit 14 calculates the depth distance vertical angle from the central depth distance "90 m". The irradiation determining unit 14 may set the calculated depth distance vertical angle as the center and set the angular range that is widened by a preset angle in the vertical direction from the depth distance vertical angle as the vertical direction of the irradiation range.
Further, for example, the irradiation determining unit 14 may determine the horizontal direction of the irradiation range based on the center of the depth distance range using a similar method.
 なお、照射決定部14は、奥行距離推定部13が奥行距離に当該奥行距離の初期値を設定した場合、当該初期値に基づき照射範囲を決定することになる。
 詳細には、照射決定部14は、奥行距離推定部13から、初期値を設定した奥行距離に関する奥行距離情報が出力された場合、奥行距離の初期値に対応する照射範囲の上下方向および左右方向を決定する。
Note that, when the depth distance estimating unit 13 sets the initial value of the depth distance as the depth distance, the irradiation determining unit 14 determines the irradiation range based on the initial value.
Specifically, when the depth distance estimating unit 13 outputs depth distance information regarding the depth distance for which the initial value has been set, the irradiation determining unit 14 determines whether the irradiation range is in the vertical and horizontal directions corresponding to the initial value of the depth distance. Determine.
 照射決定部14は、決定した照射範囲に関する情報(以下「照射情報」という。)を、ヘッドライト制御部15に出力する。
 照射情報は、ヘッドライト2の設置位置を基準位置とした、照射範囲の上下方向の角度範囲および左右方向の角度範囲を示す情報を含む。
The irradiation determining unit 14 outputs information regarding the determined irradiation range (hereinafter referred to as “irradiation information”) to the headlight control unit 15.
The irradiation information includes information indicating the angular range in the vertical direction and the angular range in the horizontal direction of the irradiation range, with the installation position of the headlight 2 as a reference position.
 ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させる。 The headlight control unit 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14.
 例えば、照射決定部14から、ヘッドライト制御部15へ、θ度~θ度を上下方向とし、φ度~φ度を左右方向とする旨の照射情報が出力されたとする。
 この場合、ヘッドライト制御部15は、例えば、ヘッドライト2のロービームユニット、ハイビームユニット、または、補助光ユニットの光源に対し、個々に、左右方向にφ度~φ度の範囲、かつ、上下方向にθ度~θ度の範囲に、光を照射させる制御を行う。ヘッドライト制御部15は、例えば、ロービームユニット、ハイビームユニット、または、補助光ユニットの複数の光源に対し、個々に、光軸を変化させて、上記範囲に光を照射させる制御を行ってもよいし、ロービームユニット、ハイビームユニット、または、補助光ユニットの複数の光源のうち、点灯させる光源を調整することで、上記範囲に光を照射させる制御を行ってもよい。
For example, assume that the irradiation determining unit 14 outputs irradiation information to the headlight control unit 15 indicating that θ 1 degree to θ 2 degrees are the vertical direction and φ 1 degree to φ 2 degrees are the horizontal direction.
In this case, the headlight control unit 15 individually controls the light source of the low beam unit, high beam unit, or auxiliary light unit of the headlight 2 in the range of φ 1 degree to φ 2 degrees in the left and right direction, and Control is performed to irradiate light in the range of θ 1 degree to θ 2 degrees in the vertical direction. The headlight control unit 15 may control, for example, a plurality of light sources such as a low beam unit, a high beam unit, or an auxiliary light unit to individually change the optical axis and irradiate the above range with light. However, by adjusting the light source to be turned on among the plurality of light sources of the low beam unit, high beam unit, or auxiliary light unit, control may be performed to irradiate the above range with light.
 これにより、図4を用いて説明したような照射範囲にヘッドライト2が照射した光が照射されるようになる。
 図4は、実施の形態1において、ヘッドライト制御部15が、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させた様子の一例を説明するための図である。
 図4において、運転者は「D」で示され、ヘッドライト2による光の照射範囲は「LA」で示されている。なお、図4は、車両100が走行中の道路を横から見た図としている。また、図4では、一例として、運転者の顔向きが「上方」の範囲内の向きであり、車両100の走行速度が90km/hであり、車両100のハンドル舵角が右に3度である場合に、ヘッドライト制御部15が、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させたものとしている。
 なお、図4に示す例では、奥行距離推定部13が、図2に示すような内容の奥行距離推定用情報に基づき運転者の向きと走行関連情報(ここでは車両情報)とが当該奥行距離推定用情報のNo.6に当てはまるとして奥行距離を「80~100m」であると推定し、照射決定部14は、左右方向にφ度~φ度の範囲、かつ、上下方向にθ度~θ度の範囲を照射範囲と決定したものとしている(図4では左右方向の照射範囲は図示省略)。
 照射決定部14は、照射範囲の上下方向について、奥行距離「80m」に基づき第1奥行距離垂直角度を算出し、奥行距離「100m」に基づき第2奥行距離垂直角度を算出して、照射範囲の上下方向を決定したものとしている。
Thereby, the light irradiated by the headlight 2 comes to be irradiated to the irradiation range as explained using FIG. 4.
FIG. 4 is a diagram for explaining an example of how the headlight control unit 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14 in the first embodiment. .
In FIG. 4, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA". Note that FIG. 4 is a side view of the road on which the vehicle 100 is traveling. Further, in FIG. 4, as an example, the driver's face direction is within the "upward" range, the traveling speed of the vehicle 100 is 90 km/h, and the steering wheel angle of the vehicle 100 is 3 degrees to the right. In one case, the headlight control unit 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14.
In the example shown in FIG. 4, the depth distance estimating unit 13 calculates the depth distance based on the depth distance estimation information as shown in FIG. Estimation information No. 6, the depth distance is estimated to be 80 to 100 m, and the irradiation determining unit 14 estimates the depth distance to be 80 to 100 m, and the irradiation determining unit 14 determines that the depth distance is 80 to 100 m, and the irradiation determining unit 14 determines that the depth distance is 80 to 100 m. The range is determined to be the irradiation range (the irradiation range in the left and right direction is not shown in FIG. 4).
The irradiation determination unit 14 calculates a first depth distance vertical angle based on the depth distance "80 m" in the vertical direction of the irradiation range, calculates a second depth distance vertical angle based on the depth distance "100 m", and determines the irradiation range. It is assumed that the vertical direction of is determined.
 ヘッドライト制御部15は、ヘッドライト2に対して、運転者が向いている方向において、奥行距離推定部13によって運転者の向きと走行関連情報(ここでは車両情報)を考慮して推定された奥行距離に基づいて照射決定部14によって決定された照射範囲、に光を照射させる。
 その結果、ヘッドライト制御部15は、推定視認位置に存在すると推定される推定視認対象物に光が照射されるよう、ヘッドライト2を制御することができる。運転者は、推定視認対象物を視認することができる。
 図4に示すような状況では、推定視認対象物は、標識であると推定される(図3参照)。つまり、ヘッドライト制御部15は、運転者の推定視認対象物である標識に、光が照射されるようにできる。なお、図4では、わかりやすさのため標識も図示するようにしているが、当該標識は、存在すると推定される物体であり、実際に存在するとは限らない。
The headlight control unit 15 estimates the depth distance estimation unit 13 in the direction in which the driver is facing with respect to the headlights 2, taking into consideration the driver's orientation and driving-related information (here, vehicle information). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance.
As a result, the headlight control unit 15 can control the headlight 2 so that light is irradiated onto the estimated visible object that is estimated to exist at the estimated visible position. The driver can visually recognize the estimated visible object.
In the situation shown in FIG. 4, the estimated visible object is estimated to be a sign (see FIG. 3). In other words, the headlight control unit 15 can cause light to be irradiated onto the sign, which is the estimated visible object of the driver. Note that in FIG. 4, a sign is also illustrated for ease of understanding, but the sign is an object that is presumed to exist, and does not necessarily exist actually.
 また、ヘッドライト制御部15は、向き検出部11から向き無効情報が出力された場合、ヘッドライト2に対して、運転者の向きが取得できないとき用の照射方向に光を照射させる。運転者の向きが取得できないとき用の照射方向は、例えば、正面方向である。当該正面方向をどこまでの範囲とするかの情報は、予め管理者等によって生成され、ヘッドライト制御部15が参照可能な場所に記憶されている。 Furthermore, when direction invalid information is output from the direction detection section 11, the headlight control section 15 causes the headlight 2 to emit light in the irradiation direction for when the direction of the driver cannot be obtained. The irradiation direction when the direction of the driver cannot be obtained is, for example, the front direction. Information regarding the range of the front direction is generated in advance by an administrator or the like, and is stored in a location where the headlight control unit 15 can refer to it.
 記憶部16は、各種情報を記憶する。
 なお、図1では、記憶部16は、ヘッドライト制御装置1に備えられているが、これは一例に過ぎない。記憶部16は、ヘッドライト制御装置1の外部の、ヘッドライト制御装置1が参照可能な場所に備えられてもよい。
The storage unit 16 stores various information.
Note that in FIG. 1, the storage unit 16 is included in the headlight control device 1, but this is only an example. The storage unit 16 may be provided at a location outside the headlight control device 1 that can be referenced by the headlight control device 1 .
 実施の形態1に係るヘッドライト制御装置1の動作について説明する。
 図5は、実施の形態1に係るヘッドライト制御装置1の動作について説明するためのフローチャートである。
 ヘッドライト制御装置1は、例えば、ヘッドライト2またはヘッドライト制御装置1がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図5のフローチャートで示すような動作を開始する。ヘッドライト制御装置1は、例えば、ヘッドライト2がオフの状態になるまで、ヘッドライト制御装置1がオフの状態になるまで、または、車両100の電源がオフにされるまで、図5のフローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1の制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。または、運転者の向き追従機能スイッチがある場合、ヘッドライト制御装置1の制御部は、ヘッドライト制御装置1がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、ヘッドライト制御装置1がオフの状態である、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1 according to the first embodiment will be explained.
FIG. 5 is a flowchart for explaining the operation of the headlight control device 1 according to the first embodiment.
For example, when the headlight 2 or the headlight control device 1 is turned on, the headlight control device 1 determines to perform lighting control of the headlight 2 based on the direction of the driver, and performs the lighting control according to the flowchart of FIG. Start the action as shown. The headlight control device 1 operates according to the flowchart of FIG. 5 until, for example, the headlights 2 are turned off, the headlight control device 1 is turned off, or the power of the vehicle 100 is turned off. Repeat the actions shown in .
For example, the control unit (not shown) of the headlight control device 1 acquires information indicating the state of the headlights 2 from a headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not. Alternatively, if there is a driver direction tracking function switch, the control unit of the headlight control device 1 determines whether the headlight control device 1 is in an on state. When the control unit determines that the headlights 2 are in the on state, the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13, the irradiation determining unit 14, and the headlight control unit 15.
Further, when the control unit determines that the headlights 2 are off, the headlight control device 1 is off, or the vehicle 100 is turned off, the control unit turns on the headlights 2 based on the direction of the driver. Information that determines to end the control and instructs the direction detection unit 11, driving-related information acquisition unit 12, depth distance estimation unit 13, irradiation determination unit 14, and headlight control unit 15 to end the lighting control of the headlights 2. Output.
 向き検出部11は、車内撮像装置3から車内撮像画像を取得し、取得した車内撮像画像に基づき、運転者の向きを検出する(ステップST1-1)。
 向き検出部11は、向き情報を、奥行距離推定部13に出力するとともに、記憶部16に記憶させる。
 当該ステップST1-1において、向き検出部11は、運転者の向きを検出しなかった場合、運転者の向きを検出しなくなってから向き検出判定用時間経過するまでであれば、記憶部16を参照し、直前に検出した運転者の向きを示す向き情報を、奥行距離推定部13に出力する。
 運転者の向きを検出しなくなってから向き検出判定用時間経過した場合は、向き検出部11は、ヘッドライト制御部15に、向き無効情報を出力する。向き検出部11がヘッドライト制御部15に向き無効情報を出力した場合、ヘッドライト制御装置1の動作は、後述のステップST2~ステップST3の処理をスキップし、ステップST4の処理に進む。
The orientation detection unit 11 acquires an in-vehicle captured image from the in-vehicle imaging device 3, and detects the orientation of the driver based on the acquired in-vehicle captured image (step ST1-1).
The orientation detection unit 11 outputs orientation information to the depth distance estimation unit 13 and stores it in the storage unit 16.
In step ST1-1, if the orientation detection unit 11 does not detect the orientation of the driver, the orientation detection unit 11 stores the storage unit 16 until the orientation detection determination time elapses after the orientation of the driver is no longer detected. Direction information indicating the direction of the driver detected immediately before is output to the depth distance estimating section 13.
If the orientation detection determination time has elapsed since the driver's orientation is no longer detected, the orientation detection unit 11 outputs orientation invalidation information to the headlight control unit 15. When the orientation detection unit 11 outputs orientation invalidation information to the headlight control unit 15, the operation of the headlight control device 1 skips the processing of steps ST2 to ST3, which will be described later, and proceeds to the processing of step ST4.
 走行関連情報取得部12は、走行関連情報取得装置4から走行関連情報を取得する。
 詳細には、車両情報取得部121は、走行関連情報取得装置4から、車両情報を取得する(ステップST1-2)。
 車両情報取得部121は、取得した車両情報を、奥行距離推定部13に出力する。
The travel-related information acquisition unit 12 acquires travel-related information from the travel-related information acquisition device 4 .
Specifically, the vehicle information acquisition unit 121 acquires vehicle information from the driving-related information acquisition device 4 (step ST1-2).
The vehicle information acquisition unit 121 outputs the acquired vehicle information to the depth distance estimation unit 13.
 奥行距離推定部13は、ステップST1-1にて向き検出部11が検出した運転者の向きに関する向き情報と、ステップST1-2にて走行関連情報取得部12が取得した走行関連情報と、奥行距離推定用情報とを用いて、奥行距離を推定する(ステップST2)。
 奥行距離推定部13は、奥行距離情報を、照射決定部14に出力する。
The depth distance estimating unit 13 calculates the direction information regarding the direction of the driver detected by the direction detecting unit 11 in step ST1-1, the driving related information acquired by the driving related information acquiring unit 12 in step ST1-2, and the depth. The depth distance is estimated using the distance estimation information (step ST2).
The depth distance estimation unit 13 outputs depth distance information to the irradiation determination unit 14.
 照射決定部14は、ステップST2にて奥行距離推定部13が推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する(ステップST3)。
 照射決定部14は、照射情報を、ヘッドライト制御部15に出力する。
The irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13 in step ST2 (step ST3).
The irradiation determining section 14 outputs irradiation information to the headlight control section 15.
 ヘッドライト制御部15は、ヘッドライト2に対して、ステップST3にて照射決定部14が決定した照射範囲に光を照射させる(ステップST4)。
 ステップST1-1にて、向き検出部11がヘッドライト制御部15に向き無効情報を出力した場合、当該ステップST4にて、ヘッドライト制御部15は、ヘッドライト2に対して、運転者の向きが取得できないとき用の照射方向に光を照射させる。
The headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determining section 14 in step ST3 (step ST4).
If the direction detection section 11 outputs direction invalid information to the headlight control section 15 in step ST1-1, the headlight control section 15 determines the direction of the driver with respect to the headlights 2 in step ST4. irradiates light in the irradiation direction for when it cannot be obtained.
 なお、例えば、ステップST2にて、奥行距離推定部13は、奥行距離が推定できなかった場合、例えば、当該奥行距離に当該奥行距離の初期値を設定する。
 例えば、奥行距離推定用情報の内容が図2に示したような内容であるとし、運転者の顔向きが上下方向に「正面」の範囲内の向きであり、左右方向に「右」の向きであるとする。また、車両100の走行速度が30km/hであるとする。また、ハンドル舵角が5度であるとする。この場合、運転者の顔向きと車速は、奥行距離推定用情報のNo.2とつきあうが、ハンドル舵角がつきあわない。そこで、奥行距離推定部13は、奥行距離に初期値を設定する。
 この場合、ステップST3にて、照射決定部14は、初期値が設定された奥行距離に基づき照射範囲を決定し、ステップST4にて、ヘッドライト制御部15は、ヘッドライト2に対して、初期値が設定された奥行距離に基づき決定された照射範囲に光を照射させる。
 例えば、その後、奥行距離推定部13は、奥行距離推定用情報を用いて推定できれば、当該奥行距離推定用情報を用いて奥行距離を推定する。例えば、上述の例で、運転者の顔向きと車速はそのままの状態で、ハンドル舵角が20度になったとする。運転者が右折しようとしてハンドルを右に切った場合が想定される。この場合、奥行距離推定部13は、運転者の顔向きと車速とハンドル舵角が、奥行距離推定用情報のNo.2とつきあうとして、奥行距離「15~20m」を推定する。
 そして、照射決定部14は、奥行距離推定部13が奥行距離推定用情報に基づいて推定した奥行距離に基づき照射範囲を決定し、ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させる。
 つまり、ヘッドライト制御装置1は、奥行距離を初期値としたヘッドライト2の点灯制御から、奥行距離推定部13によって推定された奥行距離に応じたヘッドライト2の点灯制御、言い換えれば、運転者の推定視認位置に存在すると推定される推定視認対象物に光が照射されるようにする点灯制御、へと切り替える。
Note that, for example, in step ST2, when the depth distance cannot be estimated, the depth distance estimating unit 13 sets the initial value of the depth distance to the depth distance.
For example, assume that the depth distance estimation information is as shown in Figure 2, and the driver's face orientation is within the range of "front" in the vertical direction and "right" in the horizontal direction. Suppose that Further, it is assumed that the traveling speed of vehicle 100 is 30 km/h. It is also assumed that the steering wheel angle is 5 degrees. In this case, the driver's face orientation and vehicle speed are determined by the depth distance estimation information No. 2, but the steering wheel angles do not match. Therefore, the depth distance estimation unit 13 sets an initial value to the depth distance.
In this case, in step ST3, the irradiation determining unit 14 determines the irradiation range based on the depth distance for which the initial value is set, and in step ST4, the headlight control unit 15 Light is irradiated onto the irradiation range determined based on the depth distance with the set value.
For example, after that, if the depth distance estimation unit 13 can estimate the depth distance using the depth distance estimation information, the depth distance estimation unit 13 estimates the depth distance using the depth distance estimation information. For example, in the above example, assume that the steering wheel angle becomes 20 degrees while the driver's face direction and vehicle speed remain the same. It is assumed that the driver turns the steering wheel to the right in order to make a right turn. In this case, the depth distance estimating unit 13 determines that the driver's face orientation, vehicle speed, and steering angle are the No. 1 of the depth distance estimation information. 2, we estimate the depth distance to be 15 to 20 meters.
Then, the irradiation determination unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13 based on the depth distance estimation information, and the headlight control unit 15 determines the irradiation range for the headlight 2. The irradiation range determined by the unit 14 is irradiated with light.
In other words, the headlight control device 1 changes from the lighting control of the headlights 2 with the depth distance as an initial value to the lighting control of the headlights 2 according to the depth distance estimated by the depth distance estimation unit 13. The lighting control is performed so that light is irradiated onto the estimated visible object that is estimated to exist at the estimated visible position.
 運転者の向きだけでは、ヘッドライト2の設置位置と運転者が実際に見ようとしていると推定される位置(推定視認位置)との間の距離、すなわち、奥行距離は、わからない。仮に、ヘッドライト2に対しどれぐらい先に光を照射させるかが固定的に決められていると、ヘッドライト2は、運転者が実際に見ようとしていると推定される場所または当該場所に存在している物体に光を照射することが困難である。
 なお、運転者の頭の位置とヘッドライト2の設置位置とは異なる。そのため、ヘッドライト2に対し、運転者が向いている方向の範囲全体をカバーするよう光を照射しようとすると照射範囲が広すぎ、歩行者や他車両の運転者にグレアを与えてしまう。
 上述したような従来技術では、以上のようなことが考慮されておらず、運転者の推定視認位置にヘッドライト2の光を照射できていない状況が発生し得る。
The distance between the installation position of the headlight 2 and the position estimated to be actually viewed by the driver (estimated viewing position), that is, the depth distance, cannot be determined only from the direction of the driver. If it is fixedly determined how far ahead the headlights 2 should emit light, the headlights 2 will not be present at the location where the driver is estimated to actually be looking, or at the location concerned. It is difficult to irradiate light onto objects that are
Note that the position of the driver's head and the installation position of the headlight 2 are different. Therefore, if an attempt is made to irradiate the headlights 2 with light so as to cover the entire range in the direction in which the driver is facing, the irradiation range will be too wide, giving glare to pedestrians and drivers of other vehicles.
In the conventional technology as described above, the above matters are not taken into consideration, and a situation may occur in which the light of the headlights 2 cannot be irradiated to the estimated visual recognition position of the driver.
 例えば、運転者の推定視認位置が、固定的に決められているヘッドライト2からの光の照射範囲よりも近く、運転者の推定視認位置にヘッドライト2の光が照射できない状況が発生し得る。
 具体例を挙げると、例えば、運転者が地下駐車場で手前にある停車車両の陰から歩行者が飛び出してこないか確認しているとする。この場合、実際は、運転者は、停車車両がある数メートル先を確認しているところ、周囲に停車車両等がない場所を走行しているような通常の運転時に運転者が視認していると想定される数十メートル先がヘッドライト2に光を照射させるヘッドライト2からの距離として固定的に決められていると、停車車両の陰にいる歩行者の上半身には、ヘッドライト2の光が照射されていない可能性がある。数メートル先はヘッドライト2による光の照射範囲に入っていない可能性があるためである。このように、奥行距離を考慮せず、固定的に決められているヘッドライト2からの距離だけ先の範囲に光を照射させても、飛び出してきた歩行者に光を照射させることができない可能性がある。そうすると、運転者は、歩行者の発見が遅れてしまう。
 また、逆に、運転者の推定視認位置が、固定的に決められているヘッドライト2からの光の照射範囲よりも遠く、運転者の推定視認位置にヘッドライト2の光が照射できない状況も発生し得る。
 具体的を挙げると、例えば、運転者が高速道路を走行中に前方の標識を確認するとする。この場合、運転者が確認しようとしている標識は、固定的に決められているヘッドライト2からの光の照射範囲よりも遠く、当該照射範囲に入っていない可能性がある。奥行距離を考慮せず、固定的に決められているヘッドライト2からの距離だけ先の範囲に光を照射させても、標識に光を照射させることができない可能性がある。そうすると、運転者は、標識の確認が遅れてしまう。
For example, a situation may occur where the driver's estimated visible position is closer than the fixed range of light irradiation from the headlights 2, and the headlight 2 cannot illuminate the driver's estimated visible position. .
To give a specific example, let's say that a driver is checking to see if a pedestrian is jumping out from behind a parked vehicle in front of him in an underground parking lot. In this case, the driver is actually checking a few meters ahead of the parked vehicle, but the driver may be visualizing it during normal driving, such as when driving in an area where there are no parked vehicles, etc. If the assumed several tens of meters ahead is fixedly determined as the distance from the headlight 2 that causes the headlight 2 to emit light, the light from the headlight 2 will not reach the upper body of a pedestrian in the shadow of a stopped vehicle. may not have been irradiated. This is because the distance several meters ahead may not be within the range of light irradiated by the headlights 2. In this way, even if you do not consider the depth distance and irradiate the light in a fixed distance from the headlight 2, it may not be possible to irradiate the pedestrian who jumps out. There is sex. In this case, the driver will be delayed in discovering the pedestrian.
Conversely, there may be situations where the driver's estimated visible position is farther than the fixed range of light irradiation from the headlights 2, and the light from the headlights 2 cannot illuminate the driver's estimated visible position. It can occur.
Specifically, for example, suppose that a driver checks a sign ahead while driving on an expressway. In this case, the sign that the driver is trying to check is further away than the fixed range of light irradiation from the headlights 2, and may not be within the range of light irradiation from the headlights 2. Even if the distance from the headlight 2 that is fixedly determined is irradiated with light without considering the depth distance, there is a possibility that the sign will not be irradiated with light. In this case, the driver is delayed in checking the sign.
 これに対し、実施の形態1に係るヘッドライト制御装置1は、上述したように、車内撮像画像に基づき、運転者の向きを検出し、走行関連情報(ここでは車両情報)を取得する。ヘッドライト制御装置1は、検出した運転者の向きに関する向き情報と取得した走行関連情報とに基づき奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する。そして、ヘッドライト制御装置1は、ヘッドライト2に対して、決定した照射範囲に光を照射させる。
 そのため、ヘッドライト制御装置1は、運転者の推定視認位置を適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
On the other hand, as described above, the headlight control device 1 according to the first embodiment detects the direction of the driver based on the captured image inside the vehicle, and acquires driving-related information (here, vehicle information). The headlight control device 1 estimates a depth distance based on the detected orientation information regarding the direction of the driver and the acquired travel-related information, and determines the range of light irradiated by the headlights 2 based on the estimated depth distance. . Then, the headlight control device 1 causes the headlight 2 to irradiate the determined irradiation range with light.
Therefore, the headlight control device 1 can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
 以上の実施の形態1では、ヘッドライト制御装置1において、奥行距離推定部13は、向き検出部11が検出した運転者の向きに関する向き情報と、走行関連情報取得部12が取得した走行関連情報、より詳細には車両情報取得部121が取得した車両情報と、奥行距離推定用情報とを用いて、奥行距離を推定していた。これに限らず、例えば、奥行距離推定部13は、奥行距離を推定するとともに、照射範囲の幅(以下「照射範囲の理想幅」という。)を推定するようにしてもよい。例えば、管理者等は、奥行距離推定用情報において、奥行距離を設定するとともに、照射範囲の上下方向の上限と左右方向の上限とを、それぞれ、照射範囲の理想幅として設定しておく。奥行距離推定部13は、向き情報と車両情報と奥行距離推定用情報とに基づき、奥行距離と照射範囲の理想幅とを推定する。
 この場合、奥行距離推定部13は、奥行距離情報と照射範囲の理想幅に関する情報とを照射決定部14に出力する。
 照射決定部14は、奥行距離推定部13が推定した奥行距離と照射範囲の理想幅とに基づいて照射範囲を決定する。詳細には、照射決定部14は、例えば、奥行距離推定部13が推定した奥行距離に基づいて算出した照射範囲が、照射範囲の理想幅を超えていた場合、照射範囲の理想幅までの範囲を、照射範囲と決定する。
 例えば、奥行距離推定部13によって推定された奥行距離が幅をもった値である場合、照射決定部14が奥行距離に基づき決定する照射範囲が広すぎることにより、ヘッドライト2からの光で対向車等にグレアを与えてしまうおそれがある。
 奥行距離推定部13が照射範囲の理想幅を推定し、照射決定部14が照射範囲の理想幅を上限として照射範囲を決定することで、ヘッドライト制御装置1は、対向車等に与えるグレアを低減できる。
In the first embodiment described above, in the headlight control device 1, the depth distance estimating section 13 uses the direction information regarding the direction of the driver detected by the direction detection section 11 and the driving-related information acquired by the driving-related information acquisition section 12. More specifically, the depth distance was estimated using the vehicle information acquired by the vehicle information acquisition unit 121 and the depth distance estimation information. However, the present invention is not limited to this, and for example, the depth distance estimation unit 13 may estimate the width of the irradiation range (hereinafter referred to as "ideal width of the irradiation range") in addition to estimating the depth distance. For example, the administrator or the like sets the depth distance in the depth distance estimation information, and also sets the upper limit in the vertical direction and the upper limit in the horizontal direction of the irradiation range as the ideal width of the irradiation range. The depth distance estimation unit 13 estimates the depth distance and the ideal width of the irradiation range based on the orientation information, vehicle information, and depth distance estimation information.
In this case, the depth distance estimation section 13 outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination section 14.
The irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13 and the ideal width of the irradiation range. In detail, for example, if the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 exceeds the ideal width of the irradiation range, the irradiation determining unit 14 determines that the range up to the ideal width of the irradiation range is is determined as the irradiation range.
For example, if the depth distance estimated by the depth distance estimation unit 13 is a value with a wide range, the irradiation range determined by the irradiation determination unit 14 based on the depth distance is too wide, and the light from the headlights 2 There is a risk of giving glare to cars etc.
The depth distance estimation unit 13 estimates the ideal width of the irradiation range, and the irradiation determination unit 14 determines the irradiation range with the ideal width of the irradiation range as the upper limit, so that the headlight control device 1 can reduce glare given to oncoming vehicles, etc. Can be reduced.
 また、以上の実施の形態1において、奥行距離推定部13は、奥行距離を推定するとともにヘッドライト2による照射光量(以下「理想光量」という。)を推定するようにしてもよい。例えば、管理者等は、奥行距離推定用情報において、奥行距離を設定するとともに、当該奥行距離に応じた理想的な照射光量を理想光量として設定しておく。例えば、管理者等は、奥行距離が大きいほど、理想光量に大きい値を設定しておく。奥行距離推定部13は、向き情報と車両情報と奥行距離推定用情報とに基づき、奥行距離と理想光量とを推定する。
 この場合、奥行距離推定部13は、奥行距離情報を照射決定部14に出力するとともに、推定した理想光量に関する情報を、ヘッドライト制御部15に出力する。
 ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲において、奥行距離推定部13が推定した理想光量で、光を照射させる。
 奥行距離推定部13が理想光量を推定し、ヘッドライト制御部15がヘッドライト2に対して理想光量で光を照射させることで、ヘッドライト制御装置1は、運転者が向いている方向において、光を照射させる車両100からの距離に応じて、運転者が推定視認対象物を視認するのに必要と想定される光量で、光を照射させることができる。
Further, in the first embodiment described above, the depth distance estimating unit 13 may estimate the depth distance and the amount of light irradiated by the headlights 2 (hereinafter referred to as "ideal light amount"). For example, an administrator or the like sets a depth distance in the depth distance estimation information, and also sets an ideal irradiation light amount according to the depth distance as the ideal light amount. For example, an administrator or the like sets a larger value for the ideal light amount as the depth distance becomes larger. The depth distance estimation unit 13 estimates the depth distance and the ideal light amount based on the orientation information, vehicle information, and depth distance estimation information.
In this case, the depth distance estimation section 13 outputs depth distance information to the irradiation determination section 14 and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
The headlight control unit 15 causes the headlight 2 to irradiate light at the ideal light amount estimated by the depth distance estimating unit 13 in the irradiation range determined by the irradiation determining unit 14.
The depth distance estimating unit 13 estimates the ideal amount of light, and the headlight control unit 15 causes the headlights 2 to emit light at the ideal amount of light, so that the headlight control device 1 can Depending on the distance from the vehicle 100 to which the light is irradiated, the light can be irradiated with the amount of light that is assumed to be necessary for the driver to visually recognize the estimated visible object.
 以上の実施の形態1において、奥行距離推定部13は、奥行距離を推定するとともに照射範囲の理想幅および理想光量を推定するようにしてもよい。 In the first embodiment described above, the depth distance estimation unit 13 may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
 また、以上の実施の形態1では、ヘッドライト制御装置1は、車両100に搭載される車載装置とし、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の全部がサーバに備えられてもよい。 Further, in the first embodiment described above, the headlight control device 1 is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section 12. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device. However, the present invention is not limited to this, and some of the orientation detection unit 11, driving-related information acquisition unit 12, depth distance estimation unit 13, irradiation determination unit 14, and headlight control unit 15 are provided in the on-vehicle device of the vehicle 100. Other components may be provided in a server connected to the in-vehicle device via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
 図6Aおよび図6Bは、実施の形態1に係るヘッドライト制御装置1のハードウェア構成の一例を示す図である。
 実施の形態1において、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1は、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向きに関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいてヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
6A and 6B are diagrams showing an example of the hardware configuration of the headlight control device 1 according to the first embodiment.
In the first embodiment, the functions of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown) are performed by a processing circuit. This is realized by 1001. That is, the headlight control device 1 estimates the depth distance based on the driving-related information and the orientation information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above-mentioned information.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001が専用のハードウェアである場合、処理回路1001は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。 When the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
 処理回路がプロセッサ1004の場合、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアまたはファームウェアは、プログラムとして記述され、メモリ1005に記憶される。プロセッサ1004は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1は、プロセッサ1004により実行されるときに、上述の図5のステップST1-1、ステップST1-2~ステップST4が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。ここで、メモリ1005とは、例えば、RAM、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)等が該当する。 When the processing circuit is the processor 1004, the functions of the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, the headlight control section 15, and the control section (not shown) are as follows. Realized by software, firmware, or a combination of software and firmware. Software or firmware is written as a program and stored in memory 1005. The processor 1004 reads out and executes the program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, and the headlight control section. 15, the functions of a control section (not shown) are executed. That is, the headlight control device 1 stores a program that, when executed by the processor 1004, results in steps ST1-1, ST1-2 to ST4 in FIG. A memory 1005 is provided. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and the control unit (not shown). It can also be said to be something that causes a computer to execute a processing procedure or method. Here, the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically non-volatile or volatile semiconductors such as asable, programmable, read-only memory) This includes memory, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
 なお、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、向き検出部11と、走行関連情報取得部12については専用のハードウェアとしての処理回路1001でその機能を実現し、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部についてはプロセッサ1004がメモリ1005に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 記憶部16は、例えば、メモリ1005で構成される。
 また、ヘッドライト制御装置1は、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
Note that some of the functions of the direction detection unit 11, driving-related information acquisition unit 12, depth distance estimation unit 13, irradiation determination unit 14, headlight control unit 15, and control unit (not shown) are implemented using dedicated hardware. It may also be realized by software, and part of it may be realized by software or firmware. For example, the functions of the direction detection unit 11 and the driving-related information acquisition unit 12 are realized by the processing circuit 1001 as dedicated hardware, and the functions of the direction detection unit 11 and the driving-related information acquisition unit 12 are realized by the processing circuit 1001 as dedicated hardware, and the depth distance estimation unit 13, the illumination determination unit 14, and the headlight control unit 15. The functions of the control unit (not shown) can be realized by the processor 1004 reading and executing a program stored in the memory 1005.
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1 also includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態1に係るヘッドライト制御装置1は、車両100の運転者が撮像された撮像画像(車内撮像画像)に基づき、運転者の向きを検出する向き検出部11と、車両100の走行に関連する走行関連情報を取得する走行関連情報取得部12と、向き検出部11が検出した運転者の向きに関する向き情報と、走行関連情報取得部12が取得した走行関連情報とに基づき、車両100に設けられているヘッドライト2の設置位置から運転者が向いている方向において運転者の視認位置と推定される推定視認位置までの距離である奥行距離を推定する奥行距離推定部13と、奥行距離推定部13が推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する照射決定部14と、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させるヘッドライト制御部15とを備えるように構成した。
 そのため、ヘッドライト制御装置1は、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1は、運転者の推定視認位置を適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
As described above, the headlight control device 1 according to the first embodiment includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); A driving-related information acquisition unit 12 that acquires driving-related information related to the driving of the vehicle 100, orientation information regarding the driver's orientation detected by the orientation detection unit 11, and driving-related information acquired by the driving-related information acquisition unit 12. Depth distance estimation for estimating the depth distance, which is the distance from the installation position of the headlight 2 provided in the vehicle 100 to the estimated visual position estimated to be the driver's visible position in the direction the driver is facing, based on 13; an irradiation determining unit 14 that determines the range of light irradiated by the headlights 2 based on the depth distance estimated by the depth distance estimating unit 13; The headlight controller 15 is configured to include a headlight control section 15 that irradiates a range with light.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1 determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1 can appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
 詳細には、ヘッドライト制御装置1において、走行関連情報取得部12は、走行関連情報として車両100に関する車両情報を取得する車両情報取得部121を有し、奥行距離推定部13は、向き情報と車両情報とに基づき、奥行距離を推定する。
 そのため、ヘッドライト制御装置1は、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1は、運転者の推定視認位置を適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
Specifically, in the headlight control device 1, the driving-related information acquisition unit 12 includes a vehicle information acquisition unit 121 that acquires vehicle information regarding the vehicle 100 as driving-related information, and the depth distance estimating unit 13 includes direction information and The depth distance is estimated based on the vehicle information.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1 determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1 can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
実施の形態2.
 実施の形態1では、ヘッドライト制御装置は、向き情報と車両情報とに基づき、奥行距離を推定していた。
 実施の形態2では、さらに、地図情報を用いて、奥行距離を推定する実施の形態について説明する。
Embodiment 2.
In the first embodiment, the headlight control device estimates the depth distance based on direction information and vehicle information.
In Embodiment 2, an embodiment in which depth distance is estimated using map information will be further described.
 図7は、実施の形態2に係るヘッドライト制御装置1aの構成例を示す図である。
 実施の形態2において、ヘッドライト制御装置1aは、車両100に搭載されていることを想定する。
 ヘッドライト制御装置1aは、車両100の運転者の向きに基づいて、車両100に設けられているヘッドライト2の灯火制御を行う。実施の形態2において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向であらわされる。実施の形態2において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向に加え、運転者の身体の向き、言い換えれば、運転者の姿勢、を含むものとしてもよい。
 実施の形態2では、ヘッドライト制御装置1aが行う、運転者の向きに基づくヘッドライト2の灯火制御は、例えば、夜間の駐車場、または、夜間の市街地等、車両100の周囲が暗い場所において、ヘッドライト2がオンにされた場合に行われることを想定している。
 以下の実施の形態2においても、単に「運転者の顔向き」というとき、当該「運転者の顔向き」は、運転者の視線方向も含む、「運転者の顔向き、または、視線方向」のことをいう。
FIG. 7 is a diagram showing a configuration example of a headlight control device 1a according to the second embodiment.
In the second embodiment, it is assumed that the headlight control device 1a is mounted on the vehicle 100.
The headlight control device 1a controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100. In the second embodiment, the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction. In the second embodiment, the "driver's orientation" includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
In the second embodiment, the light control of the headlights 2 based on the direction of the driver performed by the headlight control device 1a is performed in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
In Embodiment 2 below, when simply referring to "driver's face direction", the "driver's face direction" includes "driver's face direction or line of sight direction", which also includes the direction of the driver's line of sight. It refers to
 図7において、実施の形態1にて図1を用いて説明したヘッドライト制御装置1と同様の構成については、同じ符号を付して重複した説明を省略する。
 実施の形態2に係るヘッドライト制御装置1aは、実施の形態1に係るヘッドライト制御装置1とは、走行関連情報取得部12aが車両情報取得部121に加え地図情報取得部122を備えた点が異なる。
 また、実施の形態2に係るヘッドライト制御装置1aにおける奥行距離推定部13aの具体的な動作が、実施の形態1に係るヘッドライト制御装置1における奥行距離推定部13の具体的な動作とは異なる。
In FIG. 7, the same components as the headlight control device 1 described in Embodiment 1 using FIG. 1 are given the same reference numerals and redundant explanation will be omitted.
The headlight control device 1a according to the second embodiment is different from the headlight control device 1 according to the first embodiment in that the driving-related information acquisition section 12a includes a map information acquisition section 122 in addition to the vehicle information acquisition section 121. are different.
Further, the specific operation of the depth distance estimation section 13a in the headlight control device 1a according to the second embodiment is different from the specific operation of the depth distance estimation section 13 in the headlight control device 1 according to the first embodiment. different.
 実施の形態2において、走行関連情報取得装置4は、車速センサ(図示省略)またはハンドル舵角センサ(図示省略)の他、カーナビゲーション装置、GPS(Global Positioning System)、または、高精度ロケータ等の測位装置を含む。カーナビゲーション装置等の測位装置は地図情報を保持している。
 地図情報には、車両100の現在位置、車両100が走行している経路に関する経路情報、または、車両100が走行している車線(ここでいう車線とはいわゆるレーン)に関する情報も含まれるものとする。例えば、高精度ロケータは、数十cm単位で車両100の現在位置の情報を取得できるため、車両100が走行している車線に関する情報を取得できる。
 走行関連情報取得装置4は、車両情報および地図情報を、走行関連情報として、ヘッドライト制御装置1aに出力する。
In the second embodiment, the driving-related information acquisition device 4 includes a vehicle speed sensor (not shown), a steering wheel angle sensor (not shown), a car navigation device, a GPS (Global Positioning System), a high-precision locator, etc. Including positioning equipment. Positioning devices such as car navigation devices hold map information.
The map information also includes the current location of the vehicle 100, route information regarding the route the vehicle 100 is traveling on, or information regarding the lane (here, the lane is a so-called lane) the vehicle 100 is traveling on. do. For example, the high-precision locator can acquire information about the current position of the vehicle 100 in units of several tens of centimeters, and therefore can acquire information about the lane in which the vehicle 100 is traveling.
The driving-related information acquisition device 4 outputs vehicle information and map information to the headlight control device 1a as driving-related information.
 地図情報取得部122は、走行関連情報取得装置4から地図情報を取得する。
 地図情報取得部122は、取得した地図情報を、走行関連情報として、奥行距離推定部13aに出力する。
The map information acquisition unit 122 acquires map information from the travel-related information acquisition device 4.
The map information acquisition unit 122 outputs the acquired map information to the depth distance estimation unit 13a as travel-related information.
 奥行距離推定部13aは、向き検出部11が検出した運転者の向きと、走行関連情報取得部12a(詳細には車両情報取得部121が取得した車両情報と地図情報取得部122が取得した地図情報)とに基づき、奥行距離を推定する。
 詳細には、奥行距離推定部13aは、向き情報と、走行関連情報(ここでは車両情報および地図情報)と、奥行距離推定用情報との比較によって、奥行距離を推定する。
The depth distance estimation unit 13a uses the driver's orientation detected by the orientation detection unit 11 and the driving-related information acquisition unit 12a (specifically, the vehicle information acquired by the vehicle information acquisition unit 121 and the map acquired by the map information acquisition unit 122). The depth distance is estimated based on the information (information).
Specifically, the depth distance estimating unit 13a estimates the depth distance by comparing the orientation information, travel-related information (here, vehicle information and map information), and depth distance estimation information.
 ここで、図8は、実施の形態2において、奥行距離推定部13aが奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。
 図8に示すように、実施の形態2において、奥行距離推定用情報は、例えば、運転者の挙動と、車両情報と、地図情報と、奥行距離とが対応付けられたテーブルである。
 奥行距離推定用情報において、運転者の挙動には、例えば、運転者の顔向きの上下方向と、運転者の顔向きの左右方向とが含まれる。運転者の顔向きの上下方向は、「正面」、「上方」、または、「下方」であらわされる。運転者の顔向きの左右方向は、「正面」、「右」、または、「左」であらわされる。
 また、奥行距離推定用情報において、車両情報には、例えば、車速が含まれる。
 また、奥行距離推定用情報において、地図情報には、例えば、車両100の走行位置の情報、経路情報、および、車線位置の情報が含まれる。
Here, FIG. 8 is a diagram showing an example of the contents of depth distance estimation information used by the depth distance estimating section 13a to estimate the depth distance in the second embodiment.
As shown in FIG. 8, in the second embodiment, the depth distance estimation information is, for example, a table in which driver behavior, vehicle information, map information, and depth distance are associated with each other.
In the depth distance estimation information, the driver's behavior includes, for example, the vertical direction of the driver's face direction and the horizontal direction of the driver's face direction. The vertical direction of the driver's face is expressed as "front,""above," or "down." The left and right direction of the driver's face is expressed as "front", "right", or "left".
Furthermore, in the depth distance estimation information, the vehicle information includes, for example, vehicle speed.
Further, in the depth distance estimation information, the map information includes, for example, information on the traveling position of the vehicle 100, route information, and information on the lane position.
 奥行距離推定部13aは、向き検出部11が検出した運転者の向きに基づいて、運転者の挙動を判定し、判定した挙動と、車両情報取得部121が取得した車両情報に含まれている車速およびハンドル舵角を示す情報と、地図情報取得部122が取得した地図情報に含まれている車両100の走行位置を示す情報、経路情報、および、車線位置を示す情報とを、図8に示すような奥行距離推定用情報で設定されている運転者の挙動、車両情報、および、地図情報とつきあわせて奥行距離の情報を得ることで、奥行距離を推定する。 The depth distance estimation unit 13a determines the driver's behavior based on the driver's orientation detected by the orientation detection unit 11, and the determined behavior and the vehicle information acquired by the vehicle information acquisition unit 121 include the behavior of the driver. Information indicating the vehicle speed and steering angle, information indicating the traveling position of the vehicle 100, route information, and information indicating the lane position included in the map information acquired by the map information acquisition unit 122 are shown in FIG. The depth distance is estimated by obtaining depth distance information by comparing the driver's behavior, vehicle information, and map information set in the depth distance estimation information as shown.
 例えば、奥行距離推定部13aは、向き情報に含まれている運転者の垂直方向の顔向きを示す情報から、運転者の顔向きの上下方向が「正面」であるか「上方」であるか「下方」であるかを判定する。
 また、例えば、奥行距離推定部13aは、向き情報に含まれている運転者の水平方向の顔向きを示す情報から、運転者の顔向きの左右方向が「正面」であるか「右」であるか「左」であるかを判定する。
 奥行距離推定部13aによる運転者の顔向きの上下方向および左右方向が「正面」であるか「上方」であるか「下方」であるか「右」であるか「左」であるかの判定方法は、実施の形態1にて説明済みの、奥行距離推定部13による運転者の顔向きの上下方向および左右方向が「正面」であるか「上方」であるか「下方」であるか「右」であるか「左」であるかの判定方法と同様であるため、重複した説明を省略する。
For example, the depth distance estimating unit 13a determines whether the vertical direction of the driver's face is "front" or "above" from information indicating the driver's vertical face orientation included in the orientation information. Determine whether it is "downward".
For example, the depth distance estimating unit 13a determines whether the horizontal direction of the driver's face is “front” or “right” from information indicating the driver's horizontal face orientation included in the orientation information. Determine whether it is on the left or on the left.
Determination by the depth distance estimating unit 13a whether the vertical and horizontal directions of the driver's face direction are "front", "up", "down", "right", or "left" The method is to determine whether the vertical and horizontal directions of the driver's face direction are "front", "above", or "down" by the depth distance estimating unit 13, which has already been explained in the first embodiment. Since the method for determining whether it is "right" or "left" is the same, duplicate explanation will be omitted.
 例えば、今、運転者の顔向きが、上下方向に「正面」の向きであり、左右方向に「右」の向きであったとする。また、車両100の走行速度が30km/hであるとする。また、車両100の走行位置が交差点の15m手前であるとする。また、車両100の経路は、「次の交差点を右折」であるとする。
 なお、奥行距離推定用情報は、図8に示すような内容であったとする。
 この場合、奥行距離推定部13aは、向き情報と、走行関連情報(ここでは、車両情報および地図情報)と、奥行距離推定用情報とを用いて、奥行距離は「交差点の歩道と横断歩道とを含む5mの余裕を有する範囲」であると推定することになる(図8の奥行距離推定用情報のNo.2参照)。
For example, assume that the driver's face is now facing "front" in the vertical direction and "right" in the horizontal direction. Further, it is assumed that the traveling speed of vehicle 100 is 30 km/h. Further, it is assumed that the traveling position of the vehicle 100 is 15 meters before the intersection. Further, it is assumed that the route of the vehicle 100 is "turn right at the next intersection".
Note that it is assumed that the depth distance estimation information has contents as shown in FIG.
In this case, the depth distance estimating unit 13a uses the direction information, travel-related information (here, vehicle information and map information), and depth distance estimation information to determine the depth distance between the sidewalk and crosswalk at the intersection. (See No. 2 of the depth distance estimation information in FIG. 8).
 例えば、運転者の顔向きが上下方向に「正面~上方」、左右方向に「右もしくは左」であり、車速が「15~35km/h」であり、かつ、車両100が「交差点20m以内」で「次の交差点を右折または左折」しようとしている場合、車両100は交差点を右左折しようとしており、運転者の推定視認対象物は、歩行者であると推定される。そこで、奥行距離推定用情報において、奥行距離には、車両100が交差点を右左折しようとして「交差点20m以内」を「15~35km/h」で走行中に運転者が歩行者を確認しようとすると、当該歩行者はおそらくこれぐらいの奥行距離の位置にいるであろうと想定される「交差点の歩道と横断歩道とを含む5mの余裕を有する範囲」が設定されている(図8の奥行距離推定用情報のNo.2)。
 同様に、図8の奥行距離推定用情報のNo.1、No.3~No.6の条件においても、運転者の挙動、車両情報、および地図情報から推定される車両100の走行状態および運転者の推定視認対象物に基づいて、奥行距離が設定されている。
 図9は、図8に示す奥行距離推定用情報において設定されている、No.1~No.6の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車両情報と地図情報とから推定される車両100の走行状態および運転者の推定視認対象物の一例を対応付けて示した図である。
For example, the driver's face orientation is "front to top" in the vertical direction, "right or left" in the horizontal direction, the vehicle speed is "15 to 35 km/h", and the vehicle 100 is "within 20 meters of an intersection". If the vehicle 100 is about to make a "right or left turn at the next intersection," the vehicle 100 is about to make a right or left turn at the intersection, and the estimated visible object of the driver is estimated to be a pedestrian. Therefore, in the depth distance estimation information, the depth distance includes the case where the vehicle 100 is trying to turn left or right at an intersection and is traveling at ``15 to 35 km/h'' within 20 meters of the intersection, and the driver tries to check for pedestrians. , a range with a margin of 5 m including the intersection sidewalk and crosswalk is set, in which it is assumed that the pedestrian is probably at this depth (depth distance estimation in Figure 8). Information No. 2).
Similarly, No. 8 of the depth distance estimation information in FIG. 1.No. 3~No. Also in condition No. 6, the depth distance is set based on the driving state of the vehicle 100 estimated from the driver's behavior, vehicle information, and map information and the estimated visible object of the driver.
FIG. 9 shows No. 1 set in the depth distance estimation information shown in FIG. 1~No. Regarding the depth distance corresponding to the input information under the condition 6, the driving state of the vehicle 100 and the driver's estimated visibility are estimated from the driver's behavior, vehicle information, and map information, which are the basis for deriving the depth distance. FIG. 2 is a diagram showing an example of objects in association with each other.
 なお、図8に示す奥行距離推定用情報では、No.1~No.6の6パターンの条件が設定されているが、これは一例に過ぎない。また、図8に示すような奥行距離推定用情報の内容は、一例に過ぎない。
 図8に示す奥行距離推定用情報では、地図情報として、走行位置と経路情報と車線位置とが設定されているが、これは一例に過ぎず、奥行距離推定用情報において、例えば、地図情報として、ある位置との距離(例えば、交差点まで20m以内、高速道路インターまで50m以内等)を示す情報が設定されていてもよい。
 奥行距離推定用情報は、向き情報と走行関連情報(ここでは車両情報および地図情報)とから奥行距離の情報が得られる情報となっていればよく、奥行距離推定用情報において、少なくとも、運転者の顔向きと、車速と、走行位置の情報とが、奥行距離と対応付けられていればよい。
Note that in the depth distance estimation information shown in FIG. 1~No. 6 conditions are set, but this is just an example. Moreover, the content of the depth distance estimation information as shown in FIG. 8 is only an example.
In the depth distance estimation information shown in FIG. 8, a traveling position, route information, and lane position are set as map information, but this is only an example. , information indicating the distance to a certain position (for example, within 20 m to an intersection, within 50 m to an expressway interchange, etc.) may be set.
The depth distance estimation information only needs to be information that allows depth distance information to be obtained from orientation information and travel-related information (here, vehicle information and map information). It is only necessary that the face direction, vehicle speed, and driving position information be associated with the depth distance.
 奥行距離推定部13aは、奥行距離情報を、照射決定部14に出力する。 The depth distance estimation unit 13a outputs depth distance information to the irradiation determination unit 14.
 なお、奥行距離推定部13aは、向き検出部11が検出した運転者の向きに関する向き情報と走行関連情報取得部12aが取得した走行関連情報(ここでは車両情報取得部121が取得した車両情報と地図情報取得部122が取得した地図情報)とが、奥行距離推定用情報で設定されている入力情報とつきあわない場合、例えば、奥行距離推定用情報からは奥行距離は推定できなかったとして、当該奥行距離に当該奥行距離の初期値を設定する。
 奥行距離推定部13aは、初期値を設定した奥行距離に関する奥行距離情報を、照射決定部14に出力する。
Note that the depth distance estimation unit 13a uses the direction information regarding the driver's orientation detected by the direction detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12a (here, the vehicle information acquired by the vehicle information acquisition unit 121). If the map information acquired by the map information acquisition unit 122 does not match the input information set in the depth distance estimation information, for example, it is assumed that the depth distance could not be estimated from the depth distance estimation information, and the corresponding Set the initial value of the depth distance to the depth distance.
The depth distance estimation unit 13a outputs depth distance information regarding the depth distance for which the initial value is set to the irradiation determination unit 14.
 照射決定部14は、奥行距離推定部13aが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定し、照射情報をヘッドライト制御部15に出力する。ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させる。 The irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13a, and outputs irradiation information to the headlight control unit 15. The headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination section 14 with light.
 ここで、図10Aは、実施の形態2において、奥行距離推定部13aが推定する奥行距離の一例について説明するための図である。
 図10Bは、実施の形態2において、ヘッドライト制御部15が、ヘッドライト2に対して、奥行距離推定部13aが推定した図10Aに示すような奥行距離に基づき照射決定部14が決定した照射範囲に、光を照射させた様子の一例を説明するための図である。
 図10Aは、車両100が走行中の道路を上から見た俯瞰図としている。
 図10Bは、車両100が走行中の道路を横から見た図としている。図10Bにおいて、運転者は「D」で示され、ヘッドライト2による光の照射範囲は「LA」で示されている。
 なお、図10Aでは、奥行距離は、右ライトから運転者の推定視認位置までの距離としている。また、図10Bでは、便宜上、車両100を左から見た図としているが、図10Bで示されている照射範囲は、右ライトによる照射範囲とする。
 図10Aおよび図10Bでは、一例として、運転者の顔向きが上下方向に「正面」の向きであり、左右方向に「右」の向きであり、車両100が交差点の15m手前で次の交差点を右折しようと走行速度30km/hで走行している場合に、奥行距離推定部13aが奥行距離推定用情報を用いて奥行距離を推定し、ヘッドライト制御部15が、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させたものとしている。なお、奥行距離推定用情報は、図8に示すような内容であったとする。
Here, FIG. 10A is a diagram for explaining an example of the depth distance estimated by the depth distance estimation unit 13a in the second embodiment.
FIG. 10B shows that in the second embodiment, the headlight control unit 15 applies the irradiation to the headlight 2 determined by the irradiation determination unit 14 based on the depth distance as shown in FIG. FIG. 3 is a diagram for explaining an example of how a range is irradiated with light.
FIG. 10A is an overhead view of the road on which the vehicle 100 is traveling.
FIG. 10B is a side view of the road on which the vehicle 100 is traveling. In FIG. 10B, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA".
Note that in FIG. 10A, the depth distance is the distance from the right light to the estimated visual position of the driver. Further, although FIG. 10B shows the vehicle 100 as viewed from the left for convenience, the irradiation range shown in FIG. 10B is the irradiation range by the right light.
In FIGS. 10A and 10B, as an example, the driver's face is facing "front" in the vertical direction and "right" in the left-right direction, and the vehicle 100 passes the next intersection 15 meters before the intersection. When the vehicle is traveling at a speed of 30 km/h to make a right turn, the depth distance estimating section 13a estimates the depth distance using the depth distance estimation information, and the headlight control section 15 causes the headlights 2 to: It is assumed that the irradiation range determined by the irradiation determining unit 14 is irradiated with light. Note that it is assumed that the depth distance estimation information has contents as shown in FIG. 8 .
 奥行距離推定部13aは、図8に示すような内容の奥行距離推定用情報に基づき、向き情報から判定された運転者の挙動と走行関連情報(ここでは車両情報および地図情報)とが当該奥行距離推定用情報のNo.2に当てはまるとして、奥行距離を「交差点の歩道と横断歩道とを含む、5mの余裕を有する範囲」と推定する。その結果、運転者の向きにおいて、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲が推定される(図10A参照)。なお、このとき、奥行距離推定部13aは、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲について、具体的に、何m~何mの範囲になるかを、例えば、以下の図11Aおよび図11Bを用いて説明するような方法で、地図情報に基づいて算出する。なお、以下の図11Aおよび図11Bを用いて説明する方法は一例に過ぎず、奥行距離推定部13aは、その他の方法で、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲を算出してもよい。
 また、図11Aおよび図11Bは、実施の形態1において、奥行距離推定部13aが、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲について、具体的に何m~何mの範囲になるかの算出方法の一例を説明するための便宜上の図であり、図11Aおよび図11Bに示す道路と、図10Aで示す道路とは一致していない。
The depth distance estimating unit 13a calculates the distance between the driver's behavior determined from the direction information and driving-related information (vehicle information and map information here) based on the depth distance estimation information having the content as shown in FIG. Distance estimation information No. Assuming that 2 applies, the depth distance is estimated to be "an area with a 5m margin, including the intersection sidewalk and crosswalk." As a result, in the direction of the driver, a range of depth distance including the sidewalk near the intersection and the crosswalk at the intersection and has a margin of 5 m is estimated (see FIG. 10A). At this time, the depth distance estimating unit 13a determines specifically how many meters to how many meters the range of depth distance that includes the sidewalk near the intersection and the crosswalk at the intersection has a margin of 5 meters. is calculated based on the map information, for example, by a method as described using FIGS. 11A and 11B below. Note that the method described below with reference to FIGS. 11A and 11B is only an example, and the depth distance estimating unit 13a uses other methods to calculate a margin of 5 m including the sidewalk near the intersection and the crosswalk at the intersection. You may calculate the range of depth distances that have the following values.
Further, FIGS. 11A and 11B show how, in the first embodiment, the depth distance estimating unit 13a specifically calculates the range of depth distance that has a margin of 5 m, including the sidewalk near the intersection and the crosswalk at the intersection. This is a diagram for convenience to explain an example of a method of calculating the range of m to m, and the roads shown in FIGS. 11A and 11B do not match the road shown in FIG. 10A.
 図11Aおよび図11Bは、車両100が走行中の道路を上から見た俯瞰図としている。図11Aおよび図11Bでは、説明の簡単のため、車両100の図示は省略されているが、車両100は一般道を走行中で、次の交差点を右折しようとしており、運転者は右を向いているものとする。このときの運転者の向きはθd度であるとする。 FIGS. 11A and 11B are overhead views of the road on which the vehicle 100 is traveling. In FIGS. 11A and 11B, illustration of the vehicle 100 is omitted for simplicity of explanation, but the vehicle 100 is traveling on a general road and is about to turn right at the next intersection, and the driver is looking to the right. It is assumed that there is It is assumed that the direction of the driver at this time is θd degrees.
〈各種情報の取得〉
 まず、奥行距離推定部13aは、地図情報から、自車線と右折先の車線を含む道路の車線数、および、交差点までの距離を示す情報を取得する。ここでは、例えば、奥行距離推定部13aは、自車線と右折先の車線を含む道路は片側2車線の道路であり、交差点までの距離は15mとの情報を取得したとする。
 図11Aおよび図11Bに示されている例でいうと、運転者は、右を向いている。この場合、奥行距離推定部13aは、向き情報と地図情報とから、運転者は車両100からみて手前の歩道を見ようとしていると推定する。さらに、図11Aおよび図11Bに示す道路は一般道であるので、奥行距離推定部13aは、車線幅は3.5mであると仮定する。なお、道路種別に応じて、車線幅をどれぐらいと仮定するかは、予め決められているものとする。走行関連情報取得装置4として高精度ロケータが使用されている場合は、奥行距離推定部13aは、高精度ロケータから取得した車線幅情報を使用してもよい。
<Obtaining various information>
First, the depth distance estimating unit 13a acquires information indicating the number of lanes of the road including the current lane and the lane to which the vehicle is turning right, and the distance to the intersection, from the map information. Here, for example, it is assumed that the depth distance estimating unit 13a has acquired information that the road including the own lane and the right turn destination lane is a road with two lanes on each side, and that the distance to the intersection is 15 meters.
In the example shown in FIGS. 11A and 11B, the driver is facing to the right. In this case, the depth distance estimation unit 13a estimates that the driver is looking at the sidewalk in front of the vehicle 100 based on the direction information and the map information. Furthermore, since the roads shown in FIGS. 11A and 11B are general roads, the depth distance estimation unit 13a assumes that the lane width is 3.5 m. Note that it is assumed that the assumed lane width is determined in advance depending on the road type. When a high-precision locator is used as the travel-related information acquisition device 4, the depth distance estimation unit 13a may use lane width information obtained from the high-precision locator.
〈ヘッドライト2の設置位置から歩道手前までの距離の算出〉
 奥行距離推定部13aは、ヘッドライト2の設置位置から歩道手前までの距離を算出する。詳細には、奥行距離推定部13aは、運転者の頭部中心から運転者が向いている方向を示す仮想的な直線の、歩道手前までの距離(図11Aにおいて「D11」で示されている)を算出し、これに基づき、ヘッドライト2の設置位置から歩道手前までの距離を算出する。
 奥行距離推定部13aは、車両100が右折しようとしていることから、現在車両100は片側2車線の道路のうちの右車線を走行していると推定する。なお、走行関連情報取得装置4として高精度ロケータが使用されている場合は、奥行距離推定部13aは、実際の走行レーンの情報を用いて車両100が走行している車線を推定してもよい。
 奥行距離推定部13aは、運転者の頭部中心の位置と、運転者の向きと、仮定した車線幅と、地図情報から取得した道路の車線数とに基づき、運転者の頭部中心から運転者が向いている方向を示す仮想的な直線の、歩道手前までの距離を算出する。今、車両100が走行している車線は片側2車線の道路のうちの右車線であり、車線幅は3.5mであると仮定している。また、運転者の向きはθd度としている。例えば、運転者の頭部中心の位置が、車両100の走行車線の幅方向の中央であるとした場合、奥行距離推定部13aは、(3.5×2.5)/sinθd(m)を、運転者の頭部中心から運転者が向いている方向を示す仮想的な直線の、歩道手前までの距離として算出する。そして、奥行距離推定部13aは、算出した運転者の頭部中心から運転者が向いている方向を示す仮想的な直線の、歩道手前までの距離、に基づき、ヘッドライト2の設置位置から歩道手前までの距離を算出する。なお、運転者の頭部中心の位置とヘッドライト2の設置位置との位置関係はわかっているため、奥行距離推定部13aは、運転者の頭部中心から運転者が向いている方向を示す仮想的な直線の、歩道手前までの距離、に基づき、ヘッドライト2の設置位置から歩道手前までの距離を算出できる。
<Calculating the distance from the installation position of headlight 2 to the front of the sidewalk>
The depth distance estimation unit 13a calculates the distance from the installation position of the headlight 2 to the front of the sidewalk. Specifically, the depth distance estimation unit 13a calculates the distance from the center of the driver's head to the front of the sidewalk of a virtual straight line indicating the direction the driver is facing (indicated by "D11" in FIG. 11A). ), and based on this, the distance from the installation position of the headlight 2 to the front of the sidewalk is calculated.
Since the vehicle 100 is about to turn right, the depth distance estimating unit 13a estimates that the vehicle 100 is currently traveling in the right lane of a road with two lanes on each side. Note that when a high-precision locator is used as the driving-related information acquisition device 4, the depth distance estimation unit 13a may estimate the lane in which the vehicle 100 is traveling using information on the actual driving lane. .
The depth distance estimation unit 13a calculates driving distance from the center of the driver's head based on the position of the center of the driver's head, the orientation of the driver, the assumed lane width, and the number of road lanes obtained from the map information. The distance from a virtual straight line indicating the direction the person is facing to the front of the sidewalk is calculated. It is assumed that the lane in which the vehicle 100 is currently traveling is the right lane of a road with two lanes on each side, and the lane width is 3.5 m. Further, the direction of the driver is set to θd degrees. For example, if the position of the center of the driver's head is the widthwise center of the travel lane of the vehicle 100, the depth distance estimation unit 13a calculates (3.5×2.5)/sinθd(m). , is calculated as the distance from the center of the driver's head to the front of the sidewalk of a virtual straight line indicating the direction the driver is facing. Then, the depth distance estimating unit 13a calculates the distance from the installation position of the headlight 2 to the sidewalk, based on the calculated distance from the center of the driver's head to the front of the sidewalk of a virtual straight line indicating the direction the driver is facing. Calculate the distance to the front. Note that since the positional relationship between the position of the center of the driver's head and the installation position of the headlight 2 is known, the depth distance estimation unit 13a indicates the direction in which the driver is facing from the center of the driver's head. Based on the distance from the virtual straight line to the front of the sidewalk, the distance from the installation position of the headlight 2 to the front of the sidewalk can be calculated.
〈ヘッドライト2の設置位置から歩道奥までの距離の算出〉
 奥行距離推定部13aは、ヘッドライト2の設置位置から歩道手前までの距離の算出と同様の方法で、ヘッドライト2の設置位置から歩道奥までの距離を算出する。詳細には、奥行距離推定部13aは、運転者の頭部中心から運転者が向いている方向を示す仮想的な直線の、歩道奥までの距離(図11Bにおいて「D12」で示されている)を算出し、これに基づき、ヘッドライト2の設置位置から歩道奥までの距離を算出する。
 ここでは、奥行距離推定部13aは、{15-(3.5×2)}/sinθd(m)を、運転者が向いている方向を示す仮想的な直線の、歩道奥までの距離、として算出し、当該距離に基づき、ヘッドライト2の設置位置から歩道奥までの距離を算出する。
<Calculating the distance from the installation position of headlight 2 to the back of the sidewalk>
The depth distance estimation unit 13a calculates the distance from the installation position of the headlight 2 to the back of the sidewalk using the same method as calculating the distance from the installation position of the headlight 2 to the front of the sidewalk. Specifically, the depth distance estimation unit 13a calculates the distance from the center of the driver's head to the back of the sidewalk (indicated by "D12" in FIG. 11B) of a virtual straight line indicating the direction the driver is facing. ), and based on this, the distance from the installation position of the headlight 2 to the back of the sidewalk is calculated.
Here, the depth distance estimation unit 13a calculates {15-(3.5×2)}/sinθd(m) as the distance to the back of the sidewalk of a virtual straight line indicating the direction in which the driver is facing. Based on this distance, the distance from the installation position of the headlight 2 to the back of the sidewalk is calculated.
〈奥行距離の範囲の算出〉
 奥行距離推定部13aは、算出した、ヘッドライト2の設置位置から歩道手前までの距離と、ヘッドライト2の設置位置から歩道奥までの距離とに基づき、奥行距離の範囲を算出する。
 今、奥行距離推定部13aは、例えば、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲を算出するものとしている。例えば、奥行距離推定部13aが算出したヘッドライト2の設置位置から歩道手前までの距離が「Am」、ヘッドライト2の設置位置から歩道奥までの距離が「Bm」であったとすると、奥行距離推定部13aは、これらの距離から5mの余裕を加え、「Am-5m」~「Bm+5m」までの範囲を、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離の範囲として算出する。
<Calculation of depth distance range>
The depth distance estimation unit 13a calculates the range of depth distance based on the calculated distance from the installation position of the headlight 2 to the front of the sidewalk and the distance from the installation position of the headlight 2 to the back of the sidewalk.
It is assumed that the depth distance estimating unit 13a calculates, for example, a range of depth distances that includes the sidewalk near the intersection and the crosswalk at the intersection and has a margin of 5 meters. For example, if the distance from the installation position of the headlight 2 to the front of the sidewalk calculated by the depth distance estimation unit 13a is "Am" and the distance from the installation position of the headlight 2 to the back of the sidewalk is "Bm", then the depth distance The estimation unit 13a adds a margin of 5m to these distances, and calculates the range from "Am-5m" to "Bm+5m" as a depth distance with a margin of 5m, including the sidewalk near the intersection and the crosswalk at the intersection. Calculate as a range.
 図10Aを用いた説明に戻る。
 ここでは、奥行距離推定部13aは、例えば、上述の図11Aおよび図11Bを用いて説明したような方法で、奥行距離を「10m~30m」と推定したものとしている。具体的には、ヘッドライト2の設置位置から歩道手前までの距離が15m、ヘッドライト2の設置位置から歩道奥までの距離が25mであると算出した後、それぞれ5mの余裕を加え、「15m-5m=10m」~「25m+5m=30m」を奥行距離の範囲としたものである。
 また、例えば、照射決定部14は、奥行距離推定部13aが推定した奥行距離「10m~30m」に基づき、左右方向にφ度~φ度の範囲、かつ、上下方向にθ度~θ度の範囲を照射範囲と決定したものとしている(図10B参照。図10Bでは左右方向の照射範囲は図示省略)。
 なお、ここでは、照射決定部14は、照射範囲の上下方向について、奥行距離「10m」に基づき第1奥行距離垂直角度を算出し、奥行距離「30m」に基づき第2奥行距離垂直角度を算出して、照射範囲の上下方向を決定したものとしている。
Returning to the explanation using FIG. 10A.
Here, it is assumed that the depth distance estimating unit 13a estimates the depth distance to be "10 m to 30 m" using, for example, the method described using FIGS. 11A and 11B above. Specifically, after calculating that the distance from the installation position of headlight 2 to the front of the sidewalk is 15m, and the distance from the installation position of headlight 2 to the back of the sidewalk is 25m, a margin of 5m is added to each, and ``15m'' is calculated. -5m=10m" to "25m+5m=30m" is the depth distance range.
Further, for example, the irradiation determining unit 14 determines the range of φ 3 degrees to φ 4 degrees in the left-right direction and θ 3 degrees to θ 3 degrees in the vertical direction, based on the depth distance “10 m to 30 m” estimated by the depth distance estimating unit 13a . The range of θ 4 degrees is determined as the irradiation range (see FIG. 10B. In FIG. 10B, the irradiation range in the left and right direction is not shown).
Note that here, the irradiation determining unit 14 calculates a first depth distance vertical angle based on a depth distance of "10 m" in the vertical direction of the irradiation range, and calculates a second depth distance vertical angle based on a depth distance of "30 m". Then, the vertical direction of the irradiation range is determined.
 ヘッドライト制御部15は、ヘッドライト2に対して、運転者が向いている方向において、奥行距離推定部13aによって運転者の向きと走行関連情報(ここでは車両情報および地図情報)を考慮して推定された奥行距離に基づいて照射決定部14によって決定された照射範囲、に光を照射させる。
 その結果、ヘッドライト制御部15は、推定視認位置に存在すると推定される推定視認対象物に光が照射されるよう、ヘッドライト2を制御することができる。運転者は、推定視認対象物を視認することができる。
The headlight control unit 15 uses the depth distance estimating unit 13a to calculate the direction in which the driver is facing with respect to the headlights 2, taking into account the driver's orientation and driving-related information (here, vehicle information and map information). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the estimated depth distance.
As a result, the headlight control unit 15 can control the headlight 2 so that light is irradiated onto the estimated visible object that is estimated to exist at the estimated visible position. The driver can visually recognize the estimated visible object.
 図10Aおよび図10Bに示すような状況では、推定視認対象物は、歩行者であると推定される(図9参照)。つまり、ヘッドライト制御部15は、運転者の推定視認対象物である、運転者が向いている方向に存在すると推定される歩行者A(図10Aおよび図10BにてW1で示されている)に、光が照射されるようにできる。今、運転者の向きは、交差点の手前の歩道と進行方向の道路とを含む方向である。よって、運転者の推定視認対象物である歩行者は、交差点の手前の歩道または進行方向の道路上の横断歩道を横断している歩行者Aと推定される。
 例えば、車両100の経路上において、交差点の向こう側の歩道に歩行者B(図10AにてW2で示されている)がいることもあり得る。しかし、当該歩行者Bは、運転者が向いている方向に存在しない。よって、運転者の推定視認対象物とは推定されない。つまり、歩行者Bには光が照射されるようにはならない。
 ヘッドライト制御部15は、運転者が向いている方向において、推定された奥行距離だけ離れた位置(推定視認位置)に存在すると推定される運転者の推定視認対象物に光が照射されるよう、ヘッドライト2を制御する。
 なお、図10Aおよび図10Bでは、わかりやすさのため歩行者Aおよび歩行者Bも図示するようにしているが、当該歩行者Aおよび歩行者Bは、存在すると推定されるものであり、実際に存在するとは限らない。
In the situations shown in FIGS. 10A and 10B, the estimated visible object is estimated to be a pedestrian (see FIG. 9). In other words, the headlight control unit 15 controls a pedestrian A (indicated by W1 in FIGS. 10A and 10B) that is estimated to be a visible object of the driver and is estimated to exist in the direction the driver is facing. can be illuminated with light. The driver's current direction is a direction that includes the sidewalk in front of the intersection and the road in the direction of travel. Therefore, the pedestrian who is the estimated visible object of the driver is estimated to be Pedestrian A who is crossing the sidewalk in front of the intersection or the crosswalk on the road in the direction of travel.
For example, on the route of the vehicle 100, there may be a pedestrian B (indicated by W2 in FIG. 10A) on the sidewalk on the other side of the intersection. However, the pedestrian B does not exist in the direction that the driver is facing. Therefore, it is not presumed to be an object that is estimated to be visible to the driver. In other words, pedestrian B is not irradiated with light.
The headlight control unit 15 is configured to irradiate light onto an object that is estimated to be visible to the driver and is estimated to be located at a position (estimated visible position) that is an estimated depth distance away from the driver in the direction in which the driver is facing. , controls the headlights 2.
Note that in FIGS. 10A and 10B, pedestrian A and pedestrian B are also illustrated for ease of understanding, but pedestrian A and pedestrian B are estimated to exist, and may not actually exist. Not necessarily.
 ここで、図12Aは、実施の形態2において、奥行距離推定部13aが推定する奥行距離のその他の一例について説明するための図である。
 図12Bは、実施の形態2において、ヘッドライト制御部15が、ヘッドライト2に対して、奥行距離推定部13aが推定した図12Aに示すような奥行距離に基づき照射決定部14が決定した照射範囲に、光を照射させた様子の一例を説明するための図である。
 図12Aは、車両100が走行中の道路を上から見た俯瞰図としている。
 図12Bは、車両100が走行中の道路を横から見た図としている。図12Bにおいて、運転者は「D」で示され、ヘッドライト2による光の照射範囲は「LA」で示されている。
 なお、図12Aでは、奥行距離は、右ライトから運転者の推定視認位置までの距離としている。また、図12Bでは、便宜上、車両100を進行方向に対して左側から見た図としているが、図12Bで示されている照射範囲は、右ライトによる照射範囲とする。
 また、図12Aおよび図12Bでも、図10Aおよび図10B同様、一例として、運転者の顔向きが上下方向に「正面」の向きであり、左右方向に「右」の向きであり、車両100が交差点の15m手前で次の交差点を右折しようと、走行速度30km/hで走行している場合に、奥行距離推定部13aが奥行距離推定用情報を用いて奥行距離を推定し、ヘッドライト制御部15が、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させたものとしている。また、奥行距離推定用情報は、図8に示すような内容であったとしている。
 図10と図12とでは、運転者の顔向きが異なる。
Here, FIG. 12A is a diagram for explaining another example of the depth distance estimated by the depth distance estimation unit 13a in the second embodiment.
FIG. 12B shows that in the second embodiment, the headlight control unit 15 applies irradiation to the headlight 2 determined by the irradiation determination unit 14 based on the depth distance as shown in FIG. 12A estimated by the depth distance estimation unit 13a. FIG. 3 is a diagram for explaining an example of how a range is irradiated with light.
FIG. 12A is an overhead view of the road on which the vehicle 100 is traveling.
FIG. 12B is a side view of the road on which the vehicle 100 is traveling. In FIG. 12B, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA".
Note that in FIG. 12A, the depth distance is the distance from the right light to the driver's estimated visual recognition position. Further, although FIG. 12B shows the vehicle 100 as viewed from the left side with respect to the traveling direction for convenience, the irradiation range shown in FIG. 12B is the irradiation range by the right light.
Also, in FIGS. 12A and 12B, as in FIGS. 10A and 10B, as an example, the driver's face is facing "front" in the vertical direction and "right" in the horizontal direction, and the vehicle 100 is When you are driving at a speed of 30 km/h to make a right turn at the next intersection 15 meters before the intersection, the depth distance estimation unit 13a estimates the depth distance using the depth distance estimation information, and the headlight control unit 15 indicates that the headlight 2 is caused to irradiate light onto the irradiation range determined by the irradiation determining unit 14. Further, it is assumed that the depth distance estimation information had contents as shown in FIG. 8 .
The direction of the driver's face is different between FIG. 10 and FIG. 12.
 この場合、奥行距離推定部13aは、図8に示すような内容の奥行距離推定用情報に基づき、向き情報から判定された運転者の挙動と走行関連情報(ここでは車両情報および地図情報)とが当該奥行距離推定用情報のNo.2に当てはまるとして、奥行距離を「交差点の歩道と横断歩道とを含む5mの余裕を有する範囲」と推定する。その結果、運転者が向いている方向において、交差点付近の歩道と交差点の横断歩道とを含む、5mの余裕を有する奥行距離が推定される(図12B参照)。
 ここでは、奥行距離推定部13aは、地図情報に基づき、奥行距離を、例えば「20m~37m」と推定したものとしている。具体的には、ヘッドライト2の設置位置から歩道手前までの距離が25m、ヘッドライト2の設置位置から歩道奥までの距離が32mであると算出した後、それぞれ5mの余裕を加え、「25m-5m=20m」~「32m+5m=37m」を奥行距離の範囲としたものである。
 また、例えば、照射決定部14は、奥行距離推定部13aが推定した奥行距離「20m~37m」に基づき、左右方向にφ度~φ度の範囲、かつ、上下方向にθ度~θ度の範囲を照射範囲と決定したものとしている(図12B参照。図12Bでは左右方向の照射範囲は図示省略)。
 なお、ここでは、照射決定部14は、照射範囲の上下方向について、奥行距離「20m」に基づき第1奥行距離垂直角度を算出し、奥行距離「37m」に基づき第2奥行距離垂直角度を算出して、照射範囲の上下方向を決定したものとしている。
In this case, the depth distance estimating unit 13a calculates the driver's behavior determined from the direction information and driving-related information (here, vehicle information and map information) based on the depth distance estimation information as shown in FIG. is the No. of the depth distance estimation information. Assuming that 2 applies, the depth distance is estimated to be "an area with a 5m margin including the intersection sidewalk and crosswalk." As a result, in the direction in which the driver is facing, a depth distance is estimated that includes the sidewalk near the intersection and the crosswalk at the intersection and has a margin of 5 meters (see FIG. 12B).
Here, it is assumed that the depth distance estimation unit 13a estimates the depth distance to be, for example, "20 m to 37 m" based on the map information. Specifically, after calculating that the distance from the installation position of headlight 2 to the front of the sidewalk is 25m, and the distance from the installation position of headlight 2 to the back of the sidewalk is 32m, a margin of 5m is added to each, and ``25m'' is calculated. -5m=20m" to "32m+5m=37m" is the range of depth distance.
Further, for example, the irradiation determining unit 14 determines the range of φ 5 degrees to φ 6 degrees in the left-right direction and θ 5 degrees to θ in the vertical direction based on the depth distance “20 m to 37 m” estimated by the depth distance estimating unit 13a . The range of θ 6 degrees is determined as the irradiation range (see FIG. 12B. In FIG. 12B, the irradiation range in the left and right direction is not shown).
Note that here, the irradiation determining unit 14 calculates the first depth distance vertical angle based on the depth distance "20 m" in the vertical direction of the irradiation range, and calculates the second depth distance vertical angle based on the depth distance "37 m". Then, the vertical direction of the irradiation range is determined.
 ヘッドライト制御部15は、ヘッドライト2に対して、運転者が向いている方向において、奥行距離推定部13によって運転者の向きと走行関連情報(ここでは車両情報および地図情報)を考慮して推定された奥行距離に基づいて照射決定部14によって決定された照射範囲、に光を照射させる。
 図12Aおよび図12Bに示すような状況では、推定視認対象物は歩行者であると推定される(図9参照)が、今、図12Aおよび図12Bに示すように、運転者が向いている方向は、車両100の進行方向の道路と交差点の向こう側の歩道とを含む方向である。この場合、運転者の推定視認対象物は、交差点の向こう側の歩道または進行方向の道路上の横断歩道を横断している歩行者B(図12Aおよび図12BにてW2で示されている)と推定される。
 ヘッドライト制御部15は、運転者の推定視認対象物である、運転者が向いている方向に存在すると推定される歩行者Bに、光が照射されるようにできる。
 仮に、車両100の経路上において、交差点の手前の歩道に歩行者A(図12AにてW1で示されている)がいたとしても、当該歩行者Aは、運転者が向いている方向に存在しない。よって、運転者の推定視認対象物とは推定されない。つまり、歩行者Aには光が照射されるようにはならない。
The headlight control unit 15 uses the depth distance estimating unit 13 to calculate the direction in which the driver is facing with respect to the headlights 2, taking into account the driver's orientation and travel-related information (here, vehicle information and map information). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the estimated depth distance.
In the situation shown in FIGS. 12A and 12B, the estimated visible object is estimated to be a pedestrian (see FIG. 9), but now, as shown in FIGS. 12A and 12B, the driver is facing the object. The direction includes the road in the traveling direction of vehicle 100 and the sidewalk on the other side of the intersection. In this case, the driver's estimated visible object is pedestrian B (indicated by W2 in FIGS. 12A and 12B) who is crossing the sidewalk on the other side of the intersection or the crosswalk on the road in the direction of travel. It is estimated to be.
The headlight control unit 15 is capable of irradiating light onto a pedestrian B who is estimated to be a visible object of the driver and who is estimated to exist in the direction the driver is facing.
Even if there is a pedestrian A (indicated by W1 in FIG. 12A) on the sidewalk before the intersection on the route of the vehicle 100, the pedestrian A is present in the direction the driver is facing. do not. Therefore, it is not estimated to be an object that is estimated to be visible to the driver. In other words, pedestrian A is not irradiated with light.
 このように、ヘッドライト制御装置1aは、向き情報と、車両情報と、地図情報と、奥行距離推定用情報とを用いて推定した奥行距離に応じて、運転者の推定視認対象物に光が照射されるよう、ヘッドライト2の適切な点灯制御ができる。 In this way, the headlight control device 1a directs light to the estimated visible object of the driver according to the depth distance estimated using the orientation information, vehicle information, map information, and depth distance estimation information. Appropriate lighting control of the headlights 2 can be performed so that the headlights are illuminated.
 実施の形態1に係るヘッドライト制御装置1では、奥行距離推定部13は、向き情報と車両情報とから、奥行距離推定用情報を用いて、奥行距離を推定していた。奥行距離推定部13は、奥行距離を推定する際、地図情報を考慮していなかった。
 そのため、例えば、実施の形態1に係るヘッドライト制御装置1において、奥行距離推定部13が、図2に示したような奥行距離推定用情報を用いて奥行距離を推定するものとし、仮に、運転者の顔向きが上下方向に「正面」の向きであり、左右方向に「右」の向きであり、車両100が交差点の15m手前で次の交差点を右折しようと、走行速度30km/hで走行している場合、奥行距離推定部13は、運転者が右折のためにハンドルをある程度(例えば、20度以上)切らないと、奥行距離を推定できない。
 このような状況下では、奥行距離推定部13は、運転者が進行方向、ここでは、右方向、にハンドルをある程度切ってはじめて、例えば、奥行距離「15~20m」と推定できる(図2のNo.2参照)。右方向にハンドルをある程度切るまでは、車両100の走行状態および運転者の推定視認対象物の推定が難しいためである。
In the headlight control device 1 according to the first embodiment, the depth distance estimation unit 13 estimates the depth distance from the direction information and the vehicle information using the depth distance estimation information. The depth distance estimation unit 13 did not take map information into consideration when estimating the depth distance.
Therefore, for example, in the headlight control device 1 according to the first embodiment, the depth distance estimation unit 13 estimates the depth distance using the depth distance estimation information as shown in FIG. The person's face is ``front'' in the vertical direction and ``right'' in the horizontal direction, and the vehicle 100 is traveling at a speed of 30 km/h to turn right at the next intersection 15 meters before the intersection. In this case, the depth distance estimation unit 13 cannot estimate the depth distance unless the driver turns the steering wheel to a certain degree (for example, 20 degrees or more) to turn right.
Under such circumstances, the depth distance estimating unit 13 can estimate the depth distance to be, for example, "15 to 20 m" (see FIG. (See No. 2). This is because it is difficult to estimate the running state of the vehicle 100 and the estimated visible object of the driver until the steering wheel is turned to the right to some extent.
 これに対し、実施の形態2に係るヘッドライト制御装置1aでは、奥行距離推定部13aが奥行距離の推定に用いる奥行距離推定用情報には、地図情報が含まれている。なお、管理者等は、奥行距離推定用情報を生成する際、例えば、車両100の経路を根拠に推定視認対象物を推定し、奥行距離を設定できる。
 奥行距離推定部13aは、向き情報と車両情報と地図情報と奥行距離推定用情報を用いて奥行距離を推定する。
 その結果、奥行距離推定部13aは、例えば上述の例において、運転者が右折のためにハンドルをある程度切るよりも早いタイミングで、奥行距離を推定できる。
 つまり、実施の形態2に係るヘッドライト制御装置1aは、実施の形態1に係るヘッドライト制御装置1と比べ、早いタイミングで、奥行距離に応じたヘッドライト2の点灯制御を行うことができる。例えば、実施の形態2に係るヘッドライト制御装置1aは、実施の形態1に係るヘッドライト制御装置1と比べ、早いタイミングで、奥行距離を初期値としたヘッドライト2の点灯制御から、奥行距離推定部13aによって推定された奥行距離に応じたヘッドライト2の点灯制御、言い換えれば、運転者の推定視認位置に存在すると推定される推定視認対象物に光が照射されるようにする点灯制御、へと切り替えることができる。
In contrast, in the headlight control device 1a according to the second embodiment, the depth distance estimation information used by the depth distance estimating section 13a to estimate the depth distance includes map information. Note that when generating the depth distance estimation information, the administrator or the like can, for example, estimate the estimated visible object based on the route of the vehicle 100 and set the depth distance.
The depth distance estimation unit 13a estimates the depth distance using orientation information, vehicle information, map information, and depth distance estimation information.
As a result, the depth distance estimating unit 13a can estimate the depth distance at a timing earlier than when the driver turns the steering wheel to some extent to make a right turn, for example in the above example.
That is, the headlight control device 1a according to the second embodiment can control the lighting of the headlights 2 according to the depth distance at an earlier timing than the headlight control device 1 according to the first embodiment. For example, the headlight control device 1a according to the second embodiment starts the lighting control of the headlight 2 with the depth distance as an initial value at an earlier timing than the headlight control device 1 according to the first embodiment. Lighting control of the headlight 2 according to the depth distance estimated by the estimation unit 13a, in other words, lighting control so that light is irradiated to the estimated visible object that is estimated to exist at the estimated visible position of the driver, You can switch to .
 実施の形態2に係るヘッドライト制御装置1aの動作について説明する。
 図13は、実施の形態2に係るヘッドライト制御装置1aの動作について説明するためのフローチャートである。
 ヘッドライト制御装置1aは、例えば、ヘッドライト2がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図13のフローチャートで示すような動作を開始する。ヘッドライト制御装置1aは、例えば、ヘッドライト2がオフの状態になるまで、または、車両100の電源がオフにされるまで、図13のフローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1aの制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、走行関連情報取得部12a、奥行距離推定部13a、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、走行関連情報取得部12a、奥行距離推定部13a、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1a according to the second embodiment will be explained.
FIG. 13 is a flowchart for explaining the operation of the headlight control device 1a according to the second embodiment.
For example, when the headlights 2 are turned on, the headlight control device 1a determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts operations as shown in the flowchart of FIG. do. The headlight control device 1a repeats the operation shown in the flowchart of FIG. 13, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
For example, the control unit (not shown) of the headlight control device 1a acquires information indicating the state of the headlights 2 from a headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not. When the control unit determines that the headlights 2 are in the on state, the control unit determines to start lighting control of the headlights 2 based on the driver's direction, and the direction detection unit 11, driving related information acquisition unit 12a, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the section 13a, the irradiation determining section 14, and the headlight control section 15.
Further, when the control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the travel-related information acquisition section 12a, the depth distance estimation section 13a, the irradiation determination section 14, and the headlight control section 15.
 図13のフローチャートで示す動作について、ステップST1-1、ステップST1-2、ステップST3~ステップST4の処理内容は、それぞれ、実施の形態1にて説明済みの、図5のフローチャートで示すヘッドライト制御装置1の動作のステップST1-1、ステップST1-2、ステップST3~ステップST4の処理内容と同様であるため、重複した説明を省略する。 Regarding the operation shown in the flowchart of FIG. 13, the processing contents of step ST1-1, step ST1-2, and steps ST3 to ST4 are the headlight control shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing contents are the same as those of steps ST1-1, ST1-2, and steps ST3 to ST4 of the operation of the apparatus 1, duplicate explanations will be omitted.
 地図情報取得部122は、走行関連情報取得装置4から地図情報を取得する(ステップST1-3)。
 地図情報取得部122は、取得した地図情報を、走行関連情報として、奥行距離推定部13aに出力する。
The map information acquisition unit 122 acquires map information from the travel-related information acquisition device 4 (step ST1-3).
The map information acquisition unit 122 outputs the acquired map information to the depth distance estimation unit 13a as travel-related information.
 奥行距離推定部13aは、ステップST1-1にて向き検出部11が検出した運転者の向きに関する向き情報と、ステップST1-2にて車両情報取得部121が取得した車両情報と、ステップST1-3にて地図情報取得部122が取得した地図情報と、奥行距離推定用情報とを用いて、奥行距離を推定する(ステップST2a)。 The depth distance estimating unit 13a uses the orientation information regarding the driver's orientation detected by the orientation detecting unit 11 in step ST1-1, the vehicle information acquired by the vehicle information acquiring unit 121 in step ST1-2, and the vehicle information acquired in step ST1-2. The depth distance is estimated using the map information acquired by the map information acquisition unit 122 in step ST2 and the depth distance estimation information (step ST2a).
 ステップST3において、照射決定部14は、ステップST2aにて奥行距離推定部13が推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する(ステップST3)。 In step ST3, the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13 in step ST2a (step ST3).
 このように、ヘッドライト制御装置1aは、車内撮像画像に基づき、運転者の向きを検出し、走行関連情報(ここでは車両情報および地図情報)を取得する。ヘッドライト制御装置1aは、検出した運転者の向きに関する向き情報と取得した走行関連情報とに基づき奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する。そして、ヘッドライト制御装置1aは、ヘッドライト2に対して、決定した照射範囲に光を照射させる。
 そのため、ヘッドライト制御装置1aは、運転者の推定視認位置を適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
 ヘッドライト制御装置1aは、奥行距離の推定に地図情報を用いることにより、運転者の向きと車両情報だけでは奥行距離の推定が困難な状況でも、奥行距離を推定可能とし、車両100が夜間等に走行する際の運転支援を行うことができる。
In this way, the headlight control device 1a detects the direction of the driver based on the captured image inside the vehicle, and acquires travel-related information (here, vehicle information and map information). The headlight control device 1a estimates a depth distance based on the detected orientation information regarding the direction of the driver and the acquired travel-related information, and determines the range of light irradiation by the headlights 2 based on the estimated depth distance. . Then, the headlight control device 1a causes the headlight 2 to irradiate the determined irradiation range with light.
Therefore, the headlight control device 1a can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
By using map information to estimate the depth distance, the headlight control device 1a can estimate the depth distance even in situations where it is difficult to estimate the depth distance using only the driver's orientation and vehicle information. The system can provide driving support when driving.
 以上の実施の形態2において、例えば、奥行距離推定部13aは、奥行距離を推定するとともに、照射範囲の理想幅を推定するようにしてもよい。この場合、奥行距離推定部13aは、奥行距離情報と照射範囲の理想幅に関する情報とを照射決定部14に出力する。
 照射決定部14は、奥行距離推定部13aが推定した奥行距離と照射範囲の理想幅とに基づいて照射範囲を決定する。詳細には、照射決定部14は、奥行距離推定部13aが推定した奥行距離に基づいて算出した照射範囲が、照射範囲の理想幅を超えていた場合、照射範囲の理想幅までの範囲を、照射範囲と決定する。
 奥行距離推定部13aが照射範囲の理想幅を推定し、照射決定部14が照射範囲の理想幅を上限として照射範囲を決定することで、ヘッドライト制御装置1aは、対向車等に与えるグレアを低減できる。
In the second embodiment described above, for example, the depth distance estimation unit 13a may estimate the depth distance and the ideal width of the irradiation range. In this case, the depth distance estimation section 13a outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination section 14.
The irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13a and the ideal width of the irradiation range. Specifically, when the irradiation range calculated based on the depth distance estimated by the depth distance estimation unit 13a exceeds the ideal width of the irradiation range, the irradiation determining unit 14 determines the range up to the ideal width of the irradiation range. Determine the irradiation range.
The depth distance estimating unit 13a estimates the ideal width of the irradiation range, and the irradiation determining unit 14 determines the irradiation range with the ideal width of the irradiation range as the upper limit, so that the headlight control device 1a reduces glare imparted to oncoming vehicles, etc. Can be reduced.
 また、以上の実施の形態2において、奥行距離推定部13aは、奥行距離を推定するとともにヘッドライト2による理想光量を推定するようにしてもよい。この場合、奥行距離推定部13aは、奥行距離情報を照射決定部14に出力するとともに、推定した理想光量に関する情報を、ヘッドライト制御部15に出力する。
 ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲において、奥行距離推定部13aが推定した理想光量で、光を照射させる。
 奥行距離推定部13aが理想光量を推定し、ヘッドライト制御部15がヘッドライト2に対して理想光量で光を照射させることで、ヘッドライト制御装置1aは、運転者が向いている方向において、光を照射させる車両100からの距離に応じて、運転者が推定視認対象物を視認するのに必要と想定される光量で、光を照射させることができる。
Furthermore, in the second embodiment described above, the depth distance estimating unit 13a may estimate the ideal light amount by the headlights 2 as well as estimating the depth distance. In this case, the depth distance estimation section 13a outputs depth distance information to the irradiation determination section 14, and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
The headlight control unit 15 causes the headlight 2 to irradiate light in the irradiation range determined by the irradiation determination unit 14 at the ideal light amount estimated by the depth distance estimation unit 13a.
The depth distance estimating unit 13a estimates the ideal amount of light, and the headlight control unit 15 causes the headlights 2 to emit light at the ideal amount of light, so that the headlight control device 1a, in the direction in which the driver is facing, Depending on the distance from the vehicle 100 to which the light is irradiated, the light can be irradiated with the amount of light that is assumed to be necessary for the driver to visually recognize the estimated visible object.
 以上の実施の形態2において、奥行距離推定部13aは、奥行距離を推定するとともに照射範囲の理想幅および理想光量を推定するようにしてもよい。 In the second embodiment described above, the depth distance estimation unit 13a may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
 また、以上の実施の形態2では、ヘッドライト制御装置1aは、車両100に搭載される車載装置とし、向き検出部11と、走行関連情報取得部12aと、奥行距離推定部13aと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、走行関連情報取得部12aと、奥行距離推定部13aと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、走行関連情報取得部12aと、奥行距離推定部13aと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の全部がサーバに備えられてもよい。 Further, in the second embodiment described above, the headlight control device 1a is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the driving-related information acquisition section 12a, the depth distance estimation section 13a, and the irradiation determination section 12a. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device. The present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12a, depth distance estimation section 13a, irradiation determination section 14, headlight control section 15, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12a, the depth distance estimation unit 13a, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
 実施の形態2に係るヘッドライト制御装置1aのハードウェア構成は、実施の形態1において図6Aおよび図6Bを用いて説明したヘッドライト制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態2において、向き検出部11と、走行関連情報取得部12aと、奥行距離推定部13aと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1aは、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向きに関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいてヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
The hardware configuration of the headlight control device 1a according to the second embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
In the second embodiment, the functions of the direction detection section 11, the travel-related information acquisition section 12a, the depth distance estimation section 13a, the irradiation determination section 14, the headlight control section 15, and a control section (not shown) are performed by a processing circuit. This is realized by 1001. That is, the headlight control device 1a estimates the depth distance based on the driving-related information and the direction information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above-mentioned information.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、走行関連情報取得部12aと、奥行距離推定部13aと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1aは、処理回路1001により実行されるときに、上述の図13のステップST1-1、ステップST1-2、ステップST1-3~ステップST4が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、走行関連情報取得部12aと、奥行距離推定部13aと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 記憶部16は、例えば、メモリ1005で構成される。
 ヘッドライト制御装置1aは、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
The processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12a, the depth distance estimation section 13a, the irradiation determination section 14, and the headlight control section. 15 and a control section (not shown). That is, when the headlight control device 1a is executed by the processing circuit 1001, steps ST1-1, ST1-2, and ST1-3 to ST4 in FIG. 13 described above are executed as a result. A memory 1005 is provided for storing a program. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12a, the depth distance estimation unit 13a, the irradiation determination unit 14, the headlight control unit 15, and the control unit (not shown). It can also be said that it causes a computer to execute a processing procedure or method.
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1a includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態2に係るヘッドライト制御装置1aは、車両100の運転者が撮像された撮像画像(車内撮像画像)に基づき、運転者の向きを検出する向き検出部11と、車両100の走行に関連する走行関連情報を取得する走行関連情報取得部12aと、向き検出部11が検出した運転者の向きに関する向き情報と、走行関連情報取得部12aが取得した走行関連情報とに基づき、奥行距離を推定する奥行距離推定部13aと、奥行距離推定部13aが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する照射決定部14と、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させるヘッドライト制御部15とを備えるように構成した。
 そのため、ヘッドライト制御装置1aは、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1aは、運転者の推定視認位置を適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
As described above, the headlight control device 1a according to the second embodiment includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); A driving-related information acquisition unit 12a that acquires driving-related information related to driving of the vehicle 100, orientation information regarding the driver's orientation detected by the orientation detection unit 11, and driving-related information acquired by the driving-related information acquisition unit 12a. a depth distance estimating unit 13a that estimates the depth distance based on the depth distance; an irradiation determining unit 14 that determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13a; On the other hand, the headlight control section 15 is configured to irradiate light onto the irradiation range determined by the irradiation determining section 14.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1a determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1a can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
 詳細には、ヘッドライト制御装置1aにおいて、走行関連情報取得部12aは、走行関連情報として車両100に関する車両情報を取得する車両情報取得部121と、走行関連情報として地図情報を取得する地図情報取得部122とを有し、奥行距離推定部13aは、向き情報と車両情報と地図情報とに基づき、奥行距離を推定する。
 そのため、ヘッドライト制御装置1aは、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1aは、運転者の推定視認位置を適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
Specifically, in the headlight control device 1a, the driving-related information acquisition unit 12a includes a vehicle information acquisition unit 121 that acquires vehicle information regarding the vehicle 100 as driving-related information, and a map information acquisition unit 121 that acquires map information as driving-related information. The depth distance estimation section 13a estimates the depth distance based on the orientation information, vehicle information, and map information.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1a determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1a can appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
実施の形態3.
 実施の形態1では、ヘッドライト制御装置は、向き情報と車両情報とに基づき奥行距離を推定していた。
 実施の形態3では、向き情報と車両周辺の情報とに基づき奥行距離を推定する実施の形態について説明する。
Embodiment 3.
In the first embodiment, the headlight control device estimates the depth distance based on direction information and vehicle information.
In Embodiment 3, an embodiment will be described in which a depth distance is estimated based on orientation information and information around the vehicle.
 図14は、実施の形態3に係るヘッドライト制御装置1bの構成例を示す図である。
 実施の形態3において、ヘッドライト制御装置1bは、車両100に搭載されていることを想定する。
 ヘッドライト制御装置1bは、車両100の運転者の向きに基づいて、車両100に設けられているヘッドライト2の灯火制御を行う。実施の形態3において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向であらわされる。実施の形態3において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向に加え、運転者の身体の向き、言い換えれば、運転者の姿勢、を含むものとしてもよい。
 実施の形態3では、ヘッドライト制御装置1bが行う、運転者の向きに基づくヘッドライト2の灯火制御は、例えば、夜間の駐車場、または、夜間の市街地等、車両100の周囲が暗い場所において、ヘッドライト2がオンにされた場合に行われることを想定している。
 以下の実施の形態3においても、単に「運転者の顔向き」というとき、当該「運転者の顔向き」は、運転者の視線方向も含む、「運転者の顔向き、または、視線方向」のことをいう。
FIG. 14 is a diagram showing a configuration example of a headlight control device 1b according to the third embodiment.
In the third embodiment, it is assumed that the headlight control device 1b is mounted on the vehicle 100.
The headlight control device 1b controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100. In the third embodiment, the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction. In the third embodiment, the "driver's orientation" includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
In the third embodiment, the headlight control device 1b performs the light control of the headlights 2 based on the direction of the driver in a place where the surroundings of the vehicle 100 are dark, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
In Embodiment 3 below, when simply referring to "driver's face direction", the "driver's face direction" includes "driver's face direction or line of sight direction", which also includes the direction of the driver's line of sight. It refers to
 図14において、実施の形態1にて図1を用いて説明したヘッドライト制御装置1と同様の構成については、同じ符号を付して重複した説明を省略する。
 実施の形態3に係るヘッドライト制御装置1bは、実施の形態1に係るヘッドライト制御装置1とは、走行関連情報取得部12bが車両情報取得部121に代えて車外情報取得部123を備えた点が異なる。
 また、実施の形態3に係るヘッドライト制御装置1bにおける奥行距離推定部13bの具体的な動作が、実施の形態1に係るヘッドライト制御装置1における奥行距離推定部13の具体的な動作とは異なる。
In FIG. 14, the same components as the headlight control device 1 described using FIG. 1 in Embodiment 1 are given the same reference numerals and redundant explanation will be omitted.
The headlight control device 1b according to the third embodiment differs from the headlight control device 1 according to the first embodiment in that the driving-related information acquisition section 12b includes an external information acquisition section 123 instead of the vehicle information acquisition section 121. The points are different.
Further, the specific operation of the depth distance estimation section 13b in the headlight control device 1b according to the third embodiment is different from the specific operation of the depth distance estimation section 13 in the headlight control device 1 according to the first embodiment. different.
 実施の形態3において、走行関連情報取得装置4は、車外撮像装置(図示省略)またはレーダ(図示省略)等の装置を想定している。車外撮像装置またはレーダ等の走行関連情報取得装置4は、走行関連情報として、車両100の周辺の物体に関する情報(以下「車外情報」という。)を取得する。車外情報には、例えば、車両100の周辺を撮像した撮像画像(以下「車外撮像画像」という。)と、車両100の周辺に存在する物体までの距離に関する情報(以下「距離情報」という。)とが含まれる。 In the third embodiment, the driving-related information acquisition device 4 is assumed to be a device such as an external imaging device (not shown) or a radar (not shown). The driving-related information acquisition device 4, such as an external imaging device or a radar, acquires information regarding objects around the vehicle 100 (hereinafter referred to as "external information") as driving-related information. The information outside the vehicle includes, for example, a captured image of the surroundings of the vehicle 100 (hereinafter referred to as the "external captured image"), and information regarding the distance to objects existing around the vehicle 100 (hereinafter referred to as "distance information"). and is included.
 車外撮像装置は、少なくとも車両100の前方を撮像する。車外撮像装置は、車両100の前方だけでなく、車両100の側方または後方を撮像するようになっていてもよい。
 車外撮像装置は、撮像した車外撮像画像を、走行関連情報として、ヘッドライト制御装置1bに出力する。
 なお、ここでは、車外撮像装置は1つ、ヘッドライト制御装置1bに接続されていることを想定しているが、これは一例に過ぎない。例えば、車外撮像装置は車両100に複数搭載され、複数の車外撮像装置がヘッドライト制御装置1bと接続されていてもよい。
The external imaging device images at least the front of the vehicle 100. The external imaging device may be configured to image not only the front of the vehicle 100 but also the side or rear of the vehicle 100.
The external imaging device outputs the captured external image to the headlight control device 1b as travel-related information.
Note that although it is assumed here that one external imaging device is connected to the headlight control device 1b, this is only an example. For example, a plurality of external imaging devices may be mounted on the vehicle 100, and the plurality of external imaging devices may be connected to the headlight control device 1b.
 レーダは、少なくとも車両100の前方に存在する物体との距離を取得する。レーダは、車両100の前方だけでなく、車両100の側方または後方に存在する物体との距離を取得するようになっていてもよい。
 レーダは、取得した物体との距離に関する距離情報を、走行関連情報として、ヘッドライト制御装置1bに出力する。距離情報には、例えば、物体の存在、物体の位置、物体までの距離等を示す情報が含まれる。
 なお、ここでは、レーダは1つ、ヘッドライト制御装置1bに接続されていることを想定しているが、これは一例に過ぎない。例えば、レーダは車両100に複数搭載され、複数のレーダがヘッドライト制御装置1bと接続されていてもよい。例えば、近傍の物体との距離を取得するレーダと遠方の物体との距離を取得するレーダ、のように、互いに異なる種類の複数のレーダが、ヘッドライト制御装置1bと接続されていてもよい。
The radar obtains at least the distance to an object that exists in front of the vehicle 100. The radar may be configured to obtain the distance to an object not only in front of the vehicle 100 but also to the side or rear of the vehicle 100.
The radar outputs the acquired distance information regarding the distance to the object to the headlight control device 1b as travel-related information. The distance information includes, for example, information indicating the presence of an object, the position of the object, the distance to the object, and the like.
Note that although it is assumed here that one radar is connected to the headlight control device 1b, this is only an example. For example, a plurality of radars may be mounted on the vehicle 100, and the plurality of radars may be connected to the headlight control device 1b. For example, a plurality of radars of different types may be connected to the headlight control device 1b, such as a radar that obtains the distance to a nearby object and a radar that obtains the distance to a distant object.
 車外情報取得部123は、走行関連情報取得装置4から、車外情報を取得する。
 詳細には、車外情報取得部123は、車外撮像装置から車外撮像画像を取得し、レーダから距離情報を取得する。
 車外情報取得部123は、取得した車外情報を、走行関連情報として、奥行距離推定部13bに出力する。
The vehicle exterior information acquisition unit 123 acquires vehicle exterior information from the driving-related information acquisition device 4 .
Specifically, the outside-vehicle information acquisition unit 123 acquires an outside-vehicle captured image from an outside-vehicle imaging device, and acquires distance information from a radar.
The outside-vehicle information acquisition unit 123 outputs the acquired outside-vehicle information to the depth distance estimation unit 13b as travel-related information.
 奥行距離推定部13bは、向き検出部11が検出した運転者の向きに関する向き情報と、車外情報取得部123が取得した車外情報とに基づき、奥行距離を推定する。
 詳細には、奥行距離推定部13bは、向き情報と、走行関連情報(ここでは車外情報)と、奥行距離推定用情報との比較によって、奥行距離を推定する。
The depth distance estimating unit 13b estimates the depth distance based on the direction information regarding the direction of the driver detected by the direction detecting unit 11 and the external information acquired by the external information acquiring unit 123.
Specifically, the depth distance estimating unit 13b estimates the depth distance by comparing the orientation information, driving related information (here, information outside the vehicle), and information for estimating depth distance.
 ここで、図15は、実施の形態3において、奥行距離推定部13bが奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。
 図15に示すように、実施の形態3において、奥行距離推定用情報は、例えば、運転者の挙動と、車外情報と、奥行距離とが対応付けられたテーブルである。
 奥行距離推定用情報において、運転者の挙動には、例えば、運転者の顔向きの上下方向と、運転者の顔向きの左右方向とが含まれる。運転者の顔向きの上下方向は、「正面」、「上方」、または、「下方」であらわされる。運転者の顔向きの左右方向は、「正面」、「右」、または、「左」であらわされる。
 また、奥行距離推定用情報において、車外情報には、例えば、車両100の前方に存在する物体に関する情報が含まれる。
Here, FIG. 15 is a diagram showing an example of the contents of depth distance estimation information used by the depth distance estimation unit 13b to estimate the depth distance in the third embodiment.
As shown in FIG. 15, in the third embodiment, the depth distance estimation information is, for example, a table in which the driver's behavior, external vehicle information, and depth distance are associated with each other.
In the depth distance estimation information, the driver's behavior includes, for example, the vertical direction of the driver's face direction and the left/right direction of the driver's face direction. The vertical direction of the driver's face is expressed as "front,""above," or "down." The left and right direction of the driver's face is expressed as "front", "right", or "left".
Further, in the depth distance estimation information, the outside-vehicle information includes, for example, information regarding an object existing in front of the vehicle 100.
 実施の形態3において、奥行距離推定部13bは、奥行距離推定用情報を用いて奥行距離を推定すると、推定した奥行距離を、車外情報に基づいて、奥行距離調整用条件(詳細は後述する)を参照して、調整する。奥行距離推定部13bは、調整後の奥行距離を、推定した奥行距離として確定させる。以下、奥行距離推定部13bが奥行距離を推定し、推定した奥行距離を調整して、推定した奥行距離を確定させるまでの流れについて、詳細に説明する。 In Embodiment 3, when the depth distance estimating unit 13b estimates the depth distance using the depth distance estimation information, the depth distance estimation unit 13b converts the estimated depth distance into conditions for depth distance adjustment (details will be described later) based on the information outside the vehicle. Refer to and make adjustments. The depth distance estimation unit 13b determines the adjusted depth distance as the estimated depth distance. Hereinafter, a detailed description will be given of a flow in which the depth distance estimation unit 13b estimates a depth distance, adjusts the estimated depth distance, and finalizes the estimated depth distance.
〈奥行距離の推定〉
 まず、奥行距離推定部13bは、向き情報に基づいて、運転者の挙動を判定する。例えば、奥行距離推定部13bは、向き情報に含まれている運転者の垂直方向の顔向きを示す情報から、運転者の顔向きの上下方向が「正面」であるか「上方」であるか「下方」であるかを判定する。また、例えば、奥行距離推定部13bは、向き情報に含まれている運転者の水平方向の顔向きを示す情報から、運転者の顔向きの左右方向が「正面」であるか「右」であるか「左」であるかを判定する。
 奥行距離推定部13bによる運転者の顔向きの上下方向および左右方向が「正面」であるか「上方」であるか「下方」であるか「右」であるか「左」であるかの判定方法は、実施の形態1にて説明済みの、奥行距離推定部13による運転者の顔向きの上下方向および左右方向が「正面」であるか「上方」であるか「下方」であるか「右」であるか「左」であるかの判定方法と同様であるため、重複した説明を省略する。
<Estimation of depth distance>
First, the depth distance estimation unit 13b determines the driver's behavior based on the direction information. For example, the depth distance estimating unit 13b determines whether the vertical direction of the driver's face is "front" or "above" from information indicating the driver's vertical face orientation included in the orientation information. Determine whether it is "downward". For example, the depth distance estimating unit 13b determines whether the horizontal direction of the driver's face is “front” or “right” from information indicating the driver's horizontal face orientation included in the orientation information. Determine whether it is on the left or on the left.
Determination by the depth distance estimating unit 13b whether the vertical and horizontal directions of the driver's face direction are "front", "up", "down", "right", or "left" The method is to determine whether the vertical and horizontal directions of the driver's face direction are "front", "above", or "down" by the depth distance estimating unit 13, which has already been explained in the first embodiment. Since the method for determining whether it is "right" or "left" is the same, duplicate explanation will be omitted.
 また、奥行距離推定部13bは、車外情報取得部123が取得した車外情報に基づいて、車両100の周辺に存在する物体(例えば、標識、歩行者、白線、車両等)を判定する。例えば、奥行距離推定部13bは、車外撮像画像に対して公知の画像認識技術を用いた画像認識処理を行って、車両100の周辺に存在する物体を判定できる。また、車外撮像装置の設置位置および画角と、レーダの設置位置および検出範囲とは、予めわかっているので、奥行距離推定部13bは、車外撮像画像上の物体と、距離情報で示されている物体との対応付けができる。すなわち、奥行距離推定部13bは、車外撮像画像と距離情報とに基づけば、車外撮像画像で撮像されている物体が、車両100からどれぐらいの距離の位置に存在している物体であるかの紐づけができる。なお、奥行距離推定部13bは、例えば、車外撮像画像上の物体の位置を示す座標と、距離情報における物体の位置を示す座標とに基づき、位置の近いものを同一の物体とみなす方法等、公知の種々の方法を用いて、車外撮像画像上の物体と距離情報で示されている物体との対応付けを行ってもよい。また、レーダの設置位置と車両100の位置との関係は、予めわかっているので、奥行距離推定部13bは、距離情報に基づけば、レーダが検出した物体が、車両100からどれぐらいの距離の位置に存在しているか、判定できる。 Further, the depth distance estimation unit 13b determines objects (for example, signs, pedestrians, white lines, vehicles, etc.) existing around the vehicle 100 based on the external information acquired by the external information acquisition unit 123. For example, the depth distance estimating unit 13b can determine objects existing around the vehicle 100 by performing image recognition processing using a known image recognition technique on the image taken outside the vehicle. In addition, since the installation position and angle of view of the external imaging device and the installation position and detection range of the radar are known in advance, the depth distance estimating unit 13b can detect objects on the external imaging image and the distance information indicated by the distance information. It is possible to make correspondences with existing objects. That is, the depth distance estimating unit 13b estimates how far from the vehicle 100 the object imaged in the external image is located based on the external image and the distance information. Can be linked. Note that the depth distance estimating unit 13b uses, for example, a method of considering objects that are close to each other as the same object based on the coordinates indicating the position of the object on the image taken outside the vehicle and the coordinates indicating the position of the object in the distance information. Various known methods may be used to associate the object on the outside-of-vehicle image with the object indicated by the distance information. Further, since the relationship between the installation position of the radar and the position of the vehicle 100 is known in advance, the depth distance estimating unit 13b determines how far from the vehicle 100 the object detected by the radar is based on the distance information. It can be determined whether it exists in the location.
 なお、奥行距離推定部13bは、車外情報取得部123から出力された車外情報を、取得日時を付与して記憶部16に記憶させるようにし、過去の車外情報から、検出された物体の加速度を算出し、算出した加速度が大きく変わった、言い換えれば、物体が急激に移動したと判定した場合は、当該物体は誤検出された物体であると判定してもよい。奥行距離推定部13bは、誤検出された物体であると判定した物体に関する車外情報は奥行距離の推定に用いないようにする。 Note that the depth distance estimating unit 13b stores the outside vehicle information outputted from the outside vehicle information acquisition unit 123 in the storage unit 16 with an acquisition date and time, and calculates the acceleration of the detected object from the past outside information. If it is determined that the calculated acceleration has changed significantly, in other words, that the object has moved rapidly, the object may be determined to be an erroneously detected object. The depth distance estimating unit 13b does not use the outside-vehicle information regarding the object determined to be an erroneously detected object for estimating the depth distance.
 また、車外撮像画像上の物体と距離情報で示されている物体との対応付けは、車外情報取得部123が行ってもよい。この場合、例えば、車外情報取得部123が、対応する物体がわかる形態で、車外情報を、奥行距離推定部13bに出力する。
 また、車外情報取得部123が、過去の車外情報に基づく物体が誤検出されたか否かの判定を行ってもよい。車外情報取得部123は、誤検出された物体であると判定した物体に関する車外情報は奥行距離推定部13bに出力しないようにする。
Further, the outside-vehicle information acquisition unit 123 may associate the object on the outside-the-vehicle captured image with the object indicated by the distance information. In this case, for example, the outside-vehicle information acquisition unit 123 outputs the outside-vehicle information to the depth distance estimating unit 13b in a form that allows the corresponding object to be identified.
Further, the outside-vehicle information acquisition unit 123 may determine whether an object has been erroneously detected based on past outside-the-vehicle information. The outside-vehicle information acquisition unit 123 does not output outside-vehicle information regarding an object determined to be an erroneously detected object to the depth distance estimation unit 13b.
 そして、奥行距離推定部13bは、判定した挙動と、車両100の周辺に存在する物体を示す情報とを、図15に示すような奥行距離推定用情報で設定されている運転者の挙動および車外情報とつきあわせて奥行距離の情報を得ることで、奥行距離を推定する。 Then, the depth distance estimating unit 13b combines the determined behavior and information indicating objects existing around the vehicle 100 with the driver's behavior and the information outside the vehicle set in the depth distance estimation information as shown in FIG. The depth distance is estimated by comparing the information and obtaining the depth distance information.
 例えば、今、運転者の顔向きが「正面」の向きであり、車両100の前方には「横向きの停止車両」が複数あり、かつ、「歩行者」がいるとする。なお、奥行距離推定用情報は、図15に示すような内容であったとする。
 この場合、奥行距離推定部13bは、運転者の向きに関する向き情報と、走行関連情報(ここでは車外情報)と、奥行距離推定用情報とを用いて、奥行距離は「5~15m」であると推定する(図15の奥行距離推定用情報のNo.1参照)。
For example, assume that the driver's face is facing "front", that there are a plurality of "sideways stopped vehicles" in front of the vehicle 100, and that there are "pedestrians". It is assumed that the depth distance estimation information has contents as shown in FIG. 15.
In this case, the depth distance estimating unit 13b determines that the depth distance is "5 to 15 m" using orientation information regarding the driver's orientation, travel-related information (here, information outside the vehicle), and depth distance estimation information. (See No. 1 of depth distance estimation information in FIG. 15).
 例えば、運転者の顔向きが上下方向に「正面~上方」であり、車両100の前方には「横向きの停止車両」が複数あり、かつ、「歩行者」がいる場合、車両100は駐停車しようとしており、運転者の推定視認対象物は、歩行者(または歩行者がいそうな場所)であると推定される。そこで、奥行距離推定用情報において、奥行距離には、車両100が駐停車しようとしているときに運転者が歩行者(または歩行者がいそうな場所)を確認しようとすると、当該歩行者(または歩行者がいそうな場所)はおそらくこれぐらいの奥行距離の位置にいる(ある)であろうと想定される「5~15m」が設定されている(図15の奥行距離推定用情報のNo.1)。
 同様に、図15の奥行距離推定用情報のNo.2~No.8の条件においても、運転者の挙動および車外情報から推定される車両100の走行状態および運転者の推定視認対象物に基づいて、奥行距離が設定されている。
 図16は、図15に示す奥行距離推定用情報において設定されている、No.1~No.8の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車外情報とから推定される車両100の走行状態および運転者の視認対象物の一例と対応付けて示した図である。
For example, if the driver's face direction is "front to top" in the vertical direction, there are multiple "sideways stopped vehicles" in front of the vehicle 100, and there is a "pedestrian", the vehicle 100 is parked or stopped. The driver's estimated visible object is estimated to be a pedestrian (or a place where a pedestrian is likely to be found). Therefore, in the depth distance estimation information, if the driver tries to check for a pedestrian (or a place where a pedestrian is likely to be found) when the vehicle 100 is about to park or stop, The area where a person is likely to be located is set to 5 to 15 m, which is assumed to be at a depth of about this distance (No. 1 of information for estimating depth distance in Figure 15). .
Similarly, No. 1 of the depth distance estimation information in FIG. 2~No. Also in condition No. 8, the depth distance is set based on the driving state of the vehicle 100 estimated from the driver's behavior and information outside the vehicle, and the estimated visible object of the driver.
FIG. 16 shows No. 1 set in the depth distance estimation information shown in FIG. 15. 1~No. Regarding the depth distance corresponding to the input information under the condition 8, an example of the driving state of the vehicle 100 estimated from the behavior of the driver and information outside the vehicle and the object visible to the driver, which is the basis for deriving the depth distance. FIG.
 なお、図15に示す奥行距離推定用情報では、No.1~No.8の8パターンの条件が設定されているが、これは一例に過ぎない。また、図15に示すような奥行距離推定用情報の内容は、一例に過ぎない。
 例えば、奥行距離推定用情報において、車外情報として、車両後方に存在する物体を示す情報と隣車線の種別(追い越し車線か対向車線か)を示す情報が設定されていてもよい。例えば、車両後方に存在する物体が車両であり、隣車線の種別が追い越し車線である場合、奥行距離には、車両100が高速道路を走行中であり運転者の推定視認対象物は標識であることを想定した値が設定されている。
 また、例えば、奥行距離推定用情報において、車外情報として、車両側方に存在する物体を示す情報と、当該物体の高さを示す情報が設定されていてもよい。例えば、車両側方に存在する物体が建物の塀等の静止物であり、当該静止物の高さが所定の閾値以上である場合、奥行距離には、車両100が見通しの悪い交差点を走行中であり運転者の推定視認対象物は歩行者であることを想定した奥行距離が設定されている。
 奥行距離推定用情報は、向き情報と走行関連情報(ここでは車外情報)とから奥行距離の情報が得られる情報となっていればよい。
Note that in the depth distance estimation information shown in FIG. 1~No. Although 8 patterns of conditions are set, this is just an example. Moreover, the content of the depth distance estimation information as shown in FIG. 15 is only an example.
For example, in the depth distance estimation information, information indicating an object existing behind the vehicle and information indicating the type of the adjacent lane (passing lane or oncoming lane) may be set as information outside the vehicle. For example, if the object behind the vehicle is a vehicle and the type of the adjacent lane is a passing lane, the depth distance indicates that the vehicle 100 is traveling on an expressway and the driver's estimated visible object is a sign. The value is set assuming that.
Further, for example, in the depth distance estimation information, information indicating an object existing on the side of the vehicle and information indicating the height of the object may be set as the vehicle exterior information. For example, if the object present on the side of the vehicle is a stationary object such as a building wall, and the height of the stationary object is equal to or higher than a predetermined threshold, the depth distance may include a vehicle 100 traveling at an intersection with poor visibility. The depth distance is set assuming that the driver's estimated visible object is a pedestrian.
The depth distance estimation information only needs to be information that allows depth distance information to be obtained from orientation information and travel-related information (here, information outside the vehicle).
 また、奥行距離推定部13bは、向き検出部11が検出した運転者の向きに関する向き情報と走行関連情報取得部12bが取得した走行関連情報(ここでは車外情報取得部123が取得した車外情報)が、奥行距離推定用情報で設定されている入力情報とつきあわない場合、例えば、奥行距離推定用情報からは奥行距離は推定できなかったとして、当該奥行距離に当該奥行距離の初期値を設定する。 Further, the depth distance estimating unit 13b includes orientation information regarding the direction of the driver detected by the orientation detecting unit 11 and travel-related information acquired by the travel-related information acquisition unit 12b (here, vehicle exterior information acquired by the vehicle exterior information acquisition unit 123). However, if the input information does not match the input information set in the depth distance estimation information, for example, it is assumed that the depth distance could not be estimated from the depth distance estimation information, and the initial value of the depth distance is set as the depth distance. .
〈推定した奥行距離の調整〉
 奥行距離を推定すると、奥行距離推定部13bは、推定した奥行距離と車外情報とに基づき、推定した奥行距離を調整し、推定した奥行距離を確定させる。なお、奥行距離推定部13bは、奥行距離推定用情報に基づき推定された奥行距離が幅を有しない場合はこの調整を実施しない。
 詳細には、奥行距離推定部13bは、推定した奥行距離と車外情報とに基づき、奥行距離を調整するための奥行距離調整用条件を参照して、奥行距離を調整する。
 奥行距離調整用条件は、予め、管理者等によって生成され、奥行距離推定部13bが参照可能な場所に記憶されている。
<Adjustment of estimated depth distance>
After estimating the depth distance, the depth distance estimation unit 13b adjusts the estimated depth distance based on the estimated depth distance and the information outside the vehicle, and finalizes the estimated depth distance. Note that the depth distance estimation unit 13b does not perform this adjustment when the depth distance estimated based on the depth distance estimation information does not have a width.
Specifically, the depth distance estimating unit 13b adjusts the depth distance based on the estimated depth distance and the information outside the vehicle, with reference to depth distance adjustment conditions for adjusting the depth distance.
The depth distance adjustment conditions are generated in advance by an administrator or the like and stored in a location that can be referenced by the depth distance estimating unit 13b.
 奥行距離調整用条件には、例えば、以下の(条件1)~(条件4)のような条件が設定されている。 For example, conditions such as (condition 1) to (condition 4) below are set as conditions for depth distance adjustment.
(条件1)
 車両の前方において、運転者が向いている方向に移動体が存在し、ヘッドライトから当該移動体までの距離が奥行距離の範囲内である場合、推定された奥行距離を、ヘッドライトから当該移動体までの距離とする。
(Condition 1)
If there is a moving object in front of the vehicle in the direction the driver is facing and the distance from the headlights to the moving object is within the depth distance range, the estimated depth distance is calculated from the headlights to the moving object. The distance to the body.
(条件2)
 車両の前方において、運転者が向いている方向に移動体が存在し、ヘッドライトから当該移動体までの距離が奥行距離の範囲外である場合、奥行距離推定用情報に基づく奥行距離をそのまま奥行距離とする。
(Condition 2)
If there is a moving object in front of the vehicle in the direction the driver is facing, and the distance from the headlights to the moving object is outside the depth distance range, the depth distance based on the depth distance estimation information is used as the depth. Distance.
(条件3)
 車両の前方において、運転者が向いている方向に移動体は存在しないが車両の周辺に移動体以外の物体、言い換えれば、静止物体が存在し、ヘッドライトから当該静止物体までの距離が奥行距離推定用情報に基づき推定された奥行距離よりも小さい場合、推定された奥行距離を、(ヘッドライトから静止物体までの距離+予め設定された加算用距離)とする。
(Condition 3)
In front of the vehicle, there is no moving object in the direction the driver is facing, but there are objects other than moving objects around the vehicle, in other words, stationary objects, and the distance from the headlights to the stationary object is the depth distance. If it is smaller than the depth distance estimated based on the estimation information, the estimated depth distance is set to (distance from the headlight to the stationary object + preset addition distance).
(条件4)
 車両の前方において、運転者が向いている方向に移動体が存在せず、かつ、車両の周辺に静止物体が存在しない場合、奥行距離推定用情報に基づき推定された奥行距離をそのまま奥行距離とする。
 また、車両の前方において、運転者が向いている方向に移動体が存在せず、かつ、車両の周辺に静止物体が存在するが、ヘッドライトから当該静止物体までの距離が奥行距離推定用情報に基づき推定された奥行距離の範囲外である場合、奥行距離推定用情報に基づき推定された奥行距離をそのまま奥行距離とする。
(Condition 4)
If there is no moving object in front of the vehicle in the direction the driver is facing, and there are no stationary objects around the vehicle, the depth distance estimated based on the depth distance estimation information is used as the depth distance. do.
Additionally, if there is no moving object in front of the vehicle in the direction the driver is facing and there is a stationary object around the vehicle, the distance from the headlights to the stationary object is the depth distance estimation information. If the depth distance is outside the range of the depth distance estimated based on the depth distance, the depth distance estimated based on the depth distance estimation information is used as the depth distance.
 奥行距離調整用条件において、移動体は、人を含む。
 また、奥行距離調整用条件において、加算用距離は、予め管理者等によって設定され、奥行距離推定部13bが参照可能な場所に記憶されている。加算用距離には、例えば、2~5mの間の距離が設定されている。ここでは、一例として、加算用距離には「3m」が設定されているものとする。加算用距離には、幅を持たせた値が設定されていてもよい。
In the depth distance adjustment conditions, the moving object includes a person.
Further, in the depth distance adjustment conditions, the addition distance is set in advance by an administrator or the like, and is stored in a location that can be referenced by the depth distance estimation unit 13b. For example, a distance between 2 and 5 meters is set as the addition distance. Here, as an example, it is assumed that "3 m" is set as the addition distance. The addition distance may be set to a value with a width.
 奥行距離推定部13bは、車外情報取得部123が取得した車外情報に基づいて、車両100の前方において、運転者が向いている方向に存在する物体があるか否か、物体が存在する場合にはその種別、ここでは移動体か静止物体かを判定できる。
 レーダの設置位置および検出範囲は予めわかっているので、奥行距離推定部13bは、運転者が向いている方向に存在する物体があるか否かを判定できる。なお、奥行距離推定部13bは、運転者が向いている方向を、向き情報から判定できる。また、上述のとおり、奥行距離推定部13bは、車外撮像画像上の物体と、距離情報で示されている物体との対応付けができる。よって、奥行距離推定部13bは、運転者が向いている方向に存在する物体があると判定した場合、例えば、車外撮像画像に対して公知の画像認識技術を用いた画像認識処理を行って、当該物体の種別、言い換えれば、移動体か静止物体かを判定できる。奥行距離推定部13bは、例えば、距離情報を用いて、物体の種別を判定してもよい。距離情報には、例えば、物体の速度に関する情報が含まれている。
 また、レーダの設置位置と車両100の位置とヘッドライト2の設置位置との関係は予めわかっているので、奥行距離推定部13bは、当該位置関係と距離情報に基づけば、レーダが検出した物体が、ヘッドライト2からどれぐらいの距離の位置に存在しているかを判定できる。なお、奥行距離推定部13bは、例えば、運転者の顔向きを中心として予め設定された角度だけ垂直方向および水平方向に広げた範囲を、物体が存在しているか否かを判定する対象とする「運転者が向いている方向」とすればよい。
The depth distance estimation unit 13b determines whether or not there is an object in front of the vehicle 100 in the direction in which the driver is facing, based on the external information acquired by the external information acquisition unit 123. can determine its type, here whether it is a moving object or a stationary object.
Since the installation position and detection range of the radar are known in advance, the depth distance estimation unit 13b can determine whether or not there is an object in the direction in which the driver is facing. Note that the depth distance estimation unit 13b can determine the direction in which the driver is facing from the orientation information. Further, as described above, the depth distance estimating unit 13b is capable of associating an object on an image taken outside the vehicle with an object indicated by the distance information. Therefore, when determining that there is an object in the direction that the driver is facing, the depth distance estimating unit 13b performs image recognition processing using a known image recognition technique on the image taken outside the vehicle, for example. It is possible to determine the type of the object, in other words, whether it is a moving object or a stationary object. The depth distance estimation unit 13b may determine the type of object using distance information, for example. The distance information includes, for example, information regarding the speed of the object.
Further, since the relationship between the installation position of the radar, the position of the vehicle 100, and the installation position of the headlight 2 is known in advance, the depth distance estimating unit 13b calculates the object detected by the radar based on the positional relationship and distance information. It is possible to determine how far away the vehicle is from the headlights 2. Note that the depth distance estimating unit 13b uses, for example, a range expanded vertically and horizontally by a preset angle around the driver's face orientation as a target for determining whether or not an object exists. It may be "the direction in which the driver is facing".
 なお、上述したような奥行距離調整用条件は一例に過ぎず、奥行距離調整用条件は適宜設定可能である。
 また、奥行距離調整用条件は、例えば、検出された物体の種別と、ヘッドライト2から当該物体までの距離および奥行距離推定用情報に基づき推定された奥行距離の差を示す情報と、調整後の奥行距離を示す情報とが対応付けられたテーブル形式の情報としてもよい。
Note that the conditions for adjusting the depth distance as described above are only an example, and the conditions for adjusting the depth distance can be set as appropriate.
In addition, the depth distance adjustment conditions include, for example, the type of the detected object, the distance from the headlight 2 to the object, and information indicating the difference in depth distance estimated based on the depth distance estimation information, and the information after adjustment. The information may be in a table format in which the information indicating the depth distance is associated with the information.
 奥行距離推定部13bは、奥行距離を調整すると、調整後の奥行距離を、推定した奥行距離とし、奥行距離情報を、照射決定部14に出力する。 After adjusting the depth distance, the depth distance estimation unit 13b sets the adjusted depth distance as the estimated depth distance, and outputs depth distance information to the irradiation determination unit 14.
 なお、奥行距離推定部13bは、向き検出部11が検出した運転者の向きに関する向き情報と走行関連情報取得部12bが取得した走行関連情報(ここでは車外情報取得部123が取得した車外情報)とに基づき判定した運転者の挙動と車外情報とが、奥行距離推定用情報で設定されている入力情報としての運転者の挙動と車外情報とにつきあわない場合、例えば、奥行距離推定用情報からは奥行距離は推定できなかったとして、当該奥行距離に当該奥行距離の初期値を設定する。
 奥行距離推定部13bは、初期値を設定した奥行距離に関する奥行距離情報を、照射決定部14に出力する。
Note that the depth distance estimation unit 13b uses the direction information regarding the driver's orientation detected by the direction detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12b (here, the outside-vehicle information acquired by the outside-vehicle information acquisition unit 123). If the driver's behavior and the information outside the vehicle determined based on the information do not match the driver's behavior and the information outside the vehicle as input information set in the depth distance estimation information, for example, Assuming that the depth distance could not be estimated, the initial value of the depth distance is set as the depth distance.
The depth distance estimation unit 13b outputs depth distance information regarding the depth distance for which the initial value is set to the irradiation determination unit 14.
 照射決定部14は、奥行距離推定部13bが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定し、照射情報をヘッドライト制御部15に出力する。ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させる。 The irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13b, and outputs irradiation information to the headlight control unit 15. The headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination section 14 with light.
 ここで、図17、図18、および、図19は、実施の形態3において、奥行距離推定部13bが推定した奥行距離に基づいて照射決定部14が照射範囲を決定し、ヘッドライト制御部15が、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させた様子の一例を説明するための図である。なお、奥行距離推定部13bが奥行距離の推定に用いる奥行距離推定用情報は、図15に示すような内容であったとする。また、奥行距離推定部13bは、上述の(条件1)~(条件4)の奥行距離調整用条件に従って、奥行距離を調整するものとする。
 図17、図18、および、図19は、それぞれ、車両100が走行中の道路を横から見た図としている。図17、図18、および、図19において、運転者は「D」で示され、ヘッドライト2による光の照射範囲は「LA」で示されている。
Here, FIGS. 17, 18, and 19 show that in the third embodiment, the irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimation unit 13b, and the headlight control unit 15 is a diagram for explaining an example of how the headlight 2 is caused to irradiate light onto the irradiation range determined by the irradiation determining unit 14. Note that it is assumed that the depth distance estimation information used by the depth distance estimation unit 13b to estimate the depth distance has contents as shown in FIG. Further, the depth distance estimating unit 13b is assumed to adjust the depth distance according to the depth distance adjustment conditions (condition 1) to (condition 4) described above.
17, 18, and 19 are views of the road on which the vehicle 100 is traveling viewed from the side. In FIGS. 17, 18, and 19, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA".
 図17に示す例では、運転者の顔向きが、上下方向に「正面」の範囲内の向きであり、車両100の前方に複数の停車車両(図17にて「C」で示されている。なお、図17では説明の簡単のため停車車両は1台のみ図示している)があり、かつ、歩行者(図17にて「W」で示されている)がいる状況で車両100が走行しているとする。歩行者は、運転者が向いている方向において、ヘッドライト2から12mの距離に存在するとする。
 この場合、奥行距離推定部13bは、奥行距離推定用情報に基づき、運転者の顔向きと走行関連情報(ここでは車外情報)とが当該奥行距離推定用情報のNo.1に当てはまるとして奥行距離を「5~15m」と推定する。その後、奥行距離推定部13bは、「5~15」と推定した奥行距離の調整を行う。図17に示す例では、運転者が向いている方向において、「5~15m」の範囲内に歩行者が存在するので、奥行距離推定部13bは、奥行距離を調整し、「12m」を、推定した奥行距離とする。
 照射決定部14は、奥行距離推定部13bが推定した奥行距離「12m」に基づき、照射範囲を決定する。ここでは、照射決定部14は、左右方向にφ度~φ度の範囲、かつ、上下方向にθ度~θ度の範囲を照射範囲と決定したものとしている(図17では左右方向の照射範囲は図示省略)。
 なお、ここでは、照射決定部14は、照射範囲の上下方向について、奥行距離「12m」に基づき奥行距離垂直角度を算出し、当該奥行距離垂直角度から垂直方向に予め設定された角度分だけ広げた角度の範囲を、照射範囲の上下方向の角度範囲を決定したものとしている。
In the example shown in FIG. 17, the driver's face direction is within the "front" range in the vertical direction, and there are a plurality of parked vehicles (indicated by "C" in FIG. 17) in front of the vehicle 100. Note that in FIG. 17, only one stopped vehicle is shown for ease of explanation), and there is a pedestrian (indicated by "W" in FIG. 17), and the vehicle 100 is Suppose you are running. It is assumed that the pedestrian exists at a distance of 12 m from the headlight 2 in the direction the driver is facing.
In this case, the depth distance estimating unit 13b determines that the driver's face orientation and travel-related information (here, outside-vehicle information) are the Nos. of the depth distance estimation information based on the depth distance estimation information. Assuming that 1 applies, the depth distance is estimated to be 5 to 15 m. After that, the depth distance estimating unit 13b adjusts the estimated depth distance to "5 to 15". In the example shown in FIG. 17, since there is a pedestrian within the range of "5 to 15 m" in the direction the driver is facing, the depth distance estimating unit 13b adjusts the depth distance to "12 m". Let it be the estimated depth distance.
The irradiation determining unit 14 determines the irradiation range based on the depth distance “12 m” estimated by the depth distance estimating unit 13b. Here, it is assumed that the irradiation determining unit 14 has determined the range of φ 7 degrees to φ 8 degrees in the horizontal direction and the range of θ 7 degrees to θ 7 degrees in the vertical direction as the irradiation range (in FIG. The irradiation range in the direction is not shown).
Note that here, the irradiation determining unit 14 calculates the depth distance vertical angle based on the depth distance "12 m" in the vertical direction of the irradiation range, and widens the depth distance vertical angle by a preset angle in the vertical direction. The range of angles determined is the range of angles in the vertical direction of the irradiation range.
 ヘッドライト制御部15は、ヘッドライト2に対して、運転者が向いている方向において、奥行距離推定部13bによって運転者の向きと走行関連情報(ここでは車外情報)を考慮して推定された奥行距離に基づいて照射決定部14によって決定された照射範囲、に光を照射させる。
 その結果、ヘッドライト制御部15は、運転者が向いている方向に存在している歩行者に光が照射されるよう、ヘッドライト2を制御することができる。運転者は、歩行者を視認することができる。
 奥行距離推定用情報を用いて推定される奥行距離は、運転者が向いている方向に実際に存在している物体を考慮して推定されるものではない。一方、運転者が向いている方向において、奥行距離推定用情報を用いて推定された奥行距離の範囲内に実際に移動体が存在していれば、運転者は当該移動体を視認しようとしている確率が高いといえる。奥行距離推定部13bが、奥行距離を、運転者が向いている方向に実際に存在している移動体に基づいて調整することで、推定視認対象物である確率が高い、実際に存在している移動体(ここでは歩行者)に対して、光が照射されるようになる。
The headlight control unit 15 estimates the distance in the direction in which the driver is facing with respect to the headlights 2 by the depth estimation unit 13b in consideration of the driver's orientation and driving-related information (here, information outside the vehicle). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance.
As a result, the headlight control unit 15 can control the headlights 2 so that light is irradiated onto pedestrians in the direction in which the driver is facing. The driver can visually recognize pedestrians.
The depth distance estimated using the depth distance estimation information is not estimated in consideration of objects actually existing in the direction in which the driver is facing. On the other hand, if a moving object actually exists within the depth distance estimated using the depth distance estimation information in the direction the driver is facing, the driver is attempting to visually recognize the moving object. It can be said that the probability is high. The depth distance estimating unit 13b adjusts the depth distance based on the moving objects that actually exist in the direction the driver is facing, so that the estimated visible object has a high probability of being an object that actually exists. Light will now be irradiated onto the moving object (in this case, a pedestrian).
 図18に示す例では、図17に示す例同様、運転者の顔向きが、上下方向に「正面」の範囲内の向きであり、車両100の前方に複数の停車車両(図18にて「C」で示されている。なお、図18では説明の簡単のため停車車両は1台のみ図示している)があり、かつ、歩行者(図18にて「W」で示されている)がいる状況で車両100が走行しているとする。ただし、歩行者は、運転者が向いている方向において、ヘッドライト2から20mの距離に存在するとする。
 この場合、運転者が向いている方向において、「5~15m」の範囲内に歩行者が存在しないので、奥行距離推定部13bは、奥行距離推定用情報に基づき「5~15m」と推定した奥行距離をそのまま、推定した奥行距離とする。
 照射決定部14は、奥行距離推定部13bが推定した奥行距離「5~15m」に基づき、照射範囲を決定する。ここでは、照射決定部14は、左右方向にφ度~φ10度の範囲、かつ、上下方向にθ度~θ10度の範囲を照射範囲と決定したものとしている(図18では左右方向の照射範囲は図示省略)。
 なお、ここでは、照射決定部14は、照射範囲の上下方向について、奥行距離「5m」に基づき第1奥行距離垂直角度を算出し、奥行距離「15m」に基づき第2奥行距離垂直角度を算出して、照射範囲の上下方向を決定したものとしている。
In the example shown in FIG. 18, similar to the example shown in FIG. There is a pedestrian (indicated by "W" in FIG. 18) and a pedestrian (indicated by "W" in FIG. 18). Assume that the vehicle 100 is traveling in a situation where there is a driver. However, it is assumed that the pedestrian exists at a distance of 20 m from the headlight 2 in the direction in which the driver is facing.
In this case, since there are no pedestrians within the range of "5 to 15 m" in the direction the driver is facing, the depth distance estimation unit 13b estimates the depth to be "5 to 15 m" based on the depth distance estimation information. The depth distance is directly used as the estimated depth distance.
The irradiation determining unit 14 determines the irradiation range based on the depth distance “5 to 15 m” estimated by the depth distance estimating unit 13b. Here, it is assumed that the irradiation determining unit 14 has determined the irradiation range to be a range of φ 9 degrees to φ 10 degrees in the horizontal direction and a range of θ 9 degrees to θ 10 degrees in the vertical direction (in FIG. The irradiation range in the direction is not shown).
Note that, here, the irradiation determining unit 14 calculates the first depth distance vertical angle based on the depth distance "5 m" and calculates the second depth distance vertical angle based on the depth distance "15 m" in the vertical direction of the irradiation range. Then, the vertical direction of the irradiation range is determined.
 ヘッドライト制御部15は、ヘッドライト2に対して、運転者が向いている方向において、奥行距離推定部13bによって運転者の向きと走行関連情報(ここでは車外情報)を考慮して推定された奥行距離に基づいて照射決定部14によって決定された照射範囲、に光を照射させる。
 その結果、ヘッドライト制御部15は、運転者の推定視認対象物に光が照射されるよう、ヘッドライト2を制御する。ここで、歩行者が実際に存在する位置は、ヘッドライト2の光の照射範囲に含まれない。すなわち、歩行者にはヘッドライト2の光は照射されない。ヘッドライト2からみて、奥行距離推定用情報を用いて推定した奥行距離より遠い位置に存在する歩行者は推定視認対象物ではなく、当該歩行者よりも近い距離に推定視認対象物が存在していると推定される。奥行距離推定部13bが奥行距離を調整する際、運転者が向いている方向に実際に歩行者が存在していても、当該歩行者が奥行距離推定用情報を用いて推定した奥行距離より遠い位置に存在する場合には奥行距離を大きくする等せず、奥行距離推定用情報を用いて推定した奥行距離をそのままとする。
 これにより、ヘッドライト制御装置1bは、実際に存在する歩行者だとしても、推定視認対象物である可能性が低い歩行者には、光を照射せず、推定視認対象物が存在する可能性がより高いと推定される範囲に光が照射されるよう、ヘッドライト2を制御できる。
The headlight control unit 15 estimates the distance in the direction in which the driver is facing with respect to the headlights 2 by the depth estimation unit 13b in consideration of the driver's orientation and driving-related information (here, information outside the vehicle). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance.
As a result, the headlight control unit 15 controls the headlights 2 so that the light is irradiated onto the estimated visible object of the driver. Here, the position where the pedestrian actually exists is not included in the light irradiation range of the headlight 2. That is, pedestrians are not irradiated with the light of the headlights 2. When viewed from the headlight 2, a pedestrian located further away than the depth distance estimated using the depth distance estimation information is not an estimated visible object, but an estimated visible object exists at a closer distance than the pedestrian. It is estimated that there are. When the depth distance estimation unit 13b adjusts the depth distance, even if a pedestrian actually exists in the direction the driver is facing, the depth distance of the pedestrian is farther than the depth distance estimated using the depth distance estimation information. If the object exists at the position, the depth distance is not increased, but the depth distance estimated using the depth distance estimation information is left unchanged.
As a result, the headlight control device 1b does not irradiate light to a pedestrian that is unlikely to be an estimated visible object even if the pedestrian actually exists, and reduces the possibility that an estimated visible object exists. The headlights 2 can be controlled so that the light is irradiated to a range where it is estimated that the height is higher.
 図19に示す例では、運転者の顔向きが、上下方向に「正面」の範囲内の向きであり、車両100の前方に複数の停車車両(図19にて「C」で示されている。なお、図19では説明の簡単のため停車車両は1台のみ図示している)がある状況で車両100が走行しているとする。車両100の周辺に歩行者等の移動体は存在しない。停車車両までの距離は、「4m」であるとする。
 この場合、運転者が向いている方向において移動体は存在しないが停車車両が存在し、ヘッドライト2から当該停車車両までの距離が奥行距離推定用情報に基づき推定された「5~15m」の範囲内であるため、奥行距離推定部13bは、奥行距離を調整し、「7m」を、推定した奥行距離とする。
 照射決定部14は、奥行距離推定部13bが推定した奥行距離「7m」に基づき、照射範囲を決定する。ここでは、照射決定部14は、左右方向にφ11度~φ12度の範囲、かつ、上下方向にθ11度~θ12度の範囲を照射範囲と決定したものとしている(図19では左右方向の照射範囲は図示省略)。
 なお、ここでは、照射決定部14は、照射範囲の上下方向について、奥行距離「7m」に基づき奥行距離垂直角度を算出し、当該奥行距離垂直角度から垂直方向に予め設定された角度分だけ広げた角度の範囲を、照射範囲の上下方向の角度範囲を決定したものとしている。
In the example shown in FIG. 19, the driver's face direction is within the "front" range in the vertical direction, and there are a plurality of parked vehicles (indicated by "C" in FIG. 19) in front of the vehicle 100. Note that it is assumed that the vehicle 100 is traveling in a situation where only one stopped vehicle is shown in FIG. 19 for simplicity of explanation. There are no moving objects such as pedestrians around the vehicle 100. It is assumed that the distance to the stopped vehicle is "4 m".
In this case, there is no moving object in the direction the driver is facing, but there is a stopped vehicle, and the distance from the headlight 2 to the stopped vehicle is 5 to 15 m, which is estimated based on the depth distance estimation information. Since it is within the range, the depth distance estimation unit 13b adjusts the depth distance and sets "7 m" as the estimated depth distance.
The irradiation determining unit 14 determines the irradiation range based on the depth distance “7 m” estimated by the depth distance estimating unit 13b. Here, it is assumed that the irradiation determining unit 14 has determined the range of φ 11 degrees to φ 12 degrees in the horizontal direction and the range of θ 11 degrees to θ 12 degrees in the vertical direction as the irradiation range (in FIG. The irradiation range in the direction is not shown).
Note that here, the irradiation determining unit 14 calculates the depth distance vertical angle based on the depth distance "7 m" in the vertical direction of the irradiation range, and widens the depth distance vertical angle by a preset angle in the vertical direction. The range of angles determined is the range of angles in the vertical direction of the irradiation range.
 ヘッドライト制御部15は、ヘッドライト2に対して、運転者が向いている方向において、奥行距離推定部13bによって運転者の向きと走行関連情報(ここでは車外情報)を考慮して推定された奥行距離に基づいて照射決定部14によって決定された照射範囲、に光を照射させる。
 その結果、ヘッドライト制御部15は、運転者の推定視認対象物に光が照射されるよう、ヘッドライト2を制御する。ここで、ヘッドライト2の光は、車両100からみて停車車両の奥の範囲に照射される。例えば、運転者が向いている方向において、静止物体がある場合、当該静止物体の陰から歩行者等の移動体が飛び出してくる可能性があり、運転者は、この飛び出しがないかを視認しようとしていると推定される。奥行距離推定部13bが奥行距離を調整することで、奥行距離は、運転者が向いている方向が、飛び出しが発生しそうな状況である場合、当該飛び出しが発生しそうな場所を推定視認位置とした奥行距離に調整される。
 これにより、ヘッドライト制御装置1bは、推定視認対象物が存在する可能性がより高いと推定される範囲に光が照射されるよう、ヘッドライト2を制御できる。
The headlight control unit 15 estimates the distance in the direction in which the driver is facing with respect to the headlights 2 by the depth estimation unit 13b in consideration of the driver's orientation and driving-related information (here, information outside the vehicle). Light is irradiated onto the irradiation range determined by the irradiation determining unit 14 based on the depth distance.
As a result, the headlight control unit 15 controls the headlight 2 so that the light is irradiated onto the estimated visible object of the driver. Here, the light of the headlights 2 is irradiated to a range behind the stopped vehicle when viewed from the vehicle 100. For example, if there is a stationary object in the direction the driver is facing, there is a possibility that a moving object such as a pedestrian may jump out from behind the stationary object, and the driver should visually check for any such objects. It is estimated that By adjusting the depth distance by the depth distance estimating unit 13b, if the direction in which the driver is facing is a situation where a run-out is likely to occur, the depth distance is set as the estimated visual recognition position of a place where the run-out is likely to occur. Adjusted to depth distance.
Thereby, the headlight control device 1b can control the headlights 2 so that light is irradiated to a range where it is estimated that there is a higher possibility that the estimated visible object exists.
 例えば、実施の形態1に係るヘッドライト制御装置1は、実際に車両100の周辺に存在する物体を考慮した奥行距離の推定は行っていなかった。
 これに対し、実施の形態3に係るヘッドライト制御装置1bは、上述したとおり、車外情報に基づく奥行距離の推定、より詳細には、奥行距離推定用情報に基づき推定した奥行距離を車外情報に基づいて調整し、奥行距離を確定させる奥行距離の推定、を行うことで、実際に車両100の周辺に存在する物体を考慮した奥行距離の推定を行う。ヘッドライト制御装置1bは、運転者が向いている方向において実際に存在する物体があるか否かに基づき、より実際の状況に即した奥行距離を推定する。
 これにより、ヘッドライト制御装置1bは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
For example, the headlight control device 1 according to the first embodiment does not estimate the depth distance in consideration of objects actually existing around the vehicle 100.
On the other hand, as described above, the headlight control device 1b according to the third embodiment estimates the depth distance based on the information outside the vehicle, and more specifically, uses the depth distance estimated based on the information for estimating the depth distance as the information outside the vehicle. By performing adjustment based on the depth distance and estimating the depth distance to determine the depth distance, the depth distance is estimated in consideration of objects actually existing around the vehicle 100. The headlight control device 1b estimates a depth distance that is more in line with the actual situation, based on whether there is an object that actually exists in the direction that the driver is facing.
Thereby, the headlight control device 1b can more appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
 実施の形態3に係るヘッドライト制御装置1bの動作について説明する。
 図20は、実施の形態3に係るヘッドライト制御装置1bの動作について説明するためのフローチャートである。
 ヘッドライト制御装置1bは、例えば、ヘッドライト2がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図20のフローチャートで示すような動作を開始する。ヘッドライト制御装置1bは、例えば、ヘッドライト2がオフの状態になるまで、または、車両100の電源がオフにされるまで、図20のフローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1bの制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、走行関連情報取得部12b、奥行距離推定部13b、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、走行関連情報取得部12b、奥行距離推定部13b、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1b according to the third embodiment will be explained.
FIG. 20 is a flowchart for explaining the operation of the headlight control device 1b according to the third embodiment.
For example, when the headlights 2 are turned on, the headlight control device 1b determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. do. The headlight control device 1b repeats the operation shown in the flowchart of FIG. 20, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
For example, the control unit (not shown) of the headlight control device 1b acquires information indicating the state of the headlights 2 from the headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are in the on state. Determine whether or not. When the control unit determines that the headlights 2 are in the on state, the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12b, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the section 13b, the irradiation determining section 14, and the headlight control section 15.
Further, if the control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the travel-related information acquisition section 12b, the depth distance estimation section 13b, the irradiation determination section 14, and the headlight control section 15.
 図20のフローチャートで示す動作について、ステップST1-1、ステップST3~ステップST4の処理内容は、それぞれ、実施の形態1にて説明済みの、図5のフローチャートで示すヘッドライト制御装置1の動作のステップST1-1、ステップST3~ステップST4の処理内容と同様であるため、重複した説明を省略する。 Regarding the operation shown in the flowchart of FIG. 20, the processing contents of step ST1-1 and steps ST3 to ST4 are the same as those of the headlight control device 1 shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing contents are the same as those of step ST1-1 and steps ST3 to ST4, duplicate explanation will be omitted.
 車外情報取得部123は、走行関連情報取得装置4から、車外情報を、取得する(ステップST1-4)。
 詳細には、車外情報取得部123は、車外撮像装置から車外撮像画像を取得し、レーダから距離情報を取得する。
 車外情報取得部123は、取得した車外情報を、走行関連情報として、奥行距離推定部13bに出力する。
The vehicle exterior information acquisition unit 123 acquires vehicle exterior information from the driving-related information acquisition device 4 (step ST1-4).
Specifically, the outside-vehicle information acquisition unit 123 acquires an outside-vehicle captured image from an outside-vehicle imaging device, and acquires distance information from a radar.
The outside-vehicle information acquisition unit 123 outputs the acquired outside-vehicle information to the depth distance estimation unit 13b as travel-related information.
 奥行距離推定部13bは、ステップST1-1にて向き検出部11が検出した運転者の向きに関する向き情報と、ステップST1-4にて車外情報取得部123が取得した車外情報と、奥行距離推定用情報とを用いて、奥行距離を推定する(ステップST2b)。
 詳細には、奥行距離推定部13bは、奥行距離推定用情報を用いて奥行距離を推定すると、推定した奥行距離を、車外情報に基づき、奥行距離調整用条件を参照して、調整し、推定した奥行距離を確定させる。
 奥行距離推定部13bは、奥行距離情報を、照射決定部14に出力する。
The depth distance estimation unit 13b uses the orientation information regarding the driver's orientation detected by the orientation detection unit 11 in step ST1-1, the external information acquired by the external information acquisition unit 123 in step ST1-4, and the depth distance estimation. The depth distance is estimated using this information (step ST2b).
Specifically, when the depth distance estimating unit 13b estimates the depth distance using the depth distance estimation information, the depth distance estimating unit 13b adjusts the estimated depth distance based on the vehicle exterior information and with reference to the depth distance adjustment conditions, and estimates the depth distance. Determine the depth distance.
The depth distance estimation section 13b outputs depth distance information to the irradiation determination section 14.
 ステップST3において、照射決定部14は、ステップST2bにて奥行距離推定部13bが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する(ステップST3)。 In step ST3, the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13b in step ST2b (step ST3).
 このように、ヘッドライト制御装置1bは、車内撮像画像に基づき、運転者の向きを検出し、走行関連情報(ここでは車外情報)を取得する。ヘッドライト制御装置1bは、検出した運転者の向きに関する向き情報と、取得した走行関連情報とに基づき、奥行距離を推定する。その際、ヘッドライト制御装置1bは、奥行距離推定用情報を用いて推定した奥行距離を、車外情報と奥行距離調整用条件とに基づき調整し、調整後の奥行距離情報を推定した奥行情報とする。そして、ヘッドライト制御装置1bは、推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定し、ヘッドライト2に対して、決定した照射範囲に光を照射させる。
 そのため、ヘッドライト制御装置1bは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
In this way, the headlight control device 1b detects the orientation of the driver based on the captured image inside the vehicle, and acquires travel-related information (here, information outside the vehicle). The headlight control device 1b estimates the depth distance based on the detected orientation information regarding the driver's orientation and the acquired travel-related information. At this time, the headlight control device 1b adjusts the depth distance estimated using the depth distance estimation information based on the vehicle exterior information and the depth distance adjustment conditions, and converts the adjusted depth distance information into the estimated depth information. do. Then, the headlight control device 1b determines the irradiation range of light by the headlight 2 based on the estimated depth distance, and causes the headlight 2 to irradiate the determined irradiation range with light.
Therefore, the headlight control device 1b can illuminate the estimated visible position of the driver more appropriately, and can provide driving support when the vehicle 100 runs at night or the like.
 以上の実施の形態3において、例えば、奥行距離推定部13bは、奥行距離を推定するとともに、照射範囲の理想幅を推定するようにしてもよい。この場合、奥行距離推定部13bは、奥行距離情報と照射範囲の理想幅に関する情報とを照射決定部14に出力する。
 照射決定部14は、奥行距離推定部13bが推定した奥行距離と照射範囲の理想幅とに基づいて照射範囲を決定する。詳細には、照射決定部14は、奥行距離推定部13bが推定した奥行距離に基づいて算出した照射範囲が、照射範囲の理想幅を超えていた場合、照射範囲の理想幅までの範囲を、照射範囲と決定する。
 奥行距離推定部13bが照射範囲の理想幅を推定し、照射決定部14が照射範囲の理想幅を上限として照射範囲を決定することで、ヘッドライト制御装置1bは、対向車等に与えるグレアを低減できる。
In the third embodiment described above, for example, the depth distance estimation unit 13b may estimate the depth distance and also estimate the ideal width of the irradiation range. In this case, the depth distance estimation section 13b outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination section 14.
The irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13b and the ideal width of the irradiation range. Specifically, when the irradiation range calculated based on the depth distance estimated by the depth distance estimation unit 13b exceeds the ideal width of the irradiation range, the irradiation determination unit 14 determines the range up to the ideal width of the irradiation range. Determine the irradiation range.
The depth distance estimation unit 13b estimates the ideal width of the irradiation range, and the irradiation determination unit 14 determines the irradiation range with the ideal width of the irradiation range as the upper limit, so that the headlight control device 1b reduces the glare given to oncoming vehicles etc. Can be reduced.
 また、以上の実施の形態3において、奥行距離推定部13bは、奥行距離を推定するとともに、ヘッドライト2による理想光量を推定するようにしてもよい。この場合、奥行距離推定部13bは、奥行距離情報を照射決定部14に出力するとともに、推定した理想光量に関する情報を、ヘッドライト制御部15に出力する。
 ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲において、奥行距離推定部13bが推定した理想光量で、光を照射させる。
 奥行距離推定部13bが理想光量を推定し、ヘッドライト制御部15がヘッドライト2に対して理想光量で光を照射させることで、ヘッドライト制御装置1bは、運転者が向いている方向において、光を照射させる車両100からの距離に応じて、運転者が推定視認対象物を視認するのに必要と想定される光量で、光を照射させることができる。
Furthermore, in the third embodiment described above, the depth distance estimating unit 13b may estimate the ideal amount of light from the headlights 2 as well as estimating the depth distance. In this case, the depth distance estimation section 13b outputs depth distance information to the irradiation determination section 14, and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
The headlight control unit 15 causes the headlight 2 to irradiate light at the ideal light amount estimated by the depth distance estimation unit 13b in the irradiation range determined by the irradiation determination unit 14.
The depth distance estimating unit 13b estimates the ideal amount of light, and the headlight control unit 15 causes the headlights 2 to emit light at the ideal amount of light, so that the headlight control device 1b can Depending on the distance from the vehicle 100 to which the light is irradiated, the light can be irradiated with the amount of light that is assumed to be necessary for the driver to visually recognize the estimated visible object.
 以上の実施の形態3において、奥行距離推定部13bは、奥行距離を推定するとともに照射範囲の理想幅および理想光量を推定するようにしてもよい。 In the third embodiment described above, the depth distance estimation unit 13b may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
 また、以上の実施の形態3では、ヘッドライト制御装置1bは、車両100に搭載される車載装置とし、向き検出部11と、走行関連情報取得部12bと、奥行距離推定部13bと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、走行関連情報取得部12bと、奥行距離推定部13bと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、走行関連情報取得部12bと、奥行距離推定部13bと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の全部がサーバに備えられてもよい。 Further, in the third embodiment described above, the headlight control device 1b is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12b, the depth distance estimation section 13b, and the irradiation determination section 12b. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device. However, the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12b, depth distance estimation section 13b, irradiation determination section 14, headlight control section 15, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12b, the depth distance estimation unit 13b, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
 実施の形態3に係るヘッドライト制御装置1bのハードウェア構成は、実施の形態1において図6Aおよび図6Bを用いて説明したヘッドライト制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態3において、向き検出部11と、走行関連情報取得部12bと、奥行距離推定部13bと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1bは、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向き関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
The hardware configuration of the headlight control device 1b according to the third embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
In the third embodiment, the functions of the direction detection section 11, the travel-related information acquisition section 12b, the depth distance estimation section 13b, the irradiation determination section 14, the headlight control section 15, and a control section (not shown) are performed by a processing circuit. This is realized by 1001. That is, the headlight control device 1b estimates the depth distance based on the direction information related to the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3 and the driving-related information, and applies the estimated depth distance to A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、走行関連情報取得部12bと、奥行距離推定部13bと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1bは、処理回路1001により実行されるときに、上述の図20のステップST1-1、ステップST1-4~ステップST4が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、走行関連情報取得部12bと、奥行距離推定部13bと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 記憶部16は、例えば、メモリ1005で構成される。
 ヘッドライト制御装置1bは、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
The processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection unit 11, the travel-related information acquisition unit 12b, the depth distance estimation unit 13b, the irradiation determination unit 14, and the headlight control unit. 15 and a control section (not shown). That is, the headlight control device 1b stores a program that, when executed by the processing circuit 1001, results in steps ST1-1, ST1-4 to ST4 in FIG. 20 described above being executed. A memory 1005 is provided for the purpose. Further, the program stored in the memory 1005 includes the direction detection section 11, the travel-related information acquisition section 12b, the depth distance estimation section 13b, the irradiation determination section 14, the headlight control section 15, and the control section (not shown). It can also be said to be something that causes a computer to execute a processing procedure or method.
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1b includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態3に係るヘッドライト制御装置1bは、車両100の運転者が撮像された撮像画像(車内撮像画像)に基づき、運転者の向きを検出する向き検出部11と、車両100の走行に関連する走行関連情報を取得する走行関連情報取得部12bと、向き検出部11が検出した運転者の向きに関する向き情報と、走行関連情報取得部12bが取得した走行関連情報とに基づき、奥行距離を推定する奥行距離推定部13bと、奥行距離推定部13bが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する照射決定部14と、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させるヘッドライト制御部15とを備えるように構成した。
 そのため、ヘッドライト制御装置1bは、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1bは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
As described above, the headlight control device 1b according to the third embodiment includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); A driving-related information acquisition unit 12b that acquires driving-related information related to the driving of the vehicle 100, orientation information regarding the driver's orientation detected by the orientation detection unit 11, and driving-related information acquired by the driving-related information acquisition unit 12b. a depth distance estimation unit 13b that estimates the depth distance based on the depth distance; an irradiation determination unit 14 that determines the range of light irradiation by the headlight 2 based on the depth distance estimated by the depth distance estimation unit 13b; On the other hand, the headlight control section 15 is configured to irradiate light onto the irradiation range determined by the irradiation determining section 14.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1b determines how far ahead in the direction in which the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1b can more appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
 詳細には、ヘッドライト制御装置1bにおいて、走行関連情報取得部12bは、走行関連情報として車両100の前方に関する車外情報を取得する車外情報取得部123を有し、奥行距離推定部13bは、向き情報と車外情報とに基づき、奥行距離を推定する。
 そのため、ヘッドライト制御装置1bは、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1bは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
Specifically, in the headlight control device 1b, the driving-related information acquisition unit 12b has an external information acquisition unit 123 that acquires external information regarding the front of the vehicle 100 as driving-related information, and the depth distance estimating unit 13b has a direction The depth distance is estimated based on the information and the information outside the vehicle.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1b determines how far ahead in the direction the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1b can more appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
実施の形態4.
 実施の形態2では、実施の形態1におけるヘッドライト制御装置において、車両情報に加え、地図情報を取得し、向き情報と車両情報と地図情報とに基づき奥行距離を推定していた。
 実施の形態3では、実施の形態1におけるヘッドライト制御装置において、車両情報に代えて、車外情報を取得し、向き情報と車外情報とに基づき奥行距離を推定していた。
 実施の形態4では、実施の形態2と実施の形態3とを組み合わせた実施の形態について説明する。
Embodiment 4.
In the second embodiment, the headlight control device in the first embodiment acquires map information in addition to vehicle information, and estimates the depth distance based on the orientation information, vehicle information, and map information.
In the third embodiment, in the headlight control device in the first embodiment, information outside the vehicle is acquired instead of vehicle information, and the depth distance is estimated based on the orientation information and the information outside the vehicle.
In Embodiment 4, an embodiment that is a combination of Embodiment 2 and Embodiment 3 will be described.
 図21は、実施の形態4に係るヘッドライト制御装置1cの構成例を示す図である。
 実施の形態4において、ヘッドライト制御装置1cは、車両100に搭載されていることを想定する。
 図21において、実施の形態1、実施の形態2、および、実施の形態3にて、それぞれ、図1、図7、および、図14を用いて説明したヘッドライト制御装置1、1a、1bと同様の構成については、同じ符号を付して重複した説明を省略する。
FIG. 21 is a diagram showing a configuration example of a headlight control device 1c according to the fourth embodiment.
In the fourth embodiment, it is assumed that the headlight control device 1c is mounted on the vehicle 100.
In FIG. 21, headlight control devices 1, 1a, and 1b, which were explained using FIG. 1, FIG. 7, and FIG. 14 in Embodiment 1, Embodiment 2, and Embodiment 3, respectively. Similar configurations will be given the same reference numerals and redundant explanations will be omitted.
 ヘッドライト制御装置1cにおいて、走行関連情報取得部12cは、車両情報取得部121と地図情報取得部122と車外情報取得部123とを備える。 In the headlight control device 1c, the driving-related information acquisition section 12c includes a vehicle information acquisition section 121, a map information acquisition section 122, and an external information acquisition section 123.
 奥行距離推定部13cは、向き検出部11が検出した運転者の向きに関する向き情報と、車両情報取得部121が取得した車両情報と、地図情報取得部122が取得した地図情報と、車外情報取得部123が取得した車外情報とに基づき、奥行距離を推定する。
 詳細には、奥行距離推定部13bは、向き情報と、走行関連情報(ここでは車両情報、地図情報、および、車外情報)と、奥行距離推定用情報との比較によって、奥行距離を推定する。
The depth distance estimation unit 13c acquires orientation information regarding the driver's orientation detected by the orientation detection unit 11, vehicle information acquired by the vehicle information acquisition unit 121, map information acquired by the map information acquisition unit 122, and information outside the vehicle. The depth distance is estimated based on the outside information acquired by the unit 123.
Specifically, the depth distance estimating unit 13b estimates the depth distance by comparing the orientation information, travel-related information (vehicle information, map information, and information outside the vehicle) with the depth distance estimation information.
 図22は、実施の形態4において、奥行距離推定部13cが奥行距離の推定に用いる奥行距離推定用情報の内容の一例を示す図である。
 図22に示すように、実施の形態4において、奥行距離推定用情報は、例えば、運転者の挙動と、車両情報と、地図情報と、車外情報と、奥行距離とが対応付けられたテーブルである。
 なお、実施の形態4において、奥行距離推定部13cは、奥行距離推定用情報を用いて奥行距離を推定すると、推定した奥行距離を、車外情報に基づいて、奥行距離調整用条件を参照して調整する。奥行距離推定部13cは、調整後の奥行距離を、推定した奥行距離として確定させる。
 奥行距離推定部13cは、奥行距離情報を、照射決定部14に出力する。
FIG. 22 is a diagram illustrating an example of the contents of depth distance estimation information used by the depth distance estimation unit 13c to estimate the depth distance in the fourth embodiment.
As shown in FIG. 22, in the fourth embodiment, the depth distance estimation information is, for example, a table in which driver behavior, vehicle information, map information, outside vehicle information, and depth distance are associated with each other. be.
In addition, in Embodiment 4, when the depth distance estimating unit 13c estimates the depth distance using the depth distance estimation information, the depth distance estimating unit 13c calculates the estimated depth distance by referring to the depth distance adjustment conditions based on the information outside the vehicle. adjust. The depth distance estimation unit 13c determines the adjusted depth distance as the estimated depth distance.
The depth distance estimation section 13c outputs depth distance information to the irradiation determination section 14.
 図23は、図22に示す奥行距離推定用情報において設定されている、No.1~No.7の条件にて入力情報に対応する奥行距離について、当該奥行距離が導き出される根拠となる、運転者の挙動と車両情報と地図情報と車外情報とから推定される車両100の走行状態および運転者の視認対象物の一例と対応付けて示した図である。 FIG. 23 shows No. 2 set in the depth distance estimation information shown in FIG. 22. 1~No. Regarding the depth distance corresponding to the input information under the condition 7, the driving state of the vehicle 100 and the driver estimated from the driver's behavior, vehicle information, map information, and external information, which are the basis for deriving the depth distance. FIG. 2 is a diagram showing an example of a visually recognized object.
 なお、奥行距離推定部13cは、向き検出部11が検出した運転者の向きに関する向き情報と走行関連情報取得部12cが取得した走行関連情報(ここでは車両情報取得部121が取得した車両情報、地図情報取得部122が取得した地図情報、および、車外情報取得部123が取得した車外情報)とに基づき判定した運転者の挙動と車両情報と地図情報と車外情報とが、奥行距離推定用情報で設定されている入力情報としての運転者の挙動と車両情報と地図情報と車外情報とにつきあわない場合、例えば、奥行距離推定用情報からは奥行距離は推定できなかったとして、当該奥行距離に当該奥行距離の初期値を設定する。
 奥行距離推定部13cは、初期値を設定した奥行距離に関する奥行距離情報を、照射決定部14に出力する。
Note that the depth distance estimation unit 13c uses the orientation information regarding the driver's orientation detected by the orientation detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12c (here, the vehicle information acquired by the vehicle information acquisition unit 121, The driver's behavior, vehicle information, map information, and vehicle exterior information determined based on the map information acquired by the map information acquisition unit 122 and the vehicle exterior information acquired by the vehicle exterior information acquisition unit 123 are used as depth distance estimation information. If the driver's behavior, vehicle information, map information, and external information set as input information do not match, for example, the depth distance cannot be estimated from the depth distance estimation information, and the depth distance is Set the initial value of the depth distance.
The depth distance estimation unit 13c outputs depth distance information regarding the depth distance for which the initial value has been set to the irradiation determination unit 14.
 照射決定部14は、奥行距離推定部13cが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定し、照射情報をヘッドライト制御部15に出力する。ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させる。 The irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13c, and outputs irradiation information to the headlight control unit 15. The headlight control section 15 causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination section 14 with light.
 ここで、図24Aは、実施の形態4において、奥行距離推定部13cが推定する奥行距離の一例について説明するための図である。
 図24Bは、実施の形態4において、ヘッドライト制御部15が、ヘッドライト2に対して、奥行距離推定部13cが推定した図24Aに示すような奥行距離に基づき照射決定部14が決定した照射範囲に、光を照射させた様子の一例を説明するための図である。
 図24Aは、車両100が走行中の道路を上から見た俯瞰図としている。
 図24Bは、車両100が走行中の道路を横から見た図としている。図24Bにおいて、運転者は「D」で示され、ヘッドライト2による光の照射範囲は「LA」で示されている。
 なお、図24Aでは、奥行距離は、右ライトから運転者の推定視認位置までの距離としている。また、図24Bでは、便宜上、車両100を進行方向に対して左側から見た図としているが、図24Bで示されている照射範囲は、右ライトによる照射範囲とする。
Here, FIG. 24A is a diagram for explaining an example of the depth distance estimated by the depth distance estimation unit 13c in the fourth embodiment.
FIG. 24B shows the irradiation determined by the irradiation determination unit 14 based on the depth distance as shown in FIG. FIG. 3 is a diagram for explaining an example of how a range is irradiated with light.
FIG. 24A is an overhead view of the road on which the vehicle 100 is traveling.
FIG. 24B is a side view of the road on which the vehicle 100 is traveling. In FIG. 24B, the driver is indicated by "D", and the range of light irradiated by the headlight 2 is indicated by "LA".
Note that in FIG. 24A, the depth distance is the distance from the right light to the estimated visual position of the driver. Further, although FIG. 24B shows the vehicle 100 as viewed from the left side with respect to the traveling direction for convenience, the irradiation range shown in FIG. 24B is the irradiation range by the right light.
 図24Aおよび図24Bでは、一例として、運転者の顔向きが、上下方向に「正面」の範囲内の向きであり、車両100の周辺の、運転者が向いている方向に、2人の歩行者(歩行者C、歩行者D。図24Aおよび図24Bにて、それぞれ、「W3」、「W4」で示されている)が存在しているとする。歩行者Cとヘッドライト2との距離は「6m」、歩行者Dとヘッドライト2との距離は「22m」であるとする。また、車両100の前方には標識(図24Aおよび図24Bでは図示省略)があり、車両100が走行している車線(ここでいう車線はいわゆるレーン)において、白線が途切れているとする。
 この場合、奥行距離推定部13cは、図22に示すような内容の奥行距離推定用情報に基づき、向き情報から判定された運転者の挙動と走行関連情報(ここでは車両情報、地図情報、および、車外情報)とが当該奥行距離推定用情報のNo.3に当てはまるとして、奥行距離を、「交差点の歩道と横断歩道を含む5mの余裕を有する範囲」と推定する。奥行距離推定部13cは、当該「交差点の歩道と横断歩道を含む5mの余裕を有する範囲」を、地図情報に基づき「5m~20m」と算出したとする(図24A参照)。奥行距離推定部13cは、ヘッドライト2から歩行者Cまでの距離が奥行距離の範囲内であるため、奥行距離を「6m」と調整し、「6m」を推定した奥行距離とする。
In FIGS. 24A and 24B, as an example, the driver's face is in the "front" range in the vertical direction, and two people are walking around the vehicle 100 in the direction in which the driver is facing. It is assumed that there are pedestrians (pedestrian C, pedestrian D, shown as "W3" and "W4" in FIGS. 24A and 24B, respectively). It is assumed that the distance between pedestrian C and headlight 2 is "6 m", and the distance between pedestrian D and headlight 2 is "22 m". Further, it is assumed that there is a sign (not shown in FIGS. 24A and 24B) in front of the vehicle 100, and that the white line is interrupted in the lane in which the vehicle 100 is traveling (the lane referred to here is a so-called lane).
In this case, the depth distance estimating unit 13c calculates the driver's behavior determined from the direction information and driving-related information (here, vehicle information, map information, and , outside vehicle information) is the No. 1 of the depth distance estimation information. Assuming that 3 applies, the depth distance is estimated to be "an area with a 5m margin including the intersection sidewalk and crosswalk." It is assumed that the depth distance estimating unit 13c calculates the "range including the intersection sidewalk and crosswalk with a margin of 5 m" to be "5 m to 20 m" based on the map information (see FIG. 24A). Since the distance from the headlight 2 to the pedestrian C is within the depth distance range, the depth distance estimation unit 13c adjusts the depth distance to "6 m" and sets "6 m" as the estimated depth distance.
 照射決定部14は、奥行距離推定部13cが推定した奥行距離「6m」に基づき、照射範囲を決定する。ここでは、左右方向にφ13度~φ14度の範囲、かつ、上下方向にθ13度~θ14度の範囲を照射範囲と決定したものとする(図24B参照。図24Bでは左右方向の照射範囲は図示省略)。
 なお、ここでは、照射決定部14は、照射範囲の上下方向について、奥行距離「6m」に基づき奥行距離垂直角度を算出し、当該奥行距離垂直角度から垂直方向に予め設定された角度分だけ広げた角度の範囲を、照射範囲の上下方向の角度範囲を決定したものとしている。
The irradiation determining unit 14 determines the irradiation range based on the depth distance “6 m” estimated by the depth distance estimating unit 13c. Here, it is assumed that the range of φ 13 degrees to φ 14 degrees in the horizontal direction and the range of θ 13 degrees to θ 14 degrees in the vertical direction is determined as the irradiation range (see FIG. 24B. (The irradiation range is omitted from the illustration).
Note that here, the irradiation determining unit 14 calculates the depth distance vertical angle based on the depth distance "6 m" in the vertical direction of the irradiation range, and widens the depth distance vertical angle by a preset angle in the vertical direction. The range of angles determined is the range of angles in the vertical direction of the irradiation range.
 その結果、ヘッドライト制御部15は、運転者が向いている方向に存在している歩行者Cに光が照射されるよう、ヘッドライト2を制御することができる。運転者は、歩行者Cを視認することができる。
 歩行者Dにはヘッドライト2の光は照射されない。
As a result, the headlight control unit 15 can control the headlights 2 so that the pedestrians C who are present in the direction the driver is facing are irradiated with light. The driver can visually recognize the pedestrian C.
Pedestrian D is not irradiated with light from the headlight 2.
 なお、仮に歩行者Dがヘッドライト2から「12m」の距離に存在したとする。歩行者Cも歩行者Dも、奥行距離推定部13cが奥行距離推定用情報に基づき推定した奥行距離の範囲内に存在することになる。この場合、奥行距離推定部13cは、例えば、ヘッドライト2から歩行者Cまでの距離とヘッドライト2から歩行者Dまでの距離とを含む範囲を、奥行距離と推定するようにしてもよい。この場合、奥行距離調整用条件には、例えば、「ヘッドライトからの距離が、奥行距離推定用情報に基づき推定された奥行距離の範囲内となる移動体が複数存在する場合、ヘッドライトから当該複数の移動体までの距離を含む範囲を、奥行距離推定用情報に基づき推定された奥行距離の範囲となるよう調整する」との条件が設定されている。
 ヘッドライト制御部15は、歩行者Cにも歩行者Dにも光が照射されるよう、ヘッドライト2を制御することになる。
Note that it is assumed that pedestrian D exists at a distance of "12 m" from headlight 2. Both pedestrian C and pedestrian D exist within the depth distance range estimated by the depth distance estimation unit 13c based on the depth distance estimation information. In this case, the depth distance estimation unit 13c may estimate, for example, a range including the distance from the headlight 2 to the pedestrian C and the distance from the headlight 2 to the pedestrian D as the depth distance. In this case, the conditions for depth distance adjustment include, for example, ``If there are multiple moving objects whose distance from the headlights is within the depth distance range estimated based on the depth distance estimation information, "The range including the distances to the plurality of moving objects is adjusted so that it becomes the range of depth distance estimated based on the depth distance estimation information."
The headlight control unit 15 controls the headlight 2 so that both the pedestrian C and the pedestrian D are irradiated with light.
 このように、実施の形態4に係るヘッドライト制御装置1cは、実施の形態2に係るヘッドライト制御装置1aと実施の形態3に係るヘッドライト制御装置1bとを組み合わせたヘッドライト制御装置とすることで、より多くの状況においてヘッドライト2の制御を行うことができるとともに、運転者が向いている方向において実際に存在する物体があるか否かに基づき、より実際の状況に即した奥行距離を推定する。
 これにより、ヘッドライト制御装置1bは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
In this way, the headlight control device 1c according to the fourth embodiment is a headlight control device that is a combination of the headlight control device 1a according to the second embodiment and the headlight control device 1b according to the third embodiment. This makes it possible to control the headlights 2 in more situations, and also adjusts the depth distance more closely to the actual situation based on whether there is an object that actually exists in the direction the driver is facing. Estimate.
Thereby, the headlight control device 1b can more appropriately illuminate the estimated visible position of the driver, and can provide driving support when the vehicle 100 runs at night or the like.
 実施の形態4に係るヘッドライト制御装置1cの動作について説明する。
 図25は、実施の形態4に係るヘッドライト制御装置1cの動作について説明するためのフローチャートである。
 ヘッドライト制御装置1cは、例えば、ヘッドライト2がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図25のフローチャートで示すような動作を開始する。ヘッドライト制御装置1cは、例えば、ヘッドライト2がオフの状態になるまで、または、車両100の電源がオフにされるまで、図25フローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1cの制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、走行関連情報取得部12c、奥行距離推定部13c、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、走行関連情報取得部12c、奥行距離推定部13c、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1c according to the fourth embodiment will be explained.
FIG. 25 is a flowchart for explaining the operation of the headlight control device 1c according to the fourth embodiment.
For example, when the headlights 2 are turned on, the headlight control device 1c determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. 25. do. The headlight control device 1c repeats the operation shown in the flowchart of FIG. 25, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
For example, the control unit (not shown) of the headlight control device 1c acquires information indicating the state of the headlights 2 from a headlight switch installed in the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not. When the control unit determines that the headlights 2 are in the on state, it determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12c, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13c, the irradiation determining unit 14, and the headlight control unit 15.
Further, when the control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the travel-related information acquisition section 12c, the depth distance estimation section 13c, the irradiation determination section 14, and the headlight control section 15.
 図25のフローチャートで示す動作について、ステップST1-1、ステップST1-2、ステップST1-3、ステップST3~4の処理内容は、それぞれ、実施の形態2にて説明済みの、図13のフローチャートで示すヘッドライト制御装置1aの動作のステップST1-1、ステップST1-2、ステップST1-3、ステップST3~4の処理内容と同様であるため、重複した説明を省略する。また、図25のフローチャートで示す動作について、ステップST1-4の処理内容は、実施の形態3にて説明済みの、図20のフローチャートで示すヘッドライト制御装置1bの動作のステップST1-4の処理内容と同様であるため、重複した説明を省略する。 Regarding the operation shown in the flowchart of FIG. 25, the processing contents of step ST1-1, step ST1-2, step ST1-3, and steps ST3 to ST4 are shown in the flowchart of FIG. 13, which has already been explained in the second embodiment. Since the processing contents are the same as those of steps ST1-1, ST1-2, ST1-3, and ST3-4 of the operation of the headlight control device 1a shown in FIG. Further, regarding the operation shown in the flowchart of FIG. 25, the processing content of step ST1-4 is the same as the processing of step ST1-4 of the operation of the headlight control device 1b shown in the flowchart of FIG. 20, which has already been explained in the third embodiment. Since the contents are the same, duplicate explanations will be omitted.
 奥行距離推定部13cは、ステップST1-1にて向き検出部11が検出した運転者の向きに関する向き情報と、ステップST1-2にて車両情報取得部121が取得した車両情報と、ステップST1-3にて地図情報取得部122が取得した地図情報と、ステップST1-4にて車外情報取得部123が取得した車外情報と、奥行距離推定用情報とを用いて、奥行距離を推定する(ステップST2c)。
 詳細には、奥行距離推定部13cは、奥行距離推定用情報を用いて奥行距離を推定すると、推定した奥行距離を、車外情報に基づき、奥行距離調整用条件を参照して、調整し、推定した奥行距離を確定させる。
 奥行距離推定部13cは、奥行距離情報を、照射決定部14に出力する。
The depth distance estimation unit 13c uses the orientation information regarding the driver's orientation detected by the orientation detection unit 11 in step ST1-1, the vehicle information acquired by the vehicle information acquisition unit 121 in step ST1-2, and the vehicle information acquired in step ST1-2. The depth distance is estimated using the map information acquired by the map information acquisition unit 122 in step ST1-4, the external information acquired by the vehicle exterior information acquisition unit 123 in step ST1-4, and the depth distance estimation information (step ST1-4). ST2c).
Specifically, when the depth distance estimation unit 13c estimates the depth distance using the depth distance estimation information, the depth distance estimating unit 13c adjusts the estimated depth distance based on the vehicle exterior information and with reference to the depth distance adjustment conditions, and estimates the depth distance. Determine the depth distance.
The depth distance estimation section 13c outputs depth distance information to the irradiation determination section 14.
 ステップST3において、照射決定部14は、ステップST2cにて奥行距離推定部13cが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する(ステップST3)。 In step ST3, the irradiation determining unit 14 determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13c in step ST2c (step ST3).
 このように、ヘッドライト制御装置1cは、車内撮像画像に基づき、運転者の向きを検出し、走行関連情報(ここでは車両情報、地図情報、および、車外情報)を取得する。ヘッドライト制御装置1cは、検出した運転者の向きに関する向き情報と、取得した走行関連情報とに基づき、奥行距離を推定する。その際、ヘッドライト制御装置1cは、奥行距離推定用情報を用いて推定した奥行距離を、車外情報と奥行距離調整用条件とに基づき、調整し、調整後の奥行距離情報を推定した奥行情報とする。そして、ヘッドライト制御装置1cは、推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定し、ヘッドライト2に対して、決定した照射範囲に光を照射させる。
 そのため、ヘッドライト制御装置1cは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
In this way, the headlight control device 1c detects the direction of the driver based on the captured image inside the vehicle, and acquires travel-related information (here, vehicle information, map information, and information outside the vehicle). The headlight control device 1c estimates the depth distance based on the detected orientation information regarding the driver's orientation and the acquired travel-related information. At that time, the headlight control device 1c adjusts the depth distance estimated using the depth distance estimation information based on the vehicle exterior information and the depth distance adjustment conditions, and converts the adjusted depth distance information into the estimated depth information. shall be. Then, the headlight control device 1c determines the irradiation range of light by the headlight 2 based on the estimated depth distance, and causes the headlight 2 to irradiate the determined irradiation range with light.
Therefore, the headlight control device 1c can illuminate the estimated visible position of the driver more appropriately, and can provide driving support when the vehicle 100 runs at night or the like.
 以上の実施の形態4において、例えば、奥行距離推定部13cは、奥行距離を推定するとともに、照射範囲の理想幅を推定するようにしてもよい。この場合、奥行距離推定部13cは、奥行距離情報と照射範囲の理想幅に関する情報とを照射決定部14に出力する。
 照射決定部14は、奥行距離推定部13cが推定した奥行距離と照射範囲の理想幅とに基づいて照射範囲を決定する。
In the fourth embodiment described above, for example, the depth distance estimation unit 13c may estimate the ideal width of the irradiation range as well as the depth distance. In this case, the depth distance estimation unit 13c outputs depth distance information and information regarding the ideal width of the irradiation range to the irradiation determination unit 14.
The irradiation determining unit 14 determines the irradiation range based on the depth distance estimated by the depth distance estimating unit 13c and the ideal width of the irradiation range.
 また、以上の実施の形態4において、奥行距離推定部13cは、奥行距離を推定するとともに、ヘッドライト2による理想光量を推定するようにしてもよい。この場合、奥行距離推定部13cは、奥行距離情報を照射決定部14に出力するとともに、推定した理想光量に関する情報を、ヘッドライト制御部15に出力する。
 ヘッドライト制御部15は、ヘッドライト2に対して、照射決定部14が決定した照射範囲において、奥行距離推定部13cが推定した理想光量で、光を照射させる。
Furthermore, in the fourth embodiment described above, the depth distance estimating unit 13c may estimate the ideal amount of light from the headlights 2 as well as estimating the depth distance. In this case, the depth distance estimation section 13c outputs depth distance information to the irradiation determination section 14, and also outputs information regarding the estimated ideal light amount to the headlight control section 15.
The headlight control unit 15 causes the headlight 2 to irradiate light in the irradiation range determined by the irradiation determination unit 14 at the ideal light amount estimated by the depth distance estimation unit 13c.
 以上の実施の形態4において、奥行距離推定部13cは、奥行距離を推定するとともに照射範囲の理想幅および理想光量を推定するようにしてもよい。 In the above fourth embodiment, the depth distance estimating unit 13c may estimate the depth distance and the ideal width and ideal light amount of the irradiation range.
 また、以上の実施の形態4では、ヘッドライト制御装置1cは、車両100に搭載される車載装置とし、向き検出部11と、走行関連情報取得部12cと、奥行距離推定部13cと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、走行関連情報取得部12cと、奥行距離推定部13cと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、走行関連情報取得部12cと、奥行距離推定部13cと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の全部がサーバに備えられてもよい。 Further, in the above-described fourth embodiment, the headlight control device 1c is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12c, the depth distance estimation section 13c, and the irradiation determination section 13c. It is assumed that the section 14, the headlight control section 15, and a control section (not shown) are included in the vehicle-mounted device. However, the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12c, depth distance estimation section 13c, irradiation determination section 14, headlight control section 15, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12c, the depth distance estimation unit 13c, the irradiation determination unit 14, the headlight control unit 15, and a control unit (not shown). .
 また、以上の実施の形態4は、実施の形態2と実施の形態3とを組み合わせた実施の形態としたが、これに限らず、実施の形態1と実施の形態3とを組み合わせた実施の形態とすることもできる。この場合、図14で示したヘッドライト制御装置1bの構成例において、ヘッドライト制御装置1bの走行関連情報取得部12bは、車両情報取得部121と車外情報取得部123を備える。奥行距離推定部13bは、向き情報と車両情報と車外情報とに基づき、奥行距離を推定する。詳細には、奥行距離推定部13bは、向き情報と車両情報と車外情報とに基づき、例えば、図22に示したような奥行距離推定用情報の内容のうち、入力情報に地図情報が設定されていない奥行距離推定用情報との比較によって、奥行距離を推定する。 Furthermore, although Embodiment 4 above is an embodiment that combines Embodiment 2 and Embodiment 3, the present invention is not limited to this. It can also be a form. In this case, in the configuration example of the headlight control device 1b shown in FIG. 14, the driving-related information acquisition section 12b of the headlight control device 1b includes a vehicle information acquisition section 121 and an external information acquisition section 123. The depth distance estimating unit 13b estimates the depth distance based on the orientation information, vehicle information, and information outside the vehicle. Specifically, the depth distance estimating unit 13b determines whether map information is set as input information among the depth distance estimation information as shown in FIG. 22, based on orientation information, vehicle information, and vehicle exterior information. Depth distance is estimated by comparing with depth distance estimation information that is not available.
 実施の形態4に係るヘッドライト制御装置1cのハードウェア構成は、実施の形態1において図6Aおよび図6Bを用いて説明したヘッドライト制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態4において、向き検出部11と、走行関連情報取得部12cと、奥行距離推定部13cと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1cは、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向きに関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
The hardware configuration of the headlight control device 1c according to the fourth embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
In the fourth embodiment, the functions of the direction detection section 11, the travel-related information acquisition section 12c, the depth distance estimation section 13c, the irradiation determination section 14, the headlight control section 15, and a control section (not shown) are performed by a processing circuit. This is realized by 1001. That is, the headlight control device 1c estimates the depth distance based on the direction information regarding the direction of the driver detected based on the in-vehicle captured image acquired from the in-vehicle imaging device 3 and the travel-related information, and applies the estimated depth distance to the driving-related information. A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、走行関連情報取得部12cと、奥行距離推定部13cと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1cは、処理回路1001により実行されるときに、上述の図25のステップST1-1、ステップST1-2、ステップST1-3、ステップST1-4~ステップST4が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、走行関連情報取得部12cと、奥行距離推定部13cと、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 記憶部16は、例えば、メモリ1005で構成される。
 ヘッドライト制御装置1cは、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
The processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12c, the depth distance estimation section 13c, the irradiation determination section 14, and the headlight control section. 15 and a control section (not shown). That is, when the headlight control device 1c is executed by the processing circuit 1001, steps ST1-1, ST1-2, ST1-3, and ST1-4 to ST4 in FIG. 25 described above are executed as a result. A memory 1005 is provided for storing programs to be executed. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12c, the depth distance estimation unit 13c, the irradiation determination unit 14, the headlight control unit 15, and the control unit (not shown). It can also be said that it causes a computer to execute a processing procedure or method.
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1c includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態4に係るヘッドライト制御装置1cは、車両100の運転者が撮像された撮像画像(車内撮像画像)に基づき、運転者の向きを検出する向き検出部11と、車両100の走行に関連する走行関連情報を取得する走行関連情報取得部12cと、向き検出部11が検出した運転者の向きと、走行関連情報取得部12cが取得した走行関連情報とに基づき、奥行距離を推定する奥行距離推定部13cと、奥行距離推定部13cが推定した奥行距離に基づいて、ヘッドライト2による光の照射範囲を決定する照射決定部14と、ヘッドライト2に対して、照射決定部14が決定した照射範囲に光を照射させるヘッドライト制御部15とを備えるように構成した。
 そのため、ヘッドライト制御装置1cは、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1cは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
As described above, the headlight control device 1c according to the fourth embodiment includes the orientation detection unit 11 that detects the orientation of the driver based on the captured image of the driver of the vehicle 100 (in-vehicle captured image); Based on the driving-related information acquisition unit 12c that acquires driving-related information related to the driving of the vehicle 100, the orientation of the driver detected by the orientation detection unit 11, and the driving-related information acquired by the driving-related information acquisition unit 12c, A depth distance estimating unit 13c that estimates a depth distance, an irradiation determining unit 14 that determines the range of light irradiated by the headlight 2 based on the depth distance estimated by the depth distance estimating unit 13c, and the headlight 2, The headlight control section 15 is configured to irradiate light onto the irradiation range determined by the irradiation determining section 14.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1c determines how far ahead in the direction in which the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1c can more appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
 詳細には、ヘッドライト制御装置1cにおいて、走行関連情報取得部12cは、走行関連情報として車両100の前方に関する車外情報を取得する車外情報取得部123を有し、奥行距離推定部13cは、向き情報と車両情報と地図情報と車外情報とに基づき、奥行距離を推定する。
 そのため、ヘッドライト制御装置1cは、車両100における、運転者が向いている方向に基づくヘッドライト2の点灯制御において、運転者が実際に、向いている方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。
 ヘッドライト制御装置1cは、運転者の推定視認位置をより適切に照射することができ、車両100が夜間等に走行する際の運転支援を行うことができる。
Specifically, in the headlight control device 1c, the driving-related information acquisition unit 12c has an external information acquisition unit 123 that acquires external information regarding the front of the vehicle 100 as driving-related information, and the depth distance estimating unit 13c has a direction Depth distance is estimated based on information, vehicle information, map information, and information outside the vehicle.
Therefore, in controlling the lighting of the headlights 2 in the vehicle 100 based on the direction in which the driver is facing, the headlight control device 1c determines how far ahead in the direction in which the driver is actually facing. Lighting control can be done with consideration.
The headlight control device 1c can more appropriately illuminate the estimated visual position of the driver, and can provide driving support when the vehicle 100 travels at night or the like.
実施の形態5.
 例えば、運転者は、頷く、ヘッドライト2の光が追従する必要がないくらい瞬間的なよそ見をする、または、目を細める等、何かを視認しようとしているわけではなく向きを変えることがあり得る。この場合、運転者の向きは、奥行距離を推定するのに用いるべきではない。
 実施の形態5では、運転者の向きが、何かを視認しようとしているときの向きであると推定できるかを示す信頼度に基づき、運転者が向いている方向にヘッドライト2の光を照射させる制御を行うようにする実施の形態について説明する。
Embodiment 5.
For example, a driver may nod his head, look away momentarily so that the headlights 2 do not need to follow him, or turn his head without intending to see something, such as squinting. obtain. In this case, the driver's orientation should not be used to estimate depth distance.
In Embodiment 5, the light from the headlights 2 is emitted in the direction the driver is facing based on the reliability that indicates whether the driver's orientation can be estimated to be the orientation when trying to visually recognize something. An embodiment in which control is performed will be described.
 図26は、実施の形態5に係るヘッドライト制御装置1dの構成例を示す図である。
 実施の形態5において、ヘッドライト制御装置1dは、車両100に搭載されていることを想定する。
 ヘッドライト制御装置1dは、車両100の運転者の向きに基づいて、車両100に設けられているヘッドライト2の灯火制御を行う。実施の形態5において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向であらわされる。実施の形態5において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向に加え、運転者の身体の向き、言い換えれば、運転者の姿勢、を含むものとしてもよい。
 実施の形態5では、ヘッドライト制御装置1dが行う、運転者の向きに基づくヘッドライト2の灯火制御は、例えば、夜間の駐車場、または、夜間の市街地等、車両100の周囲が暗い場所において、ヘッドライト2がオンにされた場合に行われることを想定している。
FIG. 26 is a diagram showing a configuration example of a headlight control device 1d according to the fifth embodiment.
In the fifth embodiment, it is assumed that the headlight control device 1d is mounted on the vehicle 100.
The headlight control device 1d controls the headlights 2 provided in the vehicle 100 based on the orientation of the driver of the vehicle 100. In the fifth embodiment, the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction. In the fifth embodiment, the "driver's orientation" includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
In the fifth embodiment, the light control of the headlights 2 based on the direction of the driver performed by the headlight control device 1d is performed in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
 図26において、実施の形態1にて図1を用いて説明したヘッドライト制御装置1と同様の構成については、同じ符号を付して重複した説明を省略する。
 実施の形態5に係るヘッドライト制御装置1dは、実施の形態1に係るヘッドライト制御装置1とは、信頼度判定部17を備えた点が異なる。
In FIG. 26, the same components as the headlight control device 1 described in Embodiment 1 using FIG. 1 are given the same reference numerals and redundant explanation will be omitted.
The headlight control device 1d according to the fifth embodiment differs from the headlight control device 1 according to the first embodiment in that it includes a reliability determination section 17.
 信頼度判定部17は、向き検出部11によって検出された運転者の向きの信頼度を判定する。実施の形態5において、運転者の向きの「信頼度」とは、何かを視認しようとしているときの向きであると推定できるかの度合いを示し、信頼度が低い場合というのは、検出された運転者の向きの精度が低い可能性がある場合と検出された運転者の向きがヘッドライト2で照射する必要のない向きである場合の両方を含む。なお、検出された運転者の向きの精度が低い可能性がある場合の一例としては、例えば、運転者が目を細めている場合が挙げられる。また、ヘッドライト2で照射する必要のない向きの一例としては、例えば、運転者が頷いている場合の運転者の向き、または、運転者がよそ見をしている場合の運転者の向きが挙げられる。
 実施の形態5では、一例として、信頼度判定部17は、運転者の向きの信頼度が「高い」か「低い」かを判定するものとする。
 信頼度判定部17は、例えば、予め設定された期間(以下「信頼度判定用期間」という。)遡って向き検出部11によって検出された運転者の向きに基づいて、運転者の向きの信頼度を判定する。信頼度判定部17は、信頼度判定用期間遡って向き検出部11によって検出された運転者の向きを、記憶部16に記憶されている向き情報から判定すればよい。
The reliability determination unit 17 determines the reliability of the driver's orientation detected by the orientation detection unit 11. In Embodiment 5, the "reliability" of the driver's orientation indicates the degree to which it can be estimated that the orientation is the one in which the driver is trying to visually recognize something. This includes both a case where there is a possibility that the accuracy of the detected driver's direction is low and a case where the detected direction of the driver is a direction that does not require illumination by the headlights 2. Note that an example of a case where the accuracy of the detected direction of the driver may be low is, for example, a case where the driver is squinting. Examples of directions in which the headlights 2 do not need to illuminate include the direction of the driver when the driver is nodding, or the direction of the driver when the driver is looking away. It will be done.
In the fifth embodiment, as an example, the reliability determining unit 17 determines whether the reliability of the direction of the driver is "high" or "low."
For example, the reliability determination unit 17 determines the reliability of the driver's orientation based on the driver's orientation detected by the orientation detection unit 11 retroactively over a preset period (hereinafter referred to as "reliability determination period"). Determine the degree. The reliability determination unit 17 may determine the orientation of the driver detected by the orientation detection unit 11 from the orientation information stored in the storage unit 16 going back through the reliability determination period.
 例えば、信頼度判定部17は、信頼度判定用期間で一定以上、運転者の向きが変化した場合、向き検出部11によって検出された運転者の向きの信頼度は「低い」と判定する。
 信頼度判定用期間で一定以上、運転者の向きが変化する場面としては、例えば、運転者が頷いた場面、運転者が瞬間的なよそ見をした場面等が考えられる。
 また、例えば、運転者が目を細めている場面で検出された運転者の向きの信頼度も「低い」と考えられる。この場合の信頼度は、信頼度判定用期間で一定以上、運転者の向きが変化したか否かによらず、検出された運転者の向きから常時判定可能である。よって、例えば、信頼度判定部17は、運転者が目を細めている場面での信頼度の判定は、向き検出部11によって検出された運転者の向きに基づき、信頼度判定期間に関係なく常に行ってもよい。
For example, if the driver's orientation changes by a certain amount or more during the reliability determination period, the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “low”.
Scenes in which the driver's orientation changes for a certain period or more during the reliability determination period include, for example, a scene in which the driver nods his head, a scene in which the driver momentarily looks away, and the like.
Furthermore, for example, the reliability of the driver's orientation detected in a scene where the driver is squinting is also considered to be "low." In this case, the reliability can be determined at any time from the detected orientation of the driver, regardless of whether or not the orientation of the driver has changed by more than a certain value during the reliability determination period. Therefore, for example, the reliability determination unit 17 determines the reliability in a scene where the driver is squinting based on the orientation of the driver detected by the orientation detection unit 11, regardless of the reliability determination period. You can always go.
 信頼度判定部17は、向き検出部11によって検出された運転者の向きの信頼度が「低い」と判定した場合、当該向きを示す向き情報を、奥行距離推定部13に出力しないようにする。例えば、信頼度判定部17は、向き検出部11から出力された向き情報を、信頼度が「高い」と直近で判定された向き情報に書き換え、書き換え後の向き情報を、奥行距離推定部13に出力する。
 奥行距離推定部13は、信頼度判定部17が信頼度は低いと判定した運転者の向きを、奥行距離の推定に用いない。
 なお、信頼度判定部17は、向き検出部11によって検出された運転者の向きの信頼度が「高い」と判定した場合は、向き検出部11から出力された向き情報を、奥行距離推定部13に出力する。
When the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “low”, the reliability determination unit 17 does not output orientation information indicating the orientation to the depth distance estimation unit 13. . For example, the reliability determination unit 17 rewrites the orientation information output from the orientation detection unit 11 to the orientation information most recently determined to have “high” reliability, and the rewritten orientation information is transferred to the depth distance estimation unit 13. Output to.
The depth distance estimating unit 13 does not use the orientation of the driver for which the reliability determining unit 17 has determined that the reliability is low for estimating the depth distance.
Note that when the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “high”, the reliability determination unit 17 uses the orientation information output from the orientation detection unit 11 to the depth distance estimation unit. Output to 13.
 図27は、実施の形態5において、信頼度判定部17が、向き検出部11によって検出された運転者の向きの信頼度が低いと判定する場面の一例を説明するための図である。なお、図27において、ヘッドライト2の光の照射範囲は「LA」で示されている。
 例えば、運転者が頷いた場合、運転者の顔向きが、瞬間的に下方に変化する。
 この場合、信頼度判定部17は、運転者の向きの信頼度は「低い」と判定し、信頼度が「高い」と直近で判定された向き情報、ここでは、運転者が頷く前の向き情報を、奥行距離推定部13に出力する。
 奥行距離推定部13は、運転者が頷く前の向き情報と車両情報と奥行距離推定用情報とを用いて、奥行距離を推定する。
 その結果、ヘッドライト2は、ヘッドライト制御部15によって、運転者が頷く前の運転者が向いている方向において、推定視認対象物に光を照射するよう制御される。
FIG. 27 is a diagram for explaining an example of a scene in which the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is low in the fifth embodiment. Note that in FIG. 27, the light irradiation range of the headlight 2 is indicated by "LA".
For example, when the driver nods, the direction of the driver's face momentarily changes downward.
In this case, the reliability determining unit 17 determines that the reliability of the driver's orientation is "low", and the reliability is determined to be "high" most recently, in this case, the orientation before the driver nods. The information is output to the depth distance estimation section 13.
The depth distance estimating unit 13 estimates the depth distance using the orientation information before the driver nods, the vehicle information, and the depth distance estimation information.
As a result, the headlight 2 is controlled by the headlight control unit 15 to irradiate light onto the presumed visible object in the direction in which the driver is facing before the driver nods.
 検出した運転者の向きについて、信頼度が低い運転者の向きについては、奥行距離を推定する対象から除外することで、ヘッドライト制御装置1dは、不要なレベリングを防止し、運転者に対し、当該運転者が何かを視認しようとしたものではない向きに追従するヘッドライト2の光による煩わしさを低減することができる。実施の形態1において、不要なレベリングとは、例えば、運転者が頷いた瞬間に運転者の向きに追従してヘッドライト2が下方向を照射すること、または、運転者が目を細めたときに視線検出精度が下がってしまい運転者が見ていない方向をヘッドライト2が照射することである。 The headlight control device 1d prevents unnecessary leveling by excluding detected driver orientations with low reliability from the targets for estimating the depth distance. It is possible to reduce the annoyance caused by the light from the headlights 2 that follows a direction that is not the one the driver is trying to see. In the first embodiment, unnecessary leveling means, for example, when the headlight 2 follows the direction of the driver and illuminates downward at the moment the driver nods, or when the driver squints. The problem is that the line of sight detection accuracy decreases and the headlights 2 illuminate directions that the driver is not looking at.
 実施の形態5に係るヘッドライト制御装置1dの動作について説明する。
 図28は、実施の形態5に係るヘッドライト制御装置1dの動作について説明するためのフローチャートである。
 ヘッドライト制御装置1dは、例えば、ヘッドライト2がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図28のフローチャートで示すような動作を開始する。ヘッドライト制御装置1dは、例えば、ヘッドライト2がオフの状態になるまで、または、車両100の電源がオフにされるまで、図28フローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1dの制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、信頼度判定部17、走行関連情報取得部12、奥行距離推定部13、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、信頼度判定部17、走行関連情報取得部12、奥行距離推定部13、照射決定部14、および、ヘッドライト制御部15に、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1d according to the fifth embodiment will be explained.
FIG. 28 is a flowchart for explaining the operation of the headlight control device 1d according to the fifth embodiment.
For example, when the headlights 2 are turned on, the headlight control device 1d determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. do. The headlight control device 1d repeats the operation shown in the flowchart of FIG. 28, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
For example, the control unit (not shown) of the headlight control device 1d acquires information indicating the state of the headlights 2 from a headlight switch installed in the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not. When the control unit determines that the headlights 2 are in the on state, the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, the reliability determination unit 17, and the driving related information acquisition 12, depth distance estimation section 13, irradiation determining section 14, and headlight control section 15, information instructing the start of lighting control of headlight 2 is output.
Further, when the control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the reliability determination section 17, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, and the headlight control section 15.
 図28のフローチャートで示す動作について、ステップST1-11、ステップST1-2、ステップST2~4の処理内容は、それぞれ、実施の形態1にて説明済みの、図5のフローチャートで示すヘッドライト制御装置1の動作のステップST1-1、ステップST1-2、ステップST2~4の処理内容と同様であるため、重複した説明を省略する。 Regarding the operation shown in the flowchart of FIG. 28, the processing contents of step ST1-11, step ST1-2, and steps ST2 to ST4 are the headlight control device shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing contents are the same as those of step ST1-1, step ST1-2, and steps ST2-4 in operation 1, duplicate explanation will be omitted.
 信頼度判定部17は、ステップST1-11にて向き検出部11によって検出された運転者の向きの信頼度を判定する(ステップST1-12)。
 信頼度判定部17は、例えば、信頼度判定用期間遡って向き検出部11によって検出された運転者の向きに基づいて、運転者の向きの信頼度を判定する。
 信頼度判定部17は、向き検出部11によって検出された運転者の向きの信頼度が「低い」と判定した場合、向き検出部11から出力された向き情報を、奥行距離推定部13に出力しないようにする。例えば、信頼度判定部17は、向き検出部11から出力された向き情報を、信頼度が「高い」と直近で判定された向き情報に書き換え、書き換え後の向き情報を、奥行距離推定部13に出力する。
 信頼度判定部17は、向き検出部11によって検出された運転者の向きの信頼度が「高い」と判定した場合は、向き検出部11から出力された向き情報を、奥行距離推定部13に出力する。
The reliability determination unit 17 determines the reliability of the driver's orientation detected by the orientation detection unit 11 in step ST1-11 (step ST1-12).
The reliability determining unit 17 determines the reliability of the driver's orientation, for example, based on the driver's orientation detected by the orientation detecting unit 11 going back a reliability determination period.
When the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “low”, the reliability determination unit 17 outputs the orientation information output from the orientation detection unit 11 to the depth distance estimation unit 13. Try not to. For example, the reliability determination unit 17 rewrites the orientation information output from the orientation detection unit 11 to the orientation information most recently determined to have “high” reliability, and the rewritten orientation information is transferred to the depth distance estimation unit 13. Output to.
When the reliability determination unit 17 determines that the reliability of the driver's orientation detected by the orientation detection unit 11 is “high”, the reliability determination unit 17 transmits the orientation information output from the orientation detection unit 11 to the depth distance estimation unit 13. Output.
 このように、ヘッドライト制御装置1dは、検出した運転者の向きの信頼度を判定し、信頼度が低いと判定した運転者の向きは、奥行距離推定用情報を用いた奥行距離の推定に用いないようにする。
 検出した運転者の向きについて、信頼度が低い運転者の向きについては、奥行距離を推定する対象から除外することで、ヘッドライト制御装置1dは、不要なレベリングを防止し、運転者に対し、当該運転者が何かを視認しようとしたものではない向きに追従するヘッドライト2の光による煩わしさを低減することができる。
In this way, the headlight control device 1d determines the reliability of the detected driver's orientation, and if the driver's orientation is determined to have low reliability, it is used to estimate the depth distance using the depth distance estimation information. Avoid using it.
The headlight control device 1d prevents unnecessary leveling by excluding detected driver orientations with low reliability from the targets for estimating the depth distance. It is possible to reduce the annoyance caused by the light from the headlights 2 that follows a direction that is not the one the driver is trying to see.
 なお、以上の実施の形態5では、信頼度判定部17は、向き検出部11によって検出された運転者の向きの信頼度を判定するものとしたが、これは一例に過ぎず、信頼度判定部17は、向き検出部11によって検出された運転者の顔のパーツの信頼度を判定してもよい。この場合、例えば、向き検出部11は、運転者の向きの検出に用いた運転者の顔のパーツに関する情報を、向き情報とあわせて信頼度判定部17に出力する。
 信頼度判定部17は、検出された運転者の顔のパーツの信頼度が「低い」と判定した場合、例えば、信頼度が「高い」と直近で判定された運転者の顔のパーツに基づいて、運転者の向きを再検出させる。信頼度判定部17が当該運転者の向きの再検出を行ってもよい。なお、信頼度判定部17は、例えば、検出された運転者の顔のパーツの一部が取得できていない場合である。
Note that in the fifth embodiment described above, the reliability determination unit 17 determines the reliability of the driver's orientation detected by the orientation detection unit 11, but this is only an example; The unit 17 may determine the reliability of the parts of the driver's face detected by the orientation detection unit 11. In this case, for example, the orientation detection unit 11 outputs information regarding the parts of the driver's face used to detect the driver's orientation to the reliability determination unit 17 together with the orientation information.
When the reliability determination unit 17 determines that the reliability of the detected driver's facial parts is "low," the reliability determining unit 17 determines whether the reliability is based on the driver's facial parts whose reliability was most recently determined to be "high." to re-detect the driver's direction. The reliability determination unit 17 may re-detect the orientation of the driver. Note that, for example, the reliability determination unit 17 may not be able to acquire some of the detected facial parts of the driver.
 また、以上の実施の形態5では、ヘッドライト制御装置1d、車両100に搭載される車載装置とし、向き検出部11と、信頼度判定部17と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、信頼度判定部17と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、信頼度判定部17と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の全部がサーバに備えられてもよい。 Further, in the fifth embodiment described above, the headlight control device 1d is an in-vehicle device mounted on the vehicle 100, and includes the orientation detection section 11, the reliability determination section 17, the driving-related information acquisition section 12, and the depth distance estimation section. The unit 13, the irradiation determining unit 14, the headlight control unit 15, and a control unit (not shown) are included in the vehicle-mounted device. The present invention is not limited to this, but includes the direction detection section 11, the reliability determination section 17, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, the headlight control section 15, and a control section (not shown). Some of them may be provided in the on-vehicle device of the vehicle 100, and the rest may be provided in a server connected to the on-vehicle device via a network. In addition, the orientation detection section 11, the reliability determination section 17, the driving-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, the headlight control section 15, and all the control sections (not shown) It may be provided in the server.
 また、以上の実施の形態5は、実施の形態1に係るヘッドライト制御装置1において、信頼度判定部17を備えるものとしたが、これは一例に過ぎない。例えば、実施の形態2に係るヘッドライト制御装置1a、実施の形態3に係るヘッドライト制御装置1b、または実施の形態4に係るヘッドライト制御装置1cが、信頼度判定部17を備える構成としてもよい。 Furthermore, in the fifth embodiment described above, the headlight control device 1 according to the first embodiment includes the reliability determination section 17, but this is only an example. For example, the headlight control device 1a according to the second embodiment, the headlight control device 1b according to the third embodiment, or the headlight control device 1c according to the fourth embodiment may be configured to include the reliability determination unit 17. good.
 実施の形態5に係るヘッドライト制御装置1dのハードウェア構成は、実施の形態1において図6Aおよび図6Bを用いて説明したヘッドライト制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態5において、向き検出部11と、信頼度判定部17と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1dは、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向きも関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
The hardware configuration of the headlight control device 1d according to the fifth embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
In the fifth embodiment, the direction detection unit 11, the reliability determination unit 17, the driving-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14, the headlight control unit 15, and a control not shown The functions of the section are realized by the processing circuit 1001. That is, the headlight control device 1d estimates the depth distance based on driving-related information and direction information related to the direction of the driver detected based on the in-vehicle captured image acquired from the in-vehicle imaging device 3, and calculates the estimated depth distance. A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the following.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、信頼度判定部17と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1dは、処理回路1001により実行されるときに、上述の図28のステップST1-11~ステップST1-12、ステップST1-2、ステップST2~ステップST4が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、信頼度判定部17と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14と、ヘッドライト制御部15と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 記憶部16は、例えば、メモリ1005で構成される。
 ヘッドライト制御装置1dは、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
The processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the reliability determination section 17, the travel-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section. 14, a headlight control section 15, and a control section (not shown). That is, when the headlight control device 1d is executed by the processing circuit 1001, steps ST1-11 to ST1-12, ST1-2, and ST2 to ST4 in FIG. 28 described above are executed as a result. A memory 1005 is provided for storing different programs. Further, the program stored in the memory 1005 includes the direction detection section 11, the reliability determination section 17, the driving related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14, and the headlight control section 15. It can also be said that it causes a computer to execute a processing procedure or method of a control unit (not shown).
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1d includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態5に係るヘッドライト制御装置1dは、信頼度判定用期間遡って向き検出部11によって検出された運転者の向きに基づき、向き検出部11が検出した運転者の向きの信頼度を判定する信頼度判定部17を備え、奥行距離推定部13は、信頼度判定部17が信頼度は低いと判定した運転者の向きを、奥行距離の推定に用いないように構成した。
 そのため、ヘッドライト制御装置1dは、不要なレベリングを防止し、運転者に対し、当該運転者が何かを視認しようとしたものではない向きに追従するヘッドライト2の光による煩わしさを低減することができる。
As described above, the headlight control device 1d according to the fifth embodiment uses the direction of the driver detected by the direction detection section 11 based on the direction of the driver detected by the direction detection section 11 retroactively during the reliability determination period. The depth distance estimating unit 13 includes a reliability determining unit 17 that determines the reliability of the orientation, and the depth distance estimating unit 13 is configured not to use the orientation of the driver that the reliability determining unit 17 determines to have low reliability for estimating the depth distance. Configured.
Therefore, the headlight control device 1d prevents unnecessary leveling and reduces the annoyance to the driver caused by the light from the headlights 2 that follows a direction that is not the one the driver is trying to see. be able to.
実施の形態6.
 実施の形態6では、ヘッドライト制御装置が、光の強さが異なる複数の照射範囲を設定する実施の形態について説明する。
Embodiment 6.
Embodiment 6 will describe an embodiment in which a headlight control device sets a plurality of irradiation ranges with different light intensities.
 図29は、実施の形態6に係るヘッドライト制御装置1eの構成例を示す図である。
 実施の形態6において、ヘッドライト制御装置1eは、車両100に搭載されていることを想定する。
 ヘッドライト制御装置1eは、車両100の運転者の向きに基づいて、車両100に設けられているヘッドライト2の灯火制御を行う。実施の形態6において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向であらわされる。実施の形態6において、「運転者の向き」は、運転者の顔向き、または、運転者の視線方向に加え、運転者の身体の向き、言い換えれば、運転者の姿勢、を含むものとしてもよい。
 実施の形態6では、ヘッドライト制御装置1eが行う、運転者の向きに基づくヘッドライト2の灯火制御は、例えば、夜間の駐車場、または、夜間の市街地等、車両100の周囲が暗い場所において、ヘッドライト2がオンにされた場合に行われることを想定している。
FIG. 29 is a diagram showing a configuration example of a headlight control device 1e according to the sixth embodiment.
In the sixth embodiment, it is assumed that the headlight control device 1e is mounted on the vehicle 100.
The headlight control device 1e controls the headlights 2 provided in the vehicle 100 based on the direction of the driver of the vehicle 100. In the sixth embodiment, the "driver's orientation" is expressed by the driver's face orientation or the driver's line of sight direction. In the sixth embodiment, the "driver's orientation" includes not only the driver's face orientation or the driver's line of sight direction, but also the driver's body orientation, in other words, the driver's posture. good.
In the sixth embodiment, the headlight control device 1e performs the light control of the headlights 2 based on the direction of the driver in a dark place around the vehicle 100, such as a parking lot at night or a city area at night. , is assumed to be performed when the headlight 2 is turned on.
 図29において、実施の形態1にて図1を用いて説明したヘッドライト制御装置1と同様の構成については、同じ符号を付して重複した説明を省略する。
 実施の形態6に係るヘッドライト制御装置1eにおいて、照射決定部14aおよびヘッドライト制御部15aの具体的な動作が、実施の形態1に係るヘッドライト制御装置1における照射決定部14およびヘッドライト制御部15の具体的な動作とは異なる。
In FIG. 29, the same components as the headlight control device 1 described in Embodiment 1 using FIG. 1 are denoted by the same reference numerals, and redundant explanation will be omitted.
In the headlight control device 1e according to the sixth embodiment, the specific operations of the irradiation determining section 14a and the headlight control section 15a are the same as the irradiation determining section 14 and the headlight control section 14 in the headlight control device 1 according to the first embodiment. The specific operation of the unit 15 is different.
 照射決定部14aは、ヘッドライト2に対して照射させる光の強さが異なる複数の照射範囲を設定する。実施の形態6において、ヘッドライト2が照射する光が強いとは、ヘッドライト2が照射する光の光量が大きいことをいう。
 実施の形態6では、一例として、照射決定部14aは、ヘッドライト2による照射光量を大きくする照射範囲(第1照射範囲とする)と、ヘッドライト2による照射光量を小さくする照射範囲(第2照射範囲とする)の、ヘッドライト2による照射光量を異ならせる2つの照射範囲を設定するものとする。
 第1照射範囲における照射光量は、適宜設定可能であるが、運転者が当該第1照射範囲に存在し得る推定視認対象物をじゅうぶんに視認可能な程度の照射光量を有するものとする。第2照射範囲における照射光量は、第1照射範囲における照射光量よりも小さく、当該第2照射範囲に存在し得る歩行者または他車両に対してグレアを与えない程度の小さな照射光量とする。
The irradiation determining unit 14a sets a plurality of irradiation ranges in which the intensity of light irradiated to the headlight 2 differs. In Embodiment 6, "the light emitted by the headlights 2 is strong" means that the amount of light emitted by the headlights 2 is large.
In the sixth embodiment, as an example, the irradiation determining unit 14a determines an irradiation range (referred to as a first irradiation range) in which the amount of light irradiated by the headlights 2 is increased, and an irradiation range (referred to as a second irradiation range) in which the amount of light irradiated by the headlights 2 is decreased. Assume that two irradiation ranges are set in which the amounts of light irradiated by the headlights 2 are different.
The amount of irradiation light in the first irradiation range can be set as appropriate, but it is assumed that the amount of irradiation light is such that the driver can sufficiently see the presumed visible object that may exist in the first irradiation range. The amount of light irradiated in the second irradiation range is smaller than the amount of light irradiated in the first irradiation range, and is set to be a small amount of light that does not cause glare to pedestrians or other vehicles that may be present in the second irradiation range.
 照射決定部14aは、第1照射範囲の上下方向について、例えば、奥行距離推定部13が推定した奥行距離に基づいて算出した照射範囲の上下方向のうち、予め設定されている、第1照射範囲の上下方向の上限角度までとする。照射決定部14aは、例えば、奥行距離に基づいて算出した照射範囲の上下方向の中央から垂直方向に上限角度まで広げた範囲を、第1照射範囲の上下方向とする。照射決定部14aは、奥行距離に基づいて算出した照射範囲の上下方向のうち、第1照射範囲以外の範囲を、第2照射範囲の左右方向とする。
 また、照射決定部14aは、第1照射範囲の左右方向について、例えば、奥行距離推定部13が推定した奥行距離に基づいて算出した照射範囲の水平方向のうち、予め設定されている、第1照射範囲の左右方向の上限角度までとする。照射決定部14aは、例えば、奥行距離に基づいて算出した照射範囲の左右方向の中央から水平方向に上限角度まで広げた範囲を、第1照射範囲の左右方向とする。照射決定部14aは、奥行距離に基づいて算出した照射範囲の左右方向のうち、第1照射範囲以外の範囲を、第2照射範囲の左右方向とする。
 なお、第1照射範囲の上下方向の上限角度、および、左右方向の上限角度は、例えば、管理者等によって設定され、照射決定部14aが参照可能な場所に記憶されている。
For example, the irradiation determination unit 14a selects a first irradiation range that is preset in the vertical direction of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13. up to the upper limit angle in the vertical direction. For example, the irradiation determination unit 14a sets the range that is expanded vertically from the vertical center of the irradiation range calculated based on the depth distance to the upper limit angle as the first irradiation range in the vertical direction. The irradiation determination unit 14a sets the range other than the first irradiation range in the vertical direction of the irradiation range calculated based on the depth distance as the left and right direction of the second irradiation range.
In addition, the irradiation determining unit 14a also selects a first irradiation range that is set in advance in the horizontal direction of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 in the left and right direction of the first irradiation range. Up to the upper limit angle in the left and right direction of the irradiation range. For example, the irradiation determining unit 14a sets the range that is expanded from the center in the left-right direction of the irradiation range calculated based on the depth distance to the upper limit angle in the horizontal direction as the first irradiation range in the left-right direction. The irradiation determining unit 14a sets a range other than the first irradiation range in the left-right direction of the irradiation range calculated based on the depth distance as the left-right direction of the second irradiation range.
Note that the upper limit angle in the vertical direction and the upper limit angle in the horizontal direction of the first irradiation range are set by, for example, an administrator and stored in a location that can be referenced by the irradiation determining unit 14a.
 例えば、第1照射範囲の上下方向の上限角度を「5度」、左右方向の上限角度を「8度」とする。また、例えば、奥行距離推定部13が推定した奥行距離に基づいて算出した照射範囲の上下方向および左右方向が、それぞれ、ヘッドライト2の設置位置を基準(0度)とした「3度~13度」、「0.5度~12.5度」の範囲であったとする。この場合、照射決定部14aは、照射範囲の上下方向の中央「8度」から垂直方向に5度となるまで広げた「5.5度~10.5度」の範囲を、第1照射範囲の上下方向に決定し、照射範囲の左右方向の中央「6.5度」から水平方向に8度となるまで広げた「2.5度~8.5度」の範囲を、第1照射範囲の左右方向に決定する。奥行距離推定部13は、ヘッドライト2の設置位置を基準として垂直方向に「3度~5.5度」の範囲、および、「10.5度~13度」の範囲を、第2照射範囲の上下方向に決定し、ヘッドライト2の設置位置を基準として水平方向に「0.5度~5.5度」の範囲、および、「10.5度~12.5度」の範囲を、第2照射範囲の水平方向に決定する。 For example, the upper limit angle in the vertical direction of the first irradiation range is "5 degrees", and the upper limit angle in the horizontal direction is "8 degrees". Further, for example, the vertical direction and horizontal direction of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 are "3 degrees to 13 degrees" with the installation position of the headlight 2 as a reference (0 degrees). It is assumed that the temperature is in the range of 0.5 degrees to 12.5 degrees. In this case, the irradiation determining unit 14a sets the range of "5.5 degrees to 10.5 degrees", which is expanded from "8 degrees" in the vertical center of the irradiation range to 5 degrees in the vertical direction, as the first irradiation range. The first irradiation range is 2.5 degrees to 8.5 degrees, which is determined in the vertical direction of Determine the left and right direction. The depth distance estimating unit 13 defines a second irradiation range, which is a range of "3 degrees to 5.5 degrees" and a range of "10.5 degrees to 13 degrees" in the vertical direction based on the installation position of the headlight 2. determined in the vertical direction, and the range of "0.5 degrees to 5.5 degrees" and the range of "10.5 degrees to 12.5 degrees" in the horizontal direction based on the installation position of the headlight 2, The second irradiation range is determined in the horizontal direction.
 照射決定部14aは、決定した照射範囲に関する照射情報を、ヘッドライト制御部15aに出力する。
 照射決定部14aが出力する照射情報は、第1照射範囲の上下方向の角度範囲および左右方向の角度範囲を示す情報と、第2照射範囲の上下方向の角度範囲および左右方向の角度範囲を示す情報とを含む。
The irradiation determining section 14a outputs irradiation information regarding the determined irradiation range to the headlight control section 15a.
The irradiation information output by the irradiation determining unit 14a includes information indicating the vertical angle range and horizontal angle range of the first irradiation range, and information indicating the vertical angle range and the horizontal angle range of the second irradiation range. Contains information.
 ヘッドライト制御部15aは、ヘッドライト2に対して、照射決定部14aが決定した照射範囲に光を照射させる。
 詳細には、ヘッドライト制御部15aは、ヘッドライト2に対して、照射範囲のうち、照射決定部14aが設定した第2照射範囲には、照射決定部14aが設定した第1照射範囲に照射させる光よりも小さい照射光量の光を照射させる。
The headlight control unit 15a causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14a.
Specifically, the headlight control unit 15a causes the headlight 2 to irradiate the second irradiation range set by the irradiation determining unit 14a among the irradiation ranges, and irradiate the first irradiation range set by the irradiation determining unit 14a. irradiate light with a smaller irradiation light amount than the light to be used.
 ここで、図30は、実施の形態6において、ヘッドライト制御部15aが、ヘッドライト2に対して、照射決定部14aが決定した第1照射範囲および第2照射範囲に光を照射させた様子の一例を説明するための図である。
 図30Aは、第1照射範囲および第2照射範囲に光が照射されている様子を車両100が走行中の道路の横から見た図であり、図30Bは、第1照射範囲および第2照射範囲に光が照射されている様子を車両100側から見た図としている。
 図30において、運転者は「D」で示され、ヘッドライト2による光の照射範囲のうち、第1照射範囲は「LA1」で示され、第2照射範囲は「LA2」で示されている。なお、図30Bでは、車両100等の図示は省略している。
Here, FIG. 30 shows how the headlight control unit 15a causes the headlight 2 to irradiate light onto the first irradiation range and the second irradiation range determined by the irradiation determination unit 14a in the sixth embodiment. It is a figure for explaining an example.
FIG. 30A is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light when viewed from the side of the road on which the vehicle 100 is traveling, and FIG. 30B is a diagram showing the first irradiation range and the second irradiation range. The figure shows how the range is irradiated with light as seen from the vehicle 100 side.
In FIG. 30, the driver is indicated by "D", the first irradiation range of the light irradiation range by the headlight 2 is indicated by "LA1", and the second irradiation range is indicated by "LA2". . Note that in FIG. 30B, illustration of the vehicle 100 and the like is omitted.
 例えば、奥行距離推定部13が推定した奥行距離が、「5~15m」、または、「80~100m」等、幅を持たせた距離である場合、仮に、照射決定部14aが、奥行距離の幅の範囲内に光が照射されるよう照射範囲を決定すると、ヘッドライト制御部15aがヘッドライト2に対して照射させる光の照射範囲の幅が大きくなってしまう。そうすると、ヘッドライト制御装置1dは、歩行者または他車両の運転者に対してグレアを与えてしまう可能性がある。
 そこで、実施の形態6に係るヘッドライト制御装置1dは、ヘッドライト2による照射光量を大きくする第1照射範囲と、ヘッドライト2による照射光量を小さくする第2照射範囲を設定し、ヘッドライト2に対して、照射範囲のうち、照射決定部14aが設定した第2照射範囲には、照射決定部14aが設定した第1照射範囲に照射させる光よりも小さい照射光量の光を照射させる。
 これにより、ヘッドライト制御装置1dは、歩行者または他車両の運転者に対して与えてしまうグレアを低減しつつ、車両100の運転者が向いている方向において当該運転者が推定視認対象物を視認できるよう、光を照射させることができる。
For example, if the depth distance estimated by the depth distance estimating unit 13 is a distance with a width such as “5 to 15 m” or “80 to 100 m”, the irradiation determining unit 14a temporarily determines the depth distance. If the irradiation range is determined so that light is irradiated within the width range, the width of the irradiation range of the light that the headlight control unit 15a causes the headlight 2 to irradiate becomes large. In this case, the headlight control device 1d may give glare to pedestrians or drivers of other vehicles.
Therefore, the headlight control device 1d according to the sixth embodiment sets a first irradiation range in which the amount of light irradiated by the headlight 2 is increased and a second irradiation range in which the amount of light irradiated by the headlight 2 is decreased. On the other hand, among the irradiation ranges, the second irradiation range set by the irradiation determining unit 14a is irradiated with light of a smaller amount of light than the light irradiated to the first irradiation range set by the irradiation determining unit 14a.
Thereby, the headlight control device 1d can reduce the glare that would be given to pedestrians or drivers of other vehicles, while also allowing the driver of the vehicle 100 to detect an estimated visible object in the direction in which the driver is facing. Light can be irradiated to make it visible.
 なお、上述したような、照射決定部14aによる第1照射範囲および第2照射範囲の決定方法は、一例に過ぎない。照射決定部14aは、その他の方法で、第1照射範囲および第2照射範囲を決定してもよい。
 例えば、奥行距離推定部13が推定した奥行距離に基づいて算出した照射範囲のうち、第1照射範囲の上下方向とする割合と、第1照射範囲の左右方向とする割合とがそれぞれ設定されており、照射決定部14aは、予め設定されている割合に基づいて、第1照射範囲の上下方向および左右方向を設定してもよい。
Note that the method of determining the first irradiation range and the second irradiation range by the irradiation determining unit 14a as described above is only an example. The irradiation determining unit 14a may determine the first irradiation range and the second irradiation range using other methods.
For example, of the irradiation range calculated based on the depth distance estimated by the depth distance estimation unit 13, a proportion of the first irradiation range in the vertical direction and a proportion of the first irradiation range in the horizontal direction are set respectively. Alternatively, the irradiation determining unit 14a may set the vertical direction and horizontal direction of the first irradiation range based on a preset ratio.
 また、例えば、照射決定部14aは、奥行距離推定部13が推定した奥行距離に基づいて算出した照射範囲のうち、どれぐらいの範囲を第1照射範囲の上下方向、および、左右方向とするかを、奥行距離が導き出される根拠となる、運転者の挙動と走行関連情報とから推定される車両100の走行状態に基づいて設定してもよい。例えば、奥行距離推定部13は、車両100の走行状態が駐停車中の状態であれば、第1照射範囲の上下方向の上限角度および左右方向の上限角度を「10度」とし、車両100の走行状態が走行中であれば、第1照射範囲の上下方向の上限角度および左右方向の上限角度を「5度」として、第1照射範囲の上下方向および左右方向を設定してもよい。なお、車両100の走行状態がどのような状態である場合に、第1照射範囲の上下方向および左右方向をどれぐらいとするかは、予め決められている。
 この場合、例えば、実施の形態1にて図3を用いて説明したような、車両100の走行状態および運転者の推定視認対象物が対応付けられた情報が予め管理者等によって生成され、照射決定部14aが参照可能な場所に記憶されている。また、奥行距離推定部13は、奥行距離情報とともに、向き情報と走行関連情報とを、照射決定部14aに出力する。
For example, the irradiation determining unit 14a determines how much of the irradiation range calculated based on the depth distance estimated by the depth distance estimating unit 13 should be in the vertical direction and the horizontal direction of the first irradiation range. may be set based on the driving state of the vehicle 100 estimated from the driver's behavior and driving-related information, which is the basis for deriving the depth distance. For example, if the running state of the vehicle 100 is parked or stopped, the depth distance estimating unit 13 sets the upper limit angle in the vertical direction and the upper limit angle in the horizontal direction of the first irradiation range to "10 degrees", If the traveling state is running, the upper limit angle in the vertical direction and the upper limit angle in the horizontal direction of the first irradiation range may be set to "5 degrees", and the vertical direction and the horizontal direction of the first irradiation range may be set. In addition, in what state the vehicle 100 is running, how far the first irradiation range should be in the vertical direction and the horizontal direction is determined in advance.
In this case, for example, as described in Embodiment 1 using FIG. 3, information in which the driving state of vehicle 100 and the estimated visible object of the driver are associated is generated in advance by an administrator, etc., and It is stored in a location that can be referenced by the determination unit 14a. Further, the depth distance estimating unit 13 outputs orientation information and driving related information along with the depth distance information to the irradiation determining unit 14a.
 また、例えば、奥行距離推定部13が奥行距離を推定するとともに照射範囲の理想幅を推定する場合、照射決定部14aは、照射範囲の理想幅に応じた照射範囲を第1照射範囲とし、奥行距離に基づく照射範囲のうち、第1照射範囲以外の範囲を第2照射範囲としてもよい。 For example, when the depth distance estimating unit 13 estimates the depth distance and also estimates the ideal width of the irradiation range, the irradiation determining unit 14a sets the irradiation range corresponding to the ideal width of the irradiation range as the first irradiation range, and Among the irradiation ranges based on distance, a range other than the first irradiation range may be set as the second irradiation range.
 実施の形態6に係るヘッドライト制御装置1eの動作について説明する。
 図31は、実施の形態6に係るヘッドライト制御装置1eの動作について説明するためのフローチャートである。
 ヘッドライト制御装置1eは、例えば、ヘッドライト2がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図31のフローチャートで示すような動作を開始する。ヘッドライト制御装置1eは、例えば、ヘッドライト2がオフの状態になるまで、または、車両100の電源がオフにされるまで、図31のフローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1eの制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14a、および、ヘッドライト制御部15aに、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14a、および、ヘッドライト制御部15aに、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1e according to the sixth embodiment will be explained.
FIG. 31 is a flowchart for explaining the operation of the headlight control device 1e according to the sixth embodiment.
For example, when the headlights 2 are turned on, the headlight control device 1e determines that the lighting control of the headlights 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. 31. do. The headlight control device 1e repeats the operation shown in the flowchart of FIG. 31, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
For example, the control unit (not shown) of the headlight control device 1e acquires information indicating the state of the headlights 2 from a headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are on or not. Determine whether or not. When the control unit determines that the headlights 2 are in the on state, the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13, the irradiation determining unit 14a, and the headlight control unit 15a.
Further, when the control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the driving-related information acquisition section 12, the depth distance estimating section 13, the illumination determining section 14a, and the headlight control section 15a.
 図31のフローチャートで示す動作について、ステップST1-1、ステップST1-2、ステップST2の処理内容は、それぞれ、実施の形態1にて説明済みの、図5のフローチャートで示すヘッドライト制御装置1の動作のステップST1-1、ステップST1-2、ステップST2の処理内容と同様であるため、重複した説明を省略する。 Regarding the operation shown in the flowchart of FIG. 31, the processing contents of step ST1-1, step ST1-2, and step ST2 are the same as those of the headlight control device 1 shown in the flowchart of FIG. 5, which have already been explained in the first embodiment. Since the processing content is the same as that of steps ST1-1, ST1-2, and ST2 of the operation, duplicate explanation will be omitted.
 照射決定部14aは、ステップST2にて奥行距離推定部13が推定した奥行距離に基づき、ヘッドライト2による照射光量を大きくする第1照射範囲と、ヘッドライト2による照射光量を小さくする第2照射範囲を設定する(ステップST3a)。
 照射決定部14aは、決定した照射範囲に関する照射情報を、ヘッドライト制御部15aに出力する。
The irradiation determining unit 14a determines a first irradiation range in which the amount of light irradiated by the headlights 2 is increased and a second irradiation range in which the amount of light irradiated by the headlights 2 is decreased based on the depth distance estimated by the depth distance estimation unit 13 in step ST2. A range is set (step ST3a).
The irradiation determining section 14a outputs irradiation information regarding the determined irradiation range to the headlight control section 15a.
 ヘッドライト制御部15aは、ヘッドライト2に対して、ステップST3aにて照射決定部14aが決定した照射範囲に光を照射させる(ステップST4a)。
 詳細には、ヘッドライト制御部15aは、ヘッドライト2に対して、照射範囲のうち、照射決定部14aが設定した第2照射範囲には、照射決定部14aが設定した第1照射範囲に照射させる光よりも小さい照射光量の光を照射させる。
The headlight control unit 15a causes the headlight 2 to irradiate the irradiation range determined by the irradiation determination unit 14a in step ST3a (step ST4a).
Specifically, the headlight control unit 15a causes the headlight 2 to irradiate the second irradiation range set by the irradiation determining unit 14a among the irradiation ranges, and irradiate the first irradiation range set by the irradiation determining unit 14a. irradiate light with a smaller irradiation light amount than the light to be used.
 このように、ヘッドライト制御装置1eは、奥行距離に基づき決定する照射範囲において、第1照射範囲と、ヘッドライト2による照射光量を第1照射範囲よりも小さくする第2照射範囲とを設定し、ヘッドライト2に対して、照射範囲のうち、第2照射範囲には、1照射範囲に照射させる光よりも小さい照射光量の光を照射させる。
 そのため、ヘッドライト制御装置1eは、歩行者または他車両の運転者に対して与えてしまうグレアを低減しつつ、車両100の運転者が向いている方向において当該運転者が推定視認対象物を視認できるよう、光を照射させることができる。
In this way, the headlight control device 1e sets the first irradiation range and the second irradiation range in which the amount of light irradiated by the headlight 2 is smaller than the first irradiation range in the irradiation range determined based on the depth distance. , the headlight 2 is caused to irradiate a second irradiation range of the irradiation range with a smaller amount of light than the light irradiated onto one irradiation range.
Therefore, the headlight control device 1e allows the driver of the vehicle 100 to visually recognize the presumed visible object in the direction in which the driver of the vehicle 100 is facing, while reducing glare that may be given to pedestrians or drivers of other vehicles. You can irradiate it with light so that you can do it.
 なお、以上の実施の形態6において、照射範囲の位置は、運転者の向きの変化に追従して変化させられるが、ヘッドライト制御装置1eは、運転者の向きの変化に追従して変化させる第1照射範囲の位置の移動速度と、運転者の向きの変化に追従して変化させる第2照射範囲の位置の移動速度とを異ならせてもよい。例えば、ヘッドライト制御装置1eにおいて、照射決定部14aは、運転者の向きの変化に追従して変化させる第1照射範囲の移動速度は、運転者の向きの短時間平均により制御し、運転者の向きの変化に追従して変化させる第2照射範囲の移動速度は、運転者の向きの長時間平均により制御してもよい。
 これにより、ヘッドライト制御装置1eは、運転者が向かなくなった方向でもしばらくは第2照射範囲の範囲内とでき、ヘッドライト2の光が照射されるようにできる。その結果、ヘッドライト制御装置1eは、例えば、運転者が向かなくなった方向において歩行者の飛び出しがあった場合等に、当該歩行者にヘッドライト2の光を照射させ、運転者に対して歩行者を早期発見させることができる。
Note that in the above sixth embodiment, the position of the irradiation range is changed to follow the change in the driver's orientation, but the headlight control device 1e is changed to follow the change in the driver's orientation. The speed of movement of the position of the first irradiation range may be different from the speed of movement of the position of the second irradiation range, which is changed in accordance with a change in the orientation of the driver. For example, in the headlight control device 1e, the irradiation determining unit 14a controls the moving speed of the first irradiation range, which is changed in accordance with a change in the driver's orientation, by a short-time average of the driver's orientation, and The moving speed of the second irradiation range, which is changed to follow changes in the orientation of the driver, may be controlled by a long-term average of the driver's orientation.
As a result, the headlight control device 1e can remain within the second irradiation range for a while even in a direction that the driver is no longer facing, and can continue to be irradiated with light from the headlights 2. As a result, for example, when a pedestrian jumps out in the direction that the driver is no longer facing, the headlight control device 1e illuminates the pedestrian with the light of the headlight 2 and directs the driver toward the pedestrian. Pedestrians can be detected early.
 なお、以上の実施の形態6では、照射決定部14aは、ヘッドライト2による照射光量が異なる2つの照射範囲を設定するものとしたが、これは一例に過ぎず、照射決定部14aは、ヘッドライト2による照射光量が段階的に異なる3つ以上の照射範囲を設定してもよい。 Note that in the above sixth embodiment, the irradiation determining unit 14a sets two irradiation ranges with different amounts of light irradiated by the headlight 2, but this is only an example, and the irradiation determining unit 14a Three or more irradiation ranges may be set in which the amount of light irradiated by the light 2 differs in stages.
 また、以上の実施の形態6では、照射決定部14aは、照射範囲の上下方向、および、左右方向に対して、それぞれ、照射光量が異なる2つの照射範囲(第1照射範囲および第2照射範囲)を設定するものとしたが、これは一例に過ぎない。例えば、照射決定部14aは、照射範囲の上下方向のみ、照射光量が異なる照射範囲を設定してもよいし、照射範囲の左右方向のみ、照射光量が異なる照射範囲を設定してもよい。 Further, in the above-described sixth embodiment, the irradiation determining unit 14a defines two irradiation ranges (a first irradiation range and a second irradiation range) having different irradiation light amounts in the vertical and horizontal directions of the irradiation range. ), but this is just an example. For example, the irradiation determining unit 14a may set an irradiation range in which the amount of irradiation light differs only in the vertical direction of the irradiation range, or may set an irradiation range in which the amount of irradiation light differs only in the left and right directions of the irradiation range.
 また、以上の実施の形態6では、ヘッドライト制御装置1eは、車両100に搭載される車載装置とし、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14aと、ヘッドライト制御部15aと、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14aと、ヘッドライト制御部15aと、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14aと、ヘッドライト制御部15aと、図示しない制御部の全部がサーバに備えられてもよい。 Further, in the above sixth embodiment, the headlight control device 1e is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section 12. It is assumed that the section 14a, the headlight control section 15a, and a control section (not shown) are included in the in-vehicle device. The present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12, depth distance estimation section 13, irradiation determination section 14a, headlight control section 15a, and control section (not shown) 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14a, the headlight control unit 15a, and a control unit (not shown). .
 また、以上の実施の形態6は、実施の形態1に係るヘッドライト制御装置1において、第1照射範囲および第2照射範囲を設定するものとしたが、これは一例に過ぎない。例えば、実施の形態2に係るヘッドライト制御装置1a、実施の形態3に係るヘッドライト制御装置1b、実施の形態4に係るヘッドライト制御装置1c、または、実施の形態5にかかるヘッドライト制御装置1dにおいて、第1照射範囲および第2照射範囲を設定してもよい。
 なお、実施の形態3に係るヘッドライト制御装置1b、および、実施の形態4に係るヘッドライト制御装置1cは、奥行距離を、実際に運転者の向きに存在している物体を考慮して調整する。そのため、例えば、ヘッドライト2と運転者の向きに存在している移動体との距離が奥行距離と推定された場合、奥行距離は、幅を有していない距離となる可能性が高い。この場合、ヘッドライト制御装置1b、1cが、ヘッドライト2に対して照射させる光の照射範囲の幅が大きくならないと想定される。しかし、例えば、車外撮像装置またはレーダで測定された物体の位置には、測定誤差が含まれ得る。そこで、例えば、ヘッドライト制御装置1b、1cは、当該測定誤差の範囲を、第2照射範囲として設定するようにしてもよい。
Further, in the sixth embodiment described above, the first irradiation range and the second irradiation range are set in the headlight control device 1 according to the first embodiment, but this is only an example. For example, the headlight control device 1a according to the second embodiment, the headlight control device 1b according to the third embodiment, the headlight control device 1c according to the fourth embodiment, or the headlight control device according to the fifth embodiment In 1d, a first irradiation range and a second irradiation range may be set.
Note that the headlight control device 1b according to Embodiment 3 and the headlight control device 1c according to Embodiment 4 adjust the depth distance by taking into account objects that actually exist in the driver's direction. do. Therefore, for example, if the distance between the headlights 2 and a moving object that is facing the driver is estimated to be a depth distance, the depth distance is likely to be a distance that has no width. In this case, it is assumed that the width of the irradiation range of the light that the headlight control devices 1b and 1c irradiates onto the headlight 2 does not become large. However, for example, the position of the object measured by an external imaging device or radar may include a measurement error. Therefore, for example, the headlight control devices 1b and 1c may set the range of the measurement error as the second irradiation range.
 実施の形態6に係るヘッドライト制御装置1eのハードウェア構成は、実施の形態1において図6Aおよび図6Bを用いて説明したヘッドライト制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態6において、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14aと、ヘッドライト制御部15aと、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1eは、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向きに関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
The hardware configuration of the headlight control device 1e according to the sixth embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, so illustration thereof is omitted. do.
In the sixth embodiment, the functions of the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14a, the headlight control section 15a, and a control section (not shown) are performed by a processing circuit. This is realized by 1001. That is, the headlight control device 1e estimates the depth distance based on the driving-related information and the orientation information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to the driving-related information. A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14aと、ヘッドライト制御部15aと、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1eは、処理回路1001により実行されるときに、上述の図31のステップST1-1、ステップST1-2、ステップST2~ステップST4aが結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14aと、ヘッドライト制御部15aと、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 記憶部16は、例えば、メモリ1005で構成される。
 ヘッドライト制御装置1eは、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
The processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14a, and the headlight control section. The functions of the unit 15a and a control unit (not shown) are executed. That is, the headlight control device 1e executes a program that, when executed by the processing circuit 1001, results in steps ST1-1, ST1-2, and ST2 to ST4a in FIG. A memory 1005 is provided for storing. Further, the program stored in the memory 1005 includes the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14a, the headlight control unit 15a, and the control unit (not shown). It can also be said to be something that causes a computer to execute a processing procedure or method.
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1e includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態6に係るヘッドライト制御装置1eは、照射決定部14aが、照射範囲において、第1照射範囲とヘッドライト2による照射光量を第1照射範囲よりも小さくする第2照射範囲とを設定し、ヘッドライト制御部15aは、ヘッドライト2に対して、照射範囲のうち、照射決定部14aが設定した第2照射範囲には、照射決定部14aが設定した第1照射範囲に照射させる光よりも小さい照射光量の光を照射させるように構成した。
 そのため、ヘッドライト制御装置1eは、歩行者または他車両の運転者に対して与えてしまうグレアを低減しつつ、車両100の運転者が向いている方向において当該運転者が推定視認対象物を視認できるよう、光を照射させることができる。
As described above, in the headlight control device 1e according to the sixth embodiment, the irradiation determining unit 14a sets the first irradiation range and the second irradiation amount by the headlight 2 to be smaller than the first irradiation range in the irradiation range. The headlight control unit 15a sets the irradiation range for the headlight 2 in the second irradiation range set by the irradiation determination unit 14a among the irradiation ranges for the headlight 2. It is configured to irradiate a smaller amount of light than the light irradiated onto the area.
Therefore, the headlight control device 1e allows the driver of the vehicle 100 to visually recognize the estimated visible object in the direction in which the driver of the vehicle 100 is facing, while reducing glare that would be given to pedestrians or drivers of other vehicles. It is possible to irradiate light to make it possible.
実施の形態7.
 実施の形態7では、ヘッドライト制御装置が、光の強さが異なる複数の照射範囲を設定する際、光の弱い照射範囲について、実施の形態6で設定していた範囲よりもさらに広げる実施の形態について説明する。
Embodiment 7.
In Embodiment 7, when the headlight control device sets a plurality of irradiation ranges with different light intensities, the headlight control device expands the irradiation range with weak light even further than the range set in Embodiment 6. The form will be explained.
 図32は、実施の形態7に係るヘッドライト制御装置1fの構成例を示す図である。
 実施の形態7に係るヘッドライト制御装置1fの構成例について、図29を用いて説明した実施の形態6に係るヘッドライト制御装置1eの構成例と同様の構成については、同じ符号を付して重複した説明を省略する。
 実施の形態7に係るヘッドライト制御装置1fの構成例は、図29を用いて説明した実施の形態6に係るヘッドライト制御装置1eとは、照射決定部14bが周囲確認判定部141を備えるようにした点が異なる。
FIG. 32 is a diagram showing a configuration example of a headlight control device 1f according to the seventh embodiment.
Regarding the configuration example of the headlight control device 1f according to the seventh embodiment, the same components as the configuration example of the headlight control device 1e according to the sixth embodiment described using FIG. 29 are denoted by the same reference numerals. Omit duplicate explanations.
The configuration example of the headlight control device 1f according to the seventh embodiment is different from the headlight control device 1e according to the sixth embodiment described using FIG. The difference is that
 周囲確認判定部141は、向き検出部11が検出した運転者の向きに関する情報と走行関連情報取得部12が取得した走行関連情報(ここでは車両情報取得部121が取得した車両関連情報)とに基づき、運転者が周囲の確認を行っているか否かを判定し、運転者が周囲の確認を行っていると判定した場合、照射決定部14bが設定した第2照射範囲を広げる。
 詳細には、例えば、周囲確認判定部141は、向き情報と、走行関連情報(ここでは車両情報)と、運転者が周囲の確認を行っているか否かを判定するための周囲確認判定用条件とをつきあわせることで、運転者が周囲の確認を行っているか否かを判定する。
The surrounding confirmation determination unit 141 uses the information regarding the driver's orientation detected by the orientation detection unit 11 and the driving-related information acquired by the driving-related information acquisition unit 12 (here, the vehicle-related information acquired by the vehicle information acquisition unit 121). Based on this, it is determined whether the driver is checking the surroundings, and if it is determined that the driver is checking the surroundings, the second irradiation range set by the irradiation determining unit 14b is expanded.
In detail, for example, the surroundings confirmation determination unit 141 uses direction information, driving-related information (vehicle information in this case), and surroundings confirmation determination conditions for determining whether or not the driver is checking the surroundings. It is determined whether or not the driver is checking his/her surroundings.
 ここで、図33は、実施の形態7において、周囲確認判定部141が、運転者が周囲の確認を行っているか否かの判定に用いる周囲確認判定用条件の内容の一例を説明するための図である。
 周囲確認判定用条件は、例えば、運転者の挙動と車両情報とが対応付けられたテーブルである。周囲確認判定用条件は、予め、管理者等によって生成され、周囲確認判定部141が参照可能な場所に記憶されている。
 周囲確認判定部141は、向き情報と車両情報とから運転者の挙動および車速等を判定し、判定した運転者の挙動および車速等が、周囲確認判定用条件と一致した場合、運転者は周囲の確認を行っていると判定する。
 周囲確認判定部141は、判定した運転者の挙動および車速等が、周囲確認判定用条件と一致しない場合は、運転者は周囲の確認を行っていないと判定する。
Here, FIG. 33 is a diagram for explaining an example of the contents of the surroundings confirmation determination condition used by the surroundings confirmation determining unit 141 to determine whether the driver is checking the surroundings in the seventh embodiment. It is a diagram.
The surrounding confirmation determination condition is, for example, a table in which driver behavior and vehicle information are associated with each other. The conditions for surroundings confirmation determination are generated in advance by an administrator or the like, and are stored in a location where the surroundings confirmation determination unit 141 can refer to them.
Surrounding confirmation determining unit 141 determines the driver's behavior, vehicle speed, etc. from the direction information and vehicle information, and if the determined driver's behavior, vehicle speed, etc. match the surrounding confirmation determination conditions, the driver It is determined that the verification is being carried out.
The surroundings confirmation determination unit 141 determines that the driver has not checked the surroundings if the determined driver's behavior, vehicle speed, etc. do not match the surroundings confirmation determination conditions.
 そして、周囲確認判定部141は、運転者が周囲の確認を行っていると判定した場合、照射決定部14bが設定した第2照射範囲を広げる。
 なお、照射決定部14bは、実施の形態6に係るヘッドライト制御装置1eが備える照射決定部14aと同様の方法で、第1照射範囲および第2照射範囲を設定すればよいため、重複した説明を省略する。
 ここでは、例えば、周囲確認判定部141は、運転者が周囲の確認を行っていると判定した場合、第2照射範囲の左右方向を、ヘッドライト2が光を照射可能な領域、言い換えれば、ハイビーム照射可能領域、ロービーム照射可能領域、および、補助光照射可能領域の限界まで、広げる。
When the surroundings confirmation determining unit 141 determines that the driver is checking the surroundings, the second irradiation range set by the irradiation determining unit 14b is expanded.
Note that the irradiation determination unit 14b may set the first irradiation range and the second irradiation range in the same manner as the irradiation determination unit 14a included in the headlight control device 1e according to the sixth embodiment, so the explanation will be redundant. omitted.
Here, for example, when the surroundings confirmation determining unit 141 determines that the driver is checking the surroundings, the left and right direction of the second irradiation range is defined as an area where the headlight 2 can irradiate light, in other words, Expand the high beam irradiable area, low beam irradiable area, and auxiliary light irradiable area to their limits.
 照射決定部14bは、照射決定部14bが設定した第1照射範囲を示す情報と、周囲確認判定部141が第2照射範囲を広げた場合は、広げられた後の第2照射範囲を示す情報と含む照射情報を、ヘッドライト制御部15aに出力する。
 周囲確認判定部141が第2照射範囲を広げなかった場合は、照射決定部14bは、照射決定部14bが設定した第1照射範囲および第2照射範囲を示す情報を含む照射情報を、ヘッドライト制御部15aに出力する。
The irradiation determining unit 14b includes information indicating the first irradiation range set by the irradiation determining unit 14b, and information indicating the expanded second irradiation range when the surrounding confirmation determining unit 141 expands the second irradiation range. The irradiation information including the above is output to the headlight control section 15a.
If the surrounding confirmation determination unit 141 does not expand the second irradiation range, the irradiation determination unit 14b transmits the irradiation information including information indicating the first irradiation range and the second irradiation range set by the irradiation determination unit 14b to the headlights. It is output to the control section 15a.
 図34は、実施の形態7において、周囲確認判定部141が、照射決定部14bが設定した第2照射範囲を広げた後の、照射範囲に光を照射させた様子の一例を説明するための図である。
 図34は、第1照射範囲および第2照射範囲に光が照射されている様子を車両100側から見た図としている。
 図34において、第1照射範囲は「LA1」で示され、第2照射範囲は「LA2」で示されている。
 なお、周囲確認判定部141が第2照射範囲を広げなかった場合、光が照射される、照射決定部14bが設定した第1照射範囲および第2照射範囲は、実施の形態6にて図30Bで示したような範囲である。
FIG. 34 is a diagram for explaining an example of how the surrounding confirmation determining unit 141 irradiates the irradiation range with light after expanding the second irradiation range set by the irradiation determining unit 14b in the seventh embodiment. It is a diagram.
FIG. 34 is a diagram showing how the first irradiation range and the second irradiation range are irradiated with light, as seen from the vehicle 100 side.
In FIG. 34, the first irradiation range is indicated by "LA1", and the second irradiation range is indicated by "LA2".
Note that if the surrounding confirmation determination unit 141 does not expand the second irradiation range, the first irradiation range and the second irradiation range set by the irradiation determination unit 14b, in which light is irradiated, are as shown in FIG. 30B in the sixth embodiment. This is the range shown in .
 周囲確認判定部141が第2照射範囲を広げることで、図34に示すように、ヘッドライト制御装置1fは、運転者が見ている方向における推定視認対象物のみならず、運転者が見ていない可能性が高い物体が存在し得る場所も明るくすることができる。
 なお、ヘッドライト制御装置1fは、第2照射範囲、言い換えれば、第1照射範囲よりも小さい、歩行者または他車両の運転者等にグレアを与えない程度の照射光量が照射される範囲を広げる。よって、ヘッドライト制御装置1fは、運転者が見ていない可能性が高い物体が存在し得る場所を明るくするとともに、運転者が見ていない方向において、歩行者または他車両の運転者等にグレアを与えないようにできる。
By expanding the second irradiation range by the surrounding confirmation determination unit 141, as shown in FIG. It is also possible to brighten areas where there may be objects that are unlikely to exist.
Note that the headlight control device 1f widens the second irradiation range, in other words, the range that is smaller than the first irradiation range and is irradiated with an amount of light that does not cause glare to pedestrians, drivers of other vehicles, etc. . Therefore, the headlight control device 1f brightens areas where objects that are likely not seen by the driver may be present, and also reduces glare to pedestrians or drivers of other vehicles in directions where the driver is not looking. It is possible to avoid giving
 図35および図36は、実施の形態7において、ヘッドライト制御装置1fがヘッドライト2に対して照射させる光の照射範囲の一例について説明するための図である。
 図35は、車両100の周囲100の状況を上から見た俯瞰図である。
 図36Bは、図35に示すような車両100の周囲の状況において、ヘッドライト制御装置1fがヘッドライト2に対して照射させる光の照射範囲の一例を示す図であり、図36Aは、図35に示すような車両100の周囲の状況において、実施の形態6に係るヘッドライト制御装置1eがヘッドライト2に対して照射させる光の照射範囲の一例を示す図である。
35 and 36 are diagrams for explaining an example of the irradiation range of light that the headlight control device 1f irradiates to the headlight 2 in the seventh embodiment.
FIG. 35 is an overhead view of the surroundings 100 of the vehicle 100.
36B is a diagram showing an example of the irradiation range of light that the headlight control device 1f irradiates to the headlights 2 in the surrounding situation of the vehicle 100 as shown in FIG. FIG. 12 is a diagram showing an example of the irradiation range of light emitted to the headlights 2 by the headlight control device 1e according to the sixth embodiment in the situation around the vehicle 100 as shown in FIG.
 例えば、今、車両100は地下駐車場を走行しており、車両100の前方において、車両100の進行方向に対して、車両100が走行している道路の左右には、第1停車車両(図35にて「C1」で示されている)と第2停車車両(図35にて「C2」で示されている)が停車しているとする。また、車両100からみて、右前方の第1停車車両の陰から、歩行者(図35にて「W」で示されている)が出てこようとしている。
 運転者は、車両100から見て左前方を向いている。
For example, the vehicle 100 is currently traveling in an underground parking lot, and in front of the vehicle 100, there are first parked vehicles (Fig. It is assumed that a second stopped vehicle (indicated by "C2" in FIG. 35) and a second stopped vehicle (indicated by "C2" in FIG. 35) are stopped. Furthermore, as viewed from the vehicle 100, a pedestrian (indicated by "W" in FIG. 35) is about to come out from behind the first stopped vehicle on the right front.
The driver is facing toward the left front when viewed from the vehicle 100.
 この場合、例えば、実施の形態6に係るヘッドライト制御装置1eでは、図36Aに示すように、推定された奥行距離に基づいて決定された照射範囲において、当該照射範囲のうちの第2照射範囲は、運転者の向いている方向、すなわち、車両100の左前方に設定されるようになっていた。そうすると、歩行者(図36Aにおいて「W」で示されている)が出てこようとしている車両100の右前方は、運転者が向いている方向とは反対側となり、照射範囲は設定されない。その結果、歩行者付近は暗くなってしまい、運転者は歩行者の発見が遅れてしまう。 In this case, for example, in the headlight control device 1e according to the sixth embodiment, as shown in FIG. 36A, in the irradiation range determined based on the estimated depth distance, the second irradiation range of the irradiation range is is set in the direction in which the driver is facing, that is, in the left front of the vehicle 100. Then, the right front of the vehicle 100 from which the pedestrian (indicated by "W" in FIG. 36A) is about to exit will be on the opposite side to the direction in which the driver is facing, and no irradiation range will be set. As a result, the area near the pedestrian becomes dark, and the driver is delayed in finding the pedestrian.
 これに対し、実施の形態7に係るヘッドライト制御装置1fは、図36Bに示すように、推定された奥行距離に基づいて決定された照射範囲では車両100の左前方に設定される第2照射範囲について、照射可能領域の限界まで、左右方向を広げる。これにより、歩行者(図36Bにおいて「W」で示されている)が出てこようとしている車両100の右前方が、照射範囲、詳細には、第2照射範囲に設定されることになる。その結果、歩行者付近にもヘッドライト2による光が照射されるようになり、運転者は、歩行者の早期発見が可能となる。 In contrast, the headlight control device 1f according to the seventh embodiment, as shown in FIG. 36B, sets the second irradiation to the left front of the vehicle 100 in the irradiation range determined based on the estimated depth distance. Expand the range in the left and right directions to the limit of the irradiable area. As a result, the right front of the vehicle 100 where the pedestrian (indicated by "W" in FIG. 36B) is about to exit is set as the irradiation range, specifically, the second irradiation range. As a result, the headlights 2 illuminate the vicinity of pedestrians, allowing the driver to spot pedestrians early.
 なお、周囲確認判定部141がどこまで第2照射範囲を広げるかは、適宜設定可能である。
 以上の実施の形態7では、周囲確認判定部141は、ヘッドライト2が光を照射可能な領域の限界まで、第2照射範囲の左右方向を広げるものとしたが、これは一例に過ぎず、周囲確認判定部141は、予め設定された範囲だけ、第2照射範囲の左右方向を広げてもよい。周囲確認判定部141は、照射決定部14bが、奥行距離推定部13が推定した奥行距離に基づいて決定した第2照射範囲よりも広げるようになっていればよい。
Note that the extent to which the surrounding confirmation determination unit 141 expands the second irradiation range can be set as appropriate.
In the seventh embodiment described above, the surrounding confirmation determination unit 141 expands the second irradiation range in the horizontal direction to the limit of the area where the headlight 2 can irradiate light, but this is only an example. The surrounding confirmation determination unit 141 may widen the second irradiation range in the left and right direction by a preset range. The surrounding confirmation determination unit 141 may be configured such that the irradiation determination unit 14b makes the irradiation range wider than the second irradiation range determined based on the depth distance estimated by the depth distance estimation unit 13.
 また、周囲確認判定部141は、どれぐらい第2照射範囲を広げるかを、奥行距離が導き出される根拠となる、運転者の挙動と走行関連情報とから推定される車両100の走行状態(例えば、駐停車中、または、交差点右左折中)または車両100の車速に基づいて設定してもよい。この場合、例えば、実施の形態1にて図3を用いて説明したような、車両100の走行状態および運転者の推定視認対象物が対応付けられた情報が予め管理者等によって生成され、周囲確認判定部141が参照可能な場所に記憶されている。また、奥行距離推定部13は、奥行距離情報とともに、向き情報と走行関連情報とを、照射決定部14bに出力する。 The surrounding confirmation determining unit 141 also determines how much the second irradiation range should be expanded by determining the driving state of the vehicle 100 estimated from the driver's behavior and driving-related information, which is the basis for deriving the depth distance (for example, It may be set based on the vehicle speed of the vehicle 100 (while parked or stopped, or while turning right or left at an intersection) or the vehicle speed of the vehicle 100. In this case, for example, as described with reference to FIG. 3 in Embodiment 1, information in which the driving state of vehicle 100 and the estimated visible object of the driver are associated is generated in advance by an administrator, etc., and It is stored in a location that can be referenced by the confirmation determination unit 141. Further, the depth distance estimating unit 13 outputs direction information and travel-related information along with the depth distance information to the irradiation determining unit 14b.
 また、以上の実施の形態7では、周囲確認判定部141は、第2照射範囲の左右方向にのみ、当該第2照射範囲を広げるものとしたが、これは一例に過ぎず、周囲確認判定部141は、当該第2照射範囲の上下方向に、当該第2照射範囲を広げてもよい。 Further, in the seventh embodiment described above, the surroundings confirmation determining unit 141 widens the second irradiation range only in the left and right direction of the second irradiation range, but this is only an example, and the surroundings confirmation determining unit 141 may extend the second irradiation range in the vertical direction of the second irradiation range.
 実施の形態7に係るヘッドライト制御装置1fの動作について説明する。
 図37は、実施の形態7に係るヘッドライト制御装置1fの動作について説明するためのフローチャートである。
 ヘッドライト制御装置1fは、例えば、ヘッドライト2がオンの状態になった場合、運転者の向きに基づくヘッドライト2の点灯制御を行うと判定し、図37フローチャートで示すような動作を開始する。ヘッドライト制御装置1fは、例えば、ヘッドライト2がオフの状態になるまで、または、車両100の電源がオフにされるまで、図37のフローチャートで示すような動作を繰り返す。
 例えば、ヘッドライト制御装置1fの制御部(図示省略)は、車両100に搭載されているヘッドライトスイッチから、ヘッドライト2の状態を示す情報を取得し、ヘッドライト2がオンの状態であるか否かを判定する。制御部は、ヘッドライト2がオンの状態であると判定すると、運転者の向きに基づくヘッドライト2の点灯制御を開始すると判定し、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14b、および、ヘッドライト制御部15aに、ヘッドライト2の点灯制御開始を指示する情報を出力する。
 また、制御部は、ヘッドライト2がオフの状態ある、または、車両100がオフにされたと判定すると、運転者の向きに基づくヘッドライト2の点灯制御を終了すると判定し、向き検出部11、走行関連情報取得部12、奥行距離推定部13、照射決定部14b、および、ヘッドライト制御部15aに、ヘッドライト2の点灯制御終了を指示する情報を出力する。
The operation of the headlight control device 1f according to the seventh embodiment will be described.
FIG. 37 is a flowchart for explaining the operation of the headlight control device 1f according to the seventh embodiment.
For example, when the headlight 2 is turned on, the headlight control device 1f determines that the lighting control of the headlight 2 is to be performed based on the direction of the driver, and starts an operation as shown in the flowchart of FIG. 37. . The headlight control device 1f repeats the operation shown in the flowchart of FIG. 37, for example, until the headlights 2 are turned off or the power of the vehicle 100 is turned off.
For example, the control unit (not shown) of the headlight control device 1f acquires information indicating the state of the headlights 2 from the headlight switch mounted on the vehicle 100, and determines whether the headlights 2 are in the on state. Determine whether or not. When the control unit determines that the headlights 2 are in the on state, the control unit determines to start lighting control of the headlights 2 based on the direction of the driver, and the direction detection unit 11, driving related information acquisition unit 12, and depth distance estimation Information instructing the start of lighting control of the headlights 2 is output to the unit 13, the irradiation determining unit 14b, and the headlight control unit 15a.
Further, if the control unit determines that the headlights 2 are in an off state or that the vehicle 100 is turned off, the control unit determines to end the lighting control of the headlights 2 based on the driver's orientation, and the orientation detection unit 11, Information instructing the end of the lighting control of the headlights 2 is output to the driving-related information acquisition section 12, the depth distance estimation section 13, the illumination determination section 14b, and the headlight control section 15a.
 図37のフローチャートで示す動作について、ステップST1-1、ステップST1-2、ステップST2~ステップST3a、ステップST4aの処理内容は、それぞれ、実施の形態6にて説明済みの、図31のフローチャートで示すヘッドライト制御装置1eの動作のステップST1-1、ステップST1-2、ステップST2~ステップST3a、ステップST4aの処理内容と同様であるため、重複した説明を省略する。
 実施の形態7に係るヘッドライト制御装置1fの動作は、図31のフローチャートを用いて説明した実施の形態6に係るヘッドライト制御装置1eの動作から、ステップST3a-1~ステップST3a-2の処理が追加になっている。
Regarding the operation shown in the flowchart of FIG. 37, the processing contents of step ST1-1, step ST1-2, step ST2 to step ST3a, and step ST4a are shown in the flowchart of FIG. 31, which have already been explained in the sixth embodiment. Since the processing content is the same as that of step ST1-1, step ST1-2, step ST2 to step ST3a, and step ST4a of the operation of the headlight control device 1e, duplicate explanation will be omitted.
The operation of the headlight control device 1f according to the seventh embodiment starts from the operation of the headlight control device 1e according to the sixth embodiment described using the flowchart of FIG. has been added.
 ステップST3aにて、照射決定部14aが、ステップST2にて奥行距離推定部13が推定した奥行距離に基づき、第1照射範囲と第2照射範囲を設定すると、周囲確認判定部141は、ステップST1-1にて向き検出部11が検出した運転者の向きに関する向き情報と、ステップST1-2にて車両情報取得部121が取得した車両関連情報とに基づき、運転者が周囲の確認を行っているか否かを判定する(ステップST3a-1)。 In step ST3a, when the irradiation determination unit 14a sets the first irradiation range and the second irradiation range based on the depth distance estimated by the depth distance estimation unit 13 in step ST2, the surrounding confirmation determination unit 141 The driver checks the surroundings based on the orientation information regarding the driver's orientation detected by the orientation detection unit 11 in step ST1-1 and the vehicle-related information acquired by the vehicle information acquisition unit 121 in step ST1-2. It is determined whether or not there is one (step ST3a-1).
 ステップST3a-1にて、運転者が周囲の確認を行っていると判定した場合(ステップST3a-1の“YES”の場合)、周囲確認判定部141は、ステップST3aにて照射決定部14bが設定した第2照射範囲を広げる(ステップST3a-2)。
 照射決定部14bは、ステップST3aにて設定した第1照射範囲を示す情報と、ステップST3a-2にて周囲確認判定部141によって広げられた後の第2照射範囲を示す情報と含む照射情報を、ヘッドライト制御部15aに出力する。
If it is determined in step ST3a-1 that the driver is checking the surroundings (“YES” in step ST3a-1), the surroundings confirmation determining unit 141 determines that the irradiation determining unit 14b is checking the surroundings in step ST3a. The set second irradiation range is expanded (step ST3a-2).
The irradiation determining unit 14b receives irradiation information including information indicating the first irradiation range set in step ST3a and information indicating the second irradiation range expanded by the surrounding confirmation determination unit 141 in step ST3a-2. , is output to the headlight control section 15a.
 ステップST3a-1にて、運転者が周囲の確認を行っていないと判定した場合(ステップST3a-1の“NO”の場合)、照射決定部14bは、ステップST3aにて設定した第1照射範囲および第2照射範囲を示す情報を含む照射情報を、ヘッドライト制御部15aに出力する。そして、ヘッドライト制御装置1bの動作は、ステップST3a-2の処理をスキップしてステップST4aの処理へ進む。 If it is determined in step ST3a-1 that the driver has not checked the surroundings (“NO” in step ST3a-1), the irradiation determining unit 14b selects the first irradiation range set in step ST3a. And irradiation information including information indicating the second irradiation range is output to the headlight control section 15a. Then, the operation of the headlight control device 1b skips the process of step ST3a-2 and proceeds to the process of step ST4a.
 このように、ヘッドライト制御装置1fは、運転者の向きに関する向き情報と走行関連情報(ここでは車両情報)とに基づき、運転者が周囲の確認を行っているか否かを判定し、運転者が周囲の確認を行っていると判定した場合、第2照射範囲を広げる。
 そのため、ヘッドライト制御装置1fは、運転者が見ている方向における推定視認対象物のみならず、運転者が見ていない可能性が高い物体が存在し得る場所も明るくすることができる。また、ヘッドライト制御装置1fは、運転者が見ていない可能性が高い物体が存在し得る場所を明るくするとともに、運転者が見ていない方向において、歩行者または他車両の運転者等にグレアを与えないようにできる。
In this way, the headlight control device 1f determines whether or not the driver is checking the surroundings based on the direction information regarding the driver's direction and driving-related information (vehicle information in this case), and If it is determined that the person is checking the surrounding area, the second irradiation range is expanded.
Therefore, the headlight control device 1f can brighten not only the estimated visible target in the direction in which the driver is looking, but also a place where there may be an object that is likely not seen by the driver. In addition, the headlight control device 1f brightens areas where there may be objects that the driver is not likely to see, and also provides glare to pedestrians or drivers of other vehicles in directions that the driver is not looking at. It is possible to avoid giving
 なお、以上の実施の形態7では、ヘッドライト制御装置1fは、車両100に搭載される車載装置とし、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14bと、ヘッドライト制御部15aと、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14bと、ヘッドライト制御部15aと、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14bと、ヘッドライト制御部15aと、図示しない制御部の全部がサーバに備えられてもよい。 In the seventh embodiment described above, the headlight control device 1f is an in-vehicle device mounted on the vehicle 100, and includes the direction detection section 11, the driving-related information acquisition section 12, the depth distance estimation section 13, and the irradiation determination section 12. It is assumed that the section 14b, the headlight control section 15a, and a control section (not shown) are included in the in-vehicle device. However, the present invention is not limited to this, and some of the direction detection section 11, driving-related information acquisition section 12, depth distance estimation section 13, irradiation determination section 14b, headlight control section 15a, and control section (not shown) are 100 in-vehicle devices, and the other in-vehicle devices may be provided in servers connected to the in-vehicle devices via a network. Further, the server may include all of the direction detection unit 11, the travel-related information acquisition unit 12, the depth distance estimation unit 13, the irradiation determination unit 14b, the headlight control unit 15a, and a control unit (not shown). .
 実施の形態7に係るヘッドライト制御装置1fのハードウェア構成は、実施の形態1において図6Aおよび図6Bを用いて説明したヘッドライト制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態7において、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14bと、ヘッドライト制御部15aと、図示しない制御部の機能は、処理回路1001により実現される。すなわち、ヘッドライト制御装置1fは、車内撮像装置3から取得した車内撮像画像に基づいて検出した運転者の向きに関する向き情報と走行関連情報とに基づいて奥行距離を推定し、推定した奥行距離に基づいて、ヘッドライト2の点灯制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
The hardware configuration of the headlight control device 1f according to the seventh embodiment is the same as the hardware configuration of the headlight control device 1 described using FIGS. 6A and 6B in the first embodiment, and therefore is not illustrated. do.
In the seventh embodiment, the functions of the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14b, the headlight control section 15a, and a control section (not shown) are performed by a processing circuit. This is realized by 1001. That is, the headlight control device 1f estimates the depth distance based on the driving-related information and the direction information regarding the direction of the driver detected based on the in-vehicle image acquired from the in-vehicle imaging device 3, and applies the estimated depth distance to A processing circuit 1001 is provided for controlling the lighting of the headlights 2 based on the above information.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 6B.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14bと、ヘッドライト制御部15aと、図示しない制御部の機能を実行する。すなわち、ヘッドライト制御装置1fは、処理回路1001により実行されるときに、上述の図37のステップST1-1、ステップST1-2、ステップST2~ステップST4aが結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、向き検出部11と、走行関連情報取得部12と、奥行距離推定部13と、照射決定部14bと、ヘッドライト制御部15aと、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 記憶部16は、例えば、メモリ1005で構成される。
 ヘッドライト制御装置1fは、ヘッドライト2、車内撮像装置3、または、走行関連情報取得装置4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
The processing circuit 1001 reads out and executes a program stored in the memory 1005, thereby controlling the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14b, and the headlight control section. The functions of the unit 15a and a control unit (not shown) are executed. That is, the headlight control device 1f executes a program that, when executed by the processing circuit 1001, results in steps ST1-1, ST1-2, and ST2 to ST4a in FIG. A memory 1005 is provided for storing. Further, the program stored in the memory 1005 includes the direction detection section 11, the travel-related information acquisition section 12, the depth distance estimation section 13, the irradiation determination section 14b, the headlight control section 15a, and the control section (not shown). It can also be said to be something that causes a computer to execute a processing procedure or method.
The storage unit 16 includes, for example, a memory 1005.
The headlight control device 1f includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the headlight 2, the in-vehicle imaging device 3, or the driving-related information acquisition device 4.
 以上のように、実施の形態7に係るヘッドライト制御装置1fは、照射決定部14bが、向き検出部11が検出した運転者の向きに関する情報と走行関連情報取得部12が取得した走行関連情報とに基づき、運転者が周囲の確認を行っているか否かを判定し、運転者が周囲の確認を行っていると判定した場合、第2照射範囲を広げる周囲確認判定部141を有するように構成した。
 そのため、ヘッドライト制御装置1fは、運転者が見ている方向における推定視認対象物のみならず、運転者が見ていない可能性が高い物体が存在し得る場所も明るくすることができる。また、ヘッドライト制御装置1fは、運転者が見ていない可能性が高い物体が存在し得る場所を明るくするとともに、運転者が見ていない方向において、歩行者または他車両の運転者等にグレアを与えないようにできる。
As described above, in the headlight control device 1f according to the seventh embodiment, the irradiation determining unit 14b combines information regarding the driver's orientation detected by the orientation detecting unit 11 and driving-related information acquired by the driving-related information acquisition unit 12. Based on this, it is determined whether or not the driver is checking the surroundings, and when it is determined that the driver is checking the surroundings, the surroundings confirmation determining unit 141 is configured to widen the second irradiation range. Configured.
Therefore, the headlight control device 1f can brighten not only the estimated visible target in the direction in which the driver is looking, but also a place where there may be an object that is likely not seen by the driver. In addition, the headlight control device 1f brightens areas where there may be objects that the driver is not likely to see, and also provides glare to pedestrians or drivers of other vehicles in directions that the driver is not looking at. It is possible to avoid giving
 なお、以上の実施の形態1~7では、奥行距離推定部13、13a、13b、13cは、奥行距離推定用情報を用いて、奥行距離を推定するようにしていた。しかし、これは一例に過ぎない。
 例えば、奥行距離推定部13、13a、13b、13cは、向き情報を入力とし、奥行距離を出力する学習済みのモデル(以下「機械学習モデル」という。)を用いて、奥行距離を推定してもよい。
 機械学習モデルは、向き情報を入力とし、奥行距離および照射範囲の理想幅を出力するモデルであってもよいし、向き情報を入力とし、奥行距離および理想光量を出力するモデルであってもよいし、向き情報を入力とし、奥行距離、照射範囲の理想幅、および、理想光量を出力するモデルであってもよい。
 例えば、管理者等は、車両100を試走させて向き情報と、試走中に運転者が視認しようとした対象物までの奥行距離の情報とを収集し、収集した向き情報および奥行距離の情報を学習用データとして、学習装置に機械学習モデルを生成させておく。生成された機械学習モデルは、ヘッドライト制御装置1、1a、1b、1c、1d、1e、1fが参照可能な場所に記憶される。
 また、機械学習モデルは、向き情報だけでなく、地図情報、車両情報、または、車外情報も入力とし、奥行距離および照射範囲の理想幅を出力するモデルであってもよい。
 ヘッドライト制御装置1、1a、1b、1c、1d、1e、1fは、機械学習モデルを用いて奥行距離を推定するようにすることで、向き情報のバリエーションにより対応した奥行距離を推定できる。
Note that in the first to seventh embodiments described above, the depth distance estimating units 13, 13a, 13b, and 13c estimate the depth distance using the depth distance estimation information. However, this is just one example.
For example, the depth distance estimating units 13, 13a, 13b, and 13c use a trained model (hereinafter referred to as a "machine learning model") that inputs orientation information and outputs a depth distance to estimate the depth distance. Good too.
The machine learning model may be a model that inputs orientation information and outputs depth distance and ideal width of the irradiation range, or a model that inputs orientation information and outputs depth distance and ideal light amount. However, it may be a model that inputs direction information and outputs the depth distance, the ideal width of the irradiation range, and the ideal amount of light.
For example, the administrator or the like may take the vehicle 100 for a test run and collect orientation information and depth distance information to an object that the driver attempted to see during the test run, and then use the collected orientation information and depth distance information. The learning device generates a machine learning model as learning data. The generated machine learning model is stored in a location where the headlight control devices 1, 1a, 1b, 1c, 1d, 1e, and 1f can refer to it.
Furthermore, the machine learning model may be a model that inputs not only orientation information but also map information, vehicle information, or information outside the vehicle, and outputs the depth distance and the ideal width of the irradiation range.
By estimating the depth distance using a machine learning model, the headlight control devices 1, 1a, 1b, 1c, 1d, 1e, and 1f can estimate the corresponding depth distance based on variations in orientation information.
 また、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 Furthermore, it is possible to freely combine the embodiments, to modify any component of each embodiment, or to omit any component in each embodiment.
 本開示に係るヘッドライト制御装置は、車両における、運転者が向いている方向に基づくヘッドライトの点灯制御において、運転者が実際に顔向きまたは視線方向のどれぐらい先を視認しようとしているかを考慮した点灯制御ができる。 The headlight control device according to the present disclosure takes into consideration how far ahead in the direction of the driver's face or line of sight the driver is actually trying to see, when controlling the lighting of headlights in a vehicle based on the direction the driver is facing. You can control the lighting.
 1,1a,1b,1c,1d,1e,1f ヘッドライト制御装置、11 向き検出部、12,12a,12b,12c 走行関連情報取得部、121 車両情報取得部、122 地図情報取得部、123 車外情報取得部、13,13a,13b,13c 奥行距離推定部、14,14a,14b 照射決定部、141 周囲確認判定部、15,15a ヘッドライト制御部、16 記憶部、17 信頼度判定部、2 ヘッドライト、3 車内撮像装置、4 走行関連情報取得装置、100 車両、1001 処理回路、1002 入力インタフェース装置、1003 出力インタフェース装置、1004 プロセッサ、1005 メモリ。 1, 1a, 1b, 1c, 1d, 1e, 1f headlight control device, 11 direction detection unit, 12, 12a, 12b, 12c driving related information acquisition unit, 121 vehicle information acquisition unit, 122 map information acquisition unit, 123 outside the vehicle Information acquisition unit, 13, 13a, 13b, 13c Depth distance estimation unit, 14, 14a, 14b Irradiation determination unit, 141 Surrounding confirmation determination unit, 15, 15a Headlight control unit, 16 Storage unit, 17 Reliability determination unit, 2 Headlight, 3 In-vehicle imaging device, 4 Driving-related information acquisition device, 100 Vehicle, 1001 Processing circuit, 1002 Input interface device, 1003 Output interface device, 1004 Processor, 1005 Memory.

Claims (15)

  1.  車両の運転者が撮像された撮像画像に基づき、前記運転者の向きを検出する向き検出部と、
     前記車両の走行に関連する走行関連情報を取得する走行関連情報取得部と、
     前記向き検出部が検出した前記運転者の向きに関する向き情報と、前記走行関連情報取得部が取得した前記走行関連情報とに基づき、前記車両に設けられているヘッドライトの設置位置から前記運転者が向いている方向において前記運転者の視認位置と推定される推定視認位置までの距離である奥行距離を推定する奥行距離推定部と、
     前記奥行距離推定部が推定した前記奥行距離に基づいて、前記ヘッドライトによる光の照射範囲を決定する照射決定部と、
     前記ヘッドライトに対して、前記照射決定部が決定した前記照射範囲に前記光を照射させるヘッドライト制御部
     とを備えたヘッドライト制御装置。
    an orientation detection unit that detects the orientation of the driver based on a captured image of the driver of the vehicle;
    a driving-related information acquisition unit that acquires driving-related information related to driving of the vehicle;
    Based on the direction information regarding the driver's direction detected by the direction detection unit and the driving related information acquired by the driving related information acquisition unit, the driver is detected from the installation position of the headlight provided in the vehicle. a depth distance estimation unit that estimates a depth distance that is a distance between the driver's visual position and the estimated visual position in the direction in which the driver is facing;
    an irradiation determining unit that determines an irradiation range of light from the headlights based on the depth distance estimated by the depth distance estimating unit;
    A headlight control device comprising: a headlight control unit that causes the headlight to irradiate the light onto the irradiation range determined by the irradiation determination unit.
  2.  前記奥行距離推定部は、
     前記向き情報と、前記走行関連情報と、前記運転者の挙動に関する情報と前記走行関連情報とが対応付けられた奥行距離推定用情報との比較によって、前記奥行距離を推定する
     ことを特徴とする請求項1記載のヘッドライト制御装置。
    The depth distance estimator includes:
    The depth distance is estimated by comparing the orientation information, the driving-related information, and depth distance estimation information in which the driver's behavior information and the driving-related information are associated with each other. The headlight control device according to claim 1.
  3.  前記奥行距離推定部は、
     前記向き情報と、前記走行関連情報と、前記向き情報および前記走行関連情報を入力とし前記奥行距離に関する情報を出力する機械学習モデルとに基づき、前記奥行距離を推定する
     ことを特徴とする請求項1記載のヘッドライト制御装置。
    The depth distance estimator includes:
    The depth distance is estimated based on the orientation information, the travel-related information, and a machine learning model that receives the orientation information and the travel-related information as input and outputs information regarding the depth distance. 1. The headlight control device according to 1.
  4.  前記走行関連情報取得部は、前記走行関連情報として前記車両に関する車両情報を取得する車両情報取得部を有し、
     前記奥行距離推定部は、前記向き情報と前記車両情報とに基づき、前記奥行距離を推定する
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載のヘッドライト制御装置。
    The driving-related information acquisition unit includes a vehicle information acquisition unit that acquires vehicle information regarding the vehicle as the driving-related information,
    The headlight control device according to any one of claims 1 to 3, wherein the depth distance estimation unit estimates the depth distance based on the direction information and the vehicle information.
  5.  前記走行関連情報取得部は、前記走行関連情報として前記車両に関する車両情報を取得する車両情報取得部と、前記走行関連情報として地図情報を取得する地図情報取得部とを有し、
     前記奥行距離推定部は、前記向き情報と前記車両情報と前記地図情報とに基づき、前記奥行距離を推定する
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載のヘッドライト制御装置。
    The driving-related information acquisition unit includes a vehicle information acquisition unit that acquires vehicle information regarding the vehicle as the driving-related information, and a map information acquisition unit that acquires map information as the driving-related information,
    The head according to any one of claims 1 to 3, wherein the depth distance estimation unit estimates the depth distance based on the orientation information, the vehicle information, and the map information. Light control device.
  6.  前記走行関連情報取得部は、前記走行関連情報として前記車両の周辺に関する車外情報を取得する車外情報取得部を有し、
     前記奥行距離推定部は、前記向き情報と前記車外情報とに基づき、前記奥行距離を推定する
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載のヘッドライト制御装置。
    The driving-related information acquisition unit includes an external information acquisition unit that acquires external information regarding the surroundings of the vehicle as the driving-related information,
    The headlight control device according to any one of claims 1 to 3, wherein the depth distance estimation unit estimates the depth distance based on the orientation information and the vehicle exterior information.
  7.  前記走行関連情報取得部は、前記走行関連情報として前記車両に関する車両情報を取得する車両情報取得部と、前記走行関連情報として前記車両の周辺に関する車外情報を取得する車外情報取得部とを有し、
     前記奥行距離推定部は、前記向き情報と前記車両情報と前記車外情報とに基づき、前記奥行距離を推定する
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載のヘッドライト制御装置。
    The driving-related information acquisition unit includes a vehicle information acquisition unit that acquires vehicle information regarding the vehicle as the driving-related information, and an external information acquisition unit that acquires external information regarding the surroundings of the vehicle as the driving-related information. ,
    The head according to any one of claims 1 to 3, wherein the depth distance estimation unit estimates the depth distance based on the orientation information, the vehicle information, and the vehicle exterior information. Light control device.
  8.  前記走行関連情報取得部は、前記走行関連情報として前記車両に関する車両情報を取得する車両情報取得部と、前記走行関連情報として地図情報を取得する地図情報取得部と、前記走行関連情報として前記車両の前方に関する車外情報を取得する車外情報取得部とを有し、
     前記奥行距離推定部は、前記向き情報と前記車両情報と前記地図情報と前記車外情報とに基づき、前記奥行距離を推定する
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載のヘッドライト制御装置。
    The driving-related information acquisition unit includes a vehicle information acquisition unit that acquires vehicle information regarding the vehicle as the driving-related information, a map information acquisition unit that acquires map information as the driving-related information, and a map information acquisition unit that acquires map information as the driving-related information; and an external information acquisition unit that acquires external information regarding the front of the vehicle,
    Any one of claims 1 to 3, wherein the depth distance estimation unit estimates the depth distance based on the orientation information, the vehicle information, the map information, and the vehicle exterior information. The headlight control device described in Section 1.
  9.  前記車外情報取得部が取得する前記車外情報には前記車両の周辺に存在する物体までの距離に関する情報が含まれ、
     前記奥行距離推定部は、前記車両の前方に存在する前記物体までの距離に基づき、前記奥行距離を調整し、調整後の前記奥行距離を、推定した前記奥行距離とする
     ことを特徴とする請求項6から請求項8のうちのいずれか1項記載のヘッドライト制御装置。
    The vehicle exterior information acquired by the vehicle exterior information acquisition unit includes information regarding a distance to an object existing around the vehicle,
    The depth distance estimating unit adjusts the depth distance based on a distance to the object located in front of the vehicle, and sets the adjusted depth distance as the estimated depth distance. The headlight control device according to any one of claims 6 to 8.
  10.  前記向き検出部が検出した前記運転者の向きの信頼度を判定する信頼度判定部を備え、
     前記奥行距離推定部は、前記信頼度判定部が前記信頼度は低いと判定した前記運転者の向きを、前記奥行距離の推定に用いない
     ことを特徴とする請求項1から請求項9のうちのいずれか1項記載のヘッドライト制御装置。
    comprising a reliability determination unit that determines the reliability of the orientation of the driver detected by the orientation detection unit,
    The depth distance estimating unit does not use the orientation of the driver for which the reliability determining unit has determined that the reliability is low for estimating the depth distance. The headlight control device according to any one of the above.
  11.  前記照射決定部は、前記照射範囲において、第1照射範囲と前記ヘッドライトによる照射光量を前記第1照射範囲よりも小さくする第2照射範囲とを設定し、
     前記ヘッドライト制御部は、前記ヘッドライトに対して、前記照射範囲のうち、前記照射決定部が設定した前記第2照射範囲には、前記照射決定部が設定した前記第1照射範囲に照射させる前記光よりも小さい前記照射光量の前記光を照射させる
     ことを特徴とする請求項1から請求項10のうちのいずれか1項記載のヘッドライト制御装置。
    The irradiation determining unit sets, in the irradiation range, a first irradiation range and a second irradiation range in which the amount of light irradiated by the headlight is smaller than the first irradiation range,
    The headlight control unit causes the headlight to irradiate the second irradiation range set by the irradiation determining unit of the irradiation range to the first irradiation range set by the irradiation determining unit. The headlight control device according to any one of claims 1 to 10, wherein the light is irradiated with the irradiation light amount smaller than the light.
  12.  前記照射決定部は、前記向き情報と前記走行関連情報とに基づき、前記運転者が周囲の確認を行っているか否かを判定し、前記運転者が周囲の確認を行っていると判定した場合、前記第2照射範囲を広げる周囲確認判定部を有する
     ことを特徴とする請求項11記載のヘッドライト制御装置。
    The irradiation determining unit determines whether the driver is checking the surroundings based on the orientation information and the driving-related information, and when determining that the driver is checking the surroundings. 12. The headlight control device according to claim 11, further comprising a surroundings confirmation determination unit that expands the second irradiation range.
  13.  前記奥行距離推定部は、前記奥行距離を推定するとともに前記照射範囲の理想幅を推定し、
     前記照射決定部は、前記奥行距離推定部が推定した前記奥行距離と前記照射範囲の前記理想幅とに基づいて前記照射範囲を決定する
     ことを特徴とする請求項1から請求項12のうちのいずれか1項記載のヘッドライト制御装置。
    The depth distance estimation unit estimates the depth distance and an ideal width of the irradiation range,
    The irradiation determining unit determines the irradiation range based on the depth distance estimated by the depth distance estimating unit and the ideal width of the irradiation range. The headlight control device according to any one of the items.
  14.  前記奥行距離推定部は、前記奥行距離を推定するとともに前記ヘッドライトによる前記光の理想光量を推定し、
     前記ヘッドライト制御部は、前記ヘッドライトに対して、前記照射決定部が決定した前記照射範囲において、前記奥行距離推定部が推定した前記理想光量で、前記光を照射させる
     ことを特徴とする請求項1から請求項13のうちのいずれか1項記載のヘッドライト制御装置。
    The depth distance estimation unit estimates the depth distance and an ideal amount of light from the headlights,
    The headlight control unit causes the headlight to irradiate the light at the ideal light amount estimated by the depth distance estimating unit in the irradiation range determined by the irradiation determining unit. A headlight control device according to any one of claims 1 to 13.
  15.  向き検出部が、車両の運転者が撮像された撮像画像に基づき、前記運転者の向きを検出するステップと、
     走行関連情報取得部が、前記車両の走行に関連する走行関連情報を取得するステップと、
     奥行距離推定部が、前記向き検出部が検出した前記運転者の向きに関する向き情報と、前記走行関連情報取得部が取得した前記走行関連情報とに基づき、前記車両に設けられているヘッドライトの設置位置から前記運転者が向いている方向において前記運転者の視認位置と推定される推定視認位置までの距離である奥行距離を推定するステップと、
     照射決定部が、前記奥行距離推定部が推定した前記奥行距離に基づいて、前記ヘッドライトによる光の照射範囲を決定するステップと、
     ヘッドライト制御部が、前記ヘッドライトに対して、前記照射決定部が決定した前記照射範囲に前記光を照射させるステップ
     とを備えたヘッドライト制御方法。
    a step in which the orientation detection unit detects the orientation of the driver of the vehicle based on the captured image of the driver;
    a step in which the driving-related information acquisition unit acquires driving-related information related to driving of the vehicle;
    The depth distance estimating unit calculates the distance between headlights installed in the vehicle based on the direction information regarding the direction of the driver detected by the direction detecting unit and the driving related information acquired by the driving related information acquiring unit. estimating a depth distance that is a distance from an installation position to an estimated visual recognition position that is estimated to be a visual recognition position of the driver in the direction in which the driver is facing;
    an irradiation determining unit determining an irradiation range of light from the headlights based on the depth distance estimated by the depth distance estimating unit;
    A headlight control method comprising: a headlight control section causing the headlight to irradiate the light onto the irradiation range determined by the irradiation determining section.
PCT/JP2022/032693 2022-08-31 2022-08-31 Headlight control device and headlight control method WO2024047777A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032693 WO2024047777A1 (en) 2022-08-31 2022-08-31 Headlight control device and headlight control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032693 WO2024047777A1 (en) 2022-08-31 2022-08-31 Headlight control device and headlight control method

Publications (1)

Publication Number Publication Date
WO2024047777A1 true WO2024047777A1 (en) 2024-03-07

Family

ID=90098903

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032693 WO2024047777A1 (en) 2022-08-31 2022-08-31 Headlight control device and headlight control method

Country Status (1)

Country Link
WO (1) WO2024047777A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001347881A (en) * 2000-06-08 2001-12-18 Stanley Electric Co Ltd Driver's face direction detector
JP2009120148A (en) * 2007-11-19 2009-06-04 Aisin Seiki Co Ltd Vehicular lamp control system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001347881A (en) * 2000-06-08 2001-12-18 Stanley Electric Co Ltd Driver's face direction detector
JP2009120148A (en) * 2007-11-19 2009-06-04 Aisin Seiki Co Ltd Vehicular lamp control system

Similar Documents

Publication Publication Date Title
JP3619628B2 (en) Driving environment recognition device
US10387733B2 (en) Processing apparatus, processing system, and processing method
JP6648411B2 (en) Processing device, processing system, processing program and processing method
US11634150B2 (en) Display device
CN103874931B (en) For the method and apparatus of the position of the object in the environment for asking for vehicle
US10562439B2 (en) Techniques for optimizing vehicle headlights based on situational awareness
US20150332103A1 (en) Processing apparatus, computer program product, and processing method
CN105270254B (en) Method and device for controlling the light emission of at least one headlight of a vehicle
US10232772B2 (en) Driver assistance system
US20210097707A1 (en) Information processing device, movement device, and method, and program
JP2008137641A (en) Method for predicting and detecting bend part on part of road and its associated system
JP3857698B2 (en) Driving environment recognition device
KR20140071121A (en) vehicle housekeeping log system and method thereof
JPWO2018008085A1 (en) Cognitive area estimation device, cognitive area estimation method, and cognitive area estimation program
US11571969B2 (en) External communication suppression device for driving automation
WO2022049648A1 (en) Light distribution control device, light distribution control method, and light distribution control program
US9376052B2 (en) Method for estimating a roadway course and method for controlling a light emission of at least one headlight of a vehicle
JP4277678B2 (en) Vehicle driving support device
WO2024047777A1 (en) Headlight control device and headlight control method
Busch et al. Generation and communication of dynamic maps using light projection
RU2706757C1 (en) Control method and unit for rear view
JP4434864B2 (en) Vehicle lamp control device
JP4001145B2 (en) Vehicle lighting device
JP2012101651A (en) Headlamp control device
JP7511788B2 (en) Headlight control device and headlight control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22957376

Country of ref document: EP

Kind code of ref document: A1