WO2024069676A1 - Headlight control device and headlight control method - Google Patents

Headlight control device and headlight control method Download PDF

Info

Publication number
WO2024069676A1
WO2024069676A1 PCT/JP2022/035616 JP2022035616W WO2024069676A1 WO 2024069676 A1 WO2024069676 A1 WO 2024069676A1 JP 2022035616 W JP2022035616 W JP 2022035616W WO 2024069676 A1 WO2024069676 A1 WO 2024069676A1
Authority
WO
WIPO (PCT)
Prior art keywords
brightness
vehicle
control unit
control device
area
Prior art date
Application number
PCT/JP2022/035616
Other languages
French (fr)
Japanese (ja)
Inventor
裕 小野寺
僚太郎 江原
悟 井上
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/035616 priority Critical patent/WO2024069676A1/en
Publication of WO2024069676A1 publication Critical patent/WO2024069676A1/en

Links

Images

Definitions

  • This disclosure relates to a headlamp control device and a headlamp control method.
  • ADB adaptive driving beam
  • the headlights turn on and their brightness rises sharply when the headlights start to turn on due to a decrease in illuminance around the vehicle, or when dimming or blocking is released in the ADB.
  • This causes problems such as glare for pedestrians and glare for the driver due to return reflection from objects such as signs that contain reflective material.
  • the present disclosure has been made in consideration of the above-mentioned problems, and aims to provide technology that can reduce glare caused to the vehicle driver while brightening an area.
  • the headlight control device includes an acquisition unit that acquires object information about the object from a camera when an object is present in an area illuminated by the vehicle's headlights, and a control unit that illuminates the object with the headlights at a brightness corresponding to the type of object based on the object information while gradually increasing the brightness of the area with the headlights.
  • the object is illuminated by the headlights at a brightness that corresponds to the type of object based on the object information.
  • FIG. 1 is a functional block diagram showing a main part of a headlamp control device according to a first embodiment.
  • 3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment;
  • FIG. 3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment;
  • FIG. 3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment;
  • FIG. 3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment;
  • FIG. 4 is a flowchart showing an example of an operation of the headlamp control device according to the first embodiment.
  • 10 is a flowchart showing an example of an operation of a headlamp control device according to a second embodiment.
  • FIG. 11 is a functional block diagram showing a main part of a headlamp control device according to a third embodiment.
  • FIG. 13 is a diagram for explaining an example of a determination made by a headlamp control device according to a third embodiment.
  • FIG. 13 is a diagram for explaining an example of a determination made by a headlamp control device according to a third embodiment.
  • FIG. 13 is a diagram for explaining an example of a determination made by a headlamp control device according to a third embodiment.
  • 13 is a flowchart showing an example of an operation of a headlamp control device according to a third embodiment.
  • FIG. 13 is a functional block diagram showing a main part of a headlamp control device according to a fourth embodiment.
  • FIG. 13 is a top view for explaining an example of control of the headlamp control device according to the fourth embodiment.
  • 13 is a flowchart showing an example of the operation of the headlamp control device according to the fourth embodiment.
  • FIG. 13 is a top view for explaining an example of control of the headlamp control device according to the fifth embodiment.
  • FIG. 13 is a block diagram showing a hardware configuration of a headlamp control device according to another modified example.
  • FIG. 13 is a block diagram showing a hardware configuration of a headlamp control device according to another modified example.
  • ⁇ First embodiment> 1 is a functional block diagram showing a main part of a headlamp control device 1 according to the first embodiment.
  • the headlamp control device 1 according to the first embodiment is mounted on a vehicle such as an automobile.
  • vehicle such as an automobile
  • the vehicle on which the headlamp control device 1 is mounted may be referred to as a "vehicle,” and a vehicle other than the vehicle on which the headlamp control device 1 is mounted may be referred to as an "other vehicle.”
  • the headlamp control device 1 is connected to an illuminance detection unit 2, an imaging unit 3, and a headlamp 4 so as to be able to communicate with them.
  • the headlamp 4 has a lamp with functions such as low beam, high beam, or spot beam, and has an ADB function that allows the headlamp 4's illumination range and illumination amount to be adjusted.
  • the headlamp 4 configured in this way is capable of emitting light in various light distribution patterns based on the light distribution control signal from the headlamp control device 1.
  • the imaging unit 3 is, for example, a forward monitoring camera that captures a visible light image in front of the vehicle.
  • the image is, for example, a moving image.
  • the imaging unit 3 captures an image in front of the vehicle, generates object information based on the image, and outputs it to the headlight control device 1.
  • the object information is information about objects that exist in the area illuminated by the headlights 4 (hereinafter sometimes referred to as the "illumination area").
  • the object information is information that allows the headlamp control device 1 to determine the type and position information of the object, and as an example, it will be described as an image signal showing an image in front of the vehicle. However, as will be described later, the object information is not limited to image signals, etc.
  • the illuminance detection unit 2 is, for example, an illuminance sensor.
  • the illuminance detection unit 2 outputs the illuminance around the vehicle to the headlamp control device 1 as an illuminance signal.
  • the illuminance detection unit 2 outputs the illuminance around the vehicle as an illuminance signal.
  • the illuminance signal is not limited to this, and for example, the illuminance detection unit 2 may output the luminance of an image captured by a camera possessed by the vehicle as an illuminance signal. In this case, the function of the imaging unit 3 can be realized by the illuminance detection unit 2.
  • the headlamp control device 1 is connected to a computer network within the vehicle (e.g., CAN (Controller Area Network)) and can appropriately acquire various information (hereinafter referred to as "vehicle information") from the vehicle.
  • vehicle information includes, for example, information indicating whether the light switch is on or off.
  • the light switch is configured to be able to switch between the on and off states of the headlamp 4, and to switch whether or not to automatically control the headlamp 4. For example, when an operation to execute automatic control of the headlamp 4 is performed on the light switch, automatic control of the headlamp 4 is executed by the headlamp control device 1.
  • the headlight control device 1 acquires an illuminance signal relating to the illuminance around the vehicle from the illuminance detection unit 2, acquires object information from the imaging unit 3, and outputs a light distribution control signal for controlling the headlight 4 to the headlight 4 based on the illuminance signal and the object information. This enables the headlight control device 1 to automatically control the light distribution of the vehicle's headlight 4.
  • the headlight control device 1 in Figure 1 includes a lighting determination unit 11, an acquisition unit 12, and a control unit 13.
  • the light-on determination unit 11 selectively determines whether the headlights 4 are on or off based on the illuminance signal from the illuminance detection unit 2.
  • the light-on determination unit 11 outputs a light-on signal to the control unit 13 when the illuminance indicated by the illuminance signal is equal to or lower than a threshold value, and outputs an off signal to the control unit 13 when the illuminance is greater than the threshold value.
  • the acquisition unit 12 acquires object information (image capture signals in this embodiment 1) from the image capture unit 3, which is a camera.
  • the control unit 13 starts lighting control of the headlights 4 when a lighting signal indicating illuminance exceeding a threshold value is output from the lighting determination unit 11.
  • the control unit 13 gradually increases the brightness of the area illuminated by the headlights 4, i.e., the brightness of the illumination area.
  • the control unit 13 controls the brightness of the headlights 4 with the luminance of the headlights 4, so the brightness described in this embodiment 1 is substantially the same as the luminance.
  • the control unit 13 linearly increases the brightness of the illumination area by having the headlights 4 illuminate the illumination area in order of luminance from among a plurality of predetermined luminances, starting with the darkest luminance.
  • the control unit 13 determines the type and position of an object present in the illumination area based on the object information (imaging signal in this embodiment 1) acquired by the acquisition unit 12.
  • the determination of the type and position of an object can be realized by known information processing techniques such as pattern recognition and machine learning.
  • the types of objects are, for example, other vehicles, people (e.g., cyclists and pedestrians), signs (e.g., road signs), road structures, road marks, etc.
  • the control unit 13 determining the type and position of an object is essentially the same as the control unit 13 detecting an object.
  • the control unit 13 illuminates the position of the object determined above with the headlights 4 at a brightness corresponding to the type of object determined above. Although details will be described later, the control unit 13 illuminates the object as part of the illumination area until the brightness of the illumination area gradually increased by the headlights 4 reaches a brightness corresponding to the type of object. Then, when the brightness of the illumination area gradually increased by the headlights 4 exceeds the brightness corresponding to the type of object, the control unit 13 illuminates the object and the illumination area at different brightnesses.
  • the control unit 13 configured as described above generates a light distribution control signal for illuminating an object with the headlight 4 at a brightness corresponding to the type of object based on the object information while gradually increasing the brightness of the illumination area with the headlight 4. In this way, the control unit 13 controls the light distribution of the headlight 4.
  • FIGS. 2 to 5 are diagrams for explaining an example of control by the control unit 13 in this embodiment 1.
  • FIG. 2 is a diagram for explaining a series of controls in which the control unit 13 starts turning on the headlights 4 in response to a turn-on signal, and gradually increases the brightness of the headlights 4 while illuminating an object with the headlights 4 at a brightness corresponding to the type of object.
  • the horizontal axis of FIG. 2 indicates time, and the vertical axis indicates the light distribution control signal, i.e., the brightness of the headlights 4.
  • a linear slope with a gradient ⁇ B is set for the change in brightness of the headlight 4, and the brightness of the headlight 4 increases linearly.
  • brightness corresponding to the type of object is set for signs and people.
  • brightness B2 is set as the brightness of the headlight 4 corresponding to signs
  • brightness B3 is set as the brightness of the headlight 4 corresponding to people.
  • the brightness of the headlight 4 corresponding to the illumination area where an object is not ultimately detected is set to brightness B4, which has an upper limit equal to the standard brightness of the headlight 4.
  • brightness B4 which has an upper limit equal to the standard brightness of the headlight 4.
  • the brightness B3 corresponding to the object type is greater than the brightness B2 corresponding to the object type when the object type is a sign.
  • the brightness B4 of the headlight 4 corresponding to the illumination area where an object is not ultimately detected is greater than the brightnesses B2 and B3 corresponding to the object types.
  • brightness B4 is set to 100%.
  • Brightness B2 is set to a brightness that allows the vehicle driver to recognize the sign, within a range that is greater than the brightness at which the control unit 13 can detect the sign (e.g., 50%) and has an upper limit of brightness that does not cause glare to the vehicle driver or the like due to reflected light from the illuminated sign (e.g., 60%).
  • Brightness B3 is set to a brightness that allows the driver of the vehicle to recognize people, within a range that is greater than the brightness at which the control unit 13 can detect people (e.g., 70%) and has an upper limit of brightness that does not cause glare on illuminated people, etc. (e.g., 80%).
  • the control unit 13 when the control unit 13 detects another vehicle based on the object information, the control unit 13 does not illuminate the other vehicle with the headlights 4. Therefore, the brightness B1 of the headlights 4 corresponding to the other vehicle shown in FIG. 2 is substantially the same as the brightness (0%) of the headlights 4 when they are turned off.
  • Figures 3, 4 and 5 are diagrams showing an example of a traffic scene as seen by a driver of a vehicle at a certain time.
  • Figures 3 to 5 show area A1 where there is another vehicle 31 traveling ahead of the vehicle, area A2 where there is a road sign 32, which is a sign, area A3 where there is a pedestrian 33, which is a person, and an area where no object is ultimately detected, for example area A4.
  • the control unit 13 turns off the headlight 4.
  • FIG. 3 is a diagram showing a traffic scene at time t.
  • the control unit 13 detects an object based on the object information (image pickup signal in the present embodiment 1) acquired by the acquisition unit 12.
  • the headlights 4 are not yet turned on, and the image shown by the image pickup signal is dark, so the control unit 13 cannot detect the road sign 32 and the pedestrian 33.
  • the tail lamps 31a, etc. of the other vehicle 31 are turned on, so the control unit 13 can detect the other vehicle 31 from the dark image.
  • the control unit 13 is configured not to illuminate the other vehicle 31 and the area A1 with the headlights 4 when it is determined that the type of the object is the other vehicle 31 based on the object information. With this configuration, it is possible to suppress glare for the driver of the other vehicle 31.
  • FIG. 4 is a diagram showing a traffic scene at time t+1.
  • the illumination area other than area A1 is brighter than the illumination area in FIG. 3.
  • By gradually increasing the brightness of the illumination area it is possible to prevent a perceptual gap caused by a sudden change in the brightness of the light, and therefore to reduce discomfort to the driver of the vehicle.
  • By gradually increasing the output in the headlights 4 it is possible to reduce the load on the Boost circuit, which is expected to simplify the voltage resistance of the Boost circuit.
  • the control unit 13 determines that the type of object is a road sign 32 based on the object information, it sets brightness B2 for area A2 where the road sign 32 is present. At time t+1, when the brightness of area A2 reaches brightness B2 along with the other illuminated areas, it is maintained at brightness B2 without becoming brighter than that.
  • signs coated with retroreflective material have the advantage that they are easily recognized by drivers with only a small amount of light, but the disadvantage that they are easily dazzling when illuminated.
  • the brightness of the headlights 4 is first increased relatively to detect the sign, and then the brightness is reduced. This causes a problem in that glare is experienced by the driver of the vehicle immediately after the headlights 4 are turned on.
  • brightness B2 is set to a brightness that allows the driver of the vehicle to recognize the sign, within a range that is greater than the brightness at which the control unit 13 can detect the sign and has an upper limit value of brightness that does not cause glare to the driver of the vehicle, etc. This makes it possible to achieve both safety in object recognition and suppression of glare for the driver of the vehicle, and further reduces power consumption.
  • the control unit 13 uses the headlights 4 to gradually brighten the illumination areas other than areas A1 and A2. During this time, the brightness of areas A3 and A4 gradually increases until the control unit 13 can detect a pedestrian 33. If the control unit 13 determines that the type of object is a pedestrian 33 based on the object information, it sets brightness B3 for area A3 where the pedestrian 33 is present. At time t+2, when the brightness of area A3, along with the other illumination areas, reaches brightness B3, it is maintained at brightness B3 without becoming brighter than that.
  • the headlights 4 when the headlights 4 start to be turned on, the brightness of the headlights 4 is increased relatively once to detect a person, and then the brightness is decreased. This causes a problem in that glare is caused on the person immediately after the headlights 4 are turned on.
  • the present embodiment 1 while the brightness of the illumination area is gradually increased by the headlights 4, that is, before the brightness of the illumination area reaches brightness B4, the headlights 4 illuminate the person at brightness B3 corresponding to the person. This makes it possible to suppress glare on the person.
  • brightness B3 is set to a brightness that allows the vehicle driver to recognize the person, within a range that is greater than the brightness at which the control unit 13 can detect the person and has an upper limit value of brightness that does not cause glare on people, etc. This makes it possible to achieve both safety in object recognition and suppression of glare on people, and further to reduce power consumption.
  • the control unit 13 From time t+2 to time t+3, the control unit 13 gradually brightens the illumination areas other than areas A1 to A3 using the headlights 4.
  • Figure 5 shows a traffic scene at time t+3. Because no new object types are detected, the control unit 13 sets brightness B4 to illumination areas where no objects are detected, such as area A4. Area A4, where no traffic participants (e.g. the driver of the vehicle, the driver of another vehicle 31, and pedestrian 33) are present, is illuminated at brightness B4, improving the visibility of the vehicle driver at night.
  • the control unit 13 sets brightness B4 to illumination areas where no objects are detected, such as area A4.
  • Area A4 where no traffic participants (e.g. the driver of the vehicle, the driver of another vehicle 31, and pedestrian 33) are present, is illuminated at brightness B4, improving the visibility of the vehicle driver at night.
  • area A1 where other vehicles 31 exist is not illuminated, and area A2 where road signs 32 exist is illuminated with brightness B2 that is greater than brightness B1.
  • area A3 where pedestrians 33 exist is illuminated with brightness B3 that is greater than brightness B2, and area A4 where no object is detected is illuminated with brightness B4 that is greater than brightness B3.
  • ⁇ Operation> 6 is a flowchart showing an example of the operation of the headlamp control device 1 according to the embodiment 1.
  • the illuminance detection unit 2 detects an illuminance signal
  • the lighting determination unit 11 acquires the illuminance signal.
  • step S2 the lighting determination unit 11 determines whether the illuminance indicated by the illuminance signal is equal to or less than a threshold value. If it is determined that the illuminance is equal to or less than the threshold value, the process proceeds to step S3, and if it is determined that the illuminance is greater than the threshold value, the operation in FIG. 6 ends.
  • step S3 the imaging unit 3 captures an image and generates an imaging signal as object information, and the acquisition unit 12 acquires the object information.
  • step S4 the control unit 13 determines whether or not the type of object present in the illumination area has been determined based on the object information. If it is determined that the type of object has been determined, the process proceeds to step S5, and if it is determined that the type of object has not been determined, the process proceeds to step S6.
  • step S5 the control unit 13 sets a brightness corresponding to the type of object for the area in which the object exists.
  • step S6 the control unit 13 determines whether the headlights 4 are on or not. If it is determined that the headlights 4 are on, the process proceeds to step S8, and if it is determined that the headlights 4 are not on, the process proceeds to step S7.
  • step S7 the control unit 13 turns on the headlights 4.
  • step S8 it is determined whether the brightness of the illumination area has reached the brightness (e.g., brightness B4 in FIG. 2) set for the illumination area where an object is not ultimately detected. If it is determined that the brightness of the illumination area has reached the set brightness, the operation in FIG. 6 ends, and if it is determined that the brightness of the illumination area has not reached the set brightness, the process proceeds to step S9.
  • the brightness e.g., brightness B4 in FIG. 2
  • step S9 the control unit 13 increases the brightness of the illumination area by ⁇ B. However, if the type of object is determined and the brightness of the illumination area exceeds the brightness corresponding to the type of object, the control unit 13 illuminates the object with the headlights 4 at the brightness corresponding to the type of object. After that, the process returns to step S3.
  • the object is illuminated by the headlights 4 at a brightness corresponding to the type of the object based on the object information.
  • the control unit 13 can easily determine the type of object, regardless of the brightness around the vehicle.
  • the brightness of the illuminated area is increased linearly, which reduces discomfort to the vehicle driver.
  • the headlights 4 are caused to illuminate the illumination area at a predetermined brightness. This allows the headlights 4 to be controlled in an open loop, simplifying the control of the headlight control device 1.
  • the brightness corresponding to a person is greater than the brightness corresponding to a sign, and if the type of object is determined to be another vehicle, the headlights 4 do not illuminate the other vehicle.
  • the object information is an imaging signal that enables the headlight control device 1 to determine the type and position information of an object, but is not limited to this.
  • the object information may be the determination result of the type and position of the object in the imaging unit 3.
  • the control unit 13 may use the object information from the imaging unit 3, i.e., the determination result of the type and position of the object by the imaging unit 3, as the determination result by the control unit 13.
  • the imaging unit 3 and the control unit 13 may work together to determine the type and position of an object.
  • the lighting determination unit 11 selectively determines whether the headlights 4 are on or off based on the illuminance signal from the illuminance detection unit 2, but this is not limited to the above.
  • the lighting determination unit 11 may selectively determine whether the headlights 4 are on or off based on the on/off state of the light switch included in the vehicle information, that is, based on the operation of the light switch.
  • control unit 13 when the control unit 13 determines that the type of object is another vehicle, the control unit 13 does not illuminate the other vehicle with the headlights 4. However, the control unit 13 may illuminate the other vehicle with the headlights 4 to an extent that can be regarded as not substantially illuminating the other vehicle.
  • the object types were other vehicles, signs, and people, but are not limited to this.
  • the brightness corresponding to signs and people is not limited to the brightness described in the first embodiment.
  • the brightness with which the headlights 4 illuminate an object is set based on the type of object, but it may be changed as appropriate depending on the distance and direction of the object, etc.
  • the brightness described in the first embodiment may be set to the minimum brightness that can be perceived by the driver of the vehicle, or may be set to the minimum brightness at which the control unit 13 can detect an object.
  • the gradient ⁇ B of the change in brightness of the illumination area is a fixed value, but it may be changed based on the type of object, distance, direction, and vehicle speed, etc.
  • control unit 13 linearly increases the illumination area according to a linear slope having a gradient ⁇ B, but the illumination area may also be increased in a stepped manner, i.e., in stages. With this configuration, the illumination area can be brightened quickly, and the type of object can be detected quickly.
  • the control unit 13 may reduce the brightness of the object in the illumination area to the brightness corresponding to the type of object. For example, if a pedestrian is newly detected in the area A4 in FIG. 5 that is illuminated at brightness B4 without any object being detected, the control unit 13 may illuminate the area A4 with the headlights 4 at the brightness B3 corresponding to a pedestrian. Conversely, if an object that was previously present is no longer present, the control unit 13 may illuminate the area where the object was present with the headlights 4 at the illumination area brightness B4.
  • the above modified examples may be applied to embodiments other than the first embodiment.
  • the control unit 13 controls the brightness of the headlights 4 by luminance. That is, the brightness corresponding to an object is preset to a luminance that is assumed to enable the driver or the control unit 13 to recognize the object. However, unlike illuminance, luminance is not the brightness received by the observer. For this reason, depending on the environment and conditions, the brightness of an object may be too much or too little for the driver or the control unit 13.
  • the brightness corresponding to the sign described in embodiment 1 may be insufficient for the driver or control unit 13.
  • the brightness corresponding to the person described in embodiment 1 may be insufficient for the driver or control unit 13.
  • the headlights 4 are determined to be turned on, but the brightness around the vehicle is somewhat high due to street lights or the like, the brightness corresponding to the area where no object is detected may be excessive for the driver or control unit 13.
  • the headlamp control device 1 is configured to control the brightness of the headlamp 4 using illuminance rather than luminance in order to solve this problem.
  • the functional block diagram of the headlamp control device 1 according to the second embodiment is the same as the functional block diagram of the headlamp control device 1 according to the first embodiment (see FIG. 1).
  • the components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components are mainly described.
  • the acquisition unit 12 acquires the illuminance of an object illuminated by the headlight 4.
  • the acquisition unit 12 acquires the illuminance of an object illuminated by the headlight 4 from the luminance of an image captured by the imaging unit 3.
  • the control unit 13 performs feedback control of the headlights 4 based on the illuminance acquired by the acquisition unit 12 and a predetermined illuminance so that the acquired illuminance is substantially the same as the predetermined illuminance. In other words, if the illuminance of an object acquired by the acquisition unit 12 is lower than the predetermined illuminance, the control unit 13 increases the brightness of the object. On the other hand, if the illuminance of an object acquired by the acquisition unit 12 is higher than the predetermined illuminance, the control unit 13 decreases the brightness of the object.
  • Fig. 7 is a flowchart showing an example of the operation of the headlamp control device 1 according to the embodiment 2.
  • the flowchart in Fig. 7 is similar to the flowchart in Fig. 6 except that step S11 is added between step S7 and step S8 and step S9 is changed to step S12. Therefore, steps S11 and S12 will be mainly described below.
  • step S11 the imaging unit 3 captures an image, and the acquisition unit 12 acquires the illuminance of the object illuminated by the headlight 4 based on the image. Then, the process proceeds to step S8.
  • step S12 the control unit 13 performs substantially the same control as in step S9 in FIG. 6. However, the control unit 13 performs feedback control based on the illuminance acquired in step S11. After that, the process returns to step S3.
  • the headlamp 4 is feedback-controlled based on the acquired illuminance and the predetermined illuminance. With this configuration, it is possible to suppress the influence of the environment on the brightness of an object, and therefore it is possible to optimize the brightness of the object.
  • ⁇ Third embodiment> 8 is a functional block diagram showing a main part of the headlamp control device 1 according to the present embodiment 3.
  • the components according to the present embodiment 3 the components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components will be mainly described.
  • FIG. 8 The configuration of FIG. 8 is the same as the configuration of FIG. 1, except that a danger determination unit 13a is added to the control unit 13. Note that the danger determination unit 13a may be provided outside the control unit 13 so as to be able to communicate with the control unit 13.
  • the risk determination unit 13a determines whether or not there is a predetermined possibility of a collision between the vehicle and an object. An example of the risk determination unit 13a is described below.
  • the danger determination unit 13a tracks the movement of the object. Tracking of the movement of an object can be achieved by known information processing technology, such as Multi-Object Tracking, which tracks the movements of multiple specific objects in a moving image.
  • the risk determination unit 13a determines whether or not there is a predetermined possibility of collision between the vehicle and the object based on the tracking of the object.
  • Figures 9 to 11 are diagrams for explaining an example of collision possibility determination by the risk determination unit 13a.
  • Figures 9 to 11 show a roadway 36, a vehicle 37 traveling on the roadway 36, and an object 38 with which the vehicle 37 may collide.
  • the danger determination unit 13a predicts a collision between the two based on the traveling direction and speed of the vehicle 37 and the traveling direction and speed of the object 38. Then, when the danger determination unit 13a determines that the two will collide, it determines that there is a predetermined possibility of a collision.
  • the risk determination unit 13a predicts a collision between the two based on the traveling direction and speed of the vehicle 37 and the traveling direction and speed of the object 38. Then, when the risk determination unit 13a determines that the two will collide, it determines that there is a possibility of a collision.
  • the risk determination unit 13a determines whether an object 38 with limited brightness exists within a predetermined distance from the area in the traveling direction of the roadway 36 or the vehicle 37 without predicting a collision as described in Figures 9 and 10. Then, when the risk determination unit 13a determines that the object 38 exists within a predetermined distance from the area in the traveling direction of the roadway 36 or the vehicle 37 as in Figure 11, it determines that there is a possibility of a collision. For example, if the object 38 is a pedestrian checking the traffic conditions before crossing the road, the object 38 exists within a predetermined distance from the area in the traveling direction of the roadway 36 or the vehicle 37, and therefore the risk determination unit 13a can determine that there is a possibility of a collision.
  • the control unit 13 in FIG. 8 increases the upper brightness limit corresponding to the type of object for which a collision is determined to be possible. For example, the danger determination unit 13a determines whether or not there is a predetermined collision possibility between the vehicle and the object after the area in which the object exists is illuminated with a brightness corresponding to the type of object. Then, when it is determined that there is a collision possibility, the control unit 13 increases the brightness of the object by increasing the upper brightness limit corresponding to the object.
  • the control unit 13 outputs a light distribution control signal to the headlights 4 so that the brightness corresponding to the type of object determined to be a potential collision target is increased in a stepped manner, i.e., stepwise, steeply up to an upper limit. This causes the brightness corresponding to the type of object to increase steeply, creating a perceptual gap for the vehicle driver, allowing the driver to focus their attention on the object. As long as this can be achieved, the presentation format of objects determined to be a potential collision target is not limited to this.
  • Fig. 12 is a flowchart showing the operation of the headlamp control device 1 according to the third embodiment.
  • the flowchart in Fig. 12 is similar to the flowchart in Fig. 6, except that step S21 is added between step S4 and step S5, and steps S22 and S23 are added between step S5 and step S6. For this reason, steps S21 to S23 will be mainly described below.
  • step S21 if the object is the same as the previously detected object, the danger determination unit 13a tracks the movement of the object. Then, the process proceeds to step S5.
  • step S22 the risk determination unit 13a determines whether or not there is a predetermined possibility of a collision between the vehicle and the object based on the tracking result. If it is determined that there is a predetermined possibility of a collision, the process proceeds to step S23, and if it is determined that there is no predetermined possibility of a collision, the process proceeds to step S6.
  • step S23 the control unit 13 increases the upper limit of brightness corresponding to the type of object for which it has been determined that there is a predetermined collision possibility. Then, the process proceeds to step S6.
  • the headlamp control device 1 of the third embodiment described above for an object determined to have a collision possibility, the upper limit value of brightness corresponding to the type of object is increased. With this configuration, the driver of the vehicle can easily recognize an object that has the possibility of interfering with the traveling of the vehicle.
  • the control unit 13 increases the upper limit value of the brightness corresponding to the type of object for which a collision possibility has occurred, but the upper limit value may be released. That is, the control unit 13 may set the upper limit value of the brightness of an object for which a collision possibility has occurred to be the same as the brightness of an illumination area in which the type of object is not detected. In this case, the control unit 13 may increase the brightness corresponding to the type of object for which a collision possibility has occurred in a stepped manner, i.e., stepwise and steeply.
  • ⁇ Fourth embodiment> when the brightness of the illumination area exceeds the brightness corresponding to the type of the object and the object is newly detected, the control unit 13 reduces the brightness of the object in the illumination area to the brightness corresponding to the type of the object.
  • the headlamp control device 1 according to the fourth embodiment described below is configured to be able to solve such a problem.
  • FIG. 13 is a functional block diagram showing the main parts of the headlamp control device 1 according to the fourth embodiment.
  • components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components are mainly described.
  • the configuration in FIG. 13 is the same as the configuration in FIG. 1 in which the control unit 13 is connected to the detection unit 5.
  • the detection unit 5 detects objects that exist farther than the illumination area of the vehicle, that is, objects that exist farther than the detection range of the imaging unit 3 at night, without relying on the illumination of the headlights 4, and generates object detection information including the detection results.
  • the detection of an object by the detection unit 5 includes detection of the position of the object.
  • the detection unit 5 is a distance measurement sensor mounted on the vehicle, such as a LiDAR (Light detection and Ranging), a ToF (Time of Flight) camera, an infrared camera, a millimeter wave radar, or an ultrasonic sensor.
  • the detection unit 5 is not limited to a distance measurement sensor mounted on the vehicle.
  • the detection unit 5 may be a communication device that acquires object detection information detected by a sensor possessed by another vehicle using V2V (Vehicle-to-Vehicle) technology.
  • the detection unit 5 may be a communication device that acquires object detection information detected by infrastructure sensors such as roadside cameras and roadside LiDAR, location information from mobile phones, and external devices of the vehicle such as dynamic maps using V2X (Vehicle-to-Everything) technology.
  • the acquisition unit 12 acquires object detection information from the detection unit 5. Then, when the control unit 13 detects, based on the acquired object detection information, that an object exists farther away from the vehicle than the illumination area, the control unit 13 darkens the brightness of the portion of the illumination area into which the object will enter in the future by lowering the upper limit value of the brightness of the portion.
  • the portion into which the object will enter in the future may be the portion of the illumination area that is closest to the position of the object indicated in the object detection information.
  • the portion into which the object will enter in the future may be the portion of the illumination area that is determined to be the portion into which the object will enter in the future based on the position and speed of the object indicated in the object detection information by a determination similar to that made by the danger determination unit 13a according to embodiment 3.
  • FIG. 14 is a top view for explaining an example of control by the control unit 13 according to the fourth embodiment.
  • a vehicle 37 a road sign 32, a pedestrian 33, an illumination area 39 indicated by a thick line, and a detection area 40 are illustrated.
  • the illumination area 39 illuminated by the headlights 4 is different from the area in which the control unit 13 can detect objects based on object information, but for simplicity, they are considered to be the same here.
  • the detection area 40 is the area in which the detection unit 5 can detect objects. It is assumed that the detection unit 5 is a millimeter wave radar, and the distance from the vehicle 37 to the edge of the detection area 40 is also greater than the distance from the vehicle 37 to the edge of the illumination area 39.
  • the detection unit 5 detects the road sign 32 and pedestrian 33 and outputs object detection information to the control unit 13. Based on the object information from the imaging unit 3 and the object detection information from the detection unit 5, the control unit 13 detects the road sign 32 and pedestrian 33 that are not present in the illumination area 39 but are present in the detection area 40.
  • the control unit 13 calculates the portions 39a and 39b of the illumination area 39 into which the road sign 32 and pedestrian 33 will enter in the future, and makes the brightness of the portions 39a and 39b darker than the brightness of the illumination area 39 other than the portions 39a and 39b.
  • the control unit 13 When the road sign 32 and pedestrian 33 come closer to the vehicle 37 due to the vehicle 37 traveling, they will be present in the illumination area 39, and the control unit 13 will detect the road sign 32 and pedestrian 33 present in the illumination area 39 based on the object information from the imaging unit 3. When such a detection result is obtained, the control unit 13 will illuminate the road sign 32 and pedestrian 33 with the headlights 4 at a brightness corresponding to the road sign 32 and pedestrian 33, as in the first embodiment.
  • Fig. 15 is a flowchart showing the operation of the headlamp control device 1 according to the fourth embodiment.
  • the flowchart in Fig. 15 is similar to the flowchart in Fig. 6, except that steps S31 to S33 are added between steps S5 and S6. Therefore, steps S31 to S33 will be mainly described below.
  • step S31 the detection unit 5 generates object detection information, and the acquisition unit 12 acquires the object detection information.
  • step S32 the control unit 13 determines whether the detected object will enter the illumination area in the future based on the object information and the object detection information. For example, if an object is present in an area of the detection area of the detection unit 5 that is outside the illumination area and within a predetermined distance from the illumination area, the control unit 13 determines that the object will enter the illumination area in the future. If it is determined that the detected object will enter the illumination area in the future, the process proceeds to step S33, and if it is determined that the detected object will not enter the illumination area in the future, the process proceeds to step S6.
  • step S33 the control unit 13 lowers the upper brightness limit of the portion of the illumination area into which the object will enter in the future.
  • the process then proceeds to step S6. According to the above operation, if an object is present in an area of the detection area of the detection unit 5 that is relatively close to the illumination area, the brightness of the illumination area relating to the object becomes darker, but if an object is present in an area relatively far from the illumination area, the brightness of the illumination area relating to the object is maintained.
  • the headlight control device 1 of the fourth embodiment when it is detected that an object exists farther away than the illumination area of the vehicle, the brightness of the portion of the illumination area into which the object will enter in the future is dimmed. With this configuration, even if a new object enters the illumination area due to the movement of the vehicle after the headlights 4 are turned on, the glare caused by the entry of the new object can be suppressed.
  • the detection unit 5 may be a sensor capable of determining the type of an object to some extent, or a receiver capable of receiving the determination result of the type of the object, as described in the following fifth embodiment.
  • the control unit 13 may change the brightness of a portion of the illumination area into which the object will enter in the future, based on the determination result.
  • the imaging unit 3 was a camera such as a forward monitoring camera that captures visible light images, but images from such a camera are easily affected by the environment outside the vehicle (e.g., rain, snow, fog, etc.), which may reduce the reliability of the object type determination by the control unit 13.
  • the headlamp control device 1 according to the following fifth embodiment is configured to be able to solve such problems.
  • the functional block diagram of the headlamp control device 1 according to the fifth embodiment is the same as the functional block diagram of the headlamp control device 1 according to the fourth embodiment (see FIG. 13).
  • the components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components are mainly described.
  • the detection unit 5 in this fifth embodiment is a millimeter wave radar, a distance measurement sensor that is not dependent on visible light and is equipped in the vehicle, and generates and acquires information regarding the type and position of an object as a result of object detection.
  • the determination of the type of object by the millimeter wave radar can be realized by conventional technology such as micro Doppler. Pedestrians have a characteristic way of moving as they walk, so the movement of their hands and feet appears as a Doppler shift in the frequency components of the exploration waves reflected from the pedestrian. By detecting this Doppler shift, the millimeter wave radar can determine not only relatively stationary objects, relatively moving objects, and unidentified objects, but also pedestrians.
  • the control unit 13 determines the type of object based on the object information of the imaging unit 3 acquired by the acquisition unit 12 and the object detection result acquired by the detection unit 5 (a millimeter wave radar in this embodiment 5).
  • Figure 16 is a diagram for explaining an example of the determination by the control unit 13 according to this embodiment 5.
  • Figure 16 illustrates a vehicle 37, an object 38a which is a road sign, an object 38b which is a pedestrian, an illumination area 39 indicated by a thick line, and a detection area 40.
  • the illumination area 39 illuminated by the headlights 4 is different from the area in which the control unit 13 can detect objects based on object information, but for simplicity, they are considered to be the same here.
  • the detection area 40 is the area in which the millimeter wave radar, which is the detection unit 5, can detect objects.
  • the millimeter wave radar, which is the detection unit 5 can determine at least stationary objects, moving objects, unidentified objects, and pedestrians.
  • the acquisition unit 12 acquires object information from the imaging unit 3 and the detection results from the detection unit 5, and the control unit 13 determines the type of object based on the object information from the imaging unit 3 and the detection results from the detection unit 5.
  • the control unit 13 determines that the object 38a is a road sign and the object 38b is a pedestrian based on the object information from the imaging unit 3, and if the reliability of the determination based on the object information is greater than a threshold, illumination is performed in the same manner as in embodiment 1.
  • the control unit 13 has determined that the object 38a is a road sign and the object 38b is a pedestrian based on the object information of the imaging unit 3, but the reliability of the determination based on the object information is below a threshold.
  • the control unit 13 gradually increases the illumination of the headlights 4 up to brightness B5 to illuminate the object 38a, and gradually increases the illumination of the headlights 4 up to brightness B3 in FIG. 2 to illuminate the object 38b.
  • brightness B5 is set to a brightness that does not cause problems such as glare, regardless of the type of object.
  • the control unit 13 judges that the judgment based on the object information is incorrect. Then, the control unit 13 illuminates the objects 38a and 38b with the same brightness as the illumination area where no object is detected.
  • control unit 13 For example, if the control unit 13 is unable to determine the type of object based on the object information from the imaging unit 3 due to severe poor visibility caused by weather, it illuminates the objects 38a and 38b with a brightness that corresponds to the type of object based on the detection results from the detection unit 5.
  • a visible light camera that captures images using visible light can identify a large number of object types, but is subject to the effects of poor visibility, etc.
  • a distance measuring sensor is less susceptible to the effects of poor visibility, etc., but the number of object types that can be identified is smaller than that of a visible light camera.
  • the headlamp control device 1 determines the type of object based on the object information acquired by the acquisition unit 12 and the information acquired by the distance measurement sensor, which is the detection unit 5.
  • the camera that captures a visible light image and the distance measurement sensor can be used complementarily, thereby improving the reliability of the object type determination.
  • the distance measurement sensor of the detection unit 5 is a millimeter wave radar, but is not limited to this and may be a LiDAR, a ToF camera, an infrared camera, an ultrasonic sensor, or the like, and is not limited to one.
  • the types of objects that can be detected by the distance measurement sensor are not limited to stationary objects, moving objects, unidentified objects, and pedestrians.
  • the detection unit 5 is not limited to a distance measurement sensor such as a millimeter wave radar, but may be a communication device that obtains information regarding the type and position of an object through communication from an external device of the vehicle using V2V technology or V2X technology.
  • Visible light cameras can make judgments even when there are no other vehicles or infrastructure around the vehicle, but they are subject to the effects of poor visibility.
  • V2V or V2X technology is less susceptible to the effects of poor visibility, but cannot function unless there are other vehicles or infrastructure around the vehicle.
  • the camera that captures visible light images and the communication device can be used in a complementary manner, thereby increasing the reliability of object type judgment.
  • the acquisition unit 12 and the control unit 13 in FIG. 1 described above will be hereinafter referred to as "acquisition unit 12, etc.”
  • the acquisition unit 12, etc. are realized by a processing circuit 81 shown in FIG. 17. That is, the processing circuit 81 includes an acquisition unit 12 that acquires object information related to an object from a camera when an object exists in an area illuminated by the headlights of a vehicle, and a control unit 13 that illuminates the object with the headlights at a brightness corresponding to the type of the object based on the object information while gradually increasing the brightness of the above-mentioned area with the headlights.
  • the processing circuit 81 may be implemented with dedicated hardware, or may be implemented with a processor that executes a program stored in a memory. Examples of the processor include a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, and a DSP (Digital Signal Processor).
  • DSP Digital Signal Processor
  • the processing circuit 81 corresponds to, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination of these.
  • Each function of each unit such as the acquisition unit 12 may be realized by a circuit with distributed processing circuits, or the functions of each unit may be realized together by a single processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the acquisition unit 12 and the like are realized in combination with software and the like.
  • the software and the like includes, for example, software, firmware, or software and firmware.
  • the software and the like is written as a program and stored in a memory. As shown in FIG. 18, the processor 82 applied to the processing circuit 81 realizes the functions of each part by reading and executing a program stored in the memory 83.
  • the headlight control device 1 includes a memory 83 for storing a program that, when executed by the processing circuit 81, results in the execution of the steps of acquiring object information about the object from a camera when an object is present in an area illuminated by the vehicle's headlights, and illuminating the object with the headlights at a brightness corresponding to the type of object based on the object information while gradually increasing the brightness of the area by the headlights.
  • this program can be said to cause a computer to execute the procedures and methods of the acquisition unit 12 and the like.
  • memory 83 may be, for example, non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), drive devices for these, or any storage medium to be used in the future.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), drive devices for these, or any storage medium to be used in the future.
  • EPROM Erasable Programmable Read Only Memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • HDD Hard Disk Drive
  • magnetic disk
  • the above describes a configuration in which the functions of the acquisition unit 12, etc. are realized either by hardware or software, etc. However, this is not limited to this, and a configuration in which part of the acquisition unit 12, etc. is realized by dedicated hardware and another part is realized by software, etc.
  • the function of the acquisition unit 12 can be realized by a processing circuit 81 as dedicated hardware, and the other functions can be realized by the processing circuit 81 as a processor 82 reading and executing a program stored in a memory 83.
  • the processing circuit 81 can realize each of the above-mentioned functions through hardware, software, etc., or a combination of these.
  • each embodiment and each modified example can be freely combined, and each embodiment and each modified example can be modified or omitted as appropriate.

Abstract

The purpose of the present invention is to provide a technology that makes it possible to suppress the occurrence of glare for a driver of a vehicle while illuminating a region. This headlight control device comprises an acquisition unit and a control unit. When an object is present in a region illuminated by a headlight of a vehicle, the acquisition unit acquires, from a camera, object information pertaining to the object. While using the headlight to gradually increase the brightness in the region, the control unit uses the headlight to illuminate the object with a brightness corresponding to a type of the object, on the basis of the object information acquired by the acquisition unit.

Description

前照灯制御装置及び前照灯制御方法Headlamp control device and headlamp control method
 本開示は、前照灯制御装置及び前照灯制御方法に関する。 This disclosure relates to a headlamp control device and a headlamp control method.
 夜間、または、トンネルもしくは高架への車両の進入時などに、車両周辺の照度に基づいて、車両の前照灯を自動で点灯/消灯する技術として、オートライト及びオートハイビームなどが提案されている(例えば特許文献1)。 Auto light and auto high beam have been proposed as technologies that automatically turn on and off a vehicle's headlights based on the illuminance around the vehicle at night or when the vehicle enters a tunnel or overpass (for example, Patent Document 1).
 また、歩行者などにおけるグレアを抑制するために、前照灯の照明を部分的に減光または遮光する技術として、例えば配光可変前照灯(ADB:Adaptive Driving Beam)が提案されている(例えば特許文献2)。 Also, to reduce glare for pedestrians and others, a technology has been proposed for partially dimming or blocking the headlight illumination, such as adaptive driving beam (ADB) (for example, Patent Document 2).
特開2021-195042号公報JP 2021-195042 A 特許第5438410号公報Patent No. 5438410
 しかしながら従来技術では、車両周辺の照度が低下したことによる点灯開始時、または、ADBにおいて減光または遮光が解除されたときなどに、前照灯が点灯して、前照灯の明るさが急峻に上昇する。このため、歩行者にグレアが生じたり、反射材を含む標識などの物体からの回帰反射によって車両の運転者にグレアが生じたりするという問題があった。 However, with conventional technology, the headlights turn on and their brightness rises sharply when the headlights start to turn on due to a decrease in illuminance around the vehicle, or when dimming or blocking is released in the ADB. This causes problems such as glare for pedestrians and glare for the driver due to return reflection from objects such as signs that contain reflective material.
 そこで、本開示は、上記のような問題点に鑑みてなされたものであり、領域を明るくする間に、車両の運転者にグレアが生じることを抑制可能な技術を提供することを目的とする。 The present disclosure has been made in consideration of the above-mentioned problems, and aims to provide technology that can reduce glare caused to the vehicle driver while brightening an area.
 本開示に係る前照灯制御装置は、車両の前照灯によって照明される領域に物体が存在する場合に、物体に関する物体情報を、カメラから取得する取得部と、領域の明るさを前照灯によって徐々に上昇させる間に、物体情報に基づいて、物体の種別に対応する明るさで前照灯によって物体を照明する制御部とを備える。 The headlight control device according to the present disclosure includes an acquisition unit that acquires object information about the object from a camera when an object is present in an area illuminated by the vehicle's headlights, and a control unit that illuminates the object with the headlights at a brightness corresponding to the type of object based on the object information while gradually increasing the brightness of the area with the headlights.
 本開示によれば、領域の明るさを前照灯によって徐々に上昇させる間に、物体情報に基づいて、物体の種別に対応する明るさで前照灯によって物体を照明する。このような構成によれば、領域を明るくする間に、車両の運転者にグレアが生じることを抑制することができる。 According to the present disclosure, while the brightness of the area is gradually increased by the headlights, the object is illuminated by the headlights at a brightness that corresponds to the type of object based on the object information. With this configuration, it is possible to prevent glare from being caused to the driver of the vehicle while the area is being brightened.
 本開示の目的、特徴、局面及び利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects and advantages of the present disclosure will become more apparent from the following detailed description and the accompanying drawings.
実施の形態1に係る前照灯制御装置の要部を示す機能ブロック図である。1 is a functional block diagram showing a main part of a headlamp control device according to a first embodiment. 実施の形態1に係る前照灯制御装置の制御の一例を説明するための図である。3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment; FIG. 実施の形態1に係る前照灯制御装置の制御の一例を説明するための図である。3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment; FIG. 実施の形態1に係る前照灯制御装置の制御の一例を説明するための図である。3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment; FIG. 実施の形態1に係る前照灯制御装置の制御の一例を説明するための図である。3 is a diagram for explaining an example of control of the headlamp control device according to the first embodiment; FIG. 実施の形態1に係る前照灯制御装置の動作の一例を示すフローチャートである。4 is a flowchart showing an example of an operation of the headlamp control device according to the first embodiment. 実施の形態2に係る前照灯制御装置の動作の一例を示すフローチャートである。10 is a flowchart showing an example of an operation of a headlamp control device according to a second embodiment. 実施の形態3に係る前照灯制御装置の要部を示す機能ブロック図である。FIG. 11 is a functional block diagram showing a main part of a headlamp control device according to a third embodiment. 実施の形態3に係る前照灯制御装置の判定の一例を説明するための図である。FIG. 13 is a diagram for explaining an example of a determination made by a headlamp control device according to a third embodiment. 実施の形態3に係る前照灯制御装置の判定の一例を説明するための図である。FIG. 13 is a diagram for explaining an example of a determination made by a headlamp control device according to a third embodiment. 実施の形態3に係る前照灯制御装置の判定の一例を説明するための図である。FIG. 13 is a diagram for explaining an example of a determination made by a headlamp control device according to a third embodiment. 実施の形態3に係る前照灯制御装置の動作の一例を示すフローチャートである。13 is a flowchart showing an example of an operation of a headlamp control device according to a third embodiment. 実施の形態4に係る前照灯制御装置の要部を示す機能ブロック図である。FIG. 13 is a functional block diagram showing a main part of a headlamp control device according to a fourth embodiment. 実施の形態4に係る前照灯制御装置の制御の一例を説明するための上面図である。FIG. 13 is a top view for explaining an example of control of the headlamp control device according to the fourth embodiment. 実施の形態4に係る前照灯制御装置の動作の一例を示すフローチャートである。13 is a flowchart showing an example of the operation of the headlamp control device according to the fourth embodiment. 実施の形態5に係る前照灯制御装置の制御の一例を説明するための上面図である。FIG. 13 is a top view for explaining an example of control of the headlamp control device according to the fifth embodiment. その他の変形例に係る前照灯制御装置のハードウェア構成を示すブロック図である。FIG. 13 is a block diagram showing a hardware configuration of a headlamp control device according to another modified example. その他の変形例に係る前照灯制御装置のハードウェア構成を示すブロック図である。FIG. 13 is a block diagram showing a hardware configuration of a headlamp control device according to another modified example.
 <実施の形態1>
 図1は、本実施の形態1に係る前照灯制御装置1の要部を示す機能ブロック図である。本実施の形態1に係る前照灯制御装置1は、自動車などの車両に搭載されている。以下の説明では、前照灯制御装置1が搭載された車両を「車両」と記すこともあり、前照灯制御装置1が搭載された車両以外の車両を「他車両」と記すこともある。
<First embodiment>
1 is a functional block diagram showing a main part of a headlamp control device 1 according to the first embodiment. The headlamp control device 1 according to the first embodiment is mounted on a vehicle such as an automobile. In the following description, the vehicle on which the headlamp control device 1 is mounted may be referred to as a "vehicle," and a vehicle other than the vehicle on which the headlamp control device 1 is mounted may be referred to as an "other vehicle."
 図1に示すように、前照灯制御装置1は、照度検出部2、撮像部3、及び、前照灯4と通信可能に接続されている。 As shown in FIG. 1, the headlamp control device 1 is connected to an illuminance detection unit 2, an imaging unit 3, and a headlamp 4 so as to be able to communicate with them.
 前照灯4は、ロービーム、ハイビーム、または、スポットビームなどの機能を有する灯具を有しており、前照灯4の照射範囲及び照射量を調整可能なADBの機能を有している。このように構成された前照灯4は、前照灯制御装置1からの配光制御信号に基づいて、様々な配光パターンの光を照射可能となっている。 The headlamp 4 has a lamp with functions such as low beam, high beam, or spot beam, and has an ADB function that allows the headlamp 4's illumination range and illumination amount to be adjusted. The headlamp 4 configured in this way is capable of emitting light in various light distribution patterns based on the light distribution control signal from the headlamp control device 1.
 撮像部3は、例えば、車両前方の可視光の画像を撮像するカメラである前方監視カメラなどである。画像は、例えば動画像である。撮像部3は、車両前方の画像を撮像し、当該画像に基づいて物体情報を生成して、前照灯制御装置1に出力する。物体情報は、前照灯4によって照明される領域(以下、「照明領域」と記すこともある)に存在する物体に関する情報である。 The imaging unit 3 is, for example, a forward monitoring camera that captures a visible light image in front of the vehicle. The image is, for example, a moving image. The imaging unit 3 captures an image in front of the vehicle, generates object information based on the image, and outputs it to the headlight control device 1. The object information is information about objects that exist in the area illuminated by the headlights 4 (hereinafter sometimes referred to as the "illumination area").
 本実施の形態1では、物体情報は、物体の種別及び位置情報を前照灯制御装置1で判定可能な情報であり、その一例として、車両前方の画像を示す撮像信号であるものとして説明する。ただし後述するように、物体情報は撮像信号などに限定されない。 In the present embodiment 1, the object information is information that allows the headlamp control device 1 to determine the type and position information of the object, and as an example, it will be described as an image signal showing an image in front of the vehicle. However, as will be described later, the object information is not limited to image signals, etc.
 照度検出部2は、例えば照度センサである。照度検出部2は、車両周辺の照度を照度信号として前照灯制御装置1に出力する。本実施の形態1では、照度検出部2は、車両周辺の照度を照度信号として出力する。なお、照度信号はこれに限定されず、例えば、照度検出部2は、車両が有するカメラで撮像された画像の輝度を照度信号として出力してもよい。この場合、撮像部3の機能を、照度検出部2で実現することができる。 The illuminance detection unit 2 is, for example, an illuminance sensor. The illuminance detection unit 2 outputs the illuminance around the vehicle to the headlamp control device 1 as an illuminance signal. In the first embodiment, the illuminance detection unit 2 outputs the illuminance around the vehicle as an illuminance signal. Note that the illuminance signal is not limited to this, and for example, the illuminance detection unit 2 may output the luminance of an image captured by a camera possessed by the vehicle as an illuminance signal. In this case, the function of the imaging unit 3 can be realized by the illuminance detection unit 2.
 図示しないが、前照灯制御装置1は、車両内のコンピュータネットワーク(例えば、CAN(Controller Area Network))に接続されており、車両から種々の情報(以下、「車両情報」と称する)を適宜取得可能である。車両情報は、例えば、ライトスイッチのオン/オフを示す情報を含む。ライトスイッチは、前照灯4の点灯状態/消灯状態を切り替え、及び、前照灯4の制御を自動で実行するか否かを切り替え可能に構成される。例えば、前照灯4の自動制御を実行するための操作がライトスイッチに行われると、前照灯制御装置1による前照灯4の自動制御が実行される。 Although not shown, the headlamp control device 1 is connected to a computer network within the vehicle (e.g., CAN (Controller Area Network)) and can appropriately acquire various information (hereinafter referred to as "vehicle information") from the vehicle. The vehicle information includes, for example, information indicating whether the light switch is on or off. The light switch is configured to be able to switch between the on and off states of the headlamp 4, and to switch whether or not to automatically control the headlamp 4. For example, when an operation to execute automatic control of the headlamp 4 is performed on the light switch, automatic control of the headlamp 4 is executed by the headlamp control device 1.
 次に、前照灯制御装置1について説明する。前照灯制御装置1は、照度検出部2から車両周辺の照度に関する照度信号を取得し、撮像部3から物体情報を取得し、照度信号と物体情報とに基づいて、前照灯4を制御するための配光制御信号を前照灯4に出力する。これにより、前照灯制御装置1は、車両の前照灯4の配光を自動制御可能となっている。図1の前照灯制御装置1は、点灯判定部11と、取得部12と、制御部13とを備える。 Next, the headlight control device 1 will be described. The headlight control device 1 acquires an illuminance signal relating to the illuminance around the vehicle from the illuminance detection unit 2, acquires object information from the imaging unit 3, and outputs a light distribution control signal for controlling the headlight 4 to the headlight 4 based on the illuminance signal and the object information. This enables the headlight control device 1 to automatically control the light distribution of the vehicle's headlight 4. The headlight control device 1 in Figure 1 includes a lighting determination unit 11, an acquisition unit 12, and a control unit 13.
 点灯判定部11は、照度検出部2からの照度信号に基づいて、前照灯4の点灯及び消灯を選択的に判定する。点灯判定部11は、照度信号が示す照度が閾値以下である場合には点灯信号を制御部13に出力し、当該照度が閾値よりも大きい場合には消灯信号を制御部13に出力する。 The light-on determination unit 11 selectively determines whether the headlights 4 are on or off based on the illuminance signal from the illuminance detection unit 2. The light-on determination unit 11 outputs a light-on signal to the control unit 13 when the illuminance indicated by the illuminance signal is equal to or lower than a threshold value, and outputs an off signal to the control unit 13 when the illuminance is greater than the threshold value.
 取得部12は、カメラである撮像部3から物体情報(本実施の形態1では撮像信号)を取得する。 The acquisition unit 12 acquires object information (image capture signals in this embodiment 1) from the image capture unit 3, which is a camera.
 制御部13は、点灯判定部11から、閾値を超える照度を示す点灯信号が出力された場合に、前照灯4の点灯制御を開始する。点灯制御が開始すると、制御部13は、前照灯4によって照明される領域の明るさ、つまり照明領域の明るさを徐々に上昇させる。なお本実施の形態1では制御部13は、前照灯4の輝度で前照灯4の明るさを制御するので、本実施の形態1に記載された明るさは輝度と実質的に同じである。制御部13は、予め定められた複数の輝度のうち暗い輝度から順に前照灯4によって照明領域を照明させることによって、照明領域の明るさを線形的に上昇させる。 The control unit 13 starts lighting control of the headlights 4 when a lighting signal indicating illuminance exceeding a threshold value is output from the lighting determination unit 11. When lighting control starts, the control unit 13 gradually increases the brightness of the area illuminated by the headlights 4, i.e., the brightness of the illumination area. Note that in this embodiment 1, the control unit 13 controls the brightness of the headlights 4 with the luminance of the headlights 4, so the brightness described in this embodiment 1 is substantially the same as the luminance. The control unit 13 linearly increases the brightness of the illumination area by having the headlights 4 illuminate the illumination area in order of luminance from among a plurality of predetermined luminances, starting with the darkest luminance.
 制御部13は、上記の処理と並行して、取得部12で取得された物体情報(本実施の形態1では撮像信号)に基づいて、照明領域に存在する物体の種別及び位置を判定する。物体の種別及び位置の判定は、例えば、パターン認識及び機械学習などの、公知の情報処理技術によって実現できる。物体の種別は、例えば、他車両、人(例えば自転車に乗った人及び歩行者)、標識(例えば道路標識)、路上構造物、路上マークなどである。本実施の形態1では、制御部13が物体の種別及び位置を判定することと、制御部13が物体を検出することとは実質的に同じである。 In parallel with the above processing, the control unit 13 determines the type and position of an object present in the illumination area based on the object information (imaging signal in this embodiment 1) acquired by the acquisition unit 12. The determination of the type and position of an object can be realized by known information processing techniques such as pattern recognition and machine learning. The types of objects are, for example, other vehicles, people (e.g., cyclists and pedestrians), signs (e.g., road signs), road structures, road marks, etc. In this embodiment 1, the control unit 13 determining the type and position of an object is essentially the same as the control unit 13 detecting an object.
 制御部13は、以上により判定された物体の種別に対応する明るさで、以上により判定された物体の位置を前照灯4によって照明する。なお詳細は後述するが、前照灯4によって徐々に上昇される照明領域の明るさが、物体の種別に対応する明るさになるまで、制御部13は、物体を照明領域の一部として照明する。そして、前照灯4によって徐々に上昇される照明領域の明るさが、物体の種別に対応する明るさを超える場合に、制御部13は、物体と照明領域とを異なる明るさで照明する。 The control unit 13 illuminates the position of the object determined above with the headlights 4 at a brightness corresponding to the type of object determined above. Although details will be described later, the control unit 13 illuminates the object as part of the illumination area until the brightness of the illumination area gradually increased by the headlights 4 reaches a brightness corresponding to the type of object. Then, when the brightness of the illumination area gradually increased by the headlights 4 exceeds the brightness corresponding to the type of object, the control unit 13 illuminates the object and the illumination area at different brightnesses.
 以上のように構成された制御部13は、照明領域の明るさを前照灯4によって徐々に上昇させる間に、物体情報に基づいて、物体の種別に対応する明るさで前照灯4によって物体を照明する配光制御信号を生成する。これにより、制御部13は、前照灯4の配光を制御する。 The control unit 13 configured as described above generates a light distribution control signal for illuminating an object with the headlight 4 at a brightness corresponding to the type of object based on the object information while gradually increasing the brightness of the illumination area with the headlight 4. In this way, the control unit 13 controls the light distribution of the headlight 4.
 図2~図5は、本実施の形態1に係る制御部13による制御の一例を説明するための図である。 FIGS. 2 to 5 are diagrams for explaining an example of control by the control unit 13 in this embodiment 1.
 図2は、制御部13が、点灯信号によって前照灯4の点灯を開始し、徐々に前照灯4の明るさを上昇させる間に、物体の種別に対応する明るさで前照灯4によって物体を照明する一連の制御を説明するための図である。図2の横軸は時刻を示し、縦軸は、配光制御信号、つまり前照灯4の明るさを示す。 FIG. 2 is a diagram for explaining a series of controls in which the control unit 13 starts turning on the headlights 4 in response to a turn-on signal, and gradually increases the brightness of the headlights 4 while illuminating an object with the headlights 4 at a brightness corresponding to the type of object. The horizontal axis of FIG. 2 indicates time, and the vertical axis indicates the light distribution control signal, i.e., the brightness of the headlights 4.
 図2の例では、前照灯4の明るさの変化には、傾きΔBを有する線形スロープが設定されており、前照灯4の明るさは線形的に明るくなっている。 In the example of Figure 2, a linear slope with a gradient ΔB is set for the change in brightness of the headlight 4, and the brightness of the headlight 4 increases linearly.
 本実施の形態1では、標識及び人に対して、物体の種別に対応する明るさが設定されている。具体的には、標識に対応する前照灯4の明るさとして、明るさB2が設定され、人に対応する前照灯4の明るさとして、明るさB3が設定されている。 In the first embodiment, brightness corresponding to the type of object is set for signs and people. Specifically, brightness B2 is set as the brightness of the headlight 4 corresponding to signs, and brightness B3 is set as the brightness of the headlight 4 corresponding to people.
 図2の例では同様に、物体が最終的に検出されない照明領域に対応する前照灯4の明るさとして、前照灯4の標準的な明るさを上限とする明るさB4が設定されている。なお、照明領域の明るさを前照灯4によって徐々に上昇させるという記載の「徐々に」とは、照明領域が、照明領域に対応する明るさB4で照明されるまでの時間が、物体が、物体の種別に対応する明るさで照明されるまでの時間よりも長いことに相当する。 Similarly, in the example of FIG. 2, the brightness of the headlight 4 corresponding to the illumination area where an object is not ultimately detected is set to brightness B4, which has an upper limit equal to the standard brightness of the headlight 4. Note that the "gradually" in the description of gradually increasing the brightness of the illumination area by the headlight 4 corresponds to the fact that the time until the illumination area is illuminated at the brightness B4 corresponding to the illumination area is longer than the time until the object is illuminated at the brightness corresponding to the type of object.
 図2に示されるように、物体の種別が人である場合の、物体の種別に対応する明るさB3は、物体の種別が標識である場合の、物体の種別に対応する明るさB2よりも大きくなっている。物体が最終的に検出されない照明領域に対応する前照灯4の明るさB4は、物体の種別に対応する明るさB2,B3よりも大きくなっている。 As shown in FIG. 2, when the object type is a person, the brightness B3 corresponding to the object type is greater than the brightness B2 corresponding to the object type when the object type is a sign. The brightness B4 of the headlight 4 corresponding to the illumination area where an object is not ultimately detected is greater than the brightnesses B2 and B3 corresponding to the object types.
 ここで、明るさB4を100%とする。明るさB2は、制御部13で標識を検出できる明るさ(例えば50%)よりも大きく、照明された標識からの反射光によって車両の運転者などにグレアを生じさせない明るさ(例えば60%)を上限値として有する範囲のうち、車両の運転者が標識を認識可能な明るさに設定される。 Here, brightness B4 is set to 100%. Brightness B2 is set to a brightness that allows the vehicle driver to recognize the sign, within a range that is greater than the brightness at which the control unit 13 can detect the sign (e.g., 50%) and has an upper limit of brightness that does not cause glare to the vehicle driver or the like due to reflected light from the illuminated sign (e.g., 60%).
 明るさB3は、制御部13で人を検出できる明るさ(例えば70%)よりも大きく、照明された人などにグレアを生じさせない明るさ(例えば80%)を上限値として有する範囲のうち、車両の運転者が人を認識可能な明るさに設定される。 Brightness B3 is set to a brightness that allows the driver of the vehicle to recognize people, within a range that is greater than the brightness at which the control unit 13 can detect people (e.g., 70%) and has an upper limit of brightness that does not cause glare on illuminated people, etc. (e.g., 80%).
 なお本実施の形態1では、制御部13は、物体情報に基づいて他車両を検出した場合には、前照灯4によって他車両を照明しない。このため、図2に示される、他車両に対応する前照灯4の明るさB1は、消灯時の前照灯4の明るさ(0%)と実質的に同じである。 In the first embodiment, when the control unit 13 detects another vehicle based on the object information, the control unit 13 does not illuminate the other vehicle with the headlights 4. Therefore, the brightness B1 of the headlights 4 corresponding to the other vehicle shown in FIG. 2 is substantially the same as the brightness (0%) of the headlights 4 when they are turned off.
 図3、図4及び図5は、ある時刻において車両の運転者から見られる交通シーンの一例を示す図である。図3~図5には、車両前方を走行する他車両31が存在する領域A1と、標識である道路標識32が存在する領域A2と、人である歩行者33が存在する領域A3と、物体が最終的に検出されない領域、例えば領域A4とが示されている。 Figures 3, 4 and 5 are diagrams showing an example of a traffic scene as seen by a driver of a vehicle at a certain time. Figures 3 to 5 show area A1 where there is another vehicle 31 traveling ahead of the vehicle, area A2 where there is a road sign 32, which is a sign, area A3 where there is a pedestrian 33, which is a person, and an area where no object is ultimately detected, for example area A4.
 以下、前照灯4の明るさの変化を時刻に沿って説明する。時刻t-1では、制御部13は、前照灯4を消灯している。 Below, the change in brightness of the headlight 4 will be explained over time. At time t-1, the control unit 13 turns off the headlight 4.
 時刻tでは、点灯判定部11が点灯信号を出力し、制御部13が前照灯4の点灯制御を開始する。図3は、時刻tの交通シーンを示す図である。制御部13は、取得部12で取得された物体情報(本実施の形態1では撮像信号)に基づいて物体を検出する。時刻tでは、前照灯4がまだ点灯しておらず、撮像信号が示す画像は暗いので、制御部13は、道路標識32及び歩行者33を検出できない。一方、他車両31については例えばテールランプ31aなどが点灯しているので、制御部13は、暗い画像から他車両31を検出できる。制御部13は、物体情報に基づいて物体の種別が他車両31であると判定された場合に、前照灯4によって他車両31及び領域A1を照明しないように構成されている。このような構成によれば、他車両31の運転者におけるグレアを抑制することができる。 At time t, the lighting determination unit 11 outputs a lighting signal, and the control unit 13 starts controlling the lighting of the headlights 4. FIG. 3 is a diagram showing a traffic scene at time t. The control unit 13 detects an object based on the object information (image pickup signal in the present embodiment 1) acquired by the acquisition unit 12. At time t, the headlights 4 are not yet turned on, and the image shown by the image pickup signal is dark, so the control unit 13 cannot detect the road sign 32 and the pedestrian 33. On the other hand, the tail lamps 31a, etc. of the other vehicle 31 are turned on, so the control unit 13 can detect the other vehicle 31 from the dark image. The control unit 13 is configured not to illuminate the other vehicle 31 and the area A1 with the headlights 4 when it is determined that the type of the object is the other vehicle 31 based on the object information. With this configuration, it is possible to suppress glare for the driver of the other vehicle 31.
 時刻tから時刻t+1にかけては、制御部13は、領域A1以外の照明領域を前照灯4によって徐々に明るくする。図4は、時刻t+1の交通シーンを示す図である。図4では、領域A1以外の照明領域が、図3の照明領域よりも明るくなっている。照明領域の明るさが徐々に上昇することで、点灯の明るさの急峻な変化によって車両の運転者に知覚のギャップが生じることを抑制することができるので、車両の運転者の不快感を抑制することができる。また、前照灯4内の出力を緩やかに上昇することで、Boost回路の負荷を軽減することができるので、Boost回路の耐電圧の簡素化などが期待できる。 From time t to time t+1, the control unit 13 gradually brightens the illumination area other than area A1 using the headlights 4. FIG. 4 is a diagram showing a traffic scene at time t+1. In FIG. 4, the illumination area other than area A1 is brighter than the illumination area in FIG. 3. By gradually increasing the brightness of the illumination area, it is possible to prevent a perceptual gap caused by a sudden change in the brightness of the light, and therefore to reduce discomfort to the driver of the vehicle. In addition, by gradually increasing the output in the headlights 4, it is possible to reduce the load on the Boost circuit, which is expected to simplify the voltage resistance of the Boost circuit.
 時刻tから時刻t+1までの間に、領域A2~A4の明るさが徐々に上昇して、制御部13で道路標識32を検出できる明るさになる。一般的に、道路標識32には回帰反射材が塗布されているので、制御部13が道路標識32を検出するのに必要な最低限の明るさは、制御部13が歩行者33を検出するのに必要な最低限の明るさよりも小さい。制御部13は、物体情報に基づいて物体の種別が道路標識32であると判定された場合に、道路標識32が存在する領域A2に対して明るさB2を設定する。時刻t+1にて、領域A2の明るさが、他の照明領域とともに明るさB2に達すると、それより明るくならずに、明るさB2に保持される。 Between time t and time t+1, the brightness of areas A2 to A4 gradually increases until the control unit 13 can detect the road sign 32. Generally, road signs 32 are coated with a retroreflective material, so the minimum brightness required for the control unit 13 to detect the road sign 32 is less than the minimum brightness required for the control unit 13 to detect the pedestrian 33. When the control unit 13 determines that the type of object is a road sign 32 based on the object information, it sets brightness B2 for area A2 where the road sign 32 is present. At time t+1, when the brightness of area A2 reaches brightness B2 along with the other illuminated areas, it is maintained at brightness B2 without becoming brighter than that.
 一般的に、回帰反射材が塗布された標識は、運転者に、少量の光で認識されやすいメリットがあるが、照明によって眩しく感じられやすいデメリットがある。そうであるにもかかわらず、従来技術では、前照灯4を点灯開始すると、一度、前照灯4の明るさを比較的大きくして標識を検出した後に、当該明るさを小さくする。このため、前照灯4の点灯直後に、車両の運転者にグレアが生じるという問題があった。 Generally, signs coated with retroreflective material have the advantage that they are easily recognized by drivers with only a small amount of light, but the disadvantage that they are easily dazzling when illuminated. Despite this, in conventional technology, when the headlights 4 are turned on, the brightness of the headlights 4 is first increased relatively to detect the sign, and then the brightness is reduced. This causes a problem in that glare is experienced by the driver of the vehicle immediately after the headlights 4 are turned on.
 これに対して本実施の形態1では、照明領域の明るさを前照灯4によって徐々に上昇させる間に、つまり、照明領域の明るさが明るさB4になる前に、標識に対応する明るさB2で前照灯4によって標識を照明する。このため、車両の運転者におけるグレアを抑制することができる。なお、明るさB2は、制御部13で標識を検出できる明るさよりも大きく、車両の運転者などにグレアを生じさせない明るさを上限値として有する範囲のうち、車両の運転者が標識を認識可能な明るさに設定される。このため、物体認識の安全性、及び、車両の運転者へのグレア抑制を両立することができ、さらには消費電力を低減することができる。 In contrast to this, in the present embodiment 1, while the brightness of the illumination area is gradually increased by the headlights 4, that is, before the brightness of the illumination area reaches brightness B4, the sign is illuminated by the headlights 4 at brightness B2 corresponding to the sign. This makes it possible to suppress glare for the driver of the vehicle. Note that brightness B2 is set to a brightness that allows the driver of the vehicle to recognize the sign, within a range that is greater than the brightness at which the control unit 13 can detect the sign and has an upper limit value of brightness that does not cause glare to the driver of the vehicle, etc. This makes it possible to achieve both safety in object recognition and suppression of glare for the driver of the vehicle, and further reduces power consumption.
 時刻t+1から時刻t+2にかけては、制御部13は、領域A1及び領域A2以外の照明領域を前照灯4によって徐々に明るくする。この間に、領域A3,A4の明るさが徐々に上昇して、制御部13で歩行者33を検出できる明るさになる。制御部13は、物体情報に基づいて物体の種別が歩行者33であると判定された場合に、歩行者33が存在する領域A3に対して明るさB3を設定する。時刻t+2にて、領域A3の明るさが、他の照明領域とともに明るさB3に達すると、それより明るくならずに、明るさB3に保持される。 From time t+1 to time t+2, the control unit 13 uses the headlights 4 to gradually brighten the illumination areas other than areas A1 and A2. During this time, the brightness of areas A3 and A4 gradually increases until the control unit 13 can detect a pedestrian 33. If the control unit 13 determines that the type of object is a pedestrian 33 based on the object information, it sets brightness B3 for area A3 where the pedestrian 33 is present. At time t+2, when the brightness of area A3, along with the other illumination areas, reaches brightness B3, it is maintained at brightness B3 without becoming brighter than that.
 従来技術では、前照灯4を点灯開始すると、一度、前照灯4の明るさを比較的大きくして人を検出した後に、当該明るさを小さくする。このため、前照灯4の点灯直後に、人にグレアが生じるという問題があった。これに対して本実施の形態1では、照明領域の明るさを前照灯4によって徐々に上昇させる間に、つまり、照明領域の明るさが明るさB4になる前に、人に対応する明るさB3で前照灯4によって人を照明する。このため、人におけるグレアを抑制することができる。なお、明るさB3は、制御部13で人を検出できる明るさよりも大きく、人などにグレアを生じさせない明るさを上限値として有する範囲のうち、車両の運転者が人を認識可能な明るさに設定される。このため、物体認識の安全性、及び、人へのグレア抑制を両立することができ、さらには消費電力を低減することができる。 In the prior art, when the headlights 4 start to be turned on, the brightness of the headlights 4 is increased relatively once to detect a person, and then the brightness is decreased. This causes a problem in that glare is caused on the person immediately after the headlights 4 are turned on. In contrast, in the present embodiment 1, while the brightness of the illumination area is gradually increased by the headlights 4, that is, before the brightness of the illumination area reaches brightness B4, the headlights 4 illuminate the person at brightness B3 corresponding to the person. This makes it possible to suppress glare on the person. Note that brightness B3 is set to a brightness that allows the vehicle driver to recognize the person, within a range that is greater than the brightness at which the control unit 13 can detect the person and has an upper limit value of brightness that does not cause glare on people, etc. This makes it possible to achieve both safety in object recognition and suppression of glare on people, and further to reduce power consumption.
 時刻t+2から時刻t+3にかけては、制御部13は、領域A1~A3以外の照明領域を前照灯4によって徐々に明るくする。図5は、時刻t+3の交通シーンを示す図である。物体の種別が新たに検出されないので、制御部13は、物体が検出されない照明領域、例えば領域A4に対して明るさB4を設定する。交通参加者(例えば車両の運転者、他車両31の運転者、及び、歩行者33)がいない領域A4が、明るさB4で照明されることで、車両の運転者にとって夜間の見通しを良くすることができる。 From time t+2 to time t+3, the control unit 13 gradually brightens the illumination areas other than areas A1 to A3 using the headlights 4. Figure 5 shows a traffic scene at time t+3. Because no new object types are detected, the control unit 13 sets brightness B4 to illumination areas where no objects are detected, such as area A4. Area A4, where no traffic participants (e.g. the driver of the vehicle, the driver of another vehicle 31, and pedestrian 33) are present, is illuminated at brightness B4, improving the visibility of the vehicle driver at night.
 以上の結果、他車両31が存在する領域A1は照明されず、道路標識32が存在する領域A2は、明るさB1よりも大きい明るさB2で照明される。また、歩行者33が存在する領域A3は、明るさB2よりも大きい明るさB3で照明され、物体が検出されない領域A4は、明るさB3よりも大きい明るさB4で照明される。 As a result of the above, area A1 where other vehicles 31 exist is not illuminated, and area A2 where road signs 32 exist is illuminated with brightness B2 that is greater than brightness B1. Area A3 where pedestrians 33 exist is illuminated with brightness B3 that is greater than brightness B2, and area A4 where no object is detected is illuminated with brightness B4 that is greater than brightness B3.
 <動作>
 図6は、本実施の形態1に係る前照灯制御装置1の動作の一例を示すフローチャートである。まずステップS1にて、照度検出部2は照度信号を検出し、点灯判定部11は当該照度信号を取得する。
<Operation>
6 is a flowchart showing an example of the operation of the headlamp control device 1 according to the embodiment 1. First, in step S1, the illuminance detection unit 2 detects an illuminance signal, and the lighting determination unit 11 acquires the illuminance signal.
 ステップS2にて、点灯判定部11は、照度信号が示す照度が閾値以下か否かを判定する。照度が閾値以下であると判定された場合には処理がステップS3に進み、照度が閾値よりも大きいと判定された場合には図6の動作が終了する。 In step S2, the lighting determination unit 11 determines whether the illuminance indicated by the illuminance signal is equal to or less than a threshold value. If it is determined that the illuminance is equal to or less than the threshold value, the process proceeds to step S3, and if it is determined that the illuminance is greater than the threshold value, the operation in FIG. 6 ends.
 ステップS3にて、撮像部3は、画像を撮像して撮像信号を物体情報として生成し、取得部12は、当該物体情報を取得する。 In step S3, the imaging unit 3 captures an image and generates an imaging signal as object information, and the acquisition unit 12 acquires the object information.
 ステップS4にて、制御部13は、物体情報に基づいて、照明領域に存在する物体の種別を判定できたか否かを判定する。物体の種別を判定できたと判定された場合には処理がステップS5に進み、物体の種別を判定できなかったと判定された場合には処理がステップS6に進む。 In step S4, the control unit 13 determines whether or not the type of object present in the illumination area has been determined based on the object information. If it is determined that the type of object has been determined, the process proceeds to step S5, and if it is determined that the type of object has not been determined, the process proceeds to step S6.
 ステップS5にて、制御部13は、物体が存在する領域に対して、物体の種別に対応する明るさを設定する。 In step S5, the control unit 13 sets a brightness corresponding to the type of object for the area in which the object exists.
 ステップS6にて、制御部13は、前照灯4が点灯しているか否かを判定する。前照灯4が点灯していると判定された場合には処理がステップS8に進み、前照灯4が点灯していないと判定された場合には処理がステップS7に進む。 In step S6, the control unit 13 determines whether the headlights 4 are on or not. If it is determined that the headlights 4 are on, the process proceeds to step S8, and if it is determined that the headlights 4 are not on, the process proceeds to step S7.
 ステップS7にて、制御部13は、前照灯4を点灯する。 In step S7, the control unit 13 turns on the headlights 4.
 ステップS8にて、照明領域の明るさが、物体が最終的に検出されない照明領域に設定された明るさ(例えば図2の明るさB4)に達しているか否かを判定する。照明領域の明るさが、設定された明るさに達していると判定された場合には図6の動作が終了し、照明領域の明るさが、設定された明るさに達していないと判定された場合には処理がステップS9に進む。 In step S8, it is determined whether the brightness of the illumination area has reached the brightness (e.g., brightness B4 in FIG. 2) set for the illumination area where an object is not ultimately detected. If it is determined that the brightness of the illumination area has reached the set brightness, the operation in FIG. 6 ends, and if it is determined that the brightness of the illumination area has not reached the set brightness, the process proceeds to step S9.
 ステップS9にて、制御部13は、照明領域の明るさをΔBだけ上昇させる。ただし、制御部13は、物体の種別が判定され、照明領域の明るさが、物体の種別に対応する明るさを超える場合には、物体の種別に対応する明るさで、前照灯4によって物体を照明する。その後、処理がステップS3に戻る。 In step S9, the control unit 13 increases the brightness of the illumination area by ΔB. However, if the type of object is determined and the brightness of the illumination area exceeds the brightness corresponding to the type of object, the control unit 13 illuminates the object with the headlights 4 at the brightness corresponding to the type of object. After that, the process returns to step S3.
 <実施の形態1のまとめ>
 以上のような本実施の形態1に係る前照灯制御装置1によれば、照明領域の明るさを前照灯4によって徐々に上昇させる間に、物体情報に基づいて、物体の種別に対応する明るさで前照灯4によって物体を照明する。このような構成によれば、照明領域を明るくする間に、車両の運転者などにおけるグレアを抑制することができ、かつ、車両の運転者は物体を容易に識別することができる。また、照明領域の明るさを前照灯4によって徐々に上昇させることにより、車両周囲の明るさに関わらず、制御部13における物体の種別の判定を容易に行うことができる。
Summary of the First Embodiment
According to the headlight control device 1 of the first embodiment as described above, while the brightness of the illumination area is gradually increased by the headlights 4, the object is illuminated by the headlights 4 at a brightness corresponding to the type of the object based on the object information. With this configuration, while the illumination area is being brightened, glare to the driver of the vehicle can be suppressed, and the driver of the vehicle can easily identify the object. Furthermore, by gradually increasing the brightness of the illumination area by the headlights 4, the control unit 13 can easily determine the type of object, regardless of the brightness around the vehicle.
 また本実施の形態1では、照明領域の明るさを線形的に上昇させるので、車両の運転者の不快感を抑制することができる。 In addition, in this embodiment 1, the brightness of the illuminated area is increased linearly, which reduces discomfort to the vehicle driver.
 また本実施の形態1では、予め定められた輝度で前照灯4に照明領域を照明させる。これにより、前照灯4をオープンループ制御することができるので、前照灯制御装置1の制御を簡素化することができる。 In addition, in the first embodiment, the headlights 4 are caused to illuminate the illumination area at a predetermined brightness. This allows the headlights 4 to be controlled in an open loop, simplifying the control of the headlight control device 1.
 また本実施の形態1では、人に対応する明るさは、標識に対応する明るさよりも大きく、物体の種別が他車両であると判定された場合には、前照灯4によって他車両を照明しない。このような構成によれば、車両の運転者、人、他車両の運転者におけるグレアを抑制することができ、かつ、車両の運転者は、他車両、標識及び人を容易に識別することができる。 In addition, in this embodiment 1, the brightness corresponding to a person is greater than the brightness corresponding to a sign, and if the type of object is determined to be another vehicle, the headlights 4 do not illuminate the other vehicle. With this configuration, it is possible to reduce glare for the driver of the vehicle, people, and drivers of other vehicles, and the driver of the vehicle can easily identify other vehicles, signs, and people.
 <変形例>
 実施の形態1は、物体情報は、物体の種別及び位置情報を前照灯制御装置1で判定可能な撮像信号であったが、これに限定されない。例えば、撮像部3が、車両前方の画像に基づいて物体の種別及び位置を判定可能である場合には、物体情報は、撮像部3における物体の種別及び位置の判定結果であってもよい。この場合、制御部13は、撮像部3からの物体情報、つまり撮像部3による物体の種別及び位置の判定結果を、制御部13による判定結果として用いてもよい。また、撮像部3及び制御部13が協働して、物体の種別及び位置を判定してもよい。
<Modification>
In the first embodiment, the object information is an imaging signal that enables the headlight control device 1 to determine the type and position information of an object, but is not limited to this. For example, if the imaging unit 3 is capable of determining the type and position of an object based on an image in front of the vehicle, the object information may be the determination result of the type and position of the object in the imaging unit 3. In this case, the control unit 13 may use the object information from the imaging unit 3, i.e., the determination result of the type and position of the object by the imaging unit 3, as the determination result by the control unit 13. Furthermore, the imaging unit 3 and the control unit 13 may work together to determine the type and position of an object.
 また実施の形態1では、点灯判定部11は、照度検出部2からの照度信号に基づいて、前照灯4の点灯及び消灯を選択的に判定したが、これに限定されない。例えば、点灯判定部11は、車両情報に含まれるライトスイッチのオン/オフに基づいて、つまりライトスイッチの操作に基づいて、前照灯4の点灯及び消灯を選択的に判定してもよい。 In addition, in the first embodiment, the lighting determination unit 11 selectively determines whether the headlights 4 are on or off based on the illuminance signal from the illuminance detection unit 2, but this is not limited to the above. For example, the lighting determination unit 11 may selectively determine whether the headlights 4 are on or off based on the on/off state of the light switch included in the vehicle information, that is, based on the operation of the light switch.
 また実施の形態1では、制御部13は、物体の種別が他車両であると判定された場合に、前照灯4によって他車両を照明しなかった。しかしながら、制御部13は、実質的に他車両を照明しないとみなせる程度で前照灯4によって他車両を照明してもよい。 In addition, in the first embodiment, when the control unit 13 determines that the type of object is another vehicle, the control unit 13 does not illuminate the other vehicle with the headlights 4. However, the control unit 13 may illuminate the other vehicle with the headlights 4 to an extent that can be regarded as not substantially illuminating the other vehicle.
 また実施の形態1では、物体の種別は他車両、標識、及び、人であったが、これに限定されない。また、標識に対応する明るさ、及び、人に対応する明るさも、実施の形態1に記載された明るさに限定されない。 In addition, in the first embodiment, the object types were other vehicles, signs, and people, but are not limited to this. Furthermore, the brightness corresponding to signs and people is not limited to the brightness described in the first embodiment.
 また本実施の形態1では、前照灯4が物体を照明する明るさは、物体の種別に基づいて設定されたが、物体の距離及び方位などによって適宜変更されてもよい。 In addition, in the first embodiment, the brightness with which the headlights 4 illuminate an object is set based on the type of object, but it may be changed as appropriate depending on the distance and direction of the object, etc.
 また実施の形態1で説明した輝度は、車両の運転者が知覚できる最小の輝度になるように設定されてもよいし、制御部13が物体を検出できる最小の輝度になるように設定されてもよい。 The brightness described in the first embodiment may be set to the minimum brightness that can be perceived by the driver of the vehicle, or may be set to the minimum brightness at which the control unit 13 can detect an object.
 また実施の形態1では、照明領域の明るさの変化の傾きΔBは、固定の値であったが、物体の種別、距離、方位、及び、車両の速度などに基づいて変更されてもよい。 In addition, in the first embodiment, the gradient ΔB of the change in brightness of the illumination area is a fixed value, but it may be changed based on the type of object, distance, direction, and vehicle speed, etc.
 また実施の形態1では、制御部13は、傾きΔBを有する線形スロープに従って照明領域を線形的に上昇させたが、階段状に、すなわち段階的に上昇させてもよい。このような構成によれば、照明領域を早く明るくすることができるので、物体の種類の検出を早く行うことができる。 In addition, in the first embodiment, the control unit 13 linearly increases the illumination area according to a linear slope having a gradient ΔB, but the illumination area may also be increased in a stepped manner, i.e., in stages. With this configuration, the illumination area can be brightened quickly, and the type of object can be detected quickly.
 なお実施の形態1において、照明領域の明るさが、物体の種別に対応する明るさを超えた後、当該物体が新たに検出された場合に、制御部13は、当該照明領域のうちの物体の部分を、当該物体の種別に対応する明るさまで小さくしてもよい。例えば、図5の物体が検出されずに明るさB4で照明されている領域A4に、歩行者が新たに検出された場合に、制御部13は、歩行者に対応する明るさB3で、当該領域A4を前照灯4によって照明してもよい。これとは逆に、過去に存在していたが物体が存在しなくなった場合には、制御部13は、照明領域の明るさB4で、当該物体が存在していた領域を前照灯4によって照明してもよい。なお、以上の変形例は、実施の形態1以外の実施の形態において適用されてもよい。 In the first embodiment, if an object is newly detected after the brightness of the illumination area exceeds the brightness corresponding to the type of object, the control unit 13 may reduce the brightness of the object in the illumination area to the brightness corresponding to the type of object. For example, if a pedestrian is newly detected in the area A4 in FIG. 5 that is illuminated at brightness B4 without any object being detected, the control unit 13 may illuminate the area A4 with the headlights 4 at the brightness B3 corresponding to a pedestrian. Conversely, if an object that was previously present is no longer present, the control unit 13 may illuminate the area where the object was present with the headlights 4 at the illumination area brightness B4. The above modified examples may be applied to embodiments other than the first embodiment.
 <実施の形態2>
 実施の形態1では、制御部13は、前照灯4の明るさを輝度で制御した。つまり、物体に対応する明るさは、その物体を運転者または制御部13が認識できると想定される輝度に予め設定されていた。しかしながら、輝度は、照度と異なり、観測者側が受け取る明るさではない。このため、環境及び条件などによっては、運転者または制御部13にとっての物体の明るさが、過不足となる場合がある。
<Embodiment 2>
In the first embodiment, the control unit 13 controls the brightness of the headlights 4 by luminance. That is, the brightness corresponding to an object is preset to a luminance that is assumed to enable the driver or the control unit 13 to recognize the object. However, unlike illuminance, luminance is not the brightness received by the observer. For this reason, depending on the environment and conditions, the brightness of an object may be too much or too little for the driver or the control unit 13.
 例えば、標識に回帰反射材が塗布されていない場合には、実施の形態1で説明した標識に対応する明るさは、運転者または制御部13にとって不足する場合がある。また例えば、歩行者が視認性の悪い黒っぽい服を着ている場合には、実施の形態1で説明した人に対応する明るさは、運転者または制御部13にとって不足する場合がある。また例えば、前照灯4は点灯すると判定されているが、街灯などによって車両周囲の明るさがやや大きい場合には、物体が検出されない領域に対応する明るさは、運転者または制御部13にとって過剰になる場合がある。 For example, if a sign is not coated with a retroreflective material, the brightness corresponding to the sign described in embodiment 1 may be insufficient for the driver or control unit 13. Also, for example, if a pedestrian is wearing dark clothing with poor visibility, the brightness corresponding to the person described in embodiment 1 may be insufficient for the driver or control unit 13. Also, for example, if the headlights 4 are determined to be turned on, but the brightness around the vehicle is somewhat high due to street lights or the like, the brightness corresponding to the area where no object is detected may be excessive for the driver or control unit 13.
 これに対して、以下で説明する本実施の形態2に係る前照灯制御装置1は、このような問題を解決できるように、前照灯4の明るさを、輝度ではなく照度で制御するように構成されている。 In response to this, the headlamp control device 1 according to the second embodiment described below is configured to control the brightness of the headlamp 4 using illuminance rather than luminance in order to solve this problem.
 本実施の形態2に係る前照灯制御装置1の機能ブロック図は、実施の形態1に係る前照灯制御装置1の機能ブロック図(図1参照)と同様である。以下、本実施の形態2に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じまたは類似する参照符号を付し、異なる構成要素について主に説明する。 The functional block diagram of the headlamp control device 1 according to the second embodiment is the same as the functional block diagram of the headlamp control device 1 according to the first embodiment (see FIG. 1). Below, among the components according to the second embodiment, the components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components are mainly described.
 本実施の形態2に係る取得部12は、前照灯4で照明された物体の照度を取得する。例えば、取得部12は、撮像部3で撮像された画像の輝度から、前照灯4で照明された物体の照度を取得する。 The acquisition unit 12 according to the second embodiment acquires the illuminance of an object illuminated by the headlight 4. For example, the acquisition unit 12 acquires the illuminance of an object illuminated by the headlight 4 from the luminance of an image captured by the imaging unit 3.
 制御部13は、取得部12で取得された照度と予め定められた照度とに基づいて、取得された照度が予め定められた照度と実質的に同じになるように、前照灯4をフィードバック制御する。つまり、取得部12で取得された物体の照度が予め定められた照度よりも小さい場合には、当該物体の明るさを大きくする。一方、制御部13は、取得部12で取得された物体の照度が予め定められた照度よりも大きい場合には、当該物体の明るさを小さくする。 The control unit 13 performs feedback control of the headlights 4 based on the illuminance acquired by the acquisition unit 12 and a predetermined illuminance so that the acquired illuminance is substantially the same as the predetermined illuminance. In other words, if the illuminance of an object acquired by the acquisition unit 12 is lower than the predetermined illuminance, the control unit 13 increases the brightness of the object. On the other hand, if the illuminance of an object acquired by the acquisition unit 12 is higher than the predetermined illuminance, the control unit 13 decreases the brightness of the object.
 <動作>
 図7は、本実施の形態2に係る前照灯制御装置1の動作の一例を示すフローチャートである。図7のフローチャートは、図6のフローチャートのうち、ステップS7とステップS8との間にステップS11を追加し、ステップS9をステップS12に変更したフローチャートと同様である。このため、以下では、ステップS11及びステップS12について主に説明する。
<Operation>
Fig. 7 is a flowchart showing an example of the operation of the headlamp control device 1 according to the embodiment 2. The flowchart in Fig. 7 is similar to the flowchart in Fig. 6 except that step S11 is added between step S7 and step S8 and step S9 is changed to step S12. Therefore, steps S11 and S12 will be mainly described below.
 ステップS11にて、撮像部3は画像を撮像し、取得部12は、当該画像に基づいて前照灯4で照明された物体の照度を取得する。その後、処理がステップS8に進む。 In step S11, the imaging unit 3 captures an image, and the acquisition unit 12 acquires the illuminance of the object illuminated by the headlight 4 based on the image. Then, the process proceeds to step S8.
 ステップS12にて、制御部13は、図6のステップS9と概ね同じ制御を行う。ただし、制御部13は、ステップS11で取得された照度に基づいてフィードバック制御を行う。その後、処理がステップS3に戻る。 In step S12, the control unit 13 performs substantially the same control as in step S9 in FIG. 6. However, the control unit 13 performs feedback control based on the illuminance acquired in step S11. After that, the process returns to step S3.
 <実施の形態2のまとめ>
 以上のような本実施の形態2に係る前照灯制御装置1によれば、取得された照度と予め定められた照度とに基づいて前照灯4をフィードバック制御する。このような構成によれば、環境が物体に対する明るさに影響することを抑制することができるので、物体の明るさを適切化することができる。
Summary of the second embodiment
According to the headlamp control device 1 of the second embodiment described above, the headlamp 4 is feedback-controlled based on the acquired illuminance and the predetermined illuminance. With this configuration, it is possible to suppress the influence of the environment on the brightness of an object, and therefore it is possible to optimize the brightness of the object.
 <実施の形態3>
 図8は、本実施の形態3に係る前照灯制御装置1の要部を示す機能ブロック図である。以下、本実施の形態3に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じまたは類似する参照符号を付し、異なる構成要素について主に説明する。
<Third embodiment>
8 is a functional block diagram showing a main part of the headlamp control device 1 according to the present embodiment 3. Hereinafter, among the components according to the present embodiment 3, the components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components will be mainly described.
 図8の構成は、図1の構成のうち、制御部13に危険判定部13aが追加された構成と同様である。なお、危険判定部13aは、制御部13と通信可能に制御部13の外部に設けられてもよい。 The configuration of FIG. 8 is the same as the configuration of FIG. 1, except that a danger determination unit 13a is added to the control unit 13. Note that the danger determination unit 13a may be provided outside the control unit 13 so as to be able to communicate with the control unit 13.
 危険判定部13aは、車両と物体との間に予め定められた衝突可能性があるか否かを判定する。以下、危険判定部13aの一例について説明する。 The risk determination unit 13a determines whether or not there is a predetermined possibility of a collision between the vehicle and an object. An example of the risk determination unit 13a is described below.
 制御部13が物体を検出した後、車両及び物体の少なくともいずれかの移動によって、画像内の物体の位置(つまり座標)が変化した場合に、危険判定部13aは、物体の移動を追跡する。物体の移動の追跡は、例えばMulti-Object Trackingなどの、動画像内の複数の特定の物体の移動を追跡する公知の情報処理技術によって実現できる。 After the control unit 13 detects an object, if the position (i.e., coordinates) of the object in the image changes due to movement of at least one of the vehicle and the object, the danger determination unit 13a tracks the movement of the object. Tracking of the movement of an object can be achieved by known information processing technology, such as Multi-Object Tracking, which tracks the movements of multiple specific objects in a moving image.
 危険判定部13aは、物体の追跡に基づいて、車両と物体との間に予め定められた衝突可能性があるか否かを判定する。図9~図11は、危険判定部13aによる衝突可能性の判定の一例を説明するための図である。図9~図11には、車道36と、車道36を走行する車両37と、車両37と衝突可能性がある物体38とが示されている。 The risk determination unit 13a determines whether or not there is a predetermined possibility of collision between the vehicle and the object based on the tracking of the object. Figures 9 to 11 are diagrams for explaining an example of collision possibility determination by the risk determination unit 13a. Figures 9 to 11 show a roadway 36, a vehicle 37 traveling on the roadway 36, and an object 38 with which the vehicle 37 may collide.
 図9のように、最初の検出段階で検出された物体38が、車両37の進行方向に存在する場合に、危険判定部13aは、車両37の進行方向及び速度と、物体38の進行方向及び速度とに基づいて、両者の衝突を予測する。そして、危険判定部13aは、両者が衝突すると判定した場合に、予め定められた衝突可能性があると判定する。 As shown in FIG. 9, when an object 38 detected in the initial detection stage is present in the traveling direction of the vehicle 37, the danger determination unit 13a predicts a collision between the two based on the traveling direction and speed of the vehicle 37 and the traveling direction and speed of the object 38. Then, when the danger determination unit 13a determines that the two will collide, it determines that there is a predetermined possibility of a collision.
 図10のように、明るさが制限されている物体38が車道36に向かって移動している場合に、危険判定部13aは、車両37の進行方向及び速度と、物体38の進行方向及び速度とに基づいて、両者の衝突を予測する。そして、危険判定部13aは、両者が衝突すると判定した場合に衝突可能性があると判定する。 As shown in FIG. 10, when an object 38 with limited brightness is moving toward a roadway 36, the risk determination unit 13a predicts a collision between the two based on the traveling direction and speed of the vehicle 37 and the traveling direction and speed of the object 38. Then, when the risk determination unit 13a determines that the two will collide, it determines that there is a possibility of a collision.
 危険判定部13aは、図9及び図10で説明したような衝突が予測されずに、明るさが制限されている物体38が、車道36または車両37の進行方向の領域から予め定められた距離以内に存在するか否かを判定する。そして、危険判定部13aは、図11のように物体38が車道36または車両37の進行方向の領域から予め定められた距離以内に存在すると判定した場合に、衝突可能性があると判定する。例えば、物体38が、道路を横断しようと交通状況を伺っている歩行者である場合、物体38は車道36または車両37の進行方向の領域から予め定められた距離以内に存在するため、危険判定部13aは、衝突可能性があると判定することができる。 The risk determination unit 13a determines whether an object 38 with limited brightness exists within a predetermined distance from the area in the traveling direction of the roadway 36 or the vehicle 37 without predicting a collision as described in Figures 9 and 10. Then, when the risk determination unit 13a determines that the object 38 exists within a predetermined distance from the area in the traveling direction of the roadway 36 or the vehicle 37 as in Figure 11, it determines that there is a possibility of a collision. For example, if the object 38 is a pedestrian checking the traffic conditions before crossing the road, the object 38 exists within a predetermined distance from the area in the traveling direction of the roadway 36 or the vehicle 37, and therefore the risk determination unit 13a can determine that there is a possibility of a collision.
 図8の制御部13は、衝突可能性があると判定された物体について、物体の種別に対応する明るさの上限値を上昇させる。例えば危険判定部13aは、物体が存在する領域が物体の種別に対応する明るさで照明された後、車両と当該物体との間に予め定められた衝突可能性があるか否かを判定する。そして、制御部13は、衝突可能性があると判定された場合に、当該物体に対応する明るさの上限値を上昇させることによって、当該物体の明るさを上昇させる。 The control unit 13 in FIG. 8 increases the upper brightness limit corresponding to the type of object for which a collision is determined to be possible. For example, the danger determination unit 13a determines whether or not there is a predetermined collision possibility between the vehicle and the object after the area in which the object exists is illuminated with a brightness corresponding to the type of object. Then, when it is determined that there is a collision possibility, the control unit 13 increases the brightness of the object by increasing the upper brightness limit corresponding to the object.
 制御部13は、衝突可能性があると判定された物体について、当該物体の種別に対応する明るさを、上限値まで階段状に、すなわち段階的に急峻に上昇させるように、前照灯4に配光制御信号を出力する。これにより、物体の種別に対応する明るさが急峻に上昇し、車両の運転者に知覚のギャップが生じるので、運転者が当該物体に注意を向けさせることができる。このことが実現できるのであれば、衝突可能性があると判定された物体の提示形式はこれに限定されない。 The control unit 13 outputs a light distribution control signal to the headlights 4 so that the brightness corresponding to the type of object determined to be a potential collision target is increased in a stepped manner, i.e., stepwise, steeply up to an upper limit. This causes the brightness corresponding to the type of object to increase steeply, creating a perceptual gap for the vehicle driver, allowing the driver to focus their attention on the object. As long as this can be achieved, the presentation format of objects determined to be a potential collision target is not limited to this.
 <動作>
 図12は、本実施の形態3に係る前照灯制御装置1の動作を示すフローチャートである。図12のフローチャートは、図6のフローチャートのうち、ステップS4とステップS5との間にステップS21を追加し、ステップS5とステップS6との間にステップS22及びステップS23を追加したフローチャートと同様である。このため、以下では、ステップS21~S23について主に説明する。
<Operation>
Fig. 12 is a flowchart showing the operation of the headlamp control device 1 according to the third embodiment. The flowchart in Fig. 12 is similar to the flowchart in Fig. 6, except that step S21 is added between step S4 and step S5, and steps S22 and S23 are added between step S5 and step S6. For this reason, steps S21 to S23 will be mainly described below.
 ステップS21にて、危険判定部13aは、物体が前回検出された物体と同一である場合に、当該物体の移動を追跡する。その後、処理がステップS5に進む。 In step S21, if the object is the same as the previously detected object, the danger determination unit 13a tracks the movement of the object. Then, the process proceeds to step S5.
 ステップS22にて、危険判定部13aは、追跡結果に基づいて、車両と物体との間に予め定められた衝突可能性があるか否かを判定する。予め定められた衝突可能性があると判定された場合には処理がステップS23に進み、予め定められた衝突可能性がないと判定された場合には処理がステップS6に進む。 In step S22, the risk determination unit 13a determines whether or not there is a predetermined possibility of a collision between the vehicle and the object based on the tracking result. If it is determined that there is a predetermined possibility of a collision, the process proceeds to step S23, and if it is determined that there is no predetermined possibility of a collision, the process proceeds to step S6.
 ステップS23にて、制御部13は、予め定められた衝突可能性があると判定された物体について、物体の種別に対応する明るさの上限値を上昇させる。その後、処理がステップS6に進む。 In step S23, the control unit 13 increases the upper limit of brightness corresponding to the type of object for which it has been determined that there is a predetermined collision possibility. Then, the process proceeds to step S6.
 <実施の形態3のまとめ>
 以上のような本実施の形態3に係る前照灯制御装置1によれば、衝突可能性があると判定された物体について、物体の種別に対応する明るさの上限値を上昇させる。このような構成によれば、車両の運転者は、車両の走行を妨げる可能性がある物体を容易に認識することができる。
<Summary of the Third Embodiment>
According to the headlamp control device 1 of the third embodiment described above, for an object determined to have a collision possibility, the upper limit value of brightness corresponding to the type of object is increased. With this configuration, the driver of the vehicle can easily recognize an object that has the possibility of interfering with the traveling of the vehicle.
 <変形例>
 実施の形態3では、制御部13は、衝突可能性があると判定された物体について、物体の種別に対応する明るさの上限値を上昇させたが、当該上限値を解除してもよい。つまり、制御部13は、衝突可能性があると判定された物体の明るさの上限値を、物体の種別が検出されない照明領域の明るさと同じにしてもよい。そしてこの場合に、制御部13は、衝突可能性があると判定された物体について、当該物体の種別に対応する明るさを、照明領域の明るさまで階段状に、すなわち段階的に急峻に上昇させてもよい。
<Modification>
In the third embodiment, the control unit 13 increases the upper limit value of the brightness corresponding to the type of object for which a collision possibility has occurred, but the upper limit value may be released. That is, the control unit 13 may set the upper limit value of the brightness of an object for which a collision possibility has occurred to be the same as the brightness of an illumination area in which the type of object is not detected. In this case, the control unit 13 may increase the brightness corresponding to the type of object for which a collision possibility has occurred in a stepped manner, i.e., stepwise and steeply.
 <実施の形態4>
 実施の形態1の変形例では、照明領域の明るさが、物体の種別に対応する明るさを超えた後、当該物体が新たに検出された場合に、制御部13が、当該照明領域のうちの物体の部分を、当該物体の種別に対応する明るさまで小さくする構成を説明した。しかしながら、物体が照明領域に進入してから、物体の種別に対応する明るさまで小さくするまでには、ある程度の時間を要することから、その時間の間、車両の運転者などにグレアが生じる可能性がある。これに対して、以下で説明する本実施の形態4に係る前照灯制御装置1は、このような問題を解決可能に構成されている。
<Fourth embodiment>
In the modified example of the first embodiment, when the brightness of the illumination area exceeds the brightness corresponding to the type of the object and the object is newly detected, the control unit 13 reduces the brightness of the object in the illumination area to the brightness corresponding to the type of the object. However, since it takes a certain amount of time from when the object enters the illumination area until the brightness is reduced to the brightness corresponding to the type of the object, there is a possibility that glare may occur to the driver of the vehicle during that time. In contrast, the headlamp control device 1 according to the fourth embodiment described below is configured to be able to solve such a problem.
 図13は、本実施の形態4に係る前照灯制御装置1の要部を示す機能ブロック図である。以下、本実施の形態4に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じまたは類似する参照符号を付し、異なる構成要素について主に説明する。 FIG. 13 is a functional block diagram showing the main parts of the headlamp control device 1 according to the fourth embodiment. Below, among the components according to the fourth embodiment, components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components are mainly described.
 図13の構成は、図1の構成のうち、制御部13が検出部5に接続された構成と同様である。検出部5は、車両に関して照明領域よりも遠方に存在する物体、つまり夜間における撮像部3の検出レンジよりも遠方に存在する物体を、前照灯4の照明に依存せずに検出し、当該検出結果を含む物体検出情報を生成する。本実施の形態4では、検出部5による物体の検出は、物体の位置の検出を含む。検出部5は、例えば、LiDAR(Light detection and Ranging)、ToF(Time of Flight)カメラ、赤外線カメラ、ミリ波レーダー、超音波センサなどの、車両に搭載された測距センサである。 The configuration in FIG. 13 is the same as the configuration in FIG. 1 in which the control unit 13 is connected to the detection unit 5. The detection unit 5 detects objects that exist farther than the illumination area of the vehicle, that is, objects that exist farther than the detection range of the imaging unit 3 at night, without relying on the illumination of the headlights 4, and generates object detection information including the detection results. In this embodiment 4, the detection of an object by the detection unit 5 includes detection of the position of the object. The detection unit 5 is a distance measurement sensor mounted on the vehicle, such as a LiDAR (Light detection and Ranging), a ToF (Time of Flight) camera, an infrared camera, a millimeter wave radar, or an ultrasonic sensor.
 なお、検出部5は、車両に搭載された測距センサに限定されるものではない。例えば検出部5は、V2V(Vehicle-to-Vehicle)技術により、他車両が有するセンサによって検出された物体検出情報を取得する通信機であってもよい。または、例えば検出部5は、V2X(Vehicle-to-Everything)技術により、路側カメラや路側LiDARなどのインフラセンサ、携帯電話の位置情報、ダイナミックマップなどの車両の外部装置によって検出された物体検出情報を取得する通信機であってもよい。 The detection unit 5 is not limited to a distance measurement sensor mounted on the vehicle. For example, the detection unit 5 may be a communication device that acquires object detection information detected by a sensor possessed by another vehicle using V2V (Vehicle-to-Vehicle) technology. Or, for example, the detection unit 5 may be a communication device that acquires object detection information detected by infrastructure sensors such as roadside cameras and roadside LiDAR, location information from mobile phones, and external devices of the vehicle such as dynamic maps using V2X (Vehicle-to-Everything) technology.
 取得部12は、検出部5から物体検出情報を取得する。そして、制御部13は、取得された物体検出情報に基づいて、車両に関して照明領域よりも遠方に物体が存在することが検出された場合に、照明領域のうち当該物体が将来進入する部分の明るさの上限値を低下させることによって、当該明るさを暗くする。例えば、物体が将来進入する部分は、照明領域のうち、物体検出情報で示される物体の位置に最も近い部分であってもよい。または、例えば、物体が将来進入する部分は、照明領域のうち、実施の形態3に係る危険判定部13aと同様の判定によって、物体検出情報で示される物体の位置及び速度に基づいて物体が将来進入すると判定される部分であってもよい。 The acquisition unit 12 acquires object detection information from the detection unit 5. Then, when the control unit 13 detects, based on the acquired object detection information, that an object exists farther away from the vehicle than the illumination area, the control unit 13 darkens the brightness of the portion of the illumination area into which the object will enter in the future by lowering the upper limit value of the brightness of the portion. For example, the portion into which the object will enter in the future may be the portion of the illumination area that is closest to the position of the object indicated in the object detection information. Or, for example, the portion into which the object will enter in the future may be the portion of the illumination area that is determined to be the portion into which the object will enter in the future based on the position and speed of the object indicated in the object detection information by a determination similar to that made by the danger determination unit 13a according to embodiment 3.
 図14は、本実施の形態4に係る制御部13による制御の一例を説明するための上面図である。図14には、車両37と、道路標識32と、歩行者33と、太線で示される照明領域39と、検出領域40とが図示されている。 FIG. 14 is a top view for explaining an example of control by the control unit 13 according to the fourth embodiment. In FIG. 14, a vehicle 37, a road sign 32, a pedestrian 33, an illumination area 39 indicated by a thick line, and a detection area 40 are illustrated.
 一般的には、前照灯4によって照明される照明領域39と、物体情報に基づいて制御部13が物体を検出できる領域とは異なるが、ここでは簡単のため両者を同一としている。検出領域40は、検出部5が物体を検出できる領域である。検出部5は、ミリ波レーダーである場合を想定しており、車両37から検出領域40の端までの距離は、車両37から照明領域39の端までの距離も大きくなっている。 Generally, the illumination area 39 illuminated by the headlights 4 is different from the area in which the control unit 13 can detect objects based on object information, but for simplicity, they are considered to be the same here. The detection area 40 is the area in which the detection unit 5 can detect objects. It is assumed that the detection unit 5 is a millimeter wave radar, and the distance from the vehicle 37 to the edge of the detection area 40 is also greater than the distance from the vehicle 37 to the edge of the illumination area 39.
 車両37の走行などにより、道路標識32及び歩行者33が、車両37に近づくと、図14のように、照明領域39には存在しないが、検出領域40に存在することになる。この場合に、検出部5は、道路標識32及び歩行者33を検出し、物体検出情報を制御部13に出力する。制御部13は、撮像部3からの物体情報と、検出部5からの物体検出情報とに基づいて、照明領域39には存在しないが、検出領域40に存在する道路標識32及び歩行者33を検出する。そして、制御部13は、そのような検出結果が得られた場合に、照明領域39のうち、道路標識32及び歩行者33が将来進入する部分39a,39bを算出し、部分39a,39bの明るさを、部分39a,39b以外の照明領域39の明るさよりも暗くする。 When the road sign 32 and pedestrian 33 approach the vehicle 37 due to the vehicle 37 traveling, as shown in FIG. 14, they are not present in the illumination area 39 but are present in the detection area 40. In this case, the detection unit 5 detects the road sign 32 and pedestrian 33 and outputs object detection information to the control unit 13. Based on the object information from the imaging unit 3 and the object detection information from the detection unit 5, the control unit 13 detects the road sign 32 and pedestrian 33 that are not present in the illumination area 39 but are present in the detection area 40. When such a detection result is obtained, the control unit 13 calculates the portions 39a and 39b of the illumination area 39 into which the road sign 32 and pedestrian 33 will enter in the future, and makes the brightness of the portions 39a and 39b darker than the brightness of the illumination area 39 other than the portions 39a and 39b.
 車両37の走行などにより、道路標識32及び歩行者33が、車両37にさらに近づくと、照明領域39に存在することになり、制御部13は、撮像部3からの物体情報に基づいて、照明領域39には存在する道路標識32及び歩行者33を検出する。そして、制御部13は、そのような検出結果が得られた場合に実施の形態1と同様に、道路標識32及び歩行者33に対応する明るさで、前照灯4によって道路標識32及び歩行者33を照明する。 When the road sign 32 and pedestrian 33 come closer to the vehicle 37 due to the vehicle 37 traveling, they will be present in the illumination area 39, and the control unit 13 will detect the road sign 32 and pedestrian 33 present in the illumination area 39 based on the object information from the imaging unit 3. When such a detection result is obtained, the control unit 13 will illuminate the road sign 32 and pedestrian 33 with the headlights 4 at a brightness corresponding to the road sign 32 and pedestrian 33, as in the first embodiment.
 <動作>
 図15は、本実施の形態4に係る前照灯制御装置1の動作を示すフローチャートである。図15のフローチャートは、図6のフローチャートのうち、ステップS5とステップS6との間にステップS31~S33を追加したフローチャートと同様である。このため、以下では、ステップS31~S33について主に説明する。
<Operation>
Fig. 15 is a flowchart showing the operation of the headlamp control device 1 according to the fourth embodiment. The flowchart in Fig. 15 is similar to the flowchart in Fig. 6, except that steps S31 to S33 are added between steps S5 and S6. Therefore, steps S31 to S33 will be mainly described below.
 ステップS31にて、検出部5は物体検出情報を生成し、取得部12は物体検出情報を取得する。ステップS32にて、制御部13は、物体情報と物体検出情報とに基づいて検出された物体が照明領域に将来進入するか否かを判定する。例えば、検出部5の検出領域のうち、照明領域外であり、かつ、照明領域から予め定められた距離内の領域に物体が存在する場合に、制御部13は、当該物体が照明領域に将来進入すると判定する。検出された物体が照明領域に将来進入すると判定された場合には処理がステップS33に進み、検出された物体が照明領域に将来進入しないと判定された場合には処理がステップS6に進む。 In step S31, the detection unit 5 generates object detection information, and the acquisition unit 12 acquires the object detection information. In step S32, the control unit 13 determines whether the detected object will enter the illumination area in the future based on the object information and the object detection information. For example, if an object is present in an area of the detection area of the detection unit 5 that is outside the illumination area and within a predetermined distance from the illumination area, the control unit 13 determines that the object will enter the illumination area in the future. If it is determined that the detected object will enter the illumination area in the future, the process proceeds to step S33, and if it is determined that the detected object will not enter the illumination area in the future, the process proceeds to step S6.
 ステップS33にて、制御部13は、照明領域のうち物体が将来進入する部分の明るさの上限値を低下させる。その後、処理がステップS6に進む。以上の動作によれば、検出部5の検出領域のうち、照明領域から比較的近い領域に物体が存在する場合には、当該物体に関する照明領域の明るさが暗くなるが、照明領域から比較的離れた領域に物体が存在する場合には、当該物体に関する照明領域の明るさが維持される。 In step S33, the control unit 13 lowers the upper brightness limit of the portion of the illumination area into which the object will enter in the future. The process then proceeds to step S6. According to the above operation, if an object is present in an area of the detection area of the detection unit 5 that is relatively close to the illumination area, the brightness of the illumination area relating to the object becomes darker, but if an object is present in an area relatively far from the illumination area, the brightness of the illumination area relating to the object is maintained.
 <実施の形態4のまとめ>
 以上のような本実施の形態4に係る前照灯制御装置1によれば、車両に関して照明領域よりも遠方に物体が存在することが検出された場合に、照明領域のうち物体が将来進入する部分の明るさを暗くする。このような構成によれば、前照灯4の点灯後に、車両の移動などで新たな物体が照明領域に進入しても、その進入に起因するグレアを抑制することができる。
Summary of the Fourth Embodiment
According to the headlight control device 1 of the fourth embodiment as described above, when it is detected that an object exists farther away than the illumination area of the vehicle, the brightness of the portion of the illumination area into which the object will enter in the future is dimmed. With this configuration, even if a new object enters the illumination area due to the movement of the vehicle after the headlights 4 are turned on, the glare caused by the entry of the new object can be suppressed.
 <変形例>
 実施の形態4において、検出部5は、次の実施の形態5で説明されるような、物体の種別をある程度判定可能なセンサ、または、物体の種別の判定結果を受信可能な受信機であってもよい。この場合、制御部13は、その判定結果に基づいて、照明領域のうち物体が将来進入する部分の明るさを変更してもよい。
<Modification>
In the fourth embodiment, the detection unit 5 may be a sensor capable of determining the type of an object to some extent, or a receiver capable of receiving the determination result of the type of the object, as described in the following fifth embodiment. In this case, the control unit 13 may change the brightness of a portion of the illumination area into which the object will enter in the future, based on the determination result.
 <実施の形態5>
 実施の形態1では、撮像部3は、可視光の画像を撮像する前方監視カメラなどのカメラであったが、このようなカメラの画像は、車両の外部の環境(例えば雨、雪、霧など)の影響を受けやすく、制御部13での物体の種別判定の信頼性が低下する可能性がある。これに対して、以下で説明する本実施の形態5に係る前照灯制御装置1は、このような問題を解決可能に構成されている。
<Fifth embodiment>
In the first embodiment, the imaging unit 3 was a camera such as a forward monitoring camera that captures visible light images, but images from such a camera are easily affected by the environment outside the vehicle (e.g., rain, snow, fog, etc.), which may reduce the reliability of the object type determination by the control unit 13. In contrast, the headlamp control device 1 according to the following fifth embodiment is configured to be able to solve such problems.
 本実施の形態5に係る前照灯制御装置1の機能ブロック図は、実施の形態4に係る前照灯制御装置1の機能ブロック図(図13参照)と同様である。以下、本実施の形態5に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じまたは類似する参照符号を付し、異なる構成要素について主に説明する。 The functional block diagram of the headlamp control device 1 according to the fifth embodiment is the same as the functional block diagram of the headlamp control device 1 according to the fourth embodiment (see FIG. 13). Below, among the components according to the fifth embodiment, the components that are the same as or similar to the components described above are given the same or similar reference numerals, and different components are mainly described.
 本実施の形態5に係る検出部5は、車両が有する可視光に依存しない測距センサであるミリ波レーダーであり、物体の検出結果として、物体の種別及び位置に関する情報を生成して取得する。ミリ波レーダーによる物体の種別の判定は、例えばマイクロドップラーなどに代表される従来技術により実現できる。歩行者は、手足を動かしながら歩行するという特徴的な移動を行うため、探査波の歩行者からの反射波の周波数成分に、手足の動きがドップラーシフトとして現れる。ミリ波レーダーは、当該ドップラーシフトを検出することで相対静止物、相対移動物、及び、正体不明の物体だけでなく歩行者を判定することができる。 The detection unit 5 in this fifth embodiment is a millimeter wave radar, a distance measurement sensor that is not dependent on visible light and is equipped in the vehicle, and generates and acquires information regarding the type and position of an object as a result of object detection. The determination of the type of object by the millimeter wave radar can be realized by conventional technology such as micro Doppler. Pedestrians have a characteristic way of moving as they walk, so the movement of their hands and feet appears as a Doppler shift in the frequency components of the exploration waves reflected from the pedestrian. By detecting this Doppler shift, the millimeter wave radar can determine not only relatively stationary objects, relatively moving objects, and unidentified objects, but also pedestrians.
 制御部13は、取得部12が取得した撮像部3の物体情報と、検出部5(本実施の形態5ではミリ波レーダー)が取得した物体の検出結果とに基づいて、物体の種別を判定する。図16は、本実施の形態5に係る制御部13による判定の一例を説明するための図である。図16には、車両37と、道路標識である物体38aと、歩行者である物体38bと、太線で示される照明領域39と、検出領域40とが図示されている。 The control unit 13 determines the type of object based on the object information of the imaging unit 3 acquired by the acquisition unit 12 and the object detection result acquired by the detection unit 5 (a millimeter wave radar in this embodiment 5). Figure 16 is a diagram for explaining an example of the determination by the control unit 13 according to this embodiment 5. Figure 16 illustrates a vehicle 37, an object 38a which is a road sign, an object 38b which is a pedestrian, an illumination area 39 indicated by a thick line, and a detection area 40.
 一般的には、前照灯4によって照明される照明領域39と、物体情報に基づいて制御部13が物体を検出できる領域とは異なるが、ここでは簡単のため両者を同一としている。検出領域40は、検出部5であるミリ波レーダーが物体を検出できる領域である。ここでは、検出部5であるミリ波レーダーは、少なくとも静止物、移動物、正体不明の物体、及び、歩行者を判定可能となっている。 Generally, the illumination area 39 illuminated by the headlights 4 is different from the area in which the control unit 13 can detect objects based on object information, but for simplicity, they are considered to be the same here. The detection area 40 is the area in which the millimeter wave radar, which is the detection unit 5, can detect objects. Here, the millimeter wave radar, which is the detection unit 5, can determine at least stationary objects, moving objects, unidentified objects, and pedestrians.
 前照灯4の点灯開始時に、取得部12は、撮像部3から物体情報を、検出部5から検出結果を取得し、制御部13は、撮像部3の物体情報と検出部5の検出結果とに基づいて、物体の種別を判定する。 When the headlights 4 start to be turned on, the acquisition unit 12 acquires object information from the imaging unit 3 and the detection results from the detection unit 5, and the control unit 13 determines the type of object based on the object information from the imaging unit 3 and the detection results from the detection unit 5.
 例えば、視界良好であり、制御部13は、撮像部3の物体情報に基づいて、物体38aが道路標識であり、物体38bが歩行者であると判定し、物体情報に基づく判定の信頼度が閾値よりも大きい場合には、実施の形態1と同様の照明を行う。 For example, when visibility is good, the control unit 13 determines that the object 38a is a road sign and the object 38b is a pedestrian based on the object information from the imaging unit 3, and if the reliability of the determination based on the object information is greater than a threshold, illumination is performed in the same manner as in embodiment 1.
 例えば、天候による軽度の視界不良などが原因で、制御部13は、撮像部3の物体情報に基づいて、物体38aが道路標識であり、物体38bが歩行者であると判定したが、物体情報に基づく判定の信頼度が閾値以下である場合を想定する。この想定の場合に、検出部5の検出結果が、物体38aが正体不明であり、物体38bが歩行者であることを示す場合、制御部13は、明るさB5まで前照灯4の照明を徐々に上昇させて物体38aを照明し、図2の明るさB3まで前照灯4の照明を徐々に上昇させて物体38bを照明する。なお、明るさB5は、物体の種別によらずグレアなどの問題が生じない明るさに設定される。 For example, assume that due to mildly poor visibility caused by weather, the control unit 13 has determined that the object 38a is a road sign and the object 38b is a pedestrian based on the object information of the imaging unit 3, but the reliability of the determination based on the object information is below a threshold. In this assumed case, if the detection result of the detection unit 5 indicates that the object 38a is unidentified and the object 38b is a pedestrian, the control unit 13 gradually increases the illumination of the headlights 4 up to brightness B5 to illuminate the object 38a, and gradually increases the illumination of the headlights 4 up to brightness B3 in FIG. 2 to illuminate the object 38b. Note that brightness B5 is set to a brightness that does not cause problems such as glare, regardless of the type of object.
 撮像部3の物体情報に基づく判定の信頼度が閾値以下である上記想定の場合に加えて、検出部5の検出結果が、いずれの物体も存在しないことを示す場合、制御部13は、物体情報に基づく判定は誤りであると判定する。そして、制御部13は、物体が検出されていない照明領域と同様の明るさで、物体38a,38bを照明する。 In addition to the above assumed case where the reliability of the judgment based on the object information of the imaging unit 3 is equal to or lower than the threshold, if the detection result of the detection unit 5 indicates that no object is present, the control unit 13 judges that the judgment based on the object information is incorrect. Then, the control unit 13 illuminates the objects 38a and 38b with the same brightness as the illumination area where no object is detected.
 例えば、天候による重度の視界不良などが原因で、制御部13は、撮像部3の物体情報に基づいて、物体の種別が判定できなかった場合には、検出部5の検出結果に基づく物体の種別に対応する明るさで、物体38a,38bを照明する。 For example, if the control unit 13 is unable to determine the type of object based on the object information from the imaging unit 3 due to severe poor visibility caused by weather, it illuminates the objects 38a and 38b with a brightness that corresponds to the type of object based on the detection results from the detection unit 5.
 <実施の形態5のまとめ>
 可視光の画像を撮像する可視光カメラは、判定可能な物体の種別の数が多いが、視界不良などの影響を受ける。一方、測距センサは、視界不良などの影響を受けにくいが、判定可能な物体の種別の数が、可視光カメラよりは少ない。
Summary of the Fifth Embodiment
A visible light camera that captures images using visible light can identify a large number of object types, but is subject to the effects of poor visibility, etc. On the other hand, a distance measuring sensor is less susceptible to the effects of poor visibility, etc., but the number of object types that can be identified is smaller than that of a visible light camera.
 これに対して本実施の形態5に係る前照灯制御装置1によれば、取得部12が取得した物体情報と、検出部5である測距センサが取得した情報とに基づいて、物体の種別を判定する。このような構成によれば、可視光の画像を撮像するカメラと測距センサとを相補的に用いることができるので、物体の種別判定の信頼性を高めることができる。 In contrast, the headlamp control device 1 according to the fifth embodiment determines the type of object based on the object information acquired by the acquisition unit 12 and the information acquired by the distance measurement sensor, which is the detection unit 5. With this configuration, the camera that captures a visible light image and the distance measurement sensor can be used complementarily, thereby improving the reliability of the object type determination.
 <変形例>
 実施の形態5において検出部5の測距センサは、ミリ波レーダーであったが、これに限定されるものではなく、LiDAR、ToFカメラ、赤外線カメラ、超音波センサなどを用いてもよく、また1つに限定されるものではない。また、測距センサで検出できる種別は静止物、移動物、正体不明の物体、及び、歩行者に限定されるものではない。
<Modification>
In the fifth embodiment, the distance measurement sensor of the detection unit 5 is a millimeter wave radar, but is not limited to this and may be a LiDAR, a ToF camera, an infrared camera, an ultrasonic sensor, or the like, and is not limited to one. In addition, the types of objects that can be detected by the distance measurement sensor are not limited to stationary objects, moving objects, unidentified objects, and pedestrians.
 また、検出部5は、ミリ波レーダーなどの測距センサに限定されるものではなく、V2V技術またはV2X技術によって、車両の外部装置から通信で、物体の種別及び位置に関する情報を取得する通信機であってもよい。 In addition, the detection unit 5 is not limited to a distance measurement sensor such as a millimeter wave radar, but may be a communication device that obtains information regarding the type and position of an object through communication from an external device of the vehicle using V2V technology or V2X technology.
 可視光カメラは、車両の周囲に他車両及びインフラの両方がなくても判定可能であるが、視界不良などの影響を受ける。一方、V2V技術またはV2X技術は、視界不良などの影響を受けにくいが、車両の周囲に他車両またはインフラなどが存在しないと機能できない。これに対して上記のような変形例の構成によれば、可視光の画像を撮像するカメラと通信機とを相補的に用いることができるので、物体の種別判定の信頼性を高めることができる。  Visible light cameras can make judgments even when there are no other vehicles or infrastructure around the vehicle, but they are subject to the effects of poor visibility. On the other hand, V2V or V2X technology is less susceptible to the effects of poor visibility, but cannot function unless there are other vehicles or infrastructure around the vehicle. In contrast, with the configuration of the modified example described above, the camera that captures visible light images and the communication device can be used in a complementary manner, thereby increasing the reliability of object type judgment.
 <その他の変形例>
 上述した図1の取得部12、及び、制御部13を、以下「取得部12等」と記す。取得部12等は、図17に示す処理回路81により実現される。すなわち、処理回路81は、車両の前照灯によって照明される領域に物体が存在する場合に、物体に関する物体情報を、カメラから取得する取得部12と、上記領域の明るさを前照灯によって徐々に上昇させる間に、物体情報に基づいて、物体の種別に対応する明るさで前照灯によって物体を照明する制御部13と、を備える。処理回路81には、専用のハードウェアが適用されてもよいし、メモリに格納されるプログラムを実行するプロセッサが適用されてもよい。プロセッサには、例えば、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)などが該当する。
<Other Modifications>
The acquisition unit 12 and the control unit 13 in FIG. 1 described above will be hereinafter referred to as "acquisition unit 12, etc." The acquisition unit 12, etc. are realized by a processing circuit 81 shown in FIG. 17. That is, the processing circuit 81 includes an acquisition unit 12 that acquires object information related to an object from a camera when an object exists in an area illuminated by the headlights of a vehicle, and a control unit 13 that illuminates the object with the headlights at a brightness corresponding to the type of the object based on the object information while gradually increasing the brightness of the above-mentioned area with the headlights. The processing circuit 81 may be implemented with dedicated hardware, or may be implemented with a processor that executes a program stored in a memory. Examples of the processor include a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, and a DSP (Digital Signal Processor).
 処理回路81が専用のハードウェアである場合、処理回路81は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものが該当する。取得部12等の各部の機能それぞれは、処理回路を分散させた回路で実現されてもよいし、各部の機能をまとめて一つの処理回路で実現されてもよい。 When the processing circuit 81 is dedicated hardware, the processing circuit 81 corresponds to, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination of these. Each function of each unit such as the acquisition unit 12 may be realized by a circuit with distributed processing circuits, or the functions of each unit may be realized together by a single processing circuit.
 処理回路81がプロセッサである場合、取得部12等の機能は、ソフトウェア等との組み合わせにより実現される。なお、ソフトウェア等には、例えば、ソフトウェア、ファームウェア、または、ソフトウェア及びファームウェアが該当する。ソフトウェア等はプログラムとして記述され、メモリに格納される。図18に示すように、処理回路81に適用されるプロセッサ82は、メモリ83に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、前照灯制御装置1は、処理回路81により実行されるときに、車両の前照灯によって照明される領域に物体が存在する場合に、物体に関する物体情報を、カメラから取得するステップと、上記領域の明るさを前照灯によって徐々に上昇させる間に、物体情報に基づいて、物体の種別に対応する明るさで前照灯によって物体を照明するステップと、が結果的に実行されることになるプログラムを格納するためのメモリ83を備える。換言すれば、このプログラムは、取得部12等の手順や方法をコンピュータに実行させるものであるともいえる。ここで、メモリ83は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)、それらのドライブ装置、または、今後使用されるあらゆる記憶媒体であってもよい。 When the processing circuit 81 is a processor, the functions of the acquisition unit 12 and the like are realized in combination with software and the like. The software and the like includes, for example, software, firmware, or software and firmware. The software and the like is written as a program and stored in a memory. As shown in FIG. 18, the processor 82 applied to the processing circuit 81 realizes the functions of each part by reading and executing a program stored in the memory 83. That is, the headlight control device 1 includes a memory 83 for storing a program that, when executed by the processing circuit 81, results in the execution of the steps of acquiring object information about the object from a camera when an object is present in an area illuminated by the vehicle's headlights, and illuminating the object with the headlights at a brightness corresponding to the type of object based on the object information while gradually increasing the brightness of the area by the headlights. In other words, this program can be said to cause a computer to execute the procedures and methods of the acquisition unit 12 and the like. Here, memory 83 may be, for example, non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), drive devices for these, or any storage medium to be used in the future.
 以上、取得部12等の各機能が、ハードウェア及びソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限定されるものではなく、取得部12等の一部を専用のハードウェアで実現し、別の一部をソフトウェア等で実現する構成であってもよい。例えば、取得部12については専用のハードウェアとしての処理回路81でその機能を実現し、それ以外についてはプロセッサ82としての処理回路81がメモリ83に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。 The above describes a configuration in which the functions of the acquisition unit 12, etc. are realized either by hardware or software, etc. However, this is not limited to this, and a configuration in which part of the acquisition unit 12, etc. is realized by dedicated hardware and another part is realized by software, etc. For example, the function of the acquisition unit 12 can be realized by a processing circuit 81 as dedicated hardware, and the other functions can be realized by the processing circuit 81 as a processor 82 reading and executing a program stored in a memory 83.
 以上のように、処理回路81は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the processing circuit 81 can realize each of the above-mentioned functions through hardware, software, etc., or a combination of these.
 なお、各実施の形態及び各変形例を自由に組み合わせたり、各実施の形態及び各変形例を適宜、変形、省略したりすることが可能である。 In addition, each embodiment and each modified example can be freely combined, and each embodiment and each modified example can be modified or omitted as appropriate.
 上記した説明は、すべての局面において、例示であって、限定的なものではない。例示されていない無数の変形例が、想定され得るものと解される。 The above description is illustrative in all respects and is not limiting. It is understood that countless variations not illustrated can be envisioned.
 1 前照灯制御装置、3 撮像部、4 前照灯、12 取得部、13 制御部、13a 危険判定部、31 他車両、32 道路標識、33 歩行者、37 車両、38 物体。 1 Headlight control device, 3 Imaging unit, 4 Headlight, 12 Acquisition unit, 13 Control unit, 13a Danger determination unit, 31 Other vehicles, 32 Road signs, 33 Pedestrians, 37 Vehicles, 38 Objects.

Claims (11)

  1.  車両の前照灯によって照明される領域に物体が存在する場合に、前記物体に関する物体情報を、カメラから取得する取得部と、
     前記領域の明るさを前記前照灯によって徐々に上昇させる間に、前記物体情報に基づいて、前記物体の種別に対応する明るさで前記前照灯によって前記物体を照明する制御部と
    を備える、前照灯制御装置。
    an acquisition unit that acquires object information relating to an object from a camera when the object is present in an area illuminated by the vehicle headlights;
    and a control unit that illuminates the object with the headlight at a brightness corresponding to a type of the object based on the object information while gradually increasing the brightness of the area with the headlight.
  2.  請求項1に記載の前照灯制御装置であって、
     前記制御部は、前記領域の前記明るさを線形的または段階的に上昇させる、前照灯制御装置。
    The headlamp control device according to claim 1,
    The control unit increases the brightness of the area linearly or stepwise.
  3.  請求項1に記載の前照灯制御装置であって、
     前記制御部は、予め定められた輝度で前記前照灯に前記領域を照明させる、前照灯制御装置。
    The headlamp control device according to claim 1,
    The control unit controls the headlight to illuminate the area at a predetermined brightness.
  4.  請求項1に記載の前照灯制御装置であって、
     前記取得部は、前記前照灯で照明された前記物体の照度をさらに取得し、
     前記制御部は、取得された前記照度と予め定められた照度とに基づいて、前記前照灯をフィードバック制御する、前照灯制御装置。
    The headlamp control device according to claim 1,
    The acquisition unit further acquires an illuminance of the object illuminated by the headlight,
    The control unit feedback-controls the headlamp based on the acquired illuminance and a predetermined illuminance.
  5.  請求項1に記載の前照灯制御装置であって、
     前記物体の前記種別が人である場合の、前記物体の前記種別に対応する前記明るさは、前記物体の前記種別が標識である場合の、前記物体の前記種別に対応する前記明るさよりも大きい、前照灯制御装置。
    The headlamp control device according to claim 1,
    A headlamp control device, wherein the brightness corresponding to the type of the object when the type of the object is a person is greater than the brightness corresponding to the type of the object when the type of the object is a sign.
  6.  請求項5に記載の前照灯制御装置であって、
     前記制御部は、前記物体情報に基づいて前記物体の前記種別が前記車両以外の他車両であると判定された場合に、前記前照灯によって前記他車両を照明しない、前照灯制御装置。
    The headlamp control device according to claim 5,
    The control unit, when it is determined based on the object information that the type of the object is a vehicle other than the vehicle, does not illuminate the other vehicle with the headlights.
  7.  請求項1に記載の前照灯制御装置であって、
     前記車両と前記物体との間に予め定められた衝突可能性があるか否かを判定する危険判定部をさらに備え、
     前記制御部は、前記衝突可能性があると判定された前記物体について、前記物体の前記種別に対応する前記明るさの上限値を上昇させる、または、解除する、前照灯制御装置。
    The headlamp control device according to claim 1,
    a risk determination unit that determines whether or not there is a predetermined possibility of a collision between the vehicle and the object,
    The control unit increases or cancels the upper limit value of the brightness corresponding to the type of the object for which the object is determined to have the possibility of collision.
  8.  請求項7に記載の前照灯制御装置であって、
     前記制御部は、前記衝突可能性があると判定された前記物体について、前記物体の前記種別に対応する前記明るさを、前記上限値、または、前記領域の前記明るさまで段階的に上昇させる、前照灯制御装置。
    The headlamp control device according to claim 7,
    The control unit increases the brightness corresponding to the type of the object for which the object is determined to have the possibility of collision in a stepwise manner to the upper limit value or the brightness of the area.
  9.  請求項1に記載の前照灯制御装置であって、
     前記制御部は、
     前記車両に関して前記領域よりも遠方に前記物体が存在することが検出された場合に、前記領域のうち当該物体が将来進入する部分の明るさを暗くする、前照灯制御装置。
    The headlamp control device according to claim 1,
    The control unit is
    A headlamp control device that, when it is detected that the object is present farther away than the area relative to the vehicle, darkens the brightness of a portion of the area into which the object will enter in the future.
  10.  請求項1に記載の前照灯制御装置であって、
     前記制御部は、前記取得部が取得した前記物体情報と、測距センサが取得した情報、または、前記車両の外部装置から通信で取得した情報とに基づいて、前記物体の前記種別を判定する、前照灯制御装置。
    The headlamp control device according to claim 1,
    A headlight control device, wherein the control unit determines the type of the object based on the object information acquired by the acquisition unit, information acquired by a distance measuring sensor, or information acquired via communication from an external device of the vehicle.
  11.  車両の前照灯によって照明される領域に物体が存在する場合に、前記物体に関する物体情報を、カメラから取得し、
     前記領域の明るさを前記前照灯によって徐々に上昇させる間に、前記物体情報に基づいて、前記物体の種別に対応する明るさで前記前照灯によって前記物体を照明する、前照灯制御方法。
    When an object is present in an area illuminated by the vehicle's headlights, object information relating to the object is acquired from a camera;
    The headlight control method includes illuminating the object with the headlight at a brightness corresponding to a type of the object based on the object information while gradually increasing the brightness of the area with the headlight.
PCT/JP2022/035616 2022-09-26 2022-09-26 Headlight control device and headlight control method WO2024069676A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/035616 WO2024069676A1 (en) 2022-09-26 2022-09-26 Headlight control device and headlight control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/035616 WO2024069676A1 (en) 2022-09-26 2022-09-26 Headlight control device and headlight control method

Publications (1)

Publication Number Publication Date
WO2024069676A1 true WO2024069676A1 (en) 2024-04-04

Family

ID=90476570

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/035616 WO2024069676A1 (en) 2022-09-26 2022-09-26 Headlight control device and headlight control method

Country Status (1)

Country Link
WO (1) WO2024069676A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013032136A (en) * 2011-06-29 2013-02-14 Sharp Corp Light-projecting device, and vehicle headlamp including the light-projecting device
JP2017001453A (en) * 2015-06-08 2017-01-05 住友電気工業株式会社 Headlight control device and headlight control method
WO2020158391A1 (en) * 2019-01-28 2020-08-06 株式会社小糸製作所 Control device for vehicle lamp, vehicle lamp system and method for controlling vehicle lamp
JP2021181292A (en) * 2020-05-20 2021-11-25 株式会社小糸製作所 Light distribution control device, vehicular lamp fitting system and light distribution control method
JP2021193026A (en) * 2017-01-20 2021-12-23 株式会社小糸製作所 Control device for vehicle lighting fixture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013032136A (en) * 2011-06-29 2013-02-14 Sharp Corp Light-projecting device, and vehicle headlamp including the light-projecting device
JP2017001453A (en) * 2015-06-08 2017-01-05 住友電気工業株式会社 Headlight control device and headlight control method
JP2021193026A (en) * 2017-01-20 2021-12-23 株式会社小糸製作所 Control device for vehicle lighting fixture
WO2020158391A1 (en) * 2019-01-28 2020-08-06 株式会社小糸製作所 Control device for vehicle lamp, vehicle lamp system and method for controlling vehicle lamp
JP2021181292A (en) * 2020-05-20 2021-11-25 株式会社小糸製作所 Light distribution control device, vehicular lamp fitting system and light distribution control method

Similar Documents

Publication Publication Date Title
JP5409929B2 (en) Control method for headlight device for vehicle and headlight device
JP6310899B2 (en) Outside environment recognition device
KR101370215B1 (en) Method for controlling a headlight arrangement for a vehicle and such a headlight arrangement
JP6200481B2 (en) Outside environment recognition device
JP5097648B2 (en) Headlamp control device and vehicle headlamp device
US9102265B2 (en) Method and device for the distance-based debouncing of light-characteristic changes
US10618458B2 (en) Vehicle headlight control device
CN101934757B (en) Head lamp of vehicle
US9586515B2 (en) Method and device for recognizing an illuminated roadway ahead of a vehicle
JP2011526369A (en) Method for detecting poor headlamp adjustment in vehicles with cameras
US9108567B2 (en) Method and control unit for adjusting a luminosity of at least one headlight of a vehicle
JP5549741B2 (en) Vehicle light distribution control system
JP2009029227A (en) Lighting control device, method and program
JP2013097885A (en) Headlight device and headlight system
JP6288208B1 (en) Vehicle headlight control device
JP2008296759A (en) Information processor, method, and program
WO2024069676A1 (en) Headlight control device and headlight control method
JP6549974B2 (en) Outside environment recognition device
JP6539191B2 (en) Outside environment recognition device
CN110774976B (en) Device and method for controlling a vehicle headlight
KR101095023B1 (en) High beam assistance system and method thereof
JP6278217B1 (en) Vehicle headlight control device
WO2022244085A1 (en) Headlight control device, headlight control system, and headlight control method
EP4309954A1 (en) Light distribution control device, vehicular lamp system, and light distribution control method
US20240051457A1 (en) Headlamp control apparatus and vehicle