WO2022195836A1 - Backlight determination device, backlight determination method, and backlight determination program - Google Patents

Backlight determination device, backlight determination method, and backlight determination program Download PDF

Info

Publication number
WO2022195836A1
WO2022195836A1 PCT/JP2021/011312 JP2021011312W WO2022195836A1 WO 2022195836 A1 WO2022195836 A1 WO 2022195836A1 JP 2021011312 W JP2021011312 W JP 2021011312W WO 2022195836 A1 WO2022195836 A1 WO 2022195836A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
backlight
driver
event
information
Prior art date
Application number
PCT/JP2021/011312
Other languages
French (fr)
Japanese (ja)
Inventor
雅也 仁平
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2023506654A priority Critical patent/JP7270866B2/en
Priority to PCT/JP2021/011312 priority patent/WO2022195836A1/en
Publication of WO2022195836A1 publication Critical patent/WO2022195836A1/en

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles

Definitions

  • the present disclosure relates to a backlight determination device, a backlight determination method, and a backlight determination program.
  • An image recognition unit that detects objects based on image information obtained by imaging the surroundings of the vehicle and acquires information about the objects, and a vehicle surroundings that converts the vehicle surrounding information including the information about the objects into a predetermined code.
  • a vehicle periphery information collection device has been proposed that includes an information generation unit and a storage unit that stores codes converted by the vehicle periphery information generation unit. See, for example, US Pat.
  • JP 2019-120969 A (for example, abstract, see FIG. 2)
  • Patent Document 1 cannot accurately determine whether or not the driver who was driving the vehicle was in backlight when some event occurred in the vehicle.
  • An object of the present disclosure is to make it possible to accurately determine whether or not the driver is in a backlight state when a predetermined event occurs.
  • the backlight determination device of the present disclosure includes event occurrence information indicating that a predetermined event has occurred in a vehicle, vehicle position information indicating the position of the vehicle at the time of occurrence of the event, and and an information acquisition unit that acquires an image obtained by detecting the outside of the vehicle at the time of occurrence and distance information to a surrounding object obtained by detecting the outside of the vehicle, and a calculation using the vehicle position information and the time
  • a light source position acquisition unit that acquires light source position information indicating the relative position of the light source with respect to the vehicle at the time of occurrence based on the image, map information including road and structure information, the vehicle position information, and the light source a shaded area acquisition unit that acquires shaded area information indicating a shaded area at the time of occurrence based on position information and the event occurrence information; and a determination unit that determines whether or not the driver of the vehicle is in a backlight state in which light from the light source is incident on the eyes of the driver.
  • Another backlight determination device of the present disclosure includes an acquisition unit that acquires an image output from an in-vehicle camera that captures the face of a driver of a vehicle and event occurrence information that indicates that a predetermined event has occurred in the vehicle. and, using a learned model generated by prior machine learning to classify facial expressions when backlight enters the eye, when the event occurs, the driver's state changes from the light source to the and a determination unit that determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver of the vehicle.
  • Still another backlight determination device of the present disclosure acquires an image output from an in-vehicle camera that captures the eyes of a driver of a vehicle and event occurrence information indicating that a predetermined event has occurred in the vehicle. and whether a corneal reflection and an opposing reflection occurred sequentially in the driver's eye at the time of the event, the driver's condition is incident on the vehicle's driver's eye from a light source. and a determination unit that determines whether or not there is a backlit state in which light exists.
  • FIG. 1 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 1;
  • FIG. 1 is a diagram showing a hardware configuration of a backlight determination device according to Embodiment 1;
  • FIG. (A) is a diagram showing an example when an event occurs (when there is no shade), and
  • (B) is a diagram showing the position of the sun relative to the vehicle.
  • (A) is a diagram showing an example when an event occurs (when there is a shade), and
  • (B) is a diagram showing an example when an event occurs (when there is a shade).
  • 5 is a flow chart showing a shadow region detection operation of the backlight determination device according to Embodiment 1;
  • FIG. 1 is a diagram showing a hardware configuration of a backlight determination device according to Embodiment 1;
  • FIG. (A) is a diagram showing an example when an event occurs (when there is no shade)
  • (B) is a diagram showing the position of the sun relative to the
  • FIG. 5 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 2; It is a figure which shows the calculation process of a reflected light vector.
  • 10 is a flow chart showing the reflected light detection operation of the backlight determination device according to the second embodiment.
  • FIG. 11 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 3;
  • FIG. 10 is a diagram showing a hardware configuration of a backlight determination device according to Embodiment 3;
  • FIG. 10 is a diagram showing a configuration for learning a facial expression determination model and a configuration for utilizing a display determination model;
  • 4 is a flowchart showing facial expression detection processing;
  • FIG. 11 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 4; It is a figure which shows reflection detection processing. It is a flow chart which shows eyelid detection processing.
  • a backlight determination device, a backlight determination method, and a backlight determination program according to an embodiment will be described below with reference to the drawings.
  • the following embodiments are merely examples, and the embodiments can be combined as appropriate and each embodiment can be modified as appropriate.
  • FIG. 1 is a functional block diagram schematically showing the configuration of a backlight determination device 10 according to Embodiment 1.
  • the backlight determination device 10 is a device capable of executing the backlight determination method according to the first embodiment.
  • the backlight determination device 10 is, for example, a computer.
  • the backlight determination device 10 can execute the backlight determination method according to the first embodiment by executing the backlight determination program.
  • the backlight determination device 10 has an information acquisition section 11, a light source position acquisition section 12, a shade area acquisition section 13, and a determination section .
  • the backlight determination device 10 determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver based on the information stored in the storage unit 62 of the drive recorder 60 of the vehicle and the map information 63 .
  • the information stored in the storage unit 62 of the drive recorder 60 is obtained from the in-vehicle sensor 50 while the vehicle is running.
  • the in-vehicle sensor 50 includes, for example, an exterior camera 51 that captures the exterior of the vehicle, a range sensor 52 that measures the distance to objects around the vehicle, a GPS 53 that is a position detection unit of a positioning system, and a predetermined event detection. It has an acceleration sensor 54 that is The ranging sensor 52 is, for example, LiDAR, millimeter wave radar, sonar radar, or the like. When a stereo camera is used as the vehicle exterior camera 51, the ranging sensor 52 can be omitted.
  • the information acquisition unit 11 receives event occurrence information A indicating that a predetermined event E has occurred in the vehicle, vehicle position information G indicating the position of the vehicle at the time the event E occurred, and outside the vehicle at the time of occurrence.
  • a photographed image I and distance information L indicating the distance to a peripheral object obtained by detecting the outside of the vehicle when the event E occurred are acquired.
  • the light source position acquisition unit 12 acquires the position of the sun as a light source based on calculation using the vehicle position information G and time t. Further, the light source position acquisition unit 12 acquires light source position information indicating the relative position of the light source with respect to the vehicle when the event E occurs, based on the image I captured by the exterior camera 51 .
  • the light source may be the headlights of an oncoming vehicle traveling in the oncoming lane. In this case, the position of the light source is calculated based on the image of the camera outside the vehicle or the detection signal of the ranging sensor.
  • the shade area acquisition unit 13 obtains shade information indicating the shade area when the event E occurs based on the map information M including the information on the road and the structure, the vehicle position information G, the light source position information S, and the event occurrence information A. to get
  • the determination unit 14 determines whether or not the driver of the vehicle when the event E occurred was in a backlit state in which there was light entering the driver's eyes from the light source. make a judgment.
  • the occurrence time of the event E includes a predetermined time immediately before the moment when the event E occurred.
  • the predetermined time is, for example, 0.5 seconds, 1 second, 2 seconds, or 3 seconds.
  • determination may be made using a plurality of different times, instead of the predetermined time.
  • FIG. 2 is a diagram showing the hardware configuration of the backlight determination device 10 according to Embodiment 1.
  • the backlight determination device 10 includes a processor 101 such as a CPU (Central Processing Unit), a memory 102 that is a volatile storage device, and a non-volatile memory such as a hard disk drive (HDD) or solid state drive (SDD). and a storage device 103 for the data.
  • the memory 102 is, for example, a volatile semiconductor memory such as a RAM (Random Access Memory).
  • the processing circuitry may be dedicated hardware or processor 101 executing a program stored in memory 102 .
  • the processor 101 may be any of a processing device, an arithmetic device, a microprocessor, a microcomputer, and a DSP (Digital Signal Processor).
  • the processing circuit may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array) ), or a combination of any of these.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the backlight determination program is realized by software, firmware, or a combination of software and firmware.
  • Software and firmware are written as programs and stored in memory 102 .
  • the processor 101 reads and executes the backlight determination program stored in the memory 102, thereby implementing the functions of the units shown in FIG.
  • the backlight determination device 10 may be partially realized by dedicated hardware and partially realized by software or firmware.
  • the processing circuitry may implement each of the functions described above in hardware, software, firmware, or any combination thereof.
  • FIG. 3(A) is a diagram showing an example when an event occurs (when there is no shade)
  • FIG. 3(B) is a diagram showing the position of the sun relative to the vehicle.
  • a first angle .alpha. is within a predetermined first angle range
  • the second angle ⁇ which is the vertical angle formed by the central ray with respect to the traveling direction, is within a predetermined second angle range
  • the determination unit 14 determines that the driver was in the backlight state when event E occurred.
  • the first angle ⁇ is the azimuth angle.
  • the second angle is the elevation angle.
  • FIG. 4(A) is a diagram showing an example when an event occurs (when there is a shade)
  • FIG. 4(B) is a diagram showing an example when an event occurs (when there is a shade). If the driver is in the shadow area when event E occurs, the determining unit determines that the driver was in a non-backlit state when event E occurred. For example, as shown in FIG. 4A, if there is a shaded area at the time of event E caused by a building, which is a structure shown in the map information, the driver's state of the vehicle in this shaded area is non-backlit. state. In addition, as shown in FIG. 4(B), if there is a shadow due to the presence of a large vehicle in front of the camera image when event E occurs, the driver of the vehicle in the shaded area is in a non-backlit state. .
  • FIG. 5 is a flow chart showing the shadow area detection operation of the backlight determination device 10 according to the first embodiment.
  • the backlight determination device 10 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S101).
  • An event in this case is, for example, a situation where the vehicle has hit or been hit by something.
  • the backlight determination device 10 acquires vehicle position information by GPS (step S102).
  • the backlight determination device 10 detects the map information, the image of the camera outside the vehicle, the structure (for example, a building) or the surrounding object (for example, a large vehicle ahead on the road) detected by the ranging sensor (step S103).
  • the backlight determination device 10 estimates (that is, calculates) the position of the sun, which is the light source, from the vehicle position information, the surrounding object information, and the time (step S104).
  • the backlight determination device 10 determines whether or not the vehicle receives light based on the positional relationship (that is, relative position) between the vehicle and the light source (step S106).
  • the backlight determination device 10 determines whether the surrounding object is a structure registered in the map information (step S107).
  • the backlight determination device 10 acquires the position and shape of the object using the camera outside the vehicle or the distance measuring sensor (step S108). The presence or absence of light entering the vehicle is determined from the position information of (step S110).
  • the backlight determination apparatus 10 acquires the position and shape of the object from the map information (step S109), It is determined whether or not light enters the vehicle (step S110).
  • the backlight determination device 10 the light shielding determination method, or the light shielding determination program according to the first embodiment is used, it is possible to determine whether the driver is in a backlight state or a non-backlight state when a predetermined event occurs. can be determined accurately.
  • Embodiment 2 In Embodiment 1, the case where the light from the light source is directly incident on the eyes of the driver of the vehicle has been described. However, there are cases where a structure has a light reflecting area, such as a building whose walls are made of glass windows. Embodiment 2 describes a backlight determination device 20 that enables determination in consideration of reflected light from a structure having a light reflection area.
  • FIG. 6 is a functional block diagram schematically showing the configuration of the backlight determination device 20 according to the second embodiment.
  • the backlight determination device 20 is a device capable of executing the backlight determination method according to the second embodiment.
  • the backlight determination device 20 is, for example, a computer.
  • the backlight determination device 20 can execute the backlight determination method according to the second embodiment by executing the backlight determination program.
  • the backlight determination device 20 includes an information acquisition unit 21, a light source position acquisition unit 22, a reflected light vector acquisition unit 23, a shade area acquisition unit 24, and a determination unit 25.
  • the information acquisition unit 21 and the light source position acquisition unit 22 are the same as the information acquisition unit 11 and the light source position acquisition unit 12 in the first embodiment.
  • the backlight determination device 20 differs from the backlight determination device 10 according to the first embodiment in that it has a reflected light vector acquisition unit 23 and a shadow area acquisition unit 24 .
  • FIG. 7 is a diagram showing the calculation process of the reflected light vector.
  • the reflected light vector acquisition unit 23 grasps the reflection state of light from surrounding objects, determines whether or not the light is incident on the vehicle, and records the result. For the reflection area, for example, the presence or absence of reflection is determined from the image of the camera outside the vehicle. After the reflection area is determined by the external camera, the light reflection surface is calculated from the measurement result of the distance measuring sensor or the map information. From the positional relationship between the light and the reflecting surface, the normal vector of the reflecting surface is calculated.
  • FIG. 8 is a flowchart showing the reflected light detection operation of the backlight determination device 20 according to the second embodiment.
  • the backlight determination device 20 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S201).
  • the backlight determination device 20 acquires vehicle position information by GPS (step S202).
  • the backlight determination device 20 detects the map information, the image of the camera outside the vehicle, the structure (for example, building) or the surrounding object (for example, a large vehicle ahead on the road) detected by the ranging sensor (step S203).
  • the backlight determination device 20 estimates (that is, calculates) the position of the sun, which is the light source, from the vehicle position information, the surrounding object information, and the time (step S204).
  • the backlight determination device 10 determines whether or not the vehicle receives light based on the positional relationship (that is, relative position) between the vehicle and the light source (step S106).
  • the backlight determination device 10 determines whether the surrounding object is a structure registered in the map information (step S207).
  • the backlight determination device 20 acquires the position, shape, and surface of the object using the camera outside the vehicle or the distance measuring sensor (step S208), and determines the surface of each object.
  • step S210 the presence or absence of light incident on the vehicle is determined based on the positional information of the overexposed area by the camera.
  • the backlight determination device 20 acquires the position, shape and surface of the object from the map information (step S209), Whether or not there is light entering the vehicle is determined based on the positional information of the kite area (step S210).
  • the backlight determination device 20 determines whether or not the vehicle has entered the port (step S212). An incident light vector to the vehicle is calculated from the light source position information and the surface of the object (step S213).
  • the backlight determination device 20 the light shielding determination method, or the light shielding determination program according to the second embodiment is used, it is possible to determine whether the driver is in a backlight state or a non-backlight state when a predetermined event occurs. can be determined accurately.
  • the second embodiment is the same as the first embodiment.
  • Embodiments 1 and 2 above do not use the driver's facial expression (for example, squinting) when the driver is dazzled.
  • Embodiment 3 an example will be described in which the presence or absence of backlight is determined based on the facial expression of the driver.
  • backlight determination can be performed using the determination result of the backlight determination apparatus according to the first or second embodiment and the determination result of the backlight determination apparatus according to the third embodiment.
  • FIG. 9 is a functional block diagram schematically showing the configuration of the backlight determination device 30 according to Embodiment 3.
  • the backlight determination device 30 is a device capable of executing the backlight determination method according to the third embodiment.
  • the backlight determination device 30 is, for example, a computer.
  • the backlight determination device 30 can execute the backlight determination method according to the third embodiment by executing the backlight determination program.
  • the backlight determination device 30 has an information acquisition section 31 and an expression determination section 32 as a determination section.
  • the facial expression determination unit 32 uses a facial expression determination model, which is a learned model generated in advance by machine learning (for example, deep learning), to determine whether or not the facial expression of the driver is that of a dazzling expression. .
  • the backlight determining device 30 determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver based on the information stored in the storage unit 62 of the drive recorder 60 of the vehicle.
  • the information stored in the storage unit 62 of the drive recorder 60 is obtained from the in-vehicle sensor 50 while the vehicle is running.
  • the in-vehicle sensor 50 has, for example, an in-vehicle camera 55 that captures the face of the driver inside the vehicle, and an acceleration sensor 54 that is used to detect a predetermined event.
  • the vehicle-mounted sensor 50 may be provided with the ranging sensor 52, GPS53, and the vehicle exterior camera 51 like Embodiment 1 or 2.
  • FIG. 1 Embodiment 1 or 2.
  • FIG. 10 is a diagram showing the hardware configuration of the backlight determination device 30 according to the third embodiment.
  • the backlight determination device 20 has a processor 301 such as a CPU, a memory 302 which is a volatile storage device, and a storage device 303 such as an HDD or SDD.
  • Memory 302 is, for example, a RAM.
  • Each function of the backlight determination device 30 is implemented by a processing circuit, as in the case of the first embodiment.
  • the processing circuitry may be dedicated hardware or processor 101 executing a program stored in memory 102 .
  • FIG. 11 is a diagram showing the configuration for learning the facial expression determination model and the configuration for utilizing the display determination model.
  • the facial expression of the driver is detected, and the facial expression in glare is used as a teacher, and the facial expression is classified by a model learned by deep learning, and classified into the face in glare and other faces.
  • a facial expression is recorded for several seconds immediately before event E, and a pre-learned facial expression determination model is used to determine whether or not the subject is feeling glare. It is also possible to determine the glare step by step according to the accuracy rate of the classification.
  • FIG. 12 is a flowchart showing facial expression detection processing of the backlight determination device 30 according to the third embodiment.
  • the backlight determination device 30 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S301).
  • the backlight determination device 30 acquires a camera image of the driver's expression at the time of the occurrence of the event (including several seconds immediately before) (step S202).
  • the backlight determination device 30 determines how the driver's eyes are opened or closed (step S303). This is because people tend to make facial expressions such as narrowing their eyes or lowering the corners of their eyes when the light is bright.
  • the backlight determination device 30 determines whether or not the length of time during which the driver feels glare is equal to or greater than a predetermined threshold (step S304).
  • the backlight determination device 30 determines that the driver does not have a dazzling facial expression. If the value is equal to or greater than the threshold (YES in step S304), the backlight determination device 30 performs glare determination (value 0 or 1) using the facial expression determination model, which is a learned model, for a plurality of frames immediately before the occurrence of the event. (step S306). At this time, the degree of glare is calculated by assigning a larger weight to the glare determination value for a frame closer to the occurrence of the event (step S307). The weight increases linearly in proportion to time, for example, closer to the event occurrence time.
  • the backlight determination device 30 determines whether the surrounding object is a structure registered in the map information (step S107).
  • the backlight determination device 30 the light shielding determination method, or the light shielding determination program according to the third embodiment is used, a predetermined expression using a learned model of the expression when feeling glare is determined in advance. It is possible to accurately determine whether the driver was in a backlit or non-backlit condition when the event occurred.
  • the determination of the third embodiment may be performed in addition to the determination of at least one of the first and second embodiments. If a plurality of determination methods are used together, the determination accuracy is improved.
  • Embodiment 4 Embodiments 1 to 3 described above do not utilize the change in the pupil that the driver exhibits when the driver is dazzled. In Embodiment 4, an example will be described in which the presence or absence of backlight is determined based on the driver's pupils. However, the configuration of Embodiment 4 can be combined with any one or more of the backlight determination devices of Embodiments 1 to 3. For example, backlight determination can be performed using the determination result of the backlight determination apparatus according to the first or second embodiment and the determination result of the backlight determination apparatus according to the third embodiment.
  • FIG. 13 is a functional block diagram schematically showing the configuration of the backlight determination device 40 according to the fourth embodiment.
  • the backlight determination device 40 is a device capable of executing the backlight determination method according to the fourth embodiment.
  • the backlight determination device 40 is, for example, a computer.
  • the backlight determination device 40 can execute the backlight determination method according to the fourth embodiment by executing the backlight determination program.
  • the backlight determination device 40 includes an information acquisition unit 41, an eyelid opening/closing detection unit 42 that detects the opening and closing of the eyes (eyelids) based on the face image, and a light reflection detection unit that detects the light reflection of the pupil. It has a reflection detection section 43, a corneal reflection detection section 44 for detecting reflection from the cornea, and an eye determination section 45 as a determination section. It is the same as that of 3.
  • FIG. 14 is a diagram showing reflection detection processing of the backlight determination device 40 according to the fourth embodiment.
  • Humans close their eyes (eyelids) when they feel glare, but it is unknown whether they close their eyes due to the influence of incoming light or by chance. Therefore, it is determined whether or not the eyes are closed due to the backlight by combining with the reflected light vector.
  • the backlight determination device 40 determines that the backlight state is present when there is reflected light entering the eyes detected by the method of the first or second embodiment and the eyes are closed.
  • the corneal reflex is detected first, then the pupil becomes smaller and the light reflex occurs. Therefore, when the corneal reflection is detected by the corneal reflection detection unit 44 and then the opposite reflection is detected by the light reflection detection unit 43, it is determined that the backlight state is present.
  • the order of nerve transmission until the pupil reacts is the following order, which delays the reaction of the pupil. Light input, retina, optic nerve, optic chiasm, optic tract, pretectal olivary nucleus, Edinger-Westphal nucleus, oculomotor nerve, ciliary ganglion, sphincter pupillae.
  • FIG. 15 is a flowchart showing eyelid detection processing of the backlight determination device 40 according to the fourth embodiment.
  • the backlight determination device 40 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S401).
  • the backlight determination device 40 acquires the camera image of the driver's eyes at the time of the occurrence of the event (including several seconds immediately before) (step S402).
  • the backlight determination device 40 determines the degree of opening and closing of the driver's eyes (eyelids) (step S403). This is because people tend to have an expression of closing their eyes when it is bright. Information for several seconds before the occurrence of the event is recorded, and if the time until the eyelids in the open state become closed is greater than or equal to the threshold, it is judged to be open.
  • the backlight determination device 40 determines whether or not the eyelids are open (step S404).
  • the backlight determination device 40 determines whether light is received by the method described in the first or second embodiment.
  • the backlight determination device 40 detects the presence or absence of corneal reflection for each frame (step S406). The presence or absence of light reflection is detected (step S408), and the process proceeds to step S409.
  • the backlight determination device 40 weights the eyelid open/close value closer to the moment of event occurrence for each frame (step S407).
  • the weight increases linearly in proportion to time, for example, closer to the event occurrence time.
  • the presence or absence of corneal reflection and light reflection can be determined in advance based on the detection results. It can be accurately determined whether the driver was backlit or non-backlit when the event occurred.
  • Embodiment 4 may be performed. If a plurality of determination methods are used together, the determination accuracy is improved.
  • the fourth embodiment is the same as the third embodiment.
  • the judgment results of the backlight judgment devices of Embodiments 1 to 4 can be applied to a vehicle driving support device. For example, among the events whose number of occurrences should be reduced, for events determined to be caused by backlight, a device that warns the driver of a vehicle that is traveling at a time and in a place where backlight is likely to occur. In this case, the driver can know the time and place where backlighting is likely to occur before arriving at that place, and can take countermeasures.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A backlight determination device (10) comprises: an information acquisition unit (11) that acquires event occurrence information (A), vehicle position information (G) indicating the position of a vehicle at the time of occurrence of an event, an image (I) obtained by photographing the outside of the vehicle, and distance information (L) on a distance to a nearby object obtained by detecting the outside of the vehicle at the time of occurrence of the event; a light source position acquisition unit (12) that acquires light source position information indicating the relative position of a light source with respect to the vehicle at the time of occurrence of the event on the basis of the image (I) or calculation using the vehicle position information (G) and a time (t); a shady region acquisition unit (13) that acquires shade information indicating a shady region at the time of occurrence of the event on the basis of map information (M) including road and structure information, the vehicle position information (G), the light source position information (S), and the event occurrence information (A); and a determination unit (14) that determines whether or not the state of the driver of the vehicle at the time of occurrence of the event has been a backlight state in which light is incident from the light source on the eyes of the driver on the basis of the light source position information and the shade information.

Description

逆光判定装置、逆光判定方法、及び逆光判定プログラムBacklight determination device, backlight determination method, and backlight determination program
 本開示は、逆光判定装置、逆光判定方法、及び逆光判定プログラムに関する。 The present disclosure relates to a backlight determination device, a backlight determination method, and a backlight determination program.
 車両の周辺を撮像した画像情報に基づいて、対象物を検出し、対象物に関する情報を取得する画像認識部と、対象物に関する情報を含む車両周辺情報を予め決められたコードに変換する車両周辺情報生成部と、車両周辺情報生成部で変換したコードを記憶する記憶部とを備えた車両周辺情報収集装置が提案されている。例えば、特許文献1を参照。 An image recognition unit that detects objects based on image information obtained by imaging the surroundings of the vehicle and acquires information about the objects, and a vehicle surroundings that converts the vehicle surrounding information including the information about the objects into a predetermined code. A vehicle periphery information collection device has been proposed that includes an information generation unit and a storage unit that stores codes converted by the vehicle periphery information generation unit. See, for example, US Pat.
特開2019-120969号公報(例えば、要約書、図2参照)JP 2019-120969 A (for example, abstract, see FIG. 2)
 しかしながら、特許文献1に記載の装置は、車両に何等かの事象が発生したときに、この車両を運転していた運転者が、逆光下にあったか否か正確に判定することができない。 However, the device described in Patent Document 1 cannot accurately determine whether or not the driver who was driving the vehicle was in backlight when some event occurred in the vehicle.
 本開示は、予め定められた事象の発生時に運転者が逆光有状態であったか否かを正確に判定可能にすることを目的とする。 An object of the present disclosure is to make it possible to accurately determine whether or not the driver is in a backlight state when a predetermined event occurs.
 本開示の逆光判定装置は、予め定められた事象が車両に発生したことを示す事象発生情報と、前記事象の発生時における車両の位置を示す車両位置情報と、前記発生時に前記車両の外を撮影した画像と、前記発生時における前記車両の外を検出して得られた周辺物体までの距離情報とを取得する情報取得部と、前記車両位置情報と時刻とを用いた計算に基づいて又は前記画像に基づいて、前記発生時における前記車両に対する光源の相対位置を示す光源位置情報を取得する光源位置取得部と、道路及び構造物の情報を含む地図情報、前記車両位置情報、前記光源位置情報、及び前記事象発生情報に基づいて、前記発生時における日陰領域を示す日陰情報を取得する日陰領域取得部と、前記光源位置情報と前記日陰情報とに基づいて、前記発生時における前記車両の運転者の状態が、前記光源から前記運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う判定部と、を有することを特徴とする。 The backlight determination device of the present disclosure includes event occurrence information indicating that a predetermined event has occurred in a vehicle, vehicle position information indicating the position of the vehicle at the time of occurrence of the event, and and an information acquisition unit that acquires an image obtained by detecting the outside of the vehicle at the time of occurrence and distance information to a surrounding object obtained by detecting the outside of the vehicle, and a calculation using the vehicle position information and the time Alternatively, a light source position acquisition unit that acquires light source position information indicating the relative position of the light source with respect to the vehicle at the time of occurrence based on the image, map information including road and structure information, the vehicle position information, and the light source a shaded area acquisition unit that acquires shaded area information indicating a shaded area at the time of occurrence based on position information and the event occurrence information; and a determination unit that determines whether or not the driver of the vehicle is in a backlight state in which light from the light source is incident on the eyes of the driver.
 本開示の他の逆光判定装置は、車両の運転者の顔を撮影する車内カメラから出力された画像、及び予め定められた事象が前記車両に発生したことを示す事象発生情報を取得する取得部と、事前の機械学習によって逆光が目に入ったときの顔の表情を分類するために生成された学習済モデルを用いて、前記事象の発生時に、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う判定部と、を有することを特徴とする。 Another backlight determination device of the present disclosure includes an acquisition unit that acquires an image output from an in-vehicle camera that captures the face of a driver of a vehicle and event occurrence information that indicates that a predetermined event has occurred in the vehicle. and, using a learned model generated by prior machine learning to classify facial expressions when backlight enters the eye, when the event occurs, the driver's state changes from the light source to the and a determination unit that determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver of the vehicle.
 本開示のさらに他の逆光判定装置は、車両の運転者の目を撮影する車内カメラから出力された画像、及び予め定められた事象が前記車両に発生したことを示す事象発生情報を取得する取得部と、前記事象の発生時に、前記運転者の目において角膜反射及び対向反射が順に発生したかどうかに基づいて、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う判定部と、を有することを特徴とする。 Still another backlight determination device of the present disclosure acquires an image output from an in-vehicle camera that captures the eyes of a driver of a vehicle and event occurrence information indicating that a predetermined event has occurred in the vehicle. and whether a corneal reflection and an opposing reflection occurred sequentially in the driver's eye at the time of the event, the driver's condition is incident on the vehicle's driver's eye from a light source. and a determination unit that determines whether or not there is a backlit state in which light exists.
 本開示によれば、予め定められた事象の発生時に運転者が逆光有状態であったか否かを正確に判定可能にすることを目的とする。 According to the present disclosure, it is an object of making it possible to accurately determine whether or not the driver was in a backlight state when a predetermined event occurred.
実施の形態1に係る逆光判定装置の構成を概略的に示す機能ブロック図である。1 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 1; FIG. 実施の形態1に係る逆光判定装置のハードウェア構成を示す図である。1 is a diagram showing a hardware configuration of a backlight determination device according to Embodiment 1; FIG. (A)は、事象の発生時の例(日陰がない場合)を示す図であり、(B)は、車両に対する太陽の位置を示す図である。(A) is a diagram showing an example when an event occurs (when there is no shade), and (B) is a diagram showing the position of the sun relative to the vehicle. (A)は、事象の発生時の例(日陰がある場合)を示す図であり、(B)は、事象の発生時の例(日陰がある場合)を示す図である。(A) is a diagram showing an example when an event occurs (when there is a shade), and (B) is a diagram showing an example when an event occurs (when there is a shade). 実施の形態1に係る逆光判定装置の日陰領域検知動作を示すフローチャートである。5 is a flow chart showing a shadow region detection operation of the backlight determination device according to Embodiment 1; 実施の形態2に係る逆光判定装置の構成を概略的に示す機能ブロック図である。FIG. 5 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 2; 反射光ベクトルの算出処理を示す図である。It is a figure which shows the calculation process of a reflected light vector. 実施の形態2に係る逆光判定装置の反射光の検知動作を示すフローチャートである。10 is a flow chart showing the reflected light detection operation of the backlight determination device according to the second embodiment. 実施の形態3に係る逆光判定装置の構成を概略的に示す機能ブロック図である。FIG. 11 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 3; 実施の形態3に係る逆光判定装置のハードウェア構成を示す図である。FIG. 10 is a diagram showing a hardware configuration of a backlight determination device according to Embodiment 3; 表情判定モデルの学習のための構成と表示判定モデルの活用のための構成を示す図である。FIG. 10 is a diagram showing a configuration for learning a facial expression determination model and a configuration for utilizing a display determination model; 表情検知処理を示すフローチャートである。4 is a flowchart showing facial expression detection processing; 実施の形態4に係る逆光判定装置の構成を概略的に示す機能ブロック図である。FIG. 11 is a functional block diagram schematically showing the configuration of a backlight determination device according to Embodiment 4; 反射検知処理を示す図である。It is a figure which shows reflection detection processing. 瞼検知処理を示すフローチャートである。It is a flow chart which shows eyelid detection processing.
 以下に、実施の形態に係る逆光判定装置、逆光判定方法、及び逆光判定プログラムを、図面を参照しながら説明する。以下の実施の形態は、例にすぎず、実施の形態を適宜組み合わせること及び各実施の形態を適宜変更することが可能である。 A backlight determination device, a backlight determination method, and a backlight determination program according to an embodiment will be described below with reference to the drawings. The following embodiments are merely examples, and the embodiments can be combined as appropriate and each embodiment can be modified as appropriate.
実施の形態1.
 図1は、実施の形態1に係る逆光判定装置10の構成を概略的に示す機能ブロック図である。逆光判定装置10は、実施の形態1に係る逆光判定方法を実行することができる装置である。逆光判定装置10は、例えば、コンピュータである。逆光判定装置10は、逆光判定プログラムを実行することにより、実施の形態1に係る逆光判定方法を実行することができる。
Embodiment 1.
FIG. 1 is a functional block diagram schematically showing the configuration of a backlight determination device 10 according to Embodiment 1. As shown in FIG. The backlight determination device 10 is a device capable of executing the backlight determination method according to the first embodiment. The backlight determination device 10 is, for example, a computer. The backlight determination device 10 can execute the backlight determination method according to the first embodiment by executing the backlight determination program.
 図1に示されるように、逆光判定装置10は、情報取得部11と、光源位置取得部12と、日陰領域取得部13と、判定部14とを有している。逆光判定装置10は、車両のドライブレコーダ60の記憶部62に記憶された情報及び地図情報63に基づいて運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う。ドライブレコーダ60の記憶部62に記憶された情報は、車両の走行時に車載センサ50から取得される。車載センサ50は、例えば、車外を撮影する車外カメラ51、車両の周辺物体までの距離を計測する測距センサ52、測位システムの位置検出部であるGPS53、及び予め定められた事象の検出に使用される加速度センサ54を有している。測距センサ52は、例えば、LiDAR、ミリ波レーダー、ソナーレーダー、などである。車外カメラ51として、ステレオカメラを用いた場合には、測距センサ52を省略可能である。 As shown in FIG. 1, the backlight determination device 10 has an information acquisition section 11, a light source position acquisition section 12, a shade area acquisition section 13, and a determination section . The backlight determination device 10 determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver based on the information stored in the storage unit 62 of the drive recorder 60 of the vehicle and the map information 63 . The information stored in the storage unit 62 of the drive recorder 60 is obtained from the in-vehicle sensor 50 while the vehicle is running. The in-vehicle sensor 50 includes, for example, an exterior camera 51 that captures the exterior of the vehicle, a range sensor 52 that measures the distance to objects around the vehicle, a GPS 53 that is a position detection unit of a positioning system, and a predetermined event detection. It has an acceleration sensor 54 that is The ranging sensor 52 is, for example, LiDAR, millimeter wave radar, sonar radar, or the like. When a stereo camera is used as the vehicle exterior camera 51, the ranging sensor 52 can be omitted.
 情報取得部11は、予め定められた事象Eが車両に発生したことを示す事象発生情報Aと、事象Eの発生時における車両の位置を示す車両位置情報Gと、発生時に前記車両の外を撮影した画像Iと、事象Eの発生時における車両の外を検出して得られた周辺物体までの距離を示す距離情報Lとを取得する。 The information acquisition unit 11 receives event occurrence information A indicating that a predetermined event E has occurred in the vehicle, vehicle position information G indicating the position of the vehicle at the time the event E occurred, and outside the vehicle at the time of occurrence. A photographed image I and distance information L indicating the distance to a peripheral object obtained by detecting the outside of the vehicle when the event E occurred are acquired.
 光源位置取得部12は、車両位置情報Gと時刻tとを用いた計算に基づいて光源としての太陽の位置を取得する。また、光源位置取得部12は、車外カメラ51で撮影された画像Iに基づいて、事象Eの発生時における車両に対する光源の相対位置を示す光源位置情報を取得する。光源は、対向車線を走行する対向車のヘッドライトであってもよい。この場合、光源の位置は、車外カメラの画像又は測距センサの検出信号に基づいて計算される。 The light source position acquisition unit 12 acquires the position of the sun as a light source based on calculation using the vehicle position information G and time t. Further, the light source position acquisition unit 12 acquires light source position information indicating the relative position of the light source with respect to the vehicle when the event E occurs, based on the image I captured by the exterior camera 51 . The light source may be the headlights of an oncoming vehicle traveling in the oncoming lane. In this case, the position of the light source is calculated based on the image of the camera outside the vehicle or the detection signal of the ranging sensor.
 日陰領域取得部13は、道路及び構造物の情報を含む地図情報M、車両位置情報G、光源位置情報S、及び事象発生情報Aに基づいて、事象Eの発生時における日陰領域を示す日陰情報を取得する。 The shade area acquisition unit 13 obtains shade information indicating the shade area when the event E occurs based on the map information M including the information on the road and the structure, the vehicle position information G, the light source position information S, and the event occurrence information A. to get
 判定部14は、光源位置情報と日陰情報とに基づいて、事象Eの発生時における車両の運転者の状態が、光源から運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う。なお、事象Eの発生時は、事象Eが発生した瞬間の直前の予め定められた時間を含む。予め定められた時間は、例えば、0.5秒、1秒、2秒、又は3秒である。また、予め定められた時間はとして、異なる複数の時間を用いて、判定を行ってもよい。 Based on the light source position information and the shade information, the determination unit 14 determines whether or not the driver of the vehicle when the event E occurred was in a backlit state in which there was light entering the driver's eyes from the light source. make a judgment. Note that the occurrence time of the event E includes a predetermined time immediately before the moment when the event E occurred. The predetermined time is, for example, 0.5 seconds, 1 second, 2 seconds, or 3 seconds. In addition, determination may be made using a plurality of different times, instead of the predetermined time.
 図2は、実施の形態1に係る逆光判定装置10のハードウェア構成を示す図である。図2に示されるように、逆光判定装置10は、CPU(Central Processing Unit)などのプロセッサ101、揮発性の記憶装置であるメモリ102、ハードディスクドライブ(HDD)又はソリッドステートドライブ(SDD)などの不揮発性の記憶装置103とを有する。メモリ102は、例えば、RAM(Random Access Memory)などの、揮発性の半導体メモリである。 FIG. 2 is a diagram showing the hardware configuration of the backlight determination device 10 according to Embodiment 1. As shown in FIG. As shown in FIG. 2, the backlight determination device 10 includes a processor 101 such as a CPU (Central Processing Unit), a memory 102 that is a volatile storage device, and a non-volatile memory such as a hard disk drive (HDD) or solid state drive (SDD). and a storage device 103 for the data. The memory 102 is, for example, a volatile semiconductor memory such as a RAM (Random Access Memory).
 逆光判定装置10の各機能は、処理回路により実現される。処理回路は、専用のハードウェアであっても、メモリ102に格納されるプログラムを実行するプロセッサ101であってもよい。プロセッサ101は、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、及びDSP(Digital Signal Processor)のいずれであってもよい。 Each function of the backlight determination device 10 is realized by a processing circuit. The processing circuitry may be dedicated hardware or processor 101 executing a program stored in memory 102 . The processor 101 may be any of a processing device, an arithmetic device, a microprocessor, a microcomputer, and a DSP (Digital Signal Processor).
 処理回路が専用のハードウェアである場合、処理回路は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、又はこれらのうちのいずれかを組み合わせたものである。 If the processing circuit is dedicated hardware, the processing circuit may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array) ), or a combination of any of these.
 処理回路がプロセッサ151である場合、逆光判定プログラムは、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェア及びファームウェアは、プログラムとして記述され、メモリ102に格納される。プロセッサ101は、メモリ102に記憶された逆光判定プログラムを読み出して実行することにより、図1に示される各部の機能を実現する。 When the processing circuit is the processor 151, the backlight determination program is realized by software, firmware, or a combination of software and firmware. Software and firmware are written as programs and stored in memory 102 . The processor 101 reads and executes the backlight determination program stored in the memory 102, thereby implementing the functions of the units shown in FIG.
 なお、逆光判定装置10は、一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現するようにしてもよい。このように、処理回路は、ハードウェア、ソフトウェア、ファームウェア、又はこれらのうちのいずれかの組み合わせによって、上述の各機能を実現することができる。 It should be noted that the backlight determination device 10 may be partially realized by dedicated hardware and partially realized by software or firmware. As such, the processing circuitry may implement each of the functions described above in hardware, software, firmware, or any combination thereof.
 図3(A)は、事象の発生時の例(日陰がない場合)を示す図であり、図3(B)は、車両に対する太陽の位置を示す図である。図3(B)に示されるように、事象Eの発生時には、車両の運転者の目に入射する光の中心光線が車両の走行方向に対して成す水平方向の角度である第1の角度αが予め定められた第1の角度範囲内であり、且つ、中心光線が走行方向に対して成す鉛直方向の角度である第2の角度βが予め定められた第2の角度範囲内である場合に、運転者が日陰領域の外側であれば、判定部14は、事象Eの発生時に運転者の状態が逆光状態であったと判定する。第1の角度αは、方位角である。第2の角度は、仰角である。 FIG. 3(A) is a diagram showing an example when an event occurs (when there is no shade), and FIG. 3(B) is a diagram showing the position of the sun relative to the vehicle. As shown in FIG. 3(B), when event E occurs, a first angle .alpha. is within a predetermined first angle range, and the second angle β, which is the vertical angle formed by the central ray with respect to the traveling direction, is within a predetermined second angle range Moreover, if the driver is outside the shaded area, the determination unit 14 determines that the driver was in the backlight state when event E occurred. The first angle α is the azimuth angle. The second angle is the elevation angle.
 図4(A)は、事象の発生時の例(日陰がある場合)を示す図であり、図4(B)は、事象の発生時の例(日陰がある場合)を示す図である。判定部は、事象Eの発生時に運転者が日陰領域内にいれば、事象Eの発生時における運転者の状態が非逆光状態であったと判定する。例えば、図4(A)に示されるように、地図情報に示される構造物であるビルによって、事象Eの発生時に日陰がある場合、この日陰領域内の車両の運転者の状態は、非逆光状態である。また、図4(B)に示されるように、カメラ画像によって前方の大型車両によって、事象Eの発生時に日陰がある場合、この日陰領域内の車両の運転者の状態は、非逆光状態である。 FIG. 4(A) is a diagram showing an example when an event occurs (when there is a shade), and FIG. 4(B) is a diagram showing an example when an event occurs (when there is a shade). If the driver is in the shadow area when event E occurs, the determining unit determines that the driver was in a non-backlit state when event E occurred. For example, as shown in FIG. 4A, if there is a shaded area at the time of event E caused by a building, which is a structure shown in the map information, the driver's state of the vehicle in this shaded area is non-backlit. state. In addition, as shown in FIG. 4(B), if there is a shadow due to the presence of a large vehicle in front of the camera image when event E occurs, the driver of the vehicle in the shaded area is in a non-backlit state. .
 図5は、実施の形態1に係る逆光判定装置10の日陰領域検知動作を示すフローチャートである。逆光判定装置10は、加速度センサの出力に基づいて衝撃の有無を判定して、例えば、衝撃が閾値を超えたときに事象が発生したと判定する(ステップS101)。この場合の事象は、例えば、車両が何かに衝突した、又は衝突された状態である。 FIG. 5 is a flow chart showing the shadow area detection operation of the backlight determination device 10 according to the first embodiment. The backlight determination device 10 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S101). An event in this case is, for example, a situation where the vehicle has hit or been hit by something.
 次に、逆光判定装置10は、GPSにより車両位置情報を取得する(ステップS102)。 Next, the backlight determination device 10 acquires vehicle position information by GPS (step S102).
 次に、逆光判定装置10は、地図情報、車外カメラの画像、測距センサによって検知された構造物(例えば、ビル)又は周辺物体(例えば、道路上の前方の大型車両)を検知する(ステップS103)。 Next, the backlight determination device 10 detects the map information, the image of the camera outside the vehicle, the structure (for example, a building) or the surrounding object (for example, a large vehicle ahead on the road) detected by the ranging sensor (step S103).
 次に、逆光判定装置10は、車両位置情報と周辺物体情報と、時刻とから光源である太陽の位置を推定(すなわち、計算)する(ステップS104)。 Next, the backlight determination device 10 estimates (that is, calculates) the position of the sun, which is the light source, from the vehicle position information, the surrounding object information, and the time (step S104).
 逆光判定装置10は、周辺物体がない場合(ステップS105でNO)、車両と光源の位置関係(すなわち、相対位置)に基づいて車両に入光があるか否かを判定する(ステップS106)。 If there is no surrounding object (NO in step S105), the backlight determination device 10 determines whether or not the vehicle receives light based on the positional relationship (that is, relative position) between the vehicle and the light source (step S106).
 逆光判定装置10は、周辺物体がある場合(ステップS105でYES)周辺物体が地図情報に登録されている構造物かどうかを判定する(ステップS107)。 If there is a surrounding object (YES in step S105), the backlight determination device 10 determines whether the surrounding object is a structure registered in the map information (step S107).
 逆光判定装置10は、構造物が地図情報に登録されていなければ(ステップS107でNO)、車外カメラ又は測距センサによって物体の位置と形状を取得し(ステップS108)、車両と光源と周辺物体の位置情報から車両への入光の有無を判定する(ステップS110)。 If the structure is not registered in the map information (NO in step S107), the backlight determination device 10 acquires the position and shape of the object using the camera outside the vehicle or the distance measuring sensor (step S108). The presence or absence of light entering the vehicle is determined from the position information of (step S110).
 逆光判定装置10は、構造物が地図情報に登録されていれば(ステップS107でYES)、地図情報から物体の位置と形状を取得し(ステップS109)、車両と光源と周辺物体の位置情報から車両への入光の有無を判定する(ステップS110)。 If the structure is registered in the map information (YES in step S107), the backlight determination apparatus 10 acquires the position and shape of the object from the map information (step S109), It is determined whether or not light enters the vehicle (step S110).
 以上に説明したように、実施の形態1に係る逆光判定装置10、遮光判定方法、又は遮光判定プログラムを使用すれば、予め定められた事象の発生時に運転者が逆光状態であったか非逆光状態であったかを正確に判定することができる。 As described above, if the backlight determination device 10, the light shielding determination method, or the light shielding determination program according to the first embodiment is used, it is possible to determine whether the driver is in a backlight state or a non-backlight state when a predetermined event occurs. can be determined accurately.
実施の形態2.
 実施の形態1では、光源から車両の運転者の目に光が直接入射する場合を説明した。しかし、壁面がガラス窓で構成されたビルなどのように、構造物が光反射領域を有する場合がある。実施の形態2では、光反射領域を持つ構造物から反射光を考慮した判定を可能にする逆光判定装置20を説明する。
Embodiment 2.
In Embodiment 1, the case where the light from the light source is directly incident on the eyes of the driver of the vehicle has been described. However, there are cases where a structure has a light reflecting area, such as a building whose walls are made of glass windows. Embodiment 2 describes a backlight determination device 20 that enables determination in consideration of reflected light from a structure having a light reflection area.
 図6は、実施の形態2に係る逆光判定装置20の構成を概略的に示す機能ブロック図である。逆光判定装置20は、実施の形態2に係る逆光判定方法を実行することができる装置である。逆光判定装置20は、例えば、コンピュータである。逆光判定装置20は、逆光判定プログラムを実行することにより、実施の形態2に係る逆光判定方法を実行することができる。 FIG. 6 is a functional block diagram schematically showing the configuration of the backlight determination device 20 according to the second embodiment. The backlight determination device 20 is a device capable of executing the backlight determination method according to the second embodiment. The backlight determination device 20 is, for example, a computer. The backlight determination device 20 can execute the backlight determination method according to the second embodiment by executing the backlight determination program.
 図6に示されるように、逆光判定装置20は、情報取得部21と、光源位置取得部22と、反射光ベクトル取得部23と、日陰領域取得部24と、判定部25とを有している。
情報取得部21と光源位置取得部22とは、実施の形態1における情報取得部11と光源位置取得部12と同じものである。逆光判定装置20は、反射光ベクトル取得部23と、日陰領域取得部24とを有している点において、実施の形態1に係る逆光判定装置10と異なる。
As shown in FIG. 6, the backlight determination device 20 includes an information acquisition unit 21, a light source position acquisition unit 22, a reflected light vector acquisition unit 23, a shade area acquisition unit 24, and a determination unit 25. there is
The information acquisition unit 21 and the light source position acquisition unit 22 are the same as the information acquisition unit 11 and the light source position acquisition unit 12 in the first embodiment. The backlight determination device 20 differs from the backlight determination device 10 according to the first embodiment in that it has a reflected light vector acquisition unit 23 and a shadow area acquisition unit 24 .
 図7は、反射光ベクトルの算出処理を示す図である。まず、反射光ベクトル取得部23は、周辺物体の光の反射状況を把握し車両に入射されているか否かを判定、記録する。反射領域は、例えば、車外カメラの画像から反射の有無を判定する。車外カメラにより反射領域を判定した後、測距センサの計測結果又は地図情報から光反射面を算出する。光と反射面との位置関係から、反射面の法線ベクトルを算出し、車両の進行方向と反射光ベクトルのなす角度(α、β)が小さいほど影響度が大きくなる。 FIG. 7 is a diagram showing the calculation process of the reflected light vector. First, the reflected light vector acquisition unit 23 grasps the reflection state of light from surrounding objects, determines whether or not the light is incident on the vehicle, and records the result. For the reflection area, for example, the presence or absence of reflection is determined from the image of the camera outside the vehicle. After the reflection area is determined by the external camera, the light reflection surface is calculated from the measurement result of the distance measuring sensor or the map information. From the positional relationship between the light and the reflecting surface, the normal vector of the reflecting surface is calculated.
 図8は、実施の形態2に係る逆光判定装置20の反射光の検知動作を示すフローチャートである。逆光判定装置20は、加速度センサの出力に基づいて衝撃の有無を判定して、例えば、衝撃が閾値を超えたときに事象が発生したと判定する(ステップS201)。 FIG. 8 is a flowchart showing the reflected light detection operation of the backlight determination device 20 according to the second embodiment. The backlight determination device 20 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S201).
 次に、逆光判定装置20は、GPSにより車両位置情報を取得する(ステップS202)。 Next, the backlight determination device 20 acquires vehicle position information by GPS (step S202).
 次に、逆光判定装置20は、地図情報、車外カメラの画像、測距センサによって検知された構造物(例えば、ビル)又は周辺物体(例えば、道路上の前方の大型車両)を検知する(ステップS203)。 Next, the backlight determination device 20 detects the map information, the image of the camera outside the vehicle, the structure (for example, building) or the surrounding object (for example, a large vehicle ahead on the road) detected by the ranging sensor (step S203).
 次に、逆光判定装置20は、車両位置情報と周辺物体情報と、時刻とから光源である太陽の位置を推定(すなわち、計算)する(ステップS204)。 Next, the backlight determination device 20 estimates (that is, calculates) the position of the sun, which is the light source, from the vehicle position information, the surrounding object information, and the time (step S204).
 逆光判定装置10は、周辺物体がない場合(ステップS205でNO)、車両と光源の位置関係(すなわち、相対位置)に基づいて車両に入光があるか否かを判定する(ステップS106)。 If there is no surrounding object (NO in step S205), the backlight determination device 10 determines whether or not the vehicle receives light based on the positional relationship (that is, relative position) between the vehicle and the light source (step S106).
 逆光判定装置10は、周辺物体がある場合(ステップS205でYES)周辺物体が地図情報に登録されている構造物かどうかを判定する(ステップS207)。 If there is a surrounding object (YES in step S205), the backlight determination device 10 determines whether the surrounding object is a structure registered in the map information (step S207).
 逆光判定装置20は、構造物が地図情報に登録されていなければ(ステップS207でNO)、車外カメラ又は測距センサによって物体の位置と形状と面を取得し(ステップS208)、各物体の面においてカメラにより白トビ領域を位置情報から車両への入光の有無を判定する(ステップS210)。 If the structure is not registered in the map information (NO in step S207), the backlight determination device 20 acquires the position, shape, and surface of the object using the camera outside the vehicle or the distance measuring sensor (step S208), and determines the surface of each object. In step S210, the presence or absence of light incident on the vehicle is determined based on the positional information of the overexposed area by the camera.
 逆光判定装置20は、構造物が地図情報に登録されていれば(ステップS207でYES)、地図情報から物体の位置と形状と面を取得し(ステップS209)、各物体の面においてカメラにより白トビ領域を位置情報から車両への入光の有無を判定する(ステップS210)。 If the structure is registered in the map information (YES in step S207), the backlight determination device 20 acquires the position, shape and surface of the object from the map information (step S209), Whether or not there is light entering the vehicle is determined based on the positional information of the kite area (step S210).
 逆光判定装置20は、白トビ領域があれば(ステップS211でYES)、車両への入港の有無を判定し(ステップS212)、白トビ領域がなければ(ステップS211でNO)、車両位置情報と光源位置情報と物体の面とから車両への入射光ベクトルを算出する(ステップS213)。 If there is a highlighted area (YES in step S211), the backlight determination device 20 determines whether or not the vehicle has entered the port (step S212). An incident light vector to the vehicle is calculated from the light source position information and the surface of the object (step S213).
 以上に説明したように、実施の形態2に係る逆光判定装置20、遮光判定方法、又は遮光判定プログラムを使用すれば、予め定められた事象の発生時に運転者が逆光状態であったか非逆光状態であったかを正確に判定することができる。 As described above, if the backlight determination device 20, the light shielding determination method, or the light shielding determination program according to the second embodiment is used, it is possible to determine whether the driver is in a backlight state or a non-backlight state when a predetermined event occurs. can be determined accurately.
 上記以外に関して、実施の形態2は、実施の形態1と同じである。 Except for the above, the second embodiment is the same as the first embodiment.
実施の形態3.
 上記実施の形態1及び2では、運転者が眩しいときに表す表情(例えば、目を細める)を利用していない。実施の形態3では、運転者の表情に基づいて逆光の有無を判定する例を説明する。ただし、実施の形態3の構成を、実施の形態1又は2に係る逆光判定装置に組み合わせることが可能である。例えば、実施の形態1又は2に係る逆光判定装置による判定結果と、実施の形態3に係る逆光判定装置の判定結果とを用いて、逆光判定を行うことができる。
Embodiment 3.
Embodiments 1 and 2 above do not use the driver's facial expression (for example, squinting) when the driver is dazzled. In Embodiment 3, an example will be described in which the presence or absence of backlight is determined based on the facial expression of the driver. However, it is possible to combine the configuration of the third embodiment with the backlight determination device according to the first or second embodiment. For example, backlight determination can be performed using the determination result of the backlight determination apparatus according to the first or second embodiment and the determination result of the backlight determination apparatus according to the third embodiment.
 図9は、実施の形態3に係る逆光判定装置30の構成を概略的に示す機能ブロック図である。逆光判定装置30は、実施の形態3に係る逆光判定方法を実行することができる装置である。逆光判定装置30は、例えば、コンピュータである。逆光判定装置30は、逆光判定プログラムを実行することにより、実施の形態3に係る逆光判定方法を実行することができる。 FIG. 9 is a functional block diagram schematically showing the configuration of the backlight determination device 30 according to Embodiment 3. As shown in FIG. The backlight determination device 30 is a device capable of executing the backlight determination method according to the third embodiment. The backlight determination device 30 is, for example, a computer. The backlight determination device 30 can execute the backlight determination method according to the third embodiment by executing the backlight determination program.
 図9に示されるように、逆光判定装置30は、情報取得部31と、判定部としての表情判定部32とを有している。表情判定部32は、事前に機械学習(例えば、深層学習など)によって生成された学習済モデルである表情判定モデルを用いて、運転者の表情が眩しいときの表情であるか否かを判定する。逆光判定装置30は、車両のドライブレコーダ60の記憶部62に記憶された情報に基づいて運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う。ドライブレコーダ60の記憶部62に記憶された情報は、車両の走行時に車載センサ50から取得される。車載センサ50は、例えば、車内の運転者の顔を撮影する車内カメラ55、予め定められた事象の検出に使用される加速度センサ54を有している。なお、車載センサ50は、実施の形態1又は2と同様に、測距センサ52、GPS53、車外カメラ51を備えてもよい。 As shown in FIG. 9, the backlight determination device 30 has an information acquisition section 31 and an expression determination section 32 as a determination section. The facial expression determination unit 32 uses a facial expression determination model, which is a learned model generated in advance by machine learning (for example, deep learning), to determine whether or not the facial expression of the driver is that of a dazzling expression. . The backlight determining device 30 determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver based on the information stored in the storage unit 62 of the drive recorder 60 of the vehicle. The information stored in the storage unit 62 of the drive recorder 60 is obtained from the in-vehicle sensor 50 while the vehicle is running. The in-vehicle sensor 50 has, for example, an in-vehicle camera 55 that captures the face of the driver inside the vehicle, and an acceleration sensor 54 that is used to detect a predetermined event. In addition, the vehicle-mounted sensor 50 may be provided with the ranging sensor 52, GPS53, and the vehicle exterior camera 51 like Embodiment 1 or 2. FIG.
 図10は、実施の形態3に係る逆光判定装置30のハードウェア構成を示す図である。図10に示されるように、逆光判定装置20は、CPUなどのプロセッサ301、揮発性の記憶装置であるメモリ302、HDD又はSDDなどの記憶装置303とを有する。メモリ302は、例えば、RAMである。 FIG. 10 is a diagram showing the hardware configuration of the backlight determination device 30 according to the third embodiment. As shown in FIG. 10, the backlight determination device 20 has a processor 301 such as a CPU, a memory 302 which is a volatile storage device, and a storage device 303 such as an HDD or SDD. Memory 302 is, for example, a RAM.
 逆光判定装置30の各機能は、実施の形態1の場合と同様に、処理回路により実現される。処理回路は、専用のハードウェアであっても、メモリ102に格納されるプログラムを実行するプロセッサ101であってもよい。 Each function of the backlight determination device 30 is implemented by a processing circuit, as in the case of the first embodiment. The processing circuitry may be dedicated hardware or processor 101 executing a program stored in memory 102 .
 図11は、表情判定モデルの学習のための構成と表示判定モデルの活用のための構成を示す図である。図11の学習では、運転者の表情を検知し、眩しいときの表情を教師として深層学習によって学習させたモデルにて表情を分類し、眩しいときの顔とそれ以外の顔とに分類する。事象Eの直前の数秒間、表情を記録し、事前に学習した表情判定モデルを用いて、眩しさを感じているか否かを判定する。分類の正解率により眩しさを段階的に判定することも可能である。 FIG. 11 is a diagram showing the configuration for learning the facial expression determination model and the configuration for utilizing the display determination model. In the learning of FIG. 11, the facial expression of the driver is detected, and the facial expression in glare is used as a teacher, and the facial expression is classified by a model learned by deep learning, and classified into the face in glare and other faces. A facial expression is recorded for several seconds immediately before event E, and a pre-learned facial expression determination model is used to determine whether or not the subject is feeling glare. It is also possible to determine the glare step by step according to the accuracy rate of the classification.
 図12は、実施の形態3に係る逆光判定装置30の表情検知処理を示すフローチャートである。逆光判定装置30は、加速度センサの出力に基づいて衝撃の有無を判定して、例えば、衝撃が閾値を超えたときに事象が発生したと判定する(ステップS301)。 FIG. 12 is a flowchart showing facial expression detection processing of the backlight determination device 30 according to the third embodiment. The backlight determination device 30 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S301).
 次に、逆光判定装置30は、事象発生時(直前の数秒を含む)における運転者の表情のカメラ映像を取得する(ステップS202)。 Next, the backlight determination device 30 acquires a camera image of the driver's expression at the time of the occurrence of the event (including several seconds immediately before) (step S202).
 次に、逆光判定装置30は、運転者の目の開閉具合を判定する(ステップS303)。これは、眩しいときには、人は、目を細める、目じりを下げる、といった表情になる傾向があるからである。 Next, the backlight determination device 30 determines how the driver's eyes are opened or closed (step S303). This is because people tend to make facial expressions such as narrowing their eyes or lowering the corners of their eyes when the light is bright.
 次に、逆光判定装置30は、運転者が眩しさを感じているときの時間長が予め定められた閾値以上であったか否かを判定する(ステップS304)。 Next, the backlight determination device 30 determines whether or not the length of time during which the driver feels glare is equal to or greater than a predetermined threshold (step S304).
 逆光判定装置30は、閾値未満である場合(ステップS304でNO)、運転者は、眩しさを感じている表情の検知が無いと判定する。逆光判定装置30は、閾値以上である場合(ステップS304でYES)、事象発生時の直前の複数フレームについて学習済モデルである表情判定モデルを用いて、眩しさ判定(値0又は1)を実施する(ステップS306)。このとき、事象の発生時に近いフレームほど、眩しさ判定値に大きい重みを付けて眩しさの程度を計算する(ステップS307)。重みは、例えば、事象発生時に近い時刻ほど、時間に比例して線形に増加する。 If the value is less than the threshold (NO in step S304), the backlight determination device 30 determines that the driver does not have a dazzling facial expression. If the value is equal to or greater than the threshold (YES in step S304), the backlight determination device 30 performs glare determination (value 0 or 1) using the facial expression determination model, which is a learned model, for a plurality of frames immediately before the occurrence of the event. (step S306). At this time, the degree of glare is calculated by assigning a larger weight to the glare determination value for a frame closer to the occurrence of the event (step S307). The weight increases linearly in proportion to time, for example, closer to the event occurrence time.
 逆光判定装置30は、周辺物体がある場合(ステップS304でYES)周辺物体が地図情報に登録されている構造物かどうかを判定する(ステップS107)。 If there is a surrounding object (YES in step S304), the backlight determination device 30 determines whether the surrounding object is a structure registered in the map information (step S107).
 以上に説明したように、実施の形態3に係る逆光判定装置30、遮光判定方法、又は遮光判定プログラムを使用すれば、眩しさを感じたときの表情の学習済モデルを用いて、予め定められた事象の発生時に運転者が逆光状態であったか非逆光状態であったかを正確に判定することができる。 As described above, if the backlight determination device 30, the light shielding determination method, or the light shielding determination program according to the third embodiment is used, a predetermined expression using a learned model of the expression when feeling glare is determined in advance. It is possible to accurately determine whether the driver was in a backlit or non-backlit condition when the event occurred.
 なお、実施の形態1及び2の少なくとも一方の判定に加えて実施の形態3の判定を行ってもよい。複数の判定方法を併用すれば、判定精度が向上する。 The determination of the third embodiment may be performed in addition to the determination of at least one of the first and second embodiments. If a plurality of determination methods are used together, the determination accuracy is improved.
実施の形態4.
 上記実施の形態1から3では、運転者が眩しいときに表す瞳孔の変化を利用していない。実施の形態4では、運転者の瞳孔に基づいて逆光の有無を判定する例を説明する。ただし、実施の形態4の構成を、実施の形態1から3のいずれか1つ以上の逆光判定装置に組み合わせることが可能である。例えば、実施の形態1又は2に係る逆光判定装置による判定結果と、実施の形態3に係る逆光判定装置の判定結果とを用いて、逆光判定を行うことができる。
Embodiment 4.
Embodiments 1 to 3 described above do not utilize the change in the pupil that the driver exhibits when the driver is dazzled. In Embodiment 4, an example will be described in which the presence or absence of backlight is determined based on the driver's pupils. However, the configuration of Embodiment 4 can be combined with any one or more of the backlight determination devices of Embodiments 1 to 3. For example, backlight determination can be performed using the determination result of the backlight determination apparatus according to the first or second embodiment and the determination result of the backlight determination apparatus according to the third embodiment.
 図13は、実施の形態4に係る逆光判定装置40の構成を概略的に示す機能ブロック図である。逆光判定装置40は、実施の形態4に係る逆光判定方法を実行することができる装置である。逆光判定装置40は、例えば、コンピュータである。逆光判定装置40は、逆光判定プログラムを実行することにより、実施の形態4に係る逆光判定方法を実行することができる。図13に示されるように、逆光判定装置40は、情報取得部41と、目(瞼の開閉を顔映像に基づいて検出する瞼開閉検出部42と、瞳の対光反射を検出する対光反射検出部43と、角膜からの反射を検出する角膜反射検出部44と、判定部としての目判定部45とを有している。また、実施の形態4のハードウェア構成は、実施の形態3のものと同様である。 FIG. 13 is a functional block diagram schematically showing the configuration of the backlight determination device 40 according to the fourth embodiment. The backlight determination device 40 is a device capable of executing the backlight determination method according to the fourth embodiment. The backlight determination device 40 is, for example, a computer. The backlight determination device 40 can execute the backlight determination method according to the fourth embodiment by executing the backlight determination program. As shown in FIG. 13, the backlight determination device 40 includes an information acquisition unit 41, an eyelid opening/closing detection unit 42 that detects the opening and closing of the eyes (eyelids) based on the face image, and a light reflection detection unit that detects the light reflection of the pupil. It has a reflection detection section 43, a corneal reflection detection section 44 for detecting reflection from the cornea, and an eye determination section 45 as a determination section. It is the same as that of 3.
 図14は、実施の形態4に係る逆光判定装置40の反射検知処理を示す図である。人は、眩しさを感じているとき目(瞼)を閉じるが、入光の影響によって目を閉じたの、偶然に目を閉じたのかは不明である。そこで、反射光ベクトルとの組み合わせにより逆光の影響で目を閉じているかどうかを判定する。つまり、逆光判定装置40は、実施の形態1又は2の方法で検出された目に入光する反射光が存在し且つ目を閉じていれば、逆光状態と判定する。 FIG. 14 is a diagram showing reflection detection processing of the backlight determination device 40 according to the fourth embodiment. Humans close their eyes (eyelids) when they feel glare, but it is unknown whether they close their eyes due to the influence of incoming light or by chance. Therefore, it is determined whether or not the eyes are closed due to the backlight by combining with the reflected light vector. In other words, the backlight determination device 40 determines that the backlight state is present when there is reflected light entering the eyes detected by the method of the first or second embodiment and the eyes are closed.
 また、目を閉じていないときは、最初に角膜反射が検出され、その後、瞳孔が小さくなって対光反射が発生する。したがって、角膜反射検出部44で角膜反射が検出され、その後、対光反射検出部43で対向反射が検出されたときに、逆光状態と判定される。なお、瞳孔が反応するまでの神経の伝達順序は、以下の順序であり、これによって瞳孔の反応が遅れる。光の入力、網膜、視神経、視交叉、視索、視蓋前域オリーブ核、Edinger-Westphal核、動眼神経、毛様体神経節、瞳孔括約筋。 Also, when the eyes are not closed, the corneal reflex is detected first, then the pupil becomes smaller and the light reflex occurs. Therefore, when the corneal reflection is detected by the corneal reflection detection unit 44 and then the opposite reflection is detected by the light reflection detection unit 43, it is determined that the backlight state is present. The order of nerve transmission until the pupil reacts is the following order, which delays the reaction of the pupil. Light input, retina, optic nerve, optic chiasm, optic tract, pretectal olivary nucleus, Edinger-Westphal nucleus, oculomotor nerve, ciliary ganglion, sphincter pupillae.
 図15は、実施の形態4に係る逆光判定装置40の瞼検知処理を示すフローチャートである。逆光判定装置40は、加速度センサの出力に基づいて衝撃の有無を判定して、例えば、衝撃が閾値を超えたときに事象が発生したと判定する(ステップS401)。 FIG. 15 is a flowchart showing eyelid detection processing of the backlight determination device 40 according to the fourth embodiment. The backlight determination device 40 determines whether or not there is an impact based on the output of the acceleration sensor, and determines that an event has occurred when, for example, the impact exceeds a threshold value (step S401).
 次に、逆光判定装置40は、事象発生時(直前の数秒を含む)における運転者の目のカメラ映像を取得する(ステップS402)。 Next, the backlight determination device 40 acquires the camera image of the driver's eyes at the time of the occurrence of the event (including several seconds immediately before) (step S402).
 次に、逆光判定装置40は、運転者の目(瞼)の開閉具合を判定する(ステップS403)。これは、眩しいときには、人は、目を閉じるといった表情になる傾向があるからである。ばお、事象の発生前の数秒間の情報を記録し、開状態の瞼が閉状態になるまでの時間において、閾値以上の場合、開判定とする。 Next, the backlight determination device 40 determines the degree of opening and closing of the driver's eyes (eyelids) (step S403). This is because people tend to have an expression of closing their eyes when it is bright. Information for several seconds before the occurrence of the event is recorded, and if the time until the eyelids in the open state become closed is greater than or equal to the threshold, it is judged to be open.
 次に、逆光判定装置40は、瞼を開いているか否かを判定する(ステップS404)。 Next, the backlight determination device 40 determines whether or not the eyelids are open (step S404).
 逆光判定装置40は、瞼を閉じている場合(ステップS404でNO)、実施の形態1又は2で説明した方法で入光の有無を判定する。 When the eyelids are closed (NO in step S404), the backlight determination device 40 determines whether light is received by the method described in the first or second embodiment.
 逆光判定装置40は、瞼を開いている場合(ステップS404でYES)、フレームごとに角膜反射の有無を検出し(ステップS406)、角膜反射がなければ(ステップS407でNO)、フレームごとに対光反射の有無を検出し(ステップS408)、処理はステップS409に進む。 If the eyelids are open (YES in step S404), the backlight determination device 40 detects the presence or absence of corneal reflection for each frame (step S406). The presence or absence of light reflection is detected (step S408), and the process proceeds to step S409.
 逆光判定装置40は、角膜反射があれば(ステップS407でYES)、フレームごとに事象発生の瞬間に近いほど、瞼開閉値に大きい重みを付けて計算する(ステップS407)。重みは、例えば、事象発生時に近い時刻ほど、時間に比例して線形に増加する。 If there is corneal reflection (YES in step S407), the backlight determination device 40 weights the eyelid open/close value closer to the moment of event occurrence for each frame (step S407). The weight increases linearly in proportion to time, for example, closer to the event occurrence time.
 以上に説明したように、実施の形態4に係る逆光判定装置40、遮光判定方法、又は遮光判定プログラムを使用すれば、角膜反射及び対光反射の有無を検知結果に基づいて、予め定められた事象の発生時に運転者が逆光状態であったか非逆光状態であったかを正確に判定することができる。 As described above, if the backlight determination device 40, the light shielding determination method, or the light shielding determination program according to the fourth embodiment is used, the presence or absence of corneal reflection and light reflection can be determined in advance based on the detection results. It can be accurately determined whether the driver was backlit or non-backlit when the event occurred.
 なお、実施の形態1から3の少なくとも一つの判定に加えて、実施の形態4の判定を行ってもよい。複数の判定方法を併用すれば、判定精度が向上する。 In addition to the determination of at least one of Embodiments 1 to 3, the determination of Embodiment 4 may be performed. If a plurality of determination methods are used together, the determination accuracy is improved.
 上記以外に関して、実施の形態4は、実施の形態3と同様である。 Except for the above, the fourth embodiment is the same as the third embodiment.
変形例.
 実施の形態1から4の逆光判定装置の判定結果を、車両の運転支援装置に適用可能である。例えば、発生件数を減らしたい事象のうち、逆光が原因となっていると判定されたものについて、逆光が発生しやすい場所、時間を走行している車両において、運転者に注意する装置とする。この場合、運転者は、逆光が発生しやすい時刻、場所を、その場所に到達する前に知ることができ、対策を講じることができる。
Modification.
The judgment results of the backlight judgment devices of Embodiments 1 to 4 can be applied to a vehicle driving support device. For example, among the events whose number of occurrences should be reduced, for events determined to be caused by backlight, a device that warns the driver of a vehicle that is traveling at a time and in a place where backlight is likely to occur. In this case, the driver can know the time and place where backlighting is likely to occur before arriving at that place, and can take countermeasures.
 10、20、30、40 逆光判定装置、 11、21、31、41 情報取得部、 12、22 光源位置取得部、 13 日陰領域取得部、 14、25 判定部、 23 反射光ベクトル取得部、 32 表情判定部、 42 瞼開閉検出部、 43 対光反射検出部、 44 角膜反射検出部、 50 車載センサ、 51 車外カメラ、 52 測距センサ、 53 GPS、 54 加速度センサ、 55 車内カメラ、 60 ドライブレコーダ、 62 記憶部、 63 地図情報。 10, 20, 30, 40 Backlight determination device 11, 21, 31, 41 Information acquisition unit 12, 22 Light source position acquisition unit 13 Shade area acquisition unit 14, 25 Judgment unit 23 Reflected light vector acquisition unit 32 Facial expression determination unit, 42 Eyelid open/close detection unit, 43 Light reflection detection unit, 44 Corneal reflection detection unit, 50 In-vehicle sensor, 51 External camera, 52 Ranging sensor, 53 GPS, 54 Acceleration sensor, 55 In-vehicle camera, 60 Drive recorder , 62 storage unit, 63 map information.

Claims (18)

  1.  予め定められた事象が車両に発生したことを示す事象発生情報と、前記事象の発生時における車両の位置を示す車両位置情報と、前記発生時に前記車両の外を撮影した画像と、前記発生時における前記車両の外を検出して得られた周辺物体までの距離情報とを取得する情報取得部と、
     前記車両位置情報と時刻とを用いた計算に基づいて又は前記画像に基づいて、前記発生時における前記車両に対する光源の相対位置を示す光源位置情報を取得する光源位置取得部と、
     道路及び構造物の情報を含む地図情報、前記車両位置情報、前記光源位置情報、及び前記事象発生情報に基づいて、前記発生時における日陰領域を示す日陰情報を取得する日陰領域取得部と、
     前記光源位置情報と前記日陰情報とに基づいて、前記発生時における前記車両の運転者の状態が、前記光源から前記運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う判定部と、
     を有することを特徴とする逆光判定装置。
    event occurrence information indicating that a predetermined event has occurred in the vehicle; vehicle position information indicating the position of the vehicle when the event occurred; an image taken outside the vehicle when the event occurred; an information acquisition unit that acquires distance information to peripheral objects obtained by detecting the outside of the vehicle at the time;
    a light source position acquisition unit that acquires light source position information indicating the relative position of the light source with respect to the vehicle at the time of occurrence, based on calculation using the vehicle position information and time or based on the image;
    a shade area acquisition unit that acquires shade information indicating a shade area at the time of occurrence based on map information including road and structure information, the vehicle position information, the light source position information, and the event occurrence information;
    Based on the light source position information and the shade information, it is determined whether or not the state of the driver of the vehicle at the time of occurrence was a backlit state in which light from the light source is incident on the eyes of the driver. a determination unit that performs
    A backlight determination device characterized by comprising:
  2.  前記地図情報、前記車両位置情報、前記光源位置情報、及び前記事象発生情報に基づいて、前記発生時における前記構造物の光反射領域からの反射光ベクトルを算出する反射光ベクトル取得部をさらに有し、
     前記判定部は、前記発生時における前記車両の運転者の状態が、前記光源から前記光反射領域を介して前記運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う
     ことを特徴とする請求項1に記載の逆光判定装置。
    further comprising a reflected light vector acquisition unit for calculating a reflected light vector from the light reflection area of the structure at the time of occurrence based on the map information, the vehicle position information, the light source position information, and the event occurrence information. have
    The determining unit determines whether or not the state of the driver of the vehicle at the time of occurrence was a backlit state in which light from the light source enters the driver's eyes via the light reflecting region. 2. The backlight determination device according to claim 1, wherein:
  3.  前記発生時に、前記運転者の目に入射する光の中心光線が前記車両の走行方向に対して成す水平方向の角度である第1の角度が予め定められた第1の角度範囲内であり、且つ、前記中心光線が前記走行方向に対して成す鉛直方向の角度である第2の角度が予め定められた第2の角度範囲内である場合に、前記運転者が前記日陰領域の外側であれば、前記判定部は、前記発生時に前記運転者の状態が前記逆光状態であったと判定する
     請求項1又は2に記載の逆光判定装置。
    a first angle, which is a horizontal angle formed by the center ray of light incident on the eye of the driver at the time of occurrence, with respect to the running direction of the vehicle is within a predetermined first angle range; Further, when the second angle, which is the vertical angle formed by the central ray with respect to the running direction, is within a predetermined second angle range, the driver is outside the shade area. 3. The backlight determination device according to claim 1, wherein the determination unit determines that the driver's state is the backlight state at the time of occurrence.
  4.  前記発生時に前記運転者が前記日陰領域内にいれば、前記判定部は、前記発生時における前記運転者の状態が非逆光状態であったと判定する
     ことを特徴とする請求項1から3のいずれか1項に記載の逆光判定装置。
    4. Any one of claims 1 to 3, characterized in that, if the driver is in the shaded area at the time of occurrence, the determining unit determines that the state of the driver at the time of occurrence was a non-backlit state. 1. A backlight determination device according to claim 1.
  5.  前記日陰領域は、前記地図情報に示される前記構造物又は前記周辺物体によって形成される
     ことを特徴とする請求項1から4のいずれか1項に記載の逆光判定装置。
    The backlight determination apparatus according to any one of claims 1 to 4, wherein the shade area is formed by the structure or the surrounding object indicated by the map information.
  6.  前記発生時は、前記事象が発生した瞬間の直前の予め定められた時間を含むことを特徴とする請求項1から5のいずれか1項に記載の逆光判定装置。 The backlight determination device according to any one of claims 1 to 5, wherein the time of occurrence includes a predetermined time immediately before the moment when the event occurred.
  7.  前記情報取得部は、前記事象発生情報、前記車両位置情報、前記画像、及び前記距離情報を前記車両のドライブレコーダから取得する
     ことを特徴とする請求項1から6のいずれか1項に記載の逆光判定装置。
    7. The information acquisition unit according to any one of claims 1 to 6, wherein the information acquisition unit acquires the event occurrence information, the vehicle position information, the image, and the distance information from a drive recorder of the vehicle. backlight judgment device.
  8.  車両の運転者の顔を撮影する車内カメラから出力された画像、及び予め定められた事象が前記車両に発生したことを示す事象発生情報を取得する取得部と、
     事前の機械学習によって逆光が目に入ったときの顔の表情を分類するために生成された学習済モデルを用いて、前記事象の発生時に、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う判定部と、
     を有することを特徴とする逆光判定装置。
    an acquisition unit that acquires an image output from an in-vehicle camera that captures the face of a driver of the vehicle and event occurrence information that indicates that a predetermined event has occurred in the vehicle;
    Using a learned model generated by prior machine learning to classify facial expressions when backlight enters the eye, the driver's state changes from a light source to the vehicle at the time of the event. a determination unit that determines whether or not the vehicle is in a backlight state in which light is incident on the eyes of the driver;
    A backlight determination device characterized by comprising:
  9.  車両の運転者の目を撮影する車内カメラから出力された画像、及び予め定められた事象が前記車両に発生したことを示す事象発生情報を取得する取得部と、
     前記事象の発生時に、前記運転者の目において角膜反射及び対向反射が順に発生したかどうかに基づいて、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う判定部と、
     を有することを特徴とする逆光判定装置。
    an acquisition unit that acquires an image output from an in-vehicle camera that captures the eyes of a driver of the vehicle and event occurrence information that indicates that a predetermined event has occurred in the vehicle;
    Based on whether corneal reflection and counter reflection occurred in sequence in the driver's eye at the time of the event, the driver's condition is determined by the presence of light from a light source entering the vehicle's driver's eye. A determination unit that determines whether or not it was a backlight state,
    A backlight determination device characterized by comprising:
  10.  前記発生時は、前記事象が発生した瞬間の直前の予め定められた時間を含むことを特徴とする請求項8又は9に記載の逆光判定装置。 10. The backlight determination device according to claim 8 or 9, wherein the time of occurrence includes a predetermined time immediately before the moment when the event occurred.
  11.  前記判定部は、前記事象が発生した瞬間の直前の予め定められた時間内における複数フレームの画像を用いて、前記運転者の状態が前記逆光状態であったか否かを判定する
     ことを特徴とする請求項10に記載の逆光判定装置。
    The judging unit judges whether or not the driver's state is the backlight state by using images of a plurality of frames within a predetermined time period immediately before the moment when the event occurs. The backlight determination device according to claim 10.
  12.  前記判定部は、前記予め定められた事象が発生した瞬間に近いフレームの画像ほど重み値の大きいデータとして扱って、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行う
     ことを特徴とする請求項11に記載の逆光判定装置。
    The determining unit treats an image of a frame closer to the instant at which the predetermined event occurs as data having a larger weight value, and determines whether the driver's condition is determined by the light incident on the eyes of the driver of the vehicle from a light source. 12. The backlight determination device according to claim 11, wherein it is determined whether or not there was a backlight condition in which a
  13.  前記事象発生情報及び前記画像は、前記車両のドライブレコーダから取得される
     ことを特徴とする請求項8から12のいずれか1項に記載の逆光判定装置。
    The backlight determination device according to any one of claims 8 to 12, wherein the event occurrence information and the image are obtained from a drive recorder of the vehicle.
  14.  前記光源は、太陽又は対向車線を走行する他の車両のヘッドライトである
     ことを特徴とする請求項1から13のいずれか1項に記載の逆光判定装置。
    The backlight determination device according to any one of claims 1 to 13, wherein the light source is the sun or headlights of another vehicle traveling in an oncoming lane.
  15.  逆光判定装置によって実行される逆光判定方法であって、
     予め定められた事象が車両に発生したことを示す事象発生情報と、前記事象の発生時における車両の位置を示す車両位置情報と、前記発生時に前記車両の外を撮影した画像と、前記発生時における前記車両の外を検出して得られた周辺物体までの距離情報とを取得するステップと、
     前記車両位置情報と時刻とを用いた計算に基づいて又は前記画像に基づいて、前記発生時における前記車両に対する光源の相対位置を示す光源位置情報を取得するステップと、
     道路及び構造物の情報を含む地図情報、前記車両位置情報、前記光源位置情報、及び前記事象発生情報に基づいて、前記発生時における日陰領域を示す日陰情報を取得するステップと、
     前記光源位置情報と前記日陰情報とに基づいて、前記発生時における前記車両の運転者の状態が、前記光源から前記運転者の目に入射する光が存在する逆光状態であったか否かの判定を行うステップと、
     を有することを特徴とする逆光判定方法。
    A backlight determination method executed by a backlight determination device, comprising:
    event occurrence information indicating that a predetermined event has occurred in the vehicle; vehicle position information indicating the position of the vehicle when the event occurred; an image taken outside the vehicle when the event occurred; acquiring distance information to surrounding objects obtained by detecting the outside of the vehicle at the time;
    obtaining light source position information indicative of the position of a light source relative to the vehicle at the time of occurrence, based on calculations using the vehicle position information and the time of day, or based on the image;
    acquiring shade information indicating a shaded area at the time of occurrence based on map information including road and structure information, the vehicle position information, the light source position information, and the event occurrence information;
    Based on the light source position information and the shade information, it is determined whether or not the state of the driver of the vehicle at the time of occurrence was a backlit state in which light from the light source is incident on the eyes of the driver. the steps to take; and
    A backlight determination method, comprising:
  16.  逆光判定装置によって実行される逆光判定方法であって、
     車両の運転者の顔を撮影する車内カメラから出力された画像、及び予め定められた事象が前記車両に発生したことを示す事象発生情報を取得するステップと、
     事前の機械学習によって逆光が目に入ったときの顔の表情を分類するために生成された学習済モデルを用いて、前記事象の発生時に、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行うステップと、
     を有することを特徴とする逆光判定方法。
    A backlight determination method executed by a backlight determination device,
    obtaining an image output from an in-vehicle camera that captures the face of a driver of the vehicle and event occurrence information indicating that a predetermined event has occurred in the vehicle;
    Using a learned model generated by prior machine learning to classify facial expressions when backlight enters the eye, the driver's state changes from a light source to the vehicle at the time of the event. determining whether or not the vehicle is in a backlight condition in which there is light incident on the eyes of the driver;
    A backlight determination method, comprising:
  17.  逆光判定装置によって実行される逆光判定方法であって、
     車両の運転者の目を撮影する車内カメラから出力された画像、及び予め定められた事象が前記車両に発生したことを示す事象発生情報を取得するステップと、
     前記事象の発生時に、前記運転者の目において角膜反射及び対向反射が順に発生したかどうかに基づいて、前記運転者の状態が、光源から前記車両の運転者の目に入射する光が存在する逆光状態であったか否かの判定を行うステップと、
     を有することを特徴とする逆光判定方法。
    A backlight determination method executed by a backlight determination device,
    obtaining an image output from an in-vehicle camera that captures the eyes of a driver of the vehicle and event occurrence information indicating that a predetermined event has occurred in the vehicle;
    Based on whether corneal reflection and counter reflection occurred in sequence in the driver's eye at the time of the event, the driver's condition is determined by the presence of light from a light source entering the vehicle's driver's eye. a step of determining whether or not it was a backlight condition;
    A backlight determination method, comprising:
  18.  請求項15から17のいずれか1項に記載の逆光判定方法を、逆光判定装置であるコンピュータに実行させることを特徴とする逆光判定プログラム。 A backlight determination program characterized by causing a computer, which is a backlight determination device, to execute the backlight determination method according to any one of claims 15 to 17.
PCT/JP2021/011312 2021-03-19 2021-03-19 Backlight determination device, backlight determination method, and backlight determination program WO2022195836A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023506654A JP7270866B2 (en) 2021-03-19 2021-03-19 Backlight determination device, backlight determination method, and backlight determination program
PCT/JP2021/011312 WO2022195836A1 (en) 2021-03-19 2021-03-19 Backlight determination device, backlight determination method, and backlight determination program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/011312 WO2022195836A1 (en) 2021-03-19 2021-03-19 Backlight determination device, backlight determination method, and backlight determination program

Publications (1)

Publication Number Publication Date
WO2022195836A1 true WO2022195836A1 (en) 2022-09-22

Family

ID=83320232

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/011312 WO2022195836A1 (en) 2021-03-19 2021-03-19 Backlight determination device, backlight determination method, and backlight determination program

Country Status (2)

Country Link
JP (1) JP7270866B2 (en)
WO (1) WO2022195836A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013108748A1 (en) * 2012-01-20 2013-07-25 株式会社デンソー Method and device for controlling glare prevention device installed in vehicle
JP2018130342A (en) * 2017-02-15 2018-08-23 富士通株式会社 Wakefulness estimation device, wakefulness estimation method and wakefulness estimation system
WO2019053809A1 (en) * 2017-09-13 2019-03-21 三菱電機株式会社 Driver state determination device and driver state determination method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013108748A1 (en) * 2012-01-20 2013-07-25 株式会社デンソー Method and device for controlling glare prevention device installed in vehicle
JP2018130342A (en) * 2017-02-15 2018-08-23 富士通株式会社 Wakefulness estimation device, wakefulness estimation method and wakefulness estimation system
WO2019053809A1 (en) * 2017-09-13 2019-03-21 三菱電機株式会社 Driver state determination device and driver state determination method

Also Published As

Publication number Publication date
JPWO2022195836A1 (en) 2022-09-22
JP7270866B2 (en) 2023-05-10

Similar Documents

Publication Publication Date Title
EP2288287B1 (en) Driver imaging apparatus and driver imaging method
US9286515B2 (en) Doze detection method and apparatus thereof
CN109758167A (en) Driver status detection device
EP2541493B1 (en) Pupil detection device and pupil detection method
US20120177266A1 (en) Pupil detection device and pupil detection method
BRPI0712837A2 (en) Method and apparatus for determining and analyzing a location of visual interest.
JPH07249197A (en) Detecting device for state of person
US11144756B2 (en) Method and system of distinguishing between a glance event and an eye closure event
US20200218878A1 (en) Personalized eye openness estimation
JP2008065776A (en) Doze detection device and doze detection method
JP4770385B2 (en) Automatic sun visor
CN108162859B (en) Vehicle rearview mirror, image display method, vehicle and storage medium
CN115393830A (en) Fatigue driving detection method based on deep learning and facial features
JP7270866B2 (en) Backlight determination device, backlight determination method, and backlight determination program
JP2021037216A (en) Eye closing determination device
CN112528793B (en) Method and device for eliminating jitter of obstacle detection frame of vehicle
US11318877B2 (en) Vehicle light-adjusting system
WO2014162654A1 (en) Monitoring system
WO2023108364A1 (en) Method and apparatus for detecting driver state, and storage medium
JP5050794B2 (en) Sleepiness detection device, sleepiness detection method
US11919522B2 (en) Apparatus and method for determining state
JP4781292B2 (en) Closed eye detection device, dozing detection device, closed eye detection method, and closed eye detection program
JP2022123787A (en) Obstacle detection device, method and program
JP3812355B2 (en) Driving attention detection device and drowsy driving detection device
WO2024057356A1 (en) Level of eyelid opening detection device, level of eyelid opening detection method, and drowsiness assessment system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931583

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023506654

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931583

Country of ref document: EP

Kind code of ref document: A1