WO2020183652A1 - Driving assistance device - Google Patents

Driving assistance device Download PDF

Info

Publication number
WO2020183652A1
WO2020183652A1 PCT/JP2019/010296 JP2019010296W WO2020183652A1 WO 2020183652 A1 WO2020183652 A1 WO 2020183652A1 JP 2019010296 W JP2019010296 W JP 2019010296W WO 2020183652 A1 WO2020183652 A1 WO 2020183652A1
Authority
WO
WIPO (PCT)
Prior art keywords
road traffic
traffic information
distance coefficient
viewpoint
viewpoint distance
Prior art date
Application number
PCT/JP2019/010296
Other languages
French (fr)
Japanese (ja)
Inventor
悟史 山口
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021504710A priority Critical patent/JP7105985B2/en
Priority to PCT/JP2019/010296 priority patent/WO2020183652A1/en
Publication of WO2020183652A1 publication Critical patent/WO2020183652A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a driving support device.
  • Patent Document 1 discloses a technique for calculating the visibility of a safety confirmation object by the driver based on the distance of the safety confirmation object with respect to the driver's line-of-sight direction. As a result, not only the visibility of the safety confirmation object being watched by the driver is calculated, but also the visibility of the safety confirmation object included in the so-called "peripheral vision" is calculated.
  • road information display showing various information about roads
  • traffic information display showing various information related to traffic
  • road information display and the traffic information display are collectively referred to as “road traffic information display”.
  • the technique described in Patent Document 1 calculates the visibility of the safety confirmation object by the driver.
  • the degree of recognition of the road traffic information display by the driver instead of calculating the visibility of the safety confirmation object by the driver (hereinafter referred to as “recognition degree”). It is conceivable to calculate (hereinafter referred to as “diversification technology”).
  • the degree of recognition of the road traffic information display by the driver is not only different depending on the distance of the road traffic information display with respect to the line-of-sight direction, but also different depending on the content of the road traffic information display. .. Therefore, when calculating the degree of recognition of the road traffic information display by the driver, it is preferable to consider not only the distance of the road traffic information display with respect to the line-of-sight direction but also the content of the road traffic information display. is there.
  • Patent Document 1 considers the distance of the safety confirmation object with respect to the line-of-sight direction when calculating the visibility of the safety confirmation object by the driver, but does not consider the content of the safety confirmation object. .. Therefore, the diversion technology does not consider the content of the road traffic information display when calculating the recognition degree of the road traffic information display by the driver. As a result, there is a problem that the accuracy of determining whether or not the road traffic information display is recognized by the driver is low.
  • the present invention has been made to solve the above problems, and an object of the present invention is to improve the accuracy of determining whether or not the road traffic information display has been recognized by the driver.
  • the driving support device of the present invention uses a road traffic information detection unit that detects an area corresponding to the road traffic information display in the front image and detects the content of the road traffic information display, and a vehicle interior image to drive in the front image.
  • the viewpoint distance coefficient is corrected based on the viewpoint detection unit that detects the position corresponding to the viewpoint of the person, the viewpoint distance coefficient calculation unit that calculates the viewpoint distance coefficient according to the distance of the area to the position, and the content of the road traffic information display. It is provided with a viewpoint distance coefficient correction unit for calculating a viewpoint distance coefficient, and a recognition degree calculation unit for calculating the recognition degree of the road traffic information display by the driver by integrating the corrected viewpoint distance coefficient.
  • the present invention since it is configured as described above, it is possible to improve the accuracy of determining whether or not the road traffic information display is recognized by the driver.
  • FIG. It is a block diagram which shows the main part of the driving support system including the driving support device which concerns on Embodiment 1.
  • FIG. It is explanatory drawing which shows the example of the front image. It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 2A. It is explanatory drawing which shows the position corresponding to the viewpoint of the driver in the front image shown in FIG. 2A. It is explanatory drawing which shows the distance of the said region with respect to the said position in the front image shown in FIG. 2A. It is explanatory drawing which shows the other example of the front image. It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 3A.
  • FIG. 1 is a block diagram showing a main part of a driving support system including a driving support device according to the first embodiment.
  • the operation support device 100 of the first embodiment will be described with reference to FIG. Further, the driving support system 200 including the driving support device 100 will be described.
  • the vehicle 1 is provided with a control device 2, a first image pickup device 3, a second image pickup device 4, and a display device 5.
  • the main part of the driving support system 200 is composed of the control device 2, the first image pickup device 3, the second image pickup device 4, and the display device 5.
  • the first image pickup device 3 takes an image of the front of the vehicle 1 at predetermined time intervals, and outputs an image signal indicating the captured image (hereinafter referred to as "front image") I1.
  • the first imaging device 3 is composed of, for example, an infrared camera or a visible light camera.
  • the first imaging device 3 is provided, for example, on the front end of the vehicle 1, on the dashboard of the vehicle 1, or on the ceiling of the front of the vehicle interior of the vehicle 1.
  • the second imaging device 4 captures the interior of the vehicle 1 at predetermined time intervals and outputs an image signal indicating the captured image (hereinafter referred to as "in-vehicle image") I2.
  • the second imaging device 4 is composed of, for example, an infrared camera or a visible light camera.
  • the second imaging device 4 is provided, for example, on the dashboard of the vehicle 1.
  • the display device 5 is composed of, for example, a liquid crystal display or an organic EL (Electro Luminescence) display, and is provided on the dashboard of the vehicle 1.
  • the display device 5 is configured by a HUD (Head-Up Display).
  • the control device 2 is composed of, for example, an ECU (Electronic Control Unit).
  • the driving support device 100 is provided in the control device 2. Hereinafter, the driving support device 100 will be described.
  • the road traffic information detection unit 21 acquires the image signal output by the first imaging device 3.
  • the road traffic information detection unit 21 detects the road traffic information display S included in the front image I1 by executing an image recognition process for the front image I1 using the acquired image signal. More specifically, the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S in the front image I1 and also detects the content of the road traffic information display S.
  • the process in which the road traffic information detection unit 21 detects the contents of the area A and the road traffic information display S is referred to as “road traffic information detection process”.
  • the viewpoint detection unit 22 acquires the image signal output by the second imaging device 4.
  • the viewpoint detection unit 22 uses the acquired image signal to perform image recognition processing on the in-vehicle image I2, thereby performing an image recognition process on the vehicle interior image I2, thereby corresponding to the driver's viewpoint in the front image I1 (hereinafter referred to as “viewpoint position”). It detects P.
  • viewpoint detection process the process in which the viewpoint detection unit 22 detects the viewpoint position P is referred to as “viewpoint detection process”.
  • the viewpoint detection unit 22 detects the position of the driver's head (more specifically, the position of the driver's eyes) by executing the image recognition process for the in-vehicle image I2, and also detects the driver's line of sight. Detect the vector.
  • the viewpoint detection unit 22 stores in advance a table showing the correspondence between the position of the driver's eyes, the driver's line-of-sight vector, and the viewpoint position P in the front image I1.
  • the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 by using the stored table.
  • the viewpoint distance coefficient calculation unit 23 acquires the detection result by the road traffic information detection unit 21 and the detection result by the viewpoint detection unit 22.
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P by using the acquired detection result. Further, the viewpoint distance coefficient calculation unit 23 calculates a coefficient (hereinafter referred to as “viewpoint distance coefficient”) C corresponding to the calculated distance L. That is, the viewpoint distance coefficient C is a value that gradually decreases as the distance L increases. In other words, the viewpoint distance coefficient C is a value that gradually increases as the distance L decreases.
  • the viewpoint distance coefficient calculation unit 23 has a coordinate value indicating the viewpoint position P in the front image I1 (hereinafter referred to as “first coordinate value”) and a predetermined portion (for example, the central portion or the upper left) of the region A in the front image I1.
  • the distance L in pixel units is calculated based on the difference value from the coordinate value (hereinafter referred to as "second coordinate value") indicating the position of the end portion).
  • the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C corresponding to the calculated distance L in pixel units.
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L in pixel units based on the difference value between the first coordinate value and the second coordinate value.
  • the viewpoint distance coefficient calculation unit 23 stores in advance a table showing the correspondence between the first coordinate value, the second coordinate value, and the coefficient for converting the distance L in pixel units into the distance L in meters. There is.
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L in meters by converting the calculated distance L in pixels into the distance L in meters using the stored table.
  • the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C corresponding to the calculated distance L in meters.
  • FIG. 2A shows an example of the front image I1.
  • the front image I1 shown in FIG. 2A includes a road sign (more specifically, a regulation sign) having the meaning of “no vehicle intrusion”, that is, a road traffic information display S.
  • the regulation sign is composed of one symbol (hereinafter referred to as "mark") and does not include a character string.
  • the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S (see FIG. 2B). Further, the road traffic information detection unit 21 detects the content of the road traffic information display S. More specifically, the road traffic information detection unit 21 detects that the road traffic information display S is a regulatory sign having the meaning of "no vehicle entry". At this time, it is also detected that the road traffic information display S is composed of one mark and that the road traffic information display S does not include a character string.
  • the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 (see FIG. 2C).
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P (see FIG. 2D), and calculates the viewpoint distance coefficient C corresponding to the calculated distance L.
  • FIG. 3A shows another example of the front image I1.
  • the front image I1 shown in FIG. 3A includes a signboard including the character string “future / closed”, that is, the road traffic information display S.
  • the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S (see FIG. 3B). Further, the road traffic information detection unit 21 detects the content of the road traffic information display S. More specifically, the road traffic information detection unit 21 detects that the road traffic information display S is a signboard including the character string "future / closed”.
  • the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 (see FIG. 3C).
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P (see FIG. 3D), and calculates the viewpoint distance coefficient C corresponding to the calculated distance L.
  • the viewpoint distance coefficient correction unit 24 acquires the viewpoint distance coefficient C calculated by the viewpoint distance coefficient calculation unit 23. Further, the viewpoint distance coefficient correction unit 24 acquires the detection result by the road traffic information detection unit 21. The viewpoint distance coefficient correction unit 24 corrects the acquired viewpoint distance coefficient C based on the content of the road traffic information display S by using the acquired detection result.
  • the viewpoint distance coefficient correction unit 24 is based on the number of marks included in the road traffic information display S, the presence or absence of a character string in the road traffic information display S, the number of characters in the character string included in the road traffic information display S, and the like.
  • the difficulty level D1 of recognition of the road traffic information display S is calculated.
  • the difficulty level D1 is also an index indicating the complexity of the road traffic information display S.
  • the road traffic information display S in the front image I1 shown in FIG. 2 is composed of one mark and does not include a character string.
  • the road traffic information display S in the front image I1 shown in FIG. 3 includes a character string.
  • the difficulty level D1 of the former (FIG. 2) is calculated to be smaller than the difficulty level D1 of the latter (FIG. 3).
  • the difficulty level D1 of the latter (FIG. 3) is calculated to be larger than the difficulty level D1 of the former (FIG. 2).
  • the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C so that when the difficulty level D1 is large, the value becomes smaller than when the difficulty level D1 is small. In other words, the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C so that when the difficulty level D1 is small, the value becomes larger than when the difficulty level D1 is large.
  • the forward image I1 is imaged at predetermined time intervals.
  • the road traffic information detection unit 21 executes the road traffic information detection process at predetermined time intervals. Therefore, when the road traffic information display S is included in the front image I1, the viewpoint distance coefficient C is calculated at a predetermined time interval, and the calculated viewpoint distance coefficient C is sequentially corrected.
  • the recognition degree calculation unit 25 calculates the recognition degree D2 of the road traffic information display S by the driver by temporally integrating the corrected viewpoint distance coefficient C'. That is, the recognition degree D2 is updated every time the viewpoint distance coefficient C is calculated and the calculated viewpoint distance coefficient C is corrected.
  • the recognition degree determination unit 26 acquires the detection result by the road traffic information detection unit 21.
  • the connection line between the road traffic information detection unit 21 and the recognition degree determination unit 26 is not shown.
  • the road traffic information display S in the front image I1 goes out of the front image I1 due to the traveling of the vehicle 1 (that is, due to the passage of time during traveling).
  • the recognition degree determination unit 26 determines whether or not the road traffic information display S has gone out of the front image I1 by using the acquired detection result.
  • the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25 when it is determined that the road traffic information display S has gone out of the front image I1.
  • the recognition degree determination unit 26 determines whether or not the road traffic information display S is recognized by the driver by determining whether or not the acquired recognition degree D2 is equal to or higher than a predetermined threshold value Dth. is there. That is, the threshold value Dth is set to a value that can determine whether or not the road traffic information display S is recognized by the driver.
  • the display control unit 27 controls the display device 5 to display the image I3 corresponding to the road traffic information display S (hereinafter, "first”). It is called “display control”). By visually recognizing the image I3, the driver can confirm the road traffic information display S that he / she missed.
  • the driving support device 100 is required by the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27.
  • the part is composed.
  • the driving support device 100 has a processor 51 and a memory 52.
  • the memory 52 has functions of a road traffic information detection unit 21, a viewpoint detection unit 22, a viewpoint distance coefficient calculation unit 23, a viewpoint distance coefficient correction unit 24, a recognition degree calculation unit 25, a recognition degree determination unit 26, and a display control unit 27.
  • the program to realize it is stored.
  • the processor 51 reads out and executes the stored program, the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, and recognition
  • the functions of the coefficient determination unit 26 and the display control unit 27 are realized.
  • the driving support device 100 has a processing circuit 53.
  • the functions of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27 are dedicated. It is realized by the processing circuit 53.
  • the driving support device 100 has a processor 51, a memory 52, and a processing circuit 53 (not shown).
  • a processor 51 among the functions of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27.
  • Some functions are realized by the processor 51 and the memory 52, and the remaining functions are realized by the dedicated processing circuit 53.
  • the processor 51 uses, for example, at least one of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, and a DSP (Digital Signal Processor).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memory 52 is composed of a non-volatile memory or a non-volatile memory and a volatile memory.
  • the volatile memory in the memory 52 is, for example, one using RAM (Random Access Memory).
  • the non-volatile memory among the memory 52 is, for example, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Memory) or a EEPROM (Electrically Erasable Memory) MemoryRead At least one of (Hard Disc Drive) is used.
  • the processing circuit 53 is composed of a digital circuit or a digital circuit and an analog circuit.
  • the processing circuit 53 includes, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), an FPGA (Field-Programmable Gate Array), a System-System (System) System, and an ASIC (System) System. At least one of them is used.
  • the first image pickup apparatus 3 continuously executes a process of capturing an image of the front of the vehicle 1 at predetermined time intervals and outputting an image signal indicating the front image I1.
  • the second imaging device 4 continuously executes a process of capturing an image of the interior of the vehicle 1 at predetermined time intervals and outputting an image signal indicating the in-vehicle image I2.
  • step ST1 the road traffic information detection unit 21 executes the road traffic information detection process.
  • the viewpoint detection unit 22 executes the viewpoint detection process in step ST3.
  • step ST4 the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C.
  • step ST5 the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C.
  • step ST6 the recognition degree calculation unit 25 calculates the recognition degree D2.
  • step ST2 “NO” the process of the driving support device 100 returns to step ST1. Further, following step ST6, the processing of the driving support device 100 returns to step ST1. That is, as described above, the road traffic information detection process is executed at predetermined time intervals.
  • the operation of the driving support device 100 will be described focusing on the operations of the recognition degree determination unit 26 and the display control unit 27.
  • the recognition degree determination unit 26 determines whether or not the road traffic information display S has gone out of the front image I1 by using the detection result by the road traffic information detection unit 21. When it is determined that the road traffic information display S has gone out of the front image I1, the recognition degree determination unit 26 executes the process of step ST11.
  • step ST11 the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25, and determines whether or not the acquired recognition degree D2 is equal to or greater than the threshold value Dth. ..
  • the display control unit 27 executes the first display control in step ST12.
  • step ST11 “YES” the process of step ST12 is skipped.
  • the driving support device 100 includes the viewpoint distance coefficient correction unit 24 that corrects the viewpoint distance coefficient C based on the content of the road traffic information display S.
  • the viewpoint distance coefficient C'for the calculation of the recognition degree D2 the viewpoint distance coefficient C before the correction is used for the calculation of the recognition degree D2 (that is, compared with the conventional driving support device). The accuracy of determining whether or not the road traffic information display S is recognized by the driver can be improved.
  • the driving support device 100 may execute the processes of ST4 to ST6 for each of the plurality of road traffic information displays S. (See FIG. 7). Further, in this case, the driving support device 100 may execute the processes of steps ST11 and ST12 for each of the plurality of road traffic information display S.
  • the recognition degree determination unit 26 and the display control unit 27 may be provided outside the driving support device 100. That is, the main part of the driving support device 100 is composed of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit 25. You may.
  • the driving support device 100 of the first embodiment detects the region A corresponding to the road traffic information display S in the front image I1 and also detects the content of the road traffic information display S.
  • the viewpoint detection unit 22 that detects the position P corresponding to the driver's viewpoint in the front image I1 and the viewpoint distance coefficient C corresponding to the distance L of the region A with respect to the position P are calculated.
  • the viewpoint distance coefficient correction unit 24 that corrects the viewpoint distance coefficient C based on the contents of the road traffic information display S, and the corrected viewpoint distance coefficient C'by the driver.
  • a recognition degree calculation unit 25 for calculating the recognition degree D2 of the road traffic information display S is provided. By using the corrected viewpoint distance coefficient C'for calculating the recognition degree D2, it is possible to improve the accuracy of determining whether or not the road traffic information display S is recognized by the driver.
  • the driving support device 100 determines whether or not the recognition degree D2 is equal to or higher than the threshold value Dth, thereby determining whether or not the road traffic information display S is recognized by the driver, and the recognition degree determination unit 26.
  • the display control unit 27 that executes the control (first display control) to display the image I3 corresponding to the road traffic information display S on the display device 5 is provided. ..
  • the driver can confirm the road traffic information display S that he / she missed.
  • the recognition degree determination unit 26 determines whether or not the recognition degree D2 is equal to or greater than the threshold value Dth when the road traffic information display S goes out of the front image I1. As a result, when the vehicle 1 passes the installation position of the road traffic information display S, it is possible to determine whether or not the recognition degree D2 is equal to or higher than the threshold value Dth. That is, it is possible to determine whether or not the recognition degree D2 is equal to or higher than the threshold value Dth for the road traffic information display S that the driver may have missed.
  • FIG. 9 is a block diagram showing a main part of the driving support system including the driving support device according to the second embodiment.
  • the driving support device 100a of the second embodiment will be described with reference to FIG. Moreover, the driving support system 200a including the driving support device 100a will be described.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • control device 2a is provided in the vehicle 1.
  • the control device 2a is composed of, for example, an ECU.
  • the vehicle 1 is provided with sensors 6.
  • the sensors 6 include various sensors.
  • the sensors 6 include a wheel speed sensor, an acceleration sensor, a gyro sensor, a steering sensor, and a wiper sensor.
  • the vehicle 1 is provided with a wireless communication device 7.
  • the wireless communication device 7 is composed of a transmitter and a receiver for wireless communication.
  • the wireless communication device 7 is free to communicate with the server device 8 outside the vehicle 1.
  • the storage device 9 is provided in the vehicle 1.
  • the storage device 9 is composed of a non-volatile memory.
  • the storage device 9 may be integrally configured with the memory (that is, the memory 52) of the control device 2a.
  • the vehicle 1 is provided with an operation input device 10.
  • the operation input device 10 is composed of, for example, at least one of a touch panel, a hardware key, and a microphone for voice input.
  • the operation input device 10 is provided on the dashboard of the vehicle 1, for example.
  • the operation input device 10 may be provided integrally with the display device 5, or may be provided adjacent to the display device 5.
  • the main part of the driving support system 200a is composed of the control device 2a, the first imaging device 3, the second imaging device 4, the display device 5, the sensors 6, the wireless communication device 7, the storage device 9, and the operation input device 10. There is.
  • the driving support device 100a is provided in the control device 2a.
  • the vehicle information acquisition unit 31 acquires information indicating the running state of the vehicle 1 (hereinafter referred to as "vehicle information").
  • vehicle information includes, for example, information indicating the traveling speed of the vehicle 1 and information indicating the turning amount of the vehicle 1.
  • the vehicle information acquisition unit 31 calculates the traveling speed of the vehicle 1 by using the output signal from the wheel speed sensor of the sensors 6. As a result, information indicating the traveling speed of the vehicle 1 is acquired.
  • the vehicle information acquisition unit 31 calculates the turning amount of the vehicle 1 by using the output signal from the acceleration sensor, the gyro sensor, or the steering sensor of the sensors 6. As a result, information indicating the turning amount of the vehicle 1 is acquired.
  • the environmental information acquisition unit 32 acquires information indicating the environment outside the vehicle 1 (hereinafter referred to as "environmental information").
  • the environmental information includes, for example, information indicating the brightness around the vehicle 1 and information indicating the weather around the vehicle 1.
  • the environmental information acquisition unit 32 detects the brightness around the vehicle 1 by executing an image recognition process for the front image I1. As a result, information indicating the brightness of the surroundings of the vehicle 1 is acquired.
  • the environmental information acquisition unit 32 determines the weather around the vehicle 1 by executing the image recognition process for the front image I1. As a result, information indicating the weather around the vehicle 1 is acquired.
  • the environmental information acquisition unit 32 determines whether or not the wiper of the vehicle 1 is operating by using the output signal from the wiper sensor of the sensors 6. When the wiper of the vehicle 1 is operating, the environmental information acquisition unit 32 determines that the weather around the vehicle 1 is rain or snow. Further, the environmental information acquisition unit 32 determines that the weather around the vehicle 1 is sunny or cloudy when the wiper of the vehicle 1 is stopped. As a result, information indicating the weather around the vehicle 1 is acquired.
  • the server device 8 stores information indicating the weather in each area.
  • the wireless communication device 7 receives information indicating the weather in the area including the current position of the vehicle 1.
  • the environment information acquisition unit 32 acquires the received information. As a result, information indicating the weather around the vehicle 1 is acquired.
  • the driver information acquisition unit 33 acquires information about the driver of the vehicle 1 (hereinafter referred to as "driver information").
  • the driver information includes, for example, information indicating the driver's eyesight.
  • vehicle 1 is used by a plurality of users.
  • the storage device 9 stores a database in which information indicating the visual acuity of each user and information indicating the face image of each user are associated with each other.
  • the information indicating the visual acuity of each user is, for example, input in advance by each user using the operation input device 10.
  • the facial image of each user is, for example, pre-imaged by the second imaging device 4.
  • the driver information acquisition unit 33 acquires the face image of the driver of the vehicle 1 by executing the image recognition process for the in-vehicle image I2.
  • the driver information acquisition unit 33 determines which of the plurality of users the driver of the vehicle 1 is by matching the acquired face image with the face image in the database.
  • the driver information acquisition unit 33 acquires information indicating the user's visual acuity corresponding to the result of the determination from the database. As a result, information indicating the eyesight of the driver of the vehicle 1 is acquired.
  • the storage device 9 stores the following trained model. That is, a trained model is stored in which a value indicating the driver's visual acuity is output when a value indicating the result of the image recognition process for the in-vehicle image I2 is input.
  • the trained model is pre-generated by machine learning.
  • the driver information acquisition unit 33 executes an image recognition process for the in-vehicle image I2, and inputs a value indicating the result of the image recognition process into the trained model. As a result, information indicating the driver's eyesight is acquired.
  • correction information vehicle information, environmental information, and driver information are collectively referred to as "correction information”.
  • the vehicle information acquisition unit 31, the environment information acquisition unit 32, and the driver information acquisition unit 33 constitute the correction information acquisition unit 34.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S. That is, the viewpoint distance coefficient correction unit 24a executes the same correction processing as the viewpoint distance coefficient correction unit 24. In addition to this, the viewpoint distance coefficient correction unit 24a executes at least one of the following correction processes (1) to (6).
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the size of the area A is large, the value becomes larger than when the size of the area A is small. To do. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the size of the area A is small, the value is smaller than when the size of the area A is large.
  • the detection result by the road traffic information detection unit 21 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a larger value.
  • the viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient such that when the traveling speed of the vehicle 1 is high, the value is smaller than when the traveling speed of the vehicle 1 is low. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the traveling speed of the vehicle 1 is low, the value becomes larger than when the traveling speed of the vehicle 1 is high. Vehicle information acquired by the vehicle information acquisition unit 31 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient so that when the turning amount of the vehicle 1 is large, the value becomes smaller than when the turning amount of the vehicle 1 is small. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the turning amount of the vehicle 1 is large, the value becomes smaller than when the turning amount of the vehicle 1 is small. Vehicle information acquired by the vehicle information acquisition unit 31 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient so that when the surroundings of the vehicle 1 are dark, the value is smaller than when the surroundings of the vehicle 1 are bright. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the surroundings of the vehicle 1 are bright, the value becomes larger than when the surroundings of the vehicle 1 are dark.
  • the environmental information acquired by the environmental information acquisition unit 32 is used for the correction process.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a compares when the weather around the vehicle 1 is rain or snow and the weather around the vehicle 1 is sunny or cloudy.
  • the viewpoint distance coefficient C is corrected so that the value becomes small.
  • the viewpoint distance coefficient correction unit 24a makes a large value when the weather around the vehicle 1 is sunny or cloudy as compared with when the weather around the vehicle 1 is rain or snow.
  • the distance coefficient C is corrected.
  • the environmental information acquired by the environmental information acquisition unit 32 is used for the correction process.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the driver's visual acuity is low, the value becomes smaller than when the driver's visual acuity is high. To do. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the driver's visual acuity is high, the value becomes larger than when the driver's visual acuity is low.
  • the driver information acquired by the driver information acquisition unit 33 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • viewpoint detection unit 21 viewpoint detection unit 22, viewpoint distance coefficient calculation unit 23, viewpoint distance coefficient correction unit 24a, recognition degree calculation unit 25, recognition degree determination unit 26, display control unit 27, and correction information acquisition unit 34.
  • viewpoint detection unit 22 viewpoint detection unit 22
  • viewpoint distance coefficient calculation unit 23 viewpoint distance coefficient correction unit 24a
  • recognition degree calculation unit 25 recognition degree determination unit 26
  • display control unit 27 correction information acquisition unit 34.
  • the main part of the driving support device 100a is configured.
  • Each function of 34 may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
  • step ST1 the road traffic information detection unit 21 executes the road traffic information detection process.
  • the viewpoint detection unit 22 executes the viewpoint detection process in step ST3.
  • step ST4 the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C.
  • step ST7 the correction information acquisition unit 34 acquires the correction information.
  • step ST5a the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C.
  • the viewpoint distance coefficient correction unit 24a executes at least one of the above correction processes (1) to (6) in addition to the correction process based on the content of the road traffic information display S.
  • the viewpoint distance coefficient correction unit 24a executes only the correction process (1) among the correction processes (1) to (6) above, the correction information is unnecessary. In this case, the process of step ST7 may be skipped.
  • step ST6 the recognition degree calculation unit 25 calculates the recognition degree D2.
  • the viewpoint distance coefficient correction unit 24a executes at least one of the above correction processes (1) to (6) in addition to the correction process based on the content of the road traffic information display S.
  • the corrected viewpoint distance coefficient C'for calculating the recognition degree D2 it is possible to further improve the accuracy of determining whether or not the road traffic information display S is recognized by the driver.
  • the correction information acquisition unit 34 may be provided outside the driving support device 100a. That is, the driving support device 100a is provided by the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27. The main part of the above may be configured.
  • the recognition degree determination unit 26 and the display control unit 27 may be provided outside the driving support device 100a. That is, the main part of the driving support device 100a is composed of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, and the recognition degree calculation unit 25. You may.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S and the viewpoint distance based on the size of the area A. Correct the coefficient C. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the traveling state of the vehicle 1. More specifically, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the traveling speed of the vehicle 1 or the turning amount of the vehicle 1. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the environment outside the vehicle 1. More specifically, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the brightness around the vehicle 1 or the weather around the vehicle 1. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the driver's visual acuity. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • FIG. 13 is a block diagram showing a main part of the driving support system including the driving support device according to the third embodiment.
  • the driving support device 100b of the third embodiment will be described with reference to FIG. Moreover, the driving support system 200b including the driving support device 100b will be described.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • control device 2b is provided in the vehicle 1.
  • the control device 2b is composed of, for example, an ECU.
  • the vehicle 1 is provided with the voice output device 11.
  • the audio output device 11 is composed of, for example, a speaker.
  • the main part of the driving support system 200b is composed of the control device 2b, the first imaging device 3, the second imaging device 4, the display device 5, and the audio output device 11.
  • the driving support device 100b is provided in the control device 2b.
  • the importance determination unit 41 acquires the detection result by the road traffic information detection unit 21.
  • the importance determination unit 41 determines the importance D3 of the road traffic information display S based on the content of the road traffic information display S by using the acquired detection result.
  • the importance D3 is, for example, one of three values (that is, the first value, the second value, and the third value) according to the magnitude of the influence of the content of the road traffic information display S on the driving of the vehicle 1. It is determined that it is one of the values. The larger the value of the importance D3, the greater the influence that the content of the road traffic information display S has on the driving of the vehicle 1.
  • the importance determination unit 41 indicates that the content of the road traffic information display S requests the stop of the vehicle 1 (for example, the road traffic information display S is a regulation sign meaning "closed” or "closed”. If), it is determined that the importance D3 is the third value. Further, the importance determination unit 41 determines that the importance D3 is the second value when the content of the road traffic information display S requests the deceleration or lane change of the vehicle 1. In addition, the importance determination unit 41 determines that the importance D3 is the first value in other cases.
  • the voice output control unit 42 sets the importance level D3 to a predetermined reference value Dref (for example, a second value) by the importance level determination unit 41.
  • a predetermined reference value Dref for example, a second value
  • the control for outputting the voice corresponding to the road traffic information display S to the voice output device 11 (hereinafter referred to as “voice output control”) is executed.
  • the voice is, for example, a voice that reads out a sentence corresponding to the content of the road traffic information display S.
  • the output control unit 43 is composed of the display control unit 27 and the audio output control unit 42.
  • the voice output control unit 42 constitutes the main part of the driving support device 100b.
  • the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25, and determines whether or not the acquired recognition degree D2 is equal to or greater than the threshold value Dth. ..
  • the importance degree determination unit 41 determines the importance degree D3 of the road traffic information display S in step ST13.
  • the output control unit 43 executes the first display control in step ST12.
  • the output control unit 43 executes the first display control and the audio output control in step ST14.
  • the display control unit 27 executes the first display control.
  • the voice output control unit 42 executes voice output control.
  • the driving support device 100b may have the same viewpoint distance coefficient correction unit 24a as the driving support device 100a instead of the viewpoint distance coefficient correction unit 24. Further, the control device 2b may have the same correction information acquisition unit 34 as the control device 2a.
  • the driving support system 200b may include sensors 6, a wireless communication device 7, a storage device 9, and an operation input device 10 similar to the driving support system 200a.
  • the driving support device 100b can employ various modifications similar to those described in the first and second embodiments.
  • the driving support device 100b of the third embodiment has the importance of determining whether or not the importance D3 of the road traffic information display S is equal to or higher than the reference value Dref based on the content of the road traffic information display S.
  • the voice output device corresponds to the road traffic information display S.
  • a voice output control unit 42 that executes control to output to 11 is provided.
  • FIG. 15 is a block diagram showing a main part of the driving support system including the driving support device according to the fourth embodiment.
  • FIG. 15 is calculated, and the operation support device 100c according to the fourth embodiment will be described later.
  • the driving support system 200c including the driving support device 100c will be described.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • the same blocks as those shown in FIG. 9 are designated by the same reference numerals, and the description thereof will be omitted.
  • control device 2c is provided in the vehicle 1.
  • the control device 2c is composed of, for example, an ECU.
  • the main part of the driving support system 200c is composed of the control device 2c, the first imaging device 3, the second imaging device 4, the display device 5, the storage device 9, and the operation input device 10.
  • the driving support device 100c is provided in the control device 2c.
  • the road traffic information detection unit 21a executes the same road traffic information detection process as the road traffic information detection unit 21.
  • the road traffic information detection unit 21a stores the data indicating the road traffic information display S detected by the road traffic information detection process in the storage device 9.
  • the storage device 9 holds the stored data for a predetermined time.
  • the display control unit 27a executes the first display control similar to the display control unit 27. In addition to this, when the operation for instructing the display of the image I3 corresponding to the road traffic information display S indicated by the data stored in the storage device 9 is input to the operation input device 10, the display control unit 27a is concerned. The control for displaying the image I3 corresponding to the road traffic information display S indicated by the stored data on the display device 5 (hereinafter referred to as “second display control”) is executed.
  • the driver can confirm the content of the road traffic information display S that the vehicle 1 has passed at any timing regardless of whether or not he / she overlooked it.
  • the driving support device 100c is required by the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27a.
  • the part is composed.
  • the functions of the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27a are It may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
  • the recognition degree determination unit 26 may be provided outside the driving support device 100c. That is, the main part of the driving support device 100c is composed of the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit 25. You may.
  • the driving support device 100c may have the same viewpoint distance coefficient correction unit 24a as the driving support device 100a instead of the viewpoint distance coefficient correction unit 24. Further, the control device 2c may have the same correction information acquisition unit 34 as the control device 2a. In this case, the driving support system 200c may include the same sensors 6 and the wireless communication device 7 as the driving support system 200a.
  • the driving support device 100c may have the same importance determination unit 41 and voice output control unit 42 as the driving support device 100b.
  • the driving support system 200c may include an audio output device 11 similar to the driving support system 200b.
  • the driving support device 100c of the fourth embodiment is controlled to display the image I3 corresponding to the road traffic information display S on the display device 5 in response to the operation input to the operation input device 10 (second display control). ) Is included in the display control unit 27a.
  • the driver can confirm the content of the road traffic information display S that the vehicle 1 has passed at any timing regardless of whether or not he / she overlooked it.
  • the invention of the present application is capable of freely combining the respective embodiments, modifying any constituent element of each embodiment, or omitting any constituent element in each embodiment. ..
  • the driving support device of the present invention can be used for driving support of a vehicle.

Abstract

A driving assistance device (100) is provided with: a road traffic information detection unit (21) which detects, in a forward image (I1), a region (A) corresponding to a road traffic information sign (S), and also detects the content of the road traffic information sign (S); a viewpoint detection unit (22) which uses a vehicle interior image (I2) to detect, in the forward image (I1), a position (P) corresponding to the viewpoint of a driver; a viewpoint distance coefficient calculation unit (23) which calculates a viewpoint distance coefficient (C) corresponding to the distance (L) of the region (A) with respect to the position (P); a viewpoint distance coefficient correction unit (24) which corrects the viewpoint distance coefficient (C) on the basis of the content of the road traffic information sign (S); and an awareness level calculation unit (25) which calculates the driver's level of awareness (D2) of the road traffic information sign (S) by integrating corrected viewpoint distance coefficients (C').

Description

運転支援装置Driving support device
 本発明は、運転支援装置に関する。 The present invention relates to a driving support device.
 特許文献1には、運転者の視線方向に対する安全確認対象物の距離に基づき、運転者による安全確認対象物の視認度を算出する技術が開示されている。これにより、運転者により注視されている安全確認対象物の視認度が算出されるのはもちろんのこと、いわゆる「周辺視」の範囲に含まれる安全確認対象物の視認度も算出される。 Patent Document 1 discloses a technique for calculating the visibility of a safety confirmation object by the driver based on the distance of the safety confirmation object with respect to the driver's line-of-sight direction. As a result, not only the visibility of the safety confirmation object being watched by the driver is calculated, but also the visibility of the safety confirmation object included in the so-called "peripheral vision" is calculated.
国際公開第2008/029802号International Publication No. 2008/029802
 従来、道路に関する種々の情報を示す標識、表示板及び看板など(以下総称して「道路情報表示」という。)が各地に設置されている。また、交通に関する種々の情報を示す標識、表示板及び看板など(以下総称して「交通情報表示」という。)も各地に設置されている。以下、道路情報表示及び交通情報表示を総称して「道路交通情報表示」という。 Conventionally, signs, display boards, signboards, etc. (hereinafter collectively referred to as "road information display") showing various information about roads have been installed in various places. In addition, signs, display boards, signboards, etc. (hereinafter collectively referred to as "traffic information display") showing various information related to traffic are also installed in various places. Hereinafter, the road information display and the traffic information display are collectively referred to as "road traffic information display".
 上記のとおり、特許文献1記載の技術は、運転者による安全確認対象物の視認度を算出するものである。特許文献1記載の技術を転用することにより、運転者による安全確認対象物の視認度を算出するのに代えて、運転者による道路交通情報表示の認識の度合(以下「認識度」という。)を算出することが考えられる(以下「転用技術」という。)。 As described above, the technique described in Patent Document 1 calculates the visibility of the safety confirmation object by the driver. By diverting the technology described in Patent Document 1, the degree of recognition of the road traffic information display by the driver instead of calculating the visibility of the safety confirmation object by the driver (hereinafter referred to as "recognition degree"). It is conceivable to calculate (hereinafter referred to as "diversification technology").
 ここで、運転者による道路交通情報表示の認識度は、視線方向に対する道路交通情報表示の距離に応じて異なる値となるのはもちろんのこと、道路交通情報表示の内容に応じて異なる値となる。このため、運転者による道路交通情報表示の認識度を算出するときは、視線方向に対する道路交通情報表示の距離を考慮するのはもちろんのこと、道路交通情報表示の内容も考慮するのが好適である。 Here, the degree of recognition of the road traffic information display by the driver is not only different depending on the distance of the road traffic information display with respect to the line-of-sight direction, but also different depending on the content of the road traffic information display. .. Therefore, when calculating the degree of recognition of the road traffic information display by the driver, it is preferable to consider not only the distance of the road traffic information display with respect to the line-of-sight direction but also the content of the road traffic information display. is there.
 特許文献1記載の技術は、運転者による安全確認対象物の視認度を算出するとき、視線方向に対する安全確認対象物の距離を考慮しているものの、安全確認対象物の内容を考慮していない。このため、転用技術は、運転者による道路交通情報表示の認識度を算出するとき、道路交通情報表示の内容を考慮するものとならない。この結果、道路交通情報表示が運転者により認識されたか否かの判定精度が低いという問題があった。 The technique described in Patent Document 1 considers the distance of the safety confirmation object with respect to the line-of-sight direction when calculating the visibility of the safety confirmation object by the driver, but does not consider the content of the safety confirmation object. .. Therefore, the diversion technology does not consider the content of the road traffic information display when calculating the recognition degree of the road traffic information display by the driver. As a result, there is a problem that the accuracy of determining whether or not the road traffic information display is recognized by the driver is low.
 本発明は、上記のような課題を解決するためになされたものであり、道路交通情報表示が運転者により認識されたか否かの判定精度を向上することを目的とする。 The present invention has been made to solve the above problems, and an object of the present invention is to improve the accuracy of determining whether or not the road traffic information display has been recognized by the driver.
 本発明の運転支援装置は、前方画像における道路交通情報表示に対応する領域を検出するとともに、道路交通情報表示の内容を検出する道路交通情報検出部と、車内画像を用いて、前方画像における運転者の視点に対応する位置を検出する視点検出部と、位置に対する領域の距離に応じた視点距離係数を算出する視点距離係数算出部と、道路交通情報表示の内容に基づき、視点距離係数を補正する視点距離係数補正部と、補正後の視点距離係数を積算することにより、運転者による道路交通情報表示の認識度を算出する認識度算出部と、を備えるものである。 The driving support device of the present invention uses a road traffic information detection unit that detects an area corresponding to the road traffic information display in the front image and detects the content of the road traffic information display, and a vehicle interior image to drive in the front image. The viewpoint distance coefficient is corrected based on the viewpoint detection unit that detects the position corresponding to the viewpoint of the person, the viewpoint distance coefficient calculation unit that calculates the viewpoint distance coefficient according to the distance of the area to the position, and the content of the road traffic information display. It is provided with a viewpoint distance coefficient correction unit for calculating a viewpoint distance coefficient, and a recognition degree calculation unit for calculating the recognition degree of the road traffic information display by the driver by integrating the corrected viewpoint distance coefficient.
 本発明によれば、上記のように構成したので、道路交通情報表示が運転者により認識されたか否かの判定精度を向上することができる。 According to the present invention, since it is configured as described above, it is possible to improve the accuracy of determining whether or not the road traffic information display is recognized by the driver.
実施の形態1に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the driving support device which concerns on Embodiment 1. FIG. 前方画像の例を示す説明図である。It is explanatory drawing which shows the example of the front image. 図2Aに示す前方画像における、道路交通情報表示に対応する領域を示す説明図である。It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 2A. 図2Aに示す前方画像における、運転者の視点に対応する位置を示す説明図である。It is explanatory drawing which shows the position corresponding to the viewpoint of the driver in the front image shown in FIG. 2A. 図2Aに示す前方画像における、上記位置に対する上記領域の距離を示す説明図である。It is explanatory drawing which shows the distance of the said region with respect to the said position in the front image shown in FIG. 2A. 前方画像の他の例を示す説明図である。It is explanatory drawing which shows the other example of the front image. 図3Aに示す前方画像における、道路交通情報表示に対応する領域を示す説明図である。It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 3A. 図3Aに示す前方画像における、運転者の視点に対応する位置を示す説明図である。It is explanatory drawing which shows the position corresponding to the viewpoint of the driver in the front image shown in FIG. 3A. 図3Aに示す前方画像における、上記位置に対する上記領域の距離を示す説明図である。It is explanatory drawing which shows the distance of the said region with respect to the said position in the front image shown in FIG. 3A. 実施の形態1に係る運転支援装置のハードウェア構成を示す説明図である。It is explanatory drawing which shows the hardware configuration of the driving support apparatus which concerns on Embodiment 1. FIG. 実施の形態1に係る運転支援装置の他のハードウェア構成を示す説明図である。It is explanatory drawing which shows the other hardware configuration of the driving support apparatus which concerns on Embodiment 1. FIG. 実施の形態1に係る運転支援装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the driving support apparatus which concerns on Embodiment 1. FIG. 実施の形態1に係る運転支援装置の他の動作を示すフローチャートである。It is a flowchart which shows other operation of the driving support apparatus which concerns on Embodiment 1. FIG. 前方画像の他の例を示す説明図である。It is explanatory drawing which shows the other example of the front image. 図7Aに示す前方画像における、道路交通情報表示に対応する領域を示す説明図である。It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 7A. 図7Aに示す前方画像における、運転者の視点に対応する位置を示す説明図である。It is explanatory drawing which shows the position corresponding to the viewpoint of the driver in the front image shown in FIG. 7A. 図7Aに示す前方画像における、上記位置に対する上記領域の距離を示す説明図である。It is explanatory drawing which shows the distance of the said region with respect to the said position in the front image shown in FIG. 7A. 実施の形態1に係る他の運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the other driving support device which concerns on Embodiment 1. FIG. 実施の形態2に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the driving support device which concerns on Embodiment 2. 実施の形態2に係る運転支援装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the driving support apparatus which concerns on Embodiment 2. 実施の形態2に係る他の運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the other driving support device which concerns on Embodiment 2. 実施の形態2に係る他の運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the other driving support device which concerns on Embodiment 2. 実施の形態3に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the driving support device which concerns on Embodiment 3. 実施の形態3に係る運転支援装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the driving support apparatus which concerns on Embodiment 3. 実施の形態4に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the driving support device which concerns on Embodiment 4. 実施の形態4に係る他の運転支援装置を含む運転支援システムの要部を示すブロック図である。It is a block diagram which shows the main part of the driving support system including the other driving support device which concerns on Embodiment 4.
 以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。 Hereinafter, in order to explain the present invention in more detail, a mode for carrying out the present invention will be described with reference to the accompanying drawings.
実施の形態1.
 図1は、実施の形態1に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。図1を参照して、実施の形態1の運転支援装置100について説明する。また、運転支援装置100を含む運転支援システム200について説明する。
Embodiment 1.
FIG. 1 is a block diagram showing a main part of a driving support system including a driving support device according to the first embodiment. The operation support device 100 of the first embodiment will be described with reference to FIG. Further, the driving support system 200 including the driving support device 100 will be described.
 図1に示す如く、車両1に制御装置2、第1撮像装置3、第2撮像装置4及び表示装置5が設けられている。制御装置2、第1撮像装置3、第2撮像装置4及び表示装置5により、運転支援システム200の要部が構成されている。 As shown in FIG. 1, the vehicle 1 is provided with a control device 2, a first image pickup device 3, a second image pickup device 4, and a display device 5. The main part of the driving support system 200 is composed of the control device 2, the first image pickup device 3, the second image pickup device 4, and the display device 5.
 第1撮像装置3は、車両1の前方を所定の時間間隔にて撮像して、当該撮像された画像(以下「前方画像」という。)I1を示す画像信号を出力するものである。第1撮像装置3は、例えば、赤外線カメラ又は可視光カメラにより構成されている。第1撮像装置3は、例えば、車両1の前端部、車両1のダッシュボード上、又は車両1の車室内前方部の天井部に設けられている。 The first image pickup device 3 takes an image of the front of the vehicle 1 at predetermined time intervals, and outputs an image signal indicating the captured image (hereinafter referred to as "front image") I1. The first imaging device 3 is composed of, for example, an infrared camera or a visible light camera. The first imaging device 3 is provided, for example, on the front end of the vehicle 1, on the dashboard of the vehicle 1, or on the ceiling of the front of the vehicle interior of the vehicle 1.
 第2撮像装置4は、車両1の車室内を所定の時間間隔にて撮像して、当該撮像された画像(以下「車内画像」という。)I2を示す画像信号を出力するものである。第2撮像装置4は、例えば、赤外線カメラ又は可視光カメラにより構成されている。第2撮像装置4は、例えば、車両1のダッシュボードに設けられている。 The second imaging device 4 captures the interior of the vehicle 1 at predetermined time intervals and outputs an image signal indicating the captured image (hereinafter referred to as "in-vehicle image") I2. The second imaging device 4 is composed of, for example, an infrared camera or a visible light camera. The second imaging device 4 is provided, for example, on the dashboard of the vehicle 1.
 表示装置5は、例えば、液晶ディスプレイ又は有機EL(Electro Luminescence)ディスプレイにより構成されており、車両1のダッシュボードに設けられている。または、例えば、表示装置5は、HUD(Head-Up Display)により構成されている。 The display device 5 is composed of, for example, a liquid crystal display or an organic EL (Electro Luminescence) display, and is provided on the dashboard of the vehicle 1. Alternatively, for example, the display device 5 is configured by a HUD (Head-Up Display).
 制御装置2は、例えば、ECU(Electronic Control Unit)により構成されている。運転支援装置100は、制御装置2に設けられている。以下、運転支援装置100について説明する。 The control device 2 is composed of, for example, an ECU (Electronic Control Unit). The driving support device 100 is provided in the control device 2. Hereinafter, the driving support device 100 will be described.
 道路交通情報検出部21は、第1撮像装置3により出力された画像信号を取得するものである。道路交通情報検出部21は、当該取得された画像信号を用いて、前方画像I1に対する画像認識処理を実行することにより、前方画像I1に含まれる道路交通情報表示Sを検出するものである。より具体的には、道路交通情報検出部21は、前方画像I1における道路交通情報表示Sに対応する領域Aを検出するとともに、道路交通情報表示Sの内容を検出するものである。以下、道路交通情報検出部21が領域A及び道路交通情報表示Sの内容を検出する処理を「道路交通情報検出処理」という。 The road traffic information detection unit 21 acquires the image signal output by the first imaging device 3. The road traffic information detection unit 21 detects the road traffic information display S included in the front image I1 by executing an image recognition process for the front image I1 using the acquired image signal. More specifically, the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S in the front image I1 and also detects the content of the road traffic information display S. Hereinafter, the process in which the road traffic information detection unit 21 detects the contents of the area A and the road traffic information display S is referred to as “road traffic information detection process”.
 視点検出部22は、第2撮像装置4により出力された画像信号を取得するものである。視点検出部22は、当該取得された画像信号を用いて、車内画像I2に対する画像認識処理を実行することにより、前方画像I1における運転者の視点に対応する位置(以下「視点位置」という。)Pを検出するものである。以下、視点検出部22が視点位置Pを検出する処理を「視点検出処理」という。 The viewpoint detection unit 22 acquires the image signal output by the second imaging device 4. The viewpoint detection unit 22 uses the acquired image signal to perform image recognition processing on the in-vehicle image I2, thereby performing an image recognition process on the vehicle interior image I2, thereby corresponding to the driver's viewpoint in the front image I1 (hereinafter referred to as “viewpoint position”). It detects P. Hereinafter, the process in which the viewpoint detection unit 22 detects the viewpoint position P is referred to as “viewpoint detection process”.
 例えば、視点検出部22は、車内画像I2に対する画像認識処理を実行することにより、運転者の頭部の位置(より具体的には運転者の眼の位置)を検出するとともに、運転者の視線ベクトルを検出する。視点検出部22には、運転者の眼の位置と、運転者の視線ベクトルと、前方画像I1における視点位置Pとの対応関係を示すテーブルが予め記憶されている。視点検出部22は、当該記憶されているテーブルを用いて、前方画像I1における視点位置Pを検出する。 For example, the viewpoint detection unit 22 detects the position of the driver's head (more specifically, the position of the driver's eyes) by executing the image recognition process for the in-vehicle image I2, and also detects the driver's line of sight. Detect the vector. The viewpoint detection unit 22 stores in advance a table showing the correspondence between the position of the driver's eyes, the driver's line-of-sight vector, and the viewpoint position P in the front image I1. The viewpoint detection unit 22 detects the viewpoint position P in the front image I1 by using the stored table.
 視点距離係数算出部23は、道路交通情報検出部21による検出結果及び視点検出部22による検出結果を取得するものである。視点距離係数算出部23は、当該取得された検出結果を用いて、視点位置Pに対する領域Aの距離Lを算出するものである。また、視点距離係数算出部23は、当該算出された距離Lに対応する係数(以下「視点距離係数」という。)Cを算出するものである。すなわち、視点距離係数Cは、距離Lが大きくなるにつれて次第に小さくなる値である。換言すれば、視点距離係数Cは、距離Lが小さくなるにつれて次第に大きくなる値である。 The viewpoint distance coefficient calculation unit 23 acquires the detection result by the road traffic information detection unit 21 and the detection result by the viewpoint detection unit 22. The viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P by using the acquired detection result. Further, the viewpoint distance coefficient calculation unit 23 calculates a coefficient (hereinafter referred to as “viewpoint distance coefficient”) C corresponding to the calculated distance L. That is, the viewpoint distance coefficient C is a value that gradually decreases as the distance L increases. In other words, the viewpoint distance coefficient C is a value that gradually increases as the distance L decreases.
 例えば、視点距離係数算出部23は、前方画像I1における視点位置Pを示す座標値(以下「第1座標値」という。)と、前方画像I1における領域Aの所定の部位(例えば中心部又は左上端部)の位置を示す座標値(以下「第2座標値」という。)との差分値に基づき、ピクセル単位の距離Lを算出する。視点距離係数算出部23は、当該算出されたピクセル単位の距離Lに対応する視点距離係数Cを算出する。 For example, the viewpoint distance coefficient calculation unit 23 has a coordinate value indicating the viewpoint position P in the front image I1 (hereinafter referred to as “first coordinate value”) and a predetermined portion (for example, the central portion or the upper left) of the region A in the front image I1. The distance L in pixel units is calculated based on the difference value from the coordinate value (hereinafter referred to as "second coordinate value") indicating the position of the end portion). The viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C corresponding to the calculated distance L in pixel units.
 または、例えば、視点距離係数算出部23は、第1座標値と第2座標値との差分値に基づき、ピクセル単位の距離Lを算出する。視点距離係数算出部23には、第1座標値と、第2座標値と、ピクセル単位の距離Lをメートル単位の距離Lに変換するための係数との対応関係を示すテーブルが予め記憶されている。視点距離係数算出部23は、当該記憶されているテーブルを用いて、当該算出されたピクセル単位の距離Lをメートル単位の距離Lに変換することにより、メートル単位の距離Lを算出する。視点距離係数算出部23は、当該算出されたメートル単位の距離Lに対応する視点距離係数Cを算出する。 Alternatively, for example, the viewpoint distance coefficient calculation unit 23 calculates the distance L in pixel units based on the difference value between the first coordinate value and the second coordinate value. The viewpoint distance coefficient calculation unit 23 stores in advance a table showing the correspondence between the first coordinate value, the second coordinate value, and the coefficient for converting the distance L in pixel units into the distance L in meters. There is. The viewpoint distance coefficient calculation unit 23 calculates the distance L in meters by converting the calculated distance L in pixels into the distance L in meters using the stored table. The viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C corresponding to the calculated distance L in meters.
 図2Aは、前方画像I1の例を示している。図2Aに示す前方画像I1は、「車両侵入禁止」の意味を有する道路標識(より具体的には規制標識)、すなわち道路交通情報表示Sを含むものである。当該規制標識は、1個の記号(以下「マーク」という。)により構成されており、かつ、文字列を含まないものである。 FIG. 2A shows an example of the front image I1. The front image I1 shown in FIG. 2A includes a road sign (more specifically, a regulation sign) having the meaning of “no vehicle intrusion”, that is, a road traffic information display S. The regulation sign is composed of one symbol (hereinafter referred to as "mark") and does not include a character string.
 この場合、道路交通情報検出部21は、道路交通情報表示Sに対応する領域Aを検出する(図2B参照)。また、道路交通情報検出部21は、道路交通情報表示Sの内容を検出する。より具体的には、道路交通情報検出部21は、道路交通情報表示Sが「車両進入禁止」の意味を有する規制標識であることを検出する。このとき、道路交通情報表示Sが1個のマークにより構成されていること、及び道路交通情報表示Sが文字列を含まないものであることなども検出される。次いで、視点検出部22は、前方画像I1における視点位置Pを検出する(図2C参照)。次いで、視点距離係数算出部23は、視点位置Pに対する領域Aの距離Lを算出して(図2D参照)、当該算出された距離Lに対応する視点距離係数Cを算出する。 In this case, the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S (see FIG. 2B). Further, the road traffic information detection unit 21 detects the content of the road traffic information display S. More specifically, the road traffic information detection unit 21 detects that the road traffic information display S is a regulatory sign having the meaning of "no vehicle entry". At this time, it is also detected that the road traffic information display S is composed of one mark and that the road traffic information display S does not include a character string. Next, the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 (see FIG. 2C). Next, the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P (see FIG. 2D), and calculates the viewpoint distance coefficient C corresponding to the calculated distance L.
 図3Aは、前方画像I1の他の例を示している。図3Aに示す前方画像I1は、「この先/通行止め」の文字列を含む看板、すなわち道路交通情報表示Sを含むものである。 FIG. 3A shows another example of the front image I1. The front image I1 shown in FIG. 3A includes a signboard including the character string “future / closed”, that is, the road traffic information display S.
 この場合、道路交通情報検出部21は、道路交通情報表示Sに対応する領域Aを検出する(図3B参照)。また、道路交通情報検出部21は、道路交通情報表示Sの内容を検出する。より具体的には、道路交通情報検出部21は、道路交通情報表示Sが「この先/通行止め」の文字列を含む看板であることを検出する。次いで、視点検出部22は、前方画像I1における視点位置Pを検出する(図3C参照)。次いで、視点距離係数算出部23は、視点位置Pに対する領域Aの距離Lを算出して(図3D参照)、当該算出された距離Lに対応する視点距離係数Cを算出する。 In this case, the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S (see FIG. 3B). Further, the road traffic information detection unit 21 detects the content of the road traffic information display S. More specifically, the road traffic information detection unit 21 detects that the road traffic information display S is a signboard including the character string "future / closed". Next, the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 (see FIG. 3C). Next, the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P (see FIG. 3D), and calculates the viewpoint distance coefficient C corresponding to the calculated distance L.
 視点距離係数補正部24は、視点距離係数算出部23により算出された視点距離係数Cを取得するものである。また、視点距離係数補正部24は、道路交通情報検出部21による検出結果を取得するものである。視点距離係数補正部24は、当該取得された検出結果を用いて、道路交通情報表示Sの内容に基づき、当該取得された視点距離係数Cを補正するものである。 The viewpoint distance coefficient correction unit 24 acquires the viewpoint distance coefficient C calculated by the viewpoint distance coefficient calculation unit 23. Further, the viewpoint distance coefficient correction unit 24 acquires the detection result by the road traffic information detection unit 21. The viewpoint distance coefficient correction unit 24 corrects the acquired viewpoint distance coefficient C based on the content of the road traffic information display S by using the acquired detection result.
 すなわち、視点距離係数補正部24は、道路交通情報表示Sに含まれるマークの個数、道路交通情報表示Sにおける文字列の有無、及び道路交通情報表示Sに含まれる文字列の文字数などに基づき、道路交通情報表示Sの認識の困難度D1を算出する。困難度D1は、道路交通情報表示Sの複雑さを示す指標でもある。 That is, the viewpoint distance coefficient correction unit 24 is based on the number of marks included in the road traffic information display S, the presence or absence of a character string in the road traffic information display S, the number of characters in the character string included in the road traffic information display S, and the like. The difficulty level D1 of recognition of the road traffic information display S is calculated. The difficulty level D1 is also an index indicating the complexity of the road traffic information display S.
 例えば、図2に示す前方画像I1における道路交通情報表示Sは、1個のマークにより構成されており、かつ、文字列を含まないものである。これに対して、図3に示す前方画像I1における道路交通情報表示Sは、文字列を含むものである。この場合、前者(図2)の困難度D1が後者(図3)の困難度D1よりも小さい値に算出される。換言すれば、後者(図3)の困難度D1が前者(図2)の困難度D1よりも大きい値に算出される。 For example, the road traffic information display S in the front image I1 shown in FIG. 2 is composed of one mark and does not include a character string. On the other hand, the road traffic information display S in the front image I1 shown in FIG. 3 includes a character string. In this case, the difficulty level D1 of the former (FIG. 2) is calculated to be smaller than the difficulty level D1 of the latter (FIG. 3). In other words, the difficulty level D1 of the latter (FIG. 3) is calculated to be larger than the difficulty level D1 of the former (FIG. 2).
 視点距離係数補正部24は、困難度D1が大きいときは困難度D1が小さいときに比して小さい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24は、困難度D1が小さいときは困難度D1が大きいときに比して大きい値となるように視点距離係数Cを補正する。 The viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C so that when the difficulty level D1 is large, the value becomes smaller than when the difficulty level D1 is small. In other words, the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C so that when the difficulty level D1 is small, the value becomes larger than when the difficulty level D1 is large.
 上記のとおり、前方画像I1は所定の時間間隔にて撮像される。これに対して、道路交通情報検出部21は、道路交通情報検出処理を所定の時間間隔にて実行する。このため、前方画像I1に道路交通情報表示Sが含まれている場合、視点距離係数Cが所定の時間間隔にて算出されて、当該算出された視点距離係数Cが順次補正される。認識度算出部25は、補正後の視点距離係数C’を時間的に積算することにより、運転者による道路交通情報表示Sの認識度D2を算出するものである。すなわち、認識度D2は、視点距離係数Cが算出されて当該算出された視点距離係数Cが補正される毎に更新されるものである。 As described above, the forward image I1 is imaged at predetermined time intervals. On the other hand, the road traffic information detection unit 21 executes the road traffic information detection process at predetermined time intervals. Therefore, when the road traffic information display S is included in the front image I1, the viewpoint distance coefficient C is calculated at a predetermined time interval, and the calculated viewpoint distance coefficient C is sequentially corrected. The recognition degree calculation unit 25 calculates the recognition degree D2 of the road traffic information display S by the driver by temporally integrating the corrected viewpoint distance coefficient C'. That is, the recognition degree D2 is updated every time the viewpoint distance coefficient C is calculated and the calculated viewpoint distance coefficient C is corrected.
 認識度判定部26は、道路交通情報検出部21による検出結果を取得するものである。図1において、道路交通情報検出部21と認識度判定部26間の接続線は図示を省略している。通常、前方画像I1内の道路交通情報表示Sは、車両1の走行により(すなわち走行中の時間経過により)、前方画像I1外に出るものである。認識度判定部26は、当該取得された検出結果を用いて、道路交通情報表示Sが前方画像I1外に出たか否かを判定するものである。 The recognition degree determination unit 26 acquires the detection result by the road traffic information detection unit 21. In FIG. 1, the connection line between the road traffic information detection unit 21 and the recognition degree determination unit 26 is not shown. Normally, the road traffic information display S in the front image I1 goes out of the front image I1 due to the traveling of the vehicle 1 (that is, due to the passage of time during traveling). The recognition degree determination unit 26 determines whether or not the road traffic information display S has gone out of the front image I1 by using the acquired detection result.
 認識度判定部26は、道路交通情報表示Sが前方画像I1外に出たと判定されたとき、認識度算出部25により算出された認識度D2を取得するものである。認識度判定部26は、当該取得された認識度D2が所定の閾値Dth以上であるか否かを判定することにより、道路交通情報表示Sが運転者により認識されたか否かを判定するものである。すなわち、閾値Dthは、道路交通情報表示Sが運転者により認識されたか否かを判定することができるような値に設定されている。 The recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25 when it is determined that the road traffic information display S has gone out of the front image I1. The recognition degree determination unit 26 determines whether or not the road traffic information display S is recognized by the driver by determining whether or not the acquired recognition degree D2 is equal to or higher than a predetermined threshold value Dth. is there. That is, the threshold value Dth is set to a value that can determine whether or not the road traffic information display S is recognized by the driver.
 表示制御部27は、認識度判定部26により認識度D2が閾値Dth未満であると判定された場合、道路交通情報表示Sに対応する画像I3を表示装置5に表示させる制御(以下「第1表示制御」という。)を実行するものである。運転者は、画像I3を視認することにより、自身が見逃した道路交通情報表示Sを確認することができる。 When the recognition degree D2 is determined by the recognition degree determination unit 26 to be less than the threshold value Dth, the display control unit 27 controls the display device 5 to display the image I3 corresponding to the road traffic information display S (hereinafter, "first"). It is called "display control"). By visually recognizing the image I3, the driver can confirm the road traffic information display S that he / she missed.
 道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27により、運転支援装置100の要部が構成されている。 The driving support device 100 is required by the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27. The part is composed.
 次に、図4を参照して、運転支援装置100の要部のハードウェア構成について説明する。 Next, the hardware configuration of the main part of the driving support device 100 will be described with reference to FIG.
 図4Aに示す如く、運転支援装置100は、プロセッサ51及びメモリ52を有している。メモリ52には、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27の機能を実現するためのプログラムが記憶されている。当該記憶されているプログラムをプロセッサ51が読み出して実行することにより、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27の機能が実現される。 As shown in FIG. 4A, the driving support device 100 has a processor 51 and a memory 52. The memory 52 has functions of a road traffic information detection unit 21, a viewpoint detection unit 22, a viewpoint distance coefficient calculation unit 23, a viewpoint distance coefficient correction unit 24, a recognition degree calculation unit 25, a recognition degree determination unit 26, and a display control unit 27. The program to realize it is stored. When the processor 51 reads out and executes the stored program, the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, and recognition The functions of the coefficient determination unit 26 and the display control unit 27 are realized.
 または、図4Bに示す如く、運転支援装置100は、処理回路53を有している。この場合、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27の機能が専用の処理回路53により実現される。 Alternatively, as shown in FIG. 4B, the driving support device 100 has a processing circuit 53. In this case, the functions of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27 are dedicated. It is realized by the processing circuit 53.
 または、運転支援装置100は、プロセッサ51、メモリ52及び処理回路53を有している(不図示)。この場合、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27の機能のうちの一部の機能がプロセッサ51及びメモリ52により実現されて、残余の機能が専用の処理回路53により実現される。 Alternatively, the driving support device 100 has a processor 51, a memory 52, and a processing circuit 53 (not shown). In this case, among the functions of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27. Some functions are realized by the processor 51 and the memory 52, and the remaining functions are realized by the dedicated processing circuit 53.
 プロセッサ51は、例えば、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、マイクロプロセッサ、マイクロコントローラ又はDSP(Digital Signal Processor)のうちの少なくとも一つを用いたものである。 The processor 51 uses, for example, at least one of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, and a DSP (Digital Signal Processor).
 メモリ52は、不揮発性メモリ又は不揮発性メモリ及び揮発性メモリにより構成されている。メモリ52のうちの揮発性メモリは、例えば、RAM(Random Access Memory)を用いたものである。メモリ52のうちの不揮発性メモリは、例えば、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)、SSD(Solid State Drive)又はHDD(Hard Disk Drive)のうちの少なくとも一つを用いたものである。 The memory 52 is composed of a non-volatile memory or a non-volatile memory and a volatile memory. The volatile memory in the memory 52 is, for example, one using RAM (Random Access Memory). The non-volatile memory among the memory 52 is, for example, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Memory) or a EEPROM (Electrically Erasable Memory) MemoryRead At least one of (Hard Disc Drive) is used.
 処理回路53は、デジタル回路又はデジタル回路及びアナログ回路により構成されている。処理回路53は、例えば、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field-Programmable Gate Array)、SoC(System-on-a-Chip)又はシステムLSI(Large-Scale Integration)のうちの少なくとも一つを用いたものである。 The processing circuit 53 is composed of a digital circuit or a digital circuit and an analog circuit. The processing circuit 53 includes, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), an FPGA (Field-Programmable Gate Array), a System-System (System) System, and an ASIC (System) System. At least one of them is used.
 次に、図5のフローチャートを参照して、運転支援装置100の動作について、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24及び認識度算出部25の動作を中心に説明する。 Next, with reference to the flowchart of FIG. 5, regarding the operation of the driving support device 100, the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit The operation of 25 will be mainly described.
 なお、運転支援システム200用の電源(例えば車両1におけるアクセサリ電源又はイグニッション電源)がオンされているものとする。第1撮像装置3は、車両1の前方を所定の時間間隔にて撮像して、前方画像I1を示す画像信号を出力する処理を継続的に実行している。第2撮像装置4は、車両1の車室内を所定の時間間隔にて撮像して、車内画像I2を示す画像信号を出力する処理を継続的に実行している。 It is assumed that the power supply for the driving support system 200 (for example, the accessory power supply or the ignition power supply in the vehicle 1) is turned on. The first image pickup apparatus 3 continuously executes a process of capturing an image of the front of the vehicle 1 at predetermined time intervals and outputting an image signal indicating the front image I1. The second imaging device 4 continuously executes a process of capturing an image of the interior of the vehicle 1 at predetermined time intervals and outputting an image signal indicating the in-vehicle image I2.
 まず、ステップST1にて、道路交通情報検出部21が道路交通情報検出処理を実行する。前方画像I1に道路交通情報表示Sが含まれている場合(ステップST2“YES”)、ステップST3にて、視点検出部22が視点検出処理を実行する。次いで、ステップST4にて、視点距離係数算出部23が視点距離係数Cを算出する。次いで、ステップST5にて、視点距離係数補正部24が視点距離係数Cを補正する。次いで、ステップST6にて、認識度算出部25が認識度D2を算出する。 First, in step ST1, the road traffic information detection unit 21 executes the road traffic information detection process. When the road traffic information display S is included in the forward image I1 (step ST2 “YES”), the viewpoint detection unit 22 executes the viewpoint detection process in step ST3. Next, in step ST4, the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C. Next, in step ST5, the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C. Next, in step ST6, the recognition degree calculation unit 25 calculates the recognition degree D2.
 前方画像I1に道路交通情報表示Sが含まれていない場合(ステップST2“NO”)、運転支援装置100の処理はステップST1に戻る。また、ステップST6に次いで、運転支援装置100の処理はステップST1に戻る。すなわち、上記のとおり、道路交通情報検出処理は、所定の時間間隔にて実行される。 When the road traffic information display S is not included in the front image I1 (step ST2 “NO”), the process of the driving support device 100 returns to step ST1. Further, following step ST6, the processing of the driving support device 100 returns to step ST1. That is, as described above, the road traffic information detection process is executed at predetermined time intervals.
 次に、図6のフローチャートを参照して、運転支援装置100の動作について、認識度判定部26及び表示制御部27の動作を中心に説明する。 Next, with reference to the flowchart of FIG. 6, the operation of the driving support device 100 will be described focusing on the operations of the recognition degree determination unit 26 and the display control unit 27.
 上記のとおり、認識度判定部26は、道路交通情報検出部21による検出結果を用いて、道路交通情報表示Sが前方画像I1外に出たか否かを判定する。認識度判定部26は、道路交通情報表示Sが前方画像I1外に出たと判定されたとき、ステップST11の処理を実行する。 As described above, the recognition degree determination unit 26 determines whether or not the road traffic information display S has gone out of the front image I1 by using the detection result by the road traffic information detection unit 21. When it is determined that the road traffic information display S has gone out of the front image I1, the recognition degree determination unit 26 executes the process of step ST11.
 まず、ステップST11にて、認識度判定部26は、認識度算出部25により算出された認識度D2を取得して、当該取得された認識度D2が閾値Dth以上であるか否かを判定する。認識度D2が閾値Dth未満であると判定された場合(ステップST11“NO”)、ステップST12にて、表示制御部27が第1表示制御を実行する。他方、認識度D2が閾値Dth以上であると判定された場合(ステップST11“YES”)、ステップST12の処理はスキップされる。 First, in step ST11, the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25, and determines whether or not the acquired recognition degree D2 is equal to or greater than the threshold value Dth. .. When it is determined that the recognition degree D2 is less than the threshold value Dth (step ST11 “NO”), the display control unit 27 executes the first display control in step ST12. On the other hand, when it is determined that the recognition degree D2 is equal to or higher than the threshold value Dth (step ST11 “YES”), the process of step ST12 is skipped.
 このように、運転支援装置100は、道路交通情報表示Sの内容に基づき視点距離係数Cを補正する視点距離係数補正部24を備える。補正後の視点距離係数C’を認識度D2の算出に用いることにより、仮に補正前の視点距離係数Cを認識度D2の算出に用いる場合に比して(すなわち従来の運転支援装置に比して)、道路交通情報表示Sが運転者により認識されたか否かの判定精度を向上することができる。 As described above, the driving support device 100 includes the viewpoint distance coefficient correction unit 24 that corrects the viewpoint distance coefficient C based on the content of the road traffic information display S. By using the corrected viewpoint distance coefficient C'for the calculation of the recognition degree D2, the viewpoint distance coefficient C before the correction is used for the calculation of the recognition degree D2 (that is, compared with the conventional driving support device). The accuracy of determining whether or not the road traffic information display S is recognized by the driver can be improved.
 次に、図7を参照して、運転支援装置100の変形例について説明する。 Next, a modified example of the driving support device 100 will be described with reference to FIG. 7.
 運転支援装置100は、前方画像I1に複数個の道路交通情報表示Sが含まれる場合、当該複数個の道路交通情報表示Sの各々について、ST4~ST6の処理を実行するものであっても良い(図7参照)。また、この場合、運転支援装置100は、当該複数個の道路交通情報表示Sの各々について、ステップST11,ST12の処理を実行するものであっても良い。 When the front image I1 includes a plurality of road traffic information displays S, the driving support device 100 may execute the processes of ST4 to ST6 for each of the plurality of road traffic information displays S. (See FIG. 7). Further, in this case, the driving support device 100 may execute the processes of steps ST11 and ST12 for each of the plurality of road traffic information display S.
 次に、図8を参照して、運転支援装置100の他の変形例について説明する。 Next, another modification of the driving support device 100 will be described with reference to FIG.
 図8に示す如く、認識度判定部26及び表示制御部27は、運転支援装置100外に設けられているものであっても良い。すなわち、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24及び認識度算出部25により、運転支援装置100の要部が構成されているものであっても良い。 As shown in FIG. 8, the recognition degree determination unit 26 and the display control unit 27 may be provided outside the driving support device 100. That is, the main part of the driving support device 100 is composed of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit 25. You may.
 以上のように、実施の形態1の運転支援装置100は、前方画像I1における道路交通情報表示Sに対応する領域Aを検出するとともに、道路交通情報表示Sの内容を検出する道路交通情報検出部21と、車内画像I2を用いて、前方画像I1における運転者の視点に対応する位置Pを検出する視点検出部22と、位置Pに対する領域Aの距離Lに応じた視点距離係数Cを算出する視点距離係数算出部23と、道路交通情報表示Sの内容に基づき、視点距離係数Cを補正する視点距離係数補正部24と、補正後の視点距離係数C’を積算することにより、運転者による道路交通情報表示Sの認識度D2を算出する認識度算出部25と、を備える。補正後の視点距離係数C’を認識度D2の算出に用いることにより、道路交通情報表示Sが運転者により認識されたか否かの判定精度を向上することができる。 As described above, the driving support device 100 of the first embodiment detects the region A corresponding to the road traffic information display S in the front image I1 and also detects the content of the road traffic information display S. Using 21 and the in-vehicle image I2, the viewpoint detection unit 22 that detects the position P corresponding to the driver's viewpoint in the front image I1 and the viewpoint distance coefficient C corresponding to the distance L of the region A with respect to the position P are calculated. By integrating the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24 that corrects the viewpoint distance coefficient C based on the contents of the road traffic information display S, and the corrected viewpoint distance coefficient C'by the driver. A recognition degree calculation unit 25 for calculating the recognition degree D2 of the road traffic information display S is provided. By using the corrected viewpoint distance coefficient C'for calculating the recognition degree D2, it is possible to improve the accuracy of determining whether or not the road traffic information display S is recognized by the driver.
 また、運転支援装置100は、認識度D2が閾値Dth以上であるか否かを判定することにより、道路交通情報表示Sが運転者により認識されたか否かを判定する認識度判定部26と、認識度D2が閾値Dth未満であると判定された場合、道路交通情報表示Sに対応する画像I3を表示装置5に表示させる制御(第1表示制御)を実行する表示制御部27と、を備える。これにより、運転者は、自身が見逃した道路交通情報表示Sを確認することができる。 Further, the driving support device 100 determines whether or not the recognition degree D2 is equal to or higher than the threshold value Dth, thereby determining whether or not the road traffic information display S is recognized by the driver, and the recognition degree determination unit 26. When it is determined that the recognition degree D2 is less than the threshold value Dth, the display control unit 27 that executes the control (first display control) to display the image I3 corresponding to the road traffic information display S on the display device 5 is provided. .. As a result, the driver can confirm the road traffic information display S that he / she missed.
 また、認識度判定部26は、道路交通情報表示Sが前方画像I1外に出たとき、認識度D2が閾値Dth以上であるか否かを判定する。これにより、車両1が道路交通情報表示Sの設置位置を通り過ぎたとき、認識度D2が閾値Dth以上であるか否かの判定をすることができる。すなわち、運転者が見逃した可能性のある道路交通情報表示Sについて、認識度D2が閾値Dth以上であるか否かの判定をすることができる。 Further, the recognition degree determination unit 26 determines whether or not the recognition degree D2 is equal to or greater than the threshold value Dth when the road traffic information display S goes out of the front image I1. As a result, when the vehicle 1 passes the installation position of the road traffic information display S, it is possible to determine whether or not the recognition degree D2 is equal to or higher than the threshold value Dth. That is, it is possible to determine whether or not the recognition degree D2 is equal to or higher than the threshold value Dth for the road traffic information display S that the driver may have missed.
実施の形態2.
 図9は、実施の形態2に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。図9を参照して、実施の形態2の運転支援装置100aについて説明する。また、運転支援装置100aを含む運転支援システム200aについて説明する。なお、図9において、図1に示すブロックと同様のブロックには同一符号を付して説明を省略する。
Embodiment 2.
FIG. 9 is a block diagram showing a main part of the driving support system including the driving support device according to the second embodiment. The driving support device 100a of the second embodiment will be described with reference to FIG. Moreover, the driving support system 200a including the driving support device 100a will be described. In FIG. 9, the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
 図9に示す如く、車両1に制御装置2aが設けられている。制御装置2aは、例えば、ECUにより構成されている。 As shown in FIG. 9, the control device 2a is provided in the vehicle 1. The control device 2a is composed of, for example, an ECU.
 また、車両1にセンサ類6が設けられている。センサ類6は、種々のセンサを含むものである。例えば、センサ類6は、車輪速センサ、加速度センサ、ジャイロセンサ、ステアリングセンサ及びワイパセンサを含むものである。 Also, the vehicle 1 is provided with sensors 6. The sensors 6 include various sensors. For example, the sensors 6 include a wheel speed sensor, an acceleration sensor, a gyro sensor, a steering sensor, and a wiper sensor.
 また、車両1に無線通信装置7が設けられている。無線通信装置7は、無線通信用の送信機及び受信機により構成されている。無線通信装置7は、車両1外のサーバ装置8と通信自在である。 Further, the vehicle 1 is provided with a wireless communication device 7. The wireless communication device 7 is composed of a transmitter and a receiver for wireless communication. The wireless communication device 7 is free to communicate with the server device 8 outside the vehicle 1.
 また、車両1に記憶装置9が設けられている。記憶装置9は、不揮発性メモリにより構成されている。なお、記憶装置9は、制御装置2aのメモリ(すなわちメモリ52)と一体に構成されているものであっても良い。 Further, the storage device 9 is provided in the vehicle 1. The storage device 9 is composed of a non-volatile memory. The storage device 9 may be integrally configured with the memory (that is, the memory 52) of the control device 2a.
 また、車両1に操作入力装置10が設けられている。操作入力装置10は、例えば、タッチパネル、ハードウェアキー、又は音声入力用のマイクのうちの少なくとも一つにより構成されている。操作入力装置10は、例えば、車両1のダッシュボードに設けられている。なお、操作入力装置10は、表示装置5と一体に設けられているものであっても良く、又は表示装置5に隣接して設けられているものであっても良い。 Further, the vehicle 1 is provided with an operation input device 10. The operation input device 10 is composed of, for example, at least one of a touch panel, a hardware key, and a microphone for voice input. The operation input device 10 is provided on the dashboard of the vehicle 1, for example. The operation input device 10 may be provided integrally with the display device 5, or may be provided adjacent to the display device 5.
 制御装置2a、第1撮像装置3、第2撮像装置4、表示装置5、センサ類6、無線通信装置7、記憶装置9及び操作入力装置10により、運転支援システム200aの要部が構成されている。運転支援装置100aは、制御装置2aに設けられている。 The main part of the driving support system 200a is composed of the control device 2a, the first imaging device 3, the second imaging device 4, the display device 5, the sensors 6, the wireless communication device 7, the storage device 9, and the operation input device 10. There is. The driving support device 100a is provided in the control device 2a.
 車両情報取得部31は、車両1の走行状態を示す情報(以下「車両情報」という。)を取得するものである。車両情報は、例えば、車両1の走行速度を示す情報、及び車両1の旋回量を示す情報を含むものである。 The vehicle information acquisition unit 31 acquires information indicating the running state of the vehicle 1 (hereinafter referred to as "vehicle information"). The vehicle information includes, for example, information indicating the traveling speed of the vehicle 1 and information indicating the turning amount of the vehicle 1.
 例えば、車両情報取得部31は、センサ類6のうちの車輪速センサによる出力信号を用いて、車両1の走行速度を算出する。これにより、車両1の走行速度を示す情報が取得される。 For example, the vehicle information acquisition unit 31 calculates the traveling speed of the vehicle 1 by using the output signal from the wheel speed sensor of the sensors 6. As a result, information indicating the traveling speed of the vehicle 1 is acquired.
 また、例えば、車両情報取得部31は、センサ類6のうちの加速度センサ、ジャイロセンサ又はステアリングセンサによる出力信号を用いて、車両1の旋回量を算出する。これにより、車両1の旋回量を示す情報が取得される。 Further, for example, the vehicle information acquisition unit 31 calculates the turning amount of the vehicle 1 by using the output signal from the acceleration sensor, the gyro sensor, or the steering sensor of the sensors 6. As a result, information indicating the turning amount of the vehicle 1 is acquired.
 環境情報取得部32は、車両1の車外環境を示す情報(以下「環境情報」という。)を取得するものである。環境情報は、例えば、車両1の周囲の明るさを示す情報、及び車両1の周囲の天候を示す情報を含むものである。 The environmental information acquisition unit 32 acquires information indicating the environment outside the vehicle 1 (hereinafter referred to as "environmental information"). The environmental information includes, for example, information indicating the brightness around the vehicle 1 and information indicating the weather around the vehicle 1.
 例えば、環境情報取得部32は、前方画像I1に対する画像認識処理を実行することにより、車両1の周囲の明るさを検出する。これにより、車両1の周囲の明るさを示す情報が取得される。 For example, the environmental information acquisition unit 32 detects the brightness around the vehicle 1 by executing an image recognition process for the front image I1. As a result, information indicating the brightness of the surroundings of the vehicle 1 is acquired.
 また、例えば、環境情報取得部32は、前方画像I1に対する画像認識処理を実行することにより、車両1の周囲の天候を判定する。これにより、車両1の周囲の天候を示す情報が取得される。 Further, for example, the environmental information acquisition unit 32 determines the weather around the vehicle 1 by executing the image recognition process for the front image I1. As a result, information indicating the weather around the vehicle 1 is acquired.
 または、例えば、環境情報取得部32は、センサ類6のうちのワイパセンサによる出力信号を用いて、車両1のワイパが作動中であるか否かを判定する。環境情報取得部32は、車両1のワイパが作動中である場合、車両1の周囲の天候が雨又は雪であると判定する。また、環境情報取得部32は、車両1のワイパが停止中である場合、車両1の周囲の天候が晴れ又は曇りであると判定する。これにより、車両1の周囲の天候を示す情報が取得される。 Alternatively, for example, the environmental information acquisition unit 32 determines whether or not the wiper of the vehicle 1 is operating by using the output signal from the wiper sensor of the sensors 6. When the wiper of the vehicle 1 is operating, the environmental information acquisition unit 32 determines that the weather around the vehicle 1 is rain or snow. Further, the environmental information acquisition unit 32 determines that the weather around the vehicle 1 is sunny or cloudy when the wiper of the vehicle 1 is stopped. As a result, information indicating the weather around the vehicle 1 is acquired.
 または、例えば、サーバ装置8に各地域の天候を示す情報が記憶されている。無線通信装置7は、サーバ装置8と通信することにより、車両1の現在位置を含む地域の天候を示す情報を受信する。環境情報取得部32は、当該受信された情報を取得する。これにより、車両1の周囲の天候を示す情報が取得される。 Alternatively, for example, the server device 8 stores information indicating the weather in each area. By communicating with the server device 8, the wireless communication device 7 receives information indicating the weather in the area including the current position of the vehicle 1. The environment information acquisition unit 32 acquires the received information. As a result, information indicating the weather around the vehicle 1 is acquired.
 運転者情報取得部33は、車両1の運転者に関する情報(以下「運転者情報」という。)を取得するものである。運転者情報は、例えば、運転者の視力を示す情報を含むものである。 The driver information acquisition unit 33 acquires information about the driver of the vehicle 1 (hereinafter referred to as "driver information"). The driver information includes, for example, information indicating the driver's eyesight.
 例えば、車両1は、複数人のユーザにより使用されるものである。記憶装置9には、個々のユーザの視力を示す情報と、個々のユーザの顔画像を示す情報とを対応付けてなるデータベースが記憶されている。個々のユーザの視力を示す情報は、例えば、個々のユーザが操作入力装置10を用いて事前に入力したものである。個々のユーザの顔画像は、例えば、第2撮像装置4により事前に撮像されたものである。 For example, vehicle 1 is used by a plurality of users. The storage device 9 stores a database in which information indicating the visual acuity of each user and information indicating the face image of each user are associated with each other. The information indicating the visual acuity of each user is, for example, input in advance by each user using the operation input device 10. The facial image of each user is, for example, pre-imaged by the second imaging device 4.
 運転者情報取得部33は、車内画像I2に対する画像認識処理を実行することにより、車両1の運転者の顔画像を取得する。運転者情報取得部33は、当該取得された顔画像とデータベース内の顔画像とのマッチングにより、車両1の運転者が複数人のユーザのうちのいずれのユーザであるのかを判定する。運転者情報取得部33は、当該判定の結果に対応するユーザの視力を示す情報をデータベースから取得する。これにより、車両1の運転者の視力を示す情報が取得される。 The driver information acquisition unit 33 acquires the face image of the driver of the vehicle 1 by executing the image recognition process for the in-vehicle image I2. The driver information acquisition unit 33 determines which of the plurality of users the driver of the vehicle 1 is by matching the acquired face image with the face image in the database. The driver information acquisition unit 33 acquires information indicating the user's visual acuity corresponding to the result of the determination from the database. As a result, information indicating the eyesight of the driver of the vehicle 1 is acquired.
 または、例えば、記憶装置9には、以下のような学習済みモデルが記憶されている。すなわち、車内画像I2に対する画像認識処理の結果を示す値が入力されたとき、運転者の視力を示す値が出力されるような学習済みモデルが記憶されている。当該学習済みモデルは、機械学習により事前に生成されたものである。運転者情報取得部33は、車内画像I2に対する画像認識処理を実行して、当該画像認識処理の結果を示す値を当該学習済みモデルに入力する。これにより、運転者の視力を示す情報が取得される。 Alternatively, for example, the storage device 9 stores the following trained model. That is, a trained model is stored in which a value indicating the driver's visual acuity is output when a value indicating the result of the image recognition process for the in-vehicle image I2 is input. The trained model is pre-generated by machine learning. The driver information acquisition unit 33 executes an image recognition process for the in-vehicle image I2, and inputs a value indicating the result of the image recognition process into the trained model. As a result, information indicating the driver's eyesight is acquired.
 以下、車両情報、環境情報及び運転者情報を総称して「補正用情報」という。車両情報取得部31、環境情報取得部32及び運転者情報取得部33により、補正用情報取得部34が構成されている。 Hereinafter, vehicle information, environmental information, and driver information are collectively referred to as "correction information". The vehicle information acquisition unit 31, the environment information acquisition unit 32, and the driver information acquisition unit 33 constitute the correction information acquisition unit 34.
 視点距離係数補正部24aは、道路交通情報表示Sの内容に基づき、視点距離係数Cを補正するものである。すなわち、視点距離係数補正部24aは、視点距離係数補正部24と同様の補正処理を実行するものである。これに加えて、視点距離係数補正部24aは、以下の(1)~(6)のうちの少なくとも一つの補正処理を実行するものである。 The viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S. That is, the viewpoint distance coefficient correction unit 24a executes the same correction processing as the viewpoint distance coefficient correction unit 24. In addition to this, the viewpoint distance coefficient correction unit 24a executes at least one of the following correction processes (1) to (6).
(1)領域Aのサイズに基づく補正処理
 視点距離係数補正部24aは、領域Aのサイズが大きいときは領域Aのサイズが小さいときに比して大きい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24aは、領域Aのサイズが小さいときは領域Aのサイズが大きいときに比して小さい値となるように視点距離係数Cを補正する。かかる補正処理には、道路交通情報検出部21による検出結果が用いられる。
(1) Correction processing based on the size of the area A The viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the size of the area A is large, the value becomes larger than when the size of the area A is small. To do. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the size of the area A is small, the value is smaller than when the size of the area A is large. The detection result by the road traffic information detection unit 21 is used for such correction processing.
 通常、領域Aのサイズが大きいときは、道路交通情報表示Sのサイズが大きいものであるか、又は車両1に対する道路交通情報表示Sの距離が小さいものである。このような場合、運転者が道路交通情報表示Sを見逃す可能性は低いと考えられる。そこで、視点距離係数Cがより大きい値となるように補正をするのである。 Normally, when the size of the area A is large, the size of the road traffic information display S is large, or the distance of the road traffic information display S to the vehicle 1 is small. In such a case, it is unlikely that the driver will miss the road traffic information display S. Therefore, the viewpoint distance coefficient C is corrected so as to have a larger value.
(2)車両1の走行速度に基づく補正処理
 視点距離係数補正部24aは、車両1の走行速度が高いときは車両1の走行速度が低いときに比して小さい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24aは、車両1の走行速度が低いときは車両1の走行速度が高いときに比して大きい値となるように視点距離係数Cを補正する。かかる補正処理には、車両情報取得部31により取得された車両情報が用いられる。
(2) Correction processing based on the traveling speed of the vehicle 1 The viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient such that when the traveling speed of the vehicle 1 is high, the value is smaller than when the traveling speed of the vehicle 1 is low. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the traveling speed of the vehicle 1 is low, the value becomes larger than when the traveling speed of the vehicle 1 is high. Vehicle information acquired by the vehicle information acquisition unit 31 is used for such correction processing.
 通常、車両1の走行速度が高いときは、運転者による道路交通情報表示Sの視認可能時間が短い。このような場合、運転者が道路交通情報表示Sを見逃す可能性が高いと考えられる。そこで、視点距離係数Cがより小さい値となるように補正をするのである。 Normally, when the traveling speed of the vehicle 1 is high, the visible time of the road traffic information display S by the driver is short. In such a case, it is highly likely that the driver misses the road traffic information display S. Therefore, the viewpoint distance coefficient C is corrected so as to have a smaller value.
(3)車両1の旋回量に基づく補正処理
 視点距離係数補正部24aは、車両1の旋回量が大きいときは車両1の旋回量が小さいときに比して小さい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24aは、車両1の旋回量が大きいときは車両1の旋回量が小さいときに比して小さい値となるように視点距離係数Cを補正する。かかる補正処理には、車両情報取得部31により取得された車両情報が用いられる。
(3) Correction processing based on the turning amount of the vehicle 1 The viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient so that when the turning amount of the vehicle 1 is large, the value becomes smaller than when the turning amount of the vehicle 1 is small. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the turning amount of the vehicle 1 is large, the value becomes smaller than when the turning amount of the vehicle 1 is small. Vehicle information acquired by the vehicle information acquisition unit 31 is used for such correction processing.
 通常、車両1の左折時の旋回量が大きい場合において、車両1に対する右側に道路交通情報表示Sが配置されているときは、運転者による道路交通情報表示Sの視認可能時間が短い。同様に、車両1の右折時の旋回量が大きい場合において、車両1に対する左側に道路交通情報表示Sが配置されているときは、運転者による道路交通情報表示Sの視認可能時間が短い。このような場合、運転者が道路交通情報表示Sを見逃す可能性が高いと考えられる。そこで、視点距離係数Cがより小さい値となるように補正をするのである。 Normally, when the amount of turning of the vehicle 1 when turning left is large and the road traffic information display S is arranged on the right side of the vehicle 1, the visible time of the road traffic information display S by the driver is short. Similarly, when the amount of turning of the vehicle 1 when turning right is large and the road traffic information display S is arranged on the left side of the vehicle 1, the visible time of the road traffic information display S by the driver is short. In such a case, it is highly likely that the driver misses the road traffic information display S. Therefore, the viewpoint distance coefficient C is corrected so as to have a smaller value.
(4)車両1の周囲の明るさに基づく補正処理
 視点距離係数補正部24aは、車両1の周囲が暗いときは車両1の周囲が明るいときに比して小さい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24aは、車両1の周囲が明るいときは車両1の周囲が暗いときに比して大きい値となるように視点距離係数Cを補正する。当該補正処理には、環境情報取得部32により取得された環境情報が用いられる。
(4) Correction processing based on the brightness of the surroundings of the vehicle 1 The viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient so that when the surroundings of the vehicle 1 are dark, the value is smaller than when the surroundings of the vehicle 1 are bright. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the surroundings of the vehicle 1 are bright, the value becomes larger than when the surroundings of the vehicle 1 are dark. The environmental information acquired by the environmental information acquisition unit 32 is used for the correction process.
 通常、車両1の周囲が暗いときは、運転者による道路交通情報表示Sの視認性が低下する。このような場合、運転者が道路交通情報表示Sを見逃す可能性が高いと考えられる。そこで、視点距離係数Cがより小さい値となるように補正をするのである。 Normally, when the surroundings of the vehicle 1 are dark, the visibility of the road traffic information display S by the driver is reduced. In such a case, it is highly likely that the driver misses the road traffic information display S. Therefore, the viewpoint distance coefficient C is corrected so as to have a smaller value.
(5)車両1の周囲の天候に基づく補正処理
 視点距離係数補正部24aは、車両1の周囲の天候が雨又は雪であるときは車両1の周囲の天候が晴れ又は曇りであるときに比して小さい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24aは、車両1の周囲の天候が晴れ又は曇りであるときは車両1の周囲の天候が雨又は雪であるときに比して大きい値となるように視点距離係数Cを補正する。当該補正処理には、環境情報取得部32により取得された環境情報が用いられる。
(5) Correction processing based on the weather around the vehicle 1 The viewpoint distance coefficient correction unit 24a compares when the weather around the vehicle 1 is rain or snow and the weather around the vehicle 1 is sunny or cloudy. The viewpoint distance coefficient C is corrected so that the value becomes small. In other words, the viewpoint distance coefficient correction unit 24a makes a large value when the weather around the vehicle 1 is sunny or cloudy as compared with when the weather around the vehicle 1 is rain or snow. The distance coefficient C is corrected. The environmental information acquired by the environmental information acquisition unit 32 is used for the correction process.
 通常、車両1の周囲の天候が雨又は雪であるときは、運転者による道路交通情報表示Sの視認性が低下する。このような場合、運転者が道路交通情報表示Sを見逃す可能性が高いと考えられる。そこで、視点距離係数Cがより小さい値となるように補正をするのである。 Normally, when the weather around the vehicle 1 is rain or snow, the visibility of the road traffic information display S by the driver is reduced. In such a case, it is highly likely that the driver misses the road traffic information display S. Therefore, the viewpoint distance coefficient C is corrected so as to have a smaller value.
(6)運転者の視力に基づく補正処理
 視点距離係数補正部24aは、運転者の視力が低いときは運転者の視力が高いときに比して小さい値となるように視点距離係数Cを補正する。換言すれば、視点距離係数補正部24aは、運転者の視力が高いときは運転者の視力が低いときに比して大きい値となるように視点距離係数Cを補正する。かかる補正処理には、運転者情報取得部33により取得された運転者情報が用いられる。
(6) Correction processing based on the driver's visual acuity The viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the driver's visual acuity is low, the value becomes smaller than when the driver's visual acuity is high. To do. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the driver's visual acuity is high, the value becomes larger than when the driver's visual acuity is low. The driver information acquired by the driver information acquisition unit 33 is used for such correction processing.
 通常、運転者の視力が低い場合、運転者が道路交通情報表示Sを見逃す可能性が高いと考えられる。そこで、視点距離係数Cがより小さい値となるように補正をするのである。 Normally, when the driver's eyesight is low, it is highly likely that the driver will miss the road traffic information display S. Therefore, the viewpoint distance coefficient C is corrected so as to have a smaller value.
 道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24a、認識度算出部25、認識度判定部26、表示制御部27及び補正用情報取得部34により、運転支援装置100aの要部が構成されている。 By road traffic information detection unit 21, viewpoint detection unit 22, viewpoint distance coefficient calculation unit 23, viewpoint distance coefficient correction unit 24a, recognition degree calculation unit 25, recognition degree determination unit 26, display control unit 27, and correction information acquisition unit 34. , The main part of the driving support device 100a is configured.
 運転支援装置100aの要部のハードウェア構成は、実施の形態1にて図4を参照して説明したものと同様であるため、図示及び説明を省略する。すなわち、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24a、認識度算出部25、認識度判定部26、表示制御部27及び補正用情報取得部34の各々の機能は、プロセッサ51及びメモリ52により実現されるものであっても良く、又は専用の処理回路53により実現されるものであっても良い。 Since the hardware configuration of the main part of the driving support device 100a is the same as that described with reference to FIG. 4 in the first embodiment, the illustration and description will be omitted. That is, the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, the recognition degree calculation unit 25, the recognition degree determination unit 26, the display control unit 27, and the correction information acquisition unit. Each function of 34 may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
 次に、図10のフローチャートを参照して、運転支援装置100aの動作について、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、補正用情報取得部34、視点距離係数補正部24a及び認識度算出部25の動作を中心に説明する。なお、図10において、図5に示すステップと同様のステップには同一符号を付して説明を省略する。 Next, with reference to the flowchart of FIG. 10, regarding the operation of the driving support device 100a, the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the correction information acquisition unit 34, and the viewpoint distance coefficient correction The operation of the unit 24a and the recognition degree calculation unit 25 will be mainly described. In FIG. 10, the same steps as those shown in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted.
 まず、ステップST1にて、道路交通情報検出部21が道路交通情報検出処理を実行する。前方画像I1に道路交通情報表示Sが含まれている場合(ステップST2“YES”)、ステップST3にて、視点検出部22が視点検出処理を実行する。次いで、ステップST4にて、視点距離係数算出部23が視点距離係数Cを算出する。 First, in step ST1, the road traffic information detection unit 21 executes the road traffic information detection process. When the road traffic information display S is included in the forward image I1 (step ST2 “YES”), the viewpoint detection unit 22 executes the viewpoint detection process in step ST3. Next, in step ST4, the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C.
 次いで、ステップST7にて、補正用情報取得部34が補正用情報を取得する。次いで、ステップST5aにて、視点距離係数補正部24aが視点距離係数Cを補正する。このとき、視点距離係数補正部24aは、道路交通情報表示Sの内容に基づく補正処理に加えて、上記(1)~(6)のうちの少なくとも一つの補正処理を実行する。なお、視点距離係数補正部24aが上記(1)~(6)の補正処理のうちの上記(1)の補正処理のみを実行する場合、補正用情報は不要である。この場合、ステップST7の処理はスキップされるものであっても良い。 Next, in step ST7, the correction information acquisition unit 34 acquires the correction information. Next, in step ST5a, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C. At this time, the viewpoint distance coefficient correction unit 24a executes at least one of the above correction processes (1) to (6) in addition to the correction process based on the content of the road traffic information display S. When the viewpoint distance coefficient correction unit 24a executes only the correction process (1) among the correction processes (1) to (6) above, the correction information is unnecessary. In this case, the process of step ST7 may be skipped.
 次いで、ステップST6にて、認識度算出部25が認識度D2を算出する。 Next, in step ST6, the recognition degree calculation unit 25 calculates the recognition degree D2.
 このように、視点距離係数補正部24aは、道路交通情報表示Sの内容に基づく補正処理に加えて、上記(1)~(6)のうちの少なくとも一つの補正処理を実行する。かかる補正後の視点距離係数C’を認識度D2の算出に用いることにより、道路交通情報表示Sが運転者により認識されたか否かの判定精度を更に向上することができる。 As described above, the viewpoint distance coefficient correction unit 24a executes at least one of the above correction processes (1) to (6) in addition to the correction process based on the content of the road traffic information display S. By using the corrected viewpoint distance coefficient C'for calculating the recognition degree D2, it is possible to further improve the accuracy of determining whether or not the road traffic information display S is recognized by the driver.
 次に、図11を参照して、運転支援装置100aの変形例について説明する。 Next, a modified example of the driving support device 100a will be described with reference to FIG.
 図11に示す如く、補正用情報取得部34は、運転支援装置100a外に設けられているものであっても良い。すなわち、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24a、認識度算出部25、認識度判定部26及び表示制御部27により、運転支援装置100aの要部が構成されているものであっても良い。 As shown in FIG. 11, the correction information acquisition unit 34 may be provided outside the driving support device 100a. That is, the driving support device 100a is provided by the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27. The main part of the above may be configured.
 次に、図12を参照して、運転支援装置100aの他の変形例について説明する。 Next, with reference to FIG. 12, another modification of the driving support device 100a will be described.
 図12に示す如く、認識度判定部26及び表示制御部27は、運転支援装置100a外に設けられているものであっても良い。すなわち、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24a及び認識度算出部25により、運転支援装置100aの要部が構成されているものであっても良い。 As shown in FIG. 12, the recognition degree determination unit 26 and the display control unit 27 may be provided outside the driving support device 100a. That is, the main part of the driving support device 100a is composed of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, and the recognition degree calculation unit 25. You may.
 そのほか、運転支援装置100aは、実施の形態1にて説明したものと同様の種々の変形例を採用することができる。 In addition, as the driving support device 100a, various modifications similar to those described in the first embodiment can be adopted.
 以上のように、実施の形態2の運転支援装置100aにおいて、視点距離係数補正部24aは、道路交通情報表示Sの内容に基づき視点距離係数Cを補正するとともに、領域Aのサイズに基づき視点距離係数Cを補正する。これにより、道路交通情報表示Sが運転者により認識されたか否かの判定精度を更に向上することができる。 As described above, in the driving support device 100a of the second embodiment, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S and the viewpoint distance based on the size of the area A. Correct the coefficient C. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
 また、視点距離係数補正部24aは、道路交通情報表示Sの内容に基づき視点距離係数Cを補正するとともに、車両1の走行状態に基づき視点距離係数Cを補正する。より具体的には、視点距離係数補正部24aは、車両1の走行速度又は車両1の旋回量に基づき視点距離係数Cを補正する。これにより、道路交通情報表示Sが運転者により認識されたか否かの判定精度を更に向上することができる。 Further, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the traveling state of the vehicle 1. More specifically, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the traveling speed of the vehicle 1 or the turning amount of the vehicle 1. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
 また、視点距離係数補正部24aは、道路交通情報表示Sの内容に基づき視点距離係数Cを補正するとともに、車両1の車外環境に基づき視点距離係数Cを補正する。より具体的には、視点距離係数補正部24aは、車両1の周囲の明るさ又は車両1の周囲の天候に基づき視点距離係数Cを補正する。これにより、道路交通情報表示Sが運転者により認識されたか否かの判定精度を更に向上することができる。 Further, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the environment outside the vehicle 1. More specifically, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the brightness around the vehicle 1 or the weather around the vehicle 1. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
 また、視点距離係数補正部24aは、道路交通情報表示Sの内容に基づき視点距離係数Cを補正するとともに、運転者の視力に基づき視点距離係数Cを補正する。これにより、道路交通情報表示Sが運転者により認識されたか否かの判定精度を更に向上することができる。 Further, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the driver's visual acuity. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
実施の形態3.
 図13は、実施の形態3に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。図13を参照して、実施の形態3の運転支援装置100bについて説明する。また、運転支援装置100bを含む運転支援システム200bについて説明する。なお、図13において、図1に示すブロックと同様のブロックには同一符号を付して説明を省略する。
Embodiment 3.
FIG. 13 is a block diagram showing a main part of the driving support system including the driving support device according to the third embodiment. The driving support device 100b of the third embodiment will be described with reference to FIG. Moreover, the driving support system 200b including the driving support device 100b will be described. In FIG. 13, the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
 図13に示す如く、車両1に制御装置2bが設けられている。制御装置2bは、例えば、ECUにより構成されている。 As shown in FIG. 13, the control device 2b is provided in the vehicle 1. The control device 2b is composed of, for example, an ECU.
 また、車両1に音声出力装置11が設けられている。音声出力装置11は、例えば、スピーカにより構成されている。 Further, the vehicle 1 is provided with the voice output device 11. The audio output device 11 is composed of, for example, a speaker.
 制御装置2b、第1撮像装置3、第2撮像装置4、表示装置5及び音声出力装置11により、運転支援システム200bの要部が構成されている。運転支援装置100bは、制御装置2bに設けられている。 The main part of the driving support system 200b is composed of the control device 2b, the first imaging device 3, the second imaging device 4, the display device 5, and the audio output device 11. The driving support device 100b is provided in the control device 2b.
 重要度判定部41は、道路交通情報検出部21による検出結果を取得するものである。重要度判定部41は、当該取得された検出結果を用いて、道路交通情報表示Sの内容に基づき、道路交通情報表示Sの重要度D3を判定するものである。 The importance determination unit 41 acquires the detection result by the road traffic information detection unit 21. The importance determination unit 41 determines the importance D3 of the road traffic information display S based on the content of the road traffic information display S by using the acquired detection result.
 重要度D3は、例えば、道路交通情報表示Sの内容が車両1の運転に与える影響の大きさに応じて、3段階の値(すなわち第1値、第2値及び第3値)のうちのいずれかの値であると判定される。重要度D3の値が大きいほど、道路交通情報表示Sの内容が車両1の運転に与える影響が大きいことを示している。 The importance D3 is, for example, one of three values (that is, the first value, the second value, and the third value) according to the magnitude of the influence of the content of the road traffic information display S on the driving of the vehicle 1. It is determined that it is one of the values. The larger the value of the importance D3, the greater the influence that the content of the road traffic information display S has on the driving of the vehicle 1.
 例えば、重要度判定部41は、道路交通情報表示Sの内容が車両1の停止を要求するものである場合(例えば、道路交通情報表示Sが「通行止め」又は「車両通行止め」を意味する規制標識である場合)、重要度D3が第3値であると判定する。また、重要度判定部41は、道路交通情報表示Sの内容が車両1の減速又は車線変更を要求するものである場合、重要度D3が第2値であると判定する。また、重要度判定部41は、そのほかの場合、重要度D3が第1値であると判定する。 For example, the importance determination unit 41 indicates that the content of the road traffic information display S requests the stop of the vehicle 1 (for example, the road traffic information display S is a regulation sign meaning "closed" or "closed". If), it is determined that the importance D3 is the third value. Further, the importance determination unit 41 determines that the importance D3 is the second value when the content of the road traffic information display S requests the deceleration or lane change of the vehicle 1. In addition, the importance determination unit 41 determines that the importance D3 is the first value in other cases.
 音声出力制御部42は、認識度判定部26により認識度D2が閾値Dth未満であると判定された場合において、重要度判定部41により重要度D3が所定の基準値Dref(例えば第2値)以上であると判定されたとき、道路交通情報表示Sに対応する音声を音声出力装置11に出力させる制御(以下「音声出力制御」という。)を実行するものである。当該音声は、例えば、道路交通情報表示Sの内容に対応する文章を読み上げる音声である。 When the recognition level D2 is determined by the recognition level determination unit 26 to be less than the threshold value Dth, the voice output control unit 42 sets the importance level D3 to a predetermined reference value Dref (for example, a second value) by the importance level determination unit 41. When it is determined that the above is the case, the control for outputting the voice corresponding to the road traffic information display S to the voice output device 11 (hereinafter referred to as “voice output control”) is executed. The voice is, for example, a voice that reads out a sentence corresponding to the content of the road traffic information display S.
 表示制御部27及び音声出力制御部42により、出力制御部43が構成されている。また、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26、表示制御部27、重要度判定部41及び音声出力制御部42により、運転支援装置100bの要部が構成されている。 The output control unit 43 is composed of the display control unit 27 and the audio output control unit 42. In addition, road traffic information detection unit 21, viewpoint detection unit 22, viewpoint distance coefficient calculation unit 23, viewpoint distance coefficient correction unit 24, recognition degree calculation unit 25, recognition degree determination unit 26, display control unit 27, importance determination unit 41. And the voice output control unit 42 constitutes the main part of the driving support device 100b.
 運転支援装置100bの要部のハードウェア構成は、実施の形態1にて図4を参照して説明したものと同様であるため、図示及び説明を省略する。すなわち、道路交通情報検出部21、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26、表示制御部27、重要度判定部41及び音声出力制御部42の各々の機能は、プロセッサ51及びメモリ52により実現されるものであっても良く、又は専用の処理回路53により実現されるものであっても良い。 Since the hardware configuration of the main part of the driving support device 100b is the same as that described with reference to FIG. 4 in the first embodiment, the illustration and description will be omitted. That is, road traffic information detection unit 21, viewpoint detection unit 22, viewpoint distance coefficient calculation unit 23, viewpoint distance coefficient correction unit 24, recognition degree calculation unit 25, recognition degree determination unit 26, display control unit 27, importance determination unit 41. Each function of the voice output control unit 42 and the voice output control unit 42 may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
 次に、図14のフローチャートを参照して、運転支援装置100bの動作について、認識度判定部26、重要度判定部41及び出力制御部43の動作を中心に説明する。なお、図14において、図6に示すステップと同様のステップには同一符号を付して説明を省略する。 Next, with reference to the flowchart of FIG. 14, the operation of the driving support device 100b will be described focusing on the operations of the recognition degree determination unit 26, the importance determination unit 41, and the output control unit 43. In FIG. 14, the same steps as those shown in FIG. 6 are designated by the same reference numerals, and the description thereof will be omitted.
 まず、ステップST11にて、認識度判定部26は、認識度算出部25により算出された認識度D2を取得して、当該取得された認識度D2が閾値Dth以上であるか否かを判定する。認識度D2が閾値Dth未満であると判定された場合(ステップST11“NO”)、ステップST13にて、重要度判定部41は、道路交通情報表示Sの重要度D3を判定する。重要度D3が基準値Dref未満であると判定された場合(ステップST13“NO”)、ステップST12にて、出力制御部43が第1表示制御を実行する。他方、重要度D3が基準値Dref以上であると判定された場合(ステップST13“YES”)、ステップST14にて、出力制御部43が第1表示制御及び音声出力制御を実行する。 First, in step ST11, the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25, and determines whether or not the acquired recognition degree D2 is equal to or greater than the threshold value Dth. .. When it is determined that the recognition degree D2 is less than the threshold value Dth (step ST11 “NO”), the importance degree determination unit 41 determines the importance degree D3 of the road traffic information display S in step ST13. When it is determined that the importance D3 is less than the reference value Dref (step ST13 “NO”), the output control unit 43 executes the first display control in step ST12. On the other hand, when it is determined that the importance D3 is equal to or higher than the reference value Dreff (step ST13 “YES”), the output control unit 43 executes the first display control and the audio output control in step ST14.
 このように、認識度D2が閾値Dth未満であると判定された場合において、重要度D3が基準値Dref以上であると判定されたとき、表示制御部27が第1表示制御を実行するのに加えて、音声出力制御部42が音声出力制御を実行する。これにより、運転者は、自身が見逃した道路交通情報表示Sのうちの重要な道路交通情報表示Sをより確実に確認することができる。 In this way, when it is determined that the recognition degree D2 is less than the threshold value Dth and the importance degree D3 is determined to be equal to or higher than the reference value Dref, the display control unit 27 executes the first display control. In addition, the voice output control unit 42 executes voice output control. As a result, the driver can more reliably confirm the important road traffic information display S among the road traffic information display S that he / she overlooked.
 次に、運転支援装置100bの変形例について説明する。 Next, a modified example of the driving support device 100b will be described.
 運転支援装置100bは、視点距離係数補正部24に代えて、運転支援装置100aと同様の視点距離係数補正部24aを有するものであっても良い。また、制御装置2bは、制御装置2aと同様の補正用情報取得部34を有するものであっても良い。この場合、運転支援システム200bは、運転支援システム200aと同様のセンサ類6、無線通信装置7、記憶装置9及び操作入力装置10を含むものであっても良い。 The driving support device 100b may have the same viewpoint distance coefficient correction unit 24a as the driving support device 100a instead of the viewpoint distance coefficient correction unit 24. Further, the control device 2b may have the same correction information acquisition unit 34 as the control device 2a. In this case, the driving support system 200b may include sensors 6, a wireless communication device 7, a storage device 9, and an operation input device 10 similar to the driving support system 200a.
 そのほか、運転支援装置100bは、実施の形態1,2にて説明したものと同様の種々の変形例を採用することができる。 In addition, the driving support device 100b can employ various modifications similar to those described in the first and second embodiments.
 以上のように、実施の形態3の運転支援装置100bは、道路交通情報表示Sの内容に基づき、道路交通情報表示Sの重要度D3が基準値Dref以上であるか否かを判定する重要度判定部41と、認識度D2が閾値Dth未満であると判定された場合において、重要度D3が基準値Dref以上であると判定されたとき、道路交通情報表示Sに対応する音声を音声出力装置11に出力させる制御を実行する音声出力制御部42と、を備える。これにより、運転者は、自身が見逃した道路交通情報表示Sのうちの重要な道路交通情報表示Sをより確実に確認することができる。 As described above, the driving support device 100b of the third embodiment has the importance of determining whether or not the importance D3 of the road traffic information display S is equal to or higher than the reference value Dref based on the content of the road traffic information display S. When the determination unit 41 determines that the recognition degree D2 is less than the threshold value Dth and the importance degree D3 is determined to be equal to or higher than the reference value Dref, the voice output device corresponds to the road traffic information display S. A voice output control unit 42 that executes control to output to 11 is provided. As a result, the driver can more reliably confirm the important road traffic information display S among the road traffic information display S that he / she overlooked.
実施の形態4.
 図15は、実施の形態4に係る運転支援装置を含む運転支援システムの要部を示すブロック図である。図15を算出して、実施の形態4の運転支援装置100cについで説明する。また、運転支援装置100cを含む運転支援システム200cについて説明する。なお、図15において、図1に示すブロックと同様のブロックには同一符号を付して説明を省略する。また、図15において、図9に示すブロックと同様のブロックには同一符号を付して説明を省略する。
Embodiment 4.
FIG. 15 is a block diagram showing a main part of the driving support system including the driving support device according to the fourth embodiment. FIG. 15 is calculated, and the operation support device 100c according to the fourth embodiment will be described later. Moreover, the driving support system 200c including the driving support device 100c will be described. In FIG. 15, the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted. Further, in FIG. 15, the same blocks as those shown in FIG. 9 are designated by the same reference numerals, and the description thereof will be omitted.
 図15に示す如く、車両1に制御装置2cが設けられている。制御装置2cは、例えば、ECUにより構成されている。 As shown in FIG. 15, the control device 2c is provided in the vehicle 1. The control device 2c is composed of, for example, an ECU.
 制御装置2c、第1撮像装置3、第2撮像装置4、表示装置5、記憶装置9及び操作入力装置10により、運転支援システム200cの要部が構成されている。運転支援装置100cは、制御装置2cに設けられている。 The main part of the driving support system 200c is composed of the control device 2c, the first imaging device 3, the second imaging device 4, the display device 5, the storage device 9, and the operation input device 10. The driving support device 100c is provided in the control device 2c.
 道路交通情報検出部21aは、道路交通情報検出部21と同様の道路交通情報検出処理を実行するものである。道路交通情報検出部21aは、道路交通情報検出処理により検出された道路交通情報表示Sを示すデータを記憶装置9に記憶させるものである。記憶装置9は、当該記憶されたデータを所定時間保持するものである。 The road traffic information detection unit 21a executes the same road traffic information detection process as the road traffic information detection unit 21. The road traffic information detection unit 21a stores the data indicating the road traffic information display S detected by the road traffic information detection process in the storage device 9. The storage device 9 holds the stored data for a predetermined time.
 表示制御部27aは、表示制御部27と同様の第1表示制御を実行するものである。これに加えて、表示制御部27aは、記憶装置9に記憶されているデータが示す道路交通情報表示Sに対応する画像I3の表示を指示する操作が操作入力装置10に入力されたとき、当該記憶されているデータが示す道路交通情報表示Sに対応する画像I3を表示装置5に表示させる制御(以下「第2表示制御」という。)を実行するものである。 The display control unit 27a executes the first display control similar to the display control unit 27. In addition to this, when the operation for instructing the display of the image I3 corresponding to the road traffic information display S indicated by the data stored in the storage device 9 is input to the operation input device 10, the display control unit 27a is concerned. The control for displaying the image I3 corresponding to the road traffic information display S indicated by the stored data on the display device 5 (hereinafter referred to as “second display control”) is executed.
 第2表示制御により、運転者は、車両1が通り過ぎた道路交通情報表示Sの内容を、自身が見逃したか否かにかかわらず、任意のタイミングにて確認することができる。 By the second display control, the driver can confirm the content of the road traffic information display S that the vehicle 1 has passed at any timing regardless of whether or not he / she overlooked it.
 道路交通情報検出部21a、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27aにより、運転支援装置100cの要部が構成されている。 The driving support device 100c is required by the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27a. The part is composed.
 運転支援装置100cの要部のハードウェア構成は、実施の形態1にて図4を参照して説明したものと同様であるため、図示及び説明を省略する。すなわち、道路交通情報検出部21a、視点検出部22、視点距離係数算出部23、視点距離係数補正部24、認識度算出部25、認識度判定部26及び表示制御部27aの各々の機能は、プロセッサ51及びメモリ52により実現されるものであっても良く、又は専用の処理回路53により実現されるものであっても良い。 Since the hardware configuration of the main part of the driving support device 100c is the same as that described with reference to FIG. 4 in the first embodiment, the illustration and description will be omitted. That is, the functions of the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27a are It may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
 次に、図16を参照して、運転支援装置100cの変形例について説明する。 Next, a modified example of the driving support device 100c will be described with reference to FIG.
 図16に示す如く、認識度判定部26は、運転支援装置100c外に設けられているものであっても良い。すなわち、道路交通情報検出部21a、視点検出部22、視点距離係数算出部23、視点距離係数補正部24及び認識度算出部25により、運転支援装置100cの要部が構成されているものであっても良い。 As shown in FIG. 16, the recognition degree determination unit 26 may be provided outside the driving support device 100c. That is, the main part of the driving support device 100c is composed of the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit 25. You may.
 次に、運転支援装置100cの他の変形例について説明する。 Next, another modification of the driving support device 100c will be described.
 運転支援装置100cは、視点距離係数補正部24に代えて、運転支援装置100aと同様の視点距離係数補正部24aを有するものであっても良い。また、制御装置2cは、制御装置2aと同様の補正用情報取得部34を有するものであっても良い。この場合、運転支援システム200cは、運転支援システム200aと同様のセンサ類6及び無線通信装置7を含むものであっても良い。 The driving support device 100c may have the same viewpoint distance coefficient correction unit 24a as the driving support device 100a instead of the viewpoint distance coefficient correction unit 24. Further, the control device 2c may have the same correction information acquisition unit 34 as the control device 2a. In this case, the driving support system 200c may include the same sensors 6 and the wireless communication device 7 as the driving support system 200a.
 また、運転支援装置100cは、運転支援装置100bと同様の重要度判定部41及び音声出力制御部42を有するものであっても良い。この場合、運転支援システム200cは、運転支援システム200bと同様の音声出力装置11を含むものであっても良い。 Further, the driving support device 100c may have the same importance determination unit 41 and voice output control unit 42 as the driving support device 100b. In this case, the driving support system 200c may include an audio output device 11 similar to the driving support system 200b.
 そのほか、運転支援装置100cは、実施の形態1~3にて説明したものと同様の種々の変形例を採用することができる。 In addition, as the driving support device 100c, various modifications similar to those described in the first to third embodiments can be adopted.
 以上のように、実施の形態4の運転支援装置100cは、操作入力装置10に対する操作入力に応じて、道路交通情報表示Sに対応する画像I3を表示装置5に表示させる制御(第2表示制御)を実行する表示制御部27aを備える。これにより、運転者は、車両1が通り過ぎた道路交通情報表示Sの内容を、自身が見逃したか否かにかかわらず、任意のタイミングにて確認することができる。 As described above, the driving support device 100c of the fourth embodiment is controlled to display the image I3 corresponding to the road traffic information display S on the display device 5 in response to the operation input to the operation input device 10 (second display control). ) Is included in the display control unit 27a. As a result, the driver can confirm the content of the road traffic information display S that the vehicle 1 has passed at any timing regardless of whether or not he / she overlooked it.
 なお、本願発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 It should be noted that, within the scope of the invention, the invention of the present application is capable of freely combining the respective embodiments, modifying any constituent element of each embodiment, or omitting any constituent element in each embodiment. ..
 本発明の運転支援装置は、車両の運転支援に用いることができる。 The driving support device of the present invention can be used for driving support of a vehicle.
 1 車両、2,2a,2b,2c 制御装置、3 第1撮像装置、4 第2撮像装置、5 表示装置、6 センサ類、7 無線通信装置、8 サーバ装置、9 記憶装置、10 操作入力装置、11 音声出力装置、21,21a 道路交通情報検出部、22 視点検出部、23 視点距離係数算出部、24,24a 視点距離係数補正部、25 認識度算出部、26 認識度判定部、27,27a 表示制御部、31 車両情報取得部、32 環境情報取得部、33 運転者情報取得部、34 補正用情報取得部、41 重要度判定部、42 音声出力制御部、43 出力制御部、51 プロセッサ、52 メモリ、53 処理回路、100,100a,100b,100c 運転支援装置、200,200a,200b,200c 運転支援システム。 1 vehicle, 2, 2a, 2b, 2c control device, 3 first imaging device, 4 second imaging device, 5 display device, 6 sensors, 7 wireless communication device, 8 server device, 9 storage device, 10 operation input device , 11 voice output device, 21,21a road traffic information detection unit, 22 viewpoint detection unit, 23 viewpoint distance coefficient calculation unit, 24, 24a viewpoint distance coefficient correction unit, 25 recognition degree calculation unit, 26 recognition degree determination unit, 27, 27a Display control unit, 31 Vehicle information acquisition unit, 32 Environmental information acquisition unit, 33 Driver information acquisition unit, 34 Correction information acquisition unit, 41 Importance determination unit, 42 Voice output control unit, 43 Output control unit, 51 Processor , 52 memory, 53 processing circuit, 100, 100a, 100b, 100c operation support device, 200, 200a, 200b, 200c operation support system.

Claims (9)

  1.  前方画像における道路交通情報表示に対応する領域を検出するとともに、前記道路交通情報表示の内容を検出する道路交通情報検出部と、
     車内画像を用いて、前記前方画像における運転者の視点に対応する位置を検出する視点検出部と、
     前記位置に対する前記領域の距離に応じた視点距離係数を算出する視点距離係数算出部と、
     前記内容に基づき、前記視点距離係数を補正する視点距離係数補正部と、
     補正後の前記視点距離係数を積算することにより、前記運転者による前記道路交通情報表示の認識度を算出する認識度算出部と、
     を備える車両用の運転支援装置。
    A road traffic information detection unit that detects an area corresponding to the road traffic information display in the front image and detects the content of the road traffic information display,
    A viewpoint detection unit that detects a position corresponding to the driver's viewpoint in the front image using an in-vehicle image, and a viewpoint detection unit.
    A viewpoint distance coefficient calculation unit that calculates a viewpoint distance coefficient according to the distance of the region to the position,
    A viewpoint distance coefficient correction unit that corrects the viewpoint distance coefficient based on the above contents,
    A recognition degree calculation unit that calculates the recognition degree of the road traffic information display by the driver by integrating the corrected viewpoint distance coefficient.
    A driving support device for vehicles equipped with.
  2.  前記認識度が閾値以上であるか否かを判定することにより、前記道路交通情報表示が前記運転者により認識されたか否かを判定する認識度判定部と、
     前記認識度が前記閾値未満であると判定された場合、前記道路交通情報表示に対応する画像を表示装置に表示させる制御を実行する表示制御部と、
     を備えることを特徴とする請求項1記載の運転支援装置。
    A recognition degree determination unit that determines whether or not the road traffic information display has been recognized by the driver by determining whether or not the recognition degree is equal to or higher than a threshold value.
    When it is determined that the recognition degree is less than the threshold value, a display control unit that executes control to display an image corresponding to the road traffic information display on the display device, and
    The driving support device according to claim 1, further comprising.
  3.  前記認識度判定部は、前記道路交通情報表示が前記前方画像外に出たとき、前記認識度が前記閾値以上であるか否かを判定することを特徴とする請求項2記載の運転支援装置。 The driving support device according to claim 2, wherein the recognition degree determination unit determines whether or not the recognition degree is equal to or higher than the threshold value when the road traffic information display goes out of the front image. ..
  4.  前記視点距離係数補正部は、前記内容に基づき前記視点距離係数を補正するとともに、前記領域のサイズに基づき前記視点距離係数を補正することを特徴とする請求項1から請求項3のうちのいずれか1項記載の運転支援装置。 Any of claims 1 to 3, wherein the viewpoint distance coefficient correction unit corrects the viewpoint distance coefficient based on the content and also corrects the viewpoint distance coefficient based on the size of the region. The driving support device according to item 1.
  5.  前記視点距離係数補正部は、前記内容に基づき前記視点距離係数を補正するとともに、前記車両の走行状態に基づき前記視点距離係数を補正することを特徴とする請求項1から請求項3のうちのいずれか1項記載の運転支援装置。 Of claims 1 to 3, the viewpoint distance coefficient correction unit corrects the viewpoint distance coefficient based on the content and also corrects the viewpoint distance coefficient based on the traveling state of the vehicle. The driving support device according to any one of the above.
  6.  前記視点距離係数補正部は、前記内容に基づき前記視点距離係数を補正するとともに、前記車両の車外環境に基づき前記視点距離係数を補正することを特徴とする請求項1から請求項3のうちのいずれか1項記載の運転支援装置。 Of claims 1 to 3, the viewpoint distance coefficient correction unit corrects the viewpoint distance coefficient based on the contents and also corrects the viewpoint distance coefficient based on the environment outside the vehicle. The driving support device according to any one of the above.
  7.  前記視点距離係数補正部は、前記内容に基づき前記視点距離係数を補正するとともに、前記運転者の視力に基づき前記視点距離係数を補正することを特徴とする請求項1から請求項3のうちのいずれか1項記載の運転支援装置。 Of claims 1 to 3, the viewpoint distance coefficient correction unit corrects the viewpoint distance coefficient based on the contents and also corrects the viewpoint distance coefficient based on the visual acuity of the driver. The driving support device according to any one of the above.
  8.  前記内容に基づき、前記道路交通情報表示の重要度が基準値以上であるか否かを判定する重要度判定部と、
     前記認識度が前記閾値未満であると判定された場合において、前記重要度が前記基準値以上であると判定されたとき、前記道路交通情報表示に対応する音声を音声出力装置に出力させる制御を実行する音声出力制御部と、
     を備えることを特徴とする請求項2又は請求項3記載の運転支援装置。
    Based on the above contents, the importance determination unit for determining whether or not the importance of the road traffic information display is equal to or higher than the reference value, and
    When it is determined that the recognition level is less than the threshold value and the importance level is determined to be equal to or higher than the reference value, control is performed to output the voice corresponding to the road traffic information display to the voice output device. The audio output control unit to be executed and
    The driving support device according to claim 2 or 3, wherein the driving support device is provided.
  9.  操作入力装置に対する操作入力に応じて、前記道路交通情報表示に対応する画像を表示装置に表示させる制御を実行する表示制御部を備えることを特徴とする請求項1記載の運転支援装置。 The driving support device according to claim 1, further comprising a display control unit that executes control for displaying an image corresponding to the road traffic information display on the display device in response to an operation input to the operation input device.
PCT/JP2019/010296 2019-03-13 2019-03-13 Driving assistance device WO2020183652A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021504710A JP7105985B2 (en) 2019-03-13 2019-03-13 Driving support device
PCT/JP2019/010296 WO2020183652A1 (en) 2019-03-13 2019-03-13 Driving assistance device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/010296 WO2020183652A1 (en) 2019-03-13 2019-03-13 Driving assistance device

Publications (1)

Publication Number Publication Date
WO2020183652A1 true WO2020183652A1 (en) 2020-09-17

Family

ID=72427228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/010296 WO2020183652A1 (en) 2019-03-13 2019-03-13 Driving assistance device

Country Status (2)

Country Link
JP (1) JP7105985B2 (en)
WO (1) WO2020183652A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7400119B2 (en) 2021-04-06 2023-12-18 グーグル エルエルシー Use of resources based on geospatial information

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099883A (en) * 1999-10-13 2000-04-07 Denso Corp Traffic information display
JP2004037149A (en) * 2002-07-01 2004-02-05 Mazda Motor Corp Apparatus and method for guiding path, and program for guiding path
JP2005182307A (en) * 2003-12-17 2005-07-07 Denso Corp Vehicle driving support device
JP2008082886A (en) * 2006-09-27 2008-04-10 Matsushita Electric Ind Co Ltd Device and method for displaying guide sign
JP2009110394A (en) * 2007-10-31 2009-05-21 Equos Research Co Ltd Road sign display device
JP2015170249A (en) * 2014-03-10 2015-09-28 株式会社デンソーアイティーラボラトリ Safety confirmation determination device and driving support device
JP2017111469A (en) * 2015-12-14 2017-06-22 富士通株式会社 Road sign visual recognition determination system, road sign visual recognition determination method, and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099883A (en) * 1999-10-13 2000-04-07 Denso Corp Traffic information display
JP2004037149A (en) * 2002-07-01 2004-02-05 Mazda Motor Corp Apparatus and method for guiding path, and program for guiding path
JP2005182307A (en) * 2003-12-17 2005-07-07 Denso Corp Vehicle driving support device
JP2008082886A (en) * 2006-09-27 2008-04-10 Matsushita Electric Ind Co Ltd Device and method for displaying guide sign
JP2009110394A (en) * 2007-10-31 2009-05-21 Equos Research Co Ltd Road sign display device
JP2015170249A (en) * 2014-03-10 2015-09-28 株式会社デンソーアイティーラボラトリ Safety confirmation determination device and driving support device
JP2017111469A (en) * 2015-12-14 2017-06-22 富士通株式会社 Road sign visual recognition determination system, road sign visual recognition determination method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7400119B2 (en) 2021-04-06 2023-12-18 グーグル エルエルシー Use of resources based on geospatial information

Also Published As

Publication number Publication date
JP7105985B2 (en) 2022-07-25
JPWO2020183652A1 (en) 2021-09-13

Similar Documents

Publication Publication Date Title
JP6330903B2 (en) Information presentation device and information presentation method
CN107848416B (en) Display control device, display device, and display control method
US9589194B2 (en) Driving assistance device and image processing program
EP2936065B1 (en) A system for a vehicle
US8970451B2 (en) Visual guidance system
US10181308B2 (en) System and method for controlling the luminosity of a head-up display and display using said system
JP5689872B2 (en) Vehicle periphery monitoring device
JP5299026B2 (en) Vehicle display device
KR20180056867A (en) Display device and operating method thereof
US20110228980A1 (en) Control apparatus and vehicle surrounding monitoring apparatus
JP5286035B2 (en) Vehicle speed control device
JP2017111469A (en) Road sign visual recognition determination system, road sign visual recognition determination method, and program
CN107650639B (en) Visual field control device
KR101976106B1 (en) Integrated head-up display device for vehicles for providing information
CN106500716A (en) Automobile navigation optical projection system and its method
CN105224272B (en) image display method and automobile display device
WO2020183652A1 (en) Driving assistance device
JP2017202721A (en) Display system
US10946744B2 (en) Vehicular projection control device and head-up display device
US10474912B2 (en) Vehicle display controller, vehicle display system, vehicle display control method, and non-transitory storage medium
US11828947B2 (en) Vehicle and control method thereof
JP2007280203A (en) Information presenting device, automobile and information presenting method
JP4026598B2 (en) Image display device for vehicle
JP5192009B2 (en) Vehicle periphery monitoring device
CN113689358A (en) Vehicle windshield image enhancement display method, electronic equipment, storage medium and glass

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19918971

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021504710

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19918971

Country of ref document: EP

Kind code of ref document: A1