WO2021132250A1 - In-vehicle display device and program - Google Patents

In-vehicle display device and program Download PDF

Info

Publication number
WO2021132250A1
WO2021132250A1 PCT/JP2020/047985 JP2020047985W WO2021132250A1 WO 2021132250 A1 WO2021132250 A1 WO 2021132250A1 JP 2020047985 W JP2020047985 W JP 2020047985W WO 2021132250 A1 WO2021132250 A1 WO 2021132250A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
display
information
display device
Prior art date
Application number
PCT/JP2020/047985
Other languages
French (fr)
Inventor
Masato Kusanagi
Yuuki Suzuki
Kazuhiro Takazawa
Yuki Hori
Shin Sekiya
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020002962A external-priority patent/JP2021109555A/en
Priority claimed from JP2020204820A external-priority patent/JP2021105989A/en
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Priority to EP20842331.9A priority Critical patent/EP4081420A1/en
Publication of WO2021132250A1 publication Critical patent/WO2021132250A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/186Displaying information according to relevancy
    • B60K2360/1868Displaying information according to relevancy according to driving situations

Definitions

  • the disclosure discussed herein relates to an in-vehicle display device and a program.
  • Patent Document 1 predicts an arrival time to reach an obstacle in order to reduce troublesomeness of a driver.
  • this technique when the predicted arrival time to reach an obstacle is 2 seconds or less, an alarm immediately goes off, and when the predicted arrival time to reach an obstacle is 2 seconds or more, a display alarm is presented.
  • Patent Document 1 is unable to present an attention attracting display when the predicted arrival time to reach an obstacle is 2 seconds or less. Since the technology disclosed in Patent Document 1 merely generates an alarm sound when the predicted arrival time to reach an obstacle is 2 seconds or less, a driver fails to understand the type of the obstacle and the action to be taken by the driver. Thus, the related art technique fails to present a more appropriate attention attracting display to a driver of a vehicle.
  • An object of the present invention is to enable a more appropriate attention attracting display to a driver of a vehicle.
  • an in-vehicle display device installed in a vehicle includes a calculator configured to calculate a distance from the vehicle to an object; an image generator configured to generate a first image to attract attention to the object by using a character string alone or by using the character string and a graphic, in response to the distance calculated by the calculator being a predetermined first threshold value or less, or generate a second image to attract attention to the object by using the graphic alone, in response to the distance calculated by the calculator being greater than the first threshold value; and a display unit configured to display the first image or the second image, the first image or the second image being generated by the image generator using a display device.
  • a more appropriate attention attracting display can be provided to a driver of a vehicle.
  • FIG. 1 is a diagram illustrating an example of a virtual image displayed on a front windshield of an in-vehicle display device according to a first embodiment of the present invention.
  • FIG. 2 is a diagram schematically illustrating an example of an internal arrangement of a vehicle provided with the in-vehicle display device according to a first embodiment of the present invention.
  • FIG. 3 is a diagram schematically illustrating a configuration example of an optical system included in the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 4 is a diagram illustrating a configuration example of a control system included in the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 1 is a diagram illustrating an example of a virtual image displayed on a front windshield of an in-vehicle display device according to a first embodiment of the present invention.
  • FIG. 2 is a diagram schematically illustrating an example of an internal arrangement of a vehicle provided with the in-vehicle display device according to a
  • FIG. 5 is a diagram illustrating a schematic configuration example of the in-vehicle display device according to the first embodiment of the present invention and peripheral devices.
  • FIG. 6 is a diagram illustrating a functional configuration of the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device according to the first embodiment of the invention.
  • FIG. 8 is a diagram illustrating an example of an explicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 9 is a diagram illustrating an example of an explicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 9 is a diagram illustrating an example of an explicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 10 is a diagram illustrating an example of an implicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device according to a second embodiment of the present invention.
  • FIG. 12 is a diagram illustrating a functional configuration of the in-vehicle display device according to a third embodiment of the present invention.
  • FIG. 13A is a photograph depicting an example of information associated with a steering operation displayed by a display system according to a vehicle speed, and the like.
  • FIG. 13B is a photograph depicting an example of information associated with a steering operation displayed by the display system according to a vehicle speed, and the like.
  • FIG. 13A is a photograph depicting an example of information associated with a steering operation displayed by a display system according to a vehicle speed, and the like.
  • FIG. 13B is a photograph depicting an example of information associated with a steering operation
  • FIG. 14 is a diagram illustrating an example of the display system installed in a vehicle.
  • FIG. 15 is a diagram illustrating an example of a structure of a display unit.
  • FIG. 16 is a configuration diagram illustrating an example of an in-vehicle system in which the display system is installed in a moving body.
  • FIG. 17 is a diagram illustrating a configuration example of a detector in the in-vehicle system.
  • FIG. 18 is a diagram illustrating a hardware configuration example of a display controller.
  • FIG. 19 is a functional block diagram illustrating examples of functions of a recognition unit and the display controller in blocks.
  • FIG. 20A is a diagram illustrating a method of estimating curvature of a curve.
  • FIG. 20A is a diagram illustrating a method of estimating curvature of a curve.
  • FIG. 20B is a diagram illustrating a method of estimating curvature of a curve.
  • FIG. 21 is a diagram illustrating a method of determining a target steering angle.
  • FIG. 22A is a diagram illustrating an example of a display indicating a current steering angle and a target steering angle.
  • FIG. 22B is a diagram illustrating an example of a display indicating a current steering angle and a target steering angle.
  • FIG. 22C is a diagram illustrating an example of a display indicating a current steering angle and a target steering angle.
  • FIG. 23 is a diagram illustrating a relationship between an actual steering angle of a vehicle and a target steering angle.
  • FIG. 24 is a flowchart illustrating an example of a procedure in which the display controller presents a current steering angle and a target steering angle.
  • FIG. 1 is a diagram illustrating an example of a virtual image on a front windshield 21 displayed by an in-vehicle display device 100 according to a first embodiment of the present invention.
  • FIG. 2 is a diagram schematically illustrating an example of an internal arrangement of a vehicle 30 provided with an in-vehicle display device 100 according to the first embodiment of the present invention.
  • FIG. 3 is a diagram schematically illustrating a configuration example of an optical system 230 included in the in-vehicle display device 100 according to the first embodiment of the present invention.
  • the in-vehicle display device 100 is installed inside a dashboard of the vehicle 30.
  • the vehicle 30 is a traveling object acting as an example of a moving object.
  • Projection light L which is image light emitted from the in-vehicle display device 100 inside the dashboard, is reflected by a front windshield 21 acting as a light transmitting member, and the reflected light is then transmitted toward a driver 20 acting as a viewer. Accordingly, the driver 20 can view a route navigation image or the like as a virtual image G, which will be described later.
  • a combiner acting as a light transmitting member may be disposed on an inner wall of the front windshield 21 so as to allow the driver 20 to see the virtual image G via the projection light L reflected by the combiner.
  • a frontward monitoring camera 22 and a driver monitoring camera 23 are disposed on an upper part of the front windshield 21.
  • the frontward monitoring camera 22 captures a frontward scenery ahead of the vehicle.
  • the frontward scenery monitored by the frontward monitoring camera 22 includes display information displayed by the in-vehicle display device 100 and the background of the display information on the front windshield 21. That is, the frontward monitoring camera 22 captures the display information displayed by the in-vehicle display device 100 reflected on the front windshield 21, and also captures the background of the display information across the front windshield 21.
  • the background captured by the frontward monitoring camera 22 across the front windshield 21 is a frontward environment (i.e., a preceding vehicle, a road surface, etc.) of the vehicle 30.
  • the driver monitoring camera 23 monitors the driver 20 in order to detect a viewpoint position of the driver 20.
  • the optical system 230 or the like of the in-vehicle display device 100 is designed such that a distance from the driver 20 to the virtual image G is 5 m or more.
  • the distance from the driver 20 to the virtual image G is 5 m or more
  • the moving amount of the lens of the eye of the driver 20 is reduced, and the time to focus on the virtual image G is reduced so that the driver 20 can recognize contents of the virtual image G quickly.
  • the driver 20 can also reduce the fatigue of the eyeballs. Further, the driver 20 can more easily notice the contents of the virtual image G.
  • the virtual image G can facilitate the appropriate provision of information to the driver 20.
  • the driver 20 can focus on the virtual image G with little convergent movement. In this condition, reduction in the effect of using motion parallax to perceive distance (change in perceptual distance) and depth (difference in perceptual distance) is prevented by the convergent movement. Thus, it is possible to effectively exert an information perception effect of the driver 20 that utilizes the distance or depth of an image.
  • the optical system 230 included in the in-vehicle display device 100 illustrated in FIG. 3 includes red, green and blue laser light sources 201R, 201G and 201B, collimator lenses 202, 203 and 204 disposed for the respective laser light sources, and two dichroic mirrors 205 and 206.
  • the optical system 230 further includes a light amount adjusting unit 207, an optical scanning device 208 as an optical scanner, a free-form mirror 209, a microlens array 210 as a light emitting member, and a projection mirror 211 as a light reflecting member.
  • a light source unit 220 includes the laser light sources 201R, 201G and 201B, the collimator lenses 202, 203 and 204, and the dichroic mirrors 205 and 206, which are unitized by an optical housing.
  • the laser light sources 201R, 201G and 201B can utilize LD (semiconductor laser elements).
  • the wavelength of light flux emitted from the red laser light source 201R is, for example, 640 nm
  • the wavelength of light flux emitted from the green laser light source 201G is, for example, 530 nm
  • the wavelength of light flux emitted from the blue laser light source 201B is, for example, 445 nm.
  • an intermediate image to be formed on the microlens array 210 is projected onto the front windshield 21 of the vehicle 30, so that the driver 20 sees an enlarged image as a virtual image G.
  • Laser light of colors emitted from the laser light sources 201R, 201G and 201B is substantially collimated with the collimator lenses 202, 203 and 204, respectively, and is then combined by the two dichroic mirrors 205 and 206.
  • the combined laser light is adjusted by the light amount adjusting unit 207 and is deflected by a mirror of the optical scanning device 208, thereby two-dimensionally scanning the free-form mirror 209.
  • Scanning light L' deflected by the optical scanning device 208 two-dimensionally scans the free-form mirror 209.
  • the scanning light L' is reflected by the free-form mirror 209 to be compensated for distortion, and is then collected in the microlens array 210 to render an intermediate image G' on the microlens array 210.
  • the microlens array 210 is used as a light emitting member that individually diverges light flux of each pixel (one point of the intermediate image) of the intermediate image G' to emit light, but other light emitting members may be used.
  • the intermediate image G' may be formed by using a liquid crystal display (LCD) or a fluorescent display tube (VFD).
  • LCD liquid crystal display
  • VFD fluorescent display tube
  • a laser scanning type is preferable as used in the first embodiment.
  • the laser scanning type as used in the first embodiment, regarding a non-image portion within a virtual image G display area, turning off the laser light sources 201R, 201G, and 201B will completely prevent the non-image portion from being illuminated. Accordingly, it is possible to avoid degradation in viewability of the frontward scenery of the vehicle 30, which is caused by light illuminated from the in-vehicle display device 100 to be applied through the non-image portion. Thus, the viewability of the frontward scenery can further be improved.
  • the laser scanning type is preferable to perform display control for partially increasing the brightness of a portion of the image included in the virtual image G displayed by the in-vehicle display device 100.
  • the optical scanning device 208 tilts the mirror in main and sub-scanning directions with a known actuator drive system, such as a MEMS (Micro Electro Mechanical Systems), deflects the laser light incident on the mirror to two-dimensionally scan (raster scan) the free-form mirror 209.
  • the drive control of the mirror is performed in synchronization with the emission timings of the laser light sources 201R, 201G and 201B.
  • the optical scanning device 208 is not limited to the above-described configuration.
  • the optical scanning device 208 may be a mirror type optical scanning device that includes two mirrors configured to oscillate or rotate around two axes perpendicular to each other.
  • FIG. 4 is a diagram illustrating a configuration example of a control system 250 included in the in-vehicle display device 100 according to the first embodiment of the present invention.
  • the control system 250 includes a FPGA (Field Programmable Gate Array) 251, CPU (Central Processing Unit) 252, ROM (Read-Only Memory) 253, RAM (Random Access Memory) 254, an interface 255 (hereinafter referred to as I/F), a bus line 256, an LD driver 257 and a MEMS controller 258.
  • FPGA Field Programmable Gate Array
  • CPU Central Processing Unit
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • the FPGA 251 controls operations of the laser light sources 201R, 201G, and 201B of the light source unit 220 using the LD driver 257, and an operation of the MEMS 208a of the optical scanning device 208 using the MEMS controller 258.
  • the CPU 252 provides respective functions of the in-vehicle display device 100.
  • the ROM 253 stores various programs, such as an image processing program, which is executed by the CPU 252 in order to implement respective functions of the in-vehicle display device 100.
  • the RAM 254 is used as a work area of the CPU 252.
  • the I/F 255 is an interface for communicating with an external controller or the like. For example, the I/F 255 is connected to a vehicle navigation device 40, various sensors 50, or the like through the CAN (Controller Area Network) of the vehicle 30.
  • the I/F 255 is connected to the frontward monitoring camera 22 configured to monitor a frontward scenery of the vehicle 30. That is, frontward monitoring camera 22 monitors display information displayed by the in-vehicle display device 100 on the front windshield 21, as well as monitoring the background of the display information across the front windshield 21.
  • the I/F 255 is further connected to the driver monitoring camera 23 in order to detect a viewpoint position of the driver 20.
  • the control system 250 performs image processing on an image of the display information and its background captured by the frontward monitoring camera 22, based on the viewpoint position of the driver 20, and converts the processed image into an image viewed from the viewpoint position of the driver 20.
  • the control system 250 detects the viewpoint position of the driver 20, for example, by performing an image analysis of an image of the head of the driver 20 captured by the driver monitoring camera 23.
  • FIG. 5 is a diagram illustrating a schematic configuration example of the in-vehicle display device 100 according to the first embodiment of the present invention and peripheral devices.
  • the vehicle navigation device 40 and the sensor 50 are provided as an information acquiring unit configured to acquire driver-use information provided via the virtual image G to the driver 20.
  • the in-vehicle display device 100 mainly includes the optical system 230, which is an example of a display unit, and the control system 250, which is an example of a controller.
  • the vehicle navigation device 40 As the vehicle navigation device 40 according to the first embodiment, any known vehicle navigation device installed in an automobile, etc. may be widely used.
  • the vehicle navigation device 40 outputs information used to generate a route navigation image to be displayed as a virtual image G, and this output information is input to the control system 250.
  • the route navigation image includes an image illustrating information such as the number of lanes (travel lanes) of a road on which the vehicle 30 is traveling, the distance to a point until which a next route change (turn right, turn left, crossroad, etc.) should be made, the direction in which the next route change is made, and the like.
  • the above information is input from the vehicle navigation device 40 to the control system 250.
  • the in-vehicle display device 100 displays a route navigation image, such as a travel lane instruction image 711, an inter-vehicle distance presenting image 712, a route specification image 721, a remaining distance image 722, and a name image 723 such as the intersection, as the virtual image G, in an upper image display area A.
  • a route navigation image such as a travel lane instruction image 711, an inter-vehicle distance presenting image 712, a route specification image 721, a remaining distance image 722, and a name image 723 such as the intersection, as the virtual image G, in an upper image display area A.
  • an image representing road-specific information (road name, speed limit, etc.) is displayed as a virtual image G in a lower image display area B of the in-vehicle display device 100.
  • the road-specific information is also input from the vehicle navigation device 40 to the control system 250.
  • the in-vehicle display device 100 displays a road name display image 701, a speed limit display image 702, a no-passing display image 703, and the like as a virtual image in the lower image display area B, under the control of the control system 250.
  • the road name display image 701, a speed limit display image 702, a no-passing display image 703, and the like correspond to the road-specific information.
  • the sensor 50 in FIG. 5 includes one or more sensors configured to detect various information indicative of behaviors of the vehicle 30, a condition of the vehicle 30, surrounding conditions of the vehicle 30, and the like.
  • the sensor 50 outputs sensing information used to generate an image to be displayed as a virtual image G, and this sensing information is input to the control system 250.
  • a vehicle speed display image 704 (text image of "83 km/h" in the example of FIG. 1) representing the vehicle speed of the vehicle 30 is displayed as a virtual image in the lower image display area B of the in-vehicle display device 100.
  • the vehicle speed information included in the CAN information of the vehicle 30 is input to the control system 250 from the sensor 50, and the in-vehicle display device 100 displays a character image representing the vehicle speed as a virtual image G in the lower image display area B, under the control of the control system 250.
  • the sensor 50 may include, for example, sensors other than those configured to detect the vehicle speed of the vehicle 30.
  • sensors include as follows. (1) Laser radar devices and imaging devices configured to detect the distance between the vehicle 30 and other vehicles, pedestrians or structures (such as guardrails and utility poles) around (frontward, sideward, or rearward of) the vehicle 30, and sensors configured to detect external environmental information of the vehicle 30 (such as ambient temperature, brightness, weather conditions, etc.). (2) Sensors configured to detect the operations of the driver 20 (such as brake operation, accelerator opening/closing). (3) Sensors configured to detect the remaining amount of fuel in the fuel tank of the vehicle 30. (4) Sensors configured to detect the status of various vehicle-mounted equipment, such as engines and batteries. By transmitting the information detected by the sensor 50 to the control system 250, the in-vehicle display device 100 can display the information to be provided to the driver 20 as a virtual image G.
  • the driver-use information provided via the virtual image G to the driver 20 may be any information useful for the driver 20.
  • the driver-use information is broadly classified into passive information and active information for convenience.
  • Passive information is information that is passively perceived by the driver 20 at a time when predetermined information provision conditions are met. Accordingly, the information to be provided to the driver 20 at a set timing of the in-vehicle display device 100 is included in the passive information. Information having a certain relationship between the timing at which the information is provided and content of the information provided is also included in the passive information. Examples of passive information include safety-related information and route navigation information while driving.
  • the safety-related information while driving includes information representing the distance between the vehicle 30 and a preceding vehicle 350 (inter-vehicle distance presenting image 712) and emergency information relating to driving (warning information or attention attracting information, such as emergency operation instruction information that instructs a driver to operate the vehicle in an emergency).
  • the route navigation information is information for guiding a travel route to a predetermined destination, and may be the same information as that provided to the driver by a known vehicle navigation device.
  • the route navigation information includes travel lane instruction information (a travel lane instruction image 711) indicating the travel lane to be driven at the nearest intersection, and route change operation instruction information indicating the operation to change the route from the straight-ahead direction at the intersection or the branch.
  • Examples of the route change operation instruction information include route designation information (route specification image 721) for specifying which route should be taken at an intersection, etc., information indicating the remaining distance to the intersection, etc. for performing the route change operation (remaining distance image 722), and information indicating the name of the intersection, etc. (name image 723 such as name of intersection).
  • Active information is information that is actively perceived by the driver 20 at a timing determined by the driver 20.
  • the active information may be information to be provided to the driver at a timing desired by the driver 20. Examples of the active information include information having a low or no relationship between the timing at which the information is provided and content of the information provided.
  • the active information is information that is acquired by the driver 20 at the timing desired by the driver 20, and thus may continue to be displayed for a certain length of time or at all times. Examples of such active information include road-specific information associated with a road on which the vehicle 30 is traveling, the vehicle speed information of the vehicle 30 (the vehicle speed display image 704), the current time information, and the like.
  • the road-specific information of the road includes information that is information relating to the road useful for the driver 20. Examples of the road-specific information include information indicating the name of the road (the road name display image 701), information indicating the content of the regulation such as the speed limit of the road (the speed limit display image 702, and no-passing display image 703) and the like.
  • the broadly classified passive information and active information items are displayed in the respective display areas where the virtual image G can be displayed.
  • two display areas arranged in the vertical direction are set as the respective areas where the virtual image G is displayed.
  • a passive information image corresponding to the passive information is mainly displayed in the upper image display area A
  • an active information image corresponding to the active information is mainly displayed in the lower image display area B.
  • the viewability of the passive information image is prioritized in displaying the part of the active information image in the upper image display area A.
  • a stereoscopic image represented by using stereoscopic vision is used as the virtual image G displayed by the in-vehicle display device 100.
  • a perspective image represented by a perspective method is used as the inter-vehicle distance presenting image 712 and the travel lane indication image 711 displayed in the upper image display area A, in which the in-vehicle display device 100 displays a virtual image.
  • the inter-vehicle distance presenting image 712 is a perspective image that is created by a perspective method, which is a method of drawing lines toward a single vanishing point.
  • the inter-vehicle distance presenting image 712 is formed such that the single vanishing point is determined near a gazing point of the driver 20. As a result, the driver 20 can more easily perceive the sense of depth of the inter-vehicle distance presenting image 712 while driving.
  • the inter-vehicle distance presenting image 712 serving as a perspective image is displayed such that the thickness of the horizontal lines become thinner toward upper sides of the upper image display area A, or the brightness of the horizontal lines decrease toward the upper sides of the upper image display area A.
  • the driver 20 can more easily perceive the depth of the inter-vehicle distance presenting image 712 while driving.
  • FIG. 6 is a diagram illustrating a functional configuration of the in-vehicle display device 100 according to the first embodiment of the present invention.
  • the in-vehicle display device 100 illustrated in FIG. 6 is a device installed in a vehicle, such as an automobile.
  • the in-vehicle display device 100 may display information to attract a driver's attention to objects around the vehicle at a viewable location (e.g., a front windshield) from the driver inside the vehicle.
  • Objects include, for example, persons, animals, installed objects, obstacles, vehicles, and the like.
  • the in-vehicle display device 100 includes a vehicle information acquiring unit 102, an environmental information acquiring unit 104, a calculator 110, an image generator 112, and an image display unit 114.
  • the vehicle information acquiring unit 102 acquires information associated with the vehicle (hereinafter, referred to as "vehicle information"). For example, the vehicle information acquiring unit 102 acquires (but not limited to) vehicle speed information, steering angle information, and the like through the CAN (Controller Area Network) from the electronic control unit (ECU) provided by the vehicle.
  • vehicle information information associated with the vehicle
  • vehicle information acquires (but not limited to) vehicle speed information, steering angle information, and the like through the CAN (Controller Area Network) from the electronic control unit (ECU) provided by the vehicle.
  • the environmental information acquiring unit 104 acquires information (hereinafter referred to as "environmental information") associated with the environment around the vehicle. For example, the environmental information acquiring unit 104 acquires, as examples of environmental information, a relative speed ⁇ Vt between a vehicle and an object, and a distance D between the vehicle and the object. For example, the environmental information acquiring unit 104 acquires the relative speed ⁇ Vt from a speed calculating device (e.g., an ECU) capable of calculating the relative speed ⁇ Vt. For example, the environmental information acquiring unit 104 acquires the relevant distance D from a distance calculating device (e.g., a distance sensor) capable of calculating the distance D.
  • a distance calculating device e.g., a distance sensor
  • the in-vehicle display device 100 may include at least one of the speed calculating device or the distance calculating device.
  • the environmental information acquiring unit 104 may acquire various environmental information from an in-vehicle display device (e.g., an ECU, a variety of sensors, or the like), or the environmental information acquiring unit 104 may acquire various environmental information from an outside of the vehicle through communication.
  • the image generator 112 generates an image that attracts a driver's attention to an object based on the arrival time t computed by the calculator 110.
  • the image generator 112 when the arrival time t is greater than a predetermined first threshold value th1, the image generator 112 generates an implicit risk notification image (an example of a "second image") that relatively lightly attracts the driver's attention to an object.
  • the implicit risk notification image is an image that attracts attention to an object by using a graphic alone.
  • the image generator 112 when the arrival time t is the predetermined first threshold value th1 or less, the image generator 112 generates an explicit risk notification image (an example of a "first image") that relatively strongly attracts the driver's attention to an object.
  • the explicit risk notification image is an image that attracts attention to an object using a character string alone or using both a character string and a graphic.
  • the image display unit 114 displays an image (an implicit risk notification image or an explicit risk notification image) generated by the image generator 112 at a driver's viewable position of a front windshield (or transparent plate) of a vehicle using the head-up display 120 (optical system 230).
  • the image display unit 114 may display various vehicle information (e.g., vehicle speed information, steering angle information, or the like) acquired by the vehicle information acquiring unit 102 at a driver's viewable position of the front windshield (or transparent plate) of the vehicle using the head-up display 120 (the optical system 230).
  • the head-up display 120 (the optical system 230) is, but not limited to, an example of a display device.
  • a display device For example, a meter panel, a navigation device, or the like may also be used as a display device.
  • the display device may be integrally disposed on a main body of the in-vehicle display device 100, or may be externally connected to the main body of the in-vehicle display device 100.
  • the in-vehicle display device 100 includes a computer including a CPU 252, a ROM 253, and a RAM 254.
  • the in-vehicle display device 100 implements the above-described functions by executing a program stored in the ROM 253 by using the RAM 254 as a memory area.
  • the above-described functions of the in-vehicle display device 100 may be physically implemented by one device or physically implemented by a plurality of devices. A portion of each of the above functions may also be implemented by utilizing those provided in other devices (e.g., ECU, external server, etc.). (Procedure of Process Performed by In-vehicle Display Device 100)
  • FIG. 7 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device 100 according to the first embodiment of the present invention.
  • the environmental information acquiring unit 104 acquires a relative speed ⁇ V of the vehicle and a danger-related object, and the distance D between the vehicle and the danger-related object as the environmental information.
  • step S202 the calculator 110 calculates the arrival time t to reach the object based on the relative speed ⁇ V and the distance D calculated in step S201.
  • step S203 the image generator 112 determines whether the arrival time t calculated in step S202 is greater than a predetermined first threshold value th1 (e.g., 3 seconds).
  • step S203 when the image generator 112 determines that the arrival time t is greater than the predetermined first threshold value th1 (Yes in step S203), the environmental information acquiring unit 104 acquires the relative position information of the object (step S204). In step S205, the image generator 112 generates an implicit risk notification image (potential risk notification image).
  • step S206 the image display unit 114 displays the implicit risk notification image generated in step S205 at a position (i.e., a position superimposed on the object) corresponding to the relative position information acquired in step S204 on the front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 completes a series of steps illustrated in FIG. 7.
  • step S203 when the image generator 112 determines that the arrival time t is not greater than the predetermined first threshold value th1 (No in step S203), the image generator 112 generates an explicit risk notification image (obvious risk notification image) (step S207).
  • step S208 the image display unit 114 displays the explicit risk notification image generated in step S207 at a predetermined position on a front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 completes a series of steps illustrated in FIG. 7. (Example of Display of Explicit Risk Notification Image)
  • FIG. 8 and FIG. 9 are diagrams illustrating examples of an explicit risk notification image by the in-vehicle display device 100 according to the first embodiment of the present invention.
  • FIGS. 8 and 9 illustrate diagrams illustrating frontward sceneries ahead of the vehicle viewed from the driver of the vehicle across the front windshield of the vehicle.
  • the explicit risk notification image 300 or the explicit risk notification image 400 is displayed on the front windshield of the vehicle because the arrival time t is a predetermined first threshold value th1 (e.g., 3 seconds) or less before the vehicle reaches a person 10.
  • th1 e.g. 3 seconds
  • the explicit risk notification image 300 is displayed above the vehicle speed display image 302 using only the character string to attract attention to the person 10.
  • the explicit risk notification image 300 is an image with only the character string "BRAKE! representing the action to be taken by the driver.
  • the explicit risk notification image 400 is displayed on the front windshield of the vehicle above the vehicle speed display image 402 using a character string and a graphic to attract attention to the person 10.
  • the explicit risk notification image 400 is an image having a character string "BRAKE! that represents the action to be taken by the driver, and a graphic that exhibits the periphery of the character string with a red color meaning "warning".
  • the in-vehicle display device 100 is capable of displaying the explicit risk notification images 300 and 400 using a character string alone or both a character string and a graphic on the front windshield of the vehicle using the head-up display 120 (optical system 230) for strongly attracting the driver's attention to the person 10, in response to the arrival time t being a predetermined first threshold value th1 (e.g., 3 seconds) or less.
  • the arrival time t indicates a time until which the vehicle reaches the person 10.
  • the in-vehicle display device 100 according to the first embodiment enables the driver to accurately and quickly understand the contents of avoiding action instructions against the explicit risk.
  • the in-vehicle display device 100 according to the first embodiment can display an image or the like to more appropriately attract attention from the driver of the vehicle.
  • the in-vehicle display device 100 can display a plurality of explicit risk notification images in stepwise manner according to the distance D from the vehicle to the person 10, in response to the arrival time t being the predetermined first threshold value th1 or less.
  • the arrival time t indicates a time until which the vehicle reaches the person 10. Accordingly, the in-vehicle display device 100 according to the first embodiment can more effectively enable the driver to understand the presence of the explicit risk.
  • the in-vehicle display device 100 In response to the arrival time t being the predetermined first threshold value th1 or less, the in-vehicle display device 100 may display a single explicit risk notification image regardless of the distance D from the vehicle to the person 10. The arrival time t indicates a time until which the vehicle reaches the person 10.
  • the explicit risk notification images 300 and 400 may have animation effects (e.g., blinking, brightness change, color conversion, shape change, etc.). Accordingly, the in-vehicle display device 100 according to the first embodiment can more effectively enable the driver to understand the presence of the explicit risk.
  • animation effects e.g., blinking, brightness change, color conversion, shape change, etc.
  • the explicit risk notification images 300 and 400 may have a character string representing the type of object (e.g. "Attention to Person!” or the like) in addition to a character string representing the action to be taken by the driver (e.g. "BRAKE! or the like), or instead of a character string representing the action to be taken by the driver.
  • a character string representing the type of object e.g. "Attention to Person!” or the like
  • BRAKE! e.g. "BRAKE!” or the like
  • FIG. 10 is a diagram illustrating an example of an implicit risk notification image displayed by the in-vehicle display device 100 according to the first embodiment of the present invention.
  • FIG. 10 like FIGS. 8 and 9, illustrates a frontward scenery ahead of the vehicle, viewed from the driver of the vehicle across the front windshield of vehicle.
  • the front windshield of the vehicle displays an implicit risk notification image 500 for attracting attention to the person 10 at a position superimposed on the person 10 when viewed from the driver, because the time t to reach the person 10 is greater than the predetermined first threshold value th1 (e.g., 3 seconds).
  • the implicit risk notification image 500 is an image using an annular shape that exhibits the perimeter of the person 10 with a yellow color, which means "attention".
  • the in-vehicle display device 100 can display the implicit risk notification image 500 at a position superimposed on the person 10 on the front windshield of the vehicle using the head-up display 120 (the optical system 230), in response to the arrival time t being greater than the predetermined first threshold value th1 (e.g., 3 seconds).
  • the arrival time t indicates a time until which the vehicle reaches the person 10. Accordingly, the in-vehicle display device 100 according to the first embodiment can notify the driver of the implicit risk without troublesomeness. Accordingly, the in-vehicle display device 100 according to the first embodiment can display an image or the like to more appropriately attract attention of the driver of the vehicle.
  • the implicit risk notification image 500 may have an animation effect (e.g., blinking, brightness change, color conversion, shape change, etc.). Accordingly, the in-vehicle display device 100 according to the first embodiment can more effectively enable the driver to understand the presence of the implicit risk.
  • an animation effect e.g., blinking, brightness change, color conversion, shape change, etc.
  • a procedure of a process performed by the in-vehicle display device 100 differs from that of the process in the first embodiment.
  • FIG. 11 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device 100 according to the second embodiment of the present invention.
  • step S601 the environmental information acquiring unit 104 acquires a relative speed ⁇ V of the vehicle and a danger-related object, and the distance D between the vehicle and the danger-related object as the environmental information.
  • step S602 the calculator 110 calculates the arrival time t to reach the object based on the relative speed ⁇ V and the distance D calculated in step S601.
  • step S603 the image generator 112 determines whether the arrival time t calculated in step S602 is greater than a predetermined second threshold value th2 (e.g., 5 seconds). Note that the predetermined second threshold value th2 is greater than the predetermined first threshold value th1.
  • step S603 when the image generator 112 determines that the arrival time t is greater than the predetermined second threshold value th2 (Step S603: Yes), the in-vehicle display device 100 ends a series of steps illustrated in FIG. 11.
  • step S603 when the image generator 112 determines that the arrival time t is not greater than the predetermined second threshold value th2 (Step S603: No), the image generator 112 determines whether the arrival time t calculated in step S602 is greater than the predetermined first threshold value th1 (e.g., 3 seconds) in step S604.
  • the predetermined first threshold value th1 e.g., 3 seconds
  • step S604 when the image generator 112 determines that the arrival time t is greater than the predetermined first threshold value th1 (Yes in step S604), the environmental information acquiring unit 104 acquires the relative position information of the object (Step S605). In step S606, the image generator 112 generates the implicit risk notification image.
  • step S607 the image display unit 114 displays the implicit risk notification image generated in step S606 at a position (i.e., a position superimposed on the target) corresponding to the relative position information acquired in step S605 on the front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 ends a series of steps illustrated in FIG. 11.
  • step S604 when the image generator 112 determines that the arrival time t is not greater than the predetermined first threshold value th1 (Step S604: No), the image generator 112 generates an explicit risk notification image in step S608.
  • step S609 the image display unit 114 displays the explicit risk notification image generated in step S608 at a predetermined position on the front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 ends a series of steps illustrated in FIG. 11.
  • the in-vehicle display device 100 displays neither the implicit risk notification image nor the explicit risk notification image when the arrival time t is greater than the predetermined second threshold value th2 (e.g., 5 seconds). Accordingly, the in-vehicle display device 100 according to the second embodiment can prevent frequent display of the implicit risk notification image and the explicit risk notification image that are troublesome to the driver.
  • th2 e.g., 5 seconds
  • FIG. 12 is a diagram illustrating a functional configuration of an in-vehicle display device 100A according to the third embodiment of the present invention.
  • the in-vehicle display device 100A according to the third embodiment illustrated in FIG. 12 differs from the in-vehicle display device 100 according to the first embodiment in that the in-vehicle display device 100A further includes a driver information acquiring unit 106.
  • the driver information acquiring unit 106 acquires information associated with driver's mental and physical condition (hereinafter, referred to as "mental and physical condition information").
  • the driver information acquiring unit 106 acquires the following information, as the mental and physical condition information: electrocardiographic information detected by an electrocardiographic sensor, heart rate information detected by an electrocardiographic sensor, blood pressure information detected by a blood pressure sensor, body temperature information detected by a body temperature sensor, pulse information detected by a pulse sensor, respiratory information detected by a respiratory sensor, sweat information detected by a sweat sensor, blinking information detected by a blinking sensor, pupil information detected by a pupil sensor, electroencephalographic information detected by an electroencephalographic sensor, and muscle information detected by a muscle sensor, and the like.
  • the in-vehicle display device 100A may display the mental and physical condition information acquired by the driver information acquiring unit 106 on the front windshield of the vehicle by the image display unit 114. Further, the in-vehicle display device 100A may dynamically change the processing logic in accordance with the mental and physical condition information acquired by the driver information acquiring unit 106. For example, the in-vehicle display device 100A may reduce an animation effect of the implicit risk notification image and the explicit risk notification image to reduce the troublesome effect on the driver when the driver's state is determined to be in an alert state based on the mental and physical condition information acquired by the driver information acquiring unit 106.
  • FIGS. 13A and 13B are photographs depicting examples of steering information displayed by a display system according to a vehicle speed and the like.
  • the fourth embodiment illustrates a configuration in which information is presented by a HUD (Head-Up Display) device acting as a display system.
  • HUD Head-Up Display
  • this configuration is merely an example, and may also be applied to other display systems such as liquid crystal displays.
  • FIG. 13A is an example of steering information presented while approaching a curve
  • FIG. 13B is an example of steering information presented while traveling along the curve.
  • the display system presents information as a virtual image ahead of a front windshield (e.g., a few meters).
  • a display component 301 representing a vehicle speed and a display component 302 representing a steering angle are displayed.
  • the center of the display component 302 corresponds to the center of a neutral steering angle, and a desired steering angle (hereinafter, referred to as a "target steering angle 304") is indicated based on a current steering angle 303 and a current vehicle traveling condition.
  • a vehicle is traveling along a right-turn curve.
  • each of the current steering angle 303 and the target steering angle 304 indicates a right side of the center of the display component 302.
  • the target steering angle 304 indicates that the steering angle should be increased as the distance from the center increases. Accordingly, the target steering angle 304, the current steering angle 303, and the difference between the target steering angle and the current steering angle are displayed as analog information in FIGS. 13A and 13B.
  • Each of the current steering angle 303 and the target steering angle 304 corresponds to a steering wheel angle on a per one-to-one basis. However, the correspondence between the current or target steering angle and the steering wheel angle may not necessarily be technically strict.
  • the actual steering angle gradually increases as the vehicle travels along the curve, and the display system indicates the target steering angle 304 in real time based on the current vehicle's traveling condition.
  • the target steering angle 304 indicates by what amount the driver should actually turn the steering wheel.
  • the target steering angle 304 is calculated based on the vehicle speed, curvature of the curve, a desired yaw rate, and the like.
  • the target steering angle 304 can present the steering angle in which the amount of change in a lateral G-force G (hereinafter referred to as "lateral G") experienced by a vehicle occupant is less than a predetermined amount or less.
  • lateral G lateral G-force G
  • the driver When the current steering angle 303 and the target steering angle 304 are indicated by numerical values, for example, the driver must convert the values into angular values and apply the angular values to steering. In contrast, according to the fourth embodiment, the current steering angle 303 and the target steering angle 304 are displayed as analog information. Thus, the driver can easily understand by what amount the driver should actually turn the steering wheel.
  • the target steering angle 304 is detected by the vehicle based on the shape of the curve ahead of the vehicle, whereas the current steering angle 303 is the actual steering angle. Accordingly, when a road shape differs between a frontward portion and a currently traveling portion of the road (e.g., the curvature of the curve increases, or the curvature of the curve decreases), the current steering angle 303 and the target steering angle 304 deviate from each other. For example, when the curvature of the curve is large ahead of the vehicle, the current steering angle 303 the target steering angle 304 is frequently obtained. When the curvature of the curve is small ahead of the vehicle, the current steering angle 303 the target steering angle 304 is frequently obtained. Since the target steering angle 304 is displayed based on the road shape ahead of the vehicle, the driver can predict by what amount the driver is required to turn the steering wheel.
  • FIGS. 13A and 13B illustrate cases where the curve is a right-turn curve. However, in a case of a left-turn curve, each of the current steering angle 303 and the target steering angle 304 indicates a left side of the center of the display component 302.
  • the display component 302 is in the shape of a semi-circle or arc of the upper half of the circle, and substantially covers a steering angle range required while the vehicle is traveling along the curve.
  • the current steering angle 303 and the target steering angle 304 can match the actual steering angle.
  • the current steering angle 303 and the target steering angle 304 may be multiplied by a coefficient less than 1 so as to reflect the coefficient on the display component 302.
  • the display system according to the fourth embodiment sequentially presents steering angles in which the amount of change in the lateral G is a predetermined amount or less when the vehicle travels along the curve, thereby facilitating the driver to handle the vehicle such that the occupant can ride comfortably.
  • a moving body travels along a road means that a moving body travels on a road on which white lines, curbs, guardrails, and the like may preferably but may not necessarily be formed.
  • Information associated with a target steering angle of a steering wheel is information associated with a driver's steering of a steering wheel to an appropriate steering angle.
  • the moving body may be the one that travels on land, in the air, at sea, or in the sea, with an occupant on board.
  • a vehicle will be described as an example of a moving body.
  • the display system according to the fourth embodiment can also be installed in aircraft, ships, industrial robots, and the like.
  • a vehicle occupant is a person who sees or views information associated with a display system.
  • a driver may be an only occupant.
  • no occupant drives a moving body.
  • FIG. 14 is a schematic diagram illustrating an example of a display system 1 installed in a vehicle.
  • the display system 1 includes a display controller 20 (an example of a display controller) and a display unit 10 (an example of a display unit).
  • the display system 1 is embedded inside a dashboard, and projects an image toward a front windshield 91 (a transparent member) from a light-emitting window 3 disposed on a surface of the display system 1.
  • the projected image is displayed as a virtual image I ahead of the front windshield 91.
  • a vehicle occupant V can see useful information for driving while keeping his or her eyes (with little line-of-sight movement) on a preceding vehicle and on a road ahead of the vehicle.
  • the display system 1 may be disposed so as to project an image onto the front windshield 91, and thus the display system 1 may be disposed on a ceiling, a sun visor, or the like, in addition to the dashboard.
  • the display system 1 may be a general-purpose information processing terminal or a HUD-dedicated terminal.
  • the HUD-dedicated terminal is simply referred to as a heads-up display device or a navigation device when the heads-up display device is integrated with the navigation device.
  • the HUD-dedicated terminal may also be referred to as PND (Portable Navigation Device).
  • the HUD-dedicated terminal may be referred to as a display audio (or a connected audio).
  • the display audio is a device that provides mainly audio-visual functions and communication functions without incorporating navigation functions.
  • General-purpose information processing terminals include, for example, smartphones, tablets, cellular phones, PDAs (Personal Digital Assistants), laptop PCs, and wearable PCs (e.g., wristwatch, sunglass, etc.).
  • General-purpose information processing terminals are not limited thereto, but may be any devices insofar as those devices have functions of a general information processing apparatus.
  • the general-purpose information processing terminals are normally used as an information processing apparatus for executing various applications. However, in a case of executing application software for a display system, the information processing apparatus displays information useful for driving in the same way as the HUD-dedicated terminal, for example.
  • the display system 1 according to the fourth embodiment may be capable of switching between an in-vehicle state and a portable state, when the display system 1 serves as either the general-purpose information processing terminal or the HUD-only terminal.
  • the display system 1 includes a display unit 10 and a display controller 20 as main elements.
  • a laser scanning type and a panel type are known.
  • the laser scanning type is a type in which a laser beam emitted from a laser light source is scanned by a two-dimensional scanning device to form an intermediate image (a real image projected onto a screen to be described later).
  • the panel type is a type in which an imaging device such as a liquid crystal panel, a DMD panel (digital mirror device panel), or a fluorescent display tube (VFD) forms an intermediate image.
  • the laser scanning type enables allocation of light emission and non-light emission to each pixel, thereby generally forming a high contrast image.
  • the laser scanning type is preferable. Since high contrast provides better visibility, the laser scanning type allows vehicle occupants to visually recognize information with less attention resources than panel-type HUD devices.
  • the panel type light that cannot be completely shielded is projected in an area having no information, and a display frame (a square-shaped leak light) is projected to an area in which the HUD displays an image (this effect is called the postcard effect).
  • the laser scanning type does not have this kind of effect, and only the content can be projected.
  • AR Augmented Reality
  • the reality in displaying a generated image superimposing on a real landscape is improved.
  • AR is interpreted as "extended reality” and is a technology that virtually expands the world at hand by displaying images of objects that do not exist in the real landscape.
  • the panel-type HUDs may be applied insofar as such HUD devices are capable of displaying information in a manner visible with less attention resources.
  • information associated with steering of a steering wheel according to the fourth embodiment may be displayed on a transparent display disposed on a dashboard or the like (in this case, not a virtual image).
  • FIG. 15 is a diagram illustrating a configuration example of a display unit 10.
  • the display unit 10 mainly includes a light source unit 101, a light deflector 102, a mirror 103, a screen 104, and a concave mirror 105.
  • FIG. 15 merely illustrates main elements, and may have elements other than those illustrated in FIG. 15, or may not have part of the elements illustrated in FIG. 15.
  • the light source unit 101 includes, for example, three laser light sources corresponding to RGB (hereinafter referred to as LDs: laser diodes), coupling lenses, apertures, composite elements, lenses, and the like.
  • the light source unit 101 synthesizes laser beams emitted from the three LDs and directs the synthesized laser beams toward a reflective surface of the light deflector 102.
  • the laser beams directed toward the reflective surface of the light deflector 102 are two dimensionally deflected by the light deflector 102.
  • the light deflector 102 may be, for example, a single microscopic mirror that oscillates with respect to two orthogonal axes, or two microscopic mirrors that oscillate or pivot with respect to one axis.
  • the light deflector 102 may be, for example, a MEMS (Micro Electro Mechanical Systems) mirror fabricated by a semiconductor process or the like.
  • the light deflector 102 can be driven, for example, by an actuator that drives deformation force of the piezoelectric element.
  • the light deflector 102 may be a galvanic mirror, a polygon mirror, or the like.
  • a laser beam two-dimensionally deflected by the light deflector 102 enters mirror 103 and is turned back by the mirror 103 to render a two-dimensional image (an intermediate image) on the surface (scanned surface) of the screen 104.
  • a concave mirror may be used as the mirror 103, but a convex mirror or a planar mirror may also be used.
  • the configuration of deflecting the direction of the laser beams with the light deflector 102 and the mirror 103 enables the size of the display unit 10 to be reduced, or enables the arrangement of the elements to be flexibly changed.
  • the screen 104 is preferably a microlens array or a micro-mirror array having the function of diverging a laser beam at a desired divergence angle.
  • the screen 104 may be a diffuser plate for diffusing a laser beam, a transparent plate having a smooth surface, a reflector plate, or the like.
  • elements from the light source unit 101 to the screen 104 in FIG. 15 are referred to as a HUD device. However, other elements may be included in the HUD device.
  • the laser beam emitted from the screen 104 is reflected by the concave mirror 105 and projected to the front windshield 91.
  • the concave mirror 105 acts like a lens and functions to form an image at a predetermined focal length.
  • an image on the screen 104 corresponding to an object results in an image formed at a distance R2 determined by the focal length of the concave mirror 105.
  • a virtual image I is displayed at a distance R1 + R2 from the front windshield 91.
  • At least a portion of light flux to the front windshield 91 is reflected toward the viewpoint E of the vehicle occupant V.
  • the vehicle occupant V can see the virtual image I with an enlarged intermediate image of the screen 104 via the front windshield 91. That is, the virtual image I, which is the enlarged intermediate image, is displayed across the front windshield 91, viewing from the vehicle occupant V.
  • the front windshield 91 is slightly curved rather than planar. Therefore, although a position at which the virtual image I is formed is determined not only by the focal length of the concave mirror 105 but is also determined by the curved surface of the front windshield 91, the distance R is approximately determined by the distance R1 + R2 as described above. The distance R1 or R2 is elongated to remotely form the virtual image I such that line of sight movement of the vehicle occupant V is reduced. As a method for increasing the distance R1, an optical path is increased by being turned back by a mirror, and as a method for increasing the distance R2, the focal length of the concave mirror 105 is adjusted.
  • a horizontal line of the intermediate image is optically distorted upward and downward in a convex shape, due to the effect of the front windshield 91, it is preferable that at least one of the mirror 103 or the concave mirror 105 be designed and disposed to compensate for distortion. Alternatively, it is preferable that the projected image be corrected in consideration of the distortion.
  • a combiner may be disposed on the viewpoint E side of the front windshield 91. Disposing the combiner to irradiate the combiner with light from the concave mirror 105 may also present information as a virtual image I in the same manner such that the front windshield 91 is irradiated with light from the concave mirror 105.
  • FIG. 16 is a diagram illustrating a configuration example of an in-vehicle system 2 in which the display system 1 is installed in a moving body.
  • the in-vehicle system 2 includes a car navigation system 11, an engine ECU (Electronic Control Unit) 12, a display system 1, a brake ECU 13, a steering ECU 14, a recognition unit 15, and a detector 16 that communicate with each other via an in-vehicle network NW such as a CAN (Controller Area Network) bus.
  • NW Controller Area Network
  • the car navigation system 11 has a GNSS (Global Navigation Satellite System) represented by GPS, which detects a present location of a vehicle and displays the location of the vehicle on an electronic map. In addition, the car navigation system 11 receives inputs of the departure and destination locations, searches for routes from the departure to the destination and displays the search results on an electronic map, and guides an occupant of the vehicle in the direction of travel by audio, text (displayed on a display), animation, etc. before changing course.
  • the car navigation system 11 may communicate with a server via a cellular network or the like. In this case, the server can send an electronic map to the vehicle or perform a route search.
  • the engine ECU 12 controls an ideal fuel injection amount, an advance and retard angle of the ignition timing, and an actuation valve mechanism according to the information from each sensor and vehicle conditions. In addition, the engine ECU 12 determines the necessity of the speed change by referring to the map in which a shift line of the transmission gears is set in relation to the current speed of the vehicle and the opening of the accelerator. The engine ECU 12 performs acceleration and deceleration control while traveling to follow the preceding vehicle.
  • the electric motor may be powered with or without the engine.
  • the brake ECU 13 controls braking force of each wheel of the vehicle without any brake pedal operation by the vehicle occupant, such as ABS (Antilock Braking System), braking control while traveling to follow the preceding vehicle, automatic braking based on TTC (Time To Collision) with obstacles, and stopped state maintenance control when starting on a slope.
  • ABS Antilock Braking System
  • TTC Time To Collision
  • the steering ECU 14 detects a steering direction and a steering angle of the steering wheel operated by the occupant of the vehicle and performs power steering control for applying steering torque in a steering direction. In addition, the steering ECU 14 performs steering in the direction of avoiding deviation from the traveling lane, in the direction of maintaining the traveling at the center of the traveling lane, or in the direction of avoiding approaching obstacles, without steering wheel operation by the occupant of the vehicle.
  • the detector 16 includes a variety of sensors configured to detect obstacles around the vehicle.
  • the recognition unit 15 recognizes a shape of a road from an image or the like such as a white line displayed ahead of the vehicle, performs object recognition that identifies what is an obstacle detected by the detector 16, and recognizes a position (direction and distance) of the obstacle relative to the vehicle. Information such as vehicle speed, road shape, and the like are entered into the display system 1.
  • FIG. 17 is a diagram illustrating a configuration example of the detector 16 of the in-vehicle system 2.
  • the detector 16 includes a vehicle speed sensor 161 configured to detect a vehicle speed displayed by the display system 1, a vehicle information sensor 162 configured to acquire vehicle information displayed by the display system 1, a radar sensor 163 and a surround view camera 164 each configured to detect an obstacle, an occupant status information sensor 165 configured to acquire occupant information that is information associated with an occupant's condition, a VICS receiving device 166 configured to receive traffic jam information (registered trademark) (VICS: Vehicle Information and Communication System Center), and an external communication device 167 connected to the Internet.
  • VICS Vehicle Information and Communication System Center
  • the sensors of the detector 16 do not need to be integrated with the detector 16, but may be installed in the vehicle.
  • the vehicle speed sensor 161 detects, for example, a magnet that rotates with the rotation of a shaft of a drivetrain system, via a sensor unit fixed to the body, and generates a pulse wave proportional to the rotation speed. Vehicle speed can be detected by the number of pulses per unit time.
  • the vehicle information sensor 162 includes one or more sensors configured to detect vehicle information other than the vehicle speed sensor 161. Examples of such sensors include fuel meter sensors, shift lever position sensors, odometers, trip meters, turn signal sensors, and water temperature sensors. These sensors may each have a general configuration to acquire vehicle information.
  • the fuel meter sensor detects current remaining fuel.
  • the shift lever position sensor detects a position of a shift lever operated by the occupant of the vehicle.
  • the odometer cumulates the distance traveled by the vehicle and provides the total distance traveled.
  • the trip meter provides a block distance traveled by the vehicle from the time of initialization performed by the occupant of the vehicle to the present.
  • the winker sensor detects a direction of the winker operated by the occupant of the vehicle.
  • the water temperature sensor detects engine cooling water temperature.
  • vehicle information may only be examples of information that can be obtained from a vehicle.
  • Other information that can be obtained from a vehicle may also be used as vehicle information.
  • vehicle information For example, in the case of an electric vehicle or a hybrid vehicle, a remaining battery amount, a regenerated power amount, or a power consumption amount may be obtained as vehicle information.
  • the surround view camera 164 is an imaging device that captures the perimeter of the vehicle.
  • the surround view camera 164 is preferably located so as to image or monitor the frontward, two sides and rearward of the vehicle.
  • the surround view camera 164 is located so as to image at least the frontward of the vehicle.
  • the surround view camera 164 is attached on or near the back of the rearview mirror such that an optical axis is directed toward a horizontal direction ahead of the vehicle.
  • the surround view camera 164 may be located in the left rear corner, right rear corner, and rearward of the roof and bumper of the vehicle.
  • An imaging device located at the rear of the vehicle is called a back monitor, but the surround view camera 164 at the rear of the vehicle is not limited to the back monitor.
  • the surround view camera 164 may be disposed on side mirrors, pillars, side portions of the roof, or doors.
  • the surround view camera 164 may be a monocular camera or a stereo camera. In the case of a monocular camera or stereo camera capable of obtaining distance information, no radar sensor 163 is required. However, since the radar sensor 163 is provided in addition to the surround view camera 164 configured to acquire the distance information, the distance information from the surround view camera 164 and the distance information from the radar sensor 163 can be fused (integrated) to compensate for each other's disadvantages and obtain highly accurate distance information. Note that a sonic sensor (ultrasonic sensor) or the like may be provided in addition to the radar sensor 163 and the surround view camera 164.
  • the radar sensor 163 transmits radar around the vehicle, such as radar present ahead of, sides of, or rear of the vehicle, and receives radar reflected back by the object.
  • the radar sensor 163 may be located such that obstacles around the vehicle can be detected in the same way as the surround view camera.
  • the radar sensor 163 employs a TOF (Time of Flight) technique in which a distance to an object is detected according to a time from transmission to reception, and a direction of an object is detected according to an illumination direction of the radar.
  • LIDAR Light Detection and Ranging
  • Lser Imaging, Detection, and Ranging is known as a radar sensor employing the TOF technique.
  • FMCW Frequency Modulation Continuous Wave
  • FCM Frequency Chirp Modulation
  • the occupant status information sensor 165 is a sensor configured to detect occupant status information directly or indirectly from an occupant of a vehicle.
  • a typical example of the occupant status information sensor 165 is a face camera. The face camera captures and authenticates the face of an occupant of the vehicle and specifies or identifies the occupant of the vehicle.
  • the occupant status information sensor 165 can detect the face direction and the line of sight direction from the face image.
  • the occupant status information sensor 165 may include an electrocardiogram sensor, a heart rate sensor, a blood pressure sensor, a body temperature sensor, a pulse sensor, a respiration sensor, a perspiration sensor, a blinking sensor, a pupil sensor, an electroencephalogram sensor, or a myoelectric potential sensor.
  • the occupant status information sensor 165 has a configuration worn by an occupant of a vehicle, such as a wristwatch-type wearable terminal (smart watch).
  • the VICS receiving device 166 receives radio waves delivered by VICS.
  • VICS is a system that transmits traffic information, such as traffic congestion and traffic restrictions, to the in-vehicle device in real time using FM multiplex broadcasts or beacons.
  • the external communication device 167 connects to the Internet or the like via 3G, 4G, 5G, LTE, and a network such as a wireless LAN, and receives various information.
  • the external communication device 167 can receive weather information such as rain, snow and fog.
  • the external communication device 167 may also receive news, music, videos, etc.
  • the external communication device 167 can acquire, for example, traffic light state information and the time until the traffic light changes.
  • the VICS receiving device 166 and the external communication device 167 may perform roadside-to-vehicle communication.
  • the external communication device 167 may acquire information detected by another vehicle 6 via vehicle-to-vehicle communication.
  • the Advanced Operating System not only displays information and provides warnings, but may also control the vehicle.
  • the control ECU links the engine ECU 12, the brake ECU 13, and the steering ECU 14 to provide various driving assistance based on the distance information associated with obstacles detected by at least one of the radar sensors 163 or the surround view camera 164.
  • the control ECU performs acceleration/deceleration control while traveling to follow a preceding vehicle, automatic braking, avoidance of deviation from the traveling lane, lane keeping traveling, and steering to avoid obstacle.
  • the recognition unit 15 recognizes a road sign, road paint such as a white line, or the like, from the image data imaged by the imaging device configured to capture a scenery ahead of the vehicle.
  • the control ECU controls the power and braking power to maintain a target distance according to the vehicle speed.
  • Automatic braking includes warning, indication to push the brake pedal down, hoisting the seat belt, and braking to avoid collision in case of there being a high possibility of collision, which are performed according to TTC.
  • a driving assist ECU 36 (not illustrated) recognizes a white line (travel segment line) from the image data and applies steering torque in the direction opposite to the direction of deviation from the traveling lane.
  • the center of the traveling lane is set as the target traveling line, and the steering torque proportional to the deviation from the target traveling line is applied in the direction opposite to the deviation direction.
  • a traveling line for avoiding the obstacle is determined, and a steering torque for travelling in the determined traveling line is applied.
  • FIG. 18 is a diagram illustrating a hardware configuration example of the display controller 20.
  • the display controller 20 includes an FPGA 201, a CPU 202, a ROM 203, a RAM 204, an I/F 205, a bus line 206, an LD driver 207, and a MEMS controller 208.
  • the FPGA 201, CPU 202, ROM 203, RAM 204, and I/F 205 are interconnected via the bus line 206.
  • the CPU 202 controls each of the functions of the display controller 20.
  • the ROM 203 stores the program 203p executed by the CPU 202 to control each of the functions of the display controller 20.
  • the RAM 204 loads the program 203p and the CPU 202 uses the RAM 204 as a work area for executing the program 203p.
  • the RAM 204 also includes an image memory 209.
  • the image memory 209 is used to generate an image that is projected as virtual image I.
  • the I/F 205 is an interface for communicating with the recognition unit 15 and the detector 16, and the I/F 205 is connected to, for example, a CAN (Controller Area Network) bus or an Ethernet (registered trademark) of a vehicle.
  • CAN Controller Area Network
  • Ethernet registered trademark
  • the FPGA 201 controls the LD driver 207 based on an image created by the CPU 202.
  • the LD driver 207 drives the LD of the light source unit 101 of the display unit 10 to control the emission of the LD according to the image.
  • the FPGA 201 operates the light deflector 102 of the display unit 10 through the MEMS controller 208 such that the laser beam is deflected in a direction corresponding to a pixel position of the image. Function of Display Controller
  • FIG. 19 is an example of a functional block diagram illustrating functions of the recognition unit 15 and the display controller 20 in blocks.
  • the recognition unit 15 includes a vehicle speed acquiring unit 31, a road shape estimator 32, and a steering angle determining unit 33.
  • the vehicle speed acquiring unit 31 preferably acquires a current vehicle speed from the detector 16 periodically.
  • the road shape estimator 32 acquires a road shape in a traveling direction of the vehicle.
  • the recognition unit 15 recognizes a white line using a stereo camera or a laser, the recognition unit 15 recognizes three-dimensional coordinates of the white line, so that curvature of the curve can be estimated from the coordinates of the white line.
  • FIGS. 20A and 20B are diagrams illustrating a method for estimating curvature of a curve.
  • FIG. 20A schematically illustrates recognized white lines and FIG. 20B illustrates coordinates of the white lines in an XZ plane.
  • the XZ plane corresponds to a road surface.
  • the road shape estimator 32 detects white line coordinates Sn (where n is a natural number) at predetermined intervals from internal edges or the like of the right white line and the left white line. Then, the coordinates of the center of the left and right white line coordinates Sn are calculated, and the coordinates of the center are graphed as illustrated in FIG. 20B.
  • FIG. 20B is merely a descriptive diagram, and a transformation to a graph may not necessarily be applied in actuality.
  • the road shape estimator 32 determines the radius of a circle by fitting the white line coordinates Sn to equation of a circle.
  • the radius (or curvature) of a frontward curve can be estimated by periodically repeating such a process.
  • the white line coordinates Sn may be fitted to clothoid curves.
  • the shape of a road such as the radius of a curve, may be obtained from the car navigation system 11.
  • the display controller 20 estimates the road shape from the white lines, but the road shape may be estimated based on features and structures associated with the road shape, such as curbs and guardrails.
  • the extent to which the radius of the frontward curve (or curvature) is estimated may be predetermined or may vary with a vehicle speed. When, for example, 3 seconds are required for the driver to navigate from the current steering angle to the target steering angle with a sufficient allowable range, how far ahead to find the radius of the frontward curve is determined by the "vehicle speed v ⁇ 3 seconds".
  • the steering angle determining unit 33 determines the target steering angle based on the vehicle speed of the vehicle and the estimated road shape.
  • Various ways of determining the correspondence between vehicle speed, road shape, and steering angle have been proposed. The following is an example.
  • FIG. 21 is a diagram illustrating a method of determining a target steering angle.
  • a yaw rate ⁇ has a relationship between a vehicle speed v and curvature k of a curve. This relationship is represented by the following equation (1).
  • v ⁇ k ...
  • the radius r of the curve and the curvature k have a relationship represented by the following equation (2).
  • r 1/k ... (2)
  • the steering angle determining unit 33 transmits the target steering angle determined by the equation (5) to the image generator 34 of the display controller 20.
  • the preferred yaw rate can be experimentally determined by the vehicle manufacturer, etc. such that the lateral G has a predetermined amount of change or less.
  • the display controller 20 includes an image generator 34 and an output unit 35. These functions of the display controller 20 are implemented by the CPU 202 executing a program 203p, which is loaded from a ROM 203 of the display controller 20 to a RAM 204.
  • the image generator 34 applies the current steering angle 303 and the target steering angle 304 to the display component 302, and generates an image for causing the display unit 10 to project the display component 302.
  • An example of this display component 302 has been illustrated in FIGS. 13A and 13B. Since the display unit 10 projects an image in color, the image generator 34 can change the color of the current steering angle 303 and the target steering angle 304 to generate an image for facilitating understanding of the driver.
  • the output unit 35 outputs a display component representing the current steering angle 303 and the target steering angle 304 generated by the image generator 34. That is, the LD driver 207 and the MEMS controller 208 are controlled so as to display a display component generated by the image generator 34 on the display unit 10. Examples of Display Indicating Current and Target Steering Angles
  • FIGS. 22A to 22C are diagrams each illustrating a display example of the current steering angle 303 and the target steering angle 304.
  • the display component 310 of FIG. 22A has a shape that imitates a shape of a steering wheel.
  • FIG. 22A is black and white; however, a steering icon with a low brightness (or predetermined color) indicates a current steering angle 303, and a steering icon with a high brightness (or different color from the former) indicates a target steering angle 304.
  • the rotation angle of the steering icon matches the actual steering angle.
  • the current steering angle 303 may not be required. This also applies to FIGS. 13A and 13B.
  • a user may steer such that a low brightness steering icon is superimposed on a high brightness steering icon. This facilitates following and maintaining the appropriate steering angle.
  • FIG. 22B illustrates the display component 302 similar to that of FIGS. 13A and 13B.
  • the difference between the current steering angle 303 and the target steering angle 304 is equal to or greater than a threshold value
  • the target steering angle 304 emphasized by the image generator 34 is displayed.
  • the image generator 34 when the difference between the current steering angle 303 and the target steering angle 304 is equal to or greater than the threshold value, the image generator 34 generates an image of a display component that blinks the target steering angle 304. That is, the image of the display component in which the brightness and color of the target steering angle 304 change in a short time is generated.
  • the target steering angle 304 is formed in a color with higher alertness. For example, when the difference between the current steering angle 303 and the target steering angle 304 is not greater than or equal to the threshold value, the target steering angle 304 is displayed in yellow, and when the difference between the current steering angle 303 and the target steering angle 304 is greater than or equal to the threshold value, the target steering angle 304 is displayed in red.
  • the target steering angle 304 may also be indicated as larger or smaller than the actual steering angle 304. In this way, the steering amount being large or insufficient can be emphasized.
  • the image generator 34 monitors the difference between the actual vehicle steering angle and the target steering angle 304 to understand the driver's steering tendency.
  • the target steering angle 304 is displayed smaller than the actual steering angle, and when the driver has a steering tendency to steer smaller, the target steering angle 304 is displayed larger than the actual steering angle. In this way, the target steering angle 304 can be displayed according to the driver's steering tendency.
  • FIG. 22C illustrates the display component 302 when the current steering angle 303 the target steering angle 304.
  • the image generator 34 can inform a user that it is desirable to reduce the steering wheel angle in order to travel along the curve ahead of the vehicle by reducing the current steering angle 303 to be smaller than the target steering angle 304.
  • FIG. 23 is a diagram illustrating a relationship between an actual steering angle of a vehicle and a target steering angle 304.
  • the vehicle is about to travel along the right-turn curve.
  • This t-seconds is determined by how many seconds later the steering angle becomes the target steering angle 304.
  • the image generator 34 stores the target steering angle 304 and the actual steering angle after t seconds to detect the driver's steering tendency. That is, the image generator 34 detects the difference between the target steering angle and the current steering angle after t seconds, and generates an image having the target steering angle to which the difference is added, or from which the difference is reduced.
  • Target steering angle 304 - Actual steering angle after t seconds + ⁇ degrees.
  • the image generator 34 since the driver has a steering tendency to steer smaller, the image generator 34 generates an image of a display component 302 having a target steering angle 304 + ⁇ degrees as the target steering angle 304.
  • Target steering angle 304 - Actual steering angle after t seconds - ⁇ degrees.
  • the image generator 34 since the driver has a steering tendency to steer larger, the image generator 34 generates an image of a display component 302 having a target steering angle 304 - ⁇ degrees as the target steering angle 304. Operation Procedure
  • FIG. 24 is a flowchart illustrating an example of a procedure in which the display controller 20 presents the current steering angle 303 and the target steering angle 304. The process of FIG. 24 is performed repeatedly when the vehicle is in an ignition ON or in a system ON.
  • the detector 16 detects a current steering angle 303 (S1).
  • the vehicle speed acquiring unit 31 detects a vehicle speed (S2).
  • the road shape estimator 32 estimates a road shape ahead of the vehicle (S3). For example, a curvature and a pivot direction of the curve are detected.
  • the steering angle determining unit 33 determines the target steering angle 304 from the vehicle speed, the curvature of the curve, and a preferred yaw rate (S4).
  • the image generator 34 generates an image of a display component 302 including a current steering angle 303 and a target steering angle 304 (S5). Since the output unit 35 outputs an image to the display unit 10, the display unit 10 projects the image including the display component 302 (S6).
  • the display component 302 is projected onto the windshield, and the current steering angle 303 and the target steering angle 304 are presented to the driver as virtual images.
  • the display system sequentially displays a target steering angle in which the amount of change in the lateral G is a predetermined amount or less while the vehicle travels along the curve. This facilitats the driver's steering so as for the driver to comfortably drive the vehicle.
  • the fourth embodiment has illustrated the target steering angle while driving on the road; however, the fourth embodiment may illustrate the target steering angle when parking.
  • the recognition unit 15 recognizes the parking frame.
  • the parking frame is not formed by white lines or the like, a space is recognized by an ultrasonic sensor or the like, and the recognition unit 15 determines a parking frame.
  • the recognition unit 15 sets a moving line to the parking frame, and the display system displays a steering angle that moves along the moving line.
  • the display system 1 may be a liquid crystal display or the like. Further, the display controller 20 and the display unit 10 may be separated from each other, and the housings may be separately distributed.
  • a smartphone may be used as a smartphone display controller 20, and information may be displayed on a display that is built in the smartphone, or a virtual image may be displayed on a combiner.
  • the information displayed by the display system 1 is not limited to the vehicle speed, and information that can be obtained from the inside of the vehicle or information that can be obtained from the outside of the vehicle via the Internet can be displayed.
  • the processing is performed by the display system 1 installed in the vehicle, but a part of or all the processing may be performed by a server which performs communications with the vehicle via the network.
  • the vehicle can transmit a frontward image to the server, and the server can determine at least one of a road shape or a target steering angle to the vehicle. In this way, the processing load on the display system 1 installed in the vehicle can be reduced.
  • FIG. 19 a configuration example such as FIG. 19 is divided according to main functions to facilitate understanding of the processing performed by the display controller 20.
  • the invention of the present application is not limited by the dividing method of processing units or by the names.
  • the processing of the display controller 20 may be divided into more processing units according to the processing contents.
  • the processing of the display controller 20 may further be divided into more processing units such that each processing unit can include more multiple processing units.
  • processing circuit includes a processor programmed to perform each function by software, such as a processor implemented in electronic circuits, an ASIC (Application Specific Integrated Circuit) designed to perform each function as described above, a digital signal processor (DSP), a field programmable gate array (FPGA), or a conventional circuit module.
  • ASIC Application Specific Integrated Circuit
  • DSP digital signal processor
  • FPGA field programmable gate array

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)

Abstract

Disclosed is an in-vehicle display device installed in a vehicle. The in-vehicle display device includes a calculator configured to calculate a distance from the vehicle to an object; an image generator configured to generate a first image to attract attention to the object by using a character string alone or by using the character string and a graphic, in response to the distance calculated by the calculator being a predetermined first threshold value or less, or generate a second image to attract attention to the object by using the graphic alone, in response to the distance calculated by the calculator being greater than the first threshold value; and a display unit configured to display the first image or the second image, the first image or the second image being generated by the image generator using a display device.

Description

IN-VEHICLE DISPLAY DEVICE AND PROGRAM
The disclosure discussed herein relates to an in-vehicle display device and a program.
The technique disclosed in Patent Document 1 below predicts an arrival time to reach an obstacle in order to reduce troublesomeness of a driver. In this technique, when the predicted arrival time to reach an obstacle is 2 seconds or less, an alarm immediately goes off, and when the predicted arrival time to reach an obstacle is 2 seconds or more, a display alarm is presented.

[PTL 1]  Japanese Unexamined Patent Publication No. 2001-023094
[PTL 2]  Japanese Patent No. 5177105
However, the technique disclosed in Patent Document 1 is unable to present an attention attracting display when the predicted arrival time to reach an obstacle is 2 seconds or less. Since the technology disclosed in Patent Document 1 merely generates an alarm sound when the predicted arrival time to reach an obstacle is 2 seconds or less, a driver fails to understand the type of the obstacle and the action to be taken by the driver. Thus, the related art technique fails to present a more appropriate attention attracting display to a driver of a vehicle.
An object of the present invention is to enable a more appropriate attention attracting display to a driver of a vehicle.
According to one aspect of embodiments, an in-vehicle display device installed in a vehicle is provided. The in-vehicle display device includes a calculator configured to calculate a distance from the vehicle to an object; an image generator configured to generate a first image to attract attention to the object by using a character string alone or by using the character string and a graphic, in response to the distance calculated by the calculator being a predetermined first threshold value or less, or generate a second image to attract attention to the object by using the graphic alone, in response to the distance calculated by the calculator being greater than the first threshold value; and a display unit configured to display the first image or the second image, the first image or the second image being generated by the image generator using a display device.
Advantageous Effect of the Invention
In accordance with the embodiments of the present invention, a more appropriate attention attracting display can be provided to a driver of a vehicle.

FIG. 1 is a diagram illustrating an example of a virtual image displayed on a front windshield of an in-vehicle display device according to a first embodiment of the present invention. FIG. 2 is a diagram schematically illustrating an example of an internal arrangement of a vehicle provided with the in-vehicle display device according to a first embodiment of the present invention. FIG. 3 is a diagram schematically illustrating a configuration example of an optical system included in the in-vehicle display device according to the first embodiment of the present invention. FIG. 4 is a diagram illustrating a configuration example of a control system included in the in-vehicle display device according to the first embodiment of the present invention. FIG. 5 is a diagram illustrating a schematic configuration example of the in-vehicle display device according to the first embodiment of the present invention and peripheral devices. FIG. 6 is a diagram illustrating a functional configuration of the in-vehicle display device according to the first embodiment of the present invention. FIG. 7 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device according to the first embodiment of the invention. FIG. 8 is a diagram illustrating an example of an explicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention. FIG. 9 is a diagram illustrating an example of an explicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention. FIG. 10 is a diagram illustrating an example of an implicit risk notification image displayed by the in-vehicle display device according to the first embodiment of the present invention. FIG. 11 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device according to a second embodiment of the present invention. FIG. 12 is a diagram illustrating a functional configuration of the in-vehicle display device according to a third embodiment of the present invention. FIG. 13A is a photograph depicting an example of information associated with a steering operation displayed by a display system according to a vehicle speed, and the like. FIG. 13B is a photograph depicting an example of information associated with a steering operation displayed by the display system according to a vehicle speed, and the like. FIG. 14 is a diagram illustrating an example of the display system installed in a vehicle. FIG. 15 is a diagram illustrating an example of a structure of a display unit. FIG. 16 is a configuration diagram illustrating an example of an in-vehicle system in which the display system is installed in a moving body. FIG. 17 is a diagram illustrating a configuration example of a detector in the in-vehicle system. FIG. 18 is a diagram illustrating a hardware configuration example of a display controller. FIG. 19 is a functional block diagram illustrating examples of functions of a recognition unit and the display controller in blocks. FIG. 20A is a diagram illustrating a method of estimating curvature of a curve. FIG. 20B is a diagram illustrating a method of estimating curvature of a curve. FIG. 21 is a diagram illustrating a method of determining a target steering angle. FIG. 22A is a diagram illustrating an example of a display indicating a current steering angle and a target steering angle. FIG. 22B is a diagram illustrating an example of a display indicating a current steering angle and a target steering angle. FIG. 22C is a diagram illustrating an example of a display indicating a current steering angle and a target steering angle. FIG. 23 is a diagram illustrating a relationship between an actual steering angle of a vehicle and a target steering angle. FIG. 24 is a flowchart illustrating an example of a procedure in which the display controller presents a current steering angle and a target steering angle.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.
FIRST EMBODIMENT
FIG. 1 is a diagram illustrating an example of a virtual image on a front windshield 21 displayed by an in-vehicle display device 100 according to a first embodiment of the present invention. FIG. 2 is a diagram schematically illustrating an example of an internal arrangement of a vehicle 30 provided with an in-vehicle display device 100 according to the first embodiment of the present invention.
FIG. 3 is a diagram schematically illustrating a configuration example of an optical system 230 included in the in-vehicle display device 100 according to the first embodiment of the present invention.
In FIG. 2, the in-vehicle display device 100 according to the present embodiment is installed inside a dashboard of the vehicle 30. The vehicle 30 is a traveling object acting as an example of a moving object. Projection light L, which is image light emitted from the in-vehicle display device 100 inside the dashboard, is reflected by a front windshield 21 acting as a light transmitting member, and the reflected light is then transmitted toward a driver 20 acting as a viewer. Accordingly, the driver 20 can view a route navigation image or the like as a virtual image G, which will be described later. Note that a combiner acting as a light transmitting member may be disposed on an inner wall of the front windshield 21 so as to allow the driver 20 to see the virtual image G via the projection light L reflected by the combiner.
A frontward monitoring camera 22 and a driver monitoring camera 23 are disposed on an upper part of the front windshield 21. The frontward monitoring camera 22 captures a frontward scenery ahead of the vehicle. The frontward scenery monitored by the frontward monitoring camera 22 includes display information displayed by the in-vehicle display device 100 and the background of the display information on the front windshield 21. That is, the frontward monitoring camera 22 captures the display information displayed by the in-vehicle display device 100 reflected on the front windshield 21, and also captures the background of the display information across the front windshield 21. The background captured by the frontward monitoring camera 22 across the front windshield 21 is a frontward environment (i.e., a preceding vehicle, a road surface, etc.) of the vehicle 30. The driver monitoring camera 23 monitors the driver 20 in order to detect a viewpoint position of the driver 20.
According to the first embodiment, the optical system 230 or the like of the in-vehicle display device 100 is designed such that a distance from the driver 20 to the virtual image G is 5 m or more. According to the first embodiment, when the distance from the driver 20 to the virtual image G is 5 m or more, the moving amount of the lens of the eye of the driver 20 is reduced, and the time to focus on the virtual image G is reduced so that the driver 20 can recognize contents of the virtual image G quickly. In this condition, the driver 20 can also reduce the fatigue of the eyeballs. Further, the driver 20 can more easily notice the contents of the virtual image G. Thus, the virtual image G can facilitate the appropriate provision of information to the driver 20. When the distance from the driver 20 to virtual image G is 5 m or more, the driver 20 can focus on the virtual image G with little convergent movement. In this condition, reduction in the effect of using motion parallax to perceive distance (change in perceptual distance) and depth (difference in perceptual distance) is prevented by the convergent movement. Thus, it is possible to effectively exert an information perception effect of the driver 20 that utilizes the distance or depth of an image.
The optical system 230 included in the in-vehicle display device 100 illustrated in FIG. 3 includes red, green and blue laser light sources 201R, 201G and 201B, collimator lenses 202, 203 and 204 disposed for the respective laser light sources, and two dichroic mirrors 205 and 206. The optical system 230 further includes a light amount adjusting unit 207, an optical scanning device 208 as an optical scanner, a free-form mirror 209, a microlens array 210 as a light emitting member, and a projection mirror 211 as a light reflecting member. A light source unit 220 includes the laser light sources 201R, 201G and 201B, the collimator lenses 202, 203 and 204, and the dichroic mirrors 205 and 206, which are unitized by an optical housing.
The laser light sources 201R, 201G and 201B can utilize LD (semiconductor laser elements). The wavelength of light flux emitted from the red laser light source 201R is, for example, 640 nm, the wavelength of light flux emitted from the green laser light source 201G is, for example, 530 nm, and the wavelength of light flux emitted from the blue laser light source 201B is, for example, 445 nm.
In the in-vehicle display device 100 according to the first embodiment, an intermediate image to be formed on the microlens array 210 is projected onto the front windshield 21 of the vehicle 30, so that the driver 20 sees an enlarged image as a virtual image G. Laser light of colors emitted from the laser light sources 201R, 201G and 201B is substantially collimated with the collimator lenses 202, 203 and 204, respectively, and is then combined by the two dichroic mirrors 205 and 206. The combined laser light is adjusted by the light amount adjusting unit 207 and is deflected by a mirror of the optical scanning device 208, thereby two-dimensionally scanning the free-form mirror 209. Scanning light L' deflected by the optical scanning device 208 two-dimensionally scans the free-form mirror 209. The scanning light L' is reflected by the free-form mirror 209 to be compensated for distortion, and is then collected in the microlens array 210 to render an intermediate image G' on the microlens array 210.
In the first embodiment, the microlens array 210 is used as a light emitting member that individually diverges light flux of each pixel (one point of the intermediate image) of the intermediate image G' to emit light, but other light emitting members may be used. The intermediate image G' may be formed by using a liquid crystal display (LCD) or a fluorescent display tube (VFD). However, in order to display a large virtual image G with high brightness, a laser scanning type is preferable as used in the first embodiment.
According to the laser scanning type as used in the first embodiment, regarding a non-image portion within a virtual image G display area, turning off the laser light sources 201R, 201G, and 201B will completely prevent the non-image portion from being illuminated. Accordingly, it is possible to avoid degradation in viewability of the frontward scenery of the vehicle 30, which is caused by light illuminated from the in-vehicle display device 100 to be applied through the non-image portion. Thus, the viewability of the frontward scenery can further be improved. Moreover, the laser scanning type is preferable to perform display control for partially increasing the brightness of a portion of the image included in the virtual image G displayed by the in-vehicle display device 100.
The optical scanning device 208 tilts the mirror in main and sub-scanning directions with a known actuator drive system, such as a MEMS (Micro Electro Mechanical Systems), deflects the laser light incident on the mirror to two-dimensionally scan (raster scan) the free-form mirror 209. The drive control of the mirror is performed in synchronization with the emission timings of the laser light sources 201R, 201G and 201B. The optical scanning device 208 is not limited to the above-described configuration. For example, the optical scanning device 208 may be a mirror type optical scanning device that includes two mirrors configured to oscillate or rotate around two axes perpendicular to each other.
FIG. 4 is a diagram illustrating a configuration example of a control system 250 included in the in-vehicle display device 100 according to the first embodiment of the present invention. As illustrated in FIG. 4, the control system 250 includes a FPGA (Field Programmable Gate Array) 251, CPU (Central Processing Unit) 252, ROM (Read-Only Memory) 253, RAM (Random Access Memory) 254, an interface 255 (hereinafter referred to as I/F), a bus line 256, an LD driver 257 and a MEMS controller 258. The FPGA 251 controls operations of the laser light sources 201R, 201G, and 201B of the light source unit 220 using the LD driver 257, and an operation of the MEMS 208a of the optical scanning device 208 using the MEMS controller 258. The CPU 252 provides respective functions of the in-vehicle display device 100. The ROM 253 stores various programs, such as an image processing program, which is executed by the CPU 252 in order to implement respective functions of the in-vehicle display device 100. The RAM 254 is used as a work area of the CPU 252. The I/F 255 is an interface for communicating with an external controller or the like. For example, the I/F 255 is connected to a vehicle navigation device 40, various sensors 50, or the like through the CAN (Controller Area Network) of the vehicle 30.
The I/F 255 is connected to the frontward monitoring camera 22 configured to monitor a frontward scenery of the vehicle 30. That is, frontward monitoring camera 22 monitors display information displayed by the in-vehicle display device 100 on the front windshield 21, as well as monitoring the background of the display information across the front windshield 21. The I/F 255 is further connected to the driver monitoring camera 23 in order to detect a viewpoint position of the driver 20. The control system 250 performs image processing on an image of the display information and its background captured by the frontward monitoring camera 22, based on the viewpoint position of the driver 20, and converts the processed image into an image viewed from the viewpoint position of the driver 20. The control system 250 detects the viewpoint position of the driver 20, for example, by performing an image analysis of an image of the head of the driver 20 captured by the driver monitoring camera 23.
FIG. 5 is a diagram illustrating a schematic configuration example of the in-vehicle display device 100 according to the first embodiment of the present invention and peripheral devices. According to the first embodiment, the vehicle navigation device 40 and the sensor 50 are provided as an information acquiring unit configured to acquire driver-use information provided via the virtual image G to the driver 20.
The in-vehicle display device 100 mainly includes the optical system 230, which is an example of a display unit, and the control system 250, which is an example of a controller.
As the vehicle navigation device 40 according to the first embodiment, any known vehicle navigation device installed in an automobile, etc. may be widely used. The vehicle navigation device 40 outputs information used to generate a route navigation image to be displayed as a virtual image G, and this output information is input to the control system 250. For example, as illustrated in FIG. 1, the route navigation image includes an image illustrating information such as the number of lanes (travel lanes) of a road on which the vehicle 30 is traveling, the distance to a point until which a next route change (turn right, turn left, crossroad, etc.) should be made, the direction in which the next route change is made, and the like. The above information is input from the vehicle navigation device 40 to the control system 250. As a result, under the control of the control system 250, the in-vehicle display device 100 displays a route navigation image, such as a travel lane instruction image 711, an inter-vehicle distance presenting image 712, a route specification image 721, a remaining distance image 722, and a name image 723 such as the intersection, as the virtual image G, in an upper image display area A.
In an example of the image illustrated in FIG. 1, an image representing road-specific information (road name, speed limit, etc.) is displayed as a virtual image G in a lower image display area B of the in-vehicle display device 100. The road-specific information is also input from the vehicle navigation device 40 to the control system 250. As a result, the in-vehicle display device 100 displays a road name display image 701, a speed limit display image 702, a no-passing display image 703, and the like as a virtual image in the lower image display area B, under the control of the control system 250. The road name display image 701, a speed limit display image 702, a no-passing display image 703, and the like correspond to the road-specific information.
The sensor 50 in FIG. 5 includes one or more sensors configured to detect various information indicative of behaviors of the vehicle 30, a condition of the vehicle 30, surrounding conditions of the vehicle 30, and the like. The sensor 50 outputs sensing information used to generate an image to be displayed as a virtual image G, and this sensing information is input to the control system 250. For example, in the example of the image illustrated in FIG. 1, a vehicle speed display image 704 (text image of "83 km/h" in the example of FIG. 1) representing the vehicle speed of the vehicle 30 is displayed as a virtual image in the lower image display area B of the in-vehicle display device 100. That is, the vehicle speed information included in the CAN information of the vehicle 30 is input to the control system 250 from the sensor 50, and the in-vehicle display device 100 displays a character image representing the vehicle speed as a virtual image G in the lower image display area B, under the control of the control system 250.
The sensor 50 may include, for example, sensors other than those configured to detect the vehicle speed of the vehicle 30. Examples of sensors include as follows.
(1) Laser radar devices and imaging devices configured to detect the distance between the vehicle 30 and other vehicles, pedestrians or structures (such as guardrails and utility poles) around (frontward, sideward, or rearward of) the vehicle 30, and sensors configured to detect external environmental information of the vehicle 30 (such as ambient temperature, brightness, weather conditions, etc.).
(2) Sensors configured to detect the operations of the driver 20 (such as brake operation, accelerator opening/closing).
(3) Sensors configured to detect the remaining amount of fuel in the fuel tank of the vehicle 30.
(4) Sensors configured to detect the status of various vehicle-mounted equipment, such as engines and batteries.
By transmitting the information detected by the sensor 50 to the control system 250, the in-vehicle display device 100 can display the information to be provided to the driver 20 as a virtual image G.
Next, a virtual image G displayed by the in-vehicle display device 100 will be described. In the in-vehicle display device 100 according to the first embodiment, the driver-use information provided via the virtual image G to the driver 20 may be any information useful for the driver 20. The driver-use information is broadly classified into passive information and active information for convenience.
Passive information is information that is passively perceived by the driver 20 at a time when predetermined information provision conditions are met. Accordingly, the information to be provided to the driver 20 at a set timing of the in-vehicle display device 100 is included in the passive information. Information having a certain relationship between the timing at which the information is provided and content of the information provided is also included in the passive information. Examples of passive information include safety-related information and route navigation information while driving. The safety-related information while driving includes information representing the distance between the vehicle 30 and a preceding vehicle 350 (inter-vehicle distance presenting image 712) and emergency information relating to driving (warning information or attention attracting information, such as emergency operation instruction information that instructs a driver to operate the vehicle in an emergency). The route navigation information is information for guiding a travel route to a predetermined destination, and may be the same information as that provided to the driver by a known vehicle navigation device. The route navigation information includes travel lane instruction information (a travel lane instruction image 711) indicating the travel lane to be driven at the nearest intersection, and route change operation instruction information indicating the operation to change the route from the straight-ahead direction at the intersection or the branch. Examples of the route change operation instruction information include route designation information (route specification image 721) for specifying which route should be taken at an intersection, etc., information indicating the remaining distance to the intersection, etc. for performing the route change operation (remaining distance image 722), and information indicating the name of the intersection, etc. (name image 723 such as name of intersection).
Active information is information that is actively perceived by the driver 20 at a timing determined by the driver 20. The active information may be information to be provided to the driver at a timing desired by the driver 20. Examples of the active information include information having a low or no relationship between the timing at which the information is provided and content of the information provided. The active information is information that is acquired by the driver 20 at the timing desired by the driver 20, and thus may continue to be displayed for a certain length of time or at all times. Examples of such active information include road-specific information associated with a road on which the vehicle 30 is traveling, the vehicle speed information of the vehicle 30 (the vehicle speed display image 704), the current time information, and the like. The road-specific information of the road includes information that is information relating to the road useful for the driver 20. Examples of the road-specific information include information indicating the name of the road (the road name display image 701), information indicating the content of the regulation such as the speed limit of the road (the speed limit display image 702, and no-passing display image 703) and the like.
In the first embodiment, the broadly classified passive information and active information items are displayed in the respective display areas where the virtual image G can be displayed. Specifically, according to the first embodiment, two display areas arranged in the vertical direction are set as the respective areas where the virtual image G is displayed. Of these two display areas, a passive information image corresponding to the passive information is mainly displayed in the upper image display area A, and an active information image corresponding to the active information is mainly displayed in the lower image display area B. Note that to display a part of the active information image in the upper image display area A, the viewability of the passive information image is prioritized in displaying the part of the active information image in the upper image display area A.
In the first embodiment, a stereoscopic image represented by using stereoscopic vision is used as the virtual image G displayed by the in-vehicle display device 100.
Specifically, a perspective image represented by a perspective method is used as the inter-vehicle distance presenting image 712 and the travel lane indication image 711 displayed in the upper image display area A, in which the in-vehicle display device 100 displays a virtual image.
More specifically, lengths of five horizontal lines included in the inter-vehicle distance presenting image 712 are displayed to be shorter toward the upper side of the upper image display area A. That is, the inter-vehicle distance presenting image 712 is a perspective image that is created by a perspective method, which is a method of drawing lines toward a single vanishing point. Specifically, in the first embodiment, the inter-vehicle distance presenting image 712 is formed such that the single vanishing point is determined near a gazing point of the driver 20. As a result, the driver 20 can more easily perceive the sense of depth of the inter-vehicle distance presenting image 712 while driving. Further, in the first embodiment, the inter-vehicle distance presenting image 712 serving as a perspective image is displayed such that the thickness of the horizontal lines become thinner toward upper sides of the upper image display area A, or the brightness of the horizontal lines decrease toward the upper sides of the upper image display area A. As a result, the driver 20 can more easily perceive the depth of the inter-vehicle distance presenting image 712 while driving.
(Functional Configuration of In-vehicle Display Device 100)
FIG. 6 is a diagram illustrating a functional configuration of the in-vehicle display device 100 according to the first embodiment of the present invention. As described with reference to FIGS. 1 to 5, the in-vehicle display device 100 illustrated in FIG. 6 is a device installed in a vehicle, such as an automobile. The in-vehicle display device 100 may display information to attract a driver's attention to objects around the vehicle at a viewable location (e.g., a front windshield) from the driver inside the vehicle. Objects include, for example, persons, animals, installed objects, obstacles, vehicles, and the like.
As illustrated in FIG. 6, the in-vehicle display device 100 includes a vehicle information acquiring unit 102, an environmental information acquiring unit 104, a calculator 110, an image generator 112, and an image display unit 114.
The vehicle information acquiring unit 102 acquires information associated with the vehicle (hereinafter, referred to as "vehicle information"). For example, the vehicle information acquiring unit 102 acquires (but not limited to) vehicle speed information, steering angle information, and the like through the CAN (Controller Area Network) from the electronic control unit (ECU) provided by the vehicle.
The environmental information acquiring unit 104 acquires information (hereinafter referred to as "environmental information") associated with the environment around the vehicle. For example, the environmental information acquiring unit 104 acquires, as examples of environmental information, a relative speed ΔVt between a vehicle and an object, and a distance D between the vehicle and the object. For example, the environmental information acquiring unit 104 acquires the relative speed ΔVt from a speed calculating device (e.g., an ECU) capable of calculating the relative speed ΔVt. For example, the environmental information acquiring unit 104 acquires the relevant distance D from a distance calculating device (e.g., a distance sensor) capable of calculating the distance D. The in-vehicle display device 100 may include at least one of the speed calculating device or the distance calculating device. The environmental information acquiring unit 104 may acquire various environmental information from an in-vehicle display device (e.g., an ECU, a variety of sensors, or the like), or the environmental information acquiring unit 104 may acquire various environmental information from an outside of the vehicle through communication.
The calculator 110 calculates an arrival time t until which a vehicle reaches an object, based on the relative speed ΔV and the distance D acquired by the environmental information acquiring unit 104. For example, the calculator 110 calculates the arrival time t by the equation {t=D/ΔV}.
The image generator 112 generates an image that attracts a driver's attention to an object based on the arrival time t computed by the calculator 110.
For example, when the arrival time t is greater than a predetermined first threshold value th1, the image generator 112 generates an implicit risk notification image (an example of a "second image") that relatively lightly attracts the driver's attention to an object. The implicit risk notification image is an image that attracts attention to an object by using a graphic alone.
For example, when the arrival time t is the predetermined first threshold value th1 or less, the image generator 112 generates an explicit risk notification image (an example of a "first image") that relatively strongly attracts the driver's attention to an object. The explicit risk notification image is an image that attracts attention to an object using a character string alone or using both a character string and a graphic.
The image display unit 114 displays an image (an implicit risk notification image or an explicit risk notification image) generated by the image generator 112 at a driver's viewable position of a front windshield (or transparent plate) of a vehicle using the head-up display 120 (optical system 230). The image display unit 114 may display various vehicle information (e.g., vehicle speed information, steering angle information, or the like) acquired by the vehicle information acquiring unit 102 at a driver's viewable position of the front windshield (or transparent plate) of the vehicle using the head-up display 120 (the optical system 230).
In the first embodiment, the head-up display 120 (the optical system 230) is, but not limited to, an example of a display device. For example, a meter panel, a navigation device, or the like may also be used as a display device. The display device may be integrally disposed on a main body of the in-vehicle display device 100, or may be externally connected to the main body of the in-vehicle display device 100.
As illustrated in FIG. 4, the in-vehicle display device 100 includes a computer including a CPU 252, a ROM 253, and a RAM 254. The in-vehicle display device 100 implements the above-described functions by executing a program stored in the ROM 253 by using the RAM 254 as a memory area.
The above-described functions of the in-vehicle display device 100 may be physically implemented by one device or physically implemented by a plurality of devices. A portion of each of the above functions may also be implemented by utilizing those provided in other devices (e.g., ECU, external server, etc.).
(Procedure of Process Performed by In-vehicle Display Device 100)
FIG. 7 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device 100 according to the first embodiment of the present invention.
First, in step S201, the environmental information acquiring unit 104 acquires a relative speed ΔV of the vehicle and a danger-related object, and the distance D between the vehicle and the danger-related object as the environmental information. Note that instead of acquiring the relative speed ΔV, the in-vehicle display device 100 may acquire a speed V2 of the vehicle and a speed V1 of the object, respectively, and calculate a relative speed ΔV by the equation {ΔV=V2-V1}.
Next, in step S202, the calculator 110 calculates the arrival time t to reach the object based on the relative speed ΔV and the distance D calculated in step S201.
Next, in step S203, the image generator 112 determines whether the arrival time t calculated in step S202 is greater than a predetermined first threshold value th1 (e.g., 3 seconds).
In step S203, when the image generator 112 determines that the arrival time t is greater than the predetermined first threshold value th1 (Yes in step S203), the environmental information acquiring unit 104 acquires the relative position information of the object (step S204). In step S205, the image generator 112 generates an implicit risk notification image (potential risk notification image).
In step S206, the image display unit 114 displays the implicit risk notification image generated in step S205 at a position (i.e., a position superimposed on the object) corresponding to the relative position information acquired in step S204 on the front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 completes a series of steps illustrated in FIG. 7.
Meanwhile, in step S203, when the image generator 112 determines that the arrival time t is not greater than the predetermined first threshold value th1 (No in step S203), the image generator 112 generates an explicit risk notification image (obvious risk notification image) (step S207). In step S208, the image display unit 114 displays the explicit risk notification image generated in step S207 at a predetermined position on a front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 completes a series of steps illustrated in FIG. 7.
(Example of Display of Explicit Risk Notification Image)
FIG. 8 and FIG. 9 are diagrams illustrating examples of an explicit risk notification image by the in-vehicle display device 100 according to the first embodiment of the present invention. FIGS. 8 and 9 illustrate diagrams illustrating frontward sceneries ahead of the vehicle viewed from the driver of the vehicle across the front windshield of the vehicle.
In the example illustrated in FIGS. 8 and 9, the explicit risk notification image 300 or the explicit risk notification image 400 is displayed on the front windshield of the vehicle because the arrival time t is a predetermined first threshold value th1 (e.g., 3 seconds) or less before the vehicle reaches a person 10.
In the example illustrated in FIG. 8, since the distance D from the vehicle to the person 10 is close, the explicit risk notification image 300 is displayed above the vehicle speed display image 302 using only the character string to attract attention to the person 10. The explicit risk notification image 300 is an image with only the character string "BRAKE!" representing the action to be taken by the driver.
In contrast, in the example illustrated in FIG. 9, since the distance D from the vehicle to the person 10 is further closer, the explicit risk notification image 400 is displayed on the front windshield of the vehicle above the vehicle speed display image 402 using a character string and a graphic to attract attention to the person 10. The explicit risk notification image 400 is an image having a character string "BRAKE!" that represents the action to be taken by the driver, and a graphic that exhibits the periphery of the character string with a red color meaning "warning".
As described above, the in-vehicle display device 100 according to the first embodiment is capable of displaying the explicit risk notification images 300 and 400 using a character string alone or both a character string and a graphic on the front windshield of the vehicle using the head-up display 120 (optical system 230) for strongly attracting the driver's attention to the person 10, in response to the arrival time t being a predetermined first threshold value th1 (e.g., 3 seconds) or less. The arrival time t indicates a time until which the vehicle reaches the person 10. Accordingly, the in-vehicle display device 100 according to the first embodiment enables the driver to accurately and quickly understand the contents of avoiding action instructions against the explicit risk. Accordingly, the in-vehicle display device 100 according to the first embodiment can display an image or the like to more appropriately attract attention from the driver of the vehicle.
Further, the in-vehicle display device 100 according to the first embodiment can display a plurality of explicit risk notification images in stepwise manner according to the distance D from the vehicle to the person 10, in response to the arrival time t being the predetermined first threshold value th1 or less. The arrival time t indicates a time until which the vehicle reaches the person 10. Accordingly, the in-vehicle display device 100 according to the first embodiment can more effectively enable the driver to understand the presence of the explicit risk.
However, the in-vehicle display device 100 according to the first embodiment is not limited thereto. In response to the arrival time t being the predetermined first threshold value th1 or less, the in-vehicle display device 100 may display a single explicit risk notification image regardless of the distance D from the vehicle to the person 10. The arrival time t indicates a time until which the vehicle reaches the person 10.
The explicit risk notification images 300 and 400 may have animation effects (e.g., blinking, brightness change, color conversion, shape change, etc.). Accordingly, the in-vehicle display device 100 according to the first embodiment can more effectively enable the driver to understand the presence of the explicit risk.
Alternatively, the explicit risk notification images 300 and 400 may have a character string representing the type of object (e.g. "Attention to Person!" or the like) in addition to a character string representing the action to be taken by the driver (e.g. "BRAKE!" or the like), or instead of a character string representing the action to be taken by the driver.
(Examples of Displaying Implicit Risk Notification Images)
FIG. 10 is a diagram illustrating an example of an implicit risk notification image displayed by the in-vehicle display device 100 according to the first embodiment of the present invention. FIG. 10, like FIGS. 8 and 9, illustrates a frontward scenery ahead of the vehicle, viewed from the driver of the vehicle across the front windshield of vehicle.
In the example illustrated in FIG. 10, the front windshield of the vehicle displays an implicit risk notification image 500 for attracting attention to the person 10 at a position superimposed on the person 10 when viewed from the driver, because the time t to reach the person 10 is greater than the predetermined first threshold value th1 (e.g., 3 seconds). The implicit risk notification image 500 is an image using an annular shape that exhibits the perimeter of the person 10 with a yellow color, which means "attention".
As described above, the in-vehicle display device 100 according to the first embodiment can display the implicit risk notification image 500 at a position superimposed on the person 10 on the front windshield of the vehicle using the head-up display 120 (the optical system 230), in response to the arrival time t being greater than the predetermined first threshold value th1 (e.g., 3 seconds). The arrival time t indicates a time until which the vehicle reaches the person 10. Accordingly, the in-vehicle display device 100 according to the first embodiment can notify the driver of the implicit risk without troublesomeness. Accordingly, the in-vehicle display device 100 according to the first embodiment can display an image or the like to more appropriately attract attention of the driver of the vehicle.
The implicit risk notification image 500 may have an animation effect (e.g., blinking, brightness change, color conversion, shape change, etc.). Accordingly, the in-vehicle display device 100 according to the first embodiment can more effectively enable the driver to understand the presence of the implicit risk.
SECOND EMBODIMENT
Next, a second embodiment will be described with reference to FIG. 11. According to the second embodiment, a procedure of a process performed by the in-vehicle display device 100 differs from that of the process in the first embodiment.
(Procedure of Process Performed by In-vehicle Display Device 100)
FIG. 11 is a flowchart illustrating a procedure of a process performed by the in-vehicle display device 100 according to the second embodiment of the present invention.
First, in step S601, the environmental information acquiring unit 104 acquires a relative speed ΔV of the vehicle and a danger-related object, and the distance D between the vehicle and the danger-related object as the environmental information.
Next, in step S602, the calculator 110 calculates the arrival time t to reach the object based on the relative speed ΔV and the distance D calculated in step S601.
Next, in step S603, the image generator 112 determines whether the arrival time t calculated in step S602 is greater than a predetermined second threshold value th2 (e.g., 5 seconds). Note that the predetermined second threshold value th2 is greater than the predetermined first threshold value th1.
In step S603, when the image generator 112 determines that the arrival time t is greater than the predetermined second threshold value th2 (Step S603: Yes), the in-vehicle display device 100 ends a series of steps illustrated in FIG. 11.
Meanwhile, in step S603, when the image generator 112 determines that the arrival time t is not greater than the predetermined second threshold value th2 (Step S603: No), the image generator 112 determines whether the arrival time t calculated in step S602 is greater than the predetermined first threshold value th1 (e.g., 3 seconds) in step S604.
In step S604, when the image generator 112 determines that the arrival time t is greater than the predetermined first threshold value th1 (Yes in step S604), the environmental information acquiring unit 104 acquires the relative position information of the object (Step S605). In step S606, the image generator 112 generates the implicit risk notification image.
In step S607, the image display unit 114 displays the implicit risk notification image generated in step S606 at a position (i.e., a position superimposed on the target) corresponding to the relative position information acquired in step S605 on the front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 ends a series of steps illustrated in FIG. 11.
Meanwhile, in step S604, when the image generator 112 determines that the arrival time t is not greater than the predetermined first threshold value th1 (Step S604: No), the image generator 112 generates an explicit risk notification image in step S608. In step S609, the image display unit 114 displays the explicit risk notification image generated in step S608 at a predetermined position on the front windshield of the vehicle using the head-up display 120 (the optical system 230). Thereafter, the in-vehicle display device 100 ends a series of steps illustrated in FIG. 11.
As described above, the in-vehicle display device 100 according to the second embodiment displays neither the implicit risk notification image nor the explicit risk notification image when the arrival time t is greater than the predetermined second threshold value th2 (e.g., 5 seconds). Accordingly, the in-vehicle display device 100 according to the second embodiment can prevent frequent display of the implicit risk notification image and the explicit risk notification image that are troublesome to the driver.
THIRD EMBODIMENT
Next, a third embodiment will be described with reference to FIG. 12. FIG. 12 is a diagram illustrating a functional configuration of an in-vehicle display device 100A according to the third embodiment of the present invention. The in-vehicle display device 100A according to the third embodiment illustrated in FIG. 12 differs from the in-vehicle display device 100 according to the first embodiment in that the in-vehicle display device 100A further includes a driver information acquiring unit 106.
The driver information acquiring unit 106 acquires information associated with driver's mental and physical condition (hereinafter, referred to as "mental and physical condition information"). For example, the driver information acquiring unit 106 acquires the following information, as the mental and physical condition information:
electrocardiographic information detected by an electrocardiographic sensor,
heart rate information detected by an electrocardiographic sensor,
blood pressure information detected by a blood pressure sensor,
body temperature information detected by a body temperature sensor,
pulse information detected by a pulse sensor,
respiratory information detected by a respiratory sensor,
sweat information detected by a sweat sensor,
blinking information detected by a blinking sensor,
pupil information detected by a pupil sensor,
electroencephalographic information detected by an electroencephalographic sensor, and
muscle information detected by a muscle sensor, and the like.
For example, the in-vehicle display device 100A may display the mental and physical condition information acquired by the driver information acquiring unit 106 on the front windshield of the vehicle by the image display unit 114. Further, the in-vehicle display device 100A may dynamically change the processing logic in accordance with the mental and physical condition information acquired by the driver information acquiring unit 106. For example, the in-vehicle display device 100A may reduce an animation effect of the implicit risk notification image and the explicit risk notification image to reduce the troublesome effect on the driver when the driver's state is determined to be in an alert state based on the mental and physical condition information acquired by the driver information acquiring unit 106.
FOURTH EMBODIMENT
Hereinafter, a display system, and an information display method performed by the display system will be described with reference to the accompanying drawings as examples of aspects of a fourth embodiment of the present invention.
Outline of Information Display
First, an outline of a display method of information associated with steering of a steering wheel will be described with reference to FIGS. 13A and 13B. FIGS. 13A and 13B are photographs depicting examples of steering information displayed by a display system according to a vehicle speed and the like. The fourth embodiment illustrates a configuration in which information is presented by a HUD (Head-Up Display) device acting as a display system. However, this configuration is merely an example, and may also be applied to other display systems such as liquid crystal displays.
FIG. 13A is an example of steering information presented while approaching a curve, and FIG. 13B is an example of steering information presented while traveling along the curve. The display system presents information as a virtual image ahead of a front windshield (e.g., a few meters). In FIGS. 13A and 13B, a display component 301 representing a vehicle speed and a display component 302 representing a steering angle are displayed.
The center of the display component 302 corresponds to the center of a neutral steering angle, and a desired steering angle (hereinafter, referred to as a "target steering angle 304") is indicated based on a current steering angle 303 and a current vehicle traveling condition. In FIGS. 13A and 13B, a vehicle is traveling along a right-turn curve. Hence, each of the current steering angle 303 and the target steering angle 304 indicates a right side of the center of the display component 302. The target steering angle 304 indicates that the steering angle should be increased as the distance from the center increases. Accordingly, the target steering angle 304, the current steering angle 303, and the difference between the target steering angle and the current steering angle are displayed as analog information in FIGS. 13A and 13B. Each of the current steering angle 303 and the target steering angle 304 corresponds to a steering wheel angle on a per one-to-one basis. However, the correspondence between the current or target steering angle and the steering wheel angle may not necessarily be technically strict.
As can be seen from the comparison of FIGS. 13A and 13B, the actual steering angle gradually increases as the vehicle travels along the curve, and the display system indicates the target steering angle 304 in real time based on the current vehicle's traveling condition. The target steering angle 304 indicates by what amount the driver should actually turn the steering wheel. The target steering angle 304 is calculated based on the vehicle speed, curvature of the curve, a desired yaw rate, and the like. Hence, the target steering angle 304 can present the steering angle in which the amount of change in a lateral G-force G (hereinafter referred to as "lateral G") experienced by a vehicle occupant is less than a predetermined amount or less.
When the current steering angle 303 and the target steering angle 304 are indicated by numerical values, for example, the driver must convert the values into angular values and apply the angular values to steering. In contrast, according to the fourth embodiment, the current steering angle 303 and the target steering angle 304 are displayed as analog information. Thus, the driver can easily understand by what amount the driver should actually turn the steering wheel.
The target steering angle 304 is detected by the vehicle based on the shape of the curve ahead of the vehicle, whereas the current steering angle 303 is the actual steering angle. Accordingly, when a road shape differs between a frontward portion and a currently traveling portion of the road (e.g., the curvature of the curve increases, or the curvature of the curve decreases), the current steering angle 303 and the target steering angle 304 deviate from each other. For example, when the curvature of the curve is large ahead of the vehicle, the current steering angle 303 the target steering angle 304 is frequently obtained. When the curvature of the curve is small ahead of the vehicle, the current steering angle 303 the target steering angle 304 is frequently obtained. Since the target steering angle 304 is displayed based on the road shape ahead of the vehicle, the driver can predict by what amount the driver is required to turn the steering wheel.
FIGS. 13A and 13B illustrate cases where the curve is a right-turn curve. However, in a case of a left-turn curve, each of the current steering angle 303 and the target steering angle 304 indicates a left side of the center of the display component 302.
The display component 302 is in the shape of a semi-circle or arc of the upper half of the circle, and substantially covers a steering angle range required while the vehicle is traveling along the curve. Thus, in a curve where the steering angle does not exceed 90 degrees, the current steering angle 303 and the target steering angle 304 can match the actual steering angle. In a curve where the actual steering angle exceeds a range of the display component 302, the current steering angle 303 and the target steering angle 304 may be multiplied by a coefficient less than 1 so as to reflect the coefficient on the display component 302.
As described above, the display system according to the fourth embodiment sequentially presents steering angles in which the amount of change in the lateral G is a predetermined amount or less when the vehicle travels along the curve, thereby facilitating the driver to handle the vehicle such that the occupant can ride comfortably.
Terminology
A moving body travels along a road means that a moving body travels on a road on which white lines, curbs, guardrails, and the like may preferably but may not necessarily be formed.
Information associated with a target steering angle of a steering wheel is information associated with a driver's steering of a steering wheel to an appropriate steering angle.
The moving body may be the one that travels on land, in the air, at sea, or in the sea, with an occupant on board.
In the fourth embodiment, a vehicle will be described as an example of a moving body. In addition, the display system according to the fourth embodiment can also be installed in aircraft, ships, industrial robots, and the like.
A vehicle occupant is a person who sees or views information associated with a display system. For example, a driver may be an only occupant. In addition, in case of an autonomous or self-driving moving body, no occupant drives a moving body.
Configuration Example
FIG. 14 is a schematic diagram illustrating an example of a display system 1 installed in a vehicle. The display system 1 includes a display controller 20 (an example of a display controller) and a display unit 10 (an example of a display unit).
The display system 1 is embedded inside a dashboard, and projects an image toward a front windshield 91 (a transparent member) from a light-emitting window 3 disposed on a surface of the display system 1. The projected image is displayed as a virtual image I ahead of the front windshield 91. A vehicle occupant V can see useful information for driving while keeping his or her eyes (with little line-of-sight movement) on a preceding vehicle and on a road ahead of the vehicle. Further, the display system 1 may be disposed so as to project an image onto the front windshield 91, and thus the display system 1 may be disposed on a ceiling, a sun visor, or the like, in addition to the dashboard.
The display system 1 may be a general-purpose information processing terminal or a HUD-dedicated terminal. The HUD-dedicated terminal is simply referred to as a heads-up display device or a navigation device when the heads-up display device is integrated with the navigation device.
The HUD-dedicated terminal may also be referred to as PND (Portable Navigation Device). Alternatively, the HUD-dedicated terminal may be referred to as a display audio (or a connected audio). The display audio is a device that provides mainly audio-visual functions and communication functions without incorporating navigation functions.
General-purpose information processing terminals include, for example, smartphones, tablets, cellular phones, PDAs (Personal Digital Assistants), laptop PCs, and wearable PCs (e.g., wristwatch, sunglass, etc.). General-purpose information processing terminals are not limited thereto, but may be any devices insofar as those devices have functions of a general information processing apparatus. The general-purpose information processing terminals are normally used as an information processing apparatus for executing various applications. However, in a case of executing application software for a display system, the information processing apparatus displays information useful for driving in the same way as the HUD-dedicated terminal, for example.
The display system 1 according to the fourth embodiment may be capable of switching between an in-vehicle state and a portable state, when the display system 1 serves as either the general-purpose information processing terminal or the HUD-only terminal.
As illustrated in FIG. 14, the display system 1 includes a display unit 10 and a display controller 20 as main elements. As a projection type of the display unit 10, a laser scanning type and a panel type are known. The laser scanning type is a type in which a laser beam emitted from a laser light source is scanned by a two-dimensional scanning device to form an intermediate image (a real image projected onto a screen to be described later). The panel type is a type in which an imaging device such as a liquid crystal panel, a DMD panel (digital mirror device panel), or a fluorescent display tube (VFD) forms an intermediate image.
Unlike the panel type, in which light emitted from a full screen is partially shielded to form an image, the laser scanning type enables allocation of light emission and non-light emission to each pixel, thereby generally forming a high contrast image. Thus, the laser scanning type is preferable. Since high contrast provides better visibility, the laser scanning type allows vehicle occupants to visually recognize information with less attention resources than panel-type HUD devices.
Specifically, in the panel type, light that cannot be completely shielded is projected in an area having no information, and a display frame (a square-shaped leak light) is projected to an area in which the HUD displays an image (this effect is called the postcard effect). The laser scanning type does not have this kind of effect, and only the content can be projected. Specifically, in AR (Augmented Reality), the reality in displaying a generated image superimposing on a real landscape is improved. AR is interpreted as "extended reality" and is a technology that virtually expands the world at hand by displaying images of objects that do not exist in the real landscape. However, the panel-type HUDs may be applied insofar as such HUD devices are capable of displaying information in a manner visible with less attention resources.
Alternatively, although not strictly a HUD device, information associated with steering of a steering wheel according to the fourth embodiment may be displayed on a transparent display disposed on a dashboard or the like (in this case, not a virtual image).
FIG. 15 is a diagram illustrating a configuration example of a display unit 10. The display unit 10 mainly includes a light source unit 101, a light deflector 102, a mirror 103, a screen 104, and a concave mirror 105. Note that FIG. 15 merely illustrates main elements, and may have elements other than those illustrated in FIG. 15, or may not have part of the elements illustrated in FIG. 15.
The light source unit 101 includes, for example, three laser light sources corresponding to RGB (hereinafter referred to as LDs: laser diodes), coupling lenses, apertures, composite elements, lenses, and the like. The light source unit 101 synthesizes laser beams emitted from the three LDs and directs the synthesized laser beams toward a reflective surface of the light deflector 102. The laser beams directed toward the reflective surface of the light deflector 102 are two dimensionally deflected by the light deflector 102.
The light deflector 102 may be, for example, a single microscopic mirror that oscillates with respect to two orthogonal axes, or two microscopic mirrors that oscillate or pivot with respect to one axis. The light deflector 102 may be, for example, a MEMS (Micro Electro Mechanical Systems) mirror fabricated by a semiconductor process or the like. The light deflector 102 can be driven, for example, by an actuator that drives deformation force of the piezoelectric element. The light deflector 102 may be a galvanic mirror, a polygon mirror, or the like.
A laser beam two-dimensionally deflected by the light deflector 102 enters mirror 103 and is turned back by the mirror 103 to render a two-dimensional image (an intermediate image) on the surface (scanned surface) of the screen 104. For example, a concave mirror may be used as the mirror 103, but a convex mirror or a planar mirror may also be used. The configuration of deflecting the direction of the laser beams with the light deflector 102 and the mirror 103 enables the size of the display unit 10 to be reduced, or enables the arrangement of the elements to be flexibly changed.
The screen 104 is preferably a microlens array or a micro-mirror array having the function of diverging a laser beam at a desired divergence angle. However, the screen 104 may be a diffuser plate for diffusing a laser beam, a transparent plate having a smooth surface, a reflector plate, or the like. Generally, elements from the light source unit 101 to the screen 104 in FIG. 15 are referred to as a HUD device. However, other elements may be included in the HUD device.
The laser beam emitted from the screen 104 is reflected by the concave mirror 105 and projected to the front windshield 91. The concave mirror 105 acts like a lens and functions to form an image at a predetermined focal length. Thus, when the concave mirror 105 is a lens, an image on the screen 104 corresponding to an object results in an image formed at a distance R2 determined by the focal length of the concave mirror 105. Thus, when viewed from an occupant of the vehicle, a virtual image I is displayed at a distance R1 + R2 from the front windshield 91. When the distance from the occupant of the vehicle to the front windshield 91 is R3, the virtual image I is displayed at the distance R (= R1 + R2 + R3) from a viewpoint E of the vehicle occupant V in FIG. 15.
At least a portion of light flux to the front windshield 91 is reflected toward the viewpoint E of the vehicle occupant V. As a result, the vehicle occupant V can see the virtual image I with an enlarged intermediate image of the screen 104 via the front windshield 91. That is, the virtual image I, which is the enlarged intermediate image, is displayed across the front windshield 91, viewing from the vehicle occupant V.
Normally, the front windshield 91 is slightly curved rather than planar. Therefore, although a position at which the virtual image I is formed is determined not only by the focal length of the concave mirror 105 but is also determined by the curved surface of the front windshield 91, the distance R is approximately determined by the distance R1 + R2 as described above. The distance R1 or R2 is elongated to remotely form the virtual image I such that line of sight movement of the vehicle occupant V is reduced. As a method for increasing the distance R1, an optical path is increased by being turned back by a mirror, and as a method for increasing the distance R2, the focal length of the concave mirror 105 is adjusted.
Since a horizontal line of the intermediate image is optically distorted upward and downward in a convex shape, due to the effect of the front windshield 91, it is preferable that at least one of the mirror 103 or the concave mirror 105 be designed and disposed to compensate for distortion. Alternatively, it is preferable that the projected image be corrected in consideration of the distortion.
Further, as a transparent reflective member, a combiner may be disposed on the viewpoint E side of the front windshield 91. Disposing the combiner to irradiate the combiner with light from the concave mirror 105 may also present information as a virtual image I in the same manner such that the front windshield 91 is irradiated with light from the concave mirror 105.
Configuration Example of In-Vehicle System in which the Display System Is Installed
Next, a configuration example in which the display system 1 is installed in a moving body will be described with reference to FIG. 16. FIG. 16 is a diagram illustrating a configuration example of an in-vehicle system 2 in which the display system 1 is installed in a moving body. The in-vehicle system 2 (an example of a system according to the scope of the claims) includes a car navigation system 11, an engine ECU (Electronic Control Unit) 12, a display system 1, a brake ECU 13, a steering ECU 14, a recognition unit 15, and a detector 16 that communicate with each other via an in-vehicle network NW such as a CAN (Controller Area Network) bus.
The car navigation system 11 has a GNSS (Global Navigation Satellite System) represented by GPS, which detects a present location of a vehicle and displays the location of the vehicle on an electronic map. In addition, the car navigation system 11 receives inputs of the departure and destination locations, searches for routes from the departure to the destination and displays the search results on an electronic map, and guides an occupant of the vehicle in the direction of travel by audio, text (displayed on a display), animation, etc. before changing course. The car navigation system 11 may communicate with a server via a cellular network or the like. In this case, the server can send an electronic map to the vehicle or perform a route search.
The engine ECU 12 controls an ideal fuel injection amount, an advance and retard angle of the ignition timing, and an actuation valve mechanism according to the information from each sensor and vehicle conditions. In addition, the engine ECU 12 determines the necessity of the speed change by referring to the map in which a shift line of the transmission gears is set in relation to the current speed of the vehicle and the opening of the accelerator. The engine ECU 12 performs acceleration and deceleration control while traveling to follow the preceding vehicle. The electric motor may be powered with or without the engine.
The brake ECU 13 controls braking force of each wheel of the vehicle without any brake pedal operation by the vehicle occupant, such as ABS (Antilock Braking System), braking control while traveling to follow the preceding vehicle, automatic braking based on TTC (Time To Collision) with obstacles, and stopped state maintenance control when starting on a slope.
The steering ECU 14 detects a steering direction and a steering angle of the steering wheel operated by the occupant of the vehicle and performs power steering control for applying steering torque in a steering direction. In addition, the steering ECU 14 performs steering in the direction of avoiding deviation from the traveling lane, in the direction of maintaining the traveling at the center of the traveling lane, or in the direction of avoiding approaching obstacles, without steering wheel operation by the occupant of the vehicle.
The detector 16 includes a variety of sensors configured to detect obstacles around the vehicle. The recognition unit 15 recognizes a shape of a road from an image or the like such as a white line displayed ahead of the vehicle, performs object recognition that identifies what is an obstacle detected by the detector 16, and recognizes a position (direction and distance) of the obstacle relative to the vehicle. Information such as vehicle speed, road shape, and the like are entered into the display system 1.
Configuration Example of Detector
Subsequently, a configuration of the detector 16 will be described with reference to FIG. 17. FIG. 17 is a diagram illustrating a configuration example of the detector 16 of the in-vehicle system 2. The detector 16 includes a vehicle speed sensor 161 configured to detect a vehicle speed displayed by the display system 1, a vehicle information sensor 162 configured to acquire vehicle information displayed by the display system 1, a radar sensor 163 and a surround view camera 164 each configured to detect an obstacle, an occupant status information sensor 165 configured to acquire occupant information that is information associated with an occupant's condition, a VICS receiving device 166 configured to receive traffic jam information (registered trademark) (VICS: Vehicle Information and Communication System Center), and an external communication device 167 connected to the Internet.
The sensors of the detector 16 do not need to be integrated with the detector 16, but may be installed in the vehicle.
The vehicle speed sensor 161 detects, for example, a magnet that rotates with the rotation of a shaft of a drivetrain system, via a sensor unit fixed to the body, and generates a pulse wave proportional to the rotation speed. Vehicle speed can be detected by the number of pulses per unit time.
The vehicle information sensor 162 includes one or more sensors configured to detect vehicle information other than the vehicle speed sensor 161. Examples of such sensors include fuel meter sensors, shift lever position sensors, odometers, trip meters, turn signal sensors, and water temperature sensors. These sensors may each have a general configuration to acquire vehicle information. The fuel meter sensor detects current remaining fuel. The shift lever position sensor detects a position of a shift lever operated by the occupant of the vehicle. The odometer cumulates the distance traveled by the vehicle and provides the total distance traveled. The trip meter provides a block distance traveled by the vehicle from the time of initialization performed by the occupant of the vehicle to the present. The winker sensor detects a direction of the winker operated by the occupant of the vehicle. The water temperature sensor detects engine cooling water temperature. The above vehicle information may only be examples of information that can be obtained from a vehicle. Other information that can be obtained from a vehicle may also be used as vehicle information. For example, in the case of an electric vehicle or a hybrid vehicle, a remaining battery amount, a regenerated power amount, or a power consumption amount may be obtained as vehicle information.
The surround view camera 164 is an imaging device that captures the perimeter of the vehicle. The surround view camera 164 is preferably located so as to image or monitor the frontward, two sides and rearward of the vehicle. In the fourth embodiment, the surround view camera 164 is located so as to image at least the frontward of the vehicle. For example, the surround view camera 164 is attached on or near the back of the rearview mirror such that an optical axis is directed toward a horizontal direction ahead of the vehicle. Alternatively, the surround view camera 164 may be located in the left rear corner, right rear corner, and rearward of the roof and bumper of the vehicle. An imaging device located at the rear of the vehicle is called a back monitor, but the surround view camera 164 at the rear of the vehicle is not limited to the back monitor. Alternatively, the surround view camera 164 may be disposed on side mirrors, pillars, side portions of the roof, or doors.
The surround view camera 164 may be a monocular camera or a stereo camera. In the case of a monocular camera or stereo camera capable of obtaining distance information, no radar sensor 163 is required. However, since the radar sensor 163 is provided in addition to the surround view camera 164 configured to acquire the distance information, the distance information from the surround view camera 164 and the distance information from the radar sensor 163 can be fused (integrated) to compensate for each other's disadvantages and obtain highly accurate distance information. Note that a sonic sensor (ultrasonic sensor) or the like may be provided in addition to the radar sensor 163 and the surround view camera 164.
The radar sensor 163 transmits radar around the vehicle, such as radar present ahead of, sides of, or rear of the vehicle, and receives radar reflected back by the object. The radar sensor 163 may be located such that obstacles around the vehicle can be detected in the same way as the surround view camera. The radar sensor 163 employs a TOF (Time of Flight) technique in which a distance to an object is detected according to a time from transmission to reception, and a direction of an object is detected according to an illumination direction of the radar. LIDAR ("Light Detection and Ranging", or "Laser Imaging, Detection, and Ranging") is known as a radar sensor employing the TOF technique. In addition, there are an FMCW (Frequency Modulation Continuous Wave) technique and an FCM (Fast Chirp Modulation) technique which generate a mixture of the receiving and transmitting waves while continuously increasing the frequency of the transmitting waves, and converting the beat frequency of mixed waves generated by slightly different frequencies into distance. The FMCW technique and the FCM technique estimate the direction of an object by detecting the phase shift of the receiving waves with multiple receiving antennas.
The occupant status information sensor 165 is a sensor configured to detect occupant status information directly or indirectly from an occupant of a vehicle. A typical example of the occupant status information sensor 165 is a face camera. The face camera captures and authenticates the face of an occupant of the vehicle and specifies or identifies the occupant of the vehicle. In addition, the occupant status information sensor 165 can detect the face direction and the line of sight direction from the face image.
Other examples of the occupant status information sensor 165 may include an electrocardiogram sensor, a heart rate sensor, a blood pressure sensor, a body temperature sensor, a pulse sensor, a respiration sensor, a perspiration sensor, a blinking sensor, a pupil sensor, an electroencephalogram sensor, or a myoelectric potential sensor.
The occupant status information sensor 165 has a configuration worn by an occupant of a vehicle, such as a wristwatch-type wearable terminal (smart watch).
The VICS receiving device 166 receives radio waves delivered by VICS. VICS is a system that transmits traffic information, such as traffic congestion and traffic restrictions, to the in-vehicle device in real time using FM multiplex broadcasts or beacons. The external communication device 167 connects to the Internet or the like via 3G, 4G, 5G, LTE, and a network such as a wireless LAN, and receives various information. For example, the external communication device 167 can receive weather information such as rain, snow and fog. The external communication device 167 may also receive news, music, videos, etc. The external communication device 167 can acquire, for example, traffic light state information and the time until the traffic light changes. Thus, the VICS receiving device 166 and the external communication device 167 may perform roadside-to-vehicle communication. In addition, the external communication device 167 may acquire information detected by another vehicle 6 via vehicle-to-vehicle communication.
In addition, the Advanced Operating System (ADAS) not only displays information and provides warnings, but may also control the vehicle. In this case, the control ECU links the engine ECU 12, the brake ECU 13, and the steering ECU 14 to provide various driving assistance based on the distance information associated with obstacles detected by at least one of the radar sensors 163 or the surround view camera 164. For example, the control ECU performs acceleration/deceleration control while traveling to follow a preceding vehicle, automatic braking, avoidance of deviation from the traveling lane, lane keeping traveling, and steering to avoid obstacle. For such control, the recognition unit 15 recognizes a road sign, road paint such as a white line, or the like, from the image data imaged by the imaging device configured to capture a scenery ahead of the vehicle.
In acceleration/deceleration control while traveling to follow the preceding vehicle, the control ECU controls the power and braking power to maintain a target distance according to the vehicle speed. Automatic braking includes warning, indication to push the brake pedal down, hoisting the seat belt, and braking to avoid collision in case of there being a high possibility of collision, which are performed according to TTC. In order to avoid deviation from the traveling lane, a driving assist ECU 36 (not illustrated) recognizes a white line (travel segment line) from the image data and applies steering torque in the direction opposite to the direction of deviation from the traveling lane. For lane keeping traveling, the center of the traveling lane is set as the target traveling line, and the steering torque proportional to the deviation from the target traveling line is applied in the direction opposite to the deviation direction. For obstacle avoidance steering, when a collision cannot be avoided by braking, a traveling line for avoiding the obstacle is determined, and a steering torque for travelling in the determined traveling line is applied.
When the vehicle changes the lane, for example, and the radar sensor 163 or the surround view camera 164 detects a vehicle that is traveling in an area (blind area) of the adjacent lane that is not reflected in the door mirror, the occupant is warned. Such assistance is called a blind spot monitor.
Hardware Example Configuration of Display Controller
Next, a hardware configuration example of the display controller 20 will be described with reference to FIG. 18. FIG. 18 is a diagram illustrating a hardware configuration example of the display controller 20. The display controller 20 includes an FPGA 201, a CPU 202, a ROM 203, a RAM 204, an I/F 205, a bus line 206, an LD driver 207, and a MEMS controller 208. The FPGA 201, CPU 202, ROM 203, RAM 204, and I/F 205 are interconnected via the bus line 206.
The CPU 202 controls each of the functions of the display controller 20. The ROM 203 stores the program 203p executed by the CPU 202 to control each of the functions of the display controller 20. The RAM 204 loads the program 203p and the CPU 202 uses the RAM 204 as a work area for executing the program 203p. The RAM 204 also includes an image memory 209. The image memory 209 is used to generate an image that is projected as virtual image I. The I/F 205 is an interface for communicating with the recognition unit 15 and the detector 16, and the I/F 205 is connected to, for example, a CAN (Controller Area Network) bus or an Ethernet (registered trademark) of a vehicle.
The FPGA 201 controls the LD driver 207 based on an image created by the CPU 202. The LD driver 207 drives the LD of the light source unit 101 of the display unit 10 to control the emission of the LD according to the image. The FPGA 201 operates the light deflector 102 of the display unit 10 through the MEMS controller 208 such that the laser beam is deflected in a direction corresponding to a pixel position of the image.
Function of Display Controller
Subsequently, functions of the recognition unit 15 and the display controller 20 will be described with reference to FIGS. 19 to 21. FIG. 19 is an example of a functional block diagram illustrating functions of the recognition unit 15 and the display controller 20 in blocks. First, the recognition unit 15 includes a vehicle speed acquiring unit 31, a road shape estimator 32, and a steering angle determining unit 33.
The vehicle speed acquiring unit 31 preferably acquires a current vehicle speed from the detector 16 periodically. The road shape estimator 32 acquires a road shape in a traveling direction of the vehicle. When the recognition unit 15 recognizes a white line using a stereo camera or a laser, the recognition unit 15 recognizes three-dimensional coordinates of the white line, so that curvature of the curve can be estimated from the coordinates of the white line.
FIGS. 20A and 20B are diagrams illustrating a method for estimating curvature of a curve. FIG. 20A schematically illustrates recognized white lines and FIG. 20B illustrates coordinates of the white lines in an XZ plane.
The XZ plane corresponds to a road surface. The road shape estimator 32 detects white line coordinates Sn (where n is a natural number) at predetermined intervals from internal edges or the like of the right white line and the left white line.
Then, the coordinates of the center of the left and right white line coordinates Sn are calculated, and the coordinates of the center are graphed as illustrated in FIG. 20B. Note that FIG. 20B is merely a descriptive diagram, and a transformation to a graph may not necessarily be applied in actuality.
The road shape estimator 32 determines the radius of a circle by fitting the white line coordinates Sn to equation of a circle. The radius (or curvature) of a frontward curve can be estimated by periodically repeating such a process. The white line coordinates Sn may be fitted to clothoid curves. Alternatively, the shape of a road, such as the radius of a curve, may be obtained from the car navigation system 11.
In FIG. 20, the display controller 20 estimates the road shape from the white lines, but the road shape may be estimated based on features and structures associated with the road shape, such as curbs and guardrails.
The extent to which the radius of the frontward curve (or curvature) is estimated may be predetermined or may vary with a vehicle speed. When, for example, 3 seconds are required for the driver to navigate from the current steering angle to the target steering angle with a sufficient allowable range, how far ahead to find the radius of the frontward curve is determined by the "vehicle speed v × 3 seconds".
Referring back to FIG. 19, the following description will be given. The steering angle determining unit 33 determines the target steering angle based on the vehicle speed of the vehicle and the estimated road shape. Various ways of determining the correspondence between vehicle speed, road shape, and steering angle have been proposed. The following is an example.
FIG. 21 is a diagram illustrating a method of determining a target steering angle. First, a yaw rate ω has a relationship between a vehicle speed v and curvature k of a curve. This relationship is represented by the following equation (1).
ω = v × k ... (1)
The radius r of the curve and the curvature k have a relationship represented by the following equation (2).
r = 1/k ... (2)
Referring to FIG. 21, since the steering angle while traveling along the curve having the radius r is α, the following equation (3) is obtained:
Sin(α) = W/r ... (3)
where W is the wheel base.
Applying the equation (2) to the equation (3) gives the following equation (4).
Sin(α) = W*k
α = sin-1(W*k) ... (4)
Substituting equation (1) into the equation (4) gives the following equation (5).
α = sin-1(W*ω/v) ... (5)
The equation (5) illustrates that when the vehicle is traveling along the curve with the radius r at the vehicle speed v, an appropriate steering angle can be calculated, provided that a preferred yaw rate is given.
The steering angle determining unit 33 transmits the target steering angle determined by the equation (5) to the image generator 34 of the display controller 20. The preferred yaw rate can be experimentally determined by the vehicle manufacturer, etc. such that the lateral G has a predetermined amount of change or less.
Referring back to FIG. 19, the following description will be given. The display controller 20 includes an image generator 34 and an output unit 35. These functions of the display controller 20 are implemented by the CPU 202 executing a program 203p, which is loaded from a ROM 203 of the display controller 20 to a RAM 204.
The image generator 34 applies the current steering angle 303 and the target steering angle 304 to the display component 302, and generates an image for causing the display unit 10 to project the display component 302. An example of this display component 302 has been illustrated in FIGS. 13A and 13B. Since the display unit 10 projects an image in color, the image generator 34 can change the color of the current steering angle 303 and the target steering angle 304 to generate an image for facilitating understanding of the driver.
The output unit 35 outputs a display component representing the current steering angle 303 and the target steering angle 304 generated by the image generator 34. That is, the LD driver 207 and the MEMS controller 208 are controlled so as to display a display component generated by the image generator 34 on the display unit 10.
Examples of Display Indicating Current and Target Steering Angles
FIGS. 22A to 22C are diagrams each illustrating a display example of the current steering angle 303 and the target steering angle 304. First, unlike FIGS. 13A and 13B, the display component 310 of FIG. 22A has a shape that imitates a shape of a steering wheel. For purposes of illustration, FIG. 22A is black and white; however, a steering icon with a low brightness (or predetermined color) indicates a current steering angle 303, and a steering icon with a high brightness (or different color from the former) indicates a target steering angle 304. The rotation angle of the steering icon matches the actual steering angle. Thus, the current steering angle 303 may not be required. This also applies to FIGS. 13A and 13B.
In the case of such a display component, a user may steer such that a low brightness steering icon is superimposed on a high brightness steering icon. This facilitates following and maintaining the appropriate steering angle.
FIG. 22B illustrates the display component 302 similar to that of FIGS. 13A and 13B. However, when the difference between the current steering angle 303 and the target steering angle 304 is equal to or greater than a threshold value, the target steering angle 304 emphasized by the image generator 34 is displayed. For example, when the difference between the current steering angle 303 and the target steering angle 304 is equal to or greater than the threshold value, the image generator 34 generates an image of a display component that blinks the target steering angle 304. That is, the image of the display component in which the brightness and color of the target steering angle 304 change in a short time is generated.
Alternatively, when the difference between the current steering angle 303 and the target steering angle 304 is equal to or greater than a threshold value, the target steering angle 304 is formed in a color with higher alertness. For example, when the difference between the current steering angle 303 and the target steering angle 304 is not greater than or equal to the threshold value, the target steering angle 304 is displayed in yellow, and when the difference between the current steering angle 303 and the target steering angle 304 is greater than or equal to the threshold value, the target steering angle 304 is displayed in red.
The target steering angle 304 may also be indicated as larger or smaller than the actual steering angle 304.
In this way, the steering amount being large or insufficient can be emphasized.
As a method of displaying an image to be larger than or smaller than the actual image, for example, the image generator 34 monitors the difference between the actual vehicle steering angle and the target steering angle 304 to understand the driver's steering tendency. When the driver has a steering tendency to steer larger, the target steering angle 304 is displayed smaller than the actual steering angle, and when the driver has a steering tendency to steer smaller, the target steering angle 304 is displayed larger than the actual steering angle. In this way, the target steering angle 304 can be displayed according to the driver's steering tendency.
FIG. 22C illustrates the display component 302 when the current steering angle 303 the target steering angle 304. In FIG. 22C, the image generator 34 can inform a user that it is desirable to reduce the steering wheel angle in order to travel along the curve ahead of the vehicle by reducing the current steering angle 303 to be smaller than the target steering angle 304.
FIG. 23 is a diagram illustrating a relationship between an actual steering angle of a vehicle and a target steering angle 304. In this case, the vehicle is about to travel along the right-turn curve. When t=0, the target steering angle 304 is α0 and the actual steering angle after t seconds is αt. This t-seconds is determined by how many seconds later the steering angle becomes the target steering angle 304. The image generator 34 stores the target steering angle 304 and the actual steering angle after t seconds to detect the driver's steering tendency. That is, the image generator 34 detects the difference between the target steering angle and the current steering angle after t seconds, and generates an image having the target steering angle to which the difference is added, or from which the difference is reduced. For example, the following case is considered:
Target steering angle 304 - Actual steering angle after t seconds = +α degrees.
In this case, since the driver has a steering tendency to steer smaller, the image generator 34 generates an image of a display component 302 having a target steering angle 304 +α degrees as the target steering angle 304.
Further, the following case is considered:
Target steering angle 304 - Actual steering angle after t seconds = -α degrees.
In this case, since the driver has a steering tendency to steer larger, the image generator 34 generates an image of a display component 302 having a target steering angle 304 - α degrees as the target steering angle 304.
Operation Procedure
FIG. 24 is a flowchart illustrating an example of a procedure in which the display controller 20 presents the current steering angle 303 and the target steering angle 304. The process of FIG. 24 is performed repeatedly when the vehicle is in an ignition ON or in a system ON.
First, the detector 16 detects a current steering angle 303 (S1). In addition, the vehicle speed acquiring unit 31 detects a vehicle speed (S2).
The road shape estimator 32 estimates a road shape ahead of the vehicle (S3). For example, a curvature and a pivot direction of the curve are detected.
The steering angle determining unit 33 determines the target steering angle 304 from the vehicle speed, the curvature of the curve, and a preferred yaw rate (S4).
The image generator 34 generates an image of a display component 302 including a current steering angle 303 and a target steering angle 304 (S5). Since the output unit 35 outputs an image to the display unit 10, the display unit 10 projects the image including the display component 302 (S6).
Thus, the display component 302 is projected onto the windshield, and the current steering angle 303 and the target steering angle 304 are presented to the driver as virtual images.
Major Advantageous Effects
As described above, the display system according to the fourth embodiment sequentially displays a target steering angle in which the amount of change in the lateral G is a predetermined amount or less while the vehicle travels along the curve. This facilitats the driver's steering so as for the driver to comfortably drive the vehicle.
Other Preferred Embodiments
While the preferred embodiment of the present invention has been described with reference to examples, various modifications and substitutions may be made without departing from the spirit and scope of the invention.
For example, the fourth embodiment has illustrated the target steering angle while driving on the road; however, the fourth embodiment may illustrate the target steering angle when parking. In this case, when a parking frame is made of white lines or the like, the recognition unit 15 recognizes the parking frame. When the parking frame is not formed by white lines or the like, a space is recognized by an ultrasonic sensor or the like, and the recognition unit 15 determines a parking frame. The recognition unit 15 sets a moving line to the parking frame, and the display system displays a steering angle that moves along the moving line.
For example, the display system 1 may be a liquid crystal display or the like. Further, the display controller 20 and the display unit 10 may be separated from each other, and the housings may be separately distributed. For example, a smartphone may be used as a smartphone display controller 20, and information may be displayed on a display that is built in the smartphone, or a virtual image may be displayed on a combiner.
In addition, the information displayed by the display system 1 is not limited to the vehicle speed, and information that can be obtained from the inside of the vehicle or information that can be obtained from the outside of the vehicle via the Internet can be displayed.
Further, not all the processing is performed by the display system 1 installed in the vehicle, but a part of or all the processing may be performed by a server which performs communications with the vehicle via the network. For example, the vehicle can transmit a frontward image to the server, and the server can determine at least one of a road shape or a target steering angle to the vehicle. In this way, the processing load on the display system 1 installed in the vehicle can be reduced.
Further, a configuration example such as FIG. 19 is divided according to main functions to facilitate understanding of the processing performed by the display controller 20. The invention of the present application is not limited by the dividing method of processing units or by the names. The processing of the display controller 20 may be divided into more processing units according to the processing contents. Alternatively, the processing of the display controller 20 may further be divided into more processing units such that each processing unit can include more multiple processing units.
The functions of the embodiments described above may be implemented by one or more processing circuits.
As used herein, a "processing circuit" includes a processor programmed to perform each function by software, such as a processor implemented in electronic circuits, an ASIC (Application Specific Integrated Circuit) designed to perform each function as described above, a digital signal processor (DSP), a field programmable gate array (FPGA), or a conventional circuit module.
While the preferred embodiments of the invention have been described in detail above, the invention is not limited to these embodiments, and various modifications or variations are possible within the scope of the invention as defined in the appended claims.

1  display system
2  in-vehicle system
10  display unit
11  car navigation system
12  engine ECU
13  brake ECU
14  steering ECU
15  recognition unit
16  detector
20  display controller
100,100A  in-vehicle display device
102  vehicle information acquiring unit
104  environmental information acquiring unit
106  driver information acquiring unit
110  calculator
112  image generator
114  image display unit
120  head-up display
300  explicit risk notification image
400  explicit risk notification image
500  implicit risk notification image

The present application is based on and claims priority of Japanese Priority Application No. 2019-236313 filed on December 26, 2019, Japanese Priority Application No. 2020-002962 filed on January 10, 2020, and Japanese Priority Application No. 2020-204820 filed on December 10, 2020, the entire contents of which are hereby incorporated herein by reference.

Claims (7)

  1.     An in-vehicle display device installed in a vehicle, the in-vehicle display device comprising:
        a calculator configured to calculate a distance from the vehicle to an object;
        an image generator configured to
          generate a first image to attract attention to the object by using a character string alone or by using the character string and a graphic, in response to the distance calculated by the calculator being a predetermined first threshold value or less, or
          generate a second image to attract attention to the object by using the graphic alone, in response to the distance calculated by the calculator being greater than the first threshold value; and
        a display unit configured to display the first image or the second image, the first image or the second image being generated by the image generator using a display device.


  2.     The in-vehicle display device according to claim 1, wherein the display device is a head-up display configured to display the first image or the second image ahead of a driver of the vehicle, the first image or the second image being generated by the image generator.


  3.     The in-vehicle display device according to claim 2, wherein the display device displays the second image at a position superimposed on the object, from a viewpoint of the driver of the vehicle, in response to the second image being generated by the image generator.


  4.     The in-vehicle display device according to any one of claims 1 to 3, wherein the graphic has an animation effect.


  5.     The in-vehicle display device according to any one of claims 1 to 4, wherein neither the first image nor the second image is displayed, in response to the distance calculated by the calculator being greater than a predetermined second threshold value, the predetermined second threshold value being greater than the predetermined first threshold value.


  6.     The in-vehicle display device according to any one of claims 1 to 5, wherein the character string represents a type of the object or an action to be taken by the driver of the vehicle.


  7.     A program for causing a computer to function as elements of an in-vehicle display device installed in a vehicle, the elements comprising:
        a calculator configured to calculate a distance from the vehicle to an object;
        an image generator configured to
          generate a first image to attract attention to the object by using a character string alone or by using the character string and a graphic, in response to the distance calculated by the calculator being a predetermined first threshold value or less, or
          generate a second image to attract attention to the object by using the graphic alone, in response to the distance calculated by the calculator being greater than the first threshold value; and
        a display unit configured to display the first image or the second image, the first image or the second image being generated by the image generator using a display device.
PCT/JP2020/047985 2019-12-26 2020-12-22 In-vehicle display device and program WO2021132250A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20842331.9A EP4081420A1 (en) 2019-12-26 2020-12-22 In-vehicle display device and program

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2019-236313 2019-12-26
JP2019236313 2019-12-26
JP2020-002962 2020-01-10
JP2020002962A JP2021109555A (en) 2020-01-10 2020-01-10 Display control device, system, display system and information display method
JP2020-204820 2020-12-10
JP2020204820A JP2021105989A (en) 2019-12-26 2020-12-10 Onboard display device and program

Publications (1)

Publication Number Publication Date
WO2021132250A1 true WO2021132250A1 (en) 2021-07-01

Family

ID=74186788

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/047985 WO2021132250A1 (en) 2019-12-26 2020-12-22 In-vehicle display device and program

Country Status (2)

Country Link
EP (1) EP4081420A1 (en)
WO (1) WO2021132250A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001023094A (en) 1999-07-12 2001-01-26 Nissan Motor Co Ltd Semi-automatic driving system
JP5177105B2 (en) 2009-09-24 2013-04-03 株式会社デンソー Driving support display device
US20180264940A1 (en) * 2017-03-15 2018-09-20 Subaru Corporation Vehicle display system and method of controlling vehicle display system
US20190204598A1 (en) * 2017-12-28 2019-07-04 Toyota Jidosha Kabushiki Kaisha Display control device and display control method
US20190213932A1 (en) * 2016-09-26 2019-07-11 Fujifilm Corporation Projection display device, projection display method, and projection display program
JP2020002962A (en) 2018-06-25 2020-01-09 ナブテスコ株式会社 Pressure control valve
JP2020204820A (en) 2019-06-14 2020-12-24 マツダ株式会社 Outside environment recognition device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001023094A (en) 1999-07-12 2001-01-26 Nissan Motor Co Ltd Semi-automatic driving system
JP5177105B2 (en) 2009-09-24 2013-04-03 株式会社デンソー Driving support display device
US20190213932A1 (en) * 2016-09-26 2019-07-11 Fujifilm Corporation Projection display device, projection display method, and projection display program
US20180264940A1 (en) * 2017-03-15 2018-09-20 Subaru Corporation Vehicle display system and method of controlling vehicle display system
US20190204598A1 (en) * 2017-12-28 2019-07-04 Toyota Jidosha Kabushiki Kaisha Display control device and display control method
JP2020002962A (en) 2018-06-25 2020-01-09 ナブテスコ株式会社 Pressure control valve
JP2020204820A (en) 2019-06-14 2020-12-24 マツダ株式会社 Outside environment recognition device

Also Published As

Publication number Publication date
EP4081420A1 (en) 2022-11-02

Similar Documents

Publication Publication Date Title
EP3415373B1 (en) Information providing device
US8536995B2 (en) Information display apparatus and information display method
US11850941B2 (en) Display control apparatus, display apparatus, display system, moving body, program, and image generation method
CN110786004B (en) Display control device, display control method, and storage medium
US11547361B2 (en) Display controller, display device, display system, mobile object, image generation method, and carrier means
CN113165513A (en) Head-up display, display system for vehicle, and display method for vehicle
JP6504431B2 (en) IMAGE DISPLAY DEVICE, MOBILE OBJECT, IMAGE DISPLAY METHOD, AND PROGRAM
EP3770898A1 (en) Image display system, information processing device, information processing method, program, and moving body
JP7300112B2 (en) Control device, image display method and program
US11752940B2 (en) Display controller, display system, mobile object, image generation method, and carrier means
WO2021132259A1 (en) Display apparatus, display method, and program
WO2021132250A1 (en) In-vehicle display device and program
JP2021105989A (en) Onboard display device and program
JP2022152607A (en) Driving support device, driving support method, and program
JP2021117089A (en) Display device and method for display
JP2021117704A (en) Display device and method for display
WO2021149740A1 (en) Display apparatus, movable body, display method, program, and non-transitory recording medium
WO2021132408A1 (en) Display apparatus, moving body, display method, and program
JP2021109555A (en) Display control device, system, display system and information display method
WO2021153454A1 (en) Image display apparatus, image display method, program, and non-transitory recording medium
JP2021117220A (en) Display device, mobile body, method for display, and program
JP2021104803A (en) Display device, display method and program
JP2021117987A (en) Image display device, image display method, and program
JP2022044623A (en) Image display method and image display device
JP2021117938A (en) Image display device, image display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20842331

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020842331

Country of ref document: EP

Effective date: 20220726