WO2016129219A1 - Dispositif de commande d'affichage de véhicule et unité d'affichage de véhicule - Google Patents

Dispositif de commande d'affichage de véhicule et unité d'affichage de véhicule Download PDF

Info

Publication number
WO2016129219A1
WO2016129219A1 PCT/JP2016/000371 JP2016000371W WO2016129219A1 WO 2016129219 A1 WO2016129219 A1 WO 2016129219A1 JP 2016000371 W JP2016000371 W JP 2016000371W WO 2016129219 A1 WO2016129219 A1 WO 2016129219A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual image
image display
vehicle
emphasized
display control
Prior art date
Application number
PCT/JP2016/000371
Other languages
English (en)
Japanese (ja)
Inventor
真吾 柴田
彩子 小谷
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2015236915A external-priority patent/JP6520668B2/ja
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to US15/549,489 priority Critical patent/US20180024354A1/en
Publication of WO2016129219A1 publication Critical patent/WO2016129219A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a vehicle display control device and a vehicle display unit including the same.
  • HUD head-up display
  • Patent Documents 1 and 2 disclose a vehicle display control technique for displaying a virtual image as a display image of an emphasized image that emphasizes a front obstacle.
  • the virtual image display position and the virtual image display size are controlled so that a toric line-shaped emphasized image is superimposed on a front obstacle transmitted through the projection member. According to this, even if the virtual image display position of the emphasized image is deviated within the range of the control error due to disturbance or the like, the association of the emphasized image with the front obstacle can be maintained in the superimposed state.
  • Patent Document 2 when the emphasized image shifts to any one of the upper, left, and right directions with respect to the front obstacle, the user feels the shift. It looks like it is pointed on the same plane by the enhanced image. This is because there is generally a space above, to the left, and to the right of the front obstacle. This is because it is difficult for the user to feel the separation in the front-rear direction with respect to the front obstacle.
  • the linear portion extending left and right below the front obstacle is easily perceived by the user as being separated in the front-rear direction with respect to the front obstacle due to the downward shift.
  • the association with the front obstacle may become ambiguous and the enhancement effect may be reduced, or the user may have an illusion that the front obstacle is far away.
  • This disclosure is intended to provide a vehicle display control device that properly emphasizes a front obstacle by displaying a virtual image of an emphasized image, and a vehicle display unit including the vehicle display control device.
  • a head-up display that projects a display image onto a projection member that transmits an outside scene, thereby displaying the display image in a virtual image in association with at least one front obstacle in the outside scene.
  • the vehicle display control device that controls the virtual image display has a margin with the front obstacle at a virtual image display position that is an entire range of less than one round except the lower part of the periphery of the front obstacle.
  • An image storage device for storing an emphasized image for emphasizing the front obstacle as a display image by a linear portion of a virtual image display size surrounding the front obstacle at an interval is constructed by at least one processor.
  • a virtual image display control device for controlling the virtual image display position and the virtual image display size.
  • the emphasized image as a display image for emphasizing the front obstacle in the outside scene is not enough at the virtual image display position in the whole area less than one round except the lower part around the front obstacle.
  • the virtual image display size is controlled so as to surround the front obstacle by the linear part that opens the allowance. Therefore, according to the emphasis image superimposed on the space in the outside scene above, to the left, and to the right of the front obstacle, the user seems to point to the front obstacle even if the user feels a deviation from the front obstacle. Therefore, it becomes difficult to feel the separation in the front-rear direction with respect to the front obstacle.
  • the virtual image display position of the emphasized image shifts within the range of the control error due to disturbance or the like, it is possible to maintain the association of the emphasized image with the front obstacle, and the illusion that the front obstacle is separated Can also be avoided.
  • the virtual image display position of the emphasized image is deviated within the range of the control error, it is avoided that the part of the front obstacle is hidden in the emphasized image by the margin for opening the linear portion, and the user feels annoyed. Can be suppressed.
  • the front obstacle can be properly emphasized by the virtual image display of the emphasized image.
  • a head-up display that projects a display image onto a projection member that transmits an outside scene, thereby displaying the display image in a virtual image in association with at least one front obstacle in the outside scene.
  • the vehicle display control device that controls the virtual image display includes the front obstacle at the first virtual image display position that is the entire range of less than one round except the lower part of the periphery of the front obstacle.
  • An image storage device that stores an enhanced image for emphasizing a harmful object as the display image, and a virtual image display position that is constructed by at least one processor and includes the first virtual image display position and the second virtual image display position;
  • a virtual image display control device that controls the virtual image display size including the first virtual image display size and the second virtual image display size.
  • the emphasized image as a display image for emphasizing the front obstacle in the outside scene is not enough at the virtual image display position in the whole area less than one round except the lower part around the front obstacle.
  • the virtual image display size that surrounds the front obstacle is controlled by the first linear part that opens the allowance. Therefore, according to the first linear portion that overlaps the space in the external scene above, to the left, and to the right of the front obstacle, the user points to the front obstacle even if the user feels a deviation from the front obstacle. It becomes difficult to feel the separation in the front-rear direction with respect to the front obstacle.
  • the emphasized image is displayed at the virtual image display position between the both ends of the first linear portion around the front obstacle.
  • the virtual image display size surrounding the object is controlled.
  • the user's gaze point is more than the second linear portion side. Easy to gather on the first linear portion side. Therefore, according to the second linear portion on the low luminance side, since the association with the ground is weakened, the user is less likely to feel the separation in the front-rear direction with respect to the front obstacle.
  • a vehicle display unit includes the vehicle display control device according to the first or second aspect and the head-up display.
  • the virtual image display position and the virtual image display size of the emphasized image by the HUD are controlled by the vehicle display control device of the first invention or the second invention. It is possible to properly emphasize objects.
  • FIG. 1 is an interior view showing a passenger compartment of a host vehicle equipped with a travel assist system according to the first embodiment.
  • FIG. 2 is a block diagram showing the driving assist system according to the first embodiment.
  • FIG. 3 is a structural diagram schematically showing the detailed configuration of the HUD of FIGS.
  • FIG. 4 is a front view showing a virtual image display state by the HUD of FIGS.
  • FIG. 5 is a flowchart showing a display control flow by the HCU of FIG.
  • FIG. 6 is a front view for explaining the effects of the first embodiment.
  • FIG. 7 is a flowchart showing a display control flow according to the second embodiment.
  • FIG. 1 is an interior view showing a passenger compartment of a host vehicle equipped with a travel assist system according to the first embodiment.
  • FIG. 2 is a block diagram showing the driving assist system according to the first embodiment.
  • FIG. 3 is a structural diagram schematically showing the detailed configuration of the HUD of FIGS.
  • FIG. 4 is a front view showing
  • FIG. 8 is a front view showing a virtual image display state according to the second embodiment
  • FIG. 9 is a flowchart showing a display control flow according to the third embodiment
  • FIG. 10 is a front view showing a virtual image display state according to the third embodiment
  • FIG. 11 is a front view showing a virtual image display state according to the fourth embodiment
  • FIG. 12 is a flowchart showing a display control flow according to the fourth embodiment
  • FIG. 13 is a front view for demonstrating the effect of 4th embodiment
  • FIG. 14 is a flowchart showing a display control flow according to the fifth embodiment
  • FIG. 15 is a front view showing a virtual image display state according to the fifth embodiment
  • FIG. 16 is a flowchart showing a display control flow according to the sixth embodiment.
  • FIG. 17 is a front view showing a virtual image display state according to the sixth embodiment
  • FIG. 18 is a flowchart showing a display control flow according to the seventh embodiment
  • FIG. 19 is a front view showing a virtual image display state according to the seventh embodiment
  • FIG. 20 is a flowchart showing a display control flow according to the eighth embodiment.
  • FIG. 21 is a front view showing a virtual image display state according to the eighth embodiment
  • FIG. 22 is a flowchart showing a display control flow according to the ninth embodiment.
  • FIG. 23 is a front view showing a virtual image display state according to the ninth embodiment
  • FIG. 24 is a flowchart showing a display control flow according to the tenth embodiment.
  • FIG. 25 is a front view showing a virtual image display state according to the tenth embodiment
  • FIG. 26 is a flowchart showing a display control flow according to the eleventh embodiment
  • FIG. 27 is a front view showing a virtual image display state according to the eleventh embodiment
  • FIG. 28 is a flowchart showing a modification of FIG.
  • FIG. 29 is a flowchart showing a modification of FIG.
  • FIG. 30 is a flowchart showing a modification of FIG.
  • FIG. 31 is a flowchart showing a modification of FIG.
  • FIG. 32 is a flowchart showing a modification of FIG.
  • FIG. 33 is a flowchart showing a modification of FIG.
  • FIG. 34 is a flowchart showing a modification of FIG. FIG.
  • FIG. 35 is a flowchart showing a modification of FIG.
  • FIG. 36 is a flowchart showing a modification of FIG.
  • FIG. 37 is a front view showing a modification of FIG.
  • FIG. 38 is a front view showing a modification of FIG.
  • FIG. 39 is a front view showing a modification of FIG.
  • FIG. 40 is a front view showing a modification of FIG.
  • FIG. 41 is a block diagram showing a modification of FIG.
  • the travel assist system 1 according to the first embodiment to which the present disclosure is applied is mounted on the host vehicle 2 as illustrated in FIGS.
  • the travel assist system 1 includes a periphery monitoring system 3, a vehicle control system 4, and a display system 5.
  • the systems 3, 4, and 5 of the travel assist system 1 are connected via an in-vehicle network 6 such as a LAN (Local Area Network).
  • LAN Local Area Network
  • the periphery monitoring system 3 includes an external sensor 30 and a periphery monitoring ECU (Electronic Control Unit) 31.
  • the external sensor 30 detects, for example, another vehicle, an artificial structure, a human being, an animal, and the like as an obstacle that exists in the external environment of the host vehicle 2 and may collide, and traffic indications that exist in the external environment.
  • the external sensor 30 is, for example, one type or plural types of sonar, radar, camera, and the like.
  • the sonar is an ultrasonic sensor installed in, for example, the front part or the rear part of the host vehicle 2.
  • the sonar detects the obstacle in the detection area by outputting the reflected wave of the ultrasonic wave transmitted to the detection area in the external environment of the host vehicle 2, and outputs a detection signal.
  • the radar is a millimeter wave sensor or a laser sensor installed in the front part or the rear part of the host vehicle 2.
  • the radar receives the millimeter wave or quasi-millimeter wave transmitted to the detection area in the external environment of the host vehicle 2, or the reflected wave of the laser, thereby detecting an obstacle in the detection area and outputting a detection signal. .
  • the camera is a monocular or compound eye camera installed on, for example, a room mirror or a door mirror of the host vehicle 2.
  • the camera captures a detection area in the external environment of the host vehicle 2 to detect an obstacle or traffic display in the detection area and outputs an image signal.
  • the periphery monitoring ECU 31 is mainly configured by a microcomputer having a processor and a memory, and is connected to the external sensor 30 and the in-vehicle network 6.
  • the periphery monitoring ECU 31 acquires sign information such as a speed limit sign and a lane sign, and lane marking information such as a white line and a yellow line based on an output signal of the external sensor 30.
  • the surrounding monitoring ECU 31 detects obstacles such as the type of obstacle, the moving direction and moving speed of the front obstacle 8b (see FIGS. 1 and 4), the relative speed and relative distance of the front obstacle 8b with respect to the host vehicle 2, and the like. Information is acquired based on the output signal of the external sensor 30.
  • the vehicle control system 4 includes a vehicle state sensor 40, an occupant sensor 41, and a vehicle control ECU 42.
  • the vehicle state sensor 40 is connected to the in-vehicle network 6.
  • the vehicle state sensor 40 detects the traveling state of the host vehicle 2.
  • the vehicle state sensor 40 is, for example, one type or a plurality of types among a vehicle speed sensor, a rotation speed sensor, a rudder angle sensor, a fuel sensor, a water temperature sensor, a radio wave receiver, and the like.
  • the vehicle speed sensor detects the vehicle speed of the host vehicle 2 and outputs a vehicle speed signal corresponding to the detection.
  • the rotation speed sensor detects the engine rotation speed in the host vehicle 2 and outputs a rotation speed signal corresponding to the detection.
  • the steering angle sensor detects the steering angle of the host vehicle 2 and outputs a steering angle signal corresponding to the detection.
  • the fuel sensor detects the remaining amount of fuel in the fuel tank of the host vehicle 2 and outputs a fuel signal corresponding to the detection.
  • the water temperature sensor detects the cooling water temperature of the internal combustion engine in the host vehicle 2 and outputs a water temperature signal corresponding to the detection.
  • the radio wave receiver outputs a traffic signal by receiving output radio waves from, for example, a positioning satellite, another vehicle transmitter for vehicle-to-vehicle communication, and a roadside device for road-to-vehicle communication.
  • the traffic signal is a signal representing traffic information related to the host vehicle 2 such as a travel position, a travel direction, a travel path state, a speed limit, and the like, and the obstacle information.
  • the passenger sensor 41 is connected to the in-vehicle network 6.
  • the occupant sensor 41 detects the state or operation of a user who has boarded in the passenger compartment 2c of the host vehicle 2 shown in FIG.
  • the occupant sensor 41 is, for example, one type or a plurality of types among a power switch, a user status monitor, a display setting switch, a turn switch, a cruise control switch, a lane control switch, and the like.
  • the power switch is turned on by the user in the passenger compartment 2c to start the internal combustion engine or the motor generator of the host vehicle 2, and outputs a power signal corresponding to the operation.
  • the user state monitor captures the user state on the driver's seat 20 in the passenger compartment 2c with an image sensor, thereby detecting the user state and outputting an image signal.
  • the display setting switch is operated by the user to set the display state in the passenger compartment 2c, and outputs a display setting signal corresponding to the operation.
  • the turn switch is turned on by the user in the passenger compartment 2c to operate the direction indicator of the host vehicle 2, and outputs a turn signal corresponding to the operation.
  • the cruise control switch is turned on by the user in the passenger compartment 2c in order to automatically control the inter-vehicle distance of the host vehicle 2 or the vehicle speed of the host vehicle 2 with respect to the vehicle ahead as the forward obstacle 8b.
  • a cruise control signal corresponding to the output is output.
  • the lane control switch outputs a lane control signal corresponding to the operation by being turned on by the user in the passenger compartment 2c in order to automatically control the position in the width direction of the traveling lane of the host vehicle 2.
  • the vehicle control ECU 42 shown in FIG. 2 is mainly composed of a microcomputer having a processor and a memory, and is connected to the in-vehicle network 6.
  • the vehicle control ECU 42 is one type or a plurality of types including at least an integrated control ECU among an engine control ECU, a motor control ECU, a brake control ECU, a steering control ECU, an integrated control ECU, and the like.
  • the engine control ECU controls the operation of the throttle actuator and the fuel injection valve of the internal combustion engine in accordance with the operation of the accelerator pedal 26 in the passenger compartment 2c shown in FIG. Accelerate and decelerate.
  • the motor control ECU accelerates or decelerates the vehicle speed of the host vehicle 2 by controlling the operation of the motor generator according to the operation of the accelerator pedal 26 in the passenger compartment 2c or automatically.
  • the brake control ECU accelerates or decelerates the vehicle speed of the host vehicle 2 by controlling the operation of the brake actuator according to the operation of the brake pedal 27 in the passenger compartment 2c or automatically.
  • the steering control ECU adjusts the steering angle of the host vehicle 2 by automatically controlling the operation of the electric power steering in accordance with the operation of the steering handle 24 in the passenger compartment 2c.
  • the integrated control ECU synchronously controls the operation of the other control ECU based on, for example, control information in the other control ECU of the vehicle control ECU 42, output signals of the sensors 40 and 41, acquired information in the periphery monitoring ECU 31, and the like.
  • the integrated control ECU performs full vehicle speed range adaptive cruise control (FSRA: Full Speed) that automatically controls the inter-vehicle distance and the vehicle speed in the entire vehicle speed range of the host vehicle 2 when the cruise control switch is turned on.
  • FSRA full vehicle speed range adaptive cruise control
  • Range Adaptive Cruise Control is realized.
  • the integrated control ECU mounted on the host vehicle 2 as an “inter-vehicle control unit” that realizes this FSRA is based on the information acquired by the periphery monitoring ECU 31 and the output signal of the radio receiver, The operation of the brake control ECU is controlled.
  • the integrated control ECU automatically controls the position in the width direction in the traveling lane by regulating the deviation of the vehicle 2 from the white line or the yellow line when the lane control switch is turned on.
  • Assist LKA: Lane Keeping Assist
  • the integrated control ECU mounted on the host vehicle 2 also as the “lane control unit” that realizes this LKA controls the operation of the steering control ECU based on the acquired information in the periphery monitoring ECU 31 and the output signal of the radio receiver.
  • the display system 5 as a “vehicle display unit” is mounted on the host vehicle 2 for visually presenting information.
  • the display system 5 includes a HUD 50, an MFD (Multi Function Display) 51, a combination meter 52, and an HCU (HMI (Human Machine Interface) ⁇ Control Unit) 54.
  • the HUD 50 is installed on the instrument panel 22 in the passenger compartment 2c shown in FIGS.
  • the HUD 50 passes a display image 56 formed so as to show predetermined information on a display 50 i such as a liquid crystal panel or a projector to the front windshield 21 as a “projection member” in the own vehicle 2 through the optical system 50 o. Project.
  • the front windshield 21 is formed of translucent glass, and thus transmits the outside scene 8 existing in front of the host vehicle 2 out of the passenger compartment 2c.
  • the luminous flux of the display image 56 reflected by the front windshield 21 and the luminous flux from the external scene 8 transmitted through the shield 21 are perceived by the user on the driver seat 20.
  • the virtual image of the display image 56 formed in front of the front windshield 21 is displayed so as to be superimposed on a part of the external scene 8 so that the virtual image of the display image 56 and the external scene 8 are operated. It can be visually recognized by the user on the seat 20.
  • the emphasized image 560 as the display image 56 is displayed as a virtual image.
  • the emphasized image 560 forms a linear portion 560p having a constant width as a whole that curves and extends in an arc shape at the virtual image display position ⁇ .
  • the virtual image display size ⁇ of the linear portion 560p is variably set so as to continuously surround the front obstacle 8b in the virtual image display position ⁇ that is the entire range of less than one round except the lower part of the periphery of the front obstacle 8b.
  • the virtual image display size ⁇ of the linear portion 560p allows a margin 560m for allowing the user to directly view the external scene 8 other than the front obstacle 8b between the front obstacle 8b on the inner peripheral side. Is variably set. Further, the virtual image display color of the linear portion 560p emphasizes the front obstacle 8b among the translucent colors that allow the user to visually recognize the overlapped portion with the external scene 8 and suppress the user's troublesomeness. It is fixedly set or variably set by the user to a predetermined high-intensity color tone that enables alerting. For example, the virtual image display color of the linear portion 560p is set to bright yellow, bright red, bright green, or light amber color.
  • the HUD 50 in addition to the display of the emphasized image 560, for example, display of an image indicating one type or a plurality of types of information among navigation information, sign information, obstacle information, and the like is adopted. Also good. Further, by using a translucent combiner that is arranged on the instrument panel 22 and transmits the outside scenery 8 in cooperation with the front windshield 21, the display image 56 is projected onto the combiner, thereby realizing a virtual image display. Is possible. Furthermore, the navigation information can be acquired based on the map information stored in the memory 54m and the output signal of the sensor 40, for example, in the HCU 54 described in detail later.
  • the MFD 51 is installed in the center console 23 in the passenger compartment 2c shown in FIG.
  • the MFD 51 displays a real image of an image formed so as to show predetermined information on one or a plurality of liquid crystal panels so that it can be visually recognized by a user on the driver's seat 20.
  • As the real image display by the MFD 51 display of an image indicating one type or a plurality of types of information among navigation information, audio information, video information, communication information, and the like is employed.
  • the combination meter 52 is installed on the instrument panel 22 in the passenger compartment 2c.
  • the combination meter 52 displays vehicle information related to the host vehicle 2 so that the user on the driver's seat 20 can visually recognize the vehicle information.
  • the combination meter 52 is a digital meter that displays vehicle information by an image formed on a liquid crystal panel, or an analog meter that displays vehicle information by indicating a scale with a pointer.
  • As the display by the combination meter 52 for example, one type or plural types of information among the vehicle speed, the engine speed, the remaining amount of fuel, the coolant temperature, the operation state of the turn switch, the cruise control switch, and the lane control switch are displayed. The display shown is adopted.
  • the HCU 54 shown in FIG. 2 is mainly composed of a microcomputer having a processor 54p and a memory 54m, and is connected to the display elements 50, 51, 52 of the display system 5 and the in-vehicle network 6.
  • the HCU 54 synchronously controls the operation of the display elements 50, 51, 52.
  • the HCU 54 controls the operation of the sensors 40 and 41 based on, for example, output signals from the ECU 31, information acquired by the ECU 31, control information from the ECU 42, information stored in the memory 54m, and information acquired by the HCU 54 itself.
  • the memory 54m of the HCU 54 and the memories of other various ECUs are respectively configured by using one or a plurality of storage media such as a semiconductor memory, a magnetic medium, or an optical medium.
  • the data of the display image 56 including the emphasized image 560 is stored in the memory 54m as the “image storage device”, so that the HCU 54 functions as the “vehicle display control device”. Specifically, the HCU 54 executes a display control program by the processor 54p, thereby realizing a display control flow for reading and displaying the emphasized image 560 from the memory 54m as shown in FIG.
  • the “image storage device” for storing the display image 56 the memory of the built-in ECU of each of the display elements 50, 51, 52 is used, or the memory of each of them and the memory 54m of the HCU 54 are combined. Of course, it may be realized.
  • the display control flow is started in response to an ON operation of the power switch in the occupant sensor 41 and is ended in response to an OFF operation of the switch.
  • “S” in the display control flow means each step.
  • S101 of the display control flow it is determined whether or not one forward obstacle 8b that is emphasized by the emphasized image 560 and alerted is detected. Specifically, the determination in S101 is based on one type or a plurality of types of obstacle information acquired by, for example, the periphery monitoring ECU 31 and obstacle information represented by an output signal of a radio wave receiver that is the occupant sensor 41. Is done. While a negative determination is made in S101, S101 is repeatedly executed. On the other hand, if an affirmative determination is made in S101, the process proceeds to S102.
  • the necessary information I for displaying the emphasized image 560 as a virtual image is acquired.
  • the necessary information I is, for example, one type or a plurality of types of information obtained from the periphery monitoring ECU 31 and information based on the output signals of the sensors 40 and 41.
  • obstacle information is exemplified as the acquisition information in the periphery monitoring ECU 31.
  • information based on the output signal of the vehicle state sensor 40 the vehicle speed represented by the output signal of the vehicle speed sensor and the steering angle represented by the output signal of the steering angle sensor are exemplified.
  • Information based on the occupant sensor 41 includes a display state setting value represented by the output signal of the display setting switch, a user state such as an eye state represented by the output signal of the user state monitor, and traffic information represented by the output signal of the radio receiver, Obstacle information is exemplified.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the emphasized image 560 are set based on the necessary information I acquired in S102. Specifically, first, based on the necessary information I, the gaze point or gaze line when the user gazes at the front obstacle 8b detected in S101 is estimated. Next, the virtual image display position ⁇ is set so as to be the entire range of less than one round except for the lower side with respect to the estimated forward gazing point or the forward obstacle 8b on the gazing line, and the margin for the obstacle 8b is set.
  • the virtual image display size ⁇ is set so as to form a linear portion 560p having a gap of 560 m.
  • display data for displaying the emphasized image 560 in a virtual image with the virtual image display position ⁇ and the virtual image display size ⁇ set in S103 is generated.
  • the display data is generated by performing image processing on the data of the emphasized image 560 read from the memory 54m.
  • the display data generated in S104 is given to the HUD 50, and the highlighted image 560 is formed by the display 50i, thereby controlling the virtual image display position ⁇ and the virtual image display size ⁇ of the linear portion 560p.
  • the emphasized image 560 shows a front obstacle with a margin of 560 m at the virtual image display position ⁇ in the entire area less than one round except the lower part around the front obstacle 8b.
  • the virtual image display size ⁇ surrounding the object 8b is visually recognized.
  • the process returns to S101. As a result, if a negative determination is made in S101 immediately after returning, the virtual image display of the emphasized image 560 is terminated.
  • the portion of the HCU 54 that executes S101, S102, S103, S104, and S105 corresponds to a “virtual image display control device” constructed by the processor 54p.
  • the emphasis image 560 that emphasizes the front obstacle 8b in the external scene 8 is forward by a linear portion 560p that opens a margin 560m at the virtual image display position ⁇ in the entire area of the front obstacle 8b that is less than one round except the lower part.
  • the virtual image display size ⁇ surrounding the obstacle 8b is controlled. Therefore, as shown in FIG. 6, according to the emphasized image 560 superimposed on the space 8s in the external scene 8 above, to the left, and to the right of the front obstacle 8b, the user shifts the front obstacle 8b. Even if it is felt, since it seems to point to the front obstacle 8b, it becomes difficult to feel the separation in the front-rear direction with respect to the front obstacle 8b.
  • the front obstacle 8b can be appropriately emphasized by the virtual image display of the emphasized image 560.
  • an arc-shaped virtual linear portion 560v that complements the linear portion 560p (see FIG. 6).
  • the user can also imagine the lower part of the front obstacle 8b.
  • the user can image the virtual linear portion 560v over the ground 8g below the front obstacle 8b. Therefore, the virtual linear portion 560v is added on the image with respect to the emphasized image 560 in which the association with the ground 8g is weakened because the virtual image is not actually displayed below the front obstacle 8b.
  • the association of the emphasized image 560 with the front obstacle 8b can be strengthened. The effect can be increased.
  • the second embodiment of the present disclosure is a modification of the first embodiment. As shown in FIG. 7, in the display control flow of the second embodiment, it is determined in S2100 whether or not the cruise control switch of the occupant sensor 41 is turned on. As a result, while the negative determination is made, S2100 is repeatedly executed. When the positive determination is made, the process proceeds to S2101.
  • step S2101 whether the integrated control ECU of the vehicle control ECU 42 detects the immediately preceding vehicle traveling in the same direction on the same lane as the host vehicle 2 as the front obstacle 8b under the automatic control of the inter-vehicle distance by the FSRA. Determine whether or not.
  • the determination in S2101 is, for example, control information of the integrated control ECU, obstacle information represented by the output signal of the radio receiver, mark information acquired by the periphery monitoring ECU 31, lane marking information, obstacle information, and the like. , Based on one or more types. While a negative determination is made in S2101, the process returns to S2100. When an affirmative determination is made in S2101, the process returns to S2100 after execution of S102, S103, S104, and S105. If a negative determination is made in S2100 or S2101 immediately after returning from S105, the virtual image display of the emphasized image 560 ends.
  • the inter-vehicle distance of the host vehicle 2 is automatically controlled with respect to the front vehicle as the front obstacle 8b. Therefore, when the position ⁇ and the size ⁇ of the enhanced image 560 are controlled as shown in FIG. 8, the enhancement of the vehicle ahead in the same lane that requires the user's attention under the automatic control of the inter-vehicle distance is appropriate. As a result, it is possible to ensure the safety and security of the user.
  • the portion of the HCU 54 that executes S2100, S2101, S102, S103, S104, and S105 corresponds to a “virtual image display control device” constructed by the processor 54p.
  • the third embodiment of the present disclosure is a modification of the first embodiment. As shown in FIG. 9, in the display control flow of the third embodiment, it is determined in S3100 whether or not the lane control switch of the occupant sensor 41 is turned on. As a result, while a negative determination is made, S3100 is repeatedly executed. When an affirmative determination is made, the process proceeds to S3101.
  • the integrated control ECU of the vehicle control ECU 42 is set as the front obstacle 8b for the immediately preceding vehicle that travels in the same direction in the same or different lane as the traveling lane of the host vehicle 2 under the automatic control by the LKA. It is determined whether or not it has been detected. Specifically, the determination in S3101 is the control information of the integrated control ECU, the obstacle information represented by the output signal of the radio receiver, the mark information acquired by the periphery monitoring ECU 31, the lane marking information, the obstacle information, etc. Based on one or more types. While a negative determination is made in S3101, the process returns to S3100.
  • the position in the width direction in the traveling lane of the host vehicle 2 is automatically controlled in the same manner as in the first embodiment. Therefore, by controlling the position ⁇ and the size ⁇ of the emphasized image 560 as shown in FIG. 10, it is possible to emphasize the vehicle ahead in the same or different lane that requires the user's attention under the automatic control of the position in the width direction. As appropriate, it is possible to ensure the safety and security of the user.
  • the part of the HCU 54 that executes S3100, S3101, S102, S103, S104, and S105 corresponds to a “virtual image display control device” constructed by the processor 54p.
  • the fourth embodiment of the present disclosure is a modification of the first embodiment.
  • the enhanced image 4560 includes a first linear portion 4560p1 that extends in a circular arc shape at the first virtual image display position ⁇ 1, and a second linear shape that extends in a circular arc shape at the second virtual image display position ⁇ 2.
  • the portion 4560p2 is continuously formed with the same width. That is, the emphasized image 4560 has an annular line shape as a whole.
  • the first virtual image display size ⁇ 1 that is the size of the first linear portion 4560p1 is the first virtual image display position ⁇ 1 that is the entire range of less than one round except the lower part of the periphery of the front obstacle 8b. It is variably set so as to continuously surround the obstacle 8b.
  • the virtual image display size ⁇ 1 of the first linear portion 4560p1 has an allowance of 4560m1 for allowing the user to directly view the external scenery 8 other than the front obstacle 8b between the front obstacle 8b on the inner peripheral side. It is variably set to open.
  • the virtual image display color of the first linear portion 4560p1 is such that the front obstacle 8b is emphasized among the translucent colors that allow the visual recognition of the overlapping portion with the external scenery 8 and can suppress annoyance to the user. It is fixedly set or variably set by the user to a predetermined high-intensity color tone that enables alerting.
  • the virtual image display color of the first linear portion 4560p1 is set to bright yellow, bright red, bright green, light amber color, or the like.
  • the second virtual image display size ⁇ 2 which is the size of the second linear portion 4560p2 is set at the second virtual image display position ⁇ 2 between the both ends of the first linear portion 4560p1 in the lower part around the front obstacle 8b.
  • the front obstacle 8b is variably set so as to be continuously surrounded.
  • the virtual image display size ⁇ 1 of the second linear portion 4560p2 has an allowance of 4560m2 for allowing the user to directly view the external scenery 8 other than the front obstacle 8b, and between the front obstacle 8b on the inner peripheral side. It is variably set to open.
  • the virtual image display color of the second linear portion 4560p2 is lower in luminance than the first linear portion 4560p1 among the translucent colors that allow the overlapping portion with the external scenery 8 to be visually recognized and suppress the annoyance.
  • the predetermined color tone is fixedly set or variably set by the user.
  • the virtual image display color of the second linear portion 4560p2 is set to dark yellow, dark red, dark green, dark amber color, or the like.
  • each linear portion 4560p1, 4560p2 for example, the gradation of each linear portion 4560p1, 4560p2 so that the luminance value of the luminance signal is lower in the second linear portion 4560p2 than in the first linear portion 4560p1. It is adjusted by setting the value.
  • ⁇ 1 and ⁇ 2 are set based on the acquired information I in S102. Specifically, first, based on the necessary information I, the gaze point or gaze line when the user gazes at the front obstacle 8b detected in S101 is estimated. Next, the first virtual image display position ⁇ 1 is set over the entire range excluding the lower side with respect to the estimated obstacle or the forward obstacle 8b on the gaze line, and a margin 4560m1 is opened for the obstacle 8b.
  • the first virtual image display size ⁇ 1 is set so as to form the one-line portion 4560p1.
  • a second virtual image display position ⁇ 2 is set between both ends of the first linear portion 4560p1 which is below the estimated obstacle or the front obstacle 8b on the gaze line, and the obstacle 8b
  • the second virtual image display size ⁇ 2 is set so as to form a second linear portion 4560p2 having a margin of 4560m2.
  • display data for displaying the linear portions 4560p1 and 4560p2 in a virtual image with the virtual image display positions ⁇ 1 and ⁇ 2 and the virtual image display sizes ⁇ 1 and ⁇ 2 set in S4103 is generated.
  • the display data is generated by performing image processing on the data of the emphasized image 4560 read from the memory 54m.
  • the display data generated in S4104 is given to the HUD 50, and the highlighted image 4560 is formed by the display 50i, whereby the virtual image display positions ⁇ 1, ⁇ 2 and the virtual image display size ⁇ 1, each of the linear portions 4560p1, 4560p2. ⁇ 2 is controlled.
  • the emphasized image 4560 has a margin of 4560 m1 at the first virtual image display position ⁇ 1 in the entire range of less than one round except the lower part of the periphery of the front obstacle 8b.
  • the second virtual image display size ⁇ 2 surrounding the obstacle 8b is visually recognized.
  • the process returns to S101.
  • a negative determination is made in S101 immediately after returning, the virtual image display of the emphasized image 4560 ends.
  • the portion of the HCU 54 that executes S101, S102, S4103, S4104, and S4105 corresponds to a “virtual image display control device” constructed by the processor 54p.
  • the emphasized image 4560 that emphasizes the front obstacle 8b is a front obstacle due to the first linear portion 4560p1 having a margin of 4560m1 at the first virtual image display position ⁇ 1 in the entire area less than one round except the lower part around the front obstacle 8b.
  • the virtual image display size ⁇ 1 surrounding the object 8b is controlled. Therefore, as shown in FIG. 13, according to the first linear portion 4560p1 that overlaps the space 4008s of the external scenery 8 above, left, and right of the front obstacle 8b, the user can Even if a shift is felt, it looks as if it points to the front obstacle 8b, so it is difficult to feel the separation in the front-rear direction with respect to the front obstacle 8b.
  • the emphasized image 4560 is displayed at the second virtual image display position ⁇ 2 between the both ends of the first linear portion 4560p1 in the periphery of the front obstacle 8b by the second linear portion 4560p2 having a margin 4560m2 therebetween. Is controlled to a virtual image display size ⁇ 2.
  • the second linear portion 4560p2 having lower luminance than the first linear portion 4560p1 is superimposed on the ground 4008g existing below the front obstacle 8b, the user's gaze point Are more likely to gather on the first linear portion 4560p1 side than on the second linear portion 4560p2 side. Therefore, according to the second linear portion 4560p2 on the low luminance side, since the association with the ground 4008g is weakened, the user is less likely to feel the separation in the front-rear direction with respect to the front obstacle 8b.
  • the front obstacle 8b can be appropriately emphasized by the virtual image display of the emphasized image 4560.
  • the second linear portion 4560p2 that curves and extends between both ends of the first linear portion 4560p1 at the second virtual image display position ⁇ 2
  • the user's gazing point is low due to low brightness.
  • the virtual image display positions ⁇ 1 and ⁇ 2 of the linear portions 4560p1 and 4560p2 are shifted within the range of the control error, the second linear portion 4560p2 is associated with the ground 4008g even under the front obstacle 8b. Since it weakens, the gaze point can be deviated from the portion 4560p2. According to this, since the association maintaining action and the illusion avoidance action can be surely exhibited, the front obstacle 8b can be appropriately emphasized.
  • the forward obstacle 8b is appropriately enhanced by the enhanced image 4560. It is possible.
  • the fifth embodiment of the present disclosure is a modification of the first embodiment. As shown in FIG. 14, in the display control flow of the fifth embodiment, S5101a and S5101b are executed instead of S101.
  • S5101a it is determined whether or not at least one forward obstacle 8b that is emphasized and emphasized by the enhanced image 560 is detected. The determination at this time is made in the same manner as in S101. While a negative determination is made in S5101a, S5101a is repeatedly executed. On the other hand, if a positive determination is made in S5101a, the process proceeds to S5101b.
  • S5101b it is determined whether or not there are a plurality of front obstacles 8b detected in S101. As a result, if a negative determination is made, S102, S103, S104, and S105 are executed as processing for the single forward obstacle 8b. On the other hand, if a positive determination is made, S5102, S5103, S5104, and S5105 are executed as individual processing for each front obstacle 8b.
  • the necessary information I for displaying the emphasized image 560 as a virtual image is individually acquired for each front obstacle 8b detected in S101. At this time, the necessary information I for each front obstacle 8b is acquired in the same manner as S102.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the emphasized image 560 are individually set for each front obstacle 8b detected in S101.
  • the virtual obstacle display size ⁇ is determined by the front obstacle 8b emphasized by the emphasized image 560 from the host vehicle 2 as shown in FIG. The farther away, the smaller the size is set.
  • the virtual image display position ⁇ and the virtual image display size ⁇ are set in the same manner as in S103.
  • display data for displaying the emphasized image 560 with the virtual image display position ⁇ and virtual image display size ⁇ set in S5103 is displayed as the front obstacle 8b detected in S101. Generate each one individually. At this time, display data for each front obstacle 8b is generated in the same manner as in S104.
  • the display data generated in S5104 is given to the HUD 50, and the emphasized image 560 is formed by the display 50i.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the linear portion 560p of the emphasized image 560 are individually controlled for each front obstacle 8b detected in S101.
  • each of the emphasized images 560 for each front obstacle 8b includes the front obstacle 8b emphasized by the emphasized image 560 in addition to the virtual image display size ⁇ at the virtual image display position ⁇ similar to S105.
  • the virtual image display size ⁇ which becomes smaller as it is farther from the camera, is visually recognized.
  • the process returns to S5101a.
  • the virtual image display of all the emphasized images 560 ends.
  • the virtual image display of the emphasized image 560 for the forward obstacle 8b that is no longer detected ends, but the forward obstacle remains detected.
  • the virtual image display of the emphasized image 560 on the object 8b is continued. Even after the execution of S105, the process returns to S5101a.
  • the emphasized image 560 that individually emphasizes the plurality of front obstacles 8b is controlled to be smaller as the emphasized front obstacle 8b is farther from the host vehicle 2.
  • the degree of emphasis is enhanced by the large-sized emphasized image 560, while the front obstacle 8b far from the host vehicle 2 is increased.
  • the enhancement function can be ensured by the small-size enhancement image 560. Therefore, it is possible to properly emphasize the plurality of obstacles 8b with the individual enhanced image 560 with sharpness.
  • the “virtual image display control device” in which the part that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5103, S5104, and S5105 in the HCU 54 is constructed by the processor 54p. It corresponds to.
  • the sixth embodiment of the present disclosure is a modification of the fifth embodiment. As shown in FIG. 16, in the display control flow of the sixth embodiment, S5203a, S5203b, S5204, and S5205 are executed after the execution of S5102.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the emphasized image 560 are individually set for each front obstacle 8b detected in S101. At this time, the virtual image display position ⁇ and the virtual image display size ⁇ are set similarly to S5103.
  • the virtual image display shape ⁇ of the emphasized image 560 is individually set for each front obstacle 8b detected in S101.
  • the virtual image display shape of the linear portion 560p in each emphasized image 560 as shown in FIG. ⁇ is made different.
  • the virtual image display shape ⁇ of the linear portion 560p is set to a partial true circle as an arc shape with respect to the front obstacle 8b which is another vehicle.
  • the virtual image display shape ⁇ of the linear portion 560p is set to a partial ellipse as an arc shape with respect to the human front obstacle 8b.
  • a display for virtual image display of the enhanced image 560 with the virtual image display shape ⁇ set in S5203b is generated.
  • display data is individually generated for each front obstacle 8b detected in S101 by performing image processing on the data of the emphasized image 560 read from the memory 54m, as in S5104.
  • each emphasized image 560 for each front obstacle 8b is visually recognized in a virtual image display shape ⁇ that differs depending on the type of the front obstacle 8b to be emphasized, in addition to the position ⁇ and the size ⁇ similar to S5105. Is done. In the display control flow after execution of S5205, the process returns to S5101a.
  • the virtual image display shape ⁇ of the emphasized image 560 that individually emphasizes each of the plurality of front obstacles 8b varies depending on the type of the front obstacle 8b to be emphasized.
  • the user can determine the type of each front obstacle 8b from the virtual image display shape ⁇ of each emphasized image 560. Therefore, it becomes possible to strengthen the association of the individual enhanced images 560 with respect to the plurality of obstacles 8b and properly emphasize the obstacles 8b.
  • a portion of the HCU 54 that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5203a, S5203b, S5204, and S5205 is constructed by the processor 54p. It corresponds to "apparatus".
  • the seventh embodiment of the present disclosure is a modification of the fifth embodiment. As shown in FIG. 18, in the display control flow of the seventh embodiment, S5303a, S5303b, S5303c, S5304, S5305, S5104, and S5105 are executed after S5102.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the emphasized image 560 are individually set for each front obstacle 8b detected in S101. At this time, the virtual image display position ⁇ and the virtual image display size ⁇ are set similarly to S5103.
  • the forward obstacle 8b to be emphasized is farther from the host vehicle 2 in the emphasized image 560 in which the virtual image display positions ⁇ overlap as shown in FIG.
  • the virtual image display shape ⁇ is changed.
  • the virtual image display shape ⁇ is set so that the virtual image display of the linear portion 560p that emphasizes the front obstacle 8b far from the host vehicle 2 is cut at the place P where the virtual image display positions ⁇ overlap.
  • the virtual image display shape ⁇ may be set so as to be cut.
  • a display for displaying the emphasized image 560 in a virtual image with the virtual image display shape ⁇ set in S5303b in addition to the virtual image display position ⁇ and the virtual image display size ⁇ set in S5303a, a display for displaying the emphasized image 560 in a virtual image with the virtual image display shape ⁇ set in S5303b. Generate data. At this time, display data is individually generated for each front obstacle 8b detected in S101 by performing image processing on the data of the emphasized image 560 read from the memory 54m, as in S5104.
  • each emphasized image 560 for each front obstacle 8b is overlapped with a virtual image display of the emphasized image 560 for the front obstacle 8b far from the host vehicle 2 in addition to the position ⁇ and the size ⁇ similar to S5105.
  • the shape ⁇ cut by P is visually recognized.
  • the emphasized forward obstacle 8b is emphasized far from the host vehicle 2.
  • the virtual image display of the image 560 is cut at the overlapping portion P. According to this, with respect to the front obstacle 8b that needs to be particularly careful because it is close to the own vehicle 2, the degree of emphasis is enhanced by the emphasized image 560 without a cut, while the front obstacle 8b far from the own vehicle 2 is enhanced.
  • the enhancement function can be ensured by the cut enhancement image 560.
  • the troublesomeness which a user feels due to superimposition of the virtual image display positions ⁇ can be suppressed.
  • the part that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5303a, S5303b, S5303c, S5304, S5305, S5104, and S5105 of the HCU 54 is constructed by the processor 54p.
  • the processor 54p corresponds to a “virtual image display control device”.
  • the eighth embodiment of the present disclosure is a modification of the sixth embodiment. As shown in FIG. 20, in the display control flow of the eighth embodiment, S5403a, S5403b, S5204, and S5205 are executed after the execution of S5102.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the emphasized image 560 are individually set for each front obstacle 8b detected in S101. At this time, the virtual image display position ⁇ and the virtual image display size ⁇ are set similarly to S5103.
  • the virtual image display shape ⁇ of the emphasized image 560 is individually set for each front obstacle 8b detected in S101.
  • the virtual image display shape ⁇ of each emphasized image 560 limits the virtual image display range of the linear portion 560p to the range excluding both sides along with the lower part around the front obstacle 8b to be emphasized as shown in FIG. As set.
  • the virtual image display shape ⁇ of each emphasized image 560 is set to an arc shape that the linear portion 560p extends by bending only substantially above the periphery of the front obstacle 8b to be emphasized.
  • each emphasized image 560 for each front obstacle 8b is in a range excluding the lower side and both sides of the periphery of the front obstacle 8b to be emphasized in addition to the position ⁇ and the size ⁇ similar to S5105. It is visually recognized as a shape ⁇ that restricts the virtual image display of the linear portion 560p.
  • the virtual image display of the emphasized image 560 that individually emphasizes the plurality of front obstacles 8b excludes not only the lower part of the periphery of the emphasized forward obstacle 8b but also the side. Limited to range. According to this, it is difficult to superimpose the virtual image display positions ⁇ of the emphasized images 560 corresponding to the respective front obstacles 8b. Therefore, not only can the enhanced image 560 be individually associated with the plurality of obstacles 8b, but also the annoyance felt by the user due to such superposition can be suppressed. Therefore, the plurality of obstacles 8b can be appropriately emphasized by the individual enhancement image 560.
  • a portion of the HCU 54 that executes S5101a, S5101b, S102, S103, S104, S105, S5102, S5403a, S5403b, S5204, and S5205 is constructed by the “virtual image display control”. It corresponds to "apparatus”.
  • the ninth embodiment of the present disclosure is a modification of the second embodiment. As shown in FIG. 22, in the display control flow of the ninth embodiment, S6101, S6103, S6104, and S6105 are executed after the execution of S105.
  • the process returns to S101. On the other hand, if a positive determination is made in S6101, the process proceeds to S6103.
  • the virtual image display brightness ⁇ of the enhanced image 560 is partially reduced.
  • the virtual image display luminance ⁇ is set such that the normal luminance portion 9560 pn and the lower luminance portion 9560 pl having lower luminance are alternately formed for each predetermined length of the linear portion 560 p.
  • the virtual image display luminance ⁇ is set so that the normal luminance portion 9560 pn has the high luminance described in the first embodiment and the low luminance portion 9560 pl has substantially zero luminance. Therefore, in FIG.
  • the luminance of the low luminance portion 9560pl may be set higher than zero luminance as long as the luminance is lower than that of the normal luminance portion 9560pn.
  • a display for displaying the emphasized image 560 in a virtual image with the virtual image display brightness ⁇ set in S6101. Generate data.
  • the display data is generated by performing image processing on the data of the emphasized image 560 read from the memory 54m, as in S104.
  • the display data generated in S6104 is given to the HUD 50, and the highlighted image 560 is formed by the display 50i, whereby the virtual image display position ⁇ , the virtual image display size ⁇ , and the virtual image display brightness ⁇ of the linear portion 560p. And control.
  • the emphasized image 560 is visually recognized as a broken line by a linear portion 560p in which the virtual image display luminance ⁇ is partially reduced as shown in FIG.
  • the process returns to S2100.
  • the virtual image display brightness ⁇ of a part of the emphasized image 560 that emphasizes the front obstacle 8b is decreased. Accordingly, the user can intuitively grasp the state of the host vehicle 2 that has been detected and lost from the brightness change of the emphasized image 560 even if the front obstacle 8b is visible to the user. Therefore, it is possible to ensure the safety and security of the user by using the emphasized image 560.
  • the “virtual image display control device” in which the part that executes S2100, S2101, S102, S103, S104, S105, S6101, S6103, S6104, and S6105 of the HCU 54 is constructed by the processor 54p. It corresponds to.
  • the tenth embodiment of the present disclosure is a modification of the ninth embodiment. As shown in FIG. 24, in the display control flow of the tenth embodiment, when an affirmative determination is made in S6101, S6203, S6104, and S6105 are executed.
  • the virtual image display brightness ⁇ of the emphasized image 560 is lowered over the entire image 560 as shown in FIG.
  • the virtual image display brightness ⁇ is set so that the overall brightness of the linear portion 560p is lower than the high brightness described in the first embodiment and higher than the zero brightness.
  • the dot hatching roughness is coarser than that in FIG. 8 in the second embodiment, and thus the decrease in the virtual image display luminance ⁇ is schematically represented.
  • the emphasized image 560 is visually recognized at the same position ⁇ and size ⁇ as in S105, and is also visually recognized as a linear portion 560p having a reduced virtual image display luminance ⁇ as shown in FIG.
  • the virtual image display brightness ⁇ of the entire emphasized image 560 that emphasizes the forward obstacle 8b is lowered. Accordingly, the user can intuitively grasp the state of the host vehicle 2 that has been detected and lost from the brightness change of the emphasized image 560 even if the front obstacle 8b is visible to the user. Therefore, it is possible to ensure the safety and security of the user by using the emphasized image 560.
  • the “virtual image display control device” in which the part that executes S2100, S2101, S102, S103, S104, S105, S6101, S6203, S6104, and S6105 of the HCU 54 is constructed by the processor 54p. It corresponds to.
  • the eleventh embodiment of the present disclosure is a modification of the second embodiment.
  • the integrated control ECU performs adaptive cruise control (ACC: Adaptive Cruise Control), which automatically controls the inter-vehicle distance and the vehicle speed in a specific vehicle speed range such as a high-speed range, to the FSRA.
  • ACC Adaptive Cruise Control
  • the integrated control ECU as an “automatic control unit” that realizes ACC, when the cruise control switch is turned on and the vehicle speed of the host vehicle 2 enters the specific vehicle speed range, the manual operation by the user is changed to the automatic control operation. And switch.
  • the integrated control ECU switches the automatic control operation to the manual operation when the cruise control switch is turned off during the automatic control operation or when the vehicle speed goes out of the specific vehicle speed range during the automatic control operation.
  • S7100 it is determined based on the output signal of the vehicle speed sensor of the vehicle state sensor 40 whether or not the vehicle speed of the host vehicle 2 is within the specific vehicle speed range. As a result, while a negative determination is made, the process returns to S2100. On the other hand, when an affirmative determination is made, S7101, S7102, S7103a, S7103b, S7104, and S7105 are executed after executing S2101, S102, S103, S104, and S105.
  • the necessary information I for displaying the emphasized image 560 as a virtual image is acquired in the same manner as in S102.
  • the virtual image display position ⁇ and the virtual image display size ⁇ of the emphasized image 560 are set based on the necessary information I acquired in S7102. For other points, the virtual image display position ⁇ and the virtual image display size ⁇ are set in the same manner as in S103.
  • the virtual image display color ⁇ of the emphasized image 560 is changed over the entire image 560 as shown in FIG.
  • the virtual image display color ⁇ is set to, for example, blue or the like so that the color tone of the emphasized image 560 is different from the color tone described in the first embodiment.
  • the dot hatching of FIG. 8 in the second embodiment is shown instead of the cross hatching to schematically represent the change in the virtual image display color ⁇ .
  • a display for displaying the emphasized image 560 in a virtual image with the virtual image display color ⁇ set in S7103b is generated by performing image processing on the data of the emphasized image 560 read from the memory 54m, as in S104.
  • the display data generated in S7104 is provided to the HUD 50, and the highlighted image 560 is formed by the display 50i, whereby the virtual image display position ⁇ , the virtual image display size ⁇ , and the virtual image display color ⁇ of the linear portion 560p. And control. As a result, the emphasized image 560 is visually recognized by the virtual image display color ⁇ changed as shown in FIGS. 8 to 27 in addition to the position ⁇ and the size ⁇ similar to S105. In the display control flow after execution of S7105, the process returns to S2100.
  • the virtual image display color ⁇ of the emphasized image 560 changes with the switching from the automatic control operation by the integrated control ECU to the manual operation by the user.
  • the user can intuitively grasp that the automatic control operation has been switched to the manual operation from the display color change of the emphasized image 560. Therefore, it is possible to ensure the safety and security of the user by using the emphasized image 560.
  • the part that executes S2100, S7100, S2101, S102, S103, S104, S105, S7101, S7102, S7103a, S7103b, S7104, and S7105 of the HCU 54 is constructed by the processor 54p. Corresponds to a “virtual image display control device”.
  • the virtual image display control of the enhanced image 4560 according to the fourth embodiment may be applied to the second embodiment.
  • FIG. 28 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the second embodiment. That is, in FIG. 28, S4103, S4104, and S4105 are executed in place of S103, S104, and S105.
  • the portion of the HCU 54 that executes S2100, S2101, S102, S4103, S4104, and S4105 corresponds to a “virtual image display control device” constructed by the processor 54p.
  • the virtual image display control of the emphasized image 4560 according to the fourth embodiment may be applied to the third embodiment.
  • FIG. 29 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the third embodiment. That is, in FIG. 29, S4103, S4104, and S4105 are executed in place of S103, S104, and S105.
  • the portion of the HCU 54 that executes S3100, S3101, S102, S4103, S4104, and S4105 corresponds to a “virtual image display control device” constructed by the processor 54p.
  • FIG. 30 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the fifth embodiment. That is, in FIG. 30, S4103, S4104, and S4105 are executed in place of S103, S104, and S105. In addition, in FIG. 30, the positions ⁇ and size ⁇ of the linear portions 560p are replaced with the positions ⁇ 1, ⁇ 2 and sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2, and S5103, S5104, and S5105 are executed.
  • the portion of the HCU 54 that executes S5101a, S5101b, S102, S4103, S4104, S4105, S5102, S5103, S5104, and S5105 corresponds to a “virtual image display control device” constructed by the processor 54p. To do.
  • FIG. 31 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the sixth embodiment. That is, in FIG. 31, S4103, S4104, and S4105 are executed in place of S103, S104, and S105. In addition, in FIG. 31, S5203a, S5204, and S5205 are executed by replacing the position ⁇ and size ⁇ of the linear portion 560p with the positions ⁇ 1, ⁇ 2 and sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2. Further, in FIG.
  • S5203b, S5204, and S5205 are executed in place of the virtual image display shape ⁇ of the linear portion 560p with the virtual image display shape ⁇ of the entire emphasized image 4560 including the linear portions 4560p1 and 4560p2.
  • a portion of the HCU 54 that executes S5101a, S5101b, S102, S4103, S4104, S4105, S5102, S5203a, S5203b, S5204, and S5205 is a “virtual image display control device” constructed by the processor 54p. It corresponds to.
  • the virtual image display control of the enhanced image 4560 according to the fourth embodiment may be applied to the seventh embodiment.
  • FIG. 32 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the seventh embodiment. That is, in FIG. 32, S4103, S4104, and S4105 are executed in place of S103, S104, and S105.
  • the position ⁇ and the size ⁇ of the linear portion 560p are replaced with the positions ⁇ 1, ⁇ 2 and the sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2, and S5303a, S5303b, S5304, S5305, S5104, S5105. Is executed. Further, in FIG.
  • S5303c, S5304, and S5305 are executed in place of the virtual image display shape ⁇ of the linear portion 560p with the virtual image display shape ⁇ of the entire emphasized image 4560 including the linear portions 4560p1 and 4560p2.
  • the part that executes S5101a, S5101b, S102, S4103, S4104, S4105, S5102, S5303a, S5303b, S5303c, S5304, S5305, S5104, and S5105 of the HCU 54 is constructed by the processor 54p. It corresponds to a “virtual image display control device”.
  • FIG. 33 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the eighth embodiment. That is, in FIG. 33, S4103, S4104, and S4105 are executed in place of S103, S104, and S105. In addition, in FIG. 33, S5403a, S5204, and S5205 are executed by replacing the position ⁇ and the size ⁇ of the linear portion 560p with the positions ⁇ 1, ⁇ 2 and the sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2. Further, in FIG.
  • FIG. 34 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the ninth embodiment. That is, in FIG. 34, S4103, S4104, and S4105 are executed in place of S103, S104, and S105. At the same time, in FIG. 34, the position ⁇ and the size ⁇ of the linear portion 560p are replaced with the positions ⁇ 1, ⁇ 2 and the sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2, and S6104 and S6105 are executed. Further, in FIG.
  • S6103, S6104, and S6105 are executed by replacing the virtual image display brightness ⁇ of the linear portion 560p with the virtual image display brightness ⁇ of each of the linear portions 4560p1 and 4560p2.
  • a part of the HCU 54 that executes S2100, S2101, S102, S4103, S4104, S4105, S6101, S6103, S6104, and S6105 corresponds to a “virtual image display control device” constructed by the processor 54p. To do.
  • the virtual image display control of the enhanced image 4560 according to the fourth embodiment may be applied to the tenth embodiment.
  • FIG. 35 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the tenth embodiment. That is, in FIG. 35, S4103, S4104, and S4105 are executed in place of S103, S104, and S105. At the same time, in FIG. 35, the position ⁇ and the size ⁇ of the linear portion 560p are replaced with the positions ⁇ 1, ⁇ 2 and the sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2, and S6104 and S6105 are executed. Further, in FIG.
  • S6203, S6104, and S6105 are executed in place of the virtual image display luminance ⁇ of the linear portion 560p with the virtual image display luminance ⁇ of each of the linear portions 4560p1 and 4560p2.
  • the part of the HCU 54 that executes S2100, S2101, S102, S4103, S4104, S4105, S6101, S6203, S6104, and S6105 corresponds to a “virtual image display control device” constructed by the processor 54p. To do.
  • the virtual image display control of the enhanced image 4560 according to the fourth embodiment may be applied to the eleventh embodiment.
  • FIG. 36 shows a display control flow when the virtual image display control of the emphasized image 4560 according to the fourth embodiment is applied to the eleventh embodiment. That is, in FIG. 36, S4103, S4104, and S4105 are executed in place of S103, S104, and S105.
  • the positions ⁇ and size ⁇ of the linear portions 560p are replaced with the positions ⁇ 1, ⁇ 2 and sizes ⁇ 1, ⁇ 2 of the linear portions 4560p1 and 4560p2, and S7103a, S7104, and S7105 are executed.
  • S7103a, S7104, and S7105 are executed.
  • S7103b, S7014, and S7105 are executed by replacing the virtual image display color ⁇ of the linear portion 560p with the virtual image display color ⁇ of each of the linear portions 4560p1 and 4560p2.
  • the part that executes S2100, S7100, S2101, S102, S4103, S4104, S4105, S7101, S7102, S7103a, S7103b, S7104, and S7105 of the HCU 54 is constructed by the processor 54p. It corresponds to a “display control device”.
  • the linear portion 560p of the emphasized image 560 displayed as a virtual image according to the first to third and fifth to eleventh embodiments is shown in a virtual image display shape other than a curved arc shape, for example, FIG. As such, it may be formed in a substantially inverted U shape that does not curve.
  • FIG. 37 shows a tenth modification of the first embodiment.
  • the first linear portion 4560p1 of the enhanced image 4560 displayed as a virtual image in the fourth embodiment and Modifications 1 to 9 is curved in a virtual image display shape other than the curved arc shape, for example, as shown in FIG. It may be formed in a substantially inverted U shape that does not.
  • the second linear portion 4560p2 of the emphasized image 4560 displayed as a virtual image according to the fourth embodiment and Modifications 1 to 9 is curved in a virtual image display shape other than the curved arc shape, for example, as shown in FIG. Or a straight line that does not curve as shown in FIGS. 38 to 40 show modified examples 11 and 12 of the fourth embodiment.
  • the enhanced image 560 or 4560 displayed as a virtual image in the second, third, ninth to eleventh embodiments and the modified examples 1, 2, 7 to 9 is replaced with the fifth to eighth embodiments and modified examples.
  • virtual images may be displayed around the plurality of front obstacles 8b.
  • the virtual image display sizes ⁇ , ⁇ 1, and ⁇ 2 that become smaller as the front obstacle 8b is farther from the host vehicle 2 may not be used in the sixth to eighth embodiments and the modifications 4 to 6. .
  • the virtual image display color of the color tone that differs depending on the type May be adopted.
  • the enhanced image 560 instead of or in addition to lowering the virtual image display brightness ⁇ of at least a part of the enhanced image 560 according to the ninth, tenth embodiments, and modified examples 7 and 8, the enhanced image 560 may blink. Good.
  • the manual operation is switched to the automatic control operation.
  • the virtual image display color ⁇ may be changed with the switching.
  • the virtual image display color ⁇ changed in accordance with the switching from the automatic control operation to the manual operation according to the eleventh embodiment and the modified example 9 the virtual image changed in accordance with the switching.
  • a display shape may be adopted.
  • the seventh embodiment and Modification 5 may be combined with the sixth embodiment and Modification 4, respectively.
  • the eighth embodiment and Modification 6 may be combined with the sixth embodiment and Modification 4, respectively.
  • the ninth embodiment and Modification 7 may be combined with the eleventh embodiment and Modification 9, respectively.
  • the tenth embodiment and Modification 8 may be combined with the eleventh embodiment and Modification 9, respectively.
  • the ACC according to the eleventh embodiment and the modified example 9 may be realized by the integrated control ECU in the vehicle control ECU 42 instead of the FSRA in the other embodiments and modified examples.
  • the integrated control ECU among the vehicle control ECUs 42 according to the eleventh embodiment and the modified example 9 functions as an “automatic control unit” that realizes the LKA, so that the automatic control operation by the LKA is changed to the manual operation.
  • the virtual image display color ⁇ may be changed with the switching. In this case, a combination with the third embodiment and Modification 2 is possible.
  • the integrated control ECU among the vehicle control ECUs 42 according to the eleventh embodiment and the modified example 9 is caused to function as an “automatic control unit” that realizes an automatic control operation other than ACC and LKA.
  • the virtual image display color ⁇ may be changed in accordance with switching from driving to manual driving.
  • automatic control operation applicable other than ACC and LKA automatic control of, for example, merging travel at a junction on a traveling road, branch traveling at a branching point on a traveling path, traveling from a gate to a merging point, etc.
  • the HCU 54 may not be provided.
  • one type or a plurality of types among the ECUs 31 and 42 and the display ECU provided for controlling the display elements 50, 51, and 52 may be caused to function as the “vehicle display control device”.
  • the “virtual image display control device” may be constructed by realizing the display control flow of each embodiment by a processor included in one or more types of ECUs.
  • FIG. 41 shows a modified example 26 in which the display ECU 50e having the processor 54p and the memory 54m in the HUD 50 fulfills the function of “vehicle display control device”.
  • each section is expressed as S101, for example.
  • each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section.
  • each section configured in this manner can be referred to as a device, module, or means.

Abstract

L'invention concerne un dispositif de commande d'affichage de véhicule (54, 50e) qui commande l'affichage d'une image virtuelle d'une image d'affichage (56) dans un véhicule propre (2) sur lequel est prévu un affichage tête haute (50) pour afficher l'image virtuelle de l'image d'affichage (56) sur un élément de projection (21) avec l'image d'affichage (56) associée à un obstacle avant (8b) dans une vue extérieure, le dispositif de commande d'affichage de véhicule étant muni d'un dispositif de stockage d'images (54m) et d'un dispositif de commande d'affichage d'image virtuelle (S101, S102, S103, S104, S105, S2100, S2101, S3100, S3101, S5101a, S5101b, S5102, S5103, S5104, S5105, S5203a, S5203b, S5204, S5205, S5303a, S5303b, S5303c, S5304, S5305, S5403a, S5403b, S6101, S6103, S6104, S6105, S6203, S7101, S7102, S7103a, S7103b, S7104, S7105). Dans une position d'affichage d'image virtuelle qui est une zone entière d'une plage inférieure à un circuit excluant une partie inférieure de la périphérie de l'obstacle avant, le dispositif de stockage d'images stocke, en tant qu'image d'affichage, une image de mise en évidence (560) pour mettre en évidence l'obstacle avant avec une partie linéaire (560p) ayant une taille d'affichage d'image virtuelle qui entoure l'obstacle avant avec un espace (560m) à gauche. Le dispositif de commande d'affichage d'image virtuelle est composé d'au moins un processeur (54p) et commande la position d'affichage d'image virtuelle et la taille d'affichage d'image virtuelle.
PCT/JP2016/000371 2015-02-09 2016-01-26 Dispositif de commande d'affichage de véhicule et unité d'affichage de véhicule WO2016129219A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/549,489 US20180024354A1 (en) 2015-02-09 2016-01-26 Vehicle display control device and vehicle display unit

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015023621 2015-02-09
JP2015-023621 2015-02-09
JP2015236915A JP6520668B2 (ja) 2015-02-09 2015-12-03 車両用表示制御装置及び車両用表示ユニット
JP2015-236915 2015-12-03

Publications (1)

Publication Number Publication Date
WO2016129219A1 true WO2016129219A1 (fr) 2016-08-18

Family

ID=56615135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/000371 WO2016129219A1 (fr) 2015-02-09 2016-01-26 Dispositif de commande d'affichage de véhicule et unité d'affichage de véhicule

Country Status (1)

Country Link
WO (1) WO2016129219A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009219A1 (fr) * 2018-07-05 2020-01-09 日本精機株式会社 Dispositif d'affichage tête haute
CN111316247A (zh) * 2017-09-07 2020-06-19 Lg电子株式会社 车辆av系统的错误检测ic
US20210223058A1 (en) * 2018-12-14 2021-07-22 Denso Corporation Display control device and non-transitory computer-readable storage medium for the same

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11326519A (ja) * 1998-05-19 1999-11-26 Unisia Jecs Corp 障害物検知装置
JP2005343351A (ja) * 2004-06-04 2005-12-15 Olympus Corp 運転支援装置
JP2006163501A (ja) * 2004-12-02 2006-06-22 Denso Corp 適正車間距離表示制御装置
JP2009067368A (ja) * 2007-09-18 2009-04-02 Denso Corp 表示装置
WO2009072366A1 (fr) * 2007-12-05 2009-06-11 Bosch Corporation Dispositif d'affichage d'informations de véhicule
JP2011079345A (ja) * 2009-10-02 2011-04-21 Denso Corp 車両用ヘッドアップディスプレイ

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11326519A (ja) * 1998-05-19 1999-11-26 Unisia Jecs Corp 障害物検知装置
JP2005343351A (ja) * 2004-06-04 2005-12-15 Olympus Corp 運転支援装置
JP2006163501A (ja) * 2004-12-02 2006-06-22 Denso Corp 適正車間距離表示制御装置
JP2009067368A (ja) * 2007-09-18 2009-04-02 Denso Corp 表示装置
WO2009072366A1 (fr) * 2007-12-05 2009-06-11 Bosch Corporation Dispositif d'affichage d'informations de véhicule
JP2011079345A (ja) * 2009-10-02 2011-04-21 Denso Corp 車両用ヘッドアップディスプレイ

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111316247A (zh) * 2017-09-07 2020-06-19 Lg电子株式会社 车辆av系统的错误检测ic
CN111316247B (zh) * 2017-09-07 2023-06-02 Lg电子株式会社 车辆av系统的错误检测ic
WO2020009219A1 (fr) * 2018-07-05 2020-01-09 日本精機株式会社 Dispositif d'affichage tête haute
JPWO2020009219A1 (ja) * 2018-07-05 2021-08-12 日本精機株式会社 ヘッドアップディスプレイ装置
US11370304B2 (en) 2018-07-05 2022-06-28 Nippon Seiki Co., Ltd. Head-up display device
JP7338625B2 (ja) 2018-07-05 2023-09-05 日本精機株式会社 ヘッドアップディスプレイ装置
US20210223058A1 (en) * 2018-12-14 2021-07-22 Denso Corporation Display control device and non-transitory computer-readable storage medium for the same

Similar Documents

Publication Publication Date Title
JP2016147652A (ja) 車両用表示制御装置及び車両用表示ユニット
US10663315B2 (en) Vehicle display control device and vehicle display control method
EP3147149B1 (fr) Dispositif d'affichage
US10272780B2 (en) Information display system and information display device
US10754153B2 (en) Vehicle display apparatus
WO2014208008A1 (fr) Affichage tête haute et progiciel d'affichage tête haute
JP6443716B2 (ja) 画像表示装置、画像表示方法及び画像表示制御プログラム
WO2020003750A1 (fr) Dispositif de commande d'affichage de véhicule, procédé de commande d'affichage de véhicule et programme de commande
JP6748947B2 (ja) 画像表示装置、移動体、画像表示方法及びプログラム
WO2016129219A1 (fr) Dispositif de commande d'affichage de véhicule et unité d'affichage de véhicule
JP2017186008A (ja) 情報表示システム
WO2019189393A1 (fr) Appareil de commande d'image, appareil d'affichage, corps mobile et procédé de commande d'image
CN113165510A (zh) 显示控制装置、方法和计算机程序
EP3776152A1 (fr) Appareil de commande d'image, appareil d'affichage, corps mobile et procédé de commande d'image
JP6589775B2 (ja) 車両用表示制御装置及び車両用表示システム
WO2022168540A1 (fr) Dispositif de commande d'affichage et programme de commande d'affichage
JP6814416B2 (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP7054483B2 (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP6973462B2 (ja) 車両用表示制御装置
JP7014254B2 (ja) 車両用表示制御装置及び車両用表示制御方法
WO2023145856A1 (fr) Système d'affichage
WO2023176737A1 (fr) Dispositif et procédé de commande d'écran
JP7275985B2 (ja) 表示制御装置
JP2018087852A (ja) 虚像表示装置
JP2023003663A (ja) 虚像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16748867

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15549489

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16748867

Country of ref document: EP

Kind code of ref document: A1