WO2022102374A1 - Système d'affichage de véhicule - Google Patents

Système d'affichage de véhicule Download PDF

Info

Publication number
WO2022102374A1
WO2022102374A1 PCT/JP2021/039092 JP2021039092W WO2022102374A1 WO 2022102374 A1 WO2022102374 A1 WO 2022102374A1 JP 2021039092 W JP2021039092 W JP 2021039092W WO 2022102374 A1 WO2022102374 A1 WO 2022102374A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
display
position information
unit
display system
Prior art date
Application number
PCT/JP2021/039092
Other languages
English (en)
Japanese (ja)
Inventor
拓男 杉山
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Priority to JP2022561371A priority Critical patent/JPWO2022102374A1/ja
Publication of WO2022102374A1 publication Critical patent/WO2022102374A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • This disclosure relates to a vehicle display system.
  • Patent Document 1 discloses a display system that identifies the running state of another vehicle existing around the own vehicle and makes the driver recognize it.
  • the display system of Patent Document 1 includes a detection unit, an acquisition unit, and a display unit.
  • the acquisition unit acquires other vehicle information indicating the traveling state of the other vehicle.
  • the display unit generates an image of another vehicle showing the driving state (including driving settings and behavior) of the other vehicle based on the acquired information on the other vehicle, and associates the generated image of the other vehicle with the other vehicle to obtain the image of the vehicle. Display so that it is superimposed on the scenery seen through the windshield from the driver's seat.
  • Patent Document 1 acquires and displays other vehicle information by performing vehicle-to-vehicle communication with the other vehicle when another vehicle exists around the own vehicle.
  • the other vehicle does not always affect the own vehicle, there is a problem that the display load of the display system increases when the other vehicle information of the other vehicle is acquired and displayed.
  • the vehicle display system of the present disclosure is a vehicle display system provided in a vehicle.
  • a display unit that displays the position information of at least one other vehicle on the map information around the vehicle, and Based on the instructions of the occupants of the vehicle, the designated part that designates the other vehicle and
  • a determination unit for determining whether or not the other vehicle is in the automatic driving mode is provided.
  • the display unit displays a driving support instruction for supporting the driving of the vehicle.
  • the vehicle display system of the present disclosure displays a driving support instruction when the determination unit determines that the other vehicle designated by the designated unit is not in the automatic driving mode. That is, when another vehicle to be watched is specified and the other vehicle may affect the vehicle instead of the automatic driving mode of the other vehicle, the driving support instruction is displayed. Therefore, the occupants of the vehicle can concentrate their attention on the other vehicle. Further, when the determination unit determines that the other vehicle is not in the automatic driving mode, the driving support instruction is displayed, so that the display unit does not have to always display the driving support instruction. Therefore, the display load of the vehicle display system can be reduced.
  • FIG. 1 is a block diagram of a vehicle system including the vehicle display system of the present disclosure.
  • FIG. 2 is a schematic diagram of a head-up display (HUD) according to the present embodiment included in the vehicle display system.
  • FIG. 3 is a flowchart showing a processing flow of the vehicle display system.
  • FIG. 4 is an example of displaying map information, vehicle and other vehicle position information when the vehicle travels on a two-lane road.
  • FIG. 5 is a display example 1 of the HUD.
  • FIG. 6 is a display example 2 of the HUD.
  • FIG. 7 is a display example of map information, vehicle and other vehicle position information when the vehicle changes lanes.
  • FIG. 8 is an example of displaying map information, a vehicle, and position information of another vehicle when another vehicle changes lanes.
  • FIG. 9 is an example of displaying map information, vehicle and other vehicle position information when the vehicle and another vehicle travel toward a confluence.
  • FIG. 10 is an example of displaying map information, vehicle and position information of
  • horizontal direction In the description of the present embodiment, for convenience of explanation, "horizontal direction”, “vertical direction”, and “front-back direction” may be appropriately referred to. These directions are relative directions set for the HUD (head-up display) 42 shown in FIG.
  • the "left-right direction” is a direction including the “left direction” and the “right direction”.
  • the "vertical direction” is a direction including “upward” and “downward”.
  • the "front-back direction” is a direction including the "forward direction” and the "rear direction”.
  • the left-right direction is a direction orthogonal to the up-down direction and the front-back direction.
  • FIG. 1 is a block diagram of the vehicle system 2.
  • the vehicle 1 equipped with the vehicle system 2 is a vehicle (automobile) capable of traveling in a manual driving mode or an automatic driving mode.
  • the vehicle system 2 includes a vehicle control unit 3, a vehicle display system 4 (hereinafter, simply referred to as “display system 4”), a sensor 5, a camera 6, and a radar 7. .. Further, the vehicle system 2 includes an HMI (Human Machine Interface) 8, a GPS (Global Positioning System) 9, a wireless communication unit 10, a storage device 11, a steering actuator 12, a steering device 13, and a brake actuator 14. , A brake device 15, an accelerator actuator 16, and an accelerator device 17.
  • HMI Human Machine Interface
  • GPS Global Positioning System
  • the vehicle control unit 3 is configured to control the running of the vehicle.
  • the vehicle control unit 3 is composed of, for example, at least one electronic control unit (ECU: Electronic Control Unit).
  • the electronic control unit includes a computer system including one or more processors and one or more memories (for example, SoC (System on a Chip) or the like), and an electronic circuit composed of active elements such as transistors and passive elements.
  • the processor includes, for example, at least one of a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and a TPU (Tensor Processing Unit).
  • the CPU may be composed of a plurality of CPU cores.
  • the GPU may be composed of a plurality of GPU cores.
  • the memory includes a ROM (Read Only Memory) and a RAM (Random Access Memory).
  • the vehicle control program may be stored in the ROM.
  • the vehicle control program may include an artificial intelligence (AI) program for autonomous driving.
  • AI is a program (trained model) constructed by supervised or unsupervised machine learning (particularly deep learning) using a multi-layer neural network.
  • the RAM may temporarily store a vehicle control program, vehicle control data, and / or peripheral environment information indicating the surrounding environment of the vehicle.
  • the processor may be configured to develop a program designated from various vehicle control programs stored in the ROM on the RAM and execute various processes in cooperation with the RAM.
  • the computer system may be configured by a non-Von Neumann computer such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array). Further, the computer system may be composed of a combination of a Von Neumann computer and a non-Von Neumann computer.
  • the vehicle control unit 3 may control the operation of the display system 4. In this case, the vehicle control unit 3 constitutes a part of the display system 4.
  • the vehicle control unit 3 is an example of a determination unit.
  • the display system 4 includes a headlamp 20, a HUD 42, and a display control unit 43.
  • the headlamps 20 are arranged on the left and right sides of the front surface of the vehicle 1, and are configured to irradiate a low beam lamp in front of the vehicle 1 with a low beam and a high beam in front of the vehicle 1. It is equipped with a high beam lamp.
  • Each of the low beam lamp and the high beam lamp has one or more light emitting elements such as an LED (Light Emitting Diode) and an LD (Laser Diode), and an optical member such as a lens and a reflector.
  • the HUD 42 is located inside the vehicle 1. Specifically, the HUD 42 is installed at a predetermined position in the room of the vehicle 1. For example, the HUD 42 may be located within the dashboard of vehicle 1.
  • the HUD 42 is a visual interface between the vehicle 1 and the occupants.
  • the HUD 42 displays the HUD information to the occupants so that the predetermined information (hereinafter referred to as HUD information) is superimposed on the real space outside the vehicle 1 (particularly, the surrounding environment in front of the vehicle). It is configured.
  • the HUD 42 is an AR (Augmented Reality) display.
  • the HUD information displayed by the HUD 42 is, for example, related to vehicle running information related to the running of the vehicle 1 and / or surrounding environment information related to the surrounding environment of the vehicle 1 (particularly, related to an object existing outside the vehicle 1). Information).
  • HUD42 is an example of a display unit. Details of HUD42 will be described later.
  • the display control unit 43 is configured to control the operation of the headlamp 20 and the HUD 42. Further, the display control unit 43 is configured to determine the driving mode of at least one other vehicle around the vehicle 1.
  • the display control unit 43 is composed of an electronic control unit (ECU).
  • the electronic control unit includes a computer system (for example, SoC) including one or more processors and one or more memories, and an electronic circuit composed of active elements such as transistors and passive elements.
  • the processor includes at least one of CPU, MPU, GPU and TPU.
  • the memory includes a ROM and a RAM. Further, the computer system may be configured by a non-Von Neumann computer such as an ASIC or FPGA.
  • the display control unit 43 is an example of a determination unit.
  • the display control unit 43 controls the HUD 42 to support the HUD 42 with the map information M around the vehicle 1, the position information of the vehicle 1, the position information of at least one other vehicle around the vehicle 1, and the driving of the vehicle 1. It is configured to display the driving support instruction X to be performed.
  • the driving support instruction X includes a stop instruction X1 or a warning instruction X2.
  • the HUD 42 may display an identifier together with the position information of at least one other vehicle. When there are a plurality of other vehicles around the vehicle 1, the HUD 42 may display an identifier in each position information together with the position information of each other vehicle.
  • the vehicle control unit 3 and the display control unit 43 are provided as separate configurations, but the vehicle control unit 3 and the display control unit 43 may be integrally configured.
  • the display control unit 43 and the vehicle control unit 3 may be configured by a single electronic control unit. At this time, the vehicle control unit 3 determines the driving mode of at least one other vehicle around the vehicle 1 as a determination unit.
  • the display control unit 43 may be composed of two electronic control units, an electronic control unit configured to control the headlamp 20 and an electronic control unit configured to control the operation of the HUD 42. ..
  • the control board 425 that controls the operation of the HUD 42 may be configured as a part of the display control unit 43.
  • the display control unit 43 is configured to control the sensor 5 and specify at least one other vehicle around the vehicle 1 based on the instruction of the occupant of the vehicle 1.
  • the sensor 5 is, for example, a sound sensor that detects the voice of an occupant. When the occupant of the vehicle 1 utters an identifier of at least one other vehicle, the sensor 5 is configured to receive an instruction corresponding to the identifier.
  • the sensor 5 may include at least one of an acceleration sensor, a speed sensor and a gyro sensor. The sensor 5 is configured to detect the traveling state of the vehicle 1 and output the traveling state information to the vehicle control unit 3.
  • the sensor 5 includes a seating sensor that detects whether the driver is sitting in the driver's seat, a face orientation sensor that detects the direction of the driver's face, an external weather sensor that detects the external weather condition, and whether or not there is a person in the vehicle. A human sensor or the like for detection may be further provided.
  • the sensor 5 is an example of a designated unit.
  • the camera 6 is, for example, a camera including an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary MOS).
  • the camera 6 includes one or more external cameras 6A and an internal camera 6B.
  • the external camera 6A is configured to acquire image data indicating the surrounding environment of the vehicle and then transmit the image data to the vehicle control unit 3.
  • the vehicle control unit 3 acquires surrounding environment information based on the transmitted image data.
  • the surrounding environment information may include information on an object (pedestrian, other vehicle, sign, etc.) existing outside the vehicle 1.
  • the surrounding environment information may include information on the attributes of the object existing outside the vehicle 1 and information on the distance and position of the object with respect to the vehicle 1.
  • the external camera 6A may be configured as a monocular camera or a stereo camera. Further, the image pickup camera 30 may be substituted for the external camera 6A.
  • the internal camera 6B is arranged inside the vehicle 1 and is configured to acquire image data indicating an occupant.
  • the display control unit 43 is configured to control the internal camera 6B and specify at least one other vehicle around the vehicle 1 based on the instructions of the occupants of the vehicle 1.
  • the internal camera 6B is a tracking camera that detects and tracks the viewpoint P of the occupant.
  • the viewpoint P of the occupant may be either the viewpoint of the left eye or the viewpoint of the right eye of the occupant.
  • the viewpoint P may be defined as the midpoint of a line segment connecting the viewpoint of the left eye and the viewpoint of the right eye.
  • the display control unit 43 may specify the position of the occupant's viewpoint P based on the image data acquired by the internal camera 6B.
  • the position of the viewpoint P of the occupant may be updated at a predetermined cycle based on the image data, or may be determined only once when the vehicle 1 is started.
  • the internal camera 6B is an example of a designated unit.
  • the radar 7 includes at least one of a millimeter wave radar, a microwave radar and a laser radar (for example, a LiDAR unit).
  • the LiDAR unit is configured to detect the surrounding environment of the vehicle 1.
  • the LiDAR unit is configured to acquire 3D mapping data (point cloud data) indicating the surrounding environment of the vehicle 1 and then transmit the 3D mapping data to the vehicle control unit 3.
  • the vehicle control unit 3 identifies the surrounding environment information based on the transmitted 3D mapping data.
  • the HMI 8 is composed of an input unit that receives an input operation from the driver and an output unit that outputs driving information and the like to the driver.
  • the input unit includes a steering wheel, an accelerator pedal, a brake pedal, an operation mode changeover switch for switching the operation mode of the vehicle, and the like.
  • the output unit is a display (excluding HUD42) that displays various traveling information.
  • the GPS 9 is configured to acquire the position information of the vehicle 1 via communication with the satellite and output the acquired position information to the vehicle control unit 3.
  • GPS 9 is an example of the first communication unit.
  • a satellite is an example of an external device.
  • the wireless communication unit 10 receives information about at least one other vehicle around the vehicle 1 (for example, position information, traveling information, etc. of the other vehicle) from the other vehicle, and information about the vehicle 1 (for example, the vehicle). It is configured to transmit the position information, traveling information, etc. of 1) to the other vehicle (vehicle-to-vehicle communication).
  • the wireless communication unit 10 may acquire the position information of a plurality of other vehicles around the vehicle 1, respectively.
  • the wireless communication unit 10 is configured to receive map information M and infrastructure information from infrastructure equipment such as traffic lights and indicator lights, and to transmit traveling information of vehicle 1 to the infrastructure equipment (road-to-vehicle communication).
  • the wireless communication unit 10 receives information about the pedestrian from the portable electronic device (smartphone, tablet, wearable device, etc.) carried by the pedestrian, and transmits the own vehicle traveling information of the vehicle 1 to the portable electronic device. It is configured to do (pedestrian-to-vehicle communication).
  • the wireless communication unit 10 may directly communicate with another vehicle, infrastructure equipment, or a portable electronic device, or may communicate via an access point. Further, the wireless communication unit 10 may communicate with another vehicle, infrastructure equipment, or a portable electronic device via a communication network (not shown).
  • the communication network includes at least one of the Internet, a local area network (LAN), a wide area network (WAN) and a radio access network (RAN).
  • the wireless communication standard is, for example, Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), LPWA, DSRC (registered trademark) or Li-Fi.
  • the vehicle 1 may communicate with another vehicle, infrastructure equipment, or a portable electronic device by using a fifth generation mobile communication system (5G).
  • the wireless communication unit 10 is an example of the second communication unit and the third communication unit.
  • the storage device 11 is an external storage device such as a hard disk drive (HDD) or SSD (Solid State Drive).
  • the storage device 11 may store a two-dimensional or three-dimensional map information M and / or a vehicle control program.
  • the three-dimensional map information may be composed of 3D mapping data (point cloud data).
  • the storage device 11 is configured to output the map information M and the vehicle control program to the vehicle control unit 3 in response to a request from the vehicle control unit 3.
  • the map information M and the vehicle control program may be updated via the wireless communication unit 10 and the communication network.
  • the vehicle control unit 3 When the vehicle 1 travels in the automatic driving mode, the vehicle control unit 3 has a steering control signal, an accelerator control signal, and a brake control signal based on the traveling state information, the surrounding environment information, the position information of the vehicle 1, the map information M, and the like. At least one of them is automatically generated.
  • the steering actuator 12 is configured to receive a steering control signal from the vehicle control unit 3 and control the steering device 13 based on the received steering control signal.
  • the brake actuator 14 is configured to receive a brake control signal from the vehicle control unit 3 and control the brake device 15 based on the received brake control signal.
  • the accelerator actuator 16 is configured to receive an accelerator control signal from the vehicle control unit 3 and control the accelerator device 17 based on the received accelerator control signal.
  • the vehicle control unit 3 automatically controls the travel of the vehicle 1 based on the travel state information, the surrounding environment information, the position information of the vehicle 1, the map information M, and the like. That is, in the automatic driving mode, the traveling of the vehicle 1 is automatically controlled by the vehicle system 2.
  • the vehicle control unit 3 when the vehicle 1 travels in the manual driving mode, the vehicle control unit 3 generates a steering control signal, an accelerator control signal, and a brake control signal according to the manual operation of the driver with respect to the accelerator pedal, the brake pedal, and the steering wheel.
  • the steering control signal, the accelerator control signal, and the brake control signal are generated by the driver's manual operation, so that the driving of the vehicle 1 is controlled by the driver.
  • At least one other vehicle around the vehicle 1 also has the following driving modes.
  • the description of the driving mode of the other vehicle is omitted because it overlaps with the description of the driving mode of the vehicle 1.
  • the operation mode consists only of the automatic operation mode and the manual operation mode.
  • the manual driving mode the vehicle system 2 does not automatically control the driving, and the driver drives the vehicle 1 without the driving support of the vehicle system 2.
  • the manual operation mode is an example of the second operation mode.
  • the automatic operation mode includes a first automatic operation mode and a second automatic operation mode, which is an automatic operation level lower than the automatic operation level of the first automatic operation mode.
  • the automatic operation mode includes five levels.
  • the automatic driving mode includes a fully automatic driving mode which is level 5, an advanced driving support mode which is level 4 and level 3, and a driving support mode which is level 2 and level 1.
  • the level 5 is the highest, the level 4 is the lowest, the level 3 and the level 2 are the lowest, and the level 1 is the lowest.
  • the vehicle system 2 In the fully automatic driving mode (level 5), the vehicle system 2 automatically performs all driving controls such as steering control, brake control, and accelerator control, and the driver is not in a state where the vehicle 1 can be driven.
  • the fully automatic driving mode (level 5) is the first automatic driving mode
  • the advanced driving support mode and the driving support mode (level 4 to level 1) other than the fully automatic driving mode are the second automatic driving modes.
  • the vehicle system 2 In the advanced driving support mode of level 4, when the vehicle 1 is under specific conditions, the vehicle system 2 automatically performs all driving control of steering control, brake control and accelerator control, and the driver controls the vehicle 1. Although it is in a state where it can be driven, it does not drive the vehicle 1. For example, at level 4, when the vehicle 1 travels in a specific area, the vehicle system 2 automatically performs all driving control, and when the vehicle 1 travels outside the specific area, the driver drives the vehicle 1. You may.
  • the advanced driving support mode of level 4 is the first automatic driving mode
  • the automatic driving mode of level 3 to level 1 is the second automatic driving mode.
  • the vehicle system 2 In the advanced driving support mode of level 3, when the vehicle 1 is under certain conditions, the vehicle system 2 automatically performs all driving control, and the driver makes the vehicle 1 in response to a request from the vehicle system 2. You may drive. For example, at level 3, when the vehicle 1 travels on a highway, the vehicle system 2 basically automatically performs driving control, but the vehicle system 2 sends a request to the driver at a predetermined timing, and the request is made. The driver may drive the vehicle 1 according to the situation.
  • the vehicle system 2 In the level 2 driving support mode, the vehicle system 2 automatically performs some driving control of steering control, brake control, and accelerator control under certain conditions, and the driver under the driving support of the vehicle system 2 Drive the vehicle 1. For example, at level 2, when the vehicle 1 is traveling on a highway and there is a vehicle in front having a slow traveling speed, the vehicle system 2 may control the traveling so as to automatically overtake the vehicle in front. .. In the level 1 driving support mode, the vehicle system 2 performs only a part of the driving control, and basically the driver drives the vehicle 1. For example, at level 1, the vehicle 1 is automatically stopped so that the vehicle 1 does not collide with an object, the vehicle 1 is driven at a certain distance from the vehicle in front, and the vehicle 1 is out of the lane. The vehicle system 2 controls the traveling so that the vehicle does not travel.
  • the vehicle system 2 When at least one other vehicle around the vehicle 1 is in the level 5 fully automated driving mode or the level 4 advanced driving support mode, the vehicle system 2 performs most of the driving control of the other vehicle, so that the other vehicle Easy to predict driving.
  • the driver of the other vehicle controls the driving more than when the vehicle system 2 controls the driving. It is often driven and it is difficult to predict the driving of other vehicles. Therefore, in this example, the automatic operation mode from level 3 to level 1 is set as the second automatic operation mode.
  • the display control unit 43 is configured to determine whether or not the other vehicle communicating with the wireless communication unit 10 is in the automatic driving mode. Further, the vehicle control unit 3 or the display control unit 43 is configured to determine whether the other vehicle is in the first automatic driving mode or the second automatic driving mode via the communication of the wireless communication unit 10. The determination process of the vehicle control unit 3 or the display control unit 43 will be described later.
  • FIG. 2 is a schematic diagram of the HUD 42 according to the present embodiment.
  • the HUD 42 includes a HUD main body portion 420.
  • the HUD main body 420 has a housing 422 and an exit window 423.
  • the exit window 423 is a transparent plate that transmits visible light.
  • the HUD main body 420 has an image generation unit (PGU: Picture Generation Unit) 424, a control board 425, a concave mirror 426, a drive mechanism 427, and a plane mirror 428 inside the housing 422.
  • PGU Picture Generation Unit
  • the image generation unit 424 has a light source, an optical component, and a display device.
  • the light source is, for example, a laser light source or an LED light source.
  • the laser light source is, for example, an RGB laser light source configured to emit a red laser light, a green light laser light, and a blue laser light, respectively.
  • Optical components include prisms, lenses, diffusers, magnifying glasses and the like.
  • the display device is a liquid crystal display, a DMD (Digital Mirror Device), or the like.
  • the drawing method of the image generation unit 424 may be a raster scan method, a DLP method, or an LCOS method.
  • the light source of the HUD 42 may be an LED light source.
  • the light source of the HUD 42 may be a white LED light source.
  • the control board 425 is configured to control the operation of the image generation unit 424 and the drive mechanism 427.
  • the control board 425 is equipped with a processor such as a CPU (Central Processing Unit) and a memory, and the processor executes a computer program read from the memory to control the operation of the image generation unit 424.
  • the control board 425 generates a control signal for controlling the operation of the image generation unit 424 based on the image data transmitted from the display control unit 43, and then transfers the generated control signal to the image generation unit 424. It is configured to send. Further, the control board 425 may be controlled so as to change the direction of the concave mirror 426 via the drive mechanism 427.
  • the concave mirror 426 is arranged on the optical path of the light emitted from the image generation unit 424 and reflected by the plane mirror 428. Specifically, the concave mirror 426 is arranged in the HUD main body 420 in front of the image generation unit 424 and the plane mirror 428. The concave mirror 426 is configured to reflect the light emitted by the image generation unit 424 toward the windshield 18 (for example, the front window of the vehicle 1) through the exit window 423.
  • the concave mirror 426 has a concavely curved reflecting surface that forms a virtual image, and reflects an image of light emitted from the image generation unit 424 and formed into an image at a predetermined magnification.
  • the light emitted from the exit window 423 of the HUD main body 420 is applied to the windshield 18.
  • a part of the light emitted from the HUD main body 420 to the windshield 18 is reflected toward the occupant's viewpoint P.
  • the occupant recognizes the light (predetermined image) emitted from the HUD main body 420 as a virtual image formed at a predetermined distance in front of the windshield 18.
  • the image displayed by the HUD 42 being superimposed on the real space in front of the vehicle 1 through the windshield 18, the occupant can see the virtual image object I formed by the predetermined image on the road located outside the vehicle. It can be visually recognized as if it were floating.
  • a predetermined image is projected so as to be a virtual image of a single distance arbitrarily determined.
  • a 3D image stereo image
  • a plurality of predetermined images that are the same as or different from each other are projected so as to be virtual images at different distances.
  • the distance of the virtual image object I (distance from the occupant's viewpoint P to the virtual image) adjusts the distance from the image generation unit 424 to the occupant's viewpoint P (for example, the distance between the image generation unit 424 and the concave mirror 426). It can be adjusted by (adjusting).
  • the HUD main body 420 does not have to have the plane mirror 428. In this case, the light emitted from the image generation unit 424 is incident on the concave mirror 426 without being reflected by the plane mirror 428.
  • FIG. 3 is a flowchart showing the processing flow of the display system 4.
  • the GPS 9 acquires the position information of the vehicle 1 via communication with the satellite, and the wireless communication unit 10 acquires the position information of at least one other vehicle around the vehicle 1.
  • the wireless communication unit 10 acquires the position information of each of the plurality of other vehicles.
  • the GPS 9 or the wireless communication unit 10 may update the position information of the vehicle 1 or the position information of another vehicle at predetermined time intervals.
  • the acquired position information of the vehicle 1 and the acquired position information of the other vehicle are transmitted to the display control unit 43 via the vehicle control unit 3.
  • the map information M is stored in the storage device 11, the map information M is also transmitted to the display control unit 43 via the vehicle control unit 3.
  • the map information M may be acquired by road-to-vehicle communication between the infrastructure equipment and the wireless communication unit 10, and may be transmitted to the display control unit 43.
  • the display control unit 43 When the display control unit 43 receives the position information of the other vehicle, the display control unit 43 assigns an identifier to the position information of the other vehicle (step 2). When there are a plurality of other vehicles around the vehicle 1, an identifier is assigned to the position information of each other vehicle. The identifier may be given to the position information of another vehicle by the vehicle control unit 3 before being transmitted to the display control unit 43.
  • the display control unit 43 displays the map information M around the vehicle 1 on the HUD 42, and displays the acquired position information of the vehicle 1, the acquired position information of the other vehicle, and the assigned identifier on the map information M. (Step 3).
  • the display control unit 43 determines whether or not the sensor 5 or the internal camera 6B has received the instruction corresponding to the identifier (step 4). In other words, the display control unit 43 determines whether or not another vehicle corresponding to the identifier has been designated.
  • the other vehicle is designated by the occupant of the vehicle 1 sending an instruction corresponding to one identifier displayed on the HUD 42 to the sensor 5 or the internal camera 6B.
  • the occupant is the driver of the vehicle 1, but the occupant is not limited to the driver.
  • the occupant may be a occupant sitting in the passenger seat or the back seat.
  • step 4 If another vehicle is not specified (NO in step 4), the display system 4 returns to the process of acquiring the position information of the vehicle 1 and the position information of the other vehicle (step 1).
  • step 4 the designation is transmitted to the display control unit 43.
  • the display control unit 43 determines whether or not the wireless communication unit 10 can communicate with one other designated vehicle based on the transmitted designation (step 5).
  • the display control unit 43 displays the warning instruction X2 on the HUD 42 when communication with the designated other vehicle is impossible (step 12).
  • the display control unit 43 determines whether the operation mode of the other vehicle communicated by the wireless communication unit 10 is the manual operation mode or the automatic operation mode. Determine (step 6).
  • the display control unit 43 determines that the driving mode of the other vehicle with which the wireless communication unit 10 communicates is the manual driving mode (YES in step 6), the driver of the other vehicle can use the other vehicle without the driving support of the vehicle system 2. Therefore, it is difficult for the occupant of the vehicle 1 to predict the traveling of another vehicle.
  • the display control unit 43 displays the driving support instruction X on the HUD 42.
  • the display control unit 43 is based on the map information M and the position information of the vehicle 1, and the surroundings of the vehicle 1 are general roads. It is determined whether it is a highway or a highway (step 10).
  • the display control unit 43 determines that the operation mode of the other vehicle with which the wireless communication unit 10 communicates is the automatic operation mode (NO in step 6), the display control unit 43 is fully automated in the operation mode of the other vehicle. It is determined whether or not the mode is set (step 7).
  • the display control unit 43 determines that the driving mode of the other vehicle is the fully automated driving mode (level 5) (YES in step 7), the vehicle system 2 of the other vehicle controls the running of the other vehicle, so that the vehicle 1 It is easy for the occupants to predict the driving of other vehicles. In this case, since it is not so necessary to call the occupants to pay attention to other vehicles, the display control unit 43 does not display the driving support instruction X on the HUD 42 (step 13).
  • the display control unit 43 determines that the driving mode of the other vehicle is not the fully automated driving mode (NO in step 7)
  • the display control unit 43 determines whether or not the driving mode of the other vehicle is the advanced driving support mode. Is determined (step 8).
  • the display control unit 43 determines that the driving mode of the other vehicle is not the advanced driving support mode (NO in step 8), the other vehicle is the level 2 driving support mode or the level 1 driving support mode. In this case, since the driver of the other vehicle often manually travels the other vehicle rather than the travel control by the vehicle system 2, it is difficult for the occupant of the vehicle 1 to predict the travel of the other vehicle. In this case, in order to display the driving support instruction X for urging the occupant to pay attention to the other vehicle to the HUD 42, the display control unit 43 determines whether the surrounding of the vehicle 1 is a general road or an expressway (step 10).
  • the display control unit 43 determines that the driving mode of the other vehicle is the advanced driving support mode (YES in step 8), the driving mode of the other vehicle is the advanced driving support mode of level 4 or the advanced driving support of level 3. The mode. Next, the display control unit 43 determines whether or not the other vehicle is under a specific condition (step 9).
  • the driving mode of the other vehicle is the level 4 advanced driving support mode and the specific conditions.
  • the display control unit 43 does not display the driving support instruction X on the HUD 42 (step 13).
  • the driving mode of the other vehicle is the level 3 advanced driving support mode or the level 4 advanced driving support mode that is not under specific conditions. Is.
  • the display control unit 43 displays the driving support instruction X on the HUD 42. Specifically, the display control unit 43 determines whether the area around the vehicle 1 is a general road or an expressway (step 10). Between step 8 and step 9, the display control unit 43 may determine whether the other vehicle is in the level 3 advanced driving support mode or the level 4 advanced driving support mode.
  • the display control unit 43 determines that the surrounding area of the vehicle 1 is a general road (general road in step 10)
  • the display control unit 43 informs the HUD 42 in order to alert the occupants of the vehicle 1 to other vehicles.
  • the driving support instruction X of the temporary stop instruction X1 is displayed (step 11).
  • the display control unit 43 displays the driving support instruction X of the warning instruction X2 on the HUD 42 (step 12). .. After any of the processes of step 11, step 12, and step 13, the display system 4 returns to the process of acquiring the position information of the vehicle 1 and the position information of the other vehicle (step 1).
  • the display system 4 always displays the map information M around the vehicle 1, the position information of the vehicle 1, and the position information and its identifier in the HUD 42 when at least one other vehicle around the vehicle 1 exists. It is displayed, and the driving support instruction X is displayed on the HUD 42 according to the driving mode of the other vehicle.
  • FIG. 4 is an example of displaying the map information M, the position information of the vehicle 1, the position information of another vehicle, and the identifier when the vehicle 1 travels on a two-lane road.
  • the map information M around the vehicle 1 is displayed, and the position information of the vehicle 1 is displayed on the map information M.
  • the oncoming vehicle 2A is traveling in the lane opposite to the lane in which the vehicle 1 is traveling.
  • the preceding vehicle 2B and the following vehicle 2C are traveling in the same lane as the vehicle 1.
  • the identifier A is displayed on the map information M together with the position information of the oncoming vehicle 2A.
  • the identifier B is displayed on the map information M together with the position information of the preceding vehicle 2B.
  • the identifier C is displayed on the map information M together with the position information of the following vehicle 2C (step 3 in FIG. 3).
  • the occupant of the vehicle 1 utters the identifier A of the oncoming vehicle 2A among the plurality of other vehicles.
  • the sensor 5 receives an instruction from the occupant by detecting the voice of the occupant and designates the oncoming vehicle 2A (YES in step 4 of FIG. 3).
  • the occupant fixes the viewpoint P to the position information or identifier A of the oncoming vehicle 2A displayed on the HUD 42 for a certain period of time, and the internal camera 6B receives an instruction from the occupant by detecting the viewpoint P of the occupant, and the oncoming vehicle 2A. (YES in step 4 of FIG. 3).
  • the wireless communication unit 10 performs vehicle-to-vehicle communication with the oncoming vehicle 2A (YES in step 5 in FIG. 3).
  • the display control unit 43 determines the driving mode of the oncoming vehicle 2A (steps 6 to 9 in FIG. 3).
  • the driving mode of the oncoming vehicle 2A is a level 5 fully automatic mode (YES in step 7 of FIG. 3) or when it is a level 4 advanced driving support mode under specific conditions (YES in step 9 of FIG. 3). Since it is not so necessary to call the occupants of the vehicle 1 to pay attention to other vehicles, the display control unit 43 does not display the driving support instruction X on the HUD 42 (step 13).
  • the display control unit 43 displays the driving support instruction X on the HUD 42. At this time, the display control unit 43 determines whether the surroundings of the vehicle 1 are a general road or an expressway (step 10 in FIG. 3).
  • the display control unit 43 displays the pause instruction X1 on the HUD 42 (step 11 in FIG. 3). By doing so, in order to reduce the collision accident between the vehicle 1 and another vehicle on the general road, it is possible to call the occupants of the vehicle 1 to pay attention to the oncoming vehicle 2A.
  • FIG. 5 is an example of the virtual image object I when the surroundings of the vehicle 1 are general roads as the display example 1 of the HUD 42.
  • the map information M, the position information of the vehicle 1, and the position information of another vehicle are displayed in the lower right of the display area of the HUD 42.
  • the driving support instruction X the pause instruction X1 is displayed in the upper right of the display area of the HUD 42.
  • FIG. 5 is an example of display, and the display size, shape, and display location of the map information M and the pause instruction X1 are not limited.
  • the display control unit 43 displays the warning instruction X2 on the HUD 42 (step 12 in FIG. 3).
  • the vehicle 1 and the following vehicle 2C may collide with each other. Therefore, when the vehicle 1 is traveling on the highway, it is preferable that the warning instruction X2 is displayed on the HUD 42 rather than the pause instruction X1.
  • FIG. 6 is an example of the virtual image object I when the surroundings of the vehicle 1 are highways as another display example 2 of the HUD 42.
  • the map information M, the position information of the vehicle 1, and the position information of another vehicle are displayed in the lower right of the display area of the HUD 42.
  • the driving support instruction X the warning instruction X2 is displayed in the upper right of the display area of the HUD 42.
  • FIG. 6 is an example of display, and the display size, shape, and display location of the map information M and the warning instruction X2 are not limited.
  • the display system 4 of this example displays the driving support instruction X on the HUD 42 when the display control unit 43 determines that the designated other vehicle is not in the automatic driving mode. That is, the driving support instruction X is displayed when another vehicle to be watched is specified and the other vehicle may affect the vehicle instead of the automatic driving mode of the other vehicle. Therefore, the occupant of the vehicle 1 can concentrate his attention on the other vehicle. Further, since the driving support instruction X is displayed when the display control unit 43 determines that the driving mode of the other vehicle is not the automatic driving mode, it is not necessary to always display the driving support instruction X on the HUD 42. Therefore, the display load of the display system 4 can be reduced.
  • the display system 4 of this example includes a GPS 9 for acquiring the position information of the vehicle 1 and a wireless communication unit 10 for acquiring the position information of the oncoming vehicle 2A.
  • the position information of the vehicle 1 and the position information of the oncoming vehicle 2A are displayed on the HUD 42. Therefore, the occupant of the vehicle 1 can easily grasp the positional relationship between the vehicle 1 and the oncoming vehicle 2A, and can easily determine whether or not to specify the identifier of the oncoming vehicle 2A.
  • the position information of the oncoming vehicle 2A may be acquired by the external camera 6A. However, in general, other vehicles that may affect the vehicle 1 do not always exist within the detection range of the external camera 6A.
  • the wireless communication unit 10 acquires the position information of the oncoming vehicle 2A through vehicle-to-vehicle communication as in this example, the position information of the oncoming vehicle 2A can be acquired regardless of the performance and the detection range of the external camera 6B.
  • the display system 4 of this example includes a wireless communication unit 10 that communicates with the designated oncoming vehicle 2A, and the display control unit 43 determines whether or not the oncoming vehicle 2A is in the automatic driving mode via the communication of the wireless communication unit 10. judge. Therefore, the driving mode of the oncoming vehicle 2A can be acquired with high accuracy by the vehicle-to-vehicle communication with the oncoming vehicle 2A. Further, the wireless communication unit 10 communicates only with the designated oncoming vehicle 2A, and does not need to communicate with the undesignated preceding vehicle 2B and the following vehicle 2C. Therefore, the communication load of the display system 4 can be reduced.
  • the display system 4 of this example designates one of the other vehicles by using the sensor 5 or the internal camera 6B, it communicates only with the other vehicle. Therefore, the communication load of the display system 4 can be reduced, and the occupants of the vehicle 1 can concentrate on the vehicle that should be watched most.
  • the position information of the oncoming vehicle 2A acquired by the wireless communication unit 10 is displayed with an identifier attached, so that the occupant can easily instruct the oncoming vehicle 2A. Further, even when there are a plurality of other vehicles around the vehicle 1, since a plurality of identifiers are attached and displayed, it becomes easy to identify one other vehicle from the plurality of other vehicles.
  • the display system 4 can accurately recognize another vehicle corresponding to the identifier. Further, even when there are a plurality of other vehicles around the vehicle, the display system 4 accurately recognizes the other vehicle corresponding to the identifier, so that the recognition accuracy of the system is improved.
  • the driving mode of the designated other vehicle is the manual driving mode
  • the display control unit 43 determines that the other vehicle is in the manual driving mode
  • the driving support instruction X is displayed on the HUD 42, so that it is necessary to always display the driving support instruction X. not. Therefore, the display load of the display system 4 can be reduced.
  • the driving support instruction X is displayed on the HUD 42. That is, when the oncoming vehicle 2A to be watched is designated and the automatic driving level of the oncoming vehicle 2A is low and the oncoming vehicle 2A may affect the vehicle 1, the driving support instruction X is displayed. Therefore, the occupant of the vehicle 1 can concentrate his attention on the oncoming vehicle 2A.
  • the display control unit 43 determines that the oncoming vehicle 2A is in an operation mode other than the fully automated driving mode, the driving support instruction X is displayed on the HUD 42, so that driving support is always performed. It is not necessary to display the instruction X. Therefore, the display load of the display system 4 can be reduced.
  • the occupant since the sensor 5 detects the voice of the occupant and the internal camera 6B detects the line of sight of the occupant of the vehicle 1, the occupant does not interrupt the driving operation of the vehicle 1 and the sensor 5 or the internal camera 6B. It is possible to instruct the designation of the oncoming vehicle 2A via.
  • the stop instruction X1 or the warning instruction X2 is displayed on the HUD 42, the occupants of the vehicle 1 are more accurately alerted to the oncoming vehicle 2A while reducing the communication load and the display load of the display system 4. Can be done.
  • the display control unit 43 determines the driving mode of the oncoming vehicle 2A and whether the surrounding of the vehicle 1 is a general road or an expressway, but the vehicle control unit 3 determines the driving mode of the oncoming vehicle 2A and the vehicle 1. It may be determined whether the surrounding area is a general road or a highway.
  • the vehicle control unit 3 may control the GPS 9, the wireless communication unit 10, the sensor 5, and the internal camera 6B, or the display control unit 43 may control these via the vehicle control unit 3.
  • the GPS 9 acquires the position information of the vehicle 1
  • the wireless communication unit 10 acquires the position information and the driving mode of the oncoming vehicle 2A, but the present invention is not limited to this.
  • the wireless communication unit 10 may acquire location information and an operation mode from the infrastructure equipment. In this case, the first communication unit, the second communication unit, and the third communication unit are the same device.
  • the display system 4 of this example specified the oncoming vehicle 2A, but the designation of other vehicles is not limited to the oncoming vehicle 2A.
  • the display system 4 may specify the preceding vehicle 2B or the following vehicle 2C. Of the three other vehicles, two of the oncoming vehicle 2A and the preceding vehicle 2B may be designated at the same time. In this case, the communication load of the display system 4 is reduced because the vehicle-to-vehicle communication is performed only with the two designated oncoming vehicles 2A and the preceding vehicle 2B, rather than the vehicle-to-vehicle communication with all three other vehicles. Can be done.
  • FIG. 4 shows a display example when the vehicle 1 travels on a two-lane road, but the present disclosure is not limited to this case.
  • FIG. 7 is an example of displaying the map information M, the position information of the vehicle 1, the position information of the other vehicles 2B and 2C, and the identifier when the vehicle 1 changes lanes.
  • the vehicle 1 is traveling in one lane, and the preceding vehicle 2B and the following vehicle 2C are traveling in the other lane as a plurality of other vehicles around the vehicle 1.
  • the occupant of the vehicle 1 may want to confirm the driving mode of the preceding vehicle 2B or the following vehicle 2C.
  • the display system 4 of this example receives an instruction of the identifier B of the preceding vehicle 2B or the identifier C of the following vehicle 2C from the occupant of the vehicle 1 even in the case of a lane change as shown in FIG. 7, and responds to the instruction. It is possible to determine the driving mode of the vehicle corresponding to the instruction by performing vehicle-to-vehicle communication with the vehicle.
  • FIG. 7 shows a display example when the vehicle 1 changes lanes, but the present disclosure is not limited to this case.
  • FIG. 8 is a display example of the map information M, the position information of the vehicle 1, the position information of the other vehicle 2D, and the identifier D when the other vehicle 2D changes lanes toward the lane in which the vehicle 1 is traveling. ..
  • the vehicle 1 is traveling in one lane, and the other vehicle 2D is traveling in the other lane as one other vehicle around the vehicle 1.
  • the occupant of the vehicle 1 may want to confirm the driving mode of the other vehicle 2D.
  • the display system 4 of this example receives an instruction of the identifier D of the other vehicle 2D from the occupant of the vehicle 1 and performs inter-vehicle communication with the other vehicle 2D.
  • the driving mode of the other vehicle 2D can be determined.
  • FIG. 9 is a display example of the map information M, the position information of the vehicle 1, the position information of the other vehicle 2E, and the identifier E when the vehicle 1 and the other vehicle 2E travel toward the confluence point J.
  • the vehicle 1 is traveling in one lane
  • the other vehicle 2E is traveling in the other lane as one other vehicle around the vehicle 1.
  • the occupant of the vehicle 1 may want to confirm the driving mode of the other vehicle 2E.
  • the display system 4 of this example receives an instruction of the identifier E of the other vehicle 2E from the occupant of the vehicle 1 even in the case of the confluence J as shown in FIG.
  • the operation mode of the other vehicle 2E can be determined.
  • the display system 4 does not need to constantly display the map information M, the position information of the vehicle 1, the position information of the other vehicle 2E, and its identifier E.
  • the map information M and the like may not be displayed, and the map information M and the like may be displayed when the vehicle 1 passes the front point 300 m away from the confluence point J.
  • the distance from the confluence J may be set relatively short when the vehicle 1 is traveling on a general road, and may be set relatively long when the vehicle 1 is traveling on a highway.
  • the display system 4 does not display the map information M or the like, but may display the map information M or the like based on the display instruction of the occupant of the vehicle 1.
  • FIG. 10 shows an example of displaying map information M, position information of vehicle 1, position information of other vehicles 2F and 2G, and identifiers F and G when the vehicle 1 travels toward the first intersection IS1 and the second intersection IS2. Is.
  • FIG. 10 there are a first intersection IS1 and a second intersection IS2 in the traveling direction of the vehicle 1.
  • the other vehicle 2F is traveling toward the first intersection IS1.
  • the other vehicle 2F is traveling away from the stop line L1 at the first intersection IS1.
  • the other vehicle 2G is traveling toward the second intersection IS2.
  • the other vehicle 2G is traveling near the stop line L2 at the second intersection IS2.
  • the occupant of the vehicle 1 may want to confirm the driving mode of the other vehicle 2F or the other vehicle 2G. In particular, it may be desired to confirm the driving mode of the other vehicle 2G approaching the second intersection IS2 before the driving mode of the other vehicle 2F traveling away from the first intersection IS1.
  • the display system 4 of this example receives an instruction of the identifier G of the other vehicle 2G from the occupant of the vehicle 1 even in the case of the first intersection IS1 and the second intersection IS2 as shown in FIG. By performing inter-vehicle communication, it is possible to determine the driving mode of another vehicle 2G. In FIG. 10, the occupant of the vehicle 1 may designate two other vehicles 2F and another vehicle 2G at the same time.
  • the display system 4 may not normally display the map information M or the like, but may display the map information M or the like when the vehicle 1 passes a point in front of the intersection IS1, for example, 300 m away.
  • the first intersection IS1 and the second intersection IS2 are shown as a plurality of intersections, but the number of intersections may be one or three or more.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Système d'affichage de véhicule dont est équipé un véhicule, le système d'affichage de véhicule comprenant une unité d'affichage qui affiche des informations de position d'au moins un autre véhicule sur des informations de carte concernant l'environnement du véhicule, une unité de désignation qui désigne un autre véhicule sur la base d'une instruction donnée par un conducteur dans le véhicule, et une unité de détermination qui détermine si l'autre véhicule est dans un mode de conduite automatisée. Lorsque l'unité de détermination a déterminé que l'autre véhicule n'est pas dans un mode de conduite automatisée, l'unité d'affichage affiche une instruction d'aide à la conduite pour aider à la conduite du véhicule.
PCT/JP2021/039092 2020-11-16 2021-10-22 Système d'affichage de véhicule WO2022102374A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022561371A JPWO2022102374A1 (fr) 2020-11-16 2021-10-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-190156 2020-11-16
JP2020190156 2020-11-16

Publications (1)

Publication Number Publication Date
WO2022102374A1 true WO2022102374A1 (fr) 2022-05-19

Family

ID=81601066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/039092 WO2022102374A1 (fr) 2020-11-16 2021-10-22 Système d'affichage de véhicule

Country Status (2)

Country Link
JP (1) JPWO2022102374A1 (fr)
WO (1) WO2022102374A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1086698A (ja) * 1996-09-12 1998-04-07 Hitachi Ltd 自動車の走行制御装置
JP2002163783A (ja) * 2000-11-22 2002-06-07 Mazda Motor Corp 車両の表示装置
JP2005115484A (ja) * 2003-09-17 2005-04-28 Fujitsu Ten Ltd 運転支援装置
JP2014203398A (ja) * 2013-04-09 2014-10-27 株式会社デンソー 危険車両通知装置、危険車両通知プログラム、危険車両通知プログラムを記録した記録媒体
JP2015044432A (ja) * 2013-08-27 2015-03-12 株式会社デンソー 運転支援装置、および運転支援方法
WO2017047176A1 (fr) * 2015-09-18 2017-03-23 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2018056103A1 (fr) * 2016-09-26 2018-03-29 ソニー株式会社 Dispositif de commande de véhicule, procédé de commande de véhicule et corps mobile
WO2018150580A1 (fr) * 2017-02-20 2018-08-23 三菱電機株式会社 Dispositif de correction de plan de déplacement, et procédé de correction de plan de déplacement
JP2019119262A (ja) * 2017-12-28 2019-07-22 株式会社小糸製作所 ヘッドアップディスプレイ装置
US20200090509A1 (en) * 2017-06-19 2020-03-19 Boe Technology Group Co., Ltd. Vehicle monitoring method and apparatus
JP2020154748A (ja) * 2019-03-20 2020-09-24 トヨタ自動車株式会社 制御システム

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1086698A (ja) * 1996-09-12 1998-04-07 Hitachi Ltd 自動車の走行制御装置
JP2002163783A (ja) * 2000-11-22 2002-06-07 Mazda Motor Corp 車両の表示装置
JP2005115484A (ja) * 2003-09-17 2005-04-28 Fujitsu Ten Ltd 運転支援装置
JP2014203398A (ja) * 2013-04-09 2014-10-27 株式会社デンソー 危険車両通知装置、危険車両通知プログラム、危険車両通知プログラムを記録した記録媒体
JP2015044432A (ja) * 2013-08-27 2015-03-12 株式会社デンソー 運転支援装置、および運転支援方法
WO2017047176A1 (fr) * 2015-09-18 2017-03-23 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2018056103A1 (fr) * 2016-09-26 2018-03-29 ソニー株式会社 Dispositif de commande de véhicule, procédé de commande de véhicule et corps mobile
WO2018150580A1 (fr) * 2017-02-20 2018-08-23 三菱電機株式会社 Dispositif de correction de plan de déplacement, et procédé de correction de plan de déplacement
US20200090509A1 (en) * 2017-06-19 2020-03-19 Boe Technology Group Co., Ltd. Vehicle monitoring method and apparatus
JP2019119262A (ja) * 2017-12-28 2019-07-22 株式会社小糸製作所 ヘッドアップディスプレイ装置
JP2020154748A (ja) * 2019-03-20 2020-09-24 トヨタ自動車株式会社 制御システム

Also Published As

Publication number Publication date
JPWO2022102374A1 (fr) 2022-05-19

Similar Documents

Publication Publication Date Title
JP7254832B2 (ja) ヘッドアップディスプレイ、車両用表示システム、及び車両用表示方法
US11597316B2 (en) Vehicle display system and vehicle
US12005832B2 (en) Vehicle display system, vehicle system, and vehicle
WO2021065617A1 (fr) Système d'affichage de véhicule et véhicule
WO2020031915A1 (fr) Système d'affichage de véhicule et véhicule
JP7295863B2 (ja) 車両用表示システム及び車両
JP2023175794A (ja) ヘッドアップディスプレイ
WO2021015171A1 (fr) Affichage tête haute
WO2022102374A1 (fr) Système d'affichage de véhicule
EP3961291B1 (fr) Affichage tête haute de véhicule et unité de source de lumière utilisée à cet effet
US20240227664A1 (en) Vehicle display system, vehicle system, and vehicle
WO2023190338A1 (fr) Dispositif d'irradiation d'image
JP7492971B2 (ja) ヘッドアップディスプレイ
US20240069335A1 (en) Head-up display
JP2023141162A (ja) 画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21891621

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022561371

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21891621

Country of ref document: EP

Kind code of ref document: A1