WO2021039762A1 - Dispositif d'affichage - Google Patents

Dispositif d'affichage Download PDF

Info

Publication number
WO2021039762A1
WO2021039762A1 PCT/JP2020/031957 JP2020031957W WO2021039762A1 WO 2021039762 A1 WO2021039762 A1 WO 2021039762A1 JP 2020031957 W JP2020031957 W JP 2020031957W WO 2021039762 A1 WO2021039762 A1 WO 2021039762A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
vehicle
front object
drawing control
Prior art date
Application number
PCT/JP2020/031957
Other languages
English (en)
Japanese (ja)
Inventor
誠 秦
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Priority to JP2021542915A priority Critical patent/JPWO2021039762A1/ja
Publication of WO2021039762A1 publication Critical patent/WO2021039762A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present invention relates to a display device.
  • Patent Documents 1 to 3 disclose an image that is visually recognized by a user (mainly a driver of a vehicle) as a virtual image that overlaps with the scenery in front of the vehicle.
  • the display device disclosed in Patent Documents 1 to 3 is a notification for notifying the existence of a front object or the like at a position corresponding to the position of the front object existing in front of the vehicle (for example, a position superimposing on the front object). Display the image as a virtual image.
  • the broadcast image shifts from the desired display position due to external factors such as vibration given to the vehicle and the behavior of the object in front, which gives the user a sense of discomfort. Suppress.
  • Factors that cause the broadcast image to deviate from the desired display position include not only the above-mentioned external factors but also internal factors such as the load of drawing control and the sensing process of the front object. Therefore, there is room for improvement in suppressing the user from being uncomfortable due to the deviation of the display position of the broadcast image.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a display device capable of suppressing giving a discomfort to a user who visually recognizes a broadcast image.
  • the display device is A display device that displays a superposed image that is visually recognized by the user as a virtual image that overlaps with the scenery in front of the vehicle.
  • An acquisition means for acquiring information on a front object existing in front of the vehicle, including at least position information of the front object.
  • a drawing control means for drawing a broadcast image for notifying information about the front object at a corresponding position corresponding to the position information in the display area of the superimposed image based on the object information acquired by the acquisition means.
  • a specific means for identifying a display delay factor that causes the broadcast image drawn based on the object information acquired at a predetermined timing to be displayed later than the predetermined timing is provided.
  • the display delay factor is identified based on at least one of the drawing control load by the drawing control means and the transmission delay of the object information.
  • the drawing control means executes an image enlargement process for drawing the broadcast image in a size larger than a predetermined reference size.
  • FIG. 1 It is a figure which shows the example of the mounting mode of the head-up display (HUD) device which concerns on 1st Embodiment of this invention in a vehicle. It is a figure which shows the relationship between a user's viewpoint position and a virtual image. It is a block diagram of a display system for a vehicle. It is a figure which shows the display example of the broadcast image which concerns on 1st Embodiment. It is a figure which shows the occurrence example of the display shift. It is a figure which shows the display example of the broadcast image when the image enlargement processing which concerns on 1st Embodiment is executed. It is a figure which shows the occurrence example of the display shift.
  • HUD head-up display
  • the display device is a HUD (Head-Up Display) device 10 included in the vehicle display system 100 shown in FIG.
  • the HUD device 10 is provided inside the dashboard 2 of the vehicle 1 (hereinafter, also referred to as the own vehicle 1), and is provided not only with information about the vehicle 1 (hereinafter, referred to as vehicle information).
  • vehicle information information about the vehicle 1
  • Information other than vehicle information is also integratedly notified to user 4 (mainly the driver of vehicle 1).
  • the vehicle information includes not only the information of the vehicle 1 itself but also the external information of the vehicle 1 related to the operation of the vehicle 1.
  • the vehicle display system 100 is a system configured in the vehicle 1, and as shown in FIG. 2, the HUD device 10, the front object detection unit 40, the viewpoint detection unit 50, and the ECU (Electronic Control Unit) 60. And a car navigation (car navigation) device 70.
  • the HUD device 10 emits the display light Q toward the windshield 3 of the vehicle 1.
  • the display light Q reflected by the windshield 3 goes toward the user 4.
  • the user 4 can visually recognize the image represented by the display light Q displayed in front of the windshield 3 as a virtual image A.
  • the user 4 can visually recognize the virtual image A by superimposing it on the front landscape.
  • the HUD device 10 displays the virtual image A on the virtual surface set in front of the vehicle 1.
  • the virtual surface is set at a position separated from the viewpoint position 4a when the user 4 places the viewpoint in the eye box 5 in front of the vehicle 1 by a predetermined distance P (for example, about 5 to 10 m).
  • the virtual surface corresponds to the display surface of the image on the display unit 20.
  • the virtual surface and the eye box 5 are set based on the size of the display surface and the optical system composed of various mirrors and the windshield 3 in the HUD device 10.
  • the HUD device 10 includes a display unit 20 and a control device 30 shown in FIG. 2, and a reflection unit (not shown).
  • the display unit 20 displays a superimposed image visually recognized by the user 4 as a virtual image A under the control of the control device 30.
  • the display unit 20 includes, for example, a TFT (Thin Film Transistor) type LCD (Liquid Crystal Display), a backlight that illuminates the LCD from behind, and the like.
  • the backlight is configured to include, for example, an LED (Light Emitting Diode).
  • the display unit 20 Under the control of the control device 30, the display unit 20 generates the display light Q when the LCD illuminated by the backlight displays an image.
  • the generated display light Q is reflected by the reflecting portion and then emitted toward the windshield 3.
  • the reflecting portion is composed of, for example, two mirrors, a folded mirror and a concave mirror.
  • the folded mirror folds the display light Q emitted from the display unit 20 and directs it toward the concave mirror.
  • the concave mirror magnifies the display light Q from the folded mirror and reflects it toward the windshield 3.
  • the virtual image A visually recognized by the user 4 is an enlarged image displayed on the display unit 20.
  • the type and number of mirrors constituting the reflecting unit can be arbitrarily changed according to the design.
  • displaying an image visually recognized by the user 4 as a virtual image A by the display unit 20 is also referred to as "displaying a superimposed image”.
  • the control device 30 performing the display control of the display unit 20 is also referred to as “controlling the display of the superimposed image”.
  • the display unit 20 is not limited to the one using an LCD as long as it can display a superposed image, such as OLED (Organic Light Emitting Diodes), DMD (Digital Micromirror Device), LCOS (Liquid Crystal On Silicon), etc. It may be one using a display device.
  • the control device 30 is composed of a microcomputer that controls the overall operation of the HUD device 10, and includes a control unit 31 and a storage unit 32. Further, the control device 30 includes a driver for driving the display unit 20 and an input / output circuit for communicating with various systems in the vehicle 1 as a configuration (not shown).
  • the storage unit 32 is composed of a ROM (Read Only Memory) that stores an operation program and various image data in advance, a RAM (Random Access Memory) that temporarily stores various calculation results, and the like.
  • the ROM of the storage unit 32 stores data of an operation program for executing the display control process described later, data of various tables and mathematical formulas used when executing the display control process, and the like.
  • the control unit 31 includes a CPU (Central Processing Unit) 31a that executes an operation program stored in the ROM of the storage unit 32, and a GDC (Graphics Display Controller) 31b that executes image processing in cooperation with the CPU 31a.
  • CPU Central Processing Unit
  • GDC Graphics Display Controller
  • the GDC 31b is composed of, for example, a GPU (Graphics Processing Unit), an FPGA (Field-Programmable Gate Array), an ASIC (Application Specific Integrated Circuit), or the like.
  • the GDC 31b is capable of both drawing processing using a vector image and drawing processing using a raster image (bitmap image).
  • the configuration of the control device 30 and the control unit 31 is arbitrary as long as the functions described below are satisfied.
  • the control unit 31 drives and controls the display unit 20.
  • the control unit 31 drives and controls the backlight of the display unit 20 by the CPU 31a, and drives and controls the LCD of the display unit 20 by the GDC 31b that operates in cooperation with the CPU 31a.
  • the CPU 31a of the control unit 31 cooperates with the GDC 31b to control the superimposed image based on various image data stored in the ROM of the storage unit 32.
  • the GDC 31b determines the control content of the display operation of the display unit 20 based on the display control command from the CPU 31a.
  • the GDC 31b reads the image part data necessary for forming one screen to be displayed on the display unit 20 from the ROM and transfers it to the RAM of the storage unit 32. Further, the GDC 31b uses the RAM to create picture data for one screen based on the image part data and various image data received by communication from the outside of the HUD device 10.
  • the GDC 31b transfers the picture data to the display unit 20 in accordance with the image update timing (frame rate).
  • the superposed image visually recognized by the user 4 as a virtual image A is displayed on the display unit 20.
  • layers are assigned in advance to each image constituting the image visually recognized as the virtual image A, and the control unit 31 can individually control the display of each image.
  • the control unit 31 draws the content image in the display area of the virtual image A.
  • the rectangular frame shown by the virtual image A in FIGS. 1B, 3 and the like indicates a display area of the virtual image A, and the content image is an image visually recognized as a part of the virtual image A in the display area. This is synonymous with displaying the content image in the display area of the superimposed image in the display unit 20 that displays the superimposed image visually recognized by the user 4 as the virtual image A.
  • the content image is an arbitrary image drawn in the display area of the virtual image A, and includes a notification image C for notifying information about a front object F existing in front of the vehicle 1.
  • the broadcast image C is displayed at a corresponding position in the display area of the virtual image A corresponding to the position information (information indicating the position of the front object F) described later under the control of the control unit 31.
  • the corresponding position of the notification image C as the display position in the display area of the virtual image A is such that the notification image C is at least one of the front objects F in an ideal state when the display deviation described later does not occur. It is a position visually recognized by the user 4 in a mode superimposed on the portion or a mode adjacent to the front object F.
  • FIG. 3A is an example in which a notification image C for notifying the existence of a front object F as a preceding vehicle is displayed.
  • the content image includes not only the broadcast image C but also an image showing vehicle information and the like.
  • the image showing the vehicle information is the image A1 showing the vehicle speed, the image A2 showing the speed limit of the traveling lane, and the like shown in FIG. 3B.
  • the front object F to be notified by the notification image C is not limited to the preceding vehicle, but may be a pedestrian, a building, a part of a road (for example, a lane marking such as a solid white line or a broken line, an intersection, or a branch road). It may be present and can be set arbitrarily.
  • control unit 31 communicates with each of the front object detection unit 40, the viewpoint detection unit 50, the ECU 60, and the car navigation device 70.
  • communication methods such as CAN (Controller Area Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport) (registered trademark), and LVDS (Low Voltage Differential Signaling) can be applied.
  • the front object detection unit 40 detects the front object F located in front of the own vehicle 1, and supplies "object information" including at least the position information of the front object F to the control unit 31 as information regarding the detected front object F.
  • the position information included in the object information is, for example, as shown in FIG. 1B, coordinates when the traveling direction of the vehicle 1 is the x-axis, the left-right direction of the vehicle 1 is the y-axis, and the height direction of the vehicle 1 is the z-axis. It is the data of (x, y, z).
  • the origin of the coordinates may be an arbitrarily set position such as a representative point at the tip of the vehicle 1 or a center point of the vehicle 1.
  • the front object detection unit 40 includes, for example, one or a plurality of combinations of various sensors described below and a processing unit that processes information detected by the sensors.
  • LiDAR Light Detection And Ranging
  • the forward object detection unit 40 removes unnecessary point clouds from the point cloud data measured by LiDAR, and then performs a clustering process to detect one or a plurality of clusters (point cloud sets). Further, the front object detection unit 40 determines whether or not the detected cluster is the front object F by the classifier for the one presumed to have measured the preset front object F.
  • the clustering process Euclidean Clustering in PCL (Point Cloud Library) can be used.
  • the front object detection unit 40 refers to the position information (including information indicating the size of the front object F) of the front object F detected in this way and the type of the front object F (preceding vehicle, Object information including type information indicating pedestrians, buildings, etc.) is supplied to the control unit 31. Further, the front object detection unit 40 may be configured to include an image pickup device that captures a front landscape including a road surface on which the vehicle 1 travels.
  • the image pickup apparatus is composed of, for example, a stereo camera.
  • the front object detection unit 40 calculates the position information and type information of the front object F by analyzing the image data captured by the imaging device with the processing unit by a known method such as a pattern matching method. Then, the forward object detection unit 40 supplies these information to the control unit 31 as object information.
  • the front object detection unit 40 may include a sonar for detecting the front object F, an ultrasonic sensor, a millimeter wave radar, and the like.
  • the control unit 31 acquires object information from the front object detection unit 40, and identifies the position of the front object F in the display area of the virtual image A based on the acquired object information.
  • the control unit 31 attaches identification information (ID) to each of the plurality of front objects F, and uses the position information corresponding to the ID.
  • ID identification information
  • the position of a predetermined forward object F is specified. If the position information (coordinates) of the front object F is known, the direction of the front object F with respect to the own vehicle 1 can be specified. Further, if the position information is known, the distance from the own vehicle 1 to the front object F can be calculated. By time-differentiating the distance, the relative speed of the front object F with respect to the own vehicle 1 can be calculated.
  • the direction of the front object F and the calculation of the distance and the relative velocity may be performed by the front object detection unit 40 or the control unit 31.
  • the viewpoint detection unit 50 has a known configuration for detecting the viewpoint position (including the line-of-sight direction) of the user 4.
  • an infrared camera that captures the face of the user 4 and image data (captured data) captured by the infrared camera.
  • the viewpoint position analysis unit identifies the viewpoint position of the user 4 by performing image analysis of the imaging data by a known method such as a pattern matching method, and supplies information indicating the viewpoint position to the control unit 31.
  • the ECU 60 controls each part of the vehicle 1, and supplies vehicle information about the own vehicle 1 to the control unit 31.
  • the control unit 31 may directly acquire vehicle speed information from various sensors such as a vehicle speed sensor.
  • the vehicle information related to the own vehicle 1 is, for example, vehicle speed, engine speed, warning information (fuel drop, engine oil pressure abnormality, etc.).
  • the control unit 31 can control the display of the superimposed image based on the vehicle information acquired from the ECU 60, and can display an image showing the predetermined vehicle information in the display area of the virtual image A. That is, in the display area of the virtual image A, an image showing vehicle information other than the notification image C can also be displayed.
  • the car navigation device 70 includes a GPS controller that calculates the position of the vehicle 1 based on a GPS (Global Positioning System) signal received from an artificial satellite or the like.
  • the car navigation device 70 has a storage unit that stores map data, reads map data in the vicinity of the current position from the storage unit based on the position information of the vehicle 1 from the GPS controller, and reaches the destination set by the user 4. Determine the guidance route of. Then, the car navigation device 70 outputs information regarding the position of the vehicle 1 and the determined guide route to the control unit 31. Further, the car navigation device 70 outputs information indicating the name and type of the facility in front of the vehicle 1 and the distance between the facility and the vehicle 1 to the control unit 31 by referring to the map data.
  • GPS Global Positioning System
  • Map data includes road shape information (lanes, road width, number of lanes, intersections, curves, branch roads, etc.), regulatory information on road signs such as speed limits, and information on each lane when there are multiple lanes. Various information is associated with the position data.
  • the car navigation device 70 outputs these various types of information as navigation data to the control unit 31.
  • the navigation data includes object information related to the front object F, information for displaying the guidance route as the notification image C, and a vehicle.
  • the control unit 31 includes information for specifying the distance from 1 to the front object F. That is, the control unit 31 may acquire the object information from the car navigation device 70.
  • the broadcast image C may be an image showing a POI (Point of Interest) of a destination, a facility, or the like, which is preset by the user 4 in the car navigation device 70.
  • the car navigation device 70 is not limited to the one mounted on the vehicle 1, and is a mobile terminal (smartphone, tablet PC (Personal Computer), etc.) having a car navigation function by communicating with the control unit 31 by wire or wirelessly. ) May be realized.
  • the control unit 31 displays the virtual image A based on the position information of the front object F specified based on the object information from the front object detection unit 40 and the information regarding the viewpoint position of the user 4 input from the viewpoint detection unit 50.
  • the display position of the notification image C is calculated, and the display operation of the display unit 20 is controlled.
  • the control unit 31 displays the broadcast image C at a desired position (corresponding position described above) with respect to the front object F in the actual scene.
  • the control unit 31 not only displays the content image including the broadcast image C in two dimensions (2D), but also uses the three-dimensional (3D) image data stored in advance in the ROM of the storage unit 32 in 3D. It may be possible to display. Further, the control unit 31 adopts a configuration in which the display surface (display light Q is projected on the screen) of the display unit 20 as a process for correcting the distortion of the virtual image A mainly caused by the curved shape of the windshield 3. If this is the case, warping may be performed in which the image displayed on the screen) is pre-distorted in the opposite direction by the amount distorted by the photometer after the display surface. Further, the control unit 31 may execute dynamic warping, which is warping according to the viewpoint position of the user 4. The dynamic warping realizes the distortion correction optimized according to the viewpoint position by using the parameters set for each of a plurality of regions obtained by dividing the rectangular region of the eye box 5 into a grid shape.
  • FIG. 3A shows a display example of the notification image C for notifying the front object F.
  • FIG. 3A is an example in which the notification image C is displayed at a desired display position, and is an elliptical notification that is visually recognized on the road surface side of the front object F as a preceding vehicle by superimposing a part of the front object F. This is an example in which the image C is displayed.
  • the broadcast image C displayed according to the position of the front object F may be displayed later than the desired display timing due to the drawing control load (hereinafter, also referred to as drawing load) or the like.
  • drawing load the drawing control load
  • FIG. 3B shows an example in which a display shift occurs in the broadcast image C due to an increase in the number of content images displayed in the display area of the virtual image A and an increase in the drawing load.
  • the preceding vehicle F1 existing in the traveling lane of the own vehicle 1, the preceding vehicle F2 existing in the lane adjacent to the traveling lane, and the buildings F3 and F4 located in front of the own vehicle 1 are detected as the front objects F.
  • the control unit 31 executes the image enlargement processing.
  • the control unit 31 mainly functions as an acquisition means, a drawing control means, and a specific means.
  • the acquisition means acquires the above-mentioned object information from the front object detection unit 40. Based on the object information acquired by the acquisition means, the drawing control means draws the notification image C visually recognized by the user 4 at the corresponding position corresponding to the position (position information) of the front object F in the display area of the superimposed image. ..
  • the correspondence relationship between the position information and the corresponding position is stored in advance in the ROM of the storage unit 32.
  • the specific means identifies a display delay factor that causes the broadcast image C drawn based on the object information acquired by the acquisition means at a predetermined timing to be displayed later than the predetermined timing.
  • the control unit 31 as the specifying means identifies the display delay factor as follows.
  • the control unit 31 refers to the prediction drawing load determination table TA1 shown in FIG. 7A, and the content image (including the broadcast image C) to be displayed in the display area of the virtual image A. Is determined when the drawing prediction means (mainly GDC31b) draws the drawing load (predicted drawing load L).
  • the predicted drawing load determination table TA1 is referred to, for example, for one content image, and the combination of the content type (2D or 3D) and the drawing area S and the predicted drawing load L are configured correspondingly. Has been done.
  • the predictive drawing load determination table TA1 is stored in advance in the ROM of the storage unit 32.
  • the control unit 31 is an image for one screen created by the GDC 31b, and is among the images transferred to the display unit 20 at a predetermined update timing (hereinafter, referred to as a prediction target timing) of the frame rate from the next time onward.
  • the predicted drawing load L is determined with reference to the predicted drawing load determination table TA1 for one content image (hereinafter, referred to as a prediction target image). For example, when the type of the image to be predicted is "2D" and the drawing area S is "S1 ⁇ S ⁇ S2", the control unit 31 refers to the prediction drawing load determination table TA1 and sets the prediction drawing load L to "S1 ⁇ S ⁇ S2". 10% "is decided.
  • the control unit 31 determines the prediction drawing load L by the number of prediction target images. Then, the control unit 31 sums the determined predicted drawing loads L to obtain the total predicted drawing load ⁇ L. For example, as shown in FIG. 3B, when there are four broadcast images C (C1 to C4), an image A1 representing a vehicle speed, and six content images representing a speed limit of a traveling lane in the display area of the virtual image A. , The predicted drawing load L for each of the six content images is determined, and these are added up to obtain the total predicted drawing load ⁇ L.
  • control unit 31 as the specifying means specifies the drawing control load as a display delay factor based on the area, the number, and the dimension of the content image (including the broadcast image C).
  • the control unit 31 may specify the drawing control load as a display delay factor based on at least one of the area, number, and dimension of the content image.
  • the control unit 31 that has obtained the total predicted drawing load ⁇ L refers to the predicted delay time determination table TA2 shown in FIG. 7B and predicts the delay time of displaying the content image due to the drawing load (predicted delay time TL) . decide).
  • the predicted delay time determination table TA2 is configured such that the total predicted drawing load ⁇ L and the predicted delay time TL correspond to each other, and are stored in advance in the ROM of the storage unit 32.
  • the control unit 31 refers to the prediction delay time determination table TA2 and determines the prediction delay time TL corresponding to the obtained total prediction drawing load ⁇ L.
  • the display delay of the content image is also caused by the system configuration other than the drawing load.
  • the sensing delay time T S is the delay time caused by the sensing.
  • Sensing delay time T S is mainly a transmission delay of the object information to the control unit 31 from the forward object detecting unit 40, is stored as data in the ROM in the memory unit 32.
  • the predicted delay time TL may be set in advance in consideration of not only the drawing load but also the display control and display operation of the display unit 20. Further, the sensing delay time T S may be determined in consideration of processing delay on the forward object detecting unit 40 calculates the object information. Further, as a display delay factor caused by the system configuration, the processing load when the control unit 31 executes the above-mentioned warping or dynamic warping may be considered.
  • the processing load when performing dynamic warping can be predicted or obtained by experiment in consideration of the number of viewpoint positions set in advance and the number of grids when dividing the eyebox 5 into a plurality of areas. It is possible.
  • the control unit 31 as the drawing control means has a size larger than the reference size when the value indicated by the specified display delay factor is a predetermined value or more. An image enlargement process for drawing the broadcast image C is executed.
  • the reference size is the size of the broadcast image C at a predetermined distance P (see FIG. 1B) determined according to the type of the broadcast image C.
  • the reference size is not a fixed size in the display area of the virtual image A, but corresponds to the reference information (mathematical data or table data) that defines the correspondence between the distance from the vehicle 1 to the object F in front and the size of the reference size. It is variable.
  • the reference information is stored in the storage unit 32 in advance, and is set so that the reference size becomes smaller as the distance to the front object F to be notified becomes longer by using, for example, perspective.
  • the control unit 31 draws the broadcast image C in the reference size based on the distance from the own vehicle 1 to the front object F and the reference information.
  • the control unit 31 as a drawing control means determines the magnification with respect to the reference size as follows when executing the image enlargement processing.
  • the control unit 31 determines the magnification (hereinafter, referred to as the first magnification) corresponding to the display delay factor obtained as described above with reference to the first magnification determination table TA3 shown in FIG. 8A.
  • the first magnification determination table TA3 is configured such that the display delay factor and the first magnification correspond to each other, and is stored in advance in the ROM of the storage unit 32.
  • the control unit 31 determines the first magnification and the relative speed between the vehicle 1 and the front object F which is relatively close to the vehicle 1 is equal to or higher than a predetermined speed, the second one shown in FIG. 8B.
  • the magnification determination table TA4 the magnification corresponding to the relative speed (hereinafter referred to as the second magnification) is determined.
  • the second magnification determination table TA4 is configured such that the relative speed and the second magnification correspond to each other, and is stored in advance in the ROM of the storage unit 32.
  • the shape of the broadcast image C is not limited to the elliptical shape shown in the illustrated example, and may be any other shape such as a rectangle, a circle, a triangle, or a polygon. Further, the configuration of the broadcast image C is arbitrary as long as it can broadcast information about the front object F, and may be, for example, characters, symbols, figures, icons, or a combination thereof.
  • FIG. 4 shows a display example of the notification image C when the image enlargement processing is executed.
  • the value indicated by the display delay factor is equal to or greater than a predetermined value, the deviation from the desired display position is not noticeable by executing the image enlargement processing for enlarging the notification image C of the reference size at the magnification determined as described above. It is possible to suppress the discomfort given to the user.
  • control unit 31 that functions as the drawing control means is the end portion B of the notification image C when the image enlargement processing is executed when the vehicle 1 and the front object F are relatively close to each other in a predetermined direction.
  • the notification image C is drawn so as to be enlarged and visually recognized toward the vehicle 1 side with reference to the end portion B that is visually recognized at a position far from the vehicle 1 in the predetermined direction.
  • the end portion visually recognized at a position far from the vehicle 1 in the front-rear direction when executing the image enlargement processing.
  • the notification image C is drawn so as to be enlarged and visually recognized toward the vehicle 1 side. In this way, even when the broadcast image C shifts to the upper side of the desired display position due to the display delay, the display shift can be made inconspicuous.
  • the front object F is a road marking line (white line in the example of the same figure), and the notification image C notifying the front object F is superimposed on the road marking line in order to emphasize the existence of the road marking line.
  • the front object F is an immovable body that is immovable with respect to the ground, but if the vehicle 1 approaches the lane marking, the display deviation of the notification image C may occur as shown in FIG. 5A. ..
  • the image enlargement processing is executed, as shown in FIG. 5B, the end visually recognized at a position far from the vehicle 1 in the lateral direction.
  • the notification image C is drawn so as to be enlarged and visually recognized toward the vehicle 1 side with reference to the portion B (the right end portion in FIG. 5B). By doing so, even when the broadcast image C shifts to the right side of the desired display position due to the display delay, the display shift can be made inconspicuous.
  • Display control processing From here on, the display control process executed by the control unit 31 will be described mainly with reference to FIG.
  • the display control process is continuously executed, for example, with the ignition of the vehicle 1 turned on.
  • the control unit 31 first determines whether or not the front object F has been detected (step S101).
  • the control unit 31 determines that the front object F has been detected (step S101; Yes), and proceeds to the process in step S102.
  • the control unit 31 waits in step S101.
  • control unit 31 calculates the relative speed between the own vehicle 1 and the front object F as described above based on the object information acquired from the front object detection unit 40 (step S102).
  • control unit 31 determines whether or not the notification image C for notifying the detected forward object F is an image of a specific type (step S103).
  • the specific type of image means that when the front object F is an immovable body such as a road, the notification image C for notifying information about the immovable body is set to a desired display position (corresponding position described above).
  • the image is visually recognized as being displaced in the traveling direction of the vehicle 1, it is an image that is acceptable for performing the notification, and the type is stored in the ROM of the storage unit 32 in advance.
  • the notification image C is simply a guidance display indicating straight ahead, even if a display delay occurs in the notification image C, the front It is unlikely that the user 4 will recognize that the display is offset from the road surface.
  • the notification image C such as the guidance display indicating that the notification image C is going straight is stored in advance as a specific type, and when the notification image C is an image of the specific type, the image enlargement processing described later is not executed. ..
  • the notification image C is not set as a specific type. This is because if the notification image C of the aspect shown in FIG. 6B is visually recognized with a deviation in the traveling direction of the vehicle 1, the guidance will be affected.
  • the notification image C guides a direction other than straight ahead, the notification image C is not set as a specific type.
  • step S103 when the broadcast image C is an image of a specific type as shown in FIG. 6B (step S103; Yes), the control unit 31 sets the corresponding position corresponding to the position information included in the object information to the above-mentioned reference. Normal image control for displaying the broadcast image C of the reference size according to the information is executed (step S104). When a plurality of front objects F are detected in step S101, the control unit 31 displays a broadcast image C corresponding to each of the plurality of front objects F.
  • step S103 when the broadcast image C is not an image of a specific type (step S103; No), the control unit 31 determines whether or not the front object F is moving away by a predetermined relative speed or more to the relative speed calculated in step S102. Determine based on (step S103A).
  • the front object F specified by the object information acquired in step S101 is a moving body such as a preceding vehicle, and the moving body is the own vehicle at a predetermined relative speed or higher predetermined in the ROM of the storage unit 32 in advance.
  • step S104 image control
  • the moving body is an object that moves with respect to the ground, and is, for example, a preceding vehicle, a pedestrian, or the like.
  • control unit 31 determines whether or not the value T indicating the display delay factor specified in step S105 is equal to or greater than the predetermined value T1 predetermined in the ROM of the storage unit 32 (step S106).
  • the control unit 31 normally executes image control (step S104).
  • the control unit 31 determines the magnification with respect to the reference size (step S107).
  • the control unit 31 determines the magnification (first magnification) corresponding to the display delay factor specified in step S105 with reference to the first magnification determination table TA3.
  • the relative speed V between the vehicle 1 and the front object F which is relatively close to the vehicle 1 is equal to or higher than a predetermined speed (V1 in the example of FIG. 8B)
  • the relative speed V is referred to by the second magnification determination table TA4.
  • the magnification (second magnification) corresponding to is determined.
  • control unit 31 determines the product of the first magnification and the second magnification as the magnification in the image enlargement processing.
  • the control unit 31 determines the product of the first magnification and the second magnification as the magnification in the image enlargement processing.
  • the control unit 31 executes an image enlargement process for drawing the broadcast image C whose reference size is enlarged by the magnification determined in step S107 (step S108).
  • the end portion B of the notification image C is far from the vehicle 1 in the predetermined direction.
  • the notification image C is drawn so as to be enlarged and visually recognized toward the vehicle 1 side with the end portion B visually recognized at the position as a reference.
  • the control unit 31 may enlarge the notification image C with reference to an arbitrary position.
  • the control unit 31 displays an image of the broadcast image C displayed corresponding to each of the plurality of front objects F whose magnification is determined in step S107. Execute the enlargement process.
  • step S108 or step S104 the control unit 31 executes again from the process of step S101.
  • the above display control process is continuously executed until, for example, the HUD device 10 is turned off. This concludes the description of the first embodiment.
  • the configuration of the vehicle display system 100 is the same as that of the first embodiment. Therefore, in the following, each configuration similar to that of the first embodiment will be described using the same reference numerals as those of the first embodiment, and the points different from those of the first embodiment will be mainly described.
  • the control unit 31 displays a notification image C for notifying the front object F, for example, as shown in FIG. 3A. Then, the control unit 31 according to the second embodiment executes the image position adjustment process described later in order to reduce the discomfort given to the user due to the display deviation as shown in FIGS. 3B and 10A.
  • the control unit 31 that executes the display control process including the image position adjustment process mainly functions as an acquisition means, a drawing control means, and a specific means.
  • the specifying means according to the second embodiment identifies the display delay factor as in the first embodiment.
  • the control unit 31 as the drawing control means sets the vehicle at a desired display position (corresponding position described above) when the value indicated by the specified display delay factor is equal to or greater than a predetermined value.
  • the image position adjustment process for drawing the notification image C visually recognized at the adjustment position moved by a predetermined movement amount in the direction in which the front object F moves with respect to 1 is executed.
  • the control unit 31 as the drawing control means executes the image position adjustment process, the control unit 31 determines the movement amount of the broadcast image C as follows.
  • the control unit 31 sets the movement amount to a predetermined value M (hereinafter, M is also referred to as a reference movement amount). Further, the control unit 31 obtains the relative speed V between the own vehicle 1 and the front object F, and determines the movement amount corresponding to the obtained relative speed V with reference to the movement amount determination table TA5 shown in FIG.
  • the movement amount determination table TA5 is configured such that the relative speed and the movement amount correspond to each other, and are stored in advance in the ROM of the storage unit 32. For example, when the magnitude (absolute value) of the calculated relative velocity V is 0 ⁇ V ⁇ V1, the control unit 31 sets the movement amount of the broadcast image C as it is to M.
  • the control unit 31 sets the movement amount of the broadcast image C to ⁇ ⁇ M ( ⁇ > 1). That is, the control unit 31 increases the movement amount when the relative velocity V is a second value larger than the first value, as compared with the case where the relative velocity V is the first value.
  • control unit 31 may variably control the reference movement amount M itself so that the movement amount of the broadcast image C increases as the value indicated by the display delay factor increases. Further, the control unit 31 determines a coefficient ( ⁇ or ⁇ in the example of FIG. 12) according to the obtained relative speed V, and multiplies the determined coefficient by the reference movement amount M to obtain the relative speed V. The movement amount of the broadcast image C may be obtained. Further, the relationship between the relative velocity V and the movement amount of the broadcast image C is stored in advance in the ROM of the storage unit 32 as function data (for example, a linear function or a quadratic function), and the control unit 31 and the obtained relative velocity V. The movement amount of the broadcast image C may be determined based on the function.
  • function data for example, a linear function or a quadratic function
  • FIG. 10B shows a display example of the notification image C when the image position adjustment process is executed.
  • the control unit 31 sets the desired display position (corresponding position described above) when the value indicated by the display delay factor is equal to or greater than a predetermined value.
  • the image position adjustment process for drawing the notification image C visually recognized at the adjustment position moved by a predetermined movement amount in the direction in which the front object F moves with respect to 1 (downward in the example of FIG. 10) is executed.
  • control unit 31 that functions as a drawing control means is a vehicle from a position separated from the vehicle 1 by a specific distance or more in an intersection direction (for example, the y direction) in which the front object F intersects the traveling direction (for example, the x direction) of the vehicle 1.
  • the specific moving state is approaching 1, the moving amount is corrected so that the moving amount when executing the image position adjustment process is larger than when the front object F is not in the specific moving state.
  • FIGS. 11A and 11B a specific moving state of the front object F will be described with reference to FIGS. 11A and 11B.
  • FIG. 11A is an example in which the front object F approaches the own vehicle 1 in a straight line with a relative velocity vector of magnitude v.
  • FIG. 11A is an example in which the front object F approaches the own vehicle 1 in a straight line with a relative velocity vector of magnitude v.
  • FIG. 11A is an example in which the front object F approaches the own vehicle 1 in a straight line with a relative velocity vector of magnitude v.
  • the control unit 31 has the front object F approaching the own vehicle 1 and the front object F based on the position information (x, y, z) included in the acquired object information.
  • the y-coordinate is at a position separated from the vehicle 1 by a specific distance or more, the front object F is specified to be in a specific moving state.
  • the specific distance is stored in advance in the ROM of the storage unit 32 as a distance in the y direction. If the movement amount determined as described above is Mf, the control unit 31 adds the correction amount ⁇ M to Mf when the front object F is specified to be in the specific movement state, and the movement amount of the broadcast image C. Is (Mf + ⁇ M). By doing so, the notification image C can be displayed closer to the own vehicle 1, so that the presence of the front object F in the specific moving state, which is assumed to be difficult for the user 4 to grasp the sense of distance, can be satisfactorily notified. ..
  • the image position adjustment process according to the second embodiment is executed. Can be done. If the vehicle 1 approaches the lane marking line, as shown in FIG. 5A, the display deviation of the notification image C may occur. However, the image position adjustment process according to the second embodiment is executed, and the vehicle 1 is subjected to. On the other hand, if the notification image C visually recognized at the adjustment position moved by the movement amount determined in the moving direction (left direction in FIG. 5A) of the front object F is drawn, the display deviation can be made inconspicuous. By doing so, even when the broadcast image C shifts to the right side of the desired display position due to the display delay, the display shift can be made inconspicuous.
  • the control unit 31 executes the processes of steps S201 to S206 in the same manner as steps S101 to S106 described in the first embodiment.
  • step S206 when the value T indicated by the display delay factor is equal to or greater than the predetermined value T1 (step S206; Yes), the control unit 31 determines the amount of movement of the broadcast image C from the corresponding position to the adjustment position (step S207).
  • the control unit 31 determines the movement amount corresponding to the relative velocity V obtained in step S202 with reference to the movement amount determination table TA5.
  • control unit 31 determines whether or not the front object F is in a specific moving state in which the front object F is approaching the own vehicle 1 from an angle (step S207A).
  • step S207A When the front object F is not in the specific movement state (step S207A; No), the control unit 31 executes an image position adjustment process for drawing the notification image C at the adjustment position moved from the corresponding position by the movement amount determined in step S207. (Step S208).
  • Step S208 When the result is No in step S207A and the process transitions to step S208, the control unit 31 executes the image position adjustment process even when the front object F is away from the own vehicle 1. May be good.
  • step S207A When the front object F is in the specific moving state (step S207A; Yes), the control unit 31 moves the notification image C by adding the correction amount ⁇ M to Mf when the moving amount determined in step S207 is Mf.
  • the amount is corrected as (Mf + ⁇ M) (step S207B), and the image position adjustment process is executed with the corrected movement amount (step S208).
  • step S208 or step S204 the control unit 31 executes again from the process of step S201. This concludes the description of the second embodiment.
  • the control unit 31 includes a notification image C composed of a linear image (an image visually recognized linearly) as an image for notifying the front object F. Is displayed.
  • the broadcast image C is drawn by, for example, a vector image, and is visually recognized along a direction in which the vehicle 1 and the front object F intersect with each other (vertical direction in FIG. 14A) as shown in FIG. 14A.
  • the specific linear image Cs and the predetermined linear image Cn visually recognized so as to extend toward the front of the vehicle 1 from each of both ends of the specific linear image Cs are included.
  • the predetermined linear image Cn in this embodiment is an image that is not subject to the line width enlargement processing described later.
  • the broadcast image C shown in FIG. 14A is an example of a mode in which a sense of perspective is created by tilting the tips of the two predetermined linear images Cn so as to approach each other, and the image C is visually recognized along the road surface in front of the image C. Is. Note that FIG. 14A is an example in which the broadcast image C is displayed with a deviation from a desired display position due to the above-mentioned display deviation.
  • the control unit 31 executes the line width expansion process described later in order to reduce the discomfort given to the user due to the display deviation as shown in FIG. 14A.
  • the control unit 31 that executes the display control process including the line width enlargement process mainly functions as an acquisition means, a drawing control means, and a specific means.
  • the specifying means according to the third embodiment identifies the display delay factor as in the first and second embodiments.
  • the control unit 31 as the drawing control means sets the line width of the specific linear image Cs to the vehicle 1 when the value indicated by the specified display delay factor is equal to or more than a predetermined value.
  • the line width enlargement process for drawing the notification image C is executed so that the forward object F is enlarged by a predetermined enlargement amount in the moving direction and visually recognized.
  • the line width expansion process is executed by enlarging a predetermined reference line width by a predetermined enlargement amount.
  • the reference line width is the width of the linear image (including the specific linear image Cs) at a predetermined distance P (see FIG. 1B) determined according to the type of the broadcast image C.
  • the reference line width does not have to be fixed within the display area of the virtual image A, and the reference line width information (mathematical data or table data) that defines the correspondence between the distance between the vehicle 1 and the front object F and the reference line width. It may be variable according to.
  • This reference line width information is stored in the storage unit 32 in advance, and is set so that the reference line width becomes narrower as the distance to the front object F to be notified becomes longer by using, for example, perspective.
  • the control unit 31 draws a broadcast image C composed of a linear image with a reference line width based on the distance from the own vehicle 1 to the front object F and the reference line width information.
  • the control unit 31 sets the enlargement amount to a predetermined value E (hereinafter, E is also referred to as a reference movement amount). Further, the control unit 31 obtains the relative speed V between the own vehicle 1 and the front object F, and determines the enlargement amount corresponding to the obtained relative speed V with reference to the enlargement amount determination table TA6 shown in FIG.
  • the enlargement amount determination table TA6 is configured such that the relative speed and the enlargement amount correspond to each other, and is stored in advance in the ROM of the storage unit 32.
  • the control unit 31 sets the enlargement amount of the specific linear image Cs as it is to E. Further, when the magnitude of the calculated relative velocity V is V1 ⁇ V ⁇ V2, the control unit 31 sets the enlargement amount of the specific linear image Cs to ⁇ ⁇ E ( ⁇ > 1). That is, the control unit 31 increases the amount of expansion when the relative velocity V is a second value larger than the first value, as compared with the case where the relative velocity V is the first value.
  • control unit 31 may variably control the reference enlargement amount E itself so that the enlargement amount of the specific linear image Cs increases as the value indicated by the display delay factor increases. Further, the control unit 31 determines a coefficient ( ⁇ or ⁇ in the example of FIG. 16) according to the obtained relative speed V, and multiplies the determined coefficient by the reference enlargement amount E to respond to the relative speed V. The enlarged amount of the specific linear image Cs may be obtained. Further, the relationship between the relative velocity V and the enlargement amount of the specific linear image Cs is stored in advance in the ROM of the storage unit 32 as function data (for example, a linear function or a quadratic function), and the control unit 31 obtains the relative velocity. The enlargement amount of the specific linear image Cs may be determined based on V and the function.
  • FIG. 14B shows a display example of the notification image C when the line width enlargement processing is executed.
  • the control unit 31 sets the line width of the specific linear image Cs to the vehicle 1 when the value indicated by the display delay factor is equal to or more than a predetermined value.
  • the line width enlargement process for drawing the notification image C is executed so that the forward object F is magnified and visually recognized by the enlarged amount determined in the moving direction (downward in the example of FIG. 14).
  • control unit 31 that functions as the drawing control means is in the crossing direction (for example, the y direction) in which the front object F intersects the traveling direction (for example, the x direction) of the vehicle 1 as described in the second embodiment.
  • the amount of enlargement when the line width expanding process is executed becomes larger than in the case where the front object F is not in the specific moving state. As such, the enlargement amount is corrected.
  • the correction amount ⁇ E is added to Ef to obtain the specific linear image Cs.
  • the enlargement amount be (Ef + ⁇ E).
  • the front object F is a road marking line (white line in the example of the figure), and the notification image C notifying the front object F is superimposed on the road marking line in order to emphasize the existence of the road marking line.
  • the broadcast image C is formed in a substantially rectangular box shape, and is in the direction in which the vehicle 1 and the front object F intersect with each other (the direction in which the lane marking line which is the front object F in FIG. 15A extends).
  • the one close to the vehicle 1 is set as the specific linear image Cs.
  • those other than the specific linear image Cs are set to the predetermined linear image Cn.
  • the display deviation of the notification image C may occur, but the line width of the specific linear image Cs is the front object F with respect to the vehicle 1. If the line width enlargement process for drawing the notification image C is executed so that the broadcast image C is magnified and visually recognized in the moving direction (left direction in FIG. 15B), the display deviation can be made inconspicuous. .. By doing so, even when the broadcast image C shifts to the right side of the desired display position due to the display delay, the display shift can be made inconspicuous.
  • the control unit 31 executes the processes of steps S301 to S306 in the same manner as steps S101 to S106 described in the first embodiment.
  • step S306 when the value T indicated by the display delay factor is equal to or greater than the predetermined value T1 (step S306; Yes), the control unit 31 determines whether or not the front object F is approaching the own vehicle 1 (step S306A). ). When the front object F is not close to the own vehicle 1 (step S306A; No), the control unit 31 does not execute the line width enlargement process and draws the broadcast image C by normal image control (step S304). As a result, the control load can be reduced without executing the line width enlargement process for the notification image C, which is considered to be of low importance to the user 4 because the front object F is not close to the own vehicle 1. it can. When the front object F is approaching the own vehicle 1 (step S306A; Yes), the control unit 31 determines the expansion amount corresponding to the relative speed V obtained in step S302 with reference to the expansion amount determination table TA6. (Step S307).
  • control unit 31 determines whether or not the front object F is in a specific moving state in which the front object F is approaching the own vehicle 1 from an angle, as in step S207A of the second embodiment (step S307A).
  • step S307A When the front object F is not in the specific moving state (step S307A; No), the control unit 31 executes a line width expanding process for expanding the line width of the specific linear image Cs in the broadcast image C with the enlarged amount determined in step S307. (Step S308).
  • step S307A When the front object F is in the specific moving state (step S307A; Yes), the control unit 31 expands the notification image C by adding the correction amount ⁇ E to Ef when the enlargement amount determined in step S307 is Ef.
  • the amount is corrected as (Ef + ⁇ E) (step S307B), and the line width expansion process is executed with the corrected enlargement amount (step S308).
  • step S308 or step S304 the control unit 31 executes again from the process of step S301. This concludes the description of the third embodiment.
  • the value indicated by the display delay factor a predicted delay time T L due to drawing load, an example in which the delay time T obtained by summing the sensing delay time T S, is not limited thereto.
  • the display delay factor may be specified based on at least one of the drawing control load by the drawing control means and the transmission delay of the object information.
  • the value indicated by the display delay factor may be the predicted drawing load L shown in FIG. 7A.
  • the control unit 31 may determine the magnification according to the first embodiment, the movement amount according to the second embodiment, and the enlargement amount according to the third embodiment according to the predicted drawing load L.
  • control unit 31 may calculate the value indicated by the display delay factor using the data of the mathematical formula.
  • the data of the mathematical formula may be stored in the ROM of the storage unit 32 in advance, and may be configured so that the value indicated by the display delay factor can be calculated from various parameters such as the type, area, and number of the content images.
  • control unit 31 executes a combination of a plurality of the image enlargement processing according to the first embodiment, the image position adjustment processing according to the second embodiment, and the line width enlargement processing according to the third embodiment. May be good.
  • control unit 31 determines the magnification when executing the image enlargement processing and the movement amount when executing the image position adjustment processing according to the specified display delay factor, and enlarges the broadcast image C from the reference size.
  • the image position adjustment process for moving the display position of the broadcast image C by a determined movement amount may be executed.
  • the control unit 31 May be capable of performing at least one of the image enlargement process and the image position adjustment process and the line width enlargement process.
  • the shape and mode of the broadcast image C described in the third embodiment is arbitrary as long as it is composed of a linear image including the specific linear image Cs.
  • the broadcast image C may be a ring-shaped linear image.
  • any portion visually recognized along the direction in which the vehicle 1 and the front object F intersect with each other is defined as the specific linear image Cs, and the other portion is a predetermined line. It can be set as a state image Cn.
  • the specific linear image Cs in the broadcast image C described in the third embodiment may be a horizontal linear image along the left-right direction when viewed from the user 4.
  • at least a part of the specific linear image Cs whose width is expanded with respect to the reference line width may be blurred.
  • the portion of the specific linear image Cs whose width is expanded from the reference line width may be blurred, or all of the enlarged specific linear image Cs may be blurred.
  • the line width of the predetermined linear image Cn (for example, a vertical line image substantially along the vertical direction when viewed from the user 4) is increased together with the expansion of the line width of the specific linear image Cs. It may be enlarged.
  • the process of expanding the line width of the predetermined linear image Cn together with the expansion of the line width of the specific linear image Cs is useful, for example, when the front object F moves diagonally with respect to the own vehicle 1. is there.
  • the front object detection unit 40 may be configured to receive object information related to the front object F from the outside of the own vehicle 1 and supply the received object information to the control unit 31.
  • the front object detection unit 40 having the configuration comprises communication between the vehicle 1 and the wireless network (V2N: Vehicle To cellular Network), communication between the vehicle 1 and another vehicle (V2V: Vehicle To Vehicle), and communication between the vehicle 1 and a pedestrian. It may be composed of various modules that enable communication (V2P: Vehicle To Pedestrian) and communication (V2I: Vehicle To roadside Infrastructure) between the vehicle 1 and the roadside infrastructure.
  • the front object detection unit 40 includes a communication module that can directly access a WAN (Wide Area Network), an external device (such as a mobile router) that can access the WAN, an access point of a public wireless LAN (Local Area Network), and the like. It may be equipped with a communication module or the like for communication to perform Internet communication, or may be equipped with a GPS controller that calculates the position of the vehicle 1 based on a GPS (Global Positioning System) signal received from an artificial satellite or the like. .. With these configurations, communication by V2N is possible. (Ii) Further, the front object detection unit 40 may include a wireless communication module conforming to a predetermined wireless communication standard, and perform communication by V2V or V2P.
  • the front object detection unit 40 has a communication device that wirelessly communicates with the roadside infrastructure, and is installed as an infrastructure from, for example, a base station of a Safe Driving Support Systems (DSSS).
  • DSSS Safe Driving Support Systems
  • the front information of the vehicle 1 may be acquired via the roadside wireless device. This enables communication by V2I.
  • the front object detection unit 40 may be configured to enable communication by V2X (Vehicle To Everything) between the vehicle 1 and the outside of the vehicle 1. Then, the front object detection unit 40 is composed of a combination of a plurality of devices among the above-mentioned imaging device, the sensor group, and various modules that enable communication by V2X, and controls object information from each of the plurality of devices. It may be configured to supply to 31.
  • V2X Vehicle To Everything
  • the virtual surface on which the virtual image A is displayed is placed forward with respect to the vertical direction of the vehicle 1. It may be set by tilting to.
  • the projection target of the display light Q is not limited to the windshield 3, and may be a combiner composed of a plate-shaped half mirror, a hologram element, or the like.
  • the display device that executes the display control process described above is not limited to the HUD device 10.
  • the display device may be configured as a head-mounted display (HMD: Head Mounted Display) mounted on the head of the user 4 of the vehicle 1. Then, in the image displayed by the HMD as a virtual image, the display control of the broadcast image C may be executed by the same method as described above. That is, the display device is not limited to the one mounted on the vehicle 1, and may be any one used in the vehicle 1.
  • the HUD device 10 as an example of the display device described above displays a superimposed image visually recognized by the user 4 as a virtual image A that overlaps with the scenery in front of the vehicle 1.
  • the HUD device 10 includes an acquisition means, a drawing control means, and a control unit 31 that functions as a specific means.
  • the acquisition means acquires information about the front object F existing in front of the vehicle 1 and includes object information including at least the position information of the front object F.
  • the drawing control means draws a broadcast image C for notifying information about the front object F at a corresponding position corresponding to the position information in the display area of the superimposed image based on the object information acquired by the acquisition means.
  • the specifying means identifies a display delay factor that causes the broadcast image C drawn based on the object information acquired at the predetermined timing to be displayed later than the predetermined timing.
  • the display delay factor is specified based on at least one of the drawing control load by the drawing control means and the transmission delay of the object information.
  • the drawing control means broadcasts an image in a size larger than a predetermined reference size.
  • the image enlargement process for drawing C is executed. According to this configuration, as described above, it is possible to prevent the user 4 who visually recognizes the broadcast image C from giving a sense of discomfort.
  • the drawing control means executes the image enlargement processing
  • the drawing control means draws the broadcast image C at a magnification with respect to the reference size determined based on the display delay factor.
  • the drawing control means is the end portion B of the notification image C when executing the image enlargement processing.
  • the notification image C is drawn so as to be enlarged and visually recognized toward the vehicle 1 side with reference to the end portion B that is visually recognized at a position far from the vehicle 1 in a predetermined direction. According to this configuration, even when the broadcast image C shifts from the desired display position due to the display delay, the display shift can be made inconspicuous.
  • the drawing control means is an image enlargement processing when the vehicle 1 and the front object F are relatively close to each other and the relative speed V between the vehicle 1 and the front object F is equal to or higher than a predetermined speed. Is executed, the broadcast image C is drawn at a magnification with respect to the reference size determined based on the display delay factor and the relative speed V. According to this configuration, it is possible to ensure the followability of the broadcast image C with respect to the front object F.
  • the drawing control means is an image of a specific type (desired display) in which a notification image C for notifying information about the immovable body is predetermined. Even if the image is visually recognized as being deviated from the position in the traveling direction of the vehicle 1, the image enlargement process for the specific type of image is not executed when the image is acceptable for performing the notification). According to this configuration, it is not necessary to execute the image enlargement processing for the broadcast image C of the specific type, so that the control load can be reduced.
  • the drawing control means provides a notification image C for notifying information about the moving body. Does not execute the image enlargement processing of. According to this configuration, it is not necessary to execute the image enlargement processing for the broadcast image C, which is considered to be of low importance to the user 4, so that the control load can be reduced.
  • the specific means is an image in which the drawing control load as a display delay factor is drawn in the display area of the superimposed image by the drawing control means, and is a content image including the broadcast image C. Identify based on at least one of area, number, and dimension.
  • the drawing control means sets a corresponding position (desired display position) with respect to the vehicle 1.
  • the image position adjustment process for drawing the broadcast image C visually recognized at the adjustment position moved by a predetermined movement amount in the direction in which the front object F moves is executed. According to this configuration, as described above, it is possible to prevent the user 4 who visually recognizes the broadcast image C from giving a sense of discomfort.
  • the drawing control means determines the movement amount based on the display delay factor when executing the image position adjustment process.
  • the drawing control means controls the amount of movement when executing the image position adjustment process according to the relative speed V between the vehicle 1 and the front object F, and the relative speed V is the first value. Compared with a certain case, the movement amount when the relative velocity V is a second value larger than the first value is increased. According to this configuration, it is possible to ensure the followability of the broadcast image C with respect to the front object F.
  • the specific means is an image in which a drawing control load as a display delay factor is drawn in the display area of the superimposed image by the drawing control means, and is a content image including the broadcast image C. Identify based on at least one of area, number, and dimension.
  • the broadcast image C is composed of a linear image, and is visually recognized along a direction in which the vehicle 1 and the front object F intersect with each other. Includes linear image Cs. Then, when the value indicated by the specified display delay factor is equal to or greater than a predetermined value, the drawing control means increases the line width of the specific linear image Cs by a predetermined amount in the direction in which the front object F moves with respect to the vehicle 1. A line width enlargement process for drawing the broadcast image C so that it can be enlarged and visually recognized is executed. According to this configuration, as described above, it is possible to prevent the user 4 who visually recognizes the broadcast image C from giving a sense of discomfort.
  • the drawing control means determines the enlargement amount based on the display delay factor when executing the line width enlargement process.
  • the drawing control means controls the amount of enlargement when executing the line width enlargement process according to the relative speed V between the vehicle 1 and the front object F, and the relative speed V is the first value. Compared with a certain case, the amount of enlargement when the relative velocity V is a second value larger than the first value is increased. According to this configuration, it is possible to ensure the followability of the broadcast image C with respect to the front object F.
  • the drawing control means increases the amount of enlargement when executing the line width enlargement process as compared with the case where the front object F is not in the specific movement state. Correct the amount of enlargement.
  • the notification image C can be displayed closer to the own vehicle 1, so that the presence of the front object F in the specific moving state, which is assumed to be difficult for the user 4 to grasp the sense of distance, is satisfactorily notified. Can be done.
  • the specific means is an image in which the drawing control load as a display delay factor is drawn in the display area of the superimposed image by the drawing control means, and is a content image including the broadcast image C. Identify based on at least one of area, number, and dimension.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention empêche une sensation étrange d'un utilisateur qui reconnaît visuellement une image de notification. Ce dispositif d'affichage affiche une image superposée qui est reconnue visuellement en tant qu'image virtuelle A superposée sur une scène devant un véhicule. Le dispositif d'affichage est pourvu d'un moyen d'acquisition, d'un moyen de commande de dessin et d'un moyen d'identification. Le moyen d'acquisition acquiert des informations d'objet comprenant des informations d'emplacement concernant un objet avant F. Le moyen de commande de dessin dessine une image de notification C à un emplacement correspondant qui correspond aux informations d'emplacement sur la base des informations d'objet. Le moyen d'identification identifie un facteur de retard d'affichage qui provoque un retard dans l'affichage de l'image de notification C, sur la base d'au moins un parmi une charge de commande de dessin du moyen de commande de dessin et un retard de transmission des informations d'objet. Lorsqu'une valeur représentée par le facteur de retard d'affichage identifié est supérieure ou égale à une valeur prédéfinie, le moyen de commande de dessin exécute un processus d'agrandissement d'image pour dessiner l'image de notification C dans une taille supérieure à une taille standard prédéfinie.
PCT/JP2020/031957 2019-08-30 2020-08-25 Dispositif d'affichage WO2021039762A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021542915A JPWO2021039762A1 (fr) 2019-08-30 2020-08-25

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019158121 2019-08-30
JP2019-158121 2019-08-30

Publications (1)

Publication Number Publication Date
WO2021039762A1 true WO2021039762A1 (fr) 2021-03-04

Family

ID=74685940

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/031957 WO2021039762A1 (fr) 2019-08-30 2020-08-25 Dispositif d'affichage

Country Status (2)

Country Link
JP (1) JPWO2021039762A1 (fr)
WO (1) WO2021039762A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024042762A1 (fr) * 2022-08-23 2024-02-29 マクセル株式会社 Dispositif d'affichage tête haute et procédé de traitement de données vidéo

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119917A (ja) * 2009-12-02 2011-06-16 Denso Corp 車両用表示装置
JP2012218505A (ja) * 2011-04-05 2012-11-12 Denso Corp 車両用表示装置
JP2015048040A (ja) * 2013-09-04 2015-03-16 トヨタ自動車株式会社 注意喚起表示装置及び注意喚起表示方法
JP2017007481A (ja) * 2015-06-19 2017-01-12 日本精機株式会社 車載ヘッドアップディスプレイ装置及び車載表示システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119917A (ja) * 2009-12-02 2011-06-16 Denso Corp 車両用表示装置
JP2012218505A (ja) * 2011-04-05 2012-11-12 Denso Corp 車両用表示装置
JP2015048040A (ja) * 2013-09-04 2015-03-16 トヨタ自動車株式会社 注意喚起表示装置及び注意喚起表示方法
JP2017007481A (ja) * 2015-06-19 2017-01-12 日本精機株式会社 車載ヘッドアップディスプレイ装置及び車載表示システム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024042762A1 (fr) * 2022-08-23 2024-02-29 マクセル株式会社 Dispositif d'affichage tête haute et procédé de traitement de données vidéo

Also Published As

Publication number Publication date
JPWO2021039762A1 (fr) 2021-03-04

Similar Documents

Publication Publication Date Title
EP3031656B1 (fr) Dispositif et procédé de fourniture d'informations et support de stockage de programme de fourniture d'informations
JP6201690B2 (ja) 車両情報投影システム
US8536995B2 (en) Information display apparatus and information display method
JP6695049B2 (ja) 表示装置及び表示制御方法
US20180356630A1 (en) Head-up display
JP2015099469A (ja) 車両情報投影システム
WO2020241003A1 (fr) Dispositif de commande d'affichage et programme de commande d'affichage
JP6866875B2 (ja) 表示制御装置、及び表示制御プログラム
JP6186905B2 (ja) 車載表示装置およびプログラム
JP7459883B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び方法
WO2021039762A1 (fr) Dispositif d'affichage
US20210008981A1 (en) Control device, display device, display system, moving body, control method, and recording medium
JP2019202589A (ja) 表示装置
JP7310560B2 (ja) 表示制御装置及び表示制御プログラム
JP7379945B2 (ja) 表示装置
JP7379946B2 (ja) 表示装置
WO2021200914A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé
WO2020040276A1 (fr) Dispositif d'affichage
JP7318431B2 (ja) 表示制御装置及び表示制御プログラム
JP7338632B2 (ja) 表示装置
JP2019206262A (ja) 表示装置
JP2019207632A (ja) 表示装置
WO2021200913A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé
JP6692981B1 (ja) 表示装置及び表示方法
JP2019148935A (ja) 表示制御装置及びヘッドアップディスプレイ装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20857946

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021542915

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20857946

Country of ref document: EP

Kind code of ref document: A1