WO2019123976A1 - Display control method, display control device, and heads-up display device - Google Patents

Display control method, display control device, and heads-up display device Download PDF

Info

Publication number
WO2019123976A1
WO2019123976A1 PCT/JP2018/043322 JP2018043322W WO2019123976A1 WO 2019123976 A1 WO2019123976 A1 WO 2019123976A1 JP 2018043322 W JP2018043322 W JP 2018043322W WO 2019123976 A1 WO2019123976 A1 WO 2019123976A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
moving
information
moving image
display
Prior art date
Application number
PCT/JP2018/043322
Other languages
French (fr)
Japanese (ja)
Inventor
瑞翔 神保
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Priority to DE112018006514.6T priority Critical patent/DE112018006514T5/en
Priority to JP2019560905A priority patent/JP7216898B2/en
Publication of WO2019123976A1 publication Critical patent/WO2019123976A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/23
    • B60K35/28
    • B60K2360/177
    • B60K2360/178
    • B60K2360/179
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the present invention relates to a display control method of a virtual image superimposed on a foreground of a vehicle for visual recognition, a display control device, and a head-up display device for displaying a virtual image.
  • the head-up display device disclosed in Patent Document 1 adds information to an object or emphasizes an object by displaying a virtual image for the object existing in the foreground. (AR: Augmented Reality) ) Is generated.
  • the present invention has been made in view of the above problems, and its object is to improve the recognizability of objects existing in a real scene.
  • the display control method of the first embodiment of the present invention is a display control method of a moving image (200) to be viewed superimposed on the foreground of the host vehicle (1), Regarding the first object (311) existing around the vehicle, the first object information including at least the position of the first object, and the second object (312) existing around the vehicle
  • the display control apparatus is a display control apparatus for displaying an information image (200) visually recognized superimposed on the foreground of the host vehicle (1).
  • the first object (311) existing around the host vehicle at least the second with respect to the first object information including the position (311) of the first object and the second object (312)
  • the information image (210) included in the movement image is converted from a first information image (211) related to the first object information to a second one related to the second object information.
  • a display processing unit (22) for switching to the information image (212) and moving the moving image along a moving path (400) away from the first object.
  • a head-up display device includes the display control device (90) according to the second aspect; And a display (10) for displaying the moving image under the control of the display control device (90).
  • the function of the object information acquisition unit 30 for acquiring the position of the object 310 present in the periphery 300 of the vehicle 1 and the moving image 200 are moved, and the information image 210 included in the moving image 200 is switched.
  • a device provided with the function of the display processing unit 22 is called a display control device according to the present invention.
  • the unit indicated by reference numeral 90 in FIG. 1 can be called a display control device.
  • the display control device essentially has a function (display processing unit 22) for moving the moving image 200 and switching the information image 210 included in the moving image 200, the display A function (image generation unit 24) for generating display data for displaying a display image may be separately provided.
  • the display control device 90 of FIG. 1 can be called a display control device according to the present invention even when there is no function of the image generation unit 24 that generates display data.
  • a device provided with a display control device (for example, reference numeral 90 in FIG. 1) according to the present embodiment and a display 10 for displaying a display image including the moving image 200 generated by the display control device 90 is the present invention.
  • a display control device for example, reference numeral 90 in FIG. 1
  • a display 10 for displaying a display image including the moving image 200 generated by the display control device 90 is the present invention.
  • FIG. 1 is a block diagram showing the overall configuration of a head-up display device 100 (hereinafter referred to as a HUD device) in the present embodiment.
  • the HUD device 100 projects the display light L of the display 10 on a part of the front windshield (an example of the projection target 2) of the host vehicle 1.
  • the to-be-projected part 2 produces
  • the viewer places the eye point 4 (the position of the eye of the viewer) in the eye box 3 to display the display area 101 of the HUD device 100 in front of the projection part 2.
  • a virtual image can be viewed within the (see FIG. 2).
  • the HUD device 100 displays the display 10 for displaying the display image M and the display light L of the display image M displayed by the display 10 on the projection portion 2 of the host vehicle 1 (for example, the front windshield of the host vehicle 1).
  • a processing unit that changes the virtual image (moving image 200) visually recognized by the viewer by controlling the relay optical unit 11 (the plane mirror 12 and the concave mirror 13 in FIG. 1) and the display image M displayed by the display 10 20, an object information acquisition unit 30, an operation information acquisition unit 40, and an input / output interface 70.
  • the processing unit 20 receives an external device (periphery monitoring unit 31, operation detection unit 41, navigation system 500, cloud server (input device) through an input interface (object information acquisition unit 30, operation information acquisition unit 40, and input / output interface 70).
  • External server 600, vehicle ECU 700, and other on-vehicle devices in the host vehicle 1 (not shown), portable devices in the host vehicle 1, other vehicles (inter-vehicle communication V2 V), communication infrastructure on the road (inter-vehicle communication V2 I), walking It is communicably connected to a person's portable device (communication V2P between vehicle and pedestrian).
  • the input interface may include wired communication functionality, such as, for example, a USB port, a serial port, a parallel port, an OBDII, and / or any other suitable wired communication port.
  • the input interface is, for example, Bluetooth (registered trademark) communication protocol, IEEE 802.11 protocol, IEEE 1002.11 protocol, IEEE 1002.16 protocol, DSRC (Dedicated Short Range Communications) protocol, shared wireless access
  • Bluetooth registered trademark
  • IEEE 802.11 protocol IEEE 1002.11 protocol
  • IEEE 1002.16 protocol IEEE 1002.16 protocol
  • DSRC Dedicated Short Range Communications
  • the object information acquisition unit 30 is an input interface for acquiring data (object information) including at least position information of a specific object 310 present in the periphery 300 including at least the front of the own vehicle 1.
  • the object information is acquired from the periphery monitoring unit 31 including the single or plural cameras or sensors, and is output to the processing unit 20.
  • the periphery monitoring unit 31 captures (detects) the periphery 300 of the vehicle 1, analyzes the imaging (detection) data, and detects position information (object 310) of the object 310 in the periphery 300 of the vehicle 1.
  • an analysis unit (not shown) that generates positional information of the object 310 in the display area 101 of the HUD device 100 (may also include distance information to the object 310) Have.
  • the object information acquired by the object information acquiring unit 30 is the actual coordinates of the periphery 300 of the vehicle 1 or the coordinates (position information) of the object 310 in the display area 101.
  • the analysis unit as described above may be provided in the HUD device 100. Specifically, the analysis unit may be provided in the processing unit 20.
  • the object information acquired by the object information acquisition unit 30 is an imaging (detection) of the periphery 300 by the periphery monitoring unit 31 ( Detection) data.
  • the object information acquisition unit 30 may acquire type information of the object 310 that the periphery monitoring unit 31 observes the periphery 300 of the host vehicle 1 and identifies with the type identification unit (not shown).
  • the type of the object 310 is, for example, an obstacle (other vehicle, person, animal, or thing), a road sign, a road, a building or the like, but is present in the periphery 300 and is not limited thereto as long as it can be identified. That is, the object information acquisition unit 30 may acquire object information (position information of the object 310, distance information, type information) and output the acquired information to the processing unit 20.
  • the input / output interface 70 capable of communicating with an external device may function as the object information acquisition unit 30.
  • the input / output interface 70 includes the navigation system 500, the cloud server 600, the vehicle ECU 700, other in-vehicle devices in the host vehicle 1 (not shown), portable devices in the host vehicle 1, other vehicles (inter-vehicle communication V2V),
  • the object information may be acquired from the communication infrastructure on the road (road-to-vehicle communication V2I) and the portable device of the pedestrian (communication V2P between the vehicle and the pedestrian).
  • the input / output interface 70 can obtain, for example, from the navigation system 500, the cloud server 600, and the vehicle ECU 700, along with map information, positional information, type information, etc. of objects 310 such as road signs, roads, buildings, etc. Obstacles (other vehicles, people, animals, things) from other vehicles (inter-vehicle communication V2V), communication infrastructures on the road (road-to-vehicle communication V2I), portable devices of pedestrians (communications V2P between vehicles and pedestrians) , Road signs, position information such as roads, type information, etc. can be acquired.
  • V2V inter-vehicle communication
  • V2I road-to-vehicle communication
  • portable devices of pedestrians communication V2P between vehicles and pedestrians
  • Road signs position information such as roads, type information, etc.
  • the processing unit 20 outputs position information of the host vehicle 1 or the HUD device 100 (which may add direction information of the host vehicle 1) to the external device via the input / output interface 70, and the external device Based on the input position information (additional direction information) of the host vehicle 1 or the HUD device 100, the object information including the position information and type information of the object 310 present in the periphery 300 of the host vehicle 1 is output to the input / output interface 70 You may
  • the operation information acquisition unit 40 acquires, for example, operation information on a portion of the vehicle 1 detected by the operation detection unit 41 such as steering, accelerator, brake, blinker, and headlight, and outputs the operation information to the processing unit 20.
  • the input / output interface 70 capable of communicating with an external device may function as the operation information acquisition unit 40 instead of the operation information acquisition unit 40 or together with the operation information acquisition unit 40.
  • the input / output interface 70 may obtain the operation information regarding the vehicle 1 from the cloud server 600, the vehicle ECU 700, and the communication infrastructure (road-to-vehicle communication V2I) on the road (not shown).
  • the processing unit 20 determines a determination unit 21 that determines whether a moving condition of the moving image 200 described later is satisfied, a display processing unit 22 that changes the moving image 200, a program that the display processing unit 22 executes, and image element data And an image generation unit 24 that generates display data for causing the display 10 to display a display image based on the calculation result of the display processing unit 22.
  • the determination unit 21 determines, for the object 310, the priority to be notified to the viewer (hereinafter also referred to as notification priority), compares the notification priorities of the plurality of objects 310, and associates the moving image 200. When the notification priority of the other object 310 is higher than the notification priority of one object 310, it is determined that the movement condition for moving the moving image 200 is satisfied.
  • the “informing priority” is, for the object 310, the degree of risk derived from the degree of the seriousness of the object 310, the degree of urgency derived from the length of the response time required to take a response action, etc. It may be determined based on only the object information acquired by the object information acquisition unit 30.
  • the determination is made based on the operation information acquired by the operation information acquisition unit 40 and various information acquired by the input / output interface 70. It may be done.
  • the determination unit 21 essentially does not have to have the function of determining the notification priority, as long as it has the function of determining whether the movement condition is satisfied based on the notification priority.
  • part or all of the function of determining the notification priority may be provided outside the processing unit 20 or outside the display control device 90.
  • determination of a movement condition is not limited to the method of using the said danger degree or the said emergency degree. Below, a modification is shown.
  • the determination unit 21 determines that the moving condition for moving the moving image 200 toward the newly detected object 310 is satisfied by setting the notification priority to the newly detected object 310 high.
  • the freshness of the information may be used to determine the movement conditions instead of or in addition to the degree of risk or the degree of urgency.
  • this is a case where the degree of risk (urgency) of one object 310 to which the moving image 200 is associated is higher than the degree of risk (urgency) of the other object 310 detected after this object 310.
  • the determining unit 21 determines that the moving condition for moving the moving image 200 toward the other object 310 is satisfied by lowering the notification priority for the object 310 to which the moving image 200 is continuously associated. You may judge. In other words, the duration of the moving image 200 may be used to determine the movement condition instead of or in addition to the degree of danger or the degree of urgency. Specifically, when the moving image 200 is continuously displayed for the same object 310 for a predetermined threshold time or longer, a moving condition for moving the moving image 200 toward the other object 310 is satisfied.
  • the determination method of the movement condition is not limited to the above example, and the determination unit 21 newly adds, for example, via the input interface (the object information acquisition unit 30, the operation information acquisition unit 40, and the input / output interface 70). If the acquired information is a predetermined type of information, or the acquired information is information satisfying a predetermined condition, or if a predetermined combination of the acquired plurality of information is established, the driver is placed.
  • the movement condition may be determined by comprehensively judging the situation in which the user is present and changing the notification priority.
  • the display processing unit 22 determines that the movement condition is satisfied, the display processing unit 22 moves the moving image 200, and the first information image 211 (based on the first object information included in the moving image 200) 5A is switched to the second information image 212 (see FIG. 5B) based on the second object information.
  • the display processing unit 22 has a function (moving route determination unit 22a) that determines a moving route 400 from the position (moving start point 401) where the moving image 200 was originally displayed to the position (moving end point 402) to move the moving image 200. And a function (information image determination unit 22b) for determining the information image 210 of the moving image 200.
  • the movement path determination unit 22a determines the movement path 400 of the movement image 200 based on the position where the movement image 200 is currently displayed and the position of the second object 312 included in the second object information. .
  • the movement path 400 is directed from the position where the movement image 200 is currently displayed (the movement start point 401) to a predetermined position (the movement end point 402) around the second object 312 included in the second object information.
  • the display processing unit 22 determines a movement path 400 toward a predetermined position (end point of movement 402) outside the bounding box surrounding the outline of the second object 312 or the second object 312.
  • the movement end point 402 is set at a position not overlapping the outline of the second object 312 to be moved or the bounding box surrounding the second object 312 when the movement image 200 moves in the straight movement path 400.
  • the movement path determination unit 22a may determine the movement path 400 so that the movement image 200 does not overlap the second object 312.
  • the movement end point 402 may be set inside the outline of the second object 312 (the movement path determination unit 22a causes the movement path 400 to move so that the movement image 200 overlaps the second object 312. You may decide).
  • the display processing unit 22 may reduce the visibility of the moving image 200.
  • the display processing unit 22 reduces the visibility of the information image 210 of the moving image 200 and the painted image 220.
  • the visibility may be reduced by reducing the visibility of the information image 210 and maintaining the visibility of the painted image 220, or by reducing the degree to a lower visibility of the information image 210.
  • the movement path 400 may not be a mere straight line, but may be a visually elaborated curve.
  • the information image determining unit 22b determines an information image 210, which will be described later, included in the moving image 200, based on object information (position information, distance information, type information) of the object 310 with which the moving image 200 is associated.
  • the information image determining unit 22b may determine the information image 210 in consideration of the operation information acquired from the operation information acquiring unit 40 and various information acquired from the input / output interface 70 in the object information.
  • FIG. 2 is a diagram showing an example of how the moving image 200 displayed by the HUD device 100 and the real view are viewed when the driver faces the front of the host vehicle 1.
  • the HUD device 100 displays the moving image 200 at a predetermined position in the display area 101 of the projection target 2.
  • the display area 101 is described by a rectangular area surrounded by a dotted line, and the other vehicle 311 (an example of the object 310) located in front of the host vehicle 1 is located in the display area 101.
  • the HUD device 100 displays a moving image 200 drawn as “front caution” that calls attention to the other vehicle 311 in the vicinity of the other vehicle 311 (in the vertical direction below the other vehicle 311 in FIG. 2).
  • the viewer gives visual attention to the displayed moving image 200, and recognizes that it is a situation where attention needs to be paid to the "front attention” which is the information image 210 drawn on the moving image 200, And, it is possible to quickly recognize that the target of which attention must be made is the other vehicle 311 near the moving image 200.
  • the position (coordinates) at which the moving image 200 is displayed may be fixed or dynamic with respect to the position change of the object 310.
  • the position at which the moving image 200 is displayed is determined by the rough position of the object 310, and may not be changed (it may be fixed) in a non-large position change of the object 310. Further, the position where the moving image 200 is displayed may be changed according to the change of the position information of the object 310 (it may be changed continuously so that the relative change of position with the object 310 is reduced). In other words, the position of the object 310 may be tracked).
  • FIG. 3 is a diagram for explaining the configuration of the moving image 200.
  • the moving image 200 is a color such as white, orange, yellow, yellowish green, blue, red, etc., and as shown in FIG. 3A, an information image 210 which is a text image of about 4 characters such as "forward warning".
  • a paint image 220 which is rectangular (shape is one example) surrounding the information image 210 and is filled with a color having a hue different from that of the information image 210.
  • the moving image 200 is a virtual image displayed overlapping with the real view, the viewer visually recognizes a mixed color of the real view and the moving image 200.
  • FIG. 1 is a color such as white, orange, yellow, yellowish green, blue, red, etc.
  • an information image 210 which is a text image of about 4 characters such as "forward warning”.
  • a paint image 220 which is rectangular (shape is one example) surrounding the information image 210 and is filled with a color having a hue different from that of the information image
  • FIG. 3B is a view showing a state in which the moving image 200 having the painted image 220 overlaps the visual obstacle 320 (for example, a white line on the road) in the real view, for example, the information image 210
  • the visual obstacle 320 for example, a white line on the road
  • the information image 210 Example of white color (R255, G255, B255 in RGB value), yellow-green (R153, G255, B102 in RGB value) for painted image 220, and white (R255, G255, B255 in RGB value) for visual obstacle 320 It is.
  • FIG. 3C is a diagram showing a state in which the moving image 200 without the painted image 220 overlaps the visual obstacle 320 (for example, a white line on a road, etc.) in the real scene.
  • the color of the information image 210 and the color of the visual obstacle 320 are the same or approximate colors, in the moving image 200 which does not have the painted image 220 shown in FIG. It is difficult to distinguish between the image 210 and the visual obstacle 320, and the visibility of the information image 210 is reduced.
  • the moving image 200 having the paint image 220 shown in FIG. 3B the color of the visual obstacle 320 overlapping the paint image 220 is visually recognized as a different color due to color mixture with the paint image 220. This makes it easy to distinguish from the visual obstacle 320, and the visibility of the information image 210 is improved.
  • the frequency of overlapping the moving image 200 with various visual obstacles 320 on the real view increases, but by providing the paint image 220 on the moving image 200, Visibility can be improved.
  • FIG. 3D shows a state where the information image 210 of the moving image 200 is switched, one of which is the information image 210a marked as “right side caution”, and the other of which is the information image noted as “vehicle approaching”. 210b.
  • the moving image 200 has a plurality of (two types in the example of FIG. 3D) information images 210 of the information image 210a and the information image 210b, and the information images 210 in the moving image 200 are sequentially or reciprocally continuous. It may be switched.
  • the moving image 200 may not have the fill image 220.
  • the moving image 200 may be configured by adding only the information image 210 or an image element different from the filling image 220.
  • the information image 210 and the fill image 220 are not limited to a single color, and may be configured in a plurality of colors.
  • the fill image 220 may be an image element surrounding the edge of each character of the information image 210. Also, although it is preferable that the fill image 220 entirely surround the information image 210, it does not have to surround part of the information image 210.
  • FIG. 4 is a flow chart showing the main operation procedure of the HUD device 100 in the first and second embodiments
  • FIG. 5 is a table of the moving image 200 of the first embodiment corresponding to the flow chart of FIG. It is a figure showing an example of indication.
  • FIG. 5 (a) shows a display example of the moving image 200 before moving
  • FIG. 5 (b) shows the moving image 200 after steps S40 and S50 in FIG. 4 are executed when the moving condition is satisfied. Shows a display example of.
  • step S10 information acquisition process of FIG. 4, the determination unit 21 acquires object information from the periphery monitoring unit 31 via the object information acquisition unit 30.
  • the object information is position information on the actual coordinates of the object 310 in the vicinity 300 of the own vehicle 1 (may include distance information to the object 310) or / and of the object 310 in the display area 101 of the HUD device 100. At least position information (which may include distance information to the object 310).
  • the determination unit 21 receives the operation information of the own vehicle 1 from the operation detection unit 41 via the operation information acquisition unit 40, the external device via the input / output interface 70 (navigation system 500, cloud server 600, various information including information from the vehicle ECU 700) may be acquired.
  • step S20 determines whether the movement condition of the moving image 200 is satisfied based on the various information acquired in step S10. For example, when the determination unit 21 acquires the operation information of the blinker that changes the course to the driver's right lane in step S10, the first object 311 traveling in front of the host vehicle 1 (see FIG. The moving condition is satisfied by setting the notification priority for the second object 312 (see FIG. 5B) traveling in the right lane of the host vehicle 1 higher than the notification priority for 5 (a)). It is determined that
  • the display processing unit 22 updates the display without moving the moving image 200 in the direction away from the first object 311 (step S30). In other words, the display processing unit 22 maintains the object 310 associated with the moving image 200 without changing it.
  • “do not move the moving image 200 away from the object 310” is not only simply not changing the position (coordinates) at which the moving image 200 is displayed in the display area 101 of the HUD device 100. , And keeping the position of the moving image 200 changed by tracking the position of the object 310.
  • step S20 the process proceeds to step S40 (information image updating step), and the display processing unit 22 or the information image determining unit 22b performs the second process based on the various information input in step S10.
  • the second information image 212 related to the object 312 is determined, the image generation unit 24 generates display data, and switches the information image 210 from the first information image 211 to the second information image 212.
  • the display processing unit 22 or the information image determination unit 22 b may, for example, change the information image 210 from “forward warning” (see FIG. 5A), which is a first information image 211 related to the first object 311. It switches to the "right side attention" (refer FIG.5 (b)) which is the 2nd information image 212 relevant to the object 312.
  • FIG. 5A forward warning
  • FIG.5 (b) the "right side attention”
  • step S50 movement step
  • the display processing unit 22 or the movement path determination unit 22a moves the moving image 200 closer to the second object 312 based on the position of the second object 312 (in other words,
  • the movement path 400 is determined such that the movement image 200 is separated from the first object 311, the image generation unit 24 generates display data, and moves the information image 210 along the movement path 400.
  • the display processing unit 22 or the movement path determination unit 22a sets the position at which the movement image 200 is currently displayed as the movement start point 401, and moves the predetermined position determined based on the position of the second object 312
  • the end point 402 is set, and a line segment connecting the movement start point 401 and the movement end point 402 is determined as the movement path 400.
  • steps S40 and S50 may be performed out of order or simultaneously, the processing unit 20 or the display processing unit 22 may complete the movement of the moving image 200 (step S50 (moving step)).
  • step S50 moving step
  • the switching of the information image 210 to the second information image 212 step S40 (information image updating step)) is completed.
  • the switching of the information image 210 be completed while the moving image 200 moves along the moving path 400 or before the moving starts. According to this, since the new information image 210 (second information image 212) can be visually recognized before the moving image 200 moves, it becomes possible to take an action intended by the information image 210 more quickly.
  • FIG. 6 is a view showing a display example of the moving image 200 according to the second embodiment.
  • 6 (a) shows a display example of the moving image 200 before moving
  • FIG. 6 (b) shows a display example of the moving image 200 moving from the moving start point 401 to the moving relay point 403.
  • c) shows a display example of the moving image 200 moving from the moving relay point 403 to the moving end point 402.
  • the determination unit 21 acquires the object information of the objects 310 of the objects 300 around the host vehicle 1 in step S10 of FIG. 4 and updates the display in step S30 when the movement condition is not satisfied (NO in step S20). Do. If the processing unit 20 or the display processing unit 22 has a high notification priority for the first object 311 located in front of the host vehicle 1, as shown in FIG. 6A, the first information on the first object 311 The moving image 200 including the image 211 is displayed.
  • the determination unit 21 acquires the object information of the objects 310 in the vicinity 300 of the vehicle 1 in step S10 of FIG. 4 again, and based on the notification priority for the first object 311 located in front of the vehicle 1 It is determined that the moving condition of the moving image 200 is satisfied when the notification priority for the other second object 312 is high (or when the notification priority for the first object 311 is lower than a predetermined threshold) (YES in step S20 of FIG. 4). Subsequently, in step S40, the information image determination unit 22b determines the second information image 212 related to the second object 312, and the information image 210 is displayed as the first information image 211 (“forward warning” in FIG. 6A). Is switched to the second information image 212 (“right side attention” in FIG.
  • step S50 the movement path determination unit 22a determines the first movement path 410 heading from the movement start point 401 to the movement relay point 403
  • the processing unit 20 or the display processing unit 22 moves the moving image 200 to the moving relay point 403 along the first moving path 410, as shown in FIG. 6 (b).
  • the determination unit 21 acquires the object information of the objects 310 in the vicinity 300 of the vehicle 1 in step S110 of FIG. 4, and the notification priority for the second object 312 is further increased and exceeds a predetermined threshold. In this case, it is determined that the movement condition of the moving image 200 is satisfied (YES in step S120 in FIG. 4). Subsequently, in step S140, the information image determination unit 22b updates the information image 210, but the information image 210 has already been changed to the second information image 212 regarding the second object 312 by the previous processing (step S40). In step S150, the movement path determination unit 22a determines the second movement path 420 from the movement relay point 403 to the movement end point 402, as shown in FIG. 6C. As such, the moving image 200 is moved along the second moving path 420 to the moving end point 402.
  • the individual movement conditions are determined when going from the movement start point 401 to the movement relay point 403 and when going from the movement relay point 403 to the movement end point 402 (see FIG. 4).
  • the determination step (step S120) when moving from the mobile relay point 403 to the mobile end point 402 may be omitted.
  • moving image 200 passes from moving start point 401 to moving relay point 403, and then automatically moves from moving relay point 403 to moving end point 402. You may move it to The moving image 200 may be moved toward the moving end point 402 after being stopped at the moving relay point 403 for a predetermined time (for example, 10 seconds).
  • the movement relay point 403 is a specific position on the display area 101 stored in advance in the storage unit 23, the movement path 400 between the movement start point 401 and the movement end point 402 changing for each situation is
  • the movement path determination unit 22a or the memory is more than a memory 22 stores and reads in advance all movement paths 400 between all movement start points 401 and movement end points 402 that can be calculated or obtained.
  • the load on the part 23 can be reduced.
  • the mobile relay point 403 is not limited to a single one, and may be a plurality of positions on the display area 101 stored in advance in the storage unit 23. In this case, the closest moving relay point 403 may be selected from among the plurality according to the position where the moving image 200 was originally displayed, and according to the position of the second object 312, the closest among the plurality A near mobile relay point 403 may be selected.
  • FIG. 7 and FIG. 8 will be referred to.
  • the third embodiment is different from the above embodiment in that an instruction image 250 directed from the moving image 200 to the second object 312 is displayed.
  • FIG. 7 is a flowchart showing the main operation procedure of the HUD device 100 in the third embodiment
  • FIG. 8 is a diagram showing a display example of the moving image 200 in the third embodiment.
  • 8 (a) shows a display example of the moving image 200 before moving
  • FIG. 8 (b) shows a display example of the instruction image 250
  • FIG. 8 (c) shows after the instruction image 250 is displayed.
  • a display example of the moving image 200 moving from the moving start point 401 to the moving end point 402 while displaying the instruction image 250 is shown.
  • step S210 information acquisition process of FIG. 7
  • the determination unit 21 acquires object information from the periphery monitoring unit 31 via the object information acquisition unit 30, and the determination unit 21 is based on the various information acquired in step S210. It is then determined whether the moving condition of the moving image 200 is satisfied (step S220 (determination step)).
  • the display processing unit 22 updates the display without moving the moving image 200 in the direction away from the first object 311 (step S230).
  • step S240 information image updating step
  • the display processing unit 22 or the information image determining unit 22b performs second processing based on the various information input in step S210.
  • the second information image 212 related to the object 312 is determined, the image generation unit 24 generates display data, and switches the information image 210 from the first information image 211 to the second information image 212.
  • the display processing unit 22 or the information image determination unit 22 b may, for example, change the information image 210 from “forward warning” (see FIG. 8A), which is a first information image 211 related to the first object 311. It switches to the "right side attention" (refer FIG.8 (b)) which is the 2nd information image 212 relevant to the object 312.
  • step S250 instruction image display process
  • the display processing unit 22 or the image generation unit 24 determines the position at which the moving image 200 is currently displayed based on the position of the second object 312.
  • the position and the shape of the instruction image 250 are determined so as to indicate the second object 312, and the image generation unit 24 generates display data and displays the instruction image 250.
  • the display processing unit 22 determines the instruction image 250 as a ripple-like image directed from the position where the moving image 200 is currently displayed to the second object 312.
  • the instruction image 250 may be any image that associates the moving image 200 with the position where the second object 312 is present, and thus is not limited to the ripple-like image.
  • the moving image 200 and the second object 312 And a line image, a dotted line, a group image including a plurality of images, and the like.
  • the instruction image 250 may be a moving image in which one or more images move between the moving image 200 and the second object 312.
  • a ripple is directed from the moving image 200 to the second object 312. It may be a moving video.
  • step S 260 movement step
  • the display processing unit 22 or the movement path determination unit 22 a moves the moving image 200 closer to the second object 312 based on the position of the second object 312 (in other words,
  • the movement path 400 is determined such that the movement image 200 is separated from the first object 311, the image generation unit 24 generates display data, and moves the information image 210 along the movement path 400.
  • the display processing unit 22 or the movement path determination unit 22a sets the position at which the movement image 200 is currently displayed as the movement start point 401, and moves the predetermined position determined based on the position of the second object 312
  • the end point 402 is set, and a line segment connecting the movement start point 401 and the movement end point 402 is determined as the movement path 400.
  • steps S240 to S260 may be executed out of order of movement or may be executed simultaneously.
  • the processing unit 20 or the display processing unit 22 switches the information image 210 to the second information image 212 before the movement of the moving image 200 (step S260 (moving step)) is completed (step S240 (information image updating step) Is preferred to complete. According to this, since the new information image 210 (second information image 212) can be visually recognized before the moving image 200 moves, it becomes possible to take an action intended by the information image 210 more quickly.
  • the processing unit 20 or the display processing unit 22 displays the instruction image 250 (step S250 (instruction image) after the switching of the information image 210 to the second information image 212 (step S240 (information image update step)) is completed. It is preferable to start the display step)). According to this, since the instruction image 250 can be viewed after the information image 210 is switched, it is possible to immediately recognize the reason why the instruction image 250 is displayed and the like by the information image 210.
  • FIG. 9 is a flowchart showing the main operation procedure of the HUD device 100 according to the fourth embodiment
  • FIG. 10 is a diagram showing a display example of the moving image 200 according to the fourth embodiment.
  • 10 (a) shows a display example of the moving image 200 before moving
  • FIG. 10 (b) shows a display example of the moving image 200 moving from the moving start point 401 to the moving relay point 403, and
  • FIG. c) shows a display example of the indication image 250
  • FIG. 10 (d) shows a movement image 200 moving from the movement relay point 403 to the movement end point 402 after the indication image 250 is displayed or while the indication image 250 is displayed. Shows a display example of.
  • the determination unit 21 acquires object information of the objects 310 in the vicinity 300 of the host vehicle 1 in step S310 in FIG. 9, and the second other than the notification priority for the first object 311 located in front of the host vehicle 1. It is determined that the movement condition of the moving image 200 is satisfied (FIG. 9) when the notification priority for the object 312 of (1) becomes high (or the notification priority for the first object 311 becomes lower than a predetermined threshold). (YES in step S320). Subsequently, in step S340, the information image determination unit 22b determines the second information image 212 regarding the second object 312, and the information image 210 is displayed as the first information image 211 ("Forward Caution" in FIG. 10A). Is switched to the second information image 212 (“right side attention” in FIG.
  • the movement path determination unit 22a determines the first movement path 410 from the movement start point 401 to the movement relay point 403 in step S350. Then, as shown in FIG. 10B, the processing unit 20 or the display processing unit 22 moves the moving image 200 to the moving relay point 403 along the first moving path 410.
  • step S360 instruction image display process
  • the display processing unit 22 or the image generation unit 24 displays a position (movement) at which the moving image 200 is currently displayed based on the position of the second object 312.
  • the position and the shape of the instruction image 250 are determined so that the second object 312 is indicated from the relay point 403), the image generation unit 24 generates display data, and the instruction image 250 is displayed.
  • the determination unit 21 acquires the object information of the objects 310 of the objects 300 in the vicinity 300 of the vehicle 1 in step S410 in FIG. 9, and the notification priority for the second object 312 becomes higher and exceeds the predetermined threshold. In this case, it is determined that the movement condition of the moving image 200 is satisfied (YES in step S420 in FIG. 9). Subsequently, in step S440, the information image determination unit 22b updates the information image 210, but the information image 210 has already been changed to the second information image 212 regarding the second object 312 by the previous processing (step S340). In step S450, the movement path determination unit 22a determines the second movement path 420 from the movement relay point 403 to the movement end point 402, as shown in FIG. 10 (d). As such, the moving image 200 is moved along the second moving path 420 to the moving end point 402. When moving the moving image 200, the instruction image 250 may be hidden, or may be moved while the instruction image 250 is displayed.
  • the display processing unit 22 Since the movement relay point 403 is a specific position on the display area 101 stored in advance in the storage unit 23, the display processing unit 22 displays the instruction image 250 between the movement start point 401 and the movement end point 402 changing for each situation. Display processing rather than storing (in the storage unit 23) and reading out all instruction images 250 between all the moving start points 401 and the moving end points 402 that can be generated or acquired (image generation unit 24) The load on the unit 22 or the storage unit 23 can be reduced.
  • the mobile relay point 403 is not limited to a single one, and may be a plurality of positions on the display area 101 stored in advance in the storage unit 23.
  • the closest moving relay point 403 may be selected from among the plurality according to the position where the moving image 200 was originally displayed, and according to the position of the second object 312, the closest among the plurality A near mobile relay point 403 may be selected.
  • the display processing unit 22 or the movement path determination unit 22a determines the movement path 400 based on the position of the second object 312.
  • the present invention is not limited to this.
  • the display processing unit 22 or the movement path determination unit 22a may determine the movement path 400 such that the movement image 200 is separated from the first object 311 based on the position of the first object 311.
  • the display processing unit 22 or the movement path determination unit 22a may determine the movement path 400 such that the movement image 200 is separated from the first object 311 based on the position of the first object 311.

Abstract

In order to improve the recognizability of an object existing in an actual view, in the present invention an object information acquisition unit 30 acquires first object information for a first object 311 existing in the vicinity of a vehicle, said information including at least the position 311 of the first object, and second object information for a second object 312, said information including at least the position of the second object. Furthermore, a determination unit 21 determines whether a moving condition for moving a moving image 200 has been satisfied. Furthermore, when the moving condition has been satisfied a display processing unit 22 switches an information image 210 included in the moving image 200 from a first information image 211 related to the first object information to a second information image 212 related to the second object information, and moves the moving image 200 along a movement path 400 separated from the first object 311.

Description

表示制御方法、表示制御装置及びヘッドアップディスプレイ装置Display control method, display control device and head-up display device
 本発明は、車両の前景に重畳して視認させる虚像の表示制御方法、表示制御装置、及び虚像を表示するヘッドアップディスプレイ装置に関する。 The present invention relates to a display control method of a virtual image superimposed on a foreground of a vehicle for visual recognition, a display control device, and a head-up display device for displaying a virtual image.
 特許文献1に開示されているヘッドアップディスプレイ装置は、前景に存在するオブジェクトに対して虚像を表示することで、オブジェクトに情報を付加したり、オブジェクトを強調したりする拡張現実(AR:Augmented Reality)を生成するものである。 The head-up display device disclosed in Patent Document 1 adds information to an object or emphasizes an object by displaying a virtual image for the object existing in the foreground. (AR: Augmented Reality) ) Is generated.
特開2006-242859号公報JP, 2006-242859, A
 しかしながら、特許文献1の図10のように、複数のオブジェクトに対してそれぞれ虚像を表示した場合、どの虚像に注意を向けていいかわかりにくく、虚像自体及び虚像が対応するオブジェクトを認識しにくくなるおそれがあった。 However, as shown in FIG. 10 of Patent Document 1, when virtual images are displayed for a plurality of objects, it may be difficult to know which virtual image the attention is directed to, and it may be difficult to recognize the virtual image itself and the objects corresponding to the virtual image. was there.
 本発明は、上記の問題点に鑑みてなされたものであり、実景に存在するオブジェクトの認識性を向上させることを目的とする。 The present invention has been made in view of the above problems, and its object is to improve the recognizability of objects existing in a real scene.
 本発明の第1の実施形態の表示制御方法は、自車両(1)の前景に重畳して視認される移動画像(200)の表示制御方法であって、
 前記自車両の周辺に存在する第1のオブジェクト(311)に関して、少なくとも前記第1のオブジェクトの位置を含む第1のオブジェクト情報、及び前記自車両の周辺に存在する第2のオブジェクト(312)に関して、少なくとも前記第2のオブジェクトの位置を含む第2のオブジェクト情報を取得する情報取得工程(S10、S110、S310)と、
 前記移動画像を移動させる移動条件が成立するかどうかを判定する判定工程(S20、S120、S320)と、
 前記移動条件が成立したときに実行される以下の工程を有しており、すなわち、
 前記移動画像に含まれる情報画像(210)を、前記第1のオブジェクト情報に関連する第1情報画像(211)から前記第2のオブジェクト情報に関連する第2情報画像(212)に切り替える情報画像更新工程(S40、S140、S340)と、
 取得した前記第1のオブジェクトの位置及び前記第2のオブジェクトの位置の少なくとも一方に基づき、前記第1のオブジェクトから離れる移動経路(400)に沿って前記移動画像を移動させる移動工程(S50、S160、S350)と、を含む。
The display control method of the first embodiment of the present invention is a display control method of a moving image (200) to be viewed superimposed on the foreground of the host vehicle (1),
Regarding the first object (311) existing around the vehicle, the first object information including at least the position of the first object, and the second object (312) existing around the vehicle An information acquisition step (S10, S110, S310) of acquiring second object information including at least a position of the second object;
A determination step (S20, S120, S320) for determining whether a movement condition for moving the moving image is satisfied;
It has the following steps to be performed when the movement condition is satisfied, ie,
An information image for switching an information image (210) included in the moving image from a first information image (211) related to the first object information to a second information image (212) related to the second object information Updating steps (S40, S140, S340),
Moving step (S50, S160) for moving the moving image along a moving path (400) away from the first object based on at least one of the acquired position of the first object and the position of the second object , S350) and.
 また、本発明の第2の実施形態の表示制御装置は、自車両(1)の前景に重畳して視認される情報画像(200)を表示させる表示制御装置であって、
 前記自車両の周辺に存在する第1のオブジェクト(311)に関して、少なくとも前記第1のオブジェクトの位置(311)を含む第1のオブジェクト情報、及び第2のオブジェクト(312)に関して、少なくとも前記第2のオブジェクトの位置を含む第2のオブジェクト情報を取得するオブジェクト情報取得部(30)と、
 前記画像を移動させる移動条件が成立するかどうかを判定する判定部(21)と、
 前記移動条件が成立したときに、前記移動画像に含まれる情報画像(210)を、前記第1のオブジェクト情報に関連する第1情報画像(211)から前記第2のオブジェクト情報に関連する第2情報画像(212)に切り替えさせ、かつ、前記第1のオブジェクトから離れる移動経路(400)に沿って前記移動画像を移動させる表示処理部(22)と、を備える。
Further, the display control apparatus according to the second embodiment of the present invention is a display control apparatus for displaying an information image (200) visually recognized superimposed on the foreground of the host vehicle (1).
With respect to the first object (311) existing around the host vehicle, at least the second with respect to the first object information including the position (311) of the first object and the second object (312) An object information acquisition unit (30) for acquiring second object information including the position of the object of
A determination unit (21) that determines whether a movement condition for moving the image is satisfied;
When the movement condition is satisfied, the information image (210) included in the movement image is converted from a first information image (211) related to the first object information to a second one related to the second object information. And a display processing unit (22) for switching to the information image (212) and moving the moving image along a moving path (400) away from the first object.
 また、本発明の第3の実施形態のヘッドアップディスプレイ装置は、第2の態様の表示制御装置(90)と、
 前記表示制御装置(90)が制御することで前記移動画像を表示する表示器(10)と、を備える。
Further, a head-up display device according to a third embodiment of the present invention includes the display control device (90) according to the second aspect;
And a display (10) for displaying the moving image under the control of the display control device (90).
本発明の実施形態に係るヘッドアップディスプレイ装置の構成を機能的に示すブロック図である。It is a block diagram which shows functionally the composition of the head up display device concerning the embodiment of the present invention. 運転者が自車両の前方を向いた際に、上記実施形態のヘッドアップディスプレイ装置が表示する移動画像と実景とが視認される様子の例を示す図である。It is a figure which shows the example of a mode that the moving image which the head-up display apparatus of the said embodiment displays, and a real view are visually recognized, when a driver | operator goes ahead of the own vehicle. 上記実施形態のヘッドアップディスプレイ装置が表示する移動画像の例を示す図である。It is a figure which shows the example of the moving image which the head-up display apparatus of the said embodiment displays. 上記実施形態のヘッドアップディスプレイ装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the head-up display apparatus of the said embodiment. 上記実施形態のヘッドアップディスプレイ装置が表示する移動画像が移動する様子を示す図である。It is a figure which shows a mode that the moving image which the head-up display apparatus of the said embodiment displays moves. 本発明の変形例におけるヘッドアップディスプレイ装置が表示する移動画像が移動する様子を示す図である。It is a figure which shows a mode that the moving image which the head-up display apparatus in the modification of this invention displays moves. 本発明の変形例におけるヘッドアップディスプレイ装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the head-up display apparatus in the modification of this invention. 上記実施形態のヘッドアップディスプレイ装置が表示する移動画像が移動する様子を示す図である。It is a figure which shows a mode that the moving image which the head-up display apparatus of the said embodiment displays moves. 本発明の変形例におけるヘッドアップディスプレイ装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the head-up display apparatus in the modification of this invention. 上記実施形態のヘッドアップディスプレイ装置が表示する移動画像が移動する様子を示す図である。It is a figure which shows a mode that the moving image which the head-up display apparatus of the said embodiment displays moves.
 以下、本発明に係る実施形態について図面を参照して説明する。なお、本発明は下記の実施形態(図面の内容も含む。)によって限定されるものではない。下記の実施形態に変更(構成要素の削除も含む)を加えることができるのはもちろんである。また、以下の説明では、本発明の理解を容易にするために、公知の技術的事項の説明を適宜省略する。 Hereinafter, embodiments according to the present invention will be described with reference to the drawings. The present invention is not limited by the following embodiments (including the contents of the drawings). Of course, modifications (including deletion of components) can be added to the following embodiments. Further, in the following description, in order to facilitate understanding of the present invention, the description of known technical matters is appropriately omitted.
 本明細書において、自車両1の周辺300に存在するオブジェクト310の位置を取得するオブジェクト情報取得部30の機能と、移動画像200を移動させ、かつ移動画像200に含まれる情報画像210を切り替えさせる表示処理部22の機能と、を備える装置を、本発明に従う表示制御装置と呼ぶ。1例として、オブジェクト情報取得部30と表示処理部22とを備えているので、図1の符号90で示されたユニットは、表示制御装置と呼ぶことができる。また、表示制御装置は、本質的には、移動画像200を移動させ、かつ移動画像200に含まれる情報画像210を切り替えさせる機能(表示処理部22)を有していればよいので、表示器10に表示画像を表示させる表示データを生成する機能(画像生成部24)を別に設けてもよい。言い換えれば、図1の表示制御装置90は、表示データを生成する画像生成部24の機能がない場合であっても、本発明に従う表示制御装置と呼ぶことができる。 In this specification, the function of the object information acquisition unit 30 for acquiring the position of the object 310 present in the periphery 300 of the vehicle 1 and the moving image 200 are moved, and the information image 210 included in the moving image 200 is switched. A device provided with the function of the display processing unit 22 is called a display control device according to the present invention. As one example, since the object information acquisition unit 30 and the display processing unit 22 are provided, the unit indicated by reference numeral 90 in FIG. 1 can be called a display control device. In addition, since the display control device essentially has a function (display processing unit 22) for moving the moving image 200 and switching the information image 210 included in the moving image 200, the display A function (image generation unit 24) for generating display data for displaying a display image may be separately provided. In other words, the display control device 90 of FIG. 1 can be called a display control device according to the present invention even when there is no function of the image generation unit 24 that generates display data.
 また、本実施形態に従う表示制御装置(例えば図1の符号90)と、この表示制御装置90で生成された移動画像200を含む表示画像を表示する表示器10と、を備える装置は、本発明に従うヘッドアップディスプレイ装置と呼ぶことができる。 In addition, a device provided with a display control device (for example, reference numeral 90 in FIG. 1) according to the present embodiment and a display 10 for displaying a display image including the moving image 200 generated by the display control device 90 is the present invention. Can be referred to as a head-up display device according to
 図1は、本実施形態におけるヘッドアップディスプレイ装置100(以下、HUD装置と呼ぶ)の全体構成を示すブロック図である。例えば、HUD装置100は、自車両1のフロントウインドシールド(被投影部2の一例)の一部に表示器10の表示光Lを投影する。被投影部2は、表示光Lを視認者側に向けて反射することで所定のアイボックス3を生成する。視認者(自車両1の運転者)は、アイポイント4(視認者の眼の位置)をアイボックス3内におくことで、被投影部2を介した前方に、HUD装置100の表示エリア101(図2参照)内で虚像を視認することができる。 FIG. 1 is a block diagram showing the overall configuration of a head-up display device 100 (hereinafter referred to as a HUD device) in the present embodiment. For example, the HUD device 100 projects the display light L of the display 10 on a part of the front windshield (an example of the projection target 2) of the host vehicle 1. The to-be-projected part 2 produces | generates the predetermined | prescribed eye box 3 by reflecting the display light L toward the viewer side. The viewer (the driver of the host vehicle 1) places the eye point 4 (the position of the eye of the viewer) in the eye box 3 to display the display area 101 of the HUD device 100 in front of the projection part 2. A virtual image can be viewed within the (see FIG. 2).
 HUD装置100は、表示画像Mを表示する表示器10と、表示器10が表示した表示画像Mの表示光Lを自車両1の被投影部2(例えば、自車両1のフロントウインドシールド)に投影するリレー光学部11(図1では、平面鏡12と凹面鏡13)と、表示器10が表示する表示画像Mを制御することで、視認者が視認する虚像(移動画像200)を変化させる処理ユニット20と、オブジェクト情報取得部30と、操作情報取得部40と、入出力インターフェース70と、によって構成される。 The HUD device 100 displays the display 10 for displaying the display image M and the display light L of the display image M displayed by the display 10 on the projection portion 2 of the host vehicle 1 (for example, the front windshield of the host vehicle 1). A processing unit that changes the virtual image (moving image 200) visually recognized by the viewer by controlling the relay optical unit 11 (the plane mirror 12 and the concave mirror 13 in FIG. 1) and the display image M displayed by the display 10 20, an object information acquisition unit 30, an operation information acquisition unit 40, and an input / output interface 70.
 処理ユニット20は、入力インターフェース(オブジェクト情報取得部30、操作情報取得部40、及び入出力インターフェース70)を介して、外部機器(周辺監視部31、操作検出部41、ナビゲーションシステム500、クラウドサーバー(外部サーバー)600、車両ECU700、及び図示しない自車両1内の他の車載機器、自車両1内の携帯機器、他車両(車車間通信V2V)、路上の通信インフラ(路車間通信V2I)、歩行者の携帯機器(車両と歩行者との通信V2P))と通信可能に連結されている。前記入力インターフェースは、例えば、USBポート、シリアルポート、パラレルポート、OBDII、及び/又は他の任意の適切な有線通信ポートなどの有線通信機能を含むことができる。なお、他の実施形態では、前記入力インターフェースは、例えば、Bluetooth(登録商標)通信プロトコル、IEEE802.11プロトコル、IEEE1002.11プロトコル、IEEE1002.16プロトコル、DSRC(Dedicated Short Range Communications)プロトコル、共有無線アクセスプロトコル、ワイヤレスUSBプロトコル、及び/又は他の任意の適切な無線電信技術を用いた無線入力インターフェースを含むことができる。 The processing unit 20 receives an external device (periphery monitoring unit 31, operation detection unit 41, navigation system 500, cloud server (input device) through an input interface (object information acquisition unit 30, operation information acquisition unit 40, and input / output interface 70). External server) 600, vehicle ECU 700, and other on-vehicle devices in the host vehicle 1 (not shown), portable devices in the host vehicle 1, other vehicles (inter-vehicle communication V2 V), communication infrastructure on the road (inter-vehicle communication V2 I), walking It is communicably connected to a person's portable device (communication V2P between vehicle and pedestrian). The input interface may include wired communication functionality, such as, for example, a USB port, a serial port, a parallel port, an OBDII, and / or any other suitable wired communication port. In another embodiment, the input interface is, for example, Bluetooth (registered trademark) communication protocol, IEEE 802.11 protocol, IEEE 1002.11 protocol, IEEE 1002.16 protocol, DSRC (Dedicated Short Range Communications) protocol, shared wireless access A wireless input interface using a protocol, a wireless USB protocol, and / or any other suitable wireless technology may be included.
 オブジェクト情報取得部30は、自車両1の少なくとも前方を含む周辺300に存在する特定のオブジェクト310の位置情報を少なくとも含むデータ(オブジェクト情報)を取得する入力インターフェースであり、例えば、自車両1に搭載された単数又は複数のカメラやセンサなどからなる周辺監視部31からオブジェクト情報を取得し、処理ユニット20に出力する。例えば、周辺監視部31は、自車両1の周辺300を撮像(検出)し、その撮像(検出)データを解析して自車両1の周辺300のオブジェクト310の実座標上の位置情報(オブジェクト310までの距離情報も含んでいてもよい)又は/及びHUD装置100の表示エリア101内におけるオブジェクト310の位置情報(オブジェクト310までの距離情報も含んでいてもよい)を生成する図示しない解析部を有する。この場合、オブジェクト情報取得部30が取得するオブジェクト情報は、自車両1の周辺300の実座標又は表示エリア101におけるオブジェクト310の座標(位置情報)である。なお、上記したような解析部は、HUD装置100内に設けられてもよい。具体的には、解析部は、処理ユニット20内に設けられてもよく、この場合、オブジェクト情報取得部30が取得するオブジェクト情報は、周辺監視部31が周辺300を撮像(検出)した撮像(検出)データとなる。 The object information acquisition unit 30 is an input interface for acquiring data (object information) including at least position information of a specific object 310 present in the periphery 300 including at least the front of the own vehicle 1. The object information is acquired from the periphery monitoring unit 31 including the single or plural cameras or sensors, and is output to the processing unit 20. For example, the periphery monitoring unit 31 captures (detects) the periphery 300 of the vehicle 1, analyzes the imaging (detection) data, and detects position information (object 310) of the object 310 in the periphery 300 of the vehicle 1. And / or an analysis unit (not shown) that generates positional information of the object 310 in the display area 101 of the HUD device 100 (may also include distance information to the object 310) Have. In this case, the object information acquired by the object information acquiring unit 30 is the actual coordinates of the periphery 300 of the vehicle 1 or the coordinates (position information) of the object 310 in the display area 101. The analysis unit as described above may be provided in the HUD device 100. Specifically, the analysis unit may be provided in the processing unit 20. In this case, the object information acquired by the object information acquisition unit 30 is an imaging (detection) of the periphery 300 by the periphery monitoring unit 31 ( Detection) data.
 また、オブジェクト情報取得部30は、周辺監視部31が自車両1の周辺300を観測し図示しない種類識別部で識別したオブジェクト310の種類情報を取得してもよい。オブジェクト310の種類とは、例えば、障害物(他車両、人物、動物、モノ),道路標識,道路,又は建物などであるが、周辺300に存在し、識別可能であればこれらに限定されない。すなわち、オブジェクト情報取得部30は、オブジェクト情報(オブジェクト310の位置情報、距離情報、種類情報)を取得し、処理ユニット20に出力してもよい。 Further, the object information acquisition unit 30 may acquire type information of the object 310 that the periphery monitoring unit 31 observes the periphery 300 of the host vehicle 1 and identifies with the type identification unit (not shown). The type of the object 310 is, for example, an obstacle (other vehicle, person, animal, or thing), a road sign, a road, a building or the like, but is present in the periphery 300 and is not limited thereto as long as it can be identified. That is, the object information acquisition unit 30 may acquire object information (position information of the object 310, distance information, type information) and output the acquired information to the processing unit 20.
 また、他の実施形態では、オブジェクト情報取得部30の代わりに、あるいはオブジェクト情報取得部30とともに、外部機器と通信可能な入出力インターフェース70がオブジェクト情報取得部30として機能してもよい。言い換えると、入出力インターフェース70が、ナビゲーションシステム500、クラウドサーバー600、車両ECU700、及び図示しない自車両1内の他の車載機器、自車両1内の携帯機器、他車両(車車間通信V2V)、路上の通信インフラ(路車間通信V2I)、歩行者の携帯機器(車両と歩行者との通信V2P)からオブジェクト情報を取得してもよい。入出力インターフェース70は、例えば、ナビゲーションシステム500、クラウドサーバー600、車両ECU700から、地図情報とともに、道路標識,道路,建物等のオブジェクト310の位置情報、種類情報等を取得可能であり、また、図示しない他車両(車車間通信V2V)、路上の通信インフラ(路車間通信V2I)、歩行者の携帯機器(車両と歩行者との通信V2P)から、障害物(他車両、人物、動物、モノ),道路標識,道路等の位置情報、種類情報等を取得可能である。処理ユニット20は、自車両1又はHUD装置100の位置情報(自車両1の方角情報を追加してもよい)を、入出力インターフェース70を介して、前記外部機器に出力し、前記外部機器は、入力した自車両1又はHUD装置100の位置情報(追加の方角情報)に基づき、自車両1の周辺300に存在するオブジェクト310の位置情報、種類情報を含むオブジェクト情報を入出力インターフェース70に出力してもよい。 In another embodiment, instead of the object information acquisition unit 30 or together with the object information acquisition unit 30, the input / output interface 70 capable of communicating with an external device may function as the object information acquisition unit 30. In other words, the input / output interface 70 includes the navigation system 500, the cloud server 600, the vehicle ECU 700, other in-vehicle devices in the host vehicle 1 (not shown), portable devices in the host vehicle 1, other vehicles (inter-vehicle communication V2V), The object information may be acquired from the communication infrastructure on the road (road-to-vehicle communication V2I) and the portable device of the pedestrian (communication V2P between the vehicle and the pedestrian). The input / output interface 70 can obtain, for example, from the navigation system 500, the cloud server 600, and the vehicle ECU 700, along with map information, positional information, type information, etc. of objects 310 such as road signs, roads, buildings, etc. Obstacles (other vehicles, people, animals, things) from other vehicles (inter-vehicle communication V2V), communication infrastructures on the road (road-to-vehicle communication V2I), portable devices of pedestrians (communications V2P between vehicles and pedestrians) , Road signs, position information such as roads, type information, etc. can be acquired. The processing unit 20 outputs position information of the host vehicle 1 or the HUD device 100 (which may add direction information of the host vehicle 1) to the external device via the input / output interface 70, and the external device Based on the input position information (additional direction information) of the host vehicle 1 or the HUD device 100, the object information including the position information and type information of the object 310 present in the periphery 300 of the host vehicle 1 is output to the input / output interface 70 You may
 操作情報取得部40は、例えば、操作検出部41が検出した自車両1のステアリング、アクセル、ブレーキ、ウィンカ及び前照灯等の部位に対する操作情報を取得し、処理ユニット20に出力する。 The operation information acquisition unit 40 acquires, for example, operation information on a portion of the vehicle 1 detected by the operation detection unit 41 such as steering, accelerator, brake, blinker, and headlight, and outputs the operation information to the processing unit 20.
 また、他の実施形態では、操作情報取得部40の代わりに、あるいは操作情報取得部40とともに、外部機器と通信可能な入出力インターフェース70が、操作情報取得部40として機能してもよい。言い換えると、入出力インターフェース70が、クラウドサーバー600、車両ECU700、及び図示しない路上の通信インフラ(路車間通信V2I)から自車両1に関する操作情報を取得してもよい。 In another embodiment, the input / output interface 70 capable of communicating with an external device may function as the operation information acquisition unit 40 instead of the operation information acquisition unit 40 or together with the operation information acquisition unit 40. In other words, the input / output interface 70 may obtain the operation information regarding the vehicle 1 from the cloud server 600, the vehicle ECU 700, and the communication infrastructure (road-to-vehicle communication V2I) on the road (not shown).
 処理ユニット20は、後述する移動画像200の移動条件が成立するかどうかを判定する判定部21と、移動画像200を変化させる表示処理部22と、表示処理部22が実行するプログラムや画像要素データを記憶する記憶部23と、表示処理部22の演算結果に基づき表示器10に表示画像を表示させるための表示データを生成する画像生成部24と、を備える。 The processing unit 20 determines a determination unit 21 that determines whether a moving condition of the moving image 200 described later is satisfied, a display processing unit 22 that changes the moving image 200, a program that the display processing unit 22 executes, and image element data And an image generation unit 24 that generates display data for causing the display 10 to display a display image based on the calculation result of the display processing unit 22.
 判定部21は、オブジェクト310に対して、視認者に報知すべき優先度(以下では報知優先度とも呼ぶ)を決定し、複数のオブジェクト310の報知優先度を比較し、移動画像200が関連付けられた一方のオブジェクト310の報知優先度より、他方のオブジェクト310の報知優先度の方が高くなった場合に、移動画像200を移動させる移動条件が成立したと判定する。『報知優先度』は、オブジェクト310に対して、起こり得る自体の重大さの程度から導き出される危険度や、反応行動を起こすまでに要求される反応時間の長短から導かれる緊急度などであり、オブジェクト情報取得部30が取得するオブジェクト情報のみで決定されてもよく、前記オブジェクト情報に加えて、操作情報取得部40が取得する操作情報や、入出力インターフェース70が取得する各種情報に基づいて決定されてもよい。なお、判定部21は、本質的には、前記報知優先度に基づき移動条件が成立するか判定する機能を有していればいいので、前記報知優先度を決定する機能を有していなくてもよく、前記報知優先度を決定する機能の一部又は全部は、処理ユニット20の外側、又は表示制御装置90外側に設けられてもよい。なお、移動条件の判定は、前記危険度や前記緊急度を用いる方法に限定されない。以下に、変形例を示す。 The determination unit 21 determines, for the object 310, the priority to be notified to the viewer (hereinafter also referred to as notification priority), compares the notification priorities of the plurality of objects 310, and associates the moving image 200. When the notification priority of the other object 310 is higher than the notification priority of one object 310, it is determined that the movement condition for moving the moving image 200 is satisfied. The “informing priority” is, for the object 310, the degree of risk derived from the degree of the seriousness of the object 310, the degree of urgency derived from the length of the response time required to take a response action, etc. It may be determined based on only the object information acquired by the object information acquisition unit 30. In addition to the object information, the determination is made based on the operation information acquired by the operation information acquisition unit 40 and various information acquired by the input / output interface 70. It may be done. Note that the determination unit 21 essentially does not have to have the function of determining the notification priority, as long as it has the function of determining whether the movement condition is satisfied based on the notification priority. Alternatively, part or all of the function of determining the notification priority may be provided outside the processing unit 20 or outside the display control device 90. In addition, determination of a movement condition is not limited to the method of using the said danger degree or the said emergency degree. Below, a modification is shown.
 例えば、変形例として、判定部21は、新たに検出されたオブジェクト310に対する報知優先度を高くし、新たに検出されたオブジェクト310に向けて移動画像200を移動させる移動条件が成立したと判定してもよい。言い換えると、移動条件の判定には、危険度又は緊急度に代えて、あるいは、危険度又は緊急度に加えて、情報の新しさが用いられてもよい。具体的には、移動画像200が関連付けられた一方のオブジェクト310の危険度(緊急度)が、このオブジェクト310よりも後に検出された他方のオブジェクト310の危険度(緊急度)より高い場合であっても、情報の新しさを加味して前記他方のオブジェクト310に向けて移動画像200を移動させる移動条件が成立したと判定してもよい。 For example, as a modification, the determination unit 21 determines that the moving condition for moving the moving image 200 toward the newly detected object 310 is satisfied by setting the notification priority to the newly detected object 310 high. May be In other words, the freshness of the information may be used to determine the movement conditions instead of or in addition to the degree of risk or the degree of urgency. Specifically, this is a case where the degree of risk (urgency) of one object 310 to which the moving image 200 is associated is higher than the degree of risk (urgency) of the other object 310 detected after this object 310. However, it may be determined that the movement condition for moving the moving image 200 toward the other object 310 is satisfied in consideration of the newness of the information.
 別の変形例として、判定部21は、移動画像200が継続して関連づけられたオブジェクト310に対する報知優先度を低くし、他のオブジェクト310に向けて移動画像200を移動させる移動条件が成立したと判定してもよい。言い換えると、移動条件の判定には、危険度又は緊急度に代えて、あるいは、危険度又は緊急度に加えて、移動画像200の継続時間が用いられてもよい。具体的には、移動画像200が同一のオブジェクト310に対して、予め設定された閾時間以上継続して表示されていた場合、他のオブジェクト310に向けて移動画像200を移動させる移動条件が成立したと判定してもよい。なお、移動条件の判定方法は、前記の例に限定されず、判定部21は、例えば、入力インターフェース(オブジェクト情報取得部30、操作情報取得部40、入出力インターフェース70)を介して、新たに取得した情報が所定の種類の情報であった、又は取得した情報が所定の条件を満たした情報であった、又は取得した複数の情報で所定の組み合わせが成立した場合、運転者が置かれているシチュエーションを総合的に判断して報知優先度を変化させ、前記移動条件を判定してもよい。 As another modification, the determining unit 21 determines that the moving condition for moving the moving image 200 toward the other object 310 is satisfied by lowering the notification priority for the object 310 to which the moving image 200 is continuously associated. You may judge. In other words, the duration of the moving image 200 may be used to determine the movement condition instead of or in addition to the degree of danger or the degree of urgency. Specifically, when the moving image 200 is continuously displayed for the same object 310 for a predetermined threshold time or longer, a moving condition for moving the moving image 200 toward the other object 310 is satisfied. It may be determined that the The determination method of the movement condition is not limited to the above example, and the determination unit 21 newly adds, for example, via the input interface (the object information acquisition unit 30, the operation information acquisition unit 40, and the input / output interface 70). If the acquired information is a predetermined type of information, or the acquired information is information satisfying a predetermined condition, or if a predetermined combination of the acquired plurality of information is established, the driver is placed The movement condition may be determined by comprehensively judging the situation in which the user is present and changing the notification priority.
 表示処理部22は、判定部21が、前記移動条件が成立したと判定した場合、移動画像200を移動させるとともに、移動画像200に含まれる第1のオブジェクト情報に基づいた第1情報画像211(図5(a)参照)を、第2のオブジェクト情報に基づいた第2情報画像212(図5(b)参照)に切り替えさせるものである。表示処理部22は、移動画像200が元々表示されていた位置(移動始点401)から、移動画像200を移動させる位置(移動終点402)までの移動経路400を決定する機能(移動経路決定部22a)と、移動画像200の情報画像210を決定する機能(情報画像決定部22b)と、を有する。 When the display processing unit 22 determines that the movement condition is satisfied, the display processing unit 22 moves the moving image 200, and the first information image 211 (based on the first object information included in the moving image 200) 5A is switched to the second information image 212 (see FIG. 5B) based on the second object information. The display processing unit 22 has a function (moving route determination unit 22a) that determines a moving route 400 from the position (moving start point 401) where the moving image 200 was originally displayed to the position (moving end point 402) to move the moving image 200. And a function (information image determination unit 22b) for determining the information image 210 of the moving image 200.
 移動経路決定部22aは、移動画像200が現在表示されている位置と、第2のオブジェクト情報に含まれる第2のオブジェクト312の位置と、に基づいて、移動画像200の移動経路400を決定する。例えば、移動経路400は、移動画像200が現在表示されている位置(移動始点401)から、第2のオブジェクト情報に含まれる第2のオブジェクト312の周辺の所定の位置(移動終点402)に向けた直線であり、表示処理部22は、第2のオブジェクト312の輪郭線又は第2のオブジェクト312を囲むバウンディングボックスの外側の所定の位置(移動終点402)に向けた移動経路400を決定する。移動終点402は、移動画像200が直線の移動経路400で移動した場合に、移動対象である第2のオブジェクト312の輪郭線又は第2のオブジェクト312を囲むバウンディングボックスに重ならない位置に設定されることが好ましい。言い換えると、移動経路決定部22aは、第2のオブジェクト312に移動画像200が重ならないように、移動経路400を決定してもよい。しかし、前記移動終点402は、第2のオブジェクト312の輪郭線の内側に設定されてもよい(移動経路決定部22aは、第2のオブジェクト312に移動画像200が重なるように、移動経路400を決定してもよい)。この場合、表示処理部22は、移動画像200が移動に伴い、第1のオブジェクト311又は第2のオブジェクト312に重なる場合、移動画像200の視認性を低下させてもよい。具体的には、表示処理部22は、移動画像200が移動に伴い、第1のオブジェクト311又は第2のオブジェクト312に重なる場合、移動画像200の情報画像210及び塗り画像220の視認性を低下させるが、情報画像210の視認性を低下させ、かつ塗り画像220の視認性を維持、又は情報画像210の視認性低下より度合いを小さくして視認性を低下させてもよい。また、移動経路400は、単なる直線ではなく、視覚的に趣向を凝らした曲線であってもよい。 The movement path determination unit 22a determines the movement path 400 of the movement image 200 based on the position where the movement image 200 is currently displayed and the position of the second object 312 included in the second object information. . For example, the movement path 400 is directed from the position where the movement image 200 is currently displayed (the movement start point 401) to a predetermined position (the movement end point 402) around the second object 312 included in the second object information. The display processing unit 22 determines a movement path 400 toward a predetermined position (end point of movement 402) outside the bounding box surrounding the outline of the second object 312 or the second object 312. The movement end point 402 is set at a position not overlapping the outline of the second object 312 to be moved or the bounding box surrounding the second object 312 when the movement image 200 moves in the straight movement path 400. Is preferred. In other words, the movement path determination unit 22a may determine the movement path 400 so that the movement image 200 does not overlap the second object 312. However, the movement end point 402 may be set inside the outline of the second object 312 (the movement path determination unit 22a causes the movement path 400 to move so that the movement image 200 overlaps the second object 312. You may decide). In this case, when the moving image 200 overlaps the first object 311 or the second object 312 as the moving image 200 moves, the display processing unit 22 may reduce the visibility of the moving image 200. Specifically, when the moving image 200 overlaps with the first object 311 or the second object 312 as the moving image 200 moves, the display processing unit 22 reduces the visibility of the information image 210 of the moving image 200 and the painted image 220. However, the visibility may be reduced by reducing the visibility of the information image 210 and maintaining the visibility of the painted image 220, or by reducing the degree to a lower visibility of the information image 210. In addition, the movement path 400 may not be a mere straight line, but may be a visually elaborated curve.
 情報画像決定部22bは、移動画像200を関連付けるオブジェクト310のオブジェクト情報(位置情報、距離情報、種類情報)に基づいて、移動画像200に含まれる後述する情報画像210を決定する。なお、情報画像決定部22bは、前記オブジェクト情報に、操作情報取得部40から取得した操作情報や入出力インターフェース70から取得した各種情報も加味して情報画像210を決定してもよい。 The information image determining unit 22b determines an information image 210, which will be described later, included in the moving image 200, based on object information (position information, distance information, type information) of the object 310 with which the moving image 200 is associated. The information image determining unit 22b may determine the information image 210 in consideration of the operation information acquired from the operation information acquiring unit 40 and various information acquired from the input / output interface 70 in the object information.
 次に、図2、図3を用いて、HUD装置100が表示する移動画像200を説明する。図2は、運転者が自車両1の前方を向いた際に、HUD装置100が表示する移動画像200と実景とが視認される様子の例を示す図である。HUD装置100は、被投影部2の表示エリア101内の所定の位置に移動画像200を表示する。図2では、表示エリア101は、点線で囲まれた矩形領域で記され、自車両1の前方に位置する他車両311(オブジェクト310の一例)は、この表示エリア101内に位置する。HUD装置100は、他車両311に対する注意を促す『前方注意』と描画された移動画像200を、他車両311の近傍(図2では他車両311の鉛直方向下側)に表示する。これにより、視認者は、表示された移動画像200に視覚的注意を向け、移動画像200に描画された情報画像210である『前方注意』により注意しなければいけない状況であることを認識し、かつ、その注意しなければいけない対象が移動画像200の近傍の他車両311であることを迅速に認識することができる。なお、移動画像200が表示される位置(座標)は、オブジェクト310の位置変化に対して、固定であってもよく動的であってもよい。すなわち、移動画像200が表示される位置は、オブジェクト310の大まかな位置により決定され、オブジェクト310の大きくない位置変化では変更されなくてもよい(固定されてもよい)。また、移動画像200が表示される位置は、オブジェクト310の位置情報の変化に応じて変化させてもよい(オブジェクト310との相対的な位置変化が小さくなるように連続的に変化させてもよい。言い換えると、オブジェクト310の位置に追従させてもよい)。 Next, the moving image 200 displayed by the HUD device 100 will be described using FIGS. 2 and 3. FIG. 2 is a diagram showing an example of how the moving image 200 displayed by the HUD device 100 and the real view are viewed when the driver faces the front of the host vehicle 1. The HUD device 100 displays the moving image 200 at a predetermined position in the display area 101 of the projection target 2. In FIG. 2, the display area 101 is described by a rectangular area surrounded by a dotted line, and the other vehicle 311 (an example of the object 310) located in front of the host vehicle 1 is located in the display area 101. The HUD device 100 displays a moving image 200 drawn as “front caution” that calls attention to the other vehicle 311 in the vicinity of the other vehicle 311 (in the vertical direction below the other vehicle 311 in FIG. 2). Thereby, the viewer gives visual attention to the displayed moving image 200, and recognizes that it is a situation where attention needs to be paid to the "front attention" which is the information image 210 drawn on the moving image 200, And, it is possible to quickly recognize that the target of which attention must be made is the other vehicle 311 near the moving image 200. The position (coordinates) at which the moving image 200 is displayed may be fixed or dynamic with respect to the position change of the object 310. That is, the position at which the moving image 200 is displayed is determined by the rough position of the object 310, and may not be changed (it may be fixed) in a non-large position change of the object 310. Further, the position where the moving image 200 is displayed may be changed according to the change of the position information of the object 310 (it may be changed continuously so that the relative change of position with the object 310 is reduced). In other words, the position of the object 310 may be tracked).
 図3は、移動画像200の構成を説明する図である。移動画像200は、白色、橙色、黄色、黄緑色、青色、赤色などの色であり、図3(a)に示すように、「前方注意」などの4文字程度のテキスト画像である情報画像210と、この情報画像210を取り囲む矩形状(形状は一例)であり、情報画像210と異なる色相を有する色で塗りつぶされた塗り画像220と、から構成される。ちなみに、移動画像200は、実景に重なって表示される虚像であるので、視認者は実景と移動画像200の混色を視認することになる。図3(b)は、塗り画像220を有する移動画像200が実景内の視覚障害物320(例えば、道路上の白線等)と重なった際の様子を示す図であり、例えば、情報画像210を白色(RGB値でR255,G255,B255)、塗り画像220を黄緑色(RGB値でR153,G255,B102)、視覚障害物320を白色(RGB値でR255,G255,B255)とした場合の例である。また、図3(c)は、塗り画像220を有さない移動画像200が実景内の視覚障害物320(例えば、道路上の白線等)と重なった際の様子を示す図である。このように、情報画像210の色と視覚障害物320の色とが同じ又は近似される色であった場合、図3(c)に示す塗り画像220を有さない移動画像200では、情報画像210と視覚障害物320との区別がしづらく、情報画像210の視認性が低下してしまう。一方、図3(b)に示す塗り画像220を有する移動画像200では、塗り画像220と重なる視覚障害物320の色が塗り画像220との混色により異なる色として視認されるため、情報画像210と視覚障害物320との区別がしやすくなり、情報画像210の視認性が向上する。本実施形態では、移動画像200を移動させるため、実景上の様々な視覚障害物320に移動画像200が重なる頻度が増加するが、移動画像200に塗り画像220を設けることで、情報画像210の視認性を向上させることができる。 FIG. 3 is a diagram for explaining the configuration of the moving image 200. As shown in FIG. The moving image 200 is a color such as white, orange, yellow, yellowish green, blue, red, etc., and as shown in FIG. 3A, an information image 210 which is a text image of about 4 characters such as "forward warning". And a paint image 220 which is rectangular (shape is one example) surrounding the information image 210 and is filled with a color having a hue different from that of the information image 210. Incidentally, since the moving image 200 is a virtual image displayed overlapping with the real view, the viewer visually recognizes a mixed color of the real view and the moving image 200. FIG. 3B is a view showing a state in which the moving image 200 having the painted image 220 overlaps the visual obstacle 320 (for example, a white line on the road) in the real view, for example, the information image 210 Example of white color (R255, G255, B255 in RGB value), yellow-green (R153, G255, B102 in RGB value) for painted image 220, and white (R255, G255, B255 in RGB value) for visual obstacle 320 It is. Further, FIG. 3C is a diagram showing a state in which the moving image 200 without the painted image 220 overlaps the visual obstacle 320 (for example, a white line on a road, etc.) in the real scene. As described above, when the color of the information image 210 and the color of the visual obstacle 320 are the same or approximate colors, in the moving image 200 which does not have the painted image 220 shown in FIG. It is difficult to distinguish between the image 210 and the visual obstacle 320, and the visibility of the information image 210 is reduced. On the other hand, in the moving image 200 having the paint image 220 shown in FIG. 3B, the color of the visual obstacle 320 overlapping the paint image 220 is visually recognized as a different color due to color mixture with the paint image 220. This makes it easy to distinguish from the visual obstacle 320, and the visibility of the information image 210 is improved. In this embodiment, in order to move the moving image 200, the frequency of overlapping the moving image 200 with various visual obstacles 320 on the real view increases, but by providing the paint image 220 on the moving image 200, Visibility can be improved.
 図3(d)は、移動画像200の情報画像210が切り替わる様子を示しており、一方が『右側注意』と記された情報画像210aであり、他方が『車両接近』と記された情報画像210bである。移動画像200は、情報画像210aと情報画像210bとの複数(図3(d)の例では2種類)の情報画像210を有し、移動画像200内の情報画像210を順次又は往復的に連続して切り替えてもよい。 FIG. 3D shows a state where the information image 210 of the moving image 200 is switched, one of which is the information image 210a marked as “right side caution”, and the other of which is the information image noted as “vehicle approaching”. 210b. The moving image 200 has a plurality of (two types in the example of FIG. 3D) information images 210 of the information image 210a and the information image 210b, and the information images 210 in the moving image 200 are sequentially or reciprocally continuous. It may be switched.
 移動画像200の他の例として、移動画像200は、塗り画像220を有していなくてもよい。言い換えれば、移動画像200が、情報画像210のみ、又は塗り画像220とは別の画像要素が加えられて構成されてもよい。また、情報画像210と塗り画像220とは、単色に限定されるものではなく、複数色で構成されてもよい。また、塗り画像220は、情報画像210のそれぞれの文字の縁を囲む画像要素であってもよい。また、塗り画像220は、情報画像210の周り全体を取り囲む事が好ましいが、情報画像210の一部を取り囲んでいなくてもよい。 As another example of the moving image 200, the moving image 200 may not have the fill image 220. In other words, the moving image 200 may be configured by adding only the information image 210 or an image element different from the filling image 220. In addition, the information image 210 and the fill image 220 are not limited to a single color, and may be configured in a plurality of colors. Also, the fill image 220 may be an image element surrounding the edge of each character of the information image 210. Also, although it is preferable that the fill image 220 entirely surround the information image 210, it does not have to surround part of the information image 210.
 以下、第1~第3の実施形態を図4~図8を用いて具体的に説明する。
(第1の実施形態)
 まず、図4,図5を参照する。図4は、第1、第2の実施形態におけるHUD装置100の主要な動作手順を示すフロー図であり、図5は、図4のフロー図に対応した第1実施形態の移動画像200の表示例を示す図である。図5(a)は、移動させる前の移動画像200の表示例を示し、図5(b)は、移動条件が成立した場合に図4におけるステップS40,S50が実行された後の移動画像200の表示例を示す。
The first to third embodiments will be specifically described below with reference to FIGS. 4 to 8.
First Embodiment
First, FIG. 4 and FIG. 5 will be referred to. FIG. 4 is a flow chart showing the main operation procedure of the HUD device 100 in the first and second embodiments, and FIG. 5 is a table of the moving image 200 of the first embodiment corresponding to the flow chart of FIG. It is a figure showing an example of indication. FIG. 5 (a) shows a display example of the moving image 200 before moving, and FIG. 5 (b) shows the moving image 200 after steps S40 and S50 in FIG. 4 are executed when the moving condition is satisfied. Shows a display example of.
 図4のステップS10(情報取得工程)において、判定部21は、オブジェクト情報取得部30を介して周辺監視部31からオブジェクト情報を取得する。前記オブジェクト情報は、自車両1の周辺300のオブジェクト310の実座標上の位置情報(オブジェクト310までの距離情報も含んでいてもよい)又は/及びHUD装置100の表示エリア101内におけるオブジェクト310の位置情報(オブジェクト310までの距離情報も含んでいてもよい)を少なくとも含む。なお、この情報取得工程において、判定部21は、操作情報取得部40を介した操作検出部41から自車両1の操作情報や、入出力インターフェース70を介した外部機器(ナビゲーションシステム500、クラウドサーバー600、車両ECU700)からの情報を含む各種情報などを取得してもよい。 In step S10 (information acquisition process) of FIG. 4, the determination unit 21 acquires object information from the periphery monitoring unit 31 via the object information acquisition unit 30. The object information is position information on the actual coordinates of the object 310 in the vicinity 300 of the own vehicle 1 (may include distance information to the object 310) or / and of the object 310 in the display area 101 of the HUD device 100. At least position information (which may include distance information to the object 310). In the information acquisition process, the determination unit 21 receives the operation information of the own vehicle 1 from the operation detection unit 41 via the operation information acquisition unit 40, the external device via the input / output interface 70 (navigation system 500, cloud server 600, various information including information from the vehicle ECU 700) may be acquired.
 続いて、ステップS20(判定工程)において、判定部21は、ステップS10で取得した各種情報に基づいて、移動画像200の移動条件が成立したかを判定する。例えば、判定部21は、操作情報取得部40が運転者の右車線に進路を変更するウィンカの操作情報をステップS10で取得した場合、自車両1の前方を走行する第1のオブジェクト311(図5(a)参照)に対する前記報知優先度より、自車両1の右車線を走行する第2のオブジェクト312(図5(b)参照)に対する前記報知優先度を高くして、移動条件が成立したと判定する。 Subsequently, in step S20 (determination step), the determination unit 21 determines whether the movement condition of the moving image 200 is satisfied based on the various information acquired in step S10. For example, when the determination unit 21 acquires the operation information of the blinker that changes the course to the driver's right lane in step S10, the first object 311 traveling in front of the host vehicle 1 (see FIG. The moving condition is satisfied by setting the notification priority for the second object 312 (see FIG. 5B) traveling in the right lane of the host vehicle 1 higher than the notification priority for 5 (a)). It is determined that
 移動条件が成立しなかった場合(ステップS20でNO)、表示処理部22は、移動画像200を第1のオブジェクト311から離れる方向に移動させずに表示を更新する(ステップS30)。言い換えると、表示処理部22は、移動画像200が関連付けられたオブジェクト310を変更せずに維持する。ここでいう「移動画像200をオブジェクト310から離れる方向に移動させない」とは、単に、HUD装置100の表示エリア101内での移動画像200が表示される位置(座標)を変更させないことだけではなく、オブジェクト310の位置に追従させて移動画像200の位置を変化させ続けることも含む。 When the movement condition is not satisfied (NO in step S20), the display processing unit 22 updates the display without moving the moving image 200 in the direction away from the first object 311 (step S30). In other words, the display processing unit 22 maintains the object 310 associated with the moving image 200 without changing it. Here, "do not move the moving image 200 away from the object 310" is not only simply not changing the position (coordinates) at which the moving image 200 is displayed in the display area 101 of the HUD device 100. , And keeping the position of the moving image 200 changed by tracking the position of the object 310.
 移動条件が成立した場合(ステップS20でYES)、ステップS40(情報画像更新工程)に移行し、表示処理部22又は情報画像決定部22bは、ステップS10で入力した各種情報に基づき、第2のオブジェクト312に関連する第2情報画像212を決定し、画像生成部24が表示データを生成し、情報画像210を第1情報画像211から第2情報画像212に切り替える。表示処理部22又は情報画像決定部22bは、例えば、情報画像210を、第1のオブジェクト311に関連する第1情報画像211である『前方注意』(図5(a)参照)から第2のオブジェクト312に関連する第2情報画像212である『右側注意』(図5(b)参照)に切り替える。 When the movement condition is satisfied (YES in step S20), the process proceeds to step S40 (information image updating step), and the display processing unit 22 or the information image determining unit 22b performs the second process based on the various information input in step S10. The second information image 212 related to the object 312 is determined, the image generation unit 24 generates display data, and switches the information image 210 from the first information image 211 to the second information image 212. The display processing unit 22 or the information image determination unit 22 b may, for example, change the information image 210 from “forward warning” (see FIG. 5A), which is a first information image 211 related to the first object 311. It switches to the "right side attention" (refer FIG.5 (b)) which is the 2nd information image 212 relevant to the object 312. FIG.
 続いて、ステップS50(移動工程)に移行し、表示処理部22又は移動経路決定部22aは、第2のオブジェクト312の位置に基づいて、移動画像200が第2のオブジェクト312に近付く(言い換えると、移動画像200が第1のオブジェクト311から離れる)ように移動経路400を決定し、画像生成部24が表示データを生成し、情報画像210を移動経路400に沿って移動させる。例えば、表示処理部22又は移動経路決定部22aは、現在、移動画像200が表示されている位置を移動始点401に設定し、第2のオブジェクト312の位置を基準に決定される所定位置を移動終点402に設定し、これら移動始点401と移動終点402とを結ぶ線分を移動経路400に決定する。 Subsequently, the process proceeds to step S50 (movement step), and the display processing unit 22 or the movement path determination unit 22a moves the moving image 200 closer to the second object 312 based on the position of the second object 312 (in other words, The movement path 400 is determined such that the movement image 200 is separated from the first object 311, the image generation unit 24 generates display data, and moves the information image 210 along the movement path 400. For example, the display processing unit 22 or the movement path determination unit 22a sets the position at which the movement image 200 is currently displayed as the movement start point 401, and moves the predetermined position determined based on the position of the second object 312 The end point 402 is set, and a line segment connecting the movement start point 401 and the movement end point 402 is determined as the movement path 400.
 なお、ステップS40とS50は、順不同に実行されても同時に実行されてもよいが、処理ユニット20又は表示処理部22は、移動画像200の移動(ステップS50(移動工程))が完了するまでに、情報画像210の第2情報画像212への切り替え(ステップS40(情報画像更新工程))を完了させることが好ましい。言い換えると、移動画像200が移動経路400に沿って移動する最中、又は移動を開始する前に情報画像210の切り替えが完了することが好ましい。これによれば、移動画像200が移動する前に、新たな情報画像210(第2情報画像212)が視認できるため、より早く情報画像210が意図する行動を取ることが可能となる。 Although steps S40 and S50 may be performed out of order or simultaneously, the processing unit 20 or the display processing unit 22 may complete the movement of the moving image 200 (step S50 (moving step)). Preferably, the switching of the information image 210 to the second information image 212 (step S40 (information image updating step)) is completed. In other words, it is preferable that the switching of the information image 210 be completed while the moving image 200 moves along the moving path 400 or before the moving starts. According to this, since the new information image 210 (second information image 212) can be visually recognized before the moving image 200 moves, it becomes possible to take an action intended by the information image 210 more quickly.
 (第2の実施形態)
 次に、図4,図6を参照して、第2の実施形態を説明する。第2の実施形態では、移動画像200を、移動始点401から移動中継点403を経由して(第1移動行程)から移動終点402に移動させる(第2移動行程)点で上記実施形態とは異なる。図6は、第2の実施形態の移動画像200の表示例を示す図である。図6(a)は、移動させる前の移動画像200の表示例を示し、図6(b)は、移動始点401から移動中継点403に移動する移動画像200の表示例を示し、図6(c)は、移動中継点403から移動終点402に移動する移動画像200の表示例を示す。
Second Embodiment
Next, a second embodiment will be described with reference to FIGS. 4 and 6. In the second embodiment, the moving image 200 is moved from the movement start point 401 via the movement relay point 403 (first movement stroke) to the movement end point 402 (second movement stroke) according to the above embodiment. It is different. FIG. 6 is a view showing a display example of the moving image 200 according to the second embodiment. 6 (a) shows a display example of the moving image 200 before moving, and FIG. 6 (b) shows a display example of the moving image 200 moving from the moving start point 401 to the moving relay point 403. c) shows a display example of the moving image 200 moving from the moving relay point 403 to the moving end point 402.
 まず、判定部21は、図4のステップS10で自車両1の周辺300のオブジェクト310のオブジェクト情報を取得し、移動条件が成立していない場合(ステップS20でNO)、ステップS30で表示を更新する。処理ユニット20又は表示処理部22は、自車両1の前方に位置する第1のオブジェクト311に対する報知優先度が高い場合、図6(a)に示すように、第1のオブジェクト311に関する第1情報画像211を含む移動画像200を表示させる。 First, the determination unit 21 acquires the object information of the objects 310 of the objects 300 around the host vehicle 1 in step S10 of FIG. 4 and updates the display in step S30 when the movement condition is not satisfied (NO in step S20). Do. If the processing unit 20 or the display processing unit 22 has a high notification priority for the first object 311 located in front of the host vehicle 1, as shown in FIG. 6A, the first information on the first object 311 The moving image 200 including the image 211 is displayed.
 その後、再び、判定部21は、図4のステップS10で自車両1の周辺300のオブジェクト310のオブジェクト情報を取得し、自車両1の前方に位置する第1のオブジェクト311に対する報知優先度より、他の第2のオブジェクト312に対する報知優先度が高くなった場合(又は第1のオブジェクト311に対する報知優先度が所定の閾値より低くなった場合)に、移動画像200の移動条件が成立したと判定する(図4のステップS20でYES)。続いて、ステップS40で、情報画像決定部22bが、第2のオブジェクト312に関する第2情報画像212を決定し、情報画像210を第1情報画像211(図6(a)の『前方注意』)から第2情報画像212(図6(b)の『右側注意』)に切り替え、ステップS50で、移動経路決定部22aが、移動始点401から移動中継点403に向かう第1の移動経路410を決定し、処理ユニット20又は表示処理部22は、図6(b)に示すように、移動画像200を第1の移動経路410に沿って移動中継点403に移動させる。 Thereafter, the determination unit 21 acquires the object information of the objects 310 in the vicinity 300 of the vehicle 1 in step S10 of FIG. 4 again, and based on the notification priority for the first object 311 located in front of the vehicle 1 It is determined that the moving condition of the moving image 200 is satisfied when the notification priority for the other second object 312 is high (or when the notification priority for the first object 311 is lower than a predetermined threshold) (YES in step S20 of FIG. 4). Subsequently, in step S40, the information image determination unit 22b determines the second information image 212 related to the second object 312, and the information image 210 is displayed as the first information image 211 (“forward warning” in FIG. 6A). Is switched to the second information image 212 (“right side attention” in FIG. 6B), and in step S50, the movement path determination unit 22a determines the first movement path 410 heading from the movement start point 401 to the movement relay point 403 The processing unit 20 or the display processing unit 22 moves the moving image 200 to the moving relay point 403 along the first moving path 410, as shown in FIG. 6 (b).
 さらにその後、判定部21は、図4のステップS110で自車両1の周辺300のオブジェクト310のオブジェクト情報を取得し、第2のオブジェクト312に対する報知優先度がさらに高くなり、所定の閾値を超えた場合に、移動画像200の移動条件が成立したと判定する(図4のステップS120でYES)。続いて、ステップS140で、情報画像決定部22bが、情報画像210を更新するが、情報画像210が以前の処理(ステップS40)により、既に、第2のオブジェクト312に関する第2情報画像212に変化した状態なので、実質的に処理が省略され、ステップS150で、移動経路決定部22aが、移動中継点403から移動終点402に向かう第2の移動経路420を決定し、図6(c)に示すように、移動画像200を第2の移動経路420に沿って移動終点402に移動させる。 Thereafter, the determination unit 21 acquires the object information of the objects 310 in the vicinity 300 of the vehicle 1 in step S110 of FIG. 4, and the notification priority for the second object 312 is further increased and exceeds a predetermined threshold. In this case, it is determined that the movement condition of the moving image 200 is satisfied (YES in step S120 in FIG. 4). Subsequently, in step S140, the information image determination unit 22b updates the information image 210, but the information image 210 has already been changed to the second information image 212 regarding the second object 312 by the previous processing (step S40). In step S150, the movement path determination unit 22a determines the second movement path 420 from the movement relay point 403 to the movement end point 402, as shown in FIG. 6C. As such, the moving image 200 is moved along the second moving path 420 to the moving end point 402.
 なお、上記実施形態では、移動始点401から移動中継点403に向かう時と、移動中継点403から移動終点402に向かう時との双方で、それぞれ個別の移動条件を判定していた(図4のステップS20とS120との双方を実行していた)が、移動中継点403から移動終点402に向かう時の判定工程(ステップS120)は省略されてもよい。この場合、移動始点401から移動中継点403に向かう時に、移動条件が満たされれば、移動画像200を移動始点401から移動中継点403を経由した後、自動的に移動中継点403から移動終点402に移動させてもよい。なお、移動画像200は、移動中継点403で、予め定められた時間(例えば10秒)、停止した後に移動終点402に向けて移動させてもよい。 In the above embodiment, the individual movement conditions are determined when going from the movement start point 401 to the movement relay point 403 and when going from the movement relay point 403 to the movement end point 402 (see FIG. 4). Although both steps S20 and S120 are being performed), the determination step (step S120) when moving from the mobile relay point 403 to the mobile end point 402 may be omitted. In this case, when moving condition is satisfied when moving from moving start point 401 to moving relay point 403, moving image 200 passes from moving start point 401 to moving relay point 403, and then automatically moves from moving relay point 403 to moving end point 402. You may move it to The moving image 200 may be moved toward the moving end point 402 after being stopped at the moving relay point 403 for a predetermined time (for example, 10 seconds).
 移動中継点403は、記憶部23に予め記憶された表示エリア101上の特定の位置であるため、シチュエーション毎に変化する移動始点401と移動終点402との間の移動経路400を移動経路決定部22aが算出すること、又は取られ得る全ての移動始点401と移動終点402との間の全ての移動経路400を記憶部23に予め記憶しておき読み出すことよりも、移動経路決定部22a又は記憶部23にかかる負荷を低減することができる。なお、移動中継点403は、単数に限定されるものではなく、記憶部23に予め記憶された表示エリア101上の複数の位置であってもよい。この場合、移動画像200を元々表示していた位置に応じて、複数の中から最も近い移動中継点403を選択してもよく、第2のオブジェクト312の位置に応じて、複数の中から最も近い移動中継点403を選択してもよい。 Since the movement relay point 403 is a specific position on the display area 101 stored in advance in the storage unit 23, the movement path 400 between the movement start point 401 and the movement end point 402 changing for each situation is The movement path determination unit 22a or the memory is more than a memory 22 stores and reads in advance all movement paths 400 between all movement start points 401 and movement end points 402 that can be calculated or obtained. The load on the part 23 can be reduced. The mobile relay point 403 is not limited to a single one, and may be a plurality of positions on the display area 101 stored in advance in the storage unit 23. In this case, the closest moving relay point 403 may be selected from among the plurality according to the position where the moving image 200 was originally displayed, and according to the position of the second object 312, the closest among the plurality A near mobile relay point 403 may be selected.
(第3の実施形態)
 次に、図7,図8を参照する。第3の実施形態では、移動画像200から第2のオブジェクト312に向けた指示画像250を表示させる点で上記実施形態とは異なる。図7は、第3の実施形態におけるHUD装置100の主要な動作手順を示すフロー図であり、図8は、第3の実施形態の移動画像200の表示例を示す図である。図8(a)は、移動させる前の移動画像200の表示例を示し、図8(b)は、指示画像250の表示例を示し、図8(c)は、指示画像250を表示した後、又は指示画像250を表示しながら移動始点401から移動終点402に移動する移動画像200の表示例を示す。
Third Embodiment
Next, FIG. 7 and FIG. 8 will be referred to. The third embodiment is different from the above embodiment in that an instruction image 250 directed from the moving image 200 to the second object 312 is displayed. FIG. 7 is a flowchart showing the main operation procedure of the HUD device 100 in the third embodiment, and FIG. 8 is a diagram showing a display example of the moving image 200 in the third embodiment. 8 (a) shows a display example of the moving image 200 before moving, FIG. 8 (b) shows a display example of the instruction image 250, and FIG. 8 (c) shows after the instruction image 250 is displayed. Or, a display example of the moving image 200 moving from the moving start point 401 to the moving end point 402 while displaying the instruction image 250 is shown.
 図7のステップS210(情報取得工程)において、判定部21は、オブジェクト情報取得部30を介して周辺監視部31からオブジェクト情報を取得し、判定部21は、ステップS210で取得した各種情報に基づいて、移動画像200の移動条件が成立したかを判定する(ステップS220(判定工程))。 In step S210 (information acquisition process) of FIG. 7, the determination unit 21 acquires object information from the periphery monitoring unit 31 via the object information acquisition unit 30, and the determination unit 21 is based on the various information acquired in step S210. It is then determined whether the moving condition of the moving image 200 is satisfied (step S220 (determination step)).
 移動条件が成立しなかった場合(ステップS220でNO)、表示処理部22は、移動画像200を第1のオブジェクト311から離れる方向に移動させずに表示を更新する(ステップS230)。 When the movement condition is not satisfied (NO in step S220), the display processing unit 22 updates the display without moving the moving image 200 in the direction away from the first object 311 (step S230).
 移動条件が成立した場合(ステップS220でYES)、ステップS240(情報画像更新工程)に移行し、表示処理部22又は情報画像決定部22bは、ステップS210で入力した各種情報に基づき、第2のオブジェクト312に関連する第2情報画像212を決定し、画像生成部24が表示データを生成し、情報画像210を第1情報画像211から第2情報画像212に切り替える。表示処理部22又は情報画像決定部22bは、例えば、情報画像210を、第1のオブジェクト311に関連する第1情報画像211である『前方注意』(図8(a)参照)から第2のオブジェクト312に関連する第2情報画像212である『右側注意』(図8(b)参照)に切り替える。 If the movement condition is satisfied (YES in step S220), the process proceeds to step S240 (information image updating step), and the display processing unit 22 or the information image determining unit 22b performs second processing based on the various information input in step S210. The second information image 212 related to the object 312 is determined, the image generation unit 24 generates display data, and switches the information image 210 from the first information image 211 to the second information image 212. The display processing unit 22 or the information image determination unit 22 b may, for example, change the information image 210 from “forward warning” (see FIG. 8A), which is a first information image 211 related to the first object 311. It switches to the "right side attention" (refer FIG.8 (b)) which is the 2nd information image 212 relevant to the object 312. FIG.
 続いて、ステップS250(指示画像表示工程)に移行し、表示処理部22又は画像生成部24は、第2のオブジェクト312の位置に基づいて、現在、移動画像200が表示してある位置から第2のオブジェクト312を指示するように指示画像250の位置や形状を決定し、画像生成部24が表示データを生成し、指示画像250を表示させる。例えば、表示処理部22は、指示画像250を、現在、移動画像200が表示されている位置から第2のオブジェクト312に向けた波紋状の画像に決定する。なお、指示画像250は、移動画像200と第2のオブジェクト312が存在する位置とを関連付ける画像であればよいので、波紋状の画像に限定されず、例えば、移動画像200と第2のオブジェクト312との間に配置される線分、点線、複数の画像からなる群画像などである。また、指示画像250は、単数又は複数の画像が移動画像200と第2のオブジェクト312との間を移動する動画であってもよく、例えば、波紋が移動画像200から第2のオブジェクト312に向けて移動する動画であってもよい。 Subsequently, the process proceeds to step S250 (instruction image display process), and the display processing unit 22 or the image generation unit 24 determines the position at which the moving image 200 is currently displayed based on the position of the second object 312. The position and the shape of the instruction image 250 are determined so as to indicate the second object 312, and the image generation unit 24 generates display data and displays the instruction image 250. For example, the display processing unit 22 determines the instruction image 250 as a ripple-like image directed from the position where the moving image 200 is currently displayed to the second object 312. Note that the instruction image 250 may be any image that associates the moving image 200 with the position where the second object 312 is present, and thus is not limited to the ripple-like image. For example, the moving image 200 and the second object 312 And a line image, a dotted line, a group image including a plurality of images, and the like. In addition, the instruction image 250 may be a moving image in which one or more images move between the moving image 200 and the second object 312. For example, a ripple is directed from the moving image 200 to the second object 312. It may be a moving video.
 続いて、ステップS260(移動工程)に移行し、表示処理部22又は移動経路決定部22aは、第2のオブジェクト312の位置に基づいて、移動画像200が第2のオブジェクト312に近付く(言い換えると、移動画像200が第1のオブジェクト311から離れる)ように移動経路400を決定し、画像生成部24が表示データを生成し、情報画像210を移動経路400に沿って移動させる。例えば、表示処理部22又は移動経路決定部22aは、現在、移動画像200が表示されている位置を移動始点401に設定し、第2のオブジェクト312の位置を基準に決定される所定位置を移動終点402に設定し、これら移動始点401と移動終点402とを結ぶ線分を移動経路400に決定する。 Subsequently, the process proceeds to step S 260 (movement step), and the display processing unit 22 or the movement path determination unit 22 a moves the moving image 200 closer to the second object 312 based on the position of the second object 312 (in other words, The movement path 400 is determined such that the movement image 200 is separated from the first object 311, the image generation unit 24 generates display data, and moves the information image 210 along the movement path 400. For example, the display processing unit 22 or the movement path determination unit 22a sets the position at which the movement image 200 is currently displayed as the movement start point 401, and moves the predetermined position determined based on the position of the second object 312 The end point 402 is set, and a line segment connecting the movement start point 401 and the movement end point 402 is determined as the movement path 400.
 なお、ステップS260における移動画像200の移動が完了するまでにステップS250の指示画像250が表示されるのであれば、ステップS240~S260は、移動順不同に実行されても同時に実行されてもよいが、処理ユニット20又は表示処理部22は、移動画像200の移動(ステップS260(移動工程))が完了するまでに、情報画像210の第2情報画像212への切り替え(ステップS240(情報画像更新工程))を完了させることが好ましい。これによれば、移動画像200が移動する前に、新たな情報画像210(第2情報画像212)が視認できるため、より早く情報画像210が意図する行動を取ることが可能となる。また、処理ユニット20又は表示処理部22は、情報画像210の第2情報画像212への切り替え(ステップS240(情報画像更新工程))が完了した後に、指示画像250の表示(ステップS250(指示画像表示工程))を開始させることが好ましい。これによれば、情報画像210が切り替わった後に、指示画像250が視認できるため、指示画像250が表示された理由などを情報画像210によりすぐに認識することが可能となる。 If the instruction image 250 in step S250 is displayed before the movement of the moving image 200 in step S260 is completed, steps S240 to S260 may be executed out of order of movement or may be executed simultaneously. The processing unit 20 or the display processing unit 22 switches the information image 210 to the second information image 212 before the movement of the moving image 200 (step S260 (moving step)) is completed (step S240 (information image updating step) Is preferred to complete. According to this, since the new information image 210 (second information image 212) can be visually recognized before the moving image 200 moves, it becomes possible to take an action intended by the information image 210 more quickly. The processing unit 20 or the display processing unit 22 displays the instruction image 250 (step S250 (instruction image) after the switching of the information image 210 to the second information image 212 (step S240 (information image update step)) is completed. It is preferable to start the display step)). According to this, since the instruction image 250 can be viewed after the information image 210 is switched, it is possible to immediately recognize the reason why the instruction image 250 is displayed and the like by the information image 210.
 (第4の実施形態)
 次に、図9,図10を参照する。第4の実施形態では、移動画像200を、移動始点401から移動中継点403を経由してから移動画像200から第2のオブジェクト312に向けた指示画像250を表示させる点で上記実施形態とは異なる。図9は、第4の実施形態におけるHUD装置100の主要な動作手順を示すフロー図であり、図10は、第4の実施形態の移動画像200の表示例を示す図である。図10(a)は、移動させる前の移動画像200の表示例を示し、図10(b)は、移動始点401から移動中継点403に移動する移動画像200の表示例を示し、図10(c)は、指示画像250の表示例を示し、図10(d)は、指示画像250を表示した後、又は指示画像250を表示しながら移動中継点403から移動終点402に移動する移動画像200の表示例を示す。
Fourth Embodiment
Next, FIG. 9 and FIG. 10 will be referred to. The fourth embodiment is different from the above embodiment in that the moving image 200 is displayed from the moving start point 401 via the moving relay point 403 to the instruction image 250 directed from the moving image 200 to the second object 312. It is different. FIG. 9 is a flowchart showing the main operation procedure of the HUD device 100 according to the fourth embodiment, and FIG. 10 is a diagram showing a display example of the moving image 200 according to the fourth embodiment. 10 (a) shows a display example of the moving image 200 before moving, FIG. 10 (b) shows a display example of the moving image 200 moving from the moving start point 401 to the moving relay point 403, and FIG. c) shows a display example of the indication image 250, and FIG. 10 (d) shows a movement image 200 moving from the movement relay point 403 to the movement end point 402 after the indication image 250 is displayed or while the indication image 250 is displayed. Shows a display example of.
 判定部21は、図9のステップS310で自車両1の周辺300のオブジェクト310のオブジェクト情報を取得し、自車両1の前方に位置する第1のオブジェクト311に対する報知優先度より、他の第2のオブジェクト312に対する報知優先度が高くなった場合(又は第1のオブジェクト311に対する報知優先度が所定の閾値より低くなった場合)に、移動画像200の移動条件が成立したと判定する(図9のステップS320でYES)。続いて、ステップS340で、情報画像決定部22bが、第2のオブジェクト312に関する第2情報画像212を決定し、情報画像210を第1情報画像211(図10(a)の『前方注意』)から第2情報画像212(図10(b)の『右側注意』)に切り替え、ステップS350で、移動経路決定部22aが、移動始点401から移動中継点403に向かう第1の移動経路410を決定し、処理ユニット20又は表示処理部22は、図10(b)に示すように、移動画像200を第1の移動経路410に沿って移動中継点403に移動させる。 The determination unit 21 acquires object information of the objects 310 in the vicinity 300 of the host vehicle 1 in step S310 in FIG. 9, and the second other than the notification priority for the first object 311 located in front of the host vehicle 1. It is determined that the movement condition of the moving image 200 is satisfied (FIG. 9) when the notification priority for the object 312 of (1) becomes high (or the notification priority for the first object 311 becomes lower than a predetermined threshold). (YES in step S320). Subsequently, in step S340, the information image determination unit 22b determines the second information image 212 regarding the second object 312, and the information image 210 is displayed as the first information image 211 ("Forward Caution" in FIG. 10A). Is switched to the second information image 212 (“right side attention” in FIG. 10B), and the movement path determination unit 22a determines the first movement path 410 from the movement start point 401 to the movement relay point 403 in step S350. Then, as shown in FIG. 10B, the processing unit 20 or the display processing unit 22 moves the moving image 200 to the moving relay point 403 along the first moving path 410.
 続いて、ステップS360(指示画像表示工程)に移行し、表示処理部22又は画像生成部24は、第2のオブジェクト312の位置に基づいて、現在、移動画像200が表示してある位置(移動中継点403)から第2のオブジェクト312を指示するように指示画像250の位置や形状を決定し、画像生成部24が表示データを生成し、指示画像250を表示させる。 Subsequently, the process proceeds to step S360 (instruction image display process), and the display processing unit 22 or the image generation unit 24 displays a position (movement) at which the moving image 200 is currently displayed based on the position of the second object 312. The position and the shape of the instruction image 250 are determined so that the second object 312 is indicated from the relay point 403), the image generation unit 24 generates display data, and the instruction image 250 is displayed.
 さらにその後、判定部21は、図9のステップS410で自車両1の周辺300のオブジェクト310のオブジェクト情報を取得し、第2のオブジェクト312に対する報知優先度がさらに高くなり、所定の閾値を超えた場合に、移動画像200の移動条件が成立したと判定する(図9のステップS420でYES)。続いて、ステップS440で、情報画像決定部22bが、情報画像210を更新するが、情報画像210が以前の処理(ステップS340)により、既に、第2のオブジェクト312に関する第2情報画像212に変化した状態なので、実質的に処理が省略され、ステップS450で、移動経路決定部22aが、移動中継点403から移動終点402に向かう第2の移動経路420を決定し、図10(d)に示すように、移動画像200を第2の移動経路420に沿って移動終点402に移動させる。なお、この移動画像200の移動に際して、指示画像250を非表示にしてもよく、指示画像250を表示させながら移動させてもよい。 Thereafter, the determination unit 21 acquires the object information of the objects 310 of the objects 300 in the vicinity 300 of the vehicle 1 in step S410 in FIG. 9, and the notification priority for the second object 312 becomes higher and exceeds the predetermined threshold. In this case, it is determined that the movement condition of the moving image 200 is satisfied (YES in step S420 in FIG. 9). Subsequently, in step S440, the information image determination unit 22b updates the information image 210, but the information image 210 has already been changed to the second information image 212 regarding the second object 312 by the previous processing (step S340). In step S450, the movement path determination unit 22a determines the second movement path 420 from the movement relay point 403 to the movement end point 402, as shown in FIG. 10 (d). As such, the moving image 200 is moved along the second moving path 420 to the moving end point 402. When moving the moving image 200, the instruction image 250 may be hidden, or may be moved while the instruction image 250 is displayed.
 移動中継点403は、記憶部23に予め記憶された表示エリア101上の特定の位置であるため、シチュエーション毎に変化する移動始点401と移動終点402との間の指示画像250を表示処理部22(画像生成部24)が生成すること、又は取られ得る全ての移動始点401と移動終点402との間の全ての指示画像250を記憶部23に予め記憶しておき読み出すことよりも、表示処理部22又は記憶部23にかかる負荷を低減することができる。なお、移動中継点403は、単数に限定されるものではなく、記憶部23に予め記憶された表示エリア101上の複数の位置であってもよい。この場合、移動画像200を元々表示していた位置に応じて、複数の中から最も近い移動中継点403を選択してもよく、第2のオブジェクト312の位置に応じて、複数の中から最も近い移動中継点403を選択してもよい。
[変形例]
Since the movement relay point 403 is a specific position on the display area 101 stored in advance in the storage unit 23, the display processing unit 22 displays the instruction image 250 between the movement start point 401 and the movement end point 402 changing for each situation. Display processing rather than storing (in the storage unit 23) and reading out all instruction images 250 between all the moving start points 401 and the moving end points 402 that can be generated or acquired (image generation unit 24) The load on the unit 22 or the storage unit 23 can be reduced. The mobile relay point 403 is not limited to a single one, and may be a plurality of positions on the display area 101 stored in advance in the storage unit 23. In this case, the closest moving relay point 403 may be selected from among the plurality according to the position where the moving image 200 was originally displayed, and according to the position of the second object 312, the closest among the plurality A near mobile relay point 403 may be selected.
[Modification]
 なお、本発明は、以上の実施形態及び図面によって限定されるものではない。本発明の要旨を変更しない範囲で、適宜、実施形態及び図面に変更(構成要素の削除も含む)を加えることが可能である。以下に、変形例の一例を記す。 The present invention is not limited by the above embodiments and drawings. It is possible to add changes (including deletion of components) to the embodiment and the drawings as appropriate without departing from the scope of the present invention. Below, an example of a modification is described.
 上記実施形態では、表示処理部22又は移動経路決定部22aは、第2のオブジェクト312の位置に基づいて、移動経路400を決定していたが、これに限定されない。変形例として、表示処理部22又は移動経路決定部22aは、第1のオブジェクト311の位置に基づいて、第1のオブジェクト311から移動画像200が離れるように移動経路400を決定してもよい。移動画像200が、第1のオブジェクト311から離れるように移動し、かつ、情報画像210を、第1情報画像211から第2情報画像212に切り替えることで、第1のオブジェクト311から視覚的注意を離し、他の位置に存在する第2情報画像212が示す第2のオブジェクト312に注意を向けさせることができる。 In the above embodiment, the display processing unit 22 or the movement path determination unit 22a determines the movement path 400 based on the position of the second object 312. However, the present invention is not limited to this. As a modification, the display processing unit 22 or the movement path determination unit 22a may determine the movement path 400 such that the movement image 200 is separated from the first object 311 based on the position of the first object 311. By moving the moving image 200 away from the first object 311 and switching the information image 210 from the first information image 211 to the second information image 212, visual attention is given from the first object 311. Attention can be directed to the second object 312 shown by the second information image 212 which is released and present at another position.
1    :自車両
2    :被投影部
3    :アイボックス
4    :アイポイント
10   :表示器
20   :処理ユニット
21   :判定部
22   :表示処理部
22a  :移動経路決定部
22b  :情報画像決定部
23   :記憶部
24   :画像生成部
30   :オブジェクト情報取得部
31   :周辺監視部
40   :操作情報取得部
41   :操作検出部
70   :入出力インターフェース(オブジェクト情報取得部、操作情報取得部)
90   :表示制御装置
100  :ヘッドアップディスプレイ装置(HUD装置)
101  :表示エリア
200  :移動画像
210  :情報画像
211  :第1情報画像
212  :第2情報画像
220  :塗り画像
250  :指示画像
300  :周辺
310  :オブジェクト
311  :第1のオブジェクト
312  :第2のオブジェクト
320  :視覚障害物
400  :移動経路
401  :移動始点
402  :移動終点
403  :移動中継点
500  :ナビゲーションシステム(外部機器)
600  :クラウドサーバー(外部機器)
700  :車両ECU(外部機器)
L    :表示光
M    :表示画像
1: Vehicle 2: Projected part 3: Eye box 4: Eye point 10: Display 20: Processing unit 21: Judgment part 22: Display processing part 22 a: Movement path determination part 22 b: Information image determination part 23: Storage part 24: Image generation unit 30: Object information acquisition unit 31: Peripheral monitoring unit 40: Operation information acquisition unit 41: Operation detection unit 70: Input / output interface (object information acquisition unit, operation information acquisition unit)
90: Display control device 100: Head-up display device (HUD device)
101: Display area 200: Moving image 210: Information image 211: First information image 212: Second information image 220: Filled image 250: Instruction image 300: Peripheral 310: Object 311: First object 312: Second object 320: Visual obstacle 400: Movement path 401: Movement start point 402: Movement end point 403: Movement relay point 500: Navigation system (external device)
600: Cloud server (external device)
700: Vehicle ECU (external device)
L: Display light M: Display image

Claims (15)

  1.  車両(1)の前景に重畳して視認される移動画像(200)の表示制御方法であって、
     前記車両の周辺に存在する第1のオブジェクト(311)に関して、少なくとも前記第1のオブジェクトの位置を含む第1のオブジェクト情報、及び前記車両の周辺に存在する第2のオブジェクト(312)に関して、少なくとも前記第2のオブジェクトの位置を含む第2のオブジェクト情報を取得する情報取得工程(S10、S110、S210、S310、S410)と、
     前記移動画像を移動させる移動条件が成立するかどうかを判定する判定工程(S20、S120、S220、S320、S420)と、
     前記移動条件が成立したときに実行される以下の工程を有しており、すなわち、
     前記移動画像に含まれる情報画像(210)を、前記第1のオブジェクト情報に関連する第1情報画像(211)から前記第2のオブジェクト情報に関連する第2情報画像(212)に切り替える情報画像更新工程(S40、S140、S240、S340、S440)と、
     取得した前記第1のオブジェクトの位置及び前記第2のオブジェクトの位置の少なくとも一方に基づき、前記第1のオブジェクトから離れる移動経路(400)に沿って前記移動画像を移動させる移動工程(S50、S150、S260、S350、S450)と、を含む、表示制御方法。
    A display control method of a moving image (200) to be viewed superimposed on the foreground of a vehicle (1),
    With respect to the first object (311) existing around the vehicle, at least first object information including the position of the first object, and the second object (312) existing around the vehicle An information acquisition step (S10, S110, S210, S310, S410) of acquiring second object information including the position of the second object;
    A determination step (S20, S120, S220, S320, S420) for determining whether a movement condition for moving the moving image is satisfied;
    It has the following steps to be performed when the movement condition is satisfied, ie,
    An information image for switching an information image (210) included in the moving image from a first information image (211) related to the first object information to a second information image (212) related to the second object information Updating steps (S40, S140, S240, S340, S440),
    Moving step (S50, S150) for moving the moving image along a moving path (400) away from the first object based on at least one of the acquired position of the first object and the position of the second object , S260, S350, S450), and a display control method.
  2.  前記移動工程が完了するまでに、前記移動画像の近傍に前記第2のオブジェクトが存在する位置を指示する指示画像(250)を表示させる指示画像表示工程(S250、S360、S460)、をさらに含む、請求項1に記載の表示制御方法。 An instruction image display step (S250, S360, S460) for displaying an instruction image (250) indicating the position where the second object is present in the vicinity of the moving image until the moving step is completed. The display control method according to claim 1.
  3.  前記移動工程が、前記第1のオブジェクトから離れた移動中継点(403)に前記移動画像を移動させる第1移動工程(S50、S350)と、前記移動中継点から前記第2のオブジェクトに向けて前記移動画像を移動させる第2移動工程(S150、S450)と、を含む、請求項1又は2に記載の表示制御方法。 A first moving step (S50, S350) in which the moving step moves the moving image to a moving relay point (403) away from the first object; and from the moving relay point toward the second object The display control method according to claim 1, further comprising: a second moving step (S <b> 150, S <b> 450) for moving the moving image.
  4.  前記第1移動工程の後に、前記移動画像の近傍に前記第2のオブジェクトが存在する位置を指示する指示画像(250)を表示させる指示画像表示工程(S360)、を含む、請求項3に記載の表示制御方法。 The instruction image display step (S360) for displaying an instruction image (250) for instructing a position where the second object is present in the vicinity of the moving image after the first moving step. Display control method.
  5.  前記移動工程が完了するまでに、前記情報画像更新工程を完了させる、請求項1乃至4のいずれかに記載の表示制御方法。 The display control method according to any one of claims 1 to 4, wherein the information image updating step is completed until the moving step is completed.
  6.  前記移動工程を開始するまでに、前記情報画像更新工程を完了させる、請求項5に記載の表示制御方法。 The display control method according to claim 5, wherein the information image updating step is completed before the moving step is started.
  7.  前記移動工程において、前記移動画像が前記第1のオブジェクト又は前記第2のオブジェクトに重なる場合、前記移動画像の視認性を低下させる、請求項1乃至6のいずれかに記載の表示制御方法。 The display control method according to any one of claims 1 to 6, wherein, in the moving step, when the moving image overlaps the first object or the second object, the visibility of the moving image is reduced.
  8.  車両(1)の前景に重畳して視認される移動画像(200)を表示させる表示制御装置であって、
     前記車両の周辺に存在する第1のオブジェクト(311)に関して、少なくとも前記第1のオブジェクトの位置(311)を含む第1のオブジェクト情報、及び第2のオブジェクト(312)に関して、少なくとも前記第2のオブジェクトの位置を含む第2のオブジェクト情報を取得するオブジェクト情報取得部(30)と、
     前記移動画像を移動させる移動条件が成立するかどうかを判定する判定部(21)と、
     前記移動条件が成立したときに、前記移動画像に含まれる情報画像(210)を、前記第1のオブジェクト情報に関連する第1情報画像(211)から前記第2のオブジェクト情報に関連する第2情報画像(212)に切り替えさせ、かつ、前記第1のオブジェクトから離れる移動経路(400)に沿って前記移動画像を移動させる表示処理部(22)と、を備える、表示制御装置。
    A display control apparatus for displaying a moving image (200) to be viewed superimposed on the foreground of a vehicle (1),
    With respect to the first object (311) existing around the vehicle, the first object information including at least the position (311) of the first object and the second object with respect to the second object (312) An object information acquisition unit (30) for acquiring second object information including the position of the object;
    A determination unit (21) that determines whether a movement condition for moving the moving image is satisfied;
    When the movement condition is satisfied, the information image (210) included in the movement image is converted from a first information image (211) related to the first object information to a second one related to the second object information. A display processing unit (22) for switching to an information image (212) and moving the movement image along a movement path (400) away from the first object.
  9.  前記表示処理部(22)は、前記移動画像を移動させる際、前記第1のオブジェクト(311)から離れた移動中継点(403)に移動させた後、前記移動中継点から前記第2のオブジェクトに向けて前記移動画像を移動させる、請求項8に記載の表示制御装置。 The display processing unit (22) moves the moving image to the moving relay point (403) away from the first object (311) when moving the moving image, and then the second object from the moving relay point The display control apparatus according to claim 8, wherein the moving image is moved toward.
  10.  前記表示処理部(22)は、前記移動画像の近傍に前記第2のオブジェクトが存在する位置を指示する指示画像を表示させる、請求項8又は9に記載の表示制御装置。 10. The display control device according to claim 8, wherein the display processing unit (22) displays an instruction image that indicates a position at which the second object is present in the vicinity of the moving image.
  11.  前記表示処理部(22)は、前記移動画像の移動が完了するまでに、前記移動画像の近傍に前記第2のオブジェクトが存在する位置を指示する指示画像を表示させる、請求項10に記載の表示制御装置。 The display processing unit (22) according to claim 10, wherein the display processing unit (22) displays an instruction image indicating a position where the second object is present in the vicinity of the moving image until the movement of the moving image is completed. Display control device.
  12.  前記表示処理部は、前記移動画像の移動が完了するまでに、前記情報画像の切り替えを完了させる、請求項8乃至11のいずれかに記載の表示制御装置。 The display control device according to any one of claims 8 to 11, wherein the display processing unit completes the switching of the information image until the movement of the moving image is completed.
  13.  前記表示処理部は、前記移動画像の移動が開始するまでに、前記情報画像の切り替えを完了させる、請求項12に記載の表示制御装置。 The display control device according to claim 12, wherein the display processing unit completes the switching of the information image before the movement of the moving image starts.
  14.  前記表示処理部は、前記移動経路において、前記移動画像が前記第1のオブジェクト又は前記第2のオブジェクトに重なると判定された場合、前記移動画像の視認性を低下させる、請求項8乃至13のいずれかに記載の表示制御装置。 The display processing unit reduces the visibility of the moving image when it is determined that the moving image overlaps the first object or the second object in the movement path. The display control device according to any one.
  15.  請求項8乃至14のいずれかに記載の表示制御装置(90)と、
     前記表示制御装置(90)が制御することで前記移動画像を表示する表示器(10)と、を備える、ヘッドアップディスプレイ装置。
     
    A display control device (90) according to any of claims 8 to 14,
    And a display (10) for displaying the moving image under control of the display control device (90).
PCT/JP2018/043322 2017-12-21 2018-11-26 Display control method, display control device, and heads-up display device WO2019123976A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112018006514.6T DE112018006514T5 (en) 2017-12-21 2018-11-26 Method for controlling the display, device for controlling the display and head-up display
JP2019560905A JP7216898B2 (en) 2017-12-21 2018-11-26 Display control method, display control device and head-up display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017244956 2017-12-21
JP2017-244956 2017-12-21

Publications (1)

Publication Number Publication Date
WO2019123976A1 true WO2019123976A1 (en) 2019-06-27

Family

ID=66994129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/043322 WO2019123976A1 (en) 2017-12-21 2018-11-26 Display control method, display control device, and heads-up display device

Country Status (3)

Country Link
JP (1) JP7216898B2 (en)
DE (1) DE112018006514T5 (en)
WO (1) WO2019123976A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021037894A (en) * 2019-09-04 2021-03-11 株式会社デンソー Display control device and display control program
WO2021153454A1 (en) * 2020-01-29 2021-08-05 Ricoh Company, Ltd. Image display apparatus, image display method, program, and non-transitory recording medium
CN117162777A (en) * 2023-11-03 2023-12-05 西安信飞特信息科技有限公司 Content presentation method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003291688A (en) * 2002-04-03 2003-10-15 Denso Corp Display method, driving support device and program
JP2016172469A (en) * 2015-03-16 2016-09-29 株式会社デンソー Image forming apparatus
WO2017022047A1 (en) * 2015-08-03 2017-02-09 三菱電機株式会社 Display control device, display device, and display control method
JP2017090996A (en) * 2015-11-04 2017-05-25 株式会社デンソー On-vehicle display apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003291688A (en) * 2002-04-03 2003-10-15 Denso Corp Display method, driving support device and program
JP2016172469A (en) * 2015-03-16 2016-09-29 株式会社デンソー Image forming apparatus
WO2017022047A1 (en) * 2015-08-03 2017-02-09 三菱電機株式会社 Display control device, display device, and display control method
JP2017090996A (en) * 2015-11-04 2017-05-25 株式会社デンソー On-vehicle display apparatus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021037894A (en) * 2019-09-04 2021-03-11 株式会社デンソー Display control device and display control program
JP7255429B2 (en) 2019-09-04 2023-04-11 株式会社デンソー Display controller and display control program
WO2021153454A1 (en) * 2020-01-29 2021-08-05 Ricoh Company, Ltd. Image display apparatus, image display method, program, and non-transitory recording medium
CN117162777A (en) * 2023-11-03 2023-12-05 西安信飞特信息科技有限公司 Content presentation method, device, equipment and storage medium
CN117162777B (en) * 2023-11-03 2024-02-20 西安信飞特信息科技有限公司 Content presentation method, device, equipment and storage medium

Also Published As

Publication number Publication date
JPWO2019123976A1 (en) 2021-01-28
DE112018006514T5 (en) 2020-09-24
JP7216898B2 (en) 2023-02-02

Similar Documents

Publication Publication Date Title
US11008016B2 (en) Display system, display method, and storage medium
CN108140311B (en) Parking assistance information display method and parking assistance device
JP6459205B2 (en) Vehicle display system
JP4807263B2 (en) Vehicle display device
US10410423B2 (en) Display control device for controlling stereoscopic display of superimposed display object, display system, display control method and computer readable medium
WO2016147547A1 (en) Image generation device
US20220107201A1 (en) Display control device and non-transitory computer-readable storage medium
US20220130296A1 (en) Display control device and display control program product
JP6443716B2 (en) Image display device, image display method, and image display control program
US20220118983A1 (en) Display control device and display control program product
WO2019123976A1 (en) Display control method, display control device, and heads-up display device
JP6504431B2 (en) IMAGE DISPLAY DEVICE, MOBILE OBJECT, IMAGE DISPLAY METHOD, AND PROGRAM
CN210191316U (en) Display system for vehicle and vehicle
JP2020093766A (en) Vehicle control device, control system and control program
JP6748947B2 (en) Image display device, moving body, image display method and program
WO2020189238A1 (en) Vehicular display control device, vehicular display control method, and vehicular display control program
CN112822348B (en) Vehicle-mounted imaging system
US20230373309A1 (en) Display control device
JP7127565B2 (en) Display control device and display control program
US11345364B2 (en) Attention calling device and attention calling method
JP2016197312A (en) Drive support display device
JP7456329B2 (en) Vehicle display control device, vehicle display control system, and vehicle display control method
WO2017179174A1 (en) Moving body surroundings display method and moving body surroundings display apparatus
US20220065649A1 (en) Head-up display system
JP2020073365A (en) Information providing device, information providing method, and control program for providing information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18890319

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019560905

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 18890319

Country of ref document: EP

Kind code of ref document: A1