WO2017029761A1 - Dispositif de commande d'affichage, dispositif d'affichage et procédé de guidage d'itinéraire - Google Patents

Dispositif de commande d'affichage, dispositif d'affichage et procédé de guidage d'itinéraire Download PDF

Info

Publication number
WO2017029761A1
WO2017029761A1 PCT/JP2015/073383 JP2015073383W WO2017029761A1 WO 2017029761 A1 WO2017029761 A1 WO 2017029761A1 JP 2015073383 W JP2015073383 W JP 2015073383W WO 2017029761 A1 WO2017029761 A1 WO 2017029761A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
guidance
display object
point
Prior art date
Application number
PCT/JP2015/073383
Other languages
English (en)
Japanese (ja)
Inventor
英一 有田
下谷 光生
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2015/073383 priority Critical patent/WO2017029761A1/fr
Priority to JP2017535219A priority patent/JP6444514B2/ja
Publication of WO2017029761A1 publication Critical patent/WO2017029761A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers

Definitions

  • the present invention relates to a display control device for controlling a display unit and a display control method using the display unit.
  • HUD head-up display
  • a HUD that displays an image on a windshield of a vehicle.
  • a HUD that displays an image as a virtual image that looks as if it is actually present in a real landscape in front of the vehicle when viewed from the driver.
  • Patent Document 1 discloses a HUD that displays guidance information (such as an arrow indicating a right / left turn direction) indicating the content of route guidance performed by a navigation device as a virtual image.
  • guidance information can be displayed at a position that is easy for the driver to see. For example, when intersections are continuous at short intervals, it is difficult to determine which intersection the guidance information corresponds to. The driver may be wondering which intersection to turn.
  • the present invention has been made in view of the above-described problems, and an object of the present invention is to provide a technology that allows a driver to easily recognize a guide point corresponding to guide information provided by a navigation device.
  • a display control device is a display control device that controls a virtual image display unit, and the virtual image display unit is a virtual image that is visible from the driver's seat of the vehicle through the windshield of the vehicle.
  • the display object can be displayed at a virtual image position defined by a virtual image direction that is a direction of the virtual image with respect to a specific position of the vehicle and a virtual image distance that is a distance to the virtual image.
  • Control that controls the display of the guidance information acquisition unit that acquires the guidance information that is the route guidance information corresponding to the guidance point ahead, the guidance point position acquisition unit that acquires the relative position of the guidance point with respect to the vehicle, and the display of the virtual image display unit
  • the control unit displays a guide information display object that is a display object indicating the content of the guide information and is a display object indicating the position of the guide point. To view the guide point display object.
  • a display control device is a display control device that controls an image display unit, a shadow image acquisition unit that acquires a forward image that is a shadow image of the front of the vehicle, and a vehicle navigation device.
  • a guidance information acquisition unit that acquires guidance information that is route guidance information corresponding to a guidance point ahead of the vehicle, a guidance point position acquisition unit that acquires the position of the guidance point in the front image, and a display of the image display unit
  • the control unit displays a guidance information display object, which is an image showing the content of the guidance information, on the image display unit, and a guidance point display which is an image showing the position of the guidance point in the forward image The object is superimposed on the front image and displayed on the image display unit.
  • the guidance point display object indicating the position of the guidance point is displayed together with the guidance information display object indicating the content of the guidance information.
  • the corresponding guide point can be easily grasped.
  • FIG. 1 is a block diagram illustrating a configuration of a display control device according to a first embodiment. It is a figure for demonstrating the virtual image (display object) which a virtual image display part displays. It is a figure for demonstrating the display object which a virtual image display part displays. It is a figure for demonstrating the display object which a virtual image display part displays. It is a figure for demonstrating the display object which a virtual image display part displays. It is a figure for demonstrating the display object which a virtual image display part displays.
  • 3 is a flowchart showing an operation of the display control apparatus according to the first embodiment. It is a figure which shows the example of the positional relationship of the own vehicle and a guidance point. It is a figure which shows the example of a display of a guidance information display object and a guidance point display object.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration of a display control device according to Embodiment 1.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration of a display control device according to Embodiment 1.
  • FIG. 6 is a block diagram illustrating a configuration of a display control device according to Embodiment 2.
  • FIG. 10 is a flowchart illustrating an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a display control device according to a fourth embodiment.
  • FIG. 1 is a diagram showing a configuration of a display control apparatus 1 according to Embodiment 1 of the present invention.
  • the display control device 1 will be described as being mounted on a vehicle.
  • a vehicle equipped with the display control device 1 is referred to as “own vehicle”.
  • the display control device 1 controls the virtual image display unit 2 that displays an image as a virtual image in the driver's field of view, such as HUD.
  • the navigation device 3 of the host vehicle is connected to the display control device 1.
  • the virtual image display unit 2 may be configured integrally with the display control device 1. That is, the display control device 1 and the virtual image display unit 2 may be configured as one display device.
  • the virtual image displayed by the virtual image display unit 2 will be described with reference to FIGS.
  • the virtual image displayed by the virtual image display unit 2 is referred to as a “display object”.
  • the virtual image display unit 2 can display the display object 100 at a position that can be viewed through the windshield 201 from the position of the driver 200 of the host vehicle.
  • the position where the display object 100 is actually displayed is on the windshield 201, but when viewed from the driver 200, the display object 100 appears as if it exists in the landscape in front of the vehicle. ing.
  • the apparent display position of the display object 100 viewed from the driver 200 is referred to as a “virtual image position”.
  • the virtual image position is defined by the “virtual image direction” that is the direction of the display object 100 based on the position of the driver 200 and the “virtual image distance” that is the apparent distance from the position of the driver 200 to the display object 100. Is done.
  • the reference point for defining the virtual image position is preferably the position of the driver 200, but may be any specific position of the vehicle that can be regarded as the position of the driver 200. Or the windshield 201 may be used as a reference point.
  • the virtual image direction substantially corresponds to the position of the display object 100 on the windshield 201 as viewed from the driver 200, and is, for example, a deviation angle ( ⁇ i , ⁇ i ) of a three-dimensional polar coordinate system as shown in FIG. expressed.
  • Virtual image distance is substantially equivalent to the apparent distance to the display object 100 as viewed from the driver 200, for example, three-dimensional polar coordinate system radius vector as shown in FIG. 3 (r i) are represented by like.
  • the driver 200 adjusts the focus (focus) distance Fd of his / her eyes to the virtual image distance (r i ) to display a display object at a virtual image position represented by three-dimensional polar coordinates (r i , ⁇ i , ⁇ i ). 100 can be visually recognized.
  • the navigation device 3 searches for an optimum route from the current position of the host vehicle to the destination, determines that route as a scheduled travel route, and operates so that the host vehicle travels along the planned travel route. Route guidance to the user.
  • the route guidance performed by the navigation device 3 is performed by instructing the driver of the traveling direction of the host vehicle when the host vehicle approaches a branch point or an intersection on the travel route.
  • a point where the navigation apparatus 3 performs route guidance is referred to as a “guidance point”, and information indicating the content of the route guidance (such as an instruction of a traveling direction) is referred to as “guidance information”.
  • the guide points may be junctions (intersections of expressways, etc.), lane change points, and the like in addition to junctions and intersections.
  • the display control device 1 includes a guidance information acquisition unit 11, a guidance point position acquisition unit 12, a display object storage unit 13, and a control unit 14.
  • the guidance information acquisition unit 11 acquires, from the navigation device 3, guidance information indicating the content of route guidance at a guidance point in front of the host vehicle (next guidance point).
  • the navigation device 3 outputs guidance information corresponding to the guidance point to the guidance information acquisition unit 11 when approaching the guidance point of the host vehicle. “When approaching the guide point of the host vehicle” may be defined as, for example, a case where the distance from the host vehicle to the guide point becomes smaller than a predetermined threshold. It may be defined as a case where the expected time until arrival becomes smaller than a predetermined threshold.
  • the guidance point position acquisition unit 12 acquires the relative position of the guidance point corresponding to the guidance information acquired by the guidance information acquisition unit 11 with respect to the host vehicle.
  • the relative position of the guide point with respect to the host vehicle can be calculated from the position information of the guide point included in the map information stored in the navigation device 3 and the position information of the host vehicle acquired by the navigation device 3.
  • the navigation device 3 calculates the relative position of the guide point
  • the guide point position acquisition unit 12 acquires the calculation result.
  • the guidance point position acquisition part 12 may acquire the positional information on a guidance point and the own vehicle from the navigation apparatus 3, and may calculate the relative position of a guidance point from the information.
  • the display object storage unit 13 stores image data of a plurality of display objects in advance.
  • the display object stored in the display object storage unit 13 includes an image for indicating the content of the guidance information (for example, an arrow graphic indicating a right turn or a left turn) and an image for indicating the position of the guidance point ( Specific examples will be described later).
  • the control unit 14 comprehensively controls each component of the display control device 1 and controls display of a virtual image by the virtual image display unit 2.
  • the control unit 14 can display the display object stored in the display object storage unit 13 in the visual field of the driver 200 using the virtual image display unit 2.
  • the control unit 14 can control the virtual image position of the display object displayed by the virtual image display unit 2.
  • the virtual image position is represented by the virtual image direction and the virtual image distance.
  • the virtual image display unit 2 does not have a function of changing the virtual image distance of the display object.
  • the virtual image distance is assumed to be constant.
  • the control unit 14 may have a function of generating a display object that is not stored in the display object storage unit 13, or may have a function of deforming the display object.
  • the control unit 14 displays the display object 100 with the virtual image distance of 30 m on the virtual image display unit 2. Can be made. In that case, it appears to the driver that the display object 100 exists 30 m ahead through the windshield 201 as shown in FIG. 5 (the element 202 is the handle of the host vehicle).
  • FIG. 6 is a flowchart showing the operation.
  • the display control device 1 obtains the guidance information corresponding to the guidance point ahead of the host vehicle from the navigation device 3, the display control device 1 displays a display object indicating the content of the guidance information and the position of the guidance point corresponding to the guidance information. Operates to display the displayed display object.
  • the display object indicating the content of the guidance information is referred to as “guidance information display object”
  • the display object indicating the position of the guidance point is referred to as “guidance point display object”.
  • the guide information acquisition unit 11 acquires guide information corresponding to the guide point from the navigation device 3 (step S11). Moreover, the guidance point position acquisition part 12 acquires the relative position with respect to the own vehicle of the said guidance point (step S12).
  • the control unit 14 acquires data of a guidance information display object (for example, a graphic of an arrow indicating a right turn or a left turn) from the display object storage unit 13 and displays it using the virtual image display unit 2 (step S13).
  • a guidance information display object for example, a graphic of an arrow indicating a right turn or a left turn
  • the virtual image position of the guidance information display object may be an arbitrary position as long as it is easy for the driver to see.
  • the virtual image position of the guidance information display object may be fixed at a certain position of the windshield, or may be a position that is visible in the vicinity of the guidance point when viewed from the driver (for example, above the guidance point). Good.
  • control unit 14 acquires the data of the guide point display object from the display object storage unit 13, and displays it at a position corresponding to the guide point when viewed from the driver (step S14).
  • the display control device 1 repeatedly executes the above operation.
  • intersection XP1 located in front of the host vehicle S is a guide point
  • intersection XP2 that is not a guide point exists in front of the intersection XP1.
  • the content of the guidance information corresponding to the intersection XP1 is “right turn”.
  • the driver's field of view through the windshield 201 displays a guidance information display object 31 (arrow figure) indicating a right turn as shown in FIG. 8, and corresponds to the intersection XP1 as viewed from the driver.
  • a guidance point display object 32 indicating the guidance point is displayed at the position to be played.
  • the guide point display object 32 is a virtual image of a nonexistent object that appears to be present at the guide point when viewed from the driver.
  • FIG. 8 is an example in which the guide point display object 32 is a virtual image of a pole.
  • the guide point display object 32 is not limited to the virtual image of the pole.
  • a virtual image of a nonexistent wall that appears at the guide point (XP1) when viewed from the driver may be used as the guide point display object 32.
  • a virtual image in which a road ahead of the guide point (XP1) when viewed from the driver is colored may be used as the guide point display object 32.
  • the guidance point display object 32 is displayed at a position corresponding to the intersection XP1 when viewed from the driver, so that the driver can obtain guidance information (instructed to turn right) indicated by the guidance information display object 31. ) Can be easily recognized as the intersection XP1.
  • FIG. 11 and FIG. 12 are diagrams each showing an example of the hardware configuration of the display control device 1.
  • the guidance information acquisition unit 11, the guidance point position acquisition unit 12, and the control unit 14 in the display control device 1 are realized by, for example, a processing circuit 40 illustrated in FIG. That is, the processing circuit 40 includes a guidance information acquisition unit 11 that acquires guidance information corresponding to a guidance point ahead of the host vehicle from the navigation device 3, and a guidance point position acquisition unit that acquires a relative position of the guide point with respect to the host vehicle. 12 and a control unit 14 for displaying a guidance information display object 31 indicating the content of the guidance information and displaying a guidance point display object 32 indicating the position of the guidance point.
  • Dedicated hardware may be applied to the processing circuit 40, or a processor (Central processing unit, central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, digital, which executes a program stored in the memory Signal Processor) may be applied.
  • a processor Central processing unit, central processing unit, processing unit,
  • the processing circuit 40 corresponds to, for example, a single circuit, a composite circuit, a programmed processor, a processor programmed in parallel, an ASIC, an FPGA, or a combination thereof.
  • the functions of each part of the guidance information acquisition unit 11, the guidance point position acquisition unit 12, and the control unit 14 may be realized by a plurality of processing circuits 40, or the functions of the respective units are realized by a single processing circuit 40. May be.
  • FIG. 12 shows a hardware configuration of the display control device 1 when the processing circuit 40 is a processor.
  • the functions of the guidance information acquisition unit 11, the guidance point position acquisition unit 12, and the control unit 14 are realized by a combination with software or the like (software, firmware, or software and firmware).
  • Software or the like is described as a program and stored in the memory 42.
  • the processor 41 as the processing circuit 40 implements the functions of the respective units by reading and executing the program stored in the memory 42. That is, when executed by the processing circuit 40, the display control device 1 acquires the guidance information corresponding to the guidance point ahead of the host vehicle from the navigation device 3, and the relative position of the guide point with respect to the host vehicle.
  • a memory 42 for storing the program is provided.
  • this program can be said to cause the computer to execute the procedures and methods of the guidance information acquisition unit 11, the guidance point position acquisition unit 12, and the control unit 14.
  • the memory 42 is a nonvolatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory), or the like.
  • a volatile semiconductor memory HDD (Hard Disk
  • the present invention is not limited to this, and a part of the guide information acquisition unit 11, the guide point position acquisition unit 12, and the control unit 14 is realized by dedicated hardware, and another part is realized by software or the like. May be.
  • the function of the control unit 14 is realized by a processing circuit as dedicated hardware, and for the other parts, the processing circuit 40 as the processor 41 reads the program stored in the memory 42 and executes it to execute the function. Can be realized.
  • the processing circuit 40 can realize the functions described above by hardware, software, or the like, or a combination thereof.
  • the display object storage unit 13 is configured by the memory 42, but may be configured by one memory 42 or a plurality of memories 42.
  • the display control device described above includes an installed navigation device that can be mounted on a vehicle, a Portable Navigation Device, a communication terminal (for example, a mobile terminal such as a mobile phone, a smartphone, and a tablet), and an application installed in these devices. It is possible to apply to a display control system constructed as a system by appropriately combining these functions and a server. In this case, each function or each component of the display control device described above may be distributed and arranged in each device constituting the system, or may be concentrated on any device. .
  • FIG. 13 is a block diagram illustrating a configuration of the display control apparatus 1 according to the second embodiment.
  • the display control apparatus 1 is further connected with a moving body detection unit 4 that detects a moving body (a vehicle, a motorcycle, etc.) that exists in front of the host vehicle.
  • the moving body detection unit 4 can detect the presence of the moving body using, for example, a millimeter wave radar of the own vehicle, a DSRC (Dedicate Short Range Communication) unit, a camera (for example, an infrared camera), or the like.
  • the display control apparatus 1 has a configuration in which a moving body position acquisition unit 15 is added to the configuration of FIG.
  • the moving body position acquisition unit 15 acquires the relative position of the moving body detected by the moving body detection unit 4 with respect to the guide point.
  • the relative position between the guide point and the moving body is information on the relative position of the guide point with respect to the host vehicle acquired by the guide point position acquiring unit 12, the output data of the millimeter wave radar of the host vehicle, the output data of the DSRC unit, or It can be calculated from information on the position of the moving body (relative position with respect to the host vehicle) obtained from the analysis result of the video captured by the camera.
  • the display control device 1 of the second embodiment is also realized by the hardware configuration shown in FIG. 11 or FIG. That is, the function of the moving body position acquisition unit 15 is also realized by the processing circuit 40 or the processor 41 that executes a program.
  • an intersection XP1 located in front of the host vehicle S is a guide point
  • an intersection XP2 that is not a guide point exists in front of the intersection XP1.
  • the content of the guidance information corresponding to the intersection XP1 is “turn right”.
  • the guide point display object 32 does not directly indicate the position of the guide point, but indirectly by attaching the guide point display object 32 to each mobile body existing around the guide point.
  • the position of the guide point is indicated.
  • the guide point display object 32 includes a first display object 32 a (mark “ ⁇ ”) indicating a moving body (vehicles 51, 52) located closer to the guide point, and a guide point.
  • a second display object 32b (“x" mark) indicating a moving body (vehicle 53) located further away. Therefore, it can be seen that a guide point exists at a position between the vehicles 51 and 52 to which the first display object 32a is attached and the vehicle 53 to which the second display object 32b is attached. Therefore, the same effect as in the first embodiment can be obtained.
  • This embodiment is effective when it is difficult for the driver to visually recognize the guide point due to the presence of a moving body in front of the host vehicle.
  • FIG. 15 is a flowchart showing the operation.
  • the guidance information acquisition unit 11 acquires guidance information corresponding to the guidance point from the navigation device 3 (step S21). Moreover, the guidance point position acquisition part 12 acquires the relative position with respect to the own vehicle of the said guidance point (step S22).
  • control unit 14 acquires the data of the guidance information display object from the display object storage unit 13 and displays it using the virtual image display unit 2 (step S23).
  • the virtual image position of the guidance information display object may be an arbitrary position as long as it is easy for the driver to see.
  • the moving body position acquisition unit 15 checks whether the moving body detection unit 4 has detected a moving body in front of the host vehicle (step S24). At this time, if a moving body is detected (YES in step S24), the moving body position acquisition unit 15 acquires the relative position of each moving body with respect to the guide point in front of the host vehicle (step S25). And the control part 14 is a guidance point display object (a 1st display object or a 2nd display object) according to the relative position with respect to the guidance point of the mobile body in the position corresponding to each mobile body seeing from a driver
  • a guidance point display object a 1st display object or a 2nd display object
  • step S24 If no moving body is detected by the moving body detection unit 4 (NO in step S24), the above steps S25 and S26 are not performed.
  • the display control device 1 repeatedly executes the above operation.
  • the guide point display object 32 is attached to each mobile body according to the relative position of the mobile body with respect to the guide point.
  • An example of the guide point display object 32 is not limited to that shown in FIG. 14, but for example, as shown in FIG. 16, a vehicle 54 that travels from the guide point (intersection XP1) in a direction that matches the content of the guide information (right turn) You may make it attach
  • a virtual image of the guidance information display object 31 may be used as the guidance point display object 32. That is, as shown in FIG. 17, a virtual image (indicating a right turn) of the guidance information display object 31 is displayed as the guidance point display object 32 on a vehicle 54 traveling in a direction that matches the content of guidance information (right turn) from the guidance point (intersection XP1). An arrow) may be attached.
  • step S24 when a moving body is not detected by the moving body detection unit 4 (NO in step S24), there is no target to which the guidance point display object 32 is attached, so the guidance point display object 32 is displayed. Not. However, even in such a case, the guide point display object 32 may be displayed.
  • the display control device 1 uses, as a guide point display object 32, virtual images 51v, 52v, 53v of non-existent vehicles that appear to exist around an intersection XP1 (guide point), and an intersection XP1.
  • the first display object 32a showing the virtual images 51v and 52v of the vehicle appearing close to each other and the second display object 32b showing the virtual image 53v of the vehicle appearing farther than the intersection XP1 are displayed. It may be. Thereby, even when there is no moving body in front of the host vehicle, the same effect as in the case of FIG. 14 can be obtained.
  • the display control device 1 has a virtual image 54v of a non-existent vehicle that appears as a guidance point display object 32 in a direction that coincides with the content (right turn) of guidance information from the intersection XP1 (guidance point).
  • a display object (mark “ ⁇ ”) indicating the virtual image 54v of the vehicle may be displayed.
  • the virtual image display unit 2 has been described as not having the function of changing the virtual image distance, but the virtual image display unit 2 may have the function.
  • the virtual image display unit 2 can select and set the virtual image distance of the display object from 25 m, 50 m, and 75 m.
  • the control unit 14 includes a first display object 101a having a virtual image distance of 25 m, a second display object 101b having a virtual image distance of 50 m, and a third display object having a virtual image distance of 75 m.
  • 101 c can be displayed on the virtual image display unit 2.
  • the driver displays the first display object 101a 25 m ahead, the second display object 101b 50 m ahead, and the third 75 m ahead through the windshield 201. It appears that the display object 101c exists (the element 202 is the handle of the host vehicle).
  • the virtual image distance of the first display object 32a (mark “ ⁇ ”) attached to the moving body (vehicles 51, 52) located closer to the intersection XP1 is determined from the own vehicle to the intersection XP1.
  • the virtual image distance of the second display object 32b ("x" mark) attached to the moving body (vehicle 53) located farther than the intersection XP1 from the host vehicle to the intersection XP1. It may be longer than the distance.
  • the virtual image distance of the first display object 32 a attached to the vehicle 51 is made equal to the distance from the own vehicle to the vehicle 51, and the virtual image distance of the first display object 32 a attached to the vehicle 52 is changed from the own vehicle to the vehicle 52.
  • the virtual image distance of the second display object 32b attached to the vehicle 53 may be equal to the distance from the host vehicle to the vehicle 53.
  • the virtual image distance of the guide point display object 32 attached to the vehicle 54 may be equal to the distance from the host vehicle to the vehicle 54.
  • the virtual image distance of the moving object (virtual images 51v and 52v) that appears to be located closer to the intersection XP1 and the virtual image distance of the first display object 32a (mark “ ⁇ ”) attached thereto are calculated as follows.
  • a virtual image (virtual image 53v) of a moving body that is shorter than the distance from the vehicle to the intersection XP1 and is located farther than the intersection XP1, and a virtual image of the second display object 32b (mark “x”) attached thereto.
  • the distance may be longer than the distance from the host vehicle to the intersection XP1.
  • FIG. 22 is a diagram showing a configuration of the display control apparatus 1 according to Embodiment 4 of the present invention.
  • the display control device 1 according to the fourth embodiment controls an image display unit 5 that displays an image on a screen, such as a liquid crystal display device.
  • the navigation device 3 of the host vehicle is connected to the display control device 1.
  • the image display unit 5 may be configured integrally with the display control device 1. That is, the display control device 1 and the image display unit 5 may be configured as one display device.
  • the display control device 1 is connected to an in-vehicle camera 6 that captures a forward image of the host vehicle (hereinafter referred to as a “front image”).
  • the display control device 1 captures a forward image captured by the in-vehicle camera 6.
  • An image acquisition unit 16 for acquisition is provided.
  • the guide point position acquisition unit 12 operates to acquire the position of the guide point in the front image by image analysis processing. Since the other elements in FIG. 22 are the same as those shown in FIG. 1, the description thereof is omitted here.
  • the display control device 1 of the fourth embodiment is also realized by the hardware configuration shown in FIG. 11 or FIG. That is, the function of the image acquisition unit 16 is also realized by the processing circuit 40 or the processor 41 that executes a program.
  • FIG. 23 is a flowchart showing the operation.
  • the display control device 1 displays a forward image acquired from the in-vehicle camera 6 on the image display unit 5.
  • the display control device 1 acquires guidance information corresponding to the guidance point (next guidance point) ahead of the host vehicle from the navigation device 3
  • the display control device 1 displays an image indicating the content of the guidance information on the image display unit 5.
  • an image indicating the position of the guide point corresponding to the guide information is displayed on the image display unit 5 so as to be superimposed on the front shadow image.
  • each image that the display control device 1 displays on the image display unit 5 together with the forward image is referred to as a “display object”.
  • an image indicating the content of the guidance information is referred to as a “guidance information display object”
  • an image indicating the position of the guidance point is referred to as a “guidance point display object”.
  • the image acquisition unit 16 acquires a front image captured by the in-vehicle camera 6 (step S31), and the control unit 14 causes the image display unit 5 to display the front image (step S32). Moreover, the control part 14 confirms whether the guidance information acquisition part 11 acquired the guidance information corresponding to the guidance point (next guidance point) ahead of the own vehicle from the navigation apparatus 3 (step S33). If the guide information acquisition unit 11 has not acquired the guide information corresponding to the next guide point (NO in step S33), the process returns to step S31.
  • the guide point position acquisition unit 12 acquires the position of the guide point in the forward image (step S34).
  • the control unit 14 acquires data of a guidance information display object (for example, an arrow graphic representing a right turn or a left turn) from the display object storage unit 13, and displays it on the image display unit 5 (step S35).
  • a guidance information display object for example, an arrow graphic representing a right turn or a left turn
  • the display position of the guidance information display object may be an arbitrary position as long as it is easy for the driver to see. For example, it may be fixed at a fixed position on the screen of the image display unit 5, or may be in the vicinity of the guidance point (for example, above the guidance point) in the forward video displayed on the image display unit 5.
  • control unit 14 acquires the data of the guidance point display object from the display object storage unit 13 and displays it at a position corresponding to the guidance point in the forward image displayed on the image display unit 5 (step S36). Return to S31.
  • the display control device 1 repeatedly executes the above operation.
  • an intersection XP1 located in front of the host vehicle S is a guide point
  • an intersection XP2 that is not a guide point exists in front of the intersection XP1.
  • the content of the guidance information corresponding to the intersection XP1 is “turn right”.
  • a guidance information display object 31 (arrow figure) indicating a right turn is displayed on the image display unit 5, and a guidance point is indicated at a position corresponding to the intersection XP1 in the forward image.
  • a guide point display object 32 is displayed.
  • the guide point display object 32 is an image of a nonexistent object that appears to exist at the guide point.
  • FIG. 24 is an example in which the guide point display object 32 is an image of a pole.
  • the guide point display object 32 is not limited to a virtual image of a pole.
  • a virtual image of a nonexistent wall that appears at the guide point (XP1) when viewed from the driver may be used as the guide point display object 32.
  • an image in which a road ahead of the guide point (XP1) when viewed from the driver is colored may be used as the guide point display object 32.
  • the guidance point display object 32 is displayed at a position corresponding to the intersection XP1 in the forward image displayed on the image display unit 5. Accordingly, the driver can easily recognize that the guidance point corresponding to the right turn instruction represented by the guidance information display object 31 is the intersection XP1 by looking at the image display unit 5.
  • the display control apparatus 1 can also be realized by the hardware configuration shown in FIG. 11 or FIG.
  • the processing circuit 40 includes the guidance information acquisition unit 11 that acquires guidance information corresponding to the guidance point ahead of the host vehicle from the navigation device 3, and the position of the guidance point in the front image.
  • a guidance information display object 31 indicating the content of the guidance information is displayed on the image display unit 5 and a guidance point display object 32 indicating the position of the guidance point in the forward image is displayed in the forward image.
  • a control unit 14 that is superimposed on the image display unit 5 and displayed on the image display unit 5.
  • the processor 41 reads out and executes a program stored in the memory 42, thereby acquiring guidance information corresponding to a guidance point ahead of the host vehicle from the navigation device 3.
  • a step of acquiring a position of the guide point in the front image, a guide information display object 31 indicating the content of the guide information is displayed on the image display unit 5, and a guide point display indicating the position of the guide point in the front image
  • the step of superimposing the object 32 on the front image and displaying it on the image display unit 5 is executed.
  • the display control device is also installed in a navigation device, portable navigation device, communication terminal (for example, a mobile terminal such as a mobile phone, a smartphone, and a tablet) that can be mounted on a vehicle, and these.
  • a navigation device portable navigation device
  • communication terminal for example, a mobile terminal such as a mobile phone, a smartphone, and a tablet
  • the present invention can be applied to a display control system constructed as a system by appropriately combining application functions and servers.
  • each function or each component of the display control device described above may be distributed and arranged in each device constituting the system, or may be concentrated on any device. .
  • FIG. 27 is a block diagram illustrating a configuration of the display control apparatus 1 according to the fifth embodiment.
  • the display control apparatus 1 according to the fifth embodiment is further connected to a moving body detection unit 4 that detects a moving body (vehicle, motorcycle, etc.) that exists in front of the host vehicle.
  • the moving body detection unit 4 is the same as that shown in the second embodiment (FIG. 13).
  • the display control apparatus 1 has a configuration in which a moving body position acquisition unit 15 that acquires a relative position of the moving body with respect to the guide point detected by the moving body detection unit 4 is added to the configuration of FIG.
  • the moving body position acquisition unit 15 acquires the relative position of the moving body detected by the moving body detection unit 4 with respect to the guide point, and performs image analysis processing to determine the position of the moving body in the front image. Works to get.
  • an intersection XP1 located in front of the host vehicle S is a guide point
  • an intersection XP2 that is not a guide point exists in front of the intersection XP1.
  • the content of the guidance information corresponding to the intersection XP1 is “turn right”.
  • the guide point display object 32 does not directly indicate the position of the guide point, but the guide point display object 32 is attached to each mobile body existing around the guide point.
  • the location of the guide point is shown indirectly.
  • the guide point display object 32 includes a first display object 32 a (mark “ ⁇ ”) indicating a moving body (vehicles 51, 52) located closer to the guide point, and a guide point.
  • a second display object 32b (“x" mark) indicating a moving body (vehicle 53) located further away. Therefore, it can be seen that a guide point exists at a position between the vehicles 51 and 52 to which the first display object 32a is attached and the vehicle 53 to which the second display object 32b is attached. Therefore, the same effect as in the fourth embodiment can be obtained.
  • This embodiment is effective when the in-vehicle camera 6 is difficult to photograph the guide point due to the presence of a moving body in front of the host vehicle.
  • FIG. 29 is a flowchart showing the operation.
  • the image acquisition unit 16 acquires a front image captured by the in-vehicle camera 6 (step S41), and the control unit 14 causes the image display unit 5 to display the front image (step S42). Moreover, the control part 14 confirms whether the guidance information acquisition part 11 acquired the guidance information corresponding to the guidance point (next guidance point) ahead of the own vehicle from the navigation apparatus 3 (step S43). If the guide information acquisition unit 11 has not acquired the guide information corresponding to the next guide point (NO in step S43), the process returns to step S41.
  • the control unit 14 acquires the data of the guide information display object from the display object storage unit 13, and displays it as an image. This is displayed on the part 5 (step S44).
  • the display position of the guidance information display object may be an arbitrary position as long as it is easy for the driver to see.
  • the moving body position acquisition unit 15 checks whether the moving body detection unit 4 has detected a moving body in front of the host vehicle (step S45). At this time, if a moving body is detected (YES in step S45), the moving body position acquisition unit 15 acquires the relative position of each moving body with respect to the guide point in front of the host vehicle (step S46). Furthermore, the moving body position acquisition part 15 acquires the position of the said moving body in a front image by image analysis processing (step S47). And the control part 14 is a position corresponding to each moving body in the front image displayed on the image display part 5, and the guidance point display object (a 1st display object or an object) according to the relative position with respect to the guidance point of the moving body. The second display object) is displayed (step S48).
  • step S45 If no moving body is detected by the moving body detection unit 4 (NO in step S45), the above steps S46 to S48 are not performed.
  • the display control device 1 repeatedly executes the above operation.
  • the guide point display object 32 is attached to the moving body according to the relative position of the moving body with respect to the guide point.
  • the example of the guide point display object 32 is not limited to that shown in FIG. 28.
  • the driver can grasp not only the position of the guidance point but also the traveling direction of the host vehicle at the guidance point.
  • the image of the guidance information display object 31 may be used as the guidance point display object 32. That is, as shown in FIG. 31, a guidance information display object 31 (an arrow indicating a right turn) is provided as a guidance point display object 32 on a vehicle 54 that travels in a direction that coincides with the content of guidance information (right turn) from the guidance point (intersection XP1). May be attached.
  • the display control device 1 uses the images 51i, 52i, 53i of non-existing vehicles that appear to exist around the guidance point (intersection XP1) as the guidance point display object 32, and the intersection XP1.
  • the first display object 32a showing the vehicle images 51i and 52i that appear to be close to each other and the second display object 32b showing the vehicle image 53i that appears to be located far from the intersection XP1 are displayed. It may be. Thereby, even when there is no moving body in front of the host vehicle, the same effect as in the case of FIG. 28 can be obtained.
  • the display control device 1 displays, as a guide point display object 32, an image 54i of a nonexistent vehicle that appears to advance in a direction that matches the content of the guide information from the intersection XP1, and an image 54i of the vehicle. You may make it display the display object (mark of "(circle)") to show. Thereby, even when there is no moving body in front of the host vehicle, the same effect as in the case of FIG. 30 can be obtained.
  • the image of the guidance information display object 31 may also be used as the guidance point display object 32.
  • the present invention can be freely combined with each other within the scope of the invention, and each embodiment can be appropriately modified or omitted.
  • the display control device 1 may be configured to include the navigation device 3 (that is, the display control device 1 is navigated to). May have a function).
  • 1 display control device 2 virtual image display unit, 3 navigation device, 4 moving body detection unit, 5 image display unit, 6 on-vehicle camera, 11 guidance information acquisition unit, 12 guidance point position acquisition unit, 13 display object storage unit, 14 control Unit, 15 moving body position acquisition unit, 16 image acquisition unit, 31 guidance information display object, 32 guidance point display object, 32a first display object, 32b second display object, 40 processing circuit, 41 processor, 42 memory, 100 display objects, 200 drivers, 201 windshield.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne une unité d'affichage d'image virtuelle 2 qui est capable d'afficher des objets d'affichage, c'est-à-dire des images virtuelles visibles depuis le siège du conducteur d'un véhicule à travers un pare-brise, à des positions d'image virtuelle spécifiées par des distances d'image virtuelle, c'est-à-dire les distances par rapport aux images virtuelles, et les directions d'image virtuelle, c'est-à-dire les directions des images virtuelles en utilisant la position spécifique du véhicule en tant que référence. Un dispositif de commande d'affichage 1 est pourvu de : une unité d'acquisition d'informations de guidage 11 qui acquiert, à partir d'un dispositif de navigation 3 du véhicule, des informations de guidage, c'est-à-dire des informations de guidage d'itinéraire correspondant à un site de guidage à l'avant du véhicule ; une unité d'acquisition de position de site de guidage 12 pour acquérir la position relative du site de guidage par rapport au véhicule ; et une unité de commande 14 pour commander l'affichage sur l'unité d'affichage d'image virtuelle. L'unité de commande 14 affiche un objet d'affichage d'informations de guidage, c'est-à-dire un objet d'affichage indiquant le contenu des informations de guidage, et un objet d'affichage de site de guidage, c'est-à-dire un objet d'affichage indiquant la position du site de guidage.
PCT/JP2015/073383 2015-08-20 2015-08-20 Dispositif de commande d'affichage, dispositif d'affichage et procédé de guidage d'itinéraire WO2017029761A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/073383 WO2017029761A1 (fr) 2015-08-20 2015-08-20 Dispositif de commande d'affichage, dispositif d'affichage et procédé de guidage d'itinéraire
JP2017535219A JP6444514B2 (ja) 2015-08-20 2015-08-20 表示制御装置、表示装置および経路案内方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/073383 WO2017029761A1 (fr) 2015-08-20 2015-08-20 Dispositif de commande d'affichage, dispositif d'affichage et procédé de guidage d'itinéraire

Publications (1)

Publication Number Publication Date
WO2017029761A1 true WO2017029761A1 (fr) 2017-02-23

Family

ID=58051292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/073383 WO2017029761A1 (fr) 2015-08-20 2015-08-20 Dispositif de commande d'affichage, dispositif d'affichage et procédé de guidage d'itinéraire

Country Status (2)

Country Link
JP (1) JP6444514B2 (fr)
WO (1) WO2017029761A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10281795A (ja) * 1997-04-07 1998-10-23 Toyota Motor Corp 車両用案内表示装置
JP2005201635A (ja) * 2004-01-13 2005-07-28 Denso Corp 車両用ナビゲーションシステム
JP2010173619A (ja) * 2009-02-02 2010-08-12 Denso Corp ヘッドアップディスプレイ装置
JP2011203053A (ja) * 2010-03-25 2011-10-13 Equos Research Co Ltd 運転アシストシステム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10281795A (ja) * 1997-04-07 1998-10-23 Toyota Motor Corp 車両用案内表示装置
JP2005201635A (ja) * 2004-01-13 2005-07-28 Denso Corp 車両用ナビゲーションシステム
JP2010173619A (ja) * 2009-02-02 2010-08-12 Denso Corp ヘッドアップディスプレイ装置
JP2011203053A (ja) * 2010-03-25 2011-10-13 Equos Research Co Ltd 運転アシストシステム

Also Published As

Publication number Publication date
JP6444514B2 (ja) 2018-12-26
JPWO2017029761A1 (ja) 2017-10-19

Similar Documents

Publication Publication Date Title
JP6381807B2 (ja) 表示制御装置、表示装置および表示制御方法
US10029700B2 (en) Infotainment system with head-up display for symbol projection
CN104848863B (zh) 产生所关注位置的扩增视图
JP6486474B2 (ja) 表示制御装置、表示装置及び表示制御方法
JP6176541B2 (ja) 情報表示装置、情報表示方法及びプログラム
JP6448804B2 (ja) 表示制御装置、表示装置および表示制御方法
JP2016090344A (ja) ナビゲーション装置、及びナビゲーションプログラム
WO2019224922A1 (fr) Dispositif de commande d'affichage tête haute, système d'affichage tête haute et procédé de commande d'affichage tête haute
JP6945933B2 (ja) 表示システム
JP6456504B2 (ja) 表示制御装置、表示装置及び表示制御方法
JP2014211431A (ja) ナビゲーション装置、及び、表示制御方法
JP6186905B2 (ja) 車載表示装置およびプログラム
JP6444508B2 (ja) 表示制御装置およびナビゲーション装置
JP6494764B2 (ja) 表示制御装置、表示装置及び表示制御方法
JP2017056747A (ja) 表示制御装置、表示装置および音像位置制御方法
JP6448806B2 (ja) 表示制御装置、表示装置及び表示制御方法
JP6482431B2 (ja) 表示制御装置、表示装置および表示制御方法
JP6328366B2 (ja) ヘッドアップディスプレイの表示制御装置および表示制御方法
JP6444514B2 (ja) 表示制御装置、表示装置および経路案内方法
WO2019111307A1 (fr) Dispositif et procédé de commande d'affichage
JPWO2018167815A1 (ja) 表示制御装置及び表示制御方法
JP7217804B2 (ja) 表示制御装置および表示制御方法
JP6432312B2 (ja) ナビゲーションシステム、ナビゲーション方法、及びナビゲーションプログラム
JP6388723B2 (ja) 表示制御装置およびナビゲーション装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15901743

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017535219

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15901743

Country of ref document: EP

Kind code of ref document: A1