WO2017013792A1 - Display control device and navigation device - Google Patents

Display control device and navigation device Download PDF

Info

Publication number
WO2017013792A1
WO2017013792A1 PCT/JP2015/070991 JP2015070991W WO2017013792A1 WO 2017013792 A1 WO2017013792 A1 WO 2017013792A1 JP 2015070991 W JP2015070991 W JP 2015070991W WO 2017013792 A1 WO2017013792 A1 WO 2017013792A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
display
unit
guidance
road marking
Prior art date
Application number
PCT/JP2015/070991
Other languages
French (fr)
Japanese (ja)
Inventor
井上 裕二
境 賢治
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2017529419A priority Critical patent/JP6444508B2/en
Priority to PCT/JP2015/070991 priority patent/WO2017013792A1/en
Publication of WO2017013792A1 publication Critical patent/WO2017013792A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network

Definitions

  • the present invention relates to a display control device for a head-up display (hereinafter referred to as HUD) that displays display information superimposed on a driver's forward view, and a navigation device equipped with the display control device.
  • HUD head-up display
  • Patent Document 1 describes a display system that changes the display position of information by HUD according to the importance of information.
  • information with high importance is displayed in a central display area where the driver can read information without diverting his gaze from the driving road ahead of the display area in the HUD, and information with low importance is displayed. It is displayed in the peripheral display area of the HUD. Thereby, the information with high importance displayed in the central display area can be visually recognized without the driver moving his / her line of sight.
  • Patent Document 2 describes a HUD device that displays parallax to the left and right eyes of a driver to display information three-dimensionally.
  • map information such as roads, intersections or buildings along the roads can be displayed in a three-dimensional manner by superimposing them on the focal position of the driver's front view. Thereby, the driver can visually recognize the information without shifting the line of sight or the focus of the eyes.
  • Patent Document 3 describes a HUD device that detects an object that is being watched by a driver and displays information so as to form an image overlapping the object.
  • information is dynamically superimposed and displayed on the driver's gaze object, which can change from moment to moment, so that the driver does not move the line of sight and adjusts the focal length of the eyes. Can be visually recognized.
  • Patent Document 4 only for a facility facing a road that is currently running, the information on the facility included in the landscape is superimposed on the position corresponding to this facility on the scene representing the real space or the image representing the landscape.
  • the system to display is described. Each piece of information is displayed with a sense of perspective in a form attached to a portion facing the road on which the vehicle is currently traveling in the corresponding facility. Thereby, the visibility of the facility information or the actual scenery can be improved.
  • HUD information such as vehicle information and route guidance is displayed superimposed on the focal position of the driver's forward field of view, so that the driver can quickly view the information with a small amount of line-of-sight movement.
  • an object that the driver wants to pay attention to in front view and display information irrelevant to the object are displayed in an overlapping manner, both the object and the information are difficult to see.
  • the display of information by the HUD is performed at a timing at which the driver can easily recognize the information content while the vehicle is traveling.
  • the display system described in Patent Document 1 narrows down information to be displayed in the central display area where the driver can read information without diverting the line of sight according to the importance of the information.
  • the visibility of the displayed information is improved.
  • the information has a high degree of importance and is displayed in the central display area of the HUD, if the object that the driver wants to pay attention to in the front view and information that is irrelevant to the object are overlapped and displayed. , The visibility of both the object and the information is reduced.
  • the driver needs to move his / her line of sight greatly, which reduces attention to the front of the vehicle.
  • the HUD device described in Patent Document 2 can display information three-dimensionally in the driver's forward field of view, but there is an object that the driver wants to focus on in the forward field of view and information unrelated to this object. There is a possibility of overlapping. In this case, both the object and the information are difficult to see.
  • the HUD device described in Patent Document 3 dynamically displays information superimposed on a driver's gaze object that can change from moment to moment, but no consideration is given to the display form of the information to be displayed in a superimposed manner. Since it is not done, both the gaze object and the information may be difficult to see.
  • the HUD device described in Patent Literature 4 is assumed to display information on a facility facing a currently traveling road on a portion facing the road of the facility and cannot be applied to route guidance.
  • any of Patent Document 1 to Patent Document 3 the timing for displaying information on the HUD is not considered. For this reason, when information such as route guidance is displayed by the HUD at a timing that is difficult for the driver to recognize, the display ends without the driver being able to recognize the contents of the information, or the information of the information displayed by the driver by the HUD is displayed. Attention to driving may be reduced due to attention to recognize the contents.
  • patent document 4 has a description about the timing which changes the display form of facility information from the state which is stuck on a facility to the state which turned to the front, as mentioned above, route guidance is not considered. .
  • An object of the present invention is to provide a display control device and a navigation device that can display display information related to route guidance at a timing that makes it easy to see and recognize information content.
  • the display control apparatus includes a sign information acquisition unit, a timing generation unit, a display information generation unit, and an output control unit.
  • the sign information acquisition unit acquires road sign information related to route guidance from road sign information detected in front of the vehicle.
  • the timing determination unit determines whether it is the timing of voice guidance in route guidance.
  • the display information generation unit generates display information for highlighting the road marking information acquired by the marking information acquisition unit so that the road marking information overlaps with the corresponding real road marking information in the forward field of view.
  • the output control unit outputs display information to the display device in synchronization with the voice guidance timing determined by the timing determination unit.
  • the display information of the road marking information related to the route guidance is highlighted so as to overlap with the corresponding actual road marking information in the forward view, the easy-to-read route guidance emphasized in a natural form Display can be realized. Furthermore, since the road marking information related to the route guidance is displayed in synchronization with the timing of the voice guidance, the contents of the information to be recognized by the driver's auditory sense and the visual sense match, so that the contents of the display information can be easily recognized. .
  • FIG. 3 is a block diagram illustrating a functional configuration of the display control apparatus according to Embodiment 1.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the display control apparatus according to Embodiment 1.
  • FIG. 3 is a flowchart showing an outline of the operation of the display control apparatus according to the first embodiment.
  • 4 is a flowchart showing a detailed operation of the display control apparatus according to the first embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window before audio guidance. It is a figure which shows the scenery ahead of the vehicle seen through the front window at the timing which carries out voice guidance.
  • FIG. 10 is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 2 of this invention.
  • 10 is a flowchart showing a detailed operation of the display control apparatus according to the second embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window in which the additional display information was displayed in the empty space of the actual information sign. It is a figure which shows the scenery ahead of the vehicle seen through the front window in which the additional display information was displayed so that it might align with the actual guide sign. It is a figure which shows the scenery ahead of the vehicle seen through the front window displayed in the empty space of the actual information sign that the detour of the traffic jam is displayed, and that the traffic sign is displayed in line with the information sign.
  • FIG. 10 It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 3 of this invention.
  • 10 is a flowchart showing a detailed operation of the display control apparatus according to the third embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window by which the information unrelated to route guidance was mask-displayed. It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 4 of this invention. 10 is a flowchart showing a detailed operation of the display control apparatus according to the fourth embodiment. It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 5 of this invention. 10 is a flowchart showing a detailed operation of the display control apparatus according to the fifth embodiment.
  • FIG. 1 is a functional block diagram showing a configuration of a navigation device 1 according to Embodiment 1 of the present invention.
  • the navigation device 1 is a device that is mounted on a vehicle and performs route guidance to a destination, and includes an information acquisition unit 2, a control unit 3, a notification unit 4, an input unit 5, a map data storage unit 6, and a route calculation unit 7. And a route guide unit 8.
  • the operation of each part of the navigation device 1 is comprehensively controlled by the control part 3.
  • the information acquisition unit 2 is a generic name for a configuration that acquires information detected by the current position of the host vehicle, the surroundings of the host vehicle, and other vehicles, and includes a current position detection unit 20, a wireless communication unit 21, and a surrounding information detection unit 22.
  • the current position detection unit 20 is a component that detects the current position of the host vehicle, and is connected to a GPS (Global Positioning System) reception unit 20a, a direction detection unit 20b, and a pulse detection unit 20c.
  • GPS Global Positioning System
  • the GPS receiver 20a receives a GPS signal from a GPS satellite, and detects the current position (for example, latitude and longitude) of the host vehicle based on the GPS signal.
  • the direction detection unit 20b is a detection unit that detects the traveling direction (for example, the direction) of the host vehicle, and includes, for example, a gyro sensor and a direction sensor.
  • the pulse detector 20c detects a pulse signal corresponding to the number of revolutions per unit time of the axle of the own vehicle, and detects the traveling speed and the traveling distance of the own vehicle based on the pulse signal.
  • the current position detecting unit 20 corrects the current position detected by the GPS receiving unit 20a based on the traveling direction, traveling speed, and traveling distance of the host vehicle detected by the direction detecting unit 20b and the pulse detecting unit 20c. Thereby, it becomes possible to detect the exact current position of the own vehicle.
  • the wireless communication unit 21 wirelessly communicates with a communication device mounted on another vehicle, and receives information indicating a situation around the own vehicle detected by the other vehicle.
  • road marking information detected by another vehicle traveling in front of the host vehicle is acquired.
  • the road marking information is information relating to road traffic guidance.
  • Distance As for the guide sign, the position of the guide sign and the content of the guide sign become road marking information.
  • the contents of the guide sign include a place name indicating the direction and an arrow sign indicating the direction.
  • the position and the content of the road marking become road marking information.
  • the contents of the road marking include an arrow marking indicating a traffic division according to the traveling direction, such as a straight lane or a left / right turn.
  • the contents of the traffic information such as “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board is also the road marking information.
  • the surrounding information detection unit 22 is connected to the external sensor 23 and detects information indicating the situation around the own vehicle from the information around the own vehicle detected by the external sensor 23.
  • the information indicating the situation around the host vehicle includes, for example, road marking information in front of the host vehicle.
  • the external sensor 23 is a sensor group that detects information around the host vehicle, and includes a camera 23a, an image processing unit 23b, a radar 23c, and a radar control unit 23d.
  • the camera 23a is a camera capable of photographing in the visible light region or the infrared light region.
  • the camera 23a is arranged in the vicinity of a room mirror on the vehicle interior side of the front window of the own vehicle, and photographs the outside of a predetermined range ahead of the own vehicle in the traveling direction through the front window.
  • a CCD (Charge Coupled Device) camera, a CMOS (Complementary Metal Oxide Semiconductor) camera, a stereo camera, or the like is used.
  • the image processing unit 23 b performs image processing on the image captured by the camera 23 a into image data that can be processed by the surrounding information detection unit 22.
  • image data including two-dimensional array pixels is generated by performing image processing such as filtering and binarization processing. This image data is output to the surrounding information detection unit 22.
  • the radar 23c is a sensor that detects an object around the vehicle using a laser beam or a millimeter wave as a probe. For example, it is arranged near the nose portion of the vehicle body of the own vehicle or the front window on the vehicle interior side. Further, the radar 23c transmits a laser beam or millimeter wave transmission signal in the detection target direction under the control of the radar control unit 23d, and receives a reflection signal obtained by reflecting the transmission signal by an external object.
  • the detection target direction includes the traveling direction of the own vehicle.
  • the radar control unit 23d generates a beat signal obtained by mixing the reflected signal and the transmission signal received by the radar 23c, and outputs the beat signal to the ambient information detection unit 22.
  • the radar control unit 23d controls the operation of the radar 23c in accordance with a control command input from the surrounding information detection unit 22.
  • the surrounding information detection unit 22 determines whether or not the image data obtained from the external sensor 23 includes a predetermined detection target image.
  • the detection object is, for example, road marking information such as a guide sign, road marking, and information board in front of the host vehicle.
  • the surrounding information detection unit 22 determines the distance between the reference position in the entire image indicated by the image data and the detection target (hereinafter referred to as the first distance). Calculated).
  • the reference position is, for example, the center position in the horizontal direction of the entire image indicated by the image data.
  • the ambient information detection unit 22 calculates a distance between the detection target and the host vehicle (hereinafter referred to as a second distance) based on the beat signal input from the external sensor 23.
  • the camera 23a is a stereo camera
  • the second distance can be calculated using the parallax with respect to the detection target in the stereo camera.
  • the surrounding information detection unit 22 calculates the relative position of the detection target object with respect to the position of the own vehicle in the horizontal direction based on the first distance and the second distance.
  • the relative position of the detection target is, for example, a relative position obtained from a difference in latitude and longitude coordinates.
  • the surrounding information detection unit 22 calculates the current position of the detection object based on the relative position of the detection object and the current position of the host vehicle.
  • the detection object is a guide sign, a road marking, or an information board
  • the surrounding information detection unit 22 outputs the distance between these marking objects and the own vehicle to the later-described signing information acquisition unit 30 as road marking information.
  • the notification unit 4 is a generic name for a configuration in which the navigation device 1 notifies the user of information, and includes a navigation display unit 4a, a HUD display unit 4b, a display control unit 4c, an audio output unit 4d, and an audio control unit 4e.
  • the navigation display unit 4a is a display on which map display and route guidance display are performed, and is arranged, for example, in the front center of the vehicle interior.
  • the HUD display unit 4b projects and displays the display information on the front window or the combiner. However, since the road marking information related to the route guidance is displayed so as to overlap the corresponding real road marking information in the driver's front field of view, the front window is an object to be projected and displayed. desirable.
  • the display control unit 4c causes the navigation display unit 4a to display the map image and the guidance image based on the image data including the map image and the guidance image input from the control unit 3. Further, the display control unit 4c causes the HUD display unit 4b to display the display information input from the control unit 3.
  • the HUD device 4A embodies the display device according to the present invention, and includes a HUD display unit 4b and a display control unit 4c.
  • the voice output unit 4d is an output unit that outputs a guidance voice, a warning sound, and the like, and is realized by a speaker mounted on the vehicle. Also, the voice control unit 4e causes the voice output unit 4d to output guidance voice, warning sound, and the like based on voice data such as guidance voice and warning sound input from the control unit 3.
  • the audio output device 4B embodies the audio output device of the present invention, and includes the audio output unit 4d and the audio control unit 4e described above. That is, in the audio output device 4B, the audio control unit 4e controlled by the control unit 3 outputs audio information to the audio output unit 4d to output audio.
  • the navigation device 1 controls the notification unit 4 to notify the driver of the own vehicle of information for assisting driving.
  • the input unit 5 is a component that receives information input from the user, and is realized by, for example, a push button device or a touch panel.
  • the touch panel may be integrated with the navigation display unit 4a.
  • a destination to be route-guided from the current position is input using the input unit 5. For example, when a point on the map displayed on the navigation display unit 4a is designated by the user, the input unit 5 accepts this point as a destination, and when an address or telephone number is input, this address or telephone number Is received as destination information for specifying the destination.
  • the map data storage unit 6 is a storage unit that stores map data, and is realized as a storage device such as an HDD (Hard Disk Drive) or a RAM (Random Access Memory). Note that the map data storage unit 6 may store map data acquired from the outside of the navigation device 1. For example, the map data storage unit 6 stores map data downloaded from an external device via a network. The map data storage unit 6 stores map data read from a recording medium such as a DVD-ROM (Digital Versatile Disk-Read Only Memory) or a BD-ROM (Blu-Ray (registered trademark) Disc-ROM). Also good.
  • a recording medium such as a DVD-ROM (Digital Versatile Disk-Read Only Memory) or a BD-ROM (Blu-Ray (registered trademark) Disc-ROM). Also good.
  • the route calculation unit 7 calculates a route from the departure point to the destination based on the departure point, the destination, and the map data. For example, the current position of the host vehicle detected by the current position detection unit 20 is set as the departure place.
  • the destination may be a point received by the input unit 5 from the user, or may be a point registered in advance as a destination candidate.
  • the route calculated by the route calculation unit 7 travels as much as possible on, for example, a route with a short arrival time (time priority route), a route with a short travel distance (distance priority route), a route with less fuel (fuel priority route), and a toll road.
  • the route guidance unit 8 provides route guidance for a route selected by the user (hereinafter, referred to as a scheduled travel route) among the routes calculated by the route calculation unit 7. For example, the route guidance unit 8 performs route guidance from the current position to the destination along the planned travel route by controlling the notification unit 4 to notify the driver of route guidance information.
  • FIG. 2 is a block diagram illustrating a functional configuration of the display control apparatus according to the first embodiment, and illustrates each function of the control unit 3 that is a function of the display control apparatus according to the first embodiment.
  • the control unit 3 includes a labeling information acquisition unit 30, a timing determination unit 31, a display information generation unit 32, and an output control unit 33 as functions for controlling the display of the HUD device 4 ⁇ / b> A.
  • the sign information acquisition unit 30 acquires road sign information related to route guidance from road sign information detected in front of the host vehicle.
  • the marking information acquisition unit 30 uses the road marking information detected in front of the vehicle by the surrounding information detection unit 22 as road marking information related to route guidance in the route guidance unit 8 and road marking information not related to route guidance.
  • road marking information related to route guidance is acquired.
  • a classification method for example, a method of identifying information related to route guidance by comparing the guidance signs, road markings, information boards and their contents with the route guidance information of the planned travel route in the road marking information is adopted. Is done.
  • the sign information acquisition unit 30 analyzes the image data obtained by the image processing of the image processing unit 23b via the surrounding information detection unit 22, and specifies the content of the road sign information.
  • the road marking information is a guide sign
  • the place name described in the guide sign the contents of the sign such as an arrow sign, and the position in the guide sign where the contents of the sign are described are specified.
  • the marking information acquisition unit 30 analyzes the road surface image data in front of the own vehicle acquired from the surrounding information detection unit 22 to identify the content of the road marking, and the driver's line of sight And the angle formed by the road surface.
  • the sign information acquisition unit 30 sets the road marking indicating the left turn as road marking information related to the route guidance, and the road marking information for straight and right turns indicates the road marking information not related to the route guidance. Classify into: In addition, when the route is guided to the direction of “Sannomiya” at the guide point, the sign information acquisition unit 30 routes “Sannomiya” if “Sannomiya” and “Osaka” are specified as the place names indicating the direction of the guidance sign. The road marking information related to the guidance is used, and “Osaka” is classified as road marking information not related to the route guidance.
  • the timing determination unit 31 determines whether or not it is the timing at which the route guide unit 8 guides the planned travel route by voice.
  • a guidance point where voice guidance is performed is determined in advance.
  • the guide point is a point where it is necessary to determine the traveling direction of the vehicle on the planned travel route, and includes an intersection, an entrance / exit of an expressway, and the like.
  • the timing determination unit 31 acquires the route guidance information of the planned travel route used by the route guidance unit 8 for route guidance and the current position of the host vehicle, and the host vehicle is within a predetermined range with respect to the guide point. It is determined whether or not. And the timing determination part 31 determines with it being the timing of voice guidance, when it arrives within this range.
  • the predetermined range is a range determined in terms of time or distance between the guide point and the vehicle position, and the output control unit 33 performs this voice guidance during the period when the voice guidance is output.
  • This is a range in which related display information can be displayed on the HUD device 4A. That is, the timing at which the period during which the voice guidance is output and the period during which the display information corresponding to the period is displayed overlaps with the timing at which voice guidance is performed.
  • the display information generation unit 32 generates display information for highlighting the road marking information related to the route guidance so that the road marking information appears to overlap the corresponding real road marking information in the forward view. For example, when the road marking information related to the route guidance is information on the direction and direction of the guidance sign, as the display information for displaying the direction and direction so that the direction and direction are displayed so as to overlap the direction sign, The image data is generated in which the characters of the place name indicating and the arrow mark indicating the direction are highlighted in highlight color.
  • the road marking information related to the route guidance is the traffic information “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board, it appears to overlap the display part of this information board As display information to be displayed, image data in which characters indicating the traffic information are highlighted in a highlight color is generated.
  • the road marking information related to the route guidance is the left / right turn arrow marking of the road marking
  • the right / left turn arrow marking is highlighted as display information to be displayed so as to overlap the road marking.
  • Emphasized image data is generated.
  • the display information generating unit 32 Based on the angle formed by the driver's line of sight and the road surface, the display information generating unit 32 generates an image in which the shape of the arrow sign is deformed so that the driver can see the arrow mark on the actual road surface. To do. Thereby, the display information of a road marking can be made into the shape close
  • the display information generation unit 32 has a size and shape corresponding to the distance between the position of the actual road marking information and the own vehicle based on the content of the road marking information sequentially input from the marking information acquisition unit 30.
  • Image data indicating road marking information is generated as display information.
  • the display information generated by the display information generation unit 32 is displayed in a relatively short period while the voice guidance corresponding to the display information is output. Even in this period, the display information of the road marking information is displayed.
  • the display data is used as image data indicating the size and shape of the road marking information according to the distance between the position of the actual road marking information and the host vehicle, thereby reducing the deviation from the actual road marking information. be able to.
  • the output control unit 33 outputs display information to the HUD device 4A in synchronization with the voice guidance timing determined by the timing determination unit 31. Thereby, in the HUD device 4A, the display information is projected on the front window or the combiner and highlighted so as to overlap the corresponding real road marking information in the front view. In other words, the output control unit 33 controls the display control unit 4c to output the display information input from the display information generation unit 32 to the HUD device 4A when the voice guidance timing comes.
  • the output control unit 33 controls the display control unit 4c to adjust the distance between the lens of the HUD device 4A and the display source.
  • the output control part 33 changes the display position of display information according to driving
  • FIG. 3 is a block diagram illustrating a hardware configuration of the display control apparatus according to the first embodiment.
  • FIG. 4 is a flowchart showing an outline of the operation of the display control apparatus according to the first embodiment.
  • a display 101 is the HUD device 4A shown in FIG. 1, and the display is controlled by the display control device according to the first embodiment.
  • the speaker 102 is the audio output unit 4d shown in FIG. 1, and is, for example, a speaker mounted on a vehicle.
  • each function related to display control in the control unit 3 corresponds to each function of the display control apparatus according to the first embodiment.
  • the sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 in the control unit 3 are realized by the processing circuit 100. That is, in the display control apparatus, the sign information acquisition unit 30 executes the process of step ST1 shown in FIG. 4, the timing determination unit 31 executes the process of step ST2, and the display information generation unit 32 executes the process of step ST3.
  • the output control unit 33 includes a processing circuit 100 for executing the process of step ST4.
  • the processing circuit 100 is a component that implements the control unit 3 and may be dedicated hardware or a CPU (Central Processing Unit) that executes a program stored in a memory.
  • CPU Central Processing Unit
  • the processing circuit 100 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (FPGA). Field-Programmable Gate Array) or a combination thereof.
  • the functions of each part of the sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 may be realized by a processing circuit, respectively, or the functions of each unit may be realized by a single processing circuit. May be.
  • the processing circuit when the processing circuit is the CPU 103, the functions of the marking information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 are software, firmware, or software. This is realized by a combination of firmware and firmware. Software and firmware are described as programs and stored in the memory 104.
  • the CPU 103 reads out and executes the program stored in the memory 104, thereby realizing the functions of each unit. That is, the display control apparatus includes a memory 104 for storing a program that, when executed by the CPU 103, results from the processing from step ST1 to step ST4 shown in FIG.
  • the memory is, for example, a RAM (Random Access Memory), a ROM, a flash memory, an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Programmable EPROM), or a non-volatile or volatile semiconductor memory, a magnetic disk, a flexible disk, An optical disc, a compact disc, a mini disc, a DVD, and the like are applicable.
  • the functions of the sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 may be realized by dedicated hardware, and may be realized by software or firmware.
  • the function of the sign information acquisition unit 30 is realized by the processing circuit 100 as dedicated hardware, and the CPU 103 is stored in the memory 104 for the timing determination unit 31, the display information generation unit 32, and the output control unit 33.
  • the function can be realized by reading and executing the program.
  • the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
  • FIG. 5 is a flowchart showing a detailed operation of the display control apparatus according to the first embodiment.
  • processing from step ST1a to step ST6a is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST8a is an example of the detailed process of step ST3 shown in FIG.
  • the processing from step ST9a to step ST10a is an example of detailed processing of step ST4 shown in FIG. Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 5, and will be described below with reference to FIG.
  • the sign information acquisition unit 30 determines whether or not the planned travel route of the vehicle has been calculated (step ST1a).
  • the route guidance unit 8 is inquired whether route guidance is performed along the planned travel route, and when the route guidance is performed, it is determined that the planned travel route is calculated, and the route is calculated. When the guidance is not performed, it is determined that the planned travel route has not been calculated. When it is determined that the planned travel route has not been calculated (step ST1a; NO), the process proceeds to step ST10a.
  • step ST2a when it is determined that the planned travel route has been calculated (step ST1a; YES), the marking information acquisition unit 30 detects road marking information ahead of the vehicle by the surrounding information detection unit 22 in the information acquisition unit 2. It is determined whether or not it has been done (step ST2a). Here, when the road marking information ahead of the host vehicle is not detected (step ST2a; NO), the process of step ST2a is repeated. When road marking information in front of the host vehicle is detected (step ST2a; YES), the marking information acquisition unit 30 receives from the surrounding information detection unit 22 a sign such as a guide sign, road marking, and display board in the road marking information. The distance to the car is acquired (step ST3a).
  • a sign such as a guide sign, road marking, and display board in the road marking information.
  • the sign information acquisition unit 30 determines the content of the road sign information (step ST4a). For example, the sign information acquisition unit 30 performs image analysis on the image data obtained by the image processing of the image processing unit 23b via the surrounding information detection unit 22, and determines the content of the road sign information.
  • the road marking information is a guide sign
  • the name of the place described on the guide sign the contents of the sign such as an arrow sign, and the position in the guide sign where the contents of the sign are described are determined.
  • an arrow marking indicating a traffic classification according to the traveling direction, such as a straight lane or a left / right turn, is determined.
  • the road marking information is the information displayed on the information board, the contents of traffic information such as “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board are determined.
  • the marking information acquisition unit 30 compares the content of the road marking information determined in step ST4a with the route guidance information of the planned travel route (step ST5a), and whether there is road marking information related to the route guidance. Is determined (step ST6a). That is, the sign information acquisition unit 30 uses the road sign information detected in front of the host vehicle by the surrounding information detection unit 22 as road sign information related to the route guidance in the route guide unit 8 and the road sign not related to the route guidance. Classified as information.
  • step ST6a If there is no road marking information related to route guidance in the road marking information detected in front of the vehicle by the surrounding information detection unit 22 (step ST6a; NO), the process returns to step ST2a.
  • the marking information acquisition unit 30 includes the route guidance among the road marking information detected by the surrounding information detection unit 22 in front of the host vehicle. The road marking information related to is acquired and output to the display information generation unit 32.
  • the timing determination unit 31 determines whether or not it is a timing for voice guidance of the planned travel route by the route guidance unit 8 (step ST7a). As described above, for example, based on the route guidance information of the planned travel route and the current position of the host vehicle, it is determined whether or not the host vehicle has reached a predetermined range with respect to the guide point. When it reaches inside, it is determined that it is the timing of voice guidance. When it is determined that it is not the timing of voice guidance (step ST7a; NO), the process of step ST7a is repeated until the timing of voice guidance is reached.
  • the display information generation unit 32 highlights the road marking information related to this voice guidance in the highlight color and this voice guidance. Display information expressed in a form suitable for an actual sign corresponding to is generated (step ST8a).
  • the output control unit 33 outputs voice guidance to the voice output device 4B, and outputs the display information generated by the display information generation unit 32 to the HUD device 4A in synchronization with this timing. As a result, the display information is highlighted so as to overlap the corresponding actual road marking information in the front view through the front window (step ST9a).
  • the output control unit 33 determines whether or not an operation for stopping the display by the HUD device 4A has been received by the input unit 5 (step ST10a).
  • this operation is received by the input unit 5 (step ST10a; YES)
  • the process of FIG. 5 is terminated.
  • the said operation is not received by the input part 5 (step ST10a; NO)
  • it returns to the process of step ST1a.
  • FIG. 6 is a diagram showing a landscape in front of the vehicle viewed through the front window before voice guidance, and shows a state in front of the host vehicle when the vehicle approaches an intersection where voice guidance is performed.
  • a guide sign 9 is arranged in the vicinity at the intersection shown in FIG. 6, and a right turn arrow 10 and a straight arrow 11 are marked on the road surface on which the vehicle travels.
  • display information related to route guidance is not displayed.
  • FIG. 7 is a diagram showing the scenery in front of the vehicle viewed through the front window at the timing of voice guidance, and shows the case where the vehicle approaches the intersection further from the state of FIG. 6 and the timing of voice guidance is reached.
  • the road marking information related to the route guidance is information indicating the direction “Sannomiya” in the direction of the actual guidance sign 9 and the right direction, and the road marking of the right turn arrow 10.
  • the display information generation unit 32 generates, as the display information 9a, image data obtained by applying highlight color to the characters “Sannomiya” indicating the direction and the arrow indicating the right direction. Further, the display information generation unit 32 generates, as display information 10a, image data obtained by adding a highlight color to the shape that matches the right turn arrow 10.
  • the output control unit 33 generates display information for the HUD device 4A in synchronism with the output timing of the voice guidance “Please turn right ahead. Right direction, in the direction of Sannomiya” in the audio output device 4B.
  • the display information generated by the unit 32 is output.
  • the display information 9a in which the direction “Sannomiya” and the arrow mark in the right direction are expressed in highlight color is described by the HUD device 4A, and the corresponding direction and direction of the actual guide sign 9 are described. It appears as if it were pasted at the location.
  • the HUD device 4A displays the display information 10a in which the right turn arrow mark is displayed in a highlighted color so as to be pasted on the corresponding actual right turn arrow 10.
  • the road marking information display information 9a and 10a related to the route guidance is displayed in the road corresponding to the driver's front view in synchronization with the timing of the voice guidance. Highlight so that it appears to overlap the marking information.
  • the display information generation unit 32 generates display information 9a and 10a for highlighting road marking information related to the contents of voice guidance. Thereby, it is possible to realize easy-to-see route guidance display emphasized in a natural manner. Furthermore, the contents of the information to be recognized visually and the driver's hearing are easy to recognize because the contents of the display information are the same.
  • FIG. 8 is a block diagram illustrating a functional configuration of the display control apparatus according to the second embodiment, and illustrates each function of the control unit 3A that is a function of the display control apparatus according to the second embodiment.
  • the control unit 3A includes a labeling information acquisition unit 30a, a timing determination unit 31, a display information generation unit 32a, and an output control unit 33a as functions for controlling the display of the HUD device 4A.
  • the sign information acquisition unit 30a acquires the content of the route guidance that is not included in the road sign information related to the route guidance from the road marking information related to the route guidance by the route guide unit 8.
  • the road marking information detected in front of the vehicle by the surrounding information detection unit 22 is classified into road marking information related to the route guidance in the route guidance unit 8 and road marking information not related to the route guidance.
  • the road marking information related to the route guidance is compared with the content of the route guidance, and the content of the route guidance not included in the road marking information is extracted and acquired as information to be additionally displayed.
  • the road marking information related to route guidance is in the direction of “Sannomiya” in the direction of the guidance sign, and the content of the route guidance includes the remaining distance from the current position to the destination in the direction of “Sannomiya”. Is included as information to be additionally displayed.
  • the route guidance information to be compared with the road marking information related to the route guidance also includes traffic information related to the planned travel route. For example, when the road on the right in the direction of “Sannomiya” is congested, the sign information acquisition unit 30a additionally displays character information indicating “congested” and information indicating a detour to avoid this traffic jam Extract as
  • the display information generation unit 32a generates additional display information for displaying the contents of route guidance that is not included in the road marking information related to route guidance.
  • the road marking information related to the route guidance is the information indicating the direction “Sannomiya” and the right direction on the guidance sign. Until the content of the route guidance not in this information arrives at the destination from the current position toward “Sannomiya” A case where the information is the remaining distance (5 km) is taken as an example. In this case, the display information generation unit 32a generates, as additional display information, image data in which character information “5 km remaining” indicating the remaining distance is expressed in highlight color.
  • the output control unit 33a outputs additional display information to the HUD device 4A.
  • the additional display information is displayed by the HUD device 4A as if it is pasted in the empty space of the corresponding real sign object in the front view.
  • the additional display information is displayed so as to be pasted at a position adjacent to the corresponding real sign in the forward view.
  • the sign corresponding to the additional display information for example, when the additional display information is the remaining distance toward the “Sannomiya” direction, the guide sign describing the “Sannomiya” direction is the sign corresponding to the additional display information. It is determined to be a thing.
  • FIG. 9 is a flowchart showing a detailed operation of the display control apparatus according to the second embodiment.
  • the processing from step ST1a to step ST6a, step ST7a and step ST8a, and step ST10a is the same as the processing from step ST1a to step ST6a, step ST7a and step ST8a, and step ST10a shown in FIG. Omitted.
  • step ST1a to step ST6a-3 in FIG. 9 is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4
  • the process from step ST8a to step ST8a-1 is an example of the detailed process of step ST3 shown in FIG.
  • the processing from step ST9b to step ST10a is an example of detailed processing of step ST4 shown in FIG. Therefore, since the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 9, the following description will be made based on FIG.
  • the marking information acquisition unit 30a compares the content of the road marking information related to the route guidance with the route guidance information of the planned travel route (step ST6a-1), and performs voice guidance not included in the road marking information related to the route guidance. It is determined whether or not there is content (step ST6a-2). That is, the marking information acquisition unit 30a compares the road marking information related to the route guidance with the content of the route guidance indicated by the route guidance information, and specifies the content of the route guidance not included in the road marking information.
  • step ST6a-2 If there is no voice guidance content not included in the road marking information related to the route guidance (step ST6a-2; NO), the process proceeds to step ST7a.
  • the marking information acquisition unit 30a extracts the content of the voice guidance as information to be additionally displayed (step ST6a- 3). For example, when the road marking information related to route guidance is in the right direction with the direction “Sannomiya” in the guidance sign, the remaining distance to the destination, estimated arrival time, traffic jam information, etc. are additionally displayed in the route guidance information. Extracted as information.
  • the display information generation unit 32a generates display information in which the information to be additionally displayed input from the sign information acquisition unit 30a is expressed in a highlight color and in a form suitable for an actual sign corresponding to this information (Ste ST8a-1).
  • the output control unit 33a outputs the display information and additional display information generated by the display information generation unit 32 to the HUD device 4A in synchronization with the timing at which the voice guidance is output.
  • the display information and the additional display information generated by the display information generation unit 32 are highlighted so as to appear to overlap the corresponding actual road marking information in the driver's front view (step ST9b).
  • FIG. 10 is a view showing the scenery in front of the vehicle seen through the front window in which the additional display information 9b is displayed in the empty space of the actual guidance sign 9, and the vehicle approaches the intersection and the voice guidance timing is reached. It shows the case.
  • FIG. 11 is a view showing a scenery in front of the vehicle viewed through the front window on which the additional display information 9c is displayed so as to be aligned with the guide sign 9.
  • the road marking information related to the route guidance is information indicating the direction “Sannomiya” and the right direction described in the guidance sign, and the content of the route guidance not included in this information is from the current position to the direction “Sannomiya”. This shows a case where the remaining distance until the destination is reached (more 5 km).
  • the display information generation unit 32a generates, as display information 9a, image data obtained by highlighting the characters “Sannomiya” indicating this direction and the arrow mark indicating the right direction in the actual guide sign 9. In addition, the display information generation unit 32a generates, as display information 10a, image data obtained by applying a highlight color to a shape that matches the actual right turn arrow 10. Further, the display information generation unit 32a generates additional display information 9b and 9c in which the character information “5 km remaining” indicating the remaining distance is expressed in highlight color.
  • the output control unit 33a displays the display information 9a generated by the display information generation unit 32a in synchronism with the output timing of the voice guidance “Please turn right ahead. There is another 5km in the right direction toward Sannomiya”. , 10a is output to the HUD device 4A. Thereby, the display information 9a is displayed by the HUD device 4A so as to be pasted at the position where the corresponding direction and direction of the actual guide sign 9 are described, and the display information 10a is displayed in correspondence with the actual right turn arrow. 10 is displayed as being pasted.
  • the output control unit 33a outputs additional display information 9b and 9c in which the character information “5 km” is expressed in highlight color to the HUD device 4A in synchronization with the voice guidance timing.
  • the additional display information 9b is displayed in the empty space of the guide sign 9 as shown in FIG. 10, or the additional display information 9c is aligned with the guide sign 9 as shown in FIG. Is displayed.
  • FIG. 12 is a view showing a scenery in front of the vehicle viewed through the front window in which a detour of a traffic jam is displayed in an empty space of the actual guidance sign 9 and that it is displayed that the traffic jam is aligned with the guidance sign 9 It is.
  • FIG. 12 shows a case where the road marking information related to the route guidance is information indicating the direction “Sannomiya” described in the guidance sign and the right direction. Further, the content of the route guidance that does not exist in the road marking information indicates that the road in the right direction is congested in the direction of “Sannomiya” and is information indicating a detour to avoid this traffic jam.
  • the display information generation unit 32a generates, as display information 9a, image data in which characters of “Sannomiya” indicating this direction in the actual guide sign 9 are expressed in highlight color.
  • the display information generation unit 32a generates, as display information 10a, image data in which a highlight color is applied to a shape that matches the right turn arrow 10 in an actual road marking.
  • the display information generation unit 32a generates, as additional display information 9d, image data in which an arrow sign indicating a detour that detours from the right direction toward the “Sannomiya” direction is highlighted.
  • the display information generation unit 32a generates, as additional display information 9e, image data in which character information “detour” is expressed in highlight color, and character information “in traffic jam” is expressed in highlight color. Additional display information 9f is generated.
  • the output control unit 33a synchronizes with the timing at which the voice guidance saying “There is a traffic jam in the right direction in the direction of Sannomiya.
  • the display information 10a is output to the HUD device 4A.
  • the display information 10a in which the right turn arrow sign is expressed in highlight color is displayed as if it is pasted on the actual right turn arrow 10 corresponding thereto.
  • the additional display information 9d is displayed as pasted on the arrow display of the actual guide sign 9 corresponding thereto.
  • the additional display information 9e is displayed in the empty space of the actual guide sign 9 corresponding thereto, and the additional display information 9f is displayed at a position adjacent to the guide sign 9.
  • the display information generating unit 32a displays the additional display information 9c to 9f for displaying the contents of the route guidance that does not exist in the road marking information related to the route guidance. Generate.
  • the output control unit 33a outputs additional display information 9c to 9f to the HUD device 4A.
  • the display information generating unit 32a displays the additional display information 9c to 9f for displaying the contents of the route guidance that does not exist in the road marking information related to the route guidance.
  • the output control unit 33a outputs additional display information 9c to 9f to the HUD device 4A.
  • FIG. 13 is a block diagram illustrating a functional configuration of the display control apparatus according to the third embodiment, and illustrates each function of the control unit 3B that is a function of the display control apparatus according to the third embodiment.
  • the control unit 3B includes a labeling information acquisition unit 30b, a timing determination unit 31, a display information generation unit 32b, and an output control unit 33b as functions for controlling the display of the HUD device 4A.
  • the marking information acquisition unit 30b converts the road marking information detected in front of the vehicle by the surrounding information detection unit 22 into road marking information related to the route guidance in the route guidance unit 8 and road marking information not related to the route guidance. Categorize and get both of these. For example, if the road marking information related to route guidance is the direction “Sannomiya” in the direction of the guidance sign, and the direction of “Osaka” is also indicated on this guidance sign, the direction of “Osaka” is also used for route guidance. Acquired as unrelated road marking information.
  • the display information generation unit 32b generates display information for highlighting road marking information related to the route guidance, and displays mask display information for masking road marking information not related to the route guidance for the forward view. Generate. An example is given in which the road marking information related to the route guidance is the direction “Sannomiya” in the direction of the guidance sign and the right direction, and the content not related to this information is the direction “Osaka” and the straight direction. In this case, the display information generation unit 32b generates character information indicating “Sannomiya” and an image expressing the right arrow mark in highlight color as display information. In addition, the display information generation unit 32b generates mask display information for mask-displaying the character information indicating “Osaka” and the arrow mark in the straight direction.
  • the display information for mask is image information in which the content of the base displayed in a superimposed manner is translucent or shielded.
  • the output control unit 33b outputs the display information generated by the display information generation unit 32b to the HUD device 4A, and outputs mask display information.
  • the character information indicating “Sannomiya” and the arrow indication in the right direction are expressed in highlight color.
  • the displayed information is displayed so as to overlap “Sannomiya” in the actual information sign and the arrow mark in the right direction.
  • the mask display information is displayed so as to overlap the “Osaka” direction and the position in the straight direction.
  • FIG. 14 is a flowchart showing a detailed operation of the display control apparatus according to the third embodiment.
  • the processing from step ST1a to step ST8a and step ST10a is the same as the processing from step ST1a to step ST8a and step ST10a shown in FIG.
  • step ST1a to step ST6a in FIG. 14 is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4
  • the process from step ST8a to step ST8b-1 is an example of the detailed process of step ST3 shown in FIG.
  • the processing from step ST9c to step ST10a is an example of detailed processing of step ST4 shown in FIG. Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 14, and will be described below with reference to FIG.
  • step ST8b-1 the display information generation unit 32b generates mask display information for mask display of road marking information not related to route guidance.
  • the output control unit 33b outputs the display information generated by the display information generation unit 32 to the HUD device 4A in synchronization with the timing at which the voice guidance is output by the voice output device 4B.
  • the display information is highlighted by the HUD device 4A so as to be superimposed on the corresponding road marking information in the driver's front field of view, and the mask display information is displayed (step ST9c).
  • FIG. 15 is a diagram showing the scenery in front of the vehicle viewed through the front window on which information not related to route guidance is displayed as a mask, and shows the case where the vehicle approaches the intersection and the timing of voice guidance is reached.
  • the road marking information related to the route guidance is the information indicating the direction of the direction “Sannomiya” and the right direction
  • the road marking information not related to the route guidance is the direction of “Osaka” and a straight arrow. Show.
  • the display information generation unit 32b generates, as the display information 9a, an image obtained by highlighting the characters “Sannomiya” indicating this direction and the arrow mark indicating the right direction in the actual guide sign 9.
  • the display information generation unit 32a generates, as the display information 10a, an image in which a highlight color is applied to a shape that matches the right turn arrow 10 in an actual road marking.
  • the display information generating unit 32b generates mask display information 9g that shields the direction of “Osaka” in the actual guide sign 9 and its direction.
  • the display information generation unit 32b also generates the road marking of the straight arrow 11 as mask display information 11a that shields the straight arrow 11 in the actual road marking because the road marking is not related to the route guidance.
  • the output control unit 33b uses the voice output device 4B to synchronize with the timing at which the voice guidance “Please turn right. 10a is output. Thereby, the display information 9a is displayed by the HUD device 4A so as to be pasted at the position where the corresponding direction and direction of the actual guide sign 9 are described, and the display information 10a is displayed in correspondence with the actual right turn arrow. 10 is displayed as being pasted.
  • the output control unit 33b outputs the mask display information 9g and 11a to the HUD device 4A in synchronization with the voice guidance timing.
  • the mask display information 9g is displayed so as to overlap the position where the direction of “Osaka” and the direction of the guide sign 9 are described.
  • the mask display information 11a is displayed so as to be pasted on the straight-going straight arrow 11.
  • the display information generation unit 32b generates mask display information 9g and 11a for mask-displaying road marking information that is not related to the route guidance for the forward view.
  • the mask display information 9g and 11a are displayed by the HUD device 4A so as to be superimposed on the road marking information not related to the route guidance for the front view.
  • the road marking information not related to the route guidance is displayed as a mask, it becomes easier for the driver to recognize the display information of the road marking information related to the route guidance.
  • FIG. 16 is a block diagram illustrating a functional configuration of the display control apparatus according to the fourth embodiment, and illustrates each function of the control unit 3C that is a function of the display control apparatus according to the fourth embodiment.
  • the control unit 3C includes a labeling information acquisition unit 30, a timing determination unit 31a, a display information generation unit 32, and an output control unit 33 as functions for controlling the display of the HUD device 4A.
  • the timing determination unit 31a determines that the timing when the road marking information related to route guidance and the vehicle are in a predetermined range is the timing of voice guidance.
  • the predetermined range is a range determined in terms of distance or time in which the road marking information ahead of the vehicle can be visually recognized from the driver. For example, a range of about 30 m from the own vehicle is conceivable.
  • FIG. 17 is a flowchart showing a detailed operation of the display control apparatus according to the fourth embodiment.
  • the processing from step ST1a to step ST7a and from step ST8a to step ST10a is the same as the processing from step ST1a to step ST7a and from step ST8a to step ST10a shown in FIG.
  • step ST1a to step ST6a in FIG. 17 is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST7b is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST8a is an example of the detailed process of step ST3 shown in FIG.
  • the processing from step ST9a to step ST10a is an example of detailed processing of step ST4 shown in FIG.
  • step ST7b the timing determination unit 31a compares the current position of the host vehicle input from the information acquisition unit 2 with the position of the road marking information related to the route guidance ahead of the host vehicle. It is determined whether or not the distance to the position is within a predetermined range. That is, it is determined whether or not the distance has reached a distance where the road marking information can be recognized. If it is determined that the distance is not such that the road marking information can be recognized (step ST7b; NO), the process of step ST7b is repeated until this distance is reached. Further, when it is determined that the distance has become such that the road marking information can be recognized (step ST7b; YES), the process proceeds to step ST8a.
  • the timing determination unit 31a determines the timing at which the distance between the road marking information related to the route guidance and the own vehicle is within a predetermined range. It is determined that it is timing. Thus, at the timing when the driver can actually see the road marking information, it is possible to output voice guidance and display display information corresponding to the route guidance.
  • FIG. 18 is a block diagram illustrating a functional configuration of the display control apparatus according to the fifth embodiment, and illustrates each function of the control unit 3D that is a function of the display control apparatus according to the fifth embodiment.
  • the control unit 3D functions as a display control of the HUD device 4A as a sign information acquisition unit 30c, a timing determination unit 31, a display information generation unit 32, an output control unit 33c, and an audio information generation unit 34. Is provided.
  • the marking information acquisition unit 30c operates in the same manner as in the first embodiment, and acquires road marking information that is not included in the content of voice guidance among road marking information related to route guidance. For example, if the content of the voice guidance is “right direction” and the road marking information related to the route guidance is in the direction of “Sannomiya” and the right direction, the direction of “Sannomiya” It is extracted as road marking information not included in the content.
  • the voice information generation unit 34 generates voice information for voice-outputting road marking information not included in the voice guidance content among road marking information related to the route guidance, and adds the voice information to the voice guidance.
  • the output control unit 33c causes the voice output device 4B to output the voice guidance to which the voice information is added by the voice information generation unit 34 in synchronization with the voice guidance timing.
  • FIG. 19 is a flowchart showing a detailed operation of the display control apparatus according to the fifth embodiment.
  • the processing from step ST1a to step ST6a, step ST7a to step ST8a, and step ST10a is the same as the processing from step ST1a to step ST6a, step ST7a to step ST8a, and step ST10a shown in FIG. Omitted.
  • step ST1a to step ST6b-2 in FIG. 19 is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST8a is an example of the detailed process of step ST3 shown in FIG.
  • the processing from step ST9d to step ST10a is an example of detailed processing of step ST4 shown in FIG. Therefore, since the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 19, the operation will be described below with reference to FIG. 19.
  • the sign information acquisition unit 30c determines whether there is road sign information that is not included in the voice guidance content among the road sign information related to the route guidance (step ST6b-1). That is, the marking information acquisition unit 30c compares the road marking information related to the route guidance with the content of the voice guidance indicated by the route guidance information, and identifies the road marking information that is not included in the content of the voice guidance.
  • step ST6b-1 If there is no road marking information not included in the content of the voice guidance (step ST6b-1; NO), the process proceeds to step ST7a.
  • the marking information acquisition unit 30c voices road marking information not included in the voice guidance content among the road marking information related to the route guidance.
  • the data is output to the information generation unit 34.
  • the voice information generation unit 34 converts the road marking information that is not included in the voice guidance content input from the marking information acquisition unit 30c into voice and adds it to the voice guidance content (step ST6b-2). For example, when the content of the voice guidance is “right direction” and the road marking information related to the route guidance is in the direction of “Sannomiya” and the right direction of the guidance sign, the voice information “Sannomiya direction” is generated, “To Sannomiya, right direction” is added to the voice guidance.
  • step ST9d the output control unit 33c causes the voice output device 4B to output the guidance voice to which the voice information is added. In synchronization with this timing, the output control unit 33c outputs the display information generated by the display information generation unit 32 to the HUD device 4A. Thus, the display information is highlighted so as to overlap the corresponding actual road marking information in the driver's forward view.
  • the display control apparatus generates voice information for voice-outputting road marking information that is not included in the voice guidance content among the road marking information related to route guidance, and provides voice guidance.
  • voice information generating unit 34 Is provided with a voice information generating unit 34 to be added.
  • the output control unit 33c causes the voice output device 4B to output voice guidance after adding voice information in synchronization with the timing of voice guidance.
  • Information that is not included in the map database as in the case where a guidance sign is newly added by voice-outputting information that was not in voice guidance among road marking information detected in front of the vehicle in this way Can be interpolated by voice guidance. As a result, voice guidance in accordance with actual road conditions is possible.
  • any combination of each embodiment, any component of each embodiment can be modified, or any component can be omitted in each embodiment. .
  • the display control device is suitable for a display control device of an in-vehicle navigation device, for example, because the display information related to route guidance can be displayed at a timing that makes it easy to see and recognize the information content.
  • 1 navigation device 2 information acquisition unit, 3 control unit, 3A-3D control unit, 4 notification unit, 4A HUD device, 4B audio output device, 4a navigation display unit, 4b HUD display unit, 4c display control unit, 4d audio output Part, 4e voice control part, 5 input part, 6 map data storage part, 7 route calculation part, 8 route guidance part, 9 guidance sign, 9a, 10a display information, 9b-9f additional display information, 9g display information for mask, 10 right turn arrow, 11 straight arrow, 11a display information for mask, 20 current position detection unit, 20a GPS reception unit, 20b direction detection unit, 20c pulse detection unit, 21 wireless communication unit, 22 ambient information detection unit, 23 external sensor, 23a camera, 23b image processing unit, 23c radar, 23d radar control unit, 30, 30a to 30c Marking information acquisition unit, 31, 31a timing determination unit, 32, 32a, 32b display information generation unit, 33, 33a-33c output control unit, 34 audio information generation unit, 100 processing circuit, 101 display, 102 speaker, 103 CPU,

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)

Abstract

Display information for road marking information relating to route guidance is displayed in synchronization with the output timing of audio guidance so as to be emphasized and superimposed over corresponding road marking information in a front view.

Description

表示制御装置およびナビゲーション装置Display control device and navigation device
 この発明は、運転者の前方視界に表示情報を重ねて表示するヘッドアップディスプレイ(以下、HUDと記載する)の表示制御装置およびこれを備えたナビゲーション装置に関する。 The present invention relates to a display control device for a head-up display (hereinafter referred to as HUD) that displays display information superimposed on a driver's forward view, and a navigation device equipped with the display control device.
 特許文献1には、情報の重要度に応じてHUDによる情報の表示位置を変更する表示システムが記載されている。この表示システムでは、重要度が高い情報を、HUDにおける表示領域のうち、運転者が前方の走行路から視線を逸らすことなく情報を読み取り可能な中央表示領域に表示し、重要度が低い情報をHUDの周辺表示領域に表示している。これにより、中央表示領域に表示された重要度が高い情報は、運転者が視線を移動させることなく視認できる。 Patent Document 1 describes a display system that changes the display position of information by HUD according to the importance of information. In this display system, information with high importance is displayed in a central display area where the driver can read information without diverting his gaze from the driving road ahead of the display area in the HUD, and information with low importance is displayed. It is displayed in the peripheral display area of the HUD. Thereby, the information with high importance displayed in the central display area can be visually recognized without the driver moving his / her line of sight.
 また、特許文献2には、運転者の左目と右目に視差を与えて情報を3次元表示するHUD装置が記載されている。このHUD装置では、道路、交差点または道路沿いの建物などの地図情報を運転者の前方視界の焦点位置に重ねて3次元表示することができる。これにより、運転者が視線または目の焦点をずらすことなく情報を視認できる。 Further, Patent Document 2 describes a HUD device that displays parallax to the left and right eyes of a driver to display information three-dimensionally. In this HUD device, map information such as roads, intersections or buildings along the roads can be displayed in a three-dimensional manner by superimposing them on the focal position of the driver's front view. Thereby, the driver can visually recognize the information without shifting the line of sight or the focus of the eyes.
 特許文献3には、運転者の注視している物体を検出してその物体上に重なって結像するように情報を表示するHUD装置が記載されている。このHUD装置では、時々刻々と変わり得る運転者の注視物体に動的に情報が重ねて表示されるので、運転者は、視線を移動させず、かつ目の焦点距離を合わせ直すことなく、情報を視認できる。 Patent Document 3 describes a HUD device that detects an object that is being watched by a driver and displays information so as to form an image overlapping the object. In this HUD device, information is dynamically superimposed and displayed on the driver's gaze object, which can change from moment to moment, so that the driver does not move the line of sight and adjusts the focal length of the eyes. Can be visually recognized.
 さらに、特許文献4には、現在走行している道路に面する施設についてのみ、実空間の風景または風景を表す映像上に風景に含まれる施設の情報を、この施設に対応する位置に重ねて表示するシステムが記載されている。各情報は、対応する施設における、現在走行している道路に面する部分に貼り付いた形態で遠近感を持って表示される。これにより、施設の情報または実風景の視認性の向上を図ることができる。 Furthermore, in Patent Document 4, only for a facility facing a road that is currently running, the information on the facility included in the landscape is superimposed on the position corresponding to this facility on the scene representing the real space or the image representing the landscape. The system to display is described. Each piece of information is displayed with a sense of perspective in a form attached to a portion facing the road on which the vehicle is currently traveling in the corresponding facility. Thereby, the visibility of the facility information or the actual scenery can be improved.
特開2005-199992号公報JP 2005-199992 A 特開2009-8722号公報JP 2009-8722 A 特開2005-313772号公報JP 2005-313772 A 特開2014-13989号公報JP 2014-13989 A
 HUDでは、車両情報、経路案内などの情報を、運転者の前方視界の焦点位置に重ねて表示することで、運転者は、少ない視線移動で素早く情報を視認することができる。
 しかしながら、前方視界で運転者が注目したい対象物と、この対象物とは無関係の表示情報とが重なって表示された場合、対象物と情報の両方が見難くなる。
 また、HUDによる情報の表示は、車両の走行中に運転者が情報の内容を認識しやすいタイミングで行われることが望ましい。
In HUD, information such as vehicle information and route guidance is displayed superimposed on the focal position of the driver's forward field of view, so that the driver can quickly view the information with a small amount of line-of-sight movement.
However, when an object that the driver wants to pay attention to in front view and display information irrelevant to the object are displayed in an overlapping manner, both the object and the information are difficult to see.
Further, it is desirable that the display of information by the HUD is performed at a timing at which the driver can easily recognize the information content while the vehicle is traveling.
 これに対して、特許文献1に記載される表示システムは、情報の重要度に応じて運転者が視線を逸らすことなく情報を読み取り可能な中央表示領域に表示する情報を絞り込むことで、HUDによって表示された情報の視認性を高めている。
 しかしながら、重要度が高くHUDの中央表示領域に表示される情報であっても、前方視界で運転者が注目したい対象物とこの対象物とは無関係の情報とが重なって表示された場合には、対象物と情報との両方の視認性が落ちてしまう。
 また、HUDの周辺表示領域に表示された情報を視認する場合、運転者は視線を大きく移動させる必要があることから、車両前方に対する注意が低下してしまう。
On the other hand, the display system described in Patent Document 1 narrows down information to be displayed in the central display area where the driver can read information without diverting the line of sight according to the importance of the information. The visibility of the displayed information is improved.
However, even if the information has a high degree of importance and is displayed in the central display area of the HUD, if the object that the driver wants to pay attention to in the front view and information that is irrelevant to the object are overlapped and displayed. , The visibility of both the object and the information is reduced.
In addition, when visually recognizing information displayed in the peripheral display area of the HUD, the driver needs to move his / her line of sight greatly, which reduces attention to the front of the vehicle.
 特許文献2に記載されるHUD装置は、運転者の前方視界に情報を3次元表示することができるが、前方視界で運転者が注目したい対象物と、この対象物とは無関係の情報とが重なって表示される可能性がある。この場合、対象物と情報との両方が見難くなる。 The HUD device described in Patent Document 2 can display information three-dimensionally in the driver's forward field of view, but there is an object that the driver wants to focus on in the forward field of view and information unrelated to this object. There is a possibility of overlapping. In this case, both the object and the information are difficult to see.
 特許文献3に記載されるHUD装置は、時々刻々と変わり得る運転者の注視物体に動的に情報を重ねて表示するものであるが、重ねて表示する情報の表示形態については何らの考慮もされていないため、注視物体と情報との両方が見難くなる可能性がある。
 特許文献4に記載されるHUD装置は、現在走行している道路に面する施設の情報を、施設の道路に面する部分に表示することを想定しており、経路案内に適用できない。
The HUD device described in Patent Document 3 dynamically displays information superimposed on a driver's gaze object that can change from moment to moment, but no consideration is given to the display form of the information to be displayed in a superimposed manner. Since it is not done, both the gaze object and the information may be difficult to see.
The HUD device described in Patent Literature 4 is assumed to display information on a facility facing a currently traveling road on a portion facing the road of the facility and cannot be applied to route guidance.
 さらに、特許文献1から特許文献3までのいずれにおいても、HUDに情報を表示するタイミングについて考慮されていない。このため、運転者が認識し難いタイミングで経路案内などの情報がHUDによって表示された場合、運転者が情報の内容を認識できないまま表示が終了するか、運転者がHUDによって表示された情報の内容を認識するために注視してしまい、運転への注意が低下する可能性がある。
 なお、特許文献4には、施設情報の表示形態を施設に貼り付いて見える状態から正面を向いた状態に変更するタイミングについての記載はあるが、前述したとおり、経路案内については考慮されていない。
Furthermore, in any of Patent Document 1 to Patent Document 3, the timing for displaying information on the HUD is not considered. For this reason, when information such as route guidance is displayed by the HUD at a timing that is difficult for the driver to recognize, the display ends without the driver being able to recognize the contents of the information, or the information of the information displayed by the driver by the HUD is displayed. Attention to driving may be reduced due to attention to recognize the contents.
In addition, although patent document 4 has a description about the timing which changes the display form of facility information from the state which is stuck on a facility to the state which turned to the front, as mentioned above, route guidance is not considered. .
 この発明は上記の課題を解決するもので、経路案内に関係する表示情報を見やすくかつ情報内容を認識しやすいタイミングで表示できる表示制御装置およびナビゲーション装置を得ることを目的とする。 SUMMARY OF THE INVENTION An object of the present invention is to provide a display control device and a navigation device that can display display information related to route guidance at a timing that makes it easy to see and recognize information content.
 この発明に係る表示制御装置は、標示情報取得部、タイミング生成部、表示情報生成部および出力制御部を備える。標示情報取得部は、車両前方で検出された道路標示情報から経路案内に関係する道路標示情報を取得する。タイミング判定部は、経路案内における音声案内のタイミングであるか否かを判定する。表示情報生成部は、標示情報取得部により取得された道路標示情報を、前方視界の対応する実在の道路標示情報に重なって見えるように強調表示させるための表示情報を生成する。出力制御部は、タイミング判定部により判定された音声案内のタイミングに同期して、表示装置に対して表示情報を出力する。 The display control apparatus according to the present invention includes a sign information acquisition unit, a timing generation unit, a display information generation unit, and an output control unit. The sign information acquisition unit acquires road sign information related to route guidance from road sign information detected in front of the vehicle. The timing determination unit determines whether it is the timing of voice guidance in route guidance. The display information generation unit generates display information for highlighting the road marking information acquired by the marking information acquisition unit so that the road marking information overlaps with the corresponding real road marking information in the forward field of view. The output control unit outputs display information to the display device in synchronization with the voice guidance timing determined by the timing determination unit.
 この発明によれば、経路案内に関係する道路標示情報の表示情報が前方視界の対応する実在の道路標示情報に重なって見えるように強調表示されるので、自然な形で強調された見やすい経路案内表示を実現することができる。さらに音声案内のタイミングに同期して経路案内に関係する道路標示情報が表示されるので、運転者の聴覚と視覚とで認識すべき情報の内容が一致して表示情報の内容が認識しやすくなる。 According to the present invention, since the display information of the road marking information related to the route guidance is highlighted so as to overlap with the corresponding actual road marking information in the forward view, the easy-to-read route guidance emphasized in a natural form Display can be realized. Furthermore, since the road marking information related to the route guidance is displayed in synchronization with the timing of the voice guidance, the contents of the information to be recognized by the driver's auditory sense and the visual sense match, so that the contents of the display information can be easily recognized. .
この発明の実施の形態1に係るナビゲーション装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the navigation apparatus which concerns on Embodiment 1 of this invention. 実施の形態1に係る表示制御装置の機能構成を示すブロック図である。3 is a block diagram illustrating a functional configuration of the display control apparatus according to Embodiment 1. FIG. 実施の形態1に係る表示制御装置のハードウェア構成を示すブロック図である。3 is a block diagram illustrating a hardware configuration of the display control apparatus according to Embodiment 1. FIG. 実施の形態1に係る表示制御装置の動作の概要を示すフローチャートである。3 is a flowchart showing an outline of the operation of the display control apparatus according to the first embodiment. 実施の形態1に係る表示制御装置の詳細な動作を示すフローチャートである。4 is a flowchart showing a detailed operation of the display control apparatus according to the first embodiment. 音声案内前のフロントウィンドウ越しに見た車両前方の風景を示す図である。It is a figure which shows the scenery ahead of the vehicle seen through the front window before audio guidance. 音声案内するタイミングでフロントウィンドウ越しに見た車両前方の風景を示す図である。It is a figure which shows the scenery ahead of the vehicle seen through the front window at the timing which carries out voice guidance. この発明の実施の形態2に係る表示制御装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 2 of this invention. 実施の形態2に係る表示制御装置の詳細な動作を示すフローチャートである。10 is a flowchart showing a detailed operation of the display control apparatus according to the second embodiment. 実在の案内標識の空きスペースに追加表示情報が表示されたフロントウィンドウ越しに見た車両前方の風景を示す図である。It is a figure which shows the scenery ahead of the vehicle seen through the front window in which the additional display information was displayed in the empty space of the actual information sign. 実在の案内標識と並ぶように追加表示情報が表示されたフロントウィンドウ越しに見た車両前方の風景を示す図である。It is a figure which shows the scenery ahead of the vehicle seen through the front window in which the additional display information was displayed so that it might align with the actual guide sign. 実在の案内標識の空きスペースに渋滞の迂回路が表示され、渋滞中であることが案内標識と並ぶように表示されたフロントウィンドウ越しに見た車両前方の風景を示す図である。It is a figure which shows the scenery ahead of the vehicle seen through the front window displayed in the empty space of the actual information sign that the detour of the traffic jam is displayed, and that the traffic sign is displayed in line with the information sign. この発明の実施の形態3に係る表示制御装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 3 of this invention. 実施の形態3に係る表示制御装置の詳細な動作を示すフローチャートである。10 is a flowchart showing a detailed operation of the display control apparatus according to the third embodiment. 経路案内に関係しない情報がマスク表示されたフロントウィンドウ越しに見た車両前方の風景を示す図である。It is a figure which shows the scenery ahead of the vehicle seen through the front window by which the information unrelated to route guidance was mask-displayed. この発明の実施の形態4に係る表示制御装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 4 of this invention. 実施の形態4に係る表示制御装置の詳細な動作を示すフローチャートである。10 is a flowchart showing a detailed operation of the display control apparatus according to the fourth embodiment. この発明の実施の形態5に係る表示制御装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 5 of this invention. 実施の形態5に係る表示制御装置の詳細な動作を示すフローチャートである。10 is a flowchart showing a detailed operation of the display control apparatus according to the fifth embodiment.
 以下、この発明をより詳細に説明するため、この発明を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1はこの発明の実施の形態1に係るナビゲーション装置1の構成を示す機能ブロック図である。ナビゲーション装置1は、車両に搭載されて目的地までの経路案内を実行する装置であり、情報取得部2、制御部3、通知部4、入力部5、地図データ蓄積部6、経路算出部7および経路案内部8を備える。ナビゲーション装置1の各部の動作は、制御部3によって統括的に制御される。
Hereinafter, in order to describe the present invention in more detail, modes for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1 FIG.
FIG. 1 is a functional block diagram showing a configuration of a navigation device 1 according to Embodiment 1 of the present invention. The navigation device 1 is a device that is mounted on a vehicle and performs route guidance to a destination, and includes an information acquisition unit 2, a control unit 3, a notification unit 4, an input unit 5, a map data storage unit 6, and a route calculation unit 7. And a route guide unit 8. The operation of each part of the navigation device 1 is comprehensively controlled by the control part 3.
 情報取得部2は、自車の現在位置、自車周囲、他車で検出された情報を取得する構成の総称であり、現在位置検出部20、無線通信部21および周囲情報検出部22を備える。
 現在位置検出部20は、自車の現在位置を検出する構成要素であり、GPS(Global Positioning System)受信部20a、方位検出部20bおよびパルス検出部20cが接続されている。
The information acquisition unit 2 is a generic name for a configuration that acquires information detected by the current position of the host vehicle, the surroundings of the host vehicle, and other vehicles, and includes a current position detection unit 20, a wireless communication unit 21, and a surrounding information detection unit 22. .
The current position detection unit 20 is a component that detects the current position of the host vehicle, and is connected to a GPS (Global Positioning System) reception unit 20a, a direction detection unit 20b, and a pulse detection unit 20c.
 GPS受信部20aは、GPS衛星からのGPS信号を受信し、このGPS信号に基づいて自車の現在位置(例えば、緯度経度)を検出する。方位検出部20bは、自車の走行方向(例えば、方位)を検出する検出部であり、例えばジャイロセンサおよび方位センサから構成される。パルス検出部20cは、自車の車軸の単位時間当たりの回転数に応じたパルス信号を検出し、このパルス信号に基づいて自車の走行速度と走行距離を検出する。
 現在位置検出部20は、GPS受信部20aによって検出された現在位置を、方位検出部20bおよびパルス検出部20cによって検出された自車の走行方向、走行速度および走行距離に基づいて補正する。これにより、自車の正確な現在位置を検出することが可能となる。
The GPS receiver 20a receives a GPS signal from a GPS satellite, and detects the current position (for example, latitude and longitude) of the host vehicle based on the GPS signal. The direction detection unit 20b is a detection unit that detects the traveling direction (for example, the direction) of the host vehicle, and includes, for example, a gyro sensor and a direction sensor. The pulse detector 20c detects a pulse signal corresponding to the number of revolutions per unit time of the axle of the own vehicle, and detects the traveling speed and the traveling distance of the own vehicle based on the pulse signal.
The current position detecting unit 20 corrects the current position detected by the GPS receiving unit 20a based on the traveling direction, traveling speed, and traveling distance of the host vehicle detected by the direction detecting unit 20b and the pulse detecting unit 20c. Thereby, it becomes possible to detect the exact current position of the own vehicle.
 無線通信部21は、他車に搭載された通信装置と無線通信して、他車で検出された自車周囲の状況を示す情報を受信する。例えば、自車の前方を走行する他の車両で検出された道路標示情報を取得する。
 道路標示情報とは、道路の交通案内に関する標示情報であり、例えば、案内標識、路面標示、情報板、これらの内容、およびこれらと上記他の車両を自車として見た場合の当該自車との距離が挙げられる。
 案内標識については、案内標識の位置と案内標識の内容が道路標示情報となる。案内標識の内容には、方面を示す地名およびその方向を示す矢印標示などがある。
 路面標示については、路面標示の位置とその内容が道路標示情報となる。路面標示の内容には、車線の直進、右左折などの進行方向別の通行区分を示す矢印標示などがある。
 情報板については、情報板の位置に加え、情報板が表示している「大阪まで30km、渋滞あり、所要時間40分」といった交通情報の内容も道路標示情報となる。
The wireless communication unit 21 wirelessly communicates with a communication device mounted on another vehicle, and receives information indicating a situation around the own vehicle detected by the other vehicle. For example, road marking information detected by another vehicle traveling in front of the host vehicle is acquired.
The road marking information is information relating to road traffic guidance. For example, a guidance sign, a road marking, an information board, their contents, and the own vehicle when these and other vehicles are viewed as the own vehicle. Distance.
As for the guide sign, the position of the guide sign and the content of the guide sign become road marking information. The contents of the guide sign include a place name indicating the direction and an arrow sign indicating the direction.
As for the road marking, the position and the content of the road marking become road marking information. The contents of the road marking include an arrow marking indicating a traffic division according to the traveling direction, such as a straight lane or a left / right turn.
Regarding the information board, in addition to the position of the information board, the contents of the traffic information such as “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board is also the road marking information.
 周囲情報検出部22は、外界センサ23と接続しており、外界センサ23によって検出された自車周囲の情報から自車周囲の状況を示す情報を検出する。自車周囲の状況を示す情報には、例えば、自車前方の道路標示情報が挙げられる。
 外界センサ23は、自車周囲の情報を検出するセンサ群であって、カメラ23a、画像処理部23b、レーダ23cおよびレーダ制御部23dを備えて構成される。
The surrounding information detection unit 22 is connected to the external sensor 23 and detects information indicating the situation around the own vehicle from the information around the own vehicle detected by the external sensor 23. The information indicating the situation around the host vehicle includes, for example, road marking information in front of the host vehicle.
The external sensor 23 is a sensor group that detects information around the host vehicle, and includes a camera 23a, an image processing unit 23b, a radar 23c, and a radar control unit 23d.
 カメラ23aは、可視光領域あるいは赤外光領域での撮影が可能なカメラである。例えば、カメラ23aは、自車のフロントウィンドウの車室内側でルームミラーの近傍に配置され、フロントウィンドウ越しに自車の走行方向前方の予め定められた範囲の外界を撮影する。なお、カメラ23aとしては、例えば、CCD(Charge Coupled Device)カメラ、あるいは、CMOS(Complementary Metal Oxide Semiconductor)カメラ、ステレオカメラなどを使用する。 The camera 23a is a camera capable of photographing in the visible light region or the infrared light region. For example, the camera 23a is arranged in the vicinity of a room mirror on the vehicle interior side of the front window of the own vehicle, and photographs the outside of a predetermined range ahead of the own vehicle in the traveling direction through the front window. As the camera 23a, for example, a CCD (Charge Coupled Device) camera, a CMOS (Complementary Metal Oxide Semiconductor) camera, a stereo camera, or the like is used.
 画像処理部23bは、カメラ23aによって撮影された画像を、周囲情報検出部22で処理可能な画像データに画像処理する。例えば、フィルタリング、二値化処理などの画像処理を施して2次元配列の画素からなる画像データを生成する。この画像データは、周囲情報検出部22に出力される。 The image processing unit 23 b performs image processing on the image captured by the camera 23 a into image data that can be processed by the surrounding information detection unit 22. For example, image data including two-dimensional array pixels is generated by performing image processing such as filtering and binarization processing. This image data is output to the surrounding information detection unit 22.
 レーダ23cは、レーザ光あるいはミリ波をプローブとして車両周囲の物体を検出するセンサである。例えば、自車の車体におけるノーズ部分または車室内側のフロントウィンドウの近傍などに配置される。また、レーダ23cは、レーダ制御部23dの制御によりレーザ光あるいはミリ波の発信信号を検出対象方向に送信し、この発信信号が外部の物体で反射された反射信号を受信する。検出対象方向には、自車の進行方向が挙げられる。 The radar 23c is a sensor that detects an object around the vehicle using a laser beam or a millimeter wave as a probe. For example, it is arranged near the nose portion of the vehicle body of the own vehicle or the front window on the vehicle interior side. Further, the radar 23c transmits a laser beam or millimeter wave transmission signal in the detection target direction under the control of the radar control unit 23d, and receives a reflection signal obtained by reflecting the transmission signal by an external object. The detection target direction includes the traveling direction of the own vehicle.
 レーダ制御部23dは、レーダ23cにより受信された反射信号と発信信号とを混合したビート信号を生成して周囲情報検出部22に出力する。
 また、レーダ制御部23dは、周囲情報検出部22から入力される制御指令に従って、レーダ23cの動作を制御する。
The radar control unit 23d generates a beat signal obtained by mixing the reflected signal and the transmission signal received by the radar 23c, and outputs the beat signal to the ambient information detection unit 22.
The radar control unit 23d controls the operation of the radar 23c in accordance with a control command input from the surrounding information detection unit 22.
 周囲情報検出部22は、外界センサ23から得られた上記画像データに予め定められた検出対象物の画像が含まれるか否かを判定する。検出対象物は、例えば、自車前方にある案内標識、路面標示、情報板などの道路標示情報である。 The surrounding information detection unit 22 determines whether or not the image data obtained from the external sensor 23 includes a predetermined detection target image. The detection object is, for example, road marking information such as a guide sign, road marking, and information board in front of the host vehicle.
 画像データに検出対象物の画像が含まれると判定した場合に、周囲情報検出部22は、画像データが示す画像全体における基準位置と検出対象物との間の距離(以下、第1の距離と記載する)を算出する。基準位置は、例えば、上記画像データが示す画像全体の水平方向における中央の位置である。
 また、周囲情報検出部22は、外界センサ23から入力されたビート信号に基づいて、検出対象物と自車との間の距離(以下、第2の距離と記載する)を算出する。
 なお、カメラ23aがステレオカメラである場合、ステレオカメラにおける検出対象物に対する視差を利用して第2の距離を算出することも可能である。
When it is determined that the image of the detection target is included in the image data, the surrounding information detection unit 22 determines the distance between the reference position in the entire image indicated by the image data and the detection target (hereinafter referred to as the first distance). Calculated). The reference position is, for example, the center position in the horizontal direction of the entire image indicated by the image data.
In addition, the ambient information detection unit 22 calculates a distance between the detection target and the host vehicle (hereinafter referred to as a second distance) based on the beat signal input from the external sensor 23.
When the camera 23a is a stereo camera, the second distance can be calculated using the parallax with respect to the detection target in the stereo camera.
 周囲情報検出部22は、第1の距離と第2の距離に基づいて、水平方向における自車の位置に対する検出対象物の相対位置を算出する。検出対象物の相対位置は、例えば、緯度経度の座標の差から求められた相対的な位置である。この後、周囲情報検出部22は、検出対象物の相対位置および自車の現在位置に基づいて検出対象物の現在位置を算出する。
 検出対象物が案内標識、路面標示、情報板である場合、周囲情報検出部22は、これらの標示物と自車との距離を道路標示情報として後述する標示情報取得部30に出力する。
The surrounding information detection unit 22 calculates the relative position of the detection target object with respect to the position of the own vehicle in the horizontal direction based on the first distance and the second distance. The relative position of the detection target is, for example, a relative position obtained from a difference in latitude and longitude coordinates. Thereafter, the surrounding information detection unit 22 calculates the current position of the detection object based on the relative position of the detection object and the current position of the host vehicle.
When the detection object is a guide sign, a road marking, or an information board, the surrounding information detection unit 22 outputs the distance between these marking objects and the own vehicle to the later-described signing information acquisition unit 30 as road marking information.
 通知部4は、ナビゲーション装置1がユーザに情報を通知する構成の総称であり、ナビ表示部4a、HUD表示部4b、表示制御部4c、音声出力部4dおよび音声制御部4eを備える。ナビ表示部4aは、地図表示と経路案内表示が行われるディスプレイであり、例えば、車室内の前方中央部に配置される。HUD表示部4bは、表示情報をフロントウィンドウまたはコンバイナに投影表示する。
 ただし、この発明は、経路案内に関係する道路標示情報が、運転者の前方視界の対応する実在の道路標示情報に重なって見えるように表示されるので、投影表示する対象としては、フロントウィンドウが望ましい。
The notification unit 4 is a generic name for a configuration in which the navigation device 1 notifies the user of information, and includes a navigation display unit 4a, a HUD display unit 4b, a display control unit 4c, an audio output unit 4d, and an audio control unit 4e. The navigation display unit 4a is a display on which map display and route guidance display are performed, and is arranged, for example, in the front center of the vehicle interior. The HUD display unit 4b projects and displays the display information on the front window or the combiner.
However, since the road marking information related to the route guidance is displayed so as to overlap the corresponding real road marking information in the driver's front field of view, the front window is an object to be projected and displayed. desirable.
 表示制御部4cは、制御部3から入力された地図画像および案内画像を含む画像データに基づいて、ナビ表示部4aに地図画像および案内画像を表示させる。また、表示制御部4cは、制御部3から入力された表示情報をHUD表示部4bに表示させる。
 HUD装置4Aは、この発明における表示装置を具体化したものであり、HUD表示部4bと表示制御部4cを備えている。
The display control unit 4c causes the navigation display unit 4a to display the map image and the guidance image based on the image data including the map image and the guidance image input from the control unit 3. Further, the display control unit 4c causes the HUD display unit 4b to display the display information input from the control unit 3.
The HUD device 4A embodies the display device according to the present invention, and includes a HUD display unit 4b and a display control unit 4c.
 音声出力部4dは、案内音声、警告音などを音声出力する出力部であって、車両に搭載されたスピーカにより実現される。
 また、音声制御部4eは、制御部3から入力された案内音声、警告音などの音声データに基づいて、音声出力部4dに案内音声、警告音などを出力させる。
 音声出力装置4Bは、この発明における音声出力装置を具体化したものであり、前述の音声出力部4dと音声制御部4eを備える。すなわち、音声出力装置4Bでは、制御部3に制御された音声制御部4eが、音声情報を音声出力部4dに出力して音声を出させる。
 ナビゲーション装置1は、この通知部4を制御することにより、自車の運転者に対して運転を支援する情報を通知する。
The voice output unit 4d is an output unit that outputs a guidance voice, a warning sound, and the like, and is realized by a speaker mounted on the vehicle.
Also, the voice control unit 4e causes the voice output unit 4d to output guidance voice, warning sound, and the like based on voice data such as guidance voice and warning sound input from the control unit 3.
The audio output device 4B embodies the audio output device of the present invention, and includes the audio output unit 4d and the audio control unit 4e described above. That is, in the audio output device 4B, the audio control unit 4e controlled by the control unit 3 outputs audio information to the audio output unit 4d to output audio.
The navigation device 1 controls the notification unit 4 to notify the driver of the own vehicle of information for assisting driving.
 入力部5は、ユーザからの情報入力を受け付ける構成要素であり、例えば、押しボタン装置、タッチパネルなどによって実現される。なお、タッチパネルは、ナビ表示部4aと一体に構成されてもよい。入力部5を使用して現在位置から経路案内すべき目的地などが入力される。例えば、入力部5は、ユーザからナビ表示部4aに表示された地図上の地点が指定された場合、この地点を目的地として受け付け、住所または電話番号が入力された場合、この住所または電話番号を、目的地を特定する目的地情報として受け付ける。 The input unit 5 is a component that receives information input from the user, and is realized by, for example, a push button device or a touch panel. The touch panel may be integrated with the navigation display unit 4a. A destination to be route-guided from the current position is input using the input unit 5. For example, when a point on the map displayed on the navigation display unit 4a is designated by the user, the input unit 5 accepts this point as a destination, and when an address or telephone number is input, this address or telephone number Is received as destination information for specifying the destination.
 地図データ蓄積部6は、地図データを記憶する記憶部であり、例えば、HDD(Hard Disk Drive)、RAM(Random Access Memory)などの記憶装置として実現される。なお、地図データ蓄積部6は、ナビゲーション装置1の外部から取得した地図データを記憶するものであってもよい。
 例えば、地図データ蓄積部6は、ネットワークを介して外部装置からダウンロードした地図データを記憶する。また、地図データ蓄積部6が、DVD-ROM(Digital Versatile Disk-Read Only Memory)、BD-ROM(Blu-Ray(登録商標) Disc-ROM)などの記録媒体から読み出した地図データを記憶してもよい。
The map data storage unit 6 is a storage unit that stores map data, and is realized as a storage device such as an HDD (Hard Disk Drive) or a RAM (Random Access Memory). Note that the map data storage unit 6 may store map data acquired from the outside of the navigation device 1.
For example, the map data storage unit 6 stores map data downloaded from an external device via a network. The map data storage unit 6 stores map data read from a recording medium such as a DVD-ROM (Digital Versatile Disk-Read Only Memory) or a BD-ROM (Blu-Ray (registered trademark) Disc-ROM). Also good.
 経路算出部7は、出発地、目的地および地図データに基づいて、出発地から目的地までの経路を算出する。例えば、現在位置検出部20により検出された自車の現在位置を出発地とする。目的地は、入力部5がユーザから受け付けた地点でもよいが、目的地候補として予め登録された地点であってもよい。
 経路算出部7が算出する経路は、例えば、到着時間が短い経路(時間優先経路)、走行距離が短い経路(距離優先経路)、燃料が少ない経路(燃料優先経路)、有料道路をなるべく走行する経路(有料道路優先経路)、一般道路をなるべく走行する経路(一般道路優先経路)、時間、距離および費用のバランスが良い経路(標準経路)などがある。
The route calculation unit 7 calculates a route from the departure point to the destination based on the departure point, the destination, and the map data. For example, the current position of the host vehicle detected by the current position detection unit 20 is set as the departure place. The destination may be a point received by the input unit 5 from the user, or may be a point registered in advance as a destination candidate.
The route calculated by the route calculation unit 7 travels as much as possible on, for example, a route with a short arrival time (time priority route), a route with a short travel distance (distance priority route), a route with less fuel (fuel priority route), and a toll road. There are a route (toll road priority route), a route traveling on a general road as much as possible (general road priority route), a route (standard route) with a good balance of time, distance and cost.
 経路案内部8は、経路算出部7により算出された経路のうち、ユーザに選択された経路(以下、走行予定経路と記載する)を経路案内する。例えば、経路案内部8は、通知部4を制御して運転者に経路案内情報を通知することで、現在位置から走行予定経路に沿って目的地までの経路案内を行う。 The route guidance unit 8 provides route guidance for a route selected by the user (hereinafter, referred to as a scheduled travel route) among the routes calculated by the route calculation unit 7. For example, the route guidance unit 8 performs route guidance from the current position to the destination along the planned travel route by controlling the notification unit 4 to notify the driver of route guidance information.
 また、実施の形態1に係る表示制御装置が備える各機能は、制御部3における表示制御に関する各機能により実現される。
 図2は、実施の形態1に係る表示制御装置の機能構成を示すブロック図であり、実施の形態1に係る表示制御装置が備える機能となる制御部3の各機能を示している。
 図2に示すように、制御部3は、HUD装置4Aの表示を制御する機能として、標示情報取得部30、タイミング判定部31、表示情報生成部32、出力制御部33を備える。
Moreover, each function with which the display control apparatus which concerns on Embodiment 1 is provided is implement | achieved by each function regarding the display control in the control part 3. FIG.
FIG. 2 is a block diagram illustrating a functional configuration of the display control apparatus according to the first embodiment, and illustrates each function of the control unit 3 that is a function of the display control apparatus according to the first embodiment.
As shown in FIG. 2, the control unit 3 includes a labeling information acquisition unit 30, a timing determination unit 31, a display information generation unit 32, and an output control unit 33 as functions for controlling the display of the HUD device 4 </ b> A.
 標示情報取得部30は、自車前方で検出された道路標示情報から、経路案内に関係する道路標示情報を取得する。例えば、標示情報取得部30は、周囲情報検出部22によって自車の前方で検出された道路標示情報を、経路案内部8における経路案内に関係する道路標示情報と経路案内に関係しない道路標示情報とに分類し、経路案内に関係する道路標示情報を取得する。分類の方法としては、例えば、道路標示情報における、案内標識、路面標示、情報板およびこれらの内容と走行予定経路の経路案内情報とを比較して経路案内に関係する情報を特定する方法が採用される。 The sign information acquisition unit 30 acquires road sign information related to route guidance from road sign information detected in front of the host vehicle. For example, the marking information acquisition unit 30 uses the road marking information detected in front of the vehicle by the surrounding information detection unit 22 as road marking information related to route guidance in the route guidance unit 8 and road marking information not related to route guidance. And road marking information related to route guidance is acquired. As a classification method, for example, a method of identifying information related to route guidance by comparing the guidance signs, road markings, information boards and their contents with the route guidance information of the planned travel route in the road marking information is adopted. Is done.
 例えば、標示情報取得部30は、周囲情報検出部22を介して画像処理部23bの画像処理で得られた画像データを画像解析して道路標示情報の内容を特定する。ここで、道路標示情報が案内標識である場合は、案内標識に記載される地名、矢印標示などの標示内容およびこの標示内容が記載されている案内標識中の位置が特定される。
 道路標示情報が路面標示の場合、標示情報取得部30は、周囲情報検出部22から取得した自車前方の路面画像データを画像解析して、路面標示の内容を特定するとともに、運転者の視線と路面とがなす角度を算出する。
For example, the sign information acquisition unit 30 analyzes the image data obtained by the image processing of the image processing unit 23b via the surrounding information detection unit 22, and specifies the content of the road sign information. Here, when the road marking information is a guide sign, the place name described in the guide sign, the contents of the sign such as an arrow sign, and the position in the guide sign where the contents of the sign are described are specified.
When the road marking information is a road marking, the marking information acquisition unit 30 analyzes the road surface image data in front of the own vehicle acquired from the surrounding information detection unit 22 to identify the content of the road marking, and the driver's line of sight And the angle formed by the road surface.
 案内地点における経路案内が左折である場合、標示情報取得部30は、左折を示す路面標示を経路案内に関係する道路標示情報とし、直進、右折の路面標示は、経路案内に関係しない道路標示情報に分類する。また、案内地点で「三宮」方面に経路案内する場合は、標示情報取得部30は、案内標識の方面を示す地名として「三宮」と「大阪」を特定していれば、「三宮」を経路案内に関係する道路標示情報とし、「大阪」は、経路案内に関係しない道路標示情報に分類する。 When the route guidance at the guidance point is a left turn, the sign information acquisition unit 30 sets the road marking indicating the left turn as road marking information related to the route guidance, and the road marking information for straight and right turns indicates the road marking information not related to the route guidance. Classify into: In addition, when the route is guided to the direction of “Sannomiya” at the guide point, the sign information acquisition unit 30 routes “Sannomiya” if “Sannomiya” and “Osaka” are specified as the place names indicating the direction of the guidance sign. The road marking information related to the guidance is used, and “Osaka” is classified as road marking information not related to the route guidance.
 タイミング判定部31は、経路案内部8が走行予定経路を音声で案内するタイミングであるか否かを判定する。走行予定経路では、音声案内が行われる案内地点が予め決まっている。案内地点は、走行予定経路上の車両の進行方向を判断する必要がある地点であり、交差点、高速道路の出入り口などがある。 The timing determination unit 31 determines whether or not it is the timing at which the route guide unit 8 guides the planned travel route by voice. In the planned travel route, a guidance point where voice guidance is performed is determined in advance. The guide point is a point where it is necessary to determine the traveling direction of the vehicle on the planned travel route, and includes an intersection, an entrance / exit of an expressway, and the like.
 例えば、タイミング判定部31は、経路案内部8が経路案内に使用する走行予定経路の経路案内情報と自車の現在位置とを取得し、案内地点に対して自車が予め定められた範囲内に到達したか否かを判定する。そして、タイミング判定部31は、この範囲内に到達したときに音声案内のタイミングであると判定する。
 なお、予め定められた範囲は、案内地点と自車位置の間を時間的または距離的に定めた範囲であって、音声案内が出力されている期間に、出力制御部33がこの音声案内に関係する表示情報をHUD装置4Aに表示させることができる範囲となる。
 すなわち、音声案内が出力されている期間とこれに対応する表示情報が表示される期間とが重なるタイミングが音声案内を行うタイミングと判定される。
For example, the timing determination unit 31 acquires the route guidance information of the planned travel route used by the route guidance unit 8 for route guidance and the current position of the host vehicle, and the host vehicle is within a predetermined range with respect to the guide point. It is determined whether or not. And the timing determination part 31 determines with it being the timing of voice guidance, when it arrives within this range.
Note that the predetermined range is a range determined in terms of time or distance between the guide point and the vehicle position, and the output control unit 33 performs this voice guidance during the period when the voice guidance is output. This is a range in which related display information can be displayed on the HUD device 4A.
That is, the timing at which the period during which the voice guidance is output and the period during which the display information corresponding to the period is displayed overlaps with the timing at which voice guidance is performed.
 表示情報生成部32は、経路案内に関係する道路標示情報を、前方視界の対応する実在の道路標示情報に重なって見えるように強調表示させるための表示情報を生成する。
 例えば、経路案内に関係する道路標示情報が案内標識の方面と方向の情報である場合、この方面と方向が記載された実在の案内標識に重なって見えるように表示させるための表示情報として、方面を示す地名の文字と方向を示す矢印標示がハイライト色で強調表現された画像データが生成される。
 また、経路案内に関係する道路標示情報が、情報板に表示された「大阪まで30km、渋滞あり、所要時間40分」という交通情報である場合、この情報板の表示部分に重なって見えるように表示させるための表示情報として、この交通情報を示す文字がハイライト色で強調表現された画像データが生成される。
The display information generation unit 32 generates display information for highlighting the road marking information related to the route guidance so that the road marking information appears to overlap the corresponding real road marking information in the forward view.
For example, when the road marking information related to the route guidance is information on the direction and direction of the guidance sign, as the display information for displaying the direction and direction so that the direction and direction are displayed so as to overlap the direction sign, The image data is generated in which the characters of the place name indicating and the arrow mark indicating the direction are highlighted in highlight color.
In addition, when the road marking information related to the route guidance is the traffic information “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board, it appears to overlap the display part of this information board As display information to be displayed, image data in which characters indicating the traffic information are highlighted in a highlight color is generated.
 さらに、経路案内に関係する道路標示情報が路面標示の右左折の矢印標示である場合、この路面標示に重なって見えるように表示させるための表示情報として、右左折の矢印標示がハイライト色で強調表現された画像データが生成される。このとき、表示情報生成部32は、運転者の視線と路面とがなす角度に基づいて、運転者から実際の路面の矢印標示に貼り付いて見えるように矢印標示の形状を変形した画像を生成する。これにより、路面標示の表示情報を、運転者から見て実際の路面標示に近い形状とすることができる。 In addition, when the road marking information related to the route guidance is the left / right turn arrow marking of the road marking, the right / left turn arrow marking is highlighted as display information to be displayed so as to overlap the road marking. Emphasized image data is generated. At this time, based on the angle formed by the driver's line of sight and the road surface, the display information generating unit 32 generates an image in which the shape of the arrow sign is deformed so that the driver can see the arrow mark on the actual road surface. To do. Thereby, the display information of a road marking can be made into the shape close | similar to an actual road marking seeing from a driver | operator.
 さらに、表示情報生成部32は、標示情報取得部30から順次入力される道路標示情報の内容に基づいて、実在の道路標示情報の位置と自車との間の距離に応じたサイズおよび形状の道路標示情報を示す画像データを、表示情報として生成する。
 表示情報生成部32が生成する表示情報は、これに対応する音声案内が出力されている間の比較的に短い期間に表示されるものであるが、この期間であっても、道路標示情報の画像のサイズおよび形状が合わなくなり、実際の道路標示情報からずれる可能性がある。
 そこで、実在の道路標示情報の位置と自車との間の距離に応じたサイズおよび形状の道路標示情報を示す画像データを表示情報とすることで、実際の道路標示情報からのずれを軽減することができる。
Further, the display information generation unit 32 has a size and shape corresponding to the distance between the position of the actual road marking information and the own vehicle based on the content of the road marking information sequentially input from the marking information acquisition unit 30. Image data indicating road marking information is generated as display information.
The display information generated by the display information generation unit 32 is displayed in a relatively short period while the voice guidance corresponding to the display information is output. Even in this period, the display information of the road marking information is displayed. There is a possibility that the size and shape of the image will not match, and the actual road marking information will deviate.
Therefore, the display data is used as image data indicating the size and shape of the road marking information according to the distance between the position of the actual road marking information and the host vehicle, thereby reducing the deviation from the actual road marking information. be able to.
 出力制御部33は、タイミング判定部31によって判定された音声案内のタイミングに同期して、HUD装置4Aに対して表示情報を出力する。これにより、HUD装置4Aでは、表示情報がフロントウィンドウまたはコンバイナに投影されて、前方視界の対応する実在の道路標示情報に重なって見えるように強調表示される。すなわち、出力制御部33は、音声案内のタイミングになったときに、表示制御部4cを制御して、表示情報生成部32から入力された表示情報をHUD装置4Aに出力する。 The output control unit 33 outputs display information to the HUD device 4A in synchronization with the voice guidance timing determined by the timing determination unit 31. Thereby, in the HUD device 4A, the display information is projected on the front window or the combiner and highlighted so as to overlap the corresponding real road marking information in the front view. In other words, the output control unit 33 controls the display control unit 4c to output the display information input from the display information generation unit 32 to the HUD device 4A when the voice guidance timing comes.
 HUD装置4Aでは、表示源とこの表示源からの表示光をフロントウィンドウに投影するレンズとの距離を調整することで、このレンズの焦点距離との関係から、虚像を任意の距離に結像させることができる。出力制御部33は、表示制御部4cを制御して、HUD装置4Aのレンズと表示源との距離を調整する。 In the HUD device 4A, by adjusting the distance between the display source and the lens that projects the display light from the display source onto the front window, a virtual image is formed at an arbitrary distance from the relationship with the focal length of the lens. be able to. The output control unit 33 controls the display control unit 4c to adjust the distance between the lens of the HUD device 4A and the display source.
 HUD装置4Aのレンズと表示源との距離を調整することにより、例えば、経路案内に関係する道路標示情報が案内標識の方面と方向の情報である場合、方面の地名と方向の矢印標示がハイライト色で強調表現された表示情報を、実在の案内標識における対応する方面および方向が記載された部分に貼り付けてあるように重ねて表示する。
 また、経路案内に関係する道路標示情報が路面標示である場合は、右左折の矢印標示がハイライト色で強調表現された表示情報を、対応する実際の路面標示に貼り付けてあるように重ねて表示する。
 なお、出力制御部33は、自車の走行に合わせて表示情報の表示位置を変更することにより、HUD装置4Aにおける表示情報の表示を自車の移動に追従させる。
By adjusting the distance between the lens of the HUD device 4A and the display source, for example, when the road marking information related to the route guidance is information on the direction and direction of the guidance sign, the place name and direction arrow indication in the direction are high. The display information highlighted in the light color is displayed so as to be pasted on the part where the corresponding direction and direction in the actual guide sign are described.
If the road marking information related to the route guidance is a road marking, the display information in which the right and left turn arrow markings are highlighted is highlighted so that it is pasted on the corresponding actual road marking. To display.
In addition, the output control part 33 changes the display position of display information according to driving | running | working of the own vehicle, and makes the display of the display information in the HUD apparatus 4A follow the movement of the own vehicle.
 図3は実施の形態1に係る表示制御装置のハードウェア構成を示すブロック図である。図4は、実施の形態1に係る表示制御装置の動作の概要を示すフローチャートである。
 図3(a)において、ディスプレイ101は、図1に示したHUD装置4Aであり、実施の形態1に係る表示制御装置によって表示が制御される。スピーカ102は、図1に示した音声出力部4dであり、例えば車両に搭載されたスピーカである。
FIG. 3 is a block diagram illustrating a hardware configuration of the display control apparatus according to the first embodiment. FIG. 4 is a flowchart showing an outline of the operation of the display control apparatus according to the first embodiment.
In FIG. 3A, a display 101 is the HUD device 4A shown in FIG. 1, and the display is controlled by the display control device according to the first embodiment. The speaker 102 is the audio output unit 4d shown in FIG. 1, and is, for example, a speaker mounted on a vehicle.
 前述したように、制御部3における表示制御に関する各機能は、実施の形態1に係る表示制御装置の各機能に相当する。制御部3における標示情報取得部30、タイミング判定部31、表示情報生成部32、出力制御部33は、処理回路100により実現される。
 すなわち、表示制御装置は、標示情報取得部30が図4に示すステップST1の処理を実行し、タイミング判定部31がステップST2の処理を実行し、表示情報生成部32がステップST3の処理を実行し、出力制御部33がステップST4の処理を実行するための処理回路100を備える。処理回路100は、制御部3を実現する構成要素であって、専用のハードウェアであっても、メモリに格納されるプログラムを実行するCPU(Central Processing Unit)であってもよい。
As described above, each function related to display control in the control unit 3 corresponds to each function of the display control apparatus according to the first embodiment. The sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 in the control unit 3 are realized by the processing circuit 100.
That is, in the display control apparatus, the sign information acquisition unit 30 executes the process of step ST1 shown in FIG. 4, the timing determination unit 31 executes the process of step ST2, and the display information generation unit 32 executes the process of step ST3. The output control unit 33 includes a processing circuit 100 for executing the process of step ST4. The processing circuit 100 is a component that implements the control unit 3 and may be dedicated hardware or a CPU (Central Processing Unit) that executes a program stored in a memory.
 処理回路100が専用のハードウェアである場合に、処理回路100は、例えば、単一回路、複合回路、プログラム化されたプロセッサ、並列プログラム化されたプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、または、これらを組み合わせたものが該当する。標示情報取得部30、タイミング判定部31、表示情報生成部32および出力制御部33の各部の機能をそれぞれ処理回路で実現してもよいし、各部の機能をまとめて1つの処理回路で実現してもよい。 When the processing circuit 100 is dedicated hardware, the processing circuit 100 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (FPGA). Field-Programmable Gate Array) or a combination thereof. The functions of each part of the sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 may be realized by a processing circuit, respectively, or the functions of each unit may be realized by a single processing circuit. May be.
 図3(b)に示すように、処理回路がCPU103である場合、標示情報取得部30、タイミング判定部31、表示情報生成部32および出力制御部33の機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアとファームウェアはプログラムとして記述され、メモリ104に格納される。CPU103は、メモリ104に格納されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、表示制御装置は、CPU103によって実行されるときに、図4に示すステップST1からステップST4までの処理が結果的に実行されるプログラムを格納するためのメモリ104を備える。
 また、これらのプログラムは、標示情報取得部30、タイミング判定部31、表示情報生成部32、出力制御部33の手順または方法をコンピュータに実行させるものである。ここで、メモリとは、例えば、RAM(Random Access Memory)、ROM、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(Electrically EPROM)などの不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVDなどが該当する。
As shown in FIG. 3B, when the processing circuit is the CPU 103, the functions of the marking information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 are software, firmware, or software. This is realized by a combination of firmware and firmware. Software and firmware are described as programs and stored in the memory 104. The CPU 103 reads out and executes the program stored in the memory 104, thereby realizing the functions of each unit. That is, the display control apparatus includes a memory 104 for storing a program that, when executed by the CPU 103, results from the processing from step ST1 to step ST4 shown in FIG.
In addition, these programs cause the computer to execute the procedures or methods of the sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33. Here, the memory is, for example, a RAM (Random Access Memory), a ROM, a flash memory, an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Programmable EPROM), or a non-volatile or volatile semiconductor memory, a magnetic disk, a flexible disk, An optical disc, a compact disc, a mini disc, a DVD, and the like are applicable.
 なお、標示情報取得部30、タイミング判定部31、表示情報生成部32、出力制御部33の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現してもよい。
 例えば、標示情報取得部30については専用のハードウェアとしての処理回路100でその機能を実現し、タイミング判定部31、表示情報生成部32、出力制御部33についてはCPU103がメモリ104に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 このように、処理回路は、ハードウェア、ソフトウェア、ファームウェア、またはこれらの組み合わせによって、前述の機能を実現することができる。
Note that some of the functions of the sign information acquisition unit 30, the timing determination unit 31, the display information generation unit 32, and the output control unit 33 may be realized by dedicated hardware, and may be realized by software or firmware. Good.
For example, the function of the sign information acquisition unit 30 is realized by the processing circuit 100 as dedicated hardware, and the CPU 103 is stored in the memory 104 for the timing determination unit 31, the display information generation unit 32, and the output control unit 33. The function can be realized by reading and executing the program.
As described above, the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
 次に動作について説明する。
 図5は、実施の形態1に係る表示制御装置の詳細な動作を示すフローチャートである。図5において、ステップST1aからステップST6aまでの処理は、図4に示したステップST1の詳細な処理の一例である。また、ステップST7aの処理は、図4に示したステップST2の詳細な処理の一例であって、ステップST8aの処理は、図4に示したステップST3の詳細な処理の一例である。ステップST9aからステップST10aの処理は、図4に示したステップST4の詳細な処理の一例である。
 従って、図4の各ステップの動作は、図5の各ステップを説明することにより明らかになるため、以下、図5に基づいて説明を行う。
Next, the operation will be described.
FIG. 5 is a flowchart showing a detailed operation of the display control apparatus according to the first embodiment. In FIG. 5, processing from step ST1a to step ST6a is an example of detailed processing of step ST1 shown in FIG. Moreover, the process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4, and the process of step ST8a is an example of the detailed process of step ST3 shown in FIG. The processing from step ST9a to step ST10a is an example of detailed processing of step ST4 shown in FIG.
Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 5, and will be described below with reference to FIG.
 まず、標示情報取得部30は、自車の走行予定経路が算出されているか否かを判定する(ステップST1a)。例えば、経路案内部8に対して走行予定経路に沿った経路案内を行っているか否かを問い合わせて、経路案内が行われている場合に、走行予定経路が算出されていると判定し、経路案内が行われていない場合、走行予定経路が算出されていないと判定する。走行予定経路が算出されていないと判定した場合(ステップST1a;NO)、ステップST10aの処理に移行する。 First, the sign information acquisition unit 30 determines whether or not the planned travel route of the vehicle has been calculated (step ST1a). For example, the route guidance unit 8 is inquired whether route guidance is performed along the planned travel route, and when the route guidance is performed, it is determined that the planned travel route is calculated, and the route is calculated. When the guidance is not performed, it is determined that the planned travel route has not been calculated. When it is determined that the planned travel route has not been calculated (step ST1a; NO), the process proceeds to step ST10a.
 次に、標示情報取得部30は、走行予定経路が算出されていると判定した場合(ステップST1a;YES)、情報取得部2における、周囲情報検出部22によって自車前方の道路標示情報が検出されたか否かを判定する(ステップST2a)。ここで、自車前方の道路標示情報が検出されない場合(ステップST2a;NO)、ステップST2aの処理を繰り返す。自車前方の道路標示情報が検出された場合(ステップST2a;YES)、標示情報取得部30は、周囲情報検出部22から、道路標示情報における案内標識、路面標示、表示板といった標示物と自車との距離を取得する(ステップST3a)。 Next, when it is determined that the planned travel route has been calculated (step ST1a; YES), the marking information acquisition unit 30 detects road marking information ahead of the vehicle by the surrounding information detection unit 22 in the information acquisition unit 2. It is determined whether or not it has been done (step ST2a). Here, when the road marking information ahead of the host vehicle is not detected (step ST2a; NO), the process of step ST2a is repeated. When road marking information in front of the host vehicle is detected (step ST2a; YES), the marking information acquisition unit 30 receives from the surrounding information detection unit 22 a sign such as a guide sign, road marking, and display board in the road marking information. The distance to the car is acquired (step ST3a).
 次に、標示情報取得部30は、道路標示情報の内容を判別する(ステップST4a)。
 例えば、標示情報取得部30は、周囲情報検出部22を介して画像処理部23bの画像処理で得られた画像データを画像解析して道路標示情報の内容を判別する。
 道路標示情報が案内標識である場合、案内標識に記載される地名、矢印標示などの標示内容およびこの標示内容が記載されている案内標識中の位置が判別される。
 道路標示情報が路面標示の場合、路面標示である、車線の直進、右左折などの進行方向別の通行区分を示す矢印標示が判別される。
 道路標示情報が情報板の表示内容である場合、情報板が表示する「大阪まで30km、渋滞あり、所要時間40分」といった交通情報の内容が判別される。
Next, the sign information acquisition unit 30 determines the content of the road sign information (step ST4a).
For example, the sign information acquisition unit 30 performs image analysis on the image data obtained by the image processing of the image processing unit 23b via the surrounding information detection unit 22, and determines the content of the road sign information.
When the road marking information is a guide sign, the name of the place described on the guide sign, the contents of the sign such as an arrow sign, and the position in the guide sign where the contents of the sign are described are determined.
When the road marking information is a road marking, an arrow marking indicating a traffic classification according to the traveling direction, such as a straight lane or a left / right turn, is determined.
When the road marking information is the information displayed on the information board, the contents of traffic information such as “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board are determined.
 続いて、標示情報取得部30は、ステップST4aで判別した道路標示情報の内容と、走行予定経路の経路案内情報とを比較し(ステップST5a)、経路案内に関係する道路標示情報があるか否かを判定する(ステップST6a)。
 すなわち、標示情報取得部30は、周囲情報検出部22により自車の前方で検出された道路標示情報を、経路案内部8における経路案内に関係する道路標示情報と、経路案内に関係しない道路標示情報とに分類する。
Subsequently, the marking information acquisition unit 30 compares the content of the road marking information determined in step ST4a with the route guidance information of the planned travel route (step ST5a), and whether there is road marking information related to the route guidance. Is determined (step ST6a).
That is, the sign information acquisition unit 30 uses the road sign information detected in front of the host vehicle by the surrounding information detection unit 22 as road sign information related to the route guidance in the route guide unit 8 and the road sign not related to the route guidance. Classified as information.
 周囲情報検出部22により自車の前方で検出された道路標示情報に経路案内に関係する道路標示情報がない場合(ステップST6a;NO)、ステップST2aの処理に戻る。
 一方、経路案内に関係する道路標示情報があった場合(ステップST6a;YES)、標示情報取得部30は、周囲情報検出部22により自車の前方で検出された道路標示情報のうち、経路案内に関係する道路標示情報を取得して表示情報生成部32に出力する。
If there is no road marking information related to route guidance in the road marking information detected in front of the vehicle by the surrounding information detection unit 22 (step ST6a; NO), the process returns to step ST2a.
On the other hand, when there is road marking information related to the route guidance (step ST6a; YES), the marking information acquisition unit 30 includes the route guidance among the road marking information detected by the surrounding information detection unit 22 in front of the host vehicle. The road marking information related to is acquired and output to the display information generation unit 32.
 次に、タイミング判定部31が、経路案内部8によって走行予定経路を音声で案内するタイミングであるか否かを判定する(ステップST7a)。
 前述したように、例えば、走行予定経路の経路案内情報と自車の現在位置に基づいて、案内地点に対して自車が予め定められた範囲内に到達したか否かが判定され、この範囲内に到達したときに音声案内のタイミングであると判定される。
 音声案内のタイミングでないと判定した場合(ステップST7a;NO)、音声案内のタイミングになるまで、ステップST7aの処理が繰り返される。
Next, the timing determination unit 31 determines whether or not it is a timing for voice guidance of the planned travel route by the route guidance unit 8 (step ST7a).
As described above, for example, based on the route guidance information of the planned travel route and the current position of the host vehicle, it is determined whether or not the host vehicle has reached a predetermined range with respect to the guide point. When it reaches inside, it is determined that it is the timing of voice guidance.
When it is determined that it is not the timing of voice guidance (step ST7a; NO), the process of step ST7a is repeated until the timing of voice guidance is reached.
 タイミング判定部31により音声案内のタイミングであると判定された場合(ステップST7a;YES)、表示情報生成部32は、この音声案内に関係する道路標示情報を、ハイライト色で、かつこの音声案内に対応する実在の標示物に合った形態で表現した表示情報を生成する(ステップST8a)。
 次に、出力制御部33は、音声出力装置4Bに対して音声案内を出力し、このタイミングに同期して、HUD装置4Aに対して表示情報生成部32により生成された表示情報を出力する。これにより、表示情報が、フロントウィンドウ越しに前方視界の対応する実在の道路標示情報に重なって見えるように強調表示される(ステップST9a)。
When the timing determination unit 31 determines that it is the timing of voice guidance (step ST7a; YES), the display information generation unit 32 highlights the road marking information related to this voice guidance in the highlight color and this voice guidance. Display information expressed in a form suitable for an actual sign corresponding to is generated (step ST8a).
Next, the output control unit 33 outputs voice guidance to the voice output device 4B, and outputs the display information generated by the display information generation unit 32 to the HUD device 4A in synchronization with this timing. As a result, the display information is highlighted so as to overlap the corresponding actual road marking information in the front view through the front window (step ST9a).
 出力制御部33は、この後に、HUD装置4Aによる表示を停止する操作が、入力部5において受け付けられたか否かを判定する(ステップST10a)。この操作が入力部5で受け付けられた場合(ステップST10a;YES)、図5の処理を終了する。
 また、上記操作が入力部5で受け付けられていない場合(ステップST10a;NO)、ステップST1aの処理に戻る。
Thereafter, the output control unit 33 determines whether or not an operation for stopping the display by the HUD device 4A has been received by the input unit 5 (step ST10a). When this operation is received by the input unit 5 (step ST10a; YES), the process of FIG. 5 is terminated.
Moreover, when the said operation is not received by the input part 5 (step ST10a; NO), it returns to the process of step ST1a.
 次に、HUD装置4Aにおける表示情報の表示について説明する。
 図6は、音声案内前のフロントウィンドウ越しに見た車両前方の風景を示す図であり、音声案内が行われる交差点に自車が差し掛かるときの自車前方の状態を示している。
 図6に示す交差点には案内標識9が近傍に配置されており、自車が走行する路面に右折矢印10、直進矢印11が標示されている。図6に示すように、この交差点の音声案内のタイミングでない場合、経路案内に関係する表示情報が表示されていない。
Next, display of display information in the HUD device 4A will be described.
FIG. 6 is a diagram showing a landscape in front of the vehicle viewed through the front window before voice guidance, and shows a state in front of the host vehicle when the vehicle approaches an intersection where voice guidance is performed.
A guide sign 9 is arranged in the vicinity at the intersection shown in FIG. 6, and a right turn arrow 10 and a straight arrow 11 are marked on the road surface on which the vehicle travels. As shown in FIG. 6, when it is not the timing of voice guidance at this intersection, display information related to route guidance is not displayed.
 図7は、音声案内するタイミングでフロントウィンドウ越しに見た車両前方の風景を示す図であり、図6の状態から自車がさらに交差点に接近して、音声案内のタイミングになった場合を示している。図7において、経路案内に関係する道路標示情報は、実在の案内標識9の方面「三宮」と右方向を示す情報、および右折矢印10の路面標示である。
 このとき、表示情報生成部32は、この方面を示す「三宮」の文字と右方向を示す矢印標示にハイライト色を施した画像データを表示情報9aとして生成する。
 また、表示情報生成部32は、右折矢印10に合った形状にハイライト色を施した画像データを表示情報10aとして生成する。
FIG. 7 is a diagram showing the scenery in front of the vehicle viewed through the front window at the timing of voice guidance, and shows the case where the vehicle approaches the intersection further from the state of FIG. 6 and the timing of voice guidance is reached. ing. In FIG. 7, the road marking information related to the route guidance is information indicating the direction “Sannomiya” in the direction of the actual guidance sign 9 and the right direction, and the road marking of the right turn arrow 10.
At this time, the display information generation unit 32 generates, as the display information 9a, image data obtained by applying highlight color to the characters “Sannomiya” indicating the direction and the arrow indicating the right direction.
Further, the display information generation unit 32 generates, as display information 10a, image data obtained by adding a highlight color to the shape that matches the right turn arrow 10.
 出力制御部33は、音声出力装置4Bにおいて「この先、右折してください。右方向、三宮方面です。」という音声案内が出力されるタイミングに同期して、HUD装置4Aに対して、表示情報生成部32によって生成された表示情報を出力する。
 このようにすることで、HUD装置4Aによって、方面「三宮」と右方向の矢印標示がハイライト色で表現された表示情報9aが、実在の案内標識9の対応する方面および方向が記載された位置に貼り付けてあるように表示される。
 同様に、HUD装置4Aによって、右折の矢印標示がハイライト色で表現された表示情報10aが、対応する実在の右折矢印10に貼り付けてあるように重ねて表示される。
The output control unit 33 generates display information for the HUD device 4A in synchronism with the output timing of the voice guidance “Please turn right ahead. Right direction, in the direction of Sannomiya” in the audio output device 4B. The display information generated by the unit 32 is output.
By doing in this way, the display information 9a in which the direction “Sannomiya” and the arrow mark in the right direction are expressed in highlight color is described by the HUD device 4A, and the corresponding direction and direction of the actual guide sign 9 are described. It appears as if it were pasted at the location.
Similarly, the HUD device 4A displays the display information 10a in which the right turn arrow mark is displayed in a highlighted color so as to be pasted on the corresponding actual right turn arrow 10.
 以上のように、実施の形態1に係る表示制御装置では、音声案内のタイミングに同期して、経路案内に関係する道路標示情報の表示情報9a,10aを、運転者の前方視界の対応する道路標示情報に重なって見えるように強調表示する。
 特に、表示情報生成部32は、音声案内の内容に関係する道路標示情報を強調表示させるための表示情報9a,10aを生成する。
 これにより、自然な形で強調された見やすい経路案内表示を実現できる。さらに運転者の聴覚と視覚で認識すべき情報の内容が一致して表示情報の内容が認識しやすくなる。
As described above, in the display control device according to the first embodiment, the road marking information display information 9a and 10a related to the route guidance is displayed in the road corresponding to the driver's front view in synchronization with the timing of the voice guidance. Highlight so that it appears to overlap the marking information.
In particular, the display information generation unit 32 generates display information 9a and 10a for highlighting road marking information related to the contents of voice guidance.
Thereby, it is possible to realize easy-to-see route guidance display emphasized in a natural manner. Furthermore, the contents of the information to be recognized visually and the driver's hearing are easy to recognize because the contents of the display information are the same.
実施の形態2.
 図8は、実施の形態2に係る表示制御装置の機能構成を示すブロック図であり、実施の形態2に係る表示制御装置が備える機能となる制御部3Aの各機能を示している。
 図8に示すように、制御部3Aは、HUD装置4Aの表示を制御する機能として、標示情報取得部30a、タイミング判定部31、表示情報生成部32a、出力制御部33aを備える。なお、図8において、図2と同一の構成要素には同一の符号を付して説明を省略する。
Embodiment 2. FIG.
FIG. 8 is a block diagram illustrating a functional configuration of the display control apparatus according to the second embodiment, and illustrates each function of the control unit 3A that is a function of the display control apparatus according to the second embodiment.
As illustrated in FIG. 8, the control unit 3A includes a labeling information acquisition unit 30a, a timing determination unit 31, a display information generation unit 32a, and an output control unit 33a as functions for controlling the display of the HUD device 4A. In FIG. 8, the same components as those in FIG.
 標示情報取得部30aは、経路案内部8による経路案内に関係する道路標示情報から、経路案内に関係する道路標示情報にない経路案内の内容を取得する。
 ここで、周囲情報検出部22によって自車前方で検出された道路標示情報を、経路案内部8における経路案内に関係する道路標示情報と経路案内に関係しない道路標示情報とに分類する。続いて、経路案内に関係する道路標示情報と経路案内の内容を比較して、この道路標示情報にない経路案内の内容を、追加表示する情報として抽出して取得する。
 例えば、経路案内に関係する道路標示情報が案内標識における方面「三宮」と右方向であり、経路案内の内容には、現在位置から「三宮」方面に向かって目的地に到着するまでの残り距離が含まれる場合、この残り距離が追加表示する情報として抽出される。
The sign information acquisition unit 30a acquires the content of the route guidance that is not included in the road sign information related to the route guidance from the road marking information related to the route guidance by the route guide unit 8.
Here, the road marking information detected in front of the vehicle by the surrounding information detection unit 22 is classified into road marking information related to the route guidance in the route guidance unit 8 and road marking information not related to the route guidance. Subsequently, the road marking information related to the route guidance is compared with the content of the route guidance, and the content of the route guidance not included in the road marking information is extracted and acquired as information to be additionally displayed.
For example, the road marking information related to route guidance is in the direction of “Sannomiya” in the direction of the guidance sign, and the content of the route guidance includes the remaining distance from the current position to the destination in the direction of “Sannomiya”. Is included as information to be additionally displayed.
 また、経路案内に関係する道路標示情報と比較する経路案内情報には、走行予定経路に関係する交通情報も含まれる。
 例えば、「三宮」方面に右方向の道路が渋滞中である場合、標示情報取得部30aは、「渋滞中」を示す文字情報およびこの渋滞を回避する迂回路を示す情報を、追加表示する情報として抽出する。
The route guidance information to be compared with the road marking information related to the route guidance also includes traffic information related to the planned travel route.
For example, when the road on the right in the direction of “Sannomiya” is congested, the sign information acquisition unit 30a additionally displays character information indicating “congested” and information indicating a detour to avoid this traffic jam Extract as
 表示情報生成部32aは、経路案内に関係する道路標示情報にない経路案内の内容を、表示させるための追加表示情報を生成する。経路案内に関係する道路標示情報が案内標識における方面「三宮」および右方向を示す情報であり、この情報にない経路案内の内容が現在位置から「三宮」方面に向かって目的地に到着するまでの残り距離(あと5km)という情報である場合を例に挙げる。この場合、表示情報生成部32aは、残り距離を示す「あと5km」という文字情報をハイライト色で表現した画像データを、追加表示情報として生成する。 The display information generation unit 32a generates additional display information for displaying the contents of route guidance that is not included in the road marking information related to route guidance. The road marking information related to the route guidance is the information indicating the direction “Sannomiya” and the right direction on the guidance sign. Until the content of the route guidance not in this information arrives at the destination from the current position toward “Sannomiya” A case where the information is the remaining distance (5 km) is taken as an example. In this case, the display information generation unit 32a generates, as additional display information, image data in which character information “5 km remaining” indicating the remaining distance is expressed in highlight color.
 出力制御部33aは、HUD装置4Aに対して追加表示情報を出力する。これにより、HUD装置4Aによって、追加表示情報が、前方視界の対応する実在の標示物の空きスペースに貼り付けてあるように表示される。あるいは、追加表示情報が、前方視界の対応する実在の標示物に隣接する位置に貼り付けてあるように表示される。
 追加表示情報に対応する標示物としては、例えば、追加表示情報が「三宮」方面に向かった残り距離である場合、「三宮」方面が記載された案内標識が、この追加表示情報に対応する標示物と判定される。
The output control unit 33a outputs additional display information to the HUD device 4A. Thereby, the additional display information is displayed by the HUD device 4A as if it is pasted in the empty space of the corresponding real sign object in the front view. Alternatively, the additional display information is displayed so as to be pasted at a position adjacent to the corresponding real sign in the forward view.
As the sign corresponding to the additional display information, for example, when the additional display information is the remaining distance toward the “Sannomiya” direction, the guide sign describing the “Sannomiya” direction is the sign corresponding to the additional display information. It is determined to be a thing.
 次に動作について説明する。
 図9は、実施の形態2に係る表示制御装置の詳細な動作を示すフローチャートである。図9において、ステップST1aからステップST6a、ステップST7aとステップST8a、ステップST10aの処理は、図5に示したステップST1aからステップST6a、ステップST7aとステップST8a、ステップST10aの処理と同じであり、説明を省略する。
Next, the operation will be described.
FIG. 9 is a flowchart showing a detailed operation of the display control apparatus according to the second embodiment. In FIG. 9, the processing from step ST1a to step ST6a, step ST7a and step ST8a, and step ST10a is the same as the processing from step ST1a to step ST6a, step ST7a and step ST8a, and step ST10a shown in FIG. Omitted.
 また、図9におけるステップST1aからステップST6a-3までの処理は、図4に示したステップST1の詳細な処理の一例である。
 ステップST7aの処理は、図4に示したステップST2の詳細な処理の一例であり、ステップST8aからステップST8a-1までの処理は、図4に示したステップST3の詳細な処理の一例である。ステップST9bからステップST10aの処理は、図4に示したステップST4の詳細な処理の一例である。
 従って、図4の各ステップの動作は、図9の各ステップを説明することにより明らかになるため、以下、図9に基づいて説明を行う。
Further, the processing from step ST1a to step ST6a-3 in FIG. 9 is an example of detailed processing of step ST1 shown in FIG.
The process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4, and the process from step ST8a to step ST8a-1 is an example of the detailed process of step ST3 shown in FIG. The processing from step ST9b to step ST10a is an example of detailed processing of step ST4 shown in FIG.
Therefore, since the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 9, the following description will be made based on FIG.
 標示情報取得部30aは、経路案内に関係する道路標示情報の内容と、走行予定経路の経路案内情報とを比較し(ステップST6a-1)、経路案内に関係する道路標示情報にない音声案内の内容があるか否かを判定する(ステップST6a-2)。
 すなわち、標示情報取得部30aは、経路案内に関係する道路標示情報と経路案内情報が示す経路案内の内容とを比較し、この道路標示情報にない経路案内の内容を特定する。
The marking information acquisition unit 30a compares the content of the road marking information related to the route guidance with the route guidance information of the planned travel route (step ST6a-1), and performs voice guidance not included in the road marking information related to the route guidance. It is determined whether or not there is content (step ST6a-2).
That is, the marking information acquisition unit 30a compares the road marking information related to the route guidance with the content of the route guidance indicated by the route guidance information, and specifies the content of the route guidance not included in the road marking information.
 経路案内に関係する道路標示情報にない音声案内の内容がない場合(ステップST6a-2;NO)、ステップST7aの処理に移行する。
 経路案内に関係する道路標示情報にない音声案内の内容がある場合(ステップST6a-2;YES)、標示情報取得部30aは、この音声案内の内容を追加表示する情報として抽出する(ステップST6a-3)。
 例えば、経路案内に関係する道路標示情報が、案内標識における方面「三宮」と右方向である場合、経路案内情報のうち、目的地までの残り距離、到着予想時間、渋滞情報などが追加表示する情報として抽出される。
If there is no voice guidance content not included in the road marking information related to the route guidance (step ST6a-2; NO), the process proceeds to step ST7a.
When there is a voice guidance content not included in the road marking information related to the route guidance (step ST6a-2; YES), the marking information acquisition unit 30a extracts the content of the voice guidance as information to be additionally displayed (step ST6a- 3).
For example, when the road marking information related to route guidance is in the right direction with the direction “Sannomiya” in the guidance sign, the remaining distance to the destination, estimated arrival time, traffic jam information, etc. are additionally displayed in the route guidance information. Extracted as information.
 表示情報生成部32aは、標示情報取得部30aから入力された追加表示する情報を、ハイライト色で、かつこの情報に対応する実在の標示物に合った形態で表現した表示情報を生成する(ステップST8a-1)。
 次に、出力制御部33aは、音声案内が出力されるタイミングに同期して、HUD装置4Aに対して表示情報生成部32により生成された表示情報と追加表示情報を出力する。
 表示情報生成部32により生成された表示情報および追加表示情報は、運転者の前方視界の対応する実在の道路標示情報に重なって見えるように強調表示される(ステップST9b)。
The display information generation unit 32a generates display information in which the information to be additionally displayed input from the sign information acquisition unit 30a is expressed in a highlight color and in a form suitable for an actual sign corresponding to this information ( Step ST8a-1).
Next, the output control unit 33a outputs the display information and additional display information generated by the display information generation unit 32 to the HUD device 4A in synchronization with the timing at which the voice guidance is output.
The display information and the additional display information generated by the display information generation unit 32 are highlighted so as to appear to overlap the corresponding actual road marking information in the driver's front view (step ST9b).
 次に、HUD装置4Aにおける表示情報の表示について説明する。
 図10は、実在の案内標識9の空きスペースに追加表示情報9bが表示されたフロントウィンドウ越しに見た車両前方の風景を示す図であり、自車が交差点に接近し、音声案内のタイミングになった場合を示している。
 また、図11は、案内標識9と並ぶように追加表示情報9cが表示されたフロントウィンドウ越しに見た車両前方の風景を示す図である。
 ここでは、経路案内に関係する道路標示情報が、案内標識に記載された方面「三宮」と右方向を示す情報であり、この情報にない経路案内の内容が、現在位置から「三宮」方面に向かって目的地に到着するまでの残り距離(あと5km)である場合を示している。
Next, display of display information in the HUD device 4A will be described.
FIG. 10 is a view showing the scenery in front of the vehicle seen through the front window in which the additional display information 9b is displayed in the empty space of the actual guidance sign 9, and the vehicle approaches the intersection and the voice guidance timing is reached. It shows the case.
FIG. 11 is a view showing a scenery in front of the vehicle viewed through the front window on which the additional display information 9c is displayed so as to be aligned with the guide sign 9. FIG.
Here, the road marking information related to the route guidance is information indicating the direction “Sannomiya” and the right direction described in the guidance sign, and the content of the route guidance not included in this information is from the current position to the direction “Sannomiya”. This shows a case where the remaining distance until the destination is reached (more 5 km).
 表示情報生成部32aは、実在の案内標識9における、この方面を示す「三宮」の文字と右方向を示す矢印標示にハイライト色を施した画像データを、表示情報9aとして生成する。また、表示情報生成部32aは、実在の右折矢印10に合った形状にハイライト色を施した画像データを、表示情報10aとして生成する。さらに、表示情報生成部32aは、残り距離を示す「あと5km」という文字情報をハイライト色で表現した追加表示情報9b,9cを生成する。 The display information generation unit 32a generates, as display information 9a, image data obtained by highlighting the characters “Sannomiya” indicating this direction and the arrow mark indicating the right direction in the actual guide sign 9. In addition, the display information generation unit 32a generates, as display information 10a, image data obtained by applying a highlight color to a shape that matches the actual right turn arrow 10. Further, the display information generation unit 32a generates additional display information 9b and 9c in which the character information “5 km remaining” indicating the remaining distance is expressed in highlight color.
 出力制御部33aは、「この先、右折してください。右方向、三宮方面にあと5kmです。」という音声案内が出力されるタイミングに同期して、表示情報生成部32aにより生成された表示情報9a,10aをHUD装置4Aに対して出力する。これにより、HUD装置4Aによって、表示情報9aが、実在の案内標識9の対応する方面および方向が記載された位置に貼り付けてあるように表示され、表示情報10aが、対応する実際の右折矢印10に貼り付けてあるように表示される。 The output control unit 33a displays the display information 9a generated by the display information generation unit 32a in synchronism with the output timing of the voice guidance “Please turn right ahead. There is another 5km in the right direction toward Sannomiya”. , 10a is output to the HUD device 4A. Thereby, the display information 9a is displayed by the HUD device 4A so as to be pasted at the position where the corresponding direction and direction of the actual guide sign 9 are described, and the display information 10a is displayed in correspondence with the actual right turn arrow. 10 is displayed as being pasted.
 さらに、出力制御部33aは、この音声案内のタイミングに同期して、HUD装置4Aに対して、「あと5km」という文字情報がハイライト色で表現された追加表示情報9b,9cを出力する。これにより、追加表示情報9bが、図10に示すように、案内標識9の空きスペースに表示されるか、あるいは、図11に示すように、追加表示情報9cが、案内標識9と並ぶように表示される。 Further, the output control unit 33a outputs additional display information 9b and 9c in which the character information “5 km” is expressed in highlight color to the HUD device 4A in synchronization with the voice guidance timing. Thereby, the additional display information 9b is displayed in the empty space of the guide sign 9 as shown in FIG. 10, or the additional display information 9c is aligned with the guide sign 9 as shown in FIG. Is displayed.
 図12は、実在の案内標識9の空きスペースに渋滞の迂回路が表示され、渋滞中であることが案内標識9と並ぶように表示されたフロントウィンドウ越しに見た車両前方の風景を示す図である。図12では、経路案内に関係する道路標示情報が、案内標識に記載された方面「三宮」と右方向を示す情報である場合を示している。また、この道路標示情報に存在しない経路案内の内容が、「三宮」方面に右方向の道路が渋滞しており、この渋滞を回避する迂回路を示す情報である場合を示している。 FIG. 12 is a view showing a scenery in front of the vehicle viewed through the front window in which a detour of a traffic jam is displayed in an empty space of the actual guidance sign 9 and that it is displayed that the traffic jam is aligned with the guidance sign 9 It is. FIG. 12 shows a case where the road marking information related to the route guidance is information indicating the direction “Sannomiya” described in the guidance sign and the right direction. Further, the content of the route guidance that does not exist in the road marking information indicates that the road in the right direction is congested in the direction of “Sannomiya” and is information indicating a detour to avoid this traffic jam.
 表示情報生成部32aは、実在の案内標識9における、この方面を示す「三宮」の文字をハイライト色で表現した画像データを、表示情報9aとして生成する。
 また、表示情報生成部32aは、実在の路面標示における右折矢印10に合った形状にハイライト色を施した画像データを、表示情報10aとして生成する。
 さらに、表示情報生成部32aは、右方向から「三宮」方面へ迂回する迂回路を示す矢印標示をハイライト色で表現された画像データを、追加表示情報9dとして生成する。
 加えて、表示情報生成部32aは、「迂回路」という文字情報がハイライト色で表現された画像データを追加表示情報9eとして生成し、「渋滞中」という文字情報をハイライト色で表現した追加表示情報9fを生成する。
The display information generation unit 32a generates, as display information 9a, image data in which characters of “Sannomiya” indicating this direction in the actual guide sign 9 are expressed in highlight color.
In addition, the display information generation unit 32a generates, as display information 10a, image data in which a highlight color is applied to a shape that matches the right turn arrow 10 in an actual road marking.
Further, the display information generation unit 32a generates, as additional display information 9d, image data in which an arrow sign indicating a detour that detours from the right direction toward the “Sannomiya” direction is highlighted.
In addition, the display information generation unit 32a generates, as additional display information 9e, image data in which character information “detour” is expressed in highlight color, and character information “in traffic jam” is expressed in highlight color. Additional display information 9f is generated.
 出力制御部33aは、音声出力装置4Bにおいて、例えば「三宮方面に右方向の道は渋滞中です。この先、右折してから迂回路があります。」という音声案内が出力されるタイミングに同期して、HUD装置4Aに対して表示情報10aを出力する。
 これにより、例えば、右折の矢印標示がハイライト色で表現された表示情報10aが、これに対応する実在の右折矢印10に貼り付けてあるように表示される。
 また、追加表示情報9dは、これに対応する実在の案内標識9の矢印表示に貼り付けてあるように表示される。
 さらに、追加表示情報9eは、これに対応する実在の案内標識9の空きスペースに表示され、追加表示情報9fは、案内標識9に隣接する位置に表示される。
In the voice output device 4B, the output control unit 33a, for example, synchronizes with the timing at which the voice guidance saying “There is a traffic jam in the right direction in the direction of Sannomiya. The display information 10a is output to the HUD device 4A.
Thereby, for example, the display information 10a in which the right turn arrow sign is expressed in highlight color is displayed as if it is pasted on the actual right turn arrow 10 corresponding thereto.
Further, the additional display information 9d is displayed as pasted on the arrow display of the actual guide sign 9 corresponding thereto.
Further, the additional display information 9e is displayed in the empty space of the actual guide sign 9 corresponding thereto, and the additional display information 9f is displayed at a position adjacent to the guide sign 9.
 以上のように、実施の形態2に係る表示制御装置において、表示情報生成部32aは、経路案内に関係する道路標示情報に存在しない経路案内の内容を表示させるための追加表示情報9c~9fを生成する。出力制御部33aは、HUD装置4Aに対して追加表示情報9c~9fを出力する。このように経路案内に関係する道路標示情報にない経路案内の内容を追加表示することで、運転者の聴覚と視覚とで認識すべき情報の内容が一致して追加表示情報の内容が認識しやすくなる。また、案内標識9の空きスペースあるいは案内標識9に隣接する位置に追加表示することで、自然な形で強調された見やすい経路案内表示を実現できる。 As described above, in the display control apparatus according to the second embodiment, the display information generating unit 32a displays the additional display information 9c to 9f for displaying the contents of the route guidance that does not exist in the road marking information related to the route guidance. Generate. The output control unit 33a outputs additional display information 9c to 9f to the HUD device 4A. In this way, by additionally displaying the contents of the route guidance that is not included in the road marking information related to the route guidance, the contents of the information that should be recognized by the driver's auditory sense and visual sense match, and the contents of the additional display information are recognized. It becomes easy. Further, by additionally displaying the empty space of the guide sign 9 or the position adjacent to the guide sign 9, it is possible to realize easy-to-see route guidance display emphasized in a natural manner.
実施の形態3.
 図13は、実施の形態3に係る表示制御装置の機能構成を示すブロック図であり、実施の形態3に係る表示制御装置が備える機能となる制御部3Bの各機能を示している。
 図13に示すように、制御部3Bは、HUD装置4Aの表示を制御する機能として、標示情報取得部30b、タイミング判定部31、表示情報生成部32b、出力制御部33bを備える。なお、図13において、図2と同一の構成要素には同一の符号を付して説明を省略する。
Embodiment 3 FIG.
FIG. 13 is a block diagram illustrating a functional configuration of the display control apparatus according to the third embodiment, and illustrates each function of the control unit 3B that is a function of the display control apparatus according to the third embodiment.
As illustrated in FIG. 13, the control unit 3B includes a labeling information acquisition unit 30b, a timing determination unit 31, a display information generation unit 32b, and an output control unit 33b as functions for controlling the display of the HUD device 4A. In FIG. 13, the same components as those in FIG.
 標示情報取得部30bは、周囲情報検出部22により自車の前方で検出された道路標示情報を、経路案内部8における経路案内に関係する道路標示情報と経路案内に関係しない道路標示情報とに分類し、これらの両方を取得する。
 例えば、経路案内に関係する道路標示情報が案内標識における方面「三宮」と右方向である場合、この案内標識に「大阪」方面も標示されていれば、「大阪」方面についても、経路案内に関係しない道路標示情報として取得される。
The marking information acquisition unit 30b converts the road marking information detected in front of the vehicle by the surrounding information detection unit 22 into road marking information related to the route guidance in the route guidance unit 8 and road marking information not related to the route guidance. Categorize and get both of these.
For example, if the road marking information related to route guidance is the direction “Sannomiya” in the direction of the guidance sign, and the direction of “Osaka” is also indicated on this guidance sign, the direction of “Osaka” is also used for route guidance. Acquired as unrelated road marking information.
 表示情報生成部32bは、経路案内に関係する道路標示情報を強調表示させるための表示情報を生成するとともに、前方視界の経路案内に関係しない道路標示情報をマスク表示させるためのマスク用表示情報を生成する。
 経路案内に関係する道路標示情報が案内標識における方面「三宮」と右方向であり、この情報に関係しない内容が「大阪」方面と直進方向である場合を例に挙げる。
 この場合、表示情報生成部32bは、「三宮」を示す文字情報および右方向の矢印標示をハイライト色で表現した画像を表示情報として生成する。また、表示情報生成部32bは、「大阪」を示す文字情報および直進方向の矢印標示を、マスク表示させるためのマスク用表示情報を生成する。
 マスク用表示情報は、重ねて表示した下地の内容が半透明もしくは遮蔽される画像情報である。
The display information generation unit 32b generates display information for highlighting road marking information related to the route guidance, and displays mask display information for masking road marking information not related to the route guidance for the forward view. Generate.
An example is given in which the road marking information related to the route guidance is the direction “Sannomiya” in the direction of the guidance sign and the right direction, and the content not related to this information is the direction “Osaka” and the straight direction.
In this case, the display information generation unit 32b generates character information indicating “Sannomiya” and an image expressing the right arrow mark in highlight color as display information. In addition, the display information generation unit 32b generates mask display information for mask-displaying the character information indicating “Osaka” and the arrow mark in the straight direction.
The display information for mask is image information in which the content of the base displayed in a superimposed manner is translucent or shielded.
 出力制御部33bは、HUD装置4Aに対して、表示情報生成部32bにより生成された表示情報を出力し、マスク用表示情報を出力する。
 これにより、例えば、経路案内に関係する道路標示情報が案内標識における方面「三宮」と右方向を示す情報である場合、「三宮」を示す文字情報および右方向の矢印標示をハイライト色で表現した表示情報が、実在の案内標識における「三宮」と右方向の矢印標示に重ねて表示される。さらに、経路案内に関係しない道路標示情報が「大阪」方面と直進方向を示す情報である場合、マスク用表示情報が「大阪」方面および直進方向の位置に重ねて表示される。
The output control unit 33b outputs the display information generated by the display information generation unit 32b to the HUD device 4A, and outputs mask display information.
Thus, for example, when the road marking information related to the route guidance is information indicating the direction “Sannomiya” and the right direction in the guidance sign, the character information indicating “Sannomiya” and the arrow indication in the right direction are expressed in highlight color. The displayed information is displayed so as to overlap “Sannomiya” in the actual information sign and the arrow mark in the right direction. Further, when the road marking information not related to the route guidance is information indicating the “Osaka” direction and the straight direction, the mask display information is displayed so as to overlap the “Osaka” direction and the position in the straight direction.
 次に動作について説明する。
 図14は、実施の形態3に係る表示制御装置の詳細な動作を示すフローチャートである。図14において、ステップST1aからステップST8aおよびステップST10aの処理は、図5に示したステップST1aからステップST8aおよびステップST10aの処理と同じであり、説明を省略する。
Next, the operation will be described.
FIG. 14 is a flowchart showing a detailed operation of the display control apparatus according to the third embodiment. In FIG. 14, the processing from step ST1a to step ST8a and step ST10a is the same as the processing from step ST1a to step ST8a and step ST10a shown in FIG.
 また、図14におけるステップST1aからステップST6aまでの処理は、図4に示したステップST1の詳細な処理の一例である。
 ステップST7aの処理は、図4に示したステップST2の詳細な処理の一例であり、ステップST8aからステップST8b-1までの処理は、図4に示したステップST3の詳細な処理の一例である。ステップST9cからステップST10aの処理は、図4に示したステップST4の詳細な処理の一例である。
 従って、図4の各ステップの動作は、図14の各ステップを説明することにより明らかになるため、以下、図14に基づいて説明を行う。
Further, the processing from step ST1a to step ST6a in FIG. 14 is an example of detailed processing of step ST1 shown in FIG.
The process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4, and the process from step ST8a to step ST8b-1 is an example of the detailed process of step ST3 shown in FIG. The processing from step ST9c to step ST10a is an example of detailed processing of step ST4 shown in FIG.
Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 14, and will be described below with reference to FIG.
 ステップST8b-1において、表示情報生成部32bは、経路案内に関係しない道路標示情報をマスク表示させるためのマスク用表示情報を生成する。
 次に、出力制御部33bは、音声出力装置4Bによって音声案内が出力されるタイミングに同期して、HUD装置4Aに対して、表示情報生成部32により生成された表示情報を出力する。HUD装置4Aによって、表示情報は、運転者の前方視界の対応する道路標示情報に重ねて強調表示され、マスク用表示情報が表示される(ステップST9c)。
In step ST8b-1, the display information generation unit 32b generates mask display information for mask display of road marking information not related to route guidance.
Next, the output control unit 33b outputs the display information generated by the display information generation unit 32 to the HUD device 4A in synchronization with the timing at which the voice guidance is output by the voice output device 4B. The display information is highlighted by the HUD device 4A so as to be superimposed on the corresponding road marking information in the driver's front field of view, and the mask display information is displayed (step ST9c).
 次に、HUD装置4Aにおける表示情報の表示について説明する。
 図15は、経路案内に関係しない情報がマスク表示されたフロントウィンドウ越しに見た車両前方の風景を示す図であり、自車が交差点に接近し音声案内のタイミングになった場合を示している。ここでは、経路案内に関係する道路標示情報が案内標識の方面「三宮」と右方向を示す情報であり、経路案内に関係しない道路標示情報が「大阪」方面および直進の矢印標示である場合を示している。
Next, display of display information in the HUD device 4A will be described.
FIG. 15 is a diagram showing the scenery in front of the vehicle viewed through the front window on which information not related to route guidance is displayed as a mask, and shows the case where the vehicle approaches the intersection and the timing of voice guidance is reached. . Here, the road marking information related to the route guidance is the information indicating the direction of the direction “Sannomiya” and the right direction, and the road marking information not related to the route guidance is the direction of “Osaka” and a straight arrow. Show.
 表示情報生成部32bは、実在の案内標識9における、この方面を示す「三宮」の文字と右方向を示す矢印標示にハイライト色を施した画像を表示情報9aとして生成する。
 また、表示情報生成部32aは、実在の路面標示における右折矢印10に合った形状にハイライト色を施した画像を表示情報10aとして生成する。
The display information generation unit 32b generates, as the display information 9a, an image obtained by highlighting the characters “Sannomiya” indicating this direction and the arrow mark indicating the right direction in the actual guide sign 9.
In addition, the display information generation unit 32a generates, as the display information 10a, an image in which a highlight color is applied to a shape that matches the right turn arrow 10 in an actual road marking.
 さらに、表示情報生成部32bは、実在の案内標識9における「大阪」方面およびその方向を遮蔽するマスク用表示情報9gを生成する。
 また、表示情報生成部32bは、直進矢印11の路面標示についても、経路案内に関係しない道路標示情報であるので、実在の路面標示における直進矢印11を遮蔽するマスク用表示情報11aとして生成する。
Further, the display information generating unit 32b generates mask display information 9g that shields the direction of “Osaka” in the actual guide sign 9 and its direction.
The display information generation unit 32b also generates the road marking of the straight arrow 11 as mask display information 11a that shields the straight arrow 11 in the actual road marking because the road marking is not related to the route guidance.
 出力制御部33bは、音声出力装置4Bにより「この先、右折してください。右方向、三宮方面です。」という音声案内が出力されるタイミングに同期して、HUD装置4Aに対して表示情報9a,10aを出力する。
 これにより、HUD装置4Aによって、表示情報9aが、実在の案内標識9の対応する方面および方向が記載された位置に貼り付けてあるように表示され、表示情報10aが、対応する実際の右折矢印10に貼り付けてあるように表示される。
The output control unit 33b uses the voice output device 4B to synchronize with the timing at which the voice guidance “Please turn right. 10a is output.
Thereby, the display information 9a is displayed by the HUD device 4A so as to be pasted at the position where the corresponding direction and direction of the actual guide sign 9 are described, and the display information 10a is displayed in correspondence with the actual right turn arrow. 10 is displayed as being pasted.
 さらに、出力制御部33bは、この音声案内のタイミングに同期して、HUD装置4Aに対してマスク用表示情報9g,11aを出力する。
 これにより、マスク用表示情報9gは、案内標識9の「大阪」方面およびその方向が記載される位置に重なって表示される。マスク用表示情報11aは、直進の直進矢印11に貼り付けてあるように重ねて表示される。
Further, the output control unit 33b outputs the mask display information 9g and 11a to the HUD device 4A in synchronization with the voice guidance timing.
As a result, the mask display information 9g is displayed so as to overlap the position where the direction of “Osaka” and the direction of the guide sign 9 are described. The mask display information 11a is displayed so as to be pasted on the straight-going straight arrow 11.
 以上のように、実施の形態3に係る表示制御装置において、表示情報生成部32bは、前方視界の経路案内に関係しない道路標示情報をマスク表示させるためのマスク用表示情報9g,11aを生成する。マスク用表示情報9g,11aは、HUD装置4Aによって前方視界の経路案内に関係しない道路標示情報に重ねて表示される。
 このように経路案内に関係しない道路標示情報がマスク表示されるので、運転者が、経路案内に関係する道路標示情報の表示情報をより認識しやすくなる。
As described above, in the display control apparatus according to the third embodiment, the display information generation unit 32b generates mask display information 9g and 11a for mask-displaying road marking information that is not related to the route guidance for the forward view. . The mask display information 9g and 11a are displayed by the HUD device 4A so as to be superimposed on the road marking information not related to the route guidance for the front view.
Thus, since the road marking information not related to the route guidance is displayed as a mask, it becomes easier for the driver to recognize the display information of the road marking information related to the route guidance.
実施の形態4.
 図16は、実施の形態4に係る表示制御装置の機能構成を示すブロック図であり、実施の形態4に係る表示制御装置が備える機能となる制御部3Cの各機能を示している。
 図16に示すように、制御部3Cは、HUD装置4Aの表示を制御する機能として、標示情報取得部30、タイミング判定部31a、表示情報生成部32および出力制御部33を備える。なお、図16において、図2と同一の構成要素には同一の符号を付して説明を省略する。
Embodiment 4 FIG.
FIG. 16 is a block diagram illustrating a functional configuration of the display control apparatus according to the fourth embodiment, and illustrates each function of the control unit 3C that is a function of the display control apparatus according to the fourth embodiment.
As illustrated in FIG. 16, the control unit 3C includes a labeling information acquisition unit 30, a timing determination unit 31a, a display information generation unit 32, and an output control unit 33 as functions for controlling the display of the HUD device 4A. In FIG. 16, the same components as those in FIG.
 タイミング判定部31aは、経路案内に関係する道路標示情報と車両とが予め定められた範囲になったタイミングを音声案内のタイミングであると判定する。
 予め定められた範囲とは、運転者から自車前方の道路標示情報が視認可能となる距離的または時間的に定めた範囲である。例えば、自車から30m程度の範囲が考えられる。
The timing determination unit 31a determines that the timing when the road marking information related to route guidance and the vehicle are in a predetermined range is the timing of voice guidance.
The predetermined range is a range determined in terms of distance or time in which the road marking information ahead of the vehicle can be visually recognized from the driver. For example, a range of about 30 m from the own vehicle is conceivable.
 次に動作について説明する。
 図17は実施の形態4に係る表示制御装置の詳細な動作を示すフローチャートである。
 図17において、ステップST1aからステップST7aおよびステップST8aからステップST10aの処理は、図5に示したステップST1aからステップST7aおよびステップST8aからステップST10aの処理と同じであり、説明を省略する。
Next, the operation will be described.
FIG. 17 is a flowchart showing a detailed operation of the display control apparatus according to the fourth embodiment.
In FIG. 17, the processing from step ST1a to step ST7a and from step ST8a to step ST10a is the same as the processing from step ST1a to step ST7a and from step ST8a to step ST10a shown in FIG.
 また、図17におけるステップST1aからステップST6aまでの処理は、図4に示したステップST1の詳細な処理の一例である。ステップST7bの処理は、図4に示したステップST2の詳細な処理の一例であり、ステップST8aの処理は、図4に示したステップST3の詳細な処理の一例である。ステップST9aからステップST10aの処理は、図4に示したステップST4の詳細な処理の一例である。 Further, the processing from step ST1a to step ST6a in FIG. 17 is an example of detailed processing of step ST1 shown in FIG. The process of step ST7b is an example of the detailed process of step ST2 shown in FIG. 4, and the process of step ST8a is an example of the detailed process of step ST3 shown in FIG. The processing from step ST9a to step ST10a is an example of detailed processing of step ST4 shown in FIG.
 ステップST7bにおいて、タイミング判定部31aは、情報取得部2から入力された自車の現在位置と自車前方の経路案内に関係する道路標示情報の位置とを比較し、自車と道路標示情報の位置との距離が予め定められた範囲内になったか否かを判定する。
 すなわち、道路標示情報の認識が可能な距離になったか否かが判定される。
 道路標示情報の認識が可能な距離になっていないと判定した場合(ステップST7b;NO)、この距離になるまでステップST7bの処理が繰り返される。
 また、道路標示情報の認識が可能な距離になったと判定された場合(ステップST7b;YES)、ステップST8aの処理に移行する。
In step ST7b, the timing determination unit 31a compares the current position of the host vehicle input from the information acquisition unit 2 with the position of the road marking information related to the route guidance ahead of the host vehicle. It is determined whether or not the distance to the position is within a predetermined range.
That is, it is determined whether or not the distance has reached a distance where the road marking information can be recognized.
If it is determined that the distance is not such that the road marking information can be recognized (step ST7b; NO), the process of step ST7b is repeated until this distance is reached.
Further, when it is determined that the distance has become such that the road marking information can be recognized (step ST7b; YES), the process proceeds to step ST8a.
 以上のように、実施の形態4に係る表示制御装置において、タイミング判定部31aは、経路案内に関係する道路標示情報と自車との距離が予め定められた範囲になったタイミングを音声案内のタイミングであると判定する。このように運転者が実際に道路標示情報の視認が可能な状況になったタイミングで、音声案内の出力と経路案内に対応する表示情報の表示とを行うことが可能である。 As described above, in the display control apparatus according to the fourth embodiment, the timing determination unit 31a determines the timing at which the distance between the road marking information related to the route guidance and the own vehicle is within a predetermined range. It is determined that it is timing. Thus, at the timing when the driver can actually see the road marking information, it is possible to output voice guidance and display display information corresponding to the route guidance.
実施の形態5.
 図18は、実施の形態5に係る表示制御装置の機能構成を示すブロック図であり、実施の形態5に係る表示制御装置が備える機能となる制御部3Dの各機能を示している。
 図18に示すように、制御部3Dは、HUD装置4Aの表示を制御する機能として、標示情報取得部30c、タイミング判定部31、表示情報生成部32、出力制御部33c、音声情報生成部34を備える。なお、図18において、図2と同一の構成要素には同一の符号を付して説明を省略する。
Embodiment 5 FIG.
FIG. 18 is a block diagram illustrating a functional configuration of the display control apparatus according to the fifth embodiment, and illustrates each function of the control unit 3D that is a function of the display control apparatus according to the fifth embodiment.
As illustrated in FIG. 18, the control unit 3D functions as a display control of the HUD device 4A as a sign information acquisition unit 30c, a timing determination unit 31, a display information generation unit 32, an output control unit 33c, and an audio information generation unit 34. Is provided. In FIG. 18, the same components as those in FIG.
 標示情報取得部30cは、実施の形態1と同様に動作するとともに、経路案内に関係する道路標示情報のうち、音声案内の内容にはない道路標示情報を取得する。
 例えば、音声案内の内容が「右方向です。」であり、経路案内に関係する道路標示情報が案内標識に標示された「三宮」方面および右方向である場合、「三宮」方面が音声案内の内容にはない道路標示情報として抽出される。
The marking information acquisition unit 30c operates in the same manner as in the first embodiment, and acquires road marking information that is not included in the content of voice guidance among road marking information related to route guidance.
For example, if the content of the voice guidance is “right direction” and the road marking information related to the route guidance is in the direction of “Sannomiya” and the right direction, the direction of “Sannomiya” It is extracted as road marking information not included in the content.
 音声情報生成部34は、経路案内に関係する道路標示情報のうち、音声案内の内容にはない道路標示情報を音声出力させるための音声情報を生成して音声案内に追加する。
 出力制御部33cは、音声案内のタイミングに同期して、音声情報生成部34によって音声情報が追加された音声案内を音声出力装置4Bに出力させる。
The voice information generation unit 34 generates voice information for voice-outputting road marking information not included in the voice guidance content among road marking information related to the route guidance, and adds the voice information to the voice guidance.
The output control unit 33c causes the voice output device 4B to output the voice guidance to which the voice information is added by the voice information generation unit 34 in synchronization with the voice guidance timing.
 次に動作について説明する。
 図19は実施の形態5に係る表示制御装置の詳細な動作を示すフローチャートである。
 図19において、ステップST1aからステップST6a、ステップST7aからステップST8a、ステップST10aの処理は、図5に示したステップST1aからステップST6a、ステップST7aからステップST8a、ステップST10aの処理と同じであり、説明を省略する。
Next, the operation will be described.
FIG. 19 is a flowchart showing a detailed operation of the display control apparatus according to the fifth embodiment.
In FIG. 19, the processing from step ST1a to step ST6a, step ST7a to step ST8a, and step ST10a is the same as the processing from step ST1a to step ST6a, step ST7a to step ST8a, and step ST10a shown in FIG. Omitted.
 また、図19におけるステップST1aからステップST6b-2までの処理は、図4に示したステップST1の詳細な処理の一例である。ステップST7aの処理は、図4に示したステップST2の詳細な処理の一例であって、ステップST8aの処理は、図4に示したステップST3の詳細な処理の一例である。ステップST9dからステップST10aの処理は、図4に示したステップST4の詳細な処理の一例である。
 従って、図4の各ステップの動作は、図19の各ステップを説明することにより明らかになるため、以下、図19に基づいて説明を行う。
Further, the processing from step ST1a to step ST6b-2 in FIG. 19 is an example of detailed processing of step ST1 shown in FIG. The process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4, and the process of step ST8a is an example of the detailed process of step ST3 shown in FIG. The processing from step ST9d to step ST10a is an example of detailed processing of step ST4 shown in FIG.
Therefore, since the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 19, the operation will be described below with reference to FIG. 19.
 標示情報取得部30cは、経路案内に関係する道路標示情報のうち、音声案内の内容にない道路標示情報があるか否かを判定する(ステップST6b-1)。すなわち、標示情報取得部30cは、経路案内に関係する道路標示情報と経路案内情報が示す音声案内の内容とを比較し、この音声案内の内容にない道路標示情報を特定する。 The sign information acquisition unit 30c determines whether there is road sign information that is not included in the voice guidance content among the road sign information related to the route guidance (step ST6b-1). That is, the marking information acquisition unit 30c compares the road marking information related to the route guidance with the content of the voice guidance indicated by the route guidance information, and identifies the road marking information that is not included in the content of the voice guidance.
 音声案内の内容にない道路標示情報がない場合(ステップST6b-1;NO)、ステップST7aの処理に移行する。
 音声案内の内容にない道路標示情報がある場合(ステップST6b-1;YES)、標示情報取得部30cは、経路案内に関係する道路標示情報のうち、音声案内の内容にない道路標示情報を音声情報生成部34に出力する。
If there is no road marking information not included in the content of the voice guidance (step ST6b-1; NO), the process proceeds to step ST7a.
When there is road marking information not included in the voice guidance content (step ST6b-1; YES), the marking information acquisition unit 30c voices road marking information not included in the voice guidance content among the road marking information related to the route guidance. The data is output to the information generation unit 34.
 音声情報生成部34は、標示情報取得部30cから入力した音声案内の内容にない道路標示情報を音声化し、この音声案内の内容に追加する(ステップST6b-2)。
 例えば、音声案内の内容が「右方向です。」であり、経路案内に関係する道路標示情報が案内標識の「三宮」方面および右方向である場合、「三宮方面」という音声情報が生成され、「三宮方面、右方向です。」というように音声案内に追加される。
The voice information generation unit 34 converts the road marking information that is not included in the voice guidance content input from the marking information acquisition unit 30c into voice and adds it to the voice guidance content (step ST6b-2).
For example, when the content of the voice guidance is “right direction” and the road marking information related to the route guidance is in the direction of “Sannomiya” and the right direction of the guidance sign, the voice information “Sannomiya direction” is generated, “To Sannomiya, right direction” is added to the voice guidance.
 ステップST9dにおいて、出力制御部33cは、音声情報が追加された案内音声を、音声出力装置4Bに出力させる。そして、このタイミングに同期して、出力制御部33cは、HUD装置4Aに対して表示情報生成部32により生成された表示情報を出力する。
 これにより、表示情報は、運転者の前方視界の対応する実在の道路標示情報に重なって見えるように強調表示される。
In step ST9d, the output control unit 33c causes the voice output device 4B to output the guidance voice to which the voice information is added. In synchronization with this timing, the output control unit 33c outputs the display information generated by the display information generation unit 32 to the HUD device 4A.
Thus, the display information is highlighted so as to overlap the corresponding actual road marking information in the driver's forward view.
 以上のように、実施の形態5に係る表示制御装置は、経路案内に関係する道路標示情報のうち、音声案内の内容にはない道路標示情報を音声出力させるための音声情報を生成し音声案内に追加する音声情報生成部34を備える。出力制御部33cは、音声案内のタイミングに同期して、音声情報を追加後の音声案内を音声出力装置4Bに出力させる。
 このように自車前方で検出された道路標示情報のうち、音声案内になかった情報を音声化して出力することにより、案内標識が新たに追加された場合のように地図データベースに含まれない内容を、音声案内で補間することができる。これにより、実際の道路状況に即した音声案内が可能となる。
As described above, the display control apparatus according to the fifth embodiment generates voice information for voice-outputting road marking information that is not included in the voice guidance content among the road marking information related to route guidance, and provides voice guidance. Is provided with a voice information generating unit 34 to be added. The output control unit 33c causes the voice output device 4B to output voice guidance after adding voice information in synchronization with the timing of voice guidance.
Information that is not included in the map database as in the case where a guidance sign is newly added by voice-outputting information that was not in voice guidance among road marking information detected in front of the vehicle in this way Can be interpolated by voice guidance. As a result, voice guidance in accordance with actual road conditions is possible.
 なお、本発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 In the present invention, within the scope of the invention, any combination of each embodiment, any component of each embodiment can be modified, or any component can be omitted in each embodiment. .
 この発明に係る表示制御装置は、経路案内に関係する表示情報を見やすくかつ情報内容を認識しやすいタイミングで表示できるので、例えば、車載用ナビゲーション装置の表示制御装置に好適である。 The display control device according to the present invention is suitable for a display control device of an in-vehicle navigation device, for example, because the display information related to route guidance can be displayed at a timing that makes it easy to see and recognize the information content.
 1 ナビゲーション装置、2 情報取得部、3 制御部、3A~3D 制御部、4 通知部、4A HUD装置、4B 音声出力装置、4a ナビ表示部、4b HUD表示部、4c 表示制御部、4d 音声出力部、4e 音声制御部、5 入力部、6 地図データ蓄積部、7 経路算出部、8 経路案内部、9 案内標識、9a,10a 表示情報、9b~9f 追加表示情報、9g マスク用表示情報、10 右折矢印、11 直進矢印、11a マスク用表示情報、20 現在位置検出部、20a GPS受信部、20b 方位検出部、20c パルス検出部、21 無線通信部、22 周囲情報検出部、23 外界センサ、23a カメラ、23b 画像処理部、23c レーダ、23d レーダ制御部、30,30a~30c 標示情報取得部、31,31a タイミング判定部、32,32a,32b 表示情報生成部、33,33a~33c 出力制御部、34 音声情報生成部、100 処理回路、101 ディスプレイ、102 スピーカ、103 CPU、104 メモリ。 1 navigation device, 2 information acquisition unit, 3 control unit, 3A-3D control unit, 4 notification unit, 4A HUD device, 4B audio output device, 4a navigation display unit, 4b HUD display unit, 4c display control unit, 4d audio output Part, 4e voice control part, 5 input part, 6 map data storage part, 7 route calculation part, 8 route guidance part, 9 guidance sign, 9a, 10a display information, 9b-9f additional display information, 9g display information for mask, 10 right turn arrow, 11 straight arrow, 11a display information for mask, 20 current position detection unit, 20a GPS reception unit, 20b direction detection unit, 20c pulse detection unit, 21 wireless communication unit, 22 ambient information detection unit, 23 external sensor, 23a camera, 23b image processing unit, 23c radar, 23d radar control unit, 30, 30a to 30c Marking information acquisition unit, 31, 31a timing determination unit, 32, 32a, 32b display information generation unit, 33, 33a-33c output control unit, 34 audio information generation unit, 100 processing circuit, 101 display, 102 speaker, 103 CPU, 104 memory.

Claims (7)

  1.  車両前方で検出された道路標示情報から、経路案内に関係する道路標示情報を取得する標示情報取得部と、
     経路案内における音声案内のタイミングであるか否かを判定するタイミング判定部と、
     前記標示情報取得部により取得された道路標示情報を、前方視界の対応する実在の道路表示情報に重なって見えるように強調表示させるための表示情報を生成する表示情報生成部と、
     前記タイミング判定部により判定された音声案内のタイミングに同期して、表示装置に対して前記表示情報を出力する出力制御部と
    を備えたことを特徴とする表示制御装置。
    A sign information acquisition unit for acquiring road sign information related to route guidance from road sign information detected in front of the vehicle;
    A timing determination unit that determines whether it is the timing of voice guidance in route guidance;
    A display information generation unit that generates display information for highlighting the road marking information acquired by the marking information acquisition unit so as to overlap the corresponding real road display information in the forward view;
    A display control device comprising: an output control unit that outputs the display information to a display device in synchronization with the voice guidance timing determined by the timing determination unit.
  2.  前記表示情報生成部は、音声案内の内容に関係する道路標示情報を、前方視界の対応する実在の道路表示情報に重なって見えるように強調表示させるための表示情報を生成することを特徴とする請求項1記載の表示制御装置。 The display information generation unit generates display information for highlighting road marking information related to the content of voice guidance so that the road marking information is superimposed on the corresponding real road display information in the forward field of view. The display control apparatus according to claim 1.
  3.  前記表示情報生成部は、経路案内に関係する道路標示情報に存在しない経路案内の内容を、前方視界の対応する実在の標示物の空きスペースおよび前記標示物に隣接する位置のうちの少なくとも一方に重ねて見えるように表示させるための追加表示情報を生成することを特徴とする請求項1記載の表示制御装置。 The display information generation unit displays the content of the route guidance that does not exist in the road marking information related to the route guidance in at least one of the corresponding empty space of the actual sign object in the front view and the position adjacent to the sign object. The display control device according to claim 1, wherein additional display information for displaying the images so as to be seen in a superimposed manner is generated.
  4.  前記表示情報生成部は、前方視界の経路案内に関係しない道路標示情報をマスク表示させるためのマスク用表示情報を生成することを特徴とする請求項1記載の表示制御装置。 The display control apparatus according to claim 1, wherein the display information generation unit generates mask display information for mask-displaying road marking information not related to route guidance for a forward view.
  5.  前記タイミング判定部は、経路案内に関係する道路標示情報と車両との距離が予め定められた範囲になったタイミングを音声案内のタイミングであると判定することを特徴とする請求項1記載の表示制御装置。 The display according to claim 1, wherein the timing determination unit determines that the timing when the distance between the road marking information related to the route guidance and the vehicle is in a predetermined range is the timing of voice guidance. Control device.
  6.  経路案内に関係する道路標示情報のうち、音声案内の内容にはない道路標示情報を音声出力させるための音声情報を生成して前記音声案内に追加する音声情報生成部を備え、
     前記出力制御部は、音声案内のタイミングに同期して、音声出力装置に前記音声情報を追加後の音声案内を出力させることを特徴とする請求項1記載の表示制御装置。
    Among the road marking information related to the route guidance, a voice information generation unit that generates voice information for voice-outputting road marking information not included in the contents of the voice guidance and adds the voice information to the voice guidance,
    The display control device according to claim 1, wherein the output control unit causes the voice output device to output the voice guidance after adding the voice information in synchronization with a voice guidance timing.
  7.  目的地までの経路を算出する経路算出部と、
     前記経路算出部により算出された経路を経路案内する経路案内部と、
     車両前方で検出された道路標示情報から、前記経路案内部が行う経路案内に関係する道路標示情報を取得する標示情報取得部と、
     前記経路案内部による音声案内のタイミングであるか否かを判定するタイミング判定部と、
     前記標示情報取得部により取得された道路標示情報を、前方視界の対応する実在の道路表示情報に重なって見えるように強調表示させるための強調表示させるための表示情報を生成する表示情報生成部と、
     前記タイミング判定部により判定された音声案内のタイミングに同期して、表示装置に対して前記表示情報を出力する出力制御部とを備えたことを特徴とするナビゲーション装置。
    A route calculator for calculating a route to the destination;
    A route guidance unit for guiding the route calculated by the route calculation unit;
    A sign information acquisition unit that acquires road sign information related to route guidance performed by the route guide unit from road sign information detected in front of the vehicle;
    A timing determination unit that determines whether it is a voice guidance timing by the route guidance unit;
    A display information generation unit that generates display information for highlighting the road marking information acquired by the marking information acquisition unit so that the road marking information is displayed so as to overlap the corresponding real road display information in the forward view; ,
    A navigation device comprising: an output control unit that outputs the display information to a display device in synchronization with the timing of voice guidance determined by the timing determination unit.
PCT/JP2015/070991 2015-07-23 2015-07-23 Display control device and navigation device WO2017013792A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2017529419A JP6444508B2 (en) 2015-07-23 2015-07-23 Display control device and navigation device
PCT/JP2015/070991 WO2017013792A1 (en) 2015-07-23 2015-07-23 Display control device and navigation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/070991 WO2017013792A1 (en) 2015-07-23 2015-07-23 Display control device and navigation device

Publications (1)

Publication Number Publication Date
WO2017013792A1 true WO2017013792A1 (en) 2017-01-26

Family

ID=57834249

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/070991 WO2017013792A1 (en) 2015-07-23 2015-07-23 Display control device and navigation device

Country Status (2)

Country Link
JP (1) JP6444508B2 (en)
WO (1) WO2017013792A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018036728A (en) * 2016-08-29 2018-03-08 株式会社デンソーテン Driving support device and driving support method
JP2019082382A (en) * 2017-10-30 2019-05-30 アイシン・エィ・ダブリュ株式会社 Superposition picture display device
CN111194397A (en) * 2017-10-05 2020-05-22 大众汽车有限公司 Method for operating a navigation system
JP2022063276A (en) * 2017-06-30 2022-04-21 パナソニックIpマネジメント株式会社 Display system and control method of display system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007183764A (en) * 2006-01-05 2007-07-19 Hitachi Ltd Onboard equipment
JP2009210431A (en) * 2008-03-04 2009-09-17 Alpine Electronics Inc Navigation system
WO2013088510A1 (en) * 2011-12-13 2013-06-20 パイオニア株式会社 Display device and display method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3732008B2 (en) * 1998-05-27 2006-01-05 富士通テン株式会社 Navigation device
JP4354217B2 (en) * 2003-06-11 2009-10-28 クラリオン株式会社 Road sign display device
JP2011192044A (en) * 2010-03-15 2011-09-29 Emprie Technology Development LLC Traffic sign notification system and method for the same
WO2013145146A1 (en) * 2012-03-28 2013-10-03 パイオニア株式会社 Navigation device, navigation method and navigation program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007183764A (en) * 2006-01-05 2007-07-19 Hitachi Ltd Onboard equipment
JP2009210431A (en) * 2008-03-04 2009-09-17 Alpine Electronics Inc Navigation system
WO2013088510A1 (en) * 2011-12-13 2013-06-20 パイオニア株式会社 Display device and display method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018036728A (en) * 2016-08-29 2018-03-08 株式会社デンソーテン Driving support device and driving support method
JP2022063276A (en) * 2017-06-30 2022-04-21 パナソニックIpマネジメント株式会社 Display system and control method of display system
JP7266257B2 (en) 2017-06-30 2023-04-28 パナソニックIpマネジメント株式会社 DISPLAY SYSTEM AND METHOD OF CONTROLLING DISPLAY SYSTEM
CN111194397A (en) * 2017-10-05 2020-05-22 大众汽车有限公司 Method for operating a navigation system
US20200333144A1 (en) * 2017-10-05 2020-10-22 Volkswagen Aktiengesellschaft Method for Operating a Navigation System
US11663835B2 (en) * 2017-10-05 2023-05-30 Volkswagen Aktiengesellschaft Method for operating a navigation system
CN111194397B (en) * 2017-10-05 2024-02-02 大众汽车有限公司 Method for operating a navigation system
JP2019082382A (en) * 2017-10-30 2019-05-30 アイシン・エィ・ダブリュ株式会社 Superposition picture display device

Also Published As

Publication number Publication date
JPWO2017013792A1 (en) 2017-10-19
JP6444508B2 (en) 2018-12-26

Similar Documents

Publication Publication Date Title
JP6700623B2 (en) Driving support device and computer program
CN103969831B (en) vehicle head-up display device
JP4293917B2 (en) Navigation device and intersection guide method
JP6775188B2 (en) Head-up display device and display control method
WO2020261781A1 (en) Display control device, display control program, and persistent tangible computer-readable medium
WO2017056210A1 (en) Vehicular display device
JP7028228B2 (en) Display system, display control device and display control program
WO2019224922A1 (en) Head-up display control device, head-up display system, and head-up display control method
JP5621589B2 (en) Navigation device, navigation method, and navigation program
JP6444508B2 (en) Display control device and navigation device
CN108139223A (en) Display apparatus
JP2022058537A (en) Display control device
JP6627214B2 (en) Information display device, control method, program, and storage medium
JP2015128956A (en) Head-up display, control method, program and storage medium
WO2011135660A1 (en) Navigation system, navigation method, navigation program, and storage medium
JP2014052345A (en) Navigation device and display control method for navigation information
JP5702476B2 (en) Display device, control method, program, storage medium
KR20170014545A (en) Apparatus and method of displaying point of interest
WO2020149109A1 (en) Display system, display control device, and display control program
JP6785956B2 (en) Display control device and display control method
JP6388723B2 (en) Display control device and navigation device
JP2020112541A (en) Display controller and display control program
JP2018158725A (en) Head-up display, control method, program and storage medium
KR20040025150A (en) Route guide method in car navigation system
JP7230960B2 (en) display controller

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15898951

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017529419

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15898951

Country of ref document: EP

Kind code of ref document: A1