WO2017013793A1 - Dispositif de commande d'affichage et dispositif de navigation - Google Patents

Dispositif de commande d'affichage et dispositif de navigation Download PDF

Info

Publication number
WO2017013793A1
WO2017013793A1 PCT/JP2015/070992 JP2015070992W WO2017013793A1 WO 2017013793 A1 WO2017013793 A1 WO 2017013793A1 JP 2015070992 W JP2015070992 W JP 2015070992W WO 2017013793 A1 WO2017013793 A1 WO 2017013793A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
display
display information
unit
sign
Prior art date
Application number
PCT/JP2015/070992
Other languages
English (en)
Japanese (ja)
Inventor
境 賢治
井上 裕二
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2015/070992 priority Critical patent/WO2017013793A1/fr
Priority to JP2017529420A priority patent/JP6388723B2/ja
Publication of WO2017013793A1 publication Critical patent/WO2017013793A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network

Definitions

  • the present invention relates to a display control device for a head-up display (hereinafter referred to as HUD) that displays display information superimposed on a driver's forward view, and a navigation device equipped with the display control device.
  • HUD head-up display
  • Patent Document 1 describes a display system that changes the display position of information by HUD according to the importance of information.
  • information with high importance is displayed in a central display area where the driver can read information without diverting his gaze from the driving road ahead of the display area in the HUD, and information with low importance is displayed. It is displayed in the peripheral display area of the HUD. Thereby, the information with high importance displayed in the central display area can be visually recognized without the driver moving his / her line of sight.
  • Patent Document 2 describes a HUD device that displays parallax to the left and right eyes of a driver to display information three-dimensionally.
  • map information such as roads, intersections or buildings along the roads can be displayed in a three-dimensional manner by superimposing them on the focal position of the driver's front view. Thereby, the driver can visually recognize the information without shifting the line of sight or the focus of the eyes.
  • Patent Document 3 describes a HUD device that detects an object that is being watched by a driver and displays information so as to form an image overlapping the object.
  • information is dynamically superimposed and displayed on the driver's gaze object, which can change from moment to moment, so that the driver does not move the line of sight and adjusts the focal length of the eyes. Can be visually recognized.
  • This invention solves the said subject, and it aims at obtaining the display control apparatus and navigation apparatus which can display the road marking information relevant to route guidance easily.
  • the display control device includes a sign information acquisition unit, a display information generation unit, and an output control unit.
  • the sign information acquisition unit acquires road sign information related to route guidance from road sign information detected in front of the vehicle.
  • the display information generation unit generates display information for highlighting the road marking information acquired by the marking information acquisition unit so that the road marking information overlaps with the corresponding real road marking information in the forward field of view.
  • the output control unit outputs display information to the display device.
  • the display information of the road marking information related to the route guidance is highlighted and superimposed on the corresponding road marking information in the forward field of view, there is an effect that the road marking information related to the route guidance can be easily displayed. is there.
  • FIG. 3 is a block diagram illustrating a functional configuration of the display control apparatus according to Embodiment 1.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the display control apparatus according to Embodiment 1.
  • FIG. 3 is a flowchart showing an outline of the operation of the display control apparatus according to the first embodiment.
  • 3 is a flowchart showing an operation of the navigation device according to the first embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window where the display information of the road marking information was displayed. It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 2 of this invention.
  • 10 is a flowchart showing a detailed operation of the display control apparatus according to the second embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window where the display information of the route guidance information was displayed. 10 is a flowchart showing another operation of the display control apparatus according to the second embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window where the display information of another route guidance information was displayed. It is a block diagram which shows the function structure of the display control apparatus which concerns on Embodiment 3 of this invention. 10 is a flowchart showing a detailed operation of the display control apparatus according to the third embodiment. It is a figure which shows the scenery ahead of the vehicle seen through the front window where the display information for masks was displayed.
  • FIG. 1 is a block diagram showing a functional configuration of a navigation device 1 according to Embodiment 1 of the present invention.
  • the navigation device 1 is a device that is mounted on a vehicle and performs route guidance to a destination.
  • the navigation device 1 includes a position measurement unit 2, a map data storage unit 3, an operation unit 4, a control unit 5, and a display unit 6, and includes a HUD 7
  • the distance measuring unit 8 and the imaging unit 9 are connected.
  • the operation of each part of the navigation device 1 is comprehensively controlled by the control part 5.
  • the position measuring unit 2 is a component that measures the current position of the host vehicle, and is connected to, for example, a GPS (Global Positioning System) receiver, an orientation sensor, and a vehicle control device of the host vehicle.
  • the GPS receiver receives GPS signals from GPS satellites and detects the current position (for example, latitude and longitude) of the vehicle based on the GPS signals.
  • the direction sensor is a sensor that detects the traveling direction (for example, direction) of the host vehicle, and includes, for example, a gyro sensor and an direction sensor.
  • the vehicle control device detects a pulse signal corresponding to the number of revolutions per unit time of the axle of the own vehicle, and detects the traveling speed and the traveling distance of the own vehicle based on the pulse signal.
  • the position measuring unit 2 corrects the current position detected by the GPS receiver based on the traveling direction, traveling speed, and traveling distance of the host vehicle detected by the direction sensor and the vehicle control device. Thereby, it becomes possible to detect the exact current position of the own vehicle.
  • the map data storage unit 3 is a storage unit that stores map data, and is realized as a storage device such as an HDD (Hard Disk Drive) and a RAM (Random Access Memory).
  • the map data storage unit 3 may store map data acquired from the outside of the navigation device 1.
  • the map data storage unit 3 stores map data downloaded from an external device via a network.
  • the map data storage unit 3 may store map data read from a recording medium such as a DVD-ROM (Digital Versatile Disk-Read Only Memory) or Blu-Ray (registered trademark) Disc-ROM.
  • the operation unit 4 is a component that receives information input from the user, and is realized by, for example, a push button device, a touch panel, or the like.
  • the touch panel may be configured integrally with the display unit 6.
  • a destination to be route-guided from the current position is input using the operation unit 4. For example, when a point on the map displayed on the display unit 6 is designated by the user, the operation unit 4 accepts this point as a destination, and when an address or a telephone number is input, This is accepted as destination information for specifying a destination.
  • the display unit 6 is a display for navigation processing in which map display and route guidance display are performed, and is arranged, for example, in the front center portion of the passenger compartment.
  • the HUD 7 is a display that projects and displays display information on a front window or a combiner. Further, a display method in which display information is three-dimensionally displayed by stereo parallax may be used. In other words, the HUD 7 only needs to be able to display information three-dimensionally in a superimposed manner on the driver's forward field of view, and the distance and depth viewed from the driver are variable. However, in the present invention, since the information is displayed so as to be superimposed on the corresponding actual road marking information in the driver's forward field of view, the front window is desirable as an object to be projected and displayed.
  • the distance measuring unit 8 is a measuring unit that measures a distance between a detection object in front of the host vehicle and the host vehicle using a laser beam or a millimeter wave as a probe, and a nose portion in the body of the host vehicle or a front window on the vehicle interior It is arranged in the vicinity of.
  • the detection target object is, for example, a guide sign, a road marking, an information board, etc. in front of the host vehicle.
  • the imaging unit 9 is a camera capable of shooting in the visible light region or the infrared light region.
  • the imaging unit 9 is arranged in the vicinity of a room mirror on the vehicle interior side of the front window of the own vehicle, and images the outside of a predetermined range ahead of the traveling direction of the own vehicle through the front window.
  • the camera for example, a CCD (Charge Coupled Device) camera, a CMOS (Complementary Metal Oxide Semiconductor) camera, a stereo camera, or the like is used.
  • the imaging unit 9 performs image processing on the captured image into image data that can be processed by the sign information acquisition unit 51.
  • image data including two-dimensional array pixels is generated by performing image processing such as filtering and binarization processing.
  • FIG. 2 is a block diagram illustrating a functional configuration of the display control apparatus according to the first embodiment, and illustrates functions of the control unit 5 serving as functions of the display control apparatus according to the first embodiment.
  • the control unit 5 includes a navigation processing unit 50 as a function for performing navigation processing, and includes a labeling information acquisition unit 51, a display information generation unit 52, and an output control unit 53 as functions for controlling display of the HUD 7.
  • a navigation processing unit 50 as a function for performing navigation processing
  • the control unit 5 includes a labeling information acquisition unit 51, a display information generation unit 52, and an output control unit 53 as functions for controlling display of the HUD 7.
  • the navigation processing unit 50 performs navigation processing such as map display, route search, and route guidance. For example, the navigation processing unit 50 searches for a route from the departure place to the destination based on the departure place, the destination, and the map data. For example, the current position of the vehicle measured by the position measurement unit 2 is set as the departure place.
  • the destination may be a point received from the user using the operation unit 4 or may be a point registered in advance as a destination candidate. Further, the navigation processing unit 50 provides route guidance for the route selected by the user among the route candidates of the search result.
  • the sign information acquisition unit 51 acquires road sign information related to route guidance from road sign information detected in front of the host vehicle.
  • the road marking information is marking information related to road traffic guidance, and includes, for example, guidance signs, road markings, information boards, their contents, and the distance between them and the vehicle.
  • the position of the guide sign and the content of the guide sign become road marking information.
  • the contents of the guide sign include a place name indicating the direction and an arrow sign indicating the direction.
  • the position and the content of the road marking become road marking information.
  • the contents of the road marking include an arrow marking indicating a traffic division according to the traveling direction, such as a straight lane or a left / right turn.
  • the information board in addition to the position of the information board, the contents of the traffic information such as “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board is also the road marking information.
  • the sign information acquisition unit 51 converts road sign information detected in front of the host vehicle by the imaging unit 9 into road sign information related to route guidance in the navigation processing unit 50 and road sign information not related to route guidance. Classify and obtain road marking information related to route guidance. As a classification method, for example, a method of identifying information related to route guidance by comparing the guidance signs, road markings, information boards and their contents with the route guidance information of the planned travel route in the road marking information is adopted. Is done.
  • the sign information acquisition unit 51 performs image analysis on the image data obtained by the image processing of the image pickup unit 9 to identify the content of the road sign information. For example, when the road sign information is a guide sign, the place name of the guide sign, the contents of the sign such as an arrow sign, and the position in the guide sign where the sign contents are described are specified.
  • the sign information acquisition unit 51 uses the road marking indicating the left turn as road marking information related to the route guidance, and the road marking information for straight and right turns is the road marking information not related to the route guidance. Classify into: For example, when the route is guided to the direction of “Sannomiya” at the guidance point, the sign information acquisition unit 51 specifies “Sannomiya” and “Osaka” as the place names indicating the direction of the guidance sign, and routes “Sannomiya”. The road marking information related to the guidance is used, and “Osaka” is classified as road marking information not related to the route guidance.
  • the display information generation unit 52 generates display information for highlighting the road marking information related to the route guidance so as to appear to overlap the corresponding real road marking information in the forward view. For example, when the road marking information related to the route guidance is information on the direction and direction of the guidance sign, as display information, image data in which the name of the place name indicating the direction and the arrow marking indicating the direction are expressed in highlight color Is generated. In addition, when the road marking information related to the route guidance is the traffic information “30 km to Osaka, traffic jam, required time 40 minutes” displayed on the information board, the character indicating this content is high as the display information. Image data expressed in light color is generated.
  • the display information generation unit 52 changes the shape of the arrow sign as pasted from the driver to the arrow sign on the actual road surface based on the angle formed by the driver's line of sight and the road surface. Is generated. Thereby, the display information of a road marking can be made into the shape close
  • the display information generation unit 52 generates road marking information having a size and a shape according to the distance between the position where the road marking information is located and the own vehicle based on the contents of the road marking information sequentially input from the marking information acquisition unit 51.
  • Image data is generated as display information.
  • the display information generated by the display information generation unit 52 may be out of the actual road marking information because the size and shape of the road marking information image do not match due to the traveling of the vehicle. Therefore, by using the image data of road marking information having a size and shape according to the distance between the position where the road marking information is present and the own vehicle as display information, deviation from the actual road marking information can be reduced.
  • the output control unit 53 outputs display information to the HUD 7.
  • the HUD 7 adjusts the distance between the display source and the lens that projects the display light from the display source onto the front window, thereby forming a virtual image at an arbitrary distance from the relationship with the focal length of the lens. Can do. For example, by adjusting the distance between the display source of the HUD 7 and the lens, it is possible to highlight the road marking information related to the route guidance so that it appears to overlap the corresponding real road marking information in the forward view.
  • the output control unit 53 changes the display position of the display information in accordance with the traveling of the host vehicle, thereby causing the display information display on the HUD 7 to follow the movement of the host vehicle.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the display control apparatus according to the first embodiment.
  • FIG. 4 is a flowchart showing an outline of the operation of the display control apparatus according to the first embodiment.
  • the display 101 is the HUD 7 shown in FIG. 1, and the display is controlled by the display control apparatus according to the first embodiment.
  • each function for controlling the display of the HUD 7 in the control unit 5 corresponds to each function of the display control apparatus according to the first embodiment.
  • the sign information acquisition unit 51, the display information generation unit 52, and the output control unit 53 are realized by the processing circuit 100. That is, in the display control device, the sign information acquisition unit 51 executes the process of step ST1 shown in FIG.
  • the processing circuit 100 is a component that realizes the control unit 5 and may be dedicated hardware or a CPU (Central Processing Unit) that executes a program stored in a memory.
  • CPU Central Processing Unit
  • the processing circuit 100 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (FPGA). Field-Programmable Gate Array) or a combination thereof.
  • the functions of each unit of the sign information acquisition unit 51, the display information generation unit 52, and the output control unit 53 may be realized by a processing circuit, or the functions of each unit may be realized by a single processing circuit.
  • the processing circuit when the processing circuit is the CPU 102, the functions of the indication information acquisition unit 51, the display information generation unit 52, and the output control unit 53 are based on software, firmware, or a combination of software and firmware. Realized. Software and firmware are described as programs and stored in the memory 103.
  • the CPU 102 implements the functions of each unit by reading and executing the program stored in the memory 103. That is, the display control device includes a memory 103 for storing a program that, when executed by the CPU 102, results from the processing from step ST1 to step ST3 shown in FIG.
  • these programs cause the computer to execute the procedure or method of the sign information acquisition unit 51, the display information generation unit 52, and the output control unit 53.
  • the memory is, for example, a RAM (Random Access Memory), a ROM, a flash memory, an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Programmable EPROM), or a non-volatile or volatile semiconductor memory, a magnetic disk, a flexible disk, An optical disc, a compact disc, a mini disc, a DVD, and the like are applicable.
  • the functions of the sign information acquisition unit 51, the display information generation unit 52, and the output control unit 53 may be realized by dedicated hardware, and part of the functions may be realized by software or firmware.
  • the sign information acquisition unit 51 realizes its function by the processing circuit 100 as dedicated hardware, and the display information generation unit 52 and the output control unit 53 perform their functions by the CPU 102 executing the program stored in the memory 103.
  • the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
  • FIG. 5 is a flowchart showing the operation of the navigation device according to the first embodiment.
  • processing from step ST ⁇ b> 1 a to step ST ⁇ b> 3 a and step ST ⁇ b> 9 a is an operation of the navigation processing unit 50.
  • the processing from step ST4a to step ST6a is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST7a is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST8a is an example of the detailed process of step ST3 shown in FIG. Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 5, and will be described below with reference to FIG.
  • step ST2a When the system is activated (step ST1a), the navigation processing unit 50 checks whether a destination is set (step ST2a). If the destination is not set (step ST2a; NO), the process returns to step ST2a and repeats the above confirmation until the destination is set. On the other hand, if the destination is set (step ST2a; YES), the navigation processing unit 50 searches for a route to the destination and starts route guidance (step ST3a). For example, the navigation processing unit 50 uses the current position of the host vehicle measured by the position measurement unit 2 as a departure point, and based on the destination inputted by the operation unit 4 and the map data read from the map data storage unit 3, Search for the route from the starting point to the destination. Then, the navigation processing unit 50 starts route guidance for the route selected by the user among the route candidates of the search result.
  • the marking information acquisition unit 51 acquires the distance between the road marking information detected in front of the host vehicle by the distance measuring unit 8 and the host vehicle (step ST4a). For example, the distance between the sign such as a guide sign, road marking, and information board and the vehicle is acquired. Subsequently, the sign information acquisition unit 51 determines the content of the road sign information (step ST5a). For example, the sign information acquisition unit 51 performs image analysis on the image data obtained from the imaging unit 9 and determines the content of the road sign information. When the road marking information is a guide sign, the name of the place described on the guide sign, the contents of the sign such as an arrow sign, and the position in the guide sign where the contents of the sign are described are determined.
  • the road marking information is a road marking
  • an arrow marking indicating a traffic classification according to the traveling direction such as a straight lane or a left / right turn
  • the sign information acquisition unit 51 compares the contents of the road sign information determined in step ST5a with the route guidance information of the planned travel route, and extracts road sign information related to the route guidance (step ST6a).
  • the display information generation unit 52 generates display information that expresses the road marking information related to the route guidance in a highlight color and in a form suitable for the actual marking corresponding to the route guidance (step ST7a).
  • This display information is display information for highlighting the road marking information related to the route guidance so as to appear to overlap the corresponding actual road marking information in the forward view.
  • the output control unit 53 outputs the display information generated by the display information generation unit 52 to the HUD 7. As a result, the display information is highlighted by the HUD 7 so as to overlap the corresponding real road marking information in the front view (step ST8a).
  • step ST9a the route guidance is finished. If it is determined that the route guidance has been completed (step ST9a; YES), the process returns to step ST2a. Moreover, when it determines with route guidance not being complete
  • FIG. 6 is a diagram showing a landscape in front of the vehicle viewed through the front window on which the display information of the road marking information is displayed, and shows a state in front of the vehicle when the vehicle approaches the intersection.
  • a guide sign 10 is arranged in the vicinity, and a right turn arrow 11 and a straight arrow 12 are marked on the road surface on which the vehicle travels.
  • the display information generation unit 52 indicates the character “Sannomiya” indicating the direction and the right direction.
  • Image data in which an arrow mark is highlighted is generated as display information 10a.
  • the display information generation unit 52 generates, as display information 11a, image data obtained by applying a highlight color to a shape that matches the actual right turn arrow 11.
  • the output control unit 53 outputs the display information 10a and 11a generated by the display information generation unit 52 to the HUD 7.
  • the display information 10a and 11a are highlighted by the HUD 7 so that they appear to overlap the corresponding actual road marking information in the forward view.
  • the display information 10a is displayed as pasted at a position where the corresponding direction and direction of the actual guide sign 10 are described.
  • the display information 11a is displayed as pasted on the corresponding actual right turn arrow 11.
  • the display information of the present invention emphasizes the actual road marking information related to the route guidance in a natural manner in the HUD 7. Therefore, the shape and size of the display information are within a range that does not deviate from the sign of the actual road sign information as viewed from the driver.
  • the display information for highlighting the road marking information related to the route guidance seems to overlap the corresponding real road marking information in the forward view. Highlighted. Thereby, it is possible to display the road marking information related to the route guidance in an easy-to-see manner.
  • FIG. FIG. 7 is a block diagram showing the functional configuration of the display control apparatus according to Embodiment 2 of the present invention, and shows the function of the control unit 5A that is the function of the display control apparatus according to Embodiment 2.
  • the control unit 5A includes a navigation processing unit 50 as a function for performing navigation processing, and includes a labeling information acquisition unit 51, a display information generation unit 52a, and an output control unit 53a as functions for controlling the display of the HUD 7.
  • the display information generation unit 52a generates display information for displaying the route guidance information as if pasted on the actual road surface area of the forward view. For example, when route guidance is made that the intersection of Mita City Miwa is 50 m straight from the current position, an image expressing the contents of this route guidance in characters is generated as display information. In addition, the display information generation unit 52a generates display information for displaying the route guidance information so as to be seen adjacent to the actual sign object in the forward field of view.
  • the output control unit 53a outputs the display information generated by the display information generation unit 52a to the HUD 7. Accordingly, the display information is displayed as being pasted on the road surface area of the front view, or is displayed so as to be seen adjacent to the sign object of the front view.
  • FIG. 8 is a flowchart showing a detailed operation of the display control apparatus according to the second embodiment.
  • the processing from step ST1b to step ST2b is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST3b is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST4b is an example of the detailed process of step ST3 shown in FIG. Therefore, since the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 8, a description will be given below based on FIG.
  • the sign information acquisition unit 51 performs image analysis on the imaging information in front of the host vehicle obtained by the imaging unit 9, and determines the road surface area in front of the host vehicle (step ST1b). For example, assuming that the change in the luminance value of the part where the vehicle ahead and the features on the road are imaged is larger than the change in the luminance value of the part where the road surface area and the sky area are imaged, A part where the road surface area and the sky area are imaged is determined from the imaging information. Then, based on the position of the area in the imaging information, the upper area is determined to be an empty area, and the lower area is determined to be a road surface area.
  • the sign information acquisition unit 51 refers to the route guidance information input from the navigation processing unit 50 and extracts the route guidance information at the current position of the host vehicle as the route guidance information to be displayed in the road surface area (step ST2b). ). For example, when traveling straight 50 meters from the current position, when route guidance is made at the intersection of Mita City Miwa, the content of this route guidance is extracted as information to be displayed in the road surface area up to the intersection.
  • the display information generation unit 52a generates display information for displaying the route guidance information extracted by the sign information acquisition unit 51 so as to be pasted on the actual road surface area of the front view (step ST3b). Thereafter, the output control unit 53a outputs the display information to the HUD 7. Thereby, the display information is displayed as pasted on the actual road surface area of the front view (step ST4b).
  • FIG. 9 is a diagram showing a scenery in front of the vehicle viewed through the front window on which the display information of the route guidance information is displayed, and shows a state in front of the own vehicle when the own vehicle approaches the intersection.
  • the display information generation unit 52 a generates image data that expresses the contents of this route guidance as characters as display information 14.
  • the output control unit 53a outputs the display information 14 generated by the display information generation unit 52a to the HUD 7. Thereby, as shown in FIG. 9, the display information 14 is displayed by the HUD 7 as pasted on the actual road surface area 13 in the forward field of view. In this way, route guidance information can be displayed so as not to obstruct the driver when recognizing a vehicle or the like on the road surface.
  • FIG. 10 is a flowchart showing another operation of the display control apparatus according to the second embodiment, and the method for displaying the display information of the route guidance information is different from FIG.
  • the processing from step ST1c to step ST2c is an example of detailed processing of step ST1 shown in FIG.
  • the process of step ST3c is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST4c is an example of the detailed process of step ST3 shown in FIG. Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 10, and will be described below based on FIG. 10.
  • the sign information acquisition unit 51 performs image analysis on the imaging information in front of the host vehicle obtained by the imaging unit 9, and determines the road surface area in front of the host vehicle (step ST1c).
  • the sign information acquisition unit 51 refers to the route guide information input from the navigation processing unit 50 and extracts the route guide information at the current position of the vehicle as route guide information to be displayed adjacent to the sign object. (Step ST2c). For example, when an information board is detected as road marking information, and traffic information that is not on this information board is obtained, the contents of this traffic information are extracted as route guidance information to be displayed adjacent to the information board that is a sign object. To do.
  • the display information generation unit 52a generates display information for displaying the route guidance information extracted by the sign information acquisition unit 51 so as to be seen adjacent to the actual sign object in the forward field of view (step ST3c). For example, display information indicating traffic information that is not on the information board is generated.
  • the output control unit 53a outputs the display information to the HUD 7. Accordingly, the display information is displayed by the HUD 7 at a position adjacent to the actual sign in the forward field of view (step ST4c). For example, display information of traffic information that is not on the information board is displayed so as to be adjacent to the information board. In this way, the driver can collectively recognize the traffic information at the current position of the vehicle from the display contents of the actual information board in the forward field of view and the display information that appears adjacent to the information board. Can do. Thereby, in order to obtain traffic information, the amount of movement of the driver's line of sight can be reduced, and driving can be focused.
  • FIG. 11 is a diagram showing a landscape in front of the vehicle viewed through the front window on which display information of another route guidance information is displayed, and shows a state in front of the vehicle when the vehicle approaches an intersection. .
  • a traffic light 15 is arranged as a sign.
  • the sign information acquisition unit 51 should display this traffic information adjacent to the sign. Obtained as guide information.
  • the display information generation unit 52a generates image data expressing the contents of the route guidance information as characters as display information 16 for displaying the image data so as to be seen adjacent to the actual traffic light 15 in the forward field of view.
  • the output control unit 53a outputs the display information 16 to the HUD 7. Accordingly, the display information 16 is displayed by the HUD 7 so as to be seen adjacent to the traffic light 15 in the front field of view as shown in FIG.
  • the route guidance information can be accurately obtained without obstructing the driving of the vehicle. I can tell you.
  • the display information generation unit 52a displays the display information 14 for displaying the route guidance information as if pasted on the actual road surface area 13 in the forward field of view. Generate. Thereby, route guidance information can be displayed on the HUD 7 so as not to obstruct the driver when recognizing a vehicle or the like on the road surface.
  • the display information generation unit 52a generates display information 16 for displaying the route guidance information so as to be seen adjacent to the actual traffic light 15 in the forward field of view. In this way, by displaying the route guidance information adjacent to the sign that the driver's eyes naturally face while driving the vehicle, the route guidance information can be accurately transmitted without obstructing the driving of the vehicle. .
  • the route guidance information adjacent to the sign related to the route guidance information the related route guidance information can be collectively shown to the driver.
  • FIG. 12 is a block diagram showing the functional configuration of the display control apparatus according to Embodiment 3 of the present invention, and shows the function of the control unit 5B, which is the function of the display control apparatus according to Embodiment 3.
  • the control unit 5B includes a navigation processing unit 50 as a function for performing navigation processing, and includes a labeling information acquisition unit 51, a display information generation unit 52b, and an output control unit 53b as functions for controlling display of the HUD 7.
  • the display information generation unit 52b generates mask display information for mask-displaying an area that is not related to vehicle driving in the forward view.
  • the display information for mask for example, image information in which the content of the background displayed in an overlapping manner is translucent or shielded can be cited. Further, as will be described later with reference to FIG. 14, a street tree image may be generated as mask display information.
  • the output control unit 53b outputs mask display information to the HUD 7.
  • the display information for mask is displayed by the HUD 7 in an area not related to the vehicle driving in the forward field of view.
  • the area not related to vehicle driving is an area in the field of view that does not hinder the driving of the vehicle even if the driver does not visually recognize it.
  • a roadside scenery of an expressway does not hinder driving even if the driver does not visually recognize it. Therefore, in the third embodiment, the display information for the mask is displayed so as to overlap the scenery on the roadside of the expressway.
  • FIG. 13 is a flowchart showing a detailed operation of the display control apparatus according to the third embodiment.
  • the process of step ST1d is an example of the detailed process of step ST1 shown in FIG.
  • the process of step ST2d is an example of the detailed process of step ST2 shown in FIG. 4
  • the process of step ST3d is an example of the detailed process of step ST3 shown in FIG. Therefore, the operation of each step in FIG. 4 becomes clear by explaining each step in FIG. 13, and will be described below with reference to FIG.
  • FIG. 14 is a diagram showing a scenery in front of the vehicle viewed through the front window on which the mask display information is displayed.
  • the sign information acquisition unit 51 performs image analysis on the imaging information in front of the host vehicle obtained by the imaging unit 9, and determines the road surface area 13 in front of the host vehicle (step ST1d).
  • the display information generation unit 52b generates mask display information 17 for mask display at positions along the left and right ends of the road surface area 13 in the forward view (step ST2d).
  • the mask display information 17 is an image of a roadside tree.
  • the output control unit 53b outputs the mask display information 17 to the HUD 7. Accordingly, the mask display information 17 is displayed by the HUD 7 at positions along the left and right ends of the road surface area 13 in the forward field of view (step ST3d). As a result, it appears to the driver that the vehicle is traveling on the road on which the roadside tree is virtually displayed.
  • the display information generation unit 52b generates the mask display information 17 for mask-displaying an area that is not related to the vehicle driving in the forward view. In this way, by blocking areas not related to driving in the driver's field of view with the mask display information 17, the driver's side-by-side driving can be prevented.
  • the present invention is not limited to this.
  • the present invention may be applied to eyeglass-shaped head mounted displays for pedestrians.
  • any combination of each embodiment, any component of each embodiment can be modified, or any component can be omitted in each embodiment. .
  • the display control apparatus can display road marking information related to route guidance in an easy-to-see manner, it is suitable, for example, for a display control apparatus of an in-vehicle navigation system.
  • 1 navigation device 2 position measurement unit, 3 map data storage unit, 4 operation unit, 5, 5A, 5B control unit, 6 display unit, 7 HUD, 8 distance measurement unit, 9 imaging unit, 10 guide sign, 10a, 11a , 14, 16 display information, 11 right turn arrow, 12 straight arrow, 13 road surface area, 15 traffic light, 17 mask display information, 50 navigation processing section, 51 sign information acquisition section, 52, 52a, 52b display information generation section, 53 , 53a, 53b, output control unit, 100 processing circuit, 101 display, 102 CPU, 103 memory.

Abstract

La présente invention concerne un dispositif de commande d'affichage qui affiche des informations de marquage routier concernant un guidage d'itinéraire de manière à les mettre en évidence, ces informations étant superposées aux informations de marquage routier dans le champ de vision avant.
PCT/JP2015/070992 2015-07-23 2015-07-23 Dispositif de commande d'affichage et dispositif de navigation WO2017013793A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/070992 WO2017013793A1 (fr) 2015-07-23 2015-07-23 Dispositif de commande d'affichage et dispositif de navigation
JP2017529420A JP6388723B2 (ja) 2015-07-23 2015-07-23 表示制御装置およびナビゲーション装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/070992 WO2017013793A1 (fr) 2015-07-23 2015-07-23 Dispositif de commande d'affichage et dispositif de navigation

Publications (1)

Publication Number Publication Date
WO2017013793A1 true WO2017013793A1 (fr) 2017-01-26

Family

ID=57834269

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/070992 WO2017013793A1 (fr) 2015-07-23 2015-07-23 Dispositif de commande d'affichage et dispositif de navigation

Country Status (2)

Country Link
JP (1) JP6388723B2 (fr)
WO (1) WO2017013793A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009020089A (ja) * 2007-06-12 2009-01-29 Panasonic Corp ナビゲーション装置、ナビゲーション方法、及びナビゲーション用プログラム
JP2009210431A (ja) * 2008-03-04 2009-09-17 Alpine Electronics Inc ナビゲーションシステム
WO2013088510A1 (fr) * 2011-12-13 2013-06-20 パイオニア株式会社 Dispositif d'affichage et procédé d'affichage

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4935145B2 (ja) * 2006-03-29 2012-05-23 株式会社デンソー カーナビゲーション装置
US8878660B2 (en) * 2011-06-28 2014-11-04 Nissan North America, Inc. Vehicle meter cluster
JP2015172548A (ja) * 2014-03-12 2015-10-01 パイオニア株式会社 表示制御装置、制御方法、プログラム、及び記憶媒体

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009020089A (ja) * 2007-06-12 2009-01-29 Panasonic Corp ナビゲーション装置、ナビゲーション方法、及びナビゲーション用プログラム
JP2009210431A (ja) * 2008-03-04 2009-09-17 Alpine Electronics Inc ナビゲーションシステム
WO2013088510A1 (fr) * 2011-12-13 2013-06-20 パイオニア株式会社 Dispositif d'affichage et procédé d'affichage

Also Published As

Publication number Publication date
JP6388723B2 (ja) 2018-09-12
JPWO2017013793A1 (ja) 2017-10-12

Similar Documents

Publication Publication Date Title
CN103969831B (zh) 车辆抬头显示装置
US10029700B2 (en) Infotainment system with head-up display for symbol projection
JP4696248B2 (ja) 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置
JP4862774B2 (ja) 運転支援方法及び運転支援装置
JP6775188B2 (ja) ヘッドアップディスプレイ装置および表示制御方法
CN110920604A (zh) 辅助驾驶方法、辅助驾驶系统、计算设备及存储介质
WO2017056210A1 (fr) Dispositif d'affichage de véhicule
US11525694B2 (en) Superimposed-image display device and computer program
JP2008309529A (ja) ナビゲーション装置、ナビゲーション方法、及びナビゲーション用プログラム
JP4784572B2 (ja) 運転支援方法及び運転支援装置
JP2008015758A (ja) 運転支援装置
JP2016090344A (ja) ナビゲーション装置、及びナビゲーションプログラム
JP6444508B2 (ja) 表示制御装置およびナビゲーション装置
JP2010185761A (ja) ナビゲーションシステム、道路地図表示方法
JP6627214B2 (ja) 情報表示装置、制御方法、プログラム、及び記憶媒体
JP6945933B2 (ja) 表示システム
JP2014211431A (ja) ナビゲーション装置、及び、表示制御方法
JP7028228B2 (ja) 表示システム、表示制御装置及び表示制御プログラム
JP6388723B2 (ja) 表示制御装置およびナビゲーション装置
KR20170014545A (ko) 관심지점 표시 장치 및 방법
KR101637298B1 (ko) 증강 현실을 이용한 차량용 헤드 업 디스플레이 장치
JP6785956B2 (ja) 表示制御装置及び表示制御方法
JP2009264835A (ja) ナビゲーション装置、方法及びプログラム
JP2006098348A (ja) ナビゲーション装置
WO2020149109A1 (fr) Système d'affichage, dispositif de commande d'affichage et programme de commande d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15898952

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017529420

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15898952

Country of ref document: EP

Kind code of ref document: A1