WO2007142084A1 - Dispositif de navigation - Google Patents

Dispositif de navigation Download PDF

Info

Publication number
WO2007142084A1
WO2007142084A1 PCT/JP2007/060922 JP2007060922W WO2007142084A1 WO 2007142084 A1 WO2007142084 A1 WO 2007142084A1 JP 2007060922 W JP2007060922 W JP 2007060922W WO 2007142084 A1 WO2007142084 A1 WO 2007142084A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
navigation device
distance
guide
arrow
Prior art date
Application number
PCT/JP2007/060922
Other languages
English (en)
Japanese (ja)
Inventor
Takashi Akita
Takahiro Kudoh
Tsuyoshi Kindo
Original Assignee
Panasonic Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corporation filed Critical Panasonic Corporation
Priority to JP2007548632A priority Critical patent/JPWO2007142084A1/ja
Publication of WO2007142084A1 publication Critical patent/WO2007142084A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means

Definitions

  • the present invention relates to a navigation device, and more specifically to a navigation device that displays a guidance object superimposed on a three-dimensional space image.
  • a navigation system that performs a route search from a current vehicle position to a set destination and guides a route using a stored map data to a guide point existing on the route.
  • Recent navigation devices can store more detailed map data due to the large capacity of recording media such as HDD (Hard Disk Drive) and DVD (Digital Ver catile Disk). .
  • HDD Hard Disk Drive
  • DVD Digital Ver catile Disk
  • 3D bird's-eye view display and driver's view using real 3D CG are being used for map display in pursuit of ease of sharing and reality.
  • Patent Document 1 JP-A-7-63572
  • Patent Document 2 Japanese Patent Laid-Open No. 2003-337033
  • the portion indicating the right / left turn direction of the guide arrow is displayed in a state of standing perpendicular to the ground plane of the three-dimensional image, the height of the portion indicating the right / left turn direction is displayed.
  • the background image is concealed by the guidance arrow immediately before the guidance point. For this reason, it is difficult for the user to recognize the situation around the guide point.
  • the ground plane refers to the tangent plane between the vehicle and the road. Hereinafter, it is referred to as a ground plane.
  • the present invention has been made in view of the above problems.
  • the height or transparency of the part that indicates the direction of the left or right turn of the guide arrow with respect to the ground plane of the three-dimensional space image is controlled, so that
  • An object of the present invention is to provide a navigation device that reduces the fact that a background image is hidden by a guide arrow.
  • the 3D spatial image includes a real image and a 3D map image.
  • An aspect of the present invention is directed to a navigation apparatus.
  • the present invention provides an object generation unit that generates a guide object, and an object control unit that controls the display size or transparency of a predetermined portion of the guide object based on the distance to the vehicle position force guide point. And a display control unit that displays navigation based on a driver's view based on the guidance object obtained by the control of the object control unit.
  • the object control unit controls the size of the display by controlling the inclination of a predetermined portion of the guide object based on the distance from the vehicle to the guide point. .
  • the object control unit can control the inclination by controlling the angle of the portion of the guide object that instructs the left or right turn based on the distance from the vehicle to the guide point. preferable.
  • the object control unit reduce the angle of the vehicle position force as the distance to the guide point is shorter.
  • the object control unit reduce the angle within a range of 90 degrees force and 0 degrees.
  • the object control unit may control the size of the display by controlling the height of a predetermined portion of the guide object based on the distance to the vehicle position force guide point. preferable.
  • the object control unit controls the height of a portion instructing a right or left turn in the guide object based on a distance to the vehicle position force guide point.
  • the object control unit lowers the vehicle position force as the distance to the guide point is shorter.
  • the object control unit increases the transparency of the portion of the guidance object that instructs to turn left or right as the vehicle position force is closer to the guidance point.
  • the display control unit superimposes and displays the guidance object on the three-dimensional map image.
  • the display control unit superimposes and displays the guidance object on the photographed image.
  • the part that indicates the right / left turn direction of the guide arrow with respect to the ground plane of the three-dimensional space image based on the distance from the vehicle position to the guide point.
  • FIG. 1 is a block diagram showing an overall configuration of a navigation device according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing an example of information stored in a map database.
  • FIG. 3 is a diagram showing an example of a road and branch point network formed by nodes, interpolation nodes, and links stored in a map database.
  • FIG. 4 is a flowchart showing the operation of the navigation device according to the embodiment of the present invention.
  • FIG. 5 is a diagram showing a method of setting a camera visual field space in a three-dimensional map space.
  • FIG. 6 is a diagram showing roads detected by road detection processing.
  • FIG. 7 is a diagram showing definitions of a straight direction arrow and a right / left turn direction arrow.
  • FIG. 8 is a diagram showing the relationship between the distance D and the angle ⁇ .
  • FIG. 9 is a cross-sectional view showing an angle ⁇ between a right / left turn direction arrow and the ground plane.
  • FIG. 10 is a diagram showing a guidance arrow at a point away from the guidance target intersection.
  • Fig. 11 is a diagram showing a guidance arrow at a point approaching the guidance target intersection.
  • FIG. 12 is a diagram showing the relationship between distance and height.
  • FIG. 13 is a diagram showing a guide arrow at a point away from the intersection of the prior art guidance target.
  • FIG. 14 is a diagram showing the relationship between distance and transparency.
  • FIG. 15 is a diagram showing a guide arrow at a point away from the intersection of the prior art guidance target.
  • FIG. 16 is a diagram showing a guide arrow at a point near the intersection power of the conventional guidance object.
  • FIG. 1 is a block diagram showing an overall configuration of a navigation apparatus 100 according to an embodiment of the present invention.
  • a navigation device 100 includes an image acquisition unit 101, a positioning unit 102, a map database 103, an input unit 104, a control unit 105, a route search unit 106, an arrow generation unit 107, an arrow control unit 108, and a display control unit 109. And a display unit 110.
  • the video acquisition unit 101 is, for example, a camera that captures a live-action video in front of the vehicle.
  • the positioning unit 102 is a positioning sensor that acquires information related to the position of the host vehicle, such as a GPS (Global Positioning System) or a gyro attached to the vehicle.
  • GPS Global Positioning System
  • the map database 103 is, for example, an HDD or a DVD that stores map information such as data on roads and intersections.
  • the present invention is not limited to this, and the map information may be appropriately downloaded to the map database 103 by the communication means (for example, a mobile phone) (not shown).
  • FIG. 2 shows data extracted from the map information stored in the map database 103, which is related to the present embodiment.
  • the map database 103 includes (A) node data, (B) interpolation node data, and (C) link data.
  • the node data is a point where the road branches in several directions, such as an intersection or a merge point, position information such as latitude and longitude for each node, and the number of links to be described later connected to the node, and The ID power of the link is also configured.
  • Interpolation node data is present on a link to be described later, and represents a turning point for expressing the shape of the link, such as when the link is not a straight line. Consists of existing link IDs.
  • Link data represents a road connecting nodes, and is a start point that is an end point of a link. It consists of the node, end node, link length (unit: meters, kilometers, etc.), the number of interpolation nodes mentioned above, road width, and its ID.
  • Fig. 3 shows an example of a road and intersection network formed by such map data. As shown in Fig. 3, three or more links are connected to a node and connected to another node, and an interpolation node for controlling the link shape exists on that link. If the link is a straight road, an interpolation node does not necessarily exist.
  • the input unit 104 is, for example, a remote controller, a touch panel, or a voice input microphone for inputting information on the destination to the navigation device.
  • Control unit 105 is, for example, a remote controller, a touch panel, or a voice input microphone for inputting information on the destination to the navigation device.
  • the navigation device 100 is controlled as a whole.
  • the route search unit 106 includes information on the destination input from the input unit 104, and a positioning unit.
  • the map database 103 is referenced to search for the optimum route to the destination.
  • the arrow generation unit 107 generates a guide arrow along the road corresponding to the guide route searched by the route search unit 106 among the roads detected by the road detection process in the three-dimensional space image. Generate.
  • the arrow control unit 108 controls the angle formed by the portion of the guide arrow that indicates the direction to turn left and right with the ground plane of the 3D map.
  • the display control unit 109 causes the display unit 110 to display the guidance arrow obtained by the control of the arrow control unit 108 on the photographed video.
  • FIG. 4 is a flowchart showing the operation of the navigation device 100 according to the embodiment of the present invention.
  • control unit 105 determines whether or not a destination is set from input unit 104 (step S401). When the destination is not set in step S401, the control unit 105 returns the process to S401. On the other hand, when the destination is set in step S401, the route search unit 106 searches the route to the destination with reference to the map database 103 based on the vehicle position information acquired by the positioning unit 102. (Step S402). When a route to the destination is searched, the control unit 105 starts guidance (step S403).
  • control unit 105 captures a 3D map, video capture stored in the map database 103.
  • the field of view of the camera in the 3D map space is calculated based on the camera direction, camera position (camera angle (horizontal angle, elevation angle)), focal length, and image size, which are parameters for determining the imaging direction and imaging range of the acquisition unit 101.
  • position information is represented based on latitude, longitude, and altitude.
  • the visual field space of the camera is obtained, for example, by a method as shown in FIG.
  • a point (point F) advanced from the camera position (viewpoint) E by the focal length f in the camera angle direction is obtained, and a horizontal X vertical y plane (camera screen) corresponding to the image size is found there. It is set to be perpendicular to the vector connecting E and point F.
  • the control unit 105 obtains a three-dimensional space in which the half-line connecting the four corner points of the camera screen is also created as the viewpoint E force. This three-dimensional space theoretically extends to infinity, but is censored at an appropriate distance from the viewpoint E to be a viewing space.
  • control unit 105 may calculate the field of view of the camera in the 2D map space using a 2D map excluding the 3D map power altitude information instead of the 3D map.
  • the parameters for determining the imaging direction and the imaging range are not limited to those described above, and may be calculated using other parameters such as the angle of view as long as the imaging direction and the imaging range are determined.
  • control unit 105 determines whether or not the vehicle has reached the destination (step S405). If the destination has been reached in step S405, the control unit 105 ends the process. On the other hand, if it is determined in step S405 that the destination has not been reached, the control unit 105 performs a road detection process for detecting a road existing in the field of view of the camera and its position in the 3D map space. Perform (Step S406).
  • the control unit 105 obtains an overlap between the camera view space and the road area in the three-dimensional map space.
  • Figure 6 shows the roads detected by the road detection process.
  • Fig. 6 shows the 3D map space and the camera field of view viewed in the upward direction.
  • the shape and width of the road in the vicinity of the vehicle is extracted based on the node data, the interpolation node data, and the link data.
  • the road surrounded by the visual field space (the hatched portion) is detected as a road existing in the visual field space of the camera.
  • the arrow generation unit 107 guides the shape along the road corresponding to the guide route searched by the route search unit 106 among the roads detected by the road detection process in the 3D map space. An arrow is generated (step S407).
  • the arrow control unit 108 separates the guide arrow into a straight direction arrow and a right / left turn direction arrow, and determines an arrangement on the road corresponding to the guide route in each of the three-dimensional map spaces. . This separation process is performed by determining which node is the separation point between the straight direction arrow and the right / left turn direction arrow based on the information of the nodes constituting the guide arrow.
  • FIG. 7 is a diagram for explaining the definitions of the straight direction arrow and the right / left turn direction arrow.
  • the arrow placed on the road (road R1) that is running is the straight direction arrow
  • the arrow placed on the road (road R2) that is turned left and right at the guidance target intersection is the left and right.
  • It is a folding direction arrow.
  • FIG. 7 merely explains the definitions of the straight direction arrow and the right / left turn direction arrow, and does not show the arrangement of the guide arrows actually determined by the arrangement process.
  • the shape of the guide arrow is not limited to the arrow figure shown in FIG. 7, and for example, a polygonal line figure excluding the triangle at the tip of the arrow figure force may be used.
  • the arrow control unit 108 arranges the straight direction arrow so as to overlap in parallel with the ground plane on the corresponding road in the three-dimensional map space (step S408).
  • the arrow control unit 108 changes the arrangement of the right and left turn direction arrows in accordance with the vehicle position force and the distance to the guidance target intersection.
  • the arrow control unit 108 first calculates the distance from the own vehicle position to the next intersection to be guided through which the own vehicle passes (step S409). In the following, it is assumed that the distance to the target intersection for the vehicle position guidance is calculated periodically.
  • the arrow control unit 108 determines that the distance D from the vehicle position to the guidance target intersection is a predetermined value.
  • step S410 it is determined whether the force is below (step S410).
  • the arrow control unit 108 determines that the distance D is greater than the distance D1
  • the arrow control unit 108 arranges only the straight direction arrow without arranging the right / left turn direction arrow.
  • the arrow control unit 108 shifts the process to a projection process (steps S410 ⁇ S412) of the in-house arrow composed of the straight direction arrow.
  • distance D In the explanation, 1 is 200m. Also, if there is no guidance target intersection in the route from the vehicle position to the destination, the right / left turn direction arrow will not be displayed.
  • step S410 determines that the distance D from the vehicle position to the guidance target intersection is equal to or less than the distance Dl (200m)
  • the arrow control unit 108 arranges a right / left turn direction arrow (step S410 ⁇ S411).
  • the right and left turn direction arrows are arranged on the corresponding road in the 3D map space at an angle corresponding to the distance to the vehicle position force guidance target intersection with respect to the ground plane.
  • the left and right turn direction arrows are arranged so that the angle ⁇ of the guide arrow with respect to the ground plane becomes smaller as the distance to the target intersection in the vehicle position force plan becomes shorter. For example, it is arranged to satisfy the angle ⁇ force S (Equation 1).
  • Fig. 9 is a view as seen from a cross-section passing through the road R1 in the 3D map space and perpendicular to the ground plane. As shown in Fig. 9, when the distance to the vehicle position force guidance target intersection is 200m, the angle ⁇ is 90 degrees, and the right / left turn direction arrow is placed upright (see Fig. 9 (a)) .
  • the arrow control unit 108 performs a projection process on the guide arrow composed of the straight direction arrow and the right / left turn direction arrow with the camera screen shown in FIG. 5 as the projection plane. (Step S412).
  • the display control unit 109 captures the guide arrow as shown in FIG. 10 and FIG. Overlay on the road shown in the image (corresponding to the guide route) and display it on the display unit 110
  • the display is performed with the right / left turn direction arrow having a height.
  • the right / left turn direction arrow at this time is displayed large enough to be easily recognized from a distance.
  • the right and left turn direction arrows are arranged on the ground plane with no height. Do not hide the background image.
  • FIG. 12 is a cross-sectional view similar to FIG. 9, and FIG. 12 (a) is a case where the vehicle position is the farthest from the guidance target intersection, and FIG. This is the case where the vehicle position is closest to the intersection to be guided.
  • the height of the three-dimensionally displayed right / left turn direction arrow gradually decreases as the vehicle position force decreases as the distance to the guidance target intersection decreases.
  • the guide object is separately displayed above the guide object displayed on the ground plane. It may be composed of two separate guide objects with a three-dimensional guide object.
  • the guidance object that is separately displayed in a three-dimensional upward direction is, for example, the bowl-shaped object in FIG.
  • the 3D display here is a default form. Means to display an object displayed on the ground plane in a height direction that is close to the ground plane. At this time, if the object to be controlled by the object control unit 108 is a bowl-shaped object, the same effects as described above can be exhibited.
  • the object control unit 108 controls the transparency rather than controlling the size of the guide object displayed in the three-dimensional upper part separately. It ’s okay.
  • the guidance object that is separately displayed in the upper part is arranged so that the transparency oc of the guidance object increases as the vehicle position force is closer to the guidance point.
  • the transparency ex satisfies (Equation 2).
  • the transparency ⁇ changes in accordance with the distance D to the guide point as shown in FIG. This makes it possible to ensure the visibility of the separately displayed three-dimensional guide object at a point away from the intersection force, while the guide object displayed three-dimensionally above the point near the intersection. Since it becomes transparent, the background image is not hidden. In other words, guidance display is performed in which both the ease of viewing the separately displayed three-dimensional guide object at a position away from the intersection force and the visibility of the background image immediately before the intersection are compatible.
  • the photographed image is obtained through a photographed image or communication stored in advance in a storage medium. It may be applied when the guidance object is displayed superimposed on the actual photographed image.
  • the present invention is applied to the driver's view display by a live-action image.
  • the present invention is applied when the driver's view display by the three-dimensional map display is performed, the same effect can be obtained.
  • the vehicle position force is also large on the display of the portion that indicates the right-left turn direction of the guide arrow with respect to the ground plane of the three-dimensional space image based on the distance to the guide point.
  • the navigation device of the present invention is useful as a car navigation device installed in a vehicle. It is also useful as a navigation device for mobile phones.

Abstract

L'invention concerne un dispositif de navigation ou la taille ou la lisibilité d'une partie de flèche de guidage indiquant une direction pour tourner à gauche ou à droite sont commandées sur la base de la distance depuis la position d'un véhicule jusqu'au point cible auquel doit être guidé le véhicule. La constitution réduit les possibilités qu'une image d'arrière plan soit couverte par la flèche de guidage immédiatement avant le point cible. Le dispositif de navigation comporte une section de création d'objet (107) permettant de créer un objet de guidage, une section de commande d'objet (108) permettant de commander, sur la base de la distance depuis la position du véhicule jusqu'au point cible, la taille et la lisibilité d'affichage d'une partie spécifique de l'objet de guidage, ainsi qu'un moyen de commande d'affichage (109) permettant d'effectuer, sur la base de l'objet de guidage obtenu grâce à la commande de la section de commande d'objet (108), un affichage de navigation en mode d'affichage de visualisation pour le chauffeur.
PCT/JP2007/060922 2006-06-05 2007-05-29 Dispositif de navigation WO2007142084A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007548632A JPWO2007142084A1 (ja) 2006-06-05 2007-05-29 ナビゲーション装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006156154 2006-06-05
JP2006-156154 2006-06-05

Publications (1)

Publication Number Publication Date
WO2007142084A1 true WO2007142084A1 (fr) 2007-12-13

Family

ID=38801341

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/060922 WO2007142084A1 (fr) 2006-06-05 2007-05-29 Dispositif de navigation

Country Status (2)

Country Link
JP (1) JPWO2007142084A1 (fr)
WO (1) WO2007142084A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011121401A (ja) * 2009-12-08 2011-06-23 Toshiba Corp 表示装置、表示方法及び移動体
WO2014050172A1 (fr) * 2012-09-28 2014-04-03 アイシン・エィ・ダブリュ株式会社 Système, procédé et programme de navigation d'intersection
JP2017021019A (ja) * 2015-07-08 2017-01-26 日産自動車株式会社 車両用表示装置及び車両用表示方法
WO2017056211A1 (fr) * 2015-09-30 2017-04-06 日産自動車株式会社 Dispositif d'affichage de véhicule
JP2017091504A (ja) * 2016-08-26 2017-05-25 株式会社オプティム 遠隔指示方法及び遠隔端末用プログラム
WO2019097762A1 (fr) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Dispositif d'affichage d'image superposée et programme informatique
EP3505382A1 (fr) * 2017-12-28 2019-07-03 Alpine Electronics, Inc. Système embarqué
JP2021099343A (ja) * 2019-06-21 2021-07-01 パイオニア株式会社 表示装置、プログラム、及び記憶媒体

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07262492A (ja) * 1994-03-25 1995-10-13 Alpine Electron Inc 車載用ナビゲーション装置
JPH09325042A (ja) * 1996-06-05 1997-12-16 Matsushita Electric Ind Co Ltd 走行位置表示装置
JPH1089990A (ja) * 1996-09-13 1998-04-10 Alpine Electron Inc ナビゲーション装置
JPH10103997A (ja) * 1996-09-27 1998-04-24 Toyota Motor Corp 車両用経路案内装置
JP2000155895A (ja) * 1998-11-24 2000-06-06 Sony Corp ナビゲーション装置
JP2003083761A (ja) * 2001-09-13 2003-03-19 Alpine Electronics Inc ナビゲーション装置
JP2004233153A (ja) * 2003-01-29 2004-08-19 Xanavi Informatics Corp 車載用ナビゲーション装置および地図画像表示方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07262492A (ja) * 1994-03-25 1995-10-13 Alpine Electron Inc 車載用ナビゲーション装置
JPH09325042A (ja) * 1996-06-05 1997-12-16 Matsushita Electric Ind Co Ltd 走行位置表示装置
JPH1089990A (ja) * 1996-09-13 1998-04-10 Alpine Electron Inc ナビゲーション装置
JPH10103997A (ja) * 1996-09-27 1998-04-24 Toyota Motor Corp 車両用経路案内装置
JP2000155895A (ja) * 1998-11-24 2000-06-06 Sony Corp ナビゲーション装置
JP2003083761A (ja) * 2001-09-13 2003-03-19 Alpine Electronics Inc ナビゲーション装置
JP2004233153A (ja) * 2003-01-29 2004-08-19 Xanavi Informatics Corp 車載用ナビゲーション装置および地図画像表示方法

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011121401A (ja) * 2009-12-08 2011-06-23 Toshiba Corp 表示装置、表示方法及び移動体
WO2014050172A1 (fr) * 2012-09-28 2014-04-03 アイシン・エィ・ダブリュ株式会社 Système, procédé et programme de navigation d'intersection
CN104603578A (zh) * 2012-09-28 2015-05-06 爱信艾达株式会社 交叉路口引导系统、方法以及程序
US9508258B2 (en) 2012-09-28 2016-11-29 Aisin Aw Co., Ltd. Intersection guide system, method, and program
JP2017021019A (ja) * 2015-07-08 2017-01-26 日産自動車株式会社 車両用表示装置及び車両用表示方法
JP7048202B2 (ja) 2015-07-08 2022-04-05 日産自動車株式会社 車両用表示装置及び車両用表示方法
US11209285B2 (en) 2015-09-30 2021-12-28 Nissan Motor Co., Ltd. Vehicular display device
WO2017056211A1 (fr) * 2015-09-30 2017-04-06 日産自動車株式会社 Dispositif d'affichage de véhicule
JPWO2017056211A1 (ja) * 2015-09-30 2018-09-13 日産自動車株式会社 車両用表示装置
JP2017091504A (ja) * 2016-08-26 2017-05-25 株式会社オプティム 遠隔指示方法及び遠隔端末用プログラム
WO2019097762A1 (fr) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Dispositif d'affichage d'image superposée et programme informatique
JP2019095215A (ja) * 2017-11-17 2019-06-20 アイシン・エィ・ダブリュ株式会社 重畳画像表示装置及びコンピュータプログラム
US11525694B2 (en) 2017-11-17 2022-12-13 Aisin Corporation Superimposed-image display device and computer program
US11193785B2 (en) 2017-12-28 2021-12-07 Alpine Electronics, Inc. In-vehicle system
EP3505382A1 (fr) * 2017-12-28 2019-07-03 Alpine Electronics, Inc. Système embarqué
JP2021099343A (ja) * 2019-06-21 2021-07-01 パイオニア株式会社 表示装置、プログラム、及び記憶媒体

Also Published As

Publication number Publication date
JPWO2007142084A1 (ja) 2009-10-22

Similar Documents

Publication Publication Date Title
JP4560090B2 (ja) ナビゲーション装置及びナビゲーション方法
JP4550927B2 (ja) ナビゲーション装置
JP2009020089A (ja) ナビゲーション装置、ナビゲーション方法、及びナビゲーション用プログラム
WO2007142084A1 (fr) Dispositif de navigation
WO2007129382A1 (fr) Dispositif et procede de navigation
JP2007121001A (ja) ナビゲーション装置
US20110288763A1 (en) Method and apparatus for displaying three-dimensional route guidance
JP4899746B2 (ja) 経路案内表示装置
JP2008128827A (ja) ナビゲーション装置およびナビゲーション方法ならびにそのプログラム
JP2007198962A (ja) 車両用案内表示装置
JP5218607B2 (ja) ナビゲーション装置
KR102222102B1 (ko) 증강 현실 내비게이션 시스템 및 증강 현실 내비게이션 시스템의 경로 안내 방법
JP4105609B2 (ja) ナビゲーション用立体表示方法およびナビゲーション装置
JP2007206014A (ja) ナビゲーション装置
JP2013225275A (ja) 3次元画像表示システム
JPWO2006109527A1 (ja) ナビゲーション装置およびナビゲーション方法
JP2007256048A (ja) ナビゲーション装置
JP6968069B2 (ja) 表示制御装置および表示制御方法
JP2008002965A (ja) ナビゲーション装置およびその方法
KR100886330B1 (ko) 사용자 뷰 출력 시스템 및 방법
JPWO2011121788A1 (ja) ナビゲーション装置、情報表示装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体
JP2007263849A (ja) ナビゲーション装置
JP2008157680A (ja) ナビゲーション装置
JP2009264835A (ja) ナビゲーション装置、方法及びプログラム
JP2007322371A (ja) ナビゲーション装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2007548632

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07744341

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 07744341

Country of ref document: EP

Kind code of ref document: A1