WO2007142084A1 - Navigation device - Google Patents

Navigation device Download PDF

Info

Publication number
WO2007142084A1
WO2007142084A1 PCT/JP2007/060922 JP2007060922W WO2007142084A1 WO 2007142084 A1 WO2007142084 A1 WO 2007142084A1 JP 2007060922 W JP2007060922 W JP 2007060922W WO 2007142084 A1 WO2007142084 A1 WO 2007142084A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
navigation device
distance
guide
arrow
Prior art date
Application number
PCT/JP2007/060922
Other languages
French (fr)
Japanese (ja)
Inventor
Takashi Akita
Takahiro Kudoh
Tsuyoshi Kindo
Original Assignee
Panasonic Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corporation filed Critical Panasonic Corporation
Priority to JP2007548632A priority Critical patent/JPWO2007142084A1/en
Publication of WO2007142084A1 publication Critical patent/WO2007142084A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means

Definitions

  • the present invention relates to a navigation device, and more specifically to a navigation device that displays a guidance object superimposed on a three-dimensional space image.
  • a navigation system that performs a route search from a current vehicle position to a set destination and guides a route using a stored map data to a guide point existing on the route.
  • Recent navigation devices can store more detailed map data due to the large capacity of recording media such as HDD (Hard Disk Drive) and DVD (Digital Ver catile Disk). .
  • HDD Hard Disk Drive
  • DVD Digital Ver catile Disk
  • 3D bird's-eye view display and driver's view using real 3D CG are being used for map display in pursuit of ease of sharing and reality.
  • Patent Document 1 JP-A-7-63572
  • Patent Document 2 Japanese Patent Laid-Open No. 2003-337033
  • the portion indicating the right / left turn direction of the guide arrow is displayed in a state of standing perpendicular to the ground plane of the three-dimensional image, the height of the portion indicating the right / left turn direction is displayed.
  • the background image is concealed by the guidance arrow immediately before the guidance point. For this reason, it is difficult for the user to recognize the situation around the guide point.
  • the ground plane refers to the tangent plane between the vehicle and the road. Hereinafter, it is referred to as a ground plane.
  • the present invention has been made in view of the above problems.
  • the height or transparency of the part that indicates the direction of the left or right turn of the guide arrow with respect to the ground plane of the three-dimensional space image is controlled, so that
  • An object of the present invention is to provide a navigation device that reduces the fact that a background image is hidden by a guide arrow.
  • the 3D spatial image includes a real image and a 3D map image.
  • An aspect of the present invention is directed to a navigation apparatus.
  • the present invention provides an object generation unit that generates a guide object, and an object control unit that controls the display size or transparency of a predetermined portion of the guide object based on the distance to the vehicle position force guide point. And a display control unit that displays navigation based on a driver's view based on the guidance object obtained by the control of the object control unit.
  • the object control unit controls the size of the display by controlling the inclination of a predetermined portion of the guide object based on the distance from the vehicle to the guide point. .
  • the object control unit can control the inclination by controlling the angle of the portion of the guide object that instructs the left or right turn based on the distance from the vehicle to the guide point. preferable.
  • the object control unit reduce the angle of the vehicle position force as the distance to the guide point is shorter.
  • the object control unit reduce the angle within a range of 90 degrees force and 0 degrees.
  • the object control unit may control the size of the display by controlling the height of a predetermined portion of the guide object based on the distance to the vehicle position force guide point. preferable.
  • the object control unit controls the height of a portion instructing a right or left turn in the guide object based on a distance to the vehicle position force guide point.
  • the object control unit lowers the vehicle position force as the distance to the guide point is shorter.
  • the object control unit increases the transparency of the portion of the guidance object that instructs to turn left or right as the vehicle position force is closer to the guidance point.
  • the display control unit superimposes and displays the guidance object on the three-dimensional map image.
  • the display control unit superimposes and displays the guidance object on the photographed image.
  • the part that indicates the right / left turn direction of the guide arrow with respect to the ground plane of the three-dimensional space image based on the distance from the vehicle position to the guide point.
  • FIG. 1 is a block diagram showing an overall configuration of a navigation device according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing an example of information stored in a map database.
  • FIG. 3 is a diagram showing an example of a road and branch point network formed by nodes, interpolation nodes, and links stored in a map database.
  • FIG. 4 is a flowchart showing the operation of the navigation device according to the embodiment of the present invention.
  • FIG. 5 is a diagram showing a method of setting a camera visual field space in a three-dimensional map space.
  • FIG. 6 is a diagram showing roads detected by road detection processing.
  • FIG. 7 is a diagram showing definitions of a straight direction arrow and a right / left turn direction arrow.
  • FIG. 8 is a diagram showing the relationship between the distance D and the angle ⁇ .
  • FIG. 9 is a cross-sectional view showing an angle ⁇ between a right / left turn direction arrow and the ground plane.
  • FIG. 10 is a diagram showing a guidance arrow at a point away from the guidance target intersection.
  • Fig. 11 is a diagram showing a guidance arrow at a point approaching the guidance target intersection.
  • FIG. 12 is a diagram showing the relationship between distance and height.
  • FIG. 13 is a diagram showing a guide arrow at a point away from the intersection of the prior art guidance target.
  • FIG. 14 is a diagram showing the relationship between distance and transparency.
  • FIG. 15 is a diagram showing a guide arrow at a point away from the intersection of the prior art guidance target.
  • FIG. 16 is a diagram showing a guide arrow at a point near the intersection power of the conventional guidance object.
  • FIG. 1 is a block diagram showing an overall configuration of a navigation apparatus 100 according to an embodiment of the present invention.
  • a navigation device 100 includes an image acquisition unit 101, a positioning unit 102, a map database 103, an input unit 104, a control unit 105, a route search unit 106, an arrow generation unit 107, an arrow control unit 108, and a display control unit 109. And a display unit 110.
  • the video acquisition unit 101 is, for example, a camera that captures a live-action video in front of the vehicle.
  • the positioning unit 102 is a positioning sensor that acquires information related to the position of the host vehicle, such as a GPS (Global Positioning System) or a gyro attached to the vehicle.
  • GPS Global Positioning System
  • the map database 103 is, for example, an HDD or a DVD that stores map information such as data on roads and intersections.
  • the present invention is not limited to this, and the map information may be appropriately downloaded to the map database 103 by the communication means (for example, a mobile phone) (not shown).
  • FIG. 2 shows data extracted from the map information stored in the map database 103, which is related to the present embodiment.
  • the map database 103 includes (A) node data, (B) interpolation node data, and (C) link data.
  • the node data is a point where the road branches in several directions, such as an intersection or a merge point, position information such as latitude and longitude for each node, and the number of links to be described later connected to the node, and The ID power of the link is also configured.
  • Interpolation node data is present on a link to be described later, and represents a turning point for expressing the shape of the link, such as when the link is not a straight line. Consists of existing link IDs.
  • Link data represents a road connecting nodes, and is a start point that is an end point of a link. It consists of the node, end node, link length (unit: meters, kilometers, etc.), the number of interpolation nodes mentioned above, road width, and its ID.
  • Fig. 3 shows an example of a road and intersection network formed by such map data. As shown in Fig. 3, three or more links are connected to a node and connected to another node, and an interpolation node for controlling the link shape exists on that link. If the link is a straight road, an interpolation node does not necessarily exist.
  • the input unit 104 is, for example, a remote controller, a touch panel, or a voice input microphone for inputting information on the destination to the navigation device.
  • Control unit 105 is, for example, a remote controller, a touch panel, or a voice input microphone for inputting information on the destination to the navigation device.
  • the navigation device 100 is controlled as a whole.
  • the route search unit 106 includes information on the destination input from the input unit 104, and a positioning unit.
  • the map database 103 is referenced to search for the optimum route to the destination.
  • the arrow generation unit 107 generates a guide arrow along the road corresponding to the guide route searched by the route search unit 106 among the roads detected by the road detection process in the three-dimensional space image. Generate.
  • the arrow control unit 108 controls the angle formed by the portion of the guide arrow that indicates the direction to turn left and right with the ground plane of the 3D map.
  • the display control unit 109 causes the display unit 110 to display the guidance arrow obtained by the control of the arrow control unit 108 on the photographed video.
  • FIG. 4 is a flowchart showing the operation of the navigation device 100 according to the embodiment of the present invention.
  • control unit 105 determines whether or not a destination is set from input unit 104 (step S401). When the destination is not set in step S401, the control unit 105 returns the process to S401. On the other hand, when the destination is set in step S401, the route search unit 106 searches the route to the destination with reference to the map database 103 based on the vehicle position information acquired by the positioning unit 102. (Step S402). When a route to the destination is searched, the control unit 105 starts guidance (step S403).
  • control unit 105 captures a 3D map, video capture stored in the map database 103.
  • the field of view of the camera in the 3D map space is calculated based on the camera direction, camera position (camera angle (horizontal angle, elevation angle)), focal length, and image size, which are parameters for determining the imaging direction and imaging range of the acquisition unit 101.
  • position information is represented based on latitude, longitude, and altitude.
  • the visual field space of the camera is obtained, for example, by a method as shown in FIG.
  • a point (point F) advanced from the camera position (viewpoint) E by the focal length f in the camera angle direction is obtained, and a horizontal X vertical y plane (camera screen) corresponding to the image size is found there. It is set to be perpendicular to the vector connecting E and point F.
  • the control unit 105 obtains a three-dimensional space in which the half-line connecting the four corner points of the camera screen is also created as the viewpoint E force. This three-dimensional space theoretically extends to infinity, but is censored at an appropriate distance from the viewpoint E to be a viewing space.
  • control unit 105 may calculate the field of view of the camera in the 2D map space using a 2D map excluding the 3D map power altitude information instead of the 3D map.
  • the parameters for determining the imaging direction and the imaging range are not limited to those described above, and may be calculated using other parameters such as the angle of view as long as the imaging direction and the imaging range are determined.
  • control unit 105 determines whether or not the vehicle has reached the destination (step S405). If the destination has been reached in step S405, the control unit 105 ends the process. On the other hand, if it is determined in step S405 that the destination has not been reached, the control unit 105 performs a road detection process for detecting a road existing in the field of view of the camera and its position in the 3D map space. Perform (Step S406).
  • the control unit 105 obtains an overlap between the camera view space and the road area in the three-dimensional map space.
  • Figure 6 shows the roads detected by the road detection process.
  • Fig. 6 shows the 3D map space and the camera field of view viewed in the upward direction.
  • the shape and width of the road in the vicinity of the vehicle is extracted based on the node data, the interpolation node data, and the link data.
  • the road surrounded by the visual field space (the hatched portion) is detected as a road existing in the visual field space of the camera.
  • the arrow generation unit 107 guides the shape along the road corresponding to the guide route searched by the route search unit 106 among the roads detected by the road detection process in the 3D map space. An arrow is generated (step S407).
  • the arrow control unit 108 separates the guide arrow into a straight direction arrow and a right / left turn direction arrow, and determines an arrangement on the road corresponding to the guide route in each of the three-dimensional map spaces. . This separation process is performed by determining which node is the separation point between the straight direction arrow and the right / left turn direction arrow based on the information of the nodes constituting the guide arrow.
  • FIG. 7 is a diagram for explaining the definitions of the straight direction arrow and the right / left turn direction arrow.
  • the arrow placed on the road (road R1) that is running is the straight direction arrow
  • the arrow placed on the road (road R2) that is turned left and right at the guidance target intersection is the left and right.
  • It is a folding direction arrow.
  • FIG. 7 merely explains the definitions of the straight direction arrow and the right / left turn direction arrow, and does not show the arrangement of the guide arrows actually determined by the arrangement process.
  • the shape of the guide arrow is not limited to the arrow figure shown in FIG. 7, and for example, a polygonal line figure excluding the triangle at the tip of the arrow figure force may be used.
  • the arrow control unit 108 arranges the straight direction arrow so as to overlap in parallel with the ground plane on the corresponding road in the three-dimensional map space (step S408).
  • the arrow control unit 108 changes the arrangement of the right and left turn direction arrows in accordance with the vehicle position force and the distance to the guidance target intersection.
  • the arrow control unit 108 first calculates the distance from the own vehicle position to the next intersection to be guided through which the own vehicle passes (step S409). In the following, it is assumed that the distance to the target intersection for the vehicle position guidance is calculated periodically.
  • the arrow control unit 108 determines that the distance D from the vehicle position to the guidance target intersection is a predetermined value.
  • step S410 it is determined whether the force is below (step S410).
  • the arrow control unit 108 determines that the distance D is greater than the distance D1
  • the arrow control unit 108 arranges only the straight direction arrow without arranging the right / left turn direction arrow.
  • the arrow control unit 108 shifts the process to a projection process (steps S410 ⁇ S412) of the in-house arrow composed of the straight direction arrow.
  • distance D In the explanation, 1 is 200m. Also, if there is no guidance target intersection in the route from the vehicle position to the destination, the right / left turn direction arrow will not be displayed.
  • step S410 determines that the distance D from the vehicle position to the guidance target intersection is equal to or less than the distance Dl (200m)
  • the arrow control unit 108 arranges a right / left turn direction arrow (step S410 ⁇ S411).
  • the right and left turn direction arrows are arranged on the corresponding road in the 3D map space at an angle corresponding to the distance to the vehicle position force guidance target intersection with respect to the ground plane.
  • the left and right turn direction arrows are arranged so that the angle ⁇ of the guide arrow with respect to the ground plane becomes smaller as the distance to the target intersection in the vehicle position force plan becomes shorter. For example, it is arranged to satisfy the angle ⁇ force S (Equation 1).
  • Fig. 9 is a view as seen from a cross-section passing through the road R1 in the 3D map space and perpendicular to the ground plane. As shown in Fig. 9, when the distance to the vehicle position force guidance target intersection is 200m, the angle ⁇ is 90 degrees, and the right / left turn direction arrow is placed upright (see Fig. 9 (a)) .
  • the arrow control unit 108 performs a projection process on the guide arrow composed of the straight direction arrow and the right / left turn direction arrow with the camera screen shown in FIG. 5 as the projection plane. (Step S412).
  • the display control unit 109 captures the guide arrow as shown in FIG. 10 and FIG. Overlay on the road shown in the image (corresponding to the guide route) and display it on the display unit 110
  • the display is performed with the right / left turn direction arrow having a height.
  • the right / left turn direction arrow at this time is displayed large enough to be easily recognized from a distance.
  • the right and left turn direction arrows are arranged on the ground plane with no height. Do not hide the background image.
  • FIG. 12 is a cross-sectional view similar to FIG. 9, and FIG. 12 (a) is a case where the vehicle position is the farthest from the guidance target intersection, and FIG. This is the case where the vehicle position is closest to the intersection to be guided.
  • the height of the three-dimensionally displayed right / left turn direction arrow gradually decreases as the vehicle position force decreases as the distance to the guidance target intersection decreases.
  • the guide object is separately displayed above the guide object displayed on the ground plane. It may be composed of two separate guide objects with a three-dimensional guide object.
  • the guidance object that is separately displayed in a three-dimensional upward direction is, for example, the bowl-shaped object in FIG.
  • the 3D display here is a default form. Means to display an object displayed on the ground plane in a height direction that is close to the ground plane. At this time, if the object to be controlled by the object control unit 108 is a bowl-shaped object, the same effects as described above can be exhibited.
  • the object control unit 108 controls the transparency rather than controlling the size of the guide object displayed in the three-dimensional upper part separately. It ’s okay.
  • the guidance object that is separately displayed in the upper part is arranged so that the transparency oc of the guidance object increases as the vehicle position force is closer to the guidance point.
  • the transparency ex satisfies (Equation 2).
  • the transparency ⁇ changes in accordance with the distance D to the guide point as shown in FIG. This makes it possible to ensure the visibility of the separately displayed three-dimensional guide object at a point away from the intersection force, while the guide object displayed three-dimensionally above the point near the intersection. Since it becomes transparent, the background image is not hidden. In other words, guidance display is performed in which both the ease of viewing the separately displayed three-dimensional guide object at a position away from the intersection force and the visibility of the background image immediately before the intersection are compatible.
  • the photographed image is obtained through a photographed image or communication stored in advance in a storage medium. It may be applied when the guidance object is displayed superimposed on the actual photographed image.
  • the present invention is applied to the driver's view display by a live-action image.
  • the present invention is applied when the driver's view display by the three-dimensional map display is performed, the same effect can be obtained.
  • the vehicle position force is also large on the display of the portion that indicates the right-left turn direction of the guide arrow with respect to the ground plane of the three-dimensional space image based on the distance to the guide point.
  • the navigation device of the present invention is useful as a car navigation device installed in a vehicle. It is also useful as a navigation device for mobile phones.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)

Abstract

A navigation device where the size or clarity of that portion of a guidance arrow that indicates the left and right turning direction are controlled based on the distance from the position of a vehicle up to a target point to which the vehicle is to be guided. The construction reduces possibilities that a background image is covered by the guidance arrow immediately before the target point. The navigation device has an object creation section (107) for creating a guidance object, an object control section (108) for controlling, based on the distance from the position of the vehicle up to the target point, the size and clarity in display of a specific portion of the guidance object, and a display control means (109) for performing, based on the guidance object obtained by the control of the object control section (108), navigation display in a driver's view display mode.

Description

明 細 書  Specification
ナビゲーシヨン装置  Navigation device
技術分野  Technical field
[0001] 本発明は、ナビゲーシヨン装置に関し、より特定的には、 3次元空間画像に案内ォ ブジェクトを重畳表示するナビゲーシヨン装置に関する。  TECHNICAL FIELD [0001] The present invention relates to a navigation device, and more specifically to a navigation device that displays a guidance object superimposed on a three-dimensional space image.
背景技術  Background art
[0002] 従来、設定された目的地に対して現在の自車位置からの経路探索を行い、経路上 に存在する案内点に対して、蓄積された地図データを利用して経路案内を行うナビ
Figure imgf000003_0001
、る。
[0002] Conventionally, a navigation system that performs a route search from a current vehicle position to a set destination and guides a route using a stored map data to a guide point existing on the route.
Figure imgf000003_0001
RU
[0003] 近年のナビゲーシヨン装置は、 HDD (Hard Disk Drive)や DVD (Digital Ver catile Disk)等の記録メディアの大容量ィ匕により、より詳細な地図データを記憶して おくことが可能である。そのため、例えば、 3次元的な鳥瞰図表示やリアルな 3次元 C Gを用いたドライバーズ 'ビュー等、分力り易さやリアリティを追求した地図表示が行わ れている。  [0003] Recent navigation devices can store more detailed map data due to the large capacity of recording media such as HDD (Hard Disk Drive) and DVD (Digital Ver catile Disk). . For this reason, for example, 3D bird's-eye view display and driver's view using real 3D CG are being used for map display in pursuit of ease of sharing and reality.
[0004] このように 3次元的な地図表示が行われる際には、案内矢印(経路案内線)の見易 さが重要であり、特に、視点高度を車両の高さ程度まで下げて運転者の目線で見た 状態の 3次元的な地図表示が行われることが好ましい。  [0004] When three-dimensional map display is performed in this way, it is important that the guide arrows (route guide lines) are easy to see. In particular, the viewpoint altitude is lowered to about the height of the vehicle, It is preferable that a three-dimensional map display as seen from the line of view is performed.
[0005] 従来、車両の前方映像をビデオカメラ等の撮像手段で取得し、誘導のポイントとな る案内点 (経路上に存在する右左折する対象の交差点や分岐点)に近づくと、案内 点における車両の進路を示す進路案内情報として矢印画像を重畳表示する車載用 ナビゲーシヨン装置の走行案内画像表示方法が提案されている(例えば、特許文献 1参照)。  [0005] Conventionally, when a front image of a vehicle is acquired by an imaging means such as a video camera and approaches a guide point (an intersection or branch point to be turned left or right on the route), the guide point A driving guidance image display method for an in-vehicle navigation device has been proposed that displays an arrow image as route guidance information indicating the course of the vehicle in (see, for example, Patent Document 1).
[0006] このような 3次元的な地図表示では、案内点が現在の自車位置から遠方に存在す ると、図 15に示すように、案内矢印の右左折方向を指示する部分が細く表示され、ュ 一ザにとって見え辛くなつてしまう。  [0006] In such a three-dimensional map display, if the guide point is far away from the current vehicle position, as shown in FIG. It becomes difficult to see for the user.
[0007] そこで、従来、右左折道路に沿って伸びる案内オブジェクトは通常細く表示され、 視認性が悪い案内オブジェクトであるため、地平面に対して高さを有するように表示 させる技術が提案されて ヽる (特許文献 2参照)。 [0007] Therefore, conventionally, a guide object extending along a right-left turn road is usually displayed thinly, and is a guide object with poor visibility, so that it has a height with respect to the ground plane. There is a proposal for a technique for this (see Patent Document 2).
特許文献 1 :特開平 7— 63572号公報  Patent Document 1: JP-A-7-63572
特許文献 2:特開 2003 - 337033号公報  Patent Document 2: Japanese Patent Laid-Open No. 2003-337033
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0008] しかしながら、従来技術は、案内矢印の右左折方向を指示する部分が 3次元画像 の地平面に対して垂直に立った状態で表示されるため、右左折方向を指示する部分 の高さが大きくなり、図 16に示すように、案内点の直前では、案内矢印によって背景 画像が隠蔽されてしまう。このため、ユーザは、案内点周辺の様子を認識し辛い。な お、地平面とは、車両と道路の接平面のことを表す。以下では、地平面と称する。  [0008] However, according to the prior art, since the portion indicating the right / left turn direction of the guide arrow is displayed in a state of standing perpendicular to the ground plane of the three-dimensional image, the height of the portion indicating the right / left turn direction is displayed. As shown in FIG. 16, the background image is concealed by the guidance arrow immediately before the guidance point. For this reason, it is difficult for the user to recognize the situation around the guide point. The ground plane refers to the tangent plane between the vehicle and the road. Hereinafter, it is referred to as a ground plane.
[0009] そこで、本発明は、上記問題に鑑みてなされた。すなわち、自車位置から案内点ま での距離に基づいて、 3次元空間画像の地平面に対する案内矢印の右左折方向を 指示する部分の高さ、又は透明度を制御することによって、案内点の直前において、 案内矢印によって背景画像が隠蔽されてしまうことを軽減するナビゲーシヨン装置を 提供することを目的とする。なお、 3次元空間画像は、実写画像、及び 3次元地図画 像を含む。  Accordingly, the present invention has been made in view of the above problems. In other words, based on the distance from the vehicle position to the guide point, the height or transparency of the part that indicates the direction of the left or right turn of the guide arrow with respect to the ground plane of the three-dimensional space image is controlled, so that An object of the present invention is to provide a navigation device that reduces the fact that a background image is hidden by a guide arrow. The 3D spatial image includes a real image and a 3D map image.
課題を解決するための手段  Means for solving the problem
[0010] 本発明の局面は、ナビゲーシヨン装置に向けられている。本発明は、案内オブジェ タトを生成するオブジェクト生成部と、自車位置力 案内点までの距離に基づいて、 案内オブジェクトの所定部分について、表示上の大きさ、又は透明度を制御するォ ブジェクト制御部と、オブジェクト制御部が制御して得られる案内オブジェクトに基づ V、て、ドライバーズビューによってナビゲーシヨン表示する表示制御部とを備える。  [0010] An aspect of the present invention is directed to a navigation apparatus. The present invention provides an object generation unit that generates a guide object, and an object control unit that controls the display size or transparency of a predetermined portion of the guide object based on the distance to the vehicle position force guide point. And a display control unit that displays navigation based on a driver's view based on the guidance object obtained by the control of the object control unit.
[0011] また、オブジェクト制御部は、自車位置力も案内点までの距離に基づいて、案内ォ ブジエタトの所定部分について、傾きを制御することにより、表示上の大きさを制御す ることが好ましい。  [0011] Further, it is preferable that the object control unit controls the size of the display by controlling the inclination of a predetermined portion of the guide object based on the distance from the vehicle to the guide point. .
[0012] また、オブジェクト制御部は、自車位置力も案内点までの距離に基づいて、案内ォ ブジェクトのうち、右左折を指示する部分について、角度を制御することにより、傾き を制御することが好ましい。 [0013] また、オブジェクト制御部は、自車位置力も案内点までの距離が近い程、角度を小 さくすることが好ましい。 [0012] Further, the object control unit can control the inclination by controlling the angle of the portion of the guide object that instructs the left or right turn based on the distance from the vehicle to the guide point. preferable. [0013] In addition, it is preferable that the object control unit reduce the angle of the vehicle position force as the distance to the guide point is shorter.
[0014] また、オブジェクト制御部は、角度を 90度力も 0度の範囲内で小さくすることが好ま しい。  [0014] Further, it is preferable that the object control unit reduce the angle within a range of 90 degrees force and 0 degrees.
[0015] また、オブジェクト制御部は、自車位置力 案内点までの距離に基づいて、案内ォ ブジエタトの所定部分について、高さを制御することにより、表示上の大きさを制御す ることが好ましい。  [0015] Further, the object control unit may control the size of the display by controlling the height of a predetermined portion of the guide object based on the distance to the vehicle position force guide point. preferable.
[0016] また、オブジェクト制御部は、自車位置力 案内点までの距離に基づいて、案内ォ ブジェクトのうち、右左折を指示する部分について、高さを制御することが好ましい。  [0016] Further, it is preferable that the object control unit controls the height of a portion instructing a right or left turn in the guide object based on a distance to the vehicle position force guide point.
[0017] また、オブジェクト制御部は、自車位置力も案内点までの距離が近い程、高さを低く することが好ましい。  [0017] Further, it is preferable that the object control unit lowers the vehicle position force as the distance to the guide point is shorter.
[0018] また、オブジェクト制御部は、案内オブジェクトのうち、右左折を指示する部分につ いて、自車位置力も案内点までの距離が近くなる程、透明度を大きくすることが好ま しい。  [0018] In addition, it is preferable that the object control unit increases the transparency of the portion of the guidance object that instructs to turn left or right as the vehicle position force is closer to the guidance point.
[0019] また、表示制御部は、 3次元地図画像上に前記案内オブジェクトを重畳表示させる ことが好ましい。  [0019] Further, it is preferable that the display control unit superimposes and displays the guidance object on the three-dimensional map image.
[0020] また、表示制御部は、実写画像上に前記案内オブジェクトを重畳表示させることが 好ましい。  [0020] Further, it is preferable that the display control unit superimposes and displays the guidance object on the photographed image.
発明の効果  The invention's effect
[0021] 以上説明したように、本発明の局面によれば、自車位置から案内点までの距離に 基づいて、 3次元空間画像の地平面に対する案内矢印の右左折方向を指示する部 分の大きさ、又は透明度を制御することによって、案内点の直前において、案内矢印 によって背景画像が隠蔽されてしまうことを軽減するナビゲーシヨン装置を提供するこ とがでさる。  [0021] As described above, according to the aspect of the present invention, the part that indicates the right / left turn direction of the guide arrow with respect to the ground plane of the three-dimensional space image based on the distance from the vehicle position to the guide point. By controlling the size or transparency, it is possible to provide a navigation device that reduces the fact that the background image is hidden by the guide arrow immediately before the guide point.
図面の簡単な説明  Brief Description of Drawings
[0022] [図 1]図 1は、本発明の実施の形態に係るナビゲーシヨン装置の全体構成を示すプロ ック図である。  FIG. 1 is a block diagram showing an overall configuration of a navigation device according to an embodiment of the present invention.
[図 2]図 2は、地図データベースに記憶されている情報の一例を示す図である。 [図 3]図 3は、地図データベースに記憶されているノード、補間ノード、及びリンクによ り形成される道路と分岐点のネットワークの一例を示す図である。 FIG. 2 is a diagram showing an example of information stored in a map database. FIG. 3 is a diagram showing an example of a road and branch point network formed by nodes, interpolation nodes, and links stored in a map database.
[図 4]図 4は、本発明の実施の形態に係るナビゲーシヨン装置の動作を示すフロー図 である。  FIG. 4 is a flowchart showing the operation of the navigation device according to the embodiment of the present invention.
[図 5]図 5は、 3次元地図空間におけるカメラ視野空間の設定方法を示す図である。  [FIG. 5] FIG. 5 is a diagram showing a method of setting a camera visual field space in a three-dimensional map space.
[図 6]図 6は、道路検出処理によって検出される道路を示す図である。  FIG. 6 is a diagram showing roads detected by road detection processing.
[図 7]図 7は、直進方向矢印と右左折方向矢印の定義を示す図である。  [FIG. 7] FIG. 7 is a diagram showing definitions of a straight direction arrow and a right / left turn direction arrow.
[図 8]図 8は、距離 Dと角度 Θの関係を示す図である。  [FIG. 8] FIG. 8 is a diagram showing the relationship between the distance D and the angle Θ.
[図 9]図 9は、右左折方向矢印と地平面との角度 Θを示す断面図である。  FIG. 9 is a cross-sectional view showing an angle Θ between a right / left turn direction arrow and the ground plane.
圆 10]図 10は、案内対象交差点から離れた地点における案内矢印を示す図である。 圆 11]図 11は、案内対象交差点に近づいた地点における案内矢印を示す図である [10] FIG. 10 is a diagram showing a guidance arrow at a point away from the guidance target intersection. [11] Fig. 11 is a diagram showing a guidance arrow at a point approaching the guidance target intersection.
[図 12]図 12は、距離と高さの関係を示す図である。 FIG. 12 is a diagram showing the relationship between distance and height.
[図 13]図 13は、従来技術の案内対象交差点力 離れた地点における案内矢印を示 す図である。  [FIG. 13] FIG. 13 is a diagram showing a guide arrow at a point away from the intersection of the prior art guidance target.
[図 14]図 14は、距離と透明度の関係を示す図である。  FIG. 14 is a diagram showing the relationship between distance and transparency.
[図 15]図 15は、従来技術の案内対象交差点力 離れた地点における案内矢印を示 す図である。  [FIG. 15] FIG. 15 is a diagram showing a guide arrow at a point away from the intersection of the prior art guidance target.
[図 16]図 16は、従来技術の案内対象交差点力 近い地点における案内矢印を示す 図  [FIG. 16] FIG. 16 is a diagram showing a guide arrow at a point near the intersection power of the conventional guidance object.
符号の説明 Explanation of symbols
100 ナビゲーシヨン装置  100 navigation equipment
101 映像取得部  101 Video acquisition unit
102 測位部  102 Positioning part
103 地図データベース  103 Map database
104 入力部  104 Input section
105 制御部  105 Control unit
106 経路探索部 107 矢印制御部 106 Route search unit 107 Arrow control section
108 矢印生成部  108 Arrow generator
109 表示制御部  109 Display controller
110 表示部  110 Display
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0024] 以下、本発明のナビゲーシヨン装置 100について図面を参照しながら説明する。な お、各図面において、本発明に関係のない構成要素は省略している。  Hereinafter, the navigation apparatus 100 of the present invention will be described with reference to the drawings. In each drawing, components not related to the present invention are omitted.
[0025] 図 1は、本発明の一実施の形態に係るナビゲーシヨン装置 100の全体構成を示す ブロック図である。図 1において、ナビゲーシヨン装置 100は、映像取得部 101、測位 部 102、地図データベース 103、入力部 104、制御部 105、経路探索部 106、矢印 生成部 107、矢印制御部 108、表示制御部 109、及び表示部 110を備える。  FIG. 1 is a block diagram showing an overall configuration of a navigation apparatus 100 according to an embodiment of the present invention. In FIG. 1, a navigation device 100 includes an image acquisition unit 101, a positioning unit 102, a map database 103, an input unit 104, a control unit 105, a route search unit 106, an arrow generation unit 107, an arrow control unit 108, and a display control unit 109. And a display unit 110.
[0026] 映像取得部 101は、車両の前方の実写映像を撮像する、例えば、カメラである。測 位部 102は、自車位置に関する情報を取得する、例えば、車両に取り付けられた GP S (Global Positioning System)やジャイロなどに代表される測位センサである。  [0026] The video acquisition unit 101 is, for example, a camera that captures a live-action video in front of the vehicle. The positioning unit 102 is a positioning sensor that acquires information related to the position of the host vehicle, such as a GPS (Global Positioning System) or a gyro attached to the vehicle.
[0027] 地図データベース 103は、道路や交差点に関するデータ等の地図情報が格納され る、例えば、 HDDや DVDである。なお、これに限らず、図示しない通信手段(例えば 、携帯電話)によって、地図情報が地図データベース 103にセンター設備力も適宜ダ ゥンロードされる構成としてもょ 、。  [0027] The map database 103 is, for example, an HDD or a DVD that stores map information such as data on roads and intersections. However, the present invention is not limited to this, and the map information may be appropriately downloaded to the map database 103 by the communication means (for example, a mobile phone) (not shown).
[0028] 図 2に、地図データベース 103に記憶されている地図情報の中で、本実施の形態 に関連する情報を抜粋したデータを示す。地図データベース 103には、(A)ノードデ ータ、(B)補間ノードデータ及び (C)リンクデータが含まれる。  [0028] FIG. 2 shows data extracted from the map information stored in the map database 103, which is related to the present embodiment. The map database 103 includes (A) node data, (B) interpolation node data, and (C) link data.
[0029] ノードデータは、交差点や合流地点など、幾方向かに道路が分岐する地点であり、 ノード毎の緯度'経度等の位置情報、及び当該ノードに接続する、後述するリンクの 数、及びリンクの ID力も構成される。 [0029] The node data is a point where the road branches in several directions, such as an intersection or a merge point, position information such as latitude and longitude for each node, and the number of links to be described later connected to the node, and The ID power of the link is also configured.
[0030] 補間ノードデータは、後述するリンク上に存在し、リンクが直線状でない場合等、そ の形状を表現するための曲折点を表すものであり、緯度'経度等の位置情報、自身 が存在するリンク IDにより構成される。 [0030] Interpolation node data is present on a link to be described later, and represents a turning point for expressing the shape of the link, such as when the link is not a straight line. Consists of existing link IDs.
[0031] リンクデータはノードとノードを結ぶ道路を表すものであり、リンクの端点である始点 ノード、終点ノード、リンクの長さ(単位はメートルやキロメートルなど)、前述した補間 ノードの数、道路幅、及びその IDにより構成される。 [0031] Link data represents a road connecting nodes, and is a start point that is an end point of a link. It consists of the node, end node, link length (unit: meters, kilometers, etc.), the number of interpolation nodes mentioned above, road width, and its ID.
[0032] このような地図データにより形成される道路と交差点のネットワークの例を図 3に示 す。図 3に示すように、ノードには 3つ以上のリンクが接続して、別のノードと結ばれて おり、そのリンク上にはリンクの形状を支配するための補間ノードが存在する。なお、リ ンクが直線的な道路である場合には、補間ノードは必ずしも存在しな 、。 [0032] Fig. 3 shows an example of a road and intersection network formed by such map data. As shown in Fig. 3, three or more links are connected to a node and connected to another node, and an interpolation node for controlling the link shape exists on that link. If the link is a straight road, an interpolation node does not necessarily exist.
[0033] 入力部 104は、ナビゲーシヨン装置に対して目的地に関する情報を入力するため の、例えば、リモコン、タツチパネル、又は音声入力用のマイクである。制御部 105は[0033] The input unit 104 is, for example, a remote controller, a touch panel, or a voice input microphone for inputting information on the destination to the navigation device. Control unit 105
、ナビゲーシヨン装置 100全体を制御する。 The navigation device 100 is controlled as a whole.
[0034] 経路探索部 106は、入力部 104から入力された目的地に関する情報、及び測位部[0034] The route search unit 106 includes information on the destination input from the input unit 104, and a positioning unit.
102から取得した自車位置情報に基づいて、地図データベース 103を参照して、目 的地に至る最適経路を探索する。 Based on the vehicle position information acquired from 102, the map database 103 is referenced to search for the optimum route to the destination.
[0035] 矢印生成部 107は、 3次元空間画像において、道路検出処理により検出された道 路のうち、経路探索部 106によって探索された案内経路に該当する道路に沿った形 状の案内矢印を生成する。 [0035] The arrow generation unit 107 generates a guide arrow along the road corresponding to the guide route searched by the route search unit 106 among the roads detected by the road detection process in the three-dimensional space image. Generate.
[0036] 矢印制御部 108は、案内矢印の右左折方向を指示する部分が 3次元地図の地平 面となす角度を制御する。表示制御部 109は、実写映像に矢印制御部 108の制御 によって得られる案内矢印を重畳して表示部 110に表示させる。  [0036] The arrow control unit 108 controls the angle formed by the portion of the guide arrow that indicates the direction to turn left and right with the ground plane of the 3D map. The display control unit 109 causes the display unit 110 to display the guidance arrow obtained by the control of the arrow control unit 108 on the photographed video.
[0037] 次に本発明の実施の形態に係るナビゲーシヨン装置 100の動作について図 4に従 つて説明する。図 4は、本発明の実施の形態に係るナビゲーシヨン装置 100の動作を 示すフロー図である。  Next, the operation of the navigation device 100 according to the embodiment of the present invention will be described with reference to FIG. FIG. 4 is a flowchart showing the operation of the navigation device 100 according to the embodiment of the present invention.
[0038] まず、制御部 105は、入力部 104から目的地が設定されたか否かを判定する (ステ ップ S401)。ステップ S401において、目的地が設定されなかった場合、制御部 105 は、 S401に処理を戻す。一方、ステップ S401において、目的地が設定された場合 、経路探索部 106は、測位部 102で取得した自車位置情報に基づいて、地図データ ベース 103を参照して、目的地に至る経路を探索する (ステップ S402)。目的地に至 る経路が探索されると、制御部 105は、案内を開始する (ステップ S403)。  First, control unit 105 determines whether or not a destination is set from input unit 104 (step S401). When the destination is not set in step S401, the control unit 105 returns the process to S401. On the other hand, when the destination is set in step S401, the route search unit 106 searches the route to the destination with reference to the map database 103 based on the vehicle position information acquired by the positioning unit 102. (Step S402). When a route to the destination is searched, the control unit 105 starts guidance (step S403).
[0039] 更に、制御部 105は、地図データベース 103に格納されている 3次元地図、映像取 得部 101の撮像方向、撮像範囲を定めるパラメータであるカメラ位置、カメラ角(水平 角、仰角)、焦点距離、及び画像サイズに基づいて、 3次元地図空間におけるカメラ の視野空間を算出する (ステップ S404)。ここで、 3次元地図は、緯度、経度、及び高 度に基づいて、位置情報が表される。 [0039] Furthermore, the control unit 105 captures a 3D map, video capture stored in the map database 103. The field of view of the camera in the 3D map space is calculated based on the camera direction, camera position (camera angle (horizontal angle, elevation angle)), focal length, and image size, which are parameters for determining the imaging direction and imaging range of the acquisition unit 101. S404). Here, in the 3D map, position information is represented based on latitude, longitude, and altitude.
[0040] カメラの視野空間は、例えば、図 5に示すような方法で求める。 3次元空間において 、カメラ位置 (視点) Eからカメラ角方向に焦点距離 fだけ進んだ点(点 F)が求められ、 そこに画像サイズに相当する横 X縦 yの平面 (カメラ画面)が視点 Eと点 Fを結んだベタ トルに垂直になるように設定される。次に、制御部 105は、視点 E力もカメラ画面の 4 隅の点を結ぶ半直線がつくる 3次元空間を求める。この 3次元空間は、理論上無限 遠まで延びるが、視点 Eから適当な距離だけ離れた所で打ち切られ、視野空間とされ る。 [0040] The visual field space of the camera is obtained, for example, by a method as shown in FIG. In a three-dimensional space, a point (point F) advanced from the camera position (viewpoint) E by the focal length f in the camera angle direction is obtained, and a horizontal X vertical y plane (camera screen) corresponding to the image size is found there. It is set to be perpendicular to the vector connecting E and point F. Next, the control unit 105 obtains a three-dimensional space in which the half-line connecting the four corner points of the camera screen is also created as the viewpoint E force. This three-dimensional space theoretically extends to infinity, but is censored at an appropriate distance from the viewpoint E to be a viewing space.
[0041] なお、制御部 105は、 3次元地図の代わりに、 3次元地図力 高度情報を除いた 2 次元地図を用い、 2次元地図空間におけるカメラの視野空間を算出するようにしても 構わない。また、撮像方向と撮像範囲を定めるパラメータは上記のものに限らず、撮 像方向と撮像範囲が定まるものであれば、画角等の他のパラメータを用いて算出さ れても構わない。  [0041] It should be noted that the control unit 105 may calculate the field of view of the camera in the 2D map space using a 2D map excluding the 3D map power altitude information instead of the 3D map. . The parameters for determining the imaging direction and the imaging range are not limited to those described above, and may be calculated using other parameters such as the angle of view as long as the imaging direction and the imaging range are determined.
[0042] 次に、制御部 105は、自車が目的地に到達した力否かを判定する (ステップ S405) 。ステップ S405において、目的地に到達していた場合、制御部 105は、処理を終了 する。一方、ステップ S405において、目的地に到達していな力つた場合、制御部 10 5は、 3次元地図空間内において、カメラの視野空間内に存在する道路とその位置と を検出する道路検出処理を行う (ステップ S406)。  Next, the control unit 105 determines whether or not the vehicle has reached the destination (step S405). If the destination has been reached in step S405, the control unit 105 ends the process. On the other hand, if it is determined in step S405 that the destination has not been reached, the control unit 105 performs a road detection process for detecting a road existing in the field of view of the camera and its position in the 3D map space. Perform (Step S406).
[0043] 道路検出処理では、制御部 105は、 3次元地図空間において、カメラの視野空間と 道路領域との重なりを求める。図 6は、道路検出処理で検出される道路を示している 。なお、図 6は、 3次元地図空間とカメラ視野空間とを上方向力 見た図である。道路 検出処理では、上述したノードデータ、補間ノードデータ、及びリンクデータに基づい て、自車近傍の道路の形状や道路幅が抽出される。そして、図 6に示すように、視野 空間に囲まれた道路 (斜線を付した部分)がカメラの視野空間内に存在する道路とし て検出される。 [0044] 次に、矢印生成部 107は、 3次元地図空間において、道路検出処理により検出さ れた道路のうち、経路探索部 106によって探索された案内経路に該当する道路に沿 つた形状の案内矢印を生成する (ステップ S407)。 [0043] In the road detection process, the control unit 105 obtains an overlap between the camera view space and the road area in the three-dimensional map space. Figure 6 shows the roads detected by the road detection process. Fig. 6 shows the 3D map space and the camera field of view viewed in the upward direction. In the road detection process, the shape and width of the road in the vicinity of the vehicle is extracted based on the node data, the interpolation node data, and the link data. Then, as shown in FIG. 6, the road surrounded by the visual field space (the hatched portion) is detected as a road existing in the visual field space of the camera. [0044] Next, the arrow generation unit 107 guides the shape along the road corresponding to the guide route searched by the route search unit 106 among the roads detected by the road detection process in the 3D map space. An arrow is generated (step S407).
[0045] 矢印制御部 108は、案内矢印を直進方向矢印と右左折方向矢印とに分離して、そ れぞれについて、 3次元地図空間において、案内経路に該当する道路上の配置を 決定する。この分離処理は、案内矢印を構成するノードの情報に基づいて、どのノー ドが直進方向矢印と右左折方向矢印との分離点になるのかが判別されることによつ て行われる。  [0045] The arrow control unit 108 separates the guide arrow into a straight direction arrow and a right / left turn direction arrow, and determines an arrangement on the road corresponding to the guide route in each of the three-dimensional map spaces. . This separation process is performed by determining which node is the separation point between the straight direction arrow and the right / left turn direction arrow based on the information of the nodes constituting the guide arrow.
[0046] 図 7は、直進方向矢印、及び右左折方向矢印の定義をそれぞれ説明した図である 。図 7に示すように、走行中の道路 (道路 R1)上に配置される矢印が直進方向矢印で あり、案内対象交差点で右左折した先の道路 (道路 R2)上に配置される矢印が右左 折方向矢印である。なお、図 7は、あくまで直進方向矢印、及び右左折方向矢印の 定義について説明するものであり、実際に配置処理によって決定された案内矢印の 配置を示すものではない。また、案内矢印の形状は、図 7に示した矢印図形に限らず 、例えば、矢印図形力 先端の三角形を除いた折れ線図形を用いても構わない。  FIG. 7 is a diagram for explaining the definitions of the straight direction arrow and the right / left turn direction arrow. As shown in Figure 7, the arrow placed on the road (road R1) that is running is the straight direction arrow, and the arrow placed on the road (road R2) that is turned left and right at the guidance target intersection is the left and right. It is a folding direction arrow. Note that FIG. 7 merely explains the definitions of the straight direction arrow and the right / left turn direction arrow, and does not show the arrangement of the guide arrows actually determined by the arrangement process. Further, the shape of the guide arrow is not limited to the arrow figure shown in FIG. 7, and for example, a polygonal line figure excluding the triangle at the tip of the arrow figure force may be used.
[0047] 矢印制御部 108は、案内矢印の配置処理において、直進方向矢印を 3次元地図 空間の該当する道路上の地平面に平行に重なるように配置する (ステップ S408)。 一方、矢印制御部 108は、自車位置力も案内対象交差点までの距離に応じて右左 折方向矢印の配置を変更する。具体的には、矢印制御部 108は、まず、自車位置か ら自車が次に通過する案内対象交差点までの距離を算出する (ステップ S409)。な お、以降では自車位置力 案内対象交差点までの距離は定期的に算出されるものと する。  [0047] In the guide arrow arrangement process, the arrow control unit 108 arranges the straight direction arrow so as to overlap in parallel with the ground plane on the corresponding road in the three-dimensional map space (step S408). On the other hand, the arrow control unit 108 changes the arrangement of the right and left turn direction arrows in accordance with the vehicle position force and the distance to the guidance target intersection. Specifically, the arrow control unit 108 first calculates the distance from the own vehicle position to the next intersection to be guided through which the own vehicle passes (step S409). In the following, it is assumed that the distance to the target intersection for the vehicle position guidance is calculated periodically.
[0048] 次に、矢印制御部 108は、自車位置から案内対象交差点までの距離 Dが所定の値  [0048] Next, the arrow control unit 108 determines that the distance D from the vehicle position to the guidance target intersection is a predetermined value.
(距離 D1)以下力否かを判定する(ステップ S410)。ステップ S410において、矢印 制御部 108は、距離 Dが距離 D1より大きいと判定した場合、右左折方向矢印を配置 せずに直進方向矢印のみを配置する。  (Distance D1) It is determined whether the force is below (step S410). In step S410, when the arrow control unit 108 determines that the distance D is greater than the distance D1, the arrow control unit 108 arranges only the straight direction arrow without arranging the right / left turn direction arrow.
[0049] 次に、矢印制御部 108は、直進方向矢印を配置すると、直進方向矢印からなる案 内矢印の投影処理 (ステップ S410→S412)に処理を移す。なお、以降では、距離 D 1は 200mであるとして説明を行う。また、自車位置から目的地に至る経路中に案内 対象交差点が存在しない場合にも、右左折方向矢印が表示されないこととする。 [0049] Next, when the straight control direction arrow is arranged, the arrow control unit 108 shifts the process to a projection process (steps S410 → S412) of the in-house arrow composed of the straight direction arrow. In the following, distance D In the explanation, 1 is 200m. Also, if there is no guidance target intersection in the route from the vehicle position to the destination, the right / left turn direction arrow will not be displayed.
[0050] 一方、ステップ S410において、自車位置から案内対象交差点までの距離 Dが距離 Dl (200m)以下であると判定された場合、矢印制御部 108は、右左折方向矢印を 配置する (ステップ S410→S411)。このとき、右左折方向矢印は、 3次元地図空間 の該当する道路上に、地平面に対して自車位置力 案内対象交差点までの距離に 応じた角度をつけて配置される。具体的には、右左折方向矢印は、自車位置力 案 内対象交差点までの距離が近い程、地平面に対する案内矢印の角度 Θが小さくな るように配置される。例えば、角度 Θ力 S (式 1)を満たすように配置される。 [0050] On the other hand, if it is determined in step S410 that the distance D from the vehicle position to the guidance target intersection is equal to or less than the distance Dl (200m), the arrow control unit 108 arranges a right / left turn direction arrow (step S410 → S411). At this time, the right and left turn direction arrows are arranged on the corresponding road in the 3D map space at an angle corresponding to the distance to the vehicle position force guidance target intersection with respect to the ground plane. Specifically, the left and right turn direction arrows are arranged so that the angle Θ of the guide arrow with respect to the ground plane becomes smaller as the distance to the target intersection in the vehicle position force plan becomes shorter. For example, it is arranged to satisfy the angle Θ force S (Equation 1).
(式 1)  (Formula 1)
Θ = 90 (D> 200)  Θ = 90 (D> 200)
Θ =D/2- 10 (20≤D≤200)  Θ = D / 2- 10 (20≤D≤200)
Θ =0 (0≤D< 20)  Θ = 0 (0≤D <20)
このとき、角度 Θは図 8に示すように自車位置力も案内対象交差点までの距離 D 応じて変化する。図 9は、 3次元地図空間の道路 R1を通り、且つ地平面に垂直な断 面から見た図である。図 9に示すように、自車位置力 案内対象交差点までの距離が 200mの場合は、角度 Θは 90度となり、右左折方向矢印は立った状態で配置される (図 9 (a)参照)。  At this time, as shown in FIG. 8, the angle Θ also changes according to the distance D to the guidance target intersection. Fig. 9 is a view as seen from a cross-section passing through the road R1 in the 3D map space and perpendicular to the ground plane. As shown in Fig. 9, when the distance to the vehicle position force guidance target intersection is 200m, the angle Θ is 90 degrees, and the right / left turn direction arrow is placed upright (see Fig. 9 (a)) .
[0051] 自車位置力 案内対象交差点までの距離が近くなるにつれて、案内矢印は立った 状態から徐々に寝かせた状態に変化して 、き (自車位置から案内対象交差点までの 距離が 100mのとき角度 Θは 40度 ·図 9 (b)参照)、自車位置力も案内対象交差点ま での距離が 20mになると右左折方向矢印は、完全に寝かせた状態 (道路上の平面 に平行に重なる状態)で配置される(図 9 (c)参照)。  [0051] Own vehicle position force As the distance to the guidance target intersection decreases, the guidance arrow gradually changes from standing to laying down, and the distance from the own vehicle position to the guidance target intersection is 100 m. When the angle Θ is 40 degrees · See Fig. 9 (b)), and the vehicle position force is 20m away from the guidance target intersection, the right and left turn direction arrows are completely laid down (overlap parallel to the plane on the road) (See Fig. 9 (c)).
[0052] 次に、矢印制御部 108は、右左折方向矢印を配置すると、図 5に示すカメラ画面を 投影面として、直進方向矢印と右左折方向矢印からなる案内矢印に対して投影処理 を行う(ステップ S412)。  [0052] Next, when the right / left turn direction arrow is arranged, the arrow control unit 108 performs a projection process on the guide arrow composed of the straight direction arrow and the right / left turn direction arrow with the camera screen shown in FIG. 5 as the projection plane. (Step S412).
[0053] 投影処理において案内矢印が投影される投影面は、映像取得部 1のカメラ画面と 一致するため、表示制御部 109は、図 10及び図 11に示すように、案内矢印を実写 画像上に映っている道路 (案内経路に該当)上に重畳して表示部 110に表示させる [0053] Since the projection plane onto which the guide arrow is projected in the projection process coincides with the camera screen of the video acquisition unit 1, the display control unit 109 captures the guide arrow as shown in FIG. 10 and FIG. Overlay on the road shown in the image (corresponding to the guide route) and display it on the display unit 110
[0054] 図 10に示すように、交差点力も離れた地点(例えば、交差点手前 200m)では、右 左折方向矢印が高さを有した状態で表示がなされる。このときの右左折方向矢印は 、遠くからでも容易に視認できる程度に充分大きく表示される。 As shown in FIG. 10, at a point where the intersection force is also separated (for example, 200 m before the intersection), the display is performed with the right / left turn direction arrow having a height. The right / left turn direction arrow at this time is displayed large enough to be easily recognized from a distance.
[0055] また、図 11に示すように、交差点に近い地点(例えば、交差点手前 20m)では、右 左折方向矢印が高さを有さない状態で地平面上に配置されるため、案内矢印が背 景画像を隠蔽することはな 、。  [0055] Also, as shown in FIG. 11, at a point close to the intersection (for example, 20m before the intersection), the right and left turn direction arrows are arranged on the ground plane with no height. Do not hide the background image.
[0056] なお、上述の実施の形態では、案内矢印の地平面に対する角度 Θは(式 1)に従つ て変更される例を説明したが、これに限らず、自車位置力 案内対象交差点までの 距離 Dが小さくなるに連れて角度 Θが小さくなるものであれば、どのような式であって もよい。また、案内点までの距離だけでなぐ自車の速度等が参照され、角度 Θが制 御されてもよい。  [0056] In the above-described embodiment, the example in which the angle Θ of the guide arrow with respect to the ground plane is changed according to (Equation 1) has been described. However, the present invention is not limited to this. Any equation can be used as long as the angle Θ becomes smaller as the distance D becomes smaller. In addition, the angle Θ may be controlled by referring to the speed of the host vehicle, which is based only on the distance to the guide point.
[0057] また、図 12は、図 9と同様の断面図であり、図 12 (a)が自車位置が案内対象交差 点から最も離れた地点である場合であり、図 12 (c)が自車位置が案内対象交差点に 最も近い地点である場合である。図 12に示すように、自車位置力も案内対象交差点 までの距離が近くなるに連れて、立体表示された右左折方向矢印の高さが徐々に低 くなる。  [0057] FIG. 12 is a cross-sectional view similar to FIG. 9, and FIG. 12 (a) is a case where the vehicle position is the farthest from the guidance target intersection, and FIG. This is the case where the vehicle position is closest to the intersection to be guided. As shown in Fig. 12, the height of the three-dimensionally displayed right / left turn direction arrow gradually decreases as the vehicle position force decreases as the distance to the guidance target intersection decreases.
[0058] この場合においても、右左折方向矢印の地平面に対する角度 Θを制御する場合と 同様に、交差点力 離れた地点における右左折方向矢印の視認性を確保できる一 方で、交差点に近づいた地点では右左折方向矢印が背景映像を隠蔽することがな い。すなわち、交差点から離れた位置における右左折方向矢印の見易さと交差点直 前における背景画像の見易さとが両立された案内表示が行われる。  [0058] In this case, as in the case of controlling the angle Θ of the right / left turn direction arrow with respect to the ground plane, the visibility of the right / left turn direction arrow at a point away from the intersection force can be ensured while approaching the intersection. The right and left turn direction arrows do not cover the background image at the point. In other words, guidance display is performed in which the visibility of the right / left turn direction arrow at a position away from the intersection and the visibility of the background image immediately before the intersection are compatible.
[0059] なお、案内オブジェクトは、特開 2003— 337033号公報第 3の実施形態で記載さ れるのと同じように、地平面上にあるように表示される案内オブジェクトと、その上方に 別個に立体表示される案内オブジェクトとの 2つの別個の案内オブジェクトから構成 されていてもよい。上方に別個に立体表示される案内オブジェクトとは、例えば、図 1 3における鏃形状オブジェクトである。なお、ここでいう立体表示とは、既定の形態で は地平面内に表示されるオブジェクトを、地平面内ではなぐ高さ方向に表示するこ とをいう。このとき、オブジェクト制御部 108の制御対象となるオブジェクトは、鏃形状 オブジェクトとすれば、上記と同様の効果を発揮することができる。 [0059] As described in the third embodiment of Japanese Patent Laid-Open No. 2003-337033, the guide object is separately displayed above the guide object displayed on the ground plane. It may be composed of two separate guide objects with a three-dimensional guide object. The guidance object that is separately displayed in a three-dimensional upward direction is, for example, the bowl-shaped object in FIG. The 3D display here is a default form. Means to display an object displayed on the ground plane in a height direction that is close to the ground plane. At this time, if the object to be controlled by the object control unit 108 is a bowl-shaped object, the same effects as described above can be exhibited.
[0060] また、 案内オブジェクトが 2つの別個の案内オブジェクトから構成される場合、ォブ ジェタト制御部 108は上方の別個に立体表示される案内オブジェクトの大きさを制御 するのではなぐ透明度を制御するようにしても良 、。 [0060] Further, when the guide object is composed of two separate guide objects, the object control unit 108 controls the transparency rather than controlling the size of the guide object displayed in the three-dimensional upper part separately. It ’s okay.
[0061] 具体的には、上方の別個に立体表示される案内オブジェクトは、自車位置力も案内 点までの距離が近い程、案内オブジェクトの透明度 ocが大きくなるように配置される。 例えば、透明度 exが (式 2)を満たすように配置される。 [0061] Specifically, the guidance object that is separately displayed in the upper part is arranged so that the transparency oc of the guidance object increases as the vehicle position force is closer to the guidance point. For example, it is arranged so that the transparency ex satisfies (Equation 2).
(式 2)  (Formula 2)
a = 0 (D > 200)  a = 0 (D> 200)
a = - 5D/9 + 1000/9 (20≤D≤200)  a =-5D / 9 + 1000/9 (20≤D≤200)
a = 100 (0≤D< 20)  a = 100 (0≤D <20)
このとき、透明度 αは図 14に示すように自車位置力も案内点までの距離 Dに応じて 変化する。これによつて、交差点力 離れた地点における上方の別個に立体表示さ れる案内オブジェクトの視認性を確保できる一方で、交差点に近づ 、た地点では上 方の別個に立体表示される案内オブジェクトは透明になるので背景映像を隠蔽する ことがない。すなわち、交差点力 離れた位置における上方の別個に立体表示され る案内オブジェクトの見易さと交差点直前における背景画像の見易さとが両立された 案内表示が行われる。  At this time, the transparency α changes in accordance with the distance D to the guide point as shown in FIG. This makes it possible to ensure the visibility of the separately displayed three-dimensional guide object at a point away from the intersection force, while the guide object displayed three-dimensionally above the point near the intersection. Since it becomes transparent, the background image is not hidden. In other words, guidance display is performed in which both the ease of viewing the separately displayed three-dimensional guide object at a position away from the intersection force and the visibility of the background image immediately before the intersection are compatible.
[0062] また、上述の実施の形態では、車両の前方を向いたカメラで撮像された実写画像 が用いられる例を説明したが、予め記憶媒体に保存されている実写画像や通信によ つて取得した実写画像上に案内オブジェクトが重畳表示される場合に適用しても構 わない。  [0062] In the above-described embodiment, an example in which a photographed image captured by a camera facing the front of the vehicle is used has been described. However, the photographed image is obtained through a photographed image or communication stored in advance in a storage medium. It may be applied when the guidance object is displayed superimposed on the actual photographed image.
[0063] また、上述の実施の形態では、実写画像によるドライバーズビュー表示に適用され る場合について説明したが、特に、視点高度を車両の高さ程度まで下げて運転者の 目線で見た状態の三次元地図表示によるドライバーズビュー表示が行われる場合に 本発明が適用されても同様の効果を得ることができる。 [0064] このように、本発明によれば、自車位置力も案内点までの距離に基づいて、 3次元 空間画像の地平面に対する案内矢印の右左折方向を指示する部分の表示上の大 きさ、又は透明度を制御することによって、案内点の直前において、案内矢印によつ て背景画像が隠蔽されてしまうことを軽減するナビゲーシヨン装置を提供することがで きる。特に、実写画像に案内オブジェクトを重畳する表示方法において、実風景と案 内画面の対比が取り辛くなり、ユーザに対して直感的な経路誘導を行うことが困難と なる問題を回避することができる。 [0063] Further, in the above-described embodiment, the case where the present invention is applied to the driver's view display by a live-action image has been described. In particular, the state in which the viewpoint altitude is lowered to about the vehicle height and viewed from the driver's perspective. Even when the present invention is applied when the driver's view display by the three-dimensional map display is performed, the same effect can be obtained. [0064] Thus, according to the present invention, the vehicle position force is also large on the display of the portion that indicates the right-left turn direction of the guide arrow with respect to the ground plane of the three-dimensional space image based on the distance to the guide point. In addition, by controlling the transparency, it is possible to provide a navigation device that reduces the fact that the background image is hidden by the guide arrow immediately before the guide point. In particular, in the display method in which the guidance object is superimposed on the live-action image, it is difficult to compare the actual scenery and the in-house screen, and it is possible to avoid the problem that it is difficult to guide the user intuitively. .
[0065] また、上述の実施の形態では、右左折方向を指示する部分の角度、又は高さが制 御される例を示したが、例えば、道路形状がカーブしている場合、右左折方向を指示 する形状を成した当該カーブの区間の部分の角度、又は高さが制御される構成とし てもよい。  [0065] In the above-described embodiment, an example is shown in which the angle or height of the portion that indicates the right / left turn direction is controlled. For example, when the road shape is curved, the right / left turn direction The angle or the height of the portion of the curve section that has a shape instructing the above may be controlled.
[0066] 上記実施の形態で説明した構成は、単に具体例を示すものであり、本願発明の技 術的範囲を制限するものではない。本願の効果を奏する範囲において、任意の構成 を採用することが可能である。  [0066] The configuration described in the above embodiment is merely a specific example, and does not limit the technical scope of the present invention. Any configuration can be employed within the scope of the effects of the present application.
産業上の利用可能性  Industrial applicability
[0067] 本発明のナビゲーシヨン装置は、車両に設置されるカーナビゲーシヨン装置等とし て有用である。また、携帯電話におけるナビゲーシヨン装置等としても有用である。 The navigation device of the present invention is useful as a car navigation device installed in a vehicle. It is also useful as a navigation device for mobile phones.

Claims

請求の範囲 The scope of the claims
[1] 案内オブジェクトを生成するオブジェクト生成部と、  [1] an object generation unit for generating a guidance object;
自車位置力 案内点までの距離に基づいて、前記案内オブジェクトの所定部分に ついて、表示上の大きさ、又は透明度を制御するオブジェクト制御部と、  Based on the distance to the vehicle position force guide point, an object control unit that controls the display size or transparency of a predetermined portion of the guide object;
前記オブジェクト制御部が制御して得られる前記案内オブジェクトに基づ 、て、ドラ ィパーズビューによってナビゲーシヨン表示する表示制御部とを備える、ナビゲーショ ン装置。  A navigation apparatus comprising: a display control unit that performs navigation display using a driver's view based on the guidance object obtained by the control of the object control unit.
[2] 前記オブジェクト制御部は、自車位置力も案内点までの距離に基づいて、前記案 内オブジェクトの所定部分について、傾きを制御することにより、表示上の大きさを制 御することを特徴とする、請求項 1に記載のナビゲーシヨン装置。  [2] The object control unit controls the display size by controlling the inclination of a predetermined portion of the in-house object based on the vehicle position force and the distance to the guide point. The navigation device according to claim 1.
[3] 前記オブジェクト制御部は、自車位置力も案内点までの距離に基づいて、前記案 内オブジェクトのうち、右左折を指示する部分について、角度を制御することにより、 傾きを制御することを特徴とする、請求項 2に記載のナビゲーシヨン装置。 [3] The object control unit is configured to control the inclination by controlling an angle of a portion instructing a right / left turn among the planned objects based on the vehicle position force and the distance to the guide point. The navigation device according to claim 2, wherein the navigation device is characterized.
[4] 前記オブジェクト制御部は、自車位置力 案内点までの距離が近い程、前記角度 を小さくすることを特徴とする、請求項 3に記載のナビゲーシヨン装置。 4. The navigation device according to claim 3, wherein the object control unit decreases the angle as the distance to the vehicle position force guide point is shorter.
[5] 前記オブジェクト制御部は、前記角度を 90度力も 0度の範囲内で小さくすることを 特徴とする請求項 4に記載のナビゲーシヨン装置。 5. The navigation device according to claim 4, wherein the object control unit reduces the angle within a range of 90 degrees force and 0 degrees.
[6] 前記オブジェクト制御部は、自車位置力も案内点までの距離に基づ 、て、前記案 内オブジェクトの所定部分について、高さを制御することにより、表示上の大きさを制 御することを特徴とする、請求項 1に記載のナビゲーシヨン装置。 [6] The object control unit controls the size of the display by controlling the height of a predetermined portion of the in-house object based on the vehicle position force and the distance to the guide point. The navigation device according to claim 1, wherein:
[7] 前記オブジェクト制御部は、自車位置力も案内点までの距離に基づいて、前記案 内オブジェクトのうち、右左折を指示する部分について、高さを制御することを特徴と する、請求項 6に記載のナビゲーシヨン装置。 [7] The object control unit may control the height of a portion instructing a right or left turn of the in-house object based on a vehicle position force and a distance to a guide point. 6. The navigation device according to 6.
[8] 前記オブジェクト制御部は、自車位置力 案内点までの距離が近い程、前記高さを 低くすることを特徴とする、請求項 7に記載のナビゲーシヨン装置。 8. The navigation device according to claim 7, wherein the object control unit lowers the height as the distance to the vehicle position force guide point is shorter.
[9] 前記オブジェクト制御部は、前記案内オブジェクトのうち、右左折を指示する部分に ついて、自車位置力 案内点までの距離が近くなる程、透明度を大きくすることを特 徴とする、請求項 1に記載のナビゲーシヨン装置。 前記表示制御部は、 3次元地図画像上に前記案内オブジェクトを重畳表示させるこ とを特徴とする、請求項 1〜9のいずれかに記載のナビゲーシヨン装置。 [9] The object control unit is characterized in that, as the distance to the vehicle position force guide point is closer, the transparency is increased with respect to a portion instructing a right or left turn in the guide object. Item 5. The navigation device according to item 1. 10. The navigation device according to claim 1, wherein the display control unit causes the guidance object to be superimposed and displayed on a three-dimensional map image.
前記表示制御部は、実写画像上に前記案内オブジェクトを重畳表示させることを特 徴とする、請求項 1〜9のいずれかに記載のナビゲーシヨン装置。  10. The navigation device according to claim 1, wherein the display control unit displays the guidance object in a superimposed manner on a live-action image.
PCT/JP2007/060922 2006-06-05 2007-05-29 Navigation device WO2007142084A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007548632A JPWO2007142084A1 (en) 2006-06-05 2007-05-29 Navigation device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006156154 2006-06-05
JP2006-156154 2006-06-05

Publications (1)

Publication Number Publication Date
WO2007142084A1 true WO2007142084A1 (en) 2007-12-13

Family

ID=38801341

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/060922 WO2007142084A1 (en) 2006-06-05 2007-05-29 Navigation device

Country Status (2)

Country Link
JP (1) JPWO2007142084A1 (en)
WO (1) WO2007142084A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011121401A (en) * 2009-12-08 2011-06-23 Toshiba Corp Display device, display method, and movement body
WO2014050172A1 (en) * 2012-09-28 2014-04-03 アイシン・エィ・ダブリュ株式会社 Intersection navigation system, method, and program
JP2017021019A (en) * 2015-07-08 2017-01-26 日産自動車株式会社 Vehicular display apparatus and vehicular display method
WO2017056211A1 (en) * 2015-09-30 2017-04-06 日産自動車株式会社 Vehicular display device
JP2017091504A (en) * 2016-08-26 2017-05-25 株式会社オプティム Remote instruction method and remote terminal program
WO2019097762A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Superimposed-image display device and computer program
EP3505382A1 (en) * 2017-12-28 2019-07-03 Alpine Electronics, Inc. In-vehicle system
JP2021099343A (en) * 2019-06-21 2021-07-01 パイオニア株式会社 Display device, program, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07262492A (en) * 1994-03-25 1995-10-13 Alpine Electron Inc On-vehicle navigator
JPH09325042A (en) * 1996-06-05 1997-12-16 Matsushita Electric Ind Co Ltd Apparatus for displaying running position
JPH1089990A (en) * 1996-09-13 1998-04-10 Alpine Electron Inc Navigation apparatus
JPH10103997A (en) * 1996-09-27 1998-04-24 Toyota Motor Corp Course guide system for vehicle
JP2000155895A (en) * 1998-11-24 2000-06-06 Sony Corp Navigation device
JP2003083761A (en) * 2001-09-13 2003-03-19 Alpine Electronics Inc Navigation system
JP2004233153A (en) * 2003-01-29 2004-08-19 Xanavi Informatics Corp On-vehicle navigation device and map image displaying method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07262492A (en) * 1994-03-25 1995-10-13 Alpine Electron Inc On-vehicle navigator
JPH09325042A (en) * 1996-06-05 1997-12-16 Matsushita Electric Ind Co Ltd Apparatus for displaying running position
JPH1089990A (en) * 1996-09-13 1998-04-10 Alpine Electron Inc Navigation apparatus
JPH10103997A (en) * 1996-09-27 1998-04-24 Toyota Motor Corp Course guide system for vehicle
JP2000155895A (en) * 1998-11-24 2000-06-06 Sony Corp Navigation device
JP2003083761A (en) * 2001-09-13 2003-03-19 Alpine Electronics Inc Navigation system
JP2004233153A (en) * 2003-01-29 2004-08-19 Xanavi Informatics Corp On-vehicle navigation device and map image displaying method

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011121401A (en) * 2009-12-08 2011-06-23 Toshiba Corp Display device, display method, and movement body
WO2014050172A1 (en) * 2012-09-28 2014-04-03 アイシン・エィ・ダブリュ株式会社 Intersection navigation system, method, and program
CN104603578A (en) * 2012-09-28 2015-05-06 爱信艾达株式会社 Intersection navigation system, method, and program
US9508258B2 (en) 2012-09-28 2016-11-29 Aisin Aw Co., Ltd. Intersection guide system, method, and program
JP2017021019A (en) * 2015-07-08 2017-01-26 日産自動車株式会社 Vehicular display apparatus and vehicular display method
JP7048202B2 (en) 2015-07-08 2022-04-05 日産自動車株式会社 Display device for vehicles and display method for vehicles
US11209285B2 (en) 2015-09-30 2021-12-28 Nissan Motor Co., Ltd. Vehicular display device
WO2017056211A1 (en) * 2015-09-30 2017-04-06 日産自動車株式会社 Vehicular display device
JPWO2017056211A1 (en) * 2015-09-30 2018-09-13 日産自動車株式会社 Vehicle display device
JP2017091504A (en) * 2016-08-26 2017-05-25 株式会社オプティム Remote instruction method and remote terminal program
WO2019097762A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Superimposed-image display device and computer program
JP2019095215A (en) * 2017-11-17 2019-06-20 アイシン・エィ・ダブリュ株式会社 Superimposed image display device and computer program
US11525694B2 (en) 2017-11-17 2022-12-13 Aisin Corporation Superimposed-image display device and computer program
US11193785B2 (en) 2017-12-28 2021-12-07 Alpine Electronics, Inc. In-vehicle system
EP3505382A1 (en) * 2017-12-28 2019-07-03 Alpine Electronics, Inc. In-vehicle system
JP2021099343A (en) * 2019-06-21 2021-07-01 パイオニア株式会社 Display device, program, and storage medium

Also Published As

Publication number Publication date
JPWO2007142084A1 (en) 2009-10-22

Similar Documents

Publication Publication Date Title
JP4560090B2 (en) Navigation device and navigation method
JP4550927B2 (en) Navigation device
JP2009020089A (en) System, method, and program for navigation
WO2007142084A1 (en) Navigation device
WO2007129382A1 (en) Navigation device and method
JP2007121001A (en) Navigation device
US20110288763A1 (en) Method and apparatus for displaying three-dimensional route guidance
JP4899746B2 (en) Route guidance display device
JP2008128827A (en) Navigation device, navigation method, and program thereof
JP2007198962A (en) Guidance display device for vehicle
JP5218607B2 (en) Navigation device
KR102222102B1 (en) An augment reality navigation system and method of route guidance of an augment reality navigation system
JP4105609B2 (en) 3D display method for navigation and navigation apparatus
JP2007206014A (en) Navigation device
JP2013225275A (en) Three-dimensional image display system
JPWO2006109527A1 (en) Navigation device and navigation method
JP2007256048A (en) Navigation system
JP6968069B2 (en) Display control device and display control method
JP2008002965A (en) Navigation device and method therefor
KR100886330B1 (en) System and method for user&#39;s view
JPWO2011121788A1 (en) Navigation device, information display device, navigation method, navigation program, and recording medium
JP2007263849A (en) Navigation device
JP2009264835A (en) Navigation device, navigation method and navigation program
JP2007322371A (en) Navigation apparatus
KR20080019690A (en) Navigation device with camera-info

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2007548632

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07744341

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 07744341

Country of ref document: EP

Kind code of ref document: A1