WO2009118911A1 - Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur - Google Patents

Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur Download PDF

Info

Publication number
WO2009118911A1
WO2009118911A1 PCT/JP2008/056231 JP2008056231W WO2009118911A1 WO 2009118911 A1 WO2009118911 A1 WO 2009118911A1 JP 2008056231 W JP2008056231 W JP 2008056231W WO 2009118911 A1 WO2009118911 A1 WO 2009118911A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
viewpoint
dimensional object
angle
dimensional
Prior art date
Application number
PCT/JP2008/056231
Other languages
English (en)
Japanese (ja)
Inventor
昌義 鈴木
Original Assignee
パイオニア株式会社
パイオニアシステムテクノロジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社, パイオニアシステムテクノロジー株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2008/056231 priority Critical patent/WO2009118911A1/fr
Publication of WO2009118911A1 publication Critical patent/WO2009118911A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present invention relates to a map display device, a navigation device, a generation method, a map image generation program, and a computer-readable recording medium that display a map image with a bird's eye view of a three-dimensional map.
  • a navigation device that displays a map image on a display and guides a user to a desired destination using the map image.
  • a three-dimensional map image on which a three-dimensional object is arranged is displayed on a display, and a user is guided using a three-dimensional and real (much like a real landscape) map image.
  • Patent Document 1 For example, refer to the following Patent Document 1).
  • 3D map information all the information of the 3D map (hereinafter referred to as “3D map information”) is expressed by the 3D object.
  • the limited capacity of the recording device provided in the navigation device is limited.
  • the capacity occupied by the three-dimensional map information is enlarged. Therefore, when expressing a predetermined facility such as a convenience store or a gas station, the space occupied by the POI (Point of Interest) mark, which is a two-dimensional object, is used instead of the three-dimensional object to make the occupied capacity as small as possible. There was a need to do.
  • POI Point of Interest
  • a two-dimensional object such as a POI mark
  • a virtual ground reference plane
  • the display content of the two-dimensional object becomes difficult to see.
  • One example is the problem of inferiority.
  • the map display device includes a storage unit that stores information on a two-dimensional object and a three-dimensional map in which the three-dimensional object is arranged, a viewpoint, and a note.
  • Viewpoint setting means for setting a viewpoint, and adjustment means for adjusting an arrangement angle of the two-dimensional object with respect to the reference plane according to an angle of a straight line connecting the viewpoint and the gazing point with respect to the reference plane of the three-dimensional map
  • generating means for generating a map image obtained by bird's-eye view of the three-dimensional map after adjustment of the arrangement angle from the viewpoint to the direction of the gazing point, and display means for displaying the map image.
  • a navigation device is a navigation device comprising the map display device according to any one of the first to sixth aspects, wherein a detection means for detecting a current position and a route for setting a route are provided. Setting means; and guidance means for guiding the route, wherein the viewpoint setting means sets the viewpoint and the gazing point according to the current position and a guidance location on the route, and the adjustment.
  • the means adjusts the arrangement angle of the two-dimensional object located in the vicinity of the guidance area, and the generation means determines the three-dimensional map near the guidance area after the adjustment of the arrangement angle from the viewpoint to the direction of the gazing point.
  • a map image with a bird's eye view is generated, and the display means displays the map image as a guide image of the guide place.
  • the generation method according to claim 9 is a map image generation method using a 3D map in which a 2D object and a 3D object are arranged, and a viewpoint setting step of setting a viewpoint and a gaze point; An adjustment step of adjusting an arrangement angle of the two-dimensional object with respect to the reference plane according to an angle of a straight line connecting the viewpoint and the gazing point with respect to the reference plane of the three-dimensional map; Generating a map image in which a three-dimensional map is bird's-eye view from the viewpoint to the direction of the gazing point.
  • the map image generation program according to claim 10 causes the computer to execute the generation method according to claim 9.
  • the computer-readable recording medium according to claim 11 records the map image generation program according to claim 10.
  • FIG. 1 is a block diagram showing a functional configuration of a map display device according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing a processing procedure of the map display device according to the embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the navigation apparatus according to the present embodiment.
  • FIG. 4 is an explanatory diagram showing an overview of viewpoints and gaze points in the navigation device of the present embodiment.
  • FIG. 5 is a flowchart showing the processing procedure of the navigation device of this embodiment.
  • FIG. 6 is an explanatory diagram (part 1) illustrating a specific example of adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • FIG. 1 is a block diagram showing a functional configuration of a map display device according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing a processing procedure of the map display device according to the embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a hardware configuration of
  • FIG. 7 is an explanatory diagram (part 2) illustrating a specific example of adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • FIG. 8 is an explanatory diagram (part 1) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • FIG. 9 is an explanatory diagram (part 2) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • FIG. 10 is an explanatory diagram (part 3) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • FIG. 11 is an explanatory diagram (part 4) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • FIG. 12 is an explanatory diagram illustrating a specific display example of the navigation device of the present embodiment.
  • FIG. 1 is a block diagram showing a functional configuration of a map display device according to an embodiment of the present invention.
  • the map display device 100 includes a storage unit 101, a viewpoint setting unit 102, an adjustment unit 103, a generation unit 104, and a display unit 105.
  • the storage unit 101 has a function of storing 3D map information in which 2D objects and 3D objects are arranged.
  • the two-dimensional object is expressed by two-dimensional CG (Computer Graphics) such as a POI mark representing a predetermined facility, a traffic light, and a road sign.
  • the three-dimensional object is represented by a three-dimensional CG such as a polygon (group) representing a feature (shape) such as a building.
  • 3D map information means that the above-described 2D object or 3D object (for example, an actual building) is placed at a predetermined position on a predetermined reference plane (for example, an actual ground).
  • This is information that represents a map that has been configured.
  • the 3D map information includes both a 2D object and a 3D object. As described above, this means that all the features in the 3D map information are represented as 3D objects. This is because when it is used, the point occupied is enlarged. Further, for example, when a predetermined type of facility such as a convenience store is represented, information such as the series is more useful to the user than the shape of the facility.
  • the storage unit 101 is realized by a magnetic disk and a magnetic disk drive that performs reading / writing on the magnetic disk.
  • the viewpoint setting unit 102 has a function of setting a viewpoint and a gazing point.
  • the viewpoint and the gazing point are predetermined points (coordinates) in the three-dimensional map information.
  • the gazing point is a point obtained by projecting the viewpoint onto the reference plane.
  • the viewpoint setting unit 102 sets the viewpoint and the gazing point using, for example, a predetermined setting value (at a predetermined position).
  • the viewpoint setting unit 102 may receive a user's operation using various input devices, and may set a viewpoint and a gazing point (such as a position thereof) based on the operation.
  • the viewpoint setting unit 102 is realized by causing a computer device to execute a program prepared in advance.
  • the adjusting unit 103 has a function of adjusting the arrangement angle of the two-dimensional object with respect to the reference plane according to the angle of the straight line connecting the viewpoint with respect to the reference plane of the 3D map and the gazing point.
  • the reference plane is a plane having a constant height (for example, the ground (when the height is not set on the ground) or the plane where the sea level is 0) in the three-dimensional map.
  • the adjustment unit 103 adjusts the arrangement angle of the two-dimensional object so that the display surface of the two-dimensional object becomes an angle within a predetermined range including 90 degrees with respect to a straight line connecting the viewpoint and the gazing point.
  • the display surface is a surface on which display content is displayed (for example, a surface on which a predetermined pattern in the POI mark is displayed).
  • the adjustment unit 103 adjusts the arrangement angle by rotating the two-dimensional object by a predetermined angle along an axis corresponding to the arrangement position of the two-dimensional object.
  • the adjustment unit 103 may adjust the arrangement angle of the two-dimensional object when the angle of the straight line connecting the viewpoint and the gazing point with respect to the reference plane of the three-dimensional map is a predetermined angle or more.
  • the arrangement angle of only a specific type of two-dimensional object may be adjusted.
  • the two-dimensional object of a specific type may be a two-dimensional object of a predetermined type set in advance by the manufacturer of the map display device 100, or an arbitrary one set by the user of the map display device 100 It may be a two-dimensional object of the type.
  • the adjustment unit 103 is realized by causing a computer device to execute a program prepared in advance.
  • the generation unit 104 has a function of generating a map image in which the three-dimensional map after adjustment of the arrangement angle is viewed from the viewpoint to the direction of the gazing point. That is, the generation unit 104 generates a map image in which the three-dimensional map after the adjustment angle of the two-dimensional object is adjusted by the adjustment unit 103 is bird's-eye view from the viewpoint to the direction of the gazing point.
  • the generation unit 104 is realized by causing a computer device to execute a program prepared in advance.
  • the display unit 105 has a function of displaying a map image.
  • the display unit 105 displays the map image generated by the generation unit 104 on various displays.
  • the display unit 105 is realized by a liquid crystal display, a plasma display, or the like.
  • the map display device 100 may be provided as a part of the navigation device 110.
  • the navigation device 110 includes the map display device 100, a detection unit 111, a route setting unit 112, and a guide unit 113.
  • the detection unit 111 has a function of detecting the current position of the navigation device 110 (a mobile body equipped with the navigation device 110).
  • the detection unit 111 is realized by a GPS (Global Positioning System) unit or the like.
  • the route setting unit 112 has a function of setting a route. For example, the route setting unit 112 searches for a route to the destination input by the user using the Dijkstra method and sets the searched route.
  • the route setting unit 112 is realized, for example, by causing a computer to execute a program prepared in advance.
  • the guidance unit 113 has a function of guiding a route.
  • the guide unit 113 guides the route by generating various types of guide information for guiding the searched route to the user and displaying the guide information on the display unit 105.
  • the guidance information is, for example, information expressing the searched route as a line.
  • the guide unit 113 is realized by causing a computer device to execute a program prepared in advance.
  • the viewpoint setting unit 102 sets a viewpoint and a gazing point in accordance with, for example, the current position and the guidance point on the route.
  • the guidance location is, for example, a predetermined waypoint on the route (for example, a right or left turn point on the set route or an arbitrary point set as a waypoint by the user).
  • the adjustment unit 103 adjusts the arrangement angle of the two-dimensional object located near the guide point. For example, the adjustment unit 103 sets a viewpoint and a gazing point in which the angles of the straight lines connecting the viewpoint and the gazing point with respect to the reference plane of the three-dimensional map are different according to the current position and the distance from the guide point.
  • the generation unit 104 generates a map image obtained by bird's-eye view of the three-dimensional map in the vicinity of the guidance location after adjusting the arrangement angle from the viewpoint to the direction of the gazing point, and the display unit 105 uses the map image as the guidance image of the guidance location.
  • the guidance image is an image displayed on the display unit 105 to guide the guidance location.
  • FIG. 2 is a flowchart showing a processing procedure of the map display device according to the embodiment of the present invention. The flowchart shown in FIG. 2 is started, for example, when the map display device 100 is powered on.
  • the map display device 100 first sets a viewpoint and a gazing point (step S201). Specifically, the map display device 100 sets the viewpoint and the gazing point by the viewpoint setting unit 102 based on a predetermined setting value or an operation by the user.
  • step S201 after setting the viewpoint and the gazing point, the map display device 100 adjusts the arrangement angle of the two-dimensional object (step S202). Specifically, the map display device 100 uses the adjustment unit 103 to change a two-dimensional object (for example, a POI such as a convenience store) with respect to the reference plane according to an angle connecting the viewpoint with respect to the reference plane of the three-dimensional map and the gaze point. Adjust the placement angle of the mark.
  • a two-dimensional object for example, a POI such as a convenience store
  • step S202 after adjusting the arrangement angle of the two-dimensional object, the map display device 100 generates a map image after adjusting the arrangement angle (step S203). Specifically, the map display apparatus 100 generates a map image after the arrangement angle is adjusted by the adjustment unit 103 by the generation unit 104.
  • step S203 after generating the map image, the map display device 100 displays the generated map image (step S204), and ends the series of processes. Specifically, the map display device 100 causes the display unit 105 to display the map image generated by the generation unit 104 and ends a series of processes.
  • the arrangement angle of the two-dimensional object is adjusted according to the angle of the straight line connecting the viewpoint and the gazing point with respect to the reference plane. be able to. Thereby, the visibility of the two-dimensional object is improved, and the convenience can be improved.
  • the two-dimensional object is displayed so that the display surface of the two-dimensional object has an angle within a predetermined range including 90 degrees with respect to the straight line connecting the viewpoint and the gazing point.
  • the arrangement angle of the object can be adjusted. As a result, the visibility of the two-dimensional object is further improved, and convenience can be improved.
  • the arrangement angle of the two-dimensional object is adjusted when the angle of the straight line connecting the viewpoint and the gazing point with respect to the reference plane of the three-dimensional map is a predetermined angle or more. be able to.
  • the arrangement angle of the two-dimensional object is adjusted as necessary, the visibility of the two-dimensional object is further improved and the convenience can be improved.
  • the arrangement angle can be adjusted by rotating the two-dimensional object by a predetermined angle along the axis corresponding to the arrangement position of the two-dimensional object.
  • the arrangement angle of only a predetermined two-dimensional object (for example, only a POI mark representing a predetermined facility) can be adjusted.
  • the arrangement angle of the two-dimensional object desired by the user is adjusted, the convenience can be further improved.
  • the map display device 100 according to the above-described embodiment is applied to a navigation device mounted on a vehicle (including two wheels and four wheels).
  • FIG. 3 is a block diagram illustrating a hardware configuration of the navigation apparatus according to the present embodiment.
  • the navigation apparatus 300 includes a CPU 301, a ROM 302, a RAM 303, a magnetic disk drive 304, a magnetic disk 305, an optical disk drive 306, an optical disk 307, and an audio I / F (interface) 308.
  • the respective components 301 to 316 are connected by a bus 320, respectively.
  • the CPU 301 governs overall control of the navigation device 300.
  • the ROM 302 stores various programs such as a boot program, a current position detection program, a route search program, a route guidance program, a voice generation program, a viewpoint setting program, an arrangement angle adjustment program, a map image generation program, and a map image display program. Yes.
  • these various programs may be recorded in a nonvolatile memory such as a magnetic disk 305 and an optical disk 307 described later.
  • the RAM 303 is used as a work area for the CPU 301.
  • the CPU 301 controls the entire navigation device 300 by executing various programs recorded in the ROM 302 or the like while using the RAM 303 as a work area.
  • the current position detection program detects the current position of the navigation device 300 based on, for example, output information from a GPS unit 315 and various sensors 316 described later.
  • the route search program uses, for example, three-dimensional map information recorded on a magnetic disk 305 or an optical disk 307, which will be described later, and an optimal route from the departure point (for example, the current position of the navigation device 300) to the destination, A detour route is searched when the optimum route is deviated.
  • the optimum route is a route that has the lowest cost (such as required time) to the destination or a route that best meets the conditions specified by the passenger. Since the route search program is a known technique, a detailed description thereof is omitted, but an optimal route is searched using the Dijkstra method or the like.
  • the route information of the route searched by executing the route search program is output to the audio I / F 308 and the video I / F 312 via the CPU 301.
  • the route guidance program includes route information of a route searched by executing the route search program, current position information of the current position of the navigation device 300 detected by executing the current position detection program, the magnetic disk 305 or the optical disk 307. Based on the three-dimensional map information read out from, real-time route guidance information is generated. The route guidance information generated by executing the route guidance program is output to the audio I / F 308 and the video I / F 312 via the CPU 301.
  • the voice generation program generates tone and voice information corresponding to the pattern. That is, based on the route guidance information generated by executing the route guidance program, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated.
  • the voice guidance information includes, for example, an alarm indicating that a right / left turn point should be turned right and left according to the route, an alarm indicating that the vehicle should decelerate before the right / left turn point, information on a detour route when the right / left turn fails, Guidance information to the effect that it should be turned back if it fails to turn right or left is included.
  • the generated voice guidance information is output to the voice I / F 308 via the CPU 301.
  • the viewpoint setting program uses the current position information of the current position detected by the current position detection program, the route guidance information generated by the route guidance program, etc. Set the viewpoint.
  • the viewpoint setting program corresponds to, for example, a predetermined viewpoint and the viewpoint from among a plurality of preset viewpoints according to the positional relationship between the current position of the navigation device 300 and the guidance location on the route being guided.
  • the viewpoint and the gazing point set by executing the viewpoint setting program are used when a map image generation program described later is executed.
  • the arrangement angle adjustment program can adjust the arrangement angle of 2D objects in 3D map information. Specifically, the arrangement angle adjustment program adjusts the arrangement angle of the two-dimensional object with respect to the reference plane according to the angle of the straight line connecting the viewpoint and the gazing point set by executing the viewpoint setting program. Make it. The adjustment result adjusted by executing the arrangement angle adjustment program is used when the map image generation program is executed.
  • the map image generation program includes the three-dimensional map information stored in the magnetic disk 305, the viewpoint and gazing point set by executing the viewpoint setting program, the adjustment result adjusted by executing the arrangement angle adjustment program, and the like. By using this, a 3D map is generated in a bird's eye view from the set viewpoint to the direction of the gazing point.
  • the map image display program displays the map image generated by executing the map image generation program on the display 313 via the video I / F 312. For example, the map image display program causes the display 313 to display a map image around the current position of the navigation device 300.
  • the map image display program may display a map image around an arbitrary position designated by the passenger on the display 313, for example.
  • the magnetic disk drive 304 controls the reading / writing of the data with respect to the magnetic disk 305 according to control of CPU301. Data written under the control of the magnetic disk drive 304 is recorded on the magnetic disk 305.
  • the magnetic disk 305 for example, HD or FD (flexible disk) can be used.
  • the optical disc drive 306 controls reading / writing of data with respect to the optical disc 307 according to the control of the CPU 301.
  • the optical disk 307 is a detachable recording medium from which data is read according to the control of the optical disk drive 306.
  • a CD Compact Disc
  • DVD can be used as the optical disc 307.
  • a writable recording medium can be used as the optical disc 307.
  • the removable recording medium may be an MO (Magneto Optical Disk), a memory card, or the like.
  • the three-dimensional map information includes background data representing features (features) such as buildings, rivers, and point surfaces, road shape data representing road shapes, and the like, and is drawn in three dimensions on the display screen of the display 313. Is done.
  • the road shape data further has traffic condition data.
  • the traffic condition data includes, for example, whether or not there are traffic lights or pedestrian crossings, highway entrances or junctions, length (distance) for each link, road width, direction of travel, road type (highway, Information on toll roads and general roads).
  • past traffic information is recorded by statistically processing past traffic information based on seasons, days of the week, large holidays, and times.
  • the navigation device 300 obtains information on the traffic jam that is currently occurring from road traffic information received by the communication I / F 314 described later, and can predict the traffic jam situation at a specified time by using the past traffic jam information. Become.
  • the three-dimensional map information is recorded on the magnetic disk 305 or the optical disk 307.
  • the three-dimensional map information is not recorded only on information provided integrally with the hardware of the navigation device 300, and may be provided outside the navigation device 300.
  • the navigation device 300 acquires 3D map information from an external computer device connected via the communication I / F 314, for example.
  • the acquired 3D map information is recorded in the RAM 303, the magnetic disk 305, etc., and is read out as necessary.
  • the voice I / F 308 is connected to a microphone 309 for voice input and a speaker 310 for voice output.
  • the sound received by the microphone 309 is A / D converted in the sound I / F 308.
  • sound is output from the speaker 310. Note that the sound input from the microphone 309 can be recorded on the magnetic disk 305 or the optical disk 307 as sound data.
  • the input device 311 may be a remote controller, a keyboard, a mouse, a touch panel, etc. provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like.
  • the input device 311 inputs data corresponding to the key selected by the passenger into the apparatus.
  • the video I / F 312 is connected to the display 313.
  • the video I / F 312 includes, for example, a graphic controller that controls the entire display 313, a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller.
  • the display 313 is configured by a control IC or the like.
  • the display 313 displays icons, cursors, menus, windows, or various data such as characters and images.
  • a CRT for example, a CRT, a TFT liquid crystal display, a plasma display, an organic EL display, or the like can be used.
  • the communication I / F 314 is connected to the network via wireless and functions as an interface between the navigation device 300 and the CPU 301.
  • the communication I / F 314 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between the communication network and the CPU 301.
  • the communication I / F 314 receives a television broadcast or a radio broadcast.
  • Communication networks include LAN, WAN, public line network and mobile phone network.
  • the communication I / F 314 is configured by, for example, an FM tuner, a VICS / beacon receiver, a wireless navigation device, and other navigation devices. get. VICS is a registered trademark.
  • the GPS unit 315 receives radio waves from GPS satellites and calculates information indicating the current position of the vehicle.
  • the output information of the GPS unit 315 is used when the current position of the vehicle is specified by the CPU 301 together with output values of various sensors 316 described later.
  • the information indicating the current position is information for specifying one point on the map data such as latitude / longitude and altitude.
  • the various sensors 316 output information such as a vehicle speed sensor, an acceleration sensor, and an angular velocity sensor that can determine the position and behavior of the vehicle.
  • the output values of the various sensors 316 are used by the CPU 301 for specifying the current position of the vehicle, measuring the amount of change in speed and direction, and the like.
  • the storage unit 101 of the map display apparatus 100 includes a magnetic disk drive 304 and a magnetic disk 305
  • the viewpoint setting unit 102 includes a CPU 301 and a ROM 302
  • the adjustment unit 103 includes a CPU 301 and a ROM 302.
  • the generation unit 104 can realize the respective functions by the CPU 301 and the ROM 302, and the display unit 105 by the video I / F 312 and the display 313.
  • FIG. 4 is an explanatory diagram showing an overview of viewpoints and gaze points in the navigation device of the present embodiment.
  • FIG. 4 is a reference plane 400 of the three-dimensional map represented by the three-dimensional map information stored in the magnetic disk 305.
  • two viewpoints indicated by reference numerals VP11 and VP21 in FIG. 4 having different heights from the reference plane 400 are set in advance.
  • the viewpoint VP11 is set at a height h1 from the reference plane 400
  • the viewpoint VP21 is set at a height h2 from the reference plane 400 (h1 ⁇ h2).
  • the viewpoint VP11 is a viewpoint in the normal mode
  • the viewpoint VP21 is a viewpoint in the short distance mode.
  • the navigation device 300 switches to the viewpoint VP21 in the short-distance mode only when the distance to a certain guide point (including the destination) is within a predetermined distance while traveling on the set route. In this case, the viewpoint VP11 in the normal mode is used.
  • each of the viewpoints VP11 and VP21 corresponds to a predetermined gaze point.
  • the viewpoint VP11 corresponds to the gazing point VP12
  • the viewpoint VP21 corresponds to the gazing point VP22.
  • a straight line connecting VP11 and VP12 is L1
  • a straight line connecting VP21 and VP22 is L2.
  • an angle between L1 and the reference plane 400 is ⁇ 1
  • an angle between L2 and the reference plane 400 is ⁇ 2. At this time, ⁇ 1 ⁇ 2.
  • Navigation device 300 generates a map image that is bird's-eye view in the direction of a predetermined point of gaze from the viewpoint. That is, in the normal mode, the navigation device 300 generates a map image in which a three-dimensional map is viewed from the viewpoint VP11 to the gazing point VP12. In the short distance mode, the navigation device 300 generates a map image in which a three-dimensional map is viewed from the viewpoint VP21 toward the gazing point VP22.
  • FIG. 5 is a flowchart showing the processing procedure of the navigation device of this embodiment. Note that the flowchart shown in FIG. 5 is started when the navigation apparatus is powered on.
  • the navigation apparatus 300 of the present embodiment first detects the current position of the navigation apparatus 300 (step S501).
  • Step S501 is performed by the CPU 301 of the navigation device 300 executing a current position detection program stored in the ROM 302 or the like.
  • the navigation device 300 periodically detects the current position at a predetermined cycle (for example, one second cycle).
  • step S501 after detecting the current position, the navigation apparatus 300 determines whether or not a destination has been set (step S502). For example, the navigation apparatus 300 determines whether or not the destination is set by operating the input device 311 by the user.
  • step S502 If it is determined in step S502 that the destination has been set (step S502: Yes), the navigation device 300 searches for a route from the current position to the destination (step S503). For example, the CPU 301 of the navigation device 300 searches for a route to the destination by executing the route search program recorded in the ROM 302 or the like. If it is determined in step S502 that the destination is not set (step S502: No), the navigation apparatus 300 returns to step S501 and repeats the above process.
  • step S503 after the route from the current position to the destination is searched, the navigation apparatus 300 determines whether or not the distance from the current position to the next guide location on the route is equal to or smaller than a predetermined distance (Ste S504).
  • the navigation device 300 uses the current position information indicating the current position detected above, the route information of the searched route, and the 3D map information, from the current position to the guide point on the route. It is determined whether the distance is equal to or less than a predetermined distance.
  • the next guide point is a predetermined transit point (a destination when there is no transit point) such as a next turn point on the route.
  • step S504 If it is determined in step S504 that the distance to the next guide point is not less than or equal to the predetermined distance (step S504: No), the navigation device 300 determines that the standard mode viewpoint (that is, the viewpoint VP11 shown in FIG. 4). Is set (step S505).
  • the standard mode viewpoint that is, the viewpoint VP11 shown in FIG. 4
  • step S505 after setting the viewpoint, the navigation apparatus 300 sets a gaze point corresponding to the set viewpoint (that is, the viewpoint VP12 shown in FIG. 4) (step S506).
  • step S504 when it is determined in step S504 that the distance to the next guide point is equal to or less than the predetermined distance (step S504: Yes), the navigation device 300 displays the viewpoint in the short distance mode (that is, as shown in FIG. 4). Set the viewpoint VP21) (step S507).
  • step S507 after setting the viewpoint, the navigation apparatus 300 sets a gaze point corresponding to the set viewpoint (that is, the viewpoint VP22 shown in FIG. 4) (step S508).
  • the above steps S505 to S508 are performed by the CPU 301 of the navigation device 300 executing the viewpoint setting program stored in the ROM 302 or the like.
  • the navigation apparatus 300 adjusts the arrangement angle of the two-dimensional object based on the set viewpoint and the gazing point (step S509). For example, the navigation apparatus 300 adjusts the arrangement angle of the two-dimensional object so that the angle is within a predetermined range including 90 degrees with respect to the straight line (L1 or L2) connecting the set viewpoint and the gazing point. To do. Step S509 is performed by the CPU 301 of the navigation device 300 executing the arrangement angle adjustment program stored in the ROM 302 or the like.
  • step S509 after adjusting the arrangement angle of the two-dimensional object, the navigation apparatus 300 generates a map image in which the three-dimensional map after adjusting the arrangement angle is viewed from the viewpoint to the direction of the gazing point (step S510).
  • Step S510 is performed by the CPU 301 of the navigation device 300 executing a map image generation program stored in the ROM 302 or the like.
  • step S510 after generating the map image, the navigation apparatus 300 displays the generated map image (step S511).
  • the CPU 301 of the navigation device 300 displays the map image by executing the above-described map image display program recorded in the ROM 302 or the like.
  • Step S511 is performed by the CPU 301 of the navigation device 300 executing a map image display program stored in the ROM 302 or the like.
  • step S511 after displaying the map image, the navigation apparatus 300 guides the user to the destination (guidance place) using the map image (step S512), and ends the series of processes.
  • the navigation device 300 returns the viewpoint to the viewpoint VP11 in the normal mode and guides to the guide place.
  • the navigation apparatus 300 sets a viewpoint to the viewpoint VP21 of short distance mode again.
  • FIG. 6 is an explanatory diagram (part 1) illustrating a specific example of adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • reference numeral 603 in FIG. 6 indicates a two-dimensional object 603 disposed on the reference plane 400. Note that reference numeral 603a in FIG. 6 is the display surface 603a of the two-dimensional object 603. On the display surface 603a, a predetermined pattern corresponding to the facility represented by the two-dimensional object 603 is displayed.
  • the navigation device 300 generates a map image in the normal mode. That is, the navigation device 300 generates a map image in which a three-dimensional map (such as the reference plane 400 and various objects arranged on the reference plane 400) is viewed from the viewpoint VP11 to the gazing point VP12.
  • a three-dimensional map such as the reference plane 400 and various objects arranged on the reference plane 400
  • the navigation apparatus 300 does not adjust the arrangement angle of the two-dimensional object 603, and the default arrangement angle. Is used.
  • a predetermined angle for example, 45 degrees
  • FIG. 7 is an explanatory diagram (part 2) illustrating a specific example of adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • part 2 the same parts as those in FIG. 7
  • the navigation device 300 generates a map image in the short distance mode. That is, the navigation device 300 generates a map image in which a three-dimensional map (such as the reference plane 400 and various objects arranged on the reference plane 400) is viewed in the direction from the viewpoint VP21 to the gazing point VP22.
  • a three-dimensional map such as the reference plane 400 and various objects arranged on the reference plane 400
  • the navigation apparatus 300 of the present embodiment adjusts the arrangement angle of the two-dimensional object 603 in the short distance mode. That is, the arrangement angle of the two-dimensional object 603 is adjusted by rotating the two-dimensional object 603 in the direction indicated by the arrow 701 in FIG. Assuming that the rotation angle at this time is ⁇ 3, ⁇ 3 is, for example, the same angle as ⁇ 2 (not necessarily exactly the same).
  • the navigation apparatus 300 rotates the two-dimensional object 603 by ⁇ 3 so that the angle formed by L2 and the display surface 603a is an angle within a predetermined range including 90 degrees. Thereby, the navigation apparatus 300 generates a map image after adjusting the arrangement angle of the two-dimensional object 603.
  • FIG. 8 is an explanatory diagram (part 1) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • the position of each facility is registered by points (coordinates).
  • a point 801 indicates the position of a certain police station.
  • the manufacturing side of the navigation apparatus 300 arranges the POI mark 802 that is a two-dimensional object with the point 801 as the center.
  • FIG. 9 is an explanatory diagram (part 2) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • the navigation apparatus 300 uses the axis 901 passing through the point 801 and crossing the POI mark 802 as the rotation axis. , Rotate a predetermined angle to adjust the arrangement angle.
  • FIG. 10 is an explanatory diagram (part 3) illustrating the rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • a point 1001 indicates the position of a police station similar to FIG. 8 (that is, the point 801 and the point 1001 are points on the same coordinate).
  • the manufacturing side of the navigation device 300 is such that the point 1001 is the lower end side in the longitudinal direction of the POI mark 1002 (and the central portion in the direction orthogonal to the longitudinal direction).
  • the POI mark 1002 is arranged.
  • the point 801 and the point 1001 represent the same police station position, but how the POI mark is arranged with respect to the point 801 and the point 1001 (as in the POI mark 802). Arrangement or arrangement like the POI mark 1002) is arbitrarily set on the manufacturing side of the navigation apparatus 300.
  • FIG. 11 is an explanatory diagram (part 4) illustrating a rotation axis when adjusting the arrangement angle of the two-dimensional object by the navigation device of the present embodiment.
  • the navigation apparatus 300 passes through the point 1001 and the axis 1101 crosses the POI mark 1002.
  • the rotation angle is rotated by a predetermined angle to adjust the arrangement angle.
  • FIG. 12 is an explanatory diagram illustrating a specific display example of the navigation device of the present embodiment.
  • the specific display example of the navigation device of this embodiment shown in FIG. 12 is the short distance mode in the case where the distance from the current position of the vehicle on which the navigation device 300 is mounted to the guide point is equal to or less than a predetermined distance. Is a display example.
  • a map image 1200 around the current position of the navigation device 300 is displayed on the display 313 of the navigation device 300.
  • the display 313 displays various windows (for example, a guidance window 1210 that displays the distance to the guidance location, etc.).
  • reference numeral 1201 indicates a route 1201 set in the navigation device 300.
  • a guide place icon 1202 indicating a guide place is arranged.
  • reference numerals 1203 and 1204 are POI marks 1203 and 1204 which are two-dimensional objects.
  • the POI mark 1203 is a POI mark representing a convenience store.
  • the POI mark 1204 is a POI mark representing a gas station.
  • the navigation apparatus 300 adjusts the arrangement angle of the POI marks 1203 and 1204 when the short distance mode is set. Therefore, in the navigation device 300, even when the short distance mode is set, the display contents of the POI marks 1203 and 1204 around the current position are clearly visible, and the visibility is improved.
  • the arrangement angle of the two-dimensional object can be adjusted according to the angle of the straight line connecting the viewpoint and the gazing point with respect to the reference plane. Therefore, the visibility of the two-dimensional object is improved, and the convenience can be improved.
  • the display surface of the two-dimensional object has an angle within a predetermined range including 90 degrees with respect to a straight line connecting the viewpoint and the gazing point.
  • the arrangement angle can be adjusted. As a result, the visibility of the two-dimensional object is further improved, and convenience can be improved.
  • the arrangement angle of the two-dimensional object can be adjusted. it can.
  • the arrangement angle of the two-dimensional object can be adjusted as necessary, the visibility of the two-dimensional object can be further improved, and convenience can be improved.
  • the arrangement angle of only a predetermined two-dimensional object (for example, only a POI mark representing a predetermined facility) can be adjusted.
  • the user can set only the desired POI mark as an adjustment target, improve the visibility of only the desired POI mark, and improve the convenience.
  • the arrangement angle of the two-dimensional object can be adjusted according to the angle of the straight line connecting the viewpoint and the gazing point with respect to the reference plane. Therefore, the visibility of the two-dimensional object is improved, and the convenience can be improved.
  • the navigation apparatus 300 selects one of the viewpoints according to the distance from the viewpoints VP11 and VP21 to the guide point. You may choose. Further, the viewpoint is not limited to the viewpoint VP11 and the viewpoint VP21, and an arbitrary viewpoint set by the user may be used.
  • the navigation apparatus 300 sets the gazing point VP32 of the viewpoint VP31 on the reference plane. Then, assuming that a straight line connecting the viewpoint VP31 and the gazing point VP32 is L3, the navigation device 300, when the angle of the L3 with respect to the reference plane is equal to or larger than a predetermined angle, is similar to the case shown in FIG. Adjust the placement angle of the dimensional object. At this time, as described above, the navigation device 300 rotates the two-dimensional object so that the angle formed by L3 and the display surface of the two-dimensional object is an angle within a predetermined range including 90 degrees.
  • the generation method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer.
  • the program may be a medium that can be distributed via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Administration (AREA)
  • Automation & Control Theory (AREA)
  • Computer Graphics (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Technology (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Navigation (AREA)

Abstract

Un affichage de carte (100) comprend une section de stockage (101), une section de réglage de point de vue (102), une section d’ajustement (103), une section de génération (104) et une section d’affichage (105). La section de stockage (101) stocke des informations sur une carte en trois dimensions qui comporte un objet en deux dimensions et un objet en trois dimensions. La section de réglage de point de vue (102) règle un point de vue et un point de regard. La section d’ajustement (103) ajuste l’angle de disposition de l’objet en deux dimensions par rapport à une surface de référence, en fonction de l’angle que forme une ligne droite reliant le point de vue et le point de regard avec la surface de référence de la carte en trois dimensions. La section de génération (104) génère l’image de carte dans laquelle la carte en trois dimensions, après l’ajustement de l’angle de disposition, est vue dans la direction allant du point de vue au point de regard. La section d’affichage (105) affiche les informations de carte.
PCT/JP2008/056231 2008-03-28 2008-03-28 Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur WO2009118911A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2008/056231 WO2009118911A1 (fr) 2008-03-28 2008-03-28 Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2008/056231 WO2009118911A1 (fr) 2008-03-28 2008-03-28 Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur

Publications (1)

Publication Number Publication Date
WO2009118911A1 true WO2009118911A1 (fr) 2009-10-01

Family

ID=41113138

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/056231 WO2009118911A1 (fr) 2008-03-28 2008-03-28 Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur

Country Status (1)

Country Link
WO (1) WO2009118911A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103234546A (zh) * 2013-04-18 2013-08-07 易图通科技(北京)有限公司 真三维导航中变道诱导的方法和装置
US20130222364A1 (en) * 2012-02-29 2013-08-29 Daniel Kraus Manipulation of User Attention with Respect to a Simulated Field of View for Geographic Navigation Via Constrained Focus on, Perspective Attraction to, and/or Correction and Dynamic Adjustment of, Points of Interest
US8566020B2 (en) 2009-12-01 2013-10-22 Nokia Corporation Method and apparatus for transforming three-dimensional map objects to present navigation information
JP2016540284A (ja) * 2013-10-11 2016-12-22 トムトム ナビゲーション ベスローテン フエンノートシャップTomTom Navigation B.V. ナビゲーション命令を表示する装置及び方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09127862A (ja) * 1995-10-30 1997-05-16 Zanavy Informatics:Kk 車両用地図表示装置
JPH09292254A (ja) * 1996-04-26 1997-11-11 Matsushita Electric Ind Co Ltd 走行位置表示装置
JP2001027535A (ja) * 1999-05-12 2001-01-30 Denso Corp 地図表示装置
JP2003269972A (ja) * 1999-05-14 2003-09-25 Denso Corp 地図表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09127862A (ja) * 1995-10-30 1997-05-16 Zanavy Informatics:Kk 車両用地図表示装置
JPH09292254A (ja) * 1996-04-26 1997-11-11 Matsushita Electric Ind Co Ltd 走行位置表示装置
JP2001027535A (ja) * 1999-05-12 2001-01-30 Denso Corp 地図表示装置
JP2003269972A (ja) * 1999-05-14 2003-09-25 Denso Corp 地図表示装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8566020B2 (en) 2009-12-01 2013-10-22 Nokia Corporation Method and apparatus for transforming three-dimensional map objects to present navigation information
US20130222364A1 (en) * 2012-02-29 2013-08-29 Daniel Kraus Manipulation of User Attention with Respect to a Simulated Field of View for Geographic Navigation Via Constrained Focus on, Perspective Attraction to, and/or Correction and Dynamic Adjustment of, Points of Interest
WO2013127475A1 (fr) * 2012-02-29 2013-09-06 Navteq B.V. Manipulation de l'attention d'un utilisateur par rapport à un champ de vision simulé pour une navigation géographique par l'intermédiaire d'une focalisation limitée sur, une attraction en perspective vis-à-vis de, et/ou une correction et un ajustement dynamique de, points d'intérêt
US9791290B2 (en) * 2012-02-29 2017-10-17 Here Global B.V. Manipulation of user attention with respect to a simulated field of view for geographic navigation via constrained focus on, perspective attraction to, and/or correction and dynamic adjustment of, points of interest
CN103234546A (zh) * 2013-04-18 2013-08-07 易图通科技(北京)有限公司 真三维导航中变道诱导的方法和装置
JP2016540284A (ja) * 2013-10-11 2016-12-22 トムトム ナビゲーション ベスローテン フエンノートシャップTomTom Navigation B.V. ナビゲーション命令を表示する装置及び方法
US10527445B2 (en) 2013-10-11 2020-01-07 Tomtom Navigation B.V. Apparatus and methods of displaying navigation instructions
US11493358B2 (en) 2013-10-11 2022-11-08 Tomtom Navigation B.V. Apparatus and methods of displaying navigation instructions

Similar Documents

Publication Publication Date Title
US11493358B2 (en) Apparatus and methods of displaying navigation instructions
US7636634B2 (en) On-vehicle information terminal
JP4809900B2 (ja) ナビゲーション装置、地図表示方法及び地図表示プログラム
US8880343B2 (en) System for digital map labeling
US8862392B2 (en) Digital map landmarking system
JP2012221459A (ja) 地図画像表示システム、地図画像表示装置、地図画像表示方法及びコンピュータプログラム
JP2009156759A (ja) ナビゲーション装置及びコンピュータプログラム
CN101573590A (zh) 导航装置及用于显示导航信息的方法
JP4105609B2 (ja) ナビゲーション用立体表示方法およびナビゲーション装置
JP2006113047A (ja) 車載用電子装置
JP5554045B2 (ja) 地図表示装置及び地図表示方法
WO2009118911A1 (fr) Affichage de carte, dispositif de navigation, procédé de génération, programme de génération d’image de carte et support d’enregistrement lisible sur ordinateur
JP2010060294A (ja) 地図表示方法及びそれを用いたナビゲーション装置
JP4572235B2 (ja) 位置設定装置、位置設定方法、位置設定プログラム、および記録媒体
JP5451428B2 (ja) ナビゲーション装置およびネットワークデータのデータ構造
JP4682210B2 (ja) ナビゲーション装置、処理制御方法、処理制御プログラム、および記録媒体
JP2008249655A (ja) ナビゲーション装置、ナビゲーション方法、ナビゲーションプログラム、および記録媒体
JP2005326333A (ja) 交差点案内方法及びナビゲーション装置
JP2005326210A (ja) 地図情報表示制御装置、地図情報表示装置、それらのシステム、それらの方法、そのプログラム、および、そのプログラムを記録した記録媒体
JP2007078774A (ja) 車両誘導装置
JP4563708B2 (ja) ナビゲーション装置、方法及びプログラム
KR20100084855A (ko) 네비게이션 장치 및 이를 이용한 경로 디스플레이 방법
JPH1183503A (ja) ナビゲーション装置
JP3856038B2 (ja) ナビゲーション装置
JP4833709B2 (ja) ナビゲーション装置、並びに経路案内方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08739349

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 08739349

Country of ref document: EP

Kind code of ref document: A1