WO2010038328A1 - Dispositif d'affichage de carte, procédé d'affichage de carte et programme informatique - Google Patents

Dispositif d'affichage de carte, procédé d'affichage de carte et programme informatique Download PDF

Info

Publication number
WO2010038328A1
WO2010038328A1 PCT/JP2009/000978 JP2009000978W WO2010038328A1 WO 2010038328 A1 WO2010038328 A1 WO 2010038328A1 JP 2009000978 W JP2009000978 W JP 2009000978W WO 2010038328 A1 WO2010038328 A1 WO 2010038328A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
data
map
note
annotation
Prior art date
Application number
PCT/JP2009/000978
Other languages
English (en)
Japanese (ja)
Inventor
大原聡一
Original Assignee
株式会社野村総合研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社野村総合研究所 filed Critical 株式会社野村総合研究所
Publication of WO2010038328A1 publication Critical patent/WO2010038328A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/005Map projections or methods associated specifically therewith

Definitions

  • the present invention relates to a technique for displaying a note image such as various names and marks on a map image, and particularly relates to a technique for displaying a note display at an appropriate density.
  • Patent Document 1 discloses a technique for adjusting the density of annotation display (spot display) on a map when a map image is displayed using a computer or the like. That is, Patent Document 1 describes that when the area occupied by the note name display on the display screen is greater than or equal to a specified allowable limit, the area is less than the allowable limit.
  • JP 2005-227340 A JP 2005-227340 A
  • the number of notes displayed varies greatly depending on the characteristics of each region. For example, an area where people are concentrated, such as an urban area where development has progressed, has a larger number of notes, and is less in suburbs and mountainous areas where development has not progressed.
  • an object of the present invention is to determine regional characteristics based on a note display displayed on a map.
  • Another object of the present invention is to display notes according to the characteristics of the area where the map is displayed.
  • a map display device provided with a display unit is map data composed of a plurality of layers, and map image data divided into a plurality of meshes for each layer;
  • Storage means for storing map data having a plurality of note display data for displaying a plurality of notes on the map image of each mesh, display point determining means for determining display points to be displayed on the display unit, and the plurality One or more meshes including a display point determined by the display point determination means and its surrounding area as a search range in a predetermined reference layer among the layers of the above-described layers, and the specified search range Counting means for counting the number of the annotation displays displayed based on the annotation data included in the mesh, a display layer to be displayed on the display section, and When a display range including the display point is determined, note display selection means for selecting note display data to be included in the display range from the note display data in the display range of the display layer according to the counting result of the counting means. And a display image including a note display based on the
  • the apparatus further comprises city type determination means for determining a city type of the display point based on a counting result of the counting means, and the note display data includes attribute data indicating an attribute of each note display.
  • the note display selection means specifies the attribute of the note display data according to the city type determined by the city type determination means, the display layer, and the scale of the map to be displayed, and within the display range of the display layer.
  • the annotation display data having the specified attribute may be selected from the annotation display data.
  • each time a display point is determined by the display point determination unit the city type determination unit performs determination of a city type, and the annotation display selection unit determines the city type according to the determination result of the city type determination unit.
  • the note display selecting means continues for a predetermined number of times. Then, before the second city type is detected, the attribute of the annotation display data is specified according to the first city type, and after the second city type is detected continuously for the predetermined number of times or more, the first city type is identified.
  • the attribute of the note display data may be specified in accordance with the two city types.
  • the city type determination unit Does not determine the city type
  • the annotation display selection means identifies the attribute of the annotation display data according to the city type corresponding to the previously determined display point, the display layer, and the scale of the map to be displayed. You may do it.
  • the counting means includes, in the search range, a first mesh including the display point and one or more second meshes around the first mesh,
  • the number of annotation displays may be counted by weighting the number of annotation displays in the first mesh rather than the number of annotation displays in the first mesh.
  • the annotation display data is constituted by a plurality of data entities respectively corresponding to the plurality of annotation displays on the map image, and the counting of the number of the annotation displays performed by the counting means is: The number of data entities may be counted.
  • the city type determination device is map data composed of a plurality of layers, each map image data divided into a plurality of meshes for each layer, Storage means for storing map data having note display data for displaying a plurality of notes on a mesh map image, designation means for designating an urbanization determination reference point, and a predetermined one of the plurality of layers
  • one or more meshes including the urbanization determination reference point and its surrounding area are specified as a search range, and displayed based on annotation data included in the mesh in the specified search range
  • a counting means for counting the number of the annotations displayed, and a city type of the urbanization determination reference point based on a counting result of the counting means.
  • a city type determination means is map data composed of a plurality of layers, each map image data divided into a plurality of meshes for each layer
  • Storage means for storing map data having note display data for displaying a plurality of notes on a mesh map image, designation means for designating an urbanization determination reference point, and a predetermined one of the
  • map display system according to an embodiment of the present invention will be described with reference to the drawings.
  • display of a map image will be mainly described.
  • the map display system according to the present embodiment may be a subsystem of a navigation system.
  • FIG. 1 shows the overall configuration of the map display system according to this embodiment.
  • this map display system includes one or more user terminals 1, 1,... That are map display devices, access points 7, 7 that perform wireless communication, and access points 7, 7. And a server 3 connected via a network 9.
  • the server 3 is configured by, for example, a general-purpose computer system having a communication function with the network 9 and stores map data. Then, map data is provided via the network 9 in response to requests from the user terminals 1, 1.
  • the user terminals 1, 1,... May be any computer device having a communication function with the server 3, for example, a dedicated terminal for a navigation system, a mobile phone, a portable information terminal, or a general-purpose personal computer. But you can.
  • the functions or configurations of the user terminals 1, 1,... Described below are based on predetermined hardware included in the user terminals 1, 1,... And predetermined software (computer program) that controls or operates them. Can be realized.
  • the software (computer program) can be stored in a computer-readable recording medium.
  • FIG. 2 is a configuration diagram of the user terminal 1.
  • the user terminal 1 includes a display unit 111, an input unit 113, a GPS processing unit 115, a communication processing unit 117, a display point determination unit 119, a city type determination unit 121, and a display image.
  • a synthesis unit 123, a map data storage unit 125, a city type table 131, and a note data selection table 133 are provided.
  • the display unit 111 includes a display device such as a liquid crystal panel.
  • the display unit 111 displays a screen to be provided to the user.
  • the input unit 113 includes an operation unit such as a push button, a keyboard, a touch panel, or a pointing device.
  • the input unit 113 receives input from the user.
  • the GPS processing unit 115 is a current position detection unit of the user terminal 1.
  • the GPS processing unit 115 includes a GPS antenna and a GPS signal processing unit that processes the GPS signal received by the GPS antenna and identifies the coordinates (latitude and longitude) of the current position of the user terminal 1.
  • the user terminal 1 may include position detection means other than the GPS processing unit 115.
  • position detection means using WiFi radio waves of a wireless LAN may be used.
  • the GPS processing unit 115 can be omitted. That is, the user terminal 1 may not include the GPS processing unit 115 (position detecting unit).
  • the communication processing unit 117 communicates with the server 3 by wireless or wired communication. For example, the communication processing unit 117 acquires map data from the server 3 and stores it in the map data storage unit 125.
  • the map data storage unit 125 stores the map data 200 acquired from the server 3 by the communication processing unit 117.
  • FIG. 3 shows the data structure of the map data 200.
  • the map data 200 is composed of a plurality of layers 210, 210,.
  • the layers 210, 210,... are hierarchized according to the scale of the map image, for example, as shown in FIG. 4A.
  • the data of each layer 210 is configured in units of a mesh 220 obtained by dividing a map image into a predetermined size.
  • the layer having the finest mesh 220 that is, the layer having a detailed map image is defined as layer 1, and as the mesh size becomes coarser, it becomes layer 2 and layer 3, and has six layers up to layer 6 having the coarsest mesh. Also good.
  • Each mesh 220 includes a map image data block 221 of each mesh and a note data group 223 on the map image of each mesh.
  • the map image data block 221 includes a plurality of polygon data entities and polyline data entities. Each polygon data entity and polyline data entity has an attribute information item and a position information item indicating their characteristics.
  • the note data group 223 includes a plurality of note entities 225, 225,.
  • Each annotation entity 225, 225,... Has data for performing one annotation display (spot display).
  • FIG. 4B is an example of a display screen displayed on the display unit 111.
  • the annotation display refers to, for example, various characters, symbols, figures, marks, and the like displayed on the map image as shown in FIG. 4B. Therefore, each annotation entity 225, 225,... May include data such as characters, symbols, figures, marks, and the like.
  • Each note entity 225, 225,... Includes text data of various names (for example, place names (addresses), area names, river names, and names of various landmarks (buildings, structures), etc.) Image data such as a store mark may be included.
  • each note entity 225, 225,... Includes a data item indicating an attribute of each note display.
  • the attribute includes, for example, the type (address, intersection, convenience store, gas station, station, etc.) of each note display, and information on the position where each note display is arranged.
  • the display point determination unit 119 is based on the output of the input unit 113 or the GPS processing unit 115, the point included in the map displayed on the display unit 111, that is, the point where the map is to be displayed (display Point). For example, when a notification of position information (for example, latitude and longitude, an address, a name of a landmark, etc.) to be displayed on the display unit 111 is received from the input unit 113 or a map already displayed on the display unit 111 When the input for scrolling is received, or when the position information of the current position is received from the GPS processing unit 115, the display point determination unit 119 displays the display on the display unit 111 based on them. Point A is determined.
  • the display point A may be, for example, the latitude and longitude of a point (see FIG. 4B) displayed at a specific position (for example, the center) of the display unit 111.
  • the city type determination unit 121 performs the note display counting process and the city type determination process to determine the city type of the display point determined by the display point determination unit 119.
  • the city type indicates the degree of congestion of the annotation display on the map, that is, the number (density) of the annotation entities 225 per unit area. This is because the note display on the map is denser in the region where development (urbanization) has progressed, that is, the number of note entities 225 per unit area is large. On the other hand, in regions such as mountainous areas where development is not progressing, note display is sparse, that is, the number of note entities 225 per unit area is small. Therefore, in this embodiment, the city type determination is performed on the assumption that the degree of congestion of the annotation display substantially indicates the degree of urbanization.
  • FIG. 5 shows a map image divided into predetermined reference layer meshes.
  • the reference layer may be determined in advance as any one of the plurality of layers.
  • the search range 320 includes a mesh 310 including the display point A and all meshes in contact with the mesh 310 (all meshes around the mesh 310 (upper, lower, left, right, diagonally upper and lower)). May be the search range 320.
  • the search range 320 may be only the mesh 310, or may be configured with five meshes including four meshes that are adjacent to the mesh 310 vertically and horizontally.
  • the city type determination unit 121 acquires the annotation data 223 of each mesh 220 belonging to the search range 320 from the map data storage unit 125. Then, the city type determination unit 121 counts the number of annotation entities 225 of each mesh 220 for each mesh. Furthermore, the city type determination unit 121 calculates the total number of entities in the entire search range 320. When determining the total number of entities in the entire search range 320, the city type determination unit 121 may calculate the total after applying a predetermined weighting factor to the number of entities for each mesh.
  • the weighting factor of the mesh 310 to which the display point A belongs is made larger than the surroundings, or the weighting factor of the mesh that is adjacent to the mesh 310 in the vertical or horizontal direction is set to be greater than the weighting factor of the mesh that is adjacent in the diagonal direction. You may enlarge it.
  • the city type determination unit 121 calculates the number of entities per mesh in the search range 320 (average number of entities) by dividing the total number of entities calculated here by the number of meshes in the search range 320.
  • the city type determination unit 121 determines a city type by referring to the city type table 131 based on the average number of entities calculated by the above-described annotation display counting process.
  • the above processing is preferably performed with a reference layer.
  • a medium (center) mesh fineness layer for example, layer 3
  • FIG. 6 shows an example of the city type table 131. That is, the city type table 131 stores the average number of entities 1311 and the city type 1313 in association with each other.
  • the city type of the display point A determined by the display point determination unit 119 is determined.
  • the city type determination unit 121 determines the city type of other points. Can also be determined. That is, the city type determination unit 121 can determine the city type of a designated arbitrary point in the same procedure as described above.
  • the city type determination unit 121 may determine the city type. That is, each time the map image to be displayed on the display unit 111 is switched, the city type determination unit 121 may determine the city type of the display point A each time. Alternatively, on the contrary, when the map image to be displayed on the display unit 111 is scrolled and the scroll width is small, the city type determination unit 121 determines the city type of the display point A. It does not have to be done every time.
  • the city type determination by the city type determination unit 121 is omitted, and the display image is displayed.
  • the combining unit 123 may use past city type determination results.
  • the city Type determination may be omitted.
  • the display image composition unit 123 performs an annotation display selection process for performing an annotation display selection process and an image process.
  • image processing a display image including the note display selected by the note display selection unit is synthesized and displayed on the display unit 111.
  • the display image composition unit 123 performs, for example, the following processing as the note display selection processing.
  • the display image composition unit 123 stores the corresponding map image data block 221 from the map data storage unit 125. get.
  • the display range is determined by, for example, the display layer, the display time point A, and the scale of the map to be displayed. Therefore, for example, the display image composition unit 123 may acquire the map image data block 221 from the map data storage unit 125 according to the display layer to be displayed on the display unit 111, the display time point A, and the scale of the map to be displayed. .
  • the scale of the display layer and the map displayed on the display unit 111 may be determined according to a user input or an instruction from another application (not shown) in the user terminal 1.
  • the display image composition unit 123 further acquires the annotation entity 225 included in the display range described above from the map data storage unit 125, and according to the city type (or the average number of entities calculated by the city type determination unit 121). To select the note display to be displayed in the display range. For example, the display image composition unit 123 acquires the annotation entity 225 for displaying annotations included in the display range based on the display point A, the display layer, and the scale. Then, the display image synthesis unit 123 selects the note entity 225 according to the city type determined by the city type determination unit 121 and the note data selection table 133.
  • the note data selection table 133 defines the attribute of the note display included in the display image according to the layer and city type.
  • FIG. 7 shows an example of the note data selection table 133.
  • a table is configured for each layer.
  • a scale range for displaying the note display is defined for each attribute and each city type. For example, in the example of the figure, when the city type is “around the big city terminal”, the “address name” is displayed when the scale is “120” or less, and the “gas station mark” is the scale “100” or less. “Expanded address name” is displayed when the scale is “80” or less, and “Convenience store display” is displayed when the scale is “60” or less.
  • the scale shown in the example of FIG. 7 is a display scale for each layer.
  • the scale is larger as the numerical value is larger, and the map is smaller as the numerical value is smaller.
  • the display image composition unit 123 performs, for example, the following processing as image processing. That is, the display image synthesizing unit 123 generates a synthesized image (see FIG. 4B) to be displayed on the display unit 111 by superimposing the annotation display on the map image. Then, the display image synthesis unit 123 causes the display unit 111 to display the image synthesized as described above.
  • the display image composition unit 123 may be configured as follows, for example. That is, when switching from one city type to another city type, the city type adopted by the display image composition unit 123 in the note data selection process may be switched as follows, for example.
  • the display image synthesizing unit 123 performs the processing with the first city type until the second city type is detected continuously for a predetermined number of times or more.
  • the display image composition unit 123 performs subsequent processing using the second city type.
  • FIG. 8 is a flowchart showing a procedure for determining a city type based on map data.
  • the communication processing part 117 acquires previously required map data from the server 3, and stores it in the map data storage part 125.
  • FIG. 8 is a flowchart showing a procedure for determining a city type based on map data.
  • the communication processing part 117 acquires previously required map data from the server 3, and stores it in the map data storage part 125.
  • the display point determination unit 119 determines the display point A (S11).
  • the city type determination unit 121 acquires the display point A determined by the display point determination unit 119, the city type determination unit 121 determines the mesh 220 of the search range including the display point A and its surrounding area in a predetermined reference layer. Then, referring to the map data storage unit 125, the annotation data 223 is acquired from the mesh 220 within the search range of the reference layer (S13).
  • the city type determination unit 121 counts the total number of annotation entities 225 acquired in step S13 (S15). At this time, the city type determination unit 121 may perform weighting using the weighting coefficient as described above.
  • the city type determination unit 121 calculates the average number of entities from the total number of entities counted in step S15 (S17).
  • the city type determination unit 121 refers to the city type table 131 and identifies a city type corresponding to the average number of entities calculated in step S17 (S19).
  • the city type of the display point can be determined based on the map data.
  • FIG. 9 is a flowchart showing the procedure for synthesizing and displaying the display image.
  • the display image composition unit 123 determines a display range to be displayed on the display unit 111 (S21).
  • the display image composition unit 123 acquires the city type determined by the city type determination unit 121 (S23).
  • the display image composition unit 123 refers to the note data selection table 133 and specifies the attribute of the note display to be included and displayed on the display screen from the city type, the display layer, and the display scale (S25).
  • the display image composition unit 123 acquires the map image data block 221 and the note data group 223 in the display range of the display layer from the map data storage unit 125 (S27).
  • the display image composition unit 123 extracts the annotation entity 225 having the annotation display attribute specified in step S25 from the annotation data group 223 acquired in step S27. Then, the display image obtained by superimposing the annotation display based on the entity extracted here on the map image based on the map image data block 221 acquired in step S27 is synthesized (S29).
  • the display image composition unit 123 displays the display image on the display unit 111 (S31).
  • FIG. 1 is an overall configuration diagram of a map display system according to an embodiment of the present invention.
  • 2 is a configuration diagram of a user terminal 1.
  • FIG. It is a data structure of the map data 200.
  • A is an explanatory diagram of a layer, and B is an example of a composite image. It is explanatory drawing of the search range.
  • An example of the city type table 131 is shown.
  • An example of the annotation data selection table 133 is shown. It is a flowchart which shows the procedure of the process which determines a city type based on map data. It is a flowchart which shows the synthesis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Business, Economics & Management (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Remote Sensing (AREA)
  • Mathematical Physics (AREA)
  • Computer Graphics (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)
  • Processing Or Creating Images (AREA)
  • Traffic Control Systems (AREA)

Abstract

Selon l'invention, une annotation est affichée en fonction des caractéristiques de région déterminées en fonction d'une annotation affichée sur une carte. Un dispositif d'affichage de carte comporte une unité de stockage (125) pour stocker, pour chaque couche, des données cartographiques comprenant des données d'image de carte concernant une carte divisée en une pluralité de mailles et des données d'affichage d'annotation concernant chacune des mailles ; une unité de détermination de point d'affichage (119) pour déterminer un point d'affichage devant être affiché sur une unité d'affichage (111) ; une unité de détermination de type de ville (121) pour établir comme plage de recherche, dans une couche de référence prédéterminée parmi les couches, une ou plusieurs mailles comprenant le point d'affichage et sa région environnante et pour compter le nombre d'annotations affichées ; une unité de création d'image d'affichage (123) pour sélectionner, lorsqu'une plage d'affichage comprenant une couche d'affichage et un point d'affichage devant être affiché sur l'unité d'affichage (111) est déterminée, des données d'affichage d'annotation devant être incluses dans la plage d'affichage à partir des données d'affichage d'annotation à l'intérieur de la plage d'affichage de la couche d'affichage en fonction du résultat de comptage, et pour créer et afficher une image d'affichage comprenant l'affichage d'annotation.
PCT/JP2009/000978 2008-09-30 2009-03-04 Dispositif d'affichage de carte, procédé d'affichage de carte et programme informatique WO2010038328A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-252960 2008-09-30
JP2008252960A JP4326583B1 (ja) 2008-09-30 2008-09-30 地図表示装置、地図表示方法及びコンピュータプログラム

Publications (1)

Publication Number Publication Date
WO2010038328A1 true WO2010038328A1 (fr) 2010-04-08

Family

ID=41149083

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/000978 WO2010038328A1 (fr) 2008-09-30 2009-03-04 Dispositif d'affichage de carte, procédé d'affichage de carte et programme informatique

Country Status (2)

Country Link
JP (1) JP4326583B1 (fr)
WO (1) WO2010038328A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5383417B2 (ja) * 2009-10-07 2014-01-08 株式会社ゼンリンデータコム 地図情報処理装置、地図情報処理方法及び地図情報処理プログラム
JP7346979B2 (ja) * 2019-07-30 2023-09-20 トヨタ自動車株式会社 タグ付与装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08210862A (ja) * 1995-02-07 1996-08-20 Mitsubishi Electric Corp ナビゲーション用地図の表示方法
JPH10333555A (ja) * 1997-06-02 1998-12-18 Nissan Motor Co Ltd 車両用表示装置
JP2000337894A (ja) * 1999-05-24 2000-12-08 Fujitsu Ten Ltd ナビゲーション装置における市街図有無判定方法及び装置及び地図データ記憶
JP2004163592A (ja) * 2002-11-12 2004-06-10 Zenrin Datacom Co Ltd 地図情報提供システム
JP2007156849A (ja) * 2005-12-06 2007-06-21 Sony Corp 画像管理装置および画像表示装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2602938Y2 (ja) * 1993-11-05 2000-02-07 クラリオン株式会社 車載用ナビゲーション装置
JPH1173099A (ja) * 1997-08-29 1999-03-16 Denso Corp 地図表示装置
JP2003114615A (ja) * 2001-08-03 2003-04-18 Sony Corp 地図提供装置、地図提供方法及び地図提供システム
JP2003244748A (ja) * 2001-12-14 2003-08-29 Hitachi Ltd 移動端末の位置検出方法およびシステム
JP3753110B2 (ja) * 2002-08-07 2006-03-08 株式会社デンソー カーナビゲーション装置
WO2004088248A1 (fr) * 2003-04-02 2004-10-14 Wong, Lai Wan Affichage d'une carte numerique
JP4290156B2 (ja) * 2005-11-07 2009-07-01 株式会社ゼンリンデータコム 地図情報提供システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08210862A (ja) * 1995-02-07 1996-08-20 Mitsubishi Electric Corp ナビゲーション用地図の表示方法
JPH10333555A (ja) * 1997-06-02 1998-12-18 Nissan Motor Co Ltd 車両用表示装置
JP2000337894A (ja) * 1999-05-24 2000-12-08 Fujitsu Ten Ltd ナビゲーション装置における市街図有無判定方法及び装置及び地図データ記憶
JP2004163592A (ja) * 2002-11-12 2004-06-10 Zenrin Datacom Co Ltd 地図情報提供システム
JP2007156849A (ja) * 2005-12-06 2007-06-21 Sony Corp 画像管理装置および画像表示装置

Also Published As

Publication number Publication date
JP4326583B1 (ja) 2009-09-09
JP2010085587A (ja) 2010-04-15

Similar Documents

Publication Publication Date Title
US10170084B2 (en) Graphical representation generation for multiple points of interest
US7359798B2 (en) Method of controlling display of point information on map
US7869938B2 (en) Method and apparatus for displaying simplified map image for navigation system
US20080162031A1 (en) Information processing apparatus, information processing method and information processing program
US9110573B2 (en) Personalized viewports for interactive digital maps
US8525851B2 (en) Method of displaying labels on maps of wireless communications devices using pre-rendered characters
JP2012018468A (ja) 表示装置、および、プログラム
JP2002340588A (ja) ナビゲーション装置及びpoiアイコン表示方法
JP2003337041A (ja) 地図表示システム、地図表示方法、およびプログラム
JP2014519606A (ja) 短距離において複数の曲がり角を曲がるための支援を備えるナビゲーションシステム
US9835459B2 (en) Electronic device, and method and program for displaying name of search object candidate
CN105405355B (zh) 在电子地图上进行信息点筛选的方法
WO2010023963A1 (fr) Procédé d'affichage d'avatar, dispositif d'affichage d'avatar et support d'enregistrement
JP4326583B1 (ja) 地図表示装置、地図表示方法及びコンピュータプログラム
KR101307349B1 (ko) 모바일 단말기의 지도 디스플레이 장치 및 방법
CN106796498B (zh) 为用户渲染地图的方法、系统和存储介质
JP4884458B2 (ja) 比較物で面積表示する地図表示装置及び方法
JP4340326B1 (ja) ヘディングアップを行うナビゲーション装置
JP2010101838A (ja) ナビゲーション装置
JP4275928B2 (ja) 地図表示装置、地図表示方法、プログラムおよび記録媒体
JP2011008019A (ja) 制御装置、投影装置、制御方法、投影方法、制御プログラム、投影プログラムおよび記録媒体
JP4812609B2 (ja) ナビゲーションシステムおよびナビゲーション装置
JP5640598B2 (ja) 情報表示装置、情報表示システム、情報表示方法、携帯端末およびプログラム
JP2009210446A (ja) 地図情報表示装置及びスポット情報分割一覧表示プログラム及びスポット情報分割一覧表示提供サーバ
JP2000231375A (ja) 地理情報システム及びこのシステムを実現するためのプログラムを記録した記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09817372

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGTHS PURSUANT TO RULE 112(1) EPC (EPO FORM 07.09.11)

122 Ep: pct application non-entry in european phase

Ref document number: 09817372

Country of ref document: EP

Kind code of ref document: A1