WO2007125658A1 - ナビゲーション装置 - Google Patents
ナビゲーション装置 Download PDFInfo
- Publication number
- WO2007125658A1 WO2007125658A1 PCT/JP2007/050094 JP2007050094W WO2007125658A1 WO 2007125658 A1 WO2007125658 A1 WO 2007125658A1 JP 2007050094 W JP2007050094 W JP 2007050094W WO 2007125658 A1 WO2007125658 A1 WO 2007125658A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map data
- guide
- map
- acquired
- storage unit
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/367—Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/3673—Labelling using text of road map data items, e.g. road names, POI names
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
Definitions
- the present invention relates to a navigation device that guides a user at a guide point set near an intersection or branch point, for example, by displaying a scene of the guide point as a guide map, and is particularly excellent in visibility.
- the present invention relates to a technique for displaying a guide map.
- a navigation apparatus that displays a map on a display and performs route guidance.
- This navigation device searches for the optimum route to the current location destination by the user setting the destination, and stores this as a travel route.
- the current position is sequentially detected, and the current location and the travel route are displayed on the map including the detected current position for guidance.
- it displays a guidance map with 3D image power that is easy to match with the front scene that the driver actually sees, and also provides voice guidance.
- the guidance using the map and the guide map allows the user to easily select the traveling direction when traveling along the travel route.
- Patent Document 1 discloses an image display method for displaying an image such as a map with a time and season feeling.
- This image display method uses a calendar timer to monitor the current date and time, and the multiplication constant generator generates a color according to the current time, season, etc.
- R, G, B data multiplication constants CR, CG, CB is generated, and the multipliers for R, G, and B multiply the map color information (R GB data) output from the color palette by the multiplication constants CR, CG, and CB, respectively, to obtain the map image color information.
- the image is displayed based on the color information obtained by the operation.
- this image display method it is possible to change the display color of the map screen in conjunction with a preset time zone and season to express the day and night atmosphere and improve the visibility.
- Patent Document 1 Japanese Patent Laid-Open No. 9 311623 [0005]
- a color palette with a color tone suitable for the situation is prepared in advance, and when displaying, an optimum color palette is selected at that time and an image is selected.
- n is the number of situations to be set forcibly
- m is the maximum number of colors that can be defined in the color palette
- the number of colors that can be expressed in one situation is, for example, m Zn
- the number of colors is limited.
- the maximum number of colors m that can be defined in the color palette is limited to 256 colors, so a realistic image cannot be displayed.
- the image itself cannot be changed depending on the situation because a color palette with a different color tone is applied to the image data excluding color information.
- the present invention has been made to solve the above-described problems, and provides a navigation device capable of displaying a plan according to the situation without being limited by the number of colors. Objective.
- a navigation device stores a map data storage unit that stores map data including a guide map storage position pointer in data representing a guide point, and a single guide point.
- a guide map data storage unit that stores a plurality of plan data for drawing different scenes as a guide map, a position detection unit that detects a current position, a situation acquisition unit that acquires a current situation, When the current position detected by the position detector reaches the guide point indicated by the acquired map data, the guidance indicated by the guide map storage position pointer included in the data representing the guide point The map data corresponding to the situation acquired by the situation acquisition unit is acquired from the location of the figure data storage unit, and the map data storage unit is combined with the acquired map data to display the display data.
- a processing unit for generating and a display output unit for displaying a map and a guide map based on the display data generated by the processing unit are provided.
- a plurality of guide map data for drawing different scenes as guide maps is stored for one guide point, and the current position is stored in the guide point indicated by the map data.
- it arrives it obtains the guide map data corresponding to the situation at that time, combines it with the map data, and displays it.
- the color palette since the color palette is not used, the number of colors that can be displayed can be increased, and different images can be prepared depending on the situation.
- FIG. 1 is a block diagram showing a functional configuration of a navigation device according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram showing a data structure of map data and guide map data used in the navigation device according to Embodiment 1 of the present invention.
- FIG. 3 is a flowchart showing a schematic operation of the navigation device according to the first embodiment of the present invention.
- FIG. 4 is a diagram showing a data structure of map data and guide map data used in the navigation device according to Embodiment 2 and Embodiment 4 of the present invention.
- FIG. 5 is a flowchart showing details of a display target image acquisition process performed by the navigation device according to Embodiment 2 and Embodiment 4 of the present invention.
- FIG. 6 is a diagram showing a data structure of map data and guide map data used in the navigation device according to Embodiment 3 of the present invention.
- FIG. 7 is a flowchart showing details of a display target image acquisition process performed by a navigation device according to Embodiment 3 of the present invention.
- FIG. 8 is a flowchart showing details of a display target image acquisition process performed by the navigation device according to the fourth embodiment of the present invention.
- FIG. 9 is a diagram showing a data structure of map data and guide map data used in the navigation device in accordance with Embodiment 5 of the present invention.
- FIG. 10 is a flowchart showing details of a display target image acquisition process performed by the navigation device according to the fifth embodiment of the present invention.
- FIG. 1 is a block diagram showing a functional configuration of a navigation apparatus according to Embodiment 1 of the present invention.
- the navigation device includes a map information storage unit 11, a storage unit 12, a position detection unit 13, a situation acquisition unit 14, a display output unit 15, an input unit 16, and a processing unit 17.
- the map information storage unit 11 includes, for example, a CD-ROM, a DVD-ROM, a hard disk, a storage medium, and a disk drive that reads data recorded on the storage medium.
- the map information storage unit 11 can also be configured as a communication device that receives data by external force communication.
- the map information storage unit 11 includes a map data storage unit 11a and a guide map data storage unit 1 lb.
- the map data storage unit 11a stores map data including a guide map storage position pointer in data representing a guide point of road network data.
- the guide map data storage unit 1 lb stores a plurality of guide map data for drawing a plurality of guide maps having a three-dimensional image power of a scene.
- the map data read from the map data storage unit 11a and the guide map data read from the guide map data storage unit ib are sent to the processing unit 17.
- the storage unit 12 is also configured with a RAM power, for example, and is accessed from the processing unit 17.
- the storage unit 12 is used for temporarily storing data processed by the processing unit 17.
- the storage unit 12 temporarily stores data representing the optimum route obtained by the route search process performed in the processing unit 17 as travel route data.
- the position detection unit 13 is also configured with forces such as a GPS receiver, a vehicle speed sensor, and a gyro, and represents position data indicating the current position detected by the GPS receiver, and vehicle speed detected by the vehicle speed sensor.
- the current position of the host vehicle is detected based on the speed data and the direction data representing the traveling direction of the vehicle detected by the gyro.
- Current position data representing the current position of the vehicle detected by the position detection unit 13 is sent to the processing unit 17.
- the situation acquisition unit 14 includes, for example, various sensors attached to the inside or outside of the vehicle, and acquires situation data representing a situation that is a condition for switching the guide map. Specifically, the status acquisition unit 14 acquires the date and time from a clock mechanism (not shown), acquires an illuminance meter power illuminance value (not shown), and turns on the V illumination lamp power (not shown) for turning on the night illumination. Get off state. This situation acquisition part 14 The obtained situation data is sent to the processing unit 17.
- the display output unit 15 includes, for example, a CRT display device, a liquid crystal display device, and the like, and a map including the current position of the vehicle and a guide route according to display data sent from the processing unit 17. When the vehicle reaches the guidance point, a guidance map is displayed.
- the input unit 16 also includes forces such as a touch panel placed on the screen of the display output unit 15, a remote controller, and an operation switch. The user can input a destination and make various settings. Used to do. Data input from the input unit 16 is sent to the processing unit 17.
- the processing unit 17 is also configured with, for example, a microcomputer power, and controls the entire navigation device. For example, the processing unit 17 converts the optimal route from the current position of the vehicle detected by the position detection unit 13 to the destination input from the input unit 16 into the map data acquired by the map data storage unit 11a. A route search process for searching based on the above is executed. In addition, when the processing unit 17 reaches a guide point on the map data acquired from the map data storage unit 11a indicated by the current position data sent from the position detection unit 13, A guide map display process for displaying a guide map is executed according to the situation data acquired from the acquisition unit 14 (details will be described later).
- the map data stored in the map data storage unit 11a has a well-known structure and includes a guide map storage position pointer 20 in the data representing the guide points of the road network data included in the map data. Yes.
- the guide map storage position pointer 20 holds the storage position of the image data management table 21! /.
- the guide map data storage unit l ib stores guide map data for each guide point.
- the guide map data is composed of an image data management table 21 and image data 22.
- the image data management table 21 includes a plurality of pieces of image data (images 1, 2,%) Respectively corresponding to a plurality of pieces of situation data (situations 0, 1,... ⁇ Holds a pointer indicating the storage position of 1).
- Image data 22 is data for drawing a guide map displayed at a guide point, and for drawing a full-color three-dimensional image (guide map) of a different scene at one guide point. It consists of a plurality of image data.
- the image data 22 can be created in various formats such as a bitmap format and a JPEG format. Note that the image data 22 included in the image data 22 need not be stored in one place, and the storage order of the image data follows the order of the pointers stored in the image data management table 21. There is no need.
- FIG. 3 shows the operation of the navigation device configured as described above according to the first embodiment of the present invention, centering on the process of displaying the guide map at the guide point of the travel route. This will be explained with reference to the flow chart.
- route search processing is executed (step ST31). That is, the processing unit 17 searches for the optimum route from the current position of the vehicle detected by the position detection unit 13 to the destination input by the input unit 16 based on the map data read from the map information storage unit 11.
- the Data representing the optimum route obtained by this search is stored in the storage unit 12 as travel route data.
- map data is read from the map data storage unit 11a, stored in the storage unit 12, and a map display process is performed. As a result, the map is displayed on the display output unit 15. After that, the route plan starts when the vehicle starts moving.
- step ST32 the vehicle position search is performed (step ST32). That is, the processing unit 17 acquires current position data representing the current position of the host vehicle from the position detection unit 13. Next, it is checked whether or not a guide map needs to be displayed (step ST33). Specifically, the processing unit 17 collates the travel route data obtained in step ST31 with the current position data obtained in step ST32, thereby reaching the point where the guide map should be displayed, that is, the guide point. Judge whether. The guide point can be, for example, a point 500 m before the branch point of the travel route. If it is determined in this step ST33 that the guide map display is unnecessary, the process returns to step ST32 and the above-described processing is repeated until the guide point is reached.
- step ST34 the processing unit 17 acquires status data at that time from the status acquisition unit 14.
- display target images are acquired (step ST35). That is, the processing unit 17 acquires the guide map storage position pointer 20 included in the data representing the guide point of the road network data included in the map data that has already been read into the storage unit 12 to display the map, A guide map data storage unit 1 lb of position force image data management table 21 indicated by the contents of this guide map storage position pointer 20 is acquired.
- step ST34 the status data acquired in step ST34 is collated with the image data management table 21, the central force of the image data management table 21 selects a pointer corresponding to the status data, and the image data indicated by the selected pointer is displayed. Acquired from image data 22.
- step ST36 image display is performed (step ST36). That is, the processing unit 17 generates display data by combining the image data (guide map image) acquired in step ST35 and the image data (map image) generated based on the map data, and outputs the display data to the display output unit 15. send.
- the map and the guide map are displayed on the screen of the display output unit 15.
- a map including a travel route and a current position mark can be displayed on the left half of the screen of the display output unit 15, and a guide map can be displayed on the right half.
- the image data 22 for drawing a plurality of guide maps respectively corresponding to a plurality of situations is prepared.
- the current position arrives at the guide point indicated by the map data, it is configured to display a guide map according to the situation acquired by the status acquisition unit 14, so a color palette with a different color tone is prepared.
- a full-color guide map can be displayed without being limited by the number of colors because it does not depend on the color palette.
- the navigation device according to Embodiment 2 of the present invention is the same as the navigation device according to Embodiment 1, but the situation acquisition unit 14 force also indicates the current time when passing through the guide point. It is obtained and a guide map corresponding to the time zone to which the obtained time belongs is displayed.
- the configuration of the navigation device according to the second embodiment is the same as that of the navigation device shown in FIG. 1 except for the function of the situation acquisition unit 14.
- the situation acquisition unit 14 acquires time data representing the current time, such as a clock mechanism, not shown, as situation data.
- FIG. 4 is a diagram showing a data structure of map data and guide map data used in the navigation device according to Embodiment 2 of the present invention.
- the image data 22 is composed of image data data for a morning image, a daytime image, an evening image, and a nighttime image.
- the image data management table 21 also includes the morning image storage position indicating the storage position of the morning image, the daytime image storage position indicating the storage position of the daytime image, and the evening image indicating the storage position of the evening image.
- a night image storage position indicating a storage position and a night image storage position is stored.
- step ST 51 it is checked whether the time is morning (step ST 51). That is, the processing unit 17 extracts time data from the status acquisition unit 14 and checks whether or not the time indicated by the extracted time data belongs to the morning time zone. If it is determined in step ST51 that the time is morning, a morning image is acquired (step ST52). That is, the processing unit 17 extracts the morning image storage position from the image data management table 21, and acquires the image data of the morning image from the image data 22 using the extracted morning image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- step ST53 When it is determined in step ST51 that the time is not in the morning !, it is next checked whether the time is daytime (step ST53). That is, the processing unit 17 extracts time data from the status acquisition unit 14, and checks whether the time indicated by the extracted time data belongs to the daytime time zone. In this step ST53, the time is daytime
- a daytime image is acquired (step ST54). That is, the processing unit 17 extracts the daytime image storage position from the image data management table 21, and acquires the daytime image data from the image data 22 using the extracted daytime image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- step ST55 whether or not the time is evening power is checked. That is, the processing unit 17 extracts time data from the status acquisition unit 14, and checks whether or not the time indicated by the extracted time data belongs to the evening time zone. If it is determined in step ST55 that the time is evening, an evening image is acquired (step ST56). That is, the processing unit 17 extracts the evening image storage position from the image data management table 21, and acquires the image data of the evening image from the image data 22 using the extracted evening image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- step ST55 If it is determined in step ST55 that the time is not evening, it is recognized that the time is nighttime, and a nighttime image is acquired (step ST57). That is, the processing unit 17 extracts the nighttime image storage position from the image data management table 21, and acquires the image data of the nighttime image from the image data 22 using the extracted nighttime image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- steps ST51, ST53, and ST55 are based on the conditions of equations (1) to (4), and are stored in advance in the processing unit 17 or the storage unit 12, for example. Note that the conditions expressed by the equations (1) to (4) can be input from the input unit 16.
- Morning image display time zone Tl ⁇ Time when passing the guide point ⁇ 2 (1)
- Daytime image display time range ⁇ 2 ⁇ Time when passing the guide point ⁇ 3 ...
- Night image display time zone ⁇ 4 ⁇ Time when passing the guide point ⁇ T1 ⁇ ⁇ ⁇ (4)
- the navigation device is configured to display the guide map according to the time zone to which the time at the time of passing the guide point belongs.
- the guide map displayed on the display output unit 15 is closer to the scenery that you can actually see. , It is easy to select the direction of travel when traveling to the destination.
- the navigation device according to the third embodiment of the present invention is the navigation device according to the first embodiment, and acquires the illuminating light Z non-lighting state from the situation acquisition unit 14 when passing the guidance point.
- the guide map corresponding to the lighting of the acquired illumination Z and the non-lighting state is displayed.
- the configuration of the navigation device according to the third embodiment is the same as that of the navigation device shown in FIG.
- the situation acquisition unit 14 acquires, as the situation data, data representing an on-Z-off state of an illumination switch (not shown) that turns on the night illumination.
- FIG. 6 is a diagram showing a data structure of map data and guide map data used in the navigation device according to Embodiment 3 of the present invention.
- the image data 22 is composed of a daytime image that is a display image when the illumination is not lit and a night image that is a display image when the illumination is lit.
- the image data management table 21 stores a daytime image storage position indicating a storage position of a daytime image and a nighttime image storage position indicating a storage position of a nighttime image.
- step ST71 it is checked whether or not the illumination is lit. That is, the processing unit 17 retrieves data representing the on / off state of the illumination switch from the status acquisition unit 14, and checks whether or not the retrieved data representing the on / off state indicates on. If it is determined in step ST71 that the illumination is lit, a night image is acquired (step ST72). That is, the processing unit 17 extracts the nighttime image storage position from the image data management table 21, and acquires the image data of the nighttime image from the image data 22 using the extracted nighttime image storage position as a pointer. After that, the display target image acquisition process ends To do.
- step ST73 a daytime image is acquired (step ST73). That is, the processing unit 17 extracts the daytime image storage position from the image data management table 21, and acquires the daytime image data from the image data 22 using the extracted daytime image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- a plan view corresponding to the on / off state of the illumination switch that lights the night illumination of the automobile is displayed. Since it is configured, the guide map displayed on the display output unit 15 is closer to the scenery that can be seen more practically, and it is easy to select the direction of travel in traveling to the destination.
- the navigation device according to the fourth embodiment of the present invention is the navigation device according to the first embodiment.
- the navigation device according to the first embodiment acquires the current illuminance value from the situation acquisition unit 14 when passing the guidance point, and the acquired illuminance value.
- a guide map corresponding to is displayed.
- the configuration of the navigation device according to the fourth embodiment is the same as that of the navigation device shown in FIG. 1 except for the function of the situation acquisition unit 14.
- the situation acquisition unit 14 acquires the illuminance value measured by the illuminometer, not shown, as the situation data.
- the data structure of the map data and guide map data used in the navigation device according to Embodiment 4 of the present invention is the same as that of Embodiment 2 shown in FIG. However, the morning image, daytime image, evening image and nighttime image are classified according to the illuminance value.
- Step ST81 it is checked whether or not the illuminance value indicates morning. That is, the processing unit 17 extracts the current illuminance value from the situation acquisition unit 14, and checks whether or not the extracted illuminance value corresponds to the morning time zone. If it is determined in step ST81 that the illuminance value is in the morning, a morning image is acquired (step ST82). The processing in step ST82 is the same as the processing in step ST52 shown in FIG. Thereafter, the display target image acquisition process ends.
- step ST83 If it is determined in step ST81 that the illuminance value indicates morning, then it is checked whether the illuminance value indicates daytime (step ST83). That is, the processing unit 17 extracts the current illuminance value from the situation acquisition unit 14, and checks whether or not the extracted illuminance value corresponds to the daytime time zone. If it is determined in step ST83 that the illuminance value indicates daytime, a daytime image is acquired (step ST84). The processing in step ST84 is the same as the processing in step ST54 shown in FIG. Thereafter, the display target image acquisition process ends.
- step ST83 If it is determined in step ST83 that the illuminance value indicates daytime, then it is checked whether the illuminance value indicates evening (step ST85). That is, the processing unit 17 extracts the current illuminance value from the situation acquisition unit 14, and checks whether or not the extracted illuminance value corresponds to the evening time zone. If it is determined in step ST85 that the illuminance value indicates evening, an evening image is acquired (step ST86). The processing in step ST86 is the same as the processing in step ST56 shown in FIG. Thereafter, the display target image acquisition process ends.
- step ST85 If it is determined in step ST85 that the illuminance value does not indicate evening, it is recognized that the illumination value indicates night, and a night image is acquired (step ST87).
- the processing in step ST87 is the same as the processing in step ST57 shown in FIG. Thereafter, the display target image acquisition process ends.
- steps ST81, ST83, and ST85 are based on the conditions of equations (5) to (8), and are stored in advance in the processing unit 17 or the storage unit 12, for example. Note that the conditions expressed by the equations (5) to (8) can be input from the input unit 16.
- Illuminance for image display in the morning B1 ⁇ Illuminance when passing the guide point ⁇ 2 ⁇ ⁇ ⁇ (5)
- Illuminance for daytime image display B2 ⁇ Illuminance when passing through the guide point ⁇ 3 ⁇ ⁇ (6)
- Evening image display target illuminance ⁇ 3 ⁇ Illuminance when passing through the guide point ⁇ ⁇ 4 ⁇ ⁇ ⁇ (7)
- Illuminance for image display at night ⁇ 4 ⁇ Illuminance when passing through the guide point ⁇ 1 ⁇ ⁇ ⁇ (8)
- the navigation device is configured to display the guide map according to the surrounding brightness, so that it is displayed on display output unit 15.
- the guide map will be closer to the scenery that you can actually see, and it will be easier to select the direction of travel when traveling to your destination.
- the navigation device according to the fifth embodiment of the present invention is the navigation device according to the first embodiment.
- the situation acquisition unit 14 also acquires data representing the current season, and this A guide map corresponding to the season indicated by the data is displayed.
- the configuration of the navigation device according to the fifth embodiment is the same as that of the navigation device shown in FIG. 1 except for the function of the situation acquisition unit 14.
- the status acquisition unit 14 acquires current date data from a clock mechanism (not shown) as status data.
- FIG. 9 is a diagram showing a data structure of map data and guide map data used in the navigation device according to Embodiment 5 of the present invention.
- the image data 22 includes image data for a spring image, a summer image, a fall image, and a winter image.
- the image data management table 21 also includes a spring image storage location indicating the storage location of the spring image, a summer image storage location indicating the storage location of the summer image, and an autumn image indicating the storage location of the autumn image.
- a winter image storage position indicating an image storage position and a winter image storage position is stored.
- a spring image may include a flower and autumn may include a sight image including autumn leaves.
- step ST101 it is checked whether or not the season is spring (step ST101). That is, the processing unit 17 extracts date data representing the current date from the status acquisition unit 14, and checks whether the season indicated by the extracted date data belongs to spring. If it is determined in step ST101 that the season is spring, a spring image is acquired (step ST102). That is, the processing unit 17 extracts the spring image storage position from the image data management table 21, and acquires the image data of the spring image from the image data 22 using the extracted spring image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- step ST101 If it is determined in step ST101 that the season is not spring, then it is checked whether the season is summer power (step ST103). That is, the processing unit 17 extracts date data from the status acquisition unit 14, and checks whether the season indicated by the extracted date data belongs to summer. If it is determined in step ST103 that the season is summer, a summer image is acquired (step ST104). That is, the processing unit 17 extracts the summer image storage position from the image data management table 21, and acquires the image data of the summer image from the image data 22 using the extracted summer image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- step ST105 If it is determined in step ST103 that the season is not summer, then it is checked whether the season is autumn power (step ST105). That is, the processing unit 17 extracts date data from the status acquisition unit 14, and checks whether the season indicated by the extracted date data belongs to autumn. If it is determined in step ST105 that the season is autumn, an image for autumn is acquired (step ST106). That is, the processing unit 17 retrieves the autumn image storage position from the image data management table 21, and acquires the image data of the autumn image from the image data 22 using the retrieved autumn image storage position as a pointer. Thereafter, the display target image acquisition process ends.
- step ST105 If it is determined in step ST105 that the season is not autumn, it is recognized that the season is winter, and a winter image is acquired (step ST107). That is, the processing unit 17 retrieves the winter image storage position from the image data management table 21, and acquires the image data of the winter image from the image data 22 using the retrieved winter image storage position as a pointer. To do. Thereafter, the display target image acquisition process ends.
- steps ST101, ST103, and ST105 are based on the conditions of equations (9) to (12), and are stored in advance in processing unit 17 or storage unit 12, for example. It should be noted that the conditions shown in equations (9) to (12) are determined by inputting from the input unit 16.
- D1 to D4 are prepared for each region. It can be configured to use D1 to D4 corresponding to the region based on the data representing this region.
- the navigation device is configured to display different guide maps depending on the season at the time of passing the guide point, so the display output unit 15
- the guide map displayed on the screen is closer to the scenery that can be seen more practically, and it becomes easier to select the direction of travel when traveling to the destination.
- the navigation device can provide a high-quality guide map close to the scenery that the driver actually sees by displaying a plan view corresponding to the situation of the vehicle position. It has a structure that improves visibility and is suitable for use in in-vehicle navigation devices.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112007000809T DE112007000809T5 (de) | 2006-04-26 | 2007-01-09 | Navigationsgerät |
JP2008513088A JPWO2007125658A1 (ja) | 2006-04-26 | 2007-01-09 | ナビゲーション装置 |
US12/087,268 US8918283B2 (en) | 2006-04-26 | 2007-01-09 | Navigation apparatus |
CN200780006594.5A CN101389926B (zh) | 2006-04-26 | 2007-01-09 | 导航装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006-122367 | 2006-04-26 | ||
JP2006122367 | 2006-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007125658A1 true WO2007125658A1 (ja) | 2007-11-08 |
Family
ID=38655198
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/050094 WO2007125658A1 (ja) | 2006-04-26 | 2007-01-09 | ナビゲーション装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8918283B2 (ja) |
JP (2) | JPWO2007125658A1 (ja) |
CN (1) | CN101389926B (ja) |
DE (1) | DE112007000809T5 (ja) |
WO (1) | WO2007125658A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100852615B1 (ko) | 2006-04-27 | 2008-08-18 | 팅크웨어(주) | 계절 및 지형 변화에 따른 지도 표현 방법 및 시스템 |
US8157853B2 (en) * | 2008-01-24 | 2012-04-17 | Medtronic, Inc. | Delivery systems and methods of implantation for prosthetic heart valves |
US9218682B2 (en) * | 2008-10-15 | 2015-12-22 | Nokia Technologies Oy | Method and apparatus for generating an image |
JP5556485B2 (ja) * | 2010-08-05 | 2014-07-23 | アイシン・エィ・ダブリュ株式会社 | 情報提供装置、情報提供方法、及び情報提供プログラム |
US9544869B2 (en) * | 2013-05-16 | 2017-01-10 | Qualcomm Incorporated | Method for adapting to venue positioning capabilities |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07191612A (ja) * | 1993-12-27 | 1995-07-28 | Aisin Aw Co Ltd | 車両用情報表示装置 |
JPH09311623A (ja) * | 1996-05-21 | 1997-12-02 | Alpine Electron Inc | 画像表示方法 |
JPH10148534A (ja) * | 1996-11-18 | 1998-06-02 | Nissan Motor Co Ltd | 車載情報表示装置 |
JPH10267668A (ja) * | 1997-03-27 | 1998-10-09 | Fujitsu Ten Ltd | ナビゲーション装置 |
JPH10339649A (ja) * | 1997-06-10 | 1998-12-22 | Matsushita Electric Ind Co Ltd | 経路案内システム |
JP2000283784A (ja) * | 1999-03-31 | 2000-10-13 | Matsushita Electric Ind Co Ltd | 走行位置表示装置 |
JP2000292198A (ja) * | 1999-04-08 | 2000-10-20 | Kenwood Corp | 車載用ナビゲーション装置 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2801798B2 (ja) | 1991-07-10 | 1998-09-21 | パイオニア株式会社 | ナビゲーションシステム |
DE69426011T2 (de) * | 1993-12-27 | 2001-02-15 | Aisin Aw Co | Fahrzeuginformationsanzeigesystem |
JP3351650B2 (ja) * | 1995-03-28 | 2002-12-03 | アルパイン株式会社 | ナビゲーション装置のデモンストレーション方法 |
JPH09113291A (ja) | 1995-10-19 | 1997-05-02 | Toshiba Corp | 地図表示処理装置 |
JPH1138872A (ja) * | 1997-07-17 | 1999-02-12 | Toyota Motor Corp | 地図データ配信システム、およびそのシステムに適する地図データ取得装置 |
US20040204848A1 (en) * | 2002-06-20 | 2004-10-14 | Shigeru Matsuo | Navigation apparatus for receiving delivered information |
JP2003185453A (ja) * | 2001-12-20 | 2003-07-03 | Mitsubishi Electric Corp | ナビゲーション装置および経路探索方法 |
US6937936B2 (en) * | 2002-04-25 | 2005-08-30 | Aisin Aw Co., Ltd. | Navigation system |
ES2425555T3 (es) * | 2002-04-30 | 2013-10-16 | Telmap Ltd. | Sistema de navegación que utiliza mapas de corredores |
CN100449266C (zh) | 2002-08-09 | 2009-01-07 | 爱信艾达株式会社 | 地图显示装置 |
JP3981040B2 (ja) * | 2003-04-18 | 2007-09-26 | アルパイン株式会社 | ナビゲーション装置およびその装置における地図データのアクセス方法 |
US7342516B2 (en) * | 2003-10-08 | 2008-03-11 | Hitachi, Ltd. | Method and apparatus for communicating map and route guidance information for vehicle navigation |
JP4189297B2 (ja) * | 2003-10-29 | 2008-12-03 | 株式会社ナビタイムジャパン | 経路案内システム、携帯端末、サーバ、プログラム、記録媒体 |
JP2005214693A (ja) | 2004-01-28 | 2005-08-11 | Alpine Electronics Inc | 車載用ナビゲーション装置及びその画面表示方法 |
JP2006154538A (ja) * | 2004-11-30 | 2006-06-15 | Aisin Aw Co Ltd | 地図描画方法、地図描画システム、ナビゲーション装置及び入出力装置 |
US7698061B2 (en) * | 2005-09-23 | 2010-04-13 | Scenera Technologies, Llc | System and method for selecting and presenting a route to a user |
JP4745045B2 (ja) * | 2005-12-15 | 2011-08-10 | アルパイン株式会社 | ナビゲーション装置 |
JP4257661B2 (ja) * | 2006-06-30 | 2009-04-22 | アイシン・エィ・ダブリュ株式会社 | ナビゲーション装置 |
JP4809900B2 (ja) * | 2006-11-17 | 2011-11-09 | パイオニア株式会社 | ナビゲーション装置、地図表示方法及び地図表示プログラム |
CN101553709A (zh) * | 2006-12-11 | 2009-10-07 | 三菱电机株式会社 | 导航装置 |
WO2008096485A1 (ja) * | 2007-02-05 | 2008-08-14 | Mitsubishi Electric Corporation | ナビゲーション装置 |
US8311736B2 (en) * | 2008-06-04 | 2012-11-13 | Hitachi, Ltd. | Navigation device, navigation method and navigation system |
-
2007
- 2007-01-09 US US12/087,268 patent/US8918283B2/en not_active Expired - Fee Related
- 2007-01-09 CN CN200780006594.5A patent/CN101389926B/zh active Active
- 2007-01-09 JP JP2008513088A patent/JPWO2007125658A1/ja active Pending
- 2007-01-09 WO PCT/JP2007/050094 patent/WO2007125658A1/ja active Application Filing
- 2007-01-09 DE DE112007000809T patent/DE112007000809T5/de not_active Ceased
-
2010
- 2010-05-17 JP JP2010112983A patent/JP2010185880A/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07191612A (ja) * | 1993-12-27 | 1995-07-28 | Aisin Aw Co Ltd | 車両用情報表示装置 |
JPH09311623A (ja) * | 1996-05-21 | 1997-12-02 | Alpine Electron Inc | 画像表示方法 |
JPH10148534A (ja) * | 1996-11-18 | 1998-06-02 | Nissan Motor Co Ltd | 車載情報表示装置 |
JPH10267668A (ja) * | 1997-03-27 | 1998-10-09 | Fujitsu Ten Ltd | ナビゲーション装置 |
JPH10339649A (ja) * | 1997-06-10 | 1998-12-22 | Matsushita Electric Ind Co Ltd | 経路案内システム |
JP2000283784A (ja) * | 1999-03-31 | 2000-10-13 | Matsushita Electric Ind Co Ltd | 走行位置表示装置 |
JP2000292198A (ja) * | 1999-04-08 | 2000-10-20 | Kenwood Corp | 車載用ナビゲーション装置 |
Also Published As
Publication number | Publication date |
---|---|
CN101389926B (zh) | 2011-11-09 |
JP2010185880A (ja) | 2010-08-26 |
US20090005976A1 (en) | 2009-01-01 |
US8918283B2 (en) | 2014-12-23 |
DE112007000809T5 (de) | 2009-04-30 |
CN101389926A (zh) | 2009-03-18 |
JPWO2007125658A1 (ja) | 2009-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080215234A1 (en) | Portable navigation device | |
JP2003344062A (ja) | ナビゲーション装置 | |
JPH09292262A (ja) | 周辺施設検索表示方法及び誘導経路の目的地設定方法 | |
WO2007125658A1 (ja) | ナビゲーション装置 | |
JP2009280142A (ja) | 車載インパネ画像表示装置 | |
JP4219474B2 (ja) | 走行位置表示装置 | |
JPH1116091A (ja) | 車両用表示装置 | |
JP2002090167A (ja) | 車載用ナビゲーション装置のルート案内方法 | |
JP3732935B2 (ja) | ナビゲーション装置 | |
JPH09127862A (ja) | 車両用地図表示装置 | |
JP2017032436A (ja) | 移動案内システム、移動案内方法及びコンピュータプログラム | |
JP2017083278A (ja) | 情報提供システム、情報提供方法及びコンピュータプログラム | |
JP2004317222A (ja) | ナビゲーション装置およびナビゲーション装置におけるランドマークの表示方法 | |
JP3556868B2 (ja) | ナビゲーション装置 | |
JP3938825B2 (ja) | ナビゲーション装置 | |
JP2000003497A (ja) | 走行位置表示装置 | |
JP2007263666A (ja) | 経路探索方法、ナビゲーション装置及びコンピュータプログラム | |
JPH08105752A (ja) | 車載用ナビゲーション装置 | |
JP4183570B2 (ja) | シミュレーション走行時の地図表示方法およびそれを用いたナビゲーション装置 | |
JP2004177209A (ja) | ナビゲーション装置 | |
JP2002267474A (ja) | ナビゲーション装置 | |
JP2002267464A (ja) | ナビゲーション装置 | |
JP2007285907A (ja) | 地図表示装置、地図表示方法、及び地図表示プログラム | |
JPH08254433A (ja) | 車載用ナビゲーション装置 | |
JP4969392B2 (ja) | ナビゲーション装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07706441 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008513088 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12087268 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200780006594.5 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120070008091 Country of ref document: DE |
|
RET | De translation (de og part 6b) |
Ref document number: 112007000809 Country of ref document: DE Date of ref document: 20090430 Kind code of ref document: P |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 07706441 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8607 |