US20010007090A1 - Navigation device - Google Patents

Navigation device Download PDF

Info

Publication number
US20010007090A1
US20010007090A1 US09/768,460 US76846001A US2001007090A1 US 20010007090 A1 US20010007090 A1 US 20010007090A1 US 76846001 A US76846001 A US 76846001A US 2001007090 A1 US2001007090 A1 US 2001007090A1
Authority
US
United States
Prior art keywords
voice
links
nodes
guide
route
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/768,460
Other versions
US6366852B2 (en
Inventor
Takashi Irie
Masatsugu Norimoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Assigned to MITSUBISHI DENKI KABUSHIKI KAISHA reassignment MITSUBISHI DENKI KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IRIE, TAKASHI, NORIMOTO, MASATSUGU
Publication of US20010007090A1 publication Critical patent/US20010007090A1/en
Application granted granted Critical
Publication of US6366852B2 publication Critical patent/US6366852B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • G08G1/096838Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the user preferences are taken into account or the user selects one route out of a plurality
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • G08G1/096861Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where the immediate route instructions are output to the driver, e.g. arrow signs for next turn
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • G08G1/096866Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where the complete route is shown to the driver
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • G08G1/096872Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where instructions are given per voice

Definitions

  • the present invention relates to a navigation device which is mounted in a moving body and which guides a route from a present position to a destination.
  • Navigation devices which guide a route from a current position of an automobile to a destination while driving have been widely applied.
  • this type of navigation device when a destination is set by a user, the device searches a preferred route from the current position to the destination.
  • the searched route is presented to a user, it is common to present the route in an “entire route format” in which a route from a geographical position, at which a destination is set, to the destination is displayed on the same screen by switching a map scale.
  • a method is employed of displaying more detailed information about the route by scrolling the map display automatically or manually along the route by switching to a detailed map display with a large map scale ratio.
  • a method of sequential display of guiding maps of intersections on the route forward of a current position, or a method of displaying the route schematically by main branching points on the display device are also known in the art. Furthermore, it is possible to execute the guiding operation by use of voice commands.
  • FIG. 1 is a block diagram showing a first conventional navigation device as disclosed in JP-A-5-297800.
  • reference numeral 1 denotes a touch switch for inputting a destination or the like
  • 2 is a vehicle speed sensor for detecting a vehicle speed
  • 3 is a bearing sensor for detecting a bearing
  • 4 is an external storage device for storing information displaying classes of roads connecting branching points or information displaying classes of branching point of roads and pre-stored map information data.
  • 5 is a display device for displaying information regarding main points such as branching points which are on a route from the present position to the destination.
  • 6 is a control device which searches a travel route of the vehicle based on an input signal from the bearing sensor 3 and the vehicle sensor 2 and an input signal from the switch 1 and which displays main branching points on the travel route in summary form on the display device 5 .
  • route searching is executed.
  • the route searching is executed according to a Dijkstra method.
  • the device searches a route passing along main roads in which there are few right or left turns and in which the names of intersections at which turns are made are known.
  • the search is made on the basis of the map information data stored in the external storage device 4 .
  • the display process is executed to display the searched route.
  • information value is a product of a turning coefficient, a name presence/absence coefficient and a node category coefficient.
  • the node category coefficient is a fixed coefficient corresponding to categories of nodes such as expressway entrance/exit, tollway entrance/exit or national road intersection.
  • the node name presence/absence coefficient is a fixed coefficient which corresponds to the presence or absence of a name of the node.
  • the turning coefficient is a fixed coefficient which corresponds to the presence or absence of left or right turns.
  • the name of the destination and present position the name of main branching points on the route from the present position to the destination, the distance between each branching point, representative place names which indicate the direction of travel, and the names of roads which should be taken at branching points are displayed on the display device 5 .
  • FIG. 2 is a block diagram of a second conventional navigation device as disclosed in International Publication WO98/51995.
  • reference numeral 10 denotes a control means which performs control of the overall device and each type of calculation in the navigation device.
  • 11 is a map information storage means which stores digitized map information data such as intersection data, road data and the like.
  • 12 is a present position detection means which detects a present position of the moving body in which the navigation device is mounted.
  • 13 is a route setting means which sets a route between two points on a map on the basis of map information data stored in the map information storage means 11 .
  • 14 is a guide object intersection detection means which detects a guide object intersection to be guided on the route set by the route setting means 13 .
  • 15 is a quantizing calculation means which quantizes the route of the moving body onto a schematic map displaying the characteristic features of the route.
  • 16 is a display means which displays a route quantized by the quantizing calculation means with respect to a guide object intersection detected by the guide object intersection detection means 14 .
  • 17 is a voice guide message generation means having a voice information storage means (not shown) which stores necessary words or phrases for guide messages as voice wave form data.
  • the voice guide message generation means selects voice wave form data such as words or phrases for guide messages and generates such combinations as guide messages when a quantized route of the moving body is displayed on the display means 16 .
  • 18 is a voice output means which notifies a user by voice commands of guide messages generated by the voice guide message generation means.
  • FIG. 3 is a flowchart explaining the operation of the second conventional navigation device shown in FIG. 2.
  • the route setting means 13 sets two points on the map on the basis of latitude and longitude from the map information data read from the map information storage means 11 and then sets a route between the two points using a general search algorithm on a network such as a Dijkstra method or the like.
  • the present position detection means 12 detects a present position (C 1 ) of the moving body.
  • the flags FL 1 , FL 2 , FL 3 are respectively initialized to 0.
  • the guide object intersection detection means 14 extracts an intersection with, for example, more than three roads being connected to the intersection as a forward guide object intersection (C 2 ).
  • the intersection is an intersection on the route set by the route setting means 13 and, of the two geographical points set by the route setting means, the intersection is further forward than present position (C 1 ) detected by the present position detection means 12 .
  • step ST 5 the detection of the present position (C 1 ) of the moving body is performed again by the present position detection means 12 and in a step ST 6 , a road distance (L 1 ) between the present position (C 1 ) of the moving body and the forward guide object intersection (C 2 ) is calculated on the basis of map information data read from the map information storage means 11 .
  • a guide output A which is related to the forward guide object intersection (C 2 ) is executed.
  • the guide output A comprises extracting only the section to the forward guiding object intersection (C 2 ) extracted by the quantizing calculation means 15 of the road on the route on the map. Then, the result is quantized to a simple arrow shape and a display map related to the forward guide object intersection (C 2 ) is displayed on the display means 16 .
  • a guide voice message related to the forward guide object intersection (C 2 ) is generated by the voice guide message generation means 17 and the message is reported by voice commands from the voice output means 18 .
  • the flag FL 1 is varied to a value of 1 and the fact that the guide output A in relation to the forward guide object intersection (C 2 ) has been executed is stored.
  • step ST 11 it is determined whether or not the process of setting the route by the route setting means 13 is completed. When it is completed, the guide process is terminated. When it is not completed, the routine returns to a step ST 5 and executes the steps of the routine after step ST 6 .
  • step ST 8 When the flag FL 1 does not have a value of 0 in step ST 8 , since the guide output A has already been executed, the routine returns to a step ST 5 .
  • a step ST 7 when the distance (L 1 ) is less than or equal to the reference value (L 2 ) and greater than the predetermined reference value (L 3 ) (for example 300 meters), the routine progresses to a step ST 12 , and it is determined whether or not the flag FL 2 has a value of 0.
  • a guide output B related to the forward guide object intersection (C 2 ) is executed.
  • the guide output B comprises extracting only the route section connecting the forward guiding object intersection (C 2 ) of the road on the map extracted by the quantizing calculation means 15 .
  • the result is quantized to a simple arrow shape and a display map related to the forward guide object intersection (C 2 ) is displayed on the display means 16 .
  • a guide voice message related to the forward guide object intersection (C 2 ) is generated by the voice guide message generation means 17 and the message is reported by voice commands from the voice output means 18 .
  • the flag FL 2 is varied to a value of 1 and the fact that the guide output B in relation to the forward guide object intersection (C 2 ) has been executed is stored.
  • step ST 11 it is determined whether or not the process of setting the route by the route setting means 13 is completed. When it is completed, the guide process is terminated. When it is not completed, the routine returns to a step ST 5 and executes the steps of the routine after step ST 6 .
  • step ST 12 When the flag FL 2 does not have a value of 0 in step ST 12 , since the guide output has already been executed, the routine returns to a step ST 5 .
  • the routine progresses to a step ST 15 and it is determined whether or not the flag FL 3 has a value of 0 or not.
  • the guide output C related to the forward guide object intersection (C 2 ) is executed.
  • the guide output C comprises extracting the present position of the moving body, roads other than roads on the route, route roads connected to the forward guiding object intersection (C 2 ) and the forward guiding object intersection (C 2 ) extracted by the quantizing calculation means 15 on the map.
  • the result is quantized to a simple arrow shape and a display map related to the forward guide object intersection (C 2 ) is displayed on the display means 16 .
  • a guide voice message related to the forward guide object intersection (C 2 ) is generated by the voice guide message generation means 17 and the message is reported by voice commands from the voice output means 18 .
  • the flag FL 3 is varied to a value of 1 and the fact that the guide output C in relation to the forward guide object intersection (C 2 ) has been executed is stored.
  • step ST 11 it is determined whether or not the process of setting the route by the route setting means 13 is completed. When it is completed, the guide process is terminated. When it is not completed, the routine returns to a step ST 5 and executes the steps of the routine after step ST 6 .
  • step ST 15 When the flag FL 3 does not have a value of 0 in step ST 15 , since the guide output C has already been executed, the routine returns to a step ST 3 , and the flags FL 1 , FL 2 , FL 3 are initialized to 0. In a step ST 4 , the forward guide object intersection (C 2 ) is extracted.
  • the conventional navigation device Since the conventional navigation device is constructed as above, the problem has arisen that safe operation of the vehicle can be affected as it is necessary for a user to monitor the guide display even when display nodes are displayed in summary form by a summarizing process. Furthermore, the number of guiding nodes provided to guide the entire route by voice commands increases and the problem has arisen that nodes can not be adapted to guide the entire route appropriately in a short time.
  • the present invention is proposed to solve the above problems and has the object of providing a navigation device adapted to store the level of importance of nodes and links and select nodes and links on the searched route based on the level of importance or time for playing voice information.
  • a voice guide message is generated corresponding to the selected nodes and links and guiding of the route is executed by the voice guide message.
  • the present invention has the further object of providing a navigation device in which when a plurality of nodes and links with the same importance exists and the number of such nodes and links is not equal to a predetermined reference number, nodes and links in proximity to the present position, the number of which corresponds with the predetermined reference number, is selected from nodes and links with the same importance, thereby to make the number of guide nodes and the like accurately correspond with the predetermined reference number.
  • a navigation device of the present invention is adapted to store a level of importance of each node and link as a part of map information in a map information storage means, to select the nodes and links on the searched route based on the level of importance and to generate a voice guide message corresponding to the selected links and nodes. In this way, it is possible to guide an entire route appropriately in a short period of time by voice commands.
  • the navigation device of the present invention may be adapted to select the nodes and links on the searched route in such a manner that the level of importance of the selected nodes and links is less than or equal to a predetermined reference level and the number of the selected nodes and links is less than or equal to a predetermined reference number, and to generate a voice guide message corresponding to the selected nodes and links.
  • the voice guiding with respect to the nodes and links having low level of importance is not executed. Thus, it is possible to guide the entire route appropriately.
  • the navigation device of the present invention may be adapted to delete nodes and links, which are located in proximity to the guide point, from the nodes and links having the same level of importance and to make the number of the selected nodes and links equal to a predetermined reference number, when there exists a plurality of nodes and links with the same level of importance and the number of the selected nodes and links is not equal to the predetermined reference number. In this way, it is possible to make the number of nodes and links to be guided correspond accurately with the predetermined reference number.
  • the navigation device of the present invention may be provided with a reference value setting means for setting a predetermined reference value and a reference number setting means for setting a predetermined reference number. In this way, it is possible to provide a voice guide with the desired amount and level of importance.
  • the navigation device of the present invention may be adapted to store information about voice playing times relating to names of each link and node as a part of map information in the map information storage means, to select the nodes and links on the searched route in order of highest importance in such a manner that the voice playing time for the voice guide message is less than or equal to a predetermined reference value, and to generate the voice guide message corresponding to the selected nodes and links. In such a way, it is possible to accurately make the time taken for voice guiding under a predetermined reference value.
  • FIG. 1 is a block diagram showing a first convention navigation device.
  • FIG. 2 is a block diagram showing a second convention navigation device.
  • FIG. 3 is a flowchart explaining the operation of a second conventional navigation device.
  • FIG. 4 is a block diagram showing a construction of a navigation device according to a first embodiment of the present invention.
  • FIG. 5 is a block diagram showing a construction of the hardware in the navigation device shown in FIG. 4.
  • FIG. 6 shows an example of map information data stored in a map information storage means.
  • FIG. 7 shows an example of a menu and a map displayed on a display means.
  • FIG. 8 shows an example of a route determined by a route searching means.
  • FIG. 9 shows an example of a menu for each setting category of route voice guides according to a first embodiment.
  • FIG. 10 is a flowchart of the operation of each section in route voice guide processing.
  • FIG. 11 is a flowchart showing the details of the process of extracting links and nodes as well as the proximate facilities to links and nodes in step ST 105 of FIG. 10.
  • FIG. 12 shows the relative relationship of guide number A and distance X from a present position to a guide point obtained by this formula.
  • FIG. 13 is a flowchart showing the details of the process of generating a voice guide message for extracted links and nodes as well as the proximate facilities to links and nodes in step ST 106 of FIG. 10.
  • FIG. 14 is an example of a set of supplementary voice data.
  • FIG. 15 shows data related to extracted links and nodes as well as the proximate facilities to links and nodes in the process shown in FIG. 11 with respect to the route shown in FIG. 8.
  • FIG. 16 shows a voice guide message generated based on the data shown in FIG. 15.
  • FIG. 17 shows a display example of a menu for each category of setting of route voice guides according to a second embodiment.
  • FIG. 18 is a flowchart showing the details of the process of extracting links and nodes as well as the proximate facilities to links and nodes in a second embodiment.
  • FIG. 4 is a block diagram showing a construction of a navigation device according to a first embodiment of the present invention.
  • FIG. 5 is a block diagram showing a construction of the hardware in the navigation device shown in FIG. 4.
  • reference numeral 21 denotes a control means which executes each type of calculation in the navigation device and controls other constitutive elements.
  • 22 is a map information storage means which pre-stores digitized map information data such as node data and link data displaying intersection points and roads.
  • 23 is a present position detection means which detects a present position of a moving body in which the navigation device is mounted.
  • 24 is a route search means which reads map information data stored in the map information storage means 22 , which searches a route between two geographic points in a map on the basis of map information data for example on the basis of a Dijkstra method and which determines a single route.
  • 25 is a route storage means which stores a route determined by the route search means 24 .
  • 26 is a display means which displays a route and the like stored in the route storage means 25 and a map based on map information data stored in the map information storage means 22 .
  • 27 is a voice guide message generation means which has a voice information storage means 31 which pre-stores voice waveform data such as words and phrases required for voice guide messages. Voice waveform data such as words and phrases constituting voice guide messages are selected when performing voice guiding and voice guide messages are generated by combining selected voice waveform data. 28 is a voice output means which outputs voice corresponding to voice guide messages generated by the voice guide message generated means 27 and which reports the guide message to a user.
  • 29 is an operation means which is operated when commands are input into the navigation device by user and which supplies an input user commands to the control means 21 .
  • 30 is a voice guide information extraction means which extracts main guide information from guide information on a route stored in the route storage means 25 .
  • 51 is a CD-ROM storing digitized map information and a read-out device thereof which correspond to the map information storage means 22 shown in FIG. 4.
  • [0067] 52 is a GPS receiver which receives electromagnetic waves from an artificial satellite using a geo-positioning system (GPS) and which outputs a present position of the moving body in which a navigation device is mounted.
  • 53 is a bearing sensor which detects a bearing in which the moving body is directed.
  • 54 is a distance sensor which detects a movement distance of the moving body.
  • [0068] 55 is a display device which has for example a liquid crystal display and which displays map information, maps based on map information data, determined routes and the like.
  • the display device corresponds to the display means 26 shown in FIG. 4.
  • 56 is a voice output means which outputs voice guide messages. It corresponds to the voice output means shown in FIG. 4.
  • 57 is an input device which has a switch operated when commands are input into the navigation device by a user and which supplies input user commands to a control unit 58 .
  • the input device 57 corresponds to the operation means shown in FIG. 4.
  • [0069] 58 is a control unit provided with a central processing unit (CPU) 61 , a read only memory (ROM) 62 , a random access memory (RAM) 63 , a display control section 64 and an input/output control section 65 .
  • the control unit 58 calculates each type of calculation in the navigation device and executes control of other constitutive components.
  • the control unit corresponds to the control means 21 , the route search means 24 , the route storage means 25 , the voice guide message generation means 27 and the voice guide information extraction means 30 shown in FIG. 4.
  • 61 is a CPU which executes processing of route searching and guide point extraction.
  • 62 is a ROM which pre-stores data, programs and the like used by the CPU 61 .
  • 63 is a RAM into which map information data and programs used by the CPU 61 are loaded and which stores the calculation results of the CPU 61 .
  • 64 is a display control section which controls the display device 55 and which displays each type of image on the display device 55 .
  • 65 is an input/output control section which executes transfer of signals and each type of data by acting as an interface between the control unit 58 and each type of external device (CD-ROM and read-out device 51 to input device 57 ).
  • FIG. 6 shows an example of map information data stored in the map information storage means 22 .
  • the map information data comprises a node data group 110 being the set of data related to nodes and a link data group 130 being the set of data related to links.
  • the node data group 110 is comprised by a node data record 120 which comprises each type of data related to each node.
  • Each node data record 120 has a node number 121 which shows a distinguishing number which is uniquely assigned to a node which corresponds to the node data record 120 , a node coordinate 122 which shows latitude and longitude of a position of a node on the map, a connecting link number 123 which shows the number of links connecting the node, a link number 124 of each link connected to the node, a node name 125 which is the name of the node, and a proximate facility data record 150 which is the set of data related to the proximate facilities which exist in the periphery of the node.
  • a node voice guide level 126 which shows a level of importance of the referred node when performing voice guiding of a route containing the node and a node voice guide time 127 which shows a voice playing time required for voice guiding of the node are also provided.
  • the proximate facility data record 150 has a facility name 151 which shows the name of facilities in the environs of each node, a facility number 152 which displays a distinguishing number which is uniquely assigned to each facility, and a facility position 153 which shows the position of the facility which corresponds to the node.
  • a facility voice guide level 154 which shows a level of importance of the referred facility when performing voice guiding of a route containing a node and a facility voice guide time 155 which shows a voice data playing time required for voice guiding of the facility are also provided.
  • the link data group 130 is comprised by link data record 140 which comprises each type of data related to each link. Furthermore each link data record 140 comprises a link number 141 which shows a distinguishing number which is uniquely assigned to a link which corresponds to link data records 140 , an start node number 142 which shows a node connected to an start side of a link, a finish node number 143 which shows a node connected to a finish side of a link a link length 144 which shows the length of a link, and a link attribute data record 160 which is the set of each type of data related to a link attribute.
  • link data record 140 which comprises each type of data related to each link. Furthermore each link data record 140 comprises a link number 141 which shows a distinguishing number which is uniquely assigned to a link which corresponds to link data records 140 , an start node number 142 which shows a node connected to an start side of a link, a finish node number 143 which shows a node connected to a finish side of a link a link length
  • the link attribute data record 160 has a link category 161 , a flow regulation information 162 which shows flow regulation of a road which corresponds to the link, a link name 163 which shows a name of a link, a link name number 164 which shows a distinguishing number which is uniquely assigned to the name of a link, a link voice guide level which shows the importance of a link referred to when performing voice guiding of a route containing the link, and a link voice guide time 166 which shows a voice data playing time required for voice guiding of the link. Furthermore, a link proximate facility data record 170 is provided which is a set of data related to facilities in the proximity of the link.
  • the link proximate facility data record 170 comprises a facility name 171 which shows a facility name, a facility number 172 which shows a distinguishing number which is uniquely assigned to the facility, a facility position 173 which shows a position of facilities with respect to the link, and a facility voice guide level 174 which shows the importance of a facility referred to when performing voice guiding of a route containing a link, and a facility voice guide time 175 which shows a voice data playing time required for voice guiding of the facility.
  • the level of importance shown by the facility voice guide level 174 can be determined by consideration of the level of reputation of the facility in the general community or with reference to a standard determined nationally. Alternatively they may be determined by any other standard.
  • a node voice guide level 126 which shows a level of importance of each individual node and a corresponding node voice level guide time 127 which shows the time required for voice guiding the node, as well as a facility voice guide level 154 which shows a level of importance of each individual facility and a facility voice level guide time 155 which shows the time required for voice guiding the facility are contained in the map information data.
  • a link voice guide level 165 which shows a level of importance of each individual link and a link voice level guide time 166 which shows the time required for voice guiding the link, as well as a facility voice guide level 174 which shows a level of importance of each individual facility and a facility voice level guide time 175 which shows the time required for voice guiding the facility are contained in the map information data.
  • the control means 21 displays a map corresponding to the map information data in the display means 26 in response to an operation of a user.
  • a menu or the like is displayed for selecting each function.
  • FIG. 7 shows an example of a menu and a map displayed on the display means 26 .
  • a moving body mark 211 which shows the present position of the moving body
  • a bearing mark 212 which shows the direction of the map
  • a route line 213 which shows a route determined by the route search process to be discussed below
  • a menu 214 for selecting each function such as setting a destination are displayed on the map.
  • the route search process is performed.
  • the present position detected by the present position detection means 23 and the destination input to the operation means 29 by a user are supplied to the route search means 24 by the control means 21 .
  • Map information data is read from the map information storage means 22 by the route search means 24 , routes between the present position and the destination are searched by a Dijkstra method for example and a single route is determined.
  • Information relating to nodes and links which comprise the route are stored in the route storage means 25 .
  • FIG. 8 is an example of a route determined by the route search means 24 .
  • N 001 -N 00 F between a present position and a destination (the figures are hexadecimal numbers).
  • L 000 -L 00 D the figures are hexadecimal numbers connecting each node.
  • route voice guiding becomes possible when a route is discovered by the route search process
  • options (route outline) corresponding to the function of route voice guide on the menu as shown in FIG. 7 are varied to be selectable by an operation of the user. That is to say, before execution of the route search process or when no route is found, it is not possible for the user to make a selection and the option “route outline” is covered (i.e. displayed gray).
  • Each category of setting the route voice guide is executed before route voice guide processing.
  • First the control means 21 displays a menu for each setting type of the route voice guide on the display means 26 .
  • FIG. 9 shows an example of a menu display for each type of setting of the route voice guide according to a first embodiment.
  • a guide point selection term 201 which sets the execution of the route voice guide from a present position to any geographical point
  • a guide number selection term 202 which sets the total (guide number) of proximate facilities of a link or node as well as links and nodes which execute the voice guiding of the routes from a present position to a destination
  • a guide level selection term 203 which sets the level of detail of the voice guide.
  • the options “destination” and “detour” in the guide geographical point selection term 201 There are the options “destination” and “detour” in the guide geographical point selection term 201 .
  • the selection of the option is executed by a user operating the operation means 29 .
  • route voice guide from a present position to a destination is performed.
  • route voice guide from a present position to a predetermined detour point is performed.
  • detour may be adapted to provide a plurality of detours as options or to add an option “selectable geographical point” for the user to select a final geographical point for voice guiding in the displayed map.
  • the guide number selection term has the options “5”, “10”, “20” and “automatic”. The selection of these options is performed by the operation of the operational means 29 by the user.
  • a respectively corresponding number (5, 10, 20) is set as the total number of proximate facilities for a link or node or as a link or node which executes the voice guide.
  • a guide number (discussed below) which is calculated in response to the distance from a present position to a guide point is set as the total number of proximate facilities for a link or node or as a link or node which executes the voice guide.
  • the device is adapted to allow a user to directly set a value as a guide number.
  • the guide level selection term 203 has the options “high”, “medium” and “low”. The selection of these options is executed by a user operating the operation means 29 .
  • voice guiding is performed to nodes, links and facilities proximate to nodes and links of low importance as far as allowed by the guide number above.
  • voice guiding is performed only to nodes, links and facilities proximate to nodes and links of high importance as far as allowed by the guide number above.
  • “medium” voice guiding is performed only to nodes, links and facilities proximate to nodes and links of medium importance as far as allowed by the guide number above.
  • the setting of the reference value for level of importance when selecting a guided node, apart from selecting a predetermined level as above may be performed by the user directly setting a level of importance with a value.
  • a reference number setting means and a reference value setting means which set a guide level and guide number are comprised by an operational means 29 and a display means 26 on which a menu is displayed.
  • FIG. 10 is a flowchart of the operation of each section in route voice guide processing. Firstly, as a result of the route search process being executed, the routine progresses from step ST 101 to step ST 102 when the route is determined. Thus, the gray cover on the menu option “route outline” is withdrawn allowing this option to be selected. When the option “route outline” is selected, route voice guide processing is performed (step ST 103 ).
  • a present position of the moving body is detected by the present position detection means 23 .
  • the voice guide information extraction means 30 reads data relating to nodes and links which comprise the determined route and extracts a number of nodes, links and proximate facilities to nodes and links of high importance corresponding to the guide number above based on voice guide information extraction conditions set by the user in the menu (FIG. 9).
  • a step ST 106 the extracted nodes and links and proximate facilities to nodes and links are supplied to the voice guide message generation means through the control means 21 .
  • the voice guide message generation means 27 generates voice guide messages relating to the extracted nodes and links and proximate facilities to nodes and links.
  • a voice guide message is generated, it is supplied to the voice output means 28 by the control means 21 .
  • the voice guide message is output by the voice output means 28 and an outline of the route from a present position to a guide point is reported to a user.
  • FIG. 11 is a flowchart of the details of the process of extracting nodes and links and proximate facilities to nodes and links in step ST 105 in FIG. 10.
  • the voice guide information extraction means 30 reads information relating to guide level terms, guide number terms and guide point terms set by the user from the control means 21 , stores the node number of selected geographical points based on information about the guide point term and stores an extracted guide number Abased on information about the guide number term.
  • the guide number A is stored as 0.
  • the voice guide information extraction means 30 stores a value 100 as a reference value GL when the guide level “high” is selected based on the information of the guide level term.
  • the selected guide level is “medium”
  • the value 10 is stored as the reference level GL
  • the selected guide level is “low”
  • the value 5 is stored as the reference level GL.
  • the voice guide information extraction means 30 sets the initial level of the extraction level L of the importance of extracted nodes, links and proximate facilities to 0. (In FIG. 6 these are shown as the facility voice guide level 154 , 174 , the link voice guide level 165 , and the node voice guide level 126 ).
  • the extraction level L is 0, proximate facilities and nodes and links which are less than or equal to an extracted level of 0 are extracted and as discussed below, the value of the extraction level L is incremented sequentially by values of 1. Thus, only more important information is extracted as the value of the extraction level L reduces.
  • the voice guide information extraction means 30 determines whether or not the guide number A is 0. That is to say, it is determined whether or not “automatic” has been selected as a guide number term.
  • the voice guide information extraction means 30 reads link information comprising route determined in the step ST 123 from the map information storage means 22 when the guide number A is 0.
  • a route X (kilometers) is calculated from the present position to the guide point (the destination or detour) on the basis of the link information.
  • the guide number A is calculated on the basis of the following formula.
  • FIG. 12 shows the corresponding relationship of a guide number A and the distance X from a present position to a guide point obtained by the formula. As shown in FIG. 12, the increment of the guide number A decreases as the distance X increases.
  • the guide number A which is set to “automatic” is calculated by the above formula. However, the calculation may be performed on the basis of another formula with distance or other element as a variable. When the guide number A is not equal to 0, the processes in steps ST 123 and ST 124 are not performed.
  • the voice guide information extraction means 30 sets the extraction number SS (L) which shows the total of proximate facilities and nodes and links with an importance of less than the extraction level L, to an initial value of 0.
  • the voice guide information extraction means 30 selects and extracts the node voice guide level 126 , the link voice guide level 165 and the facility voice guide level 153 , 174 as shown in FIG. 6 from the nodes, links and proximate facilities from the present position to the guide point based on information regarding nodes and links which comprise the route which is stored in the route storage memory 25 . That is to say, nodes, links and proximate facilities are selected and extracted which have the same level of importance as the extraction level L.
  • the voice guide information extraction means 30 selects links from the extracted links and makes two adjacent links with the same link name number into one link.
  • the link number, the start intersection number and the link attribute of the link after variation are assigned from that link of the two original links which is nearer the present position.
  • the finish intersection number is assigned from that link of the two original links which is near the guide point.
  • the link length of the link after variation is equal to the sum of the respective lengths of the two original links.
  • data relating to proximate facility links of links after variation contains data relating to proximate facility links of the two original links.
  • the voice guide information extraction means 30 updates the extraction number SS (L) by the sum of the number S(L) of extracted nodes, links and related proximate facilities in current step ST 126 and the extraction number SS(L- 1 ) when the extraction level L is only smaller by a value of 1.
  • the voice guide information extraction means 30 determines whether or not the extraction number SS(L) is greater than or equal to the guide number A above. When the extraction number SS(L) is not greater than or equal to the guide number A, in a step ST 130 , the voice guide information extraction means 30 determines whether the extraction level L is smaller than a reference value GL of the guide level above. When the extraction level L is smaller than the reference level GL of the guide level above, in a step ST 131 , the value of the extraction level L is increased by 1 and the routine returns to the step ST 126 .
  • step ST 130 when the extracted level L is smaller than the reference value GL of the guide level above, it is determined whether all the nodes, links, and related proximate facilities with a level of importance up to the set reference value GL have been extracted or not and the process of step ST 105 is completed.
  • step ST 129 when the extracted level SS(L) is greater than the guide number A, in step ST 132 , the voice guide information extraction means 30 determines whether or not the extraction number SS(L) is the same as the guide number A. When the two are the same, it is determined that a number of nodes, links and proximate facilities of equal to the set guide number A has been extracted and the processing of step ST 105 is completed.
  • the voice guide information extraction means 30 deletes one of the nodes, links or related proximate facilities by the following process until the extraction number SS(L) equals the guide number A.
  • the voice guide information extraction means 30 determines that the number of proximate facilities with an extracted level of importance L is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete related proximate facilities from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value.
  • the proximate facilities are deleted in a step ST 134 and the extraction number SS(L) is made equal to the guide number A.
  • the voice guide information extraction means 30 deletes all the proximate facilities from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting that number of proximate facilities.
  • the voice guide information extraction means 30 determines whether the number of nodes from among extracted nodes with an importance of L to which extracted links with an importance of L are not connected is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete such nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value.
  • the nodes are deleted in a step ST 137 and the extraction number SS(L) is made equal to the guide number A.
  • the voice guide information extraction means 30 deletes all nodes to which extracted links with an importance of L are not connected from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting that number of nodes.
  • the voice guide information extraction means 30 determines whether the number of remaining extracted nodes with an importance of L is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete such nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value. When it is determined that it is possible to delete nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value, the nodes are deleted in a step ST 140 and the extraction number SS(L) is made equal to the guide number A.
  • the voice guide information extraction means 30 deletes all remaining extracted nodes with an importance of L from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting that number of nodes.
  • the voice guide information extraction means 30 determines whether the number of links to which extracted nodes with an importance of L are not connected is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A.
  • the voice guide information extraction means 30 also determines whether it is possible to delete such links from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value.
  • the links are deleted in a step ST 143 and the extraction number SS(L) is made equal to the guide number A.
  • the voice guide information extraction means 30 deletes all such links from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting the total number of nodes.
  • the voice guide information extraction means 30 deletes links near to the guide point from the remaining extracted links which have an importance of L and thus makes the extraction number SS(L) equal to the guide number A.
  • step ST 105 In such a way, nodes and the like are extracted in order of importance and when the extraction number SS(L) is greater than the guide number A, nodes and the like are deleted from near the guide point until the same number of nodes or the like as the guide point A near the present position is selected. After the extraction number SS(L) and the guide number A are made equal, the process of step ST 105 is completed.
  • FIG. 13 is a flowchart showing the details of the process of generating a voice guide message for extracted links and nodes as well as the proximate facilities of links and nodes in step ST 106 of FIG. 10.
  • the voice guide message generating means 27 After extracted links and nodes as well as the proximate facilities to links and nodes are supplied to the voice guide message generating means 27 through the control means 27 , in a step 151 , the voice guide message generating means 27 firstly output data relating to extracted links and nodes as well as the proximate facilities to links and nodes in order of data proximate to a present position. At this time, the data relating to node proximate facilities is stated with the node proximate facility closer to the present position than the node. The data relating to link proximate facilities is stated with the link proximate facility further from the present position than the link.
  • the voice guide message generation means 27 adds a group flag GF to data related to nodes, links and related proximate facilities comprising the route. Thereafter a value of 1 is assigned to the group flag GF of the mutually connected nodes and links, to nodes and related proximate facilities to that node, or to nodes, links and proximate facilities related to links and proximate facilities to that link. However, the group flag GF of data which is closest to the guide point in the group is set to a value of 0.
  • the voice guide message generation means 27 extracts voice waveform data corresponding to extracted nodes, links and proximate facilities related to nodes and links from the stored voice information storage means 31 .
  • voice waveform data consists of the names of nodes, names of links, or the names of proximate facilities related to nodes and links.
  • the voice guide message generation means 27 takes out supplementary voice data corresponding to predicates from the voice information storage means 31 and generates a voice guide message containing the previously taken out voice waveform data for each name.
  • FIG. 14 shows an example of a set of supplementary voice data.
  • a voice guide message is generated by one of the following procedures by adding supplementary voice data to each name of nodes or the like in the stated order of step ST 151 and then stating the data sequentially.
  • supplementary voice data (make right turn.) of distinguishing number 301 is added after the voice waveform data of the node name when the group flag GF has a value of 0.
  • group flag GF has a value of 1
  • supplementary voice data (make right turn and . . . ) of distinguishing number 401 is added.
  • supplementary voice data (make left turn.) of distinguishing number 302 is added after the voice waveform data of the node name when the group flag GF has a value of 0.
  • group flag GF has a value of 1
  • supplementary voice data (make left turn and.. ) of distinguishing number 402 is added.
  • supplementary voice data make a U-turn
  • distinguishing number 304 is added after the voice waveform data of the node name when the group flag GF has a value of 0.
  • group flag GF has a value of 1
  • supplementary voice data make a U-turn and
  • supplementary voice data ( . . . cross.) of distinguishing number 305 is added after the voice waveform data of the link name when the group flag GF has a value of 0.
  • group flag GF has a value of 1
  • supplementary voice data (. . . cross and . . . ) of distinguishing number 405 is added.
  • supplementary voice data (enter . . . ) of distinguishing number 306 is added after the voice waveform data of the node name when the group flag GF has a value of 0.
  • group flag GF has a value of 1
  • supplementary voice data (enter and . . . ) of distinguishing number 406 is added.
  • supplementary voice data (exit%) of distinguishing number 307 is added after the voice waveform data of the node name when the group flag GF has a value of 0.
  • group flag GF has a value of 1
  • supplementary voice data (exit . . . and . . . ) of distinguishing number 407 is added.
  • the route turns to the left at a node which is a node proximate facility and supplementary voice data (turn left at the mark) of distinguishing number 310 is added after the voice waveform data of the node proximate facility name.
  • supplementary voice data at the mark . . . ) of distinguishing number 409 is added.
  • the route continues in the same direction at a node which is a node proximate facility and supplementary voice data (continue straight at the mark) of distinguishing number 311 is added after the voice waveform data of the node proximate facility name.
  • supplementary voice data discontinue straight at the mark
  • the route search means 24 predetermines whether the route turns left or right or continues straight at each node or whether a link is a bridge.
  • FIG. 15 shows data related to extracted nodes, links and proximate facilities of nodes and links through a process shown in FIG. 11 with respect to the route shows in FIG. 8.
  • FIG. 16 shows a voice guide message generated based on data shown in FIG. 15.
  • the link L 001 shown in FIG. 15 is a single link made up of links L 001 , L 002 , L 003 in FIG. 8 by the process of step ST 127 shown in FIG. 11.
  • the link L 005 shown in FIG. 15 is a single link made up of links L 005 , L 006 , L 007 in FIG. 8.
  • the link proximate facility S 251 (not shown) belongs to the link L 002 in FIG. 8, however it is varied to belong to the link L 001 after variation by the process of step ST 127 .
  • step ST 151 the link L 001 , the link proximate facility S 251 and the node N 005 is made into a group 500 and the node N 006 , the link L 005 and the node N 009 are made into the group 501 .
  • a voice guide message shown in FIG. 16 is generated by steps ST 152 , ST 153 in FIG. 13 with respect to extracted nodes, links and proximate facilities to nodes and links.
  • nodes and links are selected from amongst nodes and links on a searched route based on their importance.
  • Voice guide messages are generated with respect to selected nodes and links and such messages are used to execute guiding of a route by voice commands.
  • voice commands it is possible to guide an entire route appropriately in a short time by voice.
  • the navigation device summarizes nodes, links and related proximate facilities on the route which are voiced guided based on a guide time pre-set by a user instead of summarizing on the basis of the level of importance and predetermined guide number of nodes, links and related proximate facilities on the voiced-guided route. That is to say, this is a variation on the process (FIG. 11) of step ST 105 in FIG. 10 of the navigation device according to the first embodiment.
  • the summarizing process entails summarizing nodes, links and related proximate facilities on the voiced-guided route.
  • the present embodiment is the same as the first embodiment and such description will be omitted.
  • the control means 21 displays a menu on the display means 26 for all types of settings for route voice guiding.
  • FIG. 17 shows a display example of a menu for displaying each type of setting for route voice guiding according to embodiment 2.
  • the menu shown in FIG. 17 comprises a guide point selection term 601 which sets the execution of route voice guiding from a present position to a given geographical point and a guide time selection term 602 which sets a guide time for voice guiding of the route from a present position to a guide point.
  • the guide point selection term 601 contains the options “destination” and “detour”. The selection of the options is executed by a user operating the operational means 29 . When “destination” is selected, route voice guiding from a present position to a destination is executed. When “detour” is set, route voice guiding from a present position to a predetermined detour point is executed.
  • the guide point selection term 601 in FIG. 17 has one option “detour”. However, a plurality of detours may be selected as options or the user may add “selectable geographic points” as options to select a final geographic point for voice guiding.
  • the guide time selection term 602 has the options “short”, “middle” and “long”. The selection of these options is executed by a user operating the operational means 29 .
  • voice guiding is performed for approximately 15 sec.
  • voice guiding is performed for approximately 30 sec.
  • voice guiding is performed for approximately 1 minute. The user may directly select a time limit for voice guiding with a figure.
  • FIG. 18 is a flowchart showing the details of the process of extracting links and nodes as well as the proximate facilities of links and nodes in a second embodiment.
  • the voice guiding information extraction means 30 reads information relating to guide time terms and guide point terms set by a user from the control means 21 .
  • the node number of the selected geographical point is stored based on information relating to the guide point term.
  • a selected guide time is set to a reference value B for guide times based on information relating to the guide time term.
  • the voice guide information extraction means 30 set an extraction level L which shows the importance of extracted nodes, links and related proximate facilities (in FIG. 6, a node voice guiding level 126 , a link voice guiding level 165 and a facility voice guiding level 154 , 174 ) to a value of 0.
  • the extraction level L When the extraction level L is 0, the nodes, links and related facilities with an extraction level of equal to or less than 0 are extracted. As described below, the value of the extraction level L is sequentially incremented by values of 1. Thus, only increasingly important information is extracted at lower extraction level values L.
  • a voice guiding information extraction means 30 sets an initial value 0 to an extraction number SS(L) which shows the total number of nodes, links and related facilities with an importance equal to or less than an extraction level of L.
  • An initial value 0 is set to a total guide time ST(L) which is required to voice guide nodes, links and related facilities with a level of importance less than or equal to an extraction level L.
  • the voice guiding information extraction means 30 selects and extracts node voice guiding level 126 , link voice guiding level 165 and facility voice guiding level 154 , 174 shown in FIG. 6, that is to say, nodes, links and related facilities which have an equal extraction level L of importance from amongst the nodes, links and related proximate facilities from a present point to a guide point based on node and link information comprising a route stored in the route storage means 25 .
  • the voice guiding information extraction means 30 converts two adjacent links with the same link name number into one link.
  • the link number of the link after conversion, the start intersection number and the link attribute are assigned from the link of the two original links which is nearer to the present position.
  • the finish intersection point number is assigned from the link of the two original links which is nearer to the guide point.
  • the link length of the link after conversion is the sum of the link lengths of the two original links.
  • step ST 205 the voice guiding information extraction means 30 updates the extraction number SS(L) by the sum of the number S(L) of extracted nodes, links and related proximate facilities in the current step ST 203 and the extraction number SS(L- 1 ) in which the extraction level L is smaller by a value of 1. Further it updates the total guiding time ST(L) by the sum of the number T(L) of guiding time required for extracted nodes, links and related proximate facilities in the current step ST 203 and the total guiding time ST(L- 1 ) in which the extraction level L is smaller by a value of 1.
  • T(L) of guiding time required for extracted nodes, links and related proximate facilities is the total of the node voice guiding time 127 , the facility voice guiding time 155 , the link voice guiding time 166 and the facility voice guiding time 175 in the map information data shown in FIG. 6.
  • the voice guiding information extraction means 30 determines whether or not twice the sum of the total guiding time ST(L) and the extraction number SS(L): (ST(L)+SS(L) x 2) is less than or equal to the reference value B above.
  • the comparison of twice the sum of the total guiding time ST(L) and the extraction number SS(L): (ST(L)+SS(L) x 2) with the reference value B above is performed for the following reason.
  • the total guide time ST(L) is the total sum of voice playing times for names such as extracted nodes.
  • the voice playing time required for supplementary voice data referred to above which is added to each node is on average two seconds.
  • the playing time for voice guide messages up to an extraction level 1 at that time thus becomes (ST(L)+SS(L) x 2).
  • (ST(L)+SS(L) x Ts) is compared with a reference value B based on an average playing time Ts for supplementary voice data in response to the length of the supplementary voice data.
  • the voice guiding information extraction means 30 determines whether or not the voice guide message playing time to an extraction level L is the same as the reference level B. When both are the same, it is determined that the nodes, links and related proximate facilities in a set guide time have been extracted and the routine is completed.
  • the voice guiding information extraction means 30 deletes by the following process any of the nodes, links and related facilities with a level of importance L until the voice guide message playing time to an extraction level L (ST(L)+SS(L) x 2) is less than or equal to a reference level B.
  • the voice guiding information extraction means 30 determines that the sum of the facility voice guiding time for proximate facilities with an extracted importance of L is greater than the difference of the voice guide message playing time to an extraction level L (ST(L)+SS(L) x 2) and a reference level B. Then, it is determined whether or not it is possible to delete from those proximate facilities which are near to guide points until the voice guide message playing time is less than or equal to the reference value B.
  • a step ST 210 When it is determined that it is possible to delete from those proximate facilities which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST 210 such proximate facilities are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • the voice guiding information extraction means 30 deletes all proximate facilities from extracted nodes, links and related proximate facilities, reduces the total guide time ST(L) by the sum of the facility voice guide time for proximate facilities and updates the value. Also the value is updated by reducing the value of the extraction number SS(L) by the total number of proximate facilities.
  • the voice guiding information extraction means 30 determines that the sum of the node voice guiding time for nodes with an extracted importance of L to which links are not connected is greater than the difference of the voice guide message playing time to an extraction level L and a reference level B. Then, it is determined whether or not it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B.
  • step ST 213 When it is determined that it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST 213 such nodes are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • the voice guiding information extraction means 30 deletes all nodes which have an extracted importance of L not connected to links from extracted nodes, links and related proximate facilities, reduces the total guide time ST(L) by the sum of the node voice guide time for such nodes and updates the value. Also the value is updated by reducing the value of the extraction number SS(L) by the total number of nodes.
  • the voice guiding information extraction means 30 determines that the sum of the node voice guiding time for remaining nodes which have an extracted importance of L is greater than or equal to the difference of the voice guide message playing time to an extraction level L and a reference level B. Then, it is determined whether or not it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B. When it is determined that it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST 216 such nodes are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • the voice guiding information extraction means 30 deletes all remaining nodes which have an extracted importance of L from the extracted nodes, links and related proximate facilities.
  • the value for the total guide time ST(L) is reduced by the sum of the node voice guiding times for such nodes and the value is updated. Also, the value for extraction number SS(L) is updated by being reduced by the total number of such nodes.
  • the voice guiding information extraction means 30 determines that of links with an extracted importance of L, the sum of the link voice guiding time for links with an extracted importance of L not connected to nodes is greater than the difference of the voice guide message playing time to an extraction level L and a reference level B. Then, it is determined whether or not it is possible to delete from those links which are near to guide points until the voice guide message playing time is less than or equal to the reference value B. When it is determined that it is possible to delete links which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST 219 such nodes are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • the voice guiding information extraction means 30 deletes all such links from the extracted nodes, links and related proximate facilities.
  • the value for the total guide time ST(L) is reduced by the sum of the link voice guiding times for such links and the value is updated.
  • the value for extraction number SS(L) is updated by being reduced by the total number of such links.
  • the voice guiding information extraction means 30 deletes such links which are near to guide points from the remaining links with an extracted importance of L until the voice guide message playing time is less than or equal to the reference value B.
  • nodes or the like are selected from the nodes and links with the same level of importance from those near to a present position so that the playing time of the voice guide message is less than a predetermined reference value.
  • the present invention is not limited to embodiments 1 and 2 above and may be applied through other embodiments.
  • the extraction method of guiding points may vary the node or link extraction conditions.
  • the voice-guide message generated by the voice guide message generation means 27 may generate messages which guide the position of a facility, a time or a distance apart from the examples discussed above. Furthermore, the invention may be adapted to generate a voice-guide message by combining simple phrases by the insertion of conjunctions between phrases.
  • the route from a present position to a guide point is displayed on the same screen.
  • the position of those nodes, links and related proximate facilities which correspond to information which voice guided may be displayed in a different color from other parts or may be displayed by a blinking light.
  • the present invention is adapted for use in a navigation device in which the level of importance of each node and link is stored as a part of map information in a map information storage means. Nodes and links from amongst the nodes and links on the searched route are selected on the basis of the level of importance and a voice guide message is generated which corresponds to the selected nodes and links. Thus, it is possible to guide an entire route appropriately in a short time by voice and it is possible for a user to easily arrive at a destination.

Abstract

A navigation device includes a map information storage means for storing map information such as nodes, links and the like, a present position detection means for detecting a present position of a moving body, a route searching means for searching a route from a present position to a guide point based on the map information, a voice guide message generation means for generating a voice guide message corresponding to the searched route, and a voice output means for outputting the voice guide message. The navigation device is adapted to store a level of importance of nodes and links as a part of said map information, and the voice guide message generation means selects links and nodes on the searched route based on the level of importance and generates a voice guide message corresponding to the selected nodes and links.

Description

    CROSS-REFERENCE TO THE RELATED APPLICATION
  • This application is a continuation of international Application No. PCT/JP99/02748, whose international filing date is May 25, 1999, the disclosures of which Application are incorporated by reference herein. [0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The present invention relates to a navigation device which is mounted in a moving body and which guides a route from a present position to a destination. [0003]
  • 2. Description of Related Art [0004]
  • Navigation devices which guide a route from a current position of an automobile to a destination while driving have been widely applied. In this type of navigation device, when a destination is set by a user, the device searches a preferred route from the current position to the destination. When the searched route is presented to a user, it is common to present the route in an “entire route format” in which a route from a geographical position, at which a destination is set, to the destination is displayed on the same screen by switching a map scale. Furthermore, a method is employed of displaying more detailed information about the route by scrolling the map display automatically or manually along the route by switching to a detailed map display with a large map scale ratio. In addition, a method of sequential display of guiding maps of intersections on the route forward of a current position, or a method of displaying the route schematically by main branching points on the display device are also known in the art. Furthermore, it is possible to execute the guiding operation by use of voice commands. [0005]
  • FIG. 1 is a block diagram showing a first conventional navigation device as disclosed in JP-A-5-297800. In the figure, [0006] reference numeral 1 denotes a touch switch for inputting a destination or the like, 2 is a vehicle speed sensor for detecting a vehicle speed, 3 is a bearing sensor for detecting a bearing, 4 is an external storage device for storing information displaying classes of roads connecting branching points or information displaying classes of branching point of roads and pre-stored map information data. 5 is a display device for displaying information regarding main points such as branching points which are on a route from the present position to the destination. 6 is a control device which searches a travel route of the vehicle based on an input signal from the bearing sensor 3 and the vehicle sensor 2 and an input signal from the switch 1 and which displays main branching points on the travel route in summary form on the display device 5.
  • The operation of the first conventional navigation device will be described below. [0007]
  • Firstly, route searching is executed. The route searching is executed according to a Dijkstra method. Of the routes connecting the present position and the destination, the device searches a route passing along main roads in which there are few right or left turns and in which the names of intersections at which turns are made are known. The search is made on the basis of the map information data stored in the [0008] external storage device 4.
  • After the route is searched, the number of nodes contained on the route is counted and it is determined whether the number of nodes is less than or equal to [0009] 10. When the number of nodes is less than or equal to 10, the display process is executed to display the searched route.
  • On the other hand, when there are more than 10 nodes, a summarizing process is applied to the nodes contained on the route in which they are summarized based on informational value of each node. The summarizing process entails deleting those nodes of low information value until the number of nodes contained on the route are less than or equal to 10. Thereafter, display processing is executed. Herein information value is a product of a turning coefficient, a name presence/absence coefficient and a node category coefficient. The node category coefficient is a fixed coefficient corresponding to categories of nodes such as expressway entrance/exit, tollway entrance/exit or national road intersection. The node name presence/absence coefficient is a fixed coefficient which corresponds to the presence or absence of a name of the node. The turning coefficient is a fixed coefficient which corresponds to the presence or absence of left or right turns. [0010]
  • In the display process, the name of the destination and present position, the name of main branching points on the route from the present position to the destination, the distance between each branching point, representative place names which indicate the direction of travel, and the names of roads which should be taken at branching points are displayed on the [0011] display device 5.
  • FIG. 2 is a block diagram of a second conventional navigation device as disclosed in International Publication WO98/51995. In the figure, [0012] reference numeral 10 denotes a control means which performs control of the overall device and each type of calculation in the navigation device. 11 is a map information storage means which stores digitized map information data such as intersection data, road data and the like. 12 is a present position detection means which detects a present position of the moving body in which the navigation device is mounted.
  • [0013] 13 is a route setting means which sets a route between two points on a map on the basis of map information data stored in the map information storage means 11. 14 is a guide object intersection detection means which detects a guide object intersection to be guided on the route set by the route setting means 13. 15 is a quantizing calculation means which quantizes the route of the moving body onto a schematic map displaying the characteristic features of the route.
  • [0014] 16 is a display means which displays a route quantized by the quantizing calculation means with respect to a guide object intersection detected by the guide object intersection detection means 14. 17 is a voice guide message generation means having a voice information storage means (not shown) which stores necessary words or phrases for guide messages as voice wave form data. The voice guide message generation means selects voice wave form data such as words or phrases for guide messages and generates such combinations as guide messages when a quantized route of the moving body is displayed on the display means 16. 18 is a voice output means which notifies a user by voice commands of guide messages generated by the voice guide message generation means.
  • The operation of the second conventional navigation device will be described below. [0015]
  • FIG. 3 is a flowchart explaining the operation of the second conventional navigation device shown in FIG. 2. [0016]
  • Firstly, in a step ST[0017] 1, the route setting means 13 sets two points on the map on the basis of latitude and longitude from the map information data read from the map information storage means 11 and then sets a route between the two points using a general search algorithm on a network such as a Dijkstra method or the like.
  • Then, in a step ST[0018] 2, the present position detection means 12 detects a present position (C1) of the moving body. In a step ST3, the flags FL1, FL2, FL3 are respectively initialized to 0.
  • Then, in a step ST[0019] 4, the guide object intersection detection means 14 extracts an intersection with, for example, more than three roads being connected to the intersection as a forward guide object intersection (C2). The intersection is an intersection on the route set by the route setting means 13 and, of the two geographical points set by the route setting means, the intersection is further forward than present position (C1) detected by the present position detection means 12.
  • In the step ST[0020] 5, the detection of the present position (C1) of the moving body is performed again by the present position detection means 12 and in a step ST6, a road distance (L1) between the present position (C1) of the moving body and the forward guide object intersection (C2) is calculated on the basis of map information data read from the map information storage means 11.
  • Then, in a step ST[0021] 7, further processing operations are selected in response to this distance (L1).
  • When the distance (L[0022] 1) is greater than a predetermined reference distance (L2) (for example 1000 meters), the routine progresses to step ST8 and it is determined whether the flag FL1 has a value of 0 or not. When the value of the flag is 0, in a step ST9, a guide output A which is related to the forward guide object intersection (C2) is executed. The guide output A comprises extracting only the section to the forward guiding object intersection (C2) extracted by the quantizing calculation means 15 of the road on the route on the map. Then, the result is quantized to a simple arrow shape and a display map related to the forward guide object intersection (C2) is displayed on the display means 16. A guide voice message related to the forward guide object intersection (C2) is generated by the voice guide message generation means 17 and the message is reported by voice commands from the voice output means 18. After the execution of the guide output A, in a step ST10, the flag FL1 is varied to a value of 1 and the fact that the guide output A in relation to the forward guide object intersection (C2) has been executed is stored.
  • Thereafter, in a step ST[0023] 11, it is determined whether or not the process of setting the route by the route setting means 13 is completed. When it is completed, the guide process is terminated. When it is not completed, the routine returns to a step ST5 and executes the steps of the routine after step ST6.
  • When the flag FL[0024] 1 does not have a value of 0 in step ST8, since the guide output A has already been executed, the routine returns to a step ST5.
  • In a step ST[0025] 7, when the distance (L1) is less than or equal to the reference value (L2) and greater than the predetermined reference value (L3) (for example 300 meters), the routine progresses to a step ST12, and it is determined whether or not the flag FL2 has a value of 0. When the flag FL2 has a value of 0, in the step ST13, a guide output B related to the forward guide object intersection (C2) is executed. The guide output B comprises extracting only the route section connecting the forward guiding object intersection (C2) of the road on the map extracted by the quantizing calculation means 15. Then, the result is quantized to a simple arrow shape and a display map related to the forward guide object intersection (C2) is displayed on the display means 16. A guide voice message related to the forward guide object intersection (C2) is generated by the voice guide message generation means 17 and the message is reported by voice commands from the voice output means 18. After the execution of the guide output B, in a step ST14, the flag FL2 is varied to a value of 1 and the fact that the guide output B in relation to the forward guide object intersection (C2) has been executed is stored.
  • Thereafter, in a step ST[0026] 11, it is determined whether or not the process of setting the route by the route setting means 13 is completed. When it is completed, the guide process is terminated. When it is not completed, the routine returns to a step ST5 and executes the steps of the routine after step ST6.
  • When the flag FL[0027] 2 does not have a value of 0 in step ST12, since the guide output has already been executed, the routine returns to a step ST5.
  • When, in a step ST[0028] 7, the distance (L1) has a value less than the reference value (L3), the routine progresses to a step ST15 and it is determined whether or not the flag FL3 has a value of 0 or not. When the value of the flag is 0, in a step ST16, the guide output C related to the forward guide object intersection (C2) is executed. The guide output C comprises extracting the present position of the moving body, roads other than roads on the route, route roads connected to the forward guiding object intersection (C2) and the forward guiding object intersection (C2) extracted by the quantizing calculation means 15 on the map. Then, the result is quantized to a simple arrow shape and a display map related to the forward guide object intersection (C2) is displayed on the display means 16. A guide voice message related to the forward guide object intersection (C2) is generated by the voice guide message generation means 17 and the message is reported by voice commands from the voice output means 18. After the execution of the guide output C, in a step ST17, the flag FL3 is varied to a value of 1 and the fact that the guide output C in relation to the forward guide object intersection (C2) has been executed is stored.
  • Thereafter, in a step ST[0029] 11, it is determined whether or not the process of setting the route by the route setting means 13 is completed. When it is completed, the guide process is terminated. When it is not completed, the routine returns to a step ST5 and executes the steps of the routine after step ST6.
  • When the flag FL[0030] 3 does not have a value of 0 in step ST15, since the guide output C has already been executed, the routine returns to a step ST3, and the flags FL1, FL2, FL3 are initialized to 0. In a step ST4, the forward guide object intersection (C2) is extracted.
  • Since the conventional navigation device is constructed as above, the problem has arisen that safe operation of the vehicle can be affected as it is necessary for a user to monitor the guide display even when display nodes are displayed in summary form by a summarizing process. Furthermore, the number of guiding nodes provided to guide the entire route by voice commands increases and the problem has arisen that nodes can not be adapted to guide the entire route appropriately in a short time. [0031]
  • Furthermore, when the number of guide nodes is reduced to a predetermined number of summarized nodes in a convention navigation device, the problem has arisen that it is difficult to make guide nodes correspond to a predetermined number of nodes when a plurality of nodes with the same informational value exists. [0032]
  • SUMMARY OF THE INVENTION
  • The present invention is proposed to solve the above problems and has the object of providing a navigation device adapted to store the level of importance of nodes and links and select nodes and links on the searched route based on the level of importance or time for playing voice information. In the navigation device, a voice guide message is generated corresponding to the selected nodes and links and guiding of the route is executed by the voice guide message. Thus, it is possible to guide an entire route appropriately in a short time by voice commands. [0033]
  • The present invention has the further object of providing a navigation device in which when a plurality of nodes and links with the same importance exists and the number of such nodes and links is not equal to a predetermined reference number, nodes and links in proximity to the present position, the number of which corresponds with the predetermined reference number, is selected from nodes and links with the same importance, thereby to make the number of guide nodes and the like accurately correspond with the predetermined reference number. [0034]
  • A navigation device of the present invention is adapted to store a level of importance of each node and link as a part of map information in a map information storage means, to select the nodes and links on the searched route based on the level of importance and to generate a voice guide message corresponding to the selected links and nodes. In this way, it is possible to guide an entire route appropriately in a short period of time by voice commands. [0035]
  • The navigation device of the present invention may be adapted to select the nodes and links on the searched route in such a manner that the level of importance of the selected nodes and links is less than or equal to a predetermined reference level and the number of the selected nodes and links is less than or equal to a predetermined reference number, and to generate a voice guide message corresponding to the selected nodes and links. In such a way, even when the number of nodes and links with a high level of importance is less than a predetermined reference number, the voice guiding with respect to the nodes and links having low level of importance is not executed. Thus, it is possible to guide the entire route appropriately. [0036]
  • The navigation device of the present invention may be adapted to delete nodes and links, which are located in proximity to the guide point, from the nodes and links having the same level of importance and to make the number of the selected nodes and links equal to a predetermined reference number, when there exists a plurality of nodes and links with the same level of importance and the number of the selected nodes and links is not equal to the predetermined reference number. In this way, it is possible to make the number of nodes and links to be guided correspond accurately with the predetermined reference number. [0037]
  • The navigation device of the present invention may be provided with a reference value setting means for setting a predetermined reference value and a reference number setting means for setting a predetermined reference number. In this way, it is possible to provide a voice guide with the desired amount and level of importance. [0038]
  • The navigation device of the present invention may be adapted to store information about voice playing times relating to names of each link and node as a part of map information in the map information storage means, to select the nodes and links on the searched route in order of highest importance in such a manner that the voice playing time for the voice guide message is less than or equal to a predetermined reference value, and to generate the voice guide message corresponding to the selected nodes and links. In such a way, it is possible to accurately make the time taken for voice guiding under a predetermined reference value. [0039]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing a first convention navigation device. [0040]
  • FIG. 2 is a block diagram showing a second convention navigation device. [0041]
  • FIG. 3 is a flowchart explaining the operation of a second conventional navigation device. [0042]
  • FIG. 4 is a block diagram showing a construction of a navigation device according to a first embodiment of the present invention. [0043]
  • FIG. 5 is a block diagram showing a construction of the hardware in the navigation device shown in FIG. 4. [0044]
  • FIG. 6 shows an example of map information data stored in a map information storage means. [0045]
  • FIG. 7 shows an example of a menu and a map displayed on a display means. [0046]
  • FIG. 8 shows an example of a route determined by a route searching means. [0047]
  • FIG. 9 shows an example of a menu for each setting category of route voice guides according to a first embodiment. [0048]
  • FIG. 10 is a flowchart of the operation of each section in route voice guide processing. [0049]
  • FIG. 11 is a flowchart showing the details of the process of extracting links and nodes as well as the proximate facilities to links and nodes in step ST[0050] 105 of FIG. 10.
  • FIG. 12 shows the relative relationship of guide number A and distance X from a present position to a guide point obtained by this formula. [0051]
  • FIG. 13 is a flowchart showing the details of the process of generating a voice guide message for extracted links and nodes as well as the proximate facilities to links and nodes in step ST[0052] 106 of FIG. 10.
  • FIG. 14 is an example of a set of supplementary voice data. [0053]
  • FIG. 15 shows data related to extracted links and nodes as well as the proximate facilities to links and nodes in the process shown in FIG. 11 with respect to the route shown in FIG. 8. [0054]
  • FIG. 16 shows a voice guide message generated based on the data shown in FIG. 15. [0055]
  • FIG. 17 shows a display example of a menu for each category of setting of route voice guides according to a second embodiment. [0056]
  • FIG. 18 is a flowchart showing the details of the process of extracting links and nodes as well as the proximate facilities to links and nodes in a second embodiment. [0057]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In order to describe the invention in greater detail, the preferred embodiments will be outlined below with reference to the accompanying figures. [0058]
  • [0059] Embodiment 1
  • FIG. 4 is a block diagram showing a construction of a navigation device according to a first embodiment of the present invention. FIG. 5 is a block diagram showing a construction of the hardware in the navigation device shown in FIG. 4. [0060]
  • In FIG. 4, [0061] reference numeral 21 denotes a control means which executes each type of calculation in the navigation device and controls other constitutive elements. 22 is a map information storage means which pre-stores digitized map information data such as node data and link data displaying intersection points and roads. 23 is a present position detection means which detects a present position of a moving body in which the navigation device is mounted.
  • [0062] 24 is a route search means which reads map information data stored in the map information storage means 22, which searches a route between two geographic points in a map on the basis of map information data for example on the basis of a Dijkstra method and which determines a single route. 25 is a route storage means which stores a route determined by the route search means 24.
  • [0063] 26 is a display means which displays a route and the like stored in the route storage means 25 and a map based on map information data stored in the map information storage means 22.
  • [0064] 27 is a voice guide message generation means which has a voice information storage means 31 which pre-stores voice waveform data such as words and phrases required for voice guide messages. Voice waveform data such as words and phrases constituting voice guide messages are selected when performing voice guiding and voice guide messages are generated by combining selected voice waveform data. 28 is a voice output means which outputs voice corresponding to voice guide messages generated by the voice guide message generated means 27 and which reports the guide message to a user.
  • [0065] 29 is an operation means which is operated when commands are input into the navigation device by user and which supplies an input user commands to the control means 21. 30 is a voice guide information extraction means which extracts main guide information from guide information on a route stored in the route storage means 25.
  • In FIG. 5, 51 is a CD-ROM storing digitized map information and a read-out device thereof which correspond to the map information storage means [0066] 22 shown in FIG. 4.
  • [0067] 52 is a GPS receiver which receives electromagnetic waves from an artificial satellite using a geo-positioning system (GPS) and which outputs a present position of the moving body in which a navigation device is mounted. 53 is a bearing sensor which detects a bearing in which the moving body is directed. 54 is a distance sensor which detects a movement distance of the moving body. These components correspond to the present position detection means 23 shown in FIG. 4.
  • [0068] 55 is a display device which has for example a liquid crystal display and which displays map information, maps based on map information data, determined routes and the like. The display device corresponds to the display means 26 shown in FIG. 4. 56 is a voice output means which outputs voice guide messages. It corresponds to the voice output means shown in FIG. 4. 57 is an input device which has a switch operated when commands are input into the navigation device by a user and which supplies input user commands to a control unit 58. The input device 57 corresponds to the operation means shown in FIG. 4.
  • [0069] 58 is a control unit provided with a central processing unit (CPU) 61, a read only memory (ROM) 62, a random access memory (RAM) 63, a display control section 64 and an input/output control section 65. The control unit 58 calculates each type of calculation in the navigation device and executes control of other constitutive components. The control unit corresponds to the control means 21, the route search means 24, the route storage means 25, the voice guide message generation means 27 and the voice guide information extraction means 30 shown in FIG. 4.
  • In the [0070] control unit 58, 61 is a CPU which executes processing of route searching and guide point extraction. 62 is a ROM which pre-stores data, programs and the like used by the CPU 61. 63 is a RAM into which map information data and programs used by the CPU 61 are loaded and which stores the calculation results of the CPU 61. 64 is a display control section which controls the display device 55 and which displays each type of image on the display device 55. 65 is an input/output control section which executes transfer of signals and each type of data by acting as an interface between the control unit 58 and each type of external device (CD-ROM and read-out device 51 to input device 57).
  • FIG. 6 shows an example of map information data stored in the map information storage means [0071] 22.
  • The map information data comprises a [0072] node data group 110 being the set of data related to nodes and a link data group 130 being the set of data related to links.
  • The [0073] node data group 110 is comprised by a node data record 120 which comprises each type of data related to each node. Each node data record 120 has a node number 121 which shows a distinguishing number which is uniquely assigned to a node which corresponds to the node data record 120, a node coordinate 122 which shows latitude and longitude of a position of a node on the map, a connecting link number 123 which shows the number of links connecting the node, a link number 124 of each link connected to the node, a node name 125 which is the name of the node, and a proximate facility data record 150 which is the set of data related to the proximate facilities which exist in the periphery of the node. A node voice guide level 126 which shows a level of importance of the referred node when performing voice guiding of a route containing the node and a node voice guide time 127 which shows a voice playing time required for voice guiding of the node are also provided.
  • The proximate [0074] facility data record 150 has a facility name 151 which shows the name of facilities in the environs of each node, a facility number 152 which displays a distinguishing number which is uniquely assigned to each facility, and a facility position 153 which shows the position of the facility which corresponds to the node. A facility voice guide level 154 which shows a level of importance of the referred facility when performing voice guiding of a route containing a node and a facility voice guide time 155 which shows a voice data playing time required for voice guiding of the facility are also provided.
  • The [0075] link data group 130 is comprised by link data record 140 which comprises each type of data related to each link. Furthermore each link data record 140 comprises a link number 141 which shows a distinguishing number which is uniquely assigned to a link which corresponds to link data records 140, an start node number 142 which shows a node connected to an start side of a link, a finish node number 143 which shows a node connected to a finish side of a link a link length 144 which shows the length of a link, and a link attribute data record 160 which is the set of each type of data related to a link attribute.
  • The link [0076] attribute data record 160 has a link category 161, a flow regulation information 162 which shows flow regulation of a road which corresponds to the link, a link name 163 which shows a name of a link, a link name number 164 which shows a distinguishing number which is uniquely assigned to the name of a link, a link voice guide level which shows the importance of a link referred to when performing voice guiding of a route containing the link, and a link voice guide time 166 which shows a voice data playing time required for voice guiding of the link. Furthermore, a link proximate facility data record 170 is provided which is a set of data related to facilities in the proximity of the link.
  • The link proximate [0077] facility data record 170 comprises a facility name 171 which shows a facility name, a facility number 172 which shows a distinguishing number which is uniquely assigned to the facility, a facility position 173 which shows a position of facilities with respect to the link, and a facility voice guide level 174 which shows the importance of a facility referred to when performing voice guiding of a route containing a link, and a facility voice guide time 175 which shows a voice data playing time required for voice guiding of the facility.
  • The level of importance shown by the facility [0078] voice guide level 174 can be determined by consideration of the level of reputation of the facility in the general community or with reference to a standard determined nationally. Alternatively they may be determined by any other standard.
  • On such a way, a node [0079] voice guide level 126 which shows a level of importance of each individual node and a corresponding node voice level guide time 127 which shows the time required for voice guiding the node, as well as a facility voice guide level 154 which shows a level of importance of each individual facility and a facility voice level guide time 155 which shows the time required for voice guiding the facility are contained in the map information data. Furthermore, a link voice guide level 165 which shows a level of importance of each individual link and a link voice level guide time 166 which shows the time required for voice guiding the link, as well as a facility voice guide level 174 which shows a level of importance of each individual facility and a facility voice level guide time 175 which shows the time required for voice guiding the facility are contained in the map information data.
  • The operation of the present invention will be described below. [0080]
  • The control means [0081] 21 displays a map corresponding to the map information data in the display means 26 in response to an operation of a user. A menu or the like is displayed for selecting each function. FIG. 7 shows an example of a menu and a map displayed on the display means 26. In the figure, a moving body mark 211 which shows the present position of the moving body, a bearing mark 212 which shows the direction of the map, a route line 213 which shows a route determined by the route search process to be discussed below and a menu 214 for selecting each function such as setting a destination are displayed on the map. When making a selection on the menu, a cursor is moved in response to an operation of a user on the operation means 29 and a selection is made.
  • When a destination is selected, firstly the route search process is performed. At this time, the present position detected by the present position detection means [0082] 23 and the destination input to the operation means 29 by a user are supplied to the route search means 24 by the control means 21. Map information data is read from the map information storage means 22 by the route search means 24, routes between the present position and the destination are searched by a Dijkstra method for example and a single route is determined. Information relating to nodes and links which comprise the route are stored in the route storage means 25.
  • FIG. 8 is an example of a route determined by the route search means [0083] 24. On the route shown in FIG. 8, there are 15 nodes N001-N00F between a present position and a destination (the figures are hexadecimal numbers). There are 14 (=15-1) links L000-L00D (the figures are hexadecimal numbers) connecting each node.
  • Since route voice guiding becomes possible when a route is discovered by the route search process, options (route outline) corresponding to the function of route voice guide on the menu as shown in FIG. 7 are varied to be selectable by an operation of the user. That is to say, before execution of the route search process or when no route is found, it is not possible for the user to make a selection and the option “route outline” is covered (i.e. displayed gray). [0084]
  • Each category of setting the route voice guide is executed before route voice guide processing. First the control means [0085] 21 displays a menu for each setting type of the route voice guide on the display means 26. FIG. 9 shows an example of a menu display for each type of setting of the route voice guide according to a first embodiment. In the menu in FIG. 9, there is a guide point selection term 201 which sets the execution of the route voice guide from a present position to any geographical point, a guide number selection term 202 which sets the total (guide number) of proximate facilities of a link or node as well as links and nodes which execute the voice guiding of the routes from a present position to a destination and a guide level selection term 203 which sets the level of detail of the voice guide.
  • There are the options “destination” and “detour” in the guide geographical [0086] point selection term 201. The selection of the option is executed by a user operating the operation means 29. When “destination” is selected, route voice guide from a present position to a destination is performed. When “route” is selected, route voice guide from a present position to a predetermined detour point is performed. Although there is one option “detour” in the guide geographical point selection term 201 in FIG. 9, the option may be adapted to provide a plurality of detours as options or to add an option “selectable geographical point” for the user to select a final geographical point for voice guiding in the displayed map.
  • The guide number selection term has the options “5”, “10”, “20” and “automatic”. The selection of these options is performed by the operation of the operational means [0087] 29 by the user. When any of “5”, “10” or “20” are selected, a respectively corresponding number (5, 10, 20) is set as the total number of proximate facilities for a link or node or as a link or node which executes the voice guide. When “automatic” is set, a guide number (discussed below) which is calculated in response to the distance from a present position to a guide point is set as the total number of proximate facilities for a link or node or as a link or node which executes the voice guide. When setting the guide number of a guided node or the like, apart from selecting a predetermined number as above, the device is adapted to allow a user to directly set a value as a guide number.
  • The guide [0088] level selection term 203 has the options “high”, “medium” and “low”. The selection of these options is executed by a user operating the operation means 29. When “high” is selected, voice guiding is performed to nodes, links and facilities proximate to nodes and links of low importance as far as allowed by the guide number above. When “low” is selected, voice guiding is performed only to nodes, links and facilities proximate to nodes and links of high importance as far as allowed by the guide number above. When “medium” is selected, voice guiding is performed only to nodes, links and facilities proximate to nodes and links of medium importance as far as allowed by the guide number above. The setting of the reference value for level of importance when selecting a guided node, apart from selecting a predetermined level as above may be performed by the user directly setting a level of importance with a value.
  • Thus, a reference number setting means and a reference value setting means which set a guide level and guide number are comprised by an [0089] operational means 29 and a display means 26 on which a menu is displayed.
  • Next, route voice guide processing is performed. FIG. 10 is a flowchart of the operation of each section in route voice guide processing. Firstly, as a result of the route search process being executed, the routine progresses from step ST[0090] 101 to step ST102 when the route is determined. Thus, the gray cover on the menu option “route outline” is withdrawn allowing this option to be selected. When the option “route outline” is selected, route voice guide processing is performed (step ST103).
  • Firstly, in a step ST[0091] 104, a present position of the moving body is detected by the present position detection means 23. Then, in a step ST105, the voice guide information extraction means 30 reads data relating to nodes and links which comprise the determined route and extracts a number of nodes, links and proximate facilities to nodes and links of high importance corresponding to the guide number above based on voice guide information extraction conditions set by the user in the menu (FIG. 9).
  • Then, in a step ST[0092] 106, the extracted nodes and links and proximate facilities to nodes and links are supplied to the voice guide message generation means through the control means 21. The voice guide message generation means 27 generates voice guide messages relating to the extracted nodes and links and proximate facilities to nodes and links. When a voice guide message is generated, it is supplied to the voice output means 28 by the control means 21. In a step ST107, the voice guide message is output by the voice output means 28 and an outline of the route from a present position to a guide point is reported to a user.
  • By adapting the output of voice guide messages in this way, an outline of a searched route may be guided. [0093]
  • Next, in the above step ST[0094] 105, the details of the process of extracting nodes and links and proximate facilities to nodes and links will be described. FIG. 11 is a flowchart of the details of the process of extracting nodes and links and proximate facilities to nodes and links in step ST105 in FIG. 10.
  • Firstly, in a step ST[0095] 121, the voice guide information extraction means 30 reads information relating to guide level terms, guide number terms and guide point terms set by the user from the control means 21, stores the node number of selected geographical points based on information about the guide point term and stores an extracted guide number Abased on information about the guide number term. When “automatic” is selected at this time, the guide number A is stored as 0. The voice guide information extraction means 30 stores a value 100 as a reference value GL when the guide level “high” is selected based on the information of the guide level term. When the selected guide level is “medium”, the value 10 is stored as the reference level GL, and when the selected guide level is “low”, the value 5 is stored as the reference level GL.
  • The voice guide information extraction means [0096] 30 sets the initial level of the extraction level L of the importance of extracted nodes, links and proximate facilities to 0. (In FIG. 6 these are shown as the facility voice guide level 154, 174, the link voice guide level 165, and the node voice guide level 126). When the extraction level L is 0, proximate facilities and nodes and links which are less than or equal to an extracted level of 0 are extracted and as discussed below, the value of the extraction level L is incremented sequentially by values of 1. Thus, only more important information is extracted as the value of the extraction level L reduces.
  • In the next step ST[0097] 122, the voice guide information extraction means 30 determines whether or not the guide number A is 0. That is to say, it is determined whether or not “automatic” has been selected as a guide number term.
  • The voice guide information extraction means [0098] 30 reads link information comprising route determined in the step ST123 from the map information storage means 22 when the guide number A is 0. A route X (kilometers) is calculated from the present position to the guide point (the destination or detour) on the basis of the link information. In step ST124, the guide number A is calculated on the basis of the following formula.
  • A∴INT(Log (X+1)0.7×6+0.5)
  • The term INT(y) is a function which outputs an integer which is a real number (y) in which decimal points are rounded off. FIG. 12 shows the corresponding relationship of a guide number A and the distance X from a present position to a guide point obtained by the formula. As shown in FIG. 12, the increment of the guide number A decreases as the distance X increases. In [0099] embodiment 1, the guide number A which is set to “automatic” is calculated by the above formula. However, the calculation may be performed on the basis of another formula with distance or other element as a variable. When the guide number A is not equal to 0, the processes in steps ST123 and ST124 are not performed.
  • In the step ST[0100] 125, the voice guide information extraction means 30 sets the extraction number SS (L) which shows the total of proximate facilities and nodes and links with an importance of less than the extraction level L, to an initial value of 0.
  • Then, in a step ST[0101] 126, the voice guide information extraction means 30 selects and extracts the node voice guide level 126, the link voice guide level 165 and the facility voice guide level 153, 174 as shown in FIG. 6 from the nodes, links and proximate facilities from the present position to the guide point based on information regarding nodes and links which comprise the route which is stored in the route storage memory 25. That is to say, nodes, links and proximate facilities are selected and extracted which have the same level of importance as the extraction level L.
  • After the nodes, links and proximate facilities which have the same level of importance as the extraction level L are extracted, in a step ST[0102] 127, the voice guide information extraction means 30 selects links from the extracted links and makes two adjacent links with the same link name number into one link. When the two links are varied into one link in this fashion, the link number, the start intersection number and the link attribute of the link after variation are assigned from that link of the two original links which is nearer the present position. The finish intersection number is assigned from that link of the two original links which is near the guide point. The link length of the link after variation is equal to the sum of the respective lengths of the two original links. Furthermore data relating to proximate facility links of links after variation contains data relating to proximate facility links of the two original links. Thus, the facility position of each proximate facility is calculated and set to a position corresponding to links after variation.
  • In a step ST[0103] 128, the voice guide information extraction means 30 updates the extraction number SS (L) by the sum of the number S(L) of extracted nodes, links and related proximate facilities in current step ST126 and the extraction number SS(L-1) when the extraction level L is only smaller by a value of 1.
  • After the extraction number SS(L) in the extraction level L at this time is calculated, in a step ST[0104] 129, the voice guide information extraction means 30 determines whether or not the extraction number SS(L) is greater than or equal to the guide number A above. When the extraction number SS(L) is not greater than or equal to the guide number A, in a step ST130, the voice guide information extraction means 30 determines whether the extraction level L is smaller than a reference value GL of the guide level above. When the extraction level L is smaller than the reference level GL of the guide level above, in a step ST131, the value of the extraction level L is increased by 1 and the routine returns to the step ST126. Thus, the extraction of the nodes, links and related proximate facilities corresponding to this extraction level L are executed in the same way. On the other hand, in a step ST130. when the extracted level L is smaller than the reference value GL of the guide level above, it is determined whether all the nodes, links, and related proximate facilities with a level of importance up to the set reference value GL have been extracted or not and the process of step ST105 is completed.
  • In the step ST[0105] 129, when the extracted level SS(L) is greater than the guide number A, in step ST132, the voice guide information extraction means 30 determines whether or not the extraction number SS(L) is the same as the guide number A. When the two are the same, it is determined that a number of nodes, links and proximate facilities of equal to the set guide number A has been extracted and the processing of step ST105 is completed.
  • In the step ST[0106] 132, when the extraction number SS(L) is not the same as the guide number A, that is to say, when the extraction number SS is greater than the guide number A, the voice guide information extraction means 30 deletes one of the nodes, links or related proximate facilities by the following process until the extraction number SS(L) equals the guide number A.
  • Firstly, in a step ST[0107] 133, the voice guide information extraction means 30 determines that the number of proximate facilities with an extracted level of importance L is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete related proximate facilities from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value. When it is determined that it is possible to delete related proximate facilities from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value, the proximate facilities are deleted in a step ST134 and the extraction number SS(L) is made equal to the guide number A.
  • When the number of proximate facilities with an extraction importance of L is less than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A, in a step ST[0108] 135, the voice guide information extraction means 30 deletes all the proximate facilities from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting that number of proximate facilities.
  • Thereafter, in a step ST[0109] 136, the voice guide information extraction means 30 determines whether the number of nodes from among extracted nodes with an importance of L to which extracted links with an importance of L are not connected is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete such nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value. When it is determined that it is possible to delete nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value, the nodes are deleted in a step ST137 and the extraction number SS(L) is made equal to the guide number A.
  • When the number of nodes to which extracted links with an importance of L are not connected is less than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A, in a step ST[0110] 138, the voice guide information extraction means 30 deletes all nodes to which extracted links with an importance of L are not connected from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting that number of nodes.
  • Thereafter, in a step ST[0111] 139, the voice guide information extraction means 30 determines whether the number of remaining extracted nodes with an importance of L is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete such nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value. When it is determined that it is possible to delete nodes from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value, the nodes are deleted in a step ST140 and the extraction number SS(L) is made equal to the guide number A.
  • When the number of remaining extracted nodes with an importance of L is less than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A, in a step ST[0112] 141, the voice guide information extraction means 30 deletes all remaining extracted nodes with an importance of L from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting that number of nodes.
  • Thereafter, in a step ST[0113] 142, the voice guide information extraction means 30 determines whether the number of links to which extracted nodes with an importance of L are not connected is greater than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A. The voice guide information extraction means 30 also determines whether it is possible to delete such links from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value. When it is determined that it is possible to delete such links from the environs of the guide point until the extraction number SS(L) and the guide number A have the same value, the links are deleted in a step ST 143 and the extraction number SS(L) is made equal to the guide number A.
  • When the number of links to which extracted nodes with an importance of L are not connected is less than the difference (SS(L)-A) of the extraction number SS(L) and the guide number A, in a step ST[0114] 144, the voice guide information extraction means 30 deletes all such links from the extracted nodes, links and related proximate facilities and updates the extraction number SS(L) by subtracting the total number of nodes.
  • Thereafter, in a step ST[0115] 145, the voice guide information extraction means 30 deletes links near to the guide point from the remaining extracted links which have an importance of L and thus makes the extraction number SS(L) equal to the guide number A.
  • In such a way, nodes and the like are extracted in order of importance and when the extraction number SS(L) is greater than the guide number A, nodes and the like are deleted from near the guide point until the same number of nodes or the like as the guide point A near the present position is selected. After the extraction number SS(L) and the guide number A are made equal, the process of step ST[0116] 105 is completed.
  • The process of generating a voice guide message regarding extracted links, nodes and proximate facilities of links or nodes in a step ST[0117] 106 will be described below. FIG. 13 is a flowchart showing the details of the process of generating a voice guide message for extracted links and nodes as well as the proximate facilities of links and nodes in step ST106 of FIG. 10.
  • After extracted links and nodes as well as the proximate facilities to links and nodes are supplied to the voice guide message generating means [0118] 27 through the control means 27, in a step 151, the voice guide message generating means 27 firstly output data relating to extracted links and nodes as well as the proximate facilities to links and nodes in order of data proximate to a present position. At this time, the data relating to node proximate facilities is stated with the node proximate facility closer to the present position than the node. The data relating to link proximate facilities is stated with the link proximate facility further from the present position than the link.
  • The voice guide message generation means [0119] 27 adds a group flag GF to data related to nodes, links and related proximate facilities comprising the route. Thereafter a value of 1 is assigned to the group flag GF of the mutually connected nodes and links, to nodes and related proximate facilities to that node, or to nodes, links and proximate facilities related to links and proximate facilities to that link. However, the group flag GF of data which is closest to the guide point in the group is set to a value of 0.
  • Next, in a step ST[0120] 152, the voice guide message generation means 27 extracts voice waveform data corresponding to extracted nodes, links and proximate facilities related to nodes and links from the stored voice information storage means 31. Such voice waveform data consists of the names of nodes, names of links, or the names of proximate facilities related to nodes and links.
  • In a step ST[0121] 153, the voice guide message generation means 27 takes out supplementary voice data corresponding to predicates from the voice information storage means 31 and generates a voice guide message containing the previously taken out voice waveform data for each name.
  • FIG. 14 shows an example of a set of supplementary voice data. When the supplementary voice data shown in FIG. 14 is pre-stored in the voice information storage means [0122] 31, a voice guide message is generated by one of the following procedures by adding supplementary voice data to each name of nodes or the like in the stated order of step ST151 and then stating the data sequentially.
  • [0123] Procedure 1.
  • When a right turn is made on the route at a node, supplementary voice data (make right turn.) of distinguishing [0124] number 301 is added after the voice waveform data of the node name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (make right turn and . . . ) of distinguishing number 401 is added.
  • [0125] Procedure 2.
  • When a left turn is made on the route at a node, supplementary voice data (make left turn.) of distinguishing [0126] number 302 is added after the voice waveform data of the node name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (make left turn and.. ) of distinguishing number 402 is added.
  • [0127] Procedure 3.
  • When a vehicle continues traveling in the same direction on the route at a node, supplementary voice data (continue straight.) of distinguishing number [0128] 303 is added after the voice waveform data of the node name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (continue straight and . . . ) of distinguishing number 403 is added.
  • [0129] Procedure 4.
  • When a vehicle makes a U-turn on the route at a node, supplementary voice data (make a U-turn) of distinguishing [0130] number 304 is added after the voice waveform data of the node name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (make a U-turn and) of distinguishing number 404 is added.
  • [0131] Procedure 5.
  • When a vehicle continues traveling in the same direction on the route at a link, supplementary voice data (continue straight.) of distinguishing number [0132] 303 is added after the voice waveform data of the link name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (continue straight and . . . ) of distinguishing number 403 is added.
  • [0133] Procedure 6.
  • When a link is a bridge, supplementary voice data ( . . . cross.) of distinguishing [0134] number 305 is added after the voice waveform data of the link name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (. . . cross and . . . ) of distinguishing number 405 is added.
  • Procedure 7. [0135]
  • When a node is an entrance to an expressway, supplementary voice data (enter . . . ) of distinguishing [0136] number 306 is added after the voice waveform data of the node name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (enter and . . . ) of distinguishing number 406 is added.
  • Procedure 8. [0137]
  • When a node is an exit of an expressway, supplementary voice data (exit...) of distinguishing [0138] number 307 is added after the voice waveform data of the node name when the group flag GF has a value of 0. When the group flag GF has a value of 1, supplementary voice data (exit . . . and . . . ) of distinguishing number 407 is added.
  • Procedure 9. [0139]
  • When the group flag GF has a value of 0 at a link proximate facility, supplementary voice data (pass on the... side) of distinguishing [0140] number 308 is added after the voice waveform data of the link proximate facility name. When the group flag GF has a value of 1, supplementary voice data (pass on the . . . side and . . . ) of distinguishing number 408 is added.
  • [0141] Procedure 10.
  • When the group flag GF has a value of 0, the route turns to the right at a node which is a node proximate facility, supplementary voice data (turn right at the mark) of distinguishing number [0142] 309 is added after the voice waveform data of the node proximate facility name. When the group flag GF has a value of 1, supplementary voice data (at the mark . . . ) of distinguishing number 409 is added.
  • Procedure 11. [0143]
  • When the group flag GF has a value of 0, the route turns to the left at a node which is a node proximate facility and supplementary voice data (turn left at the mark) of distinguishing number [0144] 310 is added after the voice waveform data of the node proximate facility name. When the group flag GF has a value of 1, supplementary voice data (at the mark . . . ) of distinguishing number 409 is added.
  • [0145] Procedure 12.
  • When the group flag GF has a value of 0, the route continues in the same direction at a node which is a node proximate facility and supplementary voice data (continue straight at the mark) of distinguishing [0146] number 311 is added after the voice waveform data of the node proximate facility name. When the group flag GF has a value of 1, supplementary voice data (continue straight at the mark) of distinguishing number 409 is added.
  • The route search means [0147] 24 predetermines whether the route turns left or right or continues straight at each node or whether a link is a bridge.
  • FIG. 15 shows data related to extracted nodes, links and proximate facilities of nodes and links through a process shown in FIG. 11 with respect to the route shows in FIG. 8. FIG. 16 shows a voice guide message generated based on data shown in FIG. 15. The link L[0148] 001 shown in FIG. 15 is a single link made up of links L001, L002, L003 in FIG. 8 by the process of step ST127 shown in FIG. 11. In the same way, the link L005 shown in FIG. 15 is a single link made up of links L005, L006, L007 in FIG. 8. The link proximate facility S251 (not shown) belongs to the link L002 in FIG. 8, however it is varied to belong to the link L001 after variation by the process of step ST127.
  • In the step ST[0149] 151 as shown in FIG. 13, the link L001, the link proximate facility S251 and the node N005 is made into a group 500 and the node N006, the link L005 and the node N009 are made into the group 501. A voice guide message shown in FIG. 16 is generated by steps ST152, ST153 in FIG. 13 with respect to extracted nodes, links and proximate facilities to nodes and links.
  • In such a way, a voice guide message is generated and output to the voice output means [0150] 28.
  • As shown above, according to [0151] embodiment 1, the importance of nodes and links is stored. Nodes and links are selected from amongst nodes and links on a searched route based on their importance. Voice guide messages are generated with respect to selected nodes and links and such messages are used to execute guiding of a route by voice commands. Thus, it is possible to guide an entire route appropriately in a short time by voice.
  • Furthermore, when a plurality of nodes and links which have equal importance exist and their number does not equal a predetermined reference number, a number of nodes which equals the predetermined reference number is selected from those nodes of equal importance near to a present position. Thus, it is possible to accurately correspond the number of guided nodes or the like with a predetermined reference number. [0152]
  • [0153] Embodiment 2
  • The navigation device according to [0154] embodiment 2 of the present invention summarizes nodes, links and related proximate facilities on the route which are voiced guided based on a guide time pre-set by a user instead of summarizing on the basis of the level of importance and predetermined guide number of nodes, links and related proximate facilities on the voiced-guided route. That is to say, this is a variation on the process (FIG. 11) of step ST105 in FIG. 10 of the navigation device according to the first embodiment.
  • Thus, according to [0155] embodiment 2, the summarizing process entails summarizing nodes, links and related proximate facilities on the voiced-guided route. In other respects, the present embodiment is the same as the first embodiment and such description will be omitted.
  • Before the route voice guide process, all types of settings related to the route voice guiding are executed. Firstly, the control means [0156] 21 displays a menu on the display means 26 for all types of settings for route voice guiding. FIG. 17 shows a display example of a menu for displaying each type of setting for route voice guiding according to embodiment 2.
  • The menu shown in FIG. 17 comprises a guide [0157] point selection term 601 which sets the execution of route voice guiding from a present position to a given geographical point and a guide time selection term 602 which sets a guide time for voice guiding of the route from a present position to a guide point.
  • The guide [0158] point selection term 601 contains the options “destination” and “detour”. The selection of the options is executed by a user operating the operational means 29. When “destination” is selected, route voice guiding from a present position to a destination is executed. When “detour” is set, route voice guiding from a present position to a predetermined detour point is executed. The guide point selection term 601 in FIG. 17 has one option “detour”. However, a plurality of detours may be selected as options or the user may add “selectable geographic points” as options to select a final geographic point for voice guiding.
  • The guide [0159] time selection term 602 has the options “short”, “middle” and “long”. The selection of these options is executed by a user operating the operational means 29. When “short” is selected by a user with respect to nodes, links and related proximate facilities on a route, voice guiding is performed for approximately 15 sec. When “middle” is selected, voice guiding is performed for approximately 30 sec. When “long” is selected, voice guiding is performed for approximately 1 minute. The user may directly select a time limit for voice guiding with a figure.
  • Now the extraction process of nodes, links and related proximate facilities according to [0160] embodiment 2 will be described. FIG. 18 is a flowchart showing the details of the process of extracting links and nodes as well as the proximate facilities of links and nodes in a second embodiment.
  • Firstly, in a step ST[0161] 201, the voice guiding information extraction means 30 reads information relating to guide time terms and guide point terms set by a user from the control means 21. The node number of the selected geographical point is stored based on information relating to the guide point term. A selected guide time is set to a reference value B for guide times based on information relating to the guide time term. The voice guide information extraction means 30 set an extraction level L which shows the importance of extracted nodes, links and related proximate facilities (in FIG. 6, a node voice guiding level 126, a link voice guiding level 165 and a facility voice guiding level 154, 174) to a value of 0. When the extraction level L is 0, the nodes, links and related facilities with an extraction level of equal to or less than 0 are extracted. As described below, the value of the extraction level L is sequentially incremented by values of 1. Thus, only increasingly important information is extracted at lower extraction level values L.
  • In a step ST[0162] 202, a voice guiding information extraction means 30 sets an initial value 0 to an extraction number SS(L) which shows the total number of nodes, links and related facilities with an importance equal to or less than an extraction level of L. An initial value 0 is set to a total guide time ST(L) which is required to voice guide nodes, links and related facilities with a level of importance less than or equal to an extraction level L.
  • Then, in the step ST[0163] 203, the voice guiding information extraction means 30 selects and extracts node voice guiding level 126, link voice guiding level 165 and facility voice guiding level 154, 174 shown in FIG. 6, that is to say, nodes, links and related facilities which have an equal extraction level L of importance from amongst the nodes, links and related proximate facilities from a present point to a guide point based on node and link information comprising a route stored in the route storage means 25.
  • After selecting and extracting nodes, links and related facilities which have an equal extraction level L of importance, in a step ST[0164] 204, of the extracted links, the voice guiding information extraction means 30 converts two adjacent links with the same link name number into one link. When two such links are converted into one link, the link number of the link after conversion, the start intersection number and the link attribute are assigned from the link of the two original links which is nearer to the present position. The finish intersection point number is assigned from the link of the two original links which is nearer to the guide point. The link length of the link after conversion is the sum of the link lengths of the two original links. With respect to data about the link proximate facilities of the link after conversion, data is stored with respect to the proximate facilities of the two original links and facility positions for each type of proximate facility is calculated and set to a position which corresponds to the link after conversion.
  • In step ST[0165] 205, the voice guiding information extraction means 30 updates the extraction number SS(L) by the sum of the number S(L) of extracted nodes, links and related proximate facilities in the current step ST203 and the extraction number SS(L-1) in which the extraction level L is smaller by a value of 1. Further it updates the total guiding time ST(L) by the sum of the number T(L) of guiding time required for extracted nodes, links and related proximate facilities in the current step ST203 and the total guiding time ST(L-1) in which the extraction level L is smaller by a value of 1. The sum T(L) of guiding time required for extracted nodes, links and related proximate facilities is the total of the node voice guiding time 127, the facility voice guiding time 155, the link voice guiding time 166 and the facility voice guiding time 175 in the map information data shown in FIG. 6.
  • After the calculation of the extraction number SS(L) and the total guiding time ST(L) in the extraction level L at that point in time, in a step ST[0166] 206, the voice guiding information extraction means 30 determines whether or not twice the sum of the total guiding time ST(L) and the extraction number SS(L): (ST(L)+SS(L) x 2) is less than or equal to the reference value B above.
  • The comparison of twice the sum of the total guiding time ST(L) and the extraction number SS(L): (ST(L)+SS(L) x 2) with the reference value B above is performed for the following reason. The total guide time ST(L) is the total sum of voice playing times for names such as extracted nodes. In addition, the voice playing time required for supplementary voice data referred to above which is added to each node is on average two seconds. The playing time for voice guide messages up to an [0167] extraction level 1 at that time thus becomes (ST(L)+SS(L) x 2). Thus, when other supplementary voice data is used, (ST(L)+SS(L) x Ts) is compared with a reference value B based on an average playing time Ts for supplementary voice data in response to the length of the supplementary voice data.
  • When the voice guide message playing time to an extraction level L (ST(L)+SS(L) x 2) is not greater than or equal to the reference value B, after the value of the extraction level L is increased by a value of 1 in a step ST[0168] 207, the routine returns to step ST203 and the extraction of nodes, links and related proximate facilities which correspond to an extraction level L is executed in the same way.
  • On the other hand, when the voice guide message playing time to an extraction level L is greater than or equal to a reference level B, in a step ST[0169] 208, the voice guiding information extraction means 30 determines whether or not the voice guide message playing time to an extraction level L is the same as the reference level B. When both are the same, it is determined that the nodes, links and related proximate facilities in a set guide time have been extracted and the routine is completed.
  • When the voice guide message playing time to an extraction level L is not the same as a reference level B, that is to say, the voice guide message playing time to an extraction level L (ST(L)+SS(L) x 2) is greater than a reference level B, the voice guiding information extraction means [0170] 30 deletes by the following process any of the nodes, links and related facilities with a level of importance L until the voice guide message playing time to an extraction level L (ST(L)+SS(L) x 2) is less than or equal to a reference level B.
  • Firstly, in a step ST[0171] 209, the voice guiding information extraction means 30 determines that the sum of the facility voice guiding time for proximate facilities with an extracted importance of L is greater than the difference of the voice guide message playing time to an extraction level L (ST(L)+SS(L) x 2) and a reference level B. Then, it is determined whether or not it is possible to delete from those proximate facilities which are near to guide points until the voice guide message playing time is less than or equal to the reference value B. When it is determined that it is possible to delete from those proximate facilities which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST210 such proximate facilities are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • On the other hand, when it is determined that the sum of the facility voice guiding time for proximate facilities with an extracted importance of L is smaller than the difference of the voice guide message playing time to an extraction level L and the reference level B, in a step ST[0172] 211, the voice guiding information extraction means 30 deletes all proximate facilities from extracted nodes, links and related proximate facilities, reduces the total guide time ST(L) by the sum of the facility voice guide time for proximate facilities and updates the value. Also the value is updated by reducing the value of the extraction number SS(L) by the total number of proximate facilities.
  • Thereafter, in a step ST[0173] 212, of the nodes which have an extracted importance of L, the voice guiding information extraction means 30 determines that the sum of the node voice guiding time for nodes with an extracted importance of L to which links are not connected is greater than the difference of the voice guide message playing time to an extraction level L and a reference level B. Then, it is determined whether or not it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B. When it is determined that it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST213 such nodes are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • On the other hand, when it is determined that the sum of the node voice guiding time for nodes with an extracted importance of L not connected to links is smaller than the difference of the voice guide message playing time to an extraction level L and the reference level B, in a step ST[0174] 214, the voice guiding information extraction means 30 deletes all nodes which have an extracted importance of L not connected to links from extracted nodes, links and related proximate facilities, reduces the total guide time ST(L) by the sum of the node voice guide time for such nodes and updates the value. Also the value is updated by reducing the value of the extraction number SS(L) by the total number of nodes.
  • Thereafter, in a step ST[0175] 215, the voice guiding information extraction means 30 determines that the sum of the node voice guiding time for remaining nodes which have an extracted importance of L is greater than or equal to the difference of the voice guide message playing time to an extraction level L and a reference level B. Then, it is determined whether or not it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B. When it is determined that it is possible to delete from those nodes which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST216 such nodes are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • On the other hand, when the sum of the node voice guiding time for remaining nodes which have an extracted importance of L is smaller than the difference of the voice guide message playing time to an extraction level L and a reference level B, the voice guiding information extraction means [0176] 30, in a step ST217, deletes all remaining nodes which have an extracted importance of L from the extracted nodes, links and related proximate facilities. The value for the total guide time ST(L) is reduced by the sum of the node voice guiding times for such nodes and the value is updated. Also, the value for extraction number SS(L) is updated by being reduced by the total number of such nodes.
  • Thereafter, in a step ST[0177] 218, the voice guiding information extraction means 30 determines that of links with an extracted importance of L, the sum of the link voice guiding time for links with an extracted importance of L not connected to nodes is greater than the difference of the voice guide message playing time to an extraction level L and a reference level B. Then, it is determined whether or not it is possible to delete from those links which are near to guide points until the voice guide message playing time is less than or equal to the reference value B. When it is determined that it is possible to delete links which are near to guide points until the voice guide message playing time is less than or equal to the reference value B, in a step ST219 such nodes are deleted until the voice guide message playing time is less than or equal to the reference value B.
  • On the other hand, when the sum of the link voice guiding time for links not connected to nodes which have an extracted importance of L is smaller than the difference of the voice guide message playing time with an extraction level L and a reference level B, the voice guiding information extraction means [0178] 30, in a step ST220, deletes all such links from the extracted nodes, links and related proximate facilities. The value for the total guide time ST(L) is reduced by the sum of the link voice guiding times for such links and the value is updated. Also, the value for extraction number SS(L) is updated by being reduced by the total number of such links.
  • Thereafter, in a step ST[0179] 221, the voice guiding information extraction means 30 deletes such links which are near to guide points from the remaining links with an extracted importance of L until the voice guide message playing time is less than or equal to the reference value B.
  • In such a way, when nodes of high importance are extracted sequentially and a voice guide message playing time (ST(L)+SS(L) x 2) to an extraction level L is greater than or equal to a reference value B of a guide time, a guide time is selected in a range of basic values for guide times by deleting nodes and the like in order of those near guide points. Thus, after the playing time of the voice guide message is less than or equal to a reference value for playing times, the routine is completed. [0180]
  • As shown above, according to [0181] embodiment 2, when there are a plurality of links and nodes with the same level of importance and when playing time of the voice guide message is greater than a predetermined reference value, nodes or the like are selected from the nodes and links with the same level of importance from those near to a present position so that the playing time of the voice guide message is less than a predetermined reference value. Thus, it is possible to make the time for voice guiding to be accurately less than or equal to a predetermined reference value.
  • The present invention is not limited to [0182] embodiments 1 and 2 above and may be applied through other embodiments. For example, the extraction method of guiding points may vary the node or link extraction conditions.
  • Furthermore, the voice-guide message generated by the voice guide message generation means [0183] 27 may generate messages which guide the position of a facility, a time or a distance apart from the examples discussed above. Furthermore, the invention may be adapted to generate a voice-guide message by combining simple phrases by the insertion of conjunctions between phrases.
  • Furthermore, during voice guiding, the route from a present position to a guide point is displayed on the same screen. The position of those nodes, links and related proximate facilities which correspond to information which voice guided may be displayed in a different color from other parts or may be displayed by a blinking light. [0184]
  • As shown above, the present invention is adapted for use in a navigation device in which the level of importance of each node and link is stored as a part of map information in a map information storage means. Nodes and links from amongst the nodes and links on the searched route are selected on the basis of the level of importance and a voice guide message is generated which corresponds to the selected nodes and links. Thus, it is possible to guide an entire route appropriately in a short time by voice and it is possible for a user to easily arrive at a destination. [0185]

Claims (5)

What is claimed is:
1. A navigation device comprising a map information storage means for storing map information including node information, link information and related information thereof, a present position detection means for detecting a present position of a moving body, a route searching means for searching a route from the present position to a guide point based on said map information, a voice guide message generation means for generating a voice guide message corresponding to the route searched by said route searching means, and a voice output means for outputting the voice guide message, wherein:
said map information storage means stores a level of importance of each node and link as a part of said map information; and
said voice guide message generation means selects the nodes and links on the searched route based on said level of importance and generates the voice guide message corresponding to the selected nodes and links.
2. A navigation device according to
claim 1
, wherein said voice guide message generation means selects said nodes and links on the searched route in such a manner that a level of importance of the selected nodes and links is less than or equal to a predetermined reference value and a number of the selected nodes and links is less than or equal to a predetermined reference number, and generates the voice guide message corresponding to the selected nodes and links.
3. A navigation device according to
claim 2
, wherein when a plurality of nodes and links with the same level of importance exists and the number of the selected nodes and links is not equal to the predetermined reference number, said voice guide message generation means deletes nodes and links, which are located near the guide point, from said plural nodes and links with the same level of importance in such a manner that the number of the selected nodes and links is equal to the predetermined reference number.
4. A navigation device according to
claim 2
, further comprising a reference value setting means for setting the predetermined reference value and a reference number setting means for setting the predetermined reference number.
5. A navigation device according to
claim 1
, wherein said map information storage means stores information about voice playing times relating to names of each node and link as a part of said map information, and wherein said voice guide message generation means selects said nodes and links on the searched route in order of high importance in such a manner that the voice playing time for the voice guide message is less than or equal to a predetermined reference value and generates the voice guide message corresponding to said selected nodes and links.
US09/768,460 1999-05-25 2001-01-25 Navigation device Expired - Lifetime US6366852B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP1999/002748 WO2000071975A1 (en) 1999-05-25 1999-05-25 Navigation device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1999/002748 Continuation WO2000071975A1 (en) 1999-05-25 1999-05-25 Navigation device

Publications (2)

Publication Number Publication Date
US20010007090A1 true US20010007090A1 (en) 2001-07-05
US6366852B2 US6366852B2 (en) 2002-04-02

Family

ID=14235786

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/768,460 Expired - Lifetime US6366852B2 (en) 1999-05-25 2001-01-25 Navigation device

Country Status (5)

Country Link
US (1) US6366852B2 (en)
EP (1) EP1099932B1 (en)
JP (1) JP4169934B2 (en)
DE (1) DE69938927D1 (en)
WO (1) WO2000071975A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020169544A1 (en) * 2001-05-08 2002-11-14 Pioneer Corporation Navigation apparatus
US20030083851A1 (en) * 2001-09-10 2003-05-01 Pioneer Corporation, Increment P Corporation Communication navigation system and server apparatus and terminal apparatus therefor, communication navigation method and communication navigation program
US20030125872A1 (en) * 2001-12-21 2003-07-03 Kimmo Kauvo Providing guiding service by means of a wireless terminal
US6778901B2 (en) 2001-08-01 2004-08-17 Pioneer Corporation Communication navigation system, communication navigation method, terminal unit, and route guidance information transmitting apparatus
US6801851B2 (en) 2001-08-27 2004-10-05 Pioneer Corporation Communication navigation system and method, communication center apparatus for providing map information, communication navigation terminal, program storage device and computer data signal embodied in carrier wave
US6842693B2 (en) 2001-07-31 2005-01-11 Pioneer Corporation Communication navigation system, communication navigation method, route guidance information transmitting device, and terminal unit
US20050049779A1 (en) * 2003-08-28 2005-03-03 Denso Corporation Navigation apparatus for vehicle
US6892132B2 (en) * 2001-07-31 2005-05-10 Pioneer Corporation Communication navigation system, communication navigation method, map data transmitting device, and terminal unit
US20060031009A1 (en) * 2003-02-24 2006-02-09 Christian Brulle-Drews Navigation system with acoustic route information
US20060106534A1 (en) * 2002-10-22 2006-05-18 Yukihiro Kawamata Map data delivering method for communication-type navigation system
US20080189035A1 (en) * 2007-02-01 2008-08-07 Denso Corporation Map display apparatus for vehicle
US20090234572A1 (en) * 2005-09-28 2009-09-17 Aisin Aw Co., Ltd Surrounding Search Data Generating System, Surrounding Search System, Surrounding Search Data Generating Method, Surrounding Search Method, and Navigation Apparatus
US20110153195A1 (en) * 2009-12-18 2011-06-23 Mitac International Corporation Navigation device and alerting method thereof

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1241447A1 (en) * 2001-03-13 2002-09-18 Matsushita Electric Industrial Co., Ltd. Information terminal and cartographic information providing system
JP2003057059A (en) * 2001-08-10 2003-02-26 Aisin Aw Co Ltd Navigation apparatus and program
DE10213150A1 (en) * 2002-03-23 2003-10-02 Philips Intellectual Property Arrangement for navigation
JP2004212295A (en) * 2003-01-07 2004-07-29 Mitsubishi Electric Corp Navigation system
JP4575066B2 (en) * 2004-07-29 2010-11-04 クラリオン株式会社 Map display device, in-vehicle navigation device
JP2006119120A (en) * 2004-09-27 2006-05-11 Denso Corp Car navigation device
KR100832940B1 (en) 2005-03-11 2008-05-27 하만 베커 오토모티브 시스템즈 게엠베하 Navigation system with acoustic route information
JP5771889B2 (en) * 2009-03-04 2015-09-02 日産自動車株式会社 Route guidance device and route guidance method
US20150134240A1 (en) * 2012-06-19 2015-05-14 Mitsubishi Electric Corporation Imitation sound generation system and map database

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0566131A (en) * 1991-09-09 1993-03-19 Sumitomo Electric Ind Ltd Voiced guide apparatus
JP3196308B2 (en) 1992-04-20 2001-08-06 株式会社デンソー Route guidance device
JP3183569B2 (en) * 1992-09-16 2001-07-09 マツダ株式会社 Car travel route guidance device
JPH0727568A (en) * 1993-07-09 1995-01-27 Zanabui Informatics:Kk Path guiding device and path searching method
JP3488969B2 (en) * 1994-03-09 2004-01-19 本田技研工業株式会社 Vehicle guidance device
US5638280A (en) * 1994-03-30 1997-06-10 Sumitomo Electric Industries, Ltd. Vehicle navigation apparatus and method
JP3168819B2 (en) * 1994-04-28 2001-05-21 トヨタ自動車株式会社 Driving support device
JP3414873B2 (en) * 1995-01-20 2003-06-09 三菱電機株式会社 Car navigation system
JPH08254436A (en) * 1995-01-20 1996-10-01 Mitsubishi Electric Corp Navigation system
JP3688751B2 (en) * 1995-04-13 2005-08-31 シャープ株式会社 Information display device
JP3737853B2 (en) * 1995-11-08 2006-01-25 トヨタ自動車株式会社 Vehicle travel information providing device
JP3173983B2 (en) * 1995-12-28 2001-06-04 松下電器産業株式会社 Route selection method and system
JP3223782B2 (en) * 1996-02-08 2001-10-29 三菱電機株式会社 Vehicle route calculation device
JPH109884A (en) * 1996-06-24 1998-01-16 Mitsubishi Electric Corp Path guidance apparatus and path finding method for vehicle
KR100278972B1 (en) * 1996-08-21 2001-01-15 모리 하루오 Navigation device
US5910177A (en) * 1996-12-09 1999-06-08 Visteon Technologies, Llc Navigating close proximity routes with a vehicle navigation system
KR100313055B1 (en) 1997-05-15 2001-12-28 다니구찌 이찌로오, 기타오카 다카시 Navigation device

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856895B2 (en) * 2001-05-08 2005-02-15 Pioneer Corporation Navigation apparatus
US20020169544A1 (en) * 2001-05-08 2002-11-14 Pioneer Corporation Navigation apparatus
US6842693B2 (en) 2001-07-31 2005-01-11 Pioneer Corporation Communication navigation system, communication navigation method, route guidance information transmitting device, and terminal unit
US6892132B2 (en) * 2001-07-31 2005-05-10 Pioneer Corporation Communication navigation system, communication navigation method, map data transmitting device, and terminal unit
US6778901B2 (en) 2001-08-01 2004-08-17 Pioneer Corporation Communication navigation system, communication navigation method, terminal unit, and route guidance information transmitting apparatus
US6801851B2 (en) 2001-08-27 2004-10-05 Pioneer Corporation Communication navigation system and method, communication center apparatus for providing map information, communication navigation terminal, program storage device and computer data signal embodied in carrier wave
US20030083851A1 (en) * 2001-09-10 2003-05-01 Pioneer Corporation, Increment P Corporation Communication navigation system and server apparatus and terminal apparatus therefor, communication navigation method and communication navigation program
US6954694B2 (en) * 2001-09-10 2005-10-11 Pioneer Corporation Communication navigation system and server apparatus and terminal apparatus therefor, communication navigation method and communication navigation program
US20030125872A1 (en) * 2001-12-21 2003-07-03 Kimmo Kauvo Providing guiding service by means of a wireless terminal
US7155338B2 (en) * 2001-12-21 2006-12-26 Nokia Corporation Providing guiding service by means of a wireless terminal
US20060106534A1 (en) * 2002-10-22 2006-05-18 Yukihiro Kawamata Map data delivering method for communication-type navigation system
US20060031009A1 (en) * 2003-02-24 2006-02-09 Christian Brulle-Drews Navigation system with acoustic route information
US7463975B2 (en) 2003-02-24 2008-12-09 Harman International Industries, Incorporated Navigation system with acoustic route information
US20050049779A1 (en) * 2003-08-28 2005-03-03 Denso Corporation Navigation apparatus for vehicle
US7457704B2 (en) * 2003-08-28 2008-11-25 Denso Corporation Navigation apparatus for vehicle
US20090234572A1 (en) * 2005-09-28 2009-09-17 Aisin Aw Co., Ltd Surrounding Search Data Generating System, Surrounding Search System, Surrounding Search Data Generating Method, Surrounding Search Method, and Navigation Apparatus
US20080189035A1 (en) * 2007-02-01 2008-08-07 Denso Corporation Map display apparatus for vehicle
US8024117B2 (en) * 2007-02-01 2011-09-20 Denso Corporation Map display apparatus for vehicle
US20110153195A1 (en) * 2009-12-18 2011-06-23 Mitac International Corporation Navigation device and alerting method thereof
US8340900B2 (en) * 2009-12-18 2012-12-25 Mitac International Corporation Navigation device and alerting method thereof
TWI420075B (en) * 2009-12-18 2013-12-21 Mitac Int Corp Navigation device and alerting method thereof

Also Published As

Publication number Publication date
JP4169934B2 (en) 2008-10-22
US6366852B2 (en) 2002-04-02
EP1099932A4 (en) 2002-07-10
DE69938927D1 (en) 2008-07-31
EP1099932A1 (en) 2001-05-16
EP1099932B1 (en) 2008-06-18
WO2000071975A1 (en) 2000-11-30

Similar Documents

Publication Publication Date Title
US6366852B2 (en) Navigation device
EP1614994B1 (en) Navigation apparatus and method
EP0588082B1 (en) Navigation system for vehicle
US6175800B1 (en) Route searching device
JPH11311533A (en) Routing device
JPH05127596A (en) Navigation device
WO2004031690A1 (en) Geographic data transmitting method, information delivering apparatus and information terminal
US6038507A (en) Driving simulation apparatus capable of arbitrarily setting start position and method thereof
US20060085123A1 (en) Route display device and route display method
JP2014044182A (en) Route search system, route search device, route search method and computer program
US6820003B2 (en) Navigation apparatus
US5777875A (en) Driving simulation apparatus capable of scrolling at optimum speed
JP4736590B2 (en) Candidate route creation device, method, program, traffic simulation device, method and program, route search device, method, and program
JP4365359B2 (en) Navigation system, navigation device, and operation mode control method in navigation device
JPH06249672A (en) Navigation device for moving body
JP3039226B2 (en) Route calculation method and device
JPH06119562A (en) Route guiding device for vehicle
JP2780206B2 (en) Vehicle navigation system
JP3525032B2 (en) Route search device
JPH1089986A (en) Navigation apparatus for vehicle
JPH05297800A (en) Route guiding device
US6389357B1 (en) Vehicle navigation system
JPH0737199A (en) Navigation system
JP2002310703A (en) Return route announcing device
JPH11344352A (en) Navigator

Legal Events

Date Code Title Description
AS Assignment

Owner name: MITSUBISHI DENKI KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IRIE, TAKASHI;NORIMOTO, MASATSUGU;REEL/FRAME:011482/0110

Effective date: 20001218

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12