WO2013069172A1 - ナビゲーション装置および方法 - Google Patents
ナビゲーション装置および方法 Download PDFInfo
- Publication number
- WO2013069172A1 WO2013069172A1 PCT/JP2012/003678 JP2012003678W WO2013069172A1 WO 2013069172 A1 WO2013069172 A1 WO 2013069172A1 JP 2012003678 W JP2012003678 W JP 2012003678W WO 2013069172 A1 WO2013069172 A1 WO 2013069172A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- route
- unit
- route setting
- expression
- point
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/3608—Destination input or retrieval using speech input, e.g. using speech recognition
Definitions
- the present invention relates to a navigation apparatus and method capable of recognizing a user's utterance content and performing navigation.
- a navigation device for in-vehicle use, for example, when a predetermined point (for example, an intersection whose traveling direction is to be changed) is approaching while traveling on a set route, a voice output or a graphic display is provided to the driver. Give guidance. At this time, the user's utterance content is recorded, and when the voice recognition button is pressed, the utterance content recorded after a predetermined time is voice-recognized to extract point information (point name), A navigation device that sets point information (point name) as a destination is known (see, for example, Patent Document 1).
- the present invention has been made to solve the above-described problems.
- the point information and the route setting method are specified from the recognition result obtained by always recognizing the content of the user's utterance, and the point information and the route are identified. It is an object of the present invention to provide a navigation apparatus and method capable of setting a route in real time based on a setting method.
- the present invention includes a position acquisition unit that acquires the position of a moving body, and performs route guidance on a route based on the position of the moving body and map data acquired by the position acquisition unit.
- a voice acquisition unit that detects and acquires input voice
- a voice recognition unit that recognizes voice data acquired by the voice acquisition unit whenever the navigation device is activated
- a place name A point name storage unit that stores the name of the facility as a point name
- a route setting expression storage unit that stores a route setting expression used when the user performs route setting, the point name storage unit, and the route setting expression storage unit
- a keyword extraction unit that extracts a point name and a route setting expression from the recognition result by the voice recognition unit, and a route setting corresponding to the route setting expression
- a route setting operation storage unit that stores an operation in association with the route setting expression, and a corresponding route setting based on the route setting expression extracted by the keyword extraction unit with reference to the route setting operation storage unit Based on the setting operation acquired by the route
- the point information and the route setting method are specified from the recognition result obtained by always recognizing the content of the user's utterance, and the route is set based on the point information and the route setting method.
- the route can be set as expected in real time and without the need for manual operation by the user.
- FIG. 1 is a block diagram illustrating an example of a navigation device according to Embodiment 1.
- FIG. It is a figure which shows an example of the point name memory
- FIG. 3 is a diagram illustrating an example of a route setting expression storage unit 4.
- 4 is a diagram illustrating an example of a route setting operation storage unit 6.
- FIG. 4 is a flowchart showing an operation of the navigation device according to the first embodiment. It is a figure which shows the path
- FIG. 10 is a block diagram illustrating an example of a navigation device according to a second embodiment. 10 is a flowchart showing the operation of the navigation device according to the second embodiment.
- FIG. 10 is a block diagram illustrating an example of a navigation device according to a third embodiment.
- FIG. 12 is a flowchart illustrating an operation of the navigation device according to the third embodiment.
- FIG. 10 is a block diagram illustrating an example of a navigation device according to a fourth embodiment. 10 is a flowchart illustrating an operation of the navigation device according to the fourth embodiment. 10 is a flowchart illustrating an operation of the navigation device according to the fifth embodiment.
- FIG. 20 is a block diagram illustrating an example of a navigation device according to a sixth embodiment. 18 is a flowchart showing the operation of the navigation device according to the sixth embodiment. 18 is a flowchart showing the operation of the navigation device according to the seventh embodiment.
- FIG. 20 is a diagram illustrating an example of a route setting expression storage unit 4 according to an eighth embodiment.
- FIG. 20 is a flowchart showing the operation of the navigation device according to the eighth embodiment.
- FIG. 38 is a block diagram illustrating an example of a navigation device according to a ninth embodiment.
- 29 is a flowchart showing the operation of the navigation device according to the ninth embodiment.
- Embodiment 9 it is a figure which shows the example of a screen which displayed the dialog for confirming the necessity of a route search.
- 29 is a flowchart showing the operation of the navigation device according to the tenth embodiment.
- it is a figure which shows an example of the presentation screen which displayed several path
- FIG. 38 is a diagram illustrating an example of a route setting expression storage unit 4 according to the tenth embodiment.
- Embodiment 10 it is a figure which shows another example of the presentation screen which displayed several path
- FIG. 38 is a block diagram illustrating an example of a navigation device according to a twelfth embodiment.
- 3 is a diagram illustrating an example of a presentation method storage unit 16.
- FIG. 29 is a flowchart showing the operation of the navigation device according to the twelfth embodiment.
- 209 is a diagram illustrating an example of a route presentation screen in Embodiment 12. [FIG. FIG.
- 38 is a diagram illustrating an example of a route setting operation storage unit 6 according to the thirteenth embodiment.
- 28 is a flowchart showing the operation of the navigation device according to the thirteenth embodiment.
- 38 is a flowchart showing another operation of the navigation device according to the thirteenth embodiment.
- it is a figure which shows the example of a screen transition when the dialog which confirms to a user whether the path
- it is a figure which shows the example of a screen when multiple searched facilities are displayed.
- the present invention includes a position acquisition unit that acquires the position of the host vehicle (moving body), and provides route guidance on the route based on the position of the host vehicle (moving body) acquired by the position acquisition unit and map data.
- the navigation device to be performed when the navigation device is activated, the user's utterance content is always recognized, the point information and the route setting method are identified from the recognition result, and the point information and the route setting method are used. The route is set automatically.
- a case where the navigation device of the present invention is applied to a car navigation system mounted on a moving body such as a vehicle will be described as an example.
- FIG. 1 is a block diagram showing an example of a navigation apparatus according to Embodiment 1 of the present invention.
- the navigation device includes a voice acquisition unit 1, a voice recognition unit 2, a point name storage unit 3, a route setting expression storage unit 4, a keyword extraction unit 5, a route setting operation storage unit 6, and a route setting operation acquisition.
- the unit 7, the map data storage unit 8, the own vehicle position acquisition unit (position acquisition unit) 9, the route determination unit 10, the presentation control unit 21, the display unit 22, and the voice output unit 23 are configured.
- the presentation control unit 21, the display unit 22, and the voice output unit 23 constitute a presentation control output unit 20.
- the navigation device also includes a key input unit 12 that acquires an input signal from a key, a touch panel, or the like, and a time acquisition unit 14 that acquires a time.
- the voice acquisition unit 1 performs A / D conversion on a user utterance collected by a microphone or the like, that is, an input voice, and acquires the voice, for example, in a PCM (Pulse Code Modulation) format.
- the voice recognition unit 2 has a recognition dictionary (not shown), detects a voice section corresponding to the content spoken by a user such as a passenger from the voice data acquired by the voice acquisition unit 1, and determines a feature amount. Extraction is performed, and voice recognition processing is performed using a recognition dictionary based on the feature amount.
- a general method such as Hidden Markov Model may be used.
- the voice recognition unit 2 may use a voice recognition server on the network.
- a button or the like for instructing the start of voice recognition (hereinafter referred to as “voice recognition start instruction unit”) is displayed on the touch panel or installed on the handle. Then, after the voice recognition start instruction unit is pressed by the user, the uttered voice is recognized. That is, when the voice recognition start instruction unit outputs a voice recognition start signal and the voice recognition unit receives the signal, it corresponds to the content of the user utterance or the like from the voice data acquired by the voice acquisition unit after receiving the signal.
- the speech section to be detected is detected, and the above-described recognition process is performed.
- the voice recognition unit 2 always recognizes the contents of the user utterances and the like even without the voice recognition start instruction by the user as described above. That is, the voice recognition unit 2 detects the voice section corresponding to the content of the user utterance or the like from the voice data acquired by the voice acquisition unit 1 without receiving the voice recognition start signal, and the voice data of the voice section The feature amount is extracted, the recognition process is performed using the recognition dictionary based on the feature amount, and the process of outputting the character string of the speech recognition result is repeated. The same applies to the following embodiments.
- the spot name storage unit 3 stores names of places and facilities that are expected as spot names.
- FIG. 2 is a diagram illustrating an example of the point name storage unit 3.
- the point name storage unit 3 stores, for example, names of facilities such as “Kiyomizu Temple”, “Kinkakuji”, and “Kyoto Station”, and place names such as “Sanjo Kawaramachi” and “Shijo Kawaramachi” as point names.
- the route setting expression storage unit 4 normally stores an expression related to the route setting operation as a route setting expression among words uttered by the user.
- FIG. 3 is a diagram illustrating an example of the route setting expression storage unit 4.
- the route setting expression storage unit 4 displays, for example, operation expressions such as “go”, “want to go”, “let's go”, “stop”, “stop”, “break”, “stop” Is stored as a route setting expression.
- the keyword extraction unit 5 performs morphological analysis with reference to the point name storage unit 3 and the route setting expression storage unit 4, and extracts the point name and the route setting expression from the character string of the voice recognition result of the voice recognition unit 2.
- the route setting operation storage unit 6 stores the route setting expression extracted by the keyword extraction unit 5 in association with the route setting operation corresponding to the route setting expression.
- FIG. 4 is a diagram illustrating an example of the route setting operation storage unit 6. As shown in this figure, for example, for a route setting expression such as “Go”, “I want to go”, “Let's go”, it is called “Set as destination” as a final destination. Path setting operations are stored in association with each other. In addition, for example, for route setting expressions such as “stop” and “take a break”, a route setting operation of “set to waypoint” is stored in association with an intermediate waypoint being spoken. Yes. Further, for example, for a route setting expression such as “stop”, a route setting operation “delete route” is stored in association with the speech when the route is stopped.
- the route setting operation acquisition unit 7 searches the route setting operation storage unit 6 using the route setting expression extracted by the keyword extraction unit 5 as a search key, and acquires a route setting operation corresponding to the route setting expression that matches the search key. To do.
- the map data storage unit 8 stores map data such as road data, intersection data, and facility data.
- the map data storage unit 8 may be, for example, a storage medium such as a DVD-ROM, a hard disk, or an SD card.
- the map data storage unit 8 exists on a network and can acquire information such as road data via a communication network (map). Data acquisition unit).
- the own vehicle position acquisition unit (position acquisition unit) 9 acquires the current position (latitude and longitude) of the own vehicle (moving body) using information acquired from a GPS receiver, a gyroscope, or the like.
- the route determination unit 10 acquires the position information (latitude and longitude) of the point specified from the point name extracted by the keyword extraction unit 5 with reference to the map data stored in the map data storage unit 8, and The acquired position information, the position (longitude and latitude) of the own vehicle (moving body) acquired by the own vehicle position acquisition unit (position acquisition unit) 9, and the route setting operation acquired by the route setting operation acquisition unit 7 Based on this, the route to the point specified from the point name is searched, and the searched route is set.
- FIG. 5 is a flowchart showing the operation of the navigation device according to the first embodiment.
- the voice acquisition unit 1 acquires the input voice, performs A / D conversion, and acquires it as, for example, PCM format voice data (step ST01).
- the voice recognition unit 2 recognizes the voice data acquired by the voice acquisition unit 1 (step ST02).
- the keyword extraction unit 5 extracts the point name and the route setting expression for the point name from the recognition result of the voice recognition unit 2 with reference to the point name storage unit 3 and the route setting expression storage unit 4 (step) ST03).
- the route setting operation acquisition unit 7 stores the route setting operation storage using the route setting expression extracted by the keyword extracting unit 5 as a search key. By searching the unit 6 and searching for a route setting expression matching the search key, a route setting operation corresponding to the route setting expression is acquired (step ST05).
- the route determination unit 10 acquires the position of the spot name extracted by the keyword extraction unit 5 with reference to the map data storage unit 8, and the acquired position information and the route acquired in step ST05. Based on the setting operation and the current position of the vehicle (moving body) acquired by the vehicle position acquisition unit (position acquisition unit) 9, the route is searched and the searched route is set (step ST06). On the other hand, if the route setting expression has not been extracted (NO in step ST04), the process ends.
- the voice acquisition unit 1 acquires the voice data (step ST01), and the voice recognition unit 2 gives the recognition result “Let's stop at Kiyomizu-dera and then go to Kyoto station” (step ST02).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST03).
- the point name was extracted with reference to the point name storage unit 3 as shown in FIG. 2 here, the map data stored in the map data storage unit 8 is taken into account, and the information on the facility and address is used.
- the point names may be extracted using a known morphological analysis method or a known meaning understanding method.
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “set to destination” corresponding to “Let's go” is acquired (step ST05).
- the route determination unit 10 refers to the map data storage unit 8 to acquire the locations of the spot names “Kiyomizu-dera” and “Kyoto Station”, and the location information of the “Kiyomizu-dera” and the corresponding route setting expression “stop by” Route setting operation called “Set as waypoint” obtained from “T”, Route setting operation called “Set as destination” obtained from the location information of “Kyoto Station” and the corresponding route setting expression “Let's go”, And based on the position of the own vehicle (moving body), a route for “Kyoto Station” is searched using “Kiyomizu Temple” as a transit point, and the searched route is set (step ST06).
- FIG. 6 is a diagram showing information set by the route determination unit 10 by the above processing, where the spot name “Kiyomizu-dera” is “set as a transit point”, and the spot name “Kyoto Station” is “set as a destination”. The route to do is shown.
- the point information and the route setting method are specified from the recognition result obtained by always recognizing the content of the user's utterance, and automatically based on the point information and the route setting method. Since the route is automatically set, it is possible to set the route in real time, and it is possible to prevent the desired route from being unable to be set by updating the utterance content. In addition, it is possible to perform the route setting as expected without requiring a user's manual route setting operation, and the usability is improved. Furthermore, since the voice acquisition and voice recognition are always performed when the navigation device is activated even if the user is not conscious, the user's manual operation and input intention for voice acquisition and voice recognition start, etc. It is not necessary.
- FIG. FIG. 7 is a block diagram showing an example of a navigation apparatus according to Embodiment 2 of the present invention.
- symbol is attached
- the point specifying unit 11 is further provided as compared with the first embodiment. And when there are a plurality of points specified by the name of one point spoken by the user (when there are a plurality of points with the same place name and different positions), a point with a short distance from the current location is selected, and the selected point To set a route to.
- the point specifying unit 11 specifies a point that is close to the current position of the own vehicle (moving body), and the position information Is output.
- the route determination unit 10 acquires the position information (latitude and longitude) of the point specified by the point specifying unit 11 with reference to the map data stored in the map data storage unit 8, and the acquired position Based on the information, the position (longitude and latitude) of the own vehicle (moving body) acquired by the own vehicle position acquisition unit (position acquisition unit) 9, and the route setting operation acquired by the route setting operation acquisition unit 7, keywords
- the route to the point specified from the point name extracted by the extraction unit 5 is searched, and the searched route is set.
- FIG. 8 is a flowchart showing the operation of the navigation device according to the second embodiment.
- the processing from steps ST11 to ST15 is the same as steps ST01 to ST05 in the flowchart of FIG.
- specification part 11 refers to the map data memorize
- the route determination unit 10 receives the position information output in step ST17 or step ST18, the route setting operation acquired in step ST15, and the current vehicle position acquisition unit (position acquisition unit) 9 The route is searched based on the position of the host vehicle (moving body), and the searched route is set (step ST19).
- the voice acquisition unit 1 acquires the voice data.
- the speech recognition unit 2 obtains a recognition result “Let's stop at Kiyomizu Temple and then go to Kyoto Station” (Step ST12).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST13).
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “Set to destination” corresponding to “Let's go” is acquired (step ST15).
- the point specifying unit 11 determines whether or not there are a plurality of points specified by the one point name for each of the point names “Kiyomizu-dera” and “Kyoto Station” (step ST16). First, referring to the map data for the location name “Kiyomizu-dera”, it can be seen that there are temples with the name “Kiyomizu-dera” in Kyoto and Hyogo Prefectures.
- Step ST16 since there are two points specified by the name of one spot called “Kiyomizu-dera” (in the case of YES in step ST16), in order to specify one of these points, from the current location to Kiyomizu-dera in Kyoto Prefecture And the distance from the current location to Kiyomizu Temple in Hyogo Prefecture. At this time, for example, if the current location is in Kyoto city, the distance to Kiyomizu-dera Temple in Kyoto Prefecture is closer, so Kiyomizu-dera Temple in Kyoto Prefecture is identified and its position information is output (step ST17). Further, referring to the map data for the place name “Kyoto Station”, since the place specified by this name is one place (in the case of NO in step ST16), the place is specified and position information is output ( Step ST18).
- the route determination unit 10 outputs the location information of “Kiyomizu-dera” output in step ST17 and the route setting operation of “set to waypoint”, the location information of “Kyoto Station” output in step ST18, and “to destination” Based on the route setting operation called “setting” and the position of the vehicle (moving body), the route from “Kiyomizu-dera” in Kyoto Prefecture to “Kyoto Station” is searched and searched. A route is set (step ST19).
- the user may be able to set whether or not to use the point specifying function in the second embodiment.
- the destination and the route Since a route having a high possibility of becoming the ground is selected and a route is set, it is possible to prevent a route deviating from the user's intention from being set.
- FIG. 9 is a block diagram showing an example of a navigation apparatus according to Embodiment 3 of the present invention. Note that the same components as those described in the first and second embodiments are denoted by the same reference numerals, and redundant description is omitted.
- the key input unit 12 not shown in the first and second embodiments is shown as compared with the second embodiment. And when there are a plurality of points specified by the name of one point spoken by the user (when there are a plurality of points with the same place name and different positions), the user presents the plurality of points, and any one by the user When one point is selected, a route to the selected point is set.
- the key input unit 12 is an input unit such as a keyboard, a button, a mouse, and a touch panel that can be manually operated by a user. When there are a plurality of points specified by the point names extracted by the keyword extraction unit 5, the user can select which of the plurality of point information is to be adopted by the key input unit 12.
- FIG. 10 is a flowchart showing the operation of the navigation device according to the third embodiment.
- the processing from steps ST21 to ST26 is the same as steps ST11 to ST16 in the flowchart of FIG.
- specification part 11 refers to the map data memorize
- the plurality of pieces of point information are displayed as a list or displayed on a map or the like.
- Step ST27 Since the user selects one of the presented pieces of point information using the key input unit 12 to select one point, the point information is output (step ST28). .
- step ST28 the point information is output (step ST28).
- step ST29 the position information of the spot specified from the spot name is acquired from, for example, map data and output (step ST29).
- the route determination unit 10 receives the position information output in step ST28 or step ST29, the route setting operation acquired in step ST25, and the current vehicle position acquisition unit (position acquisition unit) 9 The route is searched based on the position of the own vehicle (moving body), and the searched route is set (step ST30).
- the voice acquisition unit 1 sends the voice data. Acquired (step ST21), and the speech recognition unit 2 obtains a recognition result "Let's go to Kyoto station after stopping at Kiyomizu-dera" (step ST22).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, and “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST23).
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “set to destination” corresponding to “Let's go” is acquired (step ST25).
- the point specifying unit 11 determines whether or not there are a plurality of points specified by the one point name for each of the point names “Kiyomizu-dera” and “Kyoto Station” (step ST26).
- the point specifying unit 11 determines whether or not there are a plurality of points specified by the one point name for each of the point names “Kiyomizu-dera” and “Kyoto Station” (step ST26).
- Step ST27 "Hyogo Prefecture Kiyomizu-dera” is displayed as a list, for example, and a map is displayed to display the location of Kiyomizu-dera in Kyoto Prefecture and the location of Kiyomizu-dera in Hyogo Prefecture.
- Step ST28 the point specifying unit 11 specifies the selected point and outputs the position information.
- the location identifying unit 11 identifies that location and determines the location. Information is output (step ST29).
- the route determination unit 10 outputs the location information of “Kiyomizu-dera” output in step ST28 and the route setting operation of “set as waypoint”, the location information of “Kyoto station” output in step ST29 and “to destination” Based on the route setting operation called “setting” and the position of the vehicle (moving body), the route from “Kiyomizu-dera” in Kyoto Prefecture to “Kyoto Station” is searched and searched. A route is set (step ST30).
- the plurality of points are described as being displayed on a display or the like. Any method may be used for the presentation. Needless to say, the selection method is not limited to the selection by the key input unit 12, for example, when the presentation is performed by voice output, the selection by the user is also by voice input.
- the user may be able to set whether or not to use the point specifying function in the third embodiment.
- the third embodiment in addition to the effects in the first embodiment, even when there are a plurality of points specified from one point name spoken by the user, In addition, since the route is set by specifying the location as a transit point by the user, it is possible to prevent a route deviating from the user's intention from being set.
- FIG. 11 is a block diagram showing an example of a navigation device according to Embodiment 4 of the present invention. Note that the same components as those described in the first to third embodiments are denoted by the same reference numerals, and redundant description is omitted.
- a name recognition unit 13 is further provided. And when there are a plurality of points specified by a single point name spoken by the user (when there are a plurality of points with the same place name and different positions), a point with a high degree of name recognition is selected, and a route to the selected point Is set.
- the well-known degree storage unit 13 stores, for example, location information of facilities such as shrines and parks and the degree of well-known information. Further, the name recognition is determined based on data acquired by some method, such as the number of annual visitors and a questionnaire result, for example.
- FIG. 12 is a flowchart showing the operation of the navigation device according to the second embodiment.
- the processing from steps ST31 to ST36 is the same as steps ST11 to ST16 in the flowchart of FIG.
- step ST36 the point specific
- the well-known degree storage unit 13 is further referred to to obtain the degree of familiarity of the plurality of points, and the highest degree of familiarity is obtained
- a point is specified and its position information is output (step ST37).
- the position information of the spot specified from the spot name is acquired from, for example, map data and output (step ST38).
- the route determination unit 10 detects the position information output in step ST37 or step ST38, the route setting operation acquired in step ST35, and the current vehicle position acquisition unit (position acquisition unit) 9 The route is searched based on the position of the own vehicle (moving body), and the searched route is set (step ST39).
- the voice acquisition unit 1 acquires the voice data.
- the speech recognition unit 2 obtains a recognition result “Let's stop at Kiyomizu-dera and then go to Kyoto station” (step ST32).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST33).
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “set to destination” corresponding to “Let's go” is acquired (step ST35).
- the point specifying unit 11 determines whether or not there are a plurality of points specified by the one point name for each of the point names “Kiyomizu-dera” and “Kyoto Station” (step ST36).
- the map data for the location name “Kiyomizu-dera” it can be seen that there are temples with the name “Kiyomizu-dera” in Kyoto and Hyogo Prefectures. That is, since there are two points identified by the name of one spot called “Kiyomizu-dera” (in the case of YES in step ST36), refer to the name recognition unit 13 to identify one of these points. To get the name recognition of each point.
- the Kiyomizu-dera temple in Kyoto Prefecture is identified as the more well-known one, and its position information is output (step ST37). Further, referring to the map data for the location name “Kyoto Station”, since the location specified by this name is one location (in the case of NO in step ST36), the location is specified and position information is output ( Step ST38).
- the route determination unit 10 outputs the location information of “Kiyomizu-dera” output in step ST37 and the route setting operation of “set as waypoint”, the location information of “Kyoto station” output in step ST38 and “to destination” Based on the route setting operation called “setting” and the position of the vehicle (moving body), the route from “Kiyomizu-dera” in Kyoto Prefecture to “Kyoto Station” is searched and searched. A route is set (step ST39).
- the user may be able to set whether or not to use the point specifying function in the fourth embodiment.
- the destination and the route Since a route having a high possibility of becoming the ground is selected and a route is set, it is possible to prevent a route deviating from the user's intention from being set.
- FIG. 5 The block diagram showing an example of the navigation apparatus according to the fifth embodiment of the present invention is the same as the block diagram shown in FIG. In the fifth embodiment described below, as compared with the first embodiment, when the route determination unit 10 finally searches for a route, the total distance of the route exceeds a predetermined distance. The searched route is not set.
- FIG. 13 is a flowchart showing the operation of the navigation device according to the fifth embodiment.
- the processing from steps ST41 to ST45 is the same as steps ST01 to ST05 in the flowchart of FIG.
- the route determination part 10 acquires the position about the point name extracted by the keyword extraction part 5 with reference to the map data storage part 8, The said acquired position information, After searching for a route (step ST46) based on the route setting operation acquired in step ST05 and the current position of the vehicle (moving body) acquired by the vehicle position acquisition unit (position acquisition unit) 9 Then, the total distance of the searched route is determined.
- step ST47 For the route searched in step ST46, it is determined whether or not the total distance of the route is equal to or less than a predetermined threshold value (step ST47). If it is equal to or less than the predetermined threshold value (YES in step ST47), the route is set (step ST48). On the other hand, if it is determined in step ST47 that the total distance of the route is larger than the predetermined threshold (NO in step ST47), the process is terminated without setting the route.
- the voice acquisition unit 1 acquires the voice data.
- the speech recognition unit 2 obtains a recognition result “Let's stop at Kiyomizu-dera and then go to Kyoto station” (step ST42).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, and “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST43).
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “Set to destination” corresponding to “Let's go” is acquired (step ST45).
- the route determination unit 10 refers to the map data storage unit 8 to acquire the locations of the spot names “Kiyomizu-dera” and “Kyoto Station”, the location information of the “Kiyomizu-dera”, and the route “set as a transit point” Based on the setting operation, the location information of "Kyoto Station” and the route setting operation of "Set as Destination”, and the position of the vehicle (mobile body), "Kyoto Station” A route as a destination is searched (step ST46). Further, the route determination unit 10 calculates the total distance of the route based on the map data and the current location.
- the calculated total distance is 7 km, and a predetermined threshold (maximum route distance) is 5 km. If this occurs, this route is not set (in the case of NO at step ST47). Conversely, when the predetermined threshold value is 10 km, the route searched in step ST46 (route with the calculated total distance of 7 km) is set (step ST48).
- the user may be allowed to set whether or not to use the function of comparing the total distances of the routes in the fifth embodiment.
- the description has been made based on the first embodiment.
- the route is not set based on the total distance of the searched route. You may do it.
- FIG. 14 is a block diagram showing an example of a navigation apparatus according to Embodiment 6 of the present invention. Note that the same components as those described in the first to fifth embodiments are denoted by the same reference numerals, and redundant description is omitted.
- the time acquisition unit 14 not shown in the first embodiment is shown.
- the route determination unit 10 searches for a route, When the scheduled time to arrive at the destination by the route (scheduled arrival time) exceeds a predetermined time, or the time required to arrive at the destination by the route is a predetermined time If it exceeds, the searched route is not set.
- the time acquisition unit 14 acquires time using known information.
- FIG. 15 is a flowchart showing the operation of the navigation device according to the sixth embodiment.
- the processing from steps ST51 to ST56 is the same as steps ST41 to ST46 in the flowchart of FIG.
- step ST57 for the route searched in step ST56, whether or not the scheduled time (scheduled arrival time) to arrive at the destination by the route is before a predetermined threshold, or the time required to arrive at the destination by the route Is less than or equal to a predetermined threshold value.
- the route is set (step ST58).
- step ST57 if it is determined in step ST57 that the estimated arrival time is after a predetermined threshold value, or if the required time to the destination is greater than the predetermined threshold value (NO in step ST57), the route is set. The process is terminated without doing so.
- the voice acquisition unit 1 acquires the voice data.
- the speech recognition unit 2 obtains a recognition result “Let's stop at Kiyomizu-dera and then go to Kyoto station” (step ST52).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST53).
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “Set to destination” corresponding to “Let's go” is acquired (step ST55).
- the route determination unit 10 refers to the map data storage unit 8 to acquire the locations of the spot names “Kiyomizu-dera” and “Kyoto Station”, the location information of the “Kiyomizu-dera”, and the route “set as a transit point” Based on the setting operation, the location information of "Kyoto Station” and the route setting operation of "Set as Destination”, and the position of the vehicle (mobile body), "Kyoto Station” A route as a destination is searched (step ST56). Further, the route determination unit 10 calculates a scheduled time (scheduled arrival time) to arrive at the destination Kyoto station by the route based on the time acquired by the time acquisition unit 14, the map data, the current location, and the like. To do.
- a scheduled time scheduled arrival time
- the calculated estimated time of arriving at Kyoto Station via the route from Kiyomizu Temple in Kyoto Prefecture to Kyoto Station as the destination is 18:10, and a predetermined threshold value is 18:00. If this is the case, this route is not set (in the case of NO in step ST57). On the other hand, when the predetermined threshold value is 18:30, the route searched in step ST56 is set (step ST58).
- the time to be compared in step ST57 may be the time required to reach the destination, not the estimated arrival time.
- the route determination unit 10 searches for a route with “Kyoto Station” as a destination via “Kiyomizu Temple”, and then arrives at the destination Kyoto Station by that route.
- the required time is calculated based on the map data and the current location. For example, the calculated required time from the current value to arrival at Kyoto Station on the route to Kyoto Station via Kiyomizu-dera Temple in Kyoto Prefecture is 1 hour 30 minutes, and a predetermined threshold value set in advance Is 1 hour, this route is not set (in the case of NO in step ST57). Conversely, when the predetermined threshold value is 2 hours, the route searched in step ST56 is set (step ST58).
- the user may be allowed to set whether or not to use the function of comparing the estimated arrival time or required time to the destination of the route in the sixth embodiment.
- the sixth embodiment has been described based on the first embodiment, similarly to the second to fourth embodiments, after searching for a route, the estimated arrival time to the destination of the searched route Alternatively, the route may not be set depending on the required time.
- Embodiment 7 FIG.
- the block diagram showing an example of the navigation device according to the seventh embodiment of the present invention is the same as the block diagram shown in FIG. In the seventh embodiment described below, as compared with the first embodiment, when a route is already set, the point specified by the point name is “route” regardless of the contents of the route setting operation. And re-searching the route.
- FIG. 16 is a flowchart showing the operation of the navigation device according to the seventh embodiment.
- the processing from step ST61 to ST65 is the same as step ST01 to ST05 in the flowchart of FIG.
- the route determination unit 10 determines whether or not a route has already been set (step ST66).
- the route has already been set (in the case of YES in step ST66)
- the location of the spot name extracted by the keyword extraction unit 5 is acquired with reference to, for example, map data, and the acquired location is obtained.
- the location information is “added to a transit point” to search again for the route and set the route (step ST67).
- the route is not set (in the case of NO in step ST66)
- the route is not set (in the case of NO in step ST66)
- the route is searched based on the position of the mobile object), and the searched route is set (step ST68).
- the voice acquisition unit 1 acquires the voice data (step ST61), and the voice recognition unit 2 obtains a recognition result “go to Yasaka Shrine” (step ST62).
- the keyword extracting unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Go” is extracted as the setting expression (step ST63).
- the route setting operation acquisition unit 7 uses the route setting expression “go” as a search key as shown in FIG. By searching the storage unit 6 and searching for a route setting expression that matches the search key, a route setting operation “set to destination” corresponding to “go” is acquired (step ST65). Thereafter, the route determination unit 10 determines whether or not a route has already been set (step ST66).
- the route determination unit 10 refers to the map data storage unit 8 to acquire the position of the spot name “Yasaka Shrine”, and step ST65. Regardless of the route setting operation of “set as destination” acquired in step 1, the position of Yasaka Shrine is “added to the waypoint” to search again for the route and set the route (step ST67).
- Yasaka Shrine is not set as the destination, but is added to the waypoint, and the route to Kyoto Station is re-searched via Yasaka Shrine via Kiyomizu Temple in Kyoto Prefecture. Is set.
- the point of the point name extracted by the keyword extracting unit 5 is always added to the waypoint.
- the extracted point If the position of the name is around the current position of the vehicle (moving body) (for example, within a radius of 500 m), it may be configured with a condition such as adding to the waypoint.
- the user may be able to set whether or not to use the function for determining route guide expression eligibility in the seventh embodiment.
- the seventh embodiment has been described based on the first embodiment, similarly to the second to sixth embodiments, when a route has already been set, the acquired route setting operation is performed. Regardless, you may make it add the position of the acquired point name as a waypoint.
- the set destination in addition to the effects of the first to sixth embodiments, the set destination can be prevented from being changed unnecessarily, and a place where the user wants to drop in can be selected. It is possible to make additional settings efficiently at the waypoints. In addition, it is not necessary to manually set an additional waypoint.
- Embodiment 8 Since the block diagram showing an example of the navigation device according to the eighth embodiment of the present invention is the same as the block diagram shown in FIG. 1 in the first embodiment, its illustration and description are omitted.
- the route setting expression storage unit 4 is also provided with an expression representing the time
- the keyword extraction unit 5 represents the time from the contents spoken by the user. An expression is also extracted, and if the expression is not an expression representing “today”, that is, if it is an expression representing a future time beyond a predetermined distance, a route is not set.
- FIG. 17 is a diagram illustrating an example of the route setting expression storage unit 4 according to the eighth embodiment.
- the route setting expression storage unit 4 stores operation expressions such as “want to go”, “stop”, and “rest” as route setting expressions related to the route setting operation, as in FIG.
- a route setting expression representing a time such as “today”, “tomorrow”, “this time” is also stored.
- FIG. 18 is a flowchart showing the operation of the navigation device according to the eighth embodiment.
- the processing from steps ST71 to ST75 is the same as steps ST01 to ST05 in the flowchart of FIG.
- the route determination unit 10 includes the route setting expression representing the time among the expressions extracted by the keyword extraction unit 5 in step ST73. It is determined whether or not an expression other than an expression representing “today” is included, that is, whether or not an expression representing a future time beyond a predetermined distance is included (step ST76).
- step ST76 if there is no expression other than the expression representing “today” (in the case of YES in step ST76), it is extracted by the keyword extraction unit 5 as in step ST06 of the flowchart of FIG. 5 in the first embodiment.
- the location is acquired by referring to the map data, acquired by the acquired location information, the route setting operation acquired in step ST75, and the vehicle location acquisition unit (position acquisition unit) 9.
- the route is searched based on the current position of the own vehicle (moving body), and the searched route is set (step ST77).
- step ST76 for example, when there are expressions other than expressions representing “today”, such as “tomorrow”, “next time”, “next month”, that is, expressions representing future times beyond a predetermined point ( In the case of NO in step ST76), it is determined that the future is talked about more than a predetermined distance, and the process is terminated without setting a route.
- expressions other than expressions representing “today” such as “tomorrow”, “next time”, “next month”
- the voice acquisition unit 1 acquires the voice data (step ST71), and the voice recognition unit 2 sets “ A recognition result “I will go to Kyoto station tomorrow” is obtained (step ST72).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Tomorrow” and “go” are extracted as setting expressions (step ST73).
- the route setting operation acquisition unit 7 uses the route setting expression “go” as a search key as shown in FIG. By searching the storage unit 6 and searching for a route setting expression that matches the search key, the route setting operation “set to destination” corresponding to “go” is acquired (step ST75). Thereafter, the route determination unit 10 determines whether or not the route setting expression representing the time other than the expression representing “today” is included (step ST76).
- the route setting expression representing the time is something other than the expression “today” of “tomorrow”, that is, an expression representing the future time beyond a predetermined distance (in the case of NO in step ST76), It is determined that the person is talking about the future beyond a predetermined level, and the process is terminated without setting a route.
- the route setting expression storage unit 4 has been described as storing the expression representing the time.
- the time expression storing unit is separate from the route setting expression storage unit 4 as shown in FIG. You may make it provide a part.
- the determination is based on “whether or not there is”, but the determination may be based on “whether or not an expression representing“ today ”is included” such as “today” or “from now”.
- the user can set whether or not to use the function for determining whether or not the expression other than the expression representing “today” is included in the expression representing the time. May be. In the eighth embodiment, the description has been made based on the first embodiment. Similarly, in the second to seventh embodiments, the expression representing the time is something other than the expression representing “today”, that is, the predetermined expression. If the expression represents the future time, the route may not be set.
- a route that is not desired at the present time is set, for example, when the user only speaks about the future. Can be prevented.
- FIG. FIG. 19 is a block diagram showing an example of a navigation apparatus according to Embodiment 9 of the present invention. Note that the same components as those described in the first to eighth embodiments are denoted by the same reference numerals, and redundant description is omitted.
- a route search necessity / non-selection unit 15 is further provided as compared with the first embodiment, and the key input unit 12 not shown in the first embodiment is shown. The user can select whether or not to search for a route.
- the route search necessity selection unit 15 displays, for example, on a screen or outputs a voice as to whether or not to search for a route based on the location name extracted by the keyword extraction unit 5 and the route setting operation for the location. Is presented to the user, and the user is made to select whether or not the route search is necessary.
- FIG. 20 is a flowchart showing the operation of the navigation device according to the ninth embodiment.
- the processing in steps ST81 to ST85 is the same as steps ST01 to ST05 in the flowchart of FIG.
- the route search necessity selection unit 15 confirms whether or not to search for the route with respect to the spot name extracted by the keyword extraction unit 5 and the route setting operation ( Step ST86).
- a necessity confirmation method a table associating the spot name and the route setting operation as shown in FIG. 6 is displayed, and a button for selecting necessity of route search is presented to the user. Or may be presented to the user by voice.
- the spot name extracted by the keyword extraction unit 5 is the same as step ST06 in the flowchart of FIG. 5 in the first embodiment.
- the map data is referred to to acquire the position, the acquired position information, the route setting operation acquired in step ST75, and the current vehicle position acquisition unit (position acquisition unit) 9
- the route is searched based on the position of the own vehicle (moving body), and the searched route is set (step ST88).
- the search “NO” is selected by the user (NO in step ST87)
- the processing is ended as it is.
- the voice acquisition unit 1 acquires the voice data.
- the speech recognition unit 2 obtains a recognition result “Let's stop at Kiyomizu-dera and then go to Kyoto station” (step ST82).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. “Stop by” is extracted as an expression, “Kyoto Station” is extracted as a point name, and “Let's go” is extracted as a corresponding route setting expression (step ST83).
- the route setting operation acquisition unit 7 searches for the route setting expressions “stop by” and “let's go” By searching the route setting operation storage unit 6 as shown in FIG. 4 as a key and searching for a route setting expression that matches the search key, a route setting operation corresponding to “stop” is set to “set to via”. The route setting operation “set to destination” corresponding to “Let's go” is acquired (step ST85).
- the route search necessity selection unit 15 displays the extracted spot name and route setting operation as shown in FIG. 21, for example, and presents a dialog for allowing the user to select whether or not the route search is necessary.
- the user confirms whether or not route selection is necessary (step ST86).
- the route determination unit 10 refers to the map data storage unit 8 and refers to the spot name “Kiyomizu Temple”.
- “Kyoto Station” location information “Kiyomizu-dera” location information and “route setting” route setting operation, “Kyoto Station” location information and “set destination” route setting operation, and Then, based on the position of the vehicle (moving body), a route for “Kyoto Station” is searched using “Kiyomizu Temple” as a transit point, and the searched route is set (step ST88). On the other hand, if the user selects “No” (NO in step ST87), the process ends without searching for and setting the route.
- step ST86 for example, “Would you like to search for a route that takes Kyoto Station as a destination via Kiyomizu-dera?” Is presented to the user by voice, for example. Also good. In this way, when presented to the user by voice output, if the user speaks “Yes”, the route is searched and set, and if “No” is spoken, the process may be terminated.
- the user may be able to set whether or not to use the route search necessity confirmation function in the ninth embodiment.
- the description has been made based on the first embodiment.
- the necessity of route search may be confirmed.
- Embodiment 10 Since the block diagram showing an example of the navigation device according to the tenth embodiment of the present invention is the same as the block diagram shown in FIG. 1 in the first embodiment, its illustration and description are omitted.
- the route determination unit 10 when a plurality of routes are searched, the route determination unit 10 presents those routes to the user when compared with the first embodiment.
- the route determination unit 10 determines whether or not there are a plurality of routes to be searched. When there are a plurality of routes to be searched, the route determination unit 10 searches for the plurality of routes and sets the routes to the user. Present.
- FIG. 22 is a flowchart showing the operation of the navigation device according to the tenth embodiment.
- the processing from steps ST91 to ST95 is the same as steps ST01 to ST05 in the flowchart of FIG.
- the route determination unit 10 determines whether there are a plurality of routes to be searched (step ST96). If there are a plurality of routes (YES in step ST96), the keyword extraction unit. For the location name extracted in step 5, the position is acquired with reference to map data, the acquired location information, the route setting operation acquired in step ST95, and the vehicle location acquisition unit (location acquisition unit). All routes are searched based on the current position of the vehicle (moving body) acquired by step 9 (step ST97). Then, the presentation control unit 21 is instructed to present the plurality of searched routes to the user, and is presented via the display unit 22 or the voice output unit 23 (step ST98). Then, the route selected by the user is set (step ST99).
- step ST96 determines whether there is not a plurality of routes to be searched (in the case of NO in step ST96).
- the location of the spot name extracted by the keyword extraction unit 5 is acquired with reference to map data, the acquired location information, the route setting operation acquired in step ST95, and the vehicle location acquisition unit ( Based on the current position of the vehicle (moving body) acquired by the position acquisition unit 9, the route is searched and the searched route is set (step ST 100).
- a number of users A, “I want to go to Kyoto station via Kiyomizu-dera”, and B “Eh, via Kiyomizu-dera and Sanjusangendo For example, for example, a number of users A, “I want to go to Kyoto station via Kiyomizu-dera”, and B “Eh, via Kiyomizu-dera and Sanjusangendo.
- the voice acquisition unit 1 acquires the voice data (step ST91), and the voice recognition unit 2 A “I want to go to Kyoto Station via Kiyomizu-dera” and B “Eh, Kiyomizudera and Sanjusan
- the recognition result “I want to go to Kyoto Station via the temple” is obtained (step ST92).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG.
- Step ST93 “via” is extracted as a route setting expression corresponding to “Kiyomizu-dera” and “Sanjusangendo” as a point name, and “Kyoto Station” is supported as a point name "I want to go” is extracted as a route setting expression to be performed.
- the route setting operation acquisition unit 7 Since the route setting expressions “via” and “want to go” are extracted (in the case of YES in step ST94), the route setting operation acquisition unit 7 performs the route setting expressions “via” and “want to go”. 4 is used as a search key to search the route setting operation storage unit 6 as shown in FIG. 4 and search for a route setting expression that matches the search key.
- the route setting operation “set to destination” corresponding to “set” and “want to go” is acquired (step ST95).
- the route determination unit 10 refers to the map data storage unit 8 and acquires the positions of the spot names “Kiyomizu-dera” and “Kyoto Station” with respect to the recognition result of A.
- Step ST98 the route determination unit 10 selects the route selected by the user.
- Set step ST99.
- FIG. 24 is a diagram illustrating an example of the route setting expression storage unit 4 according to the tenth embodiment.
- the route setting expression storage unit 4 stores operation expressions such as “want to go”, “stop”, and “rest” as route setting expressions related to the route setting operation, as in FIG. For example, it also stores route setting expressions that represent times such as “today”, “tomorrow”, “this time”, and correct / cancel expressions such as “not”, “stop”, “stop”, “stop”, and “delete”. It is stored as a route setting expression.
- a user wants to go to Kyoto Station via Kiyomizu Temple, and B “Eh, Sanjusangendo instead of Kiyomizu Temple. "I want to go to Kyoto station via”. "
- the voice acquisition unit 1 acquires the voice data (step ST91), and the voice recognition unit 2 performs A “I want to go to Kyoto Station via Kiyomizu Temple” and B “Eh, not Kiyomizu Temple but thirty.
- the recognition result “I want to go to Kyoto Station via Sangen-do” is obtained (step ST92).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG.
- Step ST93 “Sanjusangen-do” that follows the correction expression “not,” is extracted as the point name, and “via” is extracted as the corresponding route setting expression. Also, “Kyoto Station” is extracted as the location name, and “I want to go” is extracted as the corresponding route setting expression.
- the route setting operation acquisition unit 7 Since the route setting expressions “via” and “want to go” are extracted (in the case of YES in step ST94), the route setting operation acquisition unit 7 performs the route setting expressions “via” and “want to go”. 4 is used as a search key to search the route setting operation storage unit 6 as shown in FIG. 4 and search for a route setting expression that matches the search key.
- the route setting operation “set to destination” corresponding to “set” and “want to go” is acquired (step ST95).
- the route determination unit 10 refers to the map data storage unit 8 and acquires the positions of the spot names “Kiyomizu-dera” and “Kyoto Station” with respect to the recognition result of A.
- Step ST98 the route determination unit 10 selects the route selected by the user.
- Set step ST99.
- the description is based on the first embodiment.
- the second to ninth embodiments as well, when a plurality of routes are searched, they may be presented. .
- FIG. 11 The block diagram showing an example of the navigation apparatus according to the eleventh embodiment of the present invention is the same as the block diagram shown in FIG. In the eleventh embodiment described below, as in the tenth embodiment, it is assumed that a plurality of routes are searched. Compared with the tenth embodiment, points set in the route are deleted. When such a cancellation expression is extracted, the route determination unit 10 searches for and sets a new route in which the point is deleted from the route.
- the route setting expression storage unit 4 according to the eleventh embodiment includes a corrected expression / cancellation expression in addition to the expression representing the route setting operation and the expression representing the time as in the case shown in FIG.
- the keyword extraction unit 5 also extracts the corrected / cancelled expressions.
- FIG. 26 is a flowchart showing the operation of the navigation device according to the eleventh embodiment. Since this flowchart is the same as the flowchart of FIG. 4 in the first embodiment, description of each step will be omitted, and a specific example will be described.
- the voice acquisition unit 1 acquires the voice data (step ST101), and the voice recognition unit 2 performs A “I want to go to Kyoto Station via Kiyomizu Temple” and B “Eh, Kiyomizu Temple and Sanjusan “I want to go to Kyoto Station via the hall” and “C, after all, let ’s stop Sanjusangendo” (step ST102).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG.
- Step ST103 “ Via ”is extracted as the corresponding route setting expression,“ Kyoto Station ”is extracted as the point name, and“ I want to go ”is extracted as the corresponding route setting expression.
- “via” is extracted as a route setting expression corresponding to “Kiyomizu-dera” and “Sanjusangen-do” as the point name, and “Kyoto Station” is supported as the point name "I want to go” is extracted as a route setting expression to be performed.
- “Sanjusangen-do” is extracted as the point name, and “Stop” is extracted as the corresponding route setting expression.
- the route setting operation acquisition unit 7 determines that the route setting expressions “via” “go”
- the route corresponding to “via” is searched by searching the route setting operation storage unit 6 as shown in FIG. 4 using the search key “I want” and “Stop” and searching for the route setting expression matching the search key.
- the setting operation “set to waypoint”, the route setting operation “set to destination” corresponding to “I want to go”, and the route setting operation “delete route” corresponding to “let's stop” are acquired (step ST105). .
- the recognition result of A and the recognition result of B are already used, for example, as shown in FIG.
- a two types of routes (routes) are presented to the user, such as a table as shown in FIG. 5 is displayed on the display screen, or a route (route) is displayed on the displayed map.
- the voice acquisition unit 1 acquires the voice data (step ST101), and the voice recognition unit 2 reads “ The recognition result “let's quit Sanjusangendo” is obtained (step ST102).
- the keyword extraction unit 5 refers to the point name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIG. 24, and extracts “Sanjusangen-do” as the point name. Then, “Let ’s quit” is extracted as the corresponding route setting expression. (Step ST103)
- the route setting operation acquisition unit 7 uses the route setting expression “stop it” as a search key, as shown in FIG. By searching the setting operation storage unit 6 and searching for a route setting expression that matches the search key, a route setting operation “delete route” corresponding to “let's stop” is acquired (step ST105). As a result, the location of the location name “Sanjusangen-do” is specified, and the route including the location is deleted. That is, the route of number 2 shown in FIG. 23, “Kyoto Station via Kiyomizu Temple and Sanjusangendo” is deleted and is not presented. Therefore, a route is searched for and set for the remaining route No. 1 “Kyoto Station via Kiyomizu Temple” (step ST106).
- FIG. FIG. 27 is a block diagram showing an example of a navigation apparatus according to Embodiment 12 of the present invention. Note that the same components as those described in the first to eleventh embodiments are denoted by the same reference numerals, and redundant description is omitted. In the twelfth embodiment described below, similarly to the tenth embodiment, it is assumed that a plurality of routes are searched, and the tenth embodiment (as a block diagram, FIG. 1 of the first embodiment and FIG. The presentation method storage unit 16 and the presentation method determination unit 17 are further provided. The method of presenting a plurality of routes is changed according to the importance of the route.
- the presentation method storage unit 16 stores the importance of the route to be presented and the corresponding presentation method.
- FIG. 28 is a diagram illustrating an example of the presentation method storage unit 16. As shown in this figure, the presentation method storage unit 16 presents, for example, a presentation method 1 for changing the color according to the importance of the route or a presentation method 2 for changing the thickness of the route (route). The presentation method corresponding to the importance of each route is stored. In the example shown in FIG. 28, when the importance is 1, “red” is set in the presentation method 1 and “extremely thick” is set in the presentation method 2. When the importance level is 2, the presentation method 1 is “yellow”, the presentation method 2 is “thick”, the importance level 3 is “blue”, the presentation method 2 is “fine”. "Is set.
- presentation method 1 is set in advance in the presentation method storage unit 16 shown in FIG. 28 unless there are special circumstances or conditions.
- the presentation method determination unit 17 determines whether or not there are a plurality of routes to be searched, and when it is determined that there are a plurality of routes, the presentation method as illustrated in FIG. 28, for example, according to the importance of each of the plurality of routes
- the presentation method is determined with reference to the method storage unit 16 and output to the route determination unit 10.
- the presentation method is determined according to the importance of each of the plurality of routes.
- the user inputs the key input unit 12 (see FIG. It may be possible to select which presentation method is to be adopted via the block diagram of 27).
- the route determination unit 10 searches for a plurality of routes and presents each route to the user by the presentation method determined by the presentation method determination unit 17.
- FIG. 29 is a flowchart showing the operation of the navigation device according to the twelfth embodiment.
- the processing from steps ST111 to ST115 is the same as steps ST91 to ST95 in the flowchart of FIG.
- the presentation method determination unit 17 determines whether or not there are a plurality of routes to be searched (step ST116). If there are a plurality of routes (YES in step ST116), the presentation method is determined. Referring to storage unit 16, the presentation method is determined and output to route determination unit 10 (step ST117). Then, the route determination unit 10 acquires the position of the spot name extracted by the keyword extraction unit 5 with reference to the map data, the acquired position information, and the route setting operation acquired in step ST115.
- All routes are searched based on the current position of the vehicle (moving body) acquired by the vehicle position acquisition unit (position acquisition unit) 9 (step ST118). Then, the display control unit 21 is instructed to present the plurality of searched routes to the user by the presenting method determined in step ST117 according to the importance of the route, and the display unit 22 or the voice output It is presented via the unit 23 (step ST119). Then, the route selected by the user is set (step ST120).
- step ST116 determines that there is only one route to be searched (in the case of NO in step ST116), as in step ST06 in the flowchart of FIG. 5 in the first embodiment.
- the location of the spot name extracted by the keyword extraction unit 5 is acquired with reference to map data, the acquired location information, the route setting operation acquired in step ST115, and the vehicle location acquisition unit ( Based on the current position of the vehicle (moving body) acquired by the position acquisition unit 9, the route is searched and the searched route is set (step ST 121).
- a user wants to go to Kyoto Station via Kiyomizu-dera, and B "E-, Kyoto Station via Kiyomizu-dera and Sanjusangendo.” "I want to go to”.
- the voice acquisition unit 1 acquires the voice data (step ST111), and the voice recognition unit 2 receives A “I want to go to Kyoto Station via Kiyomizu Temple” and B “Eh, Kiyomizu Temple and Sanjusan
- the recognition result “I want to go to Kyoto Station via the temple” is obtained (step ST112).
- the keyword extraction unit 5 refers to the spot name storage unit 3 as shown in FIG. 2 and the route setting expression storage unit 4 as shown in FIGS.
- Step ST113 Extracts “Kiyomizu-dera” as the point name, “via” as the corresponding route setting expression, “Kyoto Station” as the point name, and “I want to go” as the corresponding route setting expression. Furthermore, from the recognition result of B, “via” is extracted as a route setting expression corresponding to “Kiyomizu-dera” and “Sanjusangendo” as a point name, and “Kyoto Station” is supported as a point name "I want to go” is extracted as a route setting expression to be performed. (Step ST113)
- the route setting operation acquisition unit 7 performs the route setting expressions “via” and “want to go”. 4 is used as a search key to search the route setting operation storage unit 6 as shown in FIG. 4 and search for a route setting expression that matches the search key.
- the route setting operation “set to destination” corresponding to “set” and “want to go” is acquired (step ST115).
- the presentation method determining unit 17 determines that there are two (plural) routes to be searched (in the case of YES in step ST116). Then, the presentation method determination unit 17 refers to the presentation method storage unit 16 and presents the presentation method 1 set in the twelfth embodiment (red if the importance is 1, yellow if it is 2, If there is a blue color), it is determined as a presentation method and is output to the route determination unit 10 (step ST117).
- the route determination unit 10 refers to the map data storage unit 8 and acquires the positions of the spot names “Kiyomizu-dera” and “Kyoto Station” with respect to the recognition result of A. Based on the route setting operation of “setting”, the location information of “Kyoto station” and the route setting operation of “setting to destination”, and the position of the vehicle (moving body), Search for the route with “Kyoto Station” as the destination, and also for the recognition result of B, get the location of the spot names “Kiyomizu-dera”, “Sanjusangen-do” and “Kyoto Station” and the location of “Kiyomizu-dera” Information and route setting operation "Set as stopover", Location information of "Sanjusangendo” and route setting operation “Set as stopover”, Location information of "Kyoto Station” and “Set as destination” Based on the route setting operation and the position of the vehicle (moving body), "Kiyomizu Temple” As a stop
- FIG. 30 is a table in which importance and a presentation method are added to the route acquired from the recognition result of A (number 1) and the route acquired from the recognition result of B (number 2).
- the later utterance information B is set to importance 1
- the previous utterance information A is set to importance 2.
- FIG. 31 is a diagram illustrating an example of a route presentation screen.
- FIG. 31A shows a method of changing the color of the route according to the importance of the route, and the presentation method determined in the above specific example.
- FIG. 31B shows the road thickness changed according to the importance of the route, and
- FIG. 31C shows the importance attached to the route. Then, here, when the user selects a route desired to be selected on the screen presented to the user as shown in FIG. 31A, the route determination unit 10 sets the selected route (step) ST120).
- the user may be able to set whether or not to use the presentation method determination function in the twelfth embodiment.
- FIG. 13 The block diagram showing an example of the navigation apparatus according to the thirteenth embodiment of the present invention is the same as the block diagram shown in FIG.
- the route setting expression storage unit 4 includes a time expression, and particularly includes a time expression of “ ⁇ hour”.
- the route setting operation storage unit 6 is also set for a route setting operation when a spot name corresponding to the route setting expression is not extracted as shown in FIG. 32, for example.
- the route setting operation of “display service area or restaurant” is associated with the route setting expression “to take a break” when no location name is extracted.
- FIG. 33 is a flowchart showing the operation of the navigation device according to the thirteenth embodiment.
- the processing from step ST121 to ST125 is the same as step ST01 to ST05 in the flowchart of FIG.
- the time expression is further extracted. It is determined whether or not (step ST128).
- the route determination unit 10 reads the map data. Based on the position of the vehicle (moving body) and the current time, the relevant facility around the route that is likely to arrive at “12:00” (the “service area or food and drink set in the route setting operation in FIG. 32) The service area or restaurant for displaying the store is searched (step ST128).
- step ST127 the time expression is not extracted (in the case of NO in step ST127), so it is determined that the user wants to take a break immediately, and the vicinity of the current position
- the corresponding facility service area or restaurant
- step ST130 the found facility is displayed (step ST130).
- the facility predicted to arrive at the time closest to the time expression extracted in step ST123 in this example, “12:00” is searched and displayed.
- the route set as the waypoint is searched and set based on the location information of the facility searched and displayed (step ST131).
- the route determination unit 10 performs the keyword extraction unit similarly to step ST06 of the flowchart of FIG. 5 in the first embodiment.
- the location name extracted by 5 is acquired with reference to the map data storage unit 8, the acquired location information, the route setting operation acquired in step ST 125, and the vehicle location acquisition unit (location Based on the current position of the vehicle (moving body) acquired by the acquisition unit 9, the route is searched and the searched route is set (step ST 132).
- steps ST128 to ST131 of the flowchart in FIG. 33 the relevant facility that arrives at the time closest to the extracted time expression (in this example, “12:00”) is searched and displayed, and that facility is set as a transit point.
- the route is automatically searched and set.
- whether or not the displayed facility is set as a waypoint is determined by the user using the voice or key input unit 12 (in the block diagram of FIG. 14). (Not shown) may be selected. If there are a plurality of searched facilities, the plurality of facilities may be displayed so that the user can select one facility as a transit point and set it.
- FIG. 34 is a flowchart showing the operation when the user can select whether or not to set the searched facility as a transit point.
- steps ST121 to ST130 and ST132 are the same as those in FIG.
- step ST128 or ST129 when the found facility is displayed (step ST130), the user is further confirmed as to whether or not to set a route using the facility as a transit point (step ST141).
- FIG. 35 (a) is an example of a screen on which a dialog for confirming whether or not to set a route with the facility as a transit point is displayed when the searched facility is displayed.
- the navigation screen 31 displays a triangular vehicle mark 32 indicating the current position of the vehicle (moving body) and a set route (a set route) 33 that is currently running.
- restaurant XX is displayed as a facility predicted to arrive around the extracted time expression “12 o'clock”, and whether or not a route is set with the facility (restaurant XX) as a transit point.
- a dialog 34 “Do you want to set a route as a transit point?” Is displayed. At this time, as shown in FIG.
- buttons that can be selected by the user “Yes” and “No” may be displayed in the dialog 34, or the searched facility (restaurant XX).
- a voice output “Do you want to set a route?” May be output so that the user can select “Yes” or “No”.
- step ST142 If “yes” (set) is selected by the user (YES in step ST142), the location information of the facility is acquired from the map data, and the route information is set as the waypoint Is searched and set (step ST131).
- FIG. 35B when “Yes” is selected by the user in the dialog 34 displayed in FIG. 35A, a route is set using the searched facility (restaurant XX) as a transit point. As a result, the setting route 33 ′ is newly displayed.
- “NO” is selected (set is not selected) by the user in step ST142 (NO in step ST142) by the user in step ST142 (NO in step ST142), the process ends as it is.
- a plurality of corresponding facilities may be searched. For example, all relevant facilities (service areas or restaurants) predicted to arrive within 10 minutes before and after “12:00”, that is, between 11:50 and 12:10, are displayed, from which the user A desired facility may be selected and set.
- FIG. 36 is a screen example in which a plurality of searched facilities are displayed. In FIG. 36, on the front screen on which a triangular vehicle mark 32 indicating the current position of the vehicle (moving body) and a currently set route (a set route) 33 are displayed on the navigation screen 31.
- the facility may be selected by touching the facility that the user wants to set as a transit point.
- Each facility is displayed with a number, and the number is input or spoken. You may make it selectable by.
- a route having that facility as a transit point may be automatically searched and set, or “via” as in the dialog 34 shown in FIG. Do you want to set a route on the ground? "Dialog is displayed or output as audio, and the user selects" Yes "or speaks to set it as a route. Also good.
- “Restaurant OO” is selected by the user, the location information of the facility is acquired from the map data, and the route set with the location information as a waypoint is searched and set.
- a route is set with the selected facility (restaurant OO) as a transit point, and a newly set route 33 ′ is displayed.
- the user may be able to set whether or not to use the function of searching for facilities and setting a waypoint.
- the user can present a spot where the user can take a break by speaking the time at which he / she wants to take a break without knowing the specific place or name, and set the route using that spot as a transit point.
- convenience is improved.
- the navigation apparatus for a vehicle has been described.
- the navigation apparatus of the present invention is not limited to a vehicle, but is a navigation apparatus for a moving body including a person, a vehicle, a railway, a ship, an airplane, or the like.
- the present invention relates to a navigation device suitable for being brought into a vehicle or mounted on a vehicle, and any type of device can be used as long as the device can perform navigation by voice interaction between a user and the device, such as a portable navigation device. It can also be applied to things.
- the navigation device of the present invention can be applied to an in-vehicle navigation device or a portable navigation device capable of performing navigation by voice dialogue between a user and the device.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Acoustics & Sound (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
Abstract
Description
この際、ユーザの発話内容を録音しておき、音声認識ボタンが押されると、所定の時間だけ遡って録音しておいた発話内容を音声認識して地点情報(地点名称)を抽出し、その地点情報(地点名称)を目的地に設定するナビゲーション装置が知られている(例えば、特許文献1参照)。
この発明は、自車(移動体)の位置を取得する位置取得部を備え、当該位置取得部により取得された自車(移動体)の位置と地図データとに基づいて、経路の道案内を行うナビゲーション装置において、そのナビゲーション装置が起動されている場合は常時、ユーザの発話内容を認識し、その認識結果から、地点情報や経路設定方法を特定し、その地点情報と経路設定方法に基づいて自動的に経路を設定するものである。なお、以下の実施の形態では、この発明のナビゲーション装置を、車両等の移動体に搭載されるカーナビゲーションシステムに適用した場合を例に挙げて説明する。
図1は、この発明の実施の形態1によるナビゲーション装置の一例を示すブロック図である。このナビゲーション装置は、音声取得部1と、音声認識部2と、地点名称記憶部3と、経路設定表現記憶部4と、キーワード抽出部5と、経路設定動作記憶部6と、経路設定動作取得部7と、地図データ記憶部8と、自車位置取得部(位置取得部)9と、経路決定部10と、提示制御部21と、表示部22、音声出力部23により構成されている。なお、この提示制御部21と、表示部22および音声出力部23が、提示制御出力部20を構成する。また、図示は省略したが、このナビゲーション装置は、キーやタッチパネル等による入力信号を取得するキー入力部12と、時刻を取得する時刻取得部14も備えている。
音声認識部2は、認識辞書(図示せず)を有し、音声取得部1により取得された音声データから、同乗者などのユーザが発話した内容に該当する音声区間を検出し、特徴量を抽出し、その特徴量に基づいて認識辞書を用いて音声認識処理を行う。ここで、認識処理としては、例えばHidden Markov Modelのような一般的な方法を用いて行えばよい。また、音声認識部2は、ネットワーク上の音声認識サーバを使用してもよい。
経路設定表現記憶部4は、通常、ユーザが発する単語の中で、経路設定動作に関する表現を経路設定表現として記憶している。図3は、経路設定表現記憶部4の一例を示す図である。この図に示すように、経路設定表現記憶部4は、例えば、「行く」「行きたい」「行こう」「寄る」「立ち寄る」「休憩する」「やめる」などの動作表現を、経路設定動作に関する経路設定表現として記憶している。
キーワード抽出部5は、地点名称記憶部3と経路設定表現記憶部4を参照しながら形態素解析を行い、音声認識部2の音声認識結果の文字列から地点名称および経路設定表現を抽出する。
地図データ記憶部8は、例えば道路データ、交差点データ、施設データ等の地図データを記憶している。この地図データ記憶部8は、例えばDVD-ROM、ハードディスク、SDカードなどの記憶媒体であってもよいし、ネットワーク上に存在し、通信ネットワークを介して道路データなどの情報を取得できる構成(地図データ取得部)であってもよい。
経路決定部10は、キーワード抽出部5により抽出された地点名称から特定される地点の位置情報(経緯度)を、地図データ記憶部8に記憶されている地図データを参照して取得し、当該取得された位置情報と、自車位置取得部(位置取得部)9により取得された自車(移動体)の位置(経緯度)と、経路設定動作取得部7により取得された経路設定動作に基づいて、地点名称から特定される地点への経路を探索し、当該探索した経路を設定する。
まず初めに、何等かの発話入力があると、音声取得部1が入力された音声を取得し、A/D変換して、例えばPCM形式の音声データとして取得する(ステップST01)。次に、音声取得部1で取得された音声データを音声認識部2が認識する(ステップST02)。そして、キーワード抽出部5が、音声認識部2の認識結果から、地点名称記憶部3および経路設定表現記憶部4を参照しながら、地点名称と、その地点名称に対する経路設定表現を抽出する(ステップST03)。ここで、経路設定表現が抽出された場合(ステップST04のYESの場合)には、経路設定動作取得部7が、キーワード抽出部5により抽出された経路設定表現を検索キーとして、経路設定動作記憶部6を検索し、検索キーと一致する経路設定表現を検索することにより、その経路設定表現に対応する経路設定動作を取得する(ステップST05)。
なお、ここでは図2に示すような地点名称記憶部3を参照して地点名称を抽出したが、地図データ記憶部8に記憶されている地図データを参酌して、施設、住所に関する情報等から、公知の形態素解析方法や公知の意味理解方法を用いて、地点名称を抽出するようにしてもよい。
図6は、以上の処理により経路決定部10が設定した情報を示す図であり、地点名称「清水寺」を「経由地に設定」、および、地点名称「京都駅」を「目的地に設定」する経路が示されている。
図7は、この発明の実施の形態2によるナビゲーション装置の一例を示すブロック図である。なお、実施の形態1で説明したものと同様の構成には、同一の符号を付して重複した説明を省略する。以下に示す実施の形態2では、実施の形態1と比べると、地点特定部11をさらに備えている。そして、ユーザが発話した一の地点名称により特定される地点が複数ある場合(同一の地名で位置が異なる地点が複数ある場合)、現在地からの距離が近い地点を選択し、その選択された地点への経路を設定するものである。
また、経路決定部10は、地点特定部11により特定された地点の位置情報(経緯度)を、地図データ記憶部8に記憶されている地図データを参照して取得し、当該取得された位置情報と、自車位置取得部(位置取得部)9により取得された自車(移動体)の位置(経緯度)と、経路設定動作取得部7により取得された経路設定動作に基づいて、キーワード抽出部5により抽出された地点名称から特定される地点への経路を探索し、当該探索した経路を設定する。
ステップST11~ST15までの処理については、実施の形態1における図5のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態2では、地点特定部11が、例えば地図データ記憶部8に記憶されている地図データを参照して、ステップST13で抽出された地点名称から特定される地点が複数存在するか否かを判定する(ステップST16)。そして、複数ある場合(ステップST16のYESの場合)には、さらに地図データを参照して、現在地から各地点までの距離を計算し、最も距離が近い地点を特定し、その位置情報を出力する(ステップST17)。一方、複数存在しない場合(ステップST16のNOの場合)には、地点名称から特定される地点の位置情報を、例えば地図データから取得して出力する(ステップST18)。
図9は、この発明の実施の形態3によるナビゲーション装置の一例を示すブロック図である。なお、実施の形態1~2で説明したものと同様の構成には、同一の符号を付して重複した説明を省略する。以下に示す実施の形態3では、実施の形態2と比べると、実施の形態1,2では図示しなかったキー入力部12が示されている。そして、ユーザが発話した一の地点名称により特定される地点が複数ある場合(同一の地名で位置が異なる地点が複数ある場合)、その複数の地点をユーザに提示して、ユーザによりいずれか1つの地点が選択されることにより、その選択された地点への経路を設定するものである。
キー入力部12は、キーボード、ボタン、マウス、タッチパネル等、ユーザにより手動操作入力を行うことができる入力部である。そして、キーワード抽出部5により抽出された地点名称により特定される地点が複数ある場合に、ユーザはこのキー入力部12により複数の地点情報のうちいずれを採用するかを選択することができる。
ステップST21~ST26までの処理については、実施の形態2における図8のフローチャートのステップST11~ST16と同じであるため、説明を省略する。そして、この実施の形態3では、ステップST26において、地点特定部11が、例えば地図データ記憶部8に記憶されている地図データを参照して、ステップST13で抽出された地点名称から特定される地点が複数存在するか否かを判定した結果、複数ある場合(ステップST26のYESの場合)には、それら複数の地点情報を、リストとして表示したり、地図上にマーク表示する等、ユーザに提示する(ステップST27)。その提示された複数の地点情報のうちのいずれを採用するかを、ユーザがキー入力部12により選択することにより、1箇所の地点が特定されるので、その位置情報を出力する(ステップST28)。一方、複数存在しない場合(ステップST26のNOの場合)には、地点名称から特定される地点の位置情報を、例えば地図データから取得して出力する(ステップST29)。
図11は、この発明の実施の形態4によるナビゲーション装置の一例を示すブロック図である。なお、実施の形態1~3で説明したものと同様の構成には、同一の符号を付して重複した説明を省略する。以下に示す実施の形態4では、実施の形態2と比べると、知名度記憶部13をさらに備えている。そして、ユーザが発話した一の地点名称により特定される地点が複数ある場合(同一の地名で位置が異なる地点が複数ある場合)、知名度が高い地点を選択し、その選択された地点への経路を設定するものである。
知名度記憶部13は、例えば神社や公園などの施設の位置情報と、その知名度を記憶したものである。また、知名度としては、例えば、年間来訪者数や、アンケート結果など、何らかの方法により取得されたデータに基づいて決定される。
ステップST31~ST36までの処理については、実施の形態2における図8のフローチャートのステップST11~ST16と同じであるため、説明を省略する。そして、この実施の形態4では、ステップST36において、地点特定部11が、例えば地図データ記憶部8に記憶されている地図データを参照して、ステップST33で抽出された地点名称から特定される地点が複数存在するか否かを判定した結果、複数ある場合(ステップST36のYESの場合)には、さらに知名度記憶部13を参照して、その複数の地点の知名度を取得し、最も知名度が高い地点を特定して、その位置情報を出力する(ステップST37)。一方、複数存在しない場合(ステップST36のNOの場合)には、地点名称から特定される地点の位置情報を、例えば地図データから取得して出力する(ステップST38)。
この発明の実施の形態5によるナビゲーション装置の一例を示すブロック図は、実施の形態1における図1に示すブロック図と構成としては同じであるので、図示および説明を省略する。以下に説明する実施の形態5では、実施の形態1と比べると、最後に経路決定部10が、経路を探索した結果、当該経路の総距離が予め定められた所定の距離を越える場合には、探索した経路を設定しない、というものである。
ステップST41~ST45までの処理については、実施の形態1における図5のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態5では、経路決定部10が、キーワード抽出部5により抽出された地点名称について、地図データ記憶部8を参照してその位置を取得し、当該取得された位置情報と、ステップST05において取得された経路設定動作と、自車位置取得部(位置取得部)9により取得された現在の自車(移動体)の位置とに基づいて、経路を探索(ステップST46)した後、探索された経路の総距離についての判定を行う。
また、この実施の形態5では、実施の形態1を基にして説明したが、実施の形態2~4についても同様に、経路を探索した後に、その探索した経路の総距離により経路を設定しないようにしてもよい。
図14は、この発明の実施の形態6によるナビゲーション装置の一例を示すブロック図である。なお、実施の形態1~5で説明したものと同様の構成には、同一の符号を付して重複した説明を省略する。以下に示す実施の形態6では、実施の形態1と比べると、実施の形態1では図示しなかった時刻取得部14が示されており、最後に経路決定部10が、経路を探索した結果、当該経路により目的地に到着する予定時刻(到着予定時刻)が予め定められた所定の時刻を越える場合、または、当該経路により目的地に到着するまでにかかる所要時間が予め定められた所定の時間を越える場合には、探索した経路を設定しない、というものである。
時刻取得部14は、公知の情報を用いて時刻を取得する。
ステップST51~ST56までの処理については、実施の形態5における図12のフローチャートのステップST41~ST46と同じであるため、説明を省略する。そして、ステップST56で探索された経路について、その経路により目的地に到着する予定時刻(到着予定時刻)が所定の閾値以前か否か、または、その経路により目的地に到着するまでにかかる所要時間が所定の閾値以下か否かを判定する(ステップST57)。その後、到着予定時刻が所定の閾値以前の場合、または、所要時間が所定の閾値以下の場合(ステップST57のYESの場合)には、その経路を設定する(ステップST58)。一方、ステップST57の判定において、到着予定時刻が所定の閾値より後の場合、または、目的地までの所要時間が所定の閾値より大きい場合(ステップST57のNOの場合)には、その経路を設定せずに処理を終了する。
また、この実施の形態6では、実施の形態1を基にして説明したが、実施の形態2~4についても同様に、経路を探索した後に、その探索した経路の目的地までの到着予定時刻または所要時間により経路を設定しないようにしてもよい。
この発明の実施の形態7によるナビゲーション装置の一例を示すブロック図は、実施の形態1における図1に示すブロック図と構成としては同じであるので、図示および説明を省略する。以下に説明する実施の形態7では、実施の形態1と比べると、経路が既に設定されている場合には、経路設定動作の内容にかかわらず、地点名称により特定された地点を「経由地」として追加して経路を再探索するものである。
ステップST61~ST65までの処理については、実施の形態1における図5のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態7では、ステップST65において経路設定動作を取得した後に、経路決定部10が、経路が既に設定されているか否かを判定する(ステップST66)。ここで、経路が既に設定されていた場合(ステップST66のYESの場合)には、キーワード抽出部5により抽出された地点名称について、例えば地図データを参照してその位置を取得し、当該取得された位置情報を、ステップST65において経路設定動作取得部7により取得された経路設定動作の内容にかかわらず、「経由地に追加」して経路を再探索し、その経路を設定する(ステップST67)。一方、経路が設定されていなかった場合(ステップST66のNOの場合)には、実施の形態1における図5のフローチャートのステップST06と同様に、キーワード抽出部5により抽出された地点名称について、地図データを参照してその位置を取得し、当該取得された位置情報と、ステップST65において取得された経路設定動作と、自車位置取得部(位置取得部)9により取得された現在の自車(移動体)の位置とに基づいて、経路を探索し、探索した経路を設定する(ステップST68)。
なお、この実施の形態7では、実施の形態1を基にして説明したが、実施の形態2~6についても同様に、経路が既に設定されている場合には、取得された経路設定動作にかかわらず、取得された地点名称の位置を経由地として追加するようにしてもよい。
この発明の実施の形態8によるナビゲーション装置の一例を示すブロック図は、実施の形態1における図1に示すブロック図と構成としては同じであるので、図示および説明を省略する。以下に説明する実施の形態8では、実施の形態1と比べると、経路設定表現記憶部4が、時期を表す表現も備えており、キーワード抽出部5が、ユーザが発話した内容から時期を表す表現も抽出し、当該表現が、「今日」を表す表現でない場合、すなわち、所定以上先の将来の時期を表す表現であった場合には、経路を設定しないようにするものである。
ステップST71~ST75までの処理については、実施の形態1における図5のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態8では、ステップST75において経路設定動作を取得した後に、経路決定部10が、ステップST73においてキーワード抽出部5が抽出した表現の中に、時期を表す経路設定表現のうち、「今日」を表す表現以外のものが含まれていないかどうか、すなわち、所定以上先の将来の時期を表す表現が含まれていないかどうか、判定する(ステップST76)。
一方、ステップST76の判定において、例えば「明日」「今度」「来月」など、「今日」を表す表現以外のもの、すなわち、所定以上先の将来の時期を表す表現が含まれていた場合(ステップST76のNOの場合)には、所定以上先の将来のことを話しているものと判断し、経路を設定せずにそのまま処理を終了する。
また、通常の会話において、今から行く場所のことを話す際に、わざわざ「今日」と発話することは少ないと考えられるため、ステップST76において“「今日」を表す表現以外のものが含まれていないか否か”で判断するようにしたが、「今日」「今から」など、“「今日」を表す表現が含まれているか否か”で判断するようにしてもよい。
また、この実施の形態8では、実施の形態1を基にして説明したが、実施の形態2~7についても同様に、時期を表す表現が「今日」を表す表現以外のもの、すなわち、所定以上先の将来の時期を表す表現であった場合には、経路の設定を行わないようにしてもよい。
図19は、この発明の実施の形態9によるナビゲーション装置の一例を示すブロック図である。なお、実施の形態1~8で説明したものと同様の構成には、同一の符号を付して重複した説明を省略する。以下に説明する実施の形態9では、実施の形態1と比べると、経路探索要否選択部15をさらに備え、また、実施の形態1では図示しなかったキー入力部12が示されている。そして、経路を探索するか否かをユーザが選択できるようにしたものである。
経路探索要否選択部15は、キーワード抽出部5により抽出された地点名称と、その地点に対する経路設定動作に基づいて、経路を探索するか否かを、例えば画面に表示したり音声出力することによりユーザに提示して、経路探索の要否をユーザに選択させる。
ステップST81~ST85の処理については、実施の形態1における図4のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態9では、経路探索要否選択部15が、キーワード抽出部5により抽出された地点名称と、経路設定動作について、その経路を探索するか否かの要否確認を行う(ステップST86)。要否確認の方法としては、図6に示すような地点名称と経路設定動作を対応付けたテーブルを表示して、ユーザに対して経路探索の要否選択のためのボタン等を提示するようにしたり、音声によりユーザに提示するようにすればよい。
一方、ユーザにより探索「否」が選択された場合(ステップST87のNOの場合)には、そのまま処理を終了する。
一方、ユーザが「いいえ」を選択した場合(ステップST87のNOの場合)には、経路を探索して設定する処理を行わずに、終了する。
なお、この実施の形態9では、実施の形態1を基にして説明したが、実施の形態2~8についても同様に、経路探索要否の確認をするようにしてもよい。
この発明の実施の形態10によるナビゲーション装置の一例を示すブロック図は、実施の形態1における図1に示すブロック図と構成としては同じであるので、図示および説明を省略する。以下に説明する実施の形態10では、実施の形態1と比べると、複数の経路が探索された場合に、経路決定部10が、ユーザにそれらの経路を提示するようにしたものである。
この実施の形態10における経路決定部10は、探索すべき経路が複数あるか否かを判定し、探索すべき経路が複数ある場合に、それら複数の経路を探索し、ユーザにそれらの経路を提示する。
ステップST91~ST95までの処理については、実施の形態1における図4のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態10では、経路決定部10が、探索すべき経路が複数あるか否かを判断し(ステップST96)、複数ある場合(ステップST96のYESの場合)には、キーワード抽出部5により抽出された地点名称について、地図データを参照してその位置を取得し、当該取得された位置情報と、ステップST95において取得された経路設定動作と、自車位置取得部(位置取得部)9により取得された現在の自車(移動体)の位置とに基づいて、すべての経路を探索する(ステップST97)。そして、その探索された複数の経路を、ユーザに提示するよう提示制御部21に指示を出し、表示部22または音声出力部23を介して提示される(ステップST98)。そして、ユーザにより選択された経路が設定される(ステップST99)。
図24は、この実施の形態10における経路設定表現記憶部4の一例を示す図である。この図に示すように、経路設定表現記憶部4は、図17と同様に、例えば「行きたい」「立ち寄る」「休憩する」などの動作表現を、経路設定動作に関する経路設定表現として記憶し、例えば「今日」「明日」「今度」などの時期を表す経路設定表現も記憶するとともに、例えば「ではなくて」「やめて」「やめる」「やめよう」「削除する」など訂正表現/取り消し表現も経路設定表現として記憶している。
この発明の実施の形態11によるナビゲーション装置の一例を示すブロック図は、実施の形態1における図1に示すブロック図と構成としては同じであるので、図示および説明を省略する。以下に説明する実施の形態11では、実施の形態10と同様に、複数の経路が探索された場合を前提にしており、実施の形態10と比べると、経路に設定されている地点を削除するような取り消し表現が抽出された場合に、経路決定部10が、その地点を経路から削除した新しい経路を探索して設定するようにしたものである。
この実施の形態11における経路設定表現記憶部4は、例えば、図24に示すものと同じように、経路設定動作を表す表現と、時期を表す表現以外に、訂正表現/取り消し表現を含むものである。また、キーワード抽出部5は、それら訂正表現/取り消し表現も抽出する。
図27は、この発明の実施の形態12によるナビゲーション装置の一例を示すブロック図である。なお、実施の形態1~11で説明したものと同様の構成には、同一の符号を付して重複した説明を省略する。以下に説明する実施の形態12では、実施の形態10と同様に、複数の経路が探索された場合を前提にしており、実施の形態10(ブロック図としては、実施の形態1の図1と同じ)と比べると、提示方法記憶部16と提示方法決定部17をさらに備えている。そして、複数の経路を提示する方法を、その経路の重要度によって変更するようにしたものである。
そして、この実施の形態12における経路決定部10は、複数の経路を探索し、提示方法決定部17により決定された提示方法で、ユーザにそれぞれの経路を提示する。
ステップST111~ST115までの処理については、実施の形態10における図22のフローチャートのステップST91~ST95と同じであるため、説明を省略する。そして、この実施の形態12では、提示方法決定部17が、探索すべき経路が複数あるか否かを判断し(ステップST116)、複数ある場合(ステップST116のYESの場合)には、提示方法記憶部16を参照して、提示方法を決定して経路決定部10に出力する(ステップST117)。そして、経路決定部10は、キーワード抽出部5により抽出された地点名称について、地図データを参照してその位置を取得し、当該取得された位置情報と、ステップST115において取得された経路設定動作と、自車位置取得部(位置取得部)9により取得された現在の自車(移動体)の位置とに基づいて、すべての経路を探索する(ステップST118)。そして、その探索された複数の経路を、その経路の重要度に応じて、ステップST117で決定された提示方法によりユーザに提示するよう、提示制御部21に指示を出し、表示部22または音声出力部23を介して提示される(ステップST119)。そして、ユーザにより選択された経路が設定される(ステップST120)。
ここで、重要度の判定方法について説明する。判定方法の一例として、例えば、会話の後の方に出てきた経路の重要度を高くする方法が考えられる。図30は、Aの認識結果から取得された経路(番号1)と、Bの認識結果から取得された経路(番号2)に、重要度と提示方法を追加して示したテーブルである。重要度については、後の発話情報Bを重要度1とし、先の発話情報Aを重要度2としている。また、ステップST117で決定された提示方法により、重要度1の経路(番号2)を赤色、重要度2の経路(番号1)を黄色で提示することが示されている。
そして、ここでは、図31(a)のようにユーザに提示された画面において、ユーザが選択したい経路をタッチする等により選択すると、経路決定部10が、その選択された経路を設定する(ステップST120)。
この発明の実施の形態13によるナビゲーション装置の一例を示すブロック図は、実施の形態6における図14に示すブロック図と構成としては同じであるので、図示および説明を省略する。以下に説明する実施の形態13では、経路設定表現のみが抽出されて地点名称が抽出されなかった場合に、その経路設定表現に対応する施設を時刻に基づいて検索し、その施設を提示したり、その施設を経由するように経路を探索して設定したりするようにしたものである。
この実施の形態13では、経路設定表現記憶部4は、例えば図17または図24に示すように、時期表現を含むものであり、特に「~時」という時間表現を含むものとする。また、経路設定動作記憶部6は、例えば図32に示すように、経路設定表現に対応する地点名称が抽出されなかった場合の経路設定動作についても、設定されているものである。なお、ここでは、「休憩する」という経路設定表現について、地点名称が抽出されなかった場合に、「サービスエリアまたは飲食店を表示」という経路設定動作を対応付けている。
ステップST121~ST125までの処理については、実施の形態1における図5のフローチャートのステップST01~ST05と同じであるため、説明を省略する。そして、この実施の形態13では、経路設定表現は抽出できたものの、それに対応する地点名称がステップST123において抽出されていなかった場合(ステップST126のYESの場合)に、さらに、時間表現が抽出されたか否かを判定する(ステップST128)。
この図34において、ステップST121~ST130およびST132については、図33と同じであるので説明を省略する。そして、ステップST128またはST129において、検索された施設を表示(ステップST130)する際に、さらに、その施設を経由地として、経路を設定するか否かをユーザに確認する(ステップST141)。
図35(b)は、図35(a)に表示されたダイアログ34において、ユーザにより「はい」が選択された場合に、その検索された施設(レストラン○○)を経由地として経路が設定された結果、新たに設定ルート33’が表示された状態を示している。
一方、ステップST142において、ユーザにより「いいえ」が選択された(設定する、が選択されなかった)場合(ステップST142のNOの場合)には、そのまま処理を終了する。
図36は、検索された施設が複数表示された画面例である。この図36では、ナビゲーション画面31に現在の自車(移動体)の位置を示す三角形の自車マーク32と現在走行中の設定ルート(設定されている経路)33が表示されている表画面上に、抽出された時間表現「12時」頃に到着すると予測される施設として、「喫茶○○」「日本料理○○」「レストラン○○」「ファミレス○○」および「サービスエリア○○」の5つが表示されている。
そして、ユーザによって、例えば「レストラン○○」が選択された場合には、その施設の位置情報を地図データから取得して、その位置情報を経由地に設定した経路を探索し、設定する。これにより、図35(b)に示す図と同様に、選択された施設(レストラン○○)を経由地として経路が設定され、新たに設定ルート33’が表示される。
Claims (16)
- 移動体の位置を取得する位置取得部を備え、当該位置取得部により取得された移動体の位置と地図データとに基づいて、経路の道案内を行うナビゲーション装置において、
入力された音声を検知して取得する音声取得部と、
前記ナビゲーション装置が起動されている場合は常時、前記音声取得部により取得された音声データを認識する音声認識部と、
地名および施設の名称を地点名称として記憶する地点名称記憶部と、
ユーザが経路設定を行う際に用いる経路設定表現を記憶する経路設定表現記憶部と、
前記地点名称記憶部と前記経路設定表現記憶部を参照して、前記音声認識部による認識結果から地点名称および経路設定表現を抽出するキーワード抽出部と、
前記経路設定表現に対応する経路設定動作を、前記経路設定表現と対応付けて記憶する経路設定動作記憶部と、
前記経路設定動作記憶部を参照して、前記キーワード抽出部により抽出された経路設定表現に基づいて、対応する経路設定動作を取得する経路設定動作取得部と、
前記経路設定動作取得部により取得された経路設定動作に基づいて、前記キーワード抽出部により抽出された地点名称から特定される地点への経路を探索し、当該探索した経路を設定する経路決定部とを備える
ことを特徴とするナビゲーション装置。 - 前記キーワード抽出部により抽出された一の地点名称により特定される地点が複数存在する場合に、当該複数の地点から1箇所の地点を特定する地点特定部をさらに備え、
前記経路決定部は、前記地点特定部により特定された地点への経路を、前記経路設定動作取得部により取得された経路設定動作に基づいて探索し、当該探索した経路を設定する
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記地点特定部は、前記位置取得部により取得された移動体の位置と、前記複数の地点の位置とに基づいて、前記複数の地点から1箇所の地点を特定する
ことを特徴とする請求項2記載のナビゲーション装置。 - 前記地点の知名度を記憶する知名度記憶部をさらに備え、
前記地点特定部は、前記知名度記憶部に記憶された知名度に基づいて、前記複数の地点から1箇所の地点を特定する
ことを特徴とする請求項2記載のナビゲーション装置。 - 前記経路決定部は、前記探索された経路の長さを算出し、当該算出された経路の長さが所定の閾値より大きい場合には当該経路を設定しない
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記経路決定部は、前記探索された経路の目的地までの所要時間を算出し、当該算出された所要時間が所定の閾値より大きい場合には当該経路を設定しない
ことを特徴とする請求項1記載のナビゲーション装置。 - 現在の時刻を取得する時刻取得部をさらに備え、
前記経路決定部は、前記時刻取得部により取得された現在の時刻に基づいて、前記探索された経路の目的地への到着予想時刻を算出し、当該算出された到着予想時刻が所定時刻より後の場合には当該経路を設定しない
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記経路決定部は、既に経路が設定されているか否かを判定し、既に経路が設定されていると判定された場合には、前記経路設定動作取得部により取得された経路設定動作にかかわらず、前記キーワード抽出部により抽出された地点名称から特定される地点を、経由地として追加した経路を探索し、当該探索した経路を設定する
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記経路決定部は、前記キーワード抽出部により抽出された地点名称から特定される地点を前記経由地として追加する際に、当該特定される地点が、前記位置取得部により取得される移動体の位置または前記既に設定されている経路の目的地から所定の範囲内にあるか否かを判定し、所定の範囲内にあると判定された場合には、前記特定された地点を前記経由地として追加した経路を探索して、当該探索した経路を設定し、前記特定される地点が、前記移動体の位置からも前記経路の目的地からも所定の範囲内にないと判定された場合には、前記特定された地点を経由地として追加した経路を設定しない
ことを特徴とする請求項8記載のナビゲーション装置。 - 前記経路設定表現記憶部は、時期を表す経路設定表現を含み、
前記キーワード抽出部は、前記経路設定表現記憶部を参照して、前記音声認識部による認識結果から時期を表す経路設定表現も抽出し、
前記経路決定部は、前記キーワード抽出部により抽出された時期を表す経路設定表現が、所定以上先の将来の時期を表す表現であった場合には、前記経路を設定しない
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記経路決定部による経路探索の要否をユーザに選択させる経路設定要否選択部をさらに備え、
前記経路決定部は、前記経路設定要否選択部により探索「要」が選択された場合には、前記経路を探索し、探索「否」が選択された場合には、前記経路を探索しない
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記経路決定部は、前記経路を探索した結果、複数の経路が探索された場合には、当該複数の経路を提示し、当該複数の経路の中から一の経路を選択して設定する
ことを特徴とする請求項1記載のナビゲーション装置。 - 前記経路設定表現記憶部は、取り消しまたは訂正する際に使用される取り消し/訂正表現を含み、
前記キーワード抽出部は、前記経路設定表現記憶部を参照して、前記音声認識部による認識結果から取り消し/訂正表現も抽出し、
前記経路決定部は、前記キーワード抽出部により抽出された取り消し/訂正表現に基づいて、前記経路設定を変更する
ことを特徴とする請求項12記載のナビゲーション装置。 - 前記提示する経路の重要度と、当該重要度に対応する提示方法を記憶する提示方法記憶部と、
前記提示方法記憶部を参照して、前記複数の経路それぞれの重要度に応じて提示方法を決定する提示方法決定部とをさらに備え、
前記経路決定部は、前記提示方法決定部により決定された提示方法に基づいて、前記複数の経路を提示する
ことを特徴とする請求項12記載のナビゲーション装置。 - 前記経路設定動作記憶部は、前記キーワード抽出部により地点名称および経路設定表現が抽出された場合の経路設定動作に加えて、前記キーワード抽出部により経路設定表現のみが抽出されて地点名称が抽出されなかった場合の経路設定動作を前記経路設定表現に該当する施設とともに前記経路設定表現に対応付けて記憶しており、
前記経路決定部は、前記キーワード抽出部により経路設定表現のみが抽出されて地点名称が抽出されなかった場合には、当該抽出された経路設定表現に対応する前記施設を検索して提示する
ことを特徴とする請求項1記載のナビゲーション装置。 - 位置取得部が、移動体の位置を取得するステップと、前記位置取得部により取得された移動体の位置と地図データとに基づいて、経路の道案内を行うステップを有するナビゲーション方法において、
音声取得部が、入力された音声を検知して取得するステップと、
前記ナビゲーション装置が起動されている場合は常時、音声認識部が、前記音声取得部により取得された音声データを認識するステップと、
地点名称記憶部が、地名および施設の名称を地点名称として記憶するステップと、
経路設定表現記憶部が、ユーザが経路設定を行う際に用いる経路設定表現を記憶するステップと、
キーワード抽出部が、前記地点名称記憶部と前記経路設定表現記憶部を参照して、前記音声認識部による認識結果から地点名称および経路設定表現を抽出するステップと、
経路設定動作記憶部記憶部が、前記経路設定表現に対応する経路設定動作を、前記経路設定表現と対応付けて記憶するステップと、
経路設定動作取得部が、前記経路設定動作記憶部を参照して、前記キーワード抽出部により抽出された経路設定表現に基づいて、対応する経路設定動作を取得するステップと、
経路設定部が、前記経路設定動作取得部により取得された経路設定動作に基づいて、前記キーワード抽出部により抽出された地点名称から特定される地点への経路を探索し、当該探索した経路を設定するステップとを備える
ことを特徴とするナビゲーション方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013542802A JP5762554B2 (ja) | 2011-11-10 | 2012-06-05 | ナビゲーション装置および方法 |
DE112012004711.7T DE112012004711T5 (de) | 2011-11-10 | 2012-06-05 | Navigationsvorrichtung und Verfahren |
US14/131,658 US8965697B2 (en) | 2011-11-10 | 2012-06-05 | Navigation device and method |
CN201280055224.1A CN103917847B (zh) | 2011-11-10 | 2012-06-05 | 导航装置及方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011006293 | 2011-11-10 | ||
JPPCT/JP2011/006293 | 2011-11-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013069172A1 true WO2013069172A1 (ja) | 2013-05-16 |
Family
ID=48288907
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/003678 WO2013069172A1 (ja) | 2011-11-10 | 2012-06-05 | ナビゲーション装置および方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8965697B2 (ja) |
CN (1) | CN103917847B (ja) |
DE (1) | DE112012004711T5 (ja) |
WO (1) | WO2013069172A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016017752A (ja) * | 2014-07-04 | 2016-02-01 | 本田技研工業株式会社 | 情報処理システムおよび情報処理方法 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103608804B (zh) * | 2011-05-24 | 2016-11-16 | 三菱电机株式会社 | 字符输入装置及包括该字符输入装置的车载导航装置 |
CN104344830A (zh) * | 2014-10-31 | 2015-02-11 | 成都众易通科技有限公司 | 一种车载导航系统 |
CN104535074A (zh) * | 2014-12-05 | 2015-04-22 | 惠州Tcl移动通信有限公司 | 基于蓝牙耳机的语音导航方法、系统和终端 |
DE112014007288T5 (de) * | 2014-12-26 | 2017-09-07 | Mitsubishi Electric Corporation | Spracherkennungssystem |
US9628415B2 (en) * | 2015-01-07 | 2017-04-18 | International Business Machines Corporation | Destination-configured topic information updates |
CN107532909B (zh) * | 2015-07-17 | 2022-01-04 | 松下电器(美国)知识产权公司 | 飞行路线生成方法、飞行路线显示装置以及记录介质 |
EP3156765A1 (de) * | 2015-10-16 | 2017-04-19 | Inventio AG | Wegeleitung von personen mit akustischen signalen |
WO2017157078A1 (en) | 2016-03-16 | 2017-09-21 | Beijing Didi Infinity Technology And Development Co., Ltd. | System and method for determining location |
US10365887B1 (en) * | 2016-03-25 | 2019-07-30 | Amazon Technologies, Inc. | Generating commands based on location and wakeword |
US10139243B2 (en) * | 2016-04-30 | 2018-11-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | High level instruction for navigational routing systems |
US10708313B2 (en) * | 2016-12-30 | 2020-07-07 | Google Llc | Multimodal transmission of packetized data |
CN108286974B (zh) * | 2017-01-09 | 2020-10-30 | 北京四维图新科技股份有限公司 | 外业采集数据的智能处理方法和装置、以及混合导航系统 |
CN108540677A (zh) * | 2017-03-05 | 2018-09-14 | 北京智驾互联信息服务有限公司 | 语音处理方法及系统 |
CN109101475B (zh) * | 2017-06-20 | 2021-07-27 | 北京嘀嘀无限科技发展有限公司 | 出行语音识别方法、系统和计算机设备 |
CN110770819B (zh) | 2017-06-15 | 2023-05-12 | 北京嘀嘀无限科技发展有限公司 | 语音识别系统和方法 |
CN107220022A (zh) * | 2017-07-07 | 2017-09-29 | 上海思依暄机器人科技股份有限公司 | 一种控制开启导航功能的方法和装置 |
JP7002823B2 (ja) * | 2018-12-06 | 2022-01-20 | アルパイン株式会社 | 案内音声出力制御システムおよび案内音声出力制御方法 |
IL275789B1 (en) | 2020-07-01 | 2024-10-01 | Strauss Water Ltd | Installation for heating and pouring water |
TWI825468B (zh) * | 2021-08-25 | 2023-12-11 | 財團法人資訊工業策進會 | 導航裝置及方法 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000181485A (ja) * | 1998-12-14 | 2000-06-30 | Toyota Motor Corp | 音声認識装置及び方法 |
JP2002048572A (ja) * | 2000-08-01 | 2002-02-15 | Alpine Electronics Inc | 位置検出システムおよびナビゲーションシステム |
JP2002221430A (ja) * | 2001-01-29 | 2002-08-09 | Sony Corp | ナビゲーション装置、ナビゲーション方法及びナビゲーション装置のプログラム |
JP2005283239A (ja) * | 2004-03-29 | 2005-10-13 | Xanavi Informatics Corp | ナビゲーション装置 |
JP2006024194A (ja) * | 2004-06-08 | 2006-01-26 | Matsushita Electric Ind Co Ltd | 待ち合わせ場所決定装置およびその方法 |
JP2010277575A (ja) * | 2009-04-27 | 2010-12-09 | Buaru Kenkyusho:Kk | 経路探索システム、経路探索方法及びコンピュータプログラム |
JP2011169622A (ja) * | 2010-02-16 | 2011-09-01 | Sanyo Electric Co Ltd | 移動体ナビゲーション装置 |
JP2011185601A (ja) * | 2010-03-04 | 2011-09-22 | Alpine Electronics Inc | ナビゲーション装置および経路探索方法 |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4036966B2 (ja) | 1998-05-29 | 2008-01-23 | クラリオン株式会社 | ナビゲーションシステム及び方法並びにナビゲーション用ソフトウェアを記録した記録媒体 |
FR2803927B1 (fr) * | 2000-01-14 | 2002-02-22 | Renault | Procede et dispositif de commande d'equipements embarques sur un vehicule utilisant la reconnaissance vocale |
JP3567864B2 (ja) * | 2000-07-21 | 2004-09-22 | 株式会社デンソー | 音声認識装置及び記録媒体 |
US20020111810A1 (en) * | 2001-02-15 | 2002-08-15 | Khan M. Salahuddin | Spatially built word list for automatic speech recognition program and method for formation thereof |
JP3963698B2 (ja) | 2001-10-23 | 2007-08-22 | 富士通テン株式会社 | 音声対話システム |
JP2005316022A (ja) | 2004-04-27 | 2005-11-10 | Aisin Aw Co Ltd | ナビゲーション装置及びプログラム |
JP2006145331A (ja) | 2004-11-18 | 2006-06-08 | Kenwood Corp | ナビゲーション装置、ナビゲーション方法、およびナビゲーション用プログラム |
CN1959628A (zh) * | 2005-10-31 | 2007-05-09 | 西门子(中国)有限公司 | 一种人机交互导航系统 |
CN100375006C (zh) * | 2006-01-19 | 2008-03-12 | 吉林大学 | 车辆导航装置语音控制系统 |
JP4725731B2 (ja) | 2006-03-17 | 2011-07-13 | 株式会社デンソー | 車載用ナビゲーション装置 |
KR100819234B1 (ko) * | 2006-05-25 | 2008-04-02 | 삼성전자주식회사 | 네비게이션 단말의 목적지 설정 방법 및 장치 |
CN101162153A (zh) * | 2006-10-11 | 2008-04-16 | 丁玉国 | 一种语音控制的车载gps导航系统及其实现方法 |
CN101158584B (zh) * | 2007-11-15 | 2011-01-26 | 熊猫电子集团有限公司 | 车载gps的语音目的地导航实现方法 |
CN201266093Y (zh) * | 2008-09-27 | 2009-07-01 | 东莞美城电子电器有限公司 | 具有语音识别功能的导航系统 |
DE102010033189A1 (de) * | 2009-08-26 | 2011-03-17 | Navigon Ag | Verfahren zum Betrieb eines Navigationssystems |
US20110099507A1 (en) * | 2009-10-28 | 2011-04-28 | Google Inc. | Displaying a collection of interactive elements that trigger actions directed to an item |
DE102009051882A1 (de) * | 2009-11-04 | 2011-05-05 | Volkswagen Ag | Verfahren und Vorrichtung zur Spracheingabe für ein Fahrzeug |
EP2362186A1 (de) * | 2010-02-26 | 2011-08-31 | Deutsche Telekom AG | Bedieneinrichtung für elektronische Gerätefunktionen in einem Kraftfahrzeug |
JP5771902B2 (ja) | 2010-04-14 | 2015-09-02 | ソニー株式会社 | 経路案内装置、経路案内方法及びコンピュータプログラム |
CN101872362B (zh) * | 2010-06-25 | 2016-05-04 | 大陆汽车投资(上海)有限公司 | 动态语音标签信息查询系统及其信息查询方法 |
CN102063901A (zh) * | 2010-12-02 | 2011-05-18 | 深圳市凯立德欣软件技术有限公司 | 位置服务设备的语音识别方法及位置服务设备 |
US9230556B2 (en) * | 2012-06-05 | 2016-01-05 | Apple Inc. | Voice instructions during navigation |
US9182243B2 (en) * | 2012-06-05 | 2015-11-10 | Apple Inc. | Navigation application |
-
2012
- 2012-06-05 DE DE112012004711.7T patent/DE112012004711T5/de not_active Withdrawn
- 2012-06-05 US US14/131,658 patent/US8965697B2/en not_active Expired - Fee Related
- 2012-06-05 CN CN201280055224.1A patent/CN103917847B/zh not_active Expired - Fee Related
- 2012-06-05 WO PCT/JP2012/003678 patent/WO2013069172A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000181485A (ja) * | 1998-12-14 | 2000-06-30 | Toyota Motor Corp | 音声認識装置及び方法 |
JP2002048572A (ja) * | 2000-08-01 | 2002-02-15 | Alpine Electronics Inc | 位置検出システムおよびナビゲーションシステム |
JP2002221430A (ja) * | 2001-01-29 | 2002-08-09 | Sony Corp | ナビゲーション装置、ナビゲーション方法及びナビゲーション装置のプログラム |
JP2005283239A (ja) * | 2004-03-29 | 2005-10-13 | Xanavi Informatics Corp | ナビゲーション装置 |
JP2006024194A (ja) * | 2004-06-08 | 2006-01-26 | Matsushita Electric Ind Co Ltd | 待ち合わせ場所決定装置およびその方法 |
JP2010277575A (ja) * | 2009-04-27 | 2010-12-09 | Buaru Kenkyusho:Kk | 経路探索システム、経路探索方法及びコンピュータプログラム |
JP2011169622A (ja) * | 2010-02-16 | 2011-09-01 | Sanyo Electric Co Ltd | 移動体ナビゲーション装置 |
JP2011185601A (ja) * | 2010-03-04 | 2011-09-22 | Alpine Electronics Inc | ナビゲーション装置および経路探索方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016017752A (ja) * | 2014-07-04 | 2016-02-01 | 本田技研工業株式会社 | 情報処理システムおよび情報処理方法 |
Also Published As
Publication number | Publication date |
---|---|
US8965697B2 (en) | 2015-02-24 |
US20140136109A1 (en) | 2014-05-15 |
DE112012004711T5 (de) | 2014-08-21 |
CN103917847B (zh) | 2017-03-01 |
CN103917847A (zh) | 2014-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013069172A1 (ja) | ナビゲーション装置および方法 | |
JP6316524B2 (ja) | 施設情報案内装置、サーバ装置及び施設情報案内方法 | |
US7487038B2 (en) | Navigation device | |
JP5972372B2 (ja) | 車載情報装置 | |
JP2006195637A (ja) | 車両用音声対話システム | |
JP4466379B2 (ja) | 車載音声認識装置 | |
WO2013069060A1 (ja) | ナビゲーション装置および方法 | |
JP3948441B2 (ja) | 音声認識方法及び、車載装置 | |
JP5414951B2 (ja) | ナビゲーション装置、方法およびプログラム | |
EP1273887A2 (en) | Navigation system | |
JP4951934B2 (ja) | ナビゲーション装置、ナビゲーション方法、ナビゲーションシステム及びプログラム | |
JP4914632B2 (ja) | ナビゲーション装置 | |
JP5181533B2 (ja) | 音声対話装置 | |
JP4580230B2 (ja) | ナビゲーション装置 | |
JP5762554B2 (ja) | ナビゲーション装置および方法 | |
JP2000338993A (ja) | 音声認識装置、その装置を用いたナビゲーションシステム | |
JP4637793B2 (ja) | 施設検索装置 | |
JP2007025076A (ja) | 車載用音声認識装置 | |
WO2006028171A1 (ja) | データ提示装置、データ提示方法、データ提示プログラムおよびそのプログラムを記録した記録媒体 | |
WO2019124142A1 (ja) | ナビゲーション装置およびナビゲーション方法、ならびにコンピュータプログラム | |
JP2001083983A (ja) | 音声認識装置、音声認識のためのデータを記録した記録媒体、および、音声認識ナビゲーション装置 | |
JPWO2013069060A1 (ja) | ナビゲーション装置、方法およびプログラム | |
JP2005316022A (ja) | ナビゲーション装置及びプログラム | |
WO2013051072A1 (ja) | ナビゲーション装置、方法およびプログラム | |
JP4645708B2 (ja) | コード認識装置および経路探索装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12847164 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013542802 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14131658 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120120047117 Country of ref document: DE Ref document number: 112012004711 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12847164 Country of ref document: EP Kind code of ref document: A1 |