WO2013051072A1 - Dispositif, procédé et programme de navigation - Google Patents

Dispositif, procédé et programme de navigation Download PDF

Info

Publication number
WO2013051072A1
WO2013051072A1 PCT/JP2011/005659 JP2011005659W WO2013051072A1 WO 2013051072 A1 WO2013051072 A1 WO 2013051072A1 JP 2011005659 W JP2011005659 W JP 2011005659W WO 2013051072 A1 WO2013051072 A1 WO 2013051072A1
Authority
WO
WIPO (PCT)
Prior art keywords
guidance information
voice
route
unit
expression
Prior art date
Application number
PCT/JP2011/005659
Other languages
English (en)
Japanese (ja)
Inventor
武弘 重田
友紀 古本
悠希 住吉
尚嘉 竹裏
渡邉 圭輔
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2011/005659 priority Critical patent/WO2013051072A1/fr
Publication of WO2013051072A1 publication Critical patent/WO2013051072A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech

Definitions

  • the present invention relates to a navigation apparatus, method, and program capable of recognizing a user's utterance content and performing navigation.
  • a navigation device for in-vehicle use, for example, when a predetermined point (for example, an intersection whose traveling direction is to be changed) is approaching while traveling on a set route, a voice output or a graphic display is provided to the driver.
  • a predetermined point for example, an intersection whose traveling direction is to be changed
  • a voice output or a graphic display is provided to the driver.
  • the navigation device can provide road guidance at a predetermined point set in advance, the content of the road guidance that the passenger is providing to the driver during driving is used as the guidance content of the navigation device. It was not possible to present it to the person.
  • the navigation device stores guide information that the passenger has guided the driver, and the most frequently used guide utterance content among the accumulated guides is stored.
  • a navigation device and a navigation method for selecting and presenting to a user are described.
  • the presentation method is limited to “sound output by synthesized speech” and “screen output to the display of the navigation device”, and there is a problem that information that can be presented to the user is limited.
  • the driving situation feature value satisfies the conditions, the same guide is output even if it is not the same as the position where the guide was recorded (or within a predetermined range), so a guide suitable for each road guidance point must be presented. There was also a problem that it was not possible.
  • the present invention has been made to solve the above-described problems, and can provide a more flexible and user-friendly presentation similar to the case where a passenger is guiding a driver.
  • An object is to provide a navigation device, a method, and a program.
  • a navigation device of the present invention includes a voice acquisition unit that acquires input voice, a voice recognition unit that performs voice recognition processing from voice data acquired by the voice acquisition unit, and the voice acquisition
  • a voice recording unit that records the voice data acquired by the unit into a voice file
  • a route guidance expression storage unit that stores a route guidance expression
  • a route guidance expression extracting unit for extracting a route guidance expression
  • a travel route acquiring unit for acquiring a travel route for acquiring a travel route
  • a host vehicle position acquiring unit for acquiring the position of the host vehicle
  • a route guide extracted by the route guide expression extracting unit is a route guide extracted by the route guide expression extracting unit.
  • the expression is acquired by the vehicle position acquisition unit when the voice file recorded by the voice recording unit, the travel route acquired by the travel route acquisition unit, and the voice acquisition unit acquires voice.
  • the route guidance information control storage unit records the route guidance information in association with the own vehicle position.
  • a road guidance information presentation unit that outputs the road guidance information corresponding to the recorded vehicle position when it is detected that the position is within a predetermined range from the vehicle position recorded as the road guidance information; It is characterized by providing.
  • the guide information that the passenger has guided the driver is stored, and the next time the vehicle travels on the same route, the vehicle position is near the position where the guide information is stored.
  • a guide corresponding to the position is output accurately, so that flexible and user-friendly presentation can be performed.
  • FIG. 1 is a block diagram illustrating an example of a navigation device according to Embodiment 1.
  • FIG. It is a figure which shows an example of the route guidance expression memory
  • 4 is a flowchart showing an operation when registering road guidance information according to the first embodiment.
  • 5 is a flowchart showing an operation when using road guidance information according to the first embodiment.
  • 4 is a presentation example of road guidance information in the first embodiment.
  • FIG. 10 is a block diagram illustrating an example of a navigation device according to a second embodiment. 10 is a flowchart illustrating an operation when using road guidance information according to the second embodiment.
  • FIG. 10 is a block diagram illustrating an example of a navigation device according to a third embodiment.
  • FIG. 12 is a flowchart illustrating an operation when using road guidance information according to the third embodiment.
  • FIG. 10 is a block diagram illustrating an example of a navigation device according to a fourth embodiment. 10 is a flowchart illustrating an operation when using road guidance information according to the fourth embodiment.
  • FIG. 10 is a block diagram illustrating an example of a navigation device according to a fifth embodiment. It is a figure which shows an example of the ambiguous expression table. 10 is a flowchart illustrating an operation when using road guidance information according to a fifth embodiment.
  • FIG. 20 is a block diagram illustrating an example of a navigation device according to a sixth embodiment. It is a figure which shows an example of the route guidance expression conversion table.
  • FIG. 18 is a flowchart showing an operation when using road guidance information according to the sixth embodiment.
  • Embodiment 6 it is an example of presentation of the guidance information in which the utterance content is converted into a graphic expression, or converted into a graphic expression and a character expression.
  • FIG. 20 is a block diagram illustrating an example of a navigation device according to a seventh embodiment. 18 is a flowchart showing an operation at the time of registering road guidance information according to the seventh embodiment. 5 is a diagram illustrating an example of an unnecessary expression storage unit 19.
  • FIG. 20 is a flowchart illustrating an operation when using road guidance information according to an eighth embodiment. In Embodiment 8, it is explanatory drawing which represented typically the method of calculating the longitude and latitude of the output position of an instruction
  • FIG. 1 is a block diagram showing an example of a navigation apparatus according to Embodiment 1 of the present invention.
  • This navigation device includes a voice acquisition unit 1, a voice recognition unit 2, a voice recording unit 3, a route guidance expression storage unit 4, a route guidance expression extraction unit 5, a map data storage unit 6, a time acquisition unit 7,
  • the travel route acquisition unit 8, the vehicle position acquisition unit 9, the road guidance information control storage unit 10, the road guidance information table 11, the road guidance information presentation control unit 12, and the road guidance information output unit 13 are configured.
  • a key input unit that acquires an input signal using a key, a touch panel, or the like is also provided.
  • the voice acquisition unit 1 performs A / D conversion on a user utterance collected by a microphone or the like, that is, an input voice, and acquires the voice, for example, in a PCM (Pulse Code Modulation) format.
  • the voice recognition unit 2 has a recognition dictionary (not shown), detects a voice section corresponding to the content spoken by a speaker such as a passenger from the voice data acquired by the voice acquisition unit 1, and features And a speech recognition process is performed using a recognition dictionary based on the feature amount. Note that the voice recognition unit 2 may use a voice recognition server on the network.
  • the voice recording unit 3 records the voice data acquired by the voice acquisition unit 1 as a voice file such as a WAV file.
  • the route guidance expression storage unit 4 normally stores expressions that are assumed to be used when a person guides a route.
  • FIG. 2 is a diagram illustrating an example of the route guidance expression storage unit 4.
  • the route guidance expression storage unit 4 indicates, for example, “immediately”, “after a while”, “next”, “100 m ahead”, “200 m ahead”, “300 m ahead” indicating the timing and place of the route guidance. "First”, “Second”, “Third” and “Intersection”, “Three-way”, “Signal”, “Corner”, “End”, “Convenience store", "Bookstore”, “Japanese restaurant”, “Family”, “Gasoline” “Stand”, “Hospital”, etc.
  • the route guidance expression extraction unit 5 performs morphological analysis with reference to the route guidance expression storage unit 4 and extracts a route guidance expression from the character string of the speech recognition result of the speech recognition unit 2.
  • the map data storage unit 6 stores map data such as road data, intersection data, facility data, and the like.
  • the time acquisition unit 7 acquires time using known information.
  • the travel route acquisition unit 8 determines a travel route from the current location to the destination when the destination is set, and acquires the determined travel route by assigning an individual ID.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude) using information acquired from a GPS receiver, a gyroscope, or the like.
  • the route guidance information control storage unit 10 acquires the route guidance expression extracted by the route guidance expression extraction unit 5, the voice file recorded by the voice recording unit 3, the map data stored in the map data storage unit 6, and the time acquisition.
  • the time acquired by the unit 7, the travel route acquired by the travel route acquisition unit 8, and the own vehicle position acquired by the own vehicle position acquisition unit 9 are associated with each other and stored in the road guidance information table 11. Further, the route guidance information control storage unit 10 obtains the position (longitude and latitude) of the instruction point based on the utterance content, the utterance position (the own vehicle position at the time of utterance), and the map data, and from the utterance position to the instruction point. Is also calculated, and the information is also stored in the route guidance information table 11.
  • the route guidance information table 11 includes, for example, a travel route ID, an utterance content, a recorded voice file, an utterance time, an utterance position (longitude / latitude), an instruction point position (longitude / latitude), and an instruction point. The direction of travel and the distance to the indicated point are stored.
  • the ID of the road guide who performed the road guide may be recorded.
  • the route guidance information presentation control unit 12 edits and updates data recorded in the route guidance information table 11 and presents which guidance (recorded audio file) to the user at which position (at which timing). Control is performed such as instructing to the route guidance information output unit 13.
  • the route guidance information output unit 13 outputs the instructed route guidance information (guidance) at the position (timing) indicated by the route guidance information presentation control unit 12 by screen output to a display screen or voice output to a speaker. , Output to the user.
  • this road guidance information presentation control part 12 and the road guidance information output part 13 comprise the road guidance information presentation part 21 in this invention.
  • FIG. 4 is a flowchart showing an operation at the time of registering route guidance information according to the first embodiment.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude), and the travel route acquisition unit 8 refers to the map data to determine the destination from the current location.
  • a travel route to the ground is determined (step ST01).
  • the same travel route ID is assigned to the same travel route. (Step ST03).
  • a new travel route ID is assigned (step ST04).
  • the voice recognition unit 2 recognizes the voice acquired by the voice acquisition unit 1 and the voice recording unit 3 records it (step ST05).
  • the time acquisition unit 7 acquires the time at that time (step ST06), and the own vehicle position acquisition unit 9 acquires the position (latitude and longitude) at that time (step ST07).
  • the route guidance expression extracting unit 5 extracts the route guidance expression with reference to the route guidance expression storage unit 4 from the voice recognition result recognized by the voice recognition unit 2 in step ST05 (step ST08).
  • step ST09 the process returns to step ST05 and recording is started.
  • the voice recorded in step ST05 is stored as a voice file (step ST10).
  • the route guidance information control storage unit 10 refers to the map data and the vehicle position from the route guidance expression extracted in step ST08, and points indicated by the guide by the utterance (actually turning corner, actually passing through). (Hereinafter referred to as “instruction point”) (step ST11), the traveling direction at the instruction point is acquired (step ST12), and the distance between the vehicle position and the instruction point is calculated. (Step ST13).
  • step ST13 various information acquired, stored, calculated, etc. up to step ST13, that is, speech content that has been voice-recognized, recorded voice file, own vehicle position, instruction point, direction of movement at the instruction point, up to the instruction point
  • the distance, time, and travel route ID are combined (stored) in the route guidance information table 11 (step ST14). Then, until the destination is reached or a vocabulary representing the end or interruption of the route guidance such as “arrival”, “end”, “interrupt”, “stop” is obtained (NO in step ST15), step ST05 The process of ST14 is repeated.
  • the instruction point etc. is usually registered while getting the passenger to guide the route for the route that passes for the first time.
  • the second time by storing the same travel route ID in the road guide information table, the data for the travel route is more enriched, and a more detailed and easy-to-understand road guide information table can be created.
  • FIG. 5 is a flowchart showing an operation when using the route guidance information according to the first embodiment.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude), and the travel route acquisition unit 8 refers to the map data to determine the destination from the current location.
  • a travel route to the ground is determined (step ST21).
  • the road guide information presentation control unit 12 determines whether or not the same travel route as the determined travel route exists in the road guide information table 11. When there is no travel route that is the same as the determined route in the already stored route guidance information table 11 (in the case of NO in step ST22), there is no route guidance information that can be presented. Exit. On the other hand, when the same travel route as the determined travel route exists in the already stored route guidance information table 11 (in the case of YES in step ST22), the current vehicle position (latitude and longitude) is displayed. Obtain (step ST23).
  • the vehicle position (longitude and latitude) at the time of utterance of the travel route is acquired from the road guidance information table 11 (step ST24).
  • the utterance content and the recorded voice are acquired from the route guidance information table 11. (Step ST26) and present it to the user (step ST27).
  • the utterance content is presented by displaying character information, graphic information, etc. on the navigation screen, the windshield, etc., and the recorded voice is presented by outputting the recorded voice file as it is.
  • Step ST23 until arriving at the destination or obtaining a vocabulary representing the end or interruption of route guidance such as “arrival”, “end”, “interrupt”, “stop”, etc. (in the case of NO in step ST28)
  • the process of ⁇ ST27 is repeated.
  • step ST25 it is determined that the current vehicle position (latitude and longitude) is “within a predetermined range” from the vehicle position (latitude and longitude) at the time of speech. This is because the current longitude and latitude at the time of measurement of the current vehicle position at the time of making this determination are not always exactly the same as the longitude and latitude of the vehicle position stored in the table. If it is within, it is determined that it has almost reached the same position and the guidance is presented.
  • FIG. 6 is an example showing how the route guidance information is presented to the user in step ST27.
  • the utterance content of the route guidance information is displayed as character information (telop) 32, and the recorded voice file is output as voice.
  • this voice output route guidance information is indicated by a balloon 33.
  • the guide information that the passenger has guided the driver is stored, and the next time the vehicle travels on the same route, the vicinity of the position where the guide information is stored.
  • a guide corresponding to the position is output accurately, so that flexible and user-friendly presentation can be performed.
  • FIG. FIG. 7 is a block diagram showing an example of a navigation apparatus according to Embodiment 2 of the present invention.
  • symbol is attached
  • the second embodiment described below further includes a speech synthesizer 14 as compared with the first embodiment.
  • the “distance to the instruction point” stored in the road guidance information table 11 is short (for example, the distance to the instruction point is only 50 m as in data No. 3 of the road guidance information table 11 shown in FIG. 3).
  • the position where the road guidance information is presented is the own vehicle position (latitude and longitude) stored in the road guidance information table 11.
  • the output content is also changed accordingly.
  • FIG. 8 is a flowchart showing an operation when using the route guidance information according to the second embodiment.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude), and the travel route acquisition unit 8 refers to the map data to determine the destination from the current location.
  • a travel route to the ground is determined (step ST21).
  • the road guide information presentation control unit 12 determines whether or not the same travel route as the determined travel route exists in the road guide information table 11. When there is no travel route that is the same as the determined route in the already stored route guidance information table 11 (in the case of NO in step ST22), there is no route guidance information that can be presented. Exit.
  • the “designated point” It is checked whether or not there is data having a “distance to” less than a predetermined value (in this second embodiment, less than 100 m) (step ST31).
  • the determination condition “less than the predetermined value” is set to be less than 100 m here, but may be set as appropriate, such as 50 m or less and less than 50 m.
  • step ST31 If there is no data with a short distance to the instruction point (NO in step ST31), the same processing as steps ST23 to ST27 in the flowchart shown in FIG. 5 of the first embodiment is performed. On the other hand, if there is data with a short distance to the instruction point (YES in step ST31), the route guidance information presentation control unit 12 sets the distance to the instruction point of the data to a value equal to or greater than a predetermined value (here Then, the longitude and latitude corresponding to the distance are calculated, and each is overwritten and saved (step ST32).
  • the distance to be changed (a value equal to or greater than a predetermined value) is 100 m here, but a distance that is equal to or greater than a predetermined value, such as 200 m or 300 m, for example, may be appropriately set.
  • a distance that is equal to or greater than a predetermined value such as 200 m or 300 m, for example, may be appropriately set.
  • the changed distance and longitude and latitude are overwritten and updated in the route guidance information table 11, but may be added as changed data.
  • step ST33 when the distance to the instruction point is changed, the content of the utterance to be presented to the user is also changed, and a synthesized speech is created accordingly and overwritten and saved (step ST33).
  • a synthesized voice “100m ahead” is created (step ST34).
  • a known method may be used such as storing and creating a voice of a route guidance term that is likely to be used in advance.
  • step ST25 is YES
  • the utterance content is acquired from the route guidance information table 11, and the synthesized speech created in step ST34 is acquired (step ST35). It is presented to the user (step ST36). Then, until arriving at the destination or obtaining a vocabulary indicating the end or interruption of the route guidance such as “arrival”, “end”, “interrupt”, “stop”, etc. (in the case of NO in step ST37), FIG. The process after step ST31 in the flowchart is repeated.
  • the utterance content is updated by overwriting the route guidance information table 11 in the second embodiment, it may be added as post-change data.
  • the synthesized voice is not recorded in the route guidance information table 11 here, it is updated by overwriting the synthesized voice instead of the recorded voice, or added as changed data. May be.
  • the processes in steps ST31 to ST34 are performed when the road guide information is used (during travel). However, this process is not performed, and the process in FIG. 4 in the first embodiment is performed. It may be performed in advance when registering the route guidance information shown, and the data in the route guidance information table 11 may be updated.
  • the route guidance information table 11 stored at the time of registration, there is data with a short distance to the instruction point, and after the user perceives the guidance, Even if it is difficult to perform the behavior, it is possible to change the position and output contents for outputting the route guidance information to a position and contents suitable for the user to perceive the guidance and perform the behavior. As a result, more flexible and user-friendly presentation can be performed.
  • FIG. FIG. 9 is a block diagram showing an example of a navigation apparatus according to Embodiment 3 of the present invention.
  • symbol is attached
  • the vehicle speed acquisition unit 15 is further provided, and the current vehicle speed is too high compared to when the road guidance information is stored in the road guidance information table 11. For example, when it is considered that it is difficult for the user to perform the behavior according to the guidance even if the guidance information is presented at the position stored in the guidance information table 11, the guidance position in consideration of the vehicle speed Is to change.
  • FIG. 10 is a diagram showing another example of the road guidance information table 11 in which the vehicle speed is stored in addition to the road guidance information table 11 shown in FIG.
  • the vehicle speed may be acquired at steps ST06 and ST07 in the flowchart of FIG. 4 in the first embodiment. Is omitted.
  • FIG. 11 is a flowchart showing an operation when using the route guidance information according to the third embodiment.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude), and the travel route acquisition unit 8 refers to the map data to determine the destination from the current location.
  • a travel route to the ground is determined (step ST21).
  • the road guide information presentation control unit 12 determines whether or not the same travel route as the determined travel route exists in the road guide information table 11. When there is no travel route that is the same as the determined route in the already stored route guidance information table 11 (in the case of NO in step ST22), there is no route guidance information that can be presented. Exit. On the other hand, when the same travel route as the determined travel route exists in the already stored route guidance information table 11 (YES in step ST22), the current vehicle speed is acquired (step ST41). .
  • step ST42 the vehicle speed and the distance to the instruction point when the travel route is uttered are acquired from the road guidance information table 11 (step ST42). Then, based on the vehicle speed acquired from the road guidance information table 11 and the distance to the instruction point, the distance to the appropriate instruction point at the currently traveling vehicle speed is calculated, updated, and saved (overwritten) (step ST43). ).
  • the guidance is presented at the position stored in the road guidance information table 11 such as when the current vehicle speed is too high compared to the vehicle speed stored in the road guidance information table 11.
  • a margin for example, if it is running at a vehicle speed of 60 km / h, This is because the distance to the instruction point is reset to 400 m or the like, and is appropriately presented to the user.
  • the relationship between the vehicle speed and the distance for performing the resetting may be determined in advance and automatically reset by the navigation device.
  • the reset distance and position to the instruction point are updated by overwriting the route guidance information table 11, but may be added as post-change data.
  • the processing of steps ST41 to ST43 is performed when the road guidance information is used (during traveling). However, this processing is not during traveling, and the road guidance shown in FIG. 4 in the first embodiment is performed. It may be performed in advance at the time of information registration. In this case, it is only necessary to calculate the distance to the instruction point for several types of vehicle speeds that are different from the vehicle speed at the time of registration instead of the current vehicle speed.
  • the position where the route guidance information is presented has been changed.
  • the data No. stored in the route guidance information table 11 shown in FIG. When the current vehicle speed is 60 km / h (2 times the stored speed) when presenting 2, the content to be presented is changed according to the vehicle speed, such as “here straight”. May be.
  • the utterance content may be updated and a synthesized speech may be generated and output.
  • the vehicle speed acquisition unit 15 is further provided, and the vehicle travels at a vehicle speed different from the road guide information table 11 stored at the time of registration. Therefore, even if it is difficult for the user to perceive the guidance after the perception of the guidance, the user perceives the guidance and the behavior of the position where the route guidance information is output. Since the position can be changed to a position suitable for performing the output, more flexible and user-friendly presentation can be performed.
  • FIG. FIG. 12 is a block diagram showing an example of a navigation device according to Embodiment 4 of the present invention. Note that the same components as those described in the first to third embodiments are denoted by the same reference numerals, and redundant description is omitted.
  • the surrounding situation acquisition unit 16 is further provided, and the present day is night compared to the case where the road guidance information is stored in the road guidance information table 11.
  • FIG. 13 is a flowchart showing an operation when using the route guidance information according to the fourth embodiment.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude), and the travel route acquisition unit 8 refers to the map data to determine the destination from the current location.
  • a travel route to the ground is determined (step ST21).
  • the road guide information presentation control unit 12 determines whether or not the same travel route as the determined travel route exists in the road guide information table 11. When there is no travel route that is the same as the determined route in the already stored route guidance information table 11 (in the case of NO in step ST22), there is no route guidance information that can be presented. Exit. On the other hand, when the same travel route as the determined travel route exists in the already stored route guidance information table 11 (in the case of YES in step ST22), the current vehicle position (latitude and longitude) is displayed. Obtain (step ST23).
  • the vehicle position (longitude and latitude) at the time of utterance of the travel route is acquired from the road guidance information table 11 (step ST24). If the vehicle position acquired in step ST23 is within a predetermined range from the vehicle position acquired in step ST24 (YES in step ST25), the current time is acquired (step ST51). Then, the time at the time of utterance (time when the route guidance information was recorded) is acquired from the route guidance information table 11 (step ST52), and if the current time is within a predetermined range from the time at the time of utterance (in step ST53). In the case of YES, the utterance content (2) and the recorded voice (3) are acquired from the route guidance information table 11 (step ST26) and presented to the user (step ST27).
  • the “predetermined range” when comparing the time is a time that the user may set appropriately, such as 3 hours, for example, but the current time and the time at the time of utterance differ by only a few hours Therefore, it is determined that the situation is almost the same as that at the time of registration, and the same processing as in the first embodiment may be performed.
  • the surrounding state acquisition unit 16 acquires the current surrounding state (step ST54).
  • the surrounding state acquisition unit 16 that acquires the current surrounding state is, for example, VICS (Vehicle Information and Communication System) that acquires road conditions such as traffic jams and traffic restrictions, and visibility by photographing the traveling direction.
  • VICS Vehicle Information and Communication System
  • Any camera can be used as long as it can acquire the situation of the surroundings, such as a camera for checking the surroundings. In the fourth embodiment, the camera is used to photograph the surroundings.
  • Step ST55 the route guidance information presentation control unit 12 creates guidance contents according to the result.
  • a mark for example, “No. 6 in data No. 6 of the route guidance information table 11 shown in FIG. This is because a Japanese restaurant ” is out of business hours and the surroundings are dark, so even if the driver says“ Leave a Japanese restaurant over there ”to the left, the driver may not be able to recognize the“ Japanese restaurant ”. Therefore, for example, the guidance content “Left second signal” is created and presented to the user (step ST56).
  • step ST23 the process after step ST23 in the flowchart is repeated.
  • the surrounding situation acquisition unit 16 only when the current time is larger than a predetermined range from the time of utterance stored in the route guidance information table 11 (time when the route guidance information is recorded), the surrounding situation acquisition unit 16 is used, but for example, there are cases where a large truck stops and it is not possible to confirm the mark when driving, or there is a case where construction is being performed only on that day. It may be.
  • the current time differs significantly from the time when the utterance (the time when the route guidance information was recorded) is larger than the predetermined range, regardless of the surrounding situation, a guide that can be understood even at night ( It is also possible to create a clear guidance content regardless of time, such as “Left the signal at left” and present it to the user.
  • the surrounding situation acquisition unit 16 is further provided, and the guidance content is re-created and presented in accordance with at least the time when the road guidance information is recorded or the current surrounding situation. Even if the surroundings are dark at night and it is difficult to visually recognize the mark, it is possible to present guidance contents that can be recognized by the user even if the situation is different from the utterance in which the route guidance information is registered. Therefore, more flexible and user-friendly presentation can be performed.
  • FIG. FIG. 14 is a block diagram showing an example of a navigation apparatus according to Embodiment 5 of the present invention. Note that the same components as those described in the first to fourth embodiments are denoted by the same reference numerals, and redundant description is omitted.
  • an ambiguous expression table 17 is further provided, and when there is an ambiguous and difficult-to-understand expression in the user's utterance content, the expression is changed to an easy-to-understand specific expression. And presented to the user.
  • the speech synthesizer 14 described in the second embodiment also includes the speech synthesizer 14 when the utterances that are not recorded by the speech recorder 3 are output as speech as a result of changing the utterance content to a specific expression that is easy to understand.
  • the voice synthesized by step 14 is output.
  • FIG. 15 is a diagram illustrating an example of the ambiguous expression table 17.
  • FIG. 16 is a flowchart showing an operation when using the route guidance information according to the fifth embodiment.
  • the processing from step ST21 to ST26 is the same as that in the flowchart of FIG.
  • the utterance content acquired from the route guidance information table 11 in step ST26 is compared with the expressions stored in the ambiguous expression table 17, and if the utterance contents do not include an ambiguous and difficult-to-understand expression (step)
  • the utterance content and the recorded voice acquired in step ST26 are presented (step ST27) as in the first embodiment.
  • the utterance content contains an ambiguous and difficult-to-understand expression (in the case of YES in step ST61), from the indication point and the map data, the name of the official intersection and how many meters ahead Is obtained (step ST62). For example, if the utterance content is “Left the intersection over there”, the official name “XX intersection” is acquired from the map data, or the specific intersection “300 m ahead” is determined based on the distance to the indicated point. Get specific expressions, such as showing distances.
  • the ambiguous and difficult-to-understand expression in the utterance content of the route information table 11 is changed to a specific expression indicated by an official name, a specific distance, etc., and the utterance content is updated and saved (overwritten) (Ste ST63).
  • the changed utterance content is overwritten and updated in the route guidance information table 11, but may be added as changed data.
  • the synthesized speech corresponding to the utterance content after the expression is changed is created by the speech synthesizer 14 (step ST64), and the changed utterance content is displayed as the telop 32 shown in FIG.
  • the voice is output as a voice (step ST65).
  • a known method may be used as in the second embodiment.
  • the synthesized voice is created, the recorded voice recorded by the voice recording unit 3 may be used. That is, the voice waveform included in the recorded voice that has already been recorded is processed and edited and created.
  • the voice to be presented is not the recorded voice recorded at the time of utterance, but the synthesized voice corresponding to the change of the utterance contents is created and output.
  • the utterance content to be displayed and output may be changed, and the voice may be recorded voice.
  • the output voice still contains an ambiguous expression, but because the character information (telop) has been changed to a specific expression, even if the voice contains an ambiguous expression, a specific expression The user can interpret it as specific guidance information by understanding the character information together with the display.
  • the ambiguous expression table 17 is further provided, and when there is an ambiguous and unintelligible expression in the utterance content of the route guidance information table 11, the expression is changed to a specific expression. Since it is presented, it is easy to understand even if it is guidance information that includes ambiguous expressions that make it difficult for the user to accurately grasp the indicated point. It is possible to present a guide for typical expressions, and to provide a more flexible and user-friendly presentation.
  • the speech synthesizer 14 is further provided, and synthesized speech is generated and output in accordance with the change of the utterance content, guidance of specific expressions can be presented also by speech, which is more flexible. User-friendly presentation can be made.
  • the voice waveform contained in the recorded voice that has already been recorded is edited to create and output a synthesized voice, the voice of the person who gave the directions will be presented as a specific guidance. It is possible to provide a more flexible and user-friendly presentation.
  • FIG. 17 is a block diagram showing an example of a navigation apparatus according to Embodiment 6 of the present invention. Note that the same components as those described in the first to fifth embodiments are denoted by the same reference numerals, and redundant description is omitted.
  • the route guidance expression conversion table 18 is further provided, and the user's utterance content is not displayed as text information as it is as text information as shown in FIG. It is converted into visual information that is easy to recognize visually, such as graphics, and displayed.
  • FIG. 18 is a diagram illustrating an example of the route guidance expression conversion table 18.
  • FIG. 19 is a flowchart showing an operation when using the route guidance information according to the sixth embodiment.
  • the processing from step ST21 to ST26 is the same as that in the flowchart of FIG.
  • stored in the route guidance expression conversion table 18 is detected from the utterance content acquired from the route guidance information table 11 in step ST26, and visual information, such as the graphic expression defined in the route guidance expression conversion table 18, is displayed. (Step ST71).
  • FIG. 20 is a screen example of guidance information presented by converting the utterance content into visual information (graphical expression or graphic expression and character expression) based on the route guidance expression conversion table 18.
  • FIG. 20A is a diagram showing the displayed navigation screen 31 and that the utterance content spoken at that time is “turn right 200 meters ahead” (balloon 33). In this case, the same content as that described in the balloon 33 is presented as the audio output, but the visual information is shown in FIG. 20B based on the route guidance expression conversion table 18 shown in FIG. As shown in FIG.
  • the right arrow graphic data (graphic representation) 33 is displayed at the upper part of the intersection 200m ahead, or the character expression 34 "turn right” is attached to the upper part of the intersection 200m ahead.
  • the graphic data (graphic representation) 33 indicated by the right arrow is displayed. Then, until arriving at the destination or obtaining a vocabulary indicating the end or interruption of route guidance such as “arrival”, “end”, “interrupt”, “stop”, etc. (in the case of NO in step ST73), FIG. The process after step ST23 in the flowchart is repeated.
  • the route guidance expression conversion table 18 is further provided, and the expressions stored in the route guidance expression conversion table 18 are detected from the utterance contents of the route guidance information table 11, Since it is converted into visual information such as graphic representations, it can be confirmed by visually recognizing information that can be heard through speech, and mistakes due to mishearing, text, etc. This eliminates the hassle of having to read the, so a more flexible and user-friendly presentation can be made.
  • FIG. FIG. 21 is a block diagram showing an example of a navigation device according to Embodiment 7 of the present invention. Note that the same components as those described in the first to sixth embodiments are denoted by the same reference numerals, and redundant description is omitted.
  • an unnecessary expression storage unit 19 and an unnecessary expression extraction / deletion unit 20 are further provided, and the route guidance expression extracted by the route guidance expression extraction unit 5 is included in the route guidance expression. In the case where unnecessary expressions such as cancellation, correction, and non-reproducible expressions are included, such information is not stored in the route guidance information table 11.
  • FIG. 22 is a flowchart showing an operation at the time of registering route guidance information according to the seventh embodiment.
  • the processing from step ST01 to ST08 is the same as that in the flowchart of FIG.
  • FIG. 23 is a diagram illustrating an example of the unnecessary expression storage unit 19. For example, “There ’s right, wrong, left.” To cancel or correct the contents of the directions, such as “different”, “wrong”, “not ...”, For example, “Let's take a break there. Go right.” The phrase “Break” is not directly related to the directions, and other non-reproducible “ The phrase “follow me” is stored.
  • the route guidance information table 11 stores only an appropriate route guidance expression “there, left”.
  • the unnecessary expression storage unit 19 and the unnecessary expression extraction / deletion unit 20 are further provided. Unnecessary items that are not directly related to the route guidance such as cancellation, correction, and non-reproducible expressions. Such information is not stored in the route guidance information table 11 when unnecessary expressions are included, so unnecessary information is not stored, and a more flexible and user-friendly presentation is provided when using the route guidance. Can do.
  • Embodiment 8 FIG.
  • the block diagram showing an example of the navigation device according to the eighth embodiment of the present invention has the same configuration as the block diagram shown in FIG. 7 (an example of the navigation device according to the second embodiment), and therefore illustration and description thereof are omitted.
  • the eighth embodiment described below is suitable for users who travel in the reverse direction based on the registered route guidance information even when traveling in the direction opposite to the travel route at the time of registering the route guidance information. Information is created and presented.
  • FIG. 24 is a flowchart showing an operation when using the route guidance information according to the eighth embodiment.
  • the own vehicle position acquisition unit 9 acquires the current own vehicle position (longitude and latitude), and the travel route acquisition unit 8 refers to the map data to determine the destination from the current location.
  • a travel route to the ground is determined (step ST21).
  • the road guide information presentation control unit 12 determines whether or not the same travel route as the determined travel route exists in the road guide information table 11. When there is no travel route that is the same as the determined route in the already stored route guidance information table 11 (in the case of NO in step ST22), there is no route guidance information that can be presented. Exit.
  • the travel route in which the travel route is registered is registered. It is determined whether or not the route is in the opposite direction (step ST91). For example, when there is an intersection of an instruction point that guides "right" when registering road guidance, if the direction is the same as the registered travel route (forward direction), the intersection turns right at the instruction point. However, in the reverse direction, the intersection turns to the left at the indicated point.
  • the traveling direction is the forward direction if the traveling direction and the traveling direction at the points in the traveling route stored in the route guidance information table 11 are the same, and the opposite direction if the traveling direction is the opposite direction.
  • step ST91 when the travel route is in the same direction (forward direction) as the travel route at the time of registering the route guidance information (in the case of NO in step ST91), it is the same as steps ST23 to ST27 shown in FIG. 5 in the first embodiment.
  • the route guidance information is presented to the user.
  • the travel route is in the opposite direction to that at the time of registration of the road guidance information (in the case of YES in step ST91)
  • the current vehicle position latitude and longitude
  • step ST92 the road guidance information table 11 is obtained.
  • the distance to the designated point is acquired (step ST93).
  • the instruction point corresponding to the distance is matched with the travel route in the reverse direction (current travel route).
  • the longitude and latitude of the output position (presentation position) is calculated (step ST94).
  • FIG. 25 is an explanatory diagram schematically showing a method of calculating the longitude and latitude of the output position.
  • the route entering from the upper right is the forward direction
  • the route entering from the upper left is the reverse direction.
  • the triangle in the figure indicates the vehicle position
  • the black circle X on the T-junction indicates the instruction point.
  • the road guidance information table 11 stores the longitude and latitude of the instruction point X and 300 m that is the distance from the instruction point X to the longitude and latitude at the time of utterance.
  • the B point 300 m before the instruction point X on the reverse direction side is calculated from the position of the instruction point X and the distance (300 m) to the instruction point X. This is set as the current guidance information output position.
  • step ST94 it is determined whether or not the current vehicle position is within a predetermined range from the output position (point B) calculated at step ST94. If the current vehicle position is within the predetermined range (YES in step ST95).
  • step ST96 the utterance content is acquired from the route guidance information table 11 (step ST96), and the acquired utterance content is converted in accordance with the current route (step ST97). For example, if the utterance registered at the time of the forward travel route is “next intersection to the right”, it is converted to “next intersection to the left” according to the reverse travel route. . Note that the current vehicle position in FIG. 25 (the upper left triangle in the figure) is not within a predetermined range from point B, so the process returns to step ST92 for acquiring the vehicle position again and performs processing.
  • step ST98 the synthesized speech corresponding to the utterance content converted in step ST97 is created by the speech synthesizer 14 (step ST98), and the converted utterance content and the created synthesized speech are presented (step ST99).
  • the method of synthesizing the voice a known method may be used as described in the second and fifth embodiments, and the voice recording unit 3 is used as described in the fifth embodiment. You may make it utilize the audio
  • the utterance content and the synthesized voice for the reverse direction are created and presented each time.
  • the route guidance information for the travel route in the reverse direction may be created and registered in advance.
  • the reverse direction route guidance information is obtained based on the route guidance received once. It can be presented, that is, when driving on the way out (outbound), you can present the guidance information so that you can drive back on your own (return) on your own Can make presentations more flexible and user friendly.
  • the navigation device of the present invention is not limited to a vehicle-mounted device, and navigation can be performed by voice interaction between the user and the device, such as a portable navigation device. Any device can be applied as long as it is a simple device. In the present invention, within the scope of the invention, any combination of the embodiments, any modification of any component in each embodiment, or omission of any component in each embodiment is possible. .
  • the navigation device of the present invention can be applied to an in-vehicle navigation device or a portable navigation device capable of performing navigation by voice dialogue between a user and the device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

La présente invention concerne un dispositif, un procédé ou un programme de navigation qui permettent à des informations de guidage, qui indiquent un guidage donné par un compagnon de voyage à un conducteur, d'être mémorisées, et lorsqu'il est détecté qu'un véhicule se trouve à proximité d'une position à laquelle les informations de guidage sont mémorisées lors du déplacement suivant sur le même itinéraire, le guidage correspondant à la position est sorti de manière appropriée, offrant ainsi une présentation plus souple et plus conviviale qui est identique à celle donnée par le compagnon de voyage au conducteur.
PCT/JP2011/005659 2011-10-07 2011-10-07 Dispositif, procédé et programme de navigation WO2013051072A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/005659 WO2013051072A1 (fr) 2011-10-07 2011-10-07 Dispositif, procédé et programme de navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/005659 WO2013051072A1 (fr) 2011-10-07 2011-10-07 Dispositif, procédé et programme de navigation

Publications (1)

Publication Number Publication Date
WO2013051072A1 true WO2013051072A1 (fr) 2013-04-11

Family

ID=48043265

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/005659 WO2013051072A1 (fr) 2011-10-07 2011-10-07 Dispositif, procédé et programme de navigation

Country Status (1)

Country Link
WO (1) WO2013051072A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014235053A (ja) * 2013-05-31 2014-12-15 パイオニア株式会社 表示装置、表示方法及び表示プログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143485A (ja) * 1997-11-14 1999-05-28 Oki Electric Ind Co Ltd 音声認識方法及び音声認識装置
JP2002215186A (ja) * 2001-01-12 2002-07-31 Auto Network Gijutsu Kenkyusho:Kk 音声認識システム
JP2004271249A (ja) * 2003-03-06 2004-09-30 Toyota Motor Corp 音声案内装置
JP2006317573A (ja) * 2005-05-11 2006-11-24 Xanavi Informatics Corp 情報端末
JP2010190773A (ja) * 2009-02-19 2010-09-02 Mitsubishi Electric Corp ナビゲーション装置およびナビゲーション方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143485A (ja) * 1997-11-14 1999-05-28 Oki Electric Ind Co Ltd 音声認識方法及び音声認識装置
JP2002215186A (ja) * 2001-01-12 2002-07-31 Auto Network Gijutsu Kenkyusho:Kk 音声認識システム
JP2004271249A (ja) * 2003-03-06 2004-09-30 Toyota Motor Corp 音声案内装置
JP2006317573A (ja) * 2005-05-11 2006-11-24 Xanavi Informatics Corp 情報端末
JP2010190773A (ja) * 2009-02-19 2010-09-02 Mitsubishi Electric Corp ナビゲーション装置およびナビゲーション方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014235053A (ja) * 2013-05-31 2014-12-15 パイオニア株式会社 表示装置、表示方法及び表示プログラム

Similar Documents

Publication Publication Date Title
EP0768638B1 (fr) Dispositif et procédé de reconnaissance de la parole, de visualisation de cartes et de navigation
US7386437B2 (en) System for providing translated information to a driver of a vehicle
US6067521A (en) Interrupt correction of speech recognition for a navigation device
JP4935145B2 (ja) カーナビゲーション装置
WO2013069060A1 (fr) Dispositif et procédé de navigation
JP2907079B2 (ja) ナビゲーション装置,ナビゲート方法及び自動車
US20140136109A1 (en) Navigation device and method
JP5414951B2 (ja) ナビゲーション装置、方法およびプログラム
JP2008026653A (ja) 車載用ナビゲーション装置
KR101063607B1 (ko) 음성인식을 이용한 명칭 검색 기능을 가지는 네비게이션시스템 및 그 방법
US7295923B2 (en) Navigation device and address input method thereof
JP4262837B2 (ja) 音声認識機能を用いたナビゲーション方法
WO2013051072A1 (fr) Dispositif, procédé et programme de navigation
JP5546149B2 (ja) 案内情報生成装置
JP2007265203A (ja) 変換辞書生成装置及び漢字変換装置
WO2019124142A1 (fr) Dispositif de navigation, procédé de navigation et programme informatique
JPH09114487A (ja) 音声認識装置,音声認識方法,ナビゲーション装置,ナビゲート方法及び自動車
WO2006028171A1 (fr) Dispositif de présentation de données, méthode de présentation de données, programme de présentation de données et support d’enregistrement contenant le programme
JPWO2013069060A1 (ja) ナビゲーション装置、方法およびプログラム
JP3955482B2 (ja) 車載ナビゲーションシステム
JP2766402B2 (ja) 車載ナビゲ−タ
JP2002107167A (ja) ナビゲーション装置
JP2877045B2 (ja) 音声認識装置,音声認識方法,ナビゲーション装置,ナビゲート方法及び自動車
JPH1047986A (ja) カーナビゲーション装置並びにシステム
KR100455108B1 (ko) 음성인식장치,음성인식방법,지도표시장치,네비게이션장치,네비게이션방법및네비게이션기능을갖춘자동차

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11873560

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11873560

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP