JP2003130671A - Navigation system - Google Patents

Navigation system

Info

Publication number
JP2003130671A
JP2003130671A JP2001323221A JP2001323221A JP2003130671A JP 2003130671 A JP2003130671 A JP 2003130671A JP 2001323221 A JP2001323221 A JP 2001323221A JP 2001323221 A JP2001323221 A JP 2001323221A JP 2003130671 A JP2003130671 A JP 2003130671A
Authority
JP
Japan
Prior art keywords
map information
controller
search target
unit
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2001323221A
Other languages
Japanese (ja)
Inventor
Kenta Kawahara
Atsushi Kono
Tsutomu Matsubara
Tatsuya Mitsugi
Kenichi Ogawa
Tadashi Suzuki
達也 三次
健一 小川
勉 松原
健太 河原
篤 河野
鈴木  忠
Original Assignee
Mitsubishi Electric Corp
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp, 三菱電機株式会社 filed Critical Mitsubishi Electric Corp
Priority to JP2001323221A priority Critical patent/JP2003130671A/en
Publication of JP2003130671A publication Critical patent/JP2003130671A/en
Pending legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To solve the problem wherein speaking a direction and a distance with exact expression is difficult and the direction and the distance, which a user does not intend, are scrolled in the conventional technology. SOLUTION: The navigation system is provided with a map information storing part 12 where map information is stored, a voice inputting part 15, accepting designation to an object to be searched by voice input, a voice recognizing part 16 recognizing the object accepted by the voice inputting part 15, a controller 20 containing an azimuth sensor 22, a searching part 17 searching the object to be searched existing in the azimuth of the point of the controller 20 from the map information stored in the map information storing part 12 based on the azimuths of the object and the point of the controller 20, and a display controller 18, displaying the object searched by the searching part 17 by superimposing the object to the map information.

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a navigation device which can be installed in a transportation system such as an automobile or carried by a user, and which displays map information on a display and guides a route to a destination. .

[0002]

2. Description of the Related Art Conventionally, as a technique in such a field,
The thing described in Unexamined-Japanese-Patent No. 11-285083 is known. The conventional technique described in Japanese Patent Application Laid-Open No. 11-285083 uses a voice recognition technique to easily scroll a map displayed on a display. Specifically, when the user utters the “command for scrolling the map”, the “scroll direction”, and the “scroll distance”, the scroll command detection unit recognizes the utterance. Then, the map is scrolled while the direction control unit controls the scroll direction and the distance control unit controls the scroll distance in accordance with the recognized phrase.

[0003]

However, the conventional techniques are "scroll direction" and "scroll distance".
The user has to utter, but it is difficult to utter the direction and distance with appropriate expressions, and there is a problem that the user may scroll only the direction and distance not intended.
Also, since three types of utterances such as "command for scrolling the map", "scroll direction" and "scroll distance" must be recognized by the scroll command detection unit etc., it is not possible to determine the breaks of each utterance. It was a problem because it could be misrecognized.

The present invention solves such a problem and allows the user to easily input information such as the direction and distance intended by the user.
Moreover, it is an object of the present invention to provide a navigation device capable of accurately recognizing the content spoken by the user.

[0005]

A navigation device of the present invention is a navigation device for displaying a guide route from a present location to a destination, and a map information storage section in which map information is stored and designation of a search target by voice input. , A voice recognition unit that recognizes the search target received by the voice input unit, a controller with a built-in azimuth sensor, a search target recognized by the voice recognition unit, and a controller identified by the azimuth sensor Based on the direction of the tip, a search unit that searches the map information stored in the map information storage unit for a search target that exists in the direction of the tip of the controller, and the search target searched by the search unit is superimposed on the map information. A display control unit for displaying on a display is provided.

Further, the navigation device of the present invention is a navigation device for displaying a guide route from a present location to a destination, and a map information storage section in which map information is stored, and a voice input for receiving designation of a search target by voice input. Section, a voice recognition unit that recognizes the search target received by the voice input unit, a controller that incorporates an acceleration sensor, a search target that is recognized by the voice recognition unit, and an angular velocity vector at the tip of the controller identified by the acceleration sensor. And a search unit that searches the map information stored in the map information storage unit for a search target existing at a position away from the current position by a distance proportional to the magnitude of the angular velocity vector,
And a display control unit that superimposes the search target searched by the search unit on the map information and displays it on the display.

Further, the navigation device of the present invention is
In a navigation device that displays a guide route from a current location to a destination, a map information storage unit that stores map information, a voice input unit that receives designation of a search target by voice input, and a search target that is received by the voice input unit Recognizing voice recognition unit, controller with built-in direction sensor and acceleration sensor, search target recognized by voice recognition unit, direction of controller tip specified by direction sensor, controller tip specified by acceleration sensor Based on the angular velocity vector of, the search target existing in the direction of the tip of the controller and at a position away from the current position by a distance proportional to the magnitude of the angular velocity vector is searched from the map information stored in the map information storage unit. Search section,
And a display control unit that superimposes the search target searched by the search unit on the map information and displays it on the display.

Here, when a plurality of angular velocity vectors are detected by the acceleration sensor, the search unit searches the search target based on the combined vector of these vectors.

Further, the navigation device of the present invention is a navigation device which displays a guide route from the present location to a destination, and has a built-in inclination sensor and a three-dimensional map information storage section in which map information regarding a three-dimensional map is stored. The controller and the map information stored in the three-dimensional map information storage unit are read out to display the three-dimensional map on the display and displayed on the display in accordance with the tilt angle of the controller tip specified by the tilt sensor. And a display control unit for changing the display angle of the three-dimensional map.

[0010]

BEST MODE FOR CARRYING OUT THE INVENTION Preferred embodiments of a navigation device according to the present invention will be described below with reference to the accompanying drawings.

Embodiment 1. FIG. 1 is a block diagram showing the configuration of the navigation device according to the first embodiment.
As shown in the figure, 10 is a navigation device body, and 20 is a controller for operating the navigation device body 10. Further, 11 is a receiving unit that receives a signal from the controller 20, 12 is a map information storage unit that stores map information, 13 is a display that displays the map information stored in the map information storage unit 12, and 14 is the current position. The position detecting unit 15 detects the position of the search target. Note that the map information stored in the map information storage unit 12 includes an electronic map, road information, facility information, area information, and the like.

Further, 16 is a voice recognition unit for recognizing a search target accepted by the voice input unit 15, and 17 is a voice recognition unit 1.
A search unit for searching the map information stored in the map information storage unit 12 for a search target existing in the direction of the tip of the controller 20 based on the search target recognized in 6 and the direction of the tip of the controller 20; A display control unit that superimposes the search target searched in 17 on the map information and displays it on the display 13, and 19 is a main body control unit that controls the respective units 11 to 18 of the navigation device main body 10.

Reference numeral 21 is a navigation device body 10.
A transmitter for transmitting a signal to the controller 22 and a controller 2
0 is a direction sensor that detects the movement of the tip, 23 is a push-button switch that turns on / off the voice input to the voice input unit 14, and 24 is a controller control unit that controls each unit 21 to 23 of the controller 20.

Next, the operation of the navigation device according to the first embodiment will be described. First, when the position detector 14 detects the current position, the current position coordinate data is transmitted to the display controller 18. The display control unit 18 reads the map information centering on the current position coordinate data from the map information storage unit 12 and displays the map information on the display 13. When the user looking at the map displayed on the display 13 wants to search for a target object in a predetermined direction based on the current position, as shown in FIG. 2, the tip of the controller 20 is directed in the predetermined direction and the switch 23 is pressed. , The search process is started by speaking the target object.

Specifically, when the user pushes the switch 23 with the tip of the controller 20 directed in a predetermined direction, the direction sensor 22 detects in which direction the tip of the controller 20 is oriented. The ON signal of the switch 23 is transmitted from the transmitter 21 under the control of the controller controller 24.
From the navigation device body 19 to the receiving unit 1
Received at 1. This ON signal is given to the voice input section 15 under the control of the main body control section 19, and the voice input section 15 becomes ready for voice input. Here, when the user utters, for example, "convenience store" as the target object, this utterance is accepted by the voice input unit 15 and recognized by the voice recognition unit 16. The target object data speech-recognized by the voice recognition unit 16 is given to the search unit 17, and the orientation data detected by the orientation sensor 22 is also transmitted to the transmission unit 21 and the reception unit 11.
It is given to the search unit 17 via.

The search unit 17 stores the map information centered on the current position detected by the position detection unit 14 in the map information storage unit 12.
The target object data existing in the direction indicated by the orientation data is searched from the facility information included in the map information. The number of search target cases can be set in advance. For example, when it is set to 5, the top 5 cases are displayed on the display 13 in the ascending order of the current position. If the number of actually searched items is less than the search target number of 5, the search area is expanded and the search is performed again.

The search area is a fan-shaped area centered on the current position and having a width of x degrees left and right centered on the direction indicated by the azimuth data. Then, as shown in FIG. 3A, x = 0 degrees is set in the first search, and the target object data existing on the straight line is searched. If the search target number is not reached by this search, as shown in FIG. 3B, for example, target data existing in a fan-shaped search area having a width of 10 degrees to the left and right with x = 10 degrees is set. Search for. After that, the value of x is gradually increased and the search process is repeated. When the number of searches reaches the target number of searches, the search for the target object data is terminated.

After that, when the user designates a desired target from the plurality of targets displayed on the display 13, the guide route from the current position to the desired target is displayed by the control of the display controller 18. 13 is displayed.

As described in detail above, according to the present embodiment, the target object existing in the direction intended by the user is
An excellent effect that the target object can be easily searched by using an extremely simple interface such as uttering a target object while pointing the direction with the controller 20 is exhibited. In particular, it is very difficult to express the direction intended by the user in words (for example, “north-northeast”, “slightly south”, “direction of Mt. Fuji”, etc.) Since it is not necessary to speak the direction and only the controller 20 needs to point the direction, the direction intended by the user can be easily specified, and the input work efficiency is significantly improved.

Further, since the user only needs to speak the target object while the switch 23 is on, it is not necessary for the voice recognition unit 16 to separate and recognize the speech of a plurality of contents.
Therefore, the situation in which the break of each utterance cannot be determined does not occur, and the recognition efficiency of the voice recognition unit 16 is significantly improved.

Embodiment 2. Next, the navigation device according to the second embodiment will be described. FIG. 4 shows the second embodiment.
3 is a block diagram showing the navigation device of FIG. The second embodiment differs from the first embodiment shown in FIG. 1 in that an acceleration sensor 25 is further provided. Other configurations are the same as or equivalent to those in the first embodiment. In addition, the same reference numerals are given to the same or equivalent components as in the first embodiment, and the description thereof will be omitted.

Next, the operation of the navigation device according to the second embodiment will be described. First, when the position detector 14 detects the current position, the current position coordinate data is transmitted to the display controller 18. The display control unit 18 reads the map information centering on the current position coordinate data from the map information storage unit 12 and displays the map information on the display 13. When the user looking at the map displayed on the display 13 wants to search for a target object in a predetermined direction and a predetermined distance based on the current position, as shown in FIG. 5, the tip of the controller 20 is turned to a predetermined direction and switched. While pressing 23, the tip of the controller 20 is swung down from above to cast a lure with a fishing rod, and the target object is spoken to start the search process.

Specifically, when the user pushes the switch 23 with the tip of the controller 20 directed in a predetermined direction, the direction sensor 22 detects which direction the tip of the controller 20 is facing. When the user swings the tip of the controller 20 from top to bottom, the angular velocity vector of the tip of the controller 20 is detected by the acceleration sensor 25.

The ON signal of the switch 23 is transmitted from the transmitter 21 to the navigation device body 19 and received by the receiver 11 under the control of the controller controller 24. This ON signal is given to the voice input section 15 under the control of the main body control section 19, and the voice input section 15 becomes ready for voice input. Here, when the user utters, for example, "somewhere hot spring" as the target object, this utterance is accepted by the voice input unit 15 and recognized by the voice recognition unit 16. The target object data speech-recognized by the speech recognition unit 16 is given to the search unit 17, and the azimuth data detected by the azimuth sensor 22 and the angular velocity vector data detected by the acceleration sensor 25 are also transmitted by the transmission unit 21 and the reception unit 11. It is given to the search unit 17 via.

The search unit 17 stores the map information centered on the current position detected by the position detection unit 14 in the map information storage unit 12
The target object data existing in the direction indicated by the azimuth data and at the distance specified by the angular velocity vector data is searched from the facility information included in the map information. Here, the specification of the distance by the angular velocity vector data is performed based on, for example, a correspondence table between the magnitude of the angular velocity vector and the distance (the larger the angular velocity vector, the longer the distance).

When the target can be retrieved by this processing, an image in which the target is superimposed on the electronic map is displayed on the display 13. Further, by using the vibrator (not shown) built in the controller 20 to vibrate the controller 20 for a short time by the number of target objects that can be searched by this processing, the user can grasp how many searches have been made. be able to. After that, when the user specifies a desired target object from the plurality of target objects displayed on the display 13, the guide route from the current position to the desired target object is displayed by the display control unit 18.
Is displayed on the display 13 under the control of.

In the above processing, when the target object cannot be retrieved, the controller 20 may be vibrated for a long time to prompt the user to start again. In addition, the user is the controller 2
When the point separated by the distance obtained by swinging 0 from top to bottom is on the sea or lake, the controller 20 may be vibrated similarly for a long time to prompt the user to start again.

Further, when the target object cannot be retrieved by the above processing, the user swings the tip of the controller 20 from the bottom to the top to pull up the line of the fishing rod, so that the distance to the retrieval point is reached. Can be shortened. That is, when the user swings the tip of the controller 20 upward from below, the acceleration sensor 25 detects the angular velocity vector of the tip of the controller 20. The angular velocity vector data detected by the acceleration sensor 25 is transmitted by the transmitter 21.
And given to the search unit 17 via the receiving unit 11,
The search unit 17 calculates the resultant force vector data of the angular velocity vector data given last time and the angular velocity vector data given this time, and retrieves the target object data existing at the distance specified by the resultant force vector data. In this way, by swinging the controller 20 down from the top, the distance to the search point can be extended, and by swinging the controller 20 up from the bottom, the distance to the search point can be shortened. So
The fine adjustment to bring the distance closer to the user's intention becomes easier.

As described in detail above, according to the present embodiment, the controller 20 points the target object existing in the direction and distance intended by the user,
An excellent effect that the tip of the controller 20 is swung down from the top and the target object is uttered and a search can be easily performed with an extremely simple interface is exerted. In particular, the distance intended by the user is expressed in words (for example, "3
It is very difficult to express it as "0 km ahead", "where you can go in an hour", etc.), but in the present embodiment, it is not necessary to speak the distance, and just swing the tip of the controller 20 up and down. Therefore, the distance intended by the user can be easily specified, and the input work efficiency is significantly improved.

Further, the operation of swinging the tip of the controller 20 up and down is game-like, and an effect that the user is not bored is exhibited. Furthermore, the controller 2
When the operation of swinging the tip of 0 up and down fails, the search result is different from the user's intention, but the search result may rather collect unexpected information, which can entertain the user.

Further, since the user only needs to speak the target object while the switch 23 is on, it is not necessary for the voice recognition unit 16 to separate and recognize the speech of a plurality of contents.
Therefore, the situation in which the break of each utterance cannot be determined does not occur, and the recognition efficiency of the voice recognition unit 16 is significantly improved.

Embodiment 3. Next, a navigation device according to the third embodiment will be described. FIG. 6 shows the third embodiment.
3 is a block diagram showing the navigation device of FIG. The third embodiment differs from the first embodiment shown in FIG. 1 in that an inclination sensor 26 is provided instead of the direction sensor 22, and a three-dimensional map information storage unit 30 is provided instead of the map information storage unit 12. And a point further including a three-dimensional map generation unit 31 for generating a three-dimensional map image. Other configurations are the same as or equivalent to those in the first embodiment. In addition, the same reference numerals are given to the same or equivalent components as in the first embodiment, and the description thereof will be omitted.

Next, the operation of the navigation device according to the third embodiment will be described. First, when the position detector 14 detects the current position, the current position coordinate data is transmitted to the three-dimensional map generator 31. The three-dimensional map generator 31
3D map information centered on the current position coordinate data
The three-dimensional map image is read out from the three-dimensional map information storage unit 30 and a three-dimensional three-dimensional map image is generated. The depression angle (angle in the vertical plane formed by the horizontal plane and the line of the downward slope) from the viewpoint of the three-dimensional image at this stage is a preset angle (for example, 30).
Degree). The three-dimensional map image generated by the three-dimensional map generator 31 is given to the display controller 18, and the display controller 18 causes the display 13 to display this map image.
Next, when the user who views the three-dimensional map displayed on the display 13 desires to change the display angle of the three-dimensional map, the tip of the controller 20 is tilted vertically and the switch 23 is pressed to change the display angle. The display change process is started by speaking.

Specifically, when the user tilts the tip of the controller 20 by a predetermined angle and pushes the switch 23,
The inclination of the tip of the controller 20 is detected by the inclination sensor 26. The ON signal of the switch 23 is transmitted from the transmission unit 21 to the navigation device body 19 and received by the reception unit 11 under the control of the controller control unit 24. This ON signal is sent to the voice input unit 15 under the control of the main body control unit 19.
The voice input unit 15 is enabled to input a voice. Here, when the user utters, for example, “change the display angle” as the display angle change command, this utterance is accepted by the voice input unit 15 and recognized by the voice recognition unit 16. The display angle change command voice-recognized by the voice recognition unit 16 is given to the three-dimensional image generation unit 31, and the tilt angle data detected by the tilt sensor 26 is also transmitted by the transmission unit 21.
And is given to the three-dimensional image generation unit 31 via the reception unit 11.

The three-dimensional image generation unit 31 confirms that the given command is a display angle change command, and based on the map information read from the three-dimensional map information storage unit 30, determines the given tilt angle. Generate a three-dimensional map image with the depression angle from the viewpoint. The three-dimensional map image generated by the three-dimensional map generator 31 is given to the display controller 18, and the display controller 18 causes the display 13 to display this map image. In this way, it is possible to adjust the depression angle of the three-dimensional map image displayed on the display 13 in a manner that matches the actual space, and the operating performance is greatly improved.

As described above in detail, in the present embodiment, the display angle change command is uttered while the depression angle of the three-dimensional map image displayed on the display 13 is tilted at the tip of the controller 20. The excellent effect is that it can be changed easily with an extremely simple interface such as. In particular, it is very difficult to express the depression angle intended by the user with words (for example, “display from a little above”, “tilt by 5 degrees”, etc.). Since it is not necessary to speak and only the tip of the controller 20 is tilted, the depression angle intended by the user can be easily specified, and the input work efficiency is significantly improved.

Further, since the user only needs to speak the display angle changing command while the switch 23 is on, it is not necessary to distinguish the utterances having a plurality of contents by the voice recognition unit 16. Therefore, the situation in which the break of each utterance cannot be determined does not occur, and the recognition efficiency of the voice recognition unit 16 is significantly improved.

[0038]

With the navigation device according to the present invention, a target object existing in the direction intended by the user can be easily searched with an extremely simple interface such as uttering the target object while pointing the direction with the controller. The excellent effect of being able to do is exhibited.

In particular, it is very difficult to express the direction intended by the user in words (for example, “north-northeast”, “slightly southward”, “direction of Mt. Fuji”, etc.). Since it is not necessary to speak the direction and only the controller needs to point in that direction, the direction intended by the user can be easily specified, and the input work efficiency is significantly improved.

[Brief description of drawings]

FIG. 1 is a block diagram showing a configuration of a navigation device according to a first embodiment.

FIG. 2 is a diagram showing an operation example of the navigation device according to the first embodiment.

3A and 3B are diagrams showing display examples on the display of the navigation device according to the first embodiment.

FIG. 4 is a block diagram showing a configuration of a navigation device according to a second embodiment.

FIG. 5 is a diagram showing an operation example of the navigation device according to the second embodiment.

FIG. 6 is a block diagram showing a configuration of a navigation device according to a third embodiment.

[Explanation of symbols]

10 ... Navigation device main body, 11 ... Receiving part, 12 ...
Map information storage unit, 13 ... Display, 14 ... Position detection unit, 15 ... Voice input unit, 16 ... Voice recognition unit, 17 ... Search unit, 18 ... Display control unit, 19 ... Main body control unit, 20 ... Controller, 21 ... Sending unit, 22 ... Direction sensor, 23 ... Switch, 24 ... Controller control unit, 25 ... Acceleration sensor, 30 ... 3D map information storage unit, 31 ... 3D map generation unit.

─────────────────────────────────────────────────── ─── Continuation of front page (51) Int.Cl. 7 Identification code FI theme code (reference) G10L 15/28 (72) Inventor Kenichi Ogawa 2-3-3 Marunouchi, Chiyoda-ku, Tokyo Sanryo Electric Co., Ltd. (72) Inventor Tadashi Suzuki 2-3-3 Marunouchi, Chiyoda-ku, Tokyo Sanryo Electric Co., Ltd. (72) Inventor Atsushi Kono 2-3-2 Marunouchi, Chiyoda-ku, Tokyo Sanryo Electric Co., Ltd. ( 72) Inventor Tatsuya Miyoshi 2-3-2 Marunouchi, Chiyoda-ku, Tokyo F-Term (reference) within Sanryo Electric Co., Ltd. (reference) 2C032 HC16 HC26 HD21 2F029 AA02 AB07 AB13 AC03 AC09 AC18 5D015 KK02 5H180 AA01 BB13 FF03 FF22 FF25 FF32 FF36 FF40

Claims (5)

[Claims]
1. A navigation device for displaying a guide route from a current location to a destination, a map information storage section storing map information, a voice input section for accepting designation of a search target by voice input, and the voice input section. On the basis of the voice recognition unit that recognizes the search target received by the controller, the controller that includes the azimuth sensor, the search target that is recognized by the voice recognition unit, and the azimuth of the controller tip specified by the azimuth sensor. A search unit for searching the map information stored in the map information storage unit for the search target existing in the orientation of the controller tip; and displaying the search target searched by the search unit on the map information. And a display control unit for displaying the navigation device on the display device.
2. A navigation device for displaying a guide route from a current location to a destination, a map information storage section in which map information is stored, a voice input section for accepting designation of a search target by voice input, and the voice input section. Based on the voice recognition unit that recognizes the search target received in, a controller that incorporates the acceleration sensor, the search target that is recognized by the voice recognition unit, and the angular velocity vector of the controller tip identified by the acceleration sensor. A search unit that searches the map information stored in the map information storage unit for the search target existing at a position away from the current position by a distance proportional to the magnitude of the angular velocity vector; And a display control unit that superimposes the search target on the map information and displays it on a display. Navigation device.
3. A navigation device for displaying a guide route from a current location to a destination, a map information storage section storing map information, a voice input section for accepting designation of a search target by voice input, and the voice input section. The voice recognition unit that recognizes the search target that is received by the controller, the controller that includes the azimuth sensor and the acceleration sensor, the search target that is recognized by the voice recognition unit, and the azimuth of the controller tip that is specified by the azimuth sensor. , The search target existing in the azimuth of the controller tip based on the angular velocity vector of the controller tip specified by the acceleration sensor and at a position away from the current position by a distance proportional to the magnitude of the angular velocity vector A search unit for searching the map information stored in the map information storage unit, And a display control unit for superimposing the search target searched for in (1) on the map information and displaying it on a display.
4. The search unit, when a plurality of angular velocity vectors are detected by the acceleration sensor, searches the search target based on a combined vector of these vectors. The described navigation device.
5. A navigation device for displaying a guide route from a current location to a destination, a three-dimensional map information storage unit storing map information about a three-dimensional map, a controller having a built-in inclination sensor, and the three-dimensional map. The map information stored in the map information storage unit is read out to display the three-dimensional map on the display, and the three-dimensional map displayed on the display is adjusted according to the tilt angle of the controller tip specified by the tilt sensor. And a display control unit that changes the display angle of the navigation device.
JP2001323221A 2001-10-22 2001-10-22 Navigation system Pending JP2003130671A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001323221A JP2003130671A (en) 2001-10-22 2001-10-22 Navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001323221A JP2003130671A (en) 2001-10-22 2001-10-22 Navigation system

Publications (1)

Publication Number Publication Date
JP2003130671A true JP2003130671A (en) 2003-05-08

Family

ID=19140145

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001323221A Pending JP2003130671A (en) 2001-10-22 2001-10-22 Navigation system

Country Status (1)

Country Link
JP (1) JP2003130671A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011059826A (en) * 2009-09-07 2011-03-24 Sony Ericsson Mobilecommunications Japan Inc Mobile terminal device, method and program for processing information
JP2013513294A (en) * 2009-12-03 2013-04-18 オソカド リモート リミテッド ライアビリティ カンパニー Method, apparatus, and computer program for performing location-specific information retrieval using a gesture-controlled handheld mobile device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011059826A (en) * 2009-09-07 2011-03-24 Sony Ericsson Mobilecommunications Japan Inc Mobile terminal device, method and program for processing information
JP2013513294A (en) * 2009-12-03 2013-04-18 オソカド リモート リミテッド ライアビリティ カンパニー Method, apparatus, and computer program for performing location-specific information retrieval using a gesture-controlled handheld mobile device

Similar Documents

Publication Publication Date Title
EP2862125B1 (en) Depth based context identification
JP5709980B2 (en) Voice recognition device and navigation device
US8903651B2 (en) Information terminal, server device, searching system, and searching method thereof
US7103475B2 (en) Vehicle navigation system and route guidance method
JP4380541B2 (en) Vehicle agent device
JP4678534B2 (en) Navigation device and map scroll processing method
CN101578501B (en) Navigation device and method
US6604073B2 (en) Voice recognition apparatus
EP1254349B1 (en) A navigation system with unique audio tones for maneuver notification
DE69837064T2 (en) Message processing system and method for processing messages
JP4551961B2 (en) Voice input support device, its method, its program, recording medium recording the program, and navigation device
US7676370B2 (en) Command-inputting device having display panel
US8484033B2 (en) Speech recognizer control system, speech recognizer control method, and speech recognizer control program
KR100696801B1 (en) Navigation system and interesting location seaching method thereof
US9292093B2 (en) Interface method and apparatus for inputting information with air finger gesture
EP1118838B1 (en) Navigation device
US8036875B2 (en) Audio guidance system having ability to update language interface based on location
US20110106426A1 (en) Navigation apparatus and method of detection that a parking facility is sought
US20120283946A1 (en) Dynamic destination map display for navigation system
US20130297200A1 (en) Systems and Methods for Off-Board Voice-Automated Vehicle Navigation
US7392194B2 (en) Voice-controlled navigation device requiring voice or manual user affirmation of recognized destination setting before execution
JP4519515B2 (en) Peripheral facility search device
US7062380B2 (en) Navigation system, and program and storage medium for use in the same
US8532871B2 (en) Multi-modal vehicle operating device
US7102497B2 (en) In-car device and processing method for use with the in-car device

Legal Events

Date Code Title Description
RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7421

Effective date: 20040705