JP2005037181A - Navigation device, server, navigation system, and navigation method - Google Patents

Navigation device, server, navigation system, and navigation method Download PDF

Info

Publication number
JP2005037181A
JP2005037181A JP2003198453A JP2003198453A JP2005037181A JP 2005037181 A JP2005037181 A JP 2005037181A JP 2003198453 A JP2003198453 A JP 2003198453A JP 2003198453 A JP2003198453 A JP 2003198453A JP 2005037181 A JP2005037181 A JP 2005037181A
Authority
JP
Japan
Prior art keywords
navigation
information
user
device
means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
JP2003198453A
Other languages
Japanese (ja)
Inventor
Naoki Harasawa
Wakako Ichinose
Hajime Matsushita
Hajime Miyasato
Tomoko Ota
Tatsuya Sugimoto
Daisuke Suma
Hajime Tamura
Junichi Tanaka
Koji Tanaka
若子 一ノ瀬
直希 原澤
智子 太田
肇 宮里
達哉 杉本
元 松下
浩司 田中
淳一 田中
一 田村
大輔 須磨
Original Assignee
Pioneer Design Kk
Pioneer Electronic Corp
パイオニアデザイン株式会社
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Design Kk, Pioneer Electronic Corp, パイオニアデザイン株式会社, パイオニア株式会社 filed Critical Pioneer Design Kk
Priority to JP2003198453A priority Critical patent/JP2005037181A/en
Publication of JP2005037181A publication Critical patent/JP2005037181A/en
Application status is Abandoned legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a portable navigation device enabling to quickly obtain and display information which a user wants without complicated input operations. <P>SOLUTION: The navigation device comprises a spectacle type display device 2 (display means) which displays navigation information by superimposing on the real view of a user 5 (device user). The device also comprises a GPS receiver 132 (position detection means) which detects a position of the user 5, a sight line detection sensor 25 (recognition means) which recognizes an object placed on a predetermined position on the user's sight line, and an information transmission means (such as an organic EL display 21 and an earphone 23) of the spectacle type display device 2 which transmits additional information about the position detected by the GPS receiver 132 (position detection means) and the object recognized by the sight line detection sensor 25 (recognition means) to the user 5. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a navigation device, a server device, a navigation system, and a navigation method.
[0002]
[Prior art]
As a conventional navigation device, there are a type that is mounted on an automobile and a type that is carried by a device user (hereinafter referred to as “user”) when walking. In any type, when the user heads to the destination, a route guidance function is provided for guiding the route of the route from the user's current location to the destination.
[0003]
As a type mounted on an automobile, a communication type navigation device that can efficiently update map data when searching for a route is also disclosed (for example, see Patent Document 1).
[0004]
The technical background that made it possible to realize such a navigation apparatus was the presence of GPS (Global Positioning System). The position accuracy of this GPS changes with the times due to military circumstances, but in recent years, it is not dependent only on artificial satellites, but it has realized position correction functions using broadcast networks in various places, There is a tendency to develop a technology for improving the positional accuracy up to the order of several centimeters.
[0005]
The navigation device mainly functions to guide the route of the route from the current point to the destination, and the guidance and explanation displayed in the middle of the route are mainly used by the user at present. This is for confirming the position and direction of travel.
In other words, relying on landmarks such as intersections, traffic lights, and specific facilities displayed on the display device, users (or passengers, etc.) can see these landmarks in the actual scenery and check their directions. used.
[0006]
On the other hand, the type carried by the user includes a portable terminal such as a small notebook PC or a portable information terminal in which a GPS receiver, navigation software, or the like is incorporated.
In recent years, eyeglass-type displays have also been developed as display devices that display navigation information transmitted to the user (see, for example, Patent Document 2).
[0007]
[Patent Document 1]
Japanese Patent Laid-Open No. 2003-77095
[Patent Document 2]
JP-A-7-84519
[0008]
[Problems to be solved by the invention]
However, since a conventional navigation device of the type carried while walking is assumed to be carried in a bag or case, it is difficult to walk while looking at the display of the navigation device.
In order to walk while looking at the display, it is effective to use the above-mentioned glasses-type display (see Patent Document 2). The size, weight, and design to achieve have not been prepared.
[0009]
In the conventional navigation device of the above-described portable type, even if the above-mentioned glasses-type display is used for the display unit, the user wants to know certain information separately from the navigation information displayed sequentially. When you think of it, you must specify what you want to know at that time by using input buttons such as an input button or a touch panel provided on the main body of the apparatus.
For this purpose, complicated input operations must be performed, and information desired by the user cannot be acquired instantaneously.
[0010]
Furthermore, in a portable type navigation device, since the device itself is miniaturized for portable use, it is difficult to input, and it is difficult for the user to stop and perform such complicated input operation in a busy city area or the like. And dangerous.
[0011]
An example of a problem to be solved by the present invention is to enable a user to obtain and display information desired by a user instantaneously without performing a complicated input operation in a portable navigation device. Can be mentioned.
[0012]
[Means for Solving the Problems]
The invention described in claim 1 is a navigation device comprising display means for superimposing and displaying navigation information on an actual scene that is a field of view of the device user, and a position detection means for detecting the position of the device user. The apparatus uses additional information related to the recognition means for recognizing an object at a predetermined position on the line of sight of the apparatus user, the position detected by the position detection means, and the object recognized by the recognition means. An information transmission means for transmitting information to a person.
[0013]
A server device described in claim 14 provides information to the navigation device according to any one of claims 1 to 13.
[0014]
A navigation system according to a fifteenth aspect includes at least the navigation apparatus according to any one of the first to thirteenth aspects and the server apparatus according to the fourteenth aspect as constituent elements.
[0015]
The invention described in claim 16 is a navigation method using display means for superimposing and displaying navigation information on an actual scene that is a field of view of the apparatus user, and a position detecting step for detecting the position of the apparatus user. A recognition step for recognizing an object at a predetermined position on the line of sight of the apparatus user, and additional information related to the position detected by the position detection means and the object recognized by the recognition means to the apparatus user. And a step of transmitting.
[0016]
DETAILED DESCRIPTION OF THE INVENTION
Hereinafter, embodiments of a navigation device of the present invention will be described in detail with reference to the drawings.
In the following description of the embodiments, the navigation device will be described in detail. However, the embodiments of the server device, the navigation system, and the navigation method will be described as components of the navigation device or an external system. Is included in the following description.
[0017]
FIG. 1 is a configuration diagram showing the overall configuration of a navigation device according to an embodiment of the present invention.
In FIG. 1, the navigation apparatus of the present embodiment is portable for a user (apparatus user), connected to a server 3 described later and a wireless communication network, and connected to the apparatus main body 1 when the apparatus is used. And a glasses-type display device 2 (display means) to be worn (details of the device body 1 will be described later).
[0018]
The glasses-type display device 2 includes an organic EL display 21, a half mirror 22 (lens portion of glasses), an earphone 23, a frame 24, a line-of-sight detection sensor 25 (line-of-sight detection means), and an audio microphone 26. The device body 1 can also be connected to a mobile phone connected by a mobile phone network.
The organic EL display 21 is particularly preferably a film-based one, and is flexible, thin and lightweight, and can be relatively easily attached to the half mirror 22 (the spectacle lens portion). This makes it easy to configure the eyeglass-type display device 2 having a size, weight, and design that can be easily worn in daily life.
[0019]
Note that the display unit (organic EL display 21) of the glasses-type display device 2 is a liquid crystal display device as long as it is transparent or translucent and can display an actual scene that is the field of view of the user of the device through the display unit. And other display devices may be used.
[0020]
The glasses-type display device 2 can be integrated with a helmet or the like, and can be used by riding on a bicycle or motorcycle. In this case, it is possible to use a speaker incorporated in a helmet or the like instead of the earphone 23.
[0021]
The line-of-sight detection sensor 25 may be provided with imaging means (a still camera, a video camera, etc.) for imaging a predetermined position on the detected line of sight (station entrance, building, ticket vending machine, station route map, etc.). Good.
[0022]
FIG. 2 is a block diagram showing the configuration of the main body of the navigation device according to the embodiment of the present invention.
In FIG. 2, the device body 1 of the navigation device of the present embodiment includes an external I / F 11 that performs an interface with an external system, a CPU 12 that performs overall control and information processing necessary for navigation, and input / output of peripheral devices. It includes an I / O 13 that converts and controls data, a display control unit 14 that controls a display image, a voice synthesis unit 15 that performs voice synthesis such as an instruction to output voice, and a battery 16 that supplies power. .
[0023]
In addition, the apparatus main body 1 includes a ROM 121 that stores programs executed by the CPU 12 and data, a RAM 122 that is a memory necessary for information processing of the CPU 12, and the like.
[0024]
Further, the apparatus main body 1 includes a wireless communication device 131 for transmitting and receiving with the server 3 constituting the navigation system, a GPS receiver 132 (position detecting means) for receiving GPS radio waves, and a line-of-sight detection sensor of the glasses-type display device 2. 25, a line-of-sight detection sensor processing unit 133 that processes a detection signal from 25, a voice recognition processing unit 134 that processes a voice signal from the voice microphone 26 of the glasses-type display device 2, and an operation key 135 for key input. The external I / F 11, CPU 12, I / O 13, display control unit 14, and speech synthesis unit 15 are connected to the bus 10.
[0025]
Hereinafter, functions of main components of the navigation device according to the present embodiment will be described.
The external I / F 11 is an interface unit for connection with an external device. As this external device, for example, a mobile phone, a portable information terminal, a wireless LAN terminal, or the like can be connected.
[0026]
The wireless communication device 131 is connected to the server 3 constituting the navigation system together with the navigation device according to the present embodiment via a wireless line, and sends information received from the server 3 to the CPU 12 via the I / O 13.
The server 3 stores and manages various types of information in the database 31 such as information on each region, building facility, and predetermined locations, and additional information related to objects existing in these locations. In response to a request from the device body 1 of the navigation device, these pieces of information are transmitted wirelessly.
[0027]
Here, the additional information can be set in advance by the operation manager of the navigation system or the administrator of the server 3, and can also be set by the user of the apparatus.
[0028]
The wireless communication device 131 can receive the guidance information transmitted from the destination facility / equipment and send it to the CPU 12 via the I / O 13.
The GPS receiver 132 detects the current position of the user, generates current position information, and sends it to the CPU 12 via the I / O 13.
The line-of-sight detection sensor processing unit 133 sends a detection signal sent from the line-of-sight detection sensor 25 of the glasses-type display device 2 to the CPU 12 via the I / O 13. Further, when the line-of-sight detection sensor 25 has an image pickup means, the image pickup data of an object (station entrance, building, ticket vending machine, station route map, etc.) that the user is viewing is sent to the CPU 12 via the I / O 13. Send it out.
[0029]
The voice recognition processing unit 134 amplifies a voice signal (a signal indicating a user's instruction or request) from the voice microphone 26 of the glasses-type display device 2, further performs voice recognition processing and encodes the I / O 13. Is sent to the CPU 12 via.
The operation key 135 is used when a user inputs data and sends the data to the CPU 12 via the I / O 13.
As a means for the user to input data, the eyeglass-type display device 2 may be provided with a touch switch or the like to serve as an alternative or auxiliary input means for the operation key 135.
[0030]
The CPU 12 receives the data and signals described above, and displays a signal for displaying the additional information related to the current position of the user in real time together with the normal navigation information on the glasses-type display device 2 via the display control unit 14. Or output sound.
The additional information includes road construction information, boarding station, boarding route name, fare, timetable, and other navigation support information, information entered in advance by the user, such as a schedule, and various information according to user preferences. Is possible.
[0031]
In addition, the normal navigation information and additional information are not limited to being displayed as images, and include information that is transmitted as audio information to the earphone 23 or the like worn by the user via the audio synthesis unit 15 and output as audio. ing.
[0032]
These navigation information and additional information are displayed superimposed on the actual scenery that is the user's field of view.
[0033]
Further, as these normal navigation information and additional information, information on the contents related to the current position and the object visually recognized by the user is appropriately displayed.
For example, the imaging means of the line-of-sight detection sensor 25 captures an object visually recognized by the user, and the CPU 12 recognizes the type of the object (recognition means) based on the imaging data, and then recognizes the recognized object type and the user's type. Additional information (station entrance, building, ticket vending machine, station route map, etc.) that can be compared with the current position is searched, and the normal navigation information and additional information extracted by the search are retrieved via the display control unit 14. Display or audio output on the glasses-type display device 2. Note that a reduced image of the captured image data of the recognized object may be displayed on the glasses-type display device 2.
[0034]
Further, the CPU 12 determines the reason why the user himself / herself inputs schedule information such as a reason for going to the destination and an action schedule at the destination into the apparatus in advance through the operation key 135 or the voice microphone 26. The input information can be displayed or output to the glasses-type display device 2 via the display control unit 14 at a timing such as when the position is reached or when a predetermined time comes.
[0035]
Further, the CPU 12 can display or output the additional information to the glasses-type display device 2 via the display control unit 14 by an agent based on a computer program. This agent is a computer program that performs various tasks as a user's “agent” as the word implies, and the CPU 2 can use what is installed in the server 3 or an external device. is there.
[0036]
The CPU 12 is not directly related to navigation, but stores various information according to the user's preference (recommended shops around the current location, restaurants, weather, entertainment information, other information that the user is interested in, etc.) in a predetermined position ( Display or voice output to the glasses-type display device 2 via the display control unit 14 at a timing such as when the destination station is reached) or when a predetermined time (such as a meal time) is reached. .
The information according to the user's preference is such that information related to the user is inputted in advance, and the agent selects and displays or outputs the related information.
Such information according to the user's preference can be displayed or output by voice even when the navigation operation is not performed.
In addition, the agent can learn user preferences as the device is used.
[0037]
Further, when the CPU 12 detects that the object visually recognized by the user is a distant object (for example, a skyscraper A, Tokyo Tower, Mount Fuji, etc.), the object is The way to an existing place (transportation, its timetable, necessary time guidance, etc.) can be displayed on the glasses-type display device 2 via the display control unit 14 or output by voice, and further designated as a destination. You can also perform navigation.
[0038]
The display control unit 14 receives information or a signal sent from the CPU 12, converts the information or signal into an audio signal that can be displayed on the eyeglass-type display device 2 or output by voice, and sends the audio signal to the eyeglass-type display device 2.
The voice synthesizing unit 15 converts data such as an instruction to be outputted as a voice output from the display control unit 14 into a voice signal by voice synthesis, and delivers it to the display control unit 14.
[0039]
Note that the navigation information, additional information, information previously entered by the user such as the schedule, and various information according to the user's preference are displayed or output at the timing directly commanded by the user from the voice microphone 26 or the operation key 135. May be.
Each of the above information is stored in the database 31 of the server 3 and information corresponding to a request from the device main body 1 of the navigation device is transmitted wirelessly. Some information (for example, schedule information) Information on the user's individual) may be stored in a storage unit such as the RAM 122 in the apparatus main body 1.
[0040]
Next, a usage example of the navigation device according to the present embodiment will be given and described below with reference to the drawings.
FIG. 3 is a diagram illustrating a state in which the user wears the glasses-type display device 2 of the navigation device illustrated in FIG. This figure shows a state in which the user 5 wears the glasses-type display device 2 like normal glasses and is walking toward the destination while receiving navigation. In addition, although illustration is abbreviate | omitted, the user 5 is carrying the apparatus main body 1 electrically connected with the spectacles type display apparatus 2 by wire or radio | wireless.
[0041]
The glasses-type display device 2 shown in FIG. 3 receives the normal navigation information, additional information, schedule information, information according to the user's preference, etc. as signals that can be displayed or output from the device body 1. Is displayed as an image (see FIG. 1) on the organic EL display 21 of the glasses-type display device 2 or is output as sound from the earphone 23 (see FIG. 1).
[0042]
Furthermore, display examples in which each piece of information displayed as the user 5 goes toward the destination are viewed from the user 5 will be described with reference to FIGS. In this display example, it is assumed that the user 5 is heading to the destination, Kokubunji, and is currently walking in the vicinity of JR (registered trademark) Meguro Station.
[0043]
First, FIG. 4 shows a display screen in a state where the user 5 is approaching a road intersection near JR Meguro Station and is crossing the pedestrian crossing at this intersection.
As shown in FIG. 4, the display unit of the glasses-type display device 2 includes a display frame 32, a line-of-sight center 33a indicating the center of the line of sight of the user 5, a horizontal line 33h of a line of sight line passing through the line-of-sight center 33a, and a vertical line 33v is displayed. As navigation information, a character “straight ahead” is displayed at the upper left of the screen, and an arrow 34 is displayed as its direction.
[0044]
Next, FIG. 5 shows a display screen in a state where the user 5 faces the entrance 35 of the subway Meguro Station.
Similarly, in FIG. 5, the display frame 32, the line-of-sight center 33a, the line of sight crosshair 33h, and the line 33v are displayed. When the line-of-sight center 33a matches the entrance 35 of the subway Meguro station (when the line of sight is turned), additional information is displayed in the upper left corner of the screen: "Subway Meguro Station, Meguro Line (Tokyu), Namboku Line (Teidan), Mita Line (Toei) "and the names of railway lines that can be boarded from this station are displayed. In addition, the entrance 35 shown by hatching is clearly shown on the display screen by brightly displaying this portion.
[0045]
In this example, since the subway Meguro station is not a route to the destination, the above-mentioned railroad route name is not displayed unless the user 5 looks at the entrance 35.
It is also possible to display additional information such as “Caution during construction” as necessary.
[0046]
Next, FIG. 6 shows a display screen in a state where the user 5 has come just before JR Meguro Station.
Similarly, in FIG. 6, a display frame 32, a line-of-sight center 33a, a line of sight crosshair 33h, and a line 33v are displayed. In the situation of FIG. 6, when the user 5 tries to pass the premises of JR Meguro Station, which is a waypoint to the destination, or a point around it, the hatched portion indicated by reference numeral 36 is flashed brightly, As the navigation information, “JR Meguro Station, Yamanote Line (JR)” and the like are displayed on the upper left of the screen, and the station name and the route name of the railway to be boarded.
[0047]
Here, it shows that it was recognized as a route to the destination, and is displayed regardless of the line of sight of the user 5. That is, the above display is made because the position detected by the GPS receiver 132 matches the waypoint to the destination.
[0048]
Next, FIG. 7 shows a display screen in a state where the user 5 enters the premises of JR Meguro Station.
Similarly, in FIG. 7, a display frame 32, a line-of-sight center 33a, a line of sight crosshair 33h, and a vertical line 33v are displayed. In FIG. 7, when the user 5 arrives at JR Meguro Station, which is a predetermined position specified in advance (or when a predetermined time specified in advance is reached), the user 5 inputs in advance. It shows how the scheduled schedule is displayed.
[0049]
Next, FIG. 8 shows a display screen in a state where the user 5 enters the premises of JR Meguro Station.
Similarly in FIG. 8, a display frame 32, a line-of-sight center 33a, a line of sight crosshair 33h, and a line 33v are displayed. In FIG. 8, when the user 5 faces the station route map and turns his / her line of sight, the line-of-sight center 33a is recognized in accordance with the route map, and from the current location, JR Meguro Station to JR Shinjuku by character and graphic display. It shows a state where a boarding route toward the station and changing to the destination, JR Nishikokubunji Station, is displayed.
[0050]
Next, FIG. 9 shows a display screen in a state where the user 5 faces the ticket vending machine.
Similarly, in FIG. 9, a display frame 32, a line-of-sight center 33a, a line of sight crosshair 33h, and a line 33v are displayed. In FIG. 9, when the user 5 turns his gaze facing the ticket vending machine installed on the station premises, the line-of-sight center 33a is recognized in accordance with the ticket vending machine, and the hatched portion indicated by reference numeral 37 is the ticket. It shows a flashing display over the ticket machine.
[0051]
Next, FIG. 10 shows a display screen in a state where the user 5 is buying a ticket at a ticket vending machine.
Similarly, in FIG. 10, a display frame 32, a line-of-sight center 33a, a line of sight crosshair 33h, and a line 33v are displayed. In FIG. 10, as shown in FIG. 9, after the ticket vending machine to purchase a ticket to the destination is recognized by the apparatus, the user 5 performs the operation of buying a ticket from the ticket vending machine to the destination. To support, display the name of the route from "JR Meguro Station" where this ticket vending machine is installed to "JR Nishikokubunji Station" which is the target station, the transfer station, and the total fare It shows how it is.
[0052]
The user 5 visually recognizes the displayed information, inserts a necessary amount of money into a coin or bill insertion slot of a ticket vending machine, and presses a push button corresponding to the amount to the destination station. You can buy a ticket to the destination station.
[0053]
In addition, when the user 5 is unfamiliar with the operation of the ticket vending machine, the position of the button of the ticket vending machine, the timing of pressing, etc. can be set to be instructed sequentially.
[0054]
The navigation device, server device, navigation system, and navigation method of the present invention are not limited to the above-described embodiments, and appropriate modifications and improvements can be made.
[0055]
In the above example of use, the types of objects registered in advance include station entrances, buildings, ticket vending machines, station route maps, etc., and mainly described navigation within and around stations. Of course, navigation in other places is also possible. For example, using a registered additional information, when eating at a restaurant in the city or shopping at a store, the restaurant (its signboard, etc.) is imaged to detect that it is located there Then, when the menu of the restaurant is displayed, or when the store (the signboard or the like) is imaged, the store information (for example, products handled, bargain information, bargain information, etc.) can be displayed. Furthermore, by entering a store and capturing an image of a product, a tag of the product, etc., it is possible to display information related to the product, information about the store handling the product, and the like.
[0056]
As described above, the navigation device according to the present embodiment is a navigation device including the glasses-type display device 2 (display unit) that displays the navigation information superimposed on the actual scene that is the field of view of the user 5 (device user). The GPS receiver 132 (position detection means) for detecting the position of the user 5, the line-of-sight detection sensor 25 (recognition means) for recognizing an object at a predetermined position on the line of sight of the user 5, and the GPS receiver 132 (position detection) Information transmission means (the organic EL display 21 and the earphone) of the glasses-type display device 2 for transmitting additional information related to the position detected by the means) and the object recognized by the line-of-sight detection sensor 25 (recognition means) to the user 5. 23) so that the user can instantly operate the portable navigation device without performing complicated input operations. It can be made to be displayed by obtaining information Nozomu.
[Brief description of the drawings]
FIG. 1 is a configuration diagram showing an overall configuration of a navigation device according to an embodiment of the present invention.
FIG. 2 is a block diagram showing a configuration of a device main body of the navigation device according to the embodiment of the present invention.
FIG. 3 is a diagram showing an example of use of the navigation device according to the present embodiment, and shows a state in which a user wears a glasses-type display device.
4 is a display example (No. 1) in which each piece of information displayed on the glasses-type display device as the user of FIG. 3 goes toward the destination is viewed from the user.
5 is a display example (No. 2) in which each information displayed on the glasses-type display device is viewed from the user as the user of FIG. 3 goes toward the destination.
6 is a display example (part 3) in which each piece of information displayed on the glasses-type display device as the user of FIG. 3 goes toward the destination is viewed from the user.
7 is a display example (part 4) in which each piece of information displayed on the eyeglass-type display device as the user of FIG. 3 goes toward the destination is viewed from the user.
8 is a display example (No. 5) in which each piece of information displayed on the glasses-type display device is viewed from the user as the user of FIG. 3 goes toward the destination.
9 is a display example (No. 6) in which each piece of information displayed on the glasses-type display device as the user of FIG. 3 goes toward the destination is viewed from the user.
10 is a display example (No. 7) in which each information displayed on the glasses-type display device as the user of FIG. 3 goes toward the destination is viewed from the user.
[Explanation of symbols]
1 Main unit
2 glasses-type display device
3 servers
5 users
12 CPU
14 Display controller
15 Speech synthesis unit
21 Organic EL display
23 Earphone
25 Gaze detection sensor
26 Voice microphone
131 wireless communication device
132 GPS receiver
133 Gaze detection sensor processing unit
134 Audio microphone processing unit
135 Operation keys

Claims (16)

  1. A navigation device comprising display means for superimposing and displaying navigation information on an actual scene that is a field of view of the device user,
    Position detecting means for detecting the position of the device user;
    Recognizing means for recognizing an object at a predetermined position on the line of sight of the device user;
    A navigation apparatus, comprising: information transmission means for transmitting additional information related to the position detected by the position detection means and the object recognized by the recognition means to the user of the apparatus.
  2. The recognition means is
    Line-of-sight detection means for detecting the line of sight of the user of the device;
    Imaging means for imaging a predetermined position on the line of sight detected by the line of sight detection means,
    The object picked up by the image pickup means is collated with a pre-registered object type to identify the type, and the position detected by the position detection means is associated with the object type to recognize the object. The navigation device according to claim 1.
  3. The navigation according to claim 2, wherein the pre-registered object type includes at least one of a station entrance, a building, a ticket vending machine, and a station route map. apparatus.
  4. The navigation apparatus according to claim 2, wherein the pre-registered object type includes at least one of a store, a signboard, a product, and a product tag.
  5. The navigation apparatus according to claim 1, wherein the information transmission unit displays the additional information on the display unit.
  6. The navigation device according to claim 1, wherein the display unit is a glasses-type display device.
  7. The navigation apparatus according to claim 1, wherein the information transmission unit transmits the additional information as voice information to the apparatus user.
  8. The navigation apparatus according to claim 1, wherein the additional information is navigation support information provided on a route to a predetermined destination.
  9. The navigation device according to claim 8, wherein the navigation support information includes at least one of road construction information, boarding station, boarding route name, fare, and timetable.
  10. The navigation according to any one of claims 1 to 9, wherein the additional information is information selected in advance according to the registered preference information by previously registering the preference information of the device user. apparatus.
  11. 11. The navigation device according to claim 10, further comprising learning a preference of the device user by using a device, and using information selected according to the learned preference information as additional information.
  12. When the recognizing unit detects that the place where the object being viewed exists is far away, along with additional information related to the object, the route information to the place where the object exists is sent to the device user. The navigation device according to claim 1, wherein the navigation device transmits the information.
  13. 13. The information input in advance by the device user is transmitted to the device user when reaching a predetermined position or when a predetermined time is reached. The navigation device according to any one of the above.
  14. The server apparatus characterized by providing information with respect to the navigation apparatus in any one of Claims 1-13.
  15. A navigation system comprising at least the navigation device according to any one of claims 1 to 13 and the server device according to claim 14 as constituent elements.
  16. A navigation method using display means for displaying navigation information superimposed on an actual scene that is a field of view of a device user,
    A position detection step for detecting the position of the device user;
    A recognition step for recognizing an object at a predetermined position on the line of sight of the device user;
    And transmitting the additional information related to the position detected by the position detecting means and the object recognized by the recognizing means to the user of the apparatus.
JP2003198453A 2003-07-17 2003-07-17 Navigation device, server, navigation system, and navigation method Abandoned JP2005037181A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003198453A JP2005037181A (en) 2003-07-17 2003-07-17 Navigation device, server, navigation system, and navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003198453A JP2005037181A (en) 2003-07-17 2003-07-17 Navigation device, server, navigation system, and navigation method

Publications (1)

Publication Number Publication Date
JP2005037181A true JP2005037181A (en) 2005-02-10

Family

ID=34208233

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003198453A Abandoned JP2005037181A (en) 2003-07-17 2003-07-17 Navigation device, server, navigation system, and navigation method

Country Status (1)

Country Link
JP (1) JP2005037181A (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006080344A1 (en) * 2005-01-26 2006-08-03 Matsushita Electric Industrial Co., Ltd. Guiding device and guiding method
JP2012008746A (en) * 2010-06-23 2012-01-12 Softbank Mobile Corp User terminal device and shopping system
JP2012078224A (en) * 2010-10-01 2012-04-19 Olympus Corp Image generation system, program, and information storage medium
JP2012216135A (en) * 2011-04-01 2012-11-08 Olympus Corp Image generation system, program, and information storage medium
WO2014014145A1 (en) * 2012-07-19 2014-01-23 엘지전자 주식회사 Head mount display device using a brainwave signal and method for controlling same
EP2728846A1 (en) 2012-11-06 2014-05-07 Konica Minolta, Inc. Guidance information display device
JP2014115838A (en) * 2012-12-10 2014-06-26 Eneres Corp Schedule management system and program
JP2014127968A (en) * 2012-12-27 2014-07-07 Seiko Epson Corp Display device and control method for display device
JP2015057659A (en) * 2014-10-24 2015-03-26 ソニー株式会社 Display device, display method, and program
WO2015046669A1 (en) * 2013-09-26 2015-04-02 Lg Electronics Inc. Head mounted display device and method of controlling the same
JP2015069362A (en) * 2013-09-27 2015-04-13 株式会社トヨタマップマスター Head-mounted display, control method thereof, computer program for controlling head-mounted display, and recording medium with computer program recorded thereon
JP2015214330A (en) * 2015-07-02 2015-12-03 株式会社ナビタイムジャパン Navigation system
JP2015225025A (en) * 2014-05-29 2015-12-14 株式会社日立システムズ Spectacle type wearable terminal and indoor destination guiding system using wearable terminal
JP2016017886A (en) * 2014-07-09 2016-02-01 Necエンジニアリング株式会社 Visually handicapped person guide system
JP2016048192A (en) * 2014-08-27 2016-04-07 株式会社ゼンリンデータコム Information processing system, information processing method, and program
KR20160072985A (en) * 2014-12-16 2016-06-24 현대자동차주식회사 Time of arrival notice system and there of method using smart glasses
JP2016522415A (en) * 2013-06-13 2016-07-28 モービルアイ ビジョン テクノロジーズ リミテッド Visually enhanced navigation
JP2016138852A (en) * 2015-01-29 2016-08-04 株式会社ゼンリンデータコム Navigation system, navigation device, glass type device, and device cooperation method
EP3062297A1 (en) * 2015-02-25 2016-08-31 BAE Systems PLC Emergency guidance system and method
WO2016135448A1 (en) * 2015-02-25 2016-09-01 Bae Systems Plc Emergency guidance system and method
CN106101470A (en) * 2015-04-28 2016-11-09 京瓷办公信息系统株式会社 Information processor and the operation instruction method to image processing apparatus
US9587946B2 (en) 2010-06-16 2017-03-07 Navitime Japan Co., Ltd. Navigation system, terminal apparatus, navigation server, navigation apparatus, navigation method, and computer program product
US9628931B2 (en) 2014-03-31 2017-04-18 Kabushiki Kaisha Toshiba Apparatus and method for locating an acoustic signal along a direction not overlapped with an arriving direction of an information sound
WO2018008210A1 (en) * 2016-07-04 2018-01-11 ソニー株式会社 Information processing device, information processing method, and program

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006080344A1 (en) * 2005-01-26 2006-08-03 Matsushita Electric Industrial Co., Ltd. Guiding device and guiding method
US7664598B2 (en) 2005-01-26 2010-02-16 Panasonic Corporation Guiding device and guiding method
US9587946B2 (en) 2010-06-16 2017-03-07 Navitime Japan Co., Ltd. Navigation system, terminal apparatus, navigation server, navigation apparatus, navigation method, and computer program product
JP2012008746A (en) * 2010-06-23 2012-01-12 Softbank Mobile Corp User terminal device and shopping system
JP2012078224A (en) * 2010-10-01 2012-04-19 Olympus Corp Image generation system, program, and information storage medium
JP2012216135A (en) * 2011-04-01 2012-11-08 Olympus Corp Image generation system, program, and information storage medium
WO2014014145A1 (en) * 2012-07-19 2014-01-23 엘지전자 주식회사 Head mount display device using a brainwave signal and method for controlling same
EP2728846A1 (en) 2012-11-06 2014-05-07 Konica Minolta, Inc. Guidance information display device
JP2014093036A (en) * 2012-11-06 2014-05-19 Konica Minolta Inc Guide information display device
US9760168B2 (en) 2012-11-06 2017-09-12 Konica Minolta, Inc. Guidance information display device
JP2014115838A (en) * 2012-12-10 2014-06-26 Eneres Corp Schedule management system and program
JP2014127968A (en) * 2012-12-27 2014-07-07 Seiko Epson Corp Display device and control method for display device
JP2016522415A (en) * 2013-06-13 2016-07-28 モービルアイ ビジョン テクノロジーズ リミテッド Visually enhanced navigation
US10533869B2 (en) 2013-06-13 2020-01-14 Mobileye Vision Technologies Ltd. Vision augmented navigation
WO2015046669A1 (en) * 2013-09-26 2015-04-02 Lg Electronics Inc. Head mounted display device and method of controlling the same
JP2015069362A (en) * 2013-09-27 2015-04-13 株式会社トヨタマップマスター Head-mounted display, control method thereof, computer program for controlling head-mounted display, and recording medium with computer program recorded thereon
US9628931B2 (en) 2014-03-31 2017-04-18 Kabushiki Kaisha Toshiba Apparatus and method for locating an acoustic signal along a direction not overlapped with an arriving direction of an information sound
JP2015225025A (en) * 2014-05-29 2015-12-14 株式会社日立システムズ Spectacle type wearable terminal and indoor destination guiding system using wearable terminal
JP2016017886A (en) * 2014-07-09 2016-02-01 Necエンジニアリング株式会社 Visually handicapped person guide system
JP2016048192A (en) * 2014-08-27 2016-04-07 株式会社ゼンリンデータコム Information processing system, information processing method, and program
JP2015057659A (en) * 2014-10-24 2015-03-26 ソニー株式会社 Display device, display method, and program
US9638534B2 (en) 2014-12-16 2017-05-02 Hyundai Motor Company Arrival time notification system using smart glasses and method thereof
KR20160072985A (en) * 2014-12-16 2016-06-24 현대자동차주식회사 Time of arrival notice system and there of method using smart glasses
KR101655820B1 (en) * 2014-12-16 2016-09-22 현대자동차주식회사 Time of arrival notice system and there of method using smart glasses
JP2016138852A (en) * 2015-01-29 2016-08-04 株式会社ゼンリンデータコム Navigation system, navigation device, glass type device, and device cooperation method
EP3062297A1 (en) * 2015-02-25 2016-08-31 BAE Systems PLC Emergency guidance system and method
WO2016135448A1 (en) * 2015-02-25 2016-09-01 Bae Systems Plc Emergency guidance system and method
CN106101470A (en) * 2015-04-28 2016-11-09 京瓷办公信息系统株式会社 Information processor and the operation instruction method to image processing apparatus
CN106101470B (en) * 2015-04-28 2019-01-18 京瓷办公信息系统株式会社 Information processing unit and operation instruction method to image processing apparatus
JP2015214330A (en) * 2015-07-02 2015-12-03 株式会社ナビタイムジャパン Navigation system
WO2018008210A1 (en) * 2016-07-04 2018-01-11 ソニー株式会社 Information processing device, information processing method, and program

Similar Documents

Publication Publication Date Title
US6427115B1 (en) Portable terminal and on-vehicle information processing device
US9262915B2 (en) Intelligent urban communications portal and methods
JP4201758B2 (en) GPS search device
ES2543337T3 (en) Navigation device that displays dynamic travel information
JP4935145B2 (en) Car navigation system
DE60318430T2 (en) Computer-aided system and method for outputting information to a driver of a vehicle
US20060190168A1 (en) Pedestrian navigation device, pedestrian navigation system, pedestrian navigation method and program
US20040260458A1 (en) Navigation system using wireless communication network and route guidance method thereof
US9539164B2 (en) System for indoor guidance with mobility assistance
DE60028691T2 (en) Apparatus and method for displaying a map
KR101602221B1 (en) Mobile terminal system and control method thereof
KR100373666B1 (en) Information processing apparatus and pedestrian navigation system using the same
JP2006170872A (en) Guiding information system and portable device
CN101769747B (en) Intelligent tour conducting system and method for scenery spots
EP2395495A2 (en) Hand-held navigation aid for individuals with visual impairment
JP4591353B2 (en) Character recognition device, mobile communication system, mobile terminal device, fixed station device, character recognition method, and character recognition program
KR101649643B1 (en) Information display apparatus and method thereof
WO2012101720A1 (en) Information processing device, alarm method, and program
US20090096875A1 (en) Camera-fitted information retrieval device
US7466992B1 (en) Communication device
JP2004212295A (en) Navigation system
CN101578501B (en) Navigation device and method
US20140176348A1 (en) Location based parking management system
US20020004704A1 (en) Portable GPS receiving device, navigation device and navigation system
TWI436035B (en) Emergency guiding system and server

Legal Events

Date Code Title Description
A621 Written request for application examination

Effective date: 20060613

Free format text: JAPANESE INTERMEDIATE CODE: A621

A762 Written abandonment of application

Free format text: JAPANESE INTERMEDIATE CODE: A762

Effective date: 20070803