WO2005104063A1 - 施設表示装置 - Google Patents
施設表示装置 Download PDFInfo
- Publication number
- WO2005104063A1 WO2005104063A1 PCT/JP2004/005726 JP2004005726W WO2005104063A1 WO 2005104063 A1 WO2005104063 A1 WO 2005104063A1 JP 2004005726 W JP2004005726 W JP 2004005726W WO 2005104063 A1 WO2005104063 A1 WO 2005104063A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- facility
- image
- display
- display device
- specified
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
Definitions
- the present invention relates to a facility display device that displays, for example, a facility at a destination (the facility includes not only man-made objects such as buildings, but also natural objects such as trees and natural parks).
- the facility includes not only man-made objects such as buildings, but also natural objects such as trees and natural parks).
- Conventional facility display devices are equipped with voice recognition means for recognizing the user's voice.
- voice recognition means for recognizing the user's voice.
- the voice recognition means specifies that facility.
- the facility display device reads out the three-dimensional data of the facility from the database, and displays a three-dimensional still image of the facility according to the three-dimensional data (for example, see Patent Document 1).
- the conventional facility display device is configured as described above, a facility to be displayed can be specified, and a still image of the facility can be displayed three-dimensionally.
- the user cannot freely set the display method of the facility, and cannot display a moving image of the facility, so that the entire image of the facility cannot be easily grasped. was there.
- the present invention has been made to solve the above problems, and has as its object to provide a facility display device that can easily grasp the entire image of a facility. And Disclosure of the invention
- the facility display device displays a moving image or a still image of a facility specified by the facility specifying means on a map according to a display method set by the setting means.
- FIG. 1 is a configuration diagram showing a facility display device according to Embodiment 1 of the present invention.
- FIG. 2 is a flowchart showing the processing contents of the facility display device according to Embodiment 1 of the present invention.
- FIG. 3 is an explanatory diagram showing an image of a facility displayed on a wide area map.
- FIG. 4 is an explanatory diagram showing an image of a facility in the middle of being zoomed up.
- FIG. 5 is an explanatory diagram showing an image of a facility that is being zoomed up.
- FIG. 6 is an explanatory view showing an image of the facility after the zoom-up is completed.
- FIG. 7 is a flowchart showing a setting process of a facility display method.
- FIG. 8 is an explanatory diagram showing a list of facilities.
- FIG. 9 is a flowchart showing the download of map data.
- FIG. 10 is a flowchart showing the processing contents of the facility display device according to Embodiment 2 of the present invention.
- FIG. 11 is a block diagram showing a facility display device according to Embodiment 3 of the present invention. It is.
- FIG. 12 is an explanatory view showing the detailed display contents of the facility.
- FIG. 13 is a configuration diagram showing a facility display device according to Embodiment 6 of the present invention.
- FIG. 14 is an explanatory diagram showing the direction in which the facility is viewed from the current location.
- FIG. 15 is an explanatory diagram showing an image of the facility as viewed from the direction A.
- FIG. 16 is an explanatory diagram showing the direction in which facilities are viewed from the current location.
- FIG. 17 is an explanatory diagram showing an image of the facility as viewed from the direction B.
- FIG. 18 is an explanatory diagram showing the display contents of the display target facility when there is a peripheral facility that blocks the display target facility.
- FIG. 19 is a configuration diagram showing a facility display device according to Embodiment 9 of the present invention.
- FIG. 20 is a configuration diagram showing a facility display device according to Embodiment 10 of the present invention.
- FIG. 1 is a configuration diagram showing a facility display device according to Embodiment 1 of the present invention.
- a setting receiving unit 1 is, for example, an external interface for receiving an operation signal such as an infrared ray transmitted from a remote controller, or It is composed of a man-machine interface such as a key switch that accepts user operations.
- the setting accepting unit 1 is used when the user performs various settings such as setting of destinations and setting of display method of facilities.
- the setting receiving unit constitutes a setting unit.
- the microphone 2 inputs the user's voice
- the voice recognition unit 3 performs a process of recognizing the user's voice input by the microphone 2.
- the map data storage unit 4 is a memory that stores map data for three-dimensional display and map data for two-dimensional display, and the facility data storage unit 5 stores image data of various facilities (for example, polygons). Data) and position data are stored in association with text data, and the default display method is also stored.
- the facility identification unit 6 is a facility data overnight stored in the facility data storage unit 5, and is a text data, which is a recognition result of the voice recognition unit 3, from the facility data overnight and location data overnight. And search for location data.
- the microphone 2, the voice recognition unit 3, the facility data storage unit 5, and the facility identification unit 6 constitute a facility identification unit.
- the display control unit 7 controls a display device 8 such as a liquid crystal display, and displays a moving image or a still image of the facility specified by the facility specifying unit 6 on a map according to the display method set by the setting receiving unit 1. indicate.
- FIG. 2 is a flowchart showing the processing contents of the facility display device according to Embodiment 1 of the present invention.
- the facility display device in FIG. 1 can divide the display area of the display device 8 into two, display a three-dimensional map on one display area, and display a two-dimensional map on the other display area.
- the user can arbitrarily set the display method of the facilities displayed on the 3D map or the 2D map (the facilities include natural objects such as trees and natural parks in addition to man-made structures such as buildings). Can be set
- an input screen for setting the display method of the facility is displayed on the display device 8 (step ST11).
- the setting reception unit 1 stores the setting contents in the facility data storage unit 5 (step ST12). ST 13).
- the setting of the facility display method includes, for example, setting whether to display the facility image as a moving image or a still image, and setting the moving object pattern when displaying the facility image as a moving image. In the case of displaying a facility image as a still image, setting of the scale of the facility is required.
- a setting is made to display an image of the facility as a moving image.
- the image of the facility is displayed three-dimensionally on a wide-area map, and then the image of the facility is zoomed in It is assumed that the setting for rotating the facility image has been made after the completion of.
- the method of zooming up the image of the facility is not particularly limited, but for example, it is also possible to perform the zoom-up by using computer graphics technology to perform interpolation and decimation of data overnight. Good.
- the display method of the facility for example, an image of the sample facility is displayed on the display device 8 according to the display method, and the user can confirm the display method.
- the display method is stored in the facility data storage unit 5.
- the facility can be displayed thereafter, and when the user speaks the name of the facility toward the microphone 2 (step ST1), the microphone is activated. 2 inputs the user's voice, and the voice recognition unit 3 performs a process of recognizing the user's voice input by the microphone 2 (step ST 2).
- a user utters "Rain Po Bridge” as the name of a facility Then, the voice recognition unit 3 recognizes the voice of the user and outputs text data indicating “rain bridge” to the facility identification unit 6.
- the facility identification unit 6 When the facility identification unit 6 receives the text data indicating the speech recognition result from the speech recognition unit 3, the facility identification unit 6 uses the text data overnight as a key to input the image data or the facility data stored in the facility data storage unit 5. The image data and the position data of the facility to be displayed are searched from the position data overnight (step ST3).
- the display control unit 7 refers to the position data and searches the map data storage unit 4 for the facility.
- the map data for the three-dimensional display of the existing area is obtained (step ST5).
- the map data acquired from the map data storage unit 4 is a wide-area map data that can be used to confirm that the facility exists (for example, map data with a scale of 1100).
- the scale of the map data can be arbitrarily set by the user by operating the setting receiving unit 1.
- the display control unit 7 When the display control unit 7 acquires the map data from the map data storage unit 4, the display control unit 7 displays the image of the facility specified by the facility specifying unit 6 on the map in accordance with the display method set by the setting reception unit 1 ( Step ST 6).
- the display control unit 7 displays a wide-area map on the display device 8 according to the map data acquired from the map data storage unit 4, and displays the facilities searched by the facility identification unit 6. According to the image data, the image of the facility is displayed three-dimensionally on a wide area map.
- the display control unit 7 then gradually zooms in on the facility image in predetermined increments (see FIG. 4—FIG. 5—FIG. 6 in that order).
- the facility image can be maximized without departing from the display area of Zoom in as much as possible (see Figure 6).
- the display control unit 7 rotates the facility image in a horizontal plane so that the entire facility can be grasped (by a preset number of rotations). (Rotation), images of facilities are displayed from all directions.
- the facility image may be rotated while the facility image is zoomed up.
- the rotation of the image need not be limited to the horizontal plane, and may be rotated in a vertical plane or in an oblique plane. Further, these may be combined.
- the display control unit 7 sets the image data or the facility data corresponding to the user's voice If the location data is not stored in the facility data storage unit 5, a message indicating that the facility image data etc. corresponding to the user's voice is not stored in the facility data storage unit 5 is displayed on the display device 8 (step ST 7).
- the moving image or the still image of the facility specified by the facility specifying unit 6 is displayed on the map according to the display method set by the setting receiving unit 1.
- the configuration is such that the whole picture of the facility can be easily grasped.
- the configuration is such that the user's voice is recognized and the facility to be displayed is specified, the image of the desired facility can be displayed without any troublesome operation by the user. It has the effect of being able to.
- the image of the facility identified by the facility identifying unit 6 is three-dimensionally displayed on the wide area map, and then the image of the facility is zoomed up, the position of the facility is determined. After checking roughly, This has the effect that the body image can be grasped.
- the image of the facility is rotated at the same time as the zoom-up or after the completion of the zoom-up, there is an effect that the entire image of the facility can be easily grasped. .
- the image of the facility is configured to be reduced within a range in which the entire facility is within the display area of the display device 8, a part of the facility is cut off. Instead, the entire facility will be displayed larger.
- the voice recognition unit 3 recognizes the user's voice and specifies the display target facility. However, the user operates the setting reception unit 1 to specify the display target facility. The facility may be specified.
- a list of facilities that can be displayed is displayed on the display device 8 (see FIG. 8), and the facility to be displayed is selected from the list. What should I do?
- the image of the facility identified by the facility identifying unit 6 is displayed three-dimensionally on the map.
- the image of the facility may be displayed two-dimensionally on the map. Good.
- the map data storage unit 4 stores the map data has been described.
- the map data of some regions is reduced. May only be stored.
- the map data storage unit 4 is located in the location where the facility to be displayed exists. In some cases, the map data of the area is not stored. In such a case, the map data of the area is downloaded from an external server and stored in the map data storage unit 4. .
- the display control unit 7 receives GPS data from a GPS satellite to detect the current position (step ST21), and, based on the current position, determines the destination (display target). It is determined whether or not the map data of the area including the route leading to the facility data is stored in the map data storage unit 4 (step ST22).
- the display control section 7 downloads the map data of the area from an external server (step ST23) and outputs the map data.
- the data is stored in the data storage unit 4 (step ST24).
- the display control unit 7 three-dimensionally displays the image of the facility on the wide area map according to the image of the facility searched by the facility specifying unit 6 and then gradually zooms the image of the facility.
- the display control unit 7 may further zoom in on the image of the facility that has been zoomed up when a request to enlarge the facility is received from the user.
- a specific object for example, a product
- the facility may be enlarged and displayed.
- step ST 3 when the user operates the setting reception unit 1 to input a request for enlarged display of the facility, or when the voice recognition unit 3 recognizes the user's voice (step ST 3 in FIG. 10). 1, ST 32), if the speech recognition result is “for example,” (step ST 33), the display control unit 7 further zooms-in the image of the facility zoomed in, and performs the application. The image is rotated and displayed (step ST34).
- the display control unit 7 further zooms up the image of the zoomed-in facility to specify a specific object in the facility (step ST 35). For example, the item (product) is enlarged and displayed (step ST36).
- FIG. 11 is a configuration diagram showing a facility display device according to Embodiment 3 of the present invention.
- the setting receiving unit 9 receives the setting of the destination and the setting of the display method of the facility, as well as the setting of the zoom-in speed of the image of the facility and the setting of the rotation speed of the image of the facility, similarly to the setting receiving unit 1 in FIG. Has functions.
- the display control unit 7 zooms in the facility image and rotates the facility image.
- the display control unit 7 controls the zoom-up speed or the speed set by the setting reception unit 9.
- the facility image may be zoomed up or rotated at the rotation speed.
- the display control unit 7 sets the zoom-up speed set by the setting reception unit 9. To zoom in on the facility image, and rotate the facility image at the rotation speed set by the setting reception unit 9. As a result, there is an effect that the facility image can be zoomed up and the facility image can be rotated at a speed desired by the user.
- the display control unit 7 zooms in on the facility image and rotates the facility image.
- the facility image is re-displayed. May be displayed three-dimensionally on a wide-area map, and then the image of the facility may be zoomed up and the image of the facility may be rotated.
- the display control unit 7 when the display control unit 7 receives a request to redisplay a facility from a user after zooming in the image of the facility or rotating the image of the facility, the display control unit 7 performs the same operation as in the first embodiment. 7 again displays the facility image three-dimensionally on the wide area map according to the facility image data searched by the facility identification unit 6, then gradually zooms in on the facility image, and then rotates the facility image Let it do.
- the facility re-display request may be input by the user operating the setting reception unit 1 or by the voice recognition unit 3 recognizing the user's voice (for example, “saiyoji”). You may enter it.
- the facility image after the facility image is zoomed up and the facility image is rotated, the facility image can be replayed and viewed again, so that the entire facility image can be reliably obtained. It has the effect of being able to grasp.
- the display control unit 7 zooms in on the facility image and rotates the facility image.
- an image of a destination within the facility may be displayed.
- Embodiment 1 For example, if the destination is a tenant in a high-rise building, in Embodiment 1 above, if the user utters the building name of the high-rise building as the facility name, an image of the high-rise building is displayed. The image that identifies the tenant is not displayed.
- the display control unit 7 displays an image of a high-rise building and then receives a request for detailed display of facilities from the user, for example, as shown in FIG. Display transparently, clearly indicate the floor (number of floors) where the tenant is located, and display the location of the tenant within that floor.
- the facility detail display request may be input by the user operating the setting reception unit 1 or the voice recognition unit 3 recognizes a user's voice (for example, “Shosai Hyoji”). May be input.
- a specific object for example, a product displayed on the tenant may be enlarged and displayed.
- FIG. 13 is a configuration diagram showing a facility display device according to Embodiment 6 of the present invention.
- the same reference numerals as in FIG. 13 are identical to FIG. 13 and the same reference numerals as in FIG.
- the current location detector 11 receives a GPS signal transmitted from a GPS satellite, for example. It consists of a receiving GPS receiver and the like, and detects the current location of the vehicle and outputs position data using existing positioning technologies such as DGPS (differential GPS), real-time kinematics, VRS, and PAS.
- DGPS differential GPS
- VRS real-time kinematics
- PAS PAS
- the direction specifying unit 12 inputs the position data of the current position and the position data of the facility to be displayed, and specifies the direction in which the facility is viewed from the current position.
- the display control unit 13 has the same processing function as the display control unit 7 in FIG. 1 and also has a function of displaying a facility image three-dimensionally in accordance with the direction specified by the direction specifying unit 12.
- the display means is composed of the current position detection unit 11, the direction identification unit 12, the display control unit 13, and the display device 8.
- the present embodiment is the same as the first embodiment except that a current position detecting unit 11 and a direction specifying unit 12 are added, and a function of a display control unit 13 is added. Therefore, only different points will be described.
- the current position detection unit 11 receives a GPS signal transmitted from a GPS satellite, detects the current position of the vehicle from the GPS signal using a known positioning technology, and outputs position data.
- the direction identification unit 12 inputs the position data of the current position from the current position detection unit 11 and the position data of the facility to be displayed from the facility specification unit 6, and based on these position data, the current position is determined. Identify the direction to view the facility from.
- the direction to look at the facility is specified as direction A
- the direction to look at the facility is specified as direction B.
- the display control unit 13 refers to the location data and searches the map data storage unit 4 from the map data storage unit 4 as in the first embodiment. Acquire 3D map data of the area where the facility is located. Upon acquiring the map data from the map data storage unit 4, the display control unit 13 displays a wide-area map on the display device 8 according to the map data, as in the first embodiment, and the facility identification unit 6 According to the image data of the facility searched by, the image of the facility is displayed three-dimensionally on the wide area map.
- the display control unit 13 uses a computer graphics technology to display the facility image three-dimensionally in accordance with the direction specified by the direction specifying unit 12. I do.
- the image of the facility when viewed from the direction B is displayed three-dimensionally on a wide-area map, and the image of the facility is zoomed up. (See Figure 17).
- the display control unit 13 rotates the facility image, as in the first embodiment, so that the entire facility can be grasped. Display the facility image from the direction.
- the facility image is rotated after the facility image has been zoomed up.
- the facility image may be rotated while the facility image is zoomed up.
- the direction in which the facility identified by the facility identifying unit 6 is viewed from the current location is specified, and the image of the facility is displayed three-dimensionally according to the direction. Therefore, if the facility can be viewed from the current location, it is easy to compare the facility that can be actually seen with the image of the facility displayed on the display device 8, and the actual facility can be easily confirmed. It has the effect of being able to do.
- the display control unit ⁇ displays a three-dimensional image of a facility on a map.
- a peripheral facility When there is a large building, a mountain, or the like around the facility (hereinafter referred to as a peripheral facility), The target facility may be obstructed by surrounding facilities and cannot be clearly displayed.
- the display control unit 7 when there is a peripheral facility that blocks the facility identified by the facility identifying unit 6, the display control unit 7 lowers the display tone of the peripheral facility of the facility and identifies the facility by the facility identifying unit 6. Specifically, the display control unit 7 displays the image of the surrounding facility in a translucent manner as shown in FIG. 18A, and displays the display target specified by the facility specifying unit 6. Highlight facilities. FIG. 18B shows a state in which the facility of FIG. 18A is rotated.
- the image of the peripheral facility is displayed in monochrome, and the display target facility specified by the facility specifying unit 6 is displayed in full color.
- the facility to be displayed identified by the facility identifying unit 6 is enlarged and the images of the surrounding facilities are reduced or displayed at the same size.
- the display target facility identified by the facility identifying unit 6 is blinked.
- Embodiment 8 As is clear from the above, according to the seventh embodiment, even when there is a peripheral facility that blocks the facility to be displayed, the facility to be displayed can be clearly displayed.
- Embodiment 8 As is clear from the above, according to the seventh embodiment, even when there is a peripheral facility that blocks the facility to be displayed, the facility to be displayed can be clearly displayed.
- the display control unit 7 displays the image of the facility three-dimensionally on a map.
- the display color of the specified facility may be switched according to the current time zone or the weather.
- the display control unit 7 displays an image of the facility where the sun is shining, and if the current time zone is at night, the display control unit 7 turns on the lighting. Display the image of the facility where you are.
- the display control unit 7 increases the brightness of the facility image. If the weather is cloudy or rainy, the display control unit 7 lowers the brightness of the facility image.
- the weather information may be obtained from the Internet, for example, or may be obtained by detecting the movement of a wiper of a vehicle running near the facility to be displayed. Good.
- the display control unit 7 may switch the display color around the facility specified by the facility specifying unit 6 according to the current season.
- a street tree or the like is planted around the facility to be displayed, if the current season is spring, the street tree is displayed in light green, and if the current season is autumn, the street tree is displayed. Is displayed in red.
- Embodiment 9 This has the effect of making the image around the facility closer to the current situation.
- the display control unit 7 displays the images of the facilities three-dimensionally on a map.
- the facility data storage unit 5 provides guidance for various facilities (for example, the characteristics of the facilities.
- the facility data storage unit 5 provides guidance for the facility. May be obtained and the guidance may be displayed on the display device 8.
- a voice synthesis unit 14 as voice output means is mounted, and the voice synthesis unit 14 is provided.
- the guidance of the facility specified by the facility specifying unit 6 may be output as a voice.
- Embodiment 10-FIG. 20 is a block diagram showing a facility display device according to Embodiment 10 of the present invention. In the figure, the same reference numerals as those in FIG. Is omitted.
- the route search unit 15 searches for a route from the current location to the destination, and also searches for a parking lot related to the destination.
- the display control unit 16 has the same processing functions as the display control unit 13 in FIG. 19, and also has a relationship between the route from the current position searched by the route search unit 15 to the destination and the destination. A function of displaying the parking lot on the display device 8 is provided.
- a display means is composed of the current position detection unit 11, the direction identification unit 12, the route search unit 15, the display control unit 16 and the display device 8.
- a route search function of a typical navigation device is provided, and the user operates the setting reception unit 1 to set a destination, or the voice recognition unit 3 recognizes the user's voice and sets the destination, and detection
- the unit 11 detects the current position of the vehicle and outputs the position data
- the unit 11 searches the route from the current position to the destination by referring to the map data stored in the map data storage unit 4, and associates the destination with the destination. Explore the parking lot.
- the display control unit 16 displays the route from the current location to the destination and the related parking lot of the destination on the display device 8. indicate.
- the facility display device can be mounted on a navigation device such as a car navigation device that has a function of displaying an image of the facility when setting a destination facility.
- a navigation device such as a car navigation device that has a function of displaying an image of the facility when setting a destination facility.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Instructional Devices (AREA)
- Navigation (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/589,345 US7623045B2 (en) | 2004-04-21 | 2004-04-21 | Facility display unit |
PCT/JP2004/005726 WO2005104063A1 (ja) | 2004-04-21 | 2004-04-21 | 施設表示装置 |
EP04728690A EP1739643A4 (en) | 2004-04-21 | 2004-04-21 | SETUP DISPLAY DEVICE |
JP2006519128A JPWO2005104063A1 (ja) | 2004-04-21 | 2004-04-21 | 施設表示装置 |
CNB2004800428057A CN100555362C (zh) | 2004-04-21 | 2004-04-21 | 设施显示装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2004/005726 WO2005104063A1 (ja) | 2004-04-21 | 2004-04-21 | 施設表示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005104063A1 true WO2005104063A1 (ja) | 2005-11-03 |
Family
ID=35197212
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/005726 WO2005104063A1 (ja) | 2004-04-21 | 2004-04-21 | 施設表示装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7623045B2 (ja) |
EP (1) | EP1739643A4 (ja) |
JP (1) | JPWO2005104063A1 (ja) |
CN (1) | CN100555362C (ja) |
WO (1) | WO2005104063A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100852615B1 (ko) * | 2006-04-27 | 2008-08-18 | 팅크웨어(주) | 계절 및 지형 변화에 따른 지도 표현 방법 및 시스템 |
US8453060B2 (en) * | 2006-08-25 | 2013-05-28 | Microsoft Corporation | Panoramic ring user interface |
JP4706751B2 (ja) * | 2008-11-28 | 2011-06-22 | 株式会社デンソー | 時刻表示制御装置、ナビゲーション装置、及びプログラム |
US20120326893A1 (en) * | 2011-06-23 | 2012-12-27 | Abraham Glezerman | Method and system for coordinating permitted use of a parking space |
EP2543964B1 (en) | 2011-07-06 | 2015-09-02 | Harman Becker Automotive Systems GmbH | Road Surface of a three-dimensional Landmark |
CN112815958A (zh) * | 2021-01-07 | 2021-05-18 | 腾讯科技(深圳)有限公司 | 一种导航对象显示方法、装置、设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0660298A (ja) * | 1992-08-12 | 1994-03-04 | Daikei:Kk | ナビゲーションシステム |
JP2001050757A (ja) * | 1999-08-05 | 2001-02-23 | Matsushita Electric Ind Co Ltd | ナビゲーション装置 |
JP2001338037A (ja) * | 2000-05-29 | 2001-12-07 | Wall:Kk | 三次元コンピュータグラフィック画像生成用サーバ |
JP2002140731A (ja) * | 2000-11-01 | 2002-05-17 | Esuroku:Kk | 画像処理装置および方法、画像処理システム、記録媒体 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2907193B2 (ja) | 1992-07-14 | 1999-06-21 | 住友電気工業株式会社 | 車載ナビゲーション装置 |
US5528735A (en) * | 1993-03-23 | 1996-06-18 | Silicon Graphics Inc. | Method and apparatus for displaying data within a three-dimensional information landscape |
JP3476751B2 (ja) | 1995-10-17 | 2003-12-10 | 松下電器産業株式会社 | 車載ナビゲーション装置 |
JP3501390B2 (ja) | 1995-12-19 | 2004-03-02 | 本田技研工業株式会社 | 車載用ナビゲーション装置 |
US7069232B1 (en) * | 1996-01-18 | 2006-06-27 | Planalytics, Inc. | System, method and computer program product for short-range weather adapted, business forecasting |
US6148261A (en) * | 1997-06-20 | 2000-11-14 | American Calcar, Inc. | Personal communication system to send and receive voice data positioning information |
JP3547947B2 (ja) * | 1997-08-11 | 2004-07-28 | アルパイン株式会社 | ナビゲーション装置における所在階数表示方法 |
US6611753B1 (en) * | 1998-04-17 | 2003-08-26 | Magellan Dis, Inc. | 3-dimensional intersection display for vehicle navigation system |
JP3596805B2 (ja) * | 1999-07-29 | 2004-12-02 | 松下電器産業株式会社 | 情報端末装置および経路案内方法 |
JP3717045B2 (ja) * | 2000-01-19 | 2005-11-16 | 松下電器産業株式会社 | ナビゲーション装置 |
KR20020084148A (ko) * | 2000-03-10 | 2002-11-04 | 리츠에프엑스 리미티드 | 가상현실 쇼핑시스템용 유저 인터페이스 |
US6977630B1 (en) * | 2000-07-18 | 2005-12-20 | University Of Minnesota | Mobility assist device |
EP1311803B8 (de) * | 2000-08-24 | 2008-05-07 | VDO Automotive AG | Verfahren und Navigationsgerät zum Abfragen von Zielinformation und zum Navigieren in einer Kartenansicht |
US6516268B2 (en) * | 2001-02-16 | 2003-02-04 | Wizeguides.Com Inc. | Bundled map guide |
US6816627B2 (en) * | 2001-04-12 | 2004-11-09 | Lockheed Martin Corporation | System for morphological image fusion and change detection |
JP2005502936A (ja) * | 2001-04-30 | 2005-01-27 | アクティブマップ エルエルシー | 双方向性電子提示地図 |
JP4169949B2 (ja) | 2001-05-14 | 2008-10-22 | アルパイン株式会社 | 車載用ナビゲーション装置 |
JP2002357444A (ja) | 2001-05-31 | 2002-12-13 | Nec Corp | 移動端末を使用したナビゲーションシステム |
JP2003125457A (ja) * | 2001-10-16 | 2003-04-25 | Toshiba Corp | 無線通信端末装置及び無線通信方法 |
US6907345B2 (en) * | 2002-03-22 | 2005-06-14 | Maptech, Inc. | Multi-scale view navigation system, method and medium embodying the same |
US7216034B2 (en) * | 2003-02-27 | 2007-05-08 | Nokia Corporation | System and method for an intelligent multi-modal user interface for route drawing |
JP4228745B2 (ja) * | 2003-03-28 | 2009-02-25 | 株式会社日立製作所 | 多スペクトル撮像画像解析装置 |
EP1611415A4 (en) * | 2003-04-02 | 2007-09-05 | Wong Lai Wan | DISPLAYING A DIGITAL CARD |
JP4138574B2 (ja) * | 2003-05-21 | 2008-08-27 | 株式会社日立製作所 | カーナビゲーション装置 |
US6999875B2 (en) * | 2004-02-06 | 2006-02-14 | Alpine Electronics, Inc | Display method and apparatus for navigation system |
-
2004
- 2004-04-21 EP EP04728690A patent/EP1739643A4/en not_active Ceased
- 2004-04-21 CN CNB2004800428057A patent/CN100555362C/zh not_active Expired - Fee Related
- 2004-04-21 US US10/589,345 patent/US7623045B2/en not_active Expired - Fee Related
- 2004-04-21 WO PCT/JP2004/005726 patent/WO2005104063A1/ja not_active Application Discontinuation
- 2004-04-21 JP JP2006519128A patent/JPWO2005104063A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0660298A (ja) * | 1992-08-12 | 1994-03-04 | Daikei:Kk | ナビゲーションシステム |
JP2001050757A (ja) * | 1999-08-05 | 2001-02-23 | Matsushita Electric Ind Co Ltd | ナビゲーション装置 |
JP2001338037A (ja) * | 2000-05-29 | 2001-12-07 | Wall:Kk | 三次元コンピュータグラフィック画像生成用サーバ |
JP2002140731A (ja) * | 2000-11-01 | 2002-05-17 | Esuroku:Kk | 画像処理装置および方法、画像処理システム、記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1739643A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20070176795A1 (en) | 2007-08-02 |
EP1739643A1 (en) | 2007-01-03 |
JPWO2005104063A1 (ja) | 2007-08-30 |
US7623045B2 (en) | 2009-11-24 |
CN1942913A (zh) | 2007-04-04 |
CN100555362C (zh) | 2009-10-28 |
EP1739643A4 (en) | 2010-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8635019B2 (en) | Navigation device and method for altering map information related to audible information | |
EP2728313B1 (en) | Method of displaying objects on a navigation map | |
US8775071B2 (en) | Navigation device and method for displaying map information | |
US9746340B2 (en) | Map storage for navigation systems | |
US9739632B2 (en) | Methods and systems of providing information using a navigation apparatus | |
US20080208447A1 (en) | Navigation device and method for providing points of interest | |
US20080228393A1 (en) | Navigation device and method | |
US10612935B2 (en) | Methods and systems of providing information using a navigation apparatus | |
US20080168398A1 (en) | Navigation device and method for displaying a rich content document | |
US9874456B2 (en) | Method, apparatus and computer program product for providing a destination preview | |
US9864577B2 (en) | Voice recognition device and display method | |
US9528848B2 (en) | Method of displaying point on navigation map | |
JP4339178B2 (ja) | 駐車場空きスペース案内装置及び駐車場空きスペース案内方法 | |
EP2726819A1 (en) | Methods and systems for obtaining navigation instructions | |
US20110319099A1 (en) | Navigation or mapping system and method | |
WO2005104063A1 (ja) | 施設表示装置 | |
JP2004233333A (ja) | ナビゲーション用立体表示方法およびナビゲーション装置 | |
GB2494649A (en) | Selecting a destination on a navigation apparatus | |
JP2004037125A (ja) | ナビゲーションにおける周辺情報提示装置及び方法並びに提示用プログラム | |
JP2004117294A (ja) | ナビゲーション装置、方法及びプログラム | |
EP2488964A1 (en) | Navigation system and method with improved destination searching | |
JP2004108937A (ja) | 情報送信装置、ナビゲーション装置、システム、方法及びプログラム | |
JP2003329461A (ja) | ナビゲーション装置、地図表示装置、地図表示方法、プログラム | |
JP4082969B2 (ja) | ナビゲーション装置、方法及びプログラム | |
KR20090018378A (ko) | 동작 인식을 적용한 네비게이션 단말기 및 그 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480042805.7 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006519128 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10589345 Country of ref document: US Ref document number: 2007176795 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004728690 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004728690 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10589345 Country of ref document: US |