WO2011154050A1 - Navigation device and method having enhanced instruction including a panoramic image of a scene - Google Patents

Navigation device and method having enhanced instruction including a panoramic image of a scene Download PDF

Info

Publication number
WO2011154050A1
WO2011154050A1 PCT/EP2010/058235 EP2010058235W WO2011154050A1 WO 2011154050 A1 WO2011154050 A1 WO 2011154050A1 EP 2010058235 W EP2010058235 W EP 2010058235W WO 2011154050 A1 WO2011154050 A1 WO 2011154050A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation device
image
scene
instruction
display
Prior art date
Application number
PCT/EP2010/058235
Other languages
French (fr)
Inventor
Erik Thomassen
Original Assignee
Tomtom International B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tomtom International B.V. filed Critical Tomtom International B.V.
Priority to PCT/EP2010/058235 priority Critical patent/WO2011154050A1/en
Publication of WO2011154050A1 publication Critical patent/WO2011154050A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams

Definitions

  • This disclosure relates to navigation devices and to methods for determining a route of travel from a first location to a second location.
  • PNDs portable navigation devices
  • GPS Global Positioning System
  • Other embodiments relate, more generally, to any type of processing device that is configured to execute navigation software so as to provide route planning, and navigation functionality.
  • Navigation devices that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-car or other vehicle navigation systems.
  • GPS Global Positioning System
  • Such devices are of great utility when the user is not familiar with the route to the destination to which they are navigating. However, the user may be unfamiliar with surrounding landmarks, the user's orientation and/or the user's vehicle orientation in comparison to the on-screen navigation instruction provided by the navigation device.
  • the navigation device guesses (e.g., a guess may be based on a last known direction of travel) in which direction the vehicle is pointing and then provides the instruction based on this directional guess. The user may then need to correct the device's guess where to go because the navigation device may have the orientation wrong or the instruction could be completely unclear. As a result, for example, the user is often confused and the user starts their journey in the wrong direction. Although the navigation device may inform the user to reverse course, setting off in the wrong direction may inconvenience the user and / or significantly add time to the users travel.
  • a second example may be that while travelling on a highway at highway speeds, a user may be told to exit the highway where there may be several sequential off-ramps and only one being the correct ramp. Because onscreen map data is always an abstraction of reality, discerning the correct ramp to take may be difficult. As a result the user is often confused and the user may choose the wrong off-ramp. Summary
  • a method is directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device.
  • the method further includes determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image and the location determined by the navigation device.
  • the navigation device displays enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street.
  • the navigation device includes an input device to receive at least one input indicating a desired destination a desired destination.
  • the navigation device includes a processor operably coupled to the input device.
  • the processor is configured to calculate a planned route between a current location and the desired destination, the processor is configured to determine the current location (and direction of travel) and the processor configured to obtain an image of the current location.
  • the image being a street level view of a location and the processor is further configured to overlay a first instruction on the image.
  • the navigation device includes a display device controllable by the processor and configured to display the planned route and the image including the first instruction.
  • FIG. 1 is a schematic illustration of a Global Positioning System (GPS);
  • GPS Global Positioning System
  • FIG. 2 is a schematic illustration of electronic components arranged to provide a navigation device
  • FIG. 3 is a schematic illustration of the manner in which the navigation device of Fig. 2 may receive information over a wireless communication channel;
  • FIGs. 4a and 4b are illustrative perspective views of the navigation device of Fig. 2;
  • FIGs. 5a to 5i are illustrative screenshots from a navigation device for a destination input process
  • FIG. 6 is an illustrative screenshot from a navigation device depicting a start location for an illustrative calculated route
  • FIG. 7 is an illustrative flow diagram depicting steps of an example embodiment
  • FIG. 8 is an illustrative flow diagram depicting steps of an example embodiment.
  • Fig. 9 is an illustrative image depicting an image together with an instruction according to an example embodiment.
  • spatially relative terms e.g., "beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or a relationship between a feature and another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, for example, the term “below” can encompass both an orientation which is above as well as below. The device may be otherwise oriented (rotated 90 degrees or viewed or referenced at other orientations) and the spatially relative descriptors used herein should be interpreted accordingly.
  • the software implemented aspects of example embodiments are typically encoded on some form of computer readable medium or implemented over some type of transmission medium.
  • the computer readable medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or "CD ROM"), and may be read only or random access.
  • the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. Example embodiments are not limited by these aspects of any given implementation.
  • a navigation device is intended to include (without limitation) any type of route planning and/or navigation device, irrespective of whether that device is embodied as a PND, a navigation device built into a vehicle, or indeed a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA) executing route planning and navigation software).
  • a computing resource such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA) executing route planning and navigation software.
  • FIG. 1 illustrates an example view of Global Positioning System (GPS), usable by navigation devices.
  • GPS Global Positioning System
  • NAVSTAR the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • the GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals.
  • the device Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the device utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • the GPS system is denoted generally by reference numeral 100.
  • a plurality of satellites 120 are in orbit about the earth 124.
  • the orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous.
  • a GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
  • the spread spectrum signals 160 continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock.
  • Each satellite 120 as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120.
  • the GPS receiver 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two- dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver 140 to calculate its three-dimensional position in a known manner.
  • Figure 2 is an illustrative representation of electronic components of a navigation device 200 according to an example embodiment of the present disclosure, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
  • the navigation device 200 is located within a housing (not shown).
  • the housing includes a processor 210 connected to an input device 220 and a display screen 240.
  • the input device 220 can include a keyboard device, voice input device, touch panel and/ or any other known input device utilised to input information; and the display screen 240 can include any type of display screen such as an LCD display, for example.
  • the input device 220 and display screen 240 are integrated into an integrated input and display device 240, including a touchpad or touch screen input so that a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual buttons.
  • the navigation device may include an output device 260, for example, an audible output device (e.g., a loudspeaker).
  • output device 260 can produce audible information for a user of the navigation device 200, it should equally be understood that input device 220 can include a microphone and software for receiving input voice commands as well.
  • processor 210 is operatively connected to and set to receive input information from input device 220 via a connection 225, and operatively connected to at least one of display screen 240 and output device 260, via output connections 245, to output information thereto. Further, the processor 210 is operably coupled to a memory resource 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200.
  • the memory resource 230 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non-volatile memory, for example a digital memory, such as a flash memory.
  • RAM Random Access Memory
  • non-volatile memory for example a digital memory, such as a flash memory.
  • the external I/O device 280 may include, but is not limited to an external listening device such as an earpiece.
  • the connection to I/O device 280 may further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/ or for voice activated operation, for connection to an ear piece or head phones, and/or for connection to a mobile phone, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network, and / or to establish a connection to a server via the internet or some other network.
  • FIG. 2 further illustrates an operative connection between the processor 210 and an antenna/ receiver 250 via connection 255.
  • the antenna/ receiver 250 can be a GPS antenna/ receiver, for example. It will be understood that the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna, for example.
  • the electronic components shown in FIG. 2 are powered by power sources (not shown) in a conventional manner.
  • power sources not shown
  • different configurations of the components shown in FIG. 2 are considered to be within the scope of the present application.
  • the components shown in FIG. 2 may be in communication with one another via wired and/ or wireless connections and the like.
  • the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200.
  • the portable or handheld navigation device 200 of FIG. 2 can be connected or "docked" in a known manner to a vehicle such as a bicycle, a motorbike, a car or a boat ,for example. Such a navigation device 200 is then removable from the docked location for portable or handheld navigation use.
  • the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (such as a mobile phone, PDA, and/ or any device with mobile phone technology) establishing a digital connection (e.g., a digital connection via known Bluetooth technology or WiFi connection). Thereafter, through its network service provider, the mobile device can establish a network connection (through the internet for example) with a server 302. As such, a "mobile" network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/ or in a vehicle) and the server 302 to provide a "real-time" or at least very “up to date” gateway for information.
  • a mobile device not shown
  • a digital connection e.g., a digital connection via known Bluetooth technology or WiFi connection
  • the mobile device can establish a network connection (through the internet for example) with a server 302.
  • the establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using the Internet (e.g., the World Wide Web) for example, can be done in a known manner. This can include use of TCP/IP layered protocol.
  • the mobile device can utilize any number of communication standards such as CDMA, GSM, WAN, etc.
  • an internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200, for example.
  • an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service) -connection (GPRS connection is a highspeed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet).
  • GPRS General Packet Radio Service
  • the navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology, for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the navigation device 200 may include its own mobile phone technology within the navigation device 200 (including an antenna, for example, or optionally using the internal antenna of the navigation device 200).
  • the mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card (e.g., Subscriber Identity Module or SIM card), complete with necessary mobile phone technology and/or an antenna, for example.
  • an insertable card e.g., Subscriber Identity Module or SIM card
  • mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.
  • a Bluetooth enabled navigation device may be used to correctly work with the ever changing spectrum of mobile phone models and manufacturers.
  • model/ manufacturer specific settings may be stored on the navigation device 200, for example.
  • the data stored for this information can be updated.
  • the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements.
  • the server 302 and navigation device 200 can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer and /or via the internet).
  • the server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312.
  • the processor 304 is further operatively connected to a transmitter 308 and a receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318.
  • the signals sent and received may include data, communication, and/ or other propagated signals.
  • the transmitter 308 and receiver 310 may be selected or designed according to a communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
  • Server 302 is further connected to (or includes) the mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314.
  • the mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302.
  • the navigation device 200 is adapted to communicate with the server 302 through communications channel 318, and includes at least similar elements as previously described with regard to FIG. 2, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
  • Software stored in server memory 306 provides instructions for the processor 304 and allows the server 302 to provide services to the navigation device 200.
  • One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200.
  • Another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
  • the communications channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302.
  • Both the server 302 and navigation device 200 include transmitters 308, 320 for transmitting data through the communication channel and receivers 310, 322 for receiving data that has been transmitted through the communications channel 318.
  • the communication channel 318 is not limited to a particular communication technology. Additionally, the communications channel 318 is not limited to a single communication technology; that is, communications channel 318 may include several communication links that use a variety of technology. For example, the communications channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communications channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communications channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • RF radio-frequency
  • the communications channel 318 includes telephone and computer networks. Furthermore, the communications channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency and/ or infrared communication. Additionally, the communications channel 318 can accommodate satellite communication.
  • the communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology.
  • the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA) and /or Global System for Mobile Communications (GSM). Both digital and analogue signals can be transmitted through the communications channel 318. These signals may be modulated, encrypted and/ or compressed signals as may be desirable for the communication technology.
  • the server 302 is a remote server accessible by the navigation device 200 via a wireless channel.
  • the server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the server 302 may include a personal computer such as a desktop or laptop computer, and the communications channel 318 may be a cable connected between the personal computer and the navigation device 200.
  • a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200.
  • a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
  • the navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/ or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection, for example.
  • the processor 304 in the server 302 may be used to handle the bulk of the processing, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
  • a navigation device 200 includes the processor 210, the input device 220, and the display screen 240.
  • the input device 220 and display screen 240 may be integrated into an integrated input and display device 240 to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example.
  • a touch panel screen for example.
  • Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art.
  • the navigation device 200 can also include any additional input device 220 and/or any additional output device 260, such as audio input/ output devices for example.
  • FIGs. 4a and 4b are perspective views of the navigation device 200.
  • the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen, for example) and the other components of FIG. 2 (including but not limited to internal GPS receiver 250, microprocessor 210, a power supply, memory systems 230, etc.).
  • the navigation device 200 may sit on an arm 292, which itself may be secured to a vehicle dashboard/ window/ etc. using a suction cup 294.
  • This arm 292 is one example of a docking station to which the navigation device 200 can be docked.
  • the navigation device 200 can be docked or otherwise connected to an arm 292 of the docking station by snap connecting the navigation device 200 to the arm 292, for example.
  • the navigation device 200 may then be rotatable on the arm 292, as shown by the arrow of FIG. 4b.
  • a button on the navigation device 200 may be pressed, for example.
  • Other equally suitable arrangements for coupling and decoupling the navigation device to a docking station are well known to persons of ordinary skill in the art.
  • FIGs. 5a to 5i there is depicted a series of screenshots from the navigation device 200.
  • the navigation device 200 has a touch screen interface for displaying information to a user and for accepting input to the device from the user.
  • the screenshots show an example embodiment of a destination location input process for a user whose home location has been set to the offices in The Hague of the European Patent Office, and who wishes to navigate to a street address in Amsterdam, The Netherlands for which they know the street name and building number.
  • the device acquires a GPS fix and calculates (in a known manner) the current location of the navigation device 200.
  • the user is then presented, as shown in FIG. 5a, with a display 340 showing, in pseudo three-dimensions, the local environment 342 in which the navigation device 200 is determined to be located. In a region 344 of the display 340 below the local environment, a series of control and status messages is displayed.
  • the navigation device 200 switches to display (as shown in FIG. 5b) a series of virtual buttons 346 for a user to, inter alia, input a destination that they wish to navigate to.
  • the navigation device 200 switches to display (as shown in FIG. 5c) a plurality of virtual buttons that are each associated with a different category of selectable destinations.
  • the display shows a "home” button that if pressed would set the destination to the stored home location.
  • the "favourite” button if pressed, reveals a list of destinations that the user has previously stored in the navigation device 200 and if, one of these destinations is then selected, the destination for the route to be calculated is set to the selected previously stored destination.
  • the "recent destination” button if pressed, reveals a list of selectable destinations held in the memory of the navigation device 200 and to which the user has recently navigated.
  • Selection of one of the destinations populating this list would set the destination location for this route to the selected (previously visited) location.
  • the "point of interest” button if pressed, reveals a number of options by which a user can opt to navigate to any of a plurality of locations, such as cash machines, petrol stations or tourist attractions, for example, that have been pre-stored in the device as locations that a user of the navigation device 200 might want to navigate to.
  • the "arrow” shaped virtual button opens a new menu of additional options, and the "address” button 350 commences a process by which the user can input the street address of the destination that they wish to navigate to.
  • the user knows the street address and house number of the destination and hence selects the "street and house number" virtual button 352 whereupon the user is then presented, as shown in FIG. 5e, a prompt 354 to enter the name of the city that they wish to navigate to, a flag button 356 by which the user can select the country in which the desired city is located, and a virtual keyboard 358 that may be operated by the user, if necessary, to input the name of the destination city.
  • the user has previously navigated to locations in Rijswijk and Amsterdam, and the navigation device 200 therefore additionally provides the user with a list 360 of selectable cites.
  • the navigation device 200 displays, as shown in FIG. 5f, a virtual keyboard 362 for the user to input street names, a prompt 364 for entry of a street name 364 and, in this instance, as the user has previously navigated to a street in Amsterdam, a list 366 of selectable streets in Amsterdam.
  • the user wishes to return to the street, Rembrandtplein that the user has previously visited and, so, selects Rembrandtplein from the displayed list 366.
  • the navigation device 200 displays a smaller virtual keypad 368 and prompts the user, via prompt 370, to enter the number of the house in the selected street and city that they wish to navigate to. If the user has previously navigated to a house number in this street, then that number (as shown in FIG. 5g) is initially shown. If, as in this instance, the user wishes to navigate to No. 35, Rembrandtplein once again, then the user need only touch a "done" virtual button 372 displayed at the bottom right hand corner of the display. If the user should wish to navigate to a different house number in Rembrandtplein, then all they need do is operate the keypad 368 to input the appropriate house number.
  • the user is asked in FIG. 5h, whether they wish to arrive at a particular time. If the user should push the "yes" button, then functionality is invoked that estimates the time required to travel to the destination and advises the user when the user should leave (or if the user is running late, should have left) the user's current location in order to arrive the destination on time. In this instance the user is not concerned about arriving at a particular time and hence selects the "no" virtual button.
  • Selecting the "no" button 374 causes the navigation device 200 to calculate a route between the current location and the selected destination and to display that route 376, as shown in FIG. 5i, on a relatively low magnification map that shows the entire route.
  • the user provided with a "done” virtual button 378 which they can press to indicate that they are happy with the calculated route, a "find alternative” button 380 that the user can press to cause the navigation device 200 to calculate another route to the selected destination, and a “details” button 382 that a user can press to reveal selectable options for the display of more detailed information concerning the currently displayed route 376.
  • FIG. 6 Assuming that the user is happy with the displayed route, and the "done" button 378 has been pressed, the user is presented, as shown in FIG. 6, with a pseudo three-dimensional view of the current (e.g., start) location for the navigation device 200.
  • the display depicted in FIG. 6 is similar to that shown in FIG. 5a except that the displayed local environment 342 now includes a start location flag 384 and a waypoint indicator 386 indicating the next manoeuvre (in this instance, a left hand turn).
  • the lower part of the display has also changed and now displays the name of the street in which the navigation device 200 is currently located, an icon 388 indicating the distance to and type of the next manoeuvre (from the current location of the navigation device 200), and a dynamic display 390 of the distance and time to the selected destination.
  • the user may commence the journey.
  • the navigation device 200 guides the user, in a known manner, by updating the map in accordance with determined changes in the location of navigation device 200.
  • the navigation device 200 may provide the user with visual and, optionally, audible navigation instructions.
  • a method is directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device.
  • the method further includes determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image and the location determined by the navigation device.
  • a method for displaying enhanced instructions for a determined route of travel in a navigation device 200 includes determining map information for display on an integrated input and display device 240 of a navigation device 200, based upon a determined route of travel of the navigation device 200.
  • the method includes determining a current location of the navigation device 200.
  • the method includes displaying an image of the current location on the integrated input and display device 240 of the navigation device 200, and displaying an instruction, based on the determined route of travel, in relation to the image.
  • This process is accomplished without direct interaction between the user and the navigation device 200.
  • the user does not interact directly with navigation device 200, during the course of displaying the map information, for navigation device 200 to display the enhanced instructions.
  • the user may interact with the navigation device 200 during the course of displaying the map information for other purposes. For example, the user may change display settings or voice settings. The user may even change enable/ disable settings associated with displaying enhanced instructions.
  • the navigation device 200 receives in step S705, for example, through an input device 220, a desired destination to which a user intends to travel. Once the desired destination is received by the navigation device 200, in step S710, the navigation device 200 determines a current location of the user and/or navigation device 200. Once the current location is determined, in step S715, the navigation device 200 calculates, using a method known to those skilled in the art, a planned route between the current location and the desired destination.
  • the navigation device 200 determines a first instruction.
  • the first instruction may be the first instructional step in the planned route calculated in step S715.
  • the first instruction may be directions to the exit of the shopping mall parking lot.
  • the first instruction is often more difficult to comprehend than other types of instructions as the rotation of the vehicle often does not match the direction of view the navigation device 200.
  • the user may have made a turn of a vehicle that the navigation device 200 did not register with during the last event for the navigation device 200 (e.g., parking the vehicle).
  • step S722 the navigation device 200 determines if the current location is associated with the first instruction. This analysis may be performed by performing a detailed location analysis of the first instruction. For example, the navigation device 200 may develop a list of, for example, street names, crossing street names and/ or address numbers associated with the first instruction. The navigation device 200 may also analyze the current location to formulate a similar list including the same data. The two lists may then be compared. If the navigation device 200 determines the current location is not associated with the first instruction, control moves on to step S728 and no enhanced instruction is displayed. For example, if the list associated with the first instruction does match the list associated with the current location, no enhanced instruction is displayed. At step S728, the planned route information is displayed.
  • step S725 the navigation device 200 retrieves a scenic image, for example, a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, of the current location of the navigation device 200 from memory 230.
  • a panoramic image of a scene at street level may be an image showing a location at street level.
  • a panoramic image of a scene at street level may be a GOOGLE STREET VIEWTM image.
  • the panoramic image of a scene at street level may also be, for example, a satellite image zoomed in on a specific location such that street level detail is shown.
  • the panoramic image of a scene at street level may also be a three dimensional image.
  • the panoramic image of a scene at street level may include photographic, realistic details of the environment.
  • the panoramic image of a scene at street level may be corrected for the time of season.
  • the panoramic image of a scene at street level may be a current photograph taken by a photographic device recently in the location.
  • the navigation device 200 may take a photograph (or several photographs) in the direction of travel just before the navigation device 200 reaches a destination and while the user drives along the street. In this way, if the navigation device 200 needs a panoramic image of a scene at street level to display on the navigation device 200, the navigation device 200 may already have the panoramic image of a scene at street level on the navigation device 200.
  • the other navigation devices take pictures. There is a possibility that at least one of the other navigation devices takes a picture of the vehicle having navigation device 200 in the vehicle.
  • the other devices may know navigation device's 200 coordinates (e.g., via WiFi or via server 302).
  • the other navigation devices may send the images (e.g., via server 302 or directly via WiFi) to navigation device 200 before navigation device 200 needs the images. In this way, the user sees the user's own vehicle in the image and the image is also corrected for time of year and day (e.g., showing snow or seasonal conditions).
  • Memory 230 may obtain and store street level images.
  • the navigation device 200 may be connected to a remote server 302 as discussed above with regard to FIG. 3. While connected to the server 302, the navigation device 200 may download and store panoramic images of a scene at street level.
  • the navigation device 200 may download and store panoramic images of a scene at street level.
  • memory 230 would be of a fairly significant size. Example embodiments are directed to a memory 230 of reasonable size.
  • memory 230 may store enough panoramic images of a scene at street level to encompass, for example a region (e.g., northeast U.S.A.), or a country (e.g., England), or a city (e.g., Amsterdam) .
  • a region e.g., northeast U.S.A.
  • a country e.g., England
  • a city e.g., Amsterdam
  • step S730 the navigation device 200 displays the panoramic image of a scene at street level retrieved in step S725 together with the first instruction determined in step S720.
  • the panoramic image of a scene at street level may be an image of the shopping mall parking lot including the location of the navigation device 200 and the location of the appropriate exit.
  • the first instruction may be an arrow indicating the direction to take from location of the navigation device 200 (e.g., a parking spot) to the exit of the shopping mall parking lot.
  • the user may be merging onto the displayed road.
  • the navigation device 200 may not know whether the user is approaching the road from the left or the right, therefore indicating the initial direction of travel may be imprecise.
  • an arrow e.g., the first instruction
  • the arrow (e.g., the first instruction) may also be above, below, to the right or to the left of the panoramic image of a scene at street level.
  • the first instruction may also be a voice command. Again referring to FIG. 9, the voice command may be, for example, "head out onto Oak Street; head in the direction such that the white, two story house is on your left and the bank of trees is on your right.”
  • step S735 the navigation device 200 detects the navigation device 200 is beyond the area of the panoramic image of a scene at street level. If the navigation device 200 is beyond the area of the panoramic image of a scene at street level, the navigation device 200 displays the remainder of the route as is known in the art, in step S740.
  • the navigation device 200 may detect that the navigation device 200 has reached a transition point.
  • a transition point may be any deviation from the current direction of travel.
  • the user may be travelling in a northerly direction and need to travel east at the next intersection.
  • the user may be on a highway and need to take the next off-ramp.
  • Still another example may be the user is approaching a traffic circle, or rotary, travelling in a northerly direction and need to continue travelling in a northerly direction.
  • the detection of the transition point does not require interaction of the user with the navigation device.
  • the navigation device 200 together with the use of the GPS system 100 aiding identification of the location of the navigation device 200 and with the planned route, determine if the navigation device 200 has reached a transition point.
  • the details of this detection will be readily understood by one skilled in the art.
  • step S750 the navigation device 200 determines the transition instruction.
  • the transition instruction may be the next instructional step in the planned route calculated in step S715. For example, if the navigation device 200 is approaching a traffic circle, the instruction may be the correct entrance and exit relative to the traffic circle.
  • step S755 the navigation device 200 retrieves a panoramic image of a scene at street level of the transition location from memory 230.
  • step S760 the navigation device 200 displays the panoramic image of a scene at street level retrieved in step S755 and the transition instruction determined in step S750.
  • the panoramic image of a scene at street level may be the exit direction off of the traffic circle. The user may be unsure of the appropriate exit if there are several entrances and exits off of the traffic circle. By displaying the image together with the instruction, the user may have more confidence in the correct exit to take.
  • the voice command may be, for example, "take the traffic circle to the right; exit onto Oak Street such that the white, two story house is on your left and the bank of trees is on your right.”
  • step S765 the navigation device 200 determines if the navigation device 200 has reached the desired destination received in step S705, using a method known to those skilled in the art. If the navigation device 200 is not at the desired destination, the method loops through steps S735 to S765. If in step S765, the navigation device 200 determines that the navigation device 200 has reached the desired destination, control moves to step S770 and the method ends.
  • a method is directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device.
  • the method further includes determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image and the location determined by the navigation device.
  • a method for displaying enhanced instructions for a determined route of travel in a navigation device 200 includes determining map information for display on an integrated input and display device 240 of a navigation device 200, based upon a determined route of travel of the navigation device 200.
  • the method includes determining a current location of the navigation device 200.
  • the method further includes displaying an image of the current location on the integrated input and display device 240 of the navigation device 200, and displaying an instruction, based on the determined route of travel, in relation to the image.
  • the navigation device 200 receives in step S805, for example through an input device 220, a desired destination to which a user intends to travel. Once the desired destination is received by the navigation device 200, in step S810, the navigation device 200 determines a current location of the user and/or navigation device 200. Once the current location is determined, in step S815, the navigation device 200 calculates a planned route between the current location and the desired destination.
  • the first instruction is determined.
  • the first instruction may be the first instructional step in the planned route calculated in step S815.
  • the first instruction may be directions to the exit of the shopping mall parking lot.
  • the first instruction is often more difficult to comprehend than other types of instructions as the rotation of the vehicle often does not match the direction of view the navigation device 200.
  • the user may have made a turn of a vehicle that the navigation device 200 did not register with during the last event for the navigation 200 (e.g., parking the vehicle) .
  • step S822 the navigation device 200 determines if the current location is associated with the first instruction. This analysis may be performed by performing a detailed location analysis of the first instruction. For example, the navigation device 200 may develop a list of, for example, street names, crossing street names and/ or address numbers associated with the first instruction. The navigation device 200 may also analyze the current location to formulate a similar list including the same data. The two lists may then be compared. If the navigation device 200 determines the current location is not associated with the first instruction, control moves on to step S823 and no enhanced instruction is displayed. For example, if the list associated with the first instruction does match the list associated with the current location, no enhanced instruction is displayed. At step S823, the planned route information is displayed.
  • step S825 the navigation device 200 downloads a scenic image, for example a panoramic image of a scene at street level, of the current location of the navigation device 200 from server 302.
  • a scenic image for example a panoramic image of a scene at street level
  • a navigation device 200 may communicate with the network and retrieve a panoramic image of a scene at street level as described above with regard to FIG. 3.
  • the network may be one of several different types of networks.
  • a network may be one of a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a virtual private network (VPN) etc.
  • a network may be one of several different sizes depending of the implementation of a navigation device 200.
  • the network may be a private network, such that it accessibility is limited to a group of users.
  • a service provider of a navigation device 200 may setup a private network for its subscribers.
  • Each of the subscribers may be able to access panoramic images of a scene at street level stored on one or more servers hosted by the service provider, such as the mass storage device 312.
  • the navigation device 200 may be able to access panoramic images of a scene at street level in the navigation device 200 via the connection to the hosted server (s).
  • the network may be connected through an internet service provider (ISP) using a TCP/IP connection.
  • ISP internet service provider
  • the panoramic images of a scene at street level may be stored in mass data storage 312.
  • the server 302 may also be configured to retrieve the panoramic image of a scene at street level from another server. For example, if a panoramic image of a scene at street level is a GOOGLE STREET VIEWTM image, server 302 may retrieve the image from a GOOGLETM server. In the alternative, the navigation device 200 may retrieve a GOOGLE STREET VIEWTM image directly from a GOOGLETM server.
  • step S825 the navigation device 200 downloads a panoramic image of a scene at street level.
  • step S830 the navigation device 200 generates an image including the panoramic image of a scene at street level and the first instruction.
  • step S835 the server 302 receives and incorporates the first instruction into the panoramic image of a scene at street level.
  • the navigation device 200 downloads an image from server 302. The image includes both the panoramic image of a scene at street level and the first instruction.
  • step S840 the navigation device 200 displays the panoramic image of a scene at street level generated in step S830 (or in S835) together with the first instruction determined in step S820 on the display of the navigation device
  • step S845 the navigation device 200 detects the navigation device 200 is beyond the area of the panoramic image of a scene at street level. If the navigation device 200 is beyond the area of the panoramic image of a scene at street level, in step S860 the navigation device 200 displays the remainder of the route as is known in the art.
  • step S855 the navigation device 200 may detect that the navigation device 200 has reached a transition point.
  • step S860 the navigation device 200 determines the transition instruction.
  • the transition instruction may be the next instructional step in the planned route calculated in step S815.
  • step S865 the navigation device 200 downloads a panoramic image of a scene at street level of the transition location of the navigation device 200 from server 302.
  • step S870 the navigation device 200 generates an image including the panoramic image of a scene at street level and the transition point instruction.
  • step S875 the server 302 receives and incorporates the first instruction into the panoramic image of a scene at street level.
  • step S875 an image is downloaded from server 302. The image includes both the panoramic image of a scene at street level and the transition point instruction.
  • step S880 the navigation device 200 displays the panoramic image of a scene at street level generated in step S870 (or in S875) together with the transition instruction determined in step S860.
  • step S885 the navigation device 200 determines, using a method known to those skilled in the art, if the navigation device 200 has reached the desired destination received in step S805. If the navigation device 200 is not at the desired destination, the method loops through steps S845 to S885. If in step S885, the navigation device 200 determines that the navigation device 200 has reached the desired destination, control moves to step S890 and the method ends.
  • the navigation device displays enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street,.
  • the navigation device includes an input device to receive at least one input indicating a desired destination a desired destination.
  • the navigation device includes a processor operably coupled to the input device.
  • the processor is configured to calculate a planned route between a current location and the desired destination, the processor is configured to determine the current location (and direction of travel) and the processor configured to obtain an image of the current location.
  • the image being a street level view of a location and the processor is further configured to overlay a first instruction on the image.
  • the navigation device includes a display device controllable by the processor and configured to display the planned route and the image including the first instruction.
  • a navigation device 200 including a memory resource 230 and a display device 240 for displaying the map information, wherein the display device 240 may be part of an integrated input and display device 240.
  • example embodiments are not limited to such a navigation device 200.
  • Other devices e.g., a navigation device built into a vehicle, or a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA) executing route planning and navigation software) may be used to perform various aspects of the method described above with regards to FIG. 7 and FIG. 8, as would be understood by one of ordinary skill in the art. Further explanation is, thus, omitted for the sake of brevity.
  • the computer software includes one or more software modules operable, when executed in an execution environment, to cause a processor to determine map information for display on an integrated input and display device 240 of a navigation device 200, based upon a determined route of travel of the navigation device 200.
  • the processor determines a current location of the navigation device 200 and displays, without additional user input, an image of a scene of the current location on the integrated input and display device 240 of the navigation device 200.
  • the processor displays an instruction, based on the determined route of travel, in relation to the image of the scene.
  • At least some embodiments can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions or program segments stored on a tangible data recording medium (computer readable medium), such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared.
  • a tangible data recording medium such as a diskette, CD-ROM, ROM, or fixed disk
  • the series of computer instructions or program segments can constitute all or part of the functionality of the method of embodiments described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.
  • the navigation device 200 may utilise any kind of position sensing technology as an alternative to (or in addition to) GPS.
  • the navigation device 200 may utilise other global navigation satellite systems such as the European Galileo system.
  • the navigation device 200 is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location.

Abstract

Disclosed is a method and apparatus directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device. The method further including determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image.

Description

NAVIGATION DEVICE AND METHOD HAVING ENHANCED INSTRUCTION INCLUDING A PANORAMIC IMAGE OF A SCENE
Field
[0001] This disclosure relates to navigation devices and to methods for determining a route of travel from a first location to a second location. At least some illustrative embodiments relate to portable navigation devices (so-called PNDs); in particular, PNDs that include Global Positioning System (GPS) signal reception and processing functionality. Other embodiments relate, more generally, to any type of processing device that is configured to execute navigation software so as to provide route planning, and navigation functionality.
Background
[0002] Navigation devices that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-car or other vehicle navigation systems.
[0003] Such devices are of great utility when the user is not familiar with the route to the destination to which they are navigating. However, the user may be unfamiliar with surrounding landmarks, the user's orientation and/or the user's vehicle orientation in comparison to the on-screen navigation instruction provided by the navigation device.
[0004] For example, at the beginning of a route, establishing a direction of travel can be difficult because there is only a minimal amount of orientation aid provided by navigation devices during a first instruction. Because on-screen map data is always an abstraction of reality causing opposite travel direction to look similar, the navigation device guesses (e.g., a guess may be based on a last known direction of travel) in which direction the vehicle is pointing and then provides the instruction based on this directional guess. The user may then need to correct the device's guess where to go because the navigation device may have the orientation wrong or the instruction could be completely unclear. As a result, for example, the user is often confused and the user starts their journey in the wrong direction. Although the navigation device may inform the user to reverse course, setting off in the wrong direction may inconvenience the user and / or significantly add time to the users travel.
[0005] A second example may be that while travelling on a highway at highway speeds, a user may be told to exit the highway where there may be several sequential off-ramps and only one being the correct ramp. Because onscreen map data is always an abstraction of reality, discerning the correct ramp to take may be difficult. As a result the user is often confused and the user may choose the wrong off-ramp. Summary
[0006] According to one example embodiment, a method is directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device. The method further includes determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image and the location determined by the navigation device.
[0007] Another example embodiment of the present disclosure is directed to a navigation device. The navigation device displays enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street. The navigation device includes an input device to receive at least one input indicating a desired destination a desired destination. The navigation device includes a processor operably coupled to the input device. The processor is configured to calculate a planned route between a current location and the desired destination, the processor is configured to determine the current location (and direction of travel) and the processor configured to obtain an image of the current location. The image being a street level view of a location and the processor is further configured to overlay a first instruction on the image. The navigation device includes a display device controllable by the processor and configured to display the planned route and the image including the first instruction. [0008] Advantages of these embodiments are set out hereafter, and further details and features of each of these embodiments are defined in the accompanying dependent claims and elsewhere in the following detailed description.
Brief Description of the Drawings
[0009] Various aspects of the teachings of the present disclosure, and arrangements embodying those teachings, will hereafter be described by way of illustrative examples with reference to the accompanying drawings, in which:
[0010] Fig. 1 is a schematic illustration of a Global Positioning System (GPS);
[0011] Fig. 2 is a schematic illustration of electronic components arranged to provide a navigation device;
[0012] Fig. 3 is a schematic illustration of the manner in which the navigation device of Fig. 2 may receive information over a wireless communication channel;
[0013] Figs. 4a and 4b are illustrative perspective views of the navigation device of Fig. 2;
[0014] Figs. 5a to 5i are illustrative screenshots from a navigation device for a destination input process;
[0015] Fig. 6 is an illustrative screenshot from a navigation device depicting a start location for an illustrative calculated route;
[0016] Fig. 7 is an illustrative flow diagram depicting steps of an example embodiment;
[0017] Fig. 8 is an illustrative flow diagram depicting steps of an example embodiment; and
[0018] Fig. 9 is an illustrative image depicting an image together with an instruction according to an example embodiment.
Detailed Description of Example Embodiments [0004] Various example embodiments will now be described more fully with reference to the accompanying drawings in which some example embodiments are illustrated.
[0005] Accordingly, while example embodiments are capable of various modifications and alternative forms, embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed, but on the contrary, example embodiments are to cover all modifications, equivalents, and alternatives falling within the scope of example embodiments. Like numbers refer to like elements throughout the description of the figures.
[0006] It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/ or" includes any and all combinations of one or more of the associated listed items.
[0007] It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between" versus "directly between," "adjacent" versus "directly adjacent," etc.).
[0008] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a," "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and /or groups thereof.
[0009] Spatially relative terms, e.g., "beneath," "below," "lower," "above," "upper" and the like, may be used herein for ease of description to describe one element or a relationship between a feature and another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, for example, the term "below" can encompass both an orientation which is above as well as below. The device may be otherwise oriented (rotated 90 degrees or viewed or referenced at other orientations) and the spatially relative descriptors used herein should be interpreted accordingly.
[0010] It should also be noted that in some alternative implementations, the functions/ acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/ acts involved.
[0011] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0012] Portions of example embodiments and corresponding detailed description are presented in terms of software, or algorithms and symbolic representations of operation on data bits within a computer memory. These descriptions and representations are the ones by which those of ordinary skill in the art effectively convey the substance of their work to others of ordinary skill in the art. An algorithm, as the term is used here, and as it is used generally, is conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of optical, electrical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0013] In the following description, illustrative embodiments will be described with reference to acts and symbolic representations of operations (e.g., in the form of flowcharts) that may be implemented as program modules or functional processes include routines, programs, objects, components, data structures, that perform particular tasks or implement particular abstract data types and may be implemented using existing hardware at existing network elements or control nodes (e.g., a database). Such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs) computers or the like.
[0014] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, or as is apparent from the discussion, terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0019] Note also that the software implemented aspects of example embodiments are typically encoded on some form of computer readable medium or implemented over some type of transmission medium. The computer readable medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or "CD ROM"), and may be read only or random access. Similarly, the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. Example embodiments are not limited by these aspects of any given implementation.
[0020] Example embodiments will now be described with particular reference to a portable navigation device (PND). It should be understood that the teachings of the present disclosure are not limited to PNDs and are universally applicable to any type of processing device that is configured to execute navigation software so as to provide route planning and navigation functionality. It follows therefore that in the context of the present application, a navigation device is intended to include (without limitation) any type of route planning and/or navigation device, irrespective of whether that device is embodied as a PND, a navigation device built into a vehicle, or indeed a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA) executing route planning and navigation software).
[0021] It will also be apparent from the following that the teachings of the present disclosure even have utility in circumstances where a user is not seeking instructions on how to navigate from one point to another, but merely wishes to be provided with a view of a given location. In such circumstances the "destination" location selected by the user need not have a corresponding start location from which the user wishes to start navigating, and as a consequence references herein to the "destination" location or indeed to a "destination" view should not be interpreted to mean that the generation of a route is essential, that travelling to the "destination" must occur, or indeed that the presence of a destination requires the designation of a corresponding start location.
[0022] With the above provisos in mind, FIG. 1 illustrates an example view of Global Positioning System (GPS), usable by navigation devices. Such systems are known and are used for a variety of purposes. In general, GPS is a satellite- radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users. Formerly known as NAVSTAR, the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units. [0023] The GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the device utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
[0024] As shown in Figure 1 , the GPS system is denoted generally by reference numeral 100. A plurality of satellites 120 are in orbit about the earth 124. The orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous. A GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
[0025] The spread spectrum signals 160, continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock. Each satellite 120, as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120. It is appreciated by those skilled in the relevant art that the GPS receiver 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two- dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver 140 to calculate its three-dimensional position in a known manner.
[0026] Figure 2 is an illustrative representation of electronic components of a navigation device 200 according to an example embodiment of the present disclosure, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
[0027] The navigation device 200 is located within a housing (not shown). The housing includes a processor 210 connected to an input device 220 and a display screen 240. The input device 220 can include a keyboard device, voice input device, touch panel and/ or any other known input device utilised to input information; and the display screen 240 can include any type of display screen such as an LCD display, for example. In an example arrangement, the input device 220 and display screen 240 are integrated into an integrated input and display device 240, including a touchpad or touch screen input so that a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual buttons.
[0028] The navigation device may include an output device 260, for example, an audible output device (e.g., a loudspeaker). As output device 260 can produce audible information for a user of the navigation device 200, it should equally be understood that input device 220 can include a microphone and software for receiving input voice commands as well.
[0029] In the navigation device 200, processor 210 is operatively connected to and set to receive input information from input device 220 via a connection 225, and operatively connected to at least one of display screen 240 and output device 260, via output connections 245, to output information thereto. Further, the processor 210 is operably coupled to a memory resource 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200. The memory resource 230 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non-volatile memory, for example a digital memory, such as a flash memory.
[0030] The external I/O device 280 may include, but is not limited to an external listening device such as an earpiece. For example, the connection to I/O device 280 may further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/ or for voice activated operation, for connection to an ear piece or head phones, and/or for connection to a mobile phone, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network, and / or to establish a connection to a server via the internet or some other network.
[0031] FIG. 2 further illustrates an operative connection between the processor 210 and an antenna/ receiver 250 via connection 255. The antenna/ receiver 250 can be a GPS antenna/ receiver, for example. It will be understood that the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna, for example.
[0032] Further, it will be understood by one of ordinary skill in the art that the electronic components shown in FIG. 2 are powered by power sources (not shown) in a conventional manner. As will be understood by one of ordinary skill in the art, different configurations of the components shown in FIG. 2 are considered to be within the scope of the present application. For example, the components shown in FIG. 2 may be in communication with one another via wired and/ or wireless connections and the like. Thus, the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200.
[0033] In addition, the portable or handheld navigation device 200 of FIG. 2 can be connected or "docked" in a known manner to a vehicle such as a bicycle, a motorbike, a car or a boat ,for example. Such a navigation device 200 is then removable from the docked location for portable or handheld navigation use.
[0034] Referring now to FIG. 3, the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (such as a mobile phone, PDA, and/ or any device with mobile phone technology) establishing a digital connection (e.g., a digital connection via known Bluetooth technology or WiFi connection). Thereafter, through its network service provider, the mobile device can establish a network connection (through the internet for example) with a server 302. As such, a "mobile" network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/ or in a vehicle) and the server 302 to provide a "real-time" or at least very "up to date" gateway for information. [0035] The establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using the Internet (e.g., the World Wide Web) for example, can be done in a known manner. This can include use of TCP/IP layered protocol. The mobile device can utilize any number of communication standards such as CDMA, GSM, WAN, etc.
[0036] As such, an internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200, for example. For this connection, an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service) -connection (GPRS connection is a highspeed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet).
[0037] The navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology, for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
[0038] The navigation device 200 may include its own mobile phone technology within the navigation device 200 (including an antenna, for example, or optionally using the internal antenna of the navigation device 200). The mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card (e.g., Subscriber Identity Module or SIM card), complete with necessary mobile phone technology and/or an antenna, for example. As such, mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.
[0039] For GPRS phone settings, a Bluetooth enabled navigation device may be used to correctly work with the ever changing spectrum of mobile phone models and manufacturers. In addition, model/ manufacturer specific settings may be stored on the navigation device 200, for example. The data stored for this information can be updated. [0040] In FIG. 3, the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements. The server 302 and navigation device 200 can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer and /or via the internet).
[0041] The server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312. The processor 304 is further operatively connected to a transmitter 308 and a receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318. The signals sent and received may include data, communication, and/ or other propagated signals. The transmitter 308 and receiver 310 may be selected or designed according to a communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
[0042] Server 302 is further connected to (or includes) the mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314. The mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302.
[0043] The navigation device 200 is adapted to communicate with the server 302 through communications channel 318, and includes at least similar elements as previously described with regard to FIG. 2, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
[0044] Software stored in server memory 306 provides instructions for the processor 304 and allows the server 302 to provide services to the navigation device 200. One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200. Another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
[0045] The communications channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302. Both the server 302 and navigation device 200 include transmitters 308, 320 for transmitting data through the communication channel and receivers 310, 322 for receiving data that has been transmitted through the communications channel 318.
[0046] The communication channel 318 is not limited to a particular communication technology. Additionally, the communications channel 318 is not limited to a single communication technology; that is, communications channel 318 may include several communication links that use a variety of technology. For example, the communications channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communications channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communications channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
[0047] In one illustrative arrangement, the communications channel 318 includes telephone and computer networks. Furthermore, the communications channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency and/ or infrared communication. Additionally, the communications channel 318 can accommodate satellite communication. [0048] The communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology. For example, the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA) and /or Global System for Mobile Communications (GSM). Both digital and analogue signals can be transmitted through the communications channel 318. These signals may be modulated, encrypted and/ or compressed signals as may be desirable for the communication technology.
[0049] The server 302 is a remote server accessible by the navigation device 200 via a wireless channel. The server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
[0050] The server 302 may include a personal computer such as a desktop or laptop computer, and the communications channel 318 may be a cable connected between the personal computer and the navigation device 200. Alternatively, a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200. Alternatively, a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
[0051] The navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/ or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection, for example. For many dynamic calculations, the processor 304 in the server 302 may be used to handle the bulk of the processing, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
[0052] As indicated above in FIG. 2, a navigation device 200 includes the processor 210, the input device 220, and the display screen 240. The input device 220 and display screen 240 may be integrated into an integrated input and display device 240 to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example. Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art. Further, the navigation device 200 can also include any additional input device 220 and/or any additional output device 260, such as audio input/ output devices for example.
[0053] FIGs. 4a and 4b are perspective views of the navigation device 200. As shown in FIG. 4a, the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen, for example) and the other components of FIG. 2 (including but not limited to internal GPS receiver 250, microprocessor 210, a power supply, memory systems 230, etc.).
[0054] The navigation device 200 may sit on an arm 292, which itself may be secured to a vehicle dashboard/ window/ etc. using a suction cup 294. This arm 292 is one example of a docking station to which the navigation device 200 can be docked.
[0055] As shown in FIG. 4b, the navigation device 200 can be docked or otherwise connected to an arm 292 of the docking station by snap connecting the navigation device 200 to the arm 292, for example. The navigation device 200 may then be rotatable on the arm 292, as shown by the arrow of FIG. 4b. To release the connection between the navigation device 200 and the docking station, a button on the navigation device 200 may be pressed, for example. Other equally suitable arrangements for coupling and decoupling the navigation device to a docking station are well known to persons of ordinary skill in the art.
[0056] Referring now to FIGs. 5a to 5i there is depicted a series of screenshots from the navigation device 200. The navigation device 200 has a touch screen interface for displaying information to a user and for accepting input to the device from the user. The screenshots show an example embodiment of a destination location input process for a user whose home location has been set to the offices in The Hague of the European Patent Office, and who wishes to navigate to a street address in Amsterdam, The Netherlands for which they know the street name and building number. [0057] When this user switches on their navigation device 200, the device acquires a GPS fix and calculates (in a known manner) the current location of the navigation device 200. The user is then presented, as shown in FIG. 5a, with a display 340 showing, in pseudo three-dimensions, the local environment 342 in which the navigation device 200 is determined to be located. In a region 344 of the display 340 below the local environment, a series of control and status messages is displayed.
[0058] By touching the display of the local environment 342, the navigation device 200 switches to display (as shown in FIG. 5b) a series of virtual buttons 346 for a user to, inter alia, input a destination that they wish to navigate to.
[0059] By touching the "navigate to" virtual button 348, the navigation device 200 switches to display (as shown in FIG. 5c) a plurality of virtual buttons that are each associated with a different category of selectable destinations. In this instance, the display shows a "home" button that if pressed would set the destination to the stored home location. However, in this instance as the user is already at their home location (namely the EPO's offices in The Hague) selecting this option would not cause a route to be generated. The "favourite" button, if pressed, reveals a list of destinations that the user has previously stored in the navigation device 200 and if, one of these destinations is then selected, the destination for the route to be calculated is set to the selected previously stored destination. The "recent destination" button, if pressed, reveals a list of selectable destinations held in the memory of the navigation device 200 and to which the user has recently navigated.
[0060] Selection of one of the destinations populating this list would set the destination location for this route to the selected (previously visited) location. The "point of interest" button, if pressed, reveals a number of options by which a user can opt to navigate to any of a plurality of locations, such as cash machines, petrol stations or tourist attractions, for example, that have been pre-stored in the device as locations that a user of the navigation device 200 might want to navigate to. The "arrow" shaped virtual button opens a new menu of additional options, and the "address" button 350 commences a process by which the user can input the street address of the destination that they wish to navigate to. [0061] Since the user, in this example, knows the street address of the destination that they wish to navigate to, it is assumed that the "address" button 350 is operated (by touching the button displayed on the touch screen), whereupon (as shown in FIG. 5d) the user is presented with a series of address input options - in particular for address input by "city centre", by "postcode", by "crossing or intersection" (for example a junction of two roads) and by "street and house number".
[0062] In this example the user knows the street address and house number of the destination and hence selects the "street and house number" virtual button 352 whereupon the user is then presented, as shown in FIG. 5e, a prompt 354 to enter the name of the city that they wish to navigate to, a flag button 356 by which the user can select the country in which the desired city is located, and a virtual keyboard 358 that may be operated by the user, if necessary, to input the name of the destination city. In this instance the user has previously navigated to locations in Rijswijk and Amsterdam, and the navigation device 200 therefore additionally provides the user with a list 360 of selectable cites.
[0063] The user in this instance wishes to navigate to Amsterdam, and on selection of Amsterdam from the list 360, the navigation device 200 displays, as shown in FIG. 5f, a virtual keyboard 362 for the user to input street names, a prompt 364 for entry of a street name 364 and, in this instance, as the user has previously navigated to a street in Amsterdam, a list 366 of selectable streets in Amsterdam.
[0064] In this example, the user wishes to return to the street, Rembrandtplein that the user has previously visited and, so, selects Rembrandtplein from the displayed list 366.
[0065] Once a street has been selected, the navigation device 200 then displays a smaller virtual keypad 368 and prompts the user, via prompt 370, to enter the number of the house in the selected street and city that they wish to navigate to. If the user has previously navigated to a house number in this street, then that number (as shown in FIG. 5g) is initially shown. If, as in this instance, the user wishes to navigate to No. 35, Rembrandtplein once again, then the user need only touch a "done" virtual button 372 displayed at the bottom right hand corner of the display. If the user should wish to navigate to a different house number in Rembrandtplein, then all they need do is operate the keypad 368 to input the appropriate house number.
[0066] Once the house number has been input, the user is asked in FIG. 5h, whether they wish to arrive at a particular time. If the user should push the "yes" button, then functionality is invoked that estimates the time required to travel to the destination and advises the user when the user should leave (or if the user is running late, should have left) the user's current location in order to arrive the destination on time. In this instance the user is not concerned about arriving at a particular time and hence selects the "no" virtual button.
[0067] Selecting the "no" button 374 causes the navigation device 200 to calculate a route between the current location and the selected destination and to display that route 376, as shown in FIG. 5i, on a relatively low magnification map that shows the entire route. The user provided with a "done" virtual button 378 which they can press to indicate that they are happy with the calculated route, a "find alternative" button 380 that the user can press to cause the navigation device 200 to calculate another route to the selected destination, and a "details" button 382 that a user can press to reveal selectable options for the display of more detailed information concerning the currently displayed route 376.
[0068] Assuming that the user is happy with the displayed route, and the "done" button 378 has been pressed, the user is presented, as shown in FIG. 6, with a pseudo three-dimensional view of the current (e.g., start) location for the navigation device 200. The display depicted in FIG. 6 is similar to that shown in FIG. 5a except that the displayed local environment 342 now includes a start location flag 384 and a waypoint indicator 386 indicating the next manoeuvre (in this instance, a left hand turn). The lower part of the display has also changed and now displays the name of the street in which the navigation device 200 is currently located, an icon 388 indicating the distance to and type of the next manoeuvre (from the current location of the navigation device 200), and a dynamic display 390 of the distance and time to the selected destination.
[0069] The user may commence the journey. The navigation device 200 guides the user, in a known manner, by updating the map in accordance with determined changes in the location of navigation device 200. In addition, the navigation device 200 may provide the user with visual and, optionally, audible navigation instructions.
[0070] According to an example embodiment, a method is directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device. The method further includes determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image and the location determined by the navigation device.
[0071] According to various example embodiments, and as illustrated in FIG. 7, a method for displaying enhanced instructions for a determined route of travel in a navigation device 200 is disclosed. The method includes determining map information for display on an integrated input and display device 240 of a navigation device 200, based upon a determined route of travel of the navigation device 200. The method includes determining a current location of the navigation device 200. The method includes displaying an image of the current location on the integrated input and display device 240 of the navigation device 200, and displaying an instruction, based on the determined route of travel, in relation to the image.
[0072] This process is accomplished without direct interaction between the user and the navigation device 200. In other words, the user does not interact directly with navigation device 200, during the course of displaying the map information, for navigation device 200 to display the enhanced instructions. The user may interact with the navigation device 200 during the course of displaying the map information for other purposes. For example, the user may change display settings or voice settings. The user may even change enable/ disable settings associated with displaying enhanced instructions.
[0073] Referring to FIG. 7, the navigation device 200 receives in step S705, for example, through an input device 220, a desired destination to which a user intends to travel. Once the desired destination is received by the navigation device 200, in step S710, the navigation device 200 determines a current location of the user and/or navigation device 200. Once the current location is determined, in step S715, the navigation device 200 calculates, using a method known to those skilled in the art, a planned route between the current location and the desired destination.
[0074] In step S720, the navigation device 200 determines a first instruction. The first instruction may be the first instructional step in the planned route calculated in step S715. For example, if the navigation device 200 is located in the parking lot of a shopping mall, the first instruction may be directions to the exit of the shopping mall parking lot. The first instruction is often more difficult to comprehend than other types of instructions as the rotation of the vehicle often does not match the direction of view the navigation device 200. For example, the user may have made a turn of a vehicle that the navigation device 200 did not register with during the last event for the navigation device 200 (e.g., parking the vehicle).
[0075] In step S722, the navigation device 200 determines if the current location is associated with the first instruction. This analysis may be performed by performing a detailed location analysis of the first instruction. For example, the navigation device 200 may develop a list of, for example, street names, crossing street names and/ or address numbers associated with the first instruction. The navigation device 200 may also analyze the current location to formulate a similar list including the same data. The two lists may then be compared. If the navigation device 200 determines the current location is not associated with the first instruction, control moves on to step S728 and no enhanced instruction is displayed. For example, if the list associated with the first instruction does match the list associated with the current location, no enhanced instruction is displayed. At step S728, the planned route information is displayed.
[0076] If, in step S722, the navigation device 200 determines the current location is associated with the first instruction, in step S725, the navigation device 200 retrieves a scenic image, for example, a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, of the current location of the navigation device 200 from memory 230. A panoramic image of a scene at street level may be an image showing a location at street level. For example, a panoramic image of a scene at street level may be a GOOGLE STREET VIEW™ image. The panoramic image of a scene at street level may also be, for example, a satellite image zoomed in on a specific location such that street level detail is shown.
[0077] The panoramic image of a scene at street level may also be a three dimensional image. The panoramic image of a scene at street level may include photographic, realistic details of the environment. The panoramic image of a scene at street level may be corrected for the time of season. The panoramic image of a scene at street level may be a current photograph taken by a photographic device recently in the location.
[0078] As another example, assuming the navigation device 200 had a photo camera (not shown) focussed in a direction out of a window, the navigation device may take a photograph (or several photographs) in the direction of travel just before the navigation device 200 reaches a destination and while the user drives along the street. In this way, if the navigation device 200 needs a panoramic image of a scene at street level to display on the navigation device 200, the navigation device 200 may already have the panoramic image of a scene at street level on the navigation device 200.
[0079] As still another example, assuming other navigation devices have a photo camera, as the other navigation devices drive through the location where a vehicle is parked, the other navigation devices take pictures. There is a possibility that at least one of the other navigation devices takes a picture of the vehicle having navigation device 200 in the vehicle. The other devices may know navigation device's 200 coordinates (e.g., via WiFi or via server 302). The other navigation devices may send the images (e.g., via server 302 or directly via WiFi) to navigation device 200 before navigation device 200 needs the images. In this way, the user sees the user's own vehicle in the image and the image is also corrected for time of year and day (e.g., showing snow or seasonal conditions).
[0080] Memory 230 may obtain and store street level images. The navigation device 200 may be connected to a remote server 302 as discussed above with regard to FIG. 3. While connected to the server 302, the navigation device 200 may download and store panoramic images of a scene at street level. As is known to one skilled in the art, to store all panoramic images of a scene at street level that may be stored on the server 302 in mass data storage unit 312, memory 230 would be of a fairly significant size. Example embodiments are directed to a memory 230 of reasonable size. As such, memory 230 may store enough panoramic images of a scene at street level to encompass, for example a region (e.g., northeast U.S.A.), or a country (e.g., England), or a city (e.g., Amsterdam) .
[0081] In step S730, the navigation device 200 displays the panoramic image of a scene at street level retrieved in step S725 together with the first instruction determined in step S720. For example, the panoramic image of a scene at street level may be an image of the shopping mall parking lot including the location of the navigation device 200 and the location of the appropriate exit. The first instruction may be an arrow indicating the direction to take from location of the navigation device 200 (e.g., a parking spot) to the exit of the shopping mall parking lot.
[0082] In another example, referring to FIG. 9, the user (and the navigation device 200) may be merging onto the displayed road. The navigation device 200 may not know whether the user is approaching the road from the left or the right, therefore indicating the initial direction of travel may be imprecise. On the display illustrated in FIG. 9 is an arrow (e.g., the first instruction) indicating the correct direction of travel.
[0083] The arrow (e.g., the first instruction) may also be above, below, to the right or to the left of the panoramic image of a scene at street level. The first instruction may also be a voice command. Again referring to FIG. 9, the voice command may be, for example, "head out onto Oak Street; head in the direction such that the white, two story house is on your left and the bank of trees is on your right."
[0084] In step S735, the navigation device 200 detects the navigation device 200 is beyond the area of the panoramic image of a scene at street level. If the navigation device 200 is beyond the area of the panoramic image of a scene at street level, the navigation device 200 displays the remainder of the route as is known in the art, in step S740.
[0085] In step S745, the navigation device 200 may detect that the navigation device 200 has reached a transition point. A transition point may be any deviation from the current direction of travel. For example, the user may be travelling in a northerly direction and need to travel east at the next intersection. For example, the user may be on a highway and need to take the next off-ramp. Still another example may be the user is approaching a traffic circle, or rotary, travelling in a northerly direction and need to continue travelling in a northerly direction.
[0086] The detection of the transition point does not require interaction of the user with the navigation device. The navigation device 200, together with the use of the GPS system 100 aiding identification of the location of the navigation device 200 and with the planned route, determine if the navigation device 200 has reached a transition point. The details of this detection will be readily understood by one skilled in the art.
[0087] In step S750, the navigation device 200 determines the transition instruction. The transition instruction may be the next instructional step in the planned route calculated in step S715. For example, if the navigation device 200 is approaching a traffic circle, the instruction may be the correct entrance and exit relative to the traffic circle.
[0088] In step S755, the navigation device 200 retrieves a panoramic image of a scene at street level of the transition location from memory 230. In step S760, the navigation device 200 displays the panoramic image of a scene at street level retrieved in step S755 and the transition instruction determined in step S750. Continuing with the traffic circle example described above and referring to FIG. 9, the panoramic image of a scene at street level may be the exit direction off of the traffic circle. The user may be unsure of the appropriate exit if there are several entrances and exits off of the traffic circle. By displaying the image together with the instruction, the user may have more confidence in the correct exit to take.
[0089] Again referring to FIG. 9, the voice command may be, for example, "take the traffic circle to the right; exit onto Oak Street such that the white, two story house is on your left and the bank of trees is on your right."
[0090] Upon the completion of step S760, control moves to step S765. In step S765, the navigation device 200 determines if the navigation device 200 has reached the desired destination received in step S705, using a method known to those skilled in the art. If the navigation device 200 is not at the desired destination, the method loops through steps S735 to S765. If in step S765, the navigation device 200 determines that the navigation device 200 has reached the desired destination, control moves to step S770 and the method ends.
[0091] According to one example embodiment, a method is directed to displaying enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street, including determining map information for display on an integrated input and display device of a navigation device, based upon a determined route of travel of the navigation device. The method further includes determining a current location of the navigation device displaying an image of the current location on the integrated input and display device of the navigation device, and displaying an instruction, based on the determined route of travel, in relation to the image and the location determined by the navigation device.
[0092] According to various embodiments of the present application, and as illustrated in FIG. 8, a method for displaying enhanced instructions for a determined route of travel in a navigation device 200 is disclosed. The method includes determining map information for display on an integrated input and display device 240 of a navigation device 200, based upon a determined route of travel of the navigation device 200. The method includes determining a current location of the navigation device 200. The method further includes displaying an image of the current location on the integrated input and display device 240 of the navigation device 200, and displaying an instruction, based on the determined route of travel, in relation to the image.
[0093] Referring to FIG. 8, the navigation device 200 receives in step S805, for example through an input device 220, a desired destination to which a user intends to travel. Once the desired destination is received by the navigation device 200, in step S810, the navigation device 200 determines a current location of the user and/or navigation device 200. Once the current location is determined, in step S815, the navigation device 200 calculates a planned route between the current location and the desired destination.
[0094] In step S820, the first instruction is determined. The first instruction may be the first instructional step in the planned route calculated in step S815. For example, if the navigation device 200 is located in the parking lot of a shopping mall, the first instruction may be directions to the exit of the shopping mall parking lot. The first instruction is often more difficult to comprehend than other types of instructions as the rotation of the vehicle often does not match the direction of view the navigation device 200. For example, the user may have made a turn of a vehicle that the navigation device 200 did not register with during the last event for the navigation 200 (e.g., parking the vehicle) .
[0095] In step S822, the navigation device 200 determines if the current location is associated with the first instruction. This analysis may be performed by performing a detailed location analysis of the first instruction. For example, the navigation device 200 may develop a list of, for example, street names, crossing street names and/ or address numbers associated with the first instruction. The navigation device 200 may also analyze the current location to formulate a similar list including the same data. The two lists may then be compared. If the navigation device 200 determines the current location is not associated with the first instruction, control moves on to step S823 and no enhanced instruction is displayed. For example, if the list associated with the first instruction does match the list associated with the current location, no enhanced instruction is displayed. At step S823, the planned route information is displayed.
[0096] If, in step S822, the navigation device 200 determines the current location is associated with the first instruction, in step S825, the navigation device 200 downloads a scenic image, for example a panoramic image of a scene at street level, of the current location of the navigation device 200 from server 302.
[0097] A navigation device 200 may communicate with the network and retrieve a panoramic image of a scene at street level as described above with regard to FIG. 3. The network may be one of several different types of networks. For example, a network may be one of a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a virtual private network (VPN) etc. A network may be one of several different sizes depending of the implementation of a navigation device 200. The network may be a private network, such that it accessibility is limited to a group of users. As an example, a service provider of a navigation device 200 may setup a private network for its subscribers. [0098] Each of the subscribers (a navigation device 200 user) may be able to access panoramic images of a scene at street level stored on one or more servers hosted by the service provider, such as the mass storage device 312. The navigation device 200 may be able to access panoramic images of a scene at street level in the navigation device 200 via the connection to the hosted server (s). Alternatively, the network may be connected through an internet service provider (ISP) using a TCP/IP connection.
[0099] The panoramic images of a scene at street level may be stored in mass data storage 312. The server 302 may also be configured to retrieve the panoramic image of a scene at street level from another server. For example, if a panoramic image of a scene at street level is a GOOGLE STREET VIEW™ image, server 302 may retrieve the image from a GOOGLE™ server. In the alternative, the navigation device 200 may retrieve a GOOGLE STREET VIEW™ image directly from a GOOGLE™ server.
[00100] As described above, in step S825, the navigation device 200 downloads a panoramic image of a scene at street level. In step S830, the navigation device 200 generates an image including the panoramic image of a scene at street level and the first instruction.
[00101] In an alternative to the combination of steps S825 and S830 above, in step S835, the server 302 receives and incorporates the first instruction into the panoramic image of a scene at street level. In step S835, the navigation device 200 downloads an image from server 302. The image includes both the panoramic image of a scene at street level and the first instruction.
[0100] In step S840, the navigation device 200 displays the panoramic image of a scene at street level generated in step S830 (or in S835) together with the first instruction determined in step S820 on the display of the navigation device
200.
[0101] In step S845, the navigation device 200 detects the navigation device 200 is beyond the area of the panoramic image of a scene at street level. If the navigation device 200 is beyond the area of the panoramic image of a scene at street level, in step S860 the navigation device 200 displays the remainder of the route as is known in the art.
[0102] In step S855, the navigation device 200 may detect that the navigation device 200 has reached a transition point. In step S860, the navigation device 200 determines the transition instruction. The transition instruction may be the next instructional step in the planned route calculated in step S815.
[0103] In step S865, the navigation device 200 downloads a panoramic image of a scene at street level of the transition location of the navigation device 200 from server 302.
[0104] In step S870, the navigation device 200 generates an image including the panoramic image of a scene at street level and the transition point instruction.
[0105] In an alternative to the combination of steps S865 and S870 above, in step S875 the server 302 receives and incorporates the first instruction into the panoramic image of a scene at street level. In step S875, an image is downloaded from server 302. The image includes both the panoramic image of a scene at street level and the transition point instruction.
[0106] In step S880, the navigation device 200 displays the panoramic image of a scene at street level generated in step S870 (or in S875) together with the transition instruction determined in step S860.
[0107] Upon the completion of step S880, control moves to step S885. In step S885, the navigation device 200 determines, using a method known to those skilled in the art, if the navigation device 200 has reached the desired destination received in step S805. If the navigation device 200 is not at the desired destination, the method loops through steps S845 to S885. If in step S885, the navigation device 200 determines that the navigation device 200 has reached the desired destination, control moves to step S890 and the method ends.
[0108] Another example embodiment of the present disclosure is directed to a navigation device. The navigation device displays enhanced navigation instructions including a panoramic image of a scene at street level of, for example a user in an automobile or a user on a street,. The navigation device includes an input device to receive at least one input indicating a desired destination a desired destination. The navigation device includes a processor operably coupled to the input device.
[0109] The processor is configured to calculate a planned route between a current location and the desired destination, the processor is configured to determine the current location (and direction of travel) and the processor configured to obtain an image of the current location. The image being a street level view of a location and the processor is further configured to overlay a first instruction on the image. The navigation device includes a display device controllable by the processor and configured to display the planned route and the image including the first instruction.
[0110] It should be noted that some aspects of the present application have been described with regard to at least one method of the present application. However, at least one embodiment of the present application is directed to a navigation device 200 including a memory resource 230 and a display device 240 for displaying the map information, wherein the display device 240 may be part of an integrated input and display device 240. However, example embodiments are not limited to such a navigation device 200. Other devices, e.g., a navigation device built into a vehicle, or a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA) executing route planning and navigation software) may be used to perform various aspects of the method described above with regards to FIG. 7 and FIG. 8, as would be understood by one of ordinary skill in the art. Further explanation is, thus, omitted for the sake of brevity.
[0111] Another example embodiment of the present disclosure is directed to computer software. The computer software includes one or more software modules operable, when executed in an execution environment, to cause a processor to determine map information for display on an integrated input and display device 240 of a navigation device 200, based upon a determined route of travel of the navigation device 200. The processor determines a current location of the navigation device 200 and displays, without additional user input, an image of a scene of the current location on the integrated input and display device 240 of the navigation device 200. The processor displays an instruction, based on the determined route of travel, in relation to the image of the scene.
[0112] At least some embodiments can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions or program segments stored on a tangible data recording medium (computer readable medium), such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared. The series of computer instructions or program segments can constitute all or part of the functionality of the method of embodiments described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.
[0113] It will also be appreciated that various aspects and embodiments of the present disclosure have heretofore been described. The scope of the present disclosure is not limited to the particular arrangements set out herein and instead extends to encompass all arrangements, and modifications and alterations thereto, which fall within the scope of the appended claims.
[0114] For example, while embodiments described in the foregoing detailed description refer to GPS, it should be noted that the navigation device 200 may utilise any kind of position sensing technology as an alternative to (or in addition to) GPS. For example, the navigation device 200 may utilise other global navigation satellite systems such as the European Galileo system. Equally, the navigation device 200 is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location.
[0115] It will also be well understood by persons of ordinary skill in the art that the example embodiments implement certain functionality by means of software. That functionality could equally be implemented solely in hardware (for example by way of one or more ASICs (application specific integrated circuit)) or indeed by a mix of hardware and software. As such, the scope of the present disclosure should not be interpreted as being limited only to being implemented in software.
[0116] Lastly, it should also be noted that while the accompanying claims set out particular combinations of features described herein, the scope of the present disclosure is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims

1. A method, comprising:
determining map information for display on an integrated input and display device (240) of a navigation device (200), based upon a determined route of travel of the navigation device (200);
determining a current location of the navigation device (200);
displaying, without additional user input, an image of a scene associated with the current location on the integrated input and display device (240) of the navigation device (200); and
displaying an instruction, based on the determined route of travel, in relation to the image.
2. The method of claim 1 , wherein the image of the scene is a street level view of a location.
3. The method as claimed in any of claims 1 or 2, wherein the image of the scene is a three dimensional image.
4. The method as claimed in any of claims 1 , 2 or 3, wherein
the instruction is a first instruction, the first instruction being the first step in the determined route,
the instruction informs a user of a correct direction of travel along the route, and
the instruction is at least one of a voice command instruction, a symbol on the integrated input and display device (240) of the navigation device (200) and an arrow on the integrated input and display device (240) of the navigation device (200).
5. The method of claim 4, wherein at least one of the symbol and the arrow is displayed at least one of on the image of the scene, above the image of the scene, to the side of the image of the scene and below the image of the scene.
6. The method as claimed in any of claims 1 , 2 or 3, wherein the image of the scene is obtained by the navigation device (200) and stored in a memory (230) of the navigation device (200).
7. The method as claimed in any of claims 1 , 2 or 3, wherein the image of the scene is downloaded by the navigation device (200) from a server (302) including a storage device (312).
8. The method as claimed in any of claims 1 -7, wherein the image of the scene includes the instruction when the image is downloaded by the navigation device (200).
9. The method as claimed in any of claims 1 -3, 6 or 7, wherein the navigation device (200) adds the instruction to the image of the scene.
10. The method as claimed in any of claims 1 -9, wherein the image of the scene is a Google street view image.
1 1. The method as claimed in any of claims 1 - 10, further comprising displaying a remainder of the determined route after the navigation device (200) travels beyond an area corresponding to the image of the scene.
12. The method as claimed in any of claims 1 - 1 1 , further comprising:
monitoring travel of the navigation device (200) along the determined route, and
displaying on the integrated input and display device (240) of the navigation device (200) a new image of the scene and a new instruction for any transition point, the transition point being any point along the determined route where a direction is changed.
13. The method as claimed in any of claims 1 - 12, further comprising displaying, on the image of the scene, a visual indicator of the current position of the navigation device (200).
14. A computer readable medium (230) including program segments for, when executed on a processor (201) of a navigation device (200), causing the navigation device (200) to implement the method of any of claims 1 to 13.
15. The method as claimed in any of claims 1 - 14, wherein the navigation device (200) is any one of a personal navigation device (200), a mobile phone, a personal digital assistant and an in vehicle device.
16. Computer software comprising one or more software modules operable, when executed in an execution environment, to cause a processor (201) to:
determine map information for display on an integrated input and display device (240) of a navigation device (200), based upon a determined route of travel of the navigation device (200);
determine a current location of the navigation device (200);
display, without additional user input, an image of a scene of the current location on the integrated input and display device (240) of the navigation device (200); and
display an instruction, based on the determined route of travel, in relation to the image of the scene.
17. The computer software of claim 16, wherein
the image of the scene is at least one of a street level view of a location and the image of the scene is a three dimensional image.
18. The computer software as claimed in any of claims 16 or 17, wherein
the instruction is a first instruction, the first instruction being the first step in the determined route
the instruction informs a user of a correct direction of travel along the route, and
the instruction is at least one of a voice command instruction, a symbol on the integrated input and display device (240) of the navigation device (200) and an arrow on the integrated input and display device (240) of the navigation device (200).
19. The computer software of claim 18, wherein at least one of the symbol and the arrow is displayed at least one of on the image of the scene, above the image of the scene, to the side of the image of the scene and below the image of the scene.
20. The computer software as claimed in any of claims 16- 19, wherein the image of the scene is downloaded by the navigation device (200) from a server (302) including a storage device (312).
21. The computer software as claimed in any of claim 20, wherein the image of the scene includes the instruction when the image is downloaded by the navigation device (200).
22. The computer software as claimed in any of claims 16 -21 , wherein the image of the scene is a Google street view image.
23. The computer software as claimed in any of claims 16-22, further comprising displaying a remainder of the determined route after the navigation device (200) travels beyond an area corresponding to the image of the scene.
24. The computer software as claimed in any of claims 16-23, further comprising:
monitoring travel of the navigation device (200) along the determined route, and
displaying on the integrated input and display device (240) of the navigation device (200) a new image of the scene and a new instruction for any transition point, the transition point being any point along the determined route where a direction is changed.
25. The computer software as claimed in any of claims 16-24, further comprising displaying, on the image of the scene, a visual indicator of the current position of the navigation device (200).
26. A navigation device (200) comprising: an input device to receive at least one input indicating a desired destination;
a processor (201) operably coupled to the input device (220) configured to calculate a planned route between a current location and the desired destination, the processor (201) configured to determine the current location, the processor (201) configured to obtain, without additional user input, an image of a scene of the current location, the image of the scene being a street level view of a location, and the processor (201) configured to overlay a first instruction on the image of the scene; and
a display device (240) controllable by the processor (201) and configured to display the planned route and the image of the scene including the first instruction.
PCT/EP2010/058235 2010-06-11 2010-06-11 Navigation device and method having enhanced instruction including a panoramic image of a scene WO2011154050A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/058235 WO2011154050A1 (en) 2010-06-11 2010-06-11 Navigation device and method having enhanced instruction including a panoramic image of a scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/058235 WO2011154050A1 (en) 2010-06-11 2010-06-11 Navigation device and method having enhanced instruction including a panoramic image of a scene

Publications (1)

Publication Number Publication Date
WO2011154050A1 true WO2011154050A1 (en) 2011-12-15

Family

ID=43495070

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/058235 WO2011154050A1 (en) 2010-06-11 2010-06-11 Navigation device and method having enhanced instruction including a panoramic image of a scene

Country Status (1)

Country Link
WO (1) WO2011154050A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104180814A (en) * 2013-05-22 2014-12-03 北京百度网讯科技有限公司 Navigation method in live-action function on mobile terminal, and electronic map client
CN105466436A (en) * 2014-09-11 2016-04-06 苗码信息科技(上海)股份有限公司 Vehicle system for on-site automatic navigation and driving through nature foreign language text
US9417087B1 (en) 2015-02-06 2016-08-16 Volkswagen Ag Interactive 3D navigation system
US9702722B2 (en) 2015-09-26 2017-07-11 Volkswagen Ag Interactive 3D navigation system with 3D helicopter view at destination
EP3572772A4 (en) * 2017-01-19 2021-01-20 Clarion Co., Ltd. Navigation system, computer program product, and onboard device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09292830A (en) * 1996-04-25 1997-11-11 Hitachi Ltd Method and device for electronic map display
JPH11337360A (en) * 1998-05-27 1999-12-10 Fujitsu Ten Ltd Route guide apparatus
WO2006015892A1 (en) * 2004-08-05 2006-02-16 Robert Bosch Gmbh Method for the representation of navigation tips and associated navigation device
WO2007042846A1 (en) * 2005-10-11 2007-04-19 Kovacs Zoltan Navigation method and device to implement the method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09292830A (en) * 1996-04-25 1997-11-11 Hitachi Ltd Method and device for electronic map display
JPH11337360A (en) * 1998-05-27 1999-12-10 Fujitsu Ten Ltd Route guide apparatus
WO2006015892A1 (en) * 2004-08-05 2006-02-16 Robert Bosch Gmbh Method for the representation of navigation tips and associated navigation device
WO2007042846A1 (en) * 2005-10-11 2007-04-19 Kovacs Zoltan Navigation method and device to implement the method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104180814A (en) * 2013-05-22 2014-12-03 北京百度网讯科技有限公司 Navigation method in live-action function on mobile terminal, and electronic map client
CN105466436A (en) * 2014-09-11 2016-04-06 苗码信息科技(上海)股份有限公司 Vehicle system for on-site automatic navigation and driving through nature foreign language text
US9417087B1 (en) 2015-02-06 2016-08-16 Volkswagen Ag Interactive 3D navigation system
US9803993B2 (en) 2015-02-06 2017-10-31 Volkswagen Ag Interactive 3D navigation system
US9702722B2 (en) 2015-09-26 2017-07-11 Volkswagen Ag Interactive 3D navigation system with 3D helicopter view at destination
EP3572772A4 (en) * 2017-01-19 2021-01-20 Clarion Co., Ltd. Navigation system, computer program product, and onboard device

Similar Documents

Publication Publication Date Title
US9739633B2 (en) Navigation device and method
US8244454B2 (en) Navigation device and method
US8706403B2 (en) Systems and methods for detecting bifurcations
US20080228393A1 (en) Navigation device and method
US20110125398A1 (en) Navigation apparatus, server apparatus and method of providing point of interest data
EP2646781B1 (en) Navigation methods and systems
US20160054137A1 (en) Navigation device with enhanced widgets and applications
EP2223045B1 (en) Navigation device and corresponding method
US8606502B2 (en) Navigation device and method
US8886455B2 (en) Navigation apparatus, audible instruction generation system and method of generating audible instructions
WO2011154050A1 (en) Navigation device and method having enhanced instruction including a panoramic image of a scene
WO2013037852A2 (en) Navigation method and apparatus for selecting a destination
US20110098913A1 (en) Navigation device and method for determining a route of travel
WO2010081538A2 (en) Navigation device & method
WO2009132679A1 (en) Navigation device & method
TW201027035A (en) Personal navigation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10726471

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10726471

Country of ref document: EP

Kind code of ref document: A1