EP2946172A1 - User interface for displaying predicted destinations - Google Patents

User interface for displaying predicted destinations

Info

Publication number
EP2946172A1
EP2946172A1 EP14725931.1A EP14725931A EP2946172A1 EP 2946172 A1 EP2946172 A1 EP 2946172A1 EP 14725931 A EP14725931 A EP 14725931A EP 2946172 A1 EP2946172 A1 EP 2946172A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
user
mobile device
map
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP14725931.1A
Other languages
German (de)
French (fr)
Inventor
Emanuele Vulcano
Bradford A. MOORE
Jorge Fino
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/081,911 external-priority patent/US10655979B2/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of EP2946172A1 publication Critical patent/EP2946172A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3608Destination input or retrieval using speech input, e.g. using speech recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3617Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/362Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3661Guidance output on an external device, e.g. car radio
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3688Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers

Definitions

  • Portable media devices such as smartphones have the capability to run advanced mapping and navigation applications (e.g., Apple Maps®, which operates on the iPhone®, iPad®, and iPad Mini®).
  • mapping and navigation applications include turn-by- turn navigation features, which can be helpful while driving.
  • interacting with the mapping and navigation application while driving may be difficult due to the small size of many mobile devices, and therefore the small size of the various controls on the mobile devices.
  • in-car navigation systems operate independently of any of the driver's other devices, and offer a larger and conveniently positioned screen.
  • these in-car navigations systems generally provide a more limited experience than the more robust mapping applications of the mobile device due to the inherent limitations of the vehicle.
  • Some embodiments of the invention provide an application that generates multiple user interfaces for display on multiple devices at the same time.
  • the application is an integrated mapping and navigation application that runs on a mobile device (e.g., a smart phone, tablet computer, media player, etc.) and generates both (i) a user interface for display on the mobile device and (ii) a user interface for display on a screen of a vehicle to which the mobile device connects.
  • the integrated mapping and navigation application (referred to below as a mapping application) generates both user interfaces simultaneously for simultaneous output and display.
  • mapping application of some embodiments generates different user interfaces for display on the screens of different types of vehicles. Some embodiments generate different user interfaces for each different individual vehicle. On the other hand, some embodiments generate different user interfaces for categories of vehicle screens, such as high- quality touchscreens, low-quality touchscreens, and non-touch screens (with which a user interacts via separate controls built into the vehicle).
  • the mapping application of some embodiments when connected to a vehicle, identifies the type of display screen built into the vehicle, and automatically outputs the correct user interface for the vehicle.
  • a user of the mapping application may interact with the application via the vehicle interface as well as the mobile device interface (a touchscreen interface in some embodiments). Because of the different capabilities of the different interfaces, as well as the differences in likely user behavior for interactions with the different interfaces, the same operation or type of operation may be implemented differently between the mobile device interface and the vehicle interface. For instance, the mobile device may have the capability to interpret multi-touch gestures (e.g., a pinch gesture to zoom in or out), whereas the vehicle interface may not have multi-touch capability (or any touch capability), and therefore requires different user interaction to zoom (e.g., selection of zoom in and zoom out buttons, either on the touchscreen or the vehicle interface).
  • multi-touch gestures e.g., a pinch gesture to zoom in or out
  • the vehicle interface may not have multi-touch capability (or any touch capability)
  • a user may interact differently with the application user interfaces displayed on high-quality touchscreens, low-quality touchscreens, and non-touchscreens.
  • the interaction for scrolling through a map on a vehicle touchscreen may involve a similar swiping gesture as to scrolling through the map on a mobile device.
  • a low-quality touchscreen may not have the ability to interpret such gestural input, and therefore the user interface for low-quality touchscreens includes selectable (e.g., via a tap input) arrows for scrolling in different directions.
  • the non-touchscreen vehicle interface will require input through other controls (e.g., a joystick, buttons, etc.) that are built into the vehicle.
  • the vehicle interface output by the mapping application provides additional features in some embodiments.
  • the vehicle screen interface for the mapping application is geared towards identifying a destination for a user and entering a navigation mode for a route to that destination, with as little user interaction as possible (because the user is often the driver).
  • a user e.g., the driver of the vehicle, a passenger of the vehicle, etc.
  • the user may search for a specific address, a specific place name, a generic type of place name, etc.
  • the user searches through the vehicle interface via voice interaction (i.e., dictating a search into a microphone of either the mobile device or the vehicle).
  • voice interaction i.e., dictating a search into a microphone of either the mobile device or the vehicle.
  • the user can scroll through these results in the vehicle interface (through touchscreen interactions, built-in vehicle control interactions, etc.), and choose to enter a navigation mode with a search result as a destination.
  • the mapping application of some embodiments offers a predictive routing feature accessible through the vehicle user interface. While driving, the user can select an option to enter the predictive routing mode, in which the mapping application presents various likely routes to the user for navigation.
  • the mapping application may generate the likely routes based on a variety of factors, including upcoming appointments or events on a calendar or other scheduling application that runs on the mobile device, analysis of routes taken in the past by the mobile device (e.g., a route from a user's home to work).
  • the predictive routing feature may additionally factor in traffic to identify potential difficulties in a usual route or in reaching a location on time.
  • the mapping application of some embodiments also offers a recent locations feature that provides a user with recent destinations, results of recent searches, etc. Some embodiments provide search results exclusively from recent searches entered or destinations navigated to through the vehicle interface. On the other hand, some embodiments additionally include search results from recent searches made through the device, even before the connection of the device to the vehicle interface. Thus, if a user searches for a particular destination on her mobile device while walking to her car, then enters the car and connects her device to the car interface, the particular destination will appear as a recent and easily selectable search, without requiring the user to re-enter the search.
  • the mapping application enters a turn-by-turn navigation mode in some embodiments.
  • some embodiments output different displays to the vehicle display and the mobile device display.
  • the vehicle display in some embodiments, displays the user's location and the upcoming route, in either a two dimensional mode or a three dimensional mode.
  • the application of some embodiments generates this display from a perspective rendering position within a three dimensional navigation scene, though the view may be shown from directly above the scene so as to render a two dimensional view.
  • the user can interact with the vehicle user interface to, e.g., view a list of maneuvers to make for the route (e.g., a right turn onto a particular street), change between two and three dimensions, and other interactions.
  • a portion of the vehicle screen displays a representation for the maneuver (e.g., an intersection with an arrow that represents the vehicle's path through the intersection, as well as text directions for the maneuver).
  • a representation for the maneuver e.g., an intersection with an arrow that represents the vehicle's path through the intersection, as well as text directions for the maneuver.
  • Figure 1 conceptually illustrates a mapping and navigation application that generates multiple user interfaces simultaneously.
  • Figure 2 illustrates an example of a mobile device connected to the interface of a vehicle system.
  • Figure 3 conceptually illustrates a simplified software architecture for a mapping and navigation application of some embodiments
  • Figure 4 conceptually illustrates a process of some embodiments performed by the mapping application to output a vehicle user interface.
  • Figure 5 illustrates a first example of a high-quality touchscreen user interface, a second example of a low-quality touchscreen user interface, and a third example of a non-touch user interface.
  • Figure 6 illustrates an additional difference between the high-quality touchscreen interface and the low-quality touchscreen interface of the mapping application in some embodiments.
  • Figure 7 illustrates the high-quality touchscreen vehicle user interface of some embodiments in which the user navigates through various features of the interface, including the recent destinations feature, the bookmarked destinations feature, and the predictive routing feature.
  • Figure 8 illustrates the use of the search function of some embodiments to identify a destination and then enter a navigation mode.
  • Figure 9 illustrates the user of zoom and scroll functionality as well as the position control to re-center the displayed map region on the device's current location in the high-quality touchscreen interface of some embodiments.
  • Figure 10 illustrates a user exploring a map in both the mapping application interface displayed on a mobile device as well as the high-quality vehicle touchscreen interface displayed on the screen of a vehicle to which the mobile device connects.
  • Figure 11 illustrates various map exploration functions, including zooming, scrolling, etc. in the low-quality touchscreen interface of some embodiments.
  • Figure 12 conceptually illustrates a region representing the map view area of the low- quality touchscreen user interface.
  • Figure 13 conceptually illustrates a process performed by the mapping application of some embodiments in order to translate a selection input into a scroll of the map display for a low-quality touchscreen vehicle interface.
  • Figure 14 illustrates an alternative low-quality touchscreen interface of some embodiments.
  • Figure 15 illustrates a feature of the scroll arrows of the low-quality touchscreen interface of some embodiments, in which when as a user holds down a touch input for an extended period of time over one of the scroll arrows, the map scrolling accelerates.
  • Figure 16 illustrates the use of scroll arrows to navigate through a list of recents in the low-quality touchscreen interface of some embodiments.
  • Figure 17 illustrates the operation of such a non-touchscreen vehicle user interface according to some embodiments.
  • Figure 18 illustrates an example of map exploration in a non-touchscreen vehicle user interface of some embodiments.
  • Figures 19 and 20 illustrate the search feature of some embodiments in different cases of ambiguous search terms.
  • Figure 21 illustrates an example of such a search that produces multiple results.
  • Figure 22 illustrates the operation of the mapping application search feature for the non- touchscreen vehicle user interface of some embodiments.
  • Figure 23 conceptually illustrates a process performed by the mapping application of some embodiments in order to retrieve search results for a search dictated through the vehicle user interface.
  • Figure 24 illustrates an example of a user performing a search within the mapping application mobile device interface while the mobile device is connected to a vehicle interface.
  • Figure 25 illustrates the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then accesses the recent destinations on the vehicle interface of the mapping application which now include the previously-searched destination.
  • Figure 26 illustrates the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then opens the mapping application on the vehicle interface.
  • Figure 27 illustrates three stages of a mapping application in which a user adds a location to the mapping application from a web browser on the mobile device, which adds the location to the mapping application table of recent destinations.
  • Figure 28 illustrates a recent destinations page of some embodiments as the vehicle displaying the recent destinations page travels over a stretch of road.
  • Figure 29 illustrates a user utilizing the recents feature on a mobile device of some embodiments in order to select a destination.
  • Figure 30 illustrates a user utilizing the recents feature on the vehicle interface of some embodiments in order to select a destination.
  • Figure 31 illustrates the use of this predictive routing interface on the high-quality vehicle touchscreen display of some embodiments in which the user activates the interface and views several different predicted destinations.
  • Figure 32 illustrates the same scenario as Figure 31, displayed in the low-quality touchscreen interface of some embodiments.
  • Figure 33 illustrates an example of the predictive routing engine of some embodiments reordering the likelihood of various destinations, and the subsequent effects in the predictive routing interface.
  • Figure 34 illustrates another example of changes made by the predictive routing engine to the order of likely destinations.
  • Figure 35 illustrates the animation of sliding a conceptual filmstrip with different predicted routes through the display screen.
  • Figure 36 illustrates the animation between two predicted route displays when the displays are arranged conceptually as stacked cards.
  • Figure 37 illustrates a feature of the predictive routing interface of some embodiments for entering the turn-by-turn navigation mode of the mapping application.
  • Figure 38 conceptually illustrates a simplified software architecture of a mapping and navigation application of some embodiments that operates on a mobile device and performs predictive routing.
  • Figure 39 illustrates the vehicle display screen over four stages in which a user activates a messaging function and dictates a message to a recipient.
  • Figure 40 illustrates the operation of the turn-by-turn navigation user interface of some embodiments for the vehicle display.
  • Figure 41 illustrates the animation of the direction list in the vehicle interface of some embodiments.
  • Figure 42 illustrates a user accessing an overview mode in the vehicle user interface.
  • Figure 43 illustrates the display of the mobile device during navigation according to some embodiments.
  • Figures 44 and 45 illustrate the display of both a mobile device and vehicle UIs of some embodiments in which the vehicle approaches a first maneuver with a second maneuver following the first in rapid succession.
  • Figure 46 illustrates a use of the device interface to request a route from the device's current location to a destination location, then the subsequent selection to enter navigation.
  • Figure 47 illustrates the display and use of controls on the mobile device during navigation to enter a route overview display.
  • Figure 48 illustrates a device in power-saving mode as well as the vehicle display screen to which the device is connected during turn-by-turn navigation.
  • Figure 49 illustrates the home pages of the mobile device and vehicle displays of some embodiments while turn-by-turn navigation mode is operational and the vehicle approaches a maneuver.
  • Figure 50 illustrates the display outside the mapping application with turn-by-turn navigation operating according to some embodiments.
  • Figure 51 illustrates a situation in which the mapping application of some embodiments in turn-by-turn navigation mode is open on the vehicle display while the mobile device displays its home page.
  • Figure 52 illustrates an example of an architecture of a mobile computing device.
  • Figure 53 conceptually illustrates an example of an electronic system with which some embodiments of the invention are implemented.
  • Figure 54 illustrates a map service operating environment according to some embodiments.
  • Some embodiments of the invention provide an application that generates multiple user interfaces for display on multiple devices at the same time.
  • the application is an integrated mapping and navigation application that runs on a mobile device (e.g., a smart phone, tablet computer, media player, etc.) and generates both (i) a user interface for display on the mobile device and (ii) a user interface for display on a screen of a vehicle to which the mobile device connects.
  • the integrated mapping and navigation application (referred to below as a mapping application) generates both user interfaces simultaneously for simultaneous output and display.
  • Figure 1 conceptually illustrates such a mapping and navigation application 100 that generates multiple user interfaces simultaneously.
  • the mapping and navigation application 100 operates on a mobile device such as a smart phone or tablet device (e.g., an iPhone®, iPad®, etc.).
  • the mapping and navigation application 100 outputs a first user interface 105 to a local display module 110 of the mobile device and a second user interface 115 to a communication interface 120 with a vehicle.
  • the local display module 110 of some embodiments handles display of the first user interface 105 on a display screen of the mobile device on which the mapping application 100 operates.
  • a user of the mapping application may interact with the mobile device user interface 105 (e.g., through touchscreen interactions).
  • the communication interface 120 to the vehicle is an interface for a wired (e.g., USB, etc.) or wireless (e.g., Bluetooth®, Wi-Fi, etc.) connection with a vehicle electronic system.
  • the vehicle may be a car, truck, boat, or other motor vehicle with a display screen and ability to connect to the mobile device.
  • the vehicle electronic system of some embodiments includes a display screen through which the second user interface 115 is displayed.
  • the user may also interact with the vehicle user interface 115 through various vehicle controls. These controls may include a touchscreen and/or various hardware controls (e.g., buttons, knobs, joysticks) built into the vehicle.
  • the mobile device generating a graphical user interface (or user interface) for the vehicle display screen.
  • the graphical display may be sent as a bitmap in some embodiments, a different format that requires translation into a displayed image by the vehicle system, or a combination of bitmaps and different formats.
  • This display screen which may be a touchscreen, makes up the vehicle user interface, possibly along with other controls (e.g., joystick, knob, built-in buttons, trackpad, trackball, etc.). These various controls receive inputs and send signals to the mobile device indicating the inputs performed by the user.
  • the mapping application on the mobile device (or the operating system, which in some embodiments translates the signals from the vehicle) is configured to recognize these inputs and perform various actions to modify the mapping application graphical display on the vehicle screen, the mobile device screen, or both.
  • the interface of some embodiments for a mobile device to communicate with the vehicle in this way is described in further detail in U.S.
  • Patent Publications 2009/0284476, 2010/0293462, 2011/0145863, 2011/0246891, and 2011/0265003 which are incorporated herein by reference.
  • this interface of some embodiments is described in further detail in the following applications: U.S. Provisional Patent Application 61/832,841, entitled “Device and Method for Generating User Interfaces from a Template”; U.S. Provisional Patent Application 61/832,842, entitled “Device, Method, and Graphical User Interface for Synchronizing Two or More Displays"; and U.S. Patent Application 13/913,428, entitled "Application Gateway for Providing Different User Interfaces for Limited Distraction and Non-Limited Distraction Contexts".
  • U.S. Applications 61/832,841, 61/832,842, and 13/913,428 are incorporated herein by reference.
  • Figure 2 illustrates an example of a mobile device 200 connected to the interface of a vehicle system.
  • a mapping application operates on the mobile device 200, and outputs both a first user interface 205 for the mobile device display screen and a second user interface 210 for the vehicle dashboard display screen 215.
  • the figure illustrates the interior of a vehicle 250, in which the mobile device 200 connects via a wired connection to the vehicle, and outputs a user interface for display on a dashboard screen 215 of the vehicle.
  • this example, as well as others to follow illustrates a single display screen in the vehicle, some vehicles include multiple screens (e.g., a center dashboard console screen as well as a screen directly in front of the driver).
  • Some embodiments output only a single user interface to the center dashboard screen in this case, while other embodiments output the same user interface to multiple screens, and yet other embodiments output different interfaces to the different screens.
  • the figure also illustrates a blown-up view of the mobile device 200 and the dashboard screen 215. As shown, both of these display a map of the same location, but within the context of different user interfaces.
  • the mobile device interface 205 of some embodiments includes several user interface (UI) controls arranged in a top bar 220 and as floating controls (although other embodiments do not include floating controls and instead include these controls within either the top bar or a second control bar).
  • the floating controls of some embodiments include a position control 225 and a 3D control 230, while the top bar 220 includes a direction control 235, a search field 240, and a bookmark control 245.
  • the direction control 235 opens a page through which a user can request the mapping application to identify a route (e.g., by accessing a mapping and routing server) between a starting location and an ending location.
  • a route e.g., by accessing a mapping and routing server
  • the mapping application Upon receiving a selection (e.g., through a tap gesture) of a the direction control 235, the mapping application presents a page allowing the user to select a starting location (e.g., the user's current location or a different location) and an ending location for a route.
  • a route has been identified that starts at the current location of the device (e.g., through the direction control 235 or through a different aspect of the UI 205)
  • some embodiments enable the user to enter a turn-by-turn navigation mode of the mapping application by selecting a UI control.
  • the mapping application presents a view along the selected route, while also providing instructions for upcoming maneuvers to perform (e.g., as road signs with both text and graphical instructions). In some embodiments, the mapping application generates this view from a perspective rendering position within a three dimensional navigation scene that the device renders.
  • the user can initiate a search by tapping in the search field 240.
  • This directs the application to present an animation that (1) presents an on-screen keyboard and (2) opens a search table with various completions for the user's search entry.
  • the search field is empty (e.g., because the user has not yet entered any search terms)
  • the table contains a list of "recents", which in some embodiments are recent searches and/or route directions that the user (or a different user of the mapping application on the mobile device 200) has previously requested.
  • the recents list may also include addresses searched for or added from other applications on the mobile device (e.g., a web browser), received via various different forms of messaging (e.g., SMS messages, e-mail, etc.), recently accessed contacts, or other mechanisms.
  • the mapping application fills the table with search completions both from local sources (e.g., bookmarks, contacts, recent searches, recent route directions, etc.) and remote servers.
  • users can activate a voice recognition feature of the mobile device in order to perform searches within the mapping application.
  • the bookmark control 245 allows location and routes to be bookmarked by the application.
  • a user can add a location or a route as a bookmark. For instance, a user can select a location (e.g., a place of business) on the map to cause the application to present an information screen for the location that, along with various information, includes a selectable control for adding the location as a bookmark.
  • a location e.g., a place of business
  • the position control 225 allows the current position of the device to be specifically noted on the map, as is the case in this figure. Once the position control is selected in some embodiments, the application maintains the current position of the device in the center of the map as the device moves. In some embodiments, the position control can also be used to direct the application to identify the direction to which the device is currently oriented.
  • the mapping application of some embodiments identifies the location of the device using coordinates (e.g., longitudinal, latitudinal, and altitudinal coordinates) in a GPS signal that the device receives. Alternatively or conjunctively, the mapping application (or a separate application or operating system of the device) uses other methods (e.g., cell tower triangulation) to compute the current location.
  • the 3D control 230 is a control for viewing the map or inspecting a route in three dimensions.
  • the mapping application provides the 3D control as a quick mechanism for entering or exiting a 3D mode.
  • This control also serves as (1) an indicator that the current map view is a 3D view and (2) an indicator that a 3D perspective is available for a given map view (e.g., a map view that is zoomed out might not have a 3D view available).
  • Some embodiments additionally include a page curl control 255 that allows the application to minimize the number of on-screen controls, by placing certain less frequently used actions in a secondary UI page, which is accessible through the "page curl" control displayed on the map.
  • the page curl indicates the location of another set of controls that are conceptually located "behind" the map view.
  • These controls include selectable controls to drop a pin, show traffic on the map, print a map, or select different map types (e.g., satellite, standard, or a hybrid map type).
  • this second UI provides a subset of the functionalities of the first mobile device UI, and is designed to focus the user towards identifying a navigation destination with minimal touch interaction, as the user will often be the driver of the vehicle.
  • the vehicle user interface 210 includes, in addition to the map, several UI controls arranged in a top bar 260, and a home button 265 located within a lower information bar 270.
  • the vehicle user interface 210 is dependent, in some embodiments, on the type of user interface provided by the vehicle. Some embodiments generate different user interfaces for each different individual vehicle make or model. On the other hand, some embodiments generate different user interfaces for categories of vehicle screens, such as high- quality touchscreens, low-quality touchscreens, and non-touch screens (with which a user interacts via separate controls built into the vehicle).
  • the mapping application of some embodiments when connected to a vehicle, identifies the type of display screen built into the vehicle, and automatically outputs the correct user interface for the vehicle. In the case of Figure 2, the high-quality touchscreen interface of some embodiments is displayed.
  • the lower information bar 270 is designed to implement certain features of the mobile device user interface outside of the mapping application. In fact, in some embodiments, the lower information bar 270 is generated by the device operating system for the vehicle display rather than by the mapping application. As shown, the lower information bar 270 includes features also included in an upper information bar 275 of the mobile device user interface, including a clock 280, a battery charge indicator 285, and a connection indicator 290.
  • the clock 280 indicates a current time
  • the battery charge indicator 285 indicates an amount of battery charge remaining for the mobile device
  • the connection indicator 290 indicates a network connection of the mobile device (e.g., LTE, Wi-Fi, etc.).
  • the home button 265 of some embodiments is designed to replicate the functionality of the physical home button 295 of some embodiments, which enables the user of the mobile device to exit an application and return to a home page of the device. Similarly, in some embodiments, the user can touch the home button 265 to return to a home page of the vehicle user interface, which in some embodiments is also generated by the mobile device 200 (e.g., by the operating system of the mobile device).
  • the home page of the vehicle user interface displays a set of controls for opening a set of applications available through the vehicle interface (e.g., phone, messages, music, and other applications that run on the mobile device and are available through the vehicle interface).
  • the mobile device displays these controls (e.g., above the mapping application interface) without exiting the mapping application.
  • the user can then select one of the application controls or dismiss the set of controls by either waiting a short period of time or selecting a location within the mapping application.
  • the mobile device 200 displays the same application on both the vehicle display 215 and its own display in most situations. For example, when the device is connected to the vehicle system and the user enters the mapping application via the mobile device, the device 200 also outputs the mapping application UI 210 to the display 215. When a user selects the home button 295 on the mobile device, the UI output to the display screen 215 exits the mapping application as well.
  • the top bar 260 of some embodiments is translucent and includes a search control 261, a recents control 262, a bookmarks control 263, a predictive routing control 264, a position control 266, and zoom controls 267 and 268.
  • different user interfaces for different types of vehicles may include different controls (e.g., different zoom controls), or a completely different arrangement of controls (e.g., for non-touch interfaces).
  • the search control 261 of some embodiments directs the application towards a search functionality. Unlike the mobile device search control 240, the search functionality accessed through the vehicle interface 210 does not include a typing function. Instead, searching is done through voice recognition.
  • search control 261 the application presents an interface that prompts the user to speak into a microphone of either the mobile device or the vehicle.
  • the mapping application Based on the search term(s) dictated by the user, the mapping application either (1) presents the user with several selectable options for a search or (2) identifies a requested destination or destinations and presents the destination(s) on the map. For instance, if a user dictates a unique address or business name, the application automatically displays the requested location on the map display.
  • the user interface provides two or more options for the user to select.
  • the mapping application of some embodiments may present multiple destinations on the map, when several destinations that match the generic request are located nearby.
  • the recents control 262 of some embodiments directs the application to present a list of recent destinations, results of recent searches, etc.
  • the recents list of some embodiments for the vehicle interface does not provide routes from one location to another, because the vehicle interface is primarily designed for identifying a destination to which to navigate from the current location of the vehicle and device.
  • Some embodiments provide search results exclusively from recent searches entered or destinations navigated to through the vehicle interface.
  • some embodiments additionally include search results from recent searches made through the device, even before the connection of the device to the vehicle interface.
  • the particular destination will appear as a recent and easily selectable search, without requiring the user to reenter the search through the search control 261.
  • the bookmarks control 263 of some embodiments is similar to the bookmarks control 245 of the device interface, in that it enables access to a list of bookmarked locations. Again, as with the recents list, the bookmarks list of some embodiments only provides bookmarked destinations, and does not provide routes. In some embodiments, the bookmarked destinations are those bookmarked on the mapping application running on the mobile device. In other embodiments, the mapping application UI displayed on the vehicle screen includes the ability to add a destination to the bookmarks.
  • the predictive routing control 264 enables access to a predictive routing display of the mapping application, in which the mapping application presents various likely routes to the user for navigation.
  • the mapping application may generate the likely routes based on a variety of factors, including upcoming appointments or events on a calendar or other scheduling application that runs on the mobile device, or analysis of routes taken in the past by the mobile device (e.g., a route from a user's home to work).
  • the predictive routing feature may additionally factor in traffic to identify potential difficulties in a usual route or in reaching a location on time.
  • the application presents the likely routes with a navigation view on one portion of the display and information about the route (including estimated arrival time, metadata about the source of the route, etc.) on a second portion of the display, with the ability to scroll through the routes.
  • the position control 266 functions similar to the position control 225 of some embodiments, in that its selection will cause the mapping application to center the map view on the current location of the device.
  • the position control does not include an additional feature of enabling the user to identify the orientation of the vehicle. Instead, some embodiments automatically orient the map based on the direction of travel of the vehicle, or provide a setting through which the user enables this automatic orientation of the map. Other embodiments automatically orient the map with north facing up irrespective of the vehicle travel direction.
  • the map view is centered on the position indicator 269 that indicates the current location of the device.
  • Some embodiments disable the position control when the map view is centered on the position indicator.
  • the position control 266 is displayed in grayed-out fashion, to indicate its current disabled state.
  • Figure 9 below illustrates the enabling of the position control as the user moves the map region so that the position control is no longer centered.
  • the zoom controls 267 and 268 enable the user to zoom in (with control 267) and out (with control 268) of the map view in some embodiments.
  • users may also scroll through the map region.
  • the scrolling and zooming features differ for different types of vehicle interfaces because of the different capabilities of the different types of display screens.
  • UI 210 is a high-quality touchscreen interface, and the user scrolls the map view via swipe, or drag, gestures.
  • a low-quality touchscreen may not have the ability to interpret such gestural (i.e., touch movement) input, and therefore the user interface for low-quality touchscreens includes selectable arrows for scrolling in different directions.
  • zoom controls 267 and 268 some embodiments present a different control that the user selects to activate the selectable arrows as well as zoom controls similar to those for the high-quality touch interface 210.
  • the non-touchscreen vehicle user interface requires input through other controls in order to zoom and scroll. These other controls may include a joystick, trackball, various buttons, etc. that are built into the vehicle.
  • the mapping application enters a turn-by-turn navigation mode in some embodiments. In this mode, some embodiments output different displays to the vehicle display and the mobile device display.
  • the vehicle display in some embodiments, displays the user's location and the upcoming route, in either a two dimensional mode or a three dimensional mode.
  • the mapping application generates this view from a perspective rendering position within a three dimensional navigation scene that the device renders.
  • This perspective rendering position is adjustable in some embodiments and can be viewed as a virtual camera that can capture the three dimensional navigation scene from a variety of different perspectives (e.g., from a variety of different positions and orientations).
  • the turn-by-turn navigation presents an animated rendering of a navigated route rendered from the vantage point of a virtual camera that traverses along the direction of the route based on the traversal direction and speed of the vehicle to which the device is connected.
  • this direction and speed is captured by data (e.g., GPS data, triangulated cell tower data, etc.) associated with the device.
  • the user can interact with the vehicle user interface to, e.g., view a list of maneuvers to make for the route (e.g., a right turn onto a particular street), change between two and three dimensions, and other interactions.
  • a portion of the vehicle screen displays a representation for the maneuver (e.g., an intersection with an arrow that represents the vehicle's path through the intersection, as well as text directions for the maneuver). Once the vehicle has passed through the intersection, the representation of the maneuver disappears from the display screen of the vehicle.
  • FIG. 1 illustrates examples of a mapping application operating on a mobile device that outputs both a first UI for the mobile device and a second UI for a vehicle to which the mobile device connects.
  • Section I describes the different types of vehicle interfaces generated by the mapping application of some embodiments, while Section II describes map exploration features of these different vehicle interfaces. Section III then describes the search feature accessible through the vehicle user interfaces of the mapping application of some embodiments.
  • Section IV describes a feature of the vehicle user interfaces of some embodiments for accessing recent destinations, while Section V describes a predictive routing interface of some embodiments.
  • Section VI describes a turn-by-turn navigation mode of the mapping application of some embodiments.
  • Section VII then describes electronic devices that employ the mapping application of some embodiments.
  • Section VIII describes location services used by some embodiments of the invention. I. DIFFERENT TYPES OF VEHICLE USER INTERFACES
  • the mapping application of some embodiments in addition to generating a first user interface for the device on which it operates, also generates different user interfaces for different types of vehicles to which the device may connect.
  • the application categorizes vehicle interfaces into high-quality touchscreen interfaces, low- quality touchscreen interfaces, and non-touchscreen interfaces. While this description primarily focuses on these three types of interfaces, one of ordinary skill in the art will recognize that many different types of interfaces may be generated by some embodiments. For instance, some embodiments generate a UI for vehicles that include a touchscreen interface along with additional hardware controls.
  • the mapping application of some embodiments generates different UIs for vehicles with different non-touchscreen controls. Some embodiments, in fact, generate different UIs for each different make or model of vehicle.
  • Figure 3 conceptually illustrates a simplified software architecture for a mapping and navigation application 300 of some embodiments.
  • the application 300 operates on a mobile device that can be connected to a vehicle.
  • the vehicle connection requires a wired connection that additionally enables charging of the electronic device (e.g., a USB connection).
  • the mobile device can connect wirelessly to the vehicle display interface.
  • the mapping application 300 includes various mapping and navigation functions 305, a local device user interface generator 310, a high-quality vehicle touchscreen user interface generator 315, a low-quality vehicle touchscreen user interface generator 320, a non-touch vehicle touchscreen user interface generator 325, and a vehicle user interface selector 340 that uses a vehicle display lookup table 345.
  • Figure 3 also displays a local display module 330 and a communication interface 335 for connecting to a vehicle electronic system.
  • the local display module 330 and the communication interface 335 are also part of the mobile device on which the mapping application 300 operates, though not part of the mapping application 300 itself.
  • the mapping and navigation functions 305 perform a wide variety of mapping and/or navigation processes for the mapping application. These functions may include identifying map tiles from which to generate map displays and retrieving the map tiles from a remote server, matching a device location to a location on a map, storing and retrieving bookmarked or recent search results, retrieving route information from a server, generating predicted routes based on analyzing data travel history, data from other applications, etc., generating text and/or graphical navigation instructions for a route based on the retrieved route information, and other functions.
  • the local device user interface generator 310 of some embodiments generates the mapping application UI for display on the mobile device on which the mapping application 300 operates.
  • the UI generator 310 receives a rendered map from the mapping and navigation functions and adds additional user interface controls (e.g., the floating controls 225 and 230, the controls in the top bar 220, etc.).
  • the UI generator 310 receives a description of the map to be rendered, adds the overlaid controls, and sends the entire UI to a rendering engine for rendering.
  • the UI generator 310 sends the UI to the local display module 330 for display on the local display device.
  • the local display module 330 of some embodiments handles display of the mobile device user interface of the mapping application.
  • the mobile device display is a touchscreen which, in addition to displaying a user interface, accepts touchscreen interactions.
  • Either the local display module 330 or a separate touchscreen interaction reader receives these touchscreen interactions and translates the interactions into multi-touch gestures (e.g., one-finger tap, one-finger double tap, one-finger swipe, two-finger swipe, etc.) at particular locations (e.g., coordinates) of the touchscreen.
  • multi-touch gestures e.g., one-finger tap, one-finger double tap, one-finger swipe, two-finger swipe, etc.
  • These gestures are sent from the operating system to the mapping application 300 (e.g., to the local device user interface generator 310 or a separate module that translates the gesture and location into an interaction with the displayed mapping application constructs.
  • the high-quality vehicle touchscreen user interface generator 315, the low-quality vehicle touchscreen user interface generator 320, and the non-touchscreen vehicle user interface generator 325 generate three different user interfaces for display on the screens of different vehicles (e.g., screens located in the dashboards of the vehicles).
  • the mapping application 300 of some embodiments generates a first user interface for high-quality touchscreens, a second user interface for low-quality touchscreens, and a third user interface for non-touchscreen displays.
  • the high-quality touchscreens may not have the ability to interpret multi-touch gestures, but have a high enough touch resolution to interpret swipe/drag gestures, whereas the low-quality touchscreens may not be able to provide reliable swipe/drag gesture data.
  • Some low-quality touchscreens can detect swipes or other gestures, but without adequate resolution for some interactions.
  • the low-quality touchscreen interface may allow gestural inputs for interactions that do not require location data of the gesture (e.g. swiping between screens), but not for interactions where location of the gesture is important (e.g., precise positioning of a map).
  • the high-quality touchscreens may have a greater ability than the low-quality touchscreens to resolve the position of selection (e.g., tap or double-tap) gestures, and therefore the UI for high-quality touchscreens can have selectable items located more closely together.
  • Each of the vehicle UI generators 315-325 sends their UIs to the communication interface 335 that connects to the vehicle system. While the figure shows all three modules communicating with the interface, in some embodiments only one of the UIs will be generated at a given time, depending on the vehicle to which the device is connected.
  • this communication interface is a Universal Serial Bus (USB) interface through which the device communicates with the vehicle system, as well as through which the device is charged.
  • the communication interface may be one of several different interfaces (e.g., USB, Bluetooth, etc.) depending on how the user chooses to connect the device to the vehicle system.
  • the mapping application 300 also receives interactions with the vehicle system (i.e., touchscreen and/or non-touchscreen interactions) to control the mapping and navigation functions through the communication interface 335.
  • the vehicle system identifies the location and type of touchscreen gestures (e.g., single or double tap centered at a particular set of coordinates) and sends this information to the mapping application 300 for its interpretation with respect to the displayed user interface.
  • the vehicle system sends the raw touchscreen data (e.g., pressure data) to the mobile device for translation into gestures at particular locations.
  • the interactions received through the vehicle interface may also include non-touchscreen interactions, such as the pushing of specific buttons, joystick motion, etc.
  • the vehicle user interface selector 340 identifies which of the three types of vehicle user interfaces should be generated and communicates this information to both the mapping and navigation functions 305 and the different user interface generators 315-325.
  • the vehicle user interface selector 340 is part of the operating system of the mobile device rather than part of the mapping application.
  • the vehicle user interface selector 340 receives information about the vehicle display screen through the communication interface 335 (e.g., through a specific communication channel of the interface). This information, in some embodiments, includes one or more characteristics of the vehicle display screen.
  • the vehicle user interface selector 340 uses the vehicle display lookup table 345 to select one of the three types, or categories, of interfaces for the mapping application to generate.
  • the mapping application generates numerous different interfaces that cater to the specifics of the vehicle displays with greater granularity. For instance, some embodiments generate multi-touch capable vehicle user interfaces, interfaces for vehicles that include both touchscreen and knob- type inputs, etc.
  • the mapping application 300 only generates one vehicle user interface, in addition to the local device user interface for display on the mobile device on which the mapping application operates.
  • Figure 4 conceptually illustrates a process 400 of some embodiments performed by the mapping application to output a vehicle user interface.
  • Figure 4 will be described by reference to Figure 5, which illustrates three different mapping application user interfaces for display on different types of vehicle display screens.
  • Figure 5 illustrates a first example 505 of a high-quality touchscreen user interface, a second example 510 of a low-quality touchscreen user interface, and a third example 515 of a non-touch user interface.
  • the process 400 begins by detecting (at 405) a connection to a vehicle interface.
  • the process is started by the mapping application when the mobile device on which the application operates connects to the vehicle interface (e.g., by a user connecting the device with a USB connector) and the mapping application is running on the device (e.g., running in the background or foreground of the device).
  • the process determines (at 410) a vehicle interface type.
  • the vehicle does not identify itself as one of three categories (high-quality touch, low-quality touch, and non-touch), but instead, identifies itself based on the make and model of the vehicle, or the particular built-in system make and model number.
  • the mapping application (or the device operating system) stores a lookup table that identifies, for different vehicle systems, whether the vehicle has a high-quality touchscreen interface, a low-quality touchscreen interface, or a non-touchscreen interface.
  • the vehicle system identifies properties of its display device (e.g., touch latency, touch resolution, display resolution, additional inputs (e.g., knobs, etc.). Based on the identified properties, the mapping application (or the device operating system) identifies a type of user interface (e.g., high-quality touch, low-quality touch, non-touch) to generate from a stored matrix or lookup table.
  • properties of its display device e.g., touch latency, touch resolution, display resolution, additional inputs (e.g., knobs, etc.).
  • the mapping application or the device operating system identifies a type of user interface (e.g., high-quality touch, low-quality touch, non-touch) to generate from a stored matrix or lookup table.
  • the process determines (at 415) whether the interface has a high-quality touchscreen.
  • a high-quality touchscreen may not have the ability to interpret multi-touch gestures, but will have a better touch input resolution than the low-quality touchscreens.
  • the process generates and outputs (at 420) a high-quality touchscreen vehicle user interface. The process then ends.
  • the first example 505 of Figure 5 illustrates such an interface output to a first vehicle by the mapping application.
  • This first example illustrates two stages 520 and 525 of a high-quality touchscreen vehicle display.
  • the first stage 520 illustrates a home page user interface 550 of some embodiments output by the mobile device operating system when the mobile device is connected to a vehicle system with a high-quality touchscreen interface.
  • the home page 550 includes a clock 555 and a menu bar 560 that includes several controls.
  • the menu bar 560 includes a phone control 561, a music control 562, a map control 563, a messaging control 564, a car control 566, and a now playing control 567.
  • this user interface includes controls for accessing various applications or functions that operate on the mobile device that a user is most likely to access while driving.
  • the phone control 561 enables the user to access the phone application in order to place a phone call.
  • selection of the phone control causes the application to provide an interface for identifying a phone number (e.g., a phone number of a stored contact, of a place of business, etc.) to call.
  • the interface prompts the user to dictate search terms (e.g., a contact name) in order to identify a phone number.
  • search terms e.g., a contact name
  • Other embodiments provide a touchscreen interface for navigating through a contacts list, entering a phone number, etc.
  • the music control 562 enables the user to access a particular media storage, access, and playback application (e.g., iTunes®) in order to access media available through the application.
  • the media may be stored on the device or accessible through the application via a connection to an associated cloud storage service (e.g., iCloud®).
  • the music UI may involve touchscreen interaction, voice interaction, or a combination thereof.
  • the messaging control 564 enables the user to access the messaging application of the mobile device in order to send a short messaging service (SMS), or text, message.
  • SMS short messaging service
  • the UI for the messaging application is designed to quickly identify a contact to which to send a text message, via touchscreen interactions, voice interactions, or a combination thereof. Once one or more recipients have been identified for the message, in some embodiments the user composes the message by dictation.
  • the car control 566 enables the user to access a UI provided by the vehicle manufacturer, or by the in-vehicle system manufacturer. This is the UI displayed by default when the mobile device is not connected to the vehicle system, and varies from one vehicle to the next.
  • the UI provided with the vehicle may include options such as a built-in navigation system, access to satellite radio, AM radio, FM radio, etc., and other functions.
  • the now playing control 567 enables the user to access any media applications that may be operating on the user's mobile device (e.g., streaming online radio applications, streaming video applications, etc.).
  • the mobile device may output different user interface for different applications.
  • the media applications may not have specific UIs for the vehicle display, and the device simply outputs the media while requiring interaction with the mobile device UI.
  • the map control 563 enables the user to access the mapping application described herein.
  • the first stage 520 of this example illustrates a user selecting the map control 563 with a touch selection.
  • the second stage 525 illustrates the resultant mapping application user interface 565 for high-quality touchscreens.
  • This mapping application UI 565 is the same as the UI 210 described above by reference to Figure 2, and includes the lower information bar in addition to a search control 571, a recents control 572, a bookmarks control 573, a predictive routing control 574, a position control 576, and zoom controls 577 and 578. The functions of these controls are described above by reference to Figure 2.
  • the process determines (at 425) whether the vehicle interface has a low-quality touchscreen.
  • Low-quality touchscreens in some embodiments, do not have the ability to recognize gestural input (e.g., drag/swipe gestures, etc.). This may be due to the vehicle system not being programmed to recognize such gestures or due to the touchscreen sensors having too low a resolution to identify such gestures with a reasonable level of accuracy.
  • the process When the vehicle interface has a low-quality touchscreen, the process generates and outputs (at 430) a low-quality touchscreen vehicle user interface. The process then ends.
  • the second example 510 of Figure 5 illustrates such an interface output to a second vehicle by the mapping application.
  • This second example illustrates two stages 530 and 535 of a low-quality touchscreen vehicle display.
  • the first stage 530 illustrates the same home page user interface 550 as the high-quality touchscreen example 505, output by the mobile device operating system when the mobile device is connected to a vehicle system with a low-quality touchscreen interface.
  • Some embodiments include the same home page for the vehicle UI irrespective of the quality of the touchscreen.
  • the controls for this home page are described above by reference to the first example 505.
  • the user selects the map control 563 with a touch selection.
  • the second stage 535 illustrates the resultant mapping application user interface 570 for low-quality touchscreens.
  • This mapping application UI 570 is similar to the high-quality touchscreen interface shown above, with some differences.
  • the UI 570 includes the same lower information bar with a home button, as well as the top bar with a search control 581, a recents control 582, a bookmarks control 583, a predictive routing control 584, and a position control 586.
  • the low-quality touchscreen interface 570 includes a zoom and scroll control 587.
  • the search control 581, recents control 582, bookmarks control 583, predictive routing control 584, and position control 586 perform the same functions as in the high-quality touchscreen interface 570, as described above by reference to Figure 2.
  • These various functions search by dictation, recent destinations, bookmarked destinations, predictive routing, current location indication
  • the high- quality touchscreen UI for the predictive routing feature allows the user to swipe between suggested routes, while the low-quality touchscreen UI includes other constructs to enable the user to change the route shown.
  • the zoom and scroll control 587 enables the user to access various controls for exploring the map on the vehicle screen. Because of the unreliability or unavailability of swipe gesture input, the low-quality touchscreen UI 570 uses different controls for scrolling that are selectable with tap inputs rather than gestural swipe inputs.
  • the selection of the zoom and scroll control 587 causes the application to overlay selectable arrows pointing up, down, left, and right over the map view, which the user can select in order to scroll the map in the selected direction.
  • the application displays zoom in and zoom out controls over the map display that are similar to the controls 577 and 578 of the high-quality touchscreen interface 565.
  • the vehicle interface when the vehicle interface does not have a touchscreen (i.e., does not have a low-quality or high-quality touchscreen), then the vehicle interface will have a non-touch display screen. As such, the process generates and outputs (at 435) a non-touchscreen vehicle user interface. The process then ends.
  • the third example 515 of Figure 5 illustrates such an interface output to a third vehicle by the mapping application.
  • This third example illustrates two stages 540 and 545 of the non- touchscreen vehicle display.
  • the first stage 540 illustrates a home menu user interface 575 displayed on the vehicle screen, as well as vehicle hardware controls 580. While this figure illustrates the vehicle hardware controls 580 located directly below the display screen, different vehicles may have their controls located in different places. For instance, some embodiments include controls in the center console between the driver and passenger seats, on the steering wheel, etc.
  • the vehicle hardware controls 580 include a keypad 591, a joystick 592, as well as several additional buttons located above and below the joystick. These additional buttons enable the user to access various functions of the vehicle system (e.g., music, phone calls with a connected phone), as well as navigate a set of menus provided by the vehicle system (e.g., to access an in-car navigation system provided with the vehicle).
  • the mobile device operating system and/or mapping application leverage these buttons as well as the joystick for interactions with the user interface provided by the mobile device.
  • the joystick 592 is shown with an "OK" button in the center and four arrows pointing up, down, left, and right. These are illustrated to represent various interactions that a user can perform with a joystick, trackball, or other directional hardware control. For instance, with a trackball (or joystick), in some embodiments a user can press downwards on the trackball to make a selection, or shift the trackball in one of the four directions to signal a move (e.g., of a selector) in that direction.
  • a move e.g., of a selector
  • the non-touch user interface home menu 575 output by the mobile device illustrates the same options as in the touchscreen user interface 550, but arranged differently.
  • the vehicle display initially displays only the clock, and the user accesses the home menu UI 575 by selecting a menu button 593.
  • the home menu 575 includes a phone option, music option, map option, messaging option, car option, and now playing option, which correspond to the controls 560-567 of the touchscreen UI 550.
  • the user can use the joystick 592 (or a trackball, arrow buttons, turnable knob, etc.) to scroll between the different menu options.
  • the first stage 540 illustrates that, with the map menu option 594 selected, the user performs a selection input by pressing downwards on the joystick 592. This selects the mapping application, and causes the display of the mapping application non-touch user interface 595 at the second stage 545. Because the display screen is a non-touch screen, the user interface does not display any controls, and only has the map view and the lower information bar. The lower information bar, unlike those for the touchscreen UIs 565 and 570, does not include a home button. Instead, in some embodiments, the user can select a hardware control (e.g., the back or exit button) to perform the function of the home button.
  • a hardware control e.g., the back or exit button
  • the user selects the menu button 593 to access a list of menu options in order to perform functions similar to those provided by the various controls 571-576 of the high- quality touchscreen UI 565 (e.g., search, recents, bookmarks, predictive routing, current location, etc.).
  • functions similar to those provided by the various controls 571-576 of the high- quality touchscreen UI 565 e.g., search, recents, bookmarks, predictive routing, current location, etc.
  • Figure 6 illustrates an additional difference between the high-quality touchscreen interface 565 and the low-quality touchscreen interface 570 of the mapping application in some embodiments.
  • This figure illustrates expanded views of the predictive routing control 574 of the high-quality touchscreen interface and the predictive routing control 584 of the low-quality touchscreen interface.
  • the expanded views illustrate the difference in the controls between the two interfaces.
  • the high-quality touchscreen interface predictive routing control 574 includes an icon 605 and a buffer region 610 around the icon.
  • the low-quality touchscreen interface predictive routing control 584 includes an icon 615 and a buffer region 620 around the icon.
  • the two icons 605 and 615 from the two interfaces are the same; however, the buffer regions around the icons are not.
  • the buffer 620 in the low-quality interface is larger than the buffer 610 in the high-quality interface. This allows additional leeway in the touchscreen's interpretation of a user's touch selection for the low-quality case.
  • the predictive routing control is used as an example, one of ordinary skill in the art will recognize that the larger touch buffer in the low-quality touchscreen interface may apply to other touch selectable controls (sometimes called affordances) as well, such as the other controls in the top bar (e.g., search, bookmarks, recents, etc.), the home button, arrows for stepping through search results shown below, etc.
  • mapping application vehicle UIs output to vehicle displays.
  • the operations of the mapping application vehicle UIs including some of the differences between the UIs for different types of vehicle displays will be described in this section.
  • Subsection A describes the operation of the high-quality touchscreen UI
  • subsections B and C describe differences in the operation of the low-quality touchscreen and non-touchscreen UIs, focusing on the zoom and scroll operations.
  • Figure 7 illustrates six stages 705-730 of the high-quality touchscreen vehicle user interface 565 in which the user navigates through various features of the interface, including the recent destinations feature, the bookmarked destinations feature, and the predictive routing feature.
  • the first stage 705 illustrates the user interface 565 as shown in the previous Figure 5.
  • a user selects the recents control 572.
  • the second stage 710 illustrates a resulting recent destinations page 735.
  • the recent destinations page 735 includes a list of recently searched places of business (Hubcap BBQ Palace), recently searched geographic locations (San Francisco, Los Angeles), recently performed searches (Coffee), recent destinations of routes, etc.
  • the various destinations may have been searched for individually within the mapping application via either the vehicle UI or the mobile device UI, selected on the map view within the mapping application, etc.
  • the items in the list are individually selectable in order to display a map centered on the selected destination in some embodiments.
  • the recent destinations page 735 includes a cancel button 740, which the user is shown selecting (with a tap selection) in stage 710. The user selects the cancel button 740 in order to exit the recent destinations page 735 and return to the map view UI 565, as shown at the third stage 715.
  • the user selects the bookmarks control 573.
  • the bookmarked destinations page 745 includes a cancel button 750, which the user is shown selecting (with a tap selection) in stage 720. The user selects the cancel button 750 in order to exit the bookmarked destinations page 745 and return to the map view UI 565, as shown at the fifth stage 725.
  • the predictive routing UI 755 displays, in order of likelihood as computed by a predictive routing engine of the mapping application, one or more likely routes for the user to follow.
  • the predictive routing engine of some embodiments uses travel history, contact information, calendar or other appointment-keeping information, traffic information, etc. to identify likely destinations and optimal routes to the destination.
  • the predictive routing UI 755 of some embodiments displays, for each route, a first panel 760 with information about the route and a second panel 765 with a map view of the route.
  • the information panel 760 displays a name for the route as well as metadata about the route.
  • the route name may be based on a destination name identified from contact information, a calendar appointment name, a business name, an address, a primary road used by the route, etc.
  • the metadata shown may include an estimated arrival time (ETA), the source of the route for the predictive routing engine (e.g., calendar, passbook, travel history, etc.), whether the destination of the route is a frequent or less traveled destination for the device, and other information.
  • ETA estimated arrival time
  • the map view 765 displays a map to the destination with the selected route.
  • the routing engine factors in traffic information to select an optimal route to the destination, and may warn a user if a typically- traveled route has worse traffic than usual.
  • FIG. 8 illustrates the use of the search function to identify a destination and then enter a navigation mode over six stages 805-830.
  • the first stage 805 illustrates the user interface 565 as shown in Figure 5.
  • a user selects the search control 571.
  • the second stage 810 illustrates a resulting search function page 835.
  • the search function available through the vehicle display is a dictation-only (i.e., no typing) interface that avoids requiring the driver to type in a desired destination. If a passenger in the vehicle wants to type in a destination, the passenger can do so through the mapping application interface displayed on the mobile device in some embodiments.
  • the search function page 835 displays a search bar 840 with a cancel button, and a results regions 845 that initially includes a set of example searches that the user could dictate and a prompt to the user to verbally recite the desired destination.
  • the results region 845 now illustrates a waveform to indicate that the interface is receiving a voice prompt and a dictation completion control 850 for the user to select upon completion of the dictation.
  • the application will either identify the end of the user's dictation based on the selection of the completion control 850 or the identification of a gap of a threshold duration in the user's speaking. Some embodiments only use the identification that the user has stopped speaking to end dictation, and do not include the touchscreen completion control.
  • the mapping application uses a microphone in the vehicle to record the audio, but performs the audio analysis for voice recognition at the mobile device (e.g., using a voice recognition function of the mobile device or of a server with which the mobile device communicates).
  • the mobile device additionally uses its own microphone to record the audio for analysis.
  • search result map view page 852 shows a new map centered on the searched destination, for which the application displays a pin 857, and selects this pin.
  • the application continues to display the position indicator 860.
  • the mapping application also displays a top information bar 865 and a lower information bar 870.
  • the search function in some embodiments, is biased around the current location of the device. That is, the search function will prefer close destinations (e.g., in the same city as the current location, within a particular distance of the current location, etc.) rather than those in other parts of the world. In addition, some embodiments use past destination history to influence search results, expanding the search range for destinations previously selected by a user.
  • the search function may also use information such as the contacts information stored on the device.
  • search term may have multiple possible results.
  • a search for a chain restaurant may yield multiple locations for different franchises of the chain.
  • the user may also search for a generic type of location (e.g., bars, restaurants, coffee shops, etc.), which will often result in multiple possible destinations.
  • a specific search may result in ambiguity if the user searches for, e.g., "work", but works multiple jobs.
  • the mapping application displays a list of possible search terms from which the user can choose. In other embodiments, however, the mapping application always identifies one set of search terms using the voice recognition function of the device.
  • the top information bar 865 includes two selectable items. In case the user does not like the search results, the user can either cancel the search to return to the default map view UI 565 or select a search again control 875 to perform another search. Furthermore, the top information bar 865 displays the search term or terms spoken by the user and interpreted by the device (in this case, "ABC Deli").
  • the lower information bar 870 includes information about the specific destination currently selected. In some embodiments, this information, as shown, includes the name and/or address of the destination and the estimated arrival time of an optimally calculated route to the destination (3 minutes, in this case, due to the proximity of the destination to the user's current location). Furthermore, the lower information bar 870 includes a selectable button 880 for entering a turn-by-turn navigation mode directed towards the destination.
  • the fifth stage 825 displays the user selecting the button 880 in order to enter navigation to the selected destination (ABC Delicatessen).
  • the sixth stage 830 illustrates a navigation mode user interface 885 of some embodiments.
  • the navigation UI 885 includes the lower information bar from the initial map UI 565 (as provided by the operating system, in some embodiments), a navigation information bar 888, a navigation map 890, a 3D control 895, and a direction list control 897.
  • the navigation information bar 888 displays instructions for a next maneuver to perform along the navigated route, as well as estimated arrival time information (i.e., travel time, travel distance, and arrival time).
  • the 3D control 895 enables a user to toggle the navigation map 890 between 3D and 2D animation.
  • the 3D control 895 is shown as currently selected at stage 825, and the application displays the navigation map 890 in three dimensions. Some embodiments automatically select the 3D option upon entering the turn-by-turn navigation mode.
  • the list control 897 enables the user to access a list of maneuvers for the route. The navigation mode will be described in greater detail below in Section VI.
  • Figure 9 illustrates various user interface actions available in some embodiments to enable such exploration of the map region. Specifically, over six stages 905-930 of the user interface 565, Figure 9 illustrates the user of zoom and scroll functionality as well as the position control 576 to re-center the displayed map region on the device's current location.
  • the first stage 905 illustrates the high-quality touchscreen vehicle user interface 565, currently tracking the device's current location.
  • the mapping application of some embodiments tracks the current location of the mobile device and by default centers the map region displayed for the vehicle UI on the position indicator 860.
  • the mapping application currently is displaying the position control 576 in a grayed-out manner to indicate that it has been disabled.
  • the user selects (e.g., with a single tap input) the zoom out control 578.
  • the second stage 910 illustrates the map after the mapping application has performed the zoom out operation on the displayed map region in the vehicle user interface 565.
  • the map view now shows a larger area of the map because the mapping application displays it at a different zoom level, from a conceptual point of reference higher above the map.
  • the zoom function on the vehicle display modifies the zoom level by a particular amount (e.g., a particular conceptual distance from the ground, a particular multiplier to the map scale, etc.).
  • some embodiments use an inertial stop animation to slow down and stop the animation of the map changing zoom level. For example, to transition from the first stage 905 to the second stage 910, the mapping application zooms out on the map at a particular defined rate, then gradually slows the zoom animation down to a stop, rather than suddenly stopping the animation.
  • the map changes zoom levels by an amount proportional to the duration for which the user holds the touchscreen input over the zoom in or out control, while using an inertial stop animation after the user releases the press-and-hold input.
  • Some such embodiments differentiate between a tap input and a press-and-hold input, so that a tap input causes the application to change the zoom level of the map a greater amount than would be the case for a press-and-hold having the short duration of the tap gesture.
  • the mapping application zooms the map region about the center of the display, which also coincides with the position indicator 860. Some embodiments always center the zoom about the center of the map, while other embodiments always center the zoom about the current location of the device. In some embodiments, the mapping application centers the zoom about the current location when the position indicator is within the displayed map region and about the center of the displayed map when the position indicator is not within the displayed map region.
  • the second stage 910 also illustrates that the user is performing a touchscreen swipe input to scroll the map to the left (thereby enabling viewing of the map region currently off of the screen to the right).
  • the third stage 915 illustrates the result of the swipe/drag gesture, with the mapping application having moved the viewed region of the map rightward enough that the current location indicator is now off the display screen (to the left).
  • the high-quality touchscreen user interface of some embodiments resolves all such swipe gestures over the map view to either up, down, left, or right, while other embodiments calculate the angle of the swipe gesture and move the map in the calculated direction.
  • only high-quality touchscreens allow for this direct manipulation of the map region, allowing drag inputs in which the movement of the touch input from a first touchscreen location to a second touchscreen location causes the map location displayed in the first touchscreen location to move to the second touchscreen location, as though the user was dragging the map itself.
  • the position indicator 860 is no longer centered on the map.
  • the mapping application enables the position control 576, displaying the control more vividly to indicate that the user can now select it.
  • the fourth stage 920 illustrates the user selecting the position control 576.
  • selection of the position control of some embodiments causes the mapping application to center the map region about the current location of the device, irrespective of the map region previously displayed on the map.
  • the fifth stage 925 illustrates that the position indicator 860 is again displayed in the center of the map region in the UI 565.
  • the position control 576 of some embodiments does not cause the mapping application to change the zoom level at which the map is displayed.
  • the sixth stage 930 illustrates the UI 565 after the user has not interacted with the display, but has driven a short distance along the current road.
  • the clock has rolled forward to 2: 15, and the map display has shifted downwards so that the device's current location remains in the center.
  • the mapping application only automatically moves the map region along with the current location when the map view is already centered on the position indicator 860 (i.e., when the user has selected the position control 576 without subsequently scrolling the map via a swipe gesture).
  • mapping application allows independent map exploration on the two user interfaces, only synchronizing the interfaces once the user performs an action to identify a possible destination (e.g., performing a search, selecting a recent destination, etc.).
  • Figure 10 illustrates a user exploring a map in both the mapping application interface 205 displayed on a mobile device 1000 as well as the high-quality vehicle touchscreen interface 565 displayed on the screen of a vehicle to which the mobile device 1000 connects over three stages 1005-1015.
  • the mapping application is open in both the mobile device and the touchscreen.
  • the map region currently displays a region surrounding the current location of the device, with the current location centered and tracked.
  • the user performs a leftward swipe gesture over the map region in the high-quality vehicle touchscreen user interface 565 in order to pan the map to the right.
  • the second stage 1010 illustrates that the map displayed in the vehicle screen is no longer centered on the current location of the device, with a different portion of the map region displayed instead. However, the map region displayed on the mobile device 1000 has not changed.
  • the user performs a pinch multi-touch gesture over the map region displayed on the mobile device in order to zoom in on the map region.
  • the third stage 1015 illustrates that the mobile device user interface 205 displays the map region at a closer zoom level, while the vehicle user interface 565 does not change from the second to the third stage.
  • the bookmarks and recents lists only require gestural input for scrolling through a list of possible choices, when the list is too numerous to fit entirely on the screen. Some embodiments therefore place scroll arrows next to the list for non-gestural (e.g., tap) selection.
  • the search feature does not involve any gestural input (except in rare instances when the search term is ambiguous, with numerous possibilities that require scrolling) or high touch sensitivity resolution and therefore this feature is the same for the low-quality touchscreen interface as the high-quality touchscreen interface.
  • FIG. 5 illustrates various map exploration functions, including zooming, scrolling, etc. over seven stages 1105-1135 of the low-quality touchscreen interface of some embodiments.
  • the first stage 1105 illustrates the low-quality touchscreen map view user interface 570 of some embodiments, with six controls 581-587 in the upper section.
  • this interface 570 is similar to the high-quality touchscreen interface 565 with the difference being that the upper bar of controls includes a single zoom and scroll control 587 instead of the zoom controls 577 and 578, and the touchscreen does not recognize swipe gesture inputs to scroll the map. This may be due to the screen not recognizing movement of touch inputs or because the latency of the touchscreen is so high as to make direct manipulation of the map view (e.g., touchscreen inputs that give the impression of directly dragging the map) too difficult.
  • the position indicator 1170 that indicates the current location of the device is shown at the center of the displayed map region.
  • the first stage 1105 also illustrates that the user selects the zoom and scroll control 587, with a tap selection input. As shown at the second stage 1110, this input results in the display of additional controls within the map view user interface 570. These additional controls include four scroll arrow controls 1140-1155, position control 586, and zoom controls 1160 and 1165.
  • the position control 586 is the same position control as in the first stage 1105, but has been relocated to a different position within the top control bar. As before, the position control 586 enables the user to cause the mapping application to re-center the map view on the device's current location.
  • the zoom controls 1160 and 1165 have the same function as zoom controls 577 and 578 of the high-quality touchscreen interface 565. That is, the control 577 causes the mapping application to zoom in on the map region while the control 578 causes the mapping application to zoom out of the map region.
  • the scroll arrows 1140-1155 enable the user to scroll through the map region incrementally in some embodiments.
  • the displayed map region moves in the direction opposite the arrow in order to reveal a region of the map in the direction of the arrow.
  • the third stage 1115 illustrates a user selecting the left scroll arrow control 1155.
  • the user may select the scroll arrows 1140-1155 with a tap input in some embodiments, or a press and hold input in other embodiments.
  • the third stage 1115 also illustrates (by showing the two right- facing arrows, which are not displayed within the user interface and are shown only for explanatory purposes) that the map region shifts (scrolls) to the right as a result of the selection of the scroll arrow 1155.
  • selecting an arrow facing a particular direction results in the map scrolling in that direction, rather than the opposite direction; however, selecting a left arrow to cause the map to shift (i.e., pan) rightwards and thereby reveal the portion of the map to the left provides a more intuitive interaction.
  • the fourth stage 1120 illustrates the result of the user interaction to select the left scroll arrow 1155.
  • the position indicator 1170 has been moved rightwards to near the edge of the screen.
  • some embodiments scroll the map a fixed percentage of the display screen (e.g., one half the display screen height), which will cause the actual map distance to vary based on the zoom level.
  • some embodiments use an inertial stop animation to slow down and stop the animation of the map panning. For example, to transition from the third stage 1115 to the fourth stage 1120, the mapping application pans the map to the right at a particular defined speed, then gradually slows the panning animation down to a stop, rather than suddenly stopping the animation.
  • the map scrolls by a distance proportional to the duration for which the user holds the touchscreen input over the scroll arrow control, while using an inertial stop animation after the user releases the press-and-hold input.
  • Some such embodiments differentiate between a tap input and a press-and-hold input, so that a tap input causes the application to pan the map further than would be the case for a press-and-hold having the short duration of the tap gesture.
  • the mapping application accelerates the rate at which it scrolls through the map region.
  • the fourth stage 1120 additionally illustrates the user selecting a position between the two scroll arrow controls 1140 and 1145, at an angle ⁇ above the horizontal.
  • This stage also illustrates, by showing the two down-facing arrows (which are not displayed in the user interface, and only shown for explanatory purposes), that the map region shifts (scrolls) downwards as a result of the user input.
  • the angle ⁇ is greater than 45° above the horizontal, and therefore resolves to a selection of the scroll arrow 1140.
  • Figure 12 conceptually illustrates a region representing the map view area 1200 of the low-quality touchscreen user interface, with the scroll arrows 1140-1155 displayed in the map view area.
  • this figure shows two dashed lines at right angles that divide the map view area 1200 into four quadrants 1240-1255.
  • each of these four quadrants is associated with one of the scroll arrows 1140-1155. That is, a touch input anywhere within the quadrant 1255 will result in a selection of the scroll arrow 1155 and cause the map to scroll to the right, showing the map region previously off of the screen to the left.
  • Different embodiments may divide the display differently (e.g., the far right and far left portions may not be associated with scroll arrows 1145 and 1155).
  • some embodiments do not limit the scrolling to only up/down and left/right, but instead also allow for diagonal scrolling.
  • the fifth stage 1125 illustrates the result of the user selection of the upwards scroll arrow 1140, which causes the map display to shift downwards, revealing a map portion previously off the top of the display screen, and pushing the position indicator 1170 off of the bottom of the screen. Furthermore, at this stage, the user selects the zoom out control 1165.
  • the selection of this control operates in a similar manner as the selection of the zoom out control 578 shown in Figure 9 for the high-quality touchscreen interface. That is, the mapping application displays the map at a new zoom level from a conceptual distance higher above the map, thereby revealing more of the map and with a fixed portion of the display representing a larger distance.
  • the sixth stage 1130 illustrates the map in such a zoomed out state, with enough of the map now shown to reveal the position indicator 1170 in the lower right portion of the display.
  • the user selects a "Done" control 1175, to indicate that she has completed her use of the zoom and scroll controls to explore the map region.
  • the seventh stage 1135 illustrates the user interface 570 with the same controls as in the first stage, and the displayed map region that from the previous stage.
  • the low-quality touchscreen interface has the capability to scroll the displayed map diagonally as well as up, down, left, and right.
  • Figure 13 conceptually illustrates a process 1300 performed by the mapping application of some embodiments in order to translate a selection input into a scroll of the map display for a low- quality touchscreen vehicle interface.
  • the process 1300 will be described by reference to Figure 14, which illustrates three stages 1405-1415 of a low-quality touchscreen interface 1400.
  • the user interface 1400 is the same as the previously described interface 570, except for the treatment of diagonal scrolling input when the zoom and scroll control is activated.
  • the process 1300 begins by displaying (at 1305) scroll controls for the low-quality touchscreen interface. This occurs when the mapping application receives input to select the zoom and scroll control of the low-quality touchscreen interface.
  • the first stage 1405 of Figure 14 illustrates a low-quality touchscreen interface 1400.
  • a user selects the zoom and scroll control 1420.
  • the second stage 1410 illustrates the zoom and scroll controls, including scroll arrows 1425-1440.
  • the process 1300 receives (at 1310) a selection of a location on the vehicle touchscreen within the map view. That is, the selection is not a selection of the zoom controls, position indicator, done button, etc., but rather is within the map where the mapping application displays the scroll controls.
  • the second stage 1410 of Figure 14 illustrates that the user makes a selection at a specific touchscreen location between the up arrow 1425 and the right arrow 1430.
  • the process 1400 then calculates (at 1315) the angle of a line from the center of the scroll controls to the center of the touch input.
  • the center of the scroll controls does not exactly coincide with the center of the vehicle display screen because the top bar with the zoom and position controls takes up a portion of the display.
  • the vehicle touchscreen system translates its touch input into a selection at a particular location, which the vehicle system sends to the mobile device (via the mobile device's connection to the vehicle system). Using this information, the mapping application can calculate the angle.
  • the second stage of Figure 14 illustrates that in this example, the angle between the user's selection location and the horizontal is again ⁇ .
  • the process determines (at 1320) whether the angle is within a threshold of one of the primary scroll directions.
  • This threshold in some embodiments, is a particular angular range (e.g., 5°, 10°, etc.) from the horizontal or vertical. In some embodiments, the range is defined based on the display of the scroll arrows: if the selection input is within the minimum range that includes the entirety of the displayed scroll control, then the selection is within a threshold of that primary scroll direction.
  • the process scrolls (at 1330) the map according to the selection of the closest primary direction scroll control. Thus, if the selection is near the right arrow 1430, the map will shift leftwards to display the portion previously off the screen to the right.
  • the process scrolls (at 1325) the map at the calculated angle. That is, the map pans in the direction opposite the calculated angle, as if the user had selected a scroll arrow at the calculated angle.
  • the second stage 1410 shows two arrows (not actually displayed as part of the user interface) pointing down and leftwards to indicate that the map shifts in that direction.
  • the third stage 1415 of Figure 14 illustrates the result of the map having been scrolled down and left according to the user input.
  • the position control 1445 has been activated by the mapping application, in case the user wants to return to having the map follow the device location.
  • Figure 15 illustrates another feature of the scroll arrows of the low-quality touchscreen interface of some embodiments. Specifically, in some embodiments as a user holds down a touch input for an extended period of time over one of the scroll arrows, the map scrolling accelerates. Figure 15 illustrates this acceleration of the map scrolling over four stages 1505-1520 of the user interface 570.
  • the first stage 1505 illustrates the user interface, as a user selects (via a tap input on the touchscreen) the zoom and scroll control 587.
  • the zoom and scroll controls have appeared in the user interface 570.
  • the second stage 1510 also illustrates that the user has begun to press down on the left scroll arrow 1135.
  • the third stage 1515 illustrates (by showing the two right-facing arrows that are not actually displayed in the UI) that the map scrolls to the right by a particular distance as the user holds down the scroll arrow 1135.
  • the third stage also illustrates that the user continues to hold down the scroll arrow 1135.
  • the user continues holding down the scroll control arrow 1135.
  • the map scrolls to the right at twice the speed as in the previous stage.
  • Different embodiments treat the acceleration of the scrolling differently. For instance, some embodiments continue speeding up as a user continues to hold down the same scroll arrow. Other embodiments have a fixed number of different scrolling speeds (as percentages of the map region displayed, rather than physical distances represented by the map). For instance, if the user continued to hold down the scroll arrow 1135, some embodiments would accelerate the scroll speed to three times the original speed.
  • the threshold time after which the application begins to accelerate the scrolling may vary. For instance, some embodiments change the speed of the scrolling after every second that a user has continuously held down the scroll control.
  • bookmarks and recents lists may include scroll arrows for scrolling through a list of bookmarks or recents (e.g., recent searches, destinations, etc.).
  • Figure 16 illustrates the use of scroll arrows to navigate through a list of recents over four stages 1605- 1620 of the low-quality touchscreen interface of some embodiments.
  • the first stage 1605 illustrates the low-quality vehicle touchscreen mapping application interface 570.
  • the user selects the recents control 582 in order to open a recents list.
  • the second stage 1610 illustrates a recents page 1600, which is similar to the recents page 735 shown above for the high-quality touchscreen interface.
  • the low-quality recents page 1600 includes scroll arrows 1625 and 1630 for navigating through the list of recents.
  • the third stage 1615 illustrates the user selecting the scroll down arrow 1630 (with a tap selection).
  • the fourth stage 1620 illustrates that the recents list has shifted upwards by one item, so that the application no longer displays the recent location of San Francisco at the top of the list, and instead displays a recent search for restaurants in Los Angeles at the bottom of the list.
  • Figure 17 illustrates the operation of such a non-touchscreen vehicle user interface 1700 according to some embodiments over five stages 1705-1725.
  • the first stage 1705 illustrates a default non-touchscreen home page 1700. Whereas the home page for the touchscreen vehicle user interfaces includes various selectable controls, the non-touchscreen home page of some embodiments only shows the time.
  • the first stage 1705 additionally illustrates the vehicle hardware controls 580, including the keypad 591, joystick 592, and additional buttons. As mentioned, various different embodiments include different types of controls. While this figure shows a joystick 592, Figure 18, described below, illustrates a trackball. Additional vehicles may use a knob, various types of buttons, etc.
  • the user presses the menu button 593.
  • the second stage illustrates the result of this selection, with the home menu user interface 575 displayed on the non-touchscreen display screen.
  • the home menu interface 575 is described by reference to Figure 5.
  • the user accesses this menu in a variety of different ways. For instance, in some embodiments simply scrolling a trackball, turning a knob, or moving a joystick while in the home page 1700 will activate the menu 575.
  • the user selects the map menu option 594 by navigating to that option with the joystick 592 and pressing down on the joystick to perform a selection.
  • a user might rotate a knob and then press the knob inwards to select one of the options, or move a trackball to choose an option and then press inwards on the trackball to select one of the options.
  • the third stage 1715 illustrates the result of the selection of the map menu option 594.
  • the device outputs the mapping application non-touchscreen vehicle interface 595 for display on the vehicle screen. Furthermore, at this stage 1715, the user again selects the menu button 594.
  • the fourth stage 1720 illustrates the result of this selection.
  • the mapping application user interface now displays a menu 1730 with four options 1735-1750. These four options include a search option 1735, a recents option 1740, a bookmarks option 1745, and a predictive routing option 1750. These options correspond to the selectable controls 571-574 of the touchscreen user interface 565, and enable the user to access the various different features of the mapping application on the non-touch display screen.
  • a fifth menu option for tracking the device's current location in the map is available only when the current location is not presently centered on the map. By default, the first menu option 1735 is presently selected. Some embodiments always select the search option (or a different option when the menu options have a different order) upon opening the menu. Other embodiments, however, select the most recently selected option upon activation of the menu.
  • the fourth stage 1720 additionally illustrates the user moving the joystick downwards.
  • the fifth stage 1725 illustrates that the selection marker in the menu has moved down to the recents option 1740, such that a selection input (e.g., pressing down on the joystick) will select the recents option.
  • Some embodiments use up and down to scroll through the menu and use right and left to either select and option (a rightward movement) or go back to a previous page (a leftward movement).
  • the above figure illustrates menu navigation with the non-touchscreen user interface.
  • Map exploration presents a unique challenge, however, because the map must be scrollable and zoomable, actions that have more natural inputs via a touchscreen with direct manipulation type actions available (e.g., drag gestures, pinch gestures, or touchscreen controls that do not need to conform to those provided by a vehicle system).
  • direct manipulation type actions e.g., drag gestures, pinch gestures, or touchscreen controls that do not need to conform to those provided by a vehicle system.
  • Figure 18 illustrates one example of such map exploration in a non-touchscreen vehicle user interface 1800 over four stages 1805-1820.
  • the user interface output by the mapping application is the same as in the previous example, but the vehicle controls are different.
  • the vehicle controls include a trackball 1825.
  • the trackball 1825 is a ball that sits in a well, and moves within the well in various directions in response to user input.
  • a user e.g., a driver or passenger
  • the user can rotate the trackball about the vertical axis through the center of the trackball.
  • the user can press down as a separate input and/or "shift" the trackball forwards, backwards, leftwards, or rightwards.
  • the user rolls the trackball 1825 to the right.
  • the rightward scroll of the trackball results in the map shifting leftwards, revealing a portion of the map previously off of the screen to the right.
  • the second stage 1810 illustrates this shift of the map, as the map region (and the position indicator 1830 that shows the device's current location) have moved to the left in the display screen.
  • the second stage 1810 also illustrates that the user is rotating the trackball 1825 clockwise about its center axis.
  • this rotation action of the trackball causes the application to zoom either in or out (e.g., zooming in for clockwise rotation and zooming out for counterclockwise rotation).
  • the mapping application changes the zoom level to zoom in on the map, as illustrated by the third stage 1815.
  • the user selects the menu button 1835.
  • this selection causes the mapping application to display a menu 1840, similar to the menu 1730.
  • the menu includes the same four initial options (search, recents, bookmarks, and predictive routing), but also includes a fifth option 1845 for the current location (because the map is no longer centered on the current location of the device).
  • the selection of the current location option 1845 is similar to the selection of the position control in the touchscreen interfaces, and results in the mapping application centering the map on the current location of the device and subsequently tracking that location.
  • Vehicles with different types of inputs may perform these scroll and zoom actions differently.
  • a device with only a knob for input may initially scroll the map left and right in response to rotations of the knob (e.g., clockwise to scroll the map rightwards and counterclockwise to scroll the map leftwards).
  • rotations of the knob e.g., clockwise to scroll the map rightwards and counterclockwise to scroll the map leftwards.
  • subsequent rotation will scroll the map up and down.
  • additional rotation of the knob will zoom in or out of the map.
  • shifting the joystick left, right, up or down causes the map to move in the shifted direction (or opposite the shifted direction) in some embodiments.
  • Some embodiments interpret rotation of the joystick as zoom input, while in other embodiments the joystick can be pushed inwards or pulled outwards to change the zoom level.
  • the vehicle interface only permits search input via dictation, rather than via a keyboard or other typing mechanism. This is to dissuade the user from attempting to type in locations while driving, though a passenger still has the ability to enter search terms through the mobile device interface via typing.
  • the mobile device that runs the mapping application also has voice recognition functions to perform searches or answer other questions.
  • the vehicle interface search is a pure dictation search
  • the voice recognition functions on the device are more interactive. That is, on the mobile device, the user can dictate successive requests, in which the voice recognition functions use context of previous requests to analyze the meaning of subsequent requests.
  • the vehicle maps interface is optimized towards quickly and efficiently identifying a destination for navigation, each dictated request is treated as a separate search. The following section describes various aspects of the vehicle search interface in greater detail.
  • Figure 8 illustrated the search feature of some embodiments in the touchscreen vehicle user interface.
  • the mapping application automatically transitions from the search function page to a map view showing search results for the unambiguous search term.
  • the mapping application presents the user with a set of choices for the search.
  • Figure 19 illustrates the search feature in the case of an ambiguous search term over five stages 1905-1925 of the high-quality touchscreen user interface.
  • the search feature in some embodiments, operates the same in both the high-quality and low-quality touchscreen interfaces, and similarly in the non-touchscreen interface, with differences that will be explained in detail below.
  • the first three stages 1905-1915 are the same as the first three stages 805-815 of Figure 8, with the exception of the phrase spoken by the user in the third stage 815.
  • the first stage 1905 illustrates the user selecting the search control 571 in the high-quality vehicle touchscreen interface 565.
  • the second stage 1910 illustrates the search function page 835 that prompts the user to speak her destination and the third stage 1915 illustrates the page as the user speaks. In this case, the user dictates "Home" into the vehicle microphone, which is then processed by the mobile device voice recognition system.
  • the voice recognition processing performs pure dictation, rather than natural language processing. That is, the voice recognition does not attempt to parse out the meaning of phrases or expressions within the user's search request.
  • mapping application would look for search terms that match this phrase, rather than identifying "go” as a desire for the application to navigate towards a particular location.
  • some embodiments perform natural language processing for the mapping application search requests received through the vehicle interface. In such embodiments, if the user stated "go home", the application would recognize the user's intention and perform the search shown in Figure 19.
  • the mapping application cannot determine which of two search terms a user has requested through dictation. Because the search term is ambiguous, the search function page 835 displays the terms as searched ("Home") in the search bar 840, with a list of possible search results in the search results region 845. In this case, the user has two locations stored as "Home", one in Los Angeles and one in San Francisco.
  • the search feature initially searches a set of databases that store information relevant to the user of the device.
  • These databases include the user's stored contact information (e.g., address book records), recent searches performed within maps (i.e., results available through the recents feature of the interface), stored bookmark information (i.e., results available through the bookmarks feature of the interface), as well as harvested address and location information.
  • the mobile device running the mapping application is one of several devices connected via a user account. These devices share information via a content-sharing service, and this content-sharing service (with permissions from the user) identifies addresses and/or possible destinations from sources such as the user's e- mail, messages, web browsing, etc.
  • the search function uses a more general search engine to identify locations within the user's vicinity.
  • the databases find multiple results for "Home", as the user has two separate addresses stored for their home.
  • the fourth stage 1920 the user selects the Los Angeles home address.
  • the fifth stage 1925 displays the search result map view page 852, with a pin 1930 displayed at the location of the user's Los Angeles home.
  • the mapping application centers the map about this dropped pin 1930, and the lower information bar 870 displays information about the location, including the name (“Home”), the address "1919 Foothill Drive”), the estimated travel time to the location, and a selectable control 880 for entering the turn-by-turn navigation mode of the mapping application with the selected location as the destination.
  • the upper information bar 865 displays the search term as translated by the voice recognition functions ("Home"), along with a cancel control 1935 (to return to the standard map exploration interface) and a search again control 875 to return to the search function page 835 displayed in the second stage 1910 of this figure.
  • Figure 19 illustrates ambiguous results identified within the databases searched by the mapping application search function (specifically, the address book information). In some embodiments, ambiguities may also arise from search engine results (e.g., when a user dictates a street address that may resolve to addresses in multiple different cities). Figure 20 illustrates such a case of ambiguous results returned by the search engine.
  • Figure 20 illustrates the search feature of the mapping application identifying ambiguous results for an address search over five stages 2005-2025 of the high- quality vehicle touchscreen interface.
  • the first three stages 2005-2015 are the same as the first three stages of Figures 8 and 19, with the user selecting the search control 571, the search function page 835 appearing, and the user dictating a search query.
  • the user dictates "1200 Main St.”.
  • the user is driving in the Orange County area of Southern California.
  • the mapping application cannot determine whether the user is searching for the 1200 Main St. located in Huntington Beach or the 1200 Main St. located in Irvine.
  • the fourth stage 2020 illustrates that the search function page 835 displays these two possible results in the search results region 845.
  • the user selects the address located in Huntington Beach, and the fifth stage 2025 illustrates that the mapping application displays a map centered at this location, with a pin 2030 displayed at the requested address.
  • Figure 21 illustrates an example of such a search that produces multiple results over eight stages 2105-2140. As with the previous examples, the first three stages 2105-2115 are the same, with the exception of the user's dictated search query. In this shown at the third stage 21 15, the user dictates the search "Coffee".
  • the search is not ambiguous.
  • the mapping application does not find any results in the local databases (e.g., recents, address book, etc.), and therefore performs a search for coffee in the region surrounding the device location. While the search finds multiple results, the search term itself ("coffee") is not ambiguous. Instead, the mapping application search feature recognizes this as a more generic search looking for one or more results for a particular type of location (coffee shops).
  • the application automatically transition to the search results map view page 852, with the search term "Coffee” appearing in the upper information bar 865.
  • This search term produces six search results within the region searched (e.g., within a particular distance of the device, or drivable within a particular duration from the location of the device).
  • the map region displays a map with a pin for each of the six search results, and one of these pins displayed as selected.
  • Some embodiments automatically initially select the search result closest to the device's current location. In some embodiments, other factors may affect the ordering (i.e., ranking) of the search results returned by a search engine (e.g., a search server).
  • the search engine may use popularity (by review quality, or by number of people reviewing a location) and relevance in addition to distance when determining the ordering of search results. For example, when the user searches for coffee, the closest result in terms of proximity might be a restaurant that happens to serve coffee, but the search engine may return a coffee shop located slightly further from the current location instead, based on a combination of relevance and popularity.
  • the lower information bar 870 displays the information for the selected destination (in this case, Curt's Coffee, which is three minutes away), and the selectable control for entering navigation.
  • the application displays a left arrow control 2145 and a right arrow control 2150, with the text "1 of 6" located between the arrows.
  • the vehicle interface includes a mechanism for stepping between the search results. As shown in subsequent stages of this figure, selection of the arrow controls 2145 and 2150 causes the application to step through the search results.
  • the user makes a particular type of selection (in this case, a double tap) of the currently selected pin 2155 for Curt's Coffee.
  • the fifth stage 2125 illustrates that this double-tap selection of the pin 2155 results in the mapping application centering the selected pin in the map region and zooming in on the pin as well.
  • the fifth stage 2125 illustrates the same user interface except for the map region now centered on the pin and at a closer zoom level such that less of the surrounding region is displayed.
  • such selection e.g., a double-tap within the vicinity of a pin
  • a second such selection causes the application zoom back out to show all of the results.
  • the user cannot zoom in and out in this way when only one search result is found.
  • the user selects (via a tap input) the right arrow control 2150.
  • the seventh stage 2135 illustrates that the mapping application has selected the second closest result to the user's location, Killer Coffee, and shifted the map region so that the pin 2160 for this location is now in the displayed portion of the map region.
  • some embodiments center each particular pin upon stepping through the search results to the particular pin and selecting the particular pin.
  • Other embodiments modify the map display so that the pin is within a particular threshold of the center and the center of the displayed map region is equidistant from the selected pin and the current location.
  • the seventh stage also illustrates the user again selecting the right arrow control 2150 (again with a tap input).
  • the mapping application selects the third closest result to the user's location, Joe's Drowsy Cure, and shifts the map region so that the pin 2165 for this location is now visible within the map display. Furthermore, in order to include both the current location of the device and the selected pin 2140, the mapping application zooms out from the map while shifting the map region.
  • the above examples illustrate the mapping application search feature for the touchscreen vehicle user interfaces.
  • the only difference between the high-quality touchscreen interface search feature and the low-quality touchscreen interface search feature is the size of the buffers for touch selectability around the various controls (e.g., the pins, the right and left arrow controls for stepping through the search results, etc.).
  • the differences in buffer size of some embodiments are described above by reference to Figure 6.
  • Figure 22 illustrates the operation of the mapping application search feature for the non- touchscreen vehicle user interface of some embodiments, over six stages 2205-2230.
  • the first stage 2205 and second stage 2210 are the same as the third and fourth stages 1715 and 1720 of Figure 17, described above.
  • the user performs a selection input (e.g., pushing the joystick 592 inwards, pushing a knob inwards, pressing a particular hardware button, performing a specific input on a trackball, etc.) with the search menu option 1735 selected.
  • a selection input e.g., pushing the joystick 592 inwards, pushing a knob inwards, pressing a particular hardware button, performing a specific input on a trackball, etc.
  • this selection activates the search feature, causing the display of a search function page 2235.
  • the search function page 2235 is analogous to the touchscreen search function page 835, and prompts the user to dictate a destination along with examples. However, the page does not have a cancel option, because there is no touchscreen. Instead, in some embodiments the user can cancel the search by pressing the physical back button 2240 or exit button 2245.
  • the search function page has the same waveform animation as in the touchscreen interface, but does not include a touchscreen button to indicate completion of the dictation.
  • the mapping application can recognize when the user has completed speaking in some embodiments.
  • the search function page 2235 displays instructions to the user to press the menu button upon completion of the dictation.
  • the user can perform different inputs via the built-in vehicle controls in order to end the search (e.g., pressing inwards or upwards on the joystick, pushing a knob, etc.).
  • the fifth stage 2225 illustrates that the search algorithms function in the same way irrespective of the interface.
  • the same search for coffee with the device in the same location gives the same six results as in the touchscreen interface, in the same order.
  • the fifth stage 2225 illustrates the map region displayed in the non-touchscreen interface, with six pins representing the six search results, and the closest location to the device's current location selected.
  • the non-touchscreen interface search results map view 2250 unlike the search results map view 852 of the touchscreen interface, does not include touchscreen buttons such as the cancel, search again, or enter navigation controls. Instead, the mapping application relies on user interaction with the built-in vehicle controls to perform these actions (e.g., pressing inwards on a joystick to enter navigation, using the back or exit buttons to respectively search again or cancel the search function).
  • the lower information bar in the search results map view 2250 is similar to that in the touchscreen interface.
  • the user interface additionally does not have selectable arrows on either side of the text "1 for 6" that indicates the number of search results that can be stepped through via non- touchscreen input. In this case, the user rotates the joystick 592 clockwise.
  • the mapping application selects the second search result (as ordered by the search engine, using various factors, such as proximity to the current location of the device, popularity, relevance, etc.).
  • mapping application search function initially searches various local databases, then uses a search engine to perform a local search when unable to find a result via the local databases.
  • Figure 23 conceptually illustrates a process 2300 performed by the mapping application of some embodiments in order to retrieve and display search results for a search dictated through the vehicle user interface.
  • the process 2300 begins by receiving (at 2305) voice dictation audio captured by a microphone in the vehicle.
  • the user first activates a search function through the vehicle interface of the mapping application, which prompts the application to receive the audio recorded by the vehicle microphone.
  • Using the vehicle microphone (which, in some embodiments, must have the ability to record audio of at least a threshold quality) enables the user to avoid the need to pick up the mobile device and speak into the device microphone.
  • other embodiments use a microphone in the mobile device to record the voice dictation.
  • the process 2300 processes (at 2310) the dictated audio to determine the search terms.
  • the mobile device includes voice recognition software leveraged by the application in order to identify the search terms spoken by the user.
  • the mapping application connects to a voice recognition server that performs the voice analysis.
  • some embodiments perform dictation translation that does not account for natural language phrasing.
  • some embodiments parse phrases to determine the user's desired search. For instance, when the user says "go home”, the voice recognition process identifies that the user is searching for "home”, and can ignore the "go”.
  • a user dictates "Find the best coffee” some embodiments identify coffee as the search, and "best” as a modifier to use quality ratings for identified locations in the area.
  • these local databases include the user's address book (e.g., information about the user's contacts) and addresses and/or destinations harvested from other applications. These harvested destinations may include addresses or destinations in e-mails, SMS messages, etc. either sent or received by the user.
  • the harvested destinations include in some embodiments addresses either specifically sent to the mapping application from a web browser or other application (such as Yelp®) on the mobile device or on web pages or locations navigated to with the web browser or other application.
  • these addresses are not restricted to those identified by the mobile device connected to the vehicle interface.
  • the mobile device may connect to an information- and/or content-sharing service (e.g., iCloud®) that enables the sharing of content and information between multiple device connected by an account.
  • iCloud® information- and/or content-sharing service
  • destinations may be harvested from all of the user's devices connected to the account, and distributed to the mobile device on which the mapping application operates for use by the mapping application search feature.
  • some embodiments also include recent search results from previous searches by the mapping application in the initial search of local databases.
  • the process 2300 determines (at 2320) whether at least one result is found in the local databases. As described below, when no results are found via this search, the process proceeds to 2340 to perform a search using a search engine. When at least one result is found by the local database search, the process 2300 determines (at 2325) whether multiple results are found within the local database search. In some embodiments, the search function treats multiple results identified in the local databases as ambiguous results.
  • the process displays (at 2330) the set of results as selectable options in a list.
  • Figure 19 is an example of such a search result, in which two results for the search "home" are found.
  • Other searches that might turn up ambiguous results in the local databases include "work” when the user works multiple jobs or has multiple work locations, or "Bobby's house", when the address book has information for multiple people named Bobby.
  • the process receives (at 2332) a selection of one of the displayed options. While this is shown as a linear process, one of ordinary skill in the art will recognize that the application may receive a selection of a cancel option, or may wait an indeterminate amount of time between operation 2330 and operation 2332 until the user makes a selection.
  • the process displays (at 2334) the selected option on the map.
  • the process instead displays (at 2335) the singular result on the map view, without displaying a list of results.
  • the mapping application of some embodiments displays a pin at the identified location, and provides the user with a control for entering a turn-by-turn navigation mode with the identified location as the destination.
  • the process performs (at 2340) a search for the dictated search terms using a search engine.
  • this is a search engine also available to a web browser on the mobile device, while in other embodiments the mapping application uses a different web browser.
  • the mapping application constrains the search by location so that only results within a particular proximity of the device's current location will be found. For example, the application may constrain the search to a particular distance (e.g., a 5-mile radius, 10-mile radius, etc.) or a particular travel time (e.g., only locations that the user could reach by car within 15 minutes, 30 minutes, etc.). Some embodiments initially use a small distance or time constraint, then start increasing the constraint until the distance or travel time is large enough for at least one result to be identified.
  • the process determines (at 2345) whether the search results are ambiguous. In the case of the search engine results, this does not determine simply whether one or more result was returned. Instead, search results are ambiguous if the application cannot determine with a threshold confidence what result the user desires.
  • the process displays (at 2350) the set of results as selectable options in a list, as was the case for multiple results identified within the local databases. For instance, in the example of Figure 20, the user searches for a particular address without specifying a city. As this happens to be an address in two difference cities nearby, the results are ambiguous because the user is clearly searching for a single location. Similarly, if the user were to dictate "Curt's Coffee", but two different unrelated coffee shops nearby had the names “Curt's Coffee” and "Kurt's Koffee", some embodiments would display these results in a list as ambiguous, allowing the user to select one.
  • the process receives (at 2352) a selection of one of the displayed options. While this is shown as a linear process, one of ordinary skill in the art will recognize that the application may receive a selection of a cancel option, or may wait an indeterminate amount of time between operation 2350 and operation 2352 until the user makes a selection. Upon receiving the selection, the process displays (at 2354) the selected option on the map.
  • the process determines (at 2355) whether the search engine has yielded more than one result for an unambiguous search. When only one result is returned, the process displays (at 2360) the singular result on the map view, without displaying a list of results. As shown in Figure 8, the mapping application of some embodiments displays a pin at the identified location, and provides the user with a control for entering a turn-by-turn navigation mode with the identified location as the destination
  • the process displays (at 2365) the returned set of locations on the map along with controls for stepping through the locations.
  • Figures 21 and 22 are examples of such searches for different interfaces, with the controls displayed differently in for the different vehicle interfaces.
  • the mapping application displays pins at each of the locations, and the user can select the different pins with the stepping controls. With any of the pins selected, the user can select an option to enter turn-by-turn navigation with the selected location as the destination.
  • the process ends. In the cases in which the process displays a set of selectable options in a list, the application proceeds to the map view display after the user selects one of the possible results.
  • FIG. 24 illustrates an example of a user performing a search within the mapping application mobile device interface while the mobile device is connected to a vehicle interface. Specifically, the figure illustrates three stages 2405-2415 of both interfaces in which the user initiates a search function, performs a search, and receives search results displayed on both interfaces.
  • the first stage 2405 illustrates a mobile device 2400 displaying the mapping application user interface 205 of some embodiments, as described above by reference to Figure 2, as well as the high-quality vehicle user interface 565. Though not shown, the mobile device is currently connected to a vehicle that includes a touchscreen interface for displaying the vehicle UI 565. In this first stage 2405, the user selects the search field 240 to initiate a search operation on the mobile device.
  • the user has utilized a touchscreen keyboard 2420 (not available in the vehicle user interfaces of some embodiments) to type a search for "Pizza" into the search field 240.
  • the search field 240 displays various autocompleted options for selection by the user.
  • the search field displays a selectable option of "Pizza", which the user selects via a touch input.
  • the mapping application does not modify the vehicle user interface 565.
  • the third stage 2415 displays four dropped pins in the mobile device interface 205, corresponding to the search results for a generic search for pizza.
  • the mapping application changes the zoom level of the map view in the mobile device UI 205 in order to display all of the search results.
  • the mapping application also selects the first search result (for a particular pizza location), and displays an expandable banner 2425 for the result.
  • the mapping application also directs the vehicle user interface 565 to display the search results, as if the search had been performed using the vehicle interface.
  • the mapping application allows separate map exploration in the two interfaces, and separate access of various features (e.g., search interface, recents or bookmarks, predicted routes, etc.
  • a result that corresponds to a possible destination e.g., a selection of a recent location, a selection of a bookmarked location, a search result, etc.
  • the application synchronizes the interfaces to display the same possible destinations in the two interfaces.
  • the control for accessing the recent destinations feature was discussed.
  • the touchscreen interfaces include a recents control selectable from the primary map view, and the non-touch interface includes a recents option in the primary mapping application menu.
  • the following will discuss additional aspects of the recent destinations feature of some embodiments.
  • Figure 7 above illustrated that when the user selects the recents control in a touchscreen interface (or selects the recents menu option in a non-touchscreen interface), the mapping application displays in that interface a list of destinations recently searched in the mapping application.
  • these recent destinations may include destinations that have appeared as search results in the mapping application when searched through the mobile device interface of the mapping application.
  • Figure 25 illustrates three stages 2505-2515 of the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then accesses the recent destinations on the vehicle interface of the mapping application which now include the previously-searched destination.
  • the first stage 2505 illustrates a person walking towards a vehicle 2500 while carrying a mobile device 2520 on which the mapping application of some embodiments operates.
  • the mapping application mobile device UI 205 the user has searched for "Cool Coffee", and the mapping application displays a search result pin for the identified location in the map view.
  • the user has entered the vehicle 2500 and begun driving.
  • the user has connected her mobile device 2520 to the vehicle display system.
  • the display screen of the vehicle 2500 displays the mapping application UI 565.
  • the device when the device is connected while the mapping application is operating, the device automatically sends the mapping application vehicle user interface to the vehicle display screen.
  • the device operating system upon connection to the vehicle system, automatically sends the device and vehicle user interfaces to their respective home pages.
  • the third stage 2515 illustrates the recent destinations page 735 of some embodiments.
  • the recent destinations page 735 includes the Cool Coffee Shop, which was the search result identified by the mapping application based on a search through the mobile device interface 205 before the connection of the mobile device to the vehicle system.
  • FIG. 26 illustrates three stages 2605-2615 of the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then opens the mapping application on the vehicle interface.
  • the first stage 2605 illustrates a person walking towards a vehicle 2600 while carrying a mobile device 2620 on which the mapping application of some embodiments operates.
  • the mapping application mobile device UI 205 the user has searched for "Cool Coffee", and the mapping application displays a search result pin for the identified location in the map view.
  • the user has entered the vehicle 2600 and begun driving.
  • the user has connected her mobile device 2620 to the vehicle display system.
  • the display screen of the vehicle 2600 displays the home page user interface 550 for vehicle touchscreens.
  • the device automatically returns to its home page as well upon connection to the vehicle system.
  • the user selects the map control 563 to enter the mapping application, which also causes the mobile device display to display the mapping application (not shown).
  • the third stage 2615 illustrates the result of entering the mapping application, in that the application automatically displays a search results map view page (showing the one result for the Cool Coffee Shop) in the vehicle display for the recently performed search.
  • the recent destinations page displays destinations from sources other than mapping application searches. For example, some embodiments display recently harvested locations from e-mails, SMS messages, etc. (as described above in relation to the search feature) as recent destinations. In addition, some embodiments enable the user to add destinations and addresses to the mapping application.
  • Figure 27 illustrates three stages 2705-2715 of a mapping application in which a user adds a location to the mapping application from a web browser on the mobile device, which adds the location to the mapping application table of recent destinations.
  • the first stage 2705 illustrates a mobile device 2700 on which a web browsing application interface 2720 is displayed.
  • the web browsing application interface 2720 currently displays a web page for the Tee-Cake Bakery.
  • the first stage (and subsequent stages) illustrates a recents table 2725 stored by the mapping application.
  • This table indicates what the mapping application will display when the user selects a recents control in the mapping application user interface.
  • the recents table 2725 includes two entries, for Hamburger Palace and Pot Sticker's Restaurant.
  • the user selects a contact information tab 2740.
  • the web browsing application interface 2720 displays various contact information for the Tee-Cake Bakery, including a physical location address.
  • the user presses and holds over the address in order to select the text that forms the address.
  • the recents table 2725 is unchanged from the first stage 2705.
  • the web browsing application interface displays a menu 2730 with several options relating to the selected text.
  • the menu includes a copy option, as well as several options based on recognition by the mobile device that the selected text is an address.
  • the device e.g., a function of the device operating system or web browsing application
  • These menu options include an option to add the address to contacts as well as an option 2735 to add the address to the mapping application operating on the mobile device. The user selects this last option 2735 in the third stage 2715.
  • the mapping application recents table 2725 now includes a third entry, for the Tee-Cake Bakery.
  • the entry for the bakery has been added to the top of the table, which represents the order in which the recent destinations will be displayed when a user selects the recents control in the vehicle user interface.
  • the application adds it at the top of the table.
  • the application resets the access time for the destination, placing it at the top of the table.
  • While the recent destinations page of some embodiments displays recent destinations in order from most recently added to earliest added by default, various factors can cause the application to rearrange the display order of the recent destinations. For instance, some embodiments identify when a first location is a more likely destination than a second location, and place the first location above the second location in the recents list.
  • Figure 28 illustrates a recent destinations page 735 of some embodiments over three stages 2805-2815 in which the vehicle displaying the recent destinations page 735 travels over a stretch of road.
  • Each stage illustrates, in addition to the recent destinations page 735, a conceptual map 2800 on which the vehicle location (and therefore the location of the device connected to the vehicle display system) is shown.
  • This map 2800 does not represent a user interface display, but rather indicates a location on the shown streets of the vehicle.
  • the first stage 2805 illustrates the current location 2820 of the mobile device and vehicle to which it connects in the map 2800.
  • the recent destinations page 735 includes two recently searched destinations (Hamburger Palace and Pot Sticker's Restaurant), both located in Los Angeles.
  • the map 2800 additionally includes a pin drawn to represent a location of the first of these listed destinations (Hamburger Palace). Further drawn on the map are a solid arrow representing the current direction of travel of the vehicle (and device) as well as a dashed arrow representing an optimal route to the most likely of the recent destinations. Because Hamburger Palace is the closest of the recent destinations, and the direction of travel projected forward overlaps with a portion of the optimal route to Hamburger Palace, the application lists this destination first in the recent destinations page 735.
  • the vehicle approaches the intersection at which the route to Hamburger Palace diverges from the current travel direction (i.e., the intersection at which the vehicle would turn right to travel to Hamburger Palace).
  • the displayed recent destinations page 735 remains the same.
  • the third stage 2815 illustrates the map 2800 and the recent destinations page 735 after the user has passed the turn towards Hamburger Palace.
  • the map now shows a route to Pot Sticker's Restaurant, which overlaps with the projection of the current travel direction.
  • Hamburger Palace on the other hand, is located behind the current location of the vehicle in its current direction of travel.
  • the mapping application determines that Pot Sticker's Restaurant is a more likely destination for the user, and moves this destination ahead of Hamburger Palace in the recent destinations page 735.
  • a predictive routing engine of the mapping application performs the determination of which destination is more likely based on various factors, including direction of travel, routes to the different possible recent destinations, frequency of previous travel to the destination, etc.
  • the mapping application of some embodiments only synchronizes the two interfaces (mobile device and vehicle display) once the user behavior results in a selected destination.
  • Figure 29 illustrates a user utilizing the recents feature on a mobile device in order to select a destination over three stages 2905-2915 of both the mobile device mapping application UI 205 and the high-quality touchscreen interface 565.
  • both of the mapping application interfaces 205 and 565 display their respective standard map displays.
  • the user selects the search field 240 to initiate a search by the mapping application in the mobile device interface 205.
  • the second stage 2910 illustrates that the mapping application vehicle interface 565 remains static while the mobile device interface 205 displays a recents list.
  • the application displays a list of recent destinations, routes, etc. until the user begins typing in a search query.
  • the list of recents includes a previously requested route from a particular coffee shop to a mini-golf course as well as a previously searched address (rather than a route).
  • the user selects the destination 123 Opal Cove from the recents list.
  • the third stage 2915 illustrates the result of this selection in both of the mapping application interfaces.
  • the mapping application displays a pin at the selected address as well as an expandable banner 2920 for the result.
  • This banner 2920 includes a selectable item 2925 that enables the user to command the application to enter turn-by-turn navigation with 123 Opal Cove as the destination.
  • the mapping application displays the search results map view with a pin 2930 for 123 Opal Cove as a selected destination. In this case, the application displays the selected destination in the center of the map view in both UIs.
  • Figure 30 illustrates the corresponding action taken within the vehicle user interface 565 over three stages 3005-3015.
  • the first stage 3005 displays the same mobile device UI 205 and vehicle user interface 565 for the mapping application as in the first stage of Figure 29. In this case, however, the user selects the recents control 572 in the vehicle user interface 565.
  • the mapping application outputs the recent destinations page 735 to the vehicle display.
  • the mobile device interface displayed both a location (123 Opal Cove) and a route (Cool Coffee Shop to Putt-Putt Golf Course)
  • the vehicle user interface recent destinations page 735 only displays locations (i.e., possible destinations for navigation).
  • the mapping application does not display recent routes in the vehicle user interface, instead only displaying locations within the recent destinations page.
  • some embodiments display the destination of the route in the recents list as a possible destination. Some embodiments display these destinations as options irrespective of whether the searched route start point was the user's current location or a different location.
  • the user selects the destination 123 Opal Cove from the recents destinations page 735.
  • the third stage 3015 illustrates that the result of this selection in both of the mapping application interfaces is the same as when the destination was selected from the recents list on the mobile device.
  • the vehicle interface displays the search results map view with 123 Opal Cove as the selected destination, while at the same time the mobile device interface displays a pin at the selected destination as well as the expandable banner 2920.
  • the mapping application of some embodiments uses a predictive routing engine, which may be a part of the mapping application or operating separately from the mapping application on the mobile device.
  • the predictive routing engine can (1) formulate predictions about current or future destinations and/or routes to such destinations and (2) relay information to the user of the device about the predictions.
  • the predictive routing engine includes a machine-learning engine that facilitates the formulation of predicted future destinations and/or routes based on stored, user-specific data.
  • the user-specific data may include any combination of the following in various embodiments: (1) previous destinations to which the user has traveled with the device, (2) previous routes taken by the user with the device, (3) locations and times of events stored in scheduling applications (e.g., electronic calendar, electronic ticket repository, etc.), and (4) addresses parsed from e-mails and/or messages sent to or from the user.
  • the mapping application can output a series of predicted destinations and routes to those destinations when requested by the user. Through this interface, the user can view the different destinations and routes, and enter navigation towards one of the destination if desired.
  • Figure 31 illustrates the use of this predictive routing interface on the vehicle display over four stages 3105-3120 in which the user activates the interface and views several different predicted destinations.
  • the first stage 3105 illustrates the high-quality vehicle touchscreen interface 565 of some embodiments.
  • the user selects the predictive routing control 574 to activate a predictive routing feature.
  • the second stage 3110 illustrates the predictive routing user interface 755 of some embodiments.
  • the predictive routing UI 755 displays, in order of likelihood as computed by the predictive routing engine, one or more likely destinations and routes to those destinations. As described above by reference to Figure 7, the predictive routing UI 755 displays, for each predicted destination, a first panel 760 with information about the destination and a second panel 765 with a map view of a route to the destination.
  • the information panel 760 displays a name for a current destination/route as well as metadata about the destination or route. This name may be based on a destination name identified from contact information, a calendar appointment name, a business name, an address, a primary road used by the route, etc.
  • the displayed metadata may include an estimated arrival time, the source of the route for the predictive routing engine (i.e., how the predictive routing engine determined the destination as a likely end point for the current trip), whether the destination is a frequent or less traveled destination for the device (and therefore the user of the device), etc.
  • the map view 765 displays a map with the current location of the device, the predicted destination, and a route generated from the current location to the predicted destination.
  • the mapping application factors in traffic information to select an optimal route to the destination, and may warn a user if a typically-traveled route has worse traffic than usual, or if the user will be late to an appointment due to traffic.
  • the predictive routing engine has determined that the user's most likely destination is her Aunt Sally's house.
  • the information panel 760 for this destination includes a name for the destination (Aunt Sally's), an estimated time of arrival (2:30), and metadata indicating that this is a frequent destination for the user. Therefore, because the user regularly travels to her Aunt Sally's home, and is on a path regularly taken for that destination, the predictive routing engine concludes that this is a likely destination for the user's current travel.
  • the information panel 760 displays a selectable control 3130 for canceling the predictive routing feature and returning to the standard map exploration view. In some embodiments, selecting the control 3130 removes the currently predicted route from the set of routes, causing a route for a different predicted destination to appear in the display. In this case, each predicted route has its own selectable cancellation control.
  • the map view 765 displays a route from the current user location to Aunt Sally's house.
  • the predictive routing user interface 755 includes a predictive routing information bar 3125.
  • the predictive routing information bar 3125 includes a set of circles that indicate (1) the number of predicted likely destinations and (2) which of these is currently displayed in the interface 755.
  • the application displays three circles 3135 to indicate that there are three available predicted destinations.
  • the first of these circles is filled in to indicate that the application currently is displaying information about the first (and therefore most likely according to the predictive routing engine) of these destinations. Because additional predicted destinations are available, the application displays a portion of the next destination at the right edge of the display screen. In some embodiments, when only one destination is predicted by the predictive routing feature, the application does not display the circles 3135, and in some such embodiments, eliminates the information bar 3125 altogether.
  • the application displays the predicted routes in a filmstrip view, with two conceptual frames for each route.
  • the first frame, or panel, for a route displays the metadata (e.g., name of destination, other data) and the second frame displays the map view.
  • the metadata panel for the second route comes after the map panel for the first route.
  • the second stage 3110 additionally shows a condensed view 3140 of such a filmstrip, with the primary display of this stage illustrating the actual display of a portion of that filmstrip within the display screen.
  • the condensed view illustrates three pairs of frames, each pair including a route metadata panel and a route map panel.
  • the user performs a left swipe gesture on the vehicle touchscreen display.
  • the user can utilize such gestures (or similar touch gestures) in order to select different predicted destinations.
  • the third stage 3115 displays the second of the predicted destinations, Uncle Harry's, and a route to this predicted destination.
  • the application animates this transition by sliding the first destination information off to the left while concurrently sliding the information for the second destination onto the display. This simulates the sliding of the filmstrip shown in condensed view 3140 leftwards in order to show the second pair of panels in the display.
  • this new destination is the user's Uncle Harry's house, for which the estimated arrival time is 2:40.
  • the metadata also indicates that the destination is a rare destination for the user, but is based on a calendar event at the location.
  • the second of the three circles, rather than the first, is now filled in. Furthermore, the cancellation control 3130 remains within the information panel 760.
  • the user again performs a leftward swipe gesture. As shown at the fourth stage 3120, this again causes the application to slide the information for the second destination to the left and indicate a third predicted destination.
  • This third destination has a name of Dentist, an ETA of 2:43, and is a rare destination for the user.
  • the metadata indicates that the user has an appointment at the dentist today. As an example, the user might have scheduled a dentist appointment for 4:00, and also added a calendar event to see her uncle at 3:00.
  • the third of the circles 3135 is now filled in.
  • the previous figure illustrated the predictive routing interface for the high-quality touchscreen user interface of some embodiments.
  • the low-quality touchscreen interface and the non-touchscreen interface have similar predictive routing features.
  • Figure 32 illustrates the same scenario as the previous Figure 31, displayed in the low-quality touchscreen interface of some embodiments, over four stages 3205-3220.
  • the first stage 3205 illustrates the low-quality vehicle touchscreen interface 570, with the user selecting the predictive routing control 584 to open the predictive routing feature.
  • the second stage 3210 illustrates the low-quality touchscreen predictive routing interface
  • the low-quality touchscreen interface 3200 is the same as the high- quality touchscreen interface 755.
  • the interface includes additional right and left arrow controls 3225 and 3230 within the information bar 3240, at the far ends of the bar.
  • the circles 3235 representing the different predicted destinations are located in this information bar 3240, as in the high-quality touchscreen example. Because low-quality touchscreens often have high touch latency, they may have trouble detecting the swipe gestures used to select between different predicted destinations.
  • the low-quality touchscreen interface of some embodiments includes the arrow controls 3225 and 3230 for selecting between different predicted destinations.
  • the information bar 3240 is larger in the low-quality touchscreen situation so that the touch buffer around the arrows 3225 and 3230 can be made larger.
  • the low-quality predictive routing interface 3200 of some embodiments still accepts swipe gestures to change the displayed predicted destination.
  • the second stage 3210 illustrates a user performing a leftward swipe gesture
  • the third stage 3215 illustrates that the interface now displays the second of the predicted likely destinations (as in the third stage 3115 of Figure 31).
  • the user selects the right arrow control 3230.
  • the selection of the right arrow causes the same action in the interface as a leftward swipe gesture, as the mapping application now displays the third predicted destination in the interface 3200.
  • the predictive routing feature is also available in the non-touchscreen vehicle interface of the mapping application.
  • the display for non-touchscreens appears the same as the high-quality touchscreen predictive routing interface, with the exception of the cancel button 3130.
  • the cancel button In order to select between predicted destinations, the user shifts a joystick left or right, turns a knob, rotates a trackball, etc. Because the cancel button is not available, the user can exit the predictive routing feature by using a back button or similar built-in vehicle control in order to return to the standard map view interface.
  • the predictive routing interface of some embodiments.
  • three predicted destinations are shown in an order of decreasing likelihood of being the user's actual destination.
  • the predictive routing is regularly updating the likelihood of various possible destinations, and can add destinations to the display, remove destinations from the display, or modify the order of the destinations.
  • Figure 33 illustrates an example of the predictive routing engine reordering the likelihood of various destinations, and the subsequent effects in the predictive routing interface 755 over four stages 3305-3320.
  • the first stage 3305 is the same as the second stage 3110 of Figure 31, with the predictive routing interface 755 displaying Aunt Sally's as the most likely destination based on various factors, and two additional destinations available for the user to view.
  • the second stage 3310 the user has traveled towards an intersection at which she would turn right to travel to Aunt Sally's. At this point, the user interface is the same.
  • the predictive routing engine updates its likelihood, it determines that Uncle Harry's is now a more likely destination than Aunt Sally's based on all of its collected information. Specifically, the predictive routing engine determines that Aunt Sally's is no longer a relevant destination, because the user is now driving away from that destination. Thus, as shown at this stage, the view for Aunt Sally's as a destination begins sliding to the left off of the screen.
  • the fourth stage 3320 illustrates that the predictive routing interface 755 now displays the view for Uncle Harry's as the destination, because this is the most likely destination as calculated by the predictive routing engine.
  • the application has determined that Aunt Sally's is no longer a relevant destination, the application only has two possible destinations, and modifies the circles 3325 accordingly.
  • the mapping application would not change the displayed destination even if a different destination (e.g., the dentist) became more likely. However, because Aunt Sally's was no longer a relevant route after the user turned left at the intersection (a bifurcation point), the application removed the route from the display. In some embodiments, if only one route was originally predicted, and the predictive routing engine then determines the route to not be relevant, the application would return to the map exploration user interface.
  • the mapping application only changes the display within the context of the normal display of the interface. That is, the application does not suddenly switch the display order of the three destinations to match the newly calculated order, but instead slides the first destination off of the screen and slides the more likely destination into the view.
  • the predictive routing engine determined that, rather than the dentist as a third most likely option, a different destination was more likely, then some embodiments would modify the user interface such that when the user performed a swipe gesture to view the second destination (Uncle Harry's), the new third destination would appear on the right side of the display (because this change could be performed off-screen and therefore would not be visually jarring).
  • a route partially shown on the display screen e.g., Uncle Harry's in the first stage 3305
  • some embodiments will change the route in this panel, even though it may be a jarring change to the display, so as to avoid displaying information for which the user is not likely to have a use.
  • Figure 34 illustrates another example of changes made by the predictive routing engine to the order of likely destinations over six stages 3405-3430 in which the user, while traveling, exits the predictive routing interface 755 and then returns at a later time to the predictive routing interface.
  • the first stage 3405 illustrates the predictive routing interface 755 in the same state as the second stage 3115 of Figure 31, with Aunt Sally's as the displayed most likely destination and Uncle Harry's as the second most likely destination.
  • the user selects the cancelation control 3130 in order to leave the predictive routing interface.
  • the second stage 3410 illustrates the high-quality vehicle touchscreen interface 565 of some embodiments.
  • the third stage 3415 illustrates this interface six minutes later as the user approaches an intersection. This intersection represents a decision point for the predictive routing engine, because a right turn takes the user towards Aunt Sally's house while a left turn takes the user towards Uncle Harry's house. As shown at the fourth stage 3420, five minutes later, the user has taken a right turn at the intersection.
  • the user selects the predictive routing control 574 to reenter the predictive routing interface 755, shown at the fifth stage 3425.
  • the calculated most likely destination is Aunt Sally's.
  • Uncle Harry's is no longer the second predicted destination.
  • the mapping application only receives two destinations to display from the predictive routing engine. In some embodiments, only destinations that meet a minimum threshold for likeliness are displayed in the predictive routing interface.
  • the sixth stage 3430 illustrates the second predicted route, to the dentist.
  • the different predicted destinations and associated routes are arranged in a filmstrip view.
  • the application animates the sliding of the conceptual filmstrip through the display screen.
  • Figure 35 illustrates this animation over three stages 3505-3515.
  • the first stage 3505 is the same as the second stage 3110 of Figure 31, and the third stage 3505 is the same as the third stage 3115 of that figure (without the additional swipe gesture shown in the third stage 3115 of Figure 31).
  • the second stage 3510 illustrates the animation of some embodiments to transition between these stages. Specifically, at this stage, the information panel and part of the map view for the first predicted route has slid off of the display, while the entirety of the information panel and part of the map view for the second predicted route has slid onto the display.
  • a different arrangement and animation of the predicted routes is used. Rather than arranging the displays for the predicted routes one after another in a filmstrip, some embodiments conceptually treat the displays for different routes as cards stacked on top of each other.
  • Figure 36 illustrates the animation between two predicted route displays for this type of arrangement.
  • the application does not display a portion of the information for the second predicted route while showing the view for the first route. While this is shown for the stacked-card arrangement, one of ordinary skill will recognize that some embodiments that use the filmstrip arrangement will also not display any portion of the information panel for the next predicted route.
  • the user performs a swipe gesture to cause the application to display the second predicted route.
  • the second stage 3610 illustrates that, rather than animating the display for the second predicted route sliding off of the screen as the first predicted route display slides onto the screen, in this example the application reveals the second predicted route display underneath as the first predicted route display slides off.
  • the animation has completed and the second predicted route display occupies the entire screen.
  • the predictive routing feature enables the user to quickly and easily find a desired destination and enter a turn-by-turn navigation mode towards that destination. While the illustrated interface for the predictive routing feature does not include a particular control for entering navigation, in some embodiments when the mapping application receives a tap gesture within the display, the application enters the turn-by-turn navigation mode towards the currently shown destination. In different embodiments, this tap input may be located within the map view for a destination, within the information panel for the destination, or within either of these areas.
  • Figure 37 illustrates this feature of the predictive routing interface of some embodiments for entering the turn-by-turn navigation mode of the mapping application over two stages 3705- 3710.
  • the first stage 3705 illustrates the predictive routing interface 755 of some embodiments, as shown in the previous figures.
  • a user performs a tap gesture over the map view panel 765 with Aunt Sally's as the currently displayed destination.
  • the mapping application has entered its turn-by-turn navigation mode for a route heading to Aunt Sally's.
  • the application displays a specific selectable option for entering turn-by-turn navigation, and a tap gesture over the display instead returns the display to the map exploration user interface.
  • the turn-by-turn navigation mode of some embodiments will be described in further detail in the following Section VI.
  • the predictive routing user interface displays a series of predicted destinations, as well as routes to those destinations.
  • the application displays on the vehicle screen traffic information in the vicinity of the vehicle. For instance, if the vehicle is traveling on a freeway, some embodiments indicate whether the traffic is normal, lighter than usual, or heavier than usual on the freeway. Some embodiments display such a vicinity-based information panel first, then the user can swipe to view the above-described destination-based panels.
  • Figure 38 conceptually illustrates a simplified software architecture of a mapping and navigation application 3800 of some embodiments that operates on a mobile device and performs the predictive routing features discussed above.
  • the mapping application 3800 includes search functions 3805, a destination generator 3810, a route generator 3815, and additional mapping and navigation functions 3820.
  • the mapping application stores a mapping application recents database 3825, and the mobile device stores harvested system data 3830.
  • the mapping application recents database 3825 stores, in some embodiments, data about recent searches performed by the mapping application (e.g., by the search functions 3805). For example, in the above examples, when a search result yields a specific destination, the application stores that destination in the recents database 3825. When the search is performed over a particular category (e.g., coffee shops), some embodiments store the category in the mapping application recents database 3825 as well.
  • a particular category e.g., coffee shops
  • the harvested system data 3830 is not stored specifically by the mapping application. Instead, an address harvester that runs separately on the mobile device (e.g., within the operating system) stores various data within the harvested system data 3830. In some embodiments, the address harvester stores phone numbers, e-mail addresses, and physical addresses in the harvested system data 3830. In some embodiments, the phone numbers are harvested from phone calls and/or text messages, and the e-mail addresses are harvested from e- mails and/or text messages. In addition, the data may include groups of phone numbers and/or groups of e-mail addresses that were used to send messages or e-mails to several recipients at once.
  • the harvested physical addresses may come from numerous different sources in some embodiments.
  • e-mails and text messages may include physical addresses that a harvester recognizes by their formatting and stores in the database 3830.
  • Scheduled events e.g., from an electronic calendar or an electronic ticket repository
  • the system data 3830 stores this information (as well as times associated with the events, in some embodiments).
  • a web browser running on the mobile device allows a user to select addresses (e.g., of a place of business that the user looks up in the web browser) and add those addresses to the data store 3830.
  • the harvested system data 3830 includes contacts information from the user's address book, which may include phone numbers, e-mail addresses, and physical addresses.
  • the mapping and navigation functions 3820 perform a wide variety of mapping and/or navigation processes for the mapping application. These functions may include identifying map tiles from which to generate map displays and retrieving the map tiles from a remote server, matching a device location to a location on a map, retrieving route information from a server, generating text and/or graphical navigation instructions for a route based on the retrieved route information, generating one or more user interfaces for the application, and other functions.
  • the search functions 3805 perform searches as described above by reference to Figure 23.
  • the search functions 3805 initially receive a search request (i.e., a set of search terms) and attempt to find one or more matching results (i.e., locations) within the physical addresses stored in the harvested system data.
  • a search request i.e., a set of search terms
  • the search functions 3805 will return the more specific address on the street by using the harvested system data 3830.
  • the search functions 3805 cannot find a result in the harvested system data 3830, then the search functions 3805 use a search engine to perform the search in some embodiments.
  • the search functions 3805 store the search results to the mapping application recents database 3825 in some embodiments. Thus, when the user requests recent locations, the application uses the recents database 3825 to populate the fields in the requesting user interface. In addition, the search results are sent to the mapping and navigation functions 3820 for output to one or more user interfaces.
  • the destination generator 3810 of some embodiments uses travel history of the device, harvested physical locations from the harvested data 3830 (e.g., contact information, addresses received in messages or e-mails within a particular recent time period, locations of scheduled events, etc.), and/or locations from the mapping application recents database 3825, along with additional inputs such as the system clock (to retrieve the current time) and a location identification engine (to retrieve a current location).
  • the destination generator 3810 uses this data to identify predicted destinations to which it expects the device (and thus the user) to travel.
  • the mapping application can then retrieve routes to the predicted destinations from the current location (e.g., from a directions server that generates routes from one location to another), and use the predicted destinations and associated routes to populate various user interface features. For instance, the predictive routing interface for the vehicle display described in this section uses these predicted destinations and routes to the destinations in some embodiments.
  • the route generator 3815 creates sets of associated destinations, with each set specified as a possible route for traveling.
  • each set of associated destinations includes start and end locations, a number of locations in between the start and end locations, and a number of motion records specifying rate of travel (e.g., between the locations).
  • the predicted route generator uses a machine-learning function to stitch together previously unrelated destination, location, and motion records into contiguous sets that specify potential routes.
  • the routes generated by the route generator 3815 may be used by the mapping application of some embodiments to output useful prediction information (e.g., to identify that a user is likely to be following a particular route soon and warn the user about traffic along that route).
  • the harvested system data may be used for other features of the mobile device.
  • the harvested system data 3830 can be used by the e-mail or messaging functions to quickly identify a recipient for a message. This may be especially useful for the vehicle interface, which in some embodiments performs messaging through dictation, without any typing.
  • Figure 39 illustrates the vehicle display screen over four stages 3905-3920 in which a user activates a messaging function and dictates a message to a recipient.
  • the first stage 3905 illustrates the touchscreen home page 550 of some embodiments.
  • the user selects the messaging control 564, in order to activate a messaging (e.g., SMS) feature of the mobile device.
  • a messaging e.g., SMS
  • the second stage illustrates a messaging interface 3900 of some embodiments.
  • the messaging interface 3900 includes a prompt 3925 (which may also be output as audio via the vehicle speakers), a cancel control 3930 for closing the messaging interface without sending a message, and a done control 3935 for indicating to the messaging application that dictation has completed.
  • the messaging application uses a more interactive voice recognition feature, in which the answers to a first prompt may be used to inform recognition of answers to subsequent prompts.
  • the messaging feature requests a recipient for a message, and the user dictates "Jerome".
  • the third stage 3915 illustrates that the messaging feature uses the harvested data of some embodiments to identify a telecommunications address (e.g., phone number, e-mail) to which to send the data.
  • a telecommunications address e.g., phone number, e-mail
  • the messaging interface 3900 now requests that the user dictate a message to Jerome Smith.
  • the recipient is a selectable item in some embodiments, which the user can select in order to change recipients for the message.
  • the user dictates a message, "Be there in ten”.
  • the fourth stage 3920 illustrates the resulting message, with a control 3945 for allowing the user to send the message.
  • the mapping application vehicle user interface of some embodiments is designed for the user to quickly and easily identify a destination for entering turn-by-turn navigation.
  • the mapping application outputs different views for navigation to the two different interfaces. Specifically, the mapping application outputs a map view from above and/or behind the vehicle to the vehicle display screen, while outputting only navigation instructions to the mobile device display screen.
  • Figure 40 illustrates the operation of the turn-by-turn navigation user interface of some embodiments for the vehicle display over six stages 4005-4030 in which the vehicle approaches an intersection and a user follows navigation instructions to perform a maneuver at the intersection.
  • the first stage illustrates the search result map view page 852, with a particular search result and possible destination selected.
  • the user selects the control 880 to enter turn-by-turn navigation.
  • the second stage 4010 illustrates that the mapping application has entered turn-by-turn navigation mode, and now displays the navigation mode user interface 885 of some embodiments on the vehicle display screen.
  • the navigation user interface 885 includes the lower information bar, a navigation map 890, a 3D control 895, and a direction list control 897.
  • the navigation user interface 885 additionally includes a navigation information bar 4050. In some embodiments, this bar is displayed in such a way as to appear similar to a street sign.
  • the 3D control 895 enables a user to toggle the navigation map 890 between 3D and 2D animation, while the direction list control 897 enables the user to view a list of instructions for upcoming maneuvers along the route being traveled.
  • some embodiments automatically begin navigation with the 3D control 895 selected and the navigation map in three dimensions. Other embodiments begin navigation in two dimensions by default rather than the three dimensional view.
  • whether to begin in 2D or 3D is a user-selectable option within the mapping application.
  • some embodiments animate the transition from the map view shown in the first stage 4005 to the navigation view shown in the second stage 4010.
  • the navigation information bar 4050 includes information about the next maneuver along the navigated route as well as route summary information.
  • the route summary information of some embodiments includes the estimated arrival time (2:18), distance remaining in the route (1.2 miles), and time remaining for the route (4 minutes).
  • the summary of the next maneuver along the route includes a graphical representation of the maneuver and intersection, a distance to the next maneuver that counts down, and text instructions describing the maneuver. This maneuver summary information will be described additionally below by reference to the navigation instruction panel 4035, shown in subsequent stages of this figure.
  • the application of some embodiments slides a navigation instruction panel into the navigation user interface 885 in order to alert the driver of the upcoming maneuver to perform.
  • the third stage 4015 illustrates that as the vehicle approaches the intersection where the route turns right, a navigation instruction panel 4035 begins to slide onto the display, as represented by the right-facing arrows (which are not actually displayed within the user interface).
  • the mapping application of some embodiments animates the introduction of this panel once the vehicle approaches within a threshold distance (e.g., 250 ft., 200 ft., 100 m, etc.) of the intersection at which the maneuver will be performed.
  • the navigation instruction panels always appear on the driver side of the display. Therefore, in countries such as the United States, that drive on the right side of the road (with the driver seat on the left side of the car), the application places the panel on the left side of the display. On the other hand, in countries such as the United Kingdom that drive on the left side of the road (with the driver seat on the right side of the car), the application places the navigation instruction panel on the right side of the display.
  • the third stage 4015 also illustrates that the application no longer displays the navigation information bar 4050 in the UI.
  • the application either removes the information bar or animates out the information bar (e.g., by sliding the bar off of the display, fading out the bar, etc.) before sliding the information panel 4035 onto the screen (or otherwise animating its appearance). Because the next maneuver information is displayed in the information panel and the route summary information is not needed for the short period approaching a maneuver, the application can remove the bar for this period of time.
  • the information panel 4035 slides over the information bar as the information bar animates off of the display, or the information bar shrinks so as to fit within the width of the display reserved for the map view 890.
  • the fourth stage 4020 illustrates the navigation user interface 885 after the animation to slide the navigation instruction onto the display has completed.
  • the 3D map view at this stage has been compressed into a smaller region of the display.
  • the mapping application smoothly animates the compression of the navigation map 890, while sliding the 3D control 895 with the map.
  • the navigation instruction panel 4035 of some embodiments is designed to have the appearance of a road sign, such as one might see along the side of a freeway. Some embodiments use different colors, textures, or other aspects of the appearance of the navigation instruction panel to match the appearance of road signs in the location (e.g., country, state, etc.) of the vehicle.
  • the application displays (1) a graphical instruction of the upcoming maneuver to perform and (2) a text instruction describing the maneuver.
  • the mapping application generates, for each maneuver along a route, at least two different stylized graphical indicators. These include a complex graphical indicator that shows all roads entering and leaving the intersection as well as an arrow indicating the maneuver through the intersection, and a simpler graphical indicator that shows only an arrow indicating the maneuver (e.g., a right turn). In this case, the application uses the more complex graphical indicator in the navigation panel, as well as in the information bar 4050.
  • the graphical maneuver indicators of some embodiments are described in greater detail in the U.S. Application 13/632,117, filed September 30, 2012, which is incorporated herein by reference.
  • the mapping application of some embodiments additionally generates several different sets of text instructions for each maneuver along a route based on information derived from the route.
  • the application synthesizes various instruction elements based on route data, then combines these instruction elements in different ways to generate sets of text instructions.
  • the application then uses one of the sets of text instructions based on the amount of space available, whether information is already displayed in a different form, and other factors.
  • the text instructions of some embodiments are described in greater detail in the U.S. Application 13/632,1 17, incorporated by reference above.
  • the fifth stage 4025 illustrates the navigation UI 885 after the vehicle (and therefore the device) has proceeded through the intersection along the route, as specified by the navigation instruction panel 4035.
  • the panel 4035 begins sliding back off of the display (as indicated by the leftward facing arrows, which are not displayed in the UI).
  • the map view increases in size and the 3D control moves back to the left, with the opposite animation as that used for sliding the instruction panel into the display.
  • the navigation instruction panel counts down the distance to the intersection at which the vehicle will perform the displayed maneuver.
  • the instruction panel displays 200 feet as the distance to the next intersection.
  • the application changes this displayed number at various threshold distances from the intersection (e.g., 150 ft., 100 ft., 50 ft., 40 ft., 30 ft., 20 ft., and 10 ft., or a different combination of distances).
  • the sixth stage 4030 illustrates the navigation UI 885 after the application has completed the removal of the instruction panel from the display.
  • the navigation information bar 4050 has returned, and displays a summary of the new next maneuver.
  • the navigation map view will follow along behind the indicator of the device location 4040 until the next intersection is approached, or the user selects one of the controls in the user interface (e.g., the 3D control or the list control).
  • the mapping application of some embodiments animates a presentation of a list of maneuvers to perform for the currently presented route over the navigation view.
  • Figure 41 illustrates the animation of the direction list of some embodiments over four stages 4105-4120.
  • the first stage 4105 illustrates the navigation user interface 885, as described above by reference to Figure 40, with a 3D control 895 and direction list control 897, and navigation information bar 4050.
  • the user selects the direction list control 897 (with a tap input) to cause the application to display the direction list rather than the navigation map presentation.
  • the second stage 41 10 illustrates that, as a result of the user selection, the application begins animating a direction list 4125 onto the vehicle display. As shown, in some embodiments, the application slides the direction list upwards over the map view from the bottom of the display screen. In other embodiments, the direction list may slide in from the left or right, from the top, or may appear on the screen via a different animation (e.g., an enlargement of the direction list control to the size of the full screen).
  • the third stage 4115 illustrates the display after the animation to display the direction list 4125 has fully completed.
  • the direction list begins with a starting location (1018 Chico Ave ).
  • a name for the location is available (e.g., based on contact information indicating a particular person's house, map data indicating a place of business, etc.,) some embodiments additionally display this name as part of the starting address.
  • the application displays a pin to indicate that the entry in the direction list is a location, rather than a maneuver.
  • the direction list includes an entry for each maneuver along the route.
  • the mapping application receives route data from a mapping server as a series of junctions through which the route travels.
  • the mapping application identifies the junctions at which maneuvers will be performed, and generates graphical indicators and sets of instructions for these maneuvers (as described above).
  • some embodiments use the simplified graphical indicator for each maneuver (e.g., an arrow showing the maneuver that does not include the other roads at the intersection).
  • each entry for a maneuver along the route includes a distance from the previous maneuver and a set of text instructions generated as described above.
  • the entire direction list may not fit on the display screen.
  • the user can scroll through the list in some embodiments.
  • the third stage 4115 illustrates a user performing an upward swipe gesture over the direction list 4125.
  • the fourth stage 4120 illustrates that the application has scrolled the direction list to its end, such that the destination (and thus the end of the route) is now displayed.
  • Some embodiments include two entries to indicate the destination.
  • the second to last entry in the list includes an indicator of the end of the route, with information about the distance from the previous maneuver to the destination and information informing the user regarding on which side of the street the destination is located.
  • the final entry includes name and address information for the destination, similar to the starting address information.
  • some embodiments include additional controls for scrolling up and down in the low-quality touchscreen interface. For instance, some embodiments display up and down arrows along the right or left side of the direction list that the user can select with a tap input in order to scroll the list by a particular amount (e.g., one entry, three entries, etc.).
  • the direction list 4125 additionally includes a control 4130 for returning to the navigation presentation 885.
  • the application displays the reverse animation of that used to display the direction list. For instance, in this example, selecting the control 4130 would cause the application to slide the direction list downwards on the display, revealing the navigation map view underneath.
  • the direction list 4125 slides up and over the navigation information bar 4050.
  • the direction list 4125 only covers the display of the map view of navigation, and not the direction list. This enables the user to view the direction list while also able to see the summary information (e.g., the estimated arrival time, next maneuver) which may be useful to the user.
  • the user can view the navigation map in overview mode, rather than the animated view from above and/or behind the representation of the moving vehicle.
  • Figure 42 illustrates a user accessing this overview mode in the vehicle user interface over three stages 4205-4215.
  • the user performs a specific type of touchscreen input (in this case, a tap, though in different embodiments different inputs may be used) over the animated map view 890 of the navigation user interface 885.
  • the second stage 4210 illustrates that this action causes the application to modify the navigation information bar 4050.
  • the upper portion of the vehicle display now includes an end navigation control 4220 and an overview control 4225.
  • the end navigation control 4220 allows the user to end the current turn-by-turn navigation mode and return to the map exploration interface.
  • the overview control 4220 allows the user to view an overview of the route being navigated.
  • the route and maneuver information in the navigation information bar is compressed to fit within a smaller area.
  • This bar continues to display the same route summary information (arrival time, remaining distance, remaining time), and also displays the graphical representation and distance to the next maneuver in the same manner.
  • the text instructions however, use a shorter version in order to fit within the smaller area. Some embodiments remove the text instructions altogether.
  • the user selects the overview control 4225 at the second stage 4210 (via a tap input).
  • the mapping application displays a route overview map 4230 at the third stage 4215.
  • the overview map of some embodiments displays a map at a zoom level such that the entirety of the remaining route, from the current location to the ending location, fits within the display.
  • the map shows the route line for the remainder of the route.
  • the display also includes a resume navigation control 4235 in place of the overview control 4225. Selection of the resume navigation control causes the application to resume the animated map view of navigation shown in the first stage 4205.
  • This figure illustrate turn-by-turn navigation within the touchscreen user interfaces of some embodiments.
  • the non-touchscreen interfaces also have the capability to enter turn-by-turn navigation mode.
  • Some embodiments limit the functionality for non-touchscreen displays (e.g., always displaying in either 2D or 3D with no control to switch back and forth, not having a direction list control, etc.), while other embodiments use various types of inputs from the built-in vehicle controls to toggle between 2D and 3D presentations or to cause the application to animate the direction list onto or off of the display.
  • Some embodiments in fact, modify the mobile device interface during navigation (described below) to include the 3D control and direction list control. Selection of these controls on the mobile device then affects the non-touchscreen vehicle interface.
  • the above examples illustrate the vehicle display screen when the mapping application enters turn-by-turn navigation mode.
  • the application when the application enters the turn- by-turn mode, it locks the mobile device interface into a corresponding display.
  • the application displays on the mobile device on which the application operates a large set of instructions for the upcoming maneuver along the route being navigated in some embodiments.
  • Figure 43 illustrates the display of the mobile device during the turn-by-turn navigation mode according to some embodiments. Specifically, this figure illustrates the display of both the mobile device and the vehicle interface over three stages 4305-4315 as the user approaches and turns through an intersection.
  • the first stage 4305 illustrates both a mobile device 4300 and the vehicle display screen after the mapping application has entered turn-by-turn navigation mode. As shown, the vehicle display screen displays the navigation mode user interface 885 of some embodiments.
  • the mobile device 4300 on which the mapping application operates displays a blank screen with only navigation instructions. Specifically, the mobile device displays a larger version of the navigation instruction panel for the next maneuver, irrespective of the distance to that next maneuver.
  • This display includes the maneuver information also displayed in the navigation information bar 4050, without the route summary (e.g., ETA) information in some embodiments.
  • the navigation information bar 4050 is replaced with a navigation instruction panel, its distance to maneuver, graphical maneuver representation, and text instructions will be the same as the mobile device display.
  • the mobile device display matches this instruction panel. While shown with a white background in these figures, some embodiments use a black background with white text and instructions rather than the other way around.
  • the vehicle and device are beyond the threshold for displaying the navigation instruction panel within the vehicle display (400 ft. away from the next maneuver).
  • the mobile device displays (1) a complex graphical indicator 4325 for the upcoming maneuver and (2) text instructions 4330 describing the maneuver.
  • the complex graphical indicator includes a thick arrow showing the maneuver through the intersection as well as lighter (e.g., dimmer, thinner, etc.) lines marking the other roads that enter and/or exit the intersection at which the maneuver will be performed.
  • the text instructions indicate a regularly changing distance to the next maneuver, as well as information about the maneuver to perform (e.g., a particular type of turn onto a particular road). While this example shows a right turn, other instructions could include left turns, U-turns, merges, freeway exits, etc.
  • the second stage 4310 illustrates the two user interfaces as the vehicle and device approach the intersection at which the route makes a right turn onto State St.
  • a navigation instruction panel 4320 has been animated onto the display, and the navigation information bar removed, because the vehicle and device are now closer to the intersection.
  • the navigation instruction panel 4320 and the mobile device 4300 display the same information in some embodiments.
  • the mobile device always displays the next instruction, enabling the driver or passenger to quickly view the next maneuver to perform irrespective of the distance to that next maneuver, without the need to view the entire direction list or even interact with either interface.
  • the vehicle has maneuvered through the intersection and onto State St.
  • the application has removed the navigation instruction panel 4320 from the display while enlarging the size of the navigation map presentation and returning the navigation information bar 4050.
  • the mobile device 4300 now displays a new graphical indicator 4335 and text instructions 4340 for the new upcoming maneuver (another right turn, but at a four-way intersection rather than a three-way intersection), in both the mobile device display and the navigation information bar.
  • a navigated route will have two maneuvers within quick succession of each other.
  • the application modifies both the device and vehicle UIs to account for such maneuvers.
  • Figure 44 illustrates the display of both the mobile device and vehicle UIs over three stages 4405-4415 in which the vehicle approaches a first maneuver with a second maneuver following the first in rapid succession.
  • the first stage 4405 illustrates the two user interfaces when the vehicle (and device) are 400 feet from the first maneuver.
  • the navigation map UI 885 of the device does not yet display the navigation information panel, instead displaying the navigation information bar 4050.
  • the subsequent maneuver will occur only a short distance (e.g., 50 ft.) after the next maneuver, and therefore the application displays a second navigation bar 4420 (or sign) below the information bar 4050.
  • This second navigation bar 4420 only illustrates maneuver information, as including route summary information would be redundant.
  • the navigation bar 4420 includes a stylized graphical representation of the second maneuver and text instructions for the maneuver. Because the second maneuver comes so quickly after the first, no distance from the first to second maneuver is presented.
  • the mobile device 4400 displays a split screen to show the two maneuvers.
  • the two maneuvers are given equal portions of the split screen, with the lower maneuver again not including a distance.
  • other embodiments display the second maneuver in a smaller portion of the display screen.
  • the second stage 4410 illustrates the two displays after the vehicle has approached within a threshold distance of the first maneuver (e.g., within 200 feet).
  • a threshold distance of the first maneuver e.g., within 200 feet.
  • the application has slid a navigation instructions panel 4425 onto the left side of the display.
  • this instructions panel mirrors the display shown on the mobile device.
  • the instructions panel is split into an upper half displaying the summary information for the first maneuver and a lower half displaying the summary information for the subsequent maneuver.
  • the third stage 4415 illustrates the displays after the vehicle has performed the first maneuver. Because the maneuver after the turn onto Hill St. is more than a threshold distance (e.g., 200 ft.) of its previous maneuver, the application no longer displays a split screen for the device and split panel for the navigation instructions panel in the vehicle display. As shown, some embodiments display the distance to the maneuver once only one maneuver is shown, to give the same appearance as when the application had counted down to the maneuver from a greater distance.
  • Figure 45 illustrates the same situation, except that the mobile device is held in landscape orientation rather than portrait orientation. In this case, the progression of the vehicle display is unchanged from the previous figure.
  • the application splits the display horizontally rather than vertically for the two sets of maneuver instructions, with the instructions for the first maneuver appearing on the left side.
  • Figure 46 illustrates a use of the device interface to request a route from the device's current location to a destination location, then the subsequent selection to enter navigation, over three stages 4605-4615.
  • the first stage 4605 illustrates that the mapping application is open on a mobile device 4600 while connected to a vehicle display.
  • the vehicle touchscreen displays the high-quality touchscreen map exploration interface 565, while the user has previously selected a search field of the mobile device map interface to cause the application to display a list of recent routes and locations. The user selects a route from the device's current location to Putt-Putt Golf Course from this recents list.
  • the mobile device 4600 in the second stage displays a route display interface
  • the route display interface 4620 shows possible routes between the selected start and end locations.
  • the route display interface 4620 includes three selectable routes between the two locations.
  • the three labels that indicate the estimated travel time for each of the routes are selectable to select the corresponding routes.
  • a first route with a travel time of 10 minutes is selected.
  • the route display interface also includes an information bar 4625, or banner, that includes information about the currently selected route.
  • the information bar 4625 also includes an end route display control 4630 for returning to the standard map view interface 205 and a start navigation control 4635 for entering navigation along the selected route. Meanwhile, the vehicle display has not changed from the first stage to the second stage while the user views the route display interface 4620 on the mobile device.
  • the user selects the start navigation control 4635 on the mobile device display.
  • the vehicle display screen now displays the navigation mode user interface 885, while the mobile device display shows a customized graphical indicator and set of text instructions describing the next maneuver along the selected route.
  • the mapping application mobile device interface during navigation is not interactive. While the user can press the home button to exit the mapping application, some embodiments do not offer controls within the mobile device display of the upcoming maneuver instructions. Other embodiments, however, include a set of available controls on the mobile device.
  • Figure 47 illustrates the display and use of such controls on the mobile device during navigation. Specifically, this figure illustrates the activation of a set of controls and the selection of an overview control over three stages 4705-4715.
  • the first stage 4705 illustrates both a mobile device 4700 and the vehicle display screen to which the mobile device is connected after the mapping application running on the mobile device has entered turn-by-turn navigation mode. As shown, the vehicle display screen displays the navigation mode UI 885 while the mobile device displays instructions for the next maneuver along the route being navigated.
  • the user taps the display screen of the mobile device 4700.
  • the application displays a set of controls 4720 at the bottom of the display.
  • the set of controls 4720 includes an end navigation control 4725 for ending the turn-by-turn navigation mode and returning to the standard map exploration mode of the mapping application, as well as an overview control 4730 for displaying an overview of the route being navigated.
  • the user selects the overview control 4730 at the second stage 4710 (via a tap input).
  • the mapping application displays a route overview map 4735 at the third stage 4715.
  • the overview map of some embodiments displays a map at a zoom level such that the entirety of the remaining route, from the current location to the ending location, fits within the display.
  • the map shows the route line for the remainder of the route.
  • the overview display includes a banner 4740 with an end navigation control 4745, similar to the control 4725 in the previous stage 4710, as well as a control 4750 for returning to the standard navigation view on the mobile device (i.e., the display of graphics and text for the upcoming maneuver).
  • the vehicle display screen does not change during the stages 4705-4715.
  • Figure 48 illustrates, over three stages 4805-4815, a device 4800 in power-saving mode as well as the vehicle display screen to which the device is connected during turn-by-turn navigation.
  • the mobile device 4800 has already entered power-saving mode.
  • the displayed words "Power Save Mode” do not actually appear on the device's display, but are shown here for explanatory purposes.
  • the device may enter power save mode when the user has not interacted with the device for a particular period of time, or because a user has pressed a particular hardware button or touchscreen control to do so (e.g., button 4820 of the mobile device 4800).
  • the application displays the navigation map view 885 on the vehicle display screen.
  • the mobile device still performs the various navigation operations, including rendering the animated 3D map view, required to output the display to the vehicle.
  • the application has animated a navigation instruction panel 4825 onto the vehicle display as a maneuver approaches.
  • the vehicle has maneuvered through the intersection causing the application to remove the navigation instruction panel.
  • the mobile device 4800 remains in power-saving mode.
  • the mapping application displays (and may use voice prompts as well) navigation instructions on the mobile device as intersections approach even when the device has been placed in power-saving mode.
  • some embodiments do not display corresponding indications on the device when in power-saving mode.
  • Figure 49 illustrates the home pages of the mobile device and vehicle displays of some embodiments while turn-by-turn navigation mode is operational and the vehicle approaches a maneuver. This figure illustrates, for each of three stages 4905-4915, both of the displayed interfaces as well as a map 4900 that indicates the location of the vehicle and mobile device at the stage.
  • the first stage 4905 illustrates a mobile device 4920 and a vehicle display screen to which the mobile device is connected.
  • the mapping application operating on the mobile device is in turn-by-turn navigation mode, and therefore the mobile device 4920 displays information about the next maneuver while vehicle display screen displays the navigation mode UI 885.
  • the user selects a home button 4925 on the mobile device to place the mapping application in the background and return to the home page of the mobile device. In some embodiments, selecting the home button on the vehicle touchscreen has the same effect.
  • both the mobile device 4920 and the vehicle display screen display their respective home pages.
  • the mobile device home page 4930 displays various selectable items (also called affordances) for various applications that operate on the mobile device.
  • the vehicle touchscreen interface 550 of some embodiments is described above by reference to Figure 5.
  • both interfaces also display a selectable banner for returning to the turn-by-turn navigation.
  • the mobile device home page 4930 includes a banner 4935 at the top of the display with the text "Touch to return to Navigation".
  • the vehicle display screen displays a selectable banner 4936 above the set of controls 561-567 with the same text. In some embodiments, selecting either of these banners returns both of the displays to their respective turn-by-turn navigation displays, as shown in the first stage 4905.
  • banners are shown in particular locations in the two different interfaces in this figure, one of ordinary skill in the art will recognize that different embodiments may display similar banners in different locations (e.g., the banner 4936 displayed above the clock at the top of the vehicle display). Furthermore, for non-touchscreen interfaces, the banner displayed on the vehicle screen may indicate a particular action to take using the various vehicle controls in order to return to navigation (e.g., pressing a particular button).
  • the map 4900 illustrates that the vehicle is approaching an intersection at which the route makes a right turn onto a particular street.
  • both of the display screens now display indicators to make a right turn onto the particular street.
  • the text in the mobile device banner 4935 has been replaced by a right turn graphical indicator with text instructions to turn right.
  • the same graphical indicator and text are displayed in a newly added banner 4940 on the vehicle touchscreen above the clock, while the selectable banner 4936 for returning to navigation remains below the clock.
  • some embodiments return the displays to the state shown in the second stage 4910, by modifying the content of the banner 4935 and removing the banner 4940.
  • the banner 4936 is not displayed in the home page of the vehicle screen. Instead, the return to navigation banner is displayed as part of the operating system information bar that includes the simulated home button, which is not shown in the vehicle screen home page of some embodiments.
  • Figure 50 illustrates the display outside the mapping application with turn-by-turn navigation operating according to some embodiments, over three stages 5005-5015.
  • the first stage 5005 is the same as the first stage 4905 of Figure 49, with the user selecting a home button 5025 on the mobile device 5020.
  • the display of the mobile device is the same as in Figure 49, with the banner 5035 displayed at the top of the device screen.
  • the vehicle display 5050 does not display a banner in this situation. Because the user is likely to only have exited to this stage in order to select another application, not having the return to navigation selection control is not worrisome.
  • the user can select the mapping application control 5063 if need be, in order to return to the mapping application and therefore re-enter navigation.
  • the user selects the music control 5062 to enter a music application.
  • the vehicle has not yet approached the upcoming intersection. Thus, no directions are yet given for this maneuver.
  • the lower information bar 5040 that includes the simulated home button includes text stating "Return to Navigation" and has changed color. This indicates that the user can select anywhere on this bar 5040 (other than the home button) to return to the turn-by-turn navigation view.
  • the above-displayed embodiments require the mobile device to output interfaces for the same application to both the mobile device display and the vehicle display. That is, when the user selects the mapping application in one display, the application opens in both. Similarly, when the user exits the mapping application (e.g., by pressing the home button) in one display, the home pages of both are displayed. In some embodiments, however, the user can have different applications open on the two different displays.
  • Figure 51 illustrates the situation in which the mapping application in turn-by-turn navigation mode is open on the vehicle display while the mobile device displays its home page over three stages 5105-5115 in which the vehicle approaches and passes through an intersection along the route being navigated.
  • the first stage 5105 illustrates a mobile device 5100 that displays a mobile device home page 5120 of some embodiments, with several selectable items for opening various applications (including the mapping application).
  • the vehicle touchscreen displays a turn- by-turn navigation mode UI 5125.
  • the application displays a navigation instruction panel 5130 for the maneuver on the left side of the vehicle display, as shown in previous figures.
  • the device displays the maneuver instructions (with both a graphical indicator and text instructions).
  • both of these sets of instructions on the two displays count down the distance to the maneuver as the vehicle approaches the intersection.
  • the third stage illustrates that after the vehicle passes through the intersection, the display on the mobile device 5100 returns to the home screen, while the navigation instruction panel 5130 is removed from the vehicle display.
  • the mapping application upon connection of the device to the system, the mapping application begins outputting the two user interfaces as shown in Figure 43. That is, the application sends the navigation mode map view that follows the vehicle location to the vehicle display, and outputs the maneuver instructions to the mobile device display. When the user disconnects the device from the vehicle system while in navigation mode, the application does the reverse. The mobile device display transitions from the maneuver instructions to the standard turn-by-turn navigation mode when not connected.
  • the device recognizes when the user has left her vehicle (e.g., based on the type of movement of the device consistent with walking) and transitions automatically from a driving navigation mode to a walking navigation mode.
  • Computer readable storage medium also referred to as computer readable medium.
  • these instructions are executed by one or more computational or processing unit(s) (e.g., one or more processors, cores of processors, or other processing units), they cause the processing unit(s) to perform the actions indicated in the instructions.
  • computational or processing unit(s) e.g., one or more processors, cores of processors, or other processing units
  • Examples of computer readable media include, but are not limited to, CD-ROMs, flash drives, random access memory (RAM) chips, hard drives, erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), etc.
  • the computer readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
  • the term "software” is meant to include firmware residing in read- only memory or applications stored in magnetic storage which can be read into memory for processing by a processor.
  • multiple software inventions can be implemented as sub-parts of a larger program while remaining distinct software inventions.
  • multiple software inventions can also be implemented as separate programs.
  • any combination of separate programs that together implement a software invention described here is within the scope of the invention.
  • the software programs when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
  • the mapping and navigation applications of some embodiments operate on mobile devices, such as smart phones (e.g., iPhones®) and tablets (e.g., iPads®).
  • Figure 52 is an example of an architecture 5200 of such a mobile computing device. Examples of mobile computing devices include smartphones, tablets, laptops, etc.
  • the mobile computing device 5200 includes one or more processing units 5205, a memory interface 5210 and a peripherals interface 5215.
  • the peripherals interface 5215 is coupled to various sensors and subsystems, including a camera subsystem 5220, a wired communication subsystem(s) 5223, a wireless communication subsystem(s) 5225, an audio subsystem 5230, an I/O subsystem 5235, etc.
  • the peripherals interface 5215 enables communication between the processing units 5205 and various peripherals.
  • an orientation sensor 5245 e.g., a gyroscope
  • an acceleration sensor 5250 e.g., an accelerometer
  • the camera subsystem 5220 is coupled to one or more optical sensors 5240 (e.g., a charged coupled device (CCD) optical sensor, a complementary metal-oxide-semiconductor (CMOS) optical sensor, etc.).
  • the camera subsystem 5220 coupled with the optical sensors 5240 facilitates camera functions, such as image and/or video data capturing.
  • the wired communication subsystem 5223 and wireless communication subsystem 5225 serve to facilitate communication functions.
  • the wired communication system includes a USB connector for connecting the mobile device 5200 to a vehicle electronic system.
  • the interface of some embodiments for communicating with a vehicle electronic system is described in further detail in U.S. Patent Publications 2009/0284476, 2010/0293462, 2011/0145863, 2011/0246891, and 2011/0265003, which are incorporated by reference above.
  • the wireless communication subsystem 5225 includes radio frequency receivers and transmitters, and optical receivers and transmitters (not shown in Figure 52). These receivers and transmitters of some embodiments are implemented to operate over one or more communication networks such as a GSM network, a Wi-Fi network, a Bluetooth network, etc.
  • the audio subsystem 5230 is coupled to a speaker to output audio (e.g., to output voice navigation instructions). Additionally, the audio subsystem 5230 is coupled to a microphone to facilitate voice-enabled functions, such as voice recognition (e.g., for searching), digital recording, etc.
  • the I/O subsystem 5235 involves the transfer between input/output peripheral devices, such as a display, a touch screen, etc., and the data bus of the processing units 5205 through the peripherals interface 5215.
  • the I/O subsystem 5235 includes a touch-screen controller 5255 and other input controllers 5260 to facilitate the transfer between input/output peripheral devices and the data bus of the processing units 5205.
  • the touch-screen controller 5255 is coupled to a touch screen 5265.
  • the touch-screen controller 5255 detects contact and movement on the touch screen 5265 using any of multiple touch sensitivity technologies.
  • the other input controllers 5260 are coupled to other input/control devices, such as one or more buttons. Some embodiments include a near-touch sensitive screen and a corresponding controller that can detect near-touch interactions instead of or in addition to touch interactions.
  • the memory interface 5210 is coupled to memory 5270.
  • the memory 5270 includes volatile memory (e.g., high-speed random access memory), non- volatile memory (e.g., flash memory), a combination of volatile and non- volatile memory, and/or any other type of memory.
  • the memory 5270 stores an operating system (OS) 5272.
  • the OS 5272 includes instructions for handling basic system services and for performing hardware dependent tasks.
  • the memory 5270 also includes communication instructions 5274 to facilitate communicating with one or more additional devices; graphical user interface instructions 5276 to facilitate graphic user interface processing; image processing instructions 5278 to facilitate image-related processing and functions; input processing instructions 5280 to facilitate input-related (e.g., touch input) processes and functions; audio processing instructions 5282 to facilitate audio-related processes and functions; and camera instructions 5284 to facilitate camera-related processes and functions.
  • the instructions described above are merely exemplary and the memory 5270 includes additional and/or other instructions in some embodiments.
  • the memory for a smartphone may include phone instructions to facilitate phone-related processes and functions.
  • the memory may include instructions for a mapping and navigation application as well as other applications.
  • the above-identified instructions need not be implemented as separate software programs or modules.
  • Various functions of the mobile computing device can be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
  • FIG. 53 conceptually illustrates an example of a vehicle electronic system 5300 with which some embodiments of the invention are implemented.
  • Such an electronic system includes various types of computer readable media and interfaces for various other types of computer readable media.
  • Electronic system 5300 includes a bus 5305, processing unit(s) 5310, a graphics processing unit (GPU) 5315, a system memory 5320, communication subsystem(s) 5325, a readonly memory 5330, a permanent storage device 5335, input controls 5340, and output devices 5345.
  • GPU graphics processing unit
  • the bus 5305 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of the electronic system 5300.
  • the bus 5305 communicatively connects the processing unit(s) 5310 with the read-only memory 5330, the GPU 5315, the system memory 5320, and the permanent storage device 5335.
  • the processing unit(s) 5310 retrieves instructions to execute and data to process in order to execute the processes of the invention performed by the vehicle electronic system.
  • the processing unit(s) may be a single processor or a multi-core processor in different embodiments. Some instructions are passed to and executed by the GPU 5315.
  • the GPU 5315 can offload various computations or complement the image processing provided by the processing unit(s) 5310.
  • the vehicle system uses a system-on-a-chip that includes one or more CPU cores as well as one or more GPU cores.
  • the read-only-memory (ROM) 5330 stores static data and instructions that are needed by the processing unit(s) 5310 and other modules of the vehicle electronic system.
  • the permanent storage device 5335 is a read-and- write memory device. This device is a non- volatile memory unit that stores instructions and data even when the electronic system 5300 is off. Some embodiments of the invention use a mass-storage device (such as a magnetic or optical disk and its corresponding disk drive, integrated flash memory) as the permanent storage device 5335.
  • the system memory 5320 is a read-and-write memory device. However, unlike storage device 5335, the system memory 5320 is a volatile read-and-write memory, such a random access memory.
  • the system memory 5320 stores some of the instructions and data that the processor needs at runtime.
  • the invention's processes are stored in the system memory 5320, the permanent storage device 5335, and/or the read-only memory 5330.
  • the various memory units include instructions for processing multimedia clips in accordance with some embodiments. From these various memory units, the processing unit(s) 5310 retrieves instructions to execute and data to process in order to execute the processes of some embodiments.
  • the bus 5305 also connects to the input controls 5340 and output devices 5345.
  • the input devices 5340 enable the user to communicate information and select commands to the electronic system.
  • the input controls 5340 include various different inputs in different embodiments, including touchscreens, other hardware controls such as buttons, knobs, joysticks, trackballs, etc., microphones for receiving audio, and others.
  • the output devices 5345 include touchscreen and non-touchscreen displays, as well as speakers for producing audio.
  • bus 5305 also couples the vehicle electronic system 5300 to one or more communication subsystems 5325 through which the vehicle system can communicate with a mobile device, such as that shown above in Figure 52.
  • These communication subsystems may include both wireless (e.g., Bluetooth®, Wi-Fi, etc.) communication subsystems and wired (e.g., USB, Firewire®, etc.) communication subsystems. Any or all components of the vehicle electronic system 5300 may be used in conjunction with the invention.
  • Some embodiments include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media).
  • computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD- ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks.
  • CD-ROM compact discs
  • CD-R recordable compact discs
  • the computer-readable media may store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations.
  • Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher- level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • PLDs programmable logic devices
  • ROM read only memory
  • RAM random access memory
  • the terms "computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people.
  • display or displaying means displaying on an electronic device.
  • the terms "computer readable medium,” “computer readable media,” and “machine readable medium” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
  • FIG. 54 illustrates a map service operating environment, according to some embodiments.
  • a map service 5430 (also referred to as mapping service) may provide map services for one or more client devices 5402a-5402c in communication with the map service 5430 through various communication methods and protocols.
  • a map service 5430 in some embodiments provides map information and other map-related data, such as two-dimensional map image data (e.g., aerial view of roads utilizing satellite imagery), three-dimensional map image data (e.g., traversable map with three-dimensional features, such as buildings), route and direction calculation (e.g., ferry route calculations or directions between two points for a pedestrian), real-time navigation data (e.g., turn-by-turn visual navigation data in two or three dimensions), location data (e.g., where is the client device currently located), and other geographic data (e.g., wireless network coverage, weather, traffic information, or nearby points-of-interest).
  • two-dimensional map image data e.g., aerial view of roads utilizing satellite imagery
  • three-dimensional map image data e.g., traversable map with three-dimensional features, such as buildings
  • route and direction calculation e.g., ferry route calculations or directions between two points for a pedestrian
  • real-time navigation data e.g., turn-by-turn visual navigation data in two or
  • the map service data may include localized labels for different countries or regions; localized labels may be utilized to present map labels (e.g., street names, city names, points of interest) in different languages on client devices.
  • Client devices 5402a-5402c may utilize these map services by obtaining map service data.
  • Client devices 5402a-5402c may implement various techniques to process map service data.
  • Client devices 5402a-5402c may then provide map services to various entities, including, but not limited to, users, internal software or hardware modules, and/or other systems or devices external to the client devices 5402a-5402c.
  • a map service is implemented by one or more nodes in a distributed computing system. Each node may be assigned one or more services or components of a map service. Some nodes may be assigned the same map service or component of a map service.
  • a load balancing node in some embodiments distributes access or requests to other nodes within a map service.
  • a map service is implemented as a single system, such as a single server. Different modules or hardware devices within a server may implement one or more of the various services provided by a map service.
  • a map service in some embodiments provides map services by generating map service data in various formats.
  • one format of map service data is map image data.
  • Map image data provides image data to a client device so that the client device may process the image data (e.g., rendering and/or displaying the image data as a two-dimensional or three-dimensional map).
  • Map image data may specify one or more map tiles.
  • a map tile may be a portion of a larger map image. Assembling together the map tiles of a map produces the original map. Tiles may be generated from map image data, routing or navigation data, or any other map service data.
  • map tiles are raster-based map tiles, with tile sizes ranging from any size both larger and smaller than a commonly-used 256 pixel by 256 pixel tile.
  • Raster-based map tiles may be encoded in any number of standard digital image representations including, but not limited to, Bitmap (.bmp), Graphics Interchange Format(.gif), Joint Photographic Experts Group (.jpg, jpeg, etc.), Portable Networks Graphic (.png), or Tagged Image File Format (.tiff).
  • map tiles are vector-based map tiles, encoded using vector graphics, including, but not limited to, Scalable Vector Graphics (.svg) or a Drawing File (.drw). Some embodiments also include tiles with a combination of vector and raster data. Metadata or other information pertaining to the map tile may also be included within or along with a map tile, providing further map service data to a client device.
  • a map tile is encoded for transport utilizing various standards and/or protocols.
  • map tiles may be constructed from image data of different resolutions depending on zoom level. For instance, for low zoom level (e.g., world or globe view), the resolution of map or image data need not be as high relative to the resolution at a high zoom level (e.g., city or street level). For example, when in a globe view, there may be no need to render street level artifacts as such objects would be so small as to be negligible in many cases.
  • zoom level e.g., world or globe view
  • the resolution of map or image data need not be as high relative to the resolution at a high zoom level (e.g., city or street level).
  • a high zoom level e.g., city or street level
  • a map service in some embodiments performs various techniques to analyze a map tile before encoding the tile for transport. This analysis may optimize map service performance for both client devices and a map service.
  • map tiles are analyzed for complexity, according to vector-based graphic techniques, and constructed utilizing complex and non-complex layers. Map tiles may also be analyzed for common image data or patterns that may be rendered as image textures and constructed by relying on image masks.
  • raster-based image data in a map tile contains certain mask values, which are associated with one or more textures.
  • Some embodiments also analyze map tiles for specified features that may be associated with certain map styles that contain style identifiers.
  • map services generate map service data relying upon various data formats separate from a map tile in some embodiments.
  • map services that provide location data may utilize data formats conforming to location service protocols, such as, but not limited to, Radio Resource Location services Protocol (RRLP), TIA 801 for Code Division Multiple Access (CDMA), Radio Resource Control (RRC) position protocol, or LTE Positioning Protocol (LPP).
  • RRLP Radio Resource Location services Protocol
  • CDMA Code Division Multiple Access
  • RRC Radio Resource Control
  • LTP LTE Positioning Protocol
  • Embodiments may also receive or request data from client devices identifying device capabilities or attributes (e.g., hardware specifications or operating system version) or communication capabilities (e.g., device communication bandwidth as determined by wireless signal strength or wire or wireless network type).
  • a map service may obtain map service data from internal or external sources.
  • satellite imagery used in map image data may be obtained from external services, or internal systems, storage devices, or nodes.
  • Other examples may include, but are not limited to, GPS assistance servers, wireless network coverage databases, business or personal directories, weather data, government information (e.g., construction updates or road name changes), or traffic reports.
  • Some embodiments of a map service may update map service data (e.g., wireless network coverage) for analyzing future requests from client devices.
  • a map service may respond to client device requests for map services. These requests may be a request for a specific map or portion of a map. Some embodiments format requests for a map as requests for certain map tiles. In some embodiments, requests also supply the map service with starting locations (or current locations) and destination locations for a route calculation. A client device may also request map service rendering information, such as map textures or style sheets. In at least some embodiments, requests are also one of a series of requests implementing turn-by-turn navigation. Requests for other geographic data may include, but are not limited to, current location, wireless network coverage, weather, traffic information, or nearby points-of-interest.
  • a map service analyzes client device requests to optimize a device or map service operation. For instance, a map service may recognize that the location of a client device is in an area of poor communications (e.g., weak wireless signal) and send more map service data to supply a client device in the event of loss in communication or send instructions to utilize different client hardware (e.g., orientation sensors) or software (e.g., utilize wireless location services or Wi-Fi positioning instead of GPS-based services).
  • client hardware e.g., orientation sensors
  • software e.g., utilize wireless location services or Wi-Fi positioning instead of GPS-based services.
  • a map service may analyze a client device request for vector-based map image data and determine that raster-based map data better optimizes the map image data according to the image's complexity. Embodiments of other map services may perform similar analysis on client device requests and as such the above examples are not intended to be limiting.
  • client devices e.g., client devices 5402a-5402c
  • client devices 5402a-5402c utilize map service 5430 through various communication methods and protocols.
  • client devices 5402a-5402c obtain map service data from map service 5430.
  • Client devices 5402a-5402c request or receive map service data.
  • Client devices 5402a-5402c then process map service data (e.g., render and/or display the data) and may send the data to another software or hardware module on the device or to an external device or system.
  • map service data e.g., render and/or display the data
  • a client device implements techniques to render and/or display maps. These maps may be requested or received in various formats, such as map tiles described above.
  • a client device may render a map in two-dimensional or three-dimensional views.
  • Some embodiments of a client device display a rendered map and allow a user, system, or device providing input to manipulate a virtual camera in the map, changing the map display according to the virtual camera's position, orientation, and field-of-view.
  • Various forms and input devices are implemented to manipulate a virtual camera. In some embodiments, touch input, through certain single or combination gestures (e.g., touch-and-hold or a swipe) manipulate the virtual camera. Other embodiments allow manipulation of the device's physical location to manipulate a virtual camera.
  • a client device may be tilted up from its current position to manipulate the virtual camera to rotate up.
  • a client device may be tilted forward from its current position to move the virtual camera forward.
  • Other input devices to the client device may be implemented including, but not limited to, auditory input (e.g., spoken words), a physical keyboard, mouse, and/or a joystick.
  • Some embodiments provide various visual feedback to virtual camera manipulations, such as displaying an animation of possible virtual camera manipulations when transitioning from two-dimensional map views to three-dimensional map views. Some embodiments also allow input to select a map feature or object (e.g., a building) and highlight the object, producing a blur effect that maintains the virtual camera's perception of three-dimensional space.
  • a map feature or object e.g., a building
  • a client device implements a navigation system (e.g., turn-by- turn navigation).
  • a navigation system provides directions or route information, which may be displayed to a user.
  • Some embodiments of a client device request directions or a route calculation from a map service.
  • a client device may receive map image data and route data from a map service.
  • a client device implements a turn-by-turn navigation system, which provides real-time route and direction information based upon location information and route information received from a map service and/or other location system, such as Global Positioning Satellite (GPS).
  • GPS Global Positioning Satellite
  • a client device may display map image data that reflects the current location of the client device and update the map image data in real-time.
  • a navigation system may provide auditory or visual directions to follow a certain route.
  • a virtual camera is implemented to manipulate navigation map data according to some embodiments.
  • Some embodiments of client devices allow the device to adjust the virtual camera display orientation to bias toward the route destination. Some embodiments also allow virtual camera to navigation turns simulating the inertial motion of the virtual camera.
  • Client devices implement various techniques to utilize map service data from map service. Some embodiments implement some techniques to optimize rendering of two- dimensional and three-dimensional map image data.
  • a client device locally stores rendering information. For instance, a client stores a style sheet which provides rendering directions for image data containing style identifiers.
  • common image textures may be stored to decrease the amount of map image data transferred from a map service.
  • Client devices in different embodiments implement various modeling techniques to render two-dimensional and three-dimensional map image data, examples of which include, but are not limited to: generating three-dimensional buildings out of two-dimensional building footprint data; modeling two-dimensional and three-dimensional map objects to determine the client device communication environment; generating models to determine whether map labels are seen from a certain virtual camera position; and generating models to smooth transitions between map image data.
  • Some embodiments of client devices also order or prioritize map service data in certain techniques. For instance, a client device detects the motion or velocity of a virtual camera, which if exceeding certain threshold values, lower-detail image data is loaded and rendered of certain areas.
  • Other examples include: rendering vector-based curves as a series of points, preloading map image data for areas of poor communication with a map service, adapting textures based on display zoom level, or rendering map image data according to complexity.
  • client devices communicate utilizing various data formats separate from a map tile.
  • client devices implement Assisted Global Positioning Satellites (A-GPS) and communicate with location services that utilize data formats conforming to location service protocols, such as, but not limited to, Radio Resource Location services Protocol (RRLP), TIA 801 for Code Division Multiple Access (CDMA), Radio Resource Control (RRC) position protocol, or LTE Positioning Protocol (LPP).
  • RRLP Radio Resource Location services Protocol
  • CDMA Code Division Multiple Access
  • RRC Radio Resource Control
  • LTE Positioning Protocol LTE Positioning Protocol
  • Client devices may also receive GPS signals directly.
  • Embodiments may also send data, with or without solicitation from a map service, identifying the client device's capabilities or attributes (e.g., hardware specifications or operating system version) or communication capabilities (e.g., device communication bandwidth as determined by wireless signal strength or wire or wireless network type).
  • Figure 54 illustrates one possible embodiment of an operating environment 5400 for a map service 5430 and client devices 5402a-5402c.
  • devices 5402a, 5402b, and 5402c communicate over one or more wire or wireless networks 5410.
  • wireless network 5410 such as a cellular network
  • WAN wide area network
  • a gateway 5414 in some embodiments provides a packet oriented mobile data service, such as General Packet Radio Service (GPRS), or other mobile data service allowing wireless networks to transmit data to other networks, such as wide area network 5420.
  • GPRS General Packet Radio Service
  • access device 5412 e.g., IEEE 802.1 lg wireless access device
  • Devices 5402a and 5402b can be any portable electronic or computing device capable of communicating with a map service.
  • Device 5402c can be any non-portable electronic or computing device capable of communicating with a map service.
  • both voice and data communications are established over wireless network 5410 and access device 5412.
  • device 5402a can place and receive phone calls (e.g., using voice over Internet Protocol (VoIP) protocols), send and receive e-mail messages (e.g., using Simple Mail Transfer Protocol (SMTP) or Post Office Protocol 3 (POP3)), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, over wireless network 5410, gateway 5414, and WAN 5420 (e.g., using Transmission Control Protocol/Internet Protocol (TCP/IP) or User Datagram Protocol (UDP)).
  • VoIP voice over Internet Protocol
  • SMTP Simple Mail Transfer Protocol
  • POP3 Post Office Protocol 3
  • devices 5402b and 5402c can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents over access device 5412 and WAN 5420.
  • any of the illustrated client device may communicate with map service 5430 and/or other service(s) 5450 using a persistent connection established in accordance with one or more security protocols, such as the Secure Sockets Layer (SSL) protocol or the Transport Layer Security (TLS) protocol.
  • SSL Secure Sockets Layer
  • TLS Transport Layer Security
  • Devices 5402a and 5402b can also establish communications by other means.
  • wireless device 5402a can communicate with other wireless devices (e.g., other devices 5402b, cell phones, etc.) over the wireless network 5410.
  • devices 5402a and 5402b can establish peer-to-peer communications 5440 (e.g., a personal area network) by use of one or more communication subsystems, such as Bluetooth® communication from Bluetooth Special Interest Group, Inc. of Kirkland, Washington.
  • Device 5402c can also establish peer to peer communications with devices 5402a or 5402b (not shown). Other communication protocols and topologies can also be implemented.
  • Devices 5402a and 5402b may also receive Global Positioning Satellite (GPS) signals from GPS satellites 5460.
  • GPS Global Positioning Satellite
  • Devices 5402a, 5402b, and 5402c can communicate with map service 5430 over the one or more wire and/or wireless networks, 5410 or 5412.
  • map service 5430 can provide a map service data to rendering devices 5402a, 5402b, and 5402c.
  • Map service 5430 may also communicate with other services 5450 to obtain data to implement map services.
  • Map service 5430 and other services 5450 may also receive GPS signals from GPS satellites 5460.
  • map service 5430 and/or other service(s) 5450 are configured to process search requests from any of client devices.
  • Search requests may include but are not limited to queries for business, address, residential locations, points of interest, or some combination thereof.
  • Map service 5430 and/or other service(s) 5450 may be configured to return results related to a variety of parameters including but not limited to a location entered into an address bar or other text entry field (including abbreviations and/or other shorthand notation), a current map view (e.g., user may be viewing one location on the multifunction device while residing in another location), current location of the user (e.g., in cases where the current map view did not include search results), and the current route (if any).
  • these parameters may affect the composition of the search results (and/or the ordering of the search results) based on different priority weightings.
  • the search results that are returned may be a subset of results selected based on specific criteria include but not limited to a quantity of times the search result (e.g., a particular point of interest) has been requested, a measure of quality associated with the search result (e.g., highest user or editorial review rating), and/or the volume of reviews for the search results (e.g., the number of times the search result has been review or rated).
  • map service 5430 and/or other service(s) 5450 are configured to provide auto-complete search results that are displayed on the client device, such as within the mapping application. For instance, auto-complete search results may populate a portion of the screen as the user enters one or more search keywords on the multifunction device. In some cases, this feature may save the user time as the desired search result may be displayed before the user enters the full search query.
  • the auto complete search results may be search results found by the client on the client device (e.g., bookmarks or contacts), search results found elsewhere (e.g., from the Internet) by map service 5430 and/or other service(s) 5450, and/or some combination thereof.
  • any of the search queries may be entered by the user via voice or through typing.
  • the multifunction device may be configured to display search results graphically within any of the map display described herein. For instance, a pin or other graphical indicator may specify locations of search results as points of interest.
  • the multifunction device responsive to a user selection of one of these points of interest (e.g., a touch selection, such as a tap), the multifunction device is configured to display additional information about the selected point of interest including but not limited to ratings, reviews or review snippets, hours of operation, store status (e.g., open for business, permanently closed, etc.), and/or images of a storefront for the point of interest.
  • any of this information may be displayed on a graphical information card that is displayed in response to the user's selection of the point of interest.
  • map service 5430 and/or other service(s) 5450 provide one or more feedback mechanisms to receive feedback from client devices 5402a-5402c.
  • client devices may provide feedback on search results to map service 5430 and/or other service(s) 5450 (e.g., feedback specifying ratings, reviews, temporary or permanent business closures, errors etc.); this feedback may be used to update information about points of interest in order to provide more accurate or more up-to-date search results in the future.
  • map service 5430 and/or other service(s) 5450 may provide testing information to the client device (e.g., an A/B test) to determine which search results are best.
  • the client device may receive and present two search results to a user and allow the user to indicate the best result.
  • the client device may report the test results to map service 5430 and/or other service(s) 5450 to improve future search results based on the chosen testing technique, such as an A/B test technique in which a baseline control sample is compared to a variety of single-variable test samples in order to improve results.

Abstract

Some embodiments provide a method for an application that operates on a mobile device. The method predicts several likely destinations for a vehicle to which the mobile device is connected based on data from a several different sources. The method generates, for a display screen of the vehicle, a display that includes the several likely destinations. In some embodiments, the method receives a first type of input through a control of the vehicle to select one of the likely destinations, and enters a turn-by-turn navigation mode to the selected destination in response to the received input. In some embodiments, the display is for a first destination of the several likely destinations. The method receives a second type of input through a control of the vehicle to step through the set of likely destinations, and generates a display for a second destination in response to the input.

Description

USER INTERFACE FOR DISPLAYING PREDICTED
DESTINATIONS
BACKGROUND
Portable media devices, such as smartphones, have the capability to run advanced mapping and navigation applications (e.g., Apple Maps®, which operates on the iPhone®, iPad®, and iPad Mini®). Some of these mapping and navigation applications include turn-by- turn navigation features, which can be helpful while driving. However, interacting with the mapping and navigation application while driving may be difficult due to the small size of many mobile devices, and therefore the small size of the various controls on the mobile devices.
In addition, many vehicles include in-car navigation systems. These in-car navigation systems operate independently of any of the driver's other devices, and offer a larger and conveniently positioned screen. However, these in-car navigations systems generally provide a more limited experience than the more robust mapping applications of the mobile device due to the inherent limitations of the vehicle.
BRIEF SUMMARY
Some embodiments of the invention provide an application that generates multiple user interfaces for display on multiple devices at the same time. In some embodiments, the application is an integrated mapping and navigation application that runs on a mobile device (e.g., a smart phone, tablet computer, media player, etc.) and generates both (i) a user interface for display on the mobile device and (ii) a user interface for display on a screen of a vehicle to which the mobile device connects. The integrated mapping and navigation application (referred to below as a mapping application) generates both user interfaces simultaneously for simultaneous output and display.
In addition, the mapping application of some embodiments generates different user interfaces for display on the screens of different types of vehicles. Some embodiments generate different user interfaces for each different individual vehicle. On the other hand, some embodiments generate different user interfaces for categories of vehicle screens, such as high- quality touchscreens, low-quality touchscreens, and non-touch screens (with which a user interacts via separate controls built into the vehicle). The mapping application of some embodiments, when connected to a vehicle, identifies the type of display screen built into the vehicle, and automatically outputs the correct user interface for the vehicle.
A user of the mapping application may interact with the application via the vehicle interface as well as the mobile device interface (a touchscreen interface in some embodiments). Because of the different capabilities of the different interfaces, as well as the differences in likely user behavior for interactions with the different interfaces, the same operation or type of operation may be implemented differently between the mobile device interface and the vehicle interface. For instance, the mobile device may have the capability to interpret multi-touch gestures (e.g., a pinch gesture to zoom in or out), whereas the vehicle interface may not have multi-touch capability (or any touch capability), and therefore requires different user interaction to zoom (e.g., selection of zoom in and zoom out buttons, either on the touchscreen or the vehicle interface).
Furthermore, because of the different capabilities of the different types of display screens, a user may interact differently with the application user interfaces displayed on high-quality touchscreens, low-quality touchscreens, and non-touchscreens. For instance, the interaction for scrolling through a map on a vehicle touchscreen may involve a similar swiping gesture as to scrolling through the map on a mobile device. However, a low-quality touchscreen may not have the ability to interpret such gestural input, and therefore the user interface for low-quality touchscreens includes selectable (e.g., via a tap input) arrows for scrolling in different directions. The non-touchscreen vehicle interface, of course, will require input through other controls (e.g., a joystick, buttons, etc.) that are built into the vehicle.
Beyond simply exploring a map (e.g., by scrolling and zooming), the vehicle interface output by the mapping application provides additional features in some embodiments. In some embodiments, the vehicle screen interface for the mapping application is geared towards identifying a destination for a user and entering a navigation mode for a route to that destination, with as little user interaction as possible (because the user is often the driver). For example, through the vehicle interface, a user (e.g., the driver of the vehicle, a passenger of the vehicle, etc.) may search for destinations on the map. The user may search for a specific address, a specific place name, a generic type of place name, etc. In some embodiments, the user searches through the vehicle interface via voice interaction (i.e., dictating a search into a microphone of either the mobile device or the vehicle). The user can scroll through these results in the vehicle interface (through touchscreen interactions, built-in vehicle control interactions, etc.), and choose to enter a navigation mode with a search result as a destination.
In addition, the mapping application of some embodiments offers a predictive routing feature accessible through the vehicle user interface. While driving, the user can select an option to enter the predictive routing mode, in which the mapping application presents various likely routes to the user for navigation. The mapping application may generate the likely routes based on a variety of factors, including upcoming appointments or events on a calendar or other scheduling application that runs on the mobile device, analysis of routes taken in the past by the mobile device (e.g., a route from a user's home to work). The predictive routing feature may additionally factor in traffic to identify potential difficulties in a usual route or in reaching a location on time.
The mapping application of some embodiments also offers a recent locations feature that provides a user with recent destinations, results of recent searches, etc. Some embodiments provide search results exclusively from recent searches entered or destinations navigated to through the vehicle interface. On the other hand, some embodiments additionally include search results from recent searches made through the device, even before the connection of the device to the vehicle interface. Thus, if a user searches for a particular destination on her mobile device while walking to her car, then enters the car and connects her device to the car interface, the particular destination will appear as a recent and easily selectable search, without requiring the user to re-enter the search.
Once the user selects a destination, the mapping application enters a turn-by-turn navigation mode in some embodiments. In this mode, some embodiments output different displays to the vehicle display and the mobile device display. The vehicle display, in some embodiments, displays the user's location and the upcoming route, in either a two dimensional mode or a three dimensional mode. The application of some embodiments generates this display from a perspective rendering position within a three dimensional navigation scene, though the view may be shown from directly above the scene so as to render a two dimensional view. The user can interact with the vehicle user interface to, e.g., view a list of maneuvers to make for the route (e.g., a right turn onto a particular street), change between two and three dimensions, and other interactions. Furthermore, in some embodiments, when the vehicle reaches a location within a particular threshold of the next maneuver, a portion of the vehicle screen displays a representation for the maneuver (e.g., an intersection with an arrow that represents the vehicle's path through the intersection, as well as text directions for the maneuver). Once the vehicle has passed through the intersection, the representation of the maneuver disappears from the display screen of the vehicle. While the vehicle display shows the upcoming route on a map, the mobile device display of some embodiments subsequently shows a representation for the upcoming maneuver, irrespective of the distance for the vehicle to travel before making the maneuver.
The preceding Summary is intended to serve as a brief introduction to some embodiments of the invention. It is not meant to be an introduction or overview of all inventive subject matter disclosed in this document. The Detailed Description that follows and the Drawings that are referred to in the Detailed Description will further describe the embodiments described in the Summary as well as other embodiments. Accordingly, to understand all the embodiments described by this document, a full review of the Summary, Detailed Description and the Drawings is needed. Moreover, the claimed subject matters are not to be limited by the illustrative details in the Summary, Detailed Description and the Drawings, but rather are to be defined by the appended claims, because the claimed subject matters can be embodied in other specific forms without departing from the spirit of the subject matters. BRIEF DESCRIPTION OF THE FIGURES
Figure 1 conceptually illustrates a mapping and navigation application that generates multiple user interfaces simultaneously.
Figure 2 illustrates an example of a mobile device connected to the interface of a vehicle system.
Figure 3 conceptually illustrates a simplified software architecture for a mapping and navigation application of some embodiments
Figure 4 conceptually illustrates a process of some embodiments performed by the mapping application to output a vehicle user interface.
Figure 5 illustrates a first example of a high-quality touchscreen user interface, a second example of a low-quality touchscreen user interface, and a third example of a non-touch user interface.
Figure 6 illustrates an additional difference between the high-quality touchscreen interface and the low-quality touchscreen interface of the mapping application in some embodiments.
Figure 7 illustrates the high-quality touchscreen vehicle user interface of some embodiments in which the user navigates through various features of the interface, including the recent destinations feature, the bookmarked destinations feature, and the predictive routing feature.
Figure 8 illustrates the use of the search function of some embodiments to identify a destination and then enter a navigation mode.
Figure 9 illustrates the user of zoom and scroll functionality as well as the position control to re-center the displayed map region on the device's current location in the high-quality touchscreen interface of some embodiments.
Figure 10 illustrates a user exploring a map in both the mapping application interface displayed on a mobile device as well as the high-quality vehicle touchscreen interface displayed on the screen of a vehicle to which the mobile device connects.
Figure 11 illustrates various map exploration functions, including zooming, scrolling, etc. in the low-quality touchscreen interface of some embodiments.
Figure 12 conceptually illustrates a region representing the map view area of the low- quality touchscreen user interface.
Figure 13 conceptually illustrates a process performed by the mapping application of some embodiments in order to translate a selection input into a scroll of the map display for a low-quality touchscreen vehicle interface.
Figure 14 illustrates an alternative low-quality touchscreen interface of some embodiments.
Figure 15 illustrates a feature of the scroll arrows of the low-quality touchscreen interface of some embodiments, in which when as a user holds down a touch input for an extended period of time over one of the scroll arrows, the map scrolling accelerates.
Figure 16 illustrates the use of scroll arrows to navigate through a list of recents in the low-quality touchscreen interface of some embodiments.
Figure 17 illustrates the operation of such a non-touchscreen vehicle user interface according to some embodiments.
Figure 18 illustrates an example of map exploration in a non-touchscreen vehicle user interface of some embodiments.
Figures 19 and 20 illustrate the search feature of some embodiments in different cases of ambiguous search terms.
Figure 21 illustrates an example of such a search that produces multiple results.
Figure 22 illustrates the operation of the mapping application search feature for the non- touchscreen vehicle user interface of some embodiments.
Figure 23 conceptually illustrates a process performed by the mapping application of some embodiments in order to retrieve search results for a search dictated through the vehicle user interface.
Figure 24 illustrates an example of a user performing a search within the mapping application mobile device interface while the mobile device is connected to a vehicle interface.
Figure 25 illustrates the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then accesses the recent destinations on the vehicle interface of the mapping application which now include the previously-searched destination.
Figure 26 illustrates the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then opens the mapping application on the vehicle interface.
Figure 27 illustrates three stages of a mapping application in which a user adds a location to the mapping application from a web browser on the mobile device, which adds the location to the mapping application table of recent destinations.
Figure 28 illustrates a recent destinations page of some embodiments as the vehicle displaying the recent destinations page travels over a stretch of road.
Figure 29 illustrates a user utilizing the recents feature on a mobile device of some embodiments in order to select a destination.
Figure 30 illustrates a user utilizing the recents feature on the vehicle interface of some embodiments in order to select a destination.
Figure 31 illustrates the use of this predictive routing interface on the high-quality vehicle touchscreen display of some embodiments in which the user activates the interface and views several different predicted destinations.
Figure 32 illustrates the same scenario as Figure 31, displayed in the low-quality touchscreen interface of some embodiments.
Figure 33 illustrates an example of the predictive routing engine of some embodiments reordering the likelihood of various destinations, and the subsequent effects in the predictive routing interface.
Figure 34 illustrates another example of changes made by the predictive routing engine to the order of likely destinations.
Figure 35 illustrates the animation of sliding a conceptual filmstrip with different predicted routes through the display screen.
Figure 36 illustrates the animation between two predicted route displays when the displays are arranged conceptually as stacked cards.
Figure 37 illustrates a feature of the predictive routing interface of some embodiments for entering the turn-by-turn navigation mode of the mapping application.
Figure 38 conceptually illustrates a simplified software architecture of a mapping and navigation application of some embodiments that operates on a mobile device and performs predictive routing.
Figure 39 illustrates the vehicle display screen over four stages in which a user activates a messaging function and dictates a message to a recipient. Figure 40 illustrates the operation of the turn-by-turn navigation user interface of some embodiments for the vehicle display.
Figure 41 illustrates the animation of the direction list in the vehicle interface of some embodiments.
Figure 42 illustrates a user accessing an overview mode in the vehicle user interface.
Figure 43 illustrates the display of the mobile device during navigation according to some embodiments.
Figures 44 and 45 illustrate the display of both a mobile device and vehicle UIs of some embodiments in which the vehicle approaches a first maneuver with a second maneuver following the first in rapid succession.
Figure 46 illustrates a use of the device interface to request a route from the device's current location to a destination location, then the subsequent selection to enter navigation.
Figure 47 illustrates the display and use of controls on the mobile device during navigation to enter a route overview display.
Figure 48 illustrates a device in power-saving mode as well as the vehicle display screen to which the device is connected during turn-by-turn navigation.
Figure 49 illustrates the home pages of the mobile device and vehicle displays of some embodiments while turn-by-turn navigation mode is operational and the vehicle approaches a maneuver.
Figure 50 illustrates the display outside the mapping application with turn-by-turn navigation operating according to some embodiments.
Figure 51 illustrates a situation in which the mapping application of some embodiments in turn-by-turn navigation mode is open on the vehicle display while the mobile device displays its home page.
Figure 52 illustrates an example of an architecture of a mobile computing device.
Figure 53 conceptually illustrates an example of an electronic system with which some embodiments of the invention are implemented.
Figure 54 illustrates a map service operating environment according to some embodiments. DETAILED DESCRIPTION
In the following detailed description of the invention, numerous details, examples, and embodiments of the invention are set forth and described. However, it will be clear and apparent to one skilled in the art that the invention is not limited to the embodiments set forth and that the invention may be practiced without some of the specific details and examples discussed.
Some embodiments of the invention provide an application that generates multiple user interfaces for display on multiple devices at the same time. In some embodiments, the application is an integrated mapping and navigation application that runs on a mobile device (e.g., a smart phone, tablet computer, media player, etc.) and generates both (i) a user interface for display on the mobile device and (ii) a user interface for display on a screen of a vehicle to which the mobile device connects. The integrated mapping and navigation application (referred to below as a mapping application) generates both user interfaces simultaneously for simultaneous output and display.
Figure 1 conceptually illustrates such a mapping and navigation application 100 that generates multiple user interfaces simultaneously. In some embodiments, the mapping and navigation application 100 operates on a mobile device such as a smart phone or tablet device (e.g., an iPhone®, iPad®, etc.). As shown in this figure, the mapping and navigation application 100 outputs a first user interface 105 to a local display module 110 of the mobile device and a second user interface 115 to a communication interface 120 with a vehicle.
The local display module 110 of some embodiments handles display of the first user interface 105 on a display screen of the mobile device on which the mapping application 100 operates. In some embodiments, a user of the mapping application may interact with the mobile device user interface 105 (e.g., through touchscreen interactions).
The communication interface 120 to the vehicle is an interface for a wired (e.g., USB, etc.) or wireless (e.g., Bluetooth®, Wi-Fi, etc.) connection with a vehicle electronic system. In some embodiments, the vehicle may be a car, truck, boat, or other motor vehicle with a display screen and ability to connect to the mobile device. The vehicle electronic system of some embodiments includes a display screen through which the second user interface 115 is displayed. In addition to interacting with the user interface 105 displayed on the mobile device, the user may also interact with the vehicle user interface 115 through various vehicle controls. These controls may include a touchscreen and/or various hardware controls (e.g., buttons, knobs, joysticks) built into the vehicle.
Many portions of this application describe the mobile device generating a graphical user interface (or user interface) for the vehicle display screen. In some embodiments, this means that the mobile device generates a graphical user interface display (i.e., an image or series of images) which is sent through the connection to the vehicle -resident system for display on the display screen of the vehicle. The graphical display may be sent as a bitmap in some embodiments, a different format that requires translation into a displayed image by the vehicle system, or a combination of bitmaps and different formats.
This display screen, which may be a touchscreen, makes up the vehicle user interface, possibly along with other controls (e.g., joystick, knob, built-in buttons, trackpad, trackball, etc.). These various controls receive inputs and send signals to the mobile device indicating the inputs performed by the user. The mapping application on the mobile device (or the operating system, which in some embodiments translates the signals from the vehicle) is configured to recognize these inputs and perform various actions to modify the mapping application graphical display on the vehicle screen, the mobile device screen, or both. The interface of some embodiments for a mobile device to communicate with the vehicle in this way is described in further detail in U.S. Patent Publications 2009/0284476, 2010/0293462, 2011/0145863, 2011/0246891, and 2011/0265003, which are incorporated herein by reference. In addition, this interface of some embodiments is described in further detail in the following applications: U.S. Provisional Patent Application 61/832,841, entitled "Device and Method for Generating User Interfaces from a Template"; U.S. Provisional Patent Application 61/832,842, entitled "Device, Method, and Graphical User Interface for Synchronizing Two or More Displays"; and U.S. Patent Application 13/913,428, entitled "Application Gateway for Providing Different User Interfaces for Limited Distraction and Non-Limited Distraction Contexts". U.S. Applications 61/832,841, 61/832,842, and 13/913,428 are incorporated herein by reference.
Figure 2 illustrates an example of a mobile device 200 connected to the interface of a vehicle system. A mapping application operates on the mobile device 200, and outputs both a first user interface 205 for the mobile device display screen and a second user interface 210 for the vehicle dashboard display screen 215. The figure illustrates the interior of a vehicle 250, in which the mobile device 200 connects via a wired connection to the vehicle, and outputs a user interface for display on a dashboard screen 215 of the vehicle. While this example, as well as others to follow, illustrates a single display screen in the vehicle, some vehicles include multiple screens (e.g., a center dashboard console screen as well as a screen directly in front of the driver). Some embodiments output only a single user interface to the center dashboard screen in this case, while other embodiments output the same user interface to multiple screens, and yet other embodiments output different interfaces to the different screens.
The figure also illustrates a blown-up view of the mobile device 200 and the dashboard screen 215. As shown, both of these display a map of the same location, but within the context of different user interfaces. In addition to the map, the mobile device interface 205 of some embodiments includes several user interface (UI) controls arranged in a top bar 220 and as floating controls (although other embodiments do not include floating controls and instead include these controls within either the top bar or a second control bar). The floating controls of some embodiments include a position control 225 and a 3D control 230, while the top bar 220 includes a direction control 235, a search field 240, and a bookmark control 245.
The direction control 235 opens a page through which a user can request the mapping application to identify a route (e.g., by accessing a mapping and routing server) between a starting location and an ending location. Upon receiving a selection (e.g., through a tap gesture) of a the direction control 235, the mapping application presents a page allowing the user to select a starting location (e.g., the user's current location or a different location) and an ending location for a route. Once a route has been identified that starts at the current location of the device (e.g., through the direction control 235 or through a different aspect of the UI 205), some embodiments enable the user to enter a turn-by-turn navigation mode of the mapping application by selecting a UI control. In the turn-by-turn navigation mode of some embodiments, the mapping application presents a view along the selected route, while also providing instructions for upcoming maneuvers to perform (e.g., as road signs with both text and graphical instructions). In some embodiments, the mapping application generates this view from a perspective rendering position within a three dimensional navigation scene that the device renders.
In some embodiments, the user can initiate a search by tapping in the search field 240. This directs the application to present an animation that (1) presents an on-screen keyboard and (2) opens a search table with various completions for the user's search entry. When the search field is empty (e.g., because the user has not yet entered any search terms), the table contains a list of "recents", which in some embodiments are recent searches and/or route directions that the user (or a different user of the mapping application on the mobile device 200) has previously requested. In some embodiments, the recents list may also include addresses searched for or added from other applications on the mobile device (e.g., a web browser), received via various different forms of messaging (e.g., SMS messages, e-mail, etc.), recently accessed contacts, or other mechanisms. After any edit in the search field 240, the mapping application fills the table with search completions both from local sources (e.g., bookmarks, contacts, recent searches, recent route directions, etc.) and remote servers. In addition to searching through the search field 240, in some embodiments users can activate a voice recognition feature of the mobile device in order to perform searches within the mapping application.
The bookmark control 245 allows location and routes to be bookmarked by the application. Through various aspects of the mapping application user interface, a user can add a location or a route as a bookmark. For instance, a user can select a location (e.g., a place of business) on the map to cause the application to present an information screen for the location that, along with various information, includes a selectable control for adding the location as a bookmark.
The position control 225 allows the current position of the device to be specifically noted on the map, as is the case in this figure. Once the position control is selected in some embodiments, the application maintains the current position of the device in the center of the map as the device moves. In some embodiments, the position control can also be used to direct the application to identify the direction to which the device is currently oriented. The mapping application of some embodiments identifies the location of the device using coordinates (e.g., longitudinal, latitudinal, and altitudinal coordinates) in a GPS signal that the device receives. Alternatively or conjunctively, the mapping application (or a separate application or operating system of the device) uses other methods (e.g., cell tower triangulation) to compute the current location.
The 3D control 230 is a control for viewing the map or inspecting a route in three dimensions. The mapping application provides the 3D control as a quick mechanism for entering or exiting a 3D mode. This control also serves as (1) an indicator that the current map view is a 3D view and (2) an indicator that a 3D perspective is available for a given map view (e.g., a map view that is zoomed out might not have a 3D view available).
Some embodiments additionally include a page curl control 255 that allows the application to minimize the number of on-screen controls, by placing certain less frequently used actions in a secondary UI page, which is accessible through the "page curl" control displayed on the map. The page curl indicates the location of another set of controls that are conceptually located "behind" the map view. These controls, in some embodiments, include selectable controls to drop a pin, show traffic on the map, print a map, or select different map types (e.g., satellite, standard, or a hybrid map type).
The above described the UI output by the mapping application for display on the mobile device 200. In addition, the mapping application outputs (i.e., generates for display) a UI for display on the vehicle dashboard screen 215. In some embodiments, this second UI provides a subset of the functionalities of the first mobile device UI, and is designed to focus the user towards identifying a navigation destination with minimal touch interaction, as the user will often be the driver of the vehicle.
The vehicle user interface 210 includes, in addition to the map, several UI controls arranged in a top bar 260, and a home button 265 located within a lower information bar 270. As will be described below, the vehicle user interface 210 is dependent, in some embodiments, on the type of user interface provided by the vehicle. Some embodiments generate different user interfaces for each different individual vehicle make or model. On the other hand, some embodiments generate different user interfaces for categories of vehicle screens, such as high- quality touchscreens, low-quality touchscreens, and non-touch screens (with which a user interacts via separate controls built into the vehicle). The mapping application of some embodiments, when connected to a vehicle, identifies the type of display screen built into the vehicle, and automatically outputs the correct user interface for the vehicle. In the case of Figure 2, the high-quality touchscreen interface of some embodiments is displayed.
The lower information bar 270 is designed to implement certain features of the mobile device user interface outside of the mapping application. In fact, in some embodiments, the lower information bar 270 is generated by the device operating system for the vehicle display rather than by the mapping application. As shown, the lower information bar 270 includes features also included in an upper information bar 275 of the mobile device user interface, including a clock 280, a battery charge indicator 285, and a connection indicator 290. The clock 280 indicates a current time, the battery charge indicator 285 indicates an amount of battery charge remaining for the mobile device, and the connection indicator 290 indicates a network connection of the mobile device (e.g., LTE, Wi-Fi, etc.).
The home button 265 of some embodiments is designed to replicate the functionality of the physical home button 295 of some embodiments, which enables the user of the mobile device to exit an application and return to a home page of the device. Similarly, in some embodiments, the user can touch the home button 265 to return to a home page of the vehicle user interface, which in some embodiments is also generated by the mobile device 200 (e.g., by the operating system of the mobile device). The home page of the vehicle user interface, as will be described below, displays a set of controls for opening a set of applications available through the vehicle interface (e.g., phone, messages, music, and other applications that run on the mobile device and are available through the vehicle interface). In some embodiments, when the user selects the home button 265, the mobile device displays these controls (e.g., above the mapping application interface) without exiting the mapping application. The user can then select one of the application controls or dismiss the set of controls by either waiting a short period of time or selecting a location within the mapping application.
In some embodiments, the mobile device 200 displays the same application on both the vehicle display 215 and its own display in most situations. For example, when the device is connected to the vehicle system and the user enters the mapping application via the mobile device, the device 200 also outputs the mapping application UI 210 to the display 215. When a user selects the home button 295 on the mobile device, the UI output to the display screen 215 exits the mapping application as well.
The top bar 260 of some embodiments is translucent and includes a search control 261, a recents control 262, a bookmarks control 263, a predictive routing control 264, a position control 266, and zoom controls 267 and 268. In addition, different user interfaces for different types of vehicles may include different controls (e.g., different zoom controls), or a completely different arrangement of controls (e.g., for non-touch interfaces).
The search control 261 of some embodiments directs the application towards a search functionality. Unlike the mobile device search control 240, the search functionality accessed through the vehicle interface 210 does not include a typing function. Instead, searching is done through voice recognition. When the user selects search control 261, the application presents an interface that prompts the user to speak into a microphone of either the mobile device or the vehicle. Based on the search term(s) dictated by the user, the mapping application either (1) presents the user with several selectable options for a search or (2) identifies a requested destination or destinations and presents the destination(s) on the map. For instance, if a user dictates a unique address or business name, the application automatically displays the requested location on the map display. On the other hand, when the user dictates an ambiguous destination (e.g., "Bobby's House", where there are multiple contacts named Bobby), the user interface provides two or more options for the user to select. In addition, when the user dictates a generic request (e.g., "hotel"), the mapping application of some embodiments may present multiple destinations on the map, when several destinations that match the generic request are located nearby.
The recents control 262 of some embodiments directs the application to present a list of recent destinations, results of recent searches, etc. Unlike the mobile device UI, the recents list of some embodiments for the vehicle interface does not provide routes from one location to another, because the vehicle interface is primarily designed for identifying a destination to which to navigate from the current location of the vehicle and device. Some embodiments provide search results exclusively from recent searches entered or destinations navigated to through the vehicle interface. On the other hand, some embodiments additionally include search results from recent searches made through the device, even before the connection of the device to the vehicle interface. Thus, if a user searches for a particular destination on her mobile device while walking to her car, then enters the car and connects her device to the car interface, the particular destination will appear as a recent and easily selectable search, without requiring the user to reenter the search through the search control 261.
The bookmarks control 263 of some embodiments is similar to the bookmarks control 245 of the device interface, in that it enables access to a list of bookmarked locations. Again, as with the recents list, the bookmarks list of some embodiments only provides bookmarked destinations, and does not provide routes. In some embodiments, the bookmarked destinations are those bookmarked on the mapping application running on the mobile device. In other embodiments, the mapping application UI displayed on the vehicle screen includes the ability to add a destination to the bookmarks.
The predictive routing control 264 enables access to a predictive routing display of the mapping application, in which the mapping application presents various likely routes to the user for navigation. The mapping application may generate the likely routes based on a variety of factors, including upcoming appointments or events on a calendar or other scheduling application that runs on the mobile device, or analysis of routes taken in the past by the mobile device (e.g., a route from a user's home to work). The predictive routing feature may additionally factor in traffic to identify potential difficulties in a usual route or in reaching a location on time. In some embodiments, the application presents the likely routes with a navigation view on one portion of the display and information about the route (including estimated arrival time, metadata about the source of the route, etc.) on a second portion of the display, with the ability to scroll through the routes.
The position control 266 functions similar to the position control 225 of some embodiments, in that its selection will cause the mapping application to center the map view on the current location of the device. However, in some embodiments, the position control does not include an additional feature of enabling the user to identify the orientation of the vehicle. Instead, some embodiments automatically orient the map based on the direction of travel of the vehicle, or provide a setting through which the user enables this automatic orientation of the map. Other embodiments automatically orient the map with north facing up irrespective of the vehicle travel direction.
In this case, the map view is centered on the position indicator 269 that indicates the current location of the device. Some embodiments disable the position control when the map view is centered on the position indicator. As such, the position control 266 is displayed in grayed-out fashion, to indicate its current disabled state. Figure 9 below illustrates the enabling of the position control as the user moves the map region so that the position control is no longer centered.
The zoom controls 267 and 268 enable the user to zoom in (with control 267) and out (with control 268) of the map view in some embodiments. In addition to zooming to explore the map view, users may also scroll through the map region. In some embodiments of the mapping application, the scrolling and zooming features differ for different types of vehicle interfaces because of the different capabilities of the different types of display screens. For instance, UI 210 is a high-quality touchscreen interface, and the user scrolls the map view via swipe, or drag, gestures. However, a low-quality touchscreen may not have the ability to interpret such gestural (i.e., touch movement) input, and therefore the user interface for low-quality touchscreens includes selectable arrows for scrolling in different directions. Rather than zoom controls 267 and 268, some embodiments present a different control that the user selects to activate the selectable arrows as well as zoom controls similar to those for the high-quality touch interface 210. The non-touchscreen vehicle user interface, of course, requires input through other controls in order to zoom and scroll. These other controls may include a joystick, trackball, various buttons, etc. that are built into the vehicle.
As indicated, many of the features of the vehicle UI 210 are designed to enable the user to select a destination for navigation. Once the user selects a destination, the mapping application enters a turn-by-turn navigation mode in some embodiments. In this mode, some embodiments output different displays to the vehicle display and the mobile device display. The vehicle display, in some embodiments, displays the user's location and the upcoming route, in either a two dimensional mode or a three dimensional mode. In some embodiments, the mapping application generates this view from a perspective rendering position within a three dimensional navigation scene that the device renders. This perspective rendering position is adjustable in some embodiments and can be viewed as a virtual camera that can capture the three dimensional navigation scene from a variety of different perspectives (e.g., from a variety of different positions and orientations). Accordingly, in some embodiments, the turn-by-turn navigation presents an animated rendering of a navigated route rendered from the vantage point of a virtual camera that traverses along the direction of the route based on the traversal direction and speed of the vehicle to which the device is connected. In some embodiments, this direction and speed is captured by data (e.g., GPS data, triangulated cell tower data, etc.) associated with the device.
The user can interact with the vehicle user interface to, e.g., view a list of maneuvers to make for the route (e.g., a right turn onto a particular street), change between two and three dimensions, and other interactions. Furthermore, in some embodiments, when the vehicle reaches a location within a particular threshold of the next maneuver, a portion of the vehicle screen displays a representation for the maneuver (e.g., an intersection with an arrow that represents the vehicle's path through the intersection, as well as text directions for the maneuver). Once the vehicle has passed through the intersection, the representation of the maneuver disappears from the display screen of the vehicle. While the vehicle display shows the upcoming route on a map, the mobile device display of some embodiments subsequently shows a representation for the upcoming maneuver, irrespective of the distance for the vehicle to travel before making the maneuver. Figures 1 and 2 illustrate examples of a mapping application operating on a mobile device that outputs both a first UI for the mobile device and a second UI for a vehicle to which the mobile device connects. Several more detailed examples are described below. Section I describes the different types of vehicle interfaces generated by the mapping application of some embodiments, while Section II describes map exploration features of these different vehicle interfaces. Section III then describes the search feature accessible through the vehicle user interfaces of the mapping application of some embodiments. Next, Section IV describes a feature of the vehicle user interfaces of some embodiments for accessing recent destinations, while Section V describes a predictive routing interface of some embodiments. Section VI describes a turn-by-turn navigation mode of the mapping application of some embodiments. Section VII then describes electronic devices that employ the mapping application of some embodiments. Finally, Section VIII describes location services used by some embodiments of the invention. I. DIFFERENT TYPES OF VEHICLE USER INTERFACES
As described above, the mapping application of some embodiments, in addition to generating a first user interface for the device on which it operates, also generates different user interfaces for different types of vehicles to which the device may connect. In some embodiments, the application categorizes vehicle interfaces into high-quality touchscreen interfaces, low- quality touchscreen interfaces, and non-touchscreen interfaces. While this description primarily focuses on these three types of interfaces, one of ordinary skill in the art will recognize that many different types of interfaces may be generated by some embodiments. For instance, some embodiments generate a UI for vehicles that include a touchscreen interface along with additional hardware controls. In addition, the mapping application of some embodiments generates different UIs for vehicles with different non-touchscreen controls. Some embodiments, in fact, generate different UIs for each different make or model of vehicle.
Figure 3 conceptually illustrates a simplified software architecture for a mapping and navigation application 300 of some embodiments. As with the mapping application 100 described above, the application 300 operates on a mobile device that can be connected to a vehicle. In some embodiments, the vehicle connection requires a wired connection that additionally enables charging of the electronic device (e.g., a USB connection). In other embodiments, the mobile device can connect wirelessly to the vehicle display interface.
As shown, the mapping application 300 includes various mapping and navigation functions 305, a local device user interface generator 310, a high-quality vehicle touchscreen user interface generator 315, a low-quality vehicle touchscreen user interface generator 320, a non-touch vehicle touchscreen user interface generator 325, and a vehicle user interface selector 340 that uses a vehicle display lookup table 345. In addition, Figure 3 also displays a local display module 330 and a communication interface 335 for connecting to a vehicle electronic system. The local display module 330 and the communication interface 335 are also part of the mobile device on which the mapping application 300 operates, though not part of the mapping application 300 itself.
The mapping and navigation functions 305 perform a wide variety of mapping and/or navigation processes for the mapping application. These functions may include identifying map tiles from which to generate map displays and retrieving the map tiles from a remote server, matching a device location to a location on a map, storing and retrieving bookmarked or recent search results, retrieving route information from a server, generating predicted routes based on analyzing data travel history, data from other applications, etc., generating text and/or graphical navigation instructions for a route based on the retrieved route information, and other functions.
The local device user interface generator 310 of some embodiments generates the mapping application UI for display on the mobile device on which the mapping application 300 operates. In some embodiments, the UI generator 310 receives a rendered map from the mapping and navigation functions and adds additional user interface controls (e.g., the floating controls 225 and 230, the controls in the top bar 220, etc.). In other embodiments, the UI generator 310 receives a description of the map to be rendered, adds the overlaid controls, and sends the entire UI to a rendering engine for rendering. In addition, the UI generator 310 sends the UI to the local display module 330 for display on the local display device.
The local display module 330 of some embodiments handles display of the mobile device user interface of the mapping application. In some embodiments, the mobile device display is a touchscreen which, in addition to displaying a user interface, accepts touchscreen interactions. Either the local display module 330 or a separate touchscreen interaction reader receives these touchscreen interactions and translates the interactions into multi-touch gestures (e.g., one-finger tap, one-finger double tap, one-finger swipe, two-finger swipe, etc.) at particular locations (e.g., coordinates) of the touchscreen. These gestures are sent from the operating system to the mapping application 300 (e.g., to the local device user interface generator 310 or a separate module that translates the gesture and location into an interaction with the displayed mapping application constructs.
The high-quality vehicle touchscreen user interface generator 315, the low-quality vehicle touchscreen user interface generator 320, and the non-touchscreen vehicle user interface generator 325 generate three different user interfaces for display on the screens of different vehicles (e.g., screens located in the dashboards of the vehicles). As indicated by the names assigned here to the modules, the mapping application 300 of some embodiments generates a first user interface for high-quality touchscreens, a second user interface for low-quality touchscreens, and a third user interface for non-touchscreen displays.
Various differences between the different user interfaces will be described in the sections to follow, as well as by reference to Figure 5 below. The high-quality touchscreens may not have the ability to interpret multi-touch gestures, but have a high enough touch resolution to interpret swipe/drag gestures, whereas the low-quality touchscreens may not be able to provide reliable swipe/drag gesture data. Some low-quality touchscreens can detect swipes or other gestures, but without adequate resolution for some interactions. Thus, the low-quality touchscreen interface may allow gestural inputs for interactions that do not require location data of the gesture (e.g. swiping between screens), but not for interactions where location of the gesture is important (e.g., precise positioning of a map). Furthermore, the high-quality touchscreens may have a greater ability than the low-quality touchscreens to resolve the position of selection (e.g., tap or double-tap) gestures, and therefore the UI for high-quality touchscreens can have selectable items located more closely together.
Each of the vehicle UI generators 315-325 sends their UIs to the communication interface 335 that connects to the vehicle system. While the figure shows all three modules communicating with the interface, in some embodiments only one of the UIs will be generated at a given time, depending on the vehicle to which the device is connected. In some embodiments, this communication interface is a Universal Serial Bus (USB) interface through which the device communicates with the vehicle system, as well as through which the device is charged. In other embodiments, the communication interface may be one of several different interfaces (e.g., USB, Bluetooth, etc.) depending on how the user chooses to connect the device to the vehicle system.
The mapping application 300 also receives interactions with the vehicle system (i.e., touchscreen and/or non-touchscreen interactions) to control the mapping and navigation functions through the communication interface 335. In some embodiments, the vehicle system identifies the location and type of touchscreen gestures (e.g., single or double tap centered at a particular set of coordinates) and sends this information to the mapping application 300 for its interpretation with respect to the displayed user interface. In other embodiments, the vehicle system sends the raw touchscreen data (e.g., pressure data) to the mobile device for translation into gestures at particular locations. The interactions received through the vehicle interface may also include non-touchscreen interactions, such as the pushing of specific buttons, joystick motion, etc.
The vehicle user interface selector 340 identifies which of the three types of vehicle user interfaces should be generated and communicates this information to both the mapping and navigation functions 305 and the different user interface generators 315-325. In some embodiments, the vehicle user interface selector 340 is part of the operating system of the mobile device rather than part of the mapping application. The vehicle user interface selector 340 receives information about the vehicle display screen through the communication interface 335 (e.g., through a specific communication channel of the interface). This information, in some embodiments, includes one or more characteristics of the vehicle display screen. These characteristics, in some embodiments, include whether or not a touchscreen is available, the latency of the touchscreen (i.e., how quickly is the screen able to process inputs), the touch and display resolutions of the touchscreen, whether or not the touchscreen is multi-touch capable, and what type of additional inputs the vehicle includes (e.g., knob, joystick, trackball, number pad, additional buttons, etc.). Based on the received information, the vehicle user interface selector 340 uses the vehicle display lookup table 345 to select one of the three types, or categories, of interfaces for the mapping application to generate. In some embodiments, the mapping application generates numerous different interfaces that cater to the specifics of the vehicle displays with greater granularity. For instance, some embodiments generate multi-touch capable vehicle user interfaces, interfaces for vehicles that include both touchscreen and knob- type inputs, etc.
As mentioned, in some embodiments, the mapping application 300 only generates one vehicle user interface, in addition to the local device user interface for display on the mobile device on which the mapping application operates. Figure 4 conceptually illustrates a process 400 of some embodiments performed by the mapping application to output a vehicle user interface. Figure 4 will be described by reference to Figure 5, which illustrates three different mapping application user interfaces for display on different types of vehicle display screens. Specifically, Figure 5 illustrates a first example 505 of a high-quality touchscreen user interface, a second example 510 of a low-quality touchscreen user interface, and a third example 515 of a non-touch user interface.
As shown, the process 400 begins by detecting (at 405) a connection to a vehicle interface. In some embodiments, the process is started by the mapping application when the mobile device on which the application operates connects to the vehicle interface (e.g., by a user connecting the device with a USB connector) and the mapping application is running on the device (e.g., running in the background or foreground of the device).
The process then determines (at 410) a vehicle interface type. In some embodiments, the vehicle does not identify itself as one of three categories (high-quality touch, low-quality touch, and non-touch), but instead, identifies itself based on the make and model of the vehicle, or the particular built-in system make and model number. In some embodiments, the mapping application (or the device operating system) stores a lookup table that identifies, for different vehicle systems, whether the vehicle has a high-quality touchscreen interface, a low-quality touchscreen interface, or a non-touchscreen interface.
In other embodiments, the vehicle system identifies properties of its display device (e.g., touch latency, touch resolution, display resolution, additional inputs (e.g., knobs, etc.). Based on the identified properties, the mapping application (or the device operating system) identifies a type of user interface (e.g., high-quality touch, low-quality touch, non-touch) to generate from a stored matrix or lookup table.
The process then determines (at 415) whether the interface has a high-quality touchscreen. As mentioned, in some embodiments, a high-quality touchscreen may not have the ability to interpret multi-touch gestures, but will have a better touch input resolution than the low-quality touchscreens. When the vehicle interface has a high-quality touchscreen, the process generates and outputs (at 420) a high-quality touchscreen vehicle user interface. The process then ends.
The first example 505 of Figure 5 illustrates such an interface output to a first vehicle by the mapping application. This first example illustrates two stages 520 and 525 of a high-quality touchscreen vehicle display. The first stage 520 illustrates a home page user interface 550 of some embodiments output by the mobile device operating system when the mobile device is connected to a vehicle system with a high-quality touchscreen interface. The home page 550 includes a clock 555 and a menu bar 560 that includes several controls. The menu bar 560 includes a phone control 561, a music control 562, a map control 563, a messaging control 564, a car control 566, and a now playing control 567. In some embodiments, this user interface includes controls for accessing various applications or functions that operate on the mobile device that a user is most likely to access while driving.
The phone control 561 enables the user to access the phone application in order to place a phone call. In some embodiments, selection of the phone control causes the application to provide an interface for identifying a phone number (e.g., a phone number of a stored contact, of a place of business, etc.) to call. In some embodiments, the interface prompts the user to dictate search terms (e.g., a contact name) in order to identify a phone number. Other embodiments provide a touchscreen interface for navigating through a contacts list, entering a phone number, etc.
The music control 562 enables the user to access a particular media storage, access, and playback application (e.g., iTunes®) in order to access media available through the application. The media may be stored on the device or accessible through the application via a connection to an associated cloud storage service (e.g., iCloud®). As with the phone UI, the music UI may involve touchscreen interaction, voice interaction, or a combination thereof.
The messaging control 564 enables the user to access the messaging application of the mobile device in order to send a short messaging service (SMS), or text, message. In some embodiments, much like for the phone application, the UI for the messaging application is designed to quickly identify a contact to which to send a text message, via touchscreen interactions, voice interactions, or a combination thereof. Once one or more recipients have been identified for the message, in some embodiments the user composes the message by dictation.
The car control 566 enables the user to access a UI provided by the vehicle manufacturer, or by the in-vehicle system manufacturer. This is the UI displayed by default when the mobile device is not connected to the vehicle system, and varies from one vehicle to the next. The UI provided with the vehicle may include options such as a built-in navigation system, access to satellite radio, AM radio, FM radio, etc., and other functions.
The now playing control 567 enables the user to access any media applications that may be operating on the user's mobile device (e.g., streaming online radio applications, streaming video applications, etc.). Depending on the application, the mobile device may output different user interface for different applications. In some embodiments, the media applications may not have specific UIs for the vehicle display, and the device simply outputs the media while requiring interaction with the mobile device UI.
The map control 563 enables the user to access the mapping application described herein. The first stage 520 of this example illustrates a user selecting the map control 563 with a touch selection. The second stage 525 illustrates the resultant mapping application user interface 565 for high-quality touchscreens. This mapping application UI 565 is the same as the UI 210 described above by reference to Figure 2, and includes the lower information bar in addition to a search control 571, a recents control 572, a bookmarks control 573, a predictive routing control 574, a position control 576, and zoom controls 577 and 578. The functions of these controls are described above by reference to Figure 2.
Returning to Figure 4, when the vehicle interface does not have a high-quality touchscreen, the process determines (at 425) whether the vehicle interface has a low-quality touchscreen. Low-quality touchscreens, in some embodiments, do not have the ability to recognize gestural input (e.g., drag/swipe gestures, etc.). This may be due to the vehicle system not being programmed to recognize such gestures or due to the touchscreen sensors having too low a resolution to identify such gestures with a reasonable level of accuracy. When the vehicle interface has a low-quality touchscreen, the process generates and outputs (at 430) a low-quality touchscreen vehicle user interface. The process then ends.
The second example 510 of Figure 5 illustrates such an interface output to a second vehicle by the mapping application. This second example illustrates two stages 530 and 535 of a low-quality touchscreen vehicle display. The first stage 530 illustrates the same home page user interface 550 as the high-quality touchscreen example 505, output by the mobile device operating system when the mobile device is connected to a vehicle system with a low-quality touchscreen interface. Some embodiments include the same home page for the vehicle UI irrespective of the quality of the touchscreen. The controls for this home page are described above by reference to the first example 505. As in that first example, at the first stage 530 the user selects the map control 563 with a touch selection. The second stage 535 illustrates the resultant mapping application user interface 570 for low-quality touchscreens. This mapping application UI 570 is similar to the high-quality touchscreen interface shown above, with some differences. The UI 570 includes the same lower information bar with a home button, as well as the top bar with a search control 581, a recents control 582, a bookmarks control 583, a predictive routing control 584, and a position control 586. As opposed to the zoom controls 577 and 578 of the high-quality touchscreen interface 565, the low-quality touchscreen interface 570 includes a zoom and scroll control 587.
In some embodiments, the search control 581, recents control 582, bookmarks control 583, predictive routing control 584, and position control 586 perform the same functions as in the high-quality touchscreen interface 570, as described above by reference to Figure 2. These various functions (search by dictation, recent destinations, bookmarked destinations, predictive routing, current location indication) work in the same manner in both the high-quality touchscreen user interface and the low-quality touchscreen user interface in some embodiments, though in other embodiments differences exist between the features. For instance, the high- quality touchscreen UI for the predictive routing feature allows the user to swipe between suggested routes, while the low-quality touchscreen UI includes other constructs to enable the user to change the route shown.
The zoom and scroll control 587, as will be described in greater detail in the following section, enables the user to access various controls for exploring the map on the vehicle screen. Because of the unreliability or unavailability of swipe gesture input, the low-quality touchscreen UI 570 uses different controls for scrolling that are selectable with tap inputs rather than gestural swipe inputs. In some embodiments, the selection of the zoom and scroll control 587 causes the application to overlay selectable arrows pointing up, down, left, and right over the map view, which the user can select in order to scroll the map in the selected direction. In addition, the application displays zoom in and zoom out controls over the map display that are similar to the controls 577 and 578 of the high-quality touchscreen interface 565.
Returning again to Figure 4, when the vehicle interface does not have a touchscreen (i.e., does not have a low-quality or high-quality touchscreen), then the vehicle interface will have a non-touch display screen. As such, the process generates and outputs (at 435) a non-touchscreen vehicle user interface. The process then ends.
The third example 515 of Figure 5 illustrates such an interface output to a third vehicle by the mapping application. This third example illustrates two stages 540 and 545 of the non- touchscreen vehicle display. The first stage 540 illustrates a home menu user interface 575 displayed on the vehicle screen, as well as vehicle hardware controls 580. While this figure illustrates the vehicle hardware controls 580 located directly below the display screen, different vehicles may have their controls located in different places. For instance, some embodiments include controls in the center console between the driver and passenger seats, on the steering wheel, etc.
In this example, the vehicle hardware controls 580 include a keypad 591, a joystick 592, as well as several additional buttons located above and below the joystick. These additional buttons enable the user to access various functions of the vehicle system (e.g., music, phone calls with a connected phone), as well as navigate a set of menus provided by the vehicle system (e.g., to access an in-car navigation system provided with the vehicle). In some embodiments, the mobile device operating system and/or mapping application leverage these buttons as well as the joystick for interactions with the user interface provided by the mobile device.
In this and other figures, the joystick 592 is shown with an "OK" button in the center and four arrows pointing up, down, left, and right. These are illustrated to represent various interactions that a user can perform with a joystick, trackball, or other directional hardware control. For instance, with a trackball (or joystick), in some embodiments a user can press downwards on the trackball to make a selection, or shift the trackball in one of the four directions to signal a move (e.g., of a selector) in that direction.
The non-touch user interface home menu 575 output by the mobile device illustrates the same options as in the touchscreen user interface 550, but arranged differently. In some embodiments, the vehicle display initially displays only the clock, and the user accesses the home menu UI 575 by selecting a menu button 593. The home menu 575 includes a phone option, music option, map option, messaging option, car option, and now playing option, which correspond to the controls 560-567 of the touchscreen UI 550. In some embodiments, the user can use the joystick 592 (or a trackball, arrow buttons, turnable knob, etc.) to scroll between the different menu options.
The first stage 540 illustrates that, with the map menu option 594 selected, the user performs a selection input by pressing downwards on the joystick 592. This selects the mapping application, and causes the display of the mapping application non-touch user interface 595 at the second stage 545. Because the display screen is a non-touch screen, the user interface does not display any controls, and only has the map view and the lower information bar. The lower information bar, unlike those for the touchscreen UIs 565 and 570, does not include a home button. Instead, in some embodiments, the user can select a hardware control (e.g., the back or exit button) to perform the function of the home button. In addition, as will be shown below, in some embodiments the user selects the menu button 593 to access a list of menu options in order to perform functions similar to those provided by the various controls 571-576 of the high- quality touchscreen UI 565 (e.g., search, recents, bookmarks, predictive routing, current location, etc.).
Figure 6 illustrates an additional difference between the high-quality touchscreen interface 565 and the low-quality touchscreen interface 570 of the mapping application in some embodiments. This figure illustrates expanded views of the predictive routing control 574 of the high-quality touchscreen interface and the predictive routing control 584 of the low-quality touchscreen interface. The expanded views illustrate the difference in the controls between the two interfaces. The high-quality touchscreen interface predictive routing control 574 includes an icon 605 and a buffer region 610 around the icon. Similarly, the low-quality touchscreen interface predictive routing control 584 includes an icon 615 and a buffer region 620 around the icon.
The two icons 605 and 615 from the two interfaces are the same; however, the buffer regions around the icons are not. Specifically, the buffer 620 in the low-quality interface is larger than the buffer 610 in the high-quality interface. This allows additional leeway in the touchscreen's interpretation of a user's touch selection for the low-quality case. While the predictive routing control is used as an example, one of ordinary skill in the art will recognize that the larger touch buffer in the low-quality touchscreen interface may apply to other touch selectable controls (sometimes called affordances) as well, such as the other controls in the top bar (e.g., search, bookmarks, recents, etc.), the home button, arrows for stepping through search results shown below, etc.
II. MAP EXPLORATION IN DIFFERENT USER INTERFACES
The above section introduced the different mapping application user interfaces output to vehicle displays. The operations of the mapping application vehicle UIs, including some of the differences between the UIs for different types of vehicle displays will be described in this section. Subsection A describes the operation of the high-quality touchscreen UI, while subsections B and C describe differences in the operation of the low-quality touchscreen and non-touchscreen UIs, focusing on the zoom and scroll operations.
A. High-Quality Touchscreen Interface
Figure 7 illustrates six stages 705-730 of the high-quality touchscreen vehicle user interface 565 in which the user navigates through various features of the interface, including the recent destinations feature, the bookmarked destinations feature, and the predictive routing feature. The first stage 705 illustrates the user interface 565 as shown in the previous Figure 5. In addition, at this stage, a user selects the recents control 572.
The second stage 710 illustrates a resulting recent destinations page 735. The recent destinations page 735 includes a list of recently searched places of business (Hubcap BBQ Palace), recently searched geographic locations (San Francisco, Los Angeles), recently performed searches (Coffee), recent destinations of routes, etc. The various destinations may have been searched for individually within the mapping application via either the vehicle UI or the mobile device UI, selected on the map view within the mapping application, etc. The items in the list are individually selectable in order to display a map centered on the selected destination in some embodiments. In addition, the recent destinations page 735 includes a cancel button 740, which the user is shown selecting (with a tap selection) in stage 710. The user selects the cancel button 740 in order to exit the recent destinations page 735 and return to the map view UI 565, as shown at the third stage 715.
Also at the third stage 715, the user selects the bookmarks control 573. The fourth stage
720 illustrates a resulting bookmarked destinations page 745. These destinations, unlike those on the recents page 735, do not change based on recent user activity within the mapping application (unless that activity specifically adds a destination as a bookmark or removes a destination from the bookmarks). The bookmarks, in some embodiments, appear as names chosen by the user (e.g., "Home"), rather than addresses or proper business names, unless those are the names chosen. As with the recent destinations page 735, the bookmarked destinations page 745 includes a cancel button 750, which the user is shown selecting (with a tap selection) in stage 720. The user selects the cancel button 750 in order to exit the bookmarked destinations page 745 and return to the map view UI 565, as shown at the fifth stage 725.
Also at the fifth stage 725, the user selects the predictive routing control 574. The sixth stage 730 illustrates a predictive routing user interface 755, which will be described in greater detail in Section V below. The predictive routing UI 755 displays, in order of likelihood as computed by a predictive routing engine of the mapping application, one or more likely routes for the user to follow. The predictive routing engine of some embodiments uses travel history, contact information, calendar or other appointment-keeping information, traffic information, etc. to identify likely destinations and optimal routes to the destination.
The predictive routing UI 755 of some embodiments displays, for each route, a first panel 760 with information about the route and a second panel 765 with a map view of the route. The information panel 760, in some embodiments, displays a name for the route as well as metadata about the route. The route name may be based on a destination name identified from contact information, a calendar appointment name, a business name, an address, a primary road used by the route, etc. In addition, the metadata shown may include an estimated arrival time (ETA), the source of the route for the predictive routing engine (e.g., calendar, passbook, travel history, etc.), whether the destination of the route is a frequent or less traveled destination for the device, and other information. The map view 765, in some embodiments, displays a map to the destination with the selected route. In some embodiments, the routing engine factors in traffic information to select an optimal route to the destination, and may warn a user if a typically- traveled route has worse traffic than usual.
Each of the above features (recents, bookmarks, predictive routing) is designed to quickly identify a destination for navigation. In addition, users can use the search control to identify a destination when the desired result is not available through the more directed options (e.g., because the destination is a new destination). Figure 8 illustrates the use of the search function to identify a destination and then enter a navigation mode over six stages 805-830. The first stage 805 illustrates the user interface 565 as shown in Figure 5. In addition, at this stage, a user selects the search control 571.
The second stage 810 illustrates a resulting search function page 835. In some embodiments, the search function available through the vehicle display is a dictation-only (i.e., no typing) interface that avoids requiring the driver to type in a desired destination. If a passenger in the vehicle wants to type in a destination, the passenger can do so through the mapping application interface displayed on the mobile device in some embodiments.
As shown, the search function page 835 displays a search bar 840 with a cancel button, and a results regions 845 that initially includes a set of example searches that the user could dictate and a prompt to the user to verbally recite the desired destination. In the third stage 815, as the user begins speaking, the results region 845 now illustrates a waveform to indicate that the interface is receiving a voice prompt and a dictation completion control 850 for the user to select upon completion of the dictation. In some embodiments, the application will either identify the end of the user's dictation based on the selection of the completion control 850 or the identification of a gap of a threshold duration in the user's speaking. Some embodiments only use the identification that the user has stopped speaking to end dictation, and do not include the touchscreen completion control.
In this case, the user has spoken "ABC Deli", as represented by the megaphone 855 conceptually illustrated next to the display. In some embodiments, the mapping application uses a microphone in the vehicle to record the audio, but performs the audio analysis for voice recognition at the mobile device (e.g., using a voice recognition function of the mobile device or of a server with which the mobile device communicates). In other embodiments, the mobile device additionally uses its own microphone to record the audio for analysis.
As shown by the automatic (with no touch interaction) transition to the fourth stage 820 in Figure 8, when the search term is unambiguous, some embodiments automatically display the result in a search result map view page 852. The handling of ambiguous search terms, as well as additional aspects of the search function, will be described below in Section III. The search result map view page 852 shows a new map centered on the searched destination, for which the application displays a pin 857, and selects this pin. When the device's current location is within the displayed map region, the application continues to display the position indicator 860. The mapping application also displays a top information bar 865 and a lower information bar 870.
The search function, in some embodiments, is biased around the current location of the device. That is, the search function will prefer close destinations (e.g., in the same city as the current location, within a particular distance of the current location, etc.) rather than those in other parts of the world. In addition, some embodiments use past destination history to influence search results, expanding the search range for destinations previously selected by a user. The search function may also use information such as the contacts information stored on the device.
In this case, only one search result is identified for the search parameters, ABC Delicatessen. In other cases, the search term may have multiple possible results. For example, a search for a chain restaurant may yield multiple locations for different franchises of the chain. The user may also search for a generic type of location (e.g., bars, restaurants, coffee shops, etc.), which will often result in multiple possible destinations.
In addition, a specific search may result in ambiguity if the user searches for, e.g., "work", but works multiple jobs. In some embodiments, when the mapping application cannot identify the specific search term dictated, or another ambiguity arises (such as the two different "work" locations), the application displays a list of possible search terms from which the user can choose. In other embodiments, however, the mapping application always identifies one set of search terms using the voice recognition function of the device.
The top information bar 865 includes two selectable items. In case the user does not like the search results, the user can either cancel the search to return to the default map view UI 565 or select a search again control 875 to perform another search. Furthermore, the top information bar 865 displays the search term or terms spoken by the user and interpreted by the device (in this case, "ABC Deli").
The lower information bar 870 includes information about the specific destination currently selected. In some embodiments, this information, as shown, includes the name and/or address of the destination and the estimated arrival time of an optimally calculated route to the destination (3 minutes, in this case, due to the proximity of the destination to the user's current location). Furthermore, the lower information bar 870 includes a selectable button 880 for entering a turn-by-turn navigation mode directed towards the destination.
The fifth stage 825 displays the user selecting the button 880 in order to enter navigation to the selected destination (ABC Delicatessen). The sixth stage 830 illustrates a navigation mode user interface 885 of some embodiments. The navigation UI 885 includes the lower information bar from the initial map UI 565 (as provided by the operating system, in some embodiments), a navigation information bar 888, a navigation map 890, a 3D control 895, and a direction list control 897. The navigation information bar 888 displays instructions for a next maneuver to perform along the navigated route, as well as estimated arrival time information (i.e., travel time, travel distance, and arrival time). The 3D control 895 enables a user to toggle the navigation map 890 between 3D and 2D animation. The 3D control 895 is shown as currently selected at stage 825, and the application displays the navigation map 890 in three dimensions. Some embodiments automatically select the 3D option upon entering the turn-by-turn navigation mode. The list control 897 enables the user to access a list of maneuvers for the route. The navigation mode will be described in greater detail below in Section VI.
In addition to using the various features that facilitate the identification of a destination for navigation, users may also wish to explore the map region via the vehicle display screen. Figure 9 illustrates various user interface actions available in some embodiments to enable such exploration of the map region. Specifically, over six stages 905-930 of the user interface 565, Figure 9 illustrates the user of zoom and scroll functionality as well as the position control 576 to re-center the displayed map region on the device's current location.
The first stage 905 illustrates the high-quality touchscreen vehicle user interface 565, currently tracking the device's current location. As mentioned, the mapping application of some embodiments tracks the current location of the mobile device and by default centers the map region displayed for the vehicle UI on the position indicator 860. As a result of the display of the position indicator 860 in the center of the map, the mapping application currently is displaying the position control 576 in a grayed-out manner to indicate that it has been disabled. In addition, at this stage, the user selects (e.g., with a single tap input) the zoom out control 578.
The second stage 910 illustrates the map after the mapping application has performed the zoom out operation on the displayed map region in the vehicle user interface 565. The map view now shows a larger area of the map because the mapping application displays it at a different zoom level, from a conceptual point of reference higher above the map. In some embodiments, the zoom function on the vehicle display modifies the zoom level by a particular amount (e.g., a particular conceptual distance from the ground, a particular multiplier to the map scale, etc.). In addition, some embodiments use an inertial stop animation to slow down and stop the animation of the map changing zoom level. For example, to transition from the first stage 905 to the second stage 910, the mapping application zooms out on the map at a particular defined rate, then gradually slows the zoom animation down to a stop, rather than suddenly stopping the animation.
In some embodiments, the map changes zoom levels by an amount proportional to the duration for which the user holds the touchscreen input over the zoom in or out control, while using an inertial stop animation after the user releases the press-and-hold input. Some such embodiments differentiate between a tap input and a press-and-hold input, so that a tap input causes the application to change the zoom level of the map a greater amount than would be the case for a press-and-hold having the short duration of the tap gesture.
In this case, the mapping application zooms the map region about the center of the display, which also coincides with the position indicator 860. Some embodiments always center the zoom about the center of the map, while other embodiments always center the zoom about the current location of the device. In some embodiments, the mapping application centers the zoom about the current location when the position indicator is within the displayed map region and about the center of the displayed map when the position indicator is not within the displayed map region.
The second stage 910 also illustrates that the user is performing a touchscreen swipe input to scroll the map to the left (thereby enabling viewing of the map region currently off of the screen to the right). The third stage 915 illustrates the result of the swipe/drag gesture, with the mapping application having moved the viewed region of the map rightward enough that the current location indicator is now off the display screen (to the left). The high-quality touchscreen user interface of some embodiments resolves all such swipe gestures over the map view to either up, down, left, or right, while other embodiments calculate the angle of the swipe gesture and move the map in the calculated direction. In some embodiments, only high-quality touchscreens allow for this direct manipulation of the map region, allowing drag inputs in which the movement of the touch input from a first touchscreen location to a second touchscreen location causes the map location displayed in the first touchscreen location to move to the second touchscreen location, as though the user was dragging the map itself.
As a result of the manual scrolling of the map, the position indicator 860 is no longer centered on the map. Thus, the mapping application enables the position control 576, displaying the control more vividly to indicate that the user can now select it.
The fourth stage 920 illustrates the user selecting the position control 576. As described above, selection of the position control of some embodiments causes the mapping application to center the map region about the current location of the device, irrespective of the map region previously displayed on the map. The fifth stage 925 illustrates that the position indicator 860 is again displayed in the center of the map region in the UI 565. In addition, the position control 576 of some embodiments does not cause the mapping application to change the zoom level at which the map is displayed.
As the user drives and the device moves, some embodiments automatically scroll the map so that the current location stays centered in the map view. The sixth stage 930 illustrates the UI 565 after the user has not interacted with the display, but has driven a short distance along the current road. The clock has rolled forward to 2: 15, and the map display has shifted downwards so that the device's current location remains in the center. In some embodiments, the mapping application only automatically moves the map region along with the current location when the map view is already centered on the position indicator 860 (i.e., when the user has selected the position control 576 without subsequently scrolling the map via a swipe gesture).
The previous example illustrates the high-quality touchscreen user interface of some embodiments, but does not indicate the behavior of the mapping application user interface displayed on the mobile device on which the mapping application operates. In general, some embodiments of the mapping application allow independent map exploration on the two user interfaces, only synchronizing the interfaces once the user performs an action to identify a possible destination (e.g., performing a search, selecting a recent destination, etc.).
Figure 10 illustrates a user exploring a map in both the mapping application interface 205 displayed on a mobile device 1000 as well as the high-quality vehicle touchscreen interface 565 displayed on the screen of a vehicle to which the mobile device 1000 connects over three stages 1005-1015. As shown in the first stage 1005, the mapping application is open in both the mobile device and the touchscreen. In the mobile device user interface 205 and both the vehicle touchscreen user interface 565, the map region currently displays a region surrounding the current location of the device, with the current location centered and tracked.
In the first stage 1005, the user performs a leftward swipe gesture over the map region in the high-quality vehicle touchscreen user interface 565 in order to pan the map to the right. The second stage 1010 illustrates that the map displayed in the vehicle screen is no longer centered on the current location of the device, with a different portion of the map region displayed instead. However, the map region displayed on the mobile device 1000 has not changed. At this stage, the user performs a pinch multi-touch gesture over the map region displayed on the mobile device in order to zoom in on the map region. Thus, the third stage 1015 illustrates that the mobile device user interface 205 displays the map region at a closer zoom level, while the vehicle user interface 565 does not change from the second to the third stage.
B. Low-Quality Touchscreen Interface
The above section described interaction with the high-quality touchscreen user interface of some embodiments. Many of these interactions are the same or similar for the low-quality touchscreen interface in some embodiments. For example, the bookmarks and recents lists only require gestural input for scrolling through a list of possible choices, when the list is too numerous to fit entirely on the screen. Some embodiments therefore place scroll arrows next to the list for non-gestural (e.g., tap) selection. The search feature does not involve any gestural input (except in rare instances when the search term is ambiguous, with numerous possibilities that require scrolling) or high touch sensitivity resolution and therefore this feature is the same for the low-quality touchscreen interface as the high-quality touchscreen interface.
Scrolling through the map in the high-quality touchscreen interface, however, uses swipe gestures that some touchscreens may not be able to process accurately. As such, as shown in Figure 5, the low-quality touchscreen interface of some embodiments uses different interactions for the map exploration features. Figure 11 illustrates various map exploration functions, including zooming, scrolling, etc. over seven stages 1105-1135 of the low-quality touchscreen interface of some embodiments.
The first stage 1105 illustrates the low-quality touchscreen map view user interface 570 of some embodiments, with six controls 581-587 in the upper section. As described above, this interface 570 is similar to the high-quality touchscreen interface 565 with the difference being that the upper bar of controls includes a single zoom and scroll control 587 instead of the zoom controls 577 and 578, and the touchscreen does not recognize swipe gesture inputs to scroll the map. This may be due to the screen not recognizing movement of touch inputs or because the latency of the touchscreen is so high as to make direct manipulation of the map view (e.g., touchscreen inputs that give the impression of directly dragging the map) too difficult. In addition, at the first stage, the position indicator 1170 that indicates the current location of the device is shown at the center of the displayed map region.
The first stage 1105 also illustrates that the user selects the zoom and scroll control 587, with a tap selection input. As shown at the second stage 1110, this input results in the display of additional controls within the map view user interface 570. These additional controls include four scroll arrow controls 1140-1155, position control 586, and zoom controls 1160 and 1165. The position control 586 is the same position control as in the first stage 1105, but has been relocated to a different position within the top control bar. As before, the position control 586 enables the user to cause the mapping application to re-center the map view on the device's current location. The zoom controls 1160 and 1165 have the same function as zoom controls 577 and 578 of the high-quality touchscreen interface 565. That is, the control 577 causes the mapping application to zoom in on the map region while the control 578 causes the mapping application to zoom out of the map region.
The scroll arrows 1140-1155 enable the user to scroll through the map region incrementally in some embodiments. When a user selects one of the arrows 1140-1155, the displayed map region moves in the direction opposite the arrow in order to reveal a region of the map in the direction of the arrow. The third stage 1115 illustrates a user selecting the left scroll arrow control 1155. The user may select the scroll arrows 1140-1155 with a tap input in some embodiments, or a press and hold input in other embodiments.
The third stage 1115 also illustrates (by showing the two right- facing arrows, which are not displayed within the user interface and are shown only for explanatory purposes) that the map region shifts (scrolls) to the right as a result of the selection of the scroll arrow 1155. In other embodiments, selecting an arrow facing a particular direction results in the map scrolling in that direction, rather than the opposite direction; however, selecting a left arrow to cause the map to shift (i.e., pan) rightwards and thereby reveal the portion of the map to the left provides a more intuitive interaction.
The fourth stage 1120 illustrates the result of the user interaction to select the left scroll arrow 1155. The position indicator 1170 has been moved rightwards to near the edge of the screen. When a user selects one of the scroll arrow controls, some embodiments scroll the map a fixed percentage of the display screen (e.g., one half the display screen height), which will cause the actual map distance to vary based on the zoom level. In addition, some embodiments use an inertial stop animation to slow down and stop the animation of the map panning. For example, to transition from the third stage 1115 to the fourth stage 1120, the mapping application pans the map to the right at a particular defined speed, then gradually slows the panning animation down to a stop, rather than suddenly stopping the animation.
In some embodiments, the map scrolls by a distance proportional to the duration for which the user holds the touchscreen input over the scroll arrow control, while using an inertial stop animation after the user releases the press-and-hold input. Some such embodiments differentiate between a tap input and a press-and-hold input, so that a tap input causes the application to pan the map further than would be the case for a press-and-hold having the short duration of the tap gesture. As shown below, when a user presses and holds the scroll arrows in other embodiments, the mapping application accelerates the rate at which it scrolls through the map region.
The fourth stage 1120 additionally illustrates the user selecting a position between the two scroll arrow controls 1140 and 1145, at an angle Θ above the horizontal. This stage also illustrates, by showing the two down-facing arrows (which are not displayed in the user interface, and only shown for explanatory purposes), that the map region shifts (scrolls) downwards as a result of the user input. Some embodiments, as shown in this figure, only scroll in the orthogonal directions (up, down, left, and right), and therefore resolve all inputs within the map view while showing the scroll arrows 1140-1155 to one of the four scroll directions. In this case, the angle Θ is greater than 45° above the horizontal, and therefore resolves to a selection of the scroll arrow 1140.
Figure 12 conceptually illustrates a region representing the map view area 1200 of the low-quality touchscreen user interface, with the scroll arrows 1140-1155 displayed in the map view area. In addition, this figure shows two dashed lines at right angles that divide the map view area 1200 into four quadrants 1240-1255. In some embodiments, each of these four quadrants is associated with one of the scroll arrows 1140-1155. That is, a touch input anywhere within the quadrant 1255 will result in a selection of the scroll arrow 1155 and cause the map to scroll to the right, showing the map region previously off of the screen to the left. Different embodiments may divide the display differently (e.g., the far right and far left portions may not be associated with scroll arrows 1145 and 1155). Furthermore, as described below, some embodiments do not limit the scrolling to only up/down and left/right, but instead also allow for diagonal scrolling.
Returning to Figure 11, the fifth stage 1125 illustrates the result of the user selection of the upwards scroll arrow 1140, which causes the map display to shift downwards, revealing a map portion previously off the top of the display screen, and pushing the position indicator 1170 off of the bottom of the screen. Furthermore, at this stage, the user selects the zoom out control 1165. The selection of this control operates in a similar manner as the selection of the zoom out control 578 shown in Figure 9 for the high-quality touchscreen interface. That is, the mapping application displays the map at a new zoom level from a conceptual distance higher above the map, thereby revealing more of the map and with a fixed portion of the display representing a larger distance.
The sixth stage 1130 illustrates the map in such a zoomed out state, with enough of the map now shown to reveal the position indicator 1170 in the lower right portion of the display. Finally, at this stage 1130, the user selects a "Done" control 1175, to indicate that she has completed her use of the zoom and scroll controls to explore the map region. The seventh stage 1135 illustrates the user interface 570 with the same controls as in the first stage, and the displayed map region that from the previous stage.
As mentioned, in some embodiments, the low-quality touchscreen interface has the capability to scroll the displayed map diagonally as well as up, down, left, and right. Figure 13 conceptually illustrates a process 1300 performed by the mapping application of some embodiments in order to translate a selection input into a scroll of the map display for a low- quality touchscreen vehicle interface. The process 1300 will be described by reference to Figure 14, which illustrates three stages 1405-1415 of a low-quality touchscreen interface 1400. The user interface 1400 is the same as the previously described interface 570, except for the treatment of diagonal scrolling input when the zoom and scroll control is activated.
The process 1300 begins by displaying (at 1305) scroll controls for the low-quality touchscreen interface. This occurs when the mapping application receives input to select the zoom and scroll control of the low-quality touchscreen interface. The first stage 1405 of Figure 14 illustrates a low-quality touchscreen interface 1400. At this stage, a user selects the zoom and scroll control 1420. As a result, the second stage 1410 illustrates the zoom and scroll controls, including scroll arrows 1425-1440.
Next, the process 1300 receives (at 1310) a selection of a location on the vehicle touchscreen within the map view. That is, the selection is not a selection of the zoom controls, position indicator, done button, etc., but rather is within the map where the mapping application displays the scroll controls. The second stage 1410 of Figure 14 illustrates that the user makes a selection at a specific touchscreen location between the up arrow 1425 and the right arrow 1430.
The process 1400 then calculates (at 1315) the angle of a line from the center of the scroll controls to the center of the touch input. In some embodiments, as shown in the figures, the center of the scroll controls does not exactly coincide with the center of the vehicle display screen because the top bar with the zoom and position controls takes up a portion of the display. To determine the center of the touch input, in some embodiments the vehicle touchscreen system translates its touch input into a selection at a particular location, which the vehicle system sends to the mobile device (via the mobile device's connection to the vehicle system). Using this information, the mapping application can calculate the angle. The second stage of Figure 14 illustrates that in this example, the angle between the user's selection location and the horizontal is again Θ.
After calculating the angle, the process determines (at 1320) whether the angle is within a threshold of one of the primary scroll directions. This threshold, in some embodiments, is a particular angular range (e.g., 5°, 10°, etc.) from the horizontal or vertical. In some embodiments, the range is defined based on the display of the scroll arrows: if the selection input is within the minimum range that includes the entirety of the displayed scroll control, then the selection is within a threshold of that primary scroll direction. When the angle is within the threshold of one of the primary scroll directions, the process scrolls (at 1330) the map according to the selection of the closest primary direction scroll control. Thus, if the selection is near the right arrow 1430, the map will shift leftwards to display the portion previously off the screen to the right.
On the other hand, when the angle is not within the threshold of any of the primary scroll directions, the process scrolls (at 1325) the map at the calculated angle. That is, the map pans in the direction opposite the calculated angle, as if the user had selected a scroll arrow at the calculated angle. Thus, the second stage 1410 shows two arrows (not actually displayed as part of the user interface) pointing down and leftwards to indicate that the map shifts in that direction. The third stage 1415 of Figure 14 illustrates the result of the map having been scrolled down and left according to the user input. In addition to showing a different portion of the map region, the position control 1445 has been activated by the mapping application, in case the user wants to return to having the map follow the device location.
Figure 15 illustrates another feature of the scroll arrows of the low-quality touchscreen interface of some embodiments. Specifically, in some embodiments as a user holds down a touch input for an extended period of time over one of the scroll arrows, the map scrolling accelerates. Figure 15 illustrates this acceleration of the map scrolling over four stages 1505-1520 of the user interface 570. The first stage 1505 illustrates the user interface, as a user selects (via a tap input on the touchscreen) the zoom and scroll control 587. In the second stage 1510, the zoom and scroll controls have appeared in the user interface 570.
The second stage 1510 also illustrates that the user has begun to press down on the left scroll arrow 1135. The third stage 1515 illustrates (by showing the two right-facing arrows that are not actually displayed in the UI) that the map scrolls to the right by a particular distance as the user holds down the scroll arrow 1135. The third stage also illustrates that the user continues to hold down the scroll arrow 1135.
At the fourth stage, the user continues holding down the scroll control arrow 1135. As shown by the two sets of arrows (also not actually displayed within the UI), the map scrolls to the right at twice the speed as in the previous stage. Different embodiments treat the acceleration of the scrolling differently. For instance, some embodiments continue speeding up as a user continues to hold down the same scroll arrow. Other embodiments have a fixed number of different scrolling speeds (as percentages of the map region displayed, rather than physical distances represented by the map). For instance, if the user continued to hold down the scroll arrow 1135, some embodiments would accelerate the scroll speed to three times the original speed. In addition, the threshold time after which the application begins to accelerate the scrolling may vary. For instance, some embodiments change the speed of the scrolling after every second that a user has continuously held down the scroll control.
As mentioned, the bookmarks and recents lists may include scroll arrows for scrolling through a list of bookmarks or recents (e.g., recent searches, destinations, etc.). Figure 16 illustrates the use of scroll arrows to navigate through a list of recents over four stages 1605- 1620 of the low-quality touchscreen interface of some embodiments. The first stage 1605 illustrates the low-quality vehicle touchscreen mapping application interface 570. At this stage, the user selects the recents control 582 in order to open a recents list.
The second stage 1610 illustrates a recents page 1600, which is similar to the recents page 735 shown above for the high-quality touchscreen interface. However, the low-quality recents page 1600 includes scroll arrows 1625 and 1630 for navigating through the list of recents. The third stage 1615 illustrates the user selecting the scroll down arrow 1630 (with a tap selection). As a result, the fourth stage 1620 illustrates that the recents list has shifted upwards by one item, so that the application no longer displays the recent location of San Francisco at the top of the list, and instead displays a recent search for restaurants in Los Angeles at the bottom of the list.
C. Non-Touch User interface
The above subsections described map exploration (and introduced other features) for different types of touchscreen interfaces for vehicle display screens. Some vehicles, however, have center console display screens that do not accept touch input. In this case, different vehicles may have different types of controls, such as a knob, a trackball, a joystick, non-touchscreen buttons, etc. Before describing the details of the search and zoom features, the operation of the user interface generally will be described.
Figure 17 illustrates the operation of such a non-touchscreen vehicle user interface 1700 according to some embodiments over five stages 1705-1725. The first stage 1705 illustrates a default non-touchscreen home page 1700. Whereas the home page for the touchscreen vehicle user interfaces includes various selectable controls, the non-touchscreen home page of some embodiments only shows the time. The first stage 1705 additionally illustrates the vehicle hardware controls 580, including the keypad 591, joystick 592, and additional buttons. As mentioned, various different embodiments include different types of controls. While this figure shows a joystick 592, Figure 18, described below, illustrates a trackball. Additional vehicles may use a knob, various types of buttons, etc.
As shown at the first stage 1705, the user presses the menu button 593. The second stage illustrates the result of this selection, with the home menu user interface 575 displayed on the non-touchscreen display screen. The home menu interface 575 is described by reference to Figure 5. In some embodiments that do not include such a menu button, the user accesses this menu in a variety of different ways. For instance, in some embodiments simply scrolling a trackball, turning a knob, or moving a joystick while in the home page 1700 will activate the menu 575. As in Figure 5, the user selects the map menu option 594 by navigating to that option with the joystick 592 and pressing down on the joystick to perform a selection. With different sets of controls, a user might rotate a knob and then press the knob inwards to select one of the options, or move a trackball to choose an option and then press inwards on the trackball to select one of the options.
The third stage 1715 illustrates the result of the selection of the map menu option 594. As in the previous Figure 5, the device outputs the mapping application non-touchscreen vehicle interface 595 for display on the vehicle screen. Furthermore, at this stage 1715, the user again selects the menu button 594.
The fourth stage 1720 illustrates the result of this selection. The mapping application user interface now displays a menu 1730 with four options 1735-1750. These four options include a search option 1735, a recents option 1740, a bookmarks option 1745, and a predictive routing option 1750. These options correspond to the selectable controls 571-574 of the touchscreen user interface 565, and enable the user to access the various different features of the mapping application on the non-touch display screen. In some embodiments, a fifth menu option for tracking the device's current location in the map is available only when the current location is not presently centered on the map. By default, the first menu option 1735 is presently selected. Some embodiments always select the search option (or a different option when the menu options have a different order) upon opening the menu. Other embodiments, however, select the most recently selected option upon activation of the menu.
The fourth stage 1720 additionally illustrates the user moving the joystick downwards. As a result, the fifth stage 1725 illustrates that the selection marker in the menu has moved down to the recents option 1740, such that a selection input (e.g., pressing down on the joystick) will select the recents option. Some embodiments use up and down to scroll through the menu and use right and left to either select and option (a rightward movement) or go back to a previous page (a leftward movement).
The above figure illustrates menu navigation with the non-touchscreen user interface.
Map exploration presents a unique challenge, however, because the map must be scrollable and zoomable, actions that have more natural inputs via a touchscreen with direct manipulation type actions available (e.g., drag gestures, pinch gestures, or touchscreen controls that do not need to conform to those provided by a vehicle system).
Figure 18 illustrates one example of such map exploration in a non-touchscreen vehicle user interface 1800 over four stages 1805-1820. The user interface output by the mapping application is the same as in the previous example, but the vehicle controls are different. Instead of a joystick, the vehicle controls include a trackball 1825. The trackball 1825 is a ball that sits in a well, and moves within the well in various directions in response to user input. A user (e.g., a driver or passenger) can roll the trackball up, down, left, or right (i.e., rotating about one of the two horizontal axes of the trackball). In addition, the user can rotate the trackball about the vertical axis through the center of the trackball. Furthermore, with some trackball inputs, the user can press down as a separate input and/or "shift" the trackball forwards, backwards, leftwards, or rightwards.
In the first stage 1805, the user rolls the trackball 1825 to the right. As shown by the arrows illustrated facing the left (which are not actually displayed in the user interface), the rightward scroll of the trackball results in the map shifting leftwards, revealing a portion of the map previously off of the screen to the right. The second stage 1810 illustrates this shift of the map, as the map region (and the position indicator 1830 that shows the device's current location) have moved to the left in the display screen.
The second stage 1810 also illustrates that the user is rotating the trackball 1825 clockwise about its center axis. In some embodiments, this rotation action of the trackball causes the application to zoom either in or out (e.g., zooming in for clockwise rotation and zooming out for counterclockwise rotation). In this case, because the rotation is clockwise, the mapping application changes the zoom level to zoom in on the map, as illustrated by the third stage 1815.
Additionally, at the third stage 1815, the user selects the menu button 1835. As in the previous Figure 17, this selection causes the mapping application to display a menu 1840, similar to the menu 1730. In this case, the menu includes the same four initial options (search, recents, bookmarks, and predictive routing), but also includes a fifth option 1845 for the current location (because the map is no longer centered on the current location of the device). The selection of the current location option 1845 is similar to the selection of the position control in the touchscreen interfaces, and results in the mapping application centering the map on the current location of the device and subsequently tracking that location.
Vehicles with different types of inputs (other than trackballs) may perform these scroll and zoom actions differently. For instance, a device with only a knob for input may initially scroll the map left and right in response to rotations of the knob (e.g., clockwise to scroll the map rightwards and counterclockwise to scroll the map leftwards). Upon receiving a push of the knob, subsequent rotation will scroll the map up and down. After a second push of the knob, additional rotation of the knob will zoom in or out of the map. With a joystick, shifting the joystick left, right, up or down causes the map to move in the shifted direction (or opposite the shifted direction) in some embodiments. Some embodiments interpret rotation of the joystick as zoom input, while in other embodiments the joystick can be pushed inwards or pulled outwards to change the zoom level.
III. SEARCH FEATURE IN VEHICLE USER INTERFACE
The previous section introduced the search feature available through the vehicle interface of some embodiments. As shown, in some embodiments the vehicle interface only permits search input via dictation, rather than via a keyboard or other typing mechanism. This is to dissuade the user from attempting to type in locations while driving, though a passenger still has the ability to enter search terms through the mobile device interface via typing.
In some embodiments, the mobile device that runs the mapping application also has voice recognition functions to perform searches or answer other questions. In some embodiments, whereas the vehicle interface search is a pure dictation search, the voice recognition functions on the device are more interactive. That is, on the mobile device, the user can dictate successive requests, in which the voice recognition functions use context of previous requests to analyze the meaning of subsequent requests. On the other hand, because the vehicle maps interface is optimized towards quickly and efficiently identifying a destination for navigation, each dictated request is treated as a separate search. The following section describes various aspects of the vehicle search interface in greater detail.
Above, Figure 8 illustrated the search feature of some embodiments in the touchscreen vehicle user interface. As shown in that figure, when the search term is recognized without ambiguity, the mapping application automatically transitions from the search function page to a map view showing search results for the unambiguous search term. On the other hand, when the mapping application cannot unambiguously identify the search term, the application presents the user with a set of choices for the search.
Figure 19 illustrates the search feature in the case of an ambiguous search term over five stages 1905-1925 of the high-quality touchscreen user interface. The search feature, in some embodiments, operates the same in both the high-quality and low-quality touchscreen interfaces, and similarly in the non-touchscreen interface, with differences that will be explained in detail below.
The first three stages 1905-1915 are the same as the first three stages 805-815 of Figure 8, with the exception of the phrase spoken by the user in the third stage 815. The first stage 1905 illustrates the user selecting the search control 571 in the high-quality vehicle touchscreen interface 565. The second stage 1910 illustrates the search function page 835 that prompts the user to speak her destination and the third stage 1915 illustrates the page as the user speaks. In this case, the user dictates "Home" into the vehicle microphone, which is then processed by the mobile device voice recognition system. In some embodiments, for the mapping application search function, the voice recognition processing performs pure dictation, rather than natural language processing. That is, the voice recognition does not attempt to parse out the meaning of phrases or expressions within the user's search request. Thus, if the user states "go home", the mapping application would look for search terms that match this phrase, rather than identifying "go" as a desire for the application to navigate towards a particular location. On the other hand, some embodiments perform natural language processing for the mapping application search requests received through the vehicle interface. In such embodiments, if the user stated "go home", the application would recognize the user's intention and perform the search shown in Figure 19.
As shown at the fourth stage 1920, the mapping application cannot determine which of two search terms a user has requested through dictation. Because the search term is ambiguous, the search function page 835 displays the terms as searched ("Home") in the search bar 840, with a list of possible search results in the search results region 845. In this case, the user has two locations stored as "Home", one in Los Angeles and one in San Francisco.
The search feature, in some embodiments, initially searches a set of databases that store information relevant to the user of the device. These databases, in some embodiments, include the user's stored contact information (e.g., address book records), recent searches performed within maps (i.e., results available through the recents feature of the interface), stored bookmark information (i.e., results available through the bookmarks feature of the interface), as well as harvested address and location information. In some embodiments the mobile device running the mapping application is one of several devices connected via a user account. These devices share information via a content-sharing service, and this content-sharing service (with permissions from the user) identifies addresses and/or possible destinations from sources such as the user's e- mail, messages, web browsing, etc. This harvested address and location information is described in detail in U.S. Provisional Application 61/832,850, filed 6/8/2013, as well as in U.S. Patent Application 14/081,850, filed 11/15/2013 and entitled "Harvesting Addresses". U.S. Provisional Application 61/832,850 and U.S. Patent Application 14/081,850 are both incorporated herein by reference.
In some embodiments, only when the various databases do not provide any search results does the search function use a more general search engine to identify locations within the user's vicinity. In this case, the databases find multiple results for "Home", as the user has two separate addresses stored for their home. As shown at the fourth stage 1920, the user selects the Los Angeles home address. As a result, the fifth stage 1925 displays the search result map view page 852, with a pin 1930 displayed at the location of the user's Los Angeles home. The mapping application centers the map about this dropped pin 1930, and the lower information bar 870 displays information about the location, including the name ("Home"), the address "1919 Foothill Drive"), the estimated travel time to the location, and a selectable control 880 for entering the turn-by-turn navigation mode of the mapping application with the selected location as the destination. In addition, the upper information bar 865 displays the search term as translated by the voice recognition functions ("Home"), along with a cancel control 1935 (to return to the standard map exploration interface) and a search again control 875 to return to the search function page 835 displayed in the second stage 1910 of this figure.
Figure 19 illustrates ambiguous results identified within the databases searched by the mapping application search function (specifically, the address book information). In some embodiments, ambiguities may also arise from search engine results (e.g., when a user dictates a street address that may resolve to addresses in multiple different cities). Figure 20 illustrates such a case of ambiguous results returned by the search engine.
Specifically, Figure 20 illustrates the search feature of the mapping application identifying ambiguous results for an address search over five stages 2005-2025 of the high- quality vehicle touchscreen interface. In this case, the first three stages 2005-2015 are the same as the first three stages of Figures 8 and 19, with the user selecting the search control 571, the search function page 835 appearing, and the user dictating a search query. In this case, at the third stage, the user dictates "1200 Main St.". Though not shown in the figure, for reference in this case the user is driving in the Orange County area of Southern California.
Without the search terms specifying a city, the mapping application cannot determine whether the user is searching for the 1200 Main St. located in Huntington Beach or the 1200 Main St. located in Irvine. As such, the fourth stage 2020 illustrates that the search function page 835 displays these two possible results in the search results region 845. The user selects the address located in Huntington Beach, and the fifth stage 2025 illustrates that the mapping application displays a map centered at this location, with a pin 2030 displayed at the requested address.
The previous examples illustrate the case in which the dictated search term produces a single result or ambiguous results displayed in a list from which the user selects the desired search result. Some search queries will not be ambiguous, but produce multiple matching results. Figure 21 illustrates an example of such a search that produces multiple results over eight stages 2105-2140. As with the previous examples, the first three stages 2105-2115 are the same, with the exception of the user's dictated search query. In this shown at the third stage 21 15, the user dictates the search "Coffee".
In this case, the search is not ambiguous. The mapping application does not find any results in the local databases (e.g., recents, address book, etc.), and therefore performs a search for coffee in the region surrounding the device location. While the search finds multiple results, the search term itself ("coffee") is not ambiguous. Instead, the mapping application search feature recognizes this as a more generic search looking for one or more results for a particular type of location (coffee shops).
Thus, as shown at the fourth stage 2120, the application automatically transition to the search results map view page 852, with the search term "Coffee" appearing in the upper information bar 865. This search term produces six search results within the region searched (e.g., within a particular distance of the device, or drivable within a particular duration from the location of the device). The map region displays a map with a pin for each of the six search results, and one of these pins displayed as selected. Some embodiments automatically initially select the search result closest to the device's current location. In some embodiments, other factors may affect the ordering (i.e., ranking) of the search results returned by a search engine (e.g., a search server). The search engine may use popularity (by review quality, or by number of people reviewing a location) and relevance in addition to distance when determining the ordering of search results. For example, when the user searches for coffee, the closest result in terms of proximity might be a restaurant that happens to serve coffee, but the search engine may return a coffee shop located slightly further from the current location instead, based on a combination of relevance and popularity.
The lower information bar 870 displays the information for the selected destination (in this case, Curt's Coffee, which is three minutes away), and the selectable control for entering navigation. In addition, because the search term resulted in more than one search result, the application displays a left arrow control 2145 and a right arrow control 2150, with the text "1 of 6" located between the arrows. In some embodiments, when the search returns multiple results displayed together on the map, the vehicle interface includes a mechanism for stepping between the search results. As shown in subsequent stages of this figure, selection of the arrow controls 2145 and 2150 causes the application to step through the search results.
At this fourth stage 2120, however, the user makes a particular type of selection (in this case, a double tap) of the currently selected pin 2155 for Curt's Coffee. The fifth stage 2125 illustrates that this double-tap selection of the pin 2155 results in the mapping application centering the selected pin in the map region and zooming in on the pin as well. The fifth stage 2125 illustrates the same user interface except for the map region now centered on the pin and at a closer zoom level such that less of the surrounding region is displayed. In some embodiments, when a search gives more than one result, such selection (e.g., a double-tap within the vicinity of a pin) causes the application to zoom in on the pin, and a second such selection causes the application zoom back out to show all of the results. In some embodiments, the user cannot zoom in and out in this way when only one search result is found.
At the sixth stage 2130, with the map region still centered on the pin 2155 for Curt's Coffee, the user selects (via a tap input) the right arrow control 2150. The seventh stage 2135 illustrates that the mapping application has selected the second closest result to the user's location, Killer Coffee, and shifted the map region so that the pin 2160 for this location is now in the displayed portion of the map region. Though not the case in this example, some embodiments center each particular pin upon stepping through the search results to the particular pin and selecting the particular pin. Other embodiments, as shown in this figure, modify the map display so that the pin is within a particular threshold of the center and the center of the displayed map region is equidistant from the selected pin and the current location.
The seventh stage also illustrates the user again selecting the right arrow control 2150 (again with a tap input). In the eighth stage 2140, the mapping application selects the third closest result to the user's location, Joe's Drowsy Cure, and shifts the map region so that the pin 2165 for this location is now visible within the map display. Furthermore, in order to include both the current location of the device and the selected pin 2140, the mapping application zooms out from the map while shifting the map region.
The above examples illustrate the mapping application search feature for the touchscreen vehicle user interfaces. In some embodiments, the only difference between the high-quality touchscreen interface search feature and the low-quality touchscreen interface search feature is the size of the buffers for touch selectability around the various controls (e.g., the pins, the right and left arrow controls for stepping through the search results, etc.). The differences in buffer size of some embodiments are described above by reference to Figure 6.
Figure 22 illustrates the operation of the mapping application search feature for the non- touchscreen vehicle user interface of some embodiments, over six stages 2205-2230. The first stage 2205 and second stage 2210 are the same as the third and fourth stages 1715 and 1720 of Figure 17, described above. In this case, however, at the second stage, the user performs a selection input (e.g., pushing the joystick 592 inwards, pushing a knob inwards, pressing a particular hardware button, performing a specific input on a trackball, etc.) with the search menu option 1735 selected.
As shown at the third stage 2215, this selection activates the search feature, causing the display of a search function page 2235. The search function page 2235 is analogous to the touchscreen search function page 835, and prompts the user to dictate a destination along with examples. However, the page does not have a cancel option, because there is no touchscreen. Instead, in some embodiments the user can cancel the search by pressing the physical back button 2240 or exit button 2245.
Instead, as shown at the fourth stage, the user dictates the search "coffee", the same search term used above in Figure 21. At this stage, the search function page has the same waveform animation as in the touchscreen interface, but does not include a touchscreen button to indicate completion of the dictation. As with the touchscreen interface, the mapping application can recognize when the user has completed speaking in some embodiments. In addition, the search function page 2235 displays instructions to the user to press the menu button upon completion of the dictation. In various embodiments, the user can perform different inputs via the built-in vehicle controls in order to end the search (e.g., pressing inwards or upwards on the joystick, pushing a knob, etc.).
The fifth stage 2225 illustrates that the search algorithms function in the same way irrespective of the interface. The same search for coffee with the device in the same location gives the same six results as in the touchscreen interface, in the same order. Thus, the fifth stage 2225 illustrates the map region displayed in the non-touchscreen interface, with six pins representing the six search results, and the closest location to the device's current location selected. The non-touchscreen interface search results map view 2250, unlike the search results map view 852 of the touchscreen interface, does not include touchscreen buttons such as the cancel, search again, or enter navigation controls. Instead, the mapping application relies on user interaction with the built-in vehicle controls to perform these actions (e.g., pressing inwards on a joystick to enter navigation, using the back or exit buttons to respectively search again or cancel the search function).
Other than the control to enter navigation, the lower information bar in the search results map view 2250 is similar to that in the touchscreen interface. In addition to the lack of this control, the user interface additionally does not have selectable arrows on either side of the text "1 for 6" that indicates the number of search results that can be stepped through via non- touchscreen input. In this case, the user rotates the joystick 592 clockwise. As a result, at the sixth stage 2230, the mapping application selects the second search result (as ordered by the search engine, using various factors, such as proximity to the current location of the device, popularity, relevance, etc.).
As described above, in some embodiments the mapping application search function initially searches various local databases, then uses a search engine to perform a local search when unable to find a result via the local databases. Figure 23 conceptually illustrates a process 2300 performed by the mapping application of some embodiments in order to retrieve and display search results for a search dictated through the vehicle user interface.
As shown, the process 2300 begins by receiving (at 2305) voice dictation audio captured by a microphone in the vehicle. In some embodiments, the user first activates a search function through the vehicle interface of the mapping application, which prompts the application to receive the audio recorded by the vehicle microphone. Using the vehicle microphone (which, in some embodiments, must have the ability to record audio of at least a threshold quality) enables the user to avoid the need to pick up the mobile device and speak into the device microphone. However, other embodiments use a microphone in the mobile device to record the voice dictation.
Next, the process 2300 processes (at 2310) the dictated audio to determine the search terms. In some embodiments, the mobile device includes voice recognition software leveraged by the application in order to identify the search terms spoken by the user. In other embodiments, the mapping application connects to a voice recognition server that performs the voice analysis. As mentioned above, some embodiments perform dictation translation that does not account for natural language phrasing. On the other hand, in addition to simply translating the audio into words, some embodiments parse phrases to determine the user's desired search. For instance, when the user says "go home", the voice recognition process identifies that the user is searching for "home", and can ignore the "go". Similarly, when a user dictates "Find the best coffee", some embodiments identify coffee as the search, and "best" as a modifier to use quality ratings for identified locations in the area.
With the search terms identified, the process performs (at 2315) a search for the identified terms using local databases. In some embodiments, these local databases include the user's address book (e.g., information about the user's contacts) and addresses and/or destinations harvested from other applications. These harvested destinations may include addresses or destinations in e-mails, SMS messages, etc. either sent or received by the user. In addition, the harvested destinations include in some embodiments addresses either specifically sent to the mapping application from a web browser or other application (such as Yelp®) on the mobile device or on web pages or locations navigated to with the web browser or other application. Furthermore, in some embodiments, these addresses are not restricted to those identified by the mobile device connected to the vehicle interface. The mobile device may connect to an information- and/or content-sharing service (e.g., iCloud®) that enables the sharing of content and information between multiple device connected by an account. In this case, destinations may be harvested from all of the user's devices connected to the account, and distributed to the mobile device on which the mapping application operates for use by the mapping application search feature. In addition to these databases, some embodiments also include recent search results from previous searches by the mapping application in the initial search of local databases.
The process 2300 then determines (at 2320) whether at least one result is found in the local databases. As described below, when no results are found via this search, the process proceeds to 2340 to perform a search using a search engine. When at least one result is found by the local database search, the process 2300 determines (at 2325) whether multiple results are found within the local database search. In some embodiments, the search function treats multiple results identified in the local databases as ambiguous results.
Thus, when multiple results are found, the process displays (at 2330) the set of results as selectable options in a list. Figure 19 is an example of such a search result, in which two results for the search "home" are found. Other searches that might turn up ambiguous results in the local databases include "work" when the user works multiple jobs or has multiple work locations, or "Bobby's house", when the address book has information for multiple people named Bobby. After some period of time, the process receives (at 2332) a selection of one of the displayed options. While this is shown as a linear process, one of ordinary skill in the art will recognize that the application may receive a selection of a cancel option, or may wait an indeterminate amount of time between operation 2330 and operation 2332 until the user makes a selection. Upon receiving the selection, the process displays (at 2334) the selected option on the map.
When only one result is found, the process instead displays (at 2335) the singular result on the map view, without displaying a list of results. As shown in Figure 8 (which is not an example in which the search result is necessarily found in the local databases), the mapping application of some embodiments displays a pin at the identified location, and provides the user with a control for entering a turn-by-turn navigation mode with the identified location as the destination.
On the other hand, when the search of the local databases does not turn up any results, the process performs (at 2340) a search for the dictated search terms using a search engine. In some embodiments, this is a search engine also available to a web browser on the mobile device, while in other embodiments the mapping application uses a different web browser. In addition, the mapping application constrains the search by location so that only results within a particular proximity of the device's current location will be found. For example, the application may constrain the search to a particular distance (e.g., a 5-mile radius, 10-mile radius, etc.) or a particular travel time (e.g., only locations that the user could reach by car within 15 minutes, 30 minutes, etc.). Some embodiments initially use a small distance or time constraint, then start increasing the constraint until the distance or travel time is large enough for at least one result to be identified.
The process then determines (at 2345) whether the search results are ambiguous. In the case of the search engine results, this does not determine simply whether one or more result was returned. Instead, search results are ambiguous if the application cannot determine with a threshold confidence what result the user desires.
When the search results are ambiguous, the process displays (at 2350) the set of results as selectable options in a list, as was the case for multiple results identified within the local databases. For instance, in the example of Figure 20, the user searches for a particular address without specifying a city. As this happens to be an address in two difference cities nearby, the results are ambiguous because the user is clearly searching for a single location. Similarly, if the user were to dictate "Curt's Coffee", but two different unrelated coffee shops nearby had the names "Curt's Coffee" and "Kurt's Koffee", some embodiments would display these results in a list as ambiguous, allowing the user to select one. After some period of time, the process receives (at 2352) a selection of one of the displayed options. While this is shown as a linear process, one of ordinary skill in the art will recognize that the application may receive a selection of a cancel option, or may wait an indeterminate amount of time between operation 2350 and operation 2352 until the user makes a selection. Upon receiving the selection, the process displays (at 2354) the selected option on the map.
However, when the search results are not ambiguous, the process determines (at 2355) whether the search engine has yielded more than one result for an unambiguous search. When only one result is returned, the process displays (at 2360) the singular result on the map view, without displaying a list of results. As shown in Figure 8, the mapping application of some embodiments displays a pin at the identified location, and provides the user with a control for entering a turn-by-turn navigation mode with the identified location as the destination
When multiple results are returned, the process displays (at 2365) the returned set of locations on the map along with controls for stepping through the locations. Figures 21 and 22 are examples of such searches for different interfaces, with the controls displayed differently in for the different vehicle interfaces. The mapping application displays pins at each of the locations, and the user can select the different pins with the stepping controls. With any of the pins selected, the user can select an option to enter turn-by-turn navigation with the selected location as the destination. After displaying the results at any of the operations 2334, 2335, 2354, 2360, or 2365, the process ends. In the cases in which the process displays a set of selectable options in a list, the application proceeds to the map view display after the user selects one of the possible results.
The above process and examples illustrate the vehicle interface in cases in which the search is initiated through that interface. However, when the mobile device on which the mapping application operates is connected to the vehicle system, a user (e.g., a passenger) can continue to use the mapping application on the mobile device, and perform searches within the user interface displayed on the device. Figure 24 illustrates an example of a user performing a search within the mapping application mobile device interface while the mobile device is connected to a vehicle interface. Specifically, the figure illustrates three stages 2405-2415 of both interfaces in which the user initiates a search function, performs a search, and receives search results displayed on both interfaces.
The first stage 2405 illustrates a mobile device 2400 displaying the mapping application user interface 205 of some embodiments, as described above by reference to Figure 2, as well as the high-quality vehicle user interface 565. Though not shown, the mobile device is currently connected to a vehicle that includes a touchscreen interface for displaying the vehicle UI 565. In this first stage 2405, the user selects the search field 240 to initiate a search operation on the mobile device.
At the second stage 2410, the user has utilized a touchscreen keyboard 2420 (not available in the vehicle user interfaces of some embodiments) to type a search for "Pizza" into the search field 240. As the user types, the search field 240 of some embodiments displays various autocompleted options for selection by the user. Thus, at this point the search field displays a selectable option of "Pizza", which the user selects via a touch input. As the user enters the search field and types a search on the mobile device 2400, the mapping application does not modify the vehicle user interface 565.
The third stage 2415 displays four dropped pins in the mobile device interface 205, corresponding to the search results for a generic search for pizza. In some embodiments, the mapping application changes the zoom level of the map view in the mobile device UI 205 in order to display all of the search results. The mapping application also selects the first search result (for a particular pizza location), and displays an expandable banner 2425 for the result. At this point, the mapping application also directs the vehicle user interface 565 to display the search results, as if the search had been performed using the vehicle interface.
Similarly, though not shown in the previous figures, once the user inputs a search term or terms into the mapping application via the vehicle user interface, the search results are displayed in the mobile device interface in some embodiments. In general, the mapping application allows separate map exploration in the two interfaces, and separate access of various features (e.g., search interface, recents or bookmarks, predicted routes, etc. However, once the mapping application reaches a result that corresponds to a possible destination (e.g., a selection of a recent location, a selection of a bookmarked location, a search result, etc.), the application synchronizes the interfaces to display the same possible destinations in the two interfaces.
IV. ACCESSING RECENT DESTINATIONS
In describing the mapping application interface for vehicle displays above in Section II, the control for accessing the recent destinations feature was discussed. As shown, the touchscreen interfaces include a recents control selectable from the primary map view, and the non-touch interface includes a recents option in the primary mapping application menu. The following will discuss additional aspects of the recent destinations feature of some embodiments.
Figure 7 above illustrated that when the user selects the recents control in a touchscreen interface (or selects the recents menu option in a non-touchscreen interface), the mapping application displays in that interface a list of destinations recently searched in the mapping application. In some embodiments, these recent destinations may include destinations that have appeared as search results in the mapping application when searched through the mobile device interface of the mapping application.
Figure 25 illustrates three stages 2505-2515 of the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then accesses the recent destinations on the vehicle interface of the mapping application which now include the previously-searched destination. The first stage 2505 illustrates a person walking towards a vehicle 2500 while carrying a mobile device 2520 on which the mapping application of some embodiments operates. As shown in the mapping application mobile device UI 205, the user has searched for "Cool Coffee", and the mapping application displays a search result pin for the identified location in the map view.
At the second stage 2510, the user has entered the vehicle 2500 and begun driving. In addition, the user has connected her mobile device 2520 to the vehicle display system. As shown, at this point the display screen of the vehicle 2500 displays the mapping application UI 565. In some embodiments, when the device is connected while the mapping application is operating, the device automatically sends the mapping application vehicle user interface to the vehicle display screen. In other embodiments, upon connection to the vehicle system, the device operating system automatically sends the device and vehicle user interfaces to their respective home pages.
In the second stage 2510, the user selects the recents control 572. The third stage 2515 illustrates the recent destinations page 735 of some embodiments. In addition to various results from searches recently performed on the device (ABC Delicatessen, Los Angeles, and San Francisco), the recent destinations page 735 includes the Cool Coffee Shop, which was the search result identified by the mapping application based on a search through the mobile device interface 205 before the connection of the mobile device to the vehicle system.
In the situation illustrated in Figure 25, some embodiments automatically display the search result on the vehicle display screen once the mapping application is opened on the device after connection to the vehicle system. Figure 26 illustrates three stages 2605-2615 of the mapping application of some embodiments in which a user performs a search on the mobile device before connecting the device to the vehicle system, then opens the mapping application on the vehicle interface. The first stage 2605 illustrates a person walking towards a vehicle 2600 while carrying a mobile device 2620 on which the mapping application of some embodiments operates. As shown in the mapping application mobile device UI 205, the user has searched for "Cool Coffee", and the mapping application displays a search result pin for the identified location in the map view.
At the second stage 2610, the user has entered the vehicle 2600 and begun driving. In addition, the user has connected her mobile device 2620 to the vehicle display system. As shown, at this point the display screen of the vehicle 2600 displays the home page user interface 550 for vehicle touchscreens. In the embodiment shown in this figure, the device automatically returns to its home page as well upon connection to the vehicle system.
In the second stage 2610, the user selects the map control 563 to enter the mapping application, which also causes the mobile device display to display the mapping application (not shown). The third stage 2615 illustrates the result of entering the mapping application, in that the application automatically displays a search results map view page (showing the one result for the Cool Coffee Shop) in the vehicle display for the recently performed search.
In some embodiments, the recent destinations page displays destinations from sources other than mapping application searches. For example, some embodiments display recently harvested locations from e-mails, SMS messages, etc. (as described above in relation to the search feature) as recent destinations. In addition, some embodiments enable the user to add destinations and addresses to the mapping application.
Figure 27 illustrates three stages 2705-2715 of a mapping application in which a user adds a location to the mapping application from a web browser on the mobile device, which adds the location to the mapping application table of recent destinations. As shown, the first stage 2705 illustrates a mobile device 2700 on which a web browsing application interface 2720 is displayed. The web browsing application interface 2720 currently displays a web page for the Tee-Cake Bakery. In addition, the first stage (and subsequent stages) illustrates a recents table 2725 stored by the mapping application. This table, in some embodiments, indicates what the mapping application will display when the user selects a recents control in the mapping application user interface. The recents table 2725 includes two entries, for Hamburger Palace and Pot Sticker's Restaurant.
In the first stage 2705, the user selects a contact information tab 2740. The second stage
2710 illustrates that, as a result, the web browsing application interface 2720 displays various contact information for the Tee-Cake Bakery, including a physical location address. In addition, at the second stage 2710, the user presses and holds over the address in order to select the text that forms the address. At this point, the recents table 2725 is unchanged from the first stage 2705.
In the third stage 2715, as a result of selecting the address, the web browsing application interface displays a menu 2730 with several options relating to the selected text. The menu includes a copy option, as well as several options based on recognition by the mobile device that the selected text is an address. The device (e.g., a function of the device operating system or web browsing application) may detect addresses based on their formatting and/or matching them to a recognized address in map data. These menu options include an option to add the address to contacts as well as an option 2735 to add the address to the mapping application operating on the mobile device. The user selects this last option 2735 in the third stage 2715.
As a result of this selection, the mapping application recents table 2725 now includes a third entry, for the Tee-Cake Bakery. The entry for the bakery has been added to the top of the table, which represents the order in which the recent destinations will be displayed when a user selects the recents control in the vehicle user interface. Thus, because the bakery is the most recent destination, the application adds it at the top of the table. In some embodiments, when a user selects a destination from the recents (or searches again for the destination), the application resets the access time for the destination, placing it at the top of the table.
While the recent destinations page of some embodiments displays recent destinations in order from most recently added to earliest added by default, various factors can cause the application to rearrange the display order of the recent destinations. For instance, some embodiments identify when a first location is a more likely destination than a second location, and place the first location above the second location in the recents list.
Figure 28 illustrates a recent destinations page 735 of some embodiments over three stages 2805-2815 in which the vehicle displaying the recent destinations page 735 travels over a stretch of road. Each stage illustrates, in addition to the recent destinations page 735, a conceptual map 2800 on which the vehicle location (and therefore the location of the device connected to the vehicle display system) is shown. This map 2800 does not represent a user interface display, but rather indicates a location on the shown streets of the vehicle.
The first stage 2805 illustrates the current location 2820 of the mobile device and vehicle to which it connects in the map 2800. In addition, the recent destinations page 735 includes two recently searched destinations (Hamburger Palace and Pot Sticker's Restaurant), both located in Los Angeles. The map 2800 additionally includes a pin drawn to represent a location of the first of these listed destinations (Hamburger Palace). Further drawn on the map are a solid arrow representing the current direction of travel of the vehicle (and device) as well as a dashed arrow representing an optimal route to the most likely of the recent destinations. Because Hamburger Palace is the closest of the recent destinations, and the direction of travel projected forward overlaps with a portion of the optimal route to Hamburger Palace, the application lists this destination first in the recent destinations page 735.
At the second stage 2810, the vehicle approaches the intersection at which the route to Hamburger Palace diverges from the current travel direction (i.e., the intersection at which the vehicle would turn right to travel to Hamburger Palace). At this point, the displayed recent destinations page 735 remains the same.
The third stage 2815 illustrates the map 2800 and the recent destinations page 735 after the user has passed the turn towards Hamburger Palace. The map now shows a route to Pot Sticker's Restaurant, which overlaps with the projection of the current travel direction. Hamburger Palace, on the other hand, is located behind the current location of the vehicle in its current direction of travel. As a result, the mapping application of some embodiments determines that Pot Sticker's Restaurant is a more likely destination for the user, and moves this destination ahead of Hamburger Palace in the recent destinations page 735. In some embodiments, a predictive routing engine of the mapping application performs the determination of which destination is more likely based on various factors, including direction of travel, routes to the different possible recent destinations, frequency of previous travel to the destination, etc. The predictive routing engine of some embodiments is described in further detail in U.S. Provisional Application 61/832,928, filed 06/09/2013, as well as in U.S. Patent Application 14/081,899, filed 11/15/2013 and entitled "Warning for Frequently Traveled Trips Based on Traffic". U.S. Provisional Application 61/832,928 and U.S. Patent Application 14/081,899 are both incorporated herein by reference.
Much like in the example in the previous section for the search feature, the mapping application of some embodiments only synchronizes the two interfaces (mobile device and vehicle display) once the user behavior results in a selected destination. Figure 29 illustrates a user utilizing the recents feature on a mobile device in order to select a destination over three stages 2905-2915 of both the mobile device mapping application UI 205 and the high-quality touchscreen interface 565.
In the first stage 2905, both of the mapping application interfaces 205 and 565 display their respective standard map displays. In addition, in the first stage 2905, the user selects the search field 240 to initiate a search by the mapping application in the mobile device interface 205. The second stage 2910 illustrates that the mapping application vehicle interface 565 remains static while the mobile device interface 205 displays a recents list. As indicated above, in some embodiments when the user selects the search field of the mobile device interface, the application displays a list of recent destinations, routes, etc. until the user begins typing in a search query. The list of recents includes a previously requested route from a particular coffee shop to a mini-golf course as well as a previously searched address (rather than a route).
At the second stage 2910, the user selects the destination 123 Opal Cove from the recents list. The third stage 2915 illustrates the result of this selection in both of the mapping application interfaces. In the mobile device interface 205, the mapping application displays a pin at the selected address as well as an expandable banner 2920 for the result. This banner 2920 includes a selectable item 2925 that enables the user to command the application to enter turn-by-turn navigation with 123 Opal Cove as the destination. At the same time, in the vehicle interface 565, the mapping application displays the search results map view with a pin 2930 for 123 Opal Cove as a selected destination. In this case, the application displays the selected destination in the center of the map view in both UIs.
Figure 30 illustrates the corresponding action taken within the vehicle user interface 565 over three stages 3005-3015. The first stage 3005 displays the same mobile device UI 205 and vehicle user interface 565 for the mapping application as in the first stage of Figure 29. In this case, however, the user selects the recents control 572 in the vehicle user interface 565. As a result, at the second stage 3010, the mapping application outputs the recent destinations page 735 to the vehicle display. Whereas the mobile device interface displayed both a location (123 Opal Cove) and a route (Cool Coffee Shop to Putt-Putt Golf Course), the vehicle user interface recent destinations page 735 only displays locations (i.e., possible destinations for navigation). In some embodiments, the mapping application does not display recent routes in the vehicle user interface, instead only displaying locations within the recent destinations page. However, when the user has recently searched for route directions on the mobile device, some embodiments display the destination of the route in the recents list as a possible destination. Some embodiments display these destinations as options irrespective of whether the searched route start point was the user's current location or a different location.
At the second stage 3010, the user selects the destination 123 Opal Cove from the recents destinations page 735. The third stage 3015 illustrates that the result of this selection in both of the mapping application interfaces is the same as when the destination was selected from the recents list on the mobile device. Specifically, the vehicle interface displays the search results map view with 123 Opal Cove as the selected destination, while at the same time the mobile device interface displays a pin at the selected destination as well as the expandable banner 2920. V. PREDICTIVE ROUTING INTERFACE
As mentioned in the previous section, the mapping application of some embodiments uses a predictive routing engine, which may be a part of the mapping application or operating separately from the mapping application on the mobile device. In some embodiments, the predictive routing engine can (1) formulate predictions about current or future destinations and/or routes to such destinations and (2) relay information to the user of the device about the predictions. In some embodiments, the predictive routing engine includes a machine-learning engine that facilitates the formulation of predicted future destinations and/or routes based on stored, user-specific data. The user-specific data may include any combination of the following in various embodiments: (1) previous destinations to which the user has traveled with the device, (2) previous routes taken by the user with the device, (3) locations and times of events stored in scheduling applications (e.g., electronic calendar, electronic ticket repository, etc.), and (4) addresses parsed from e-mails and/or messages sent to or from the user.
In some embodiments, when the mobile device is connected to a vehicle display system, the mapping application can output a series of predicted destinations and routes to those destinations when requested by the user. Through this interface, the user can view the different destinations and routes, and enter navigation towards one of the destination if desired.
Figure 31 illustrates the use of this predictive routing interface on the vehicle display over four stages 3105-3120 in which the user activates the interface and views several different predicted destinations. The first stage 3105 illustrates the high-quality vehicle touchscreen interface 565 of some embodiments. At this stage, the user selects the predictive routing control 574 to activate a predictive routing feature.
The second stage 3110 illustrates the predictive routing user interface 755 of some embodiments. The predictive routing UI 755 displays, in order of likelihood as computed by the predictive routing engine, one or more likely destinations and routes to those destinations. As described above by reference to Figure 7, the predictive routing UI 755 displays, for each predicted destination, a first panel 760 with information about the destination and a second panel 765 with a map view of a route to the destination.
In some embodiments, the information panel 760 displays a name for a current destination/route as well as metadata about the destination or route. This name may be based on a destination name identified from contact information, a calendar appointment name, a business name, an address, a primary road used by the route, etc. In addition, the displayed metadata may include an estimated arrival time, the source of the route for the predictive routing engine (i.e., how the predictive routing engine determined the destination as a likely end point for the current trip), whether the destination is a frequent or less traveled destination for the device (and therefore the user of the device), etc.
The map view 765, in some embodiments, displays a map with the current location of the device, the predicted destination, and a route generated from the current location to the predicted destination. In some embodiments, the mapping application factors in traffic information to select an optimal route to the destination, and may warn a user if a typically-traveled route has worse traffic than usual, or if the user will be late to an appointment due to traffic.
As shown in the second stage 3110, the predictive routing engine has determined that the user's most likely destination is her Aunt Sally's house. The information panel 760 for this destination includes a name for the destination (Aunt Sally's), an estimated time of arrival (2:30), and metadata indicating that this is a frequent destination for the user. Therefore, because the user regularly travels to her Aunt Sally's home, and is on a path regularly taken for that destination, the predictive routing engine concludes that this is a likely destination for the user's current travel. As shown, the information panel 760 displays a selectable control 3130 for canceling the predictive routing feature and returning to the standard map exploration view. In some embodiments, selecting the control 3130 removes the currently predicted route from the set of routes, causing a route for a different predicted destination to appear in the display. In this case, each predicted route has its own selectable cancellation control.
The map view 765 displays a route from the current user location to Aunt Sally's house. In addition, the predictive routing user interface 755 includes a predictive routing information bar 3125. The predictive routing information bar 3125 includes a set of circles that indicate (1) the number of predicted likely destinations and (2) which of these is currently displayed in the interface 755. Currently, the application displays three circles 3135 to indicate that there are three available predicted destinations. Furthermore, the first of these circles is filled in to indicate that the application currently is displaying information about the first (and therefore most likely according to the predictive routing engine) of these destinations. Because additional predicted destinations are available, the application displays a portion of the next destination at the right edge of the display screen. In some embodiments, when only one destination is predicted by the predictive routing feature, the application does not display the circles 3135, and in some such embodiments, eliminates the information bar 3125 altogether.
In some embodiments, the application displays the predicted routes in a filmstrip view, with two conceptual frames for each route. The first frame, or panel, for a route displays the metadata (e.g., name of destination, other data) and the second frame displays the map view. In the filmstrip view, the metadata panel for the second route comes after the map panel for the first route. The second stage 3110 additionally shows a condensed view 3140 of such a filmstrip, with the primary display of this stage illustrating the actual display of a portion of that filmstrip within the display screen. As shown, the condensed view illustrates three pairs of frames, each pair including a route metadata panel and a route map panel.
In addition, at this second stage 3110, the user performs a left swipe gesture on the vehicle touchscreen display. In some embodiments, the user can utilize such gestures (or similar touch gestures) in order to select different predicted destinations. Thus, as a result, the third stage 3115 displays the second of the predicted destinations, Uncle Harry's, and a route to this predicted destination. In some embodiments, the application animates this transition by sliding the first destination information off to the left while concurrently sliding the information for the second destination onto the display. This simulates the sliding of the filmstrip shown in condensed view 3140 leftwards in order to show the second pair of panels in the display.
As shown in the information panel 760, this new destination is the user's Uncle Harry's house, for which the estimated arrival time is 2:40. The metadata also indicates that the destination is a rare destination for the user, but is based on a calendar event at the location. As shown in lower the information bar 3125, the second of the three circles, rather than the first, is now filled in. Furthermore, the cancellation control 3130 remains within the information panel 760.
Furthermore, at the third stage 3115 the user again performs a leftward swipe gesture. As shown at the fourth stage 3120, this again causes the application to slide the information for the second destination to the left and indicate a third predicted destination. This third destination has a name of Dentist, an ETA of 2:43, and is a rare destination for the user. In addition, the metadata indicates that the user has an appointment at the dentist today. As an example, the user might have scheduled a dentist appointment for 4:00, and also added a calendar event to see her uncle at 3:00. In addition, the third of the circles 3135 is now filled in.
The previous figure illustrated the predictive routing interface for the high-quality touchscreen user interface of some embodiments. In some embodiments, the low-quality touchscreen interface and the non-touchscreen interface have similar predictive routing features. Figure 32 illustrates the same scenario as the previous Figure 31, displayed in the low-quality touchscreen interface of some embodiments, over four stages 3205-3220. The first stage 3205 illustrates the low-quality vehicle touchscreen interface 570, with the user selecting the predictive routing control 584 to open the predictive routing feature.
The second stage 3210 illustrates the low-quality touchscreen predictive routing interface
3200. For the most part, the low-quality touchscreen interface 3200 is the same as the high- quality touchscreen interface 755. However, the interface includes additional right and left arrow controls 3225 and 3230 within the information bar 3240, at the far ends of the bar. The circles 3235 representing the different predicted destinations are located in this information bar 3240, as in the high-quality touchscreen example. Because low-quality touchscreens often have high touch latency, they may have trouble detecting the swipe gestures used to select between different predicted destinations. Thus, the low-quality touchscreen interface of some embodiments includes the arrow controls 3225 and 3230 for selecting between different predicted destinations. In addition, the information bar 3240 is larger in the low-quality touchscreen situation so that the touch buffer around the arrows 3225 and 3230 can be made larger.
However, as shown in this figure, the low-quality predictive routing interface 3200 of some embodiments still accepts swipe gestures to change the displayed predicted destination. Thus, the second stage 3210 illustrates a user performing a leftward swipe gesture, and the third stage 3215 illustrates that the interface now displays the second of the predicted likely destinations (as in the third stage 3115 of Figure 31). At this stage, rather than swiping, the user selects the right arrow control 3230. As shown at the fourth stage 3220, the selection of the right arrow causes the same action in the interface as a leftward swipe gesture, as the mapping application now displays the third predicted destination in the interface 3200.
The predictive routing feature is also available in the non-touchscreen vehicle interface of the mapping application. In some embodiments, the display for non-touchscreens appears the same as the high-quality touchscreen predictive routing interface, with the exception of the cancel button 3130. In order to select between predicted destinations, the user shifts a joystick left or right, turns a knob, rotates a trackball, etc. Because the cancel button is not available, the user can exit the predictive routing feature by using a back button or similar built-in vehicle control in order to return to the standard map view interface.
The above figures illustrate the predictive routing interface of some embodiments. In the example, three predicted destinations are shown in an order of decreasing likelihood of being the user's actual destination. In some embodiments, the predictive routing is regularly updating the likelihood of various possible destinations, and can add destinations to the display, remove destinations from the display, or modify the order of the destinations.
Figure 33 illustrates an example of the predictive routing engine reordering the likelihood of various destinations, and the subsequent effects in the predictive routing interface 755 over four stages 3305-3320. The first stage 3305 is the same as the second stage 3110 of Figure 31, with the predictive routing interface 755 displaying Aunt Sally's as the most likely destination based on various factors, and two additional destinations available for the user to view. In the second stage 3310, the user has traveled towards an intersection at which she would turn right to travel to Aunt Sally's. At this point, the user interface is the same.
At the third stage, the user has reached the previously mentioned intersection and turned left, away from the route to Aunt Sally's. Thus, when the predictive routing engine updates its likelihood, it determines that Uncle Harry's is now a more likely destination than Aunt Sally's based on all of its collected information. Specifically, the predictive routing engine determines that Aunt Sally's is no longer a relevant destination, because the user is now driving away from that destination. Thus, as shown at this stage, the view for Aunt Sally's as a destination begins sliding to the left off of the screen. The fourth stage 3320 illustrates that the predictive routing interface 755 now displays the view for Uncle Harry's as the destination, because this is the most likely destination as calculated by the predictive routing engine. In addition, because the application has determined that Aunt Sally's is no longer a relevant destination, the application only has two possible destinations, and modifies the circles 3325 accordingly.
In some embodiments, if Aunt Sally's still remained a viable, relevant destination, the mapping application would not change the displayed destination even if a different destination (e.g., the dentist) became more likely. However, because Aunt Sally's was no longer a relevant route after the user turned left at the intersection (a bifurcation point), the application removed the route from the display. In some embodiments, if only one route was originally predicted, and the predictive routing engine then determines the route to not be relevant, the application would return to the map exploration user interface.
In some embodiments, the mapping application only changes the display within the context of the normal display of the interface. That is, the application does not suddenly switch the display order of the three destinations to match the newly calculated order, but instead slides the first destination off of the screen and slides the more likely destination into the view. However, in such embodiments, if the predictive routing engine determined that, rather than the dentist as a third most likely option, a different destination was more likely, then some embodiments would modify the user interface such that when the user performed a swipe gesture to view the second destination (Uncle Harry's), the new third destination would appear on the right side of the display (because this change could be performed off-screen and therefore would not be visually jarring). However, when the application determines that a route partially shown on the display screen (e.g., Uncle Harry's in the first stage 3305) is no longer relevant at all, some embodiments will change the route in this panel, even though it may be a jarring change to the display, so as to avoid displaying information for which the user is not likely to have a use.
Figure 34 illustrates another example of changes made by the predictive routing engine to the order of likely destinations over six stages 3405-3430 in which the user, while traveling, exits the predictive routing interface 755 and then returns at a later time to the predictive routing interface. The first stage 3405 illustrates the predictive routing interface 755 in the same state as the second stage 3115 of Figure 31, with Aunt Sally's as the displayed most likely destination and Uncle Harry's as the second most likely destination. At this stage, the user selects the cancelation control 3130 in order to leave the predictive routing interface.
As a result, the second stage 3410 illustrates the high-quality vehicle touchscreen interface 565 of some embodiments. The third stage 3415 illustrates this interface six minutes later as the user approaches an intersection. This intersection represents a decision point for the predictive routing engine, because a right turn takes the user towards Aunt Sally's house while a left turn takes the user towards Uncle Harry's house. As shown at the fourth stage 3420, five minutes later, the user has taken a right turn at the intersection.
Also at this fourth stage 3420, the user selects the predictive routing control 574 to reenter the predictive routing interface 755, shown at the fifth stage 3425. At this stage, because the user turned right and remains along the optimal route towards Aunt Sally's, the calculated most likely destination (and therefore the first destination displayed in the predictive routing interface) is Aunt Sally's. However, as a result of turning to the right at the intersection, Uncle Harry's is no longer the second predicted destination. In fact, as shown by the circular indicators 3435, the mapping application only receives two destinations to display from the predictive routing engine. In some embodiments, only destinations that meet a minimum threshold for likeliness are displayed in the predictive routing interface. In the fifth stage 3425, the user also performs a leftward swipe gesture. As a result, the sixth stage 3430 illustrates the second predicted route, to the dentist.
In the above examples, the different predicted destinations and associated routes are arranged in a filmstrip view. As the user performs a swipe gesture (or selects a scroll arrow, rotates a knob or other non-touchscreen control, etc.), the application animates the sliding of the conceptual filmstrip through the display screen. Figure 35 illustrates this animation over three stages 3505-3515. The first stage 3505 is the same as the second stage 3110 of Figure 31, and the third stage 3505 is the same as the third stage 3115 of that figure (without the additional swipe gesture shown in the third stage 3115 of Figure 31). The second stage 3510 illustrates the animation of some embodiments to transition between these stages. Specifically, at this stage, the information panel and part of the map view for the first predicted route has slid off of the display, while the entirety of the information panel and part of the map view for the second predicted route has slid onto the display.
In some embodiments, a different arrangement and animation of the predicted routes is used. Rather than arranging the displays for the predicted routes one after another in a filmstrip, some embodiments conceptually treat the displays for different routes as cards stacked on top of each other. Figure 36 illustrates the animation between two predicted route displays for this type of arrangement. As shown at the first stage 3605, in some embodiments, the application does not display a portion of the information for the second predicted route while showing the view for the first route. While this is shown for the stacked-card arrangement, one of ordinary skill will recognize that some embodiments that use the filmstrip arrangement will also not display any portion of the information panel for the next predicted route. As in the previous figure, at this stage the user performs a swipe gesture to cause the application to display the second predicted route.
The second stage 3610 illustrates that, rather than animating the display for the second predicted route sliding off of the screen as the first predicted route display slides onto the screen, in this example the application reveals the second predicted route display underneath as the first predicted route display slides off. In the third stage 3615, the animation has completed and the second predicted route display occupies the entire screen.
As with many of the other features described above, the predictive routing feature enables the user to quickly and easily find a desired destination and enter a turn-by-turn navigation mode towards that destination. While the illustrated interface for the predictive routing feature does not include a particular control for entering navigation, in some embodiments when the mapping application receives a tap gesture within the display, the application enters the turn-by-turn navigation mode towards the currently shown destination. In different embodiments, this tap input may be located within the map view for a destination, within the information panel for the destination, or within either of these areas.
Figure 37 illustrates this feature of the predictive routing interface of some embodiments for entering the turn-by-turn navigation mode of the mapping application over two stages 3705- 3710. The first stage 3705 illustrates the predictive routing interface 755 of some embodiments, as shown in the previous figures. In this stage, a user performs a tap gesture over the map view panel 765 with Aunt Sally's as the currently displayed destination. As shown at the second stage 3710, based on this gesture, the mapping application has entered its turn-by-turn navigation mode for a route heading to Aunt Sally's. In other embodiments, the application displays a specific selectable option for entering turn-by-turn navigation, and a tap gesture over the display instead returns the display to the map exploration user interface. The turn-by-turn navigation mode of some embodiments will be described in further detail in the following Section VI.
In the above examples, the predictive routing user interface displays a series of predicted destinations, as well as routes to those destinations. In some embodiments, when the user selects the predictive routing control, the application displays on the vehicle screen traffic information in the vicinity of the vehicle. For instance, if the vehicle is traveling on a freeway, some embodiments indicate whether the traffic is normal, lighter than usual, or heavier than usual on the freeway. Some embodiments display such a vicinity-based information panel first, then the user can swipe to view the above-described destination-based panels.
Figure 38 conceptually illustrates a simplified software architecture of a mapping and navigation application 3800 of some embodiments that operates on a mobile device and performs the predictive routing features discussed above. As shown, the mapping application 3800 includes search functions 3805, a destination generator 3810, a route generator 3815, and additional mapping and navigation functions 3820. In addition, the mapping application stores a mapping application recents database 3825, and the mobile device stores harvested system data 3830.
The mapping application recents database 3825 stores, in some embodiments, data about recent searches performed by the mapping application (e.g., by the search functions 3805). For example, in the above examples, when a search result yields a specific destination, the application stores that destination in the recents database 3825. When the search is performed over a particular category (e.g., coffee shops), some embodiments store the category in the mapping application recents database 3825 as well.
The harvested system data 3830, in some embodiments, is not stored specifically by the mapping application. Instead, an address harvester that runs separately on the mobile device (e.g., within the operating system) stores various data within the harvested system data 3830. In some embodiments, the address harvester stores phone numbers, e-mail addresses, and physical addresses in the harvested system data 3830. In some embodiments, the phone numbers are harvested from phone calls and/or text messages, and the e-mail addresses are harvested from e- mails and/or text messages. In addition, the data may include groups of phone numbers and/or groups of e-mail addresses that were used to send messages or e-mails to several recipients at once.
The harvested physical addresses may come from numerous different sources in some embodiments. For instance, e-mails and text messages may include physical addresses that a harvester recognizes by their formatting and stores in the database 3830. Scheduled events (e.g., from an electronic calendar or an electronic ticket repository) may include a location and the system data 3830 stores this information (as well as times associated with the events, in some embodiments). In some embodiments, a web browser running on the mobile device allows a user to select addresses (e.g., of a place of business that the user looks up in the web browser) and add those addresses to the data store 3830. In some embodiments, the harvested system data 3830 includes contacts information from the user's address book, which may include phone numbers, e-mail addresses, and physical addresses.
The harvesting of both telecommunications addresses (e.g., e-mail addresses and phone numbers) and physical addresses is described in further detail in U.S. Provisional Application 61/832,850 and the U.S. Patent Application 14/081,850, entitled "Harvesting Addresses". U.S. Provisional Application 61/832,850 and U.S. Patent Application 14/081,850 are both incorporated by reference above.
The mapping and navigation functions 3820 perform a wide variety of mapping and/or navigation processes for the mapping application. These functions may include identifying map tiles from which to generate map displays and retrieving the map tiles from a remote server, matching a device location to a location on a map, retrieving route information from a server, generating text and/or graphical navigation instructions for a route based on the retrieved route information, generating one or more user interfaces for the application, and other functions.
The search functions 3805, in some embodiments, perform searches as described above by reference to Figure 23. Thus, the search functions 3805 initially receive a search request (i.e., a set of search terms) and attempt to find one or more matching results (i.e., locations) within the physical addresses stored in the harvested system data. Thus, if a user has received an e-mail with an address on a particular street, the user can search for the street name on its own and the search functions 3805 will return the more specific address on the street by using the harvested system data 3830. When the search functions 3805 cannot find a result in the harvested system data 3830, then the search functions 3805 use a search engine to perform the search in some embodiments. Once search results are retrieved, from either source, the search functions 3805 store the search results to the mapping application recents database 3825 in some embodiments. Thus, when the user requests recent locations, the application uses the recents database 3825 to populate the fields in the requesting user interface. In addition, the search results are sent to the mapping and navigation functions 3820 for output to one or more user interfaces.
The destination generator 3810 of some embodiments uses travel history of the device, harvested physical locations from the harvested data 3830 (e.g., contact information, addresses received in messages or e-mails within a particular recent time period, locations of scheduled events, etc.), and/or locations from the mapping application recents database 3825, along with additional inputs such as the system clock (to retrieve the current time) and a location identification engine (to retrieve a current location). The destination generator 3810 uses this data to identify predicted destinations to which it expects the device (and thus the user) to travel. The mapping application can then retrieve routes to the predicted destinations from the current location (e.g., from a directions server that generates routes from one location to another), and use the predicted destinations and associated routes to populate various user interface features. For instance, the predictive routing interface for the vehicle display described in this section uses these predicted destinations and routes to the destinations in some embodiments.
In some embodiments, the route generator 3815 creates sets of associated destinations, with each set specified as a possible route for traveling. In some embodiments, each set of associated destinations includes start and end locations, a number of locations in between the start and end locations, and a number of motion records specifying rate of travel (e.g., between the locations). In some embodiments, the predicted route generator uses a machine-learning function to stitch together previously unrelated destination, location, and motion records into contiguous sets that specify potential routes. As with the output of the destination generator 3810, the routes generated by the route generator 3815 may be used by the mapping application of some embodiments to output useful prediction information (e.g., to identify that a user is likely to be following a particular route soon and warn the user about traffic along that route). The destination and route prediction functions of some embodiments are described in further detail in U.S. Provisional Application 61/832,928 and U.S. Patent Application 14/081,899, entitled "Warning for Frequently Traveled Trips Based on Traffic", which are both incorporated by reference above.
In addition to the mapping application functions described above, the harvested system data may be used for other features of the mobile device. For instance, in some embodiments the harvested system data 3830 can be used by the e-mail or messaging functions to quickly identify a recipient for a message. This may be especially useful for the vehicle interface, which in some embodiments performs messaging through dictation, without any typing. Figure 39 illustrates the vehicle display screen over four stages 3905-3920 in which a user activates a messaging function and dictates a message to a recipient. The first stage 3905 illustrates the touchscreen home page 550 of some embodiments. In this case, the user selects the messaging control 564, in order to activate a messaging (e.g., SMS) feature of the mobile device.
The second stage illustrates a messaging interface 3900 of some embodiments. The messaging interface 3900 includes a prompt 3925 (which may also be output as audio via the vehicle speakers), a cancel control 3930 for closing the messaging interface without sending a message, and a done control 3935 for indicating to the messaging application that dictation has completed. In some embodiments, unlike the search function of the messaging application, the messaging application uses a more interactive voice recognition feature, in which the answers to a first prompt may be used to inform recognition of answers to subsequent prompts. In the second stage 3910, the messaging feature requests a recipient for a message, and the user dictates "Jerome".
The third stage 3915 illustrates that the messaging feature uses the harvested data of some embodiments to identify a telecommunications address (e.g., phone number, e-mail) to which to send the data. In this case, although the user does not have any contacts named Jerome, a text message had previously been received correlating the name Jerome Smith to a phone number. Thus, the messaging interface 3900 now requests that the user dictate a message to Jerome Smith. As shown by the highlighting 3940, the recipient is a selectable item in some embodiments, which the user can select in order to change recipients for the message. Also, at this stage, the user dictates a message, "Be there in ten". The fourth stage 3920 illustrates the resulting message, with a control 3945 for allowing the user to send the message.
VI. NAVIGATION MODE
As described above, the mapping application vehicle user interface of some embodiments is designed for the user to quickly and easily identify a destination for entering turn-by-turn navigation. When the mobile device on which the mapping application operates is connected to the vehicle display system, in some embodiments the mapping application outputs different views for navigation to the two different interfaces. Specifically, the mapping application outputs a map view from above and/or behind the vehicle to the vehicle display screen, while outputting only navigation instructions to the mobile device display screen.
Figure 40 illustrates the operation of the turn-by-turn navigation user interface of some embodiments for the vehicle display over six stages 4005-4030 in which the vehicle approaches an intersection and a user follows navigation instructions to perform a maneuver at the intersection. The first stage illustrates the search result map view page 852, with a particular search result and possible destination selected. In addition, at this stage 4005, the user selects the control 880 to enter turn-by-turn navigation.
The second stage 4010 illustrates that the mapping application has entered turn-by-turn navigation mode, and now displays the navigation mode user interface 885 of some embodiments on the vehicle display screen. As described above, the navigation user interface 885 includes the lower information bar, a navigation map 890, a 3D control 895, and a direction list control 897. The navigation user interface 885 additionally includes a navigation information bar 4050. In some embodiments, this bar is displayed in such a way as to appear similar to a street sign.
The 3D control 895 enables a user to toggle the navigation map 890 between 3D and 2D animation, while the direction list control 897 enables the user to view a list of instructions for upcoming maneuvers along the route being traveled. As shown, some embodiments automatically begin navigation with the 3D control 895 selected and the navigation map in three dimensions. Other embodiments begin navigation in two dimensions by default rather than the three dimensional view. In some embodiments, whether to begin in 2D or 3D is a user-selectable option within the mapping application. In addition, some embodiments animate the transition from the map view shown in the first stage 4005 to the navigation view shown in the second stage 4010.
The navigation information bar 4050 includes information about the next maneuver along the navigated route as well as route summary information. Specifically, the route summary information of some embodiments includes the estimated arrival time (2:18), distance remaining in the route (1.2 miles), and time remaining for the route (4 minutes). The summary of the next maneuver along the route includes a graphical representation of the maneuver and intersection, a distance to the next maneuver that counts down, and text instructions describing the maneuver. This maneuver summary information will be described additionally below by reference to the navigation instruction panel 4035, shown in subsequent stages of this figure.
As the vehicle to which the mobile device running the mapping application is connected approaches a maneuver, the application of some embodiments slides a navigation instruction panel into the navigation user interface 885 in order to alert the driver of the upcoming maneuver to perform. The third stage 4015 illustrates that as the vehicle approaches the intersection where the route turns right, a navigation instruction panel 4035 begins to slide onto the display, as represented by the right-facing arrows (which are not actually displayed within the user interface). The mapping application of some embodiments animates the introduction of this panel once the vehicle approaches within a threshold distance (e.g., 250 ft., 200 ft., 100 m, etc.) of the intersection at which the maneuver will be performed.
In some embodiments, the navigation instruction panels always appear on the driver side of the display. Therefore, in countries such as the United States, that drive on the right side of the road (with the driver seat on the left side of the car), the application places the panel on the left side of the display. On the other hand, in countries such as the United Kingdom that drive on the left side of the road (with the driver seat on the right side of the car), the application places the navigation instruction panel on the right side of the display.
The third stage 4015 also illustrates that the application no longer displays the navigation information bar 4050 in the UI. In some embodiments, the application either removes the information bar or animates out the information bar (e.g., by sliding the bar off of the display, fading out the bar, etc.) before sliding the information panel 4035 onto the screen (or otherwise animating its appearance). Because the next maneuver information is displayed in the information panel and the route summary information is not needed for the short period approaching a maneuver, the application can remove the bar for this period of time. In various other embodiments, the information panel 4035 slides over the information bar as the information bar animates off of the display, or the information bar shrinks so as to fit within the width of the display reserved for the map view 890.
The fourth stage 4020 illustrates the navigation user interface 885 after the animation to slide the navigation instruction onto the display has completed. The 3D map view at this stage has been compressed into a smaller region of the display. In some embodiments, the mapping application smoothly animates the compression of the navigation map 890, while sliding the 3D control 895 with the map.
The navigation instruction panel 4035 of some embodiments is designed to have the appearance of a road sign, such as one might see along the side of a freeway. Some embodiments use different colors, textures, or other aspects of the appearance of the navigation instruction panel to match the appearance of road signs in the location (e.g., country, state, etc.) of the vehicle. Within the panel, the application displays (1) a graphical instruction of the upcoming maneuver to perform and (2) a text instruction describing the maneuver.
In some embodiments, the mapping application generates, for each maneuver along a route, at least two different stylized graphical indicators. These include a complex graphical indicator that shows all roads entering and leaving the intersection as well as an arrow indicating the maneuver through the intersection, and a simpler graphical indicator that shows only an arrow indicating the maneuver (e.g., a right turn). In this case, the application uses the more complex graphical indicator in the navigation panel, as well as in the information bar 4050. The graphical maneuver indicators of some embodiments are described in greater detail in the U.S. Application 13/632,117, filed September 30, 2012, which is incorporated herein by reference.
The mapping application of some embodiments additionally generates several different sets of text instructions for each maneuver along a route based on information derived from the route. The application synthesizes various instruction elements based on route data, then combines these instruction elements in different ways to generate sets of text instructions. The application then uses one of the sets of text instructions based on the amount of space available, whether information is already displayed in a different form, and other factors. The text instructions of some embodiments are described in greater detail in the U.S. Application 13/632,1 17, incorporated by reference above.
The fifth stage 4025 illustrates the navigation UI 885 after the vehicle (and therefore the device) has proceeded through the intersection along the route, as specified by the navigation instruction panel 4035. At this point, the panel 4035 begins sliding back off of the display (as indicated by the leftward facing arrows, which are not displayed in the UI). As the instruction panel slides off of the display, the map view increases in size and the 3D control moves back to the left, with the opposite animation as that used for sliding the instruction panel into the display.
While not shown in this figure, in some embodiments the navigation instruction panel counts down the distance to the intersection at which the vehicle will perform the displayed maneuver. In the fourth stage 4020, the instruction panel displays 200 feet as the distance to the next intersection. Before removing the sign in the fifth stage 4025, the application changes this displayed number at various threshold distances from the intersection (e.g., 150 ft., 100 ft., 50 ft., 40 ft., 30 ft., 20 ft., and 10 ft., or a different combination of distances).
The sixth stage 4030 illustrates the navigation UI 885 after the application has completed the removal of the instruction panel from the display. The navigation information bar 4050 has returned, and displays a summary of the new next maneuver. At this point, the navigation map view will follow along behind the indicator of the device location 4040 until the next intersection is approached, or the user selects one of the controls in the user interface (e.g., the 3D control or the list control).
When the user does select the direction list control during navigation, the mapping application of some embodiments animates a presentation of a list of maneuvers to perform for the currently presented route over the navigation view. Figure 41 illustrates the animation of the direction list of some embodiments over four stages 4105-4120. The first stage 4105 illustrates the navigation user interface 885, as described above by reference to Figure 40, with a 3D control 895 and direction list control 897, and navigation information bar 4050. In this first stage, the user selects the direction list control 897 (with a tap input) to cause the application to display the direction list rather than the navigation map presentation.
The second stage 41 10 illustrates that, as a result of the user selection, the application begins animating a direction list 4125 onto the vehicle display. As shown, in some embodiments, the application slides the direction list upwards over the map view from the bottom of the display screen. In other embodiments, the direction list may slide in from the left or right, from the top, or may appear on the screen via a different animation (e.g., an enlargement of the direction list control to the size of the full screen).
The third stage 4115 illustrates the display after the animation to display the direction list 4125 has fully completed. As shown, in some embodiments, the direction list begins with a starting location (1018 Chico Ave ...). When a name for the location is available (e.g., based on contact information indicating a particular person's house, map data indicating a place of business, etc.,) some embodiments additionally display this name as part of the starting address. Next to the address, the application displays a pin to indicate that the entry in the direction list is a location, rather than a maneuver.
Below the starting location, the direction list includes an entry for each maneuver along the route. In some embodiments, the mapping application receives route data from a mapping server as a series of junctions through which the route travels. The mapping application identifies the junctions at which maneuvers will be performed, and generates graphical indicators and sets of instructions for these maneuvers (as described above). Within an entry in the direction list, some embodiments use the simplified graphical indicator for each maneuver (e.g., an arrow showing the maneuver that does not include the other roads at the intersection). In addition, each entry for a maneuver along the route includes a distance from the previous maneuver and a set of text instructions generated as described above.
In some cases, the entire direction list may not fit on the display screen. In order to view additional directions, the user can scroll through the list in some embodiments. The third stage 4115 illustrates a user performing an upward swipe gesture over the direction list 4125. As a result, the fourth stage 4120 illustrates that the application has scrolled the direction list to its end, such that the destination (and thus the end of the route) is now displayed. Some embodiments include two entries to indicate the destination. The second to last entry in the list includes an indicator of the end of the route, with information about the distance from the previous maneuver to the destination and information informing the user regarding on which side of the street the destination is located. The final entry includes name and address information for the destination, similar to the starting address information.
Because the user explores the direction list by using swipe gestures to scroll through the list, some embodiments include additional controls for scrolling up and down in the low-quality touchscreen interface. For instance, some embodiments display up and down arrows along the right or left side of the direction list that the user can select with a tap input in order to scroll the list by a particular amount (e.g., one entry, three entries, etc.).
The direction list 4125 additionally includes a control 4130 for returning to the navigation presentation 885. When a user selects this control, the application displays the reverse animation of that used to display the direction list. For instance, in this example, selecting the control 4130 would cause the application to slide the direction list downwards on the display, revealing the navigation map view underneath.
In this example, the direction list 4125 slides up and over the navigation information bar 4050. In other embodiments, however, the direction list 4125 only covers the display of the map view of navigation, and not the direction list. This enables the user to view the direction list while also able to see the summary information (e.g., the estimated arrival time, next maneuver) which may be useful to the user.
In some embodiments, the user can view the navigation map in overview mode, rather than the animated view from above and/or behind the representation of the moving vehicle. Figure 42 illustrates a user accessing this overview mode in the vehicle user interface over three stages 4205-4215. In this first stage 4205, the user performs a specific type of touchscreen input (in this case, a tap, though in different embodiments different inputs may be used) over the animated map view 890 of the navigation user interface 885.
The second stage 4210 illustrates that this action causes the application to modify the navigation information bar 4050. The upper portion of the vehicle display now includes an end navigation control 4220 and an overview control 4225. The end navigation control 4220 allows the user to end the current turn-by-turn navigation mode and return to the map exploration interface. The overview control 4220 allows the user to view an overview of the route being navigated. In addition, the route and maneuver information in the navigation information bar is compressed to fit within a smaller area. This bar continues to display the same route summary information (arrival time, remaining distance, remaining time), and also displays the graphical representation and distance to the next maneuver in the same manner. The text instructions, however, use a shorter version in order to fit within the smaller area. Some embodiments remove the text instructions altogether.
The user selects the overview control 4225 at the second stage 4210 (via a tap input). As a result, the mapping application displays a route overview map 4230 at the third stage 4215. The overview map of some embodiments displays a map at a zoom level such that the entirety of the remaining route, from the current location to the ending location, fits within the display. In addition, the map shows the route line for the remainder of the route. Some embodiments, rather than showing the remaining route, display the entirety of the route from the starting location to the destination. The display also includes a resume navigation control 4235 in place of the overview control 4225. Selection of the resume navigation control causes the application to resume the animated map view of navigation shown in the first stage 4205.
This figure (and the other figures in this section) illustrate turn-by-turn navigation within the touchscreen user interfaces of some embodiments. However, the non-touchscreen interfaces also have the capability to enter turn-by-turn navigation mode. Some embodiments limit the functionality for non-touchscreen displays (e.g., always displaying in either 2D or 3D with no control to switch back and forth, not having a direction list control, etc.), while other embodiments use various types of inputs from the built-in vehicle controls to toggle between 2D and 3D presentations or to cause the application to animate the direction list onto or off of the display. Some embodiments, in fact, modify the mobile device interface during navigation (described below) to include the 3D control and direction list control. Selection of these controls on the mobile device then affects the non-touchscreen vehicle interface.
The above examples illustrate the vehicle display screen when the mapping application enters turn-by-turn navigation mode. In some embodiments, when the application enters the turn- by-turn mode, it locks the mobile device interface into a corresponding display. Specifically, the application displays on the mobile device on which the application operates a large set of instructions for the upcoming maneuver along the route being navigated in some embodiments.
Figure 43 illustrates the display of the mobile device during the turn-by-turn navigation mode according to some embodiments. Specifically, this figure illustrates the display of both the mobile device and the vehicle interface over three stages 4305-4315 as the user approaches and turns through an intersection. The first stage 4305 illustrates both a mobile device 4300 and the vehicle display screen after the mapping application has entered turn-by-turn navigation mode. As shown, the vehicle display screen displays the navigation mode user interface 885 of some embodiments.
The mobile device 4300 on which the mapping application operates displays a blank screen with only navigation instructions. Specifically, the mobile device displays a larger version of the navigation instruction panel for the next maneuver, irrespective of the distance to that next maneuver. This display includes the maneuver information also displayed in the navigation information bar 4050, without the route summary (e.g., ETA) information in some embodiments. Thus, until the navigation information bar 4050 is replaced with a navigation instruction panel, its distance to maneuver, graphical maneuver representation, and text instructions will be the same as the mobile device display. Once the navigation instruction panel is displayed on the vehicle screen, the mobile device display matches this instruction panel. While shown with a white background in these figures, some embodiments use a black background with white text and instructions rather than the other way around. At the first stage, the vehicle and device are beyond the threshold for displaying the navigation instruction panel within the vehicle display (400 ft. away from the next maneuver). However, the mobile device displays (1) a complex graphical indicator 4325 for the upcoming maneuver and (2) text instructions 4330 describing the maneuver.
As described above with respect to its display in the navigation instruction panel, the complex graphical indicator includes a thick arrow showing the maneuver through the intersection as well as lighter (e.g., dimmer, thinner, etc.) lines marking the other roads that enter and/or exit the intersection at which the maneuver will be performed. The text instructions indicate a regularly changing distance to the next maneuver, as well as information about the maneuver to perform (e.g., a particular type of turn onto a particular road). While this example shows a right turn, other instructions could include left turns, U-turns, merges, freeway exits, etc.
The second stage 4310 illustrates the two user interfaces as the vehicle and device approach the intersection at which the route makes a right turn onto State St. At this point, a navigation instruction panel 4320 has been animated onto the display, and the navigation information bar removed, because the vehicle and device are now closer to the intersection. Once present, the navigation instruction panel 4320 and the mobile device 4300 display the same information in some embodiments. However, the mobile device always displays the next instruction, enabling the driver or passenger to quickly view the next maneuver to perform irrespective of the distance to that next maneuver, without the need to view the entire direction list or even interact with either interface.
In the third stage 4315, the vehicle has maneuvered through the intersection and onto State St. Thus, the application has removed the navigation instruction panel 4320 from the display while enlarging the size of the navigation map presentation and returning the navigation information bar 4050. In addition, because the user has completed the maneuver, the mobile device 4300 now displays a new graphical indicator 4335 and text instructions 4340 for the new upcoming maneuver (another right turn, but at a four-way intersection rather than a three-way intersection), in both the mobile device display and the navigation information bar.
In some cases, a navigated route will have two maneuvers within quick succession of each other. In some embodiments, the application modifies both the device and vehicle UIs to account for such maneuvers. Figure 44 illustrates the display of both the mobile device and vehicle UIs over three stages 4405-4415 in which the vehicle approaches a first maneuver with a second maneuver following the first in rapid succession.
The first stage 4405 illustrates the two user interfaces when the vehicle (and device) are 400 feet from the first maneuver. At this stage, the navigation map UI 885 of the device does not yet display the navigation information panel, instead displaying the navigation information bar 4050. However, the subsequent maneuver will occur only a short distance (e.g., 50 ft.) after the next maneuver, and therefore the application displays a second navigation bar 4420 (or sign) below the information bar 4050. This second navigation bar 4420 only illustrates maneuver information, as including route summary information would be redundant. Thus, the navigation bar 4420 includes a stylized graphical representation of the second maneuver and text instructions for the maneuver. Because the second maneuver comes so quickly after the first, no distance from the first to second maneuver is presented. In addition, the mobile device 4400 displays a split screen to show the two maneuvers. In this example, the two maneuvers are given equal portions of the split screen, with the lower maneuver again not including a distance. However, other embodiments display the second maneuver in a smaller portion of the display screen.
The second stage 4410 illustrates the two displays after the vehicle has approached within a threshold distance of the first maneuver (e.g., within 200 feet). As a result, the application has slid a navigation instructions panel 4425 onto the left side of the display. As in the previous examples, this instructions panel mirrors the display shown on the mobile device. Thus, the instructions panel is split into an upper half displaying the summary information for the first maneuver and a lower half displaying the summary information for the subsequent maneuver. The third stage 4415 illustrates the displays after the vehicle has performed the first maneuver. Because the maneuver after the turn onto Hill St. is more than a threshold distance (e.g., 200 ft.) of its previous maneuver, the application no longer displays a split screen for the device and split panel for the navigation instructions panel in the vehicle display. As shown, some embodiments display the distance to the maneuver once only one maneuver is shown, to give the same appearance as when the application had counted down to the maneuver from a greater distance.
Figure 45 illustrates the same situation, except that the mobile device is held in landscape orientation rather than portrait orientation. In this case, the progression of the vehicle display is unchanged from the previous figure. For the device in landscape orientation, the application splits the display horizontally rather than vertically for the two sets of maneuver instructions, with the instructions for the first maneuver appearing on the left side.
In addition to entering navigation and arriving at the displays shown in Figures 43-45 by selecting a destination in the vehicle interface (as shown in Figure 40), some embodiments also enable the user to select a destination and enter navigation via the mobile device interface. Figure 46 illustrates a use of the device interface to request a route from the device's current location to a destination location, then the subsequent selection to enter navigation, over three stages 4605-4615.
The first stage 4605 illustrates that the mapping application is open on a mobile device 4600 while connected to a vehicle display. The vehicle touchscreen displays the high-quality touchscreen map exploration interface 565, while the user has previously selected a search field of the mobile device map interface to cause the application to display a list of recent routes and locations. The user selects a route from the device's current location to Putt-Putt Golf Course from this recents list.
As a result, the mobile device 4600 in the second stage displays a route display interface
4620 showing possible routes between the selected start and end locations. In addition to various map controls (e.g., a position control, 3D control, and a direction list control), the route display interface 4620 includes three selectable routes between the two locations. In some embodiments, the three labels that indicate the estimated travel time for each of the routes are selectable to select the corresponding routes. At the moment, a first route with a travel time of 10 minutes is selected. The route display interface also includes an information bar 4625, or banner, that includes information about the currently selected route. The information bar 4625 also includes an end route display control 4630 for returning to the standard map view interface 205 and a start navigation control 4635 for entering navigation along the selected route. Meanwhile, the vehicle display has not changed from the first stage to the second stage while the user views the route display interface 4620 on the mobile device.
In the second stage 4610, the user selects the start navigation control 4635 on the mobile device display. As a result, at the third stage 4615 the vehicle display screen now displays the navigation mode user interface 885, while the mobile device display shows a customized graphical indicator and set of text instructions describing the next maneuver along the selected route.
In some embodiments, the mapping application mobile device interface during navigation is not interactive. While the user can press the home button to exit the mapping application, some embodiments do not offer controls within the mobile device display of the upcoming maneuver instructions. Other embodiments, however, include a set of available controls on the mobile device. Figure 47 illustrates the display and use of such controls on the mobile device during navigation. Specifically, this figure illustrates the activation of a set of controls and the selection of an overview control over three stages 4705-4715.
The first stage 4705 illustrates both a mobile device 4700 and the vehicle display screen to which the mobile device is connected after the mapping application running on the mobile device has entered turn-by-turn navigation mode. As shown, the vehicle display screen displays the navigation mode UI 885 while the mobile device displays instructions for the next maneuver along the route being navigated.
At this first stage 4705, the user taps the display screen of the mobile device 4700. As a result of this input, at the second stage 4710 the application displays a set of controls 4720 at the bottom of the display. The set of controls 4720 includes an end navigation control 4725 for ending the turn-by-turn navigation mode and returning to the standard map exploration mode of the mapping application, as well as an overview control 4730 for displaying an overview of the route being navigated.
The user selects the overview control 4730 at the second stage 4710 (via a tap input). As a result, the mapping application displays a route overview map 4735 at the third stage 4715. The overview map of some embodiments displays a map at a zoom level such that the entirety of the remaining route, from the current location to the ending location, fits within the display. In addition, the map shows the route line for the remainder of the route. Some embodiments, rather than showing the remaining route, display the entirety of the route from the starting location to the destination.
In addition to the overview map 4735, the overview display includes a banner 4740 with an end navigation control 4745, similar to the control 4725 in the previous stage 4710, as well as a control 4750 for returning to the standard navigation view on the mobile device (i.e., the display of graphics and text for the upcoming maneuver).
While the user views the overview map on the mobile device, some embodiments do not change the navigation view on the vehicle display screen (except to animate the movement along the route). Thus, as the user is not moving (e.g., is stopped at a traffic light) during this example, the vehicle display screen does not change during the stages 4705-4715.
In some cases, while the vehicle interface displays the turn-by-turn navigation map view, the user may not be interacting with the mobile device on which the mapping application operates. In some cases, the mobile device may enter a power-saving mode during navigation. Figure 48 illustrates, over three stages 4805-4815, a device 4800 in power-saving mode as well as the vehicle display screen to which the device is connected during turn-by-turn navigation.
In the first stage 4805, the mobile device 4800 has already entered power-saving mode. In some embodiments, the displayed words "Power Save Mode" do not actually appear on the device's display, but are shown here for explanatory purposes. The device may enter power save mode when the user has not interacted with the device for a particular period of time, or because a user has pressed a particular hardware button or touchscreen control to do so (e.g., button 4820 of the mobile device 4800). In addition, at the first stage, the application displays the navigation map view 885 on the vehicle display screen. Thus, while in power-save mode, the mobile device still performs the various navigation operations, including rendering the animated 3D map view, required to output the display to the vehicle.
In the second stage, the application has animated a navigation instruction panel 4825 onto the vehicle display as a maneuver approaches. In the third stage 4815, the vehicle has maneuvered through the intersection causing the application to remove the navigation instruction panel. During these stages, even as the vehicle and device approach a maneuver, the mobile device 4800 remains in power-saving mode. As explained in detail in U.S. Application 13/802,654, filed March 13, 2013, which is incorporated herein by reference, when not connected to the vehicle display system, the mapping application displays (and may use voice prompts as well) navigation instructions on the mobile device as intersections approach even when the device has been placed in power-saving mode. However, because the user receives indications from the vehicle interface of the upcoming maneuver, some embodiments do not display corresponding indications on the device when in power-saving mode.
In some embodiments, while turn-by-turn navigation is running, the user may exit out of the mapping application via either the mobile device or the vehicle display screen, thereby returning both interfaces to their respective home pages of the mobile device operating system. The home page, in some embodiments, will then display a banner for the user to select in order to return to the navigation view. Figure 49 illustrates the home pages of the mobile device and vehicle displays of some embodiments while turn-by-turn navigation mode is operational and the vehicle approaches a maneuver. This figure illustrates, for each of three stages 4905-4915, both of the displayed interfaces as well as a map 4900 that indicates the location of the vehicle and mobile device at the stage.
The first stage 4905 illustrates a mobile device 4920 and a vehicle display screen to which the mobile device is connected. The mapping application operating on the mobile device is in turn-by-turn navigation mode, and therefore the mobile device 4920 displays information about the next maneuver while vehicle display screen displays the navigation mode UI 885. At this stage, the user selects a home button 4925 on the mobile device to place the mapping application in the background and return to the home page of the mobile device. In some embodiments, selecting the home button on the vehicle touchscreen has the same effect.
Thus, at the second stage, both the mobile device 4920 and the vehicle display screen display their respective home pages. The mobile device home page 4930 displays various selectable items (also called affordances) for various applications that operate on the mobile device. The vehicle touchscreen interface 550 of some embodiments is described above by reference to Figure 5. In addition to the standard selectable items in the home pages, both interfaces also display a selectable banner for returning to the turn-by-turn navigation. Specifically, the mobile device home page 4930 includes a banner 4935 at the top of the display with the text "Touch to return to Navigation". Similarly, the vehicle display screen displays a selectable banner 4936 above the set of controls 561-567 with the same text. In some embodiments, selecting either of these banners returns both of the displays to their respective turn-by-turn navigation displays, as shown in the first stage 4905.
While these banners are shown in particular locations in the two different interfaces in this figure, one of ordinary skill in the art will recognize that different embodiments may display similar banners in different locations (e.g., the banner 4936 displayed above the clock at the top of the vehicle display). Furthermore, for non-touchscreen interfaces, the banner displayed on the vehicle screen may indicate a particular action to take using the various vehicle controls in order to return to navigation (e.g., pressing a particular button).
In the third stage, the map 4900 illustrates that the vehicle is approaching an intersection at which the route makes a right turn onto a particular street. As a result, both of the display screens now display indicators to make a right turn onto the particular street. Specifically, the text in the mobile device banner 4935 has been replaced by a right turn graphical indicator with text instructions to turn right. The same graphical indicator and text are displayed in a newly added banner 4940 on the vehicle touchscreen above the clock, while the selectable banner 4936 for returning to navigation remains below the clock. After the vehicle completes the maneuver, some embodiments return the displays to the state shown in the second stage 4910, by modifying the content of the banner 4935 and removing the banner 4940.
In some embodiments, unlike those shown in Figure 49, the banner 4936 is not displayed in the home page of the vehicle screen. Instead, the return to navigation banner is displayed as part of the operating system information bar that includes the simulated home button, which is not shown in the vehicle screen home page of some embodiments. Figure 50 illustrates the display outside the mapping application with turn-by-turn navigation operating according to some embodiments, over three stages 5005-5015. The first stage 5005 is the same as the first stage 4905 of Figure 49, with the user selecting a home button 5025 on the mobile device 5020.
In the second stage, the display of the mobile device is the same as in Figure 49, with the banner 5035 displayed at the top of the device screen. However, the vehicle display 5050 does not display a banner in this situation. Because the user is likely to only have exited to this stage in order to select another application, not having the return to navigation selection control is not worrisome. Furthermore, the user can select the mapping application control 5063 if need be, in order to return to the mapping application and therefore re-enter navigation. At this stage, the user selects the music control 5062 to enter a music application. At the third stage 5015, the vehicle has not yet approached the upcoming intersection. Thus, no directions are yet given for this maneuver. However, within the music application on the vehicle screen, the lower information bar 5040 that includes the simulated home button includes text stating "Return to Navigation" and has changed color. This indicates that the user can select anywhere on this bar 5040 (other than the home button) to return to the turn-by-turn navigation view.
The above-displayed embodiments require the mobile device to output interfaces for the same application to both the mobile device display and the vehicle display. That is, when the user selects the mapping application in one display, the application opens in both. Similarly, when the user exits the mapping application (e.g., by pressing the home button) in one display, the home pages of both are displayed. In some embodiments, however, the user can have different applications open on the two different displays. Figure 51 illustrates the situation in which the mapping application in turn-by-turn navigation mode is open on the vehicle display while the mobile device displays its home page over three stages 5105-5115 in which the vehicle approaches and passes through an intersection along the route being navigated.
The first stage 5105 illustrates a mobile device 5100 that displays a mobile device home page 5120 of some embodiments, with several selectable items for opening various applications (including the mapping application). At the same time, the vehicle touchscreen displays a turn- by-turn navigation mode UI 5125. In the second stage 5110, the vehicle has approached within a threshold distance of the intersection at which the next maneuver will be performed along the route, and therefore the application displays a navigation instruction panel 5130 for the maneuver on the left side of the vehicle display, as shown in previous figures. At the same time, despite having exited out of the mapping application interface on the mobile device 5100, the device displays the maneuver instructions (with both a graphical indicator and text instructions). In some embodiments, both of these sets of instructions on the two displays count down the distance to the maneuver as the vehicle approaches the intersection. The third stage illustrates that after the vehicle passes through the intersection, the display on the mobile device 5100 returns to the home screen, while the navigation instruction panel 5130 is removed from the vehicle display.
The above examples illustrate situations in which the user performs various navigation actions (e.g., entering navigation, exiting navigation) while the mobile device is already connected to the vehicle electronic system. In addition, the user can enter the turn-by-turn navigation mode of the mapping application on the mobile device and then subsequently connect the mobile device to the vehicle system. In some embodiments, upon connection of the device to the system, the mapping application begins outputting the two user interfaces as shown in Figure 43. That is, the application sends the navigation mode map view that follows the vehicle location to the vehicle display, and outputs the maneuver instructions to the mobile device display. When the user disconnects the device from the vehicle system while in navigation mode, the application does the reverse. The mobile device display transitions from the maneuver instructions to the standard turn-by-turn navigation mode when not connected. In addition, in some embodiments the device recognizes when the user has left her vehicle (e.g., based on the type of movement of the device consistent with walking) and transitions automatically from a driving navigation mode to a walking navigation mode.
VII. ELECTRONIC SYSTEM
Many of the above-described features and applications are implemented as software processes that are specified as a set of instructions recorded on a computer readable storage medium (also referred to as computer readable medium). When these instructions are executed by one or more computational or processing unit(s) (e.g., one or more processors, cores of processors, or other processing units), they cause the processing unit(s) to perform the actions indicated in the instructions. Examples of computer readable media include, but are not limited to, CD-ROMs, flash drives, random access memory (RAM) chips, hard drives, erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), etc. The computer readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
In this specification, the term "software" is meant to include firmware residing in read- only memory or applications stored in magnetic storage which can be read into memory for processing by a processor. Also, in some embodiments, multiple software inventions can be implemented as sub-parts of a larger program while remaining distinct software inventions. In some embodiments, multiple software inventions can also be implemented as separate programs. Finally, any combination of separate programs that together implement a software invention described here is within the scope of the invention. In some embodiments, the software programs, when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
A. Mobile Device
The mapping and navigation applications of some embodiments operate on mobile devices, such as smart phones (e.g., iPhones®) and tablets (e.g., iPads®). Figure 52 is an example of an architecture 5200 of such a mobile computing device. Examples of mobile computing devices include smartphones, tablets, laptops, etc. As shown, the mobile computing device 5200 includes one or more processing units 5205, a memory interface 5210 and a peripherals interface 5215.
The peripherals interface 5215 is coupled to various sensors and subsystems, including a camera subsystem 5220, a wired communication subsystem(s) 5223, a wireless communication subsystem(s) 5225, an audio subsystem 5230, an I/O subsystem 5235, etc. The peripherals interface 5215 enables communication between the processing units 5205 and various peripherals. For example, an orientation sensor 5245 (e.g., a gyroscope) and an acceleration sensor 5250 (e.g., an accelerometer) is coupled to the peripherals interface 5215 to facilitate orientation and acceleration functions.
The camera subsystem 5220 is coupled to one or more optical sensors 5240 (e.g., a charged coupled device (CCD) optical sensor, a complementary metal-oxide-semiconductor (CMOS) optical sensor, etc.). The camera subsystem 5220 coupled with the optical sensors 5240 facilitates camera functions, such as image and/or video data capturing. The wired communication subsystem 5223 and wireless communication subsystem 5225 serve to facilitate communication functions. In some embodiments, the wired communication system includes a USB connector for connecting the mobile device 5200 to a vehicle electronic system. The interface of some embodiments for communicating with a vehicle electronic system is described in further detail in U.S. Patent Publications 2009/0284476, 2010/0293462, 2011/0145863, 2011/0246891, and 2011/0265003, which are incorporated by reference above.
In some embodiments, the wireless communication subsystem 5225 includes radio frequency receivers and transmitters, and optical receivers and transmitters (not shown in Figure 52). These receivers and transmitters of some embodiments are implemented to operate over one or more communication networks such as a GSM network, a Wi-Fi network, a Bluetooth network, etc. The audio subsystem 5230 is coupled to a speaker to output audio (e.g., to output voice navigation instructions). Additionally, the audio subsystem 5230 is coupled to a microphone to facilitate voice-enabled functions, such as voice recognition (e.g., for searching), digital recording, etc.
The I/O subsystem 5235 involves the transfer between input/output peripheral devices, such as a display, a touch screen, etc., and the data bus of the processing units 5205 through the peripherals interface 5215. The I/O subsystem 5235 includes a touch-screen controller 5255 and other input controllers 5260 to facilitate the transfer between input/output peripheral devices and the data bus of the processing units 5205. As shown, the touch-screen controller 5255 is coupled to a touch screen 5265. The touch-screen controller 5255 detects contact and movement on the touch screen 5265 using any of multiple touch sensitivity technologies. The other input controllers 5260 are coupled to other input/control devices, such as one or more buttons. Some embodiments include a near-touch sensitive screen and a corresponding controller that can detect near-touch interactions instead of or in addition to touch interactions.
The memory interface 5210 is coupled to memory 5270. In some embodiments, the memory 5270 includes volatile memory (e.g., high-speed random access memory), non- volatile memory (e.g., flash memory), a combination of volatile and non- volatile memory, and/or any other type of memory. As illustrated in Figure 52, the memory 5270 stores an operating system (OS) 5272. The OS 5272 includes instructions for handling basic system services and for performing hardware dependent tasks.
The memory 5270 also includes communication instructions 5274 to facilitate communicating with one or more additional devices; graphical user interface instructions 5276 to facilitate graphic user interface processing; image processing instructions 5278 to facilitate image-related processing and functions; input processing instructions 5280 to facilitate input-related (e.g., touch input) processes and functions; audio processing instructions 5282 to facilitate audio-related processes and functions; and camera instructions 5284 to facilitate camera-related processes and functions. The instructions described above are merely exemplary and the memory 5270 includes additional and/or other instructions in some embodiments. For instance, the memory for a smartphone may include phone instructions to facilitate phone-related processes and functions. Additionally, the memory may include instructions for a mapping and navigation application as well as other applications. The above-identified instructions need not be implemented as separate software programs or modules. Various functions of the mobile computing device can be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
While the components illustrated in Figure 52 are shown as separate components, one of ordinary skill in the art will recognize that two or more components may be integrated into one or more integrated circuits. In addition, two or more components may be coupled together by one or more communication buses or signal lines. Also, while many of the functions have been described as being performed by one component, one of ordinary skill in the art will realize that the functions described with respect to Figure 52 may be split into two or more integrated circuits.
B. Computer System
Figure 53 conceptually illustrates an example of a vehicle electronic system 5300 with which some embodiments of the invention are implemented. Such an electronic system includes various types of computer readable media and interfaces for various other types of computer readable media. Electronic system 5300 includes a bus 5305, processing unit(s) 5310, a graphics processing unit (GPU) 5315, a system memory 5320, communication subsystem(s) 5325, a readonly memory 5330, a permanent storage device 5335, input controls 5340, and output devices 5345.
The bus 5305 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of the electronic system 5300. For instance, the bus 5305 communicatively connects the processing unit(s) 5310 with the read-only memory 5330, the GPU 5315, the system memory 5320, and the permanent storage device 5335.
From these various storage units, the processing unit(s) 5310 retrieves instructions to execute and data to process in order to execute the processes of the invention performed by the vehicle electronic system. The processing unit(s) may be a single processor or a multi-core processor in different embodiments. Some instructions are passed to and executed by the GPU 5315. The GPU 5315 can offload various computations or complement the image processing provided by the processing unit(s) 5310. In some embodiments, the vehicle system uses a system-on-a-chip that includes one or more CPU cores as well as one or more GPU cores.
The read-only-memory (ROM) 5330 stores static data and instructions that are needed by the processing unit(s) 5310 and other modules of the vehicle electronic system. The permanent storage device 5335, on the other hand, is a read-and- write memory device. This device is a non- volatile memory unit that stores instructions and data even when the electronic system 5300 is off. Some embodiments of the invention use a mass-storage device (such as a magnetic or optical disk and its corresponding disk drive, integrated flash memory) as the permanent storage device 5335.
Other embodiments use a removable storage device (such as a floppy disk, flash memory device, etc., and its corresponding drive) as the permanent storage device. Like the permanent storage device 5335, the system memory 5320 is a read-and-write memory device. However, unlike storage device 5335, the system memory 5320 is a volatile read-and-write memory, such a random access memory. The system memory 5320 stores some of the instructions and data that the processor needs at runtime. In some embodiments, the invention's processes are stored in the system memory 5320, the permanent storage device 5335, and/or the read-only memory 5330. For example, the various memory units include instructions for processing multimedia clips in accordance with some embodiments. From these various memory units, the processing unit(s) 5310 retrieves instructions to execute and data to process in order to execute the processes of some embodiments.
The bus 5305 also connects to the input controls 5340 and output devices 5345. The input devices 5340 enable the user to communicate information and select commands to the electronic system. The input controls 5340 include various different inputs in different embodiments, including touchscreens, other hardware controls such as buttons, knobs, joysticks, trackballs, etc., microphones for receiving audio, and others. The output devices 5345 include touchscreen and non-touchscreen displays, as well as speakers for producing audio.
Finally, as shown in Figure 53, bus 5305 also couples the vehicle electronic system 5300 to one or more communication subsystems 5325 through which the vehicle system can communicate with a mobile device, such as that shown above in Figure 52. These communication subsystems may include both wireless (e.g., Bluetooth®, Wi-Fi, etc.) communication subsystems and wired (e.g., USB, Firewire®, etc.) communication subsystems. Any or all components of the vehicle electronic system 5300 may be used in conjunction with the invention.
Some embodiments include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media). Some examples of such computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD- ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks. The computer-readable media may store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations. Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher- level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
While the above discussion primarily refers to microprocessor or multi-core processors that execute software, some embodiments are performed by one or more integrated circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some embodiments, such integrated circuits execute instructions that are stored on the circuit itself. In addition, some embodiments execute software stored in programmable logic devices (PLDs), ROM, or RAM devices.
As used in this specification and any claims of this application, the terms "computer", "server", "processor", and "memory" all refer to electronic or other technological devices. These terms exclude people or groups of people. For the purposes of the specification, the terms display or displaying means displaying on an electronic device. As used in this specification and any claims of this application, the terms "computer readable medium," "computer readable media," and "machine readable medium" are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
VIII. MAP SERVICE ENVIRONMENT
Various embodiments may operate within a map service operating environment. Figure 54 illustrates a map service operating environment, according to some embodiments. A map service 5430 (also referred to as mapping service) may provide map services for one or more client devices 5402a-5402c in communication with the map service 5430 through various communication methods and protocols. A map service 5430 in some embodiments provides map information and other map-related data, such as two-dimensional map image data (e.g., aerial view of roads utilizing satellite imagery), three-dimensional map image data (e.g., traversable map with three-dimensional features, such as buildings), route and direction calculation (e.g., ferry route calculations or directions between two points for a pedestrian), real-time navigation data (e.g., turn-by-turn visual navigation data in two or three dimensions), location data (e.g., where is the client device currently located), and other geographic data (e.g., wireless network coverage, weather, traffic information, or nearby points-of-interest). In various embodiments, the map service data may include localized labels for different countries or regions; localized labels may be utilized to present map labels (e.g., street names, city names, points of interest) in different languages on client devices. Client devices 5402a-5402c may utilize these map services by obtaining map service data. Client devices 5402a-5402c may implement various techniques to process map service data. Client devices 5402a-5402c may then provide map services to various entities, including, but not limited to, users, internal software or hardware modules, and/or other systems or devices external to the client devices 5402a-5402c.
In some embodiments, a map service is implemented by one or more nodes in a distributed computing system. Each node may be assigned one or more services or components of a map service. Some nodes may be assigned the same map service or component of a map service. A load balancing node in some embodiments distributes access or requests to other nodes within a map service. In some embodiments a map service is implemented as a single system, such as a single server. Different modules or hardware devices within a server may implement one or more of the various services provided by a map service.
A map service in some embodiments provides map services by generating map service data in various formats. In some embodiments, one format of map service data is map image data. Map image data provides image data to a client device so that the client device may process the image data (e.g., rendering and/or displaying the image data as a two-dimensional or three-dimensional map). Map image data, whether in two or three dimensions, may specify one or more map tiles. A map tile may be a portion of a larger map image. Assembling together the map tiles of a map produces the original map. Tiles may be generated from map image data, routing or navigation data, or any other map service data. In some embodiments map tiles are raster-based map tiles, with tile sizes ranging from any size both larger and smaller than a commonly-used 256 pixel by 256 pixel tile. Raster-based map tiles may be encoded in any number of standard digital image representations including, but not limited to, Bitmap (.bmp), Graphics Interchange Format(.gif), Joint Photographic Experts Group (.jpg, jpeg, etc.), Portable Networks Graphic (.png), or Tagged Image File Format (.tiff). In some embodiments, map tiles are vector-based map tiles, encoded using vector graphics, including, but not limited to, Scalable Vector Graphics (.svg) or a Drawing File (.drw). Some embodiments also include tiles with a combination of vector and raster data. Metadata or other information pertaining to the map tile may also be included within or along with a map tile, providing further map service data to a client device. In various embodiments, a map tile is encoded for transport utilizing various standards and/or protocols.
In various embodiments, map tiles may be constructed from image data of different resolutions depending on zoom level. For instance, for low zoom level (e.g., world or globe view), the resolution of map or image data need not be as high relative to the resolution at a high zoom level (e.g., city or street level). For example, when in a globe view, there may be no need to render street level artifacts as such objects would be so small as to be negligible in many cases.
A map service in some embodiments performs various techniques to analyze a map tile before encoding the tile for transport. This analysis may optimize map service performance for both client devices and a map service. In some embodiments map tiles are analyzed for complexity, according to vector-based graphic techniques, and constructed utilizing complex and non-complex layers. Map tiles may also be analyzed for common image data or patterns that may be rendered as image textures and constructed by relying on image masks. In some embodiments, raster-based image data in a map tile contains certain mask values, which are associated with one or more textures. Some embodiments also analyze map tiles for specified features that may be associated with certain map styles that contain style identifiers.
Other map services generate map service data relying upon various data formats separate from a map tile in some embodiments. For instance, map services that provide location data may utilize data formats conforming to location service protocols, such as, but not limited to, Radio Resource Location services Protocol (RRLP), TIA 801 for Code Division Multiple Access (CDMA), Radio Resource Control (RRC) position protocol, or LTE Positioning Protocol (LPP). Embodiments may also receive or request data from client devices identifying device capabilities or attributes (e.g., hardware specifications or operating system version) or communication capabilities (e.g., device communication bandwidth as determined by wireless signal strength or wire or wireless network type).
A map service may obtain map service data from internal or external sources. For example, satellite imagery used in map image data may be obtained from external services, or internal systems, storage devices, or nodes. Other examples may include, but are not limited to, GPS assistance servers, wireless network coverage databases, business or personal directories, weather data, government information (e.g., construction updates or road name changes), or traffic reports. Some embodiments of a map service may update map service data (e.g., wireless network coverage) for analyzing future requests from client devices.
Various embodiments of a map service may respond to client device requests for map services. These requests may be a request for a specific map or portion of a map. Some embodiments format requests for a map as requests for certain map tiles. In some embodiments, requests also supply the map service with starting locations (or current locations) and destination locations for a route calculation. A client device may also request map service rendering information, such as map textures or style sheets. In at least some embodiments, requests are also one of a series of requests implementing turn-by-turn navigation. Requests for other geographic data may include, but are not limited to, current location, wireless network coverage, weather, traffic information, or nearby points-of-interest.
A map service, in some embodiments, analyzes client device requests to optimize a device or map service operation. For instance, a map service may recognize that the location of a client device is in an area of poor communications (e.g., weak wireless signal) and send more map service data to supply a client device in the event of loss in communication or send instructions to utilize different client hardware (e.g., orientation sensors) or software (e.g., utilize wireless location services or Wi-Fi positioning instead of GPS-based services). In another example, a map service may analyze a client device request for vector-based map image data and determine that raster-based map data better optimizes the map image data according to the image's complexity. Embodiments of other map services may perform similar analysis on client device requests and as such the above examples are not intended to be limiting.
Various embodiments of client devices (e.g., client devices 5402a-5402c) are implemented on different portable-multifunction device types. Client devices 5402a-5402c utilize map service 5430 through various communication methods and protocols. In some embodiments, client devices 5402a-5402c obtain map service data from map service 5430. Client devices 5402a-5402c request or receive map service data. Client devices 5402a-5402c then process map service data (e.g., render and/or display the data) and may send the data to another software or hardware module on the device or to an external device or system.
A client device, according to some embodiments, implements techniques to render and/or display maps. These maps may be requested or received in various formats, such as map tiles described above. A client device may render a map in two-dimensional or three-dimensional views. Some embodiments of a client device display a rendered map and allow a user, system, or device providing input to manipulate a virtual camera in the map, changing the map display according to the virtual camera's position, orientation, and field-of-view. Various forms and input devices are implemented to manipulate a virtual camera. In some embodiments, touch input, through certain single or combination gestures (e.g., touch-and-hold or a swipe) manipulate the virtual camera. Other embodiments allow manipulation of the device's physical location to manipulate a virtual camera. For instance, a client device may be tilted up from its current position to manipulate the virtual camera to rotate up. In another example, a client device may be tilted forward from its current position to move the virtual camera forward. Other input devices to the client device may be implemented including, but not limited to, auditory input (e.g., spoken words), a physical keyboard, mouse, and/or a joystick.
Some embodiments provide various visual feedback to virtual camera manipulations, such as displaying an animation of possible virtual camera manipulations when transitioning from two-dimensional map views to three-dimensional map views. Some embodiments also allow input to select a map feature or object (e.g., a building) and highlight the object, producing a blur effect that maintains the virtual camera's perception of three-dimensional space.
In some embodiments, a client device implements a navigation system (e.g., turn-by- turn navigation). A navigation system provides directions or route information, which may be displayed to a user. Some embodiments of a client device request directions or a route calculation from a map service. A client device may receive map image data and route data from a map service. In some embodiments, a client device implements a turn-by-turn navigation system, which provides real-time route and direction information based upon location information and route information received from a map service and/or other location system, such as Global Positioning Satellite (GPS). A client device may display map image data that reflects the current location of the client device and update the map image data in real-time. A navigation system may provide auditory or visual directions to follow a certain route.
A virtual camera is implemented to manipulate navigation map data according to some embodiments. Some embodiments of client devices allow the device to adjust the virtual camera display orientation to bias toward the route destination. Some embodiments also allow virtual camera to navigation turns simulating the inertial motion of the virtual camera.
Client devices implement various techniques to utilize map service data from map service. Some embodiments implement some techniques to optimize rendering of two- dimensional and three-dimensional map image data. In some embodiments, a client device locally stores rendering information. For instance, a client stores a style sheet which provides rendering directions for image data containing style identifiers. In another example, common image textures may be stored to decrease the amount of map image data transferred from a map service. Client devices in different embodiments implement various modeling techniques to render two-dimensional and three-dimensional map image data, examples of which include, but are not limited to: generating three-dimensional buildings out of two-dimensional building footprint data; modeling two-dimensional and three-dimensional map objects to determine the client device communication environment; generating models to determine whether map labels are seen from a certain virtual camera position; and generating models to smooth transitions between map image data. Some embodiments of client devices also order or prioritize map service data in certain techniques. For instance, a client device detects the motion or velocity of a virtual camera, which if exceeding certain threshold values, lower-detail image data is loaded and rendered of certain areas. Other examples include: rendering vector-based curves as a series of points, preloading map image data for areas of poor communication with a map service, adapting textures based on display zoom level, or rendering map image data according to complexity.
In some embodiments, client devices communicate utilizing various data formats separate from a map tile. For instance, some client devices implement Assisted Global Positioning Satellites (A-GPS) and communicate with location services that utilize data formats conforming to location service protocols, such as, but not limited to, Radio Resource Location services Protocol (RRLP), TIA 801 for Code Division Multiple Access (CDMA), Radio Resource Control (RRC) position protocol, or LTE Positioning Protocol (LPP). Client devices may also receive GPS signals directly. Embodiments may also send data, with or without solicitation from a map service, identifying the client device's capabilities or attributes (e.g., hardware specifications or operating system version) or communication capabilities (e.g., device communication bandwidth as determined by wireless signal strength or wire or wireless network type).
Figure 54 illustrates one possible embodiment of an operating environment 5400 for a map service 5430 and client devices 5402a-5402c. In some embodiments, devices 5402a, 5402b, and 5402c communicate over one or more wire or wireless networks 5410. For example, wireless network 5410, such as a cellular network, can communicate with a wide area network (WAN) 5420, such as the Internet, by use of gateway 5414. A gateway 5414 in some embodiments provides a packet oriented mobile data service, such as General Packet Radio Service (GPRS), or other mobile data service allowing wireless networks to transmit data to other networks, such as wide area network 5420. Likewise, access device 5412 (e.g., IEEE 802.1 lg wireless access device) provides communication access to WAN 5420. Devices 5402a and 5402b can be any portable electronic or computing device capable of communicating with a map service. Device 5402c can be any non-portable electronic or computing device capable of communicating with a map service.
In some embodiments, both voice and data communications are established over wireless network 5410 and access device 5412. For instance, device 5402a can place and receive phone calls (e.g., using voice over Internet Protocol (VoIP) protocols), send and receive e-mail messages (e.g., using Simple Mail Transfer Protocol (SMTP) or Post Office Protocol 3 (POP3)), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, over wireless network 5410, gateway 5414, and WAN 5420 (e.g., using Transmission Control Protocol/Internet Protocol (TCP/IP) or User Datagram Protocol (UDP)). Likewise, in some implementations, devices 5402b and 5402c can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents over access device 5412 and WAN 5420. In various embodiments, any of the illustrated client device may communicate with map service 5430 and/or other service(s) 5450 using a persistent connection established in accordance with one or more security protocols, such as the Secure Sockets Layer (SSL) protocol or the Transport Layer Security (TLS) protocol.
Devices 5402a and 5402b can also establish communications by other means. For example, wireless device 5402a can communicate with other wireless devices (e.g., other devices 5402b, cell phones, etc.) over the wireless network 5410. Likewise devices 5402a and 5402b can establish peer-to-peer communications 5440 (e.g., a personal area network) by use of one or more communication subsystems, such as Bluetooth® communication from Bluetooth Special Interest Group, Inc. of Kirkland, Washington. Device 5402c can also establish peer to peer communications with devices 5402a or 5402b (not shown). Other communication protocols and topologies can also be implemented. Devices 5402a and 5402b may also receive Global Positioning Satellite (GPS) signals from GPS satellites 5460.
Devices 5402a, 5402b, and 5402c can communicate with map service 5430 over the one or more wire and/or wireless networks, 5410 or 5412. For instance, map service 5430 can provide a map service data to rendering devices 5402a, 5402b, and 5402c. Map service 5430 may also communicate with other services 5450 to obtain data to implement map services. Map service 5430 and other services 5450 may also receive GPS signals from GPS satellites 5460.
In various embodiments, map service 5430 and/or other service(s) 5450 are configured to process search requests from any of client devices. Search requests may include but are not limited to queries for business, address, residential locations, points of interest, or some combination thereof. Map service 5430 and/or other service(s) 5450 may be configured to return results related to a variety of parameters including but not limited to a location entered into an address bar or other text entry field (including abbreviations and/or other shorthand notation), a current map view (e.g., user may be viewing one location on the multifunction device while residing in another location), current location of the user (e.g., in cases where the current map view did not include search results), and the current route (if any). In various embodiments, these parameters may affect the composition of the search results (and/or the ordering of the search results) based on different priority weightings. In various embodiments, the search results that are returned may be a subset of results selected based on specific criteria include but not limited to a quantity of times the search result (e.g., a particular point of interest) has been requested, a measure of quality associated with the search result (e.g., highest user or editorial review rating), and/or the volume of reviews for the search results (e.g., the number of times the search result has been review or rated).
In various embodiments, map service 5430 and/or other service(s) 5450 are configured to provide auto-complete search results that are displayed on the client device, such as within the mapping application. For instance, auto-complete search results may populate a portion of the screen as the user enters one or more search keywords on the multifunction device. In some cases, this feature may save the user time as the desired search result may be displayed before the user enters the full search query. In various embodiments, the auto complete search results may be search results found by the client on the client device (e.g., bookmarks or contacts), search results found elsewhere (e.g., from the Internet) by map service 5430 and/or other service(s) 5450, and/or some combination thereof. As is the case with commands, any of the search queries may be entered by the user via voice or through typing. The multifunction device may be configured to display search results graphically within any of the map display described herein. For instance, a pin or other graphical indicator may specify locations of search results as points of interest. In various embodiments, responsive to a user selection of one of these points of interest (e.g., a touch selection, such as a tap), the multifunction device is configured to display additional information about the selected point of interest including but not limited to ratings, reviews or review snippets, hours of operation, store status (e.g., open for business, permanently closed, etc.), and/or images of a storefront for the point of interest. In various embodiments, any of this information may be displayed on a graphical information card that is displayed in response to the user's selection of the point of interest.
In various embodiments, map service 5430 and/or other service(s) 5450 provide one or more feedback mechanisms to receive feedback from client devices 5402a-5402c. For instance, client devices may provide feedback on search results to map service 5430 and/or other service(s) 5450 (e.g., feedback specifying ratings, reviews, temporary or permanent business closures, errors etc.); this feedback may be used to update information about points of interest in order to provide more accurate or more up-to-date search results in the future. In some embodiments, map service 5430 and/or other service(s) 5450 may provide testing information to the client device (e.g., an A/B test) to determine which search results are best. For instance, at random intervals, the client device may receive and present two search results to a user and allow the user to indicate the best result. The client device may report the test results to map service 5430 and/or other service(s) 5450 to improve future search results based on the chosen testing technique, such as an A/B test technique in which a baseline control sample is compared to a variety of single-variable test samples in order to improve results.
While the invention has been described with reference to numerous specific details, one of ordinary skill in the art will recognize that the invention can be embodied in other specific forms without departing from the spirit of the invention. For instance, many of the figures illustrate various touch gestures (e.g., taps, double taps, swipe gestures, press and hold gestures, etc.). However, many of the illustrated operations could be performed via different touch gestures (e.g., a swipe instead of a tap, etc.) or by non-touch input (e.g., using a cursor controller, a keyboard, a touchpad/trackpad, a near-touch sensitive screen, etc.). In addition, a number of the figures conceptually illustrate processes. The specific operations of these processes may not be performed in the exact order shown and described. The specific operations may not be performed in one continuous series of operations, and different specific operations may be performed in different embodiments. Furthermore, the process could be implemented using several sub- processes, or as part of a larger macro process. Thus, one of ordinary skill in the art would understand that the invention is not to be limited by the foregoing illustrative details, but rather is to be defined by the appended claims.

Claims

CLAIMS We claim:
1. For an application that operates on a mobile device, a method comprising:
predicting a plurality of likely destinations for a vehicle to which the mobile device is connected based on data from a plurality of different sources; and
generating, for a display screen of the vehicle, a display comprising the plurality of likely destinations.
2. The method of claim 1 further comprising:
receiving a particular type of input through a control of the vehicle to select one of the likely destinations; and
entering a turn-by-turn navigation mode to the selected destination in response to the received input.
3. The method of claim 2, wherein the display screen is a touchscreen, wherein the particular type of input comprises tap input on the display screen of the vehicle.
4. The method of claim 1, wherein generating the display comprises generating a filmstrip view of the plurality of destinations, wherein the plurality of destinations are for viewing one at a time on the vehicle display screen.
5. The method of claim 1, wherein the generated display comprises, for each destination, (i) a route on a map from a current location of the mobile device to the destination and (ii) data regarding the destination.
6. The method of claim 5, wherein the data regarding a particular destination comprises at least one of a name for the particular destination, a source of the particular destination, an estimated travel time to the particular destination, and frequency information indicating how often the mobile device travels to the particular destination.
7. The method of claim 1, wherein generating the display comprises generating a set of stackable cards for the plurality of destinations.
8. For a mobile device, a method comprising:
predicting a set of likely destinations for a vehicle to which the mobile device is connected based on data from a plurality of different sources;
generating, for a display screen of the vehicle, a display for a first destination of the set of likely destinations; receiving a particular type of input through a control of the vehicle to step through the set of likely destinations; and
in response to the input, generating, for the display screen of the vehicle, a display for a second destination of the set of likely destinations.
9. The method of claim 8, wherein the first destination is calculated as more likely a destination than the second destination according to a set of heuristics.
10. The method of claim 8, wherein the set of likely destinations comprises at least one of a destination frequently traveled to by the mobile device at a current time of day, an event stored by a scheduling application of the mobile device, and an address received in an electronic message.
11. The method of claim 8, wherein the display screen of the vehicle is a touchscreen and the particular type of input comprises a gestural swipe input.
12. The method of claim 8, wherein the particular type of input comprises a rotation of a built-in control of the vehicle.
13. The method of claim 8, wherein the set of likely destinations has a first order, wherein the method further comprises:
updating the predicted set of likely destinations for the vehicle;
after updating the predicted set of likely destinations, receiving the particular type of input through the vehicle control while the first destination is displayed; and
in response to the input, displaying a third destination on the display screen of the vehicle.
14. The method of claim 8, wherein the set of instructions for predicting the set of likely destinations comprises a set of instructions for calculating probabilities for each of the destinations, wherein the first destination is a destination with the highest calculated probability.
15. A machine readable medium storing a program for execution by at least one processing unit of a mobile device, the program comprising sets of instructions for:
determining, for each of a set of different locations identified from a database on the mobile device, a likelihood value that the location is a next destination for a user of the device;
identifying a plurality of the different locations for which the likelihood value is above a threshold value; and generating a display of the identified plurality of locations as selectable destinations.
16. The machine readable medium of claim 15, wherein the display is for output to a display screen of a vehicle's interactive communication system to which the mobile device connects.
17. The machine readable medium of claim 15, wherein the program comprises a set of instructions for displaying a portion of the generated display comprising information regarding a first location with a highest likelihood value, wherein the generated display further comprises a set of indicators to indicate a number of locations with likelihood values above the threshold value.
18. The machine readable medium of claim 17, wherein the portion of the generated display comprises (i) a map view with a route to the first location and (ii) a set of metadata regarding the first location.
19. A machine readable medium storing a program for execution by at least one processing unit of a mobile device, the program comprising:
a prediction engine for using a database of stored locations to identify a plurality of likely destinations for a vehicle to which the mobile device connects; and
a user interface generator for generating, for a display screen of the vehicle, a user interface display that comprises the plurality of identified likely destinations as selectable destinations.
20. The machine readable medium of claim 19, wherein the program further comprises an input processor for receiving input from controls of the vehicle to which the mobile device connects.
21. The machine readable medium of claim 20, wherein the user interface display is for displaying the identified likely destinations one at a time, wherein the input is for removing a first identified likely destination from the vehicle display screen and causing a second identified likely destination to be displayed on the vehicle display screen.
22. The machine readable medium of claim 19, wherein the program further comprises a second user interface generator for generating and outputting a second user interface display for the mobile device.
23. A machine readable medium storing an application for execution by at least one processing unit of a mobile device, the application comprising sets of instructions for: identifying a particular location in response to a search request received from a user through a graphical user interface of the mobile device prior to connection of the mobile device to an interactive vehicle communication system;
detecting a subsequent connection of the mobile device to an interactive vehicle communication system and a selection of the application for display on a vehicle display screen;
receiving a selection through a control of the vehicle to view a set of recent destinations; and
geenrating for the vehicle display screen a display that comprises the particular location as one of the set of recent destinations.
24. The machine readable medium of claim 23, wherein the set of recent destinations are ordered in the display based on a likelihood for each of the recent destinations being a next destination for the vehicle.
25. The machine readable medium of claim 23, wherein the set of recent destinations are displayed on the vehicle display screen, wherein the application further comprises sets of instructions for:
receiving a second selection through a control of the vehicle to select one of the recent destinations; and
upon receiving the second selection, generating for the vehicle display screen a map view centered on the selected destination.
26. The machine readable medium of claim 25, wherein the application further comprises a set of instructions for, upon receiving the second selection, displaying on the mobile device a second map view centered on the selected destination.
EP14725931.1A 2013-03-15 2014-03-12 User interface for displaying predicted destinations Ceased EP2946172A1 (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US201361800908P 2013-03-15 2013-03-15
US201361832853P 2013-06-08 2013-06-08
US201361832818P 2013-06-08 2013-06-08
US201361832928P 2013-06-09 2013-06-09
US201361875753P 2013-09-10 2013-09-10
US14/081,911 US10655979B2 (en) 2013-06-08 2013-11-15 User interface for displaying predicted destinations
PCT/US2014/025121 WO2014151155A1 (en) 2013-03-15 2014-03-12 User interface for displaying predicted destinations

Publications (1)

Publication Number Publication Date
EP2946172A1 true EP2946172A1 (en) 2015-11-25

Family

ID=52707448

Family Applications (4)

Application Number Title Priority Date Filing Date
EP16179680.0A Active EP3101392B1 (en) 2013-03-15 2014-03-06 Mapping application with turn-by-turn navigation mode for output to vehicle display
EP14725931.1A Ceased EP2946172A1 (en) 2013-03-15 2014-03-12 User interface for displaying predicted destinations
EP21174530.2A Pending EP3889546A1 (en) 2013-03-15 2014-03-12 Mapping application search function
EP14722422.4A Active EP2946171B1 (en) 2013-03-15 2014-03-12 Mapping application search function

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP16179680.0A Active EP3101392B1 (en) 2013-03-15 2014-03-06 Mapping application with turn-by-turn navigation mode for output to vehicle display

Family Applications After (2)

Application Number Title Priority Date Filing Date
EP21174530.2A Pending EP3889546A1 (en) 2013-03-15 2014-03-12 Mapping application search function
EP14722422.4A Active EP2946171B1 (en) 2013-03-15 2014-03-12 Mapping application search function

Country Status (5)

Country Link
EP (4) EP3101392B1 (en)
CN (8) CN110068352B (en)
AU (5) AU2014235246B2 (en)
TW (2) TWI572849B (en)
WO (2) WO2014151155A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9857193B2 (en) 2013-06-08 2018-01-02 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display
US10371526B2 (en) 2013-03-15 2019-08-06 Apple Inc. Warning for frequently traveled trips based on traffic
US10579939B2 (en) 2013-03-15 2020-03-03 Apple Inc. Mobile device with predictive routing engine
US10769217B2 (en) 2013-06-08 2020-09-08 Apple Inc. Harvesting addresses

Families Citing this family (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US10255566B2 (en) 2011-06-03 2019-04-09 Apple Inc. Generating and processing task items that represent tasks to perform
US10422654B2 (en) 2011-06-03 2019-09-24 Apple Inc. Devices and methods for comparing and selecting alternative navigation routes
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
EP2954514B1 (en) 2013-02-07 2021-03-31 Apple Inc. Voice trigger for a digital assistant
US10652394B2 (en) 2013-03-14 2020-05-12 Apple Inc. System and method for processing voicemail
US10748529B1 (en) 2013-03-15 2020-08-18 Apple Inc. Voice activated device for use with a voice-based digital assistant
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US9958278B2 (en) * 2015-01-14 2018-05-01 Telenav, Inc. Navigation system with scalable display mechanism and method of operation thereof
US9390452B1 (en) 2015-01-28 2016-07-12 Allstate Insurance Company Risk unit based policies
US10846799B2 (en) 2015-01-28 2020-11-24 Arity International Limited Interactive dashboard display
US9361599B1 (en) 2015-01-28 2016-06-07 Allstate Insurance Company Risk unit based policies
US10817950B1 (en) 2015-01-28 2020-10-27 Arity International Limited Usage-based policies
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US10097973B2 (en) 2015-05-27 2018-10-09 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
US10200824B2 (en) 2015-05-27 2019-02-05 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10740384B2 (en) 2015-09-08 2020-08-11 Apple Inc. Intelligent automated assistant for media search and playback
US10331312B2 (en) 2015-09-08 2019-06-25 Apple Inc. Intelligent automated assistant in a media environment
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10956666B2 (en) 2015-11-09 2021-03-23 Apple Inc. Unconventional virtual assistant interactions
CN106855417A (en) * 2015-12-08 2017-06-16 腾讯科技(深圳)有限公司 The processing method of digital map navigation information, location equipment and alignment system
US11857842B2 (en) 2015-12-21 2024-01-02 Suunto Oy Apparatus and exercising device
US11838990B2 (en) 2015-12-21 2023-12-05 Suunto Oy Communicating sensor data in wireless communication systems
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
CN105677749B (en) * 2015-12-31 2020-02-07 北京京东尚科信息技术有限公司 Page display method and device
EP3362920A4 (en) * 2016-01-14 2018-12-05 Samsung Electronics Co., Ltd. Method and system for automatically managing content in an electronic device
CN105955459A (en) * 2016-04-21 2016-09-21 深圳市绿地蓝海科技有限公司 Method for controlling vehicle electronic device, and device
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
US11216857B2 (en) * 2016-06-23 2022-01-04 Stubhub, Inc. Weather enhanced graphical preview for an online ticket marketplace
CN107543555B (en) * 2016-06-27 2020-12-04 阿里巴巴(中国)有限公司 Route planning method and device
CN107943439B (en) * 2016-10-13 2021-09-07 斑马智行网络(香港)有限公司 Interface moving method and device, intelligent terminal, server and operating system
US11900309B2 (en) * 2016-11-21 2024-02-13 Ford Global Technologies, Llc Item delivery to an unattended vehicle
US10417492B2 (en) * 2016-12-22 2019-09-17 Microsoft Technology Licensing, Llc Conversion of static images into interactive maps
TWI615776B (en) * 2016-12-23 2018-02-21 李雨暹 Method and system for creating virtual message onto a moving object and searching the same
US11249616B2 (en) * 2017-01-17 2022-02-15 Google Llc Assistive screenshots
US10429188B2 (en) * 2017-03-30 2019-10-01 Crown Equipment Corporation Warehouse mapping tools
CN106982094A (en) * 2017-04-09 2017-07-25 深圳市先通网信息科技有限公司 Visible light communication method and system
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
DK180048B1 (en) 2017-05-11 2020-02-04 Apple Inc. MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK201770428A1 (en) 2017-05-12 2019-02-18 Apple Inc. Low-latency intelligent automated assistant
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
US20180336892A1 (en) 2017-05-16 2018-11-22 Apple Inc. Detecting a trigger of a digital assistant
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
WO2018222510A2 (en) 2017-06-02 2018-12-06 Apple Inc. Venues map application and system
CN109213541A (en) 2017-06-29 2019-01-15 阿里巴巴集团控股有限公司 APP application shows the method, apparatus and electronic equipment at interface
JP6785871B2 (en) * 2017-08-04 2020-11-18 グーグル エルエルシー Providing navigation instructions
JP6780611B2 (en) * 2017-08-25 2020-11-04 トヨタ自動車株式会社 Autonomous driving device
CN112985448A (en) * 2017-08-31 2021-06-18 北京嘀嘀无限科技发展有限公司 Map route display method, map route display system and computer device
JP6922582B2 (en) * 2017-09-15 2021-08-18 トヨタ自動車株式会社 Travel control device, travel control system, and travel control method
CN108073687B (en) * 2017-11-17 2020-09-08 阿里巴巴集团控股有限公司 Random walk, random walk method based on cluster, random walk device and equipment
TWI647674B (en) * 2018-02-09 2019-01-11 光陽工業股份有限公司 Navigation method and system for presenting different navigation pictures
US20190286126A1 (en) * 2018-03-14 2019-09-19 GM Global Technology Operations LLC Remote end-point drop-off navigation guidance
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
CN110553657B (en) * 2018-06-01 2023-10-27 江苏瑞焕激光科技有限公司 Navigation method and system based on chat robot
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
CN111630413A (en) * 2018-06-05 2020-09-04 谷歌有限责任公司 Application-specific user interaction based on confidence
CN109269521A (en) * 2018-09-26 2019-01-25 维沃移动通信有限公司 A kind of air navigation aid and terminal device
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
EP3788321A1 (en) * 2018-11-07 2021-03-10 Google LLC Providing navigation instructions to one device in view of another device
US11380436B2 (en) 2018-11-21 2022-07-05 GE Precision Healthcare LLC Workflow predictive analytics engine
US11309076B2 (en) 2018-11-21 2022-04-19 GE Precision Healthcare LLC Workflow predictive analytics engine
CN109788438A (en) * 2019-02-11 2019-05-21 胡金星 A kind of intelligent navigation car networking system based on big data
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
US20220177067A1 (en) * 2019-03-27 2022-06-09 Tvs Motor Company Limited Smart connect instrument cluster
CN110081901B (en) * 2019-04-08 2021-04-06 百度在线网络技术(北京)有限公司 Intersection display method, device, equipment and computer readable storage medium
CN110779541B (en) * 2019-04-10 2021-11-23 北京嘀嘀无限科技发展有限公司 Display method and system of steering arrow
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
DK201970509A1 (en) 2019-05-06 2021-01-15 Apple Inc Spoken notifications
CN110099368A (en) * 2019-05-15 2019-08-06 安徽三联学院 A kind of vehicle realtime communication system and method based on car networking
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
DK201970511A1 (en) 2019-05-31 2021-02-15 Apple Inc Voice identification in digital assistant systems
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. User activity shortcut suggestions
CN111854789B (en) * 2019-05-31 2022-06-03 北京嘀嘀无限科技发展有限公司 Navigation display method and system
US11468890B2 (en) 2019-06-01 2022-10-11 Apple Inc. Methods and user interfaces for voice-based control of electronic devices
CN110231049B (en) * 2019-07-15 2023-03-21 腾讯科技(深圳)有限公司 Navigation route display method, device, terminal and storage medium
JP2021026188A (en) * 2019-08-08 2021-02-22 本田技研工業株式会社 Communication control system, notification control method, and communication control program
DE102019127078A1 (en) * 2019-10-09 2021-04-15 Bayerische Motoren Werke Aktiengesellschaft Method for preparing a motor vehicle for a trip
US11756418B2 (en) * 2019-10-18 2023-09-12 Honda Motor Co., Ltd. Device, method, and storage medium
TW202142999A (en) * 2019-12-31 2021-11-16 芬蘭商亞瑪芬體育數字服務公司 Apparatus and method for presenting thematic maps
CN113543078A (en) * 2020-04-14 2021-10-22 大众汽车股份公司 Vehicle machine, mobile terminal and method for controlling interaction between mobile terminal and vehicle machine
US11043220B1 (en) 2020-05-11 2021-06-22 Apple Inc. Digital assistant hardware abstraction
US11061543B1 (en) 2020-05-11 2021-07-13 Apple Inc. Providing relevant data items based on context
US11755276B2 (en) 2020-05-12 2023-09-12 Apple Inc. Reducing description length based on confidence
US11490204B2 (en) 2020-07-20 2022-11-01 Apple Inc. Multi-device audio adjustment coordination
US11438683B2 (en) 2020-07-21 2022-09-06 Apple Inc. User identification using headphones
KR20220049407A (en) 2020-10-14 2022-04-21 삼성전자주식회사 Display apparatus and control method thereof
CN112764703A (en) * 2021-01-22 2021-05-07 领悦数字信息技术有限公司 Display method and device for vehicle and storage medium
US11573676B2 (en) 2021-03-30 2023-02-07 Honda Motor Co., Ltd. Method and system for managing contextual views within a user interface
CN116007642A (en) * 2021-10-22 2023-04-25 华为终端有限公司 Destination navigation method and device
CN114001748B (en) * 2021-10-28 2024-03-22 维沃移动通信有限公司 Navigation route display method, device, equipment and medium
CN114217754A (en) * 2021-11-05 2022-03-22 维沃移动通信有限公司 Screen projection control method and device, electronic equipment and storage medium
CN114267194B (en) * 2021-12-16 2023-01-06 青岛创智融信数字科技集团有限公司 Parking space intelligent management method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011084156A2 (en) * 2010-01-06 2011-07-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results

Family Cites Families (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5289572A (en) * 1989-10-24 1994-02-22 Mitsubishi Denki Kabushiki Kaisha Electronic map combined with user service information
JPH10281790A (en) * 1997-04-08 1998-10-23 Aisin Aw Co Ltd Route search device, navigation apparatus and medium on which computer program for navigation processing is stored
JP3896728B2 (en) * 1999-06-23 2007-03-22 トヨタ自動車株式会社 Portable terminal device and in-vehicle information processing device
DE19956112A1 (en) * 1999-11-22 2001-05-23 Mannesmann Vdo Ag Navigation system with improved object selection
JP3743312B2 (en) * 2001-06-07 2006-02-08 株式会社デンソー Map display device, program
JP3860475B2 (en) * 2002-01-11 2006-12-20 株式会社ナビタイムジャパン In-vehicle / portable map display device
US20040160342A1 (en) * 2003-02-13 2004-08-19 Curley Maeve Sarah Remote display system and methods for navigational applications
US7272497B2 (en) * 2003-03-24 2007-09-18 Fuji Jukogyo Kabushiki Kaisha Vehicle navigation system with multi-use display
JP2005004527A (en) * 2003-06-12 2005-01-06 Toyota Motor Corp Method for discriminating screen format, screen format discrimination system and vehicle
US20060155431A1 (en) * 2004-03-04 2006-07-13 Berg Frederick J Vehicle information system with remote communicator
CA2559726C (en) * 2004-03-24 2015-10-20 A9.Com, Inc. System and method for displaying images in an online directory
EP1871075B1 (en) * 2004-12-24 2015-03-04 IZUTSU, Masahiro Mobile information communication apparatus, connection unit for mobile information communication apparatus, and external input/output unit for mobile information communication apparatus
WO2007056450A2 (en) * 2005-11-07 2007-05-18 Google Inc. Local search and mapping for mobile devices
US7634463B1 (en) * 2005-12-29 2009-12-15 Google Inc. Automatically generating and maintaining an address book
US7506275B2 (en) * 2006-02-28 2009-03-17 Microsoft Corporation User interface navigation
GB0619065D0 (en) * 2006-09-27 2006-11-08 Tomtom Int Bv Portable navigation device with wireless interface
WO2008037472A2 (en) * 2006-09-27 2008-04-03 Tom Tom International B.V. Portable navigation device
US9430945B2 (en) * 2006-12-20 2016-08-30 Johnson Controls Technology Company System and method for providing route calculation and information to a vehicle
JP5676108B2 (en) * 2006-12-20 2015-02-25 ジョンソン コントロールズ テクノロジー カンパニーJohnson Controls Technology Company Remote display reproduction system
JP5029874B2 (en) * 2006-12-28 2012-09-19 富士通株式会社 Information processing apparatus, information processing method, and information processing program
CN101573590A (en) * 2007-01-10 2009-11-04 通腾科技股份有限公司 Navigation device and method for displaying navigation information
US9360337B2 (en) * 2007-06-20 2016-06-07 Golba Llc Navigation system and methods for route navigation
US8385946B2 (en) * 2007-06-28 2013-02-26 Apple Inc. Disfavored route progressions or locations
WO2009073806A2 (en) * 2007-12-05 2009-06-11 Johnson Controls Technology Company Vehicle user interface systems and methods
JP4390006B2 (en) * 2008-04-01 2009-12-24 トヨタ自動車株式会社 Car navigation device, portable information terminal, and car navigation system
US9870130B2 (en) 2008-05-13 2018-01-16 Apple Inc. Pushing a user interface to a remote device
US20100293462A1 (en) 2008-05-13 2010-11-18 Apple Inc. Pushing a user interface to a remote device
US20090284476A1 (en) * 2008-05-13 2009-11-19 Apple Inc. Pushing a user interface to a remote device
US8970647B2 (en) 2008-05-13 2015-03-03 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
US9311115B2 (en) 2008-05-13 2016-04-12 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
CA2720198A1 (en) * 2008-05-29 2009-12-03 Tomtom International B.V. Portable navigation device, portable electronic communications apparatus, and method of generating radio data system information therefor
KR101039312B1 (en) * 2008-07-11 2011-06-08 삼성전자주식회사 Navigation system and method for providing navigation system using mobile terminal
US8078397B1 (en) * 2008-08-22 2011-12-13 Boadin Technology, LLC System, method, and computer program product for social networking utilizing a vehicular assembly
RU2011122657A (en) * 2008-11-06 2012-12-20 Томтом Интернэшнл Б.В. DATA COLLECTION DEVICE, DATA COLLECTION SYSTEM AND DATA COLLECTION METHOD
US8386163B2 (en) * 2008-11-21 2013-02-26 Gary Severson GPS navigation code system
EP2380096B1 (en) * 2008-11-25 2018-12-26 Apple Inc. Computer-implemented method for providing location related content to a mobile device
US20100204914A1 (en) * 2009-02-11 2010-08-12 Telmap Ltd Active widgets for mobile navigation systems
CN101858749A (en) * 2009-04-08 2010-10-13 阿尔派株式会社 Navigation device
JP5358277B2 (en) * 2009-05-07 2013-12-04 アルパイン株式会社 Navigation system and mobile telephone
KR101602221B1 (en) * 2009-05-19 2016-03-10 엘지전자 주식회사 Mobile terminal system and control method thereof
US8700329B2 (en) * 2009-05-29 2014-04-15 Mitsubishi Electric Corporation Navigation apparatus
US8681106B2 (en) * 2009-06-07 2014-03-25 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
US20110001819A1 (en) * 2009-07-02 2011-01-06 Sanyo Electric Co., Ltd. Image Processing Apparatus
CN201555592U (en) * 2009-08-19 2010-08-18 深圳市金研微科技有限公司 Navigating instrument capable of displaying mobile phone webpage information
US20110082620A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Adaptive Vehicle User Interface
CN102667403B (en) * 2009-12-02 2015-01-28 三菱电机株式会社 Navigation device
US8396888B2 (en) * 2009-12-04 2013-03-12 Google Inc. Location-based searching using a search area that corresponds to a geographical location of a computing device
US20110185390A1 (en) * 2010-01-27 2011-07-28 Robert Bosch Gmbh Mobile phone integration into driver information systems
US20110210922A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Dual-screen mobile device
US10134011B2 (en) * 2010-03-02 2018-11-20 Nokia Technologies Oy Methods and apparatuses for facilitating location selection
US9267813B2 (en) * 2010-03-09 2016-02-23 Honda Motor Co., Ltd. On-board system working a mobile device
US8886763B2 (en) * 2010-04-30 2014-11-11 Nokia Corporatin Method and apparatus for providing interoperability between devices
US8398182B2 (en) * 2010-06-07 2013-03-19 Gse Technologies, Llc Lightweight wear ring
US8749499B2 (en) * 2010-06-08 2014-06-10 Sap Ag Touch screen for bridging multi and/or single touch points to applications
US8473289B2 (en) * 2010-08-06 2013-06-25 Google Inc. Disambiguating input based on context
KR101219933B1 (en) * 2010-09-13 2013-01-08 현대자동차주식회사 System for controlling device in vehicle using augmented reality and thereof method
JP5859969B2 (en) * 2010-09-17 2016-02-16 クラリオン株式会社 In-vehicle information system, in-vehicle device, information terminal
TWI426769B (en) * 2010-10-07 2014-02-11 Mitac Int Corp Handheld mobile communication device providing dual-screen navigation
US9080890B2 (en) * 2010-11-05 2015-07-14 Modena Enterprises, Llc Method and system for obtaining destination information from proximate devices based on time and heading information
KR101034426B1 (en) * 2010-11-11 2011-05-12 김문태 The mobile phone cooperating system where the complex function is had
JP2012122776A (en) * 2010-12-06 2012-06-28 Fujitsu Ten Ltd In-vehicle device
US9086297B2 (en) * 2011-01-20 2015-07-21 Telenav, Inc. Navigation system having maneuver attempt training mechanism and method of operation thereof
JP5348164B2 (en) * 2011-03-23 2013-11-20 株式会社デンソー Vehicle device and external device screen display system
CN103493464B (en) * 2011-04-15 2016-05-11 歌乐株式会社 Information terminal, inter-vehicle information system, car-mounted device
US9163952B2 (en) * 2011-04-15 2015-10-20 Microsoft Technology Licensing, Llc Suggestive mapping
WO2012164333A1 (en) * 2011-05-31 2012-12-06 Mobile Devices Ingenierie System and method to search, collect and present various geolocated information
CN102622233A (en) * 2012-03-07 2012-08-01 山东大学 System and method for automatically generating user interface applicable to certain interactive terminal equipment
CN102663842B (en) * 2012-04-09 2015-03-11 李凯 Method for mobile device to control multiple external devices
CN102759359A (en) * 2012-07-18 2012-10-31 深圳市凯立德科技股份有限公司 Positioning navigation apparatus interacted with mobile terminal and interaction method of apparatus
CN102826047B (en) * 2012-08-28 2015-04-29 广东翼卡车联网服务有限公司 Method and system for operating smart mobile phone through automobile touch screen

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011084156A2 (en) * 2010-01-06 2011-07-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DIEWALD S ET AL: "Mobile Device Integration and Interaction in the Automotive Domain", 29 November 2011 (2011-11-29), pages 1 - 4, XP002732023, Retrieved from the Internet <URL:http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.347.7130> [retrieved on 20141104] *
See also references of WO2014151155A1 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10371526B2 (en) 2013-03-15 2019-08-06 Apple Inc. Warning for frequently traveled trips based on traffic
US10579939B2 (en) 2013-03-15 2020-03-03 Apple Inc. Mobile device with predictive routing engine
US11506497B2 (en) 2013-03-15 2022-11-22 Apple Inc. Warning for frequently traveled trips based on traffic
US11934961B2 (en) 2013-03-15 2024-03-19 Apple Inc. Mobile device with predictive routing engine
US9857193B2 (en) 2013-06-08 2018-01-02 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display
US10655979B2 (en) 2013-06-08 2020-05-19 Apple Inc. User interface for displaying predicted destinations
US10677606B2 (en) 2013-06-08 2020-06-09 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display
US10718627B2 (en) 2013-06-08 2020-07-21 Apple Inc. Mapping application search function
US10769217B2 (en) 2013-06-08 2020-09-08 Apple Inc. Harvesting addresses
US11874128B2 (en) 2013-06-08 2024-01-16 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display

Also Published As

Publication number Publication date
CN105143828A (en) 2015-12-09
AU2014235245A1 (en) 2015-09-24
WO2014151155A1 (en) 2014-09-25
CN111337048B (en) 2023-03-28
EP3889546A1 (en) 2021-10-06
CN105051494B (en) 2018-01-26
CN110068352B (en) 2023-06-09
WO2014151153A3 (en) 2014-11-27
CN111289010A (en) 2020-06-16
WO2014151153A4 (en) 2015-01-22
EP3101392B1 (en) 2021-12-15
EP2946171B1 (en) 2021-08-04
WO2014151153A2 (en) 2014-09-25
TWI532976B (en) 2016-05-11
CN109631935A (en) 2019-04-16
CN105051496A (en) 2015-11-11
CN105143828B (en) 2019-09-13
AU2014235244A1 (en) 2015-09-24
CN105051494A (en) 2015-11-11
CN110068352A (en) 2019-07-30
EP2946171A2 (en) 2015-11-25
EP3101392A1 (en) 2016-12-07
CN105051496B (en) 2020-01-17
AU2014235244B2 (en) 2017-04-13
WO2014151155A4 (en) 2014-12-04
AU2017261594A1 (en) 2017-12-07
AU2014235245B2 (en) 2017-03-02
AU2014235246B2 (en) 2017-08-17
CN105191387B (en) 2019-03-26
AU2014235248B2 (en) 2017-05-11
AU2014235248A1 (en) 2015-09-24
CN105191387A (en) 2015-12-23
TWI572849B (en) 2017-03-01
TW201447233A (en) 2014-12-16
CN111337048A (en) 2020-06-26
TW201447234A (en) 2014-12-16
AU2017261594B2 (en) 2019-08-15
AU2014235246A1 (en) 2015-09-24

Similar Documents

Publication Publication Date Title
US11874128B2 (en) Mapping application with turn-by-turn navigation mode for output to vehicle display
AU2017261594B2 (en) Mapping application search function
EP2778615B1 (en) Mapping Application with Several User Interfaces
EP2778614B1 (en) Mapping application with turn-by-turn navigation mode for output to vehicle display
EP3309516A1 (en) Voice instructions during navigation

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150820

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20161104

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: APPLE INC.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20180515