TW201727597A - Method, machine-readable medium and electronic device for presenting a map - Google Patents

Method, machine-readable medium and electronic device for presenting a map Download PDF

Info

Publication number
TW201727597A
TW201727597A TW106113163A TW106113163A TW201727597A TW 201727597 A TW201727597 A TW 201727597A TW 106113163 A TW106113163 A TW 106113163A TW 106113163 A TW106113163 A TW 106113163A TW 201727597 A TW201727597 A TW 201727597A
Authority
TW
Taiwan
Prior art keywords
map
user
mapping application
search
application
Prior art date
Application number
TW106113163A
Other languages
Chinese (zh)
Other versions
TWI625706B (en
Inventor
福斯塔爾史考特
A 摩爾布萊德福
歐斯馬塞 凡
布魯門伯格克里斯多福
沃凱諾伊曼紐
A 勞爾布瑞迪
S 皮蒙特派翠克
B 包爾馬修
Original Assignee
蘋果公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/632,102 external-priority patent/US9886794B2/en
Priority claimed from US13/632,124 external-priority patent/US9367959B2/en
Application filed by 蘋果公司 filed Critical 蘋果公司
Publication of TW201727597A publication Critical patent/TW201727597A/en
Application granted granted Critical
Publication of TWI625706B publication Critical patent/TWI625706B/en

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

A device that includes at least one processing unit and stores a multi-mode mapping program for execution by the at least one processing unit is described. The program includes a user interface (UI). The UI includes a display area for displaying a two-dimensional (2D) presentation of a map or a three-dimensional (3D) presentation of the map. The UI also includes a selectable 3D control for directing the program to transition between the 2D and 3D presentations and for having a first appearance when the display area displays the 2D presentation of the map and a second appearance when the display area displays the 3D presentation of the map.

Description

用於呈現地圖之方法、機器可讀媒體及電子器件Method for presenting a map, machine readable medium, and electronic device

本發明大體而言係關於地圖繪製應用程式。The present invention is generally directed to a mapping application.

隨著諸如智慧型手機之行動器件之擴散,使用者正享受可在其器件上執行之眾多種類之眾多應用程式。此應用程式之一個風行類型為允許使用者瀏覽地圖並獲得路線指引(route direction)之地圖繪製及導航應用程式。儘管此等地圖繪製及導航應用程式很風行,但該等應用程式具有在其使用者介面及特徵方面的缺點,造成使用者之不便。With the proliferation of mobile devices such as smart phones, users are enjoying a wide variety of applications that can be executed on their devices. One popular type of application is a mapping and navigation application that allows users to view maps and get route directions. Although such mapping and navigation applications are popular, such applications have disadvantages in terms of their user interface and features, which is inconvenient for the user.

本發明之一些實施例提供一種包括若干有用模態(包括方位瀏覽、地圖搜尋、路線識別及路線導航操作)之整合式地圖繪製應用程式。此等操作係複雜任務,該等任務儘管互補,但各自具有極為不同之使用者介面(UI)要求。在一些實施例中,該地圖繪製應用程式具有一新穎UI設計,其解決了將用於其不同模態中之每一者之所需控制項整合至順暢且連貫之應用程式使用者介面中之困難挑戰。該新穎UI設計及該應用程式經定義為由具有顯示該應用程式之輸出之一觸敏式螢幕之一器件來執行。在一些實施例中,此器件具有用於允許一使用者經由該螢幕提供觸碰及示意動作輸入而與該應用程式互動之一多點觸碰介面。 在一些實施例中,該地圖繪製應用程式UI之一目標為使螢幕控制項保持最少以便儘可能多地顯示互動地圖。此設計中之一個元素為浮動在內容本身之上而非(如電話UI元素中典型的)佔據螢幕之全寬度的一按鈕叢集。另外,此叢集適應於目前的任務,當一使用者在不同模態之間(例如,在瀏覽、搜尋、路線選擇及導航之間)移動時以動畫方式調整叢集之內容。具有一適應性本質之此一般元素使該地圖繪製應用程式能夠在於不同任務之間變換時維持一致的外表及互動模型之同時針對彼等任務最佳化。 在一些實施例中,一適應性浮動控制項之一個實例為清單控制項。當有機會顯示項目之一清單時(該清單為一路線中之指令之一清單,或針對一給定查詢找到多個結果時的搜尋結果之一清單),一些實施例之地圖繪製應用程式顯示一清單按鈕以作為浮動控制項中之一者。在一些實施例中,觸按該清單按鈕引出一模態清單視圖。具有模態清單視圖使該地圖繪製應用程式保持簡單且使地圖保持在前方且居於中心。在一些實施例中,清單視圖本身係針對所顯示之清單之類型而經調適且經最佳化,因為搜尋結果將與星級評等(當可用時)一起顯示而路線步驟將包括指令箭頭。 另一浮動控制項為用於三維(3D)地檢視地圖或檢查一路線之控制項。該地圖繪製應用程式提供3D控制項作為進入及退出3D之一快速機構。此控制項亦充當(1)當前視圖為一3D視圖之指示符,及(2)一3D透視圖可用於給定地圖視圖之指示符(例如,經縮小之地圖視圖可能不具有可用之3D視圖)。 除了3D控制項之外,一些實施例之地圖繪製應用程式亦允許一使用者經由對器件之多點觸碰介面之示意動作輸入而將一地圖視圖自一二維(2D)呈現轉變至一3D呈現。舉例而言,經由兩指示意動作輸入,可使該使用者體驗將一2D地圖視圖「下推」成一3D地圖視圖或將一3D地圖視圖「上拉」成一2D地圖視圖。此亦可被視為經由兩指示意動作而將一虛擬攝影機自一2D(自正上方)視圖下拉成一3D (側角)視圖。如下文所進一步描述,在一些實施例中,藉由自在概念上可被視為捕獲地圖視圖之一虛擬攝影機之位置的一特定位置顯現地圖視圖來產生地圖之3D視圖。 在一些實施例中,經由示意動作輸入,該地圖繪製應用程式允許一使用者亦旋轉2D或3D地圖。在一些實施例中,該地圖繪製應用程式為一向量地圖繪製應用程式,其允許在瀏覽地圖時對地圖之直接操縱(諸如旋轉及2D/3D操縱)。然而,對地圖之影響中之一些可為令人迷惑的。在沒有重新回到北向上(north-up)定向(亦即,北方與器件之頂部對準之定向)之容易方法的情況下,一些使用者可能在與地圖視圖互動方面有困難。為了解決此問題,一些實施例之地圖繪製應用程式在地圖上提供一不突兀的浮動羅盤控制項,其既充當指向北之指示符,又充當用以恢復北向上定向之按鈕。為了將地圖上之雜亂進一步減至最少,該地圖繪製應用程式僅在有限數目個情形下(諸如,當地圖經旋轉時等)才展示該按鈕。 為了將螢幕控制項之數目減至最小,將特定之不常使用之動作置放於位於「頁面捲曲(page curl)」後的一次級UI畫面中,頁面捲曲係顯示於由應用程式提供之地圖視圖上。在一些實施例中,頁面捲曲永久地顯示於應用程式提供之地圖視圖中之至少一些視圖上。舉例而言,在一些實施例中,應用程式將頁面捲曲永久地顯示於初始地圖視圖上,應用程式提供初始地圖視圖以允許一使用者瀏覽或搜尋一方位或識別一路線。在一些實施例中,頁面捲曲為一可控UI項目,其在不同實施例中具有不同外觀,諸如按鈕、頁面之捲起角、地圖之醒目提示角等。頁面捲曲(1)指示在概念上在當前視圖「後面」的控制項之另一集合之方位,且(2)在經選擇後,指引應用程式呈現「剝離」當前視圖以顯示展示控制項之該另一集合之另一視圖的動畫。該地圖繪製應用程式允許該使用者使用許多不同示意動作(例如,選擇、拖曳、觸按、滑動、旋轉等)來控制頁面捲曲。在一些實施例中,當使用者提供不同示意動作輸入時,該地圖繪製應用程式顯示頁面被摺疊、提起及/或以角度與旋轉之不同組合捲曲的一動畫,就好像使用者正藉由抓住紙張之一角來操縱一張紙一樣。 頁面捲曲之使用允許應用程式顯示地圖之更多部分,同時提供存取由控制項之另一集合提供之另外功能性之不突兀方式。另外,在一些實施例中,該應用程式在額外功能性被認為對於目前的任務而言不適當的地圖視圖中不使用頁面捲曲。舉例而言,在一些實施例中,該應用程式在呈現於導航期間所使用之地圖視圖的同時不顯示此頁面捲曲。 在一些實施例中,該應用程式針對該應用程式提供之每一個地圖視圖顯示頁面捲曲。在其他實施例中,該應用程式並不隨該應用程式提供之每一個地圖視圖顯示頁面捲曲。舉例而言,當一地圖視圖係在導航期間使用時,此等實施例之應用程式不顯示頁面捲曲。然而,在一些實施例中,當該應用程式正在導航期間展示路線之概觀時,頁面捲曲返回。 在一些實施例中,地圖繪製應用程式之搜尋欄位係另一UI工具,應用程式使用該工具來使不同模態之間的轉變順暢。在一些實施例中,一使用者可藉由在搜尋欄位中觸按來起始一搜尋。此觸按指引應用程式呈現一動畫,該動畫(1)呈現一螢幕小鍵盤且(2)開啟充滿有價值的完成之一搜尋表。此表具有一些重要之微小區別(subtleties)。當觸按搜尋欄位時且在編輯字詞之前,或當搜尋欄位為空時,該表含有「新近使用」之一清單,在一些實施例中,「新近使用」係使用者已請求之新近搜尋及路線指引。此使快速地引出新近存取之結果非常容易。 在搜尋欄位中進行任何編輯之後,由來自本端來源(例如,書籤、聯絡人、新近搜尋、新近路線指引等)及遠端伺服器兩者之搜尋完成來填充該表。然而,當使用者尚未將任何文字鍵入至搜尋欄位中時,一些實施例僅包括新近路線指引。一旦文字經鍵入,該地圖繪製應用程式就自搜尋完成表移除新近路線指引。將使用者之聯絡人卡片併入至搜尋介面中為該設計添加額外靈活性。當展示新近使用時,在一些實施例中,始終提供自當前方位至使用者的家之一路線,而在其他實施例中,在被視為「適當」之內容脈絡中才提供該路線。又,當搜尋字詞與一地址標籤之至少部分(例如,「Work」之「ork」)匹配時,在一些實施例中,該應用程式呈現作為搜尋表中之一完成的使用者之加標籤地址。此等行為合在一起使搜尋UI成為自多種來源在地圖上獲得結果之極有力方式。除了允許一使用者起始一搜尋之外,在一些實施例中,主地圖視圖中存在文字欄位亦允許使用者看到對應於地圖上之搜尋結果之查詢及藉由清除查詢來移除彼等搜尋結果。 該地圖繪製應用程式緊密地整合搜尋及路線識別體驗的另一方式係藉由提供得到指引之若干不同方式。如上所述,搜尋表提供對最近所使用之路線之快速存取。針對在地圖視圖上所選擇之任何方位,在一些實施例中,該地圖繪製應用程式亦呈現一資訊顯示橫幅(例如,一視窗),其顯示提取自當前方位至彼圖釘之一路線(例如,一駕駛路線)而不離開地圖視圖之一快速路線導航UI控制項(例如,按鈕)。另外,該地圖繪製應用程式亦在主地圖視圖上(例如,在左上角上)提供一可選擇指引UI控制項(例如,按鈕),該UI控制項在經選擇時呈現一模態指引(modal directions)編輯介面,該介面使使用者能夠請求更多客製化路線,諸如並非自當前方位開始之路線或替代僅駕駛路線之一步行路線。在一些實施例中,該地圖繪製應用程式基於其經由指引UI控制項接收之一路線查詢而提供若干不同可選擇路線。在此等實施例中,使用者接著可選擇該等路線中之一者。在一些實施例中,呈現該等路線中之一者以作為一預設所選擇路線,且使用者可將該所選擇路線改為其他所呈現路線中之一者。應注意,儘管在搜尋欄位中之路線歷史鍵入及快速路線導航控制項兩者皆不執行無法用指引項目達成之動作,但該兩者充當使獲得最一般所要路線更加容易之重要加速器。 一旦已獲得路線指引,該等路線指引即保持存在,直至明確地清除該等路線指引為止。此使該地圖繪製應用程式能夠進入針對導航最佳化之一模式。該導航模式具有許多新穎特徵。一個新穎特徵為,在導航中之任何時間,使用者可在呈現針對轉向提示指引(turn-by-turn direction)最佳化之顯示視圖之全螢幕模式與更適應於瀏覽的呈現剩餘路線之顯示視圖之概觀模式之間移動。一些實施例亦允許在概觀模式下在進行導航的同時執行搜尋。舉例而言,一些實施例提供一下拉式控點,其允許將搜尋欄位拉動到概觀顯示中。替代性地或相結合地,一些實施例允許在導航期間經由一些實施例之器件之語音辨識輸入來執行搜尋。 藉由在地圖中之就地轉變及控制項之一恆定集合,達成了概觀模式與全螢幕模式之間的連續性。為了進入全螢幕模式,在一些實施例中,該應用程式(1)自動地隱藏浮動控制項及沿著頂部之列(含有UI控制項),且(2)完全展開地圖。在全螢幕模式期間,該應用程式限制與地圖之觸碰互動。在一些實施例中,需要觸按來存取被自動隱藏之控制項,且儘管如此,該等控制項仍被調適以具有全螢幕導航外貌,同時在沿著頂部之列中突出顯示ETA。 一些實施例之地圖繪製應用程式允許使用者在導航中之任何時間藉由選擇用以結束導航之一控制項而在概觀模式及全螢幕模式兩者下停止導航。一些實施例之地圖繪製應用程式允許使用者在導航中之任何時間修改轉向提示導航視圖以看到替代性的三維(3D)視圖或呈現二維(2D)視圖。在一些實施例中,3D轉向提示導航為自基於使用者之行進方向及速度沿著路線之方向行進之虛擬攝影機之有利觀點顯現之導航路線的動畫顯現,在一些實施例中,該行進方向及速度係藉由與器件相關聯之方向資料(例如,GPS資料、三角測量小區塔資料等)捕獲。 在導航時,一些實施例之地圖繪製應用程式允許一使用者經由在器件之螢幕上之示意動作輸入來改變虛擬攝影機之位置(亦即,顯現的所導航路線始於之位置)。虛擬攝影機之移動(亦即,顯現的路線始於之位置之移動)允許地圖繪製應用程式呈現替代性3D視圖。一些實施例甚至使用虛擬攝影機來顯現由上而下之(top-down)2D視圖以用於轉向提示導航,而其他實施例藉由放大及縮小2D地圖來顯現由上而下之2D視圖。在一些實施例中,該地圖繪製應用程式呈現充當3D指示符及3D起始器/雙態開關兩者之一3D按鈕。 不同實施例提供不同示意動作輸入以在轉向提示導航期間調整3D/2D視圖。在一些實施例中,示意動作輸入為用以調整縮放層級之兩指捏合/擴張操作。縮放層級之此調整內在地調整攝影機相對於路線指引之位置及旋轉,且藉此改變路線指引之3D透視圖。或者,替代縮放操作或除了縮放操作之外,其他實施例亦提供改變攝影機之位置之其他示意動作輸入(例如,手指拖曳操作)。在另外其他實施例中,一示意動作輸入(例如,手指拖曳操作)短暫地改變攝影機之檢視方向以允許一使用者短暫地掃視導航路線之一側。在此等實施例中,該應用程式在一短時間段之後使攝影機返回其沿著路線的先前視圖。 一些實施例之地圖繪製應用程式提供在導航期間及在瀏覽經識別路線期間使用之看起來逼真的道路交通標誌。在一些實施例中,該等標誌為與實際公路標誌十分相似之紋理化影像,且該等標誌包括指令箭頭、文字、盾形標記(shield)及距離。一些實施例之地圖繪製應用程式在大量不同內容脈絡中呈現大量變體。對於緊接在一起之駕控,呈現恰好懸於主要標誌下之次級標誌。在一些實施例中,根據地區性標準而以不同顏色來呈現標誌。又,當在導航期間經過一個駕控時,該地圖繪製應用程式利用模擬標誌在公路上方通過之運動而用動畫表示標誌離開。當逼近一駕控時,該地圖繪製應用程式藉由微妙的動畫(例如,跨越整個標誌之閃爍)將注意力吸引至標誌。 如上所述,一些實施例之地圖繪製應用程式使用看起來逼真的道路交通標誌來提供瀏覽該應用程式已識別之一路線之新穎方法。舉例而言,在一些實施例中,該地圖繪製應用程式允許一使用者在該應用程式向該使用者呈現一經識別路線時選擇並捲動通過沿著該經識別路線之接合點之標誌。當使用者捲動通過每一標誌時,呈現或醒目提示(例如,經由顏色醒目提示或經由對該部分加記號之另一幾何形狀(例如,圓圈或其他記號))與當前在焦點上的標誌相關聯的路線之部分。替代性地或相結合地,使用者可藉由選擇路線之不同接合點以便檢視與彼接合點相關聯之標誌而捲動通過每一標誌。此等實施例中之一些僅為並非界定於使用者之當前方位與目的地之間的路線提供此互動。換言之,當呈現將使用者之當前方位連接至目的地之路線時,此等實施例不提供此瀏覽體驗。然而,其他實施例在顯示路線之其他或所有其他內容脈絡中經由道路交通標誌來提供路線瀏覽體驗。 在下文進一步描述一些實施例之地圖繪製應用程式之上述特徵以及一些其他特徵。在上文及下文之描述中,將該等特徵中之許多者描述為提供新穎方位瀏覽、定位搜尋、路線識別及路線導航操作之一整合式地圖繪製應用程式之部分。然而,一般熟習此項技術者將認識到,在其他實施例中,藉由不執行全部此等操作或除此等操作外亦執行其他操作的應用程式來執行此等新穎操作。 前述[發明內容]意欲充當對本發明之一些實施例的簡要介紹。其並不意謂係本文獻中揭示之所有發明標的的介紹或概觀。以下[實施方式]及[實施方式]中提及之[圖式說明]將進一步描述在[發明內容]中描述之實施例以及其他實施例。因而,為了理解由此文獻描述之所有實施例,需要對[發明內容]、[實施方式]及[圖式說明]的全面審閱。此外,所主張標的不受[發明內容]、[實施方式]及[圖式說明]中之說明性細節限制,而是由附加之申請專利範圍來界定,此係因為所主張標的可以其他特定形式來體現而不脫離標的之精神。Some embodiments of the present invention provide an integrated mapping application that includes a number of useful modalities, including orientation browsing, map searching, route recognition, and route navigation operations. These operations are complex tasks that, although complementary, each have very different user interface (UI) requirements. In some embodiments, the mapping application has a novel UI design that addresses the integration of the required controls for each of its different modalities into a smooth and coherent application user interface. Difficult challenges. The novel UI design and the application are defined to be performed by a device having a touch sensitive screen that displays the output of the application. In some embodiments, the device has a multi-touch interface for interacting with the application by allowing a user to provide touch and gesture input via the screen. In some embodiments, one of the mapping application UIs is targeted to keep screen controls to a minimum to display as many interactive maps as possible. One element in this design is a button cluster that floats over the content itself rather than (as typical of a phone UI element) occupying the full width of the screen. In addition, this cluster is adapted to the current task of animating the content of the cluster as it moves between different modalities (eg, between browsing, searching, routing, and navigation). This general element of an adaptive nature enables the mapping application to optimize for its tasks while maintaining consistent appearance and interaction models as they transition between tasks. In some embodiments, an example of an adaptive floating control item is a list control item. Some embodiments of the mapping application display when there is a chance to display a list of items (the list is a list of instructions in a route, or a list of search results when multiple results are found for a given query) A list button is used as one of the floating controls. In some embodiments, touching the list button brings up a modal list view. Having a modal list view keeps the mapping application simple and keeps the map in front and centered. In some embodiments, the manifest view itself is adapted and optimized for the type of manifest displayed, as the search results will be displayed along with star ratings (when available) and the routing steps will include instruction arrows. Another floating control item is a control item for viewing a map in three dimensions (3D) or checking a route. The mapping application provides 3D controls as a fast mechanism for entering and exiting 3D. This control also serves as an indicator that (1) the current view is a 3D view, and (2) a 3D perspective can be used for a given map view (eg, a reduced map view may not have a 3D view available) ). In addition to the 3D control, the mapping application of some embodiments also allows a user to transition a map view from a two-dimensional (2D) presentation to a 3D via a gesture input to the multi-touch interface of the device. Presented. For example, the two user input functions enable the user to "push down" a 2D map view into a 3D map view or "pull up" a 3D map view into a 2D map view. This can also be seen as pulling a virtual camera from a 2D (from the top) view to a 3D (side angle) view via two indicative actions. As further described below, in some embodiments, a 3D view of the map is generated by presenting the map view at a particular location that is conceptually considered to capture the location of one of the virtual views of the map view. In some embodiments, the mapping application allows a user to also rotate a 2D or 3D map via a gesture input. In some embodiments, the mapping application is a vector mapping application that allows direct manipulation of the map (such as rotation and 2D/3D manipulation) while browsing the map. However, some of the effects on the map can be confusing. In the absence of an easy way to return to the north-up orientation (i.e., the orientation of the north aligned with the top of the device), some users may have difficulty interacting with the map view. To address this issue, the mapping application of some embodiments provides an unobtrusive floating compass control on the map that acts both as an indicator pointing north and as a button to resume northward orientation. In order to further minimize clutter on the map, the mapping application only displays the button in a limited number of situations, such as when the local map is rotated. In order to minimize the number of screen controls, the specific infrequently used actions are placed in the primary UI screen located after the "page curl", and the page curl is displayed on the map provided by the application. On the view. In some embodiments, the page curl is permanently displayed on at least some of the views provided by the application. For example, in some embodiments, the application displays the page curl permanently on the initial map view, and the application provides an initial map view to allow a user to browse or search for an orientation or identify a route. In some embodiments, the page is curled into a controllable UI item that has a different appearance in different embodiments, such as buttons, page roll-up angles, eye-catching hint angles for maps, and the like. Page curl (1) indicates the orientation of another set of controls that are conceptually "behind" the current view, and (2) after selection, directs the application to "peel" the current view to display the display control An animation of another view of another collection. The mapping application allows the user to control page curl using a number of different gestures (eg, select, drag, touch, swipe, rotate, etc.). In some embodiments, when the user provides different gesture input, the mapping application displays an animation that the page is folded, lifted, and/or curled in a different combination of angle and rotation, as if the user was grasping Hold a corner of the paper to manipulate a piece of paper. The use of page curling allows the application to display more of the map, while providing an unobtrusive way to access additional functionality provided by another set of controls. Additionally, in some embodiments, the application does not use page curling in map views where additional functionality is considered inappropriate for current tasks. For example, in some embodiments, the application does not display this page curl while presenting the map view used during navigation. In some embodiments, the application displays page curl for each map view provided by the application. In other embodiments, the application does not display page curl with every map view provided by the application. For example, the application of these embodiments does not display page curl when a map view is used during navigation. However, in some embodiments, the page curls back when the application is displaying an overview of the route while navigating. In some embodiments, the search field of the mapping application is another UI tool that the application uses to smooth transitions between different modalities. In some embodiments, a user can initiate a search by tapping in the search field. The touch guides the application to present an animation that (1) presents a screen keypad and (2) opens a search table filled with valuable completions. This table has some important subtleties. When the search field is touched and before the word is edited, or when the search field is empty, the table contains a list of "recently used", in some embodiments, "recently used" is requested by the user. Recent search and route guidance. This makes it very easy to quickly bring out the results of recent accesses. After any edits in the search field, the table is populated by a search completion from both the local source (eg, bookmarks, contacts, recent searches, recent route directions, etc.) and the remote server. However, some embodiments include only recent route guidance when the user has not typed any text into the search field. Once the text is typed, the mapping application removes the recent route guidance from the search completion table. Incorporating the user's contact card into the search interface adds additional flexibility to the design. When the display is recently used, in some embodiments, one of the directions from the current orientation to the user's home is always provided, while in other embodiments, the route is provided in the context of the content deemed "appropriate". Also, when the search term matches at least a portion of an address tag (eg, "ork" of "Work"), in some embodiments, the application presents the tagged user as one of the search tables. address. Together, these behaviors make searching the UI a powerful way to get results on a map from multiple sources. In addition to allowing a user to initiate a search, in some embodiments, the presence of a text field in the main map view also allows the user to see the query corresponding to the search results on the map and remove the query by clearing the query. Wait for the search results. Another way in which the mapping application closely integrates the search and route identification experience is by providing a number of different ways to get guidance. As mentioned above, the lookup table provides quick access to the most recently used route. For any orientation selected on the map view, in some embodiments, the mapping application also presents an information display banner (eg, a window) that displays a route extracted from the current orientation to one of the pins (eg, A driving route) without leaving the map view one of the quick route navigation UI controls (eg, buttons). In addition, the mapping application also provides a selectable navigation UI control (eg, a button) on the main map view (eg, in the upper left corner) that presents a modal guide when selected (modal Directions) The editing interface that enables the user to request more customized routes, such as routes that are not starting from the current orientation or instead of one of the driving routes. In some embodiments, the mapping application provides a number of different selectable routes based on which one of the route queries is received via the guidance UI control. In such embodiments, the user can then select one of the routes. In some embodiments, one of the routes is presented as a preset selected route, and the user can change the selected route to one of the other presented routes. It should be noted that although neither the route history entry nor the fast route navigation control in the search field performs an action that cannot be achieved with the guidance item, the two serve as important accelerators that make it easier to obtain the most general desired route. Once the route guidance has been obtained, the route guidance will remain in existence until the route guidance is clearly cleared. This enables the mapping application to enter one of the modes for navigation optimization. This navigation mode has many novel features. A novel feature is that at any time during navigation, the user can present a full screen mode for a display view optimized for turn-by-turn direction and a display for the remaining route more suitable for browsing. Move between the overview modes of the view. Some embodiments also allow the search to be performed while navigating in the overview mode. For example, some embodiments provide a pull-down handle that allows the search field to be pulled into the overview display. Alternatively or in combination, some embodiments allow for a search to be performed via voice recognition input of the devices of some embodiments during navigation. The continuity between the overview mode and the full screen mode is achieved by a local set of transitions and a constant set of controls in the map. In order to enter the full screen mode, in some embodiments, the application (1) automatically hides the floating control items and along the top column (with UI controls) and (2) fully expands the map. The app limits interaction with the map during full screen mode. In some embodiments, a touch is required to access the automatically hidden controls, and despite that, the controls are still adapted to have a full screen navigation appearance while highlighting the ETA along the top column. The mapping application of some embodiments allows the user to stop navigating in both the overview mode and the full screen mode at any time during navigation by selecting one of the controls to end navigation. The mapping application of some embodiments allows a user to modify the turn-by-turn navigation view at any time during navigation to see an alternate three-dimensional (3D) view or to present a two-dimensional (2D) view. In some embodiments, the 3D steering cue navigation is an animated visualization of the navigational route emerging from the vantage point of the virtual camera traveling in the direction of the route based on the direction of travel of the user and, in some embodiments, the direction of travel and Speed is captured by direction data associated with the device (eg, GPS data, triangulation cell tower data, etc.). While navigating, the mapping application of some embodiments allows a user to change the position of the virtual camera (i.e., the position at which the rendered navigation route begins) via a gesture input on the screen of the device. The movement of the virtual camera (i.e., the movement of the appearing route from its location) allows the mapping application to present an alternative 3D view. Some embodiments even use a virtual camera to visualize a top-down 2D view for turn-tip navigation, while other embodiments visualize a top-down 2D view by zooming in and out of a 2D map. In some embodiments, the mapping application presents a 3D button that acts as both a 3D indicator and a 3D initiator/two state switch. Different embodiments provide different illustrative action inputs to adjust the 3D/2D view during turn alert navigation. In some embodiments, the gesture input is a two-finger pinch/expansion operation to adjust the zoom level. This adjustment of the zoom level inherently adjusts the position and rotation of the camera relative to the route guidance, and thereby changes the 3D perspective of the route guidance. Alternatively, in addition to or in addition to the zooming operation, other embodiments provide other illustrative motion inputs (e.g., finger drag operations) that change the position of the camera. In still other embodiments, a gesture input (eg, a finger drag operation) briefly changes the view direction of the camera to allow a user to briefly scan one side of the navigation route. In these embodiments, the application returns the camera to its previous view along the route after a short period of time. The mapping application of some embodiments provides a realistic road traffic sign that is used during navigation and during browsing of the identified route. In some embodiments, the markers are textured images that are very similar to the actual road sign, and the markers include command arrows, text, shields, and distances. The mapping application of some embodiments presents a number of variations in a large number of different contexts. For the driving control that is closely followed, the sub-marks that just hang under the main sign are presented. In some embodiments, the logo is presented in different colors according to regional criteria. Also, when a driving is passed during navigation, the mapping application uses the simulated sign to move over the road to animate the sign to leave. When approaching a driver, the mapping application draws attention to the logo by subtle animations (eg, flashing across the entire logo). As noted above, the mapping application of some embodiments uses a seemingly realistic road traffic sign to provide a novel way of browsing a route that the application has identified. For example, in some embodiments, the mapping application allows a user to select and scroll through a sign along a junction of the identified route as the application presents the identified route to the user. A presentation or eye-catching prompt when the user scrolls through each of the markers (eg, via a color eye-catching prompt or via another geometry (eg, a circle or other token) that marks the portion) with the current focus Part of the associated route. Alternatively or in combination, the user may scroll through each of the markers by selecting different joints of the route to view the markers associated with the joints. Some of these embodiments provide this interaction only for routes that are not defined between the user's current location and the destination. In other words, such embodiments do not provide this browsing experience when presenting a route that connects the user's current location to the destination. However, other embodiments provide a route browsing experience via road traffic signs in other or all other contexts that display the route. The above features of the mapping application of some embodiments, as well as some other features, are further described below. In the above and following description, many of these features are described as part of an integrated mapping application that provides one of novel orientation browsing, location search, route recognition, and route navigation operations. However, those skilled in the art will recognize that in other embodiments, such novel operations are performed by an application that does not perform all such operations or perform other operations in addition to such operations. The foregoing [invention] is intended to serve as a brief description of some embodiments of the invention. It is not intended to be an introduction or overview of all the subject matter disclosed in this document. The embodiment described in the [Summary of the Invention] and other embodiments will be further described in the following [Embodiment] and [Embodiment]. Thus, in order to understand all of the embodiments described in this document, a comprehensive review of [inventions], [embodiments], and [schematic descriptions] is required. In addition, the claimed subject matter is not limited by the illustrative details in [invention], [embodiment], and [schematic description], but is defined by the scope of additional patent application, because the claimed subject matter can be in other specific forms. To reflect the spirit of the target.

在本發明之以下詳細描述中,陳述並描述本發明之眾多細節、實例及實施例。然而,熟習此項技術者將清楚並容易瞭解,本發明不限於所陳述之實施例且可在不具有所論述之特定細節及實例中之一些之情況下實踐本發明。 本發明之一些實施例提供一種包括若干有用模態(包括方位瀏覽、地圖搜尋、路線識別及路線導航操作)之整合式地圖繪製應用程式。在一些實施例中,該應用程式經定義為由具有顯示該應用程式之輸出之一觸敏式螢幕之一器件來執行。在一些實施例中,此器件具有用於允許一使用者經由該螢幕提供觸碰及示意動作輸入而與該應用程式互動之多點觸碰介面。此等器件之實例為智慧型手機(例如,Apple Inc.所銷售之iPhone®、操作Android®作業系統之手機、操作Windows 8®作業系統之手機等)。 在下文描述本發明之若干詳細實施例。章節I描述一些實施例之地圖繪製應用程式提供之UI控制項及地圖瀏覽體驗。章節II接著描述地圖繪製應用程式之新穎搜尋欄位之特性。章節III接著描述用於呈現關於方位之不同類型細節資訊之新穎UI。接下來,章節IV描述供使用者自地圖繪製應用程式得到路線指引之若干不同方式。章節V接著描述一些實施例之整合式地圖繪製應用程式之不同操作模式。章節VI描述本發明之一些實施例實施於之實例電子系統。最後,章節VII描述一地圖服務作業環境。I. 地圖瀏覽 A. 一般控制項 圖1說明執行本發明之一些實施例之整合式地圖繪製應用程式之器件100之一實例。此應用程式具有一新穎使用者介面(UI)設計,其藉由使用浮動在內容之上的螢幕控制項之最小集合以便儘可能多地顯示內容而順暢且連貫地整合用於應用程式之不同模態中之每一者的控制項。另外,此叢集適應於目前的任務,在一使用者在不同模態之間(例如,在瀏覽、搜尋、路線選擇及導航之間)移動時以動畫方式調整叢集之內容。此種具有適應性本質之共同元件使該地圖繪製應用程式能夠針對不同任務進行最佳化同時當在彼等任務之間移動時維持一致的外觀及互動模型。 圖1展示與地圖繪製應用程式之互動的三個階段105、110及115。第一階段105展示器件之UI 120,其包括在停駐區域125中及UI之頁面上之若干應用程式之若干圖示。此頁面上之圖示中之一者為地圖繪製應用程式130之圖示。第一階段展示使用者經由在器件之螢幕上此應用程式之方位處與螢幕之觸碰接觸來選擇地圖繪製應用程式。 第二階段110展示在已開啟地圖繪製應用程式之後的器件。如此階段中所展示,地圖繪製應用程式之UI具有一開始頁面,在一些實施例中,該開始頁面(1)顯示器件之當前方位之地圖,及(2)配置於頂端列140中及作為浮動控制項之若干UI控制項。如圖1所示,該等浮動控制項包括位置控制項145、3D控制項150及頁面捲曲控制項155,而頂端列140包括指引控制項160、搜尋欄位165及書籤控制項170。 指引控制項160開啟一頁面,使用者可經由該頁面請求識別在一開始方位與一結束方位之間的路線。如下文所進一步描述,此控制項為三個機構中之一者,可經由該等機構指引地圖繪製應用程式識別並顯示兩個方位之間的一路線,另外兩個機構為(1)針對地圖中之所選擇項目顯示之一資訊橫幅中之一控制項,及(2)由器件識別之顯示於搜尋欄位165中的新近路線。因此,資訊橫幅控制項及搜尋欄位165係應用程式用來使不同模態之間的轉變順暢的兩個UI工具。 在一些實施例中,一使用者可藉由在搜尋欄位165中觸按來起始一搜尋。此觸按指引應用程式呈現一動畫,該動畫(1)呈現一螢幕小鍵盤且(2)開啟充滿有價值的完成之一搜尋表。此表具有一些重要之微小區別(subtleties)。當觸按搜尋欄位時且在編輯字詞之前,或當搜尋欄位為空時,該表含有「新近使用」之一清單,在一些實施例中,「新近使用」係使用者已請求之新近搜尋及路線指引。此使快速地引出新近存取之結果非常容易。 在搜尋欄位中之任何編輯之後,由來自本端來源(例如,書籤、聯絡人、新近搜尋、新近路線指引等)及遠端伺服器兩者之搜尋完成來填充該表。然而,當使用者尚未將任何文字鍵入至搜尋欄位中時,一些實施例僅包括新近路線指引。一旦文字經鍵入,該地圖繪製應用程式便自搜尋完成表移除新近路線指引。將使用者之聯絡人卡片併入至搜尋介面中為該設計增添了額外靈活性。當展示新近使用時,在一些實施例中,始終提供自當前方位至使用者家之路線,而在其他實施例中,在被視為「適當」之內容脈絡中才提供該路線。又,當搜尋字詞與一地址標籤之至少部分(例如,「Work」之「ork」)匹配時,在一些實施例中,該應用程式呈現作為搜尋表中之一完成的使用者之加標籤地址。此等行為合在一起使搜尋UI成為自多種來源在地圖上獲得結果之極有力方式。除了允許使用者起始搜尋之外,在一些實施例中,主要地圖視圖中存在文字欄位亦允許使用者看到對應於地圖上之搜尋結果之查詢及藉由清除查詢來移除彼等搜尋結果。 書籤控制項170 (例如,按鈕)允許應用程式對方位及路線加書籤。位置控制項145允許將器件之當前位置特別標註在地圖上。在一些實施例中,一旦選擇此位置控制項,應用程式就在器件移動時將器件之當前位置維持在地圖之中心。在一些實施例中,該控制項亦可識別器件當前所指向之方向。一些實施例之地圖繪製應用程式使用器件在器件之方位處接收之GPS信號中之座標(例如,經度、海拔及緯度座標)來識別器件之方位。替代性地或相結合地,地圖繪製應用程式使用其他方法(例如,小區塔三角測量)來計算當前方位。 3D控制項150為用於三維(3D)地檢視地圖或檢查一路線之一控制項。該地圖繪製應用程式提供3D控制項作為進入及退出3D之一快速機構。此控制項亦充當(1)當前視圖為一3D視圖之指示符,及(2)一3D透視圖可用於給定地圖視圖之指示符(例如,經縮小之地圖視圖可能不具有可用之3D視圖)。在一些實施例中,3D控制項150提供對應於此等指示中之一些的至少三個不同外觀。舉例而言,3D控制項在地圖之3D視圖不可用時變成灰色,在3D視圖可用但地圖處於2D視圖中時變成黑色,且在地圖處於3D視圖中時變成藍色。在一些實施例中,在沈浸式3D地圖呈現在給定縮放層級下可用時,3D控制項具有一第四外觀(例如,展示建築物影像或形狀之按鈕)。沈浸式及非沈浸式3D呈現係在2012年9月30日申請之題為「Rendering Maps」的美國專利申請案13/632,035中進一步描述,美國專利申請案13/632,035係以引用方式併入本文中。 頁面捲曲控制項155係允許應用程式藉由將特定之不常使用之動作置放於一次級UI畫面中而將螢幕控制項之數目減至最小的控制項,可經由顯示於地圖上之「頁面捲曲」控制項來存取該次級UI畫面。在一些實施例中,頁面捲曲永久地顯示於應用程式提供之地圖視圖中之至少某些視圖上。舉例而言,在一些實施例中,應用程式將頁面捲曲永久地顯示於開始頁面(展示於第二階段110中)上,應用程式提供開始頁面以允許一使用者瀏覽或搜尋一方位或識別一路線。 頁面捲曲指示在概念上在當前視圖「後面」的控制項之另一集合之方位。當選擇頁面捲曲控制項155時,應用程式呈現「剝離」當前視圖以顯示展示控制項之另一集合之另一視圖的動畫。第三階段115說明此動畫之一實例。如此階段所展示,開始頁面之剝離顯露若干控制項,在此實例中,該等控制項為放置圖釘(drop pin)、列印(print)、展示交通(show traffic)、清單(list)、標準(standard)、衛星(satellite)、混合(hybrid)控制項。在一些實施例中,此等控制項執行的操作與當前可用智慧型手機(諸如操作iOS®之iPhone)中的類似控制項所執行的操作相同。 頁面捲曲之使用允許該應用程式在提供存取由控制項之另一集合提供之另外功能性之不突兀方法的同時顯示更多地圖。另外,在一些實施例中,該應用程式在額外功能性被認為對於目前的任務而言不適當的地圖視圖中不使用頁面捲曲。舉例而言,在一些實施例中,該應用程式在導航期間呈現地圖視圖時不顯示此頁面捲曲。 又,在一些實施例中,第三階段115說明使用者拖曳頁面之一角或一邊緣以剝離該頁面。然而,在其他實施例中,剝離頁面之動畫係藉由僅觸按頁面捲曲控制項155而不拖曳角或邊緣來顯示。B. 適應性按鈕叢集 如上所述,在一些實施例中,地圖繪製應用程式適應性地新增控制項至浮動控制項叢集集合且移除控制項以便在維持不同任務之間的一致外貌及互動模型的同時使此叢集適應彼等任務。圖2說明應用程式適應性地修改浮動控制項叢集以新增及移除清單檢視控制項235之一實例。此實例係在使用指引指示符160來獲得兩個方位之間的路線之內容脈絡中提供。 此實例亦係依據與地圖繪製應用程式之互動之六個階段205至530而提供。第一階段205說明對指引指示符160之選擇。第二階段210接下來說明在使用者已在開始欄位245及結束欄位250中鍵入路線之開始及結束方位之後選擇路線產生控制項240。第二階段210亦展示地圖繪製應用程式在用於鍵入開始及結束方位之欄位以下顯示若干新近使用之路線產生請求。 第三階段215展示地圖繪製應用程式已針對所提供之開始及結束方位識別的兩條路線260及261。在一些實施例中,地圖繪製應用程式醒目提示該等路線中之一者以指示醒目提示之路線為地圖繪製應用程式推薦之預設路線。此階段亦說明展示清單檢視控制項235自3D圖示150下滑出之動畫的開始。當有機會顯示項目之一清單時(該清單為一路線中之指令之一清單,或針對一給定查詢找到多個結果時的搜尋結果之一清單),一些實施例之地圖繪製應用程式顯示一清單控制項以作為浮動控制項中之一者。在一些實施例中,觸按該清單控制項引出一模態清單視圖。具有模態清單視圖可使地圖繪製應用程式保持簡單且使地圖保持在前方且居於中心。在一些實施例中,清單視圖本身係針對所顯示之清單之類型而經調適且經最佳化,因為搜尋結果將與星級評等(當可用時)一起顯示而路線步驟將包括指令箭頭。 第四階段220展示對用以自所說明地圖清除經識別路線260及261之清除控制項255之選擇。回應於此選擇,自地圖移除路線260及261,且一動畫開始展示清單控制項235滑動返回到3D控制項150下,如第五階段225中所說明。第六階段230展示在該動畫已結束且已自浮動控制項集合移除清單控制項之後的應用程式UI。 一些實施例之地圖繪製應用程式使用之另一浮動控制項為一羅盤。圖3說明應用程式適應性地修改浮動控制項叢集以新增及移除羅盤300之一實例。此實例係在使用位置控制項145來檢視在由器件呈現之地圖上器件之當前位置及定向之內容脈絡中提供。在一些實施例中,位置控制項145可使地圖繪製應用程式在三個不同狀態下操作。舉例而言,當未選擇位置控制項145時,地圖繪製應用程式顯示一地圖視圖。在接收到對位置控制項145之第一選擇時,地圖繪製應用程式將地圖移位以顯示地圖之一區,其包括在該區之中心的器件之當前方位。自彼時起,一些實施例之地圖繪製應用程式在器件移動時追蹤器件之當前方位。 在一些實施例中,地圖繪製應用程式將當前方位指示符維持在顯示區域之中心且在器件自一個區移動至另一區時將地圖自一個區移位至另一區。在地圖繪製應用程式正將器件之當前方位維持在所顯示區之中心時接收到對位置控制項145之第二選擇時,地圖繪製應用程式在器件當前面對之方向上顯示地圖中的來自經識別當前位置之一模擬光投射。當地圖繪製應用程式正在顯示一模擬光投射的同時位置控制項145被再次選擇時,地圖繪製應用程式返回到在接收第一選擇之前的狀態。亦即,投射將消失且器件之當前位置不被追蹤。 此圖所說明之實例係依據與地圖繪製應用程式之互動之五個階段305至325而提供。第一階段305說明地圖繪製應用程式正在顯示碰巧不包括器件之當前方位之一地圖區(亦即,當前方位指示符未顯示於該地圖區中)。 第二階段310說明位置控制項145被選擇一次。如上所述,對位置控制項145之第一次選擇將導致使地圖移位以顯示在中心具有當前方位指示符326之一地圖區。第三階段315展示選擇位置控制項145之結果。一些實施例藉由使用當前方位指示符326來識別器件之當前位置。當前方位指示符326在不同實施例中具有不同外觀。舉例而言,一些實施例之當前方位指示符326具有地圖上之彩色點(例如,一藍色點)之外觀。當使用者已探索(例如,經由示意動作撥動操作)所顯示地圖使得器件目前在地圖上不顯示使用者之當前方位時,對當前位置之識別係有用的。 第四階段320說明位置控制項145被再次選擇。在一些實施例中,對位置控制項145之第二次選擇將使應用程式在器件當前面對之方向上顯示地圖中的來自經識別當前位置326之模擬光投射345。此投射幫助使用者在任何時間識別器件面對之方向。在一些實施例中,此投射始終指向器件之頂部(亦即,當器件被固持在縱向方向上時搜尋欄位165定位所沿著的方位)。 在第五階段310中說明此投射345。此階段亦展示,在此模式下,地圖繪製應用程式呈現浮動羅盤300。此羅盤充當使用者可用來識別至北極之方向之一指示符。在一些實施例中,此羅盤呈於底部毗鄰之兩個等腰三角形之形狀,該等三角形中之一者指向北(在遠離毗鄰底部之方向上)且具有使其與另一三角形區分之顏色(例如,橙色)。如下文進一步描述,羅盤亦可用以在使用者已旋轉地圖之2D或3D視圖之後恢復北向上定向。在一些實施例中,在地圖繪製應用程式接收到對位置控制項145之另一選擇之後,羅盤可保持在地圖視圖中。在一些實施例中,直至地圖繪製應用程式接收到用以移除羅盤之一使用者輸入(例如,選擇羅盤)之後,羅盤才會消失。 第五階段325亦展示地圖已旋轉以維持投射之方向朝向器件之頂部。此係因為器件已面對不同於先前階段320中的朝向器件之頂部之方向的一方向。當器件之方向移動時,羅盤300之方向亦將相對於器件之頂部移動。羅盤已移動以指示器件正面對西北方向。 在一些實施例中,地圖繪製應用程式在第一選擇之後改變位置控制項145之外觀一次且在第二選擇之後再次改變位置控制項145之外觀。第五階段325展示第二選擇之後的位置控制項145之外觀,其不同於第一選擇之後的位置控制項145之外觀。C.     2D 3D 1.      3D 按鈕 在一些實施例中,地圖繪製應用程式可在2D模式抑或3D模式下顯示地圖中之一方位。此允許使用者在2D模式抑或3D模式下瀏覽地圖中之一方位。如上所述,浮動控制項中之一者為3D控制項150,其允許一使用者三維(3D)地檢視地圖或檢查一路線。此控制項亦充當(1)當前視圖為一3D視圖之指示符,及(2)一3D透視圖可用於給定地圖視圖之指示符(例如,經縮小之地圖視圖可能不具有可用之3D視圖)。 圖4說明一些實施例之地圖繪製應用程式提供3D控制項150以作為進入3D模式以用於三維地檢視地圖方位之快速機構的方式。此圖以四個階段405至420說明此操作。第一階段405說明使用者在檢視使用者之當前方位425附近之區域之二維呈現時選擇3D控制項150。為了描述簡單起見,此圖中未描繪頂端列、浮動控制項及頁面捲曲。 第二階段410展示使用者之當前方位在地圖上之三維呈現。如上所述,在一些實施例中,地圖繪製應用程式藉由自三維場景中之可被概念性地視為捕獲地圖視圖之虛擬攝影機之位置的一特定位置顯現地圖視圖來產生地圖之3D視圖。將在下文藉由參看圖5來進一步向描述此顯現。 第三階段415展示使用者藉由執行一撥動操作(例如,藉由拖曳手指跨越器件之觸敏式螢幕)來圍繞當前方位進行瀏覽。此撥動操作改變呈現於器件上之3D地圖視圖以顯示3D地圖上之一新方位。在第四階段420中說明此新方位。 在一些實施例中,當地圖繪製應用程式正在導航模式下操作時(亦即,當地圖繪製應用程式正在呈現轉向提示導航視圖時),地圖繪製應用程式呈現地圖之3D視圖。為了提供導航期間的地圖之3D視圖與地圖瀏覽期間的地圖之3D視圖之間的視覺差別,一些實施例之地圖繪製應用程式使用以不同方式界定所顯現圖形之不同樣式表。舉例而言,一些實施例之地圖繪製應用程式在地圖瀏覽期間的地圖之3D視圖中使用定義用於建築物之灰色、用於道路之白色及用於街區之圓角的樣式表。某一實施例之地圖繪製應用程式在導航期間的地圖之3D視圖中使用定義用於建築物之白色、用於道路之灰色及用於街區之尖角的樣式表。在一些實施例中,地圖繪製應用程式將此等樣式表應用於地圖之一給定區之相同地圖底圖。在其他實施例中,地圖繪製應用程式將此等樣式表應用於該給定區之不同地圖底圖(例如,地圖底圖、導航底圖等)。使用樣式表來顯現地圖係在上文併入之美國專利申請案13/632,035中進一步描述。2. 虛擬攝影機 圖5呈現一簡化實例以說明虛擬攝影機505之概念。當顯現3D地圖時,虛擬攝影機為3D地圖場景中之位置(器件自該位置顯現場景以產生地圖之3D視圖)之概念化。圖5說明包括四個物件之3D地圖場景535中之一方位,該等物件為兩個建築物及兩條相交道路。為了說明虛擬攝影機概念,此圖說明三個情境,其中之每一者對應於一不同虛擬攝影機方位(亦即,一不同顯現位置)及顯示於器件上之一不同所得視圖。 第一階段510展示在第一透視位置處以第一角度(例如,-30°)向下指向3D場景的虛擬攝影機。在此位置中,攝影機指向一方位,該方位可為器件或正被探索之方位之靜止位置,或在器件之移動方位前方的一移動位置(在地圖用於導航之情況下)。在一些實施例中,攝影機之預設位置將相對於當前方位成一特定定向,但此定向可在使用者旋轉地圖時加以修改。自第一角度顯現3D場景導致3D地圖視圖525。 第二階段515展示在一不同第二透視位置處以較大第二角度(例如,-45°)向下指向場景的虛擬攝影機。自此角度顯現3D場景導致3D地圖視圖530,其中建築物及道路小於其在第一地圖視圖525中之圖示說明。 第三階段520展示由上而下視圖下的虛擬攝影機,其向下面向2D地圖545上之對應於用以顯現3D視圖525及530之3D地圖場景535中之方位的一方位。自該透視角度顯現之場景為2D地圖視圖540。不同於第一及第二階段之在一些實施例中為透射3D顯現操作之3D顯現操作,第三階段中之顯現操作相對簡單,因為該操作僅需要裁剪應用程式或使用者所指定之縮放層級所識別的2D地圖之一部分。因此,此情形下的虛擬攝影機表徵稍微不必要地使對應用程式之操作之描述複雜化,此係因為裁剪2D地圖之一部分並非一透視顯現操作。 如在第三階段520中,在一些實施例中,當攝影機自3D透視圖切換至2D由上而下視圖時,地圖繪製應用程式自從一特定透視方向顯現3D場景切換至裁剪2D場景。此係因為,在此等實施例中,應用程式經設計以使用較容易且不產生不必要之透視假影之一簡化顯現操作。然而,在其他實施例中,地圖繪製應用程式使用一透視顯現操作而自一由上而下之虛擬攝影機位置顯現一3D場景。在此等實施例中,所產生之2D地圖視圖稍微不同於第三階段520中所說明之地圖視圖540,此係因為遠離視圖之中心的任何物件皆發生失真,其中物件距視圖之中心之距離越遠,失真越大。 在不同實施例中,虛擬攝影機505沿著不同軌跡或圓弧移動。在圖5中說明兩個此等軌跡550及555。在此等兩個軌跡中,攝影機在圓弧中移動且在攝影機在圓弧上向上移動時更多地向下旋轉。軌跡555與軌跡550之不同之處在於,在軌跡555中,攝影機在其沿圓弧上升時自當前方位向後移動。 當沿著圓弧中之一者移動時,攝影機旋轉以將地圖上之所要方位維持在攝影機之焦點處。在一些情況下,所要方位係器件之一靜止方位或使用者正在地圖上瀏覽之一靜止方位。在其他情況下,所要方位係使用者隨器件移動時在器件之移動方位前方的一移動方位。 除了用導航應用程式控制攝影機(例如,在繞過拐角時自3D轉至2D)之外(或並非用導航應用程式控制攝影機),一些實施例亦允許使用者調整攝影機之位置。一些實施例允許使用者用兩根手指作出命令示意動作來調整攝影機之距離(高度)及角度。一些實施例甚至允許用多種類型示意動作來控制攝影機。 圖6概念性地說明由一些實施例之地圖繪製應用程式提供之透視調整特徵。具體言之,圖6說明處於三個不同階段605至610之虛擬攝影機600,該等階段展示回應於透視調整而調整虛擬攝影機600之位置。如所展示,圖6說明包括四個物件之3D地圖635中之一方位,該等物件為兩個建築物及兩條相交道路。 第一階段605展示在第一透視位置處以相對於地平線之第一角度(例如,45度)向下指向3D地圖635的虛擬攝影機600。在此位置中,攝影機600指向一方位,該方位可為器件或正被探索之方位之靜止位置,或在器件之移動方位前方的一移動位置(在地圖用於導航之情況下)。在一些實施例中,攝影機600之預設位置將相對於當前方位成一特定定向,但此定向可在使用者旋轉地圖時加以修改。基於虛擬攝影機600之位置顯現3D地圖視圖導致3D地圖視圖625。 第二階段610展示在一不同第二透視位置處以相對於地平線之較小第二角度(例如,30度)以一較低透視角度指向3D地圖635的虛擬攝影機600。階段610亦展示,使用者已藉由用兩根手指觸碰螢幕及在向上方向上拖曳該兩根手指(例如,一撥動示意動作)來提供用以調整3D地圖635之視圖之透視角度之輸入。藉由虛擬攝影機600降低並減小相對於地平線之視角來完成場景升高。使用以此角度定位之虛擬攝影機600來顯現一3D地圖視圖導致3D地圖視圖630,其中建築物及道路比其在第一地圖視圖625中的圖示說明高。.如虛擬攝影機600之虛線型式所展示,虛擬攝影機600在更向上傾斜(例如,俯仰)的同時沿著圓弧650向下移動得更遠。 第三階段615展示在一不同第三透視位置處以相對於地平線之一較大第三角度(例如,80°)以一較高透視角度指向3D地圖635上之一方位(例如,虛擬攝影機600之焦點)的虛擬攝影機600。階段615亦展示使用者已藉由用兩根手指觸碰螢幕及在向下方向上拖曳該兩根手指(例如,一撥動示意動作)來提供用以調整3D地圖635之視圖之透視角度之輸入。藉由虛擬攝影機600升高及增加其相對於地平線之角度來完成場景降低或壓平。如此階段615處所展示,在一些實施例中,當虛擬攝影機600定位於由上而下或接近由上而下之位置中,使得使用虛擬攝影機600顯現之3D地圖視圖表現為2D時,地圖繪製應用程式壓平3D地圖635中之建築物(亦即,將多邊形之z軸分量減小至地面高度)。在第三階段615中使用以此角度定位之虛擬攝影機600來顯現一3D地圖視圖導致3D地圖視圖640,其中與第二地圖視圖630中之圖示說明相比,建築物顯得較小、更扁平且道路顯得較小。如虛擬攝影機600之虛線型式所展示,虛擬攝影機600在更向下傾斜(例如,俯仰)的同時沿著圓弧650向上移動得更遠。 在一些實施例中,當地圖繪製應用程式接收到用於調整用於檢視3D地圖635之透視角度的輸入時,可使虛擬攝影機600以此方式移動。在此等實施例中之一些中,當縮放層級達到一特定縮小層級時,地圖繪製應用程式切換至產生2D地圖視圖之一由上而下模式(其中顯現位置筆直面向下)。 當沿著一圓弧移動時,虛擬攝影機旋轉以將地圖上之所要方位維持在攝影機之焦點處。在一些情況下,所要方位係器件之一靜止方位或使用者正在地圖上瀏覽之一靜止方位。在其他情況下,所要方位係使用者隨器件移動時在器件之移動方位前方的一移動方位。3. 用以進入或退出 3D 之示意動作 除了3D控制項之外,一些實施例之地圖繪製應用程式亦允許一使用者經由對器件之多點觸碰介面之示意動作輸入而將一地圖視圖自一二維(2D)呈現轉變至一3D呈現。舉例而言,經由兩指示意動作輸入,可使該使用者體驗將一2D地圖視圖「下推」成一3D地圖視圖,或將一3D地圖視圖「上拉」成一2D地圖視圖。此亦可被視為經由兩指示意動作而將一虛擬攝影機自一2D(自正上方)視圖下拉成一3D (側角)視圖。 不同實施例使用不同的兩指示意動作操作來將2D地圖視圖下推成3D地圖視圖,或將3D地圖視圖上拉成2D地圖視圖。圖7說明用於將2D地圖下推成3D地圖之兩指示意動作的一個實例。此圖依據地圖繪製應用程式之UI之操作之四個階段來呈現此實例。第一階段705展示應用程式UI呈現在器件之當前方位725周圍之2D地圖視圖。 第二階段710接著展示用以下推2D視圖740直至呈現3D視圖的兩指示意動作操作之開始。在一些實施例中,應用程式在其偵測到兩個接觸係水平地或近似水平地置放於2D地圖上且一起向上移動時識別對2D地圖之下推。一些實施例要求該移動超過一特定量以便強加反作用於2D地圖至3D地圖之推動之慣性,且藉此防止此轉變意外發生。 其他實施例使用用以經由示意動作輸入自2D地圖轉變至3D地圖之其他方案。舉例而言,當一使用者將兩根手指相對於彼此垂直地置放且在上部手指上施加較大力以便觸發器件之感測器(例如,迴轉儀等)中之一者或觸發手指之旋轉時,一些實施例之應用程式執行此轉變。另外其他實施例要求執行一相反操作以自2D地圖視圖轉變至3D地圖視圖。舉例而言,一些實施例要求兩根水平對準之手指一致地在2D地圖上向下移動以便將2D視圖下推成3D視圖。 第三階段715展示在使用者之兩根手指已在器件之螢幕上向上移動一特定量之後的該等手指。該階段亦展示2D地圖740已被3D地圖745替換。第四階段720展示在兩指示意動作移動結束時的3D地圖745。在此階段中,3D控制項150以醒目提示方式出現以指示當前地圖視圖為3D地圖視圖。 在一些實施例中,可藉由執行相反的兩指操作而將3D地圖視圖上拉成2D地圖視圖。具體言之,在此等實施例中,當地圖繪製應用程式偵測到3D地圖上之一致地向下移動大於一臨限量之量的兩個水平或近似水平之接觸時,地圖繪製應用程式在3D地圖與2D地圖之間轉變。D. 進入及退出 3D 時之動畫 當自2D地圖視圖轉變至3D地圖視圖時,一些實施例提供展示在2D地圖視圖中表現為扁平的物件在3D地圖視圖中升高並變得較大之動畫。產生展示物件升高/下降且變得較大/較小的此動畫係在2012年9月30日申請之題為「Displaying 3D Objects in a 3D Map Presentation」的美國專利申請案13/632,027中進一步描述。美國專利申請案13/632,027係以引用方式併入本文中。圖8以三個階段說明此動畫。第一階段805展示使用者在檢視2D地圖視圖時選擇3D控制項150。第二階段810及第三階段815展示在地圖繪製應用程式開始提供3D地圖視圖之後地圖繪製應用程式提供的後續視圖(儘管未必為連續視圖)。由於縮放層級在第二階段與第三階段之間增加,故地圖視圖中之建築物之高度增加以提供動畫,該動畫傳遞視圖正自2D視圖移動至3D場景。 當自3D視圖轉變至2D視圖時,一些實施例之地圖繪製應用程式提供一相反動畫,其展示場景中之物件收縮,直至該等物件陷縮成2D地圖中之扁平物件。 在一些實施例中,當地圖繪製應用程式操作於導航模式下或路線檢查模式下時,地圖繪製應用程式提供2D至3D或3D至2D之轉變。在下文進一步描述地圖繪製應用程式之此等兩個操作模式。 圖9以六個不同階段905至930說明一些實施例之地圖繪製應用程式改變3D控制項之外觀以指示地圖視圖之不同2D及3D狀態。第一階段905說明:地圖繪製應用程式正顯示一地圖及包括3D控制項150之浮動控制項。地圖繪製應用程式正以如所展示之特定低縮放層級(地圖尚未放大得很多)以2D顯示地圖。3D控制項150係使用一第一外觀(例如,灰色字母「3D」)顯示以指示3D地圖資料在此特定縮放層級下不可用。第一階段905亦展示地圖繪製應用程式正在接收用以放大地圖(亦即,使縮放層級增大)的使用者示意動作輸入。 第二階段910展示地圖繪製應用程式正以比其在先前階段905中顯示地圖所用之縮放層級高的一縮放層級來顯示地圖。然而,3D控制項150正維持該第一外觀,此係因為3D地圖資料即使在此特定較高縮放層級下仍不可用。第二階段910亦展示地圖繪製應用程式正在接收用以進一步放大地圖之另一示意動作輸入。 第三階段915展示地圖繪製應用程式正以比其在先前階段910中顯示地圖所用之縮放層級高的一縮放層級來顯示地圖。地圖繪製應用程式已將3D控制項150之外觀變為一第二外觀(例如,黑色字母之「3D」)以指示3D地圖資料在此縮放層級下可用。當地圖繪製應用程式接收到對3D控制項150之選擇時,一些實施例之地圖繪製應用程式會將3D控制項150之外觀變為一第三外觀(例如,藍色字母之「3D」)且3D地顯示地圖(例如,藉由自2D之筆直向下視圖變成透視圖)。該第三外觀因此可指示地圖係以3D方式顯示。第三階段915展示地圖繪製應用程式正在接收用以將地圖更進一步放大至一較高縮放層級的另一示意動作輸入。第三階段915展示一些實施例之地圖繪製應用程式正將地圖中之建築物顯示為灰色方塊。 第四階段920展示,地圖繪製應用程式正以比其在先前階段915中顯示地圖所用之縮放層級高的一縮放層級來顯示地圖。地圖繪製應用程式已將3D控制項150之外觀變為一第四外觀(例如,如所展示的第一顏色之建築物圖示)以便指示用於顯現沈浸式3D地圖視圖之3D沈浸式地圖資料在此縮放層級下可用。第四階段920亦展示地圖繪製應用程式正在接收對3D控制項150之選擇。 第五階段925及第六階段930展示在地圖繪製應用程式開始提供3D沈浸式地圖視圖之後地圖繪製應用程式提供的後續視圖(儘管未必為連續視圖)。在一些實施例中,縮放層級在第五階段與第六階段之間未改變,但地圖視圖中之建築物之高度增加以提供一動畫,該動畫傳遞視圖正自2D視圖移動成3D沈浸式視圖。又,自階段920至925,地圖繪製應用程式已將3D控制項之外觀變為第五外觀(例如,如所展示的第二顏色之建築物圖示)以便指示地圖係以3D沈浸式視圖來顯示。E. 瀏覽 1. 撥動 在一些實施例中,地圖繪製應用程式允許使用者經由多種機構來圍繞展示於地圖中之一方位進行探索。舉例而言,如上所述,一些實施例之地圖繪製應用程式允許一使用者藉由在器件之觸敏式螢幕上執行一或多個撥動操作(例如,藉由拖曳手指)來圍繞一方位進行瀏覽。此等操作將由應用程式呈現之視圖移動至地圖上之新方位。在上文參看圖4描述了在3D地圖視圖中之撥動操作之一個實例。2. 旋轉 在一些實施例中,地圖繪製應用程式亦允許一使用者經由示意動作輸入來旋轉2D或3D地圖。在一些實施例中,地圖繪製應用程式為一向量地圖繪製應用程式,其允許在瀏覽地圖時對地圖之直接操縱(諸如旋轉及2D/3D操縱)。然而,對地圖之影響中之一些可為令人迷惑(disorienting)。在沒有重新回到北向上(north-up)定向(亦即,北方向與器件之頂部對準之定向)之容易方式的情況下,一些使用者可能在與地圖視圖互動方面有困難。為了解決此問題,一些實施例之地圖繪製應用程式在地圖上提供浮動羅盤控制項。如所提及,此羅盤既充當指向北之指示符,又充當用以恢復北向上定向之按鈕。為了將地圖上之雜亂進一步減至最少,地圖繪製應用程式僅在地圖被旋轉時才展示此按鈕。 圖10說明在本發明之一些實施例中的旋轉2D地圖及使用羅盤來調直經旋轉地圖之一實例。此圖以四個階段說明此實例。第一階段1005說明2D地圖視圖1025。第二階段1010說明經由兩指示意動作對此地圖視圖之旋轉。在此實例中,使用者藉由將兩根手指置放於地圖視圖上且在上推一根手指之同時下拉一根手指來執行該示意動作。手指之此旋轉運動使應用程式將地圖旋轉成已旋轉地圖視圖1030。在一些實施例中,地圖繪製應用程式計算兩根手指之間的中點且使用該中點作為旋轉之錨點。在一些此類實施例中,若兩根手指中之一者不移動,則地圖繪製應用程式使用不移動手指之位置作為錨點。在一些實施例中,當位置控制項326存在於視圖中(例如,藉由選擇位置控制項145)時,不管手指方位如何,地圖繪製應用程式皆使用位置控制項之方位作為旋轉之錨點。 第二階段1010亦展示,回應於地圖之旋轉,應用程式已在地圖上呈現羅盤300以指示已旋轉地圖上之北方向。第三階段1015接著展示使用者選擇羅盤300。第四階段1020接著展示,在選擇羅盤之後,應用程式將地圖旋轉回至地圖視圖1025 (亦即,北向上定向)。 圖11說明在本發明之一些實施例中的旋轉地圖之另一實例。此圖以四個階段1105至1120說明此實例。在此實例中,地圖為一3D地圖。因此,第一階段1105說明3D地圖視圖1105。第二階段1110說明經由兩指示意動作使此地圖視圖旋轉。如前所述,在此實例中,使用者藉由將兩根手指置放於地圖視圖上且在上推一根手指之同時下拉一根手指來執行該示意動作。手指之此旋轉運動使應用程式將地圖旋轉成已旋轉地圖視圖1130。在此實例中,旋轉係圍繞器件之當前位置,此係因為,如上所述,當前方位指示符326存在於地圖視圖1125中。 第二階段1110亦展示:回應於地圖之旋轉,應用程式已在地圖上呈現羅盤300以指示已旋轉地圖上之北方向。第三階段1115接著展示回應於使用者之另一兩指示意動作的地圖之進一步旋轉。羅盤300仍指示北方向,但已與已旋轉地圖一起旋轉。第四階段1120接著展示地圖及羅盤300之更進一步旋轉。 在一些實施例中,地圖繪製應用程式不允許使用者在某些縮放層級下旋轉2D或3D地圖。舉例而言,當縮小地圖(至一低縮放層級)時,地圖繪製應用程式在接收到使用者之用以旋轉地圖之示意動作輸入(例如,兩指旋轉操作)時不使地圖旋轉。在一些實施例中,地圖繪製應用程式之負責移動虛擬攝影機之模組檢查當前縮放層級且在地圖在當前縮放層級下不應旋轉之情況下決定忽略此等指令。 在一些其他實施例中,應用程式在使用者提供用於旋轉地圖之示意動作輸入時使地圖旋轉一特定距離,但在使用者釋放或停止該示意動作輸入時使地圖旋轉回至預設定向(例如,北)。 在一些實施例中,地圖繪製應用程式為地圖之旋轉提供一慣性效應。當一使用者提供一特定類型示意動作輸入(例如,以大於臨限速率之角速率或平移速率終止之輸入)以旋轉地圖時,地圖繪製應用程式產生使地圖繼續旋轉且減速至停止之慣性效應。在一些實施例中,慣性效應為使用者提供了模擬真實世界中之行為的與地圖之更逼真互動。 圖12以三個不同階段1205至1215說明與用於旋轉操作之慣性效應一起之旋轉操作。為了說明簡單起見,在此圖中依據2D地圖視圖來展示慣性效應。然而,一些實施例之地圖繪製應用程式在於3D模式下檢視地圖時提供慣性效應。第一階段1205展示2D地圖之2D地圖視圖1220。在一些實施例中,地圖繪製應用程式執行用以執行旋轉操作的在下文藉由參看圖13所描述之程序1300。 如所展示,2D地圖視圖1220包括在平行或垂直方向上延伸之若干街道。第一階段1205亦展示使用者正在提供用以旋轉2D地圖視圖1220之輸入。具體言之,使用者正藉由將兩根手指觸碰於觸控螢幕上之兩個方位且在順時針方向上旋轉該兩根手指(如圖中所描繪之兩個箭頭所指示)來執行用以旋轉2D地圖視圖1220之一示意動作。在此實例中,出於解釋目的,說明了指尖之醒目提示。在一些實施例中,地圖繪製應用程式實際上不在指尖周圍顯示醒目提示。 第二階段1210展示緊接在使用者已完成用以旋轉2D地圖之輸入之後的2D地圖。對於此實例,使用者藉由將兩根手指抬離器件之觸控螢幕來完成該輸入,如不再展示的指尖周圍之醒目提示所指示。另外,第二階段1210展示由地圖繪製應用程式顯現之2D地圖之2D地圖視圖1225。如所展示,地圖繪製應用程式已在順時針方向上使2D地圖自2D地圖視圖1220旋轉至2D地圖視圖1225。第一階段1205中所展示之街道已在順時針方向上旋轉近似45度。 不同實施例之地圖繪製應用程式利用不同方法來實施旋轉操作之慣性效應。舉例而言,在一些實施例中,地圖繪製應用程式基於手指中之一者或該等手指兩者之平均值來判定在使用者停止手指之移動或將手指抬離觸控螢幕所在之瞬間或接近該瞬間時的使用者之輸入之角(或平移)速率。當使用者重複地在不抬起手指的情況下停止手指且開始再次移動手指時,一些實施例之地圖繪製應用程式將每一停止視為輸入之結束,而在其他實施例中,地圖繪製應用程式將該操作視為一個輸入,直至使用者將手指抬離螢幕為止。 地圖繪製應用程式使用角速率來判定用於慣性效應之角度量(例如,度),且判定用以檢視2D地圖之虛擬攝影機藉以使角速率減速(例如,以常數、指數、對數等方式)以旋轉該經判定之角度量的方式。在一些實施例中,地圖繪製應用程式顯現並顯示慣性效應之一動畫(例如,自2D地圖視圖1225的2D地圖之減速旋轉,其使該2D地圖旋轉經判定之角度量)。 在一些實施例中,地圖繪製應用程式本身並不分析使用者之示意動作輸入。舉例而言,此等實施例之地圖繪製應用程式不判定使用者之輸入之角速率。實情為,此等實施例之地圖繪製應用程式接收由執行地圖繪製應用程式之器件之作業系統判定之角速率。器件之作業系統具有用以接收使用者之示意動作輸入之一介面。作業系統分析所接收之輸入且將分析提供至地圖繪製應用程式。地圖繪製應用程式將基於對輸入之分析來判定要應用之慣性效應。 第三階段1215說明在地圖繪製應用程式已顯現並顯示慣性效應之動畫之後的2D地圖。如所展示,顯示由地圖繪製應用程式顯現之3D地圖之2D地圖視圖1230。在使用者完成第二階段1210中之輸入之後,在第三階段1215中,地圖繪製應用程式已使2D地圖順時針旋轉得更多。如所展示,第三階段1215中之2D地圖視圖1230展示了相對於3D地圖視圖1225中所展示之街道順時針旋轉得更多之街道。 在一些實施例中,地圖繪製應用程式亦提供用於除旋轉地圖外之操作(諸如,移動瀏覽(pan)地圖或進入或退出3D操作(例如,移動瀏覽、旋轉、自2D進入3D))的慣性效應。用於此等其他操作之慣性效應係進一步描述於上文併入之美國專利申請案13/632,035中。圖13概念性地說明用於基於示意動作輸入使地圖視圖旋轉的一些實施例之程序1300。在一些實施例中,當地圖繪製應用程式處於地圖檢視模式(例如,方位瀏覽模式、導航模式、2D檢視模式、3D檢視模式等)下且地圖繪製應用程式經由地圖繪製應用程式執行於之器件之觸控螢幕接收一示意動作時,地圖繪製應用程式執行程序1300。 程序1300藉由接收(在1310處)用於旋轉地圖視圖之一示意動作而開始。在一些實施例中,用於旋轉地圖視圖之一示意動作包括經由觸控螢幕接收之多點觸碰示意動作(例如,用多根手指同時觸碰觸控螢幕)。在此實例中,程序1300接收兩點觸碰旋轉示意動作。 接下來,程序1300識別(在1320處)所接收之示意動作之旋轉分量。一些實施例之程序1300藉由識別示意動作之觸碰點之旋轉量來識別示意動作之旋轉分量。舉例而言,在一些此等實施例中,程序1300藉由以下操作來識別示意動作之觸碰點之旋轉量:(1)判定自一個觸碰點之初始方位至另一觸碰點之初始方位之第一向量;(2)判定自該一個觸碰點之第二方位至該另一觸碰點之第二方位之第二向量;及(3)基於該等觸碰點之初始方位及該等觸碰點之第二方位來判定旋轉方向。 程序1300接著判定(在1330處)旋轉量是否在一臨限量內。當程序1300判定旋轉量不在該臨限量內時,程序1300結束。否則,程序1300基於示意動作判定(在1340處)一旋轉軸線。在一些實施例中,程序1300藉由以下操作來判定旋轉軸線:(1)識別沿著自一個觸碰點之初始方位至另一觸碰點之初始方位之一向量之一點;及(2)判定地圖視圖上之對應於沿著該向量之該點之一點(例如,地圖上之與沿著該向量之該點重合之點)。程序1300使用地圖視圖上之經判定點作為地圖視圖旋轉所圍繞之軸線(例如,z軸)之方位。 接下來,程序1300基於旋轉軸線及旋轉量來調整(在1350處)地圖視圖。在一些實施例中,程序1300藉由使地圖視圖圍繞經判定之旋轉軸線在經判定之旋轉方向上旋轉經判定之旋轉量來調整地圖視圖。不同實施例將不同座標空間用於地圖。舉例而言,一些實施例之地圖使用麥卡托(Mercator)單位座標空間。在此等實施例中,程序1300調整虛擬攝影機相對於地圖之位置,以便調整地圖視圖。作為另一實例,在一些實施例中,地圖使用全球大地座標系統(World Geodetic System)(例如,WGS 84)作為地圖之座標空間。在一些此等實施例中,程序1300相對於虛擬攝影機之位置調整地圖以便調整地圖視圖。 最後,程序1300顯現(在1360處)經調整之地圖視圖以供在器件上顯示。在一些實施例中,所顯現之地圖視圖為表示經調整地圖視圖之一影像。接著,程序1300結束。 在一些實施例中,3D地圖可在所界定範圍及/或集合之縮放層級下旋轉。舉例而言,在一些實施例中,地圖繪製應用程式允許3D地圖在經定義數目個最高縮放層級(例如,縮放層級10至20)下旋轉,且防止3D地圖在剩餘較低縮放層級(例如,縮放層級1至10)下旋轉。在一些此等實施例中,當地圖繪製應用程式接收到用以在經定義不允許旋轉操作之一縮放層級下旋轉3D地圖之輸入時,地圖繪製應用程式不產生用以旋轉3D地圖之指令。在其他此等實施例中,當地圖繪製應用程式接收到用以在經定義不允許旋轉操作之一縮放層級下旋轉3D地圖之輸入時,地圖繪製應用程式產生用以旋轉3D地圖之指令,但地圖繪製應用程式僅僅忽略該等指令。一般熟習此項技術者將認識到,在不同實施例中,可能以許多不同方式來定義允許在3D地圖上執行旋轉操作之縮放層級。3. 圖例及名稱旋轉 一些實施例之地圖繪製應用程式使用新穎技術當地圖視圖旋轉時調整出現在地圖視圖中之文字及/或符號或保持文字及/或符號未經調整。圖14依據UI操作之四個階段1405至1420說明此新穎方法之一個實例。在此實例中,名稱Apple Inc.出現在一方位處,該方位係1 Infinite Loop, Cupertino California。 在第一階段1405中,名稱Apple Inc.在一特定地圖視圖中係直立的。在第二階段1410及第三階段1415中,地圖視圖回應於使用者之兩指旋轉操作而旋轉。在此等兩階段中,名稱Apple Inc.經展示為以比地圖之旋轉角度小得多的角度輕微旋轉。名稱Apple Inc.之行為就好像此名稱係在其中心或頂部處釘紮至地圖,但此名稱之重心指向下。因此,每當地圖旋轉時,名稱亦輕微旋轉,但名稱之重心使其旋轉較少且最終使名稱回到其直立位置。在第四階段1420中展示名稱Apple Inc.之此直立位置。此階段展示在旋轉操作已完成之後的地圖視圖。 然而,在旋轉之地圖中維持所有文字及/或符號之恆定直立位置在地圖具有許多文字字元或符號且其中之許多字元或符號正對抗旋轉以保持筆直向上的情況下可能有一點令人分心。因此,對於該等字元及/或符號中之一些,一些實施例之地圖繪製應用程式使用一替代機構來調整該等字元及/或符號在旋轉期間之定向。 圖15依據UI操作之四個階段1505至1520說明一個此替代實例。在此實例中,街道之名稱係在地圖視圖已旋轉一臨限量之後旋轉之字元。 在第一階段1505中,街道名稱係與街道上之向上且向右之行進方向對準。在第二階段1510及第三階段1515中,地圖視圖回應於使用者之兩指旋轉操作而旋轉。在此等兩階段中,因為地圖尚未旋轉必要之臨限量,故尚無街道名稱旋轉。然而,到地圖經旋轉達到其在第四階段1520中的定向時,地圖已充分旋轉而通過一臨限量,此將要求街道名稱中之一些(街道1至4)必須旋轉以維持該等名稱與向上行進方向之對準。4. 縮放及彈跳 圖16說明使用者經由兩指示意動作操作而自3D地圖視圖轉變至2D地圖視圖之一實例。此圖以四個階段1605至1620說明此轉變。在前三個階段1605至1615中,使用者執行一捏合操作,其使應用程式在連續步驟中縮小第一階段中所呈現之3D視圖,直至該視圖變成階段四1620中所說明之2D視圖為止。 替代性地或與上文藉由參看圖6所描述之透視調整特徵相結合,一些實施例之地圖繪製應用程式允許使用者放大及縮小3D地圖之視圖(例如,藉由用兩根手指提供示意動作輸入)。圖17說明由一些實施例之地圖繪製應用程式提供之縮放調整特徵。詳言之,圖17說明處於三個不同階段1701至1703之虛擬攝影機1712,該等階段展示虛擬攝影機1712回應於縮放調整之移動。如所展示,圖17展示3D地圖1710中之一方位,其含有兩個建築物及形成T型匯接點之兩條道路。 第一階段1701展示3D地圖1710,其中在一特定位置處的虛擬攝影機1712指向3D地圖1710。在此位置中,攝影機1712指向一方位,該方位可為器件或正被探索之方位之靜止位置,或在器件之移動方位前方的一移動位置(在地圖用於導航之情況下)。基於虛擬攝影機1712之位置顯現3D地圖視圖導致3D地圖視圖1714。 第二階段1702展示處於一不同縮放層級位置之指向3D地圖1710之虛擬攝影機1712。階段1702展示一使用者已藉由以下操作提供用以增加3D地圖1710之視圖之縮放層級的輸入:使兩根手指彼此靠近地觸碰器件之螢幕且在手指觸碰螢幕的同時使指尖移動分開(例如,一擴張示意動作)。 藉由虛擬攝影機1712沿著線1750移動接近3D地圖1710來完成放大調整。在一些實施例中,地圖繪製應用程式使用以移動虛擬攝影機1712所沿之線1750為由虛擬攝影機1712之前端及虛擬攝影機1712之焦點形成之線。一些實施例之地圖繪製應用程式基於使用者之輸入而使虛擬攝影機1712沿著由虛擬攝影機1712之前端及3D地圖1710中之一方位形成之一線移動以放大3D地圖1710之視圖。 使用此位置處之虛擬攝影機1712顯現一3D地圖視圖導致3D地圖視圖1724,其中建築物及道路與3D地圖視圖1714中所展示之位置相比顯得更近。如虛擬攝影機1712之虛線型式所指示,虛擬攝影機1712沿著線1750移動接近3D地圖1710。 第三階段1703展示處於一不同縮放層級位置之指向3D地圖1710之虛擬攝影機1712。在此階段1703中,使用者已藉由以下操作提供用以減小3D地圖1710之縮放層級的輸入:使兩根手指遠遠分開地觸碰器件之螢幕且在手指觸碰螢幕的同時使指尖移動靠近到一起(例如,一捏合示意動作)。 藉由使虛擬攝影機1712沿著線1755移動遠離3D地圖1710來完成縮小調整。在一些實施例中,地圖繪製應用程式使用以移動虛擬攝影機1712所沿之線1755為由虛擬攝影機1712之前端及虛擬攝影機1712之焦點形成之線。一些實施例之地圖繪製應用程式基於使用者之輸入而使虛擬攝影機1712沿著由虛擬攝影機1712之前端及3D地圖1710中之一方位形成之一線移動以放大3D地圖1710之視圖。 使用此位置處之虛擬攝影機1712顯現一3D地圖視圖導致3D地圖視圖1734,其中建築物及道路與3D地圖視圖1724中所說明之位置相比顯得更遠。如虛擬攝影機1712之虛線型式所展示,虛擬攝影機1712沿著線1755移動遠離3D地圖1710。 如上所述,圖17說明若干實例縮放調整操作及虛擬攝影機在3D地圖中之用以顯現3D地圖之3D地圖視圖的對應移動。一般熟習此項技術者將認識到,許多不同縮放調整係可能的。另外,一些實施例之地圖繪製應用程式回應於額外及/或不同類型輸入(例如,觸按螢幕、兩次觸按螢幕等)而執行縮放調整操作。 圖18概念性地說明由一些實施例之地圖繪製應用程式提供之用於將虛擬攝影機之位置維持在沿著圓弧之所界定範圍內之特徵。詳言之,圖18說明處於三個不同階段1805至1815之虛擬攝影機1800,該等階段展示維持在圓弧1850之所界定範圍內的虛擬攝影機1800之位置。如圖18中所展示,3D地圖1835中之一方位包括兩個建築物及形成T型匯接點之兩條道路。 第一階段1805展示處於沿著圓弧1850之一特定位置之虛擬攝影機1800。如所展示,圓弧1850表示一所界定範圍(例如,角範圍),虛擬攝影機1800可在該範圍內移動。第一階段1805亦展示沿著圓弧1850之三個位置1855至1865 (例如,透視視角)。在此實例中,地圖繪製應用程式以類似於上文參看圖5所描述之方式的方式使虛擬攝影機1800沿著圓弧1850在圓弧1850之高透視端(例如,當虛擬攝影機1800向下傾斜最大時,沿著圓弧1850之位置)與位置1855之間移動。在第一階段1805中基於虛擬攝影機1800之位置顯現一3D地圖視圖導致3D地圖視圖1825。 當虛擬攝影機1800在向圓弧1850之低透視端移動時通過位置1855時,不管使用者所提供之輸入如何,地圖繪製應用程式使虛擬攝影機1800向圓弧1850之低透視端移動之速度減小(例如,減速)。在一些實施例中,地圖繪製應用程式使虛擬攝影機1800之速度以恆定速率減小,而在一些實施例中,地圖繪製應用程式使虛擬攝影機1800的速度以指數速率減小。在一些實施例中使用用於減小虛擬攝影機1800之速度之額外及/或不同方法。 第二階段1810展示虛擬攝影機1800已移動至沿著圓弧1850之處於或靠近圓弧1850之低透視端處之一位置。如所展示,使用者正藉由用兩根手指觸碰螢幕及在向上方向上拖曳該兩根手指(例如,一撥動示意動作)來提供用以調整3D地圖1835之視圖之透視角度之輸入。回應於該輸入,地圖繪製應用程式在使虛擬攝影機1850向上傾斜的同時使虛擬攝影機1800向圓弧1850之低透視端移動。當虛擬攝影機沿著圓弧1850到達位置1865時,地圖繪製應用程式防止虛擬攝影機1800移動得更低及超出位置1865,即使使用者繼續提供用以減小3D地圖1835之視圖之透視角度之輸入(例如,使用者繼續在螢幕上向上拖曳兩根手指)亦然。 在一些實施例中,當使用者停止提供用以減小3D地圖1835之視圖之透視角度之輸入(例如,使用者將兩根手指抬離觸控螢幕)時,地圖繪製應用程式使虛擬攝影機1800之位置自位置1865沿著圓弧1850向上「彈跳」或「快速移動」至位置1860。當地圖繪製應用程式在彈跳或快速移動運動期間正基於虛擬攝影機1800 之視圖產生或顯現3D地圖1835之3D地圖視圖時,產生之3D地圖視圖提供顯示3D地圖視圖簡短地向下彈跳或快速移動之一彈跳動畫,以便向使用者指示地圖視圖之透視角度不能減小更多。使用以此角度定位之虛擬攝影機1800顯現一3D地圖視圖導致3D地圖視圖1830,其中建築物及道路與地圖視圖1825相比較高。 第三階段1815展示地圖繪製應用程式回應於使用者停止提供輸入而已使虛擬攝影機1800之位置彈跳或快速移動至位置1860之後的虛擬攝影機1800。不同實施例使用不同技術來實施虛擬攝影機1800之彈跳或快速移動。舉例而言,一些實施例之地圖繪製應用程式開始使虛擬攝影機1800沿著圓弧1850在一經定義距離中快速地加速,或直至虛擬攝影機1800達到一經定義速度。接著,地圖繪製應用程式使虛擬攝影機1800沿著圓弧1850在距位置1860之剩餘距離中減速。在一些實施例中使用用以實施彈跳或快速移動效應之其他方式。在第三階段1815中使用定位於沿著圓弧1850之位置1860處之虛擬攝影機1800顯現一3D地圖視圖導致3D地圖視圖1840,與地圖視圖1830相比,在地圖視圖1840中,建築物顯得稍小且較扁平,且道路顯得稍小。 如上所述,圖18說明用於防止虛擬攝影機移動超出圓弧之低透視端之一技術。替代防止虛擬攝影機移動超出圓弧之低透視端或與防止虛擬攝影機移動超出圓弧之低透視端相結合,一些實施例之地圖繪製應用程式利用用於防止虛擬攝影機移動超出圓弧之高透視端之一類似技術。另外,圖18展示以下位置之一實例:減慢虛擬攝影機的沿著一圓弧之一位置;用以防止虛擬攝影機移動通過的沿著該圓弧之一位置;及虛擬攝影機快速移動或彈跳回至的沿著該圓弧之一位置。不同實施例以許多不同方式定義該等位置。舉例而言,在一些實施例中,減慢虛擬攝影機的沿著該圓弧之位置與虛擬攝影機快速移動或彈跳回至的沿著圓弧之位置相同或相近。5. 顯現模組 圖19概念性地說明由一些實施例之地圖繪製應用程式執行以便顯現地圖以供在用戶端器件處(例如在用戶端器件之顯示器上)顯示之處理或地圖顯現管線1900。在一些實施例中,地圖顯現管線1900可被共同稱為地圖顯現模組。此處理管線之更詳細型式係描述於上文併入之美國專利申請案13/632,035中。如所說明,處理管線1900包括底圖擷取器1905、一組網格建立器1915、一組網格建立處理器1910、控制器1975、底圖提供器1920、虛擬攝影機1930及地圖顯現引擎1925。 在一些實施例中,該等底圖擷取器1905根據來自網格建立器1915的對地圖底圖之請求而執行用以擷取地圖底圖之各種程序。如下文所描述,該等網格建立器1915識別建立地圖底圖之各別網格所需之現有地圖底圖(係儲存於一地圖繪製服務伺服器上或執行處理管線1900之器件上之一快取記憶體中)。該等底圖擷取器1905接收對地圖底圖之請求,判定在何處擷取該等地圖底圖最佳(例如,自地圖繪製服務、自器件上之一快取記憶體等),且在需要時解壓縮該等地圖底圖。 藉由底圖提供器1920執行個體化一些實施例之該等網格建立器1915 (亦被稱為底圖來源),以便建立視圖底圖之不同層。視地圖繪製應用程式正顯示之地圖之類型而定,底圖提供器1920可執行個體化不同數目個及不同類型網格建立器1915。舉例而言,對於低空俯瞰(flyover) (或衛星)視圖地圖,底圖提供器1920可能僅執行個體化一個網格建立器1915,此係因為一些實施例之低空俯瞰地圖底圖不含有資料之多個層。事實上,在一些實施例中,低空俯瞰地圖底圖含有在地圖繪製服務處產生之已建立網格,其中將低空俯瞰影像(藉由衛星、飛機、直升機等拍攝)用作為網格之紋理。然而,在一些實施例中,可執行個體化額外網格建立器以用於在應用程式處於混合模式下時產生標籤以覆蓋於該等低空俯瞰影像上。對於2D或3D顯現之向量地圖(亦即,非衛星影像地圖),一些實施例執行個體化單獨網格建立器1915以針對以下各者建立網格以併入至地圖中:土地覆蓋物多邊形資料(例如,公園、水體等)、道路、名勝記號、點標籤(例如,公園之標籤等)、道路標籤、交通(若顯示交通)、建築物、點陣資料(針對特定縮放層級下之特定物件),以及資料之其他層。產生低空俯瞰視圖地圖係詳細描述於題為「3D Streets」之PCT申請案PCT/EP2011/054155中。PCT申請案PCT/EP2011/054155係以引用方式併入本文中。 一些實施例之網格建立器1915自底圖提供器1920接收「空」視圖底圖且將「已建立」視圖底圖傳回至底圖提供器1920。亦即,底圖提供器1920將一或多個視圖底圖(未圖示)發送至該等網格建立器1915中之每一者。視圖底圖中之每一者指示世界之一區域,針對該區域繪出網格。在接收到此視圖底圖時,網格建立器1915識別所需的來自地圖繪製服務之地圖底圖,且將其清單發送至該等底圖擷取器1905。 在自底圖擷取器1905接收回底圖時,網格建立器使用儲存於底圖中之向量資料來建立由視圖底圖描繪之區域之多邊形網格。在一些實施例中,網格建立器1915使用若干不同網格建立處理器1910來建立網格。此等功能可包括網格產生器、三角測量儀、陰影產生器及/或紋理解碼器。在一些實施例中,此等功能(及額外網格建立功能)可供每一網格建立器使用,其中不同網格建立器1915使用不同功能。每一網格建立器1915在建立其網格之後將其網格層經填充的視圖底圖傳回至底圖提供器1920。 底圖提供器1920自控制器1975接收表示待顯示之地圖視圖(亦即,自虛擬攝影機1930可見之體積)之一特定視圖(亦即,體積或視見平截頭體)。底圖提供器執行任何揀選(culling) (例如,識別將顯示於視圖底圖中之表面區域),接著將此等視圖底圖發送至網格建立器1915。 底圖提供器1920接著自網格建立器接收已建立的視圖底圖,且在一些實施例中,該底圖提供器使用來自虛擬攝影機1930之特定視圖對已建立網格執行揀選(例如,移除離得過遠之表面區域、移除完全在其他物件後面之物件等)。在一些實施例中,底圖提供器1920在不同時間自不同網格建立器接收已建立的視圖底圖(例如,歸因於用以完成複雜度較高及較低之網格的不同處理時間、在自該等底圖擷取器1905接收必需地圖底圖之前所經過之不同時間等)。一旦已傳回視圖底圖之層之全部,一些實施例之底圖提供器1920便將該等層置放在一起且將資料釋放至控制器1975以用於顯現。 虛擬攝影機1930產生供管線1900顯現之體積或表面,且將此資訊發送至控制器1975。基於將自其顯現地圖之特定方位及定向(亦即,使用者自其「檢視」地圖的3D空間中之點),虛擬攝影機識別一視場以實際發送至底圖提供器1920。在一些實施例中,當地圖繪製應用程式正顯現3D透視圖以用於導航時,根據一演算法判定虛擬攝影機之視場,該演算法基於使用者器件之移動而以規則間隔產生一新虛擬攝影機方位及定向。 在一些實施例中,控制器1975負責管理底圖提供器1920、虛擬攝影機1930及地圖顯現引擎1925。在一些實施例中,可實際執行個體化多個底圖提供器,且控制器將若干視圖底圖(例如,地圖底圖及建築物底圖)置放在一起以產生交遞至地圖顯現引擎1925之一場景。 地圖顯現引擎1925負責基於發送自虛擬攝影機之網格底圖(未圖示)而產生用以輸出至一顯示器件之一圖式。一些實施例之地圖顯現引擎1925具有若干子程序。在一些實施例中,藉由一不同子程序來顯現每一不同類型地圖元素,其中顯現引擎1925處置對不同物件層之遮蓋(例如,將標籤置放於不同建築物上方或後面、在土地覆蓋物之上產生道路等)。此等顯現程序之實例包括道路顯現程序、建築物顯現程序,及標籤顯現程序、植被顯現程序、點陣交通顯現程序、點陣道路顯現程序、衛星顯現程序、多邊形顯現程序、背景點陣顯現程序等。 現將描述一些實施例中之顯現管線1900之操作。基於用以在一特定縮放層級下檢視一特定地圖區之使用者輸入,虛擬攝影機1930指定自其檢視地圖區之一方位及定向,且將此視見平截頭體或體積發送至控制器1975。控制器1975執行個體化一或多個底圖提供器。雖然在此圖中展示一個底圖提供器1920,但一些實施例允許同時執行個體化多個底圖提供器。舉例而言,一些實施例執行個體化用於建築物底圖及用於地圖底圖之單獨底圖提供器。 底圖提供器1920執行對於產生空視圖底圖(其識別地圖區之需要建立網格之區)所必需之任何揀選,且將空視圖底圖發送至網格建立器1915,針對繪出地圖之不同層(例如,道路、土地覆蓋物、POI標籤等)執行個體化該等網格建立器。該等網格建立器1915使用自地圖繪製服務接收之一資訊清單,其識別在地圖繪製服務伺服器上可用之不同底圖(亦即,作為四叉樹之節點)。網格建立器1915向底圖擷取器1905請求特定地圖底圖,該等底圖擷取器1905將請求之地圖底圖傳回至網格建立器1915。 一旦一特定網格建立器1915已接收到其地圖底圖,該網格建立器便開始使用儲存於地圖底圖中之向量資料來建立用於發送自底圖提供器1920之視圖底圖之網格。在建立用於其地圖層之網格之後,網格建立器1915將已建立視圖底圖發送回至底圖提供器1920。底圖提供器1920等待,直至其已自各種網格建立器1915接收到所有視圖底圖,接著將此等視圖底圖層化在一起且將完成之視圖底圖發送至控制器1975。控制器將來自其所有底圖提供器之所傳回底圖(例如,地圖視圖底圖及建築物視圖底圖)拼接在一起,且將此場景發送至顯現引擎1925。地圖顯現引擎1925使用地圖底圖中之資訊來繪出場景以供顯示。II. 方位搜尋 A. 搜尋欄位行為 1. 搜尋欄位功能及外觀 在一些實施例中,地圖繪製應用程式之搜尋欄位係另一UI工具,應用程式使用該工具來使不同模態之間的轉變順暢。在一些實施例中,一使用者可藉由在搜尋欄位中觸按來起始一搜尋。此觸按指引應用程式呈現一動畫,該動畫(1)呈現一螢幕小鍵盤且(2)開啟充滿有價值完成之一搜尋表。此表具有一些重要之微小區別(subtleties)。當觸按搜尋欄位時且在編輯字詞之前,或當搜尋欄位為空時,該表含有「新近使用」之一清單,在一些實施例中,「新近使用」係使用者已請求之新近搜尋及路線指引。此使快速地引出新近存取之結果非常容易。 在搜尋欄位中進行任何編輯之後,由來自本端來源(例如,書籤、聯絡人、新近搜尋、新近路線指引等)及遠端伺服器建議兩者之搜尋完成來填充該表。將使用者之聯絡人卡片併入至搜尋介面中為該設計增添了額外靈活性。當展示新近使用時,在一些實施例中,始終提供自當前方位至使用者家之一路線,而在其他實施例中,在被視為「適當」之內容脈絡中才提供該路線。又,當搜尋字詞匹配一地址標籤之至少部分(例如,「Work」之「Wo」)時,在一些實施例中,應用程式呈現作為搜尋表中之一完成的使用者之加標籤地址。此等行為合在一起使搜尋UI成為自多種來源在地圖上獲得結果之極有力方式。除了允許一使用者起始搜尋之外,在一些實施例中,主要地圖視圖中存在搜尋欄位亦允許使用者看到對應於地圖上之搜尋結果之查詢及藉由清除查詢來移除彼等搜尋結果。a) 檢視新近使用 如上所述,當最初觸按搜尋欄位時且在提供或編輯任何搜尋字詞之前,或當搜尋欄位為空時,搜尋表顯示新近所搜尋字詞及所搜尋路線指引之一清單。圖20說明為了顯示具有使用者之新近搜尋及新近路線指引之清單之搜尋表的使用者與在使用者之器件上執行之應用程式之互動的四個階段2005至2020。 第一階段2005展示在地圖繪製應用程式已開啟之後的器件。如上所述,地圖繪製應用程式之UI具有一開始頁面,在一些實施例中,該開始頁面(1)顯示器件之當前方位之地圖,及(2)配置於頂端列140中及作為浮動控制項之若干UI控制項。在第一階段2005中,使用者觸按當前為空之搜尋欄位165。頂端列140包括指引控制項160及書籤控制項170。 第二階段2010說明在接收到使用者對搜尋欄位之觸按之後應用程式顯示搜尋表2040。不管使用者是否在搜尋欄位中提供任何搜尋字詞,均顯示此搜尋表。搜尋表2040提供所建議搜尋完成之一清單,該等所建議搜尋完成包括新近所搜尋之字詞及路線指引。詳言之,搜尋表指示使用者新近搜尋了「John Smith」及「Pizzeria」。搜尋表中所列出之該等搜尋完成中之每一者亦指示某些其他有用資訊。舉例而言,在「John Smith」旁顯示之圖示2045指示此人包括於使用者之器件上之一聯絡人清單中,且如書籤圖示2050所指示,「Pizzeria」當前被儲存為書籤。 搜尋表亦列出使用者之新近路線指引,該等新近路線指引包括在搜尋表2040之底部處所說明的至「Royal Burgers」之指引。又,搜尋表2040列出用以獲得自使用者之當前方位至使用者之住家地址之指引之一選項,其經說明為搜尋表2040之第一項目。在一些實施例中,當展示新近使用時,始終提供自當前方位至使用者家之一路線。此外,一些實施例之地圖繪製應用程式僅在搜尋欄位為空時顯示新近路線指引。亦即,一旦使用者開始鍵入一搜尋查詢,地圖繪製應用程式就不將新近路線指引包括於所建議搜尋完成之清單中。 第二階段2010亦說明一些實施例之地圖繪製應用程式自頂端列140移除指引控制項160及書籤控制項170。地圖繪製應用程式插入取消控制項2055,其用於取消搜尋表2040且回到先前階段2005中所展示之地圖視圖。 第三階段2015說明使用者選擇作為搜尋表中之第一項目列出的至「Home」之指引選項。藉由在搜尋表之頂部提供最頻繁請求之使用者搜尋及指引請求中之一些(包括至家之指引之選項),應用程式為使用者提供不必大量地導覽應用程式以接收此等結果而快速地獲得用於使用者之最一般請求之資訊的能力。 第四階段2020說明顯示對應於自使用者之當前方位至使用者家之指引之一路線的地圖繪製應用程式。一些實施例之地圖繪製應用程式亦自頂端列140移除搜尋欄位165及取消控制項2055且置放清除控制項255及開始控制項2060。在一些實施例中,開始控制項2060係用於開始根據所選擇路線之導航。在一些實施例中,地圖繪製應用程式使當前方位指示符在顯示區域中居中,使得自顯示區域之中心顯示自當前方位起之路線。 使用者亦可在搜尋欄位中提供搜尋查詢。當使用者在搜尋欄位中鍵入完整搜尋字詞時,一些實施例之地圖繪製應用程式提供匹配或包括迄今已在搜尋欄位中鍵入之搜尋字詞之項目之一清單。對於每一特定搜尋,使用者具有用以自搜尋表中所顯示之項目之清單進行選擇之選項,或使用者可在搜尋字詞與地圖及使用者之當前方位有關時選擇小鍵盤上之「搜尋」按鈕來執行對搜尋字詞之完整搜尋。b) 完整搜尋字詞 圖21以四個階段2105至2120說明使用者在搜尋欄位中鍵入搜尋查詢及執行搜尋之一實例。在第一階段2105中,使用者觸按搜尋欄位165。在此實例中,假設使用者在鍵入此搜尋查詢之前尚未進行一搜尋,或搜尋歷史已由使用者或地圖繪製應用程式清除。因此,地圖繪製應用程式不提供填有新近所搜尋之字詞及路線指引之搜尋表。 下一階段2110展示在使用者已鍵入搜尋字詞「Pavilion」之後的器件。搜尋表2155說明針對使用者鍵入之查詢字詞「Pavilion」的搜尋完成之一清單。列出之搜尋完成包括針對「Pavilion Market」、「Pavilion Theaters」及自使用者之當前方位至「Bowling Pavilion」之指引的所建議搜尋。然而,在一些實施例中,針對路線之搜尋完成不展示文字「當前方位」。實情為,如所展示,地圖繪製應用程式顯示目的地之地址。此係因為一些實施例之地圖繪製應用程式假設:使用者鍵入字母就指示使用者意欲達到匹配搜尋查詢之目的地,且因此,與路線係自當前方位至目的地之指示相比,目的地地址為對使用者更有用的資訊。 此外,搜尋表中的所建議搜尋完成之清單未顯示任何書籤或聯絡人,此係因為在此實例中不存在本端地儲存於使用者之器件上之匹配「Pavilion」之匹配項目。此外,因為使用者尚未執行針對「Pavilion」之任何新近搜尋或指引,所以在一些實施例中,已自遠端伺服器獲得搜尋表中所列出的所有所建議搜尋完成。在下文進一步描述自遠端伺服器獲得資料。 此外,所列出之搜尋完成可包括匹配當地地理(例如,街道、鄰近地區、城市、州或國家)之搜尋完成。舉例而言,「Pavilion Street」或「City of Pavilion」可在此街道或城市存在時出現在搜尋完成清單中。又,當使用者鍵入一地址之部分(例如,「220 Pavilion」)時,若此地址存在(例如,「220 Promenade Drive, Skycity, CA」),則遠端伺服器可擷取針對此地址之最有意義的完成。 第三階段2115說明使用者忽略搜尋表2155中所列出的所建議搜尋完成中之任一者而改為選擇小鍵盤上之「搜尋」按鈕。第四階段2120說明具有用於「Pavilion Theaters」之圖釘2190之一地圖。在一些實施例中,地圖繪製應用程式調整該地圖之縮放層級,如下文進一步描述。在此等實施例中之一些中,當一方位在當前縮放層級下不在該地圖中時,地圖繪製應用程式不顯示用於該方位之圖釘。因而,該地圖中未展示用於「Bowling Pavilion」之圖釘。地圖繪製應用程式亦已自頂端列140移除取消控制項2055且在頂端列140中恢復指引控制項160及書籤控制項170。c) 部分搜尋字詞及自動完成 在搜尋欄位中之任何編輯之後,一些實施例之地圖繪製應用程式立即用自動搜尋完成來填充搜尋表。亦即,當使用者在搜尋欄位中鍵入一搜尋字詞時,地圖繪製應用程式基於在特定時刻已鍵入之字元來提供所建議搜尋完成之一清單。一些實施例之地圖繪製應用程式自本端來源(例如,書籤、聯絡人、新近搜尋、新近路線指引等)且亦自遠端伺服器獲得此等所建議搜尋完成。 圖22說明使用者起始搜尋查詢及瞬間顯示具有推薦之搜尋完成之清單之搜尋表的四個階段2205至2220。在第一階段2205中,使用者正觸按搜尋欄位165以起始一搜尋。第二階段2210說明地圖繪製應用程式呈現螢幕小鍵盤2250及使用者使用小鍵盤2250將字母「P」鍵入至搜尋欄位中。在接收到此第一字母之後,應用程式立即呈現具有自各種來源(例如,本端器件及遠端伺服器建議)收集的所建議搜尋完成之一清單之搜尋表。 搜尋表在其於搜尋欄位中接收到更多使用者輸入(亦即,更多文數字字元及符號)及查詢字詞時將繼續調整並改進(refine)搜尋表中之所建議搜尋完成之清單。在一些實施例中,地圖繪製應用程式在使用者提供更多輸入時調整並改進清單,即使使用者拼錯正在鍵入之字詞亦然。舉例而言,當使用者鍵入「Piza」時,地圖繪製應用程式將展示含有正確拼寫之詞「Pizza」的搜尋完成。一些實施例之地圖繪製應用程式使用一拼字檢查及校正機構以及其他資料(例如,搜尋歷史等)來找到類似拼寫之詞以產生搜尋完成之建議清單。 每一搜尋完成可源於多種來源,既可本端地在使用者之器件上,又可源於遠端來源及伺服器。一些實施例之地圖繪製應用程式先於來自遠端來源及伺服器之搜尋完成而列出來自本端來源之搜尋完成。舉例而言,第二階段2210中所說明之搜尋表自清單之頂部至底部以「Paul」、「Pizza」、至「Police Station」之指引及至「Promenade」之指引的次序列出包括該等項目的若干搜尋完成。「Paul」係源於使用者之器件上之一聯絡人卡片;「Pizza」係源於儲存於使用者之器件上之一搜索歷史檔案中之一先前使用者搜尋;且至「Police Station」至指引係源於一新近所搜尋之路線指引。如上所述,在一些實施例中,針對路線之搜尋完成未展示文字「當前方位」。實情為,此等實施例中之一些之地圖繪製應用程式顯示目的地之地址。在一些情況下,地圖繪製應用程式不指示路線係自當前方位開始,也不顯示目的地之地址。舉例而言,至「Police Station」之指引未另外展示地址,因為搜尋完成本身包括警察局之地址。 然而,「Promenade」係自遠端伺服器獲得之搜尋完成。遠端地圖伺服器可基於地圖伺服器之其他使用者自器件之當前方位已使用之搜尋查詢建議此完成。因此,「Promenade」係在地圖繪製應用程式本端地獲得的三個所建議完成之後列出於搜尋表2255之底部。如下文進一步描述,一些實施例之地圖繪製應用程式對本端獲得之搜尋完成進行排序。 在一些實施例中,地圖繪製應用程式之所建議完成及搜尋結果係基於器件之當前方位。亦即,在位於自器件之當前方位起之一範圍內的地圖之區內的所建議完成及搜尋結果。替代性地或相結合地,當前顯示於顯示區域中的地圖之區係一些實施例之地圖繪製應用程式的建議及搜尋結果所基於之區域。在此等實施例中,地圖繪製應用程式偏愛在地圖之當前所顯示區內之搜尋完成及搜尋結果。 此外,地圖繪製應用程式在定義及調整所建議搜尋完成之清單時考慮其他因素。在一些實施例中,地圖繪製應用程式考慮時間因素。舉例而言,地圖繪製應用程式將搜尋歷史(亦即,先前所使用之搜尋完成之清單)分割成一天之不同時段(例如,深夜、早晨等)、一星期之不同時段、一月之不同時段及/或一年之不同時段,且偏愛進行當前搜尋之時間所屬的特定時段集合的搜尋完成及搜尋結果。 第三階段2215說明使用者自搜尋表中所顯示之搜尋完成之清單選擇「Pizza」。第四階段2220說明地圖繪製應用程式現顯示一地圖,其具有經說明為地圖上之橫幅2290及圖釘的「Pizza Place」之方位。使用者可接著選擇顯示於該橫幅上之各種圖示來執行多種功能,該等功能包括獲得對餐館之評論、調用至餐館之導航或接收至餐館之指引,及如下文進一步描述之各種其他特徵。 橫幅2290包括路線提取控制項2295 (經描繪為展示汽車之圖示),該控制項用於提取自當前方位至彼圖釘之一路線(例如,一駕駛路線)而不用離開地圖視圖。該路線提取控制項亦係用於起始導航體驗。舉例而言,在接收到對該路線提取控制項之選擇時,一些實施例之地圖繪製應用程式提供自器件之當前方位至圖釘之方位的一或多條路線。當選擇一路線時,地圖繪製應用程式可開始以導航模式或以路線檢查模式操作。 圖23說明使用者起始搜尋查詢及瞬間顯示具有推薦之搜尋完成之清單之搜尋表的三個階段2305至2315。除地圖係在3D視圖中外,圖23之三個階段2305至2315類似於圖22之階段2205、2215及2220。在第一階段2305中,使用者正觸按搜尋欄位165以起始一搜尋。第二階段2310說明在接收搜尋欄位165中之一字母之後,應用程式立即呈現具有自各種來源(例如,本端器件及遠端伺服器建議)收集的所建議搜尋完成之一清單之搜尋表。 第二階段2310亦說明使用者自搜尋表中所顯示之搜尋完成之清單選擇「Pizza」。第三階段2315說明,地圖繪製應用程式現顯示一地圖,其具有在地圖上分別說明為相關聯的圖釘2390及2395的「Pizza PLC1」之方位及「Pizza PLC2」之方位(橫幅未圖示)。使用者可接著選擇顯示於該橫幅上之各種圖示來執行多種功能,該等功能包括獲得對餐館之評論、調用至餐館之導航或接收至餐館之指引,及如下文進一步描述之各種其他特徵。d) 偏愛本端結果 為了提供搜尋表內之某些搜尋完成,一些實施例之地圖繪製應用程式分析儲存於使用者之器件中之多種本端資訊。舉例而言,每一使用者之器件可含有一聯絡人清單,其含有若干聯絡人卡片。每一聯絡人卡片可含有每一聯絡人之多種資訊及標籤。舉例而言,每一聯絡人卡片可含有具有關於以下各者之資訊(若適用)的聯絡人標籤:聯絡人之名及姓、公司名稱、住家地址、工作地址、行動電話號碼、工作電話號碼、電子郵件地址、URL及各種其他資訊。 同樣地,聯絡人清單可含有對應於地圖繪製應用程式之特定使用者之一特定聯絡人卡片,地圖繪製應用程式可將該特定聯絡人卡片指定為「ME卡片」。地圖繪製應用程式可頻繁地存取使用者之ME卡片以利用某些應用程式特徵所需之某些資訊,該等應用程式特徵包括獲得自使用者之當前方位至使用者之住家地址或工作地址之指引之特徵,該等指引係在地圖繪製應用程式內在眾多不同內容脈絡中提供。詳言之,一些實施例之地圖繪製應用程式在搜尋表之頂部列出地圖繪製應用程式自ME卡片獲得之搜尋完成。 圖24說明使用者鍵入部分地址及獲得至源於使用者之聯絡人或ME卡片的使用者之住家地址之指引的四個階段2405至2420。具體言之,圖24說明一些實施例之地圖繪製應用程式在搜尋表2455頂部列出使用者之住家地址。在第一階段2405中,使用者觸按搜尋欄位以開始鍵入使用者之搜尋查詢資訊之程序。 在第二階段2410期間,使用者已鍵入部分數字「12」,其可至少部分匹配一地址或一搜尋字詞。在一些實施例中,應用程式首先使使用者鍵入之搜尋查詢與儲存於使用者之器件上之使用者之ME卡及使用者之聯絡人清單中所含之資訊匹配。若應用程式偵測到搜尋查詢與使用者之ME卡之間的任何匹配聯絡人標籤,則一些實施例之應用程式將在經識別聯絡人中發現之資訊顯示作為搜尋表之頂部處所列出之所建議搜尋完成。在一些實施例中,地圖繪製應用程式僅在匹配資訊係聯絡人之地址時才將在經識別聯絡人中發現之資訊顯示作為一所建議搜尋完成。在此所建議搜尋完成之下,地圖繪製應用程式顯示文字(例如,「當前方位」)以指示路線係自當前方位至家。然而,如上所述,一些實施例之地圖繪製應用程式改為顯示目的地地址而非顯示此文字,或與顯示此文字相結合地顯示目的地地址,或不顯示該文字及目的地地址。應用程式將在源於ME卡片之搜尋完成下方顯示其他匹配聯絡人卡片。在一些實施例中,地圖繪製應用程式亦可呈現與ME卡片無關之搜尋完成。舉例而言,當使用者鍵入「12」時,地圖繪製應用程式將呈現來自本端先前搜尋完成之匹配搜尋完成(包括社交網站訊息)以及來自遠端伺服器之匹配搜尋完成。 第二階段2410說明應用程式自動地呈現源於使用者之ME卡片的使用者之加標籤住家地址作為搜尋表中之完成。應用程式偵測到使用者之所鍵入查詢「12」與含有「1234 A Street, Santa…」之住家地址標籤的使用者之ME卡片之間的匹配。因為此匹配係源於使用者之ME卡片,所以應用程式使此本端來源之資訊具有優先權且在搜尋表中靠近所建議搜尋完成之清單之頂部處顯示該資訊。搜尋表亦顯示其他搜尋完成,包括「Bob Smith」、「12 Hour Fitness」及 「John Doe」,其全部源於各種本端來源及遠端來源。舉例而言,Bob Smith當前儲存於使用者器件上之聯絡人清單中。 第三階段2415說明使用者選擇至家之指引。第四階段2420說明應用程式顯示具有對應於自使用者之當前方位至使用者家之指引的路線之地圖。 應用程式亦可分析儲存於使用者之聯絡人清單或聯絡人卡片上之其他資訊或標籤。圖25以四個階段2505至1820說明使用者鍵入部分搜尋字詞及獲得至源於使用者之聯絡人卡片的使用者之工作地址之指引的一實例。具體言之,此圖說明一些實施例之地圖繪製應用程式在搜尋表之頂部或靠近搜尋表之頂部處列出至工作地址之指引。 在第一階段2505中,使用者觸按搜尋欄位165以開始鍵入使用者之搜尋查詢之程序。在第二階段2510期間,使用者已鍵入部分搜尋字詞「Wo」,應用程式將其偵測為儲存於使用者之聯絡人卡片(或ME卡片)之工作標籤欄位中之「Work」或「A's Work」之地址標籤的一部分。在一些實施例中,應用程式呈現使用者之加標籤工作地址以作為搜尋表中之一完成。如所展示,使用者之加標籤工作地址在搜尋表2555之頂部。在此所建議搜尋完成之下,地圖繪製應用程式顯示文字(例如,「當前方位」)以指示路線係自當前方位至家。然而,如上所述,一些實施例之地圖繪製應用程式改為可顯示目的地地址而非顯示此文字,或與顯示此文字相結合地顯示目的地地址,或不顯示該文字及目的地地址。 如上所述,一些實施例之地圖繪製應用程式靠近搜尋表中之項目之清單之頂部,但在匹配且源於使用者之ME卡片之資訊下方顯示匹配且源於使用者之器件之聯絡人清單的任何資訊。舉例而言,搜尋表2555亦在接近搜尋表之頂部,但在使用者之加標籤工作地址下方處列出源於儲存於使用者之聯絡人清單中之另一聯絡人卡片的用於「Bob Woods」之聯絡人資訊。搜尋表接下來列出「World Market」以作為由一遠端伺服器提供之所建議搜尋完成。 在搜尋表中列出每一所建議搜尋完成的次序可源於對搜尋查詢字詞與所建議搜尋完成之間的關係之強度進行排名的各種排名演算法及啟發學習法。在下文進一步描述一個此啟發學習法。在一些實施例中,源於本端來源(例如,一聯絡人清單)之搜尋完成通常擁有比源於遠端伺服器之資訊高的優先權。此等搜尋完成同樣係顯示於搜尋表之頂部處。 第三階段2515說明使用者自搜尋表選擇對應於至「Work」之指引的清單項目。第四階段2520說明應用程式顯示具有對應於自使用者之當前方位至使用者之工作之指引的路線之地圖。e) 書籤 在一些實施例中,亦可藉由存取本端地儲存於使用者之器件上之多種其他資訊來獲得搜尋表中所列出之搜尋完成。舉例而言,一些實施例可分析儲存於使用者之器件上之書籤資訊。每一書籤可含有使用者已指示為感興趣的地點之各種方位資訊。圖26說明使用者鍵入部分搜尋查詢及自搜尋表中的搜尋完成之清單選擇書籤之四個階段2605至2620。 在第一階段2605中,使用者觸按搜尋欄位以開始鍵入使用者之搜尋資訊之程序。在第二階段2610期間,使用者已在搜尋欄位中鍵入部分搜尋字詞「Bur」。應用程序使此部分查詢字詞匹配於各種本端及遠端建議。應用程序匹配包括「Burt Smith」、「Burger Palace」及至「Royal Burgers」之指引。針對此搜尋查詢,應用程式將使用者之加標籤書籤呈現作為搜尋表中之一所建議搜尋完成。詳言之,應用程序已使「Bur」匹配於「Burger Palace」,因為此餐館當前作為書籤儲存於使用者之器件中,如在「Burger Palace」旁之書籤圖示2630所指示。 在一些實施例中,與使用者之器件之書籤匹配之資訊可以一特定分類次序顯示於搜尋表內。一些實施例在源於使用者之聯絡人清單之搜尋完成下方顯示源於書籤之搜尋完成。然而,源於書籤之搜尋完成仍可顯示於遠端伺服器搜尋建議中之任一者上方。階段2610中之搜尋欄位說明聯絡人「Burt Smith」仍顯示於所建議搜尋完成之清單的頂部處,因為此搜尋完成係源於使用者之器件上之一聯絡人清單。同樣地,用於「Burger Palace」的使用者之書籤係顯示為搜尋表中之第二項目,且自當前方位至「Royal Burgers」之指引係顯示於清單之底部。 在一些實施例中,搜尋表可定義用於顯示源於器件上之本端來源之項目的不同優先次序。舉例而言,一些實施例可將搜尋歷史及使用者選擇不同的所建議搜尋完成之頻率納入考量因素以便判定藉以在搜尋表中呈現所建議搜尋完成之特定次序。舉例而言,若使用者頻繁地搜尋並選擇對應於器件上之書籤的「Burger Palace」,則應用程式可在清單之頂部顯示此所建議搜尋完成,且將對應於聯絡人卡片之「Burt Smith」顯示作為搜尋表中所顯示之清單中之第二項目。第三階段2615展示使用者選擇用於「Burger Palace」之書籤。第四階段2620展示用於「Burger Palace」之圖釘及橫幅。f) 對搜尋完成排序 如上所述,針對任何特定搜尋查詢,由來自1)本端來源(例如,書籤、聯絡人、新近搜尋、新近路線指引等)及2)遠端伺服器來源之搜尋完成來填充搜尋表。所建議搜尋完成顯示於搜尋表中之特定顯示次序係使用多種啟發學習法導出。在一些實施例中,一般而言,顯示次序使源於本端來源之搜尋完成優先於源於遠端來源之搜尋完成。 圖27概念性地說明一些實施例執行的程序2700,其用以判定在搜尋表中顯示來自不同來源之所建議搜尋完成之次序。在一些實施例中,程序2700係由地圖繪製應用程式執行。一些實施例之程序2700在使用者開始搜尋欄位中之搜尋查詢時開始。 程序2700藉由擷取或接收(在2705處)搜尋欄位中所鍵入之搜尋查詢來開始。程序2700接著自一或多個本端來源擷取(在2710處)匹配資訊。如上所述,在一些實施例中,本端來源包括使用者之聯絡人清單、書籤、搜尋歷史及新近指引。程序2700可將查詢字詞與儲存於聯絡人卡片中之若干資訊及標籤(包括地址標籤、電話號碼、名字或URL以及儲存於聯絡人卡片中之其他資訊)匹配。該程序亦可將搜尋查詢與其他本端資訊匹配,其他本端資訊包括書籤及使用者搜尋歷史資訊。 接下來,程序2700判定(在2715處)來自本端來源之已擷取匹配之顯示次序。在一些實施例中,程序2700首先基於某些準則(例如,對完成之使用頻率、搜尋查詢與匹配之間的關聯性之強度等)對來自每一本端來源之匹配排序,且僅選取來自每一本端來源之某數目個(例如,三個)最上部匹配。在一些實施例中,程序2700基於本端來源對已擷取匹配排序。舉例而言,程序2700以ME卡片、聯絡人清單、書籤及搜尋歷史之次序顯示該等匹配。其他實施例可具有不同次序。程序2700接著根據所判定(在2715處)之顯示次序在搜尋表中顯示(在2720處)來自本端來源之已擷取匹配。 程序2700接著藉由將搜尋查詢發送至遠端來源而自遠端來源(例如,遠端地圖伺服器)接收或擷取(在2725處)搜尋建議。在一些實施例中,程序2700同時將搜尋查詢發送至遠端來源且在本端來源中查看以找到匹配資訊。在一些實施例中,伺服器可應用其自身之搜尋排名演算法以識別特定搜尋建議並對該等特定搜尋建議計分。伺服器可接著將特定數目個經識別搜尋結果發送至地圖繪製應用程式,程序2700可使用自身之啟發學習法(例如,對完成之使用頻率、搜尋查詢與匹配之間的關聯性之強度等)對該等經識別搜尋結果排序及過濾(在2730處)。舉例而言,在一些實施例中,程序2700可建議搜尋表中之所建議搜尋完成之清單中的最上部三個伺服器搜尋建議。接下來,程序2700在來自本端來源之匹配下方顯示(在2735處)來自遠端來源之匹配。程序2700接著結束。 一些實施例可立即呈現本端搜尋建議且調整此等建議以在自伺服器接收到遠端伺服器建議時包括該等遠端伺服器建議,此為搜尋程序提供了「準」即時感。舉例而言,若搜尋表提供足夠螢幕空間以列出十個個別搜尋完成或建議,則該程序最初可列出來自本端導出來源(例如,書籤、聯絡人、搜尋歷史)之所有十個此等搜尋建議。該程序可接著在經由網路獲得自伺服器接收之建議時用自伺服器接收之該等建議替換此等本端搜尋建議。程序2700可不斷地當自伺服器接收到可被認為比搜尋表中所列出之資訊更重要的資訊時更新搜尋建議之特定清單。g) 將搜尋結果顯示為清單 圖28說明在器件2800 (例如,平板器件,諸如Apple, Inc.所售之iPad®)上執行一些實施例之地圖繪製應用程式之一實例,與較小器件(例如,智慧型手機,諸如Apple Inc.所售之iPhone®)之顯示區域相比,器件2800具有相對較大之顯示區域。具體言之,圖28以四個不同階段2805至2820說明用以在具有相對較大之顯示區域之器件中顯示搜尋結果之清單的使用者與地圖繪製應用程式之互動。 第一階段2805展示,地圖繪製應用程式具有包括一組控制項之頂端列2806,該組控制項包括搜尋欄位2830及用於展示搜尋結果之一清單之清單檢視控制項2835。在一些實施例中,在基於如所展示之搜尋查詢之一搜尋完成時,地圖繪製應用程式在頂端列2806中顯示此清單檢視控制項2835。在其他實施例中,地圖繪製應用程式將清單檢視控制項2835置放於搜尋欄位2830中。另外在其他實施例中,地圖繪製應用程式藉由使清單檢視控制項235自3D圖示2890下滑出來置放清單檢視控制項2835。 在一些此等實施例中,當地圖繪製應用程式在地圖中顯示搜尋結果時,地圖繪製應用程式顯示清單檢視控制項2835。在此實例中,第一階段2805展示地圖繪製應用程式已執行使用「Pizza」作為搜尋查詢之搜尋。地圖繪製應用程式將搜尋結果顯示為地圖視圖中之兩個圖釘2840及2845。一些實施例之地圖繪製應用程式亦顯示用於該兩個圖釘中之一者的資訊橫幅2846,其指示圖釘所表示之興趣點(POI)係最被建議的結果。第一階段2805亦展示使用者正在選擇清單檢視控制項2835。 第二階段2810展示地圖繪製應用程式正在顯示地圖繪製應用程式已使用搜尋查詢發現之POI之一清單2850。在此實例中,該清單具有三個POI。前兩個POI分別對應於圖釘2845及2840。地圖繪製應用程式未展示對應於第三POI「Pizza Planet」之圖釘,因為第三POI不位於包括前兩個POI的地圖之區內。在一些實施例中,當選自該清單之一POI不位於當前所顯示的地圖之區內時,地圖繪製應用程式將地圖移位以顯示另一區。 第三階段2815展示,地圖繪製應用程式正在接收對第三輸入項之選擇。第四階段2815展示,地圖繪製應用程式已將地圖移位以顯示地圖之包括對應於第三POI之圖釘2855的另一區。在一些實施例中,地圖繪製應用程式在一持續時間中顯示一動畫以展示地圖正被移位至地圖之另一區。在一些實施例中,地圖繪製應用程式顯示用於第三POI之一資訊橫幅,因為用於該POI之圖釘係地圖區中之唯一圖釘。h) 軟體架構 圖29說明基於搜尋查詢來提供所建議搜尋完成之清單的地圖繪製應用程式之一實例架構。在此實例中,一些實施例之地圖繪製應用程式2900在器件2905中執行。如所展示,地圖繪製應用程式2900包括搜尋完成管理器2910、本端來源管理器2915、遠端來源管理器2920、清單管理器2925、搜尋查詢剖析器2930、新近搜尋完成儲存庫2935及新近路線指引儲存庫2940。此圖亦說明搜尋及地圖伺服器2960。地圖繪製應用程式2900、器件2905以及搜尋及地圖伺服器2960可各自具有眾多其他模組,但為論述簡單起見,未在此圖中加以描繪。 搜尋查詢剖析器2930接收使用者經由器件2905之一輸入管理器(未圖示)而鍵入之搜尋查詢。查詢剖析器2930將經剖析查詢發送至搜尋完成管理器2910,使得搜尋完成管理器2910可產生對遠端來源管理器2920及本端來源管理器2915之搜尋請求。搜尋查詢剖析器2930亦接收在一搜尋欄位(例如,搜尋欄位165)上之一觸按輸入且通知搜尋完成管理器此輸入,使得搜尋完成管理器可自新近搜尋完成儲存庫2935及新近路線指引儲存庫2940擷取新近搜尋完成及新近路線指引。 當搜尋完成管理器2910自搜尋查詢剖析器接收到使用者已在搜尋欄位為空時觸按搜尋欄位之一通知時,搜尋完成管理器2910查找新近搜尋完成儲存庫2935及新近路線指引2940。在一些實施例中,搜尋完成管理器擷取在接收到該觸按輸入之前的某一時間段(例如,小時、天、週等)中所使用之搜尋完成及路線指引。 搜尋完成管理器2910亦指引遠端來源管理器2920及本端來源管理器2915基於經剖析搜尋查詢來尋找搜尋完成。搜尋完成管理器2910接著接收遠端來源管理器2920及本端來源管理器2915傳回之搜尋完成及路線指引。 搜尋完成管理器收集新近搜尋完成、新近路線指引、分別自遠端來源管理器2920及本端來源管理器2915接收之搜尋完成及路線指引,且濾除任何重複之完成及指引。搜尋完成管理器接著將此等完成及指引發送至清單管理器2925。 清單管理器2925基於某些準則來對搜尋完成及駕駛指引排序。如上所述,此等準則包括使用搜尋完成及路線指引之時間、完成及路線指引係來自本端來源抑或遠端來源等。清單管理器2925接著將經排序清單傳遞至器件2905之一顯示管理器(未圖示),使得可為使用者顯示該清單。 搜尋完成管理器亦轉送搜尋請求(亦即,選自搜尋完成清單之完整搜尋查詢或在使用者選擇「鍵入」或搜尋控制項時已鍵入搜尋欄位中之搜尋查詢)及所選擇路線指引,且將該等請求及該等指引傳遞至將使用該等搜尋請求進行搜尋或計算路線之一搜尋請求管理器(未圖示)。搜尋完成管理器2910將該等搜尋請求(亦即,實際用以進行搜尋之搜尋完成)及所選擇路線指引(亦即,開始及目的地方位之識別)分別儲存於新近搜尋完成儲存庫2935及新近路線指引儲存庫2940中。 新近完成儲存庫2935及新近路線指引儲存庫2940係用以儲存新近所使用之搜尋請求及新近已用以計算路線之指引的記憶體空間。在一些實施例中,該兩個儲存庫係用於快速存取之快取記憶體。 本端來源管理器2915查找聯絡人儲存庫2950及書籤儲存庫2955以找到至少部分地匹配自搜尋完成管理器2910接收之經剖析搜尋查詢之聯絡人(例如,ME卡片)及書籤。本端來源管理器2915接著基於匹配之聯絡人及書籤產生搜尋完成且將該等搜尋完成傳回至搜尋完成管理器2910。藉由在器件2905上執行之應用程式來產生、維護及/或存取儲存於儲存庫2950及2955中之聯絡人及書籤,且此等應用程式包括地圖繪製應用程式2900。 遠端來源管理器2920將自搜尋完成管理器2910接收的經剖析之搜尋查詢發送至包括搜尋及地圖伺服器2960之一或多個伺服器(未全部圖示)。遠端來源管理器2920接收回應於發送至搜尋及地圖伺服器2960之搜尋查詢而自伺服器2960傳回之搜尋完成及/或路線指引。遠端來源管理器2920接著將該等完成及路線指引發送至搜尋完成管理器2910。 如所展示,搜尋及地圖伺服器包括搜尋完成儲存庫2965及路線指引儲存庫2970。在一些實施例中,搜尋及地圖伺服器將搜尋請求及用以計算路線之路線指引儲存於儲存庫2965及2970中。搜尋及地圖伺服器2960自執行一些實施例之地圖繪製應用程式(諸如地圖繪製應用程式2900)之執行個體之器件(包括器件2905)接收此等搜尋請求及路線指引。搜尋及地圖伺服器接著產生所建議搜尋完成及路線指引至器件2905。在一些實施例中,搜尋及地圖伺服器包括分別供應地圖資料及產生路線之兩個伺服器。一些實施例之搜尋完成儲存庫2965及路線指引儲存庫2970係用於儲存搜尋請求及路線指引之資料儲存結構。2. 經由搜尋欄位清除搜尋結果 除了允許一使用者起始一搜尋之外,在一些實施例中,在主要地圖視圖中存在搜尋欄位亦允許使用者看到對應於地圖上之搜尋結果之查詢及藉由清除查詢來移除彼等搜尋結果。圖30以三個階段3005至3015說明使用者自地圖3000清除結果。 第一階段3005說明地圖顯示用於「Pizza Place」之圖釘3025。此圖釘3025可能已經由各種不同機制而置放於地圖3000上。舉例而言,使用者可能已將圖釘放置於地圖上且接收一反方向之查找,或使用者可能已鍵入針對「Pizza Place 321」之一搜尋查詢。 第二階段3010說明使用者選擇搜尋欄位165內之「X」按鈕3030,其用以清除顯示於搜尋欄位165中之任何搜尋查詢。此外,當清除搜尋查詢時,亦將自地圖3000清除顯示於地圖上的與所顯示搜尋查詢有關之所有搜尋結果(圖釘)。第三階段說明,在使用者選擇「X」按鈕3030之後,搜尋欄位165現為空且用於「Pizza Place」之圖釘不再顯示於地圖上。 圖31說明為了清除顯示於地圖上之所選擇搜尋結果的使用者與在使用者之器件上執行之應用程式之互動的四個階段3105至3120。 第一階段3105展示在地圖繪製應用程式已開啟之後的器件。第二階段3105說明在接收到使用者對搜尋欄位之觸按之後應用程式顯示搜尋表3140。不管使用者是否在搜尋欄位中提供任何搜尋字詞,均顯示此搜尋表。搜尋表3140提供所建議搜尋完成之一清單,該等所建議搜尋完成包括新近所搜尋之字詞及路線指引。詳言之,搜尋表指示使用者新近搜尋了「John Smith」及「Pizzeria」。搜尋表亦列出使用者之新近路線指引,該等新近路線指引包括在搜尋表3140之底部處所說明的至「Royal Burgers」之指引。又,搜尋表3140列出用以獲得自使用者之當前方位至使用者之住家地址之指引的一選項,其被說明為搜尋表3140之第一項目。頂端列140包括指引控制項160及書籤控制項170。 第二階段3110說明使用者選擇至「Home」之指引。第二階段3110亦說明一些實施例之地圖繪製應用程式自頂端列140移除指引控制項160及書籤控制項170。地圖繪製應用程式插入取消控制項2055。 第三階段3115說明地圖繪製應用程式顯示對應於自使用者之當前方位至使用者家之指引之一路線。如所展示,該路線具有用於該路線之起點及結束點之兩個圖釘。一些實施例之地圖繪製應用程式亦自頂端列140移除取消控制項2055且置放清除控制項255及開始控制項2060。第三階段3115亦說明對清除控制項255之選擇。第四階段3120說明搜尋欄位165現為空,且用於該路線之起點及終點之圖釘不再顯示於地圖上,此係因為一些實施例之地圖繪製應用程式在接收到對清除控制項255之選擇時自地圖移除該等圖釘。B. 地圖顯示搜尋結果之縮放層級設定 當使用者正在一特定視圖中檢視地圖且執行一搜尋查詢時,一些實施例將轉變至一新地圖視圖,其含有使用者之查詢之搜尋結果。特定類型轉變可包括連續調整地圖縮放層級,且可能包括顯示原始地圖視圖與該新地圖視圖之間的一動畫。應用程式在決定轉變之特定類型及是否提供不同地圖視圖之間的一動畫時考慮多種因素。一些因素可包括在給定特定縮放層級下的不同地圖視圖之間的距離、可用於提供該等地圖視圖之間的動畫之資料、使用者之網際網路連接之資料頻寬能力,及各種其他因素。 圖32說明一些實施例執行的程序3200,其用以判定在使用者之當前地圖視圖與含有使用者之經執行搜尋查詢之搜尋結果的目標地圖視圖之間顯示的特定類型轉變。在一些實施例中,程序3200係藉由地圖繪製應用程式執行。在一些實施例中,程序3200在地圖繪製應用程式基於使用者之搜尋查詢產生搜尋結果時開始。 程序3200藉由擷取(在3205處)搜尋結果而開始。該程序接著界定(在3210處)原始區及目標區。在一些實施例中,程序3200考慮正在向使用者顯示之地圖。程序3200將此地圖顯示定義為含有原始地圖區之當前地圖視圖。程序3200接著判定具有一目標地圖區之一提議之目標地圖視圖,其需要向使用者顯示該目標地圖視圖以便提供顯示搜尋結果中之一些或全部之最佳地圖視圖。 在一些實施例中,程序3200最初在相同縮放層級下界定(在3210處)原始區及目標區。在一些此等實施例中,程序3200最初保持原始區之縮放層級且將目標區之縮放層級設定為原始區之縮放層級。程序3200亦將目標區之定向設定為原始區之定向。此外,不同實施例以不同方式定位目標區。舉例而言,在一些實施例中,程序3200界定目標區以包括至少一搜尋結果。又,一些實施例之程序3200藉由獲得搜尋結果之平均座標且將目標區之中心設定至該平均座標來界定目標區。 接下來,程序3200判定(在3215處)原始區與目標區是否至少部分地重疊。當程序3200判定(在3215處)該兩個區至少部分地重疊時,程序3200繼續至在下文進一步描述之3225。 當程序3200判定(在3215處)原始區與目標區不重疊時,程序3200判定(在3220處)該兩個區是否分開一臨限距離以上。在一些實施例中,程序3200基於原始區及目標區之當前縮放層級來動態地計算此臨限距離。舉例而言,計算出之臨限值與該等縮放層級成反比。亦即,該等區放大得越多,計算出之臨限距離越短。 當程序3200判定(在3220處)該兩個區分開該臨限距離以上時,程序3200顯示(在3230處)目標區,但不顯示自原始區至目標區之動畫。否則,該程序顯示(在3235處)自原始區至目標區之動畫。不同實施例使用不同動畫技術。舉例而言,在一些實施例中,程序3200使用原始區及目標區之交叉淡化而自原始區轉變至目標區。在一些實施例中,程序3200可自原始區轉變至目標區,就好像俯瞰原始區之虛擬攝影機之視點正移動至目標區。 當程序3200判定(在3215處)原始區與目標區至少部分地重疊時,該程序判定(在3225處)是否修改目標區。在下文藉由參看圖33來更詳細地描述操作3225。程序3200接著判定(在3240處)是否顯示至目標區之動畫。在下文藉由參看圖34來更詳細地描述操作3240。 圖33說明一些實施例執行的程序3300,其用以在目標區與最初由地圖繪製應用程式界定之原始區至少部分地重疊時判定是否修改目標區。在一些實施例中,程序3200係藉由地圖繪製應用程式執行。 程序3300藉由進行以下操作開始:判定(在3305處)(1)原始區是否包括任何搜尋結果及(2)原始區之縮放層級是否小於一臨限縮放層級(亦即,原始區未縮小超過一臨限縮放層級)。當該程序判定(在3305處)原始區不包括任何搜尋結果或原始區之縮放層級不小於該臨限縮放層級時,程序3300繼續至在下文進一步描述之3315。 當該程序判定(在3305處)原始區包括至少一搜尋結果且原始區之縮放層級小於該臨限縮放層級時,程序3300使用(在3310處)原始區作為目標區。程序3300接著繼續至在下文進一步描述之3335。 當該程序判定(在3305處)原始區不包括任何搜尋結果或原始區之縮放層級不小於該臨限縮放層級時,程序3300判定(在3315處)原始區是否包括任何搜尋結果。當程序3300判定(在3315處)原始區包括至少一搜尋結果時,程序3300繼續至在下文進一步描述之3335。 當程序3300判定(在3315處)原始區不包括搜尋結果時,程序3300擴展(在3320處)原始區以包括至少一搜尋結果且使用經擴展之原始區作為目標區。不同實施例以不同方式擴展原始區。舉例而言,在一些實施例中,程序3300在自原始區之中心起之所有方向上擴展以包括至少一搜尋結果,而在其他實施例中,程序3300不在自原始區之中心起之所有方向上擴展以包括至少一搜尋結果。在一些此等實施例中,程序3300以包括最接近原始區之邊界的搜尋結果之方式擴展。 接下來,程序3300判定(在3325處)(1)最重要之結果是否在目標區外及(2)目標區中之所有搜尋結果是否明顯較不重要。不同實施例以不同方式評估搜尋結果之重要性。舉例而言,一些實施例量化搜尋查詢與搜尋結果之接近度,且使用量化之接近度來判定重要性。詳言之,一些實施例之程序3300可將最接近之搜尋結果視為最重要之搜尋結果。其他實施例使用其他技術來評估搜尋報告之重要性。此外,當兩個搜尋結果的量化之接近度之間的差大於一臨限差時,一些實施例之程序3300將認為一搜尋結果與另一搜尋結果相比明顯較不重要。 當程序3300判定(在3325處)最重要結果在目標區外且目標區中之所有搜尋結果明顯較不重要時,程序3300擴展(在3330處)某一大小以包括一或多個搜尋結果。程序3300接著迴圈回至3325以進行另一判定來瞭解是否要進一步擴展目標區。 當程序3300判定(在3325處)最重要結果不在目標區外或目標區中之所有搜尋結果並非明顯較不重要時,程序3300在有必要確保目標區可容納與搜尋結果相關聯之任何UI(例如,資訊橫幅)時進一步擴展(在3335處)目標區。程序3300接著結束。 圖34說明一些實施例執行的程序3400,其用以(1)在目標區與最初由地圖繪製應用程式界定之原始區至少部分地重疊時且在考慮要修改目標區時判定是否顯示自原始區至目標區之動畫。在一些實施例中,程序3400係藉由地圖繪製應用程式執行。 程序3400藉由判定(在3405處)原始區及目標區之縮放層級是否相差大於一第一臨限差而開始。在一些實施例中,該第一臨限差表示原始區及目標區之縮放層級之間的較高臨限差。在此情況下,認為原始區及目標區之縮放層級明顯不同。 當程序3400判定(在3405處)縮放層級明顯不同時,程序3400顯示(在3410處)目標區,但不顯示自原始區至目標區之動畫。當程序3400判定(在3405處)縮放層級並非明顯不同時,程序3400判定(在3415處)縮放層級是否相差大於一第二臨限差。在一些實施例中,該第二臨限差表示原始區及目標區之縮放層級之間的較低臨限差。當縮放層級之間的差低於較高臨限值及較低臨限值時,認為原始區及目標區之縮放層級中等程度地不同。 當程序3400判定(在3415處)原始區及目標區之縮放層級中等程度地不同時,程序3400顯示(在3420處)自原始區至目標區之動畫。當程序3400判定(在3415處)原始區及目標區之縮放層級既不中等程度地不同亦非明顯不同時,程序3400判定(在3425處)原始區是否包括所有搜尋結果。 當程序3400判定(在3425處)原始區包括所有搜尋結果時,該程序結束。否則,程序3400繼續至3430以判定顯示該動畫是否可導致使更多搜尋結果可見。在一些實施例中,程序3400檢查該動畫以瞭解在顯示該動畫之時是否會顯現任何搜尋結果。 當程序3400判定(在3430處)顯示該動畫可導致使更多搜尋結果可見時,程序3400顯示(在3420處)自原始區至目標區之動畫。否則,程序3400結束。 圖35說明應用程式顯示至含有對應搜尋結果之目標地圖區而不提供當前地圖視圖與目標地圖視圖之間的任何動畫的情形之四個階段3505至3520。第一階段3505說明一地圖之一原始區,其展示(例如) Cupertino, California。該地圖處於顯示各種高速公路之一特定縮放層級下。如使用者之拇指及食指在向外方向上之移動所指示,使用者亦正在調整此地圖(經由示意動作輸入)以放大至一更詳細視圖。 第二階段3510說明,地圖現處於一更詳細縮放層級下(亦即,經放大),其中所顯示之若干個別街道包括「第一街道」、「主街道」及「第二街道」。使用者亦觸按搜尋欄位165以起始一搜尋。第三階段3515說明使用者將搜尋查詢「Smithsonian」鍵入至搜尋欄位中且自搜尋表3555中之所建議搜尋完成清單選擇「Smithsonian Museum, Washington, DC」。 在選擇了Washington, DC時,應用程式立即顯示Washington DC之地圖而不提供任何動畫。在此實例中,因為對於當前地圖視圖及特定縮放層級而言,Cupertino, CA與Washington, DC分開一相當大的螢幕距離,所以應用程式立即跳轉至Washington, DC之地圖而不提供任何動畫。針對此給定搜尋,應用程式已判定,階段3510中所顯示之地圖區與顯示Washington DC所需之目標地圖區之間的螢幕距離大於一特定臨限值,且因此,針對給定縮放層級提供動畫並不合理或可行。 在一些實施例中,當目標地圖區距當前所顯示之地圖區或使用者之當前方位過遠(例如,超過幾百英里或幾千英里之臨限距離)時,地圖繪製應用程式顯示一訊息(例如,「Did you mean XYZ place in location A…?」)以詢問使用者是否確實意欲搜尋遙遠的目標區。替代性地或相結合地,地圖繪製應用程式可用聲訊向使用者呈現該訊息(例如,藉由宣讀該訊息)。在一些實施例中,在使用者回應於該訊息之前,地圖繪製應用程式不提供搜尋結果。在一些實施例中,除該訊息外,地圖繪製應用程式亦提供替代搜尋結果。舉例而言,地圖繪製應用程式可提供可在當前所顯示之區內或附近找到的搜尋結果(例如,「Smith's Onion, Cupertino, CA」)之一清單,或可提供用類似但與較接近當前所顯示之區之一方位有關的搜尋查詢執行一搜尋之結果。若使用者選擇替代結果,則地圖繪製應用程式將在地圖之較接近當前所顯示之區的一區上顯示該等搜尋結果。 圖36說明應用程式偵測原始當前地圖視圖內之搜尋結果且因此不必縮放地圖或顯示至任何新目標地圖區之動畫的情形之四個階段3605至3620。第一階段3605說明使用者檢視Cupertino, California之地圖3630。地圖3630處於顯示各種高速公路之一特定縮放層級下。如使用者之拇指及食指在向外方向上之移動所指示,使用者亦正在調整此地圖以放大至一更詳細視圖。 第二階段3610說明地圖現處於一更詳細縮放層級下,其中所顯示之若干個別街道包括「第一街道」、「主街道」及「第二街道」。使用者亦觸按搜尋欄位以起始一搜尋。第三階段3615說明使用者將搜尋查詢「Coffee Shop」鍵入至搜尋欄位中且自搜尋表中之所建議搜尋完成清單選擇位於「第一街道」之咖啡店。在選擇了咖啡店時,應用程式顯示在搜尋請求之前使用者正在檢視之相同當前地圖視圖,如第四階段3620中所展示。由於可在使用者之當前地圖視圖中檢視針對位於第一街道之咖啡店之搜尋結果,故應用程式不必調整縮放設定或提供任何動畫以顯示目標地圖區。應用程式已將含有相關搜尋結果之目標地圖區設定為當前地圖區,且在這種情況下,避免了在地圖之不同區之視圖之間的改變。 圖37以四個不同階段3705至3720說明一些實施例之地圖繪製應用程式縮小當前地圖區視圖以便呈現基於搜尋查詢所找到之若干搜尋結果。第一階段3705說明在使用者已鍵入搜尋查詢「Tech Companies Cupertino」之後的地圖繪製應用程式。搜尋表3755顯示包括一先前搜尋查詢3760之搜尋結果之一清單。如所展示,使用者正自搜尋表3760選擇搜尋查詢3760。一些實施例之地圖繪製應用程式儲存使用地圖繪製應用程式進行之搜尋的搜尋結果。 第二階段3710展示在詳細縮放層級下的當前方位之地圖,其中所顯示之若干個別街道包括「第一街道」、「主街道」及「第二街道」。由於搜尋結果the tech companies in Cupertino不位於當前地圖視圖之原始當前地圖區內,故一些實施例之應用程式擴展地圖視圖,使得目標區包括位於目標區中之所有搜尋結果。一些實施例之地圖繪製應用程式亦判定需要地圖視圖的自當前地圖區至目標區之動畫,因為當前區及目標區之縮放層級明顯不同。 第三階段3715說明不同縮放層級下之地圖。地圖繪製應用程式僅短暫地顯示此縮放層級下之地圖以作為地圖繪製應用程式為了將地圖縮小至用於顯示比原始區大之目標區之縮放層級所顯示之動畫之部分。在一些實施例中,地圖繪製應用程式顯示2D/3D轉變以作為動畫之部分。 第四階段3720說明在目標區之縮放層級下之地圖。亦即,地圖繪製應用程式已完成顯示自第二階段3710處所顯示之原始區至目標區之動畫。 在一些實施例中,地圖繪製應用程式基於轉變中所涉及之變化量來改變用於自原始區轉變至目標區之動畫之持續時間。舉例而言,當原始區與目標區離得不遠時或當原始區與目標區重疊時,一些實施例之地圖繪製應用程式在一短持續時間中用動畫表示該轉變。當該兩個區之間的距離相對較大(例如,幾百英里)時,地圖繪製應用程式顯示一較長動畫。在一些此等實施例中,當該兩個區之間的距離極大(例如,幾千英里)時,地圖繪製應用程式可能根本不顯示動畫。III. 用於預覽地圖上之項目之控制項 本發明之一些實施例提供用於呈現關於興趣點(POI)之不同類型詳細資訊之新穎使用者介面。此使用者介面在上文及下文之描述中被稱為「詳細資訊畫面(stage)」。在一些實施例中,詳細資訊畫面包括用於顯示POI之影像之一顯示區域及若干索引標籤(tab),在該等索引標籤下,關於POI之不同類型資訊經分組並呈現給使用者。 一些實施例之地圖繪製應用程式提供用以顯示用於POI之詳細資訊畫面之若干不同方式。如上所述,一些實施例之地圖繪製應用程式在經顯示為搜尋結果之圖釘中之每一者上方顯示一橫幅。使用者可選擇一POI之橫幅中之一項目以開啟用於彼POI之詳細資訊畫面。地圖繪製應用程式亦允許使用者藉由自一些實施例之地圖繪製應用程式針對一搜尋查詢之搜尋結果呈現的POI之一清單選擇POI來開啟用於一POI之詳細資訊畫面。地圖繪製應用程式允許使用者在一方位處放置一圖釘之後開啟該詳細資訊畫面。此外,地圖繪製應用程式允許使用者開啟用於當前方位之詳細資訊畫面。 圖38概念性地說明GUI 3800,其為所選擇POI之「詳細資訊畫面」。具體言之,圖38以六個不同階段3805至3830說明一些實施例之地圖繪製應用程式,其在GUI 3800之媒體顯示區域3835中顯示POI之3D動畫。此圖說明,GUI 3800包括媒體顯示區域3835、索引標籤3840、資訊顯示區域3845及頂端列3850。 一些實施例之媒體顯示區域3835係用於顯示POI之不同媒體。在一些實施例中,當GUI 3800被啟動時,地圖繪製應用程式最初顯示POI之3D動畫。舉例而言,當POI係建築物時,地圖繪製應用程式展示建築物及建築物之周圍環境之動畫3D視圖。在一些實施例中,地圖繪製應用程式顯示建築物,就好像自安裝於繞建築物之頂部盤旋之直升機上之攝影機來檢視建築物一樣。 不同實施例以不同方式產生3D動畫(3D視訊呈現)。舉例而言,3D動畫係由沿軌道繞地球旋轉之物件或在較低海拔飛行之有人或無人駕駛飛行器(例如,衛星、太空梭、飛機、直升機等)之視訊捕獲器件捕獲之一視訊剪輯。 在一些實施例中,地圖繪製應用程式藉由對由飛行物件(諸如直升機、飛機、衛星等)針對一特定方位捕獲之若干影像執行調合操作(例如,三維透視調合操作)來產生3D視訊呈現。此等影像可為靜態影像或來自由此等物件捕獲之一視訊剪輯之一部分之影像。 在一些實施例中,3D顯現操作藉由在一設定時間量中轉變通過多個影像而根據該等影像產生視訊剪輯。在一些實施例中,此轉變導致藉由在不同時刻自3D場景中之不同透視顯現位置捕獲影像之不同子集而產生多個視訊圖框。 在一些實施例中,地圖繪製應用程式藉由在3D沈浸式地圖視圖或在低空俯瞰視圖中使虛擬攝影機在POI (例如,一建築物)及其周圍環境上方及周圍移動來產生3D視訊呈現。舉例而言,地圖繪製應用程式可使虛擬攝影機移動,就好像虛擬攝影機正在自繞建築物之頂部盤旋之飛行物件拍攝POI及周圍環境的視訊一樣。虛擬攝影機及3D沈浸式地圖視圖詳細地描述於上文併入之美國專利申請案13/632,035中。 當用於POI之動畫3D視圖之資料不可獲得(例如,不可在地圖伺服器或其他本端儲存器中獲得資料)時,一些實施例之地圖繪製應用程式找到下一可用類型影像以顯示於顯示區域3835中。舉例而言,地圖繪製應用程式可顯示POI之衛星視圖。 當用於POI之動畫3D視圖之資料可用,但需要一些時間來獲得用以顯示動畫3D之必要資料(例如,由於器件至必要資料之來源之緩慢網路連接),一些實施例之地圖繪製應用程式識別用於彼POI之下一可用類型媒體且首先在媒體顯示區域3835中顯示該媒體。舉例而言,一些實施例之地圖繪製應用程式在媒體顯示區域3835中顯示POI之衛星影像。在一些實施例中,為了提供動畫效應,地圖繪製應用程式使POI之衛星影像旋轉(例如,順時針)而非靜態地顯示2D衛星影像。 當獲得用於顯示動畫3D視圖之足夠資料時,一些實施例之地圖繪製應用程式自顯示POI之衛星影像切換至顯示POI之動畫3D視圖。不同實施例之地圖繪製應用程式使用不同效應來進行此切換。舉例而言,一些實施例之地圖繪製應用程式將POI之2D衛星影像交叉淡化成動畫3D視圖。在其他實施例中,地圖繪製應用程式可使用Ken Burns(肯·伯恩斯)效應來根據衛星影像顯示3D動畫視圖。 在一些實施例中,地圖繪製應用程式基於所選擇之POI之類型來判定最初顯示於媒體顯示區域3835中的POI之媒體之類型。舉例而言,當POI為餐館時,一些實施例之地圖繪製應用程式最初顯示餐館供應之烹調菜肴之影像或餐館之內部影像。當顯示此等影像時,不同實施例之地圖繪製應用程式使用不同效應來顯示不同影像。地圖繪製應用程式可使用之不同效應包括Ken Burns效應、漸暈效應、交叉淡化、傾斜、投影片放映等。 一些實施例之地圖繪製應用程式將資訊文字覆疊於媒體顯示區域3835中所顯示之媒體上。在一些實施例中,接近媒體顯示區域3835之左邊顯示資訊文字,如第三階段3815中所展示。然而,資訊文字之方位可在媒體顯示區域3835中之任何處(例如,中心)。在一些實施例中,地圖繪製應用程式將不同效應應用於媒體顯示區域之覆疊有資訊文字的部分,使得文字清晰易讀。舉例而言,地圖繪製應用程式可改變彼部分之顏色或使該部分模糊。在其他實施例中,地圖繪製應用程式不修改影像來使資訊文字清晰易讀。實情為,地圖繪製應用程式在被文字覆疊的影像之部分變化時調整資訊文字以使其清晰易讀。 一些實施例之地圖繪製應用程式亦可自當GUI 3800被啟動時最初所顯示之類型的媒體切換至其他類型媒體。舉例而言,當使用者自顯示於「媒體」索引標籤(未圖示)下之輸入項中選擇一輸入項時,地圖繪製應用程式在媒體顯示區域3835中顯示與所選擇輸入項相關聯之影像,或在媒體顯示區域3835中播放與所選擇輸入項相關聯之視訊。 索引標籤3840係用於顯示針對與不同索引標籤相關聯之不同類型資訊而分組的不同輸入項集合的索引標籤。在一些實施例中,如所展示,GUI 3800最初包括「資訊(Info)」索引標籤、「評論(Reviews)」索引標籤及「相片(Photos)」索引標籤。當選擇資訊索引標籤時,地圖繪製應用程式在資訊顯示區域3845中顯示與關於POI之總體資訊有關之輸入項。如所展示,關於POI之總體資訊包括電話號碼、用於POI之URL、地址等。當選擇評論索引標籤時,要顯示之輸入項包括由資訊彙集實體(例如,Yelp、Facebook、Twitter等)收集且供應至地圖繪製應用程式之所有評論。類似地,當選擇相片索引標籤時,要顯示之輸入項包括由資訊彙集實體收集之相片。在下文更詳細地描述顯示於資訊顯示區域3845中之索引標籤及輸入項。 一些實施例之頂端列3850包括用於回到啟動GUI 3800之前之狀態的「上一頁」按鈕3895。當已在顯示GUI 3800之前顯示具有搜尋結果之地圖時,「上一頁」按鈕3895會指示:地圖繪製應用程式回到顯示具有搜尋結果之地圖。當在顯示GUI 3800之前已顯示POI之清單時,「上一頁」按鈕3895指示地圖繪製應用程式返回至顯示清單。 現描述GUI 3800之操作。在第一階段3805,作為鍵入搜尋查詢「Famous Building」之結果,地圖繪製應用程式顯示圖釘3860及橫幅3865。下一階段3810展示,使用者選擇箭頭3875以啟動用於此POI之「詳細資訊畫面」。 在第三階段3815,GUI 3800已被啟動。地圖繪製應用程式顯示GUI 3800之組件之初始集合,其包括頂端列3850、媒體顯示區域3835、索引標籤3840及資訊顯示區域3845。在媒體顯示區域3835中,地圖繪製應用程式顯示著名建築物及著名建築物附近之其他建築物之動畫3D視圖。 地圖繪製應用程式亦在媒體顯示區域3835之左邊顯示關於著名建築物之資訊文字。顯示之資訊包括建築物之名稱、地址、星級評等及評論之數目,如媒體顯示區域3835中所展示。在文字後面,地圖繪製應用程式最初顯示著名建築物之影像。如所展示,建築物表現為淡化的,因為地圖繪製應用程式已使出現在資訊文字後面的影像之部分淡化以便使文字更突出且更清晰易讀。在此階段3815,資訊顯示區域3835顯示關於此建築物之總體資訊(例如,電話號碼、建築物之URL、地址等),因為資訊索引標籤在一些實施例中係預設索引標籤選擇。 下一階段3820展示虛擬攝影機之視點已變化,使得著名建築物出現於所顯示之資訊文字旁。一些其他鄰近建築物被顯示於資訊文字後面。虛擬攝影機之視點亦開始圍繞著名建築物之頂部盤旋。 下一階段3825展示虛擬攝影機已移動至著名建築物之另一側。如所展示,建築物之一側上所展示之「FMS BLDG」現出現在媒體顯示區域3845之西南方向上,以便展示虛擬攝影機已相對於建築物之頂部逆時針移動。在建築物頂部之圓圈內的「H」亦表現為自上一階段3820旋轉。此外,因為虛擬攝影機之視點已變化,所以不同的鄰近建築物顯示於資訊文字後面。 下一階段3830展示,虛擬攝影機已移動至著名建築物之又一側。虛擬攝影機繼續移動且現處於使「FMS BLDG」不再可見的建築物之側上。建築物頂部之圓圈內的「H」表現為進一步旋轉。一些實施例之地圖繪製應用程式繼續展示此動畫3D動畫視圖(亦即,將使虛擬攝影機繼續在建築物之頂部上盤旋),直至使用者提供其他輸入(例如,進行輸入以關閉GUI 3800、自相片索引標籤下之輸入項選擇相片等)。 在一些實施例中,地圖繪製應用程式使用3D顯現操作來在其其他操作模式下產生3D呈現。舉例而言,在一些實施例中,每當使用者執行針對一特定方位之搜尋或指定其他搜尋準則時,或每當使用者探索地圖上之一方位時,地圖繪製應用程式皆使用此操作產生一3D視訊呈現。圖39以五個不同階段3905至3925說明地圖繪製應用程式使用3D顯現操作展示特定搜尋結果之一實例。 第一階段3905說明在接收到使用者在搜尋欄位165中鍵入的搜尋查詢之第一個字母之後地圖繪製應用程式顯示搜尋表3940。如所展示,搜尋表3940包括若干搜尋完成,包括:「X Corp.」、「X Station 555…」及「Promenade X」。第一階段3905亦說明選擇了「X Corp.」。 第二階段3910說明3D地圖視圖3910,如醒目提示的3D控制項150所指示。在一些實施例中,地圖繪製應用程式在使用者在3D模式下檢視POI時自動地展示POI之3D動畫。一些此等實施例之地圖繪製應用程式仍允許使用者旋轉地圖視圖(例如,用兩指示意動作)。在一些實施例中,地圖繪製應用程式在經過一特定時間量(例如,幾秒鐘)而未從使用者接收到輸入之後開始呈現POI之3D動畫。第二階段3910亦說明地圖繪製應用程式開始呈現建築物之3D動畫。如所展示,3D動畫正在展示X Corporation之建築物之側面1及2,其中側面2具有X Corporation之名稱。在此實例中,3D動畫係自圍繞建築物之頂部逆時針盤旋之飛行物件拍攝之3D視訊呈現。 第三階段3915說明視點已變化且建築物隨飛行物件逆時針盤旋而表現為順時針旋轉。3D呈現正在展示X Corporation之建築物之側面2及3。如所展示,建築物之頂部上之「H」標誌亦已旋轉。 第四階段3920及第五階段3925展示當飛行物件順時針盤旋時建築物之進一步旋轉。一些實施例之地圖繪製應用程式重複3D呈現,直至使用者提供使動畫停止或變化之一輸入(例如,兩指示意動作、用以退出3D模式之輸入等)。 圖40概念性地說明GUI 3800。具體言之,圖40以八個不同階段4005至4040說明一些實施例之地圖繪製應用程式最初在GUI 3800之媒體顯示區域3835中顯示POI之衛星影像之動畫且在獲得3D動畫之足夠資料以用於顯示3D動畫時切換至POI之3D動畫。 在第一階段4005,作為鍵入搜尋查詢「MyWork Building」之結果,地圖繪製應用程式顯示圖釘4060及橫幅4065。使用者亦已選擇地圖之右下角以剝離地圖及顯示上文所描述之一組按鈕。下一階段4010展示使用者選擇「清單」按鈕4070以使地圖繪製應用程式將搜尋結果顯示為一清單。 在第三階段4015,地圖繪製應用程式在使用者已在先前階段4010選擇清單按鈕4070之後顯示POI之一清單。在此實例中,該清單碰巧僅包括一個POI,此係因為搜尋查詢充分地針對特定結果。下一階段4020展示,使用者選擇輸入項4075以啟動用於此POI之「詳細資訊畫面」。如所展示,使用者已選擇「MyWork Building」POI。 在第五階段4025,GUI 3800已被啟動。然而,與上文藉由參看圖38所描述之階段3815相對比,地圖繪製應用程式顯示與「MyWork Building」POI相關聯之實際建築物之一衛星影像。該衛星影像亦展示該建築物附近之其他建築物,而非該建築物之一動畫3D視圖。該衛星影像係自建築物之頂部相隔相當大距離(亦即,自一衛星)拍攝的建築物之頂部之2D影像。如所展示,地圖繪製應用程式亦使被資訊文字覆疊之影像之部分淡化,以使得文字表現得清晰易讀。 下一階段4030展示地圖繪製應用程式已使衛星影像順時針旋轉以動畫化衛星影像。在下一階段4035,地圖繪製應用程式正在用建築物之3D動畫視圖交叉淡化衛星影像,因為地圖繪製應用程式已獲得足夠資料(例如,自地圖伺服器或資料之其他來源)以顯示建築物之3D動畫視圖。 下一階段4040展示,用於3D動畫視圖之虛擬攝影機已移動至「MyWork Building」之另一側。如所展示,建築物之黑色拐角現出現在媒體顯示區域3835之東方向上,以便展示虛擬攝影機已相對於建築物之頂部逆時針移動。又,因為虛擬攝影機之視點已變化,所以不同的鄰近建築物顯示於資訊文字後面,如所展示。 圖41概念性地說明GUI 3800。具體言之,圖41以六個不同階段4105至4130說明當POI之影像對使用者而言更有意義且含更多資訊時,一些實施例之地圖繪製應用程式最初在GUI 3800之媒體顯示區域3835中顯示POI之影像而非建築物之動畫影像。舉例而言,當所選擇POI係商業場所(例如,餐館或咖啡店)時,地圖繪製應用程式基於使用者預期來判定要顯示之適當影像。舉例而言,地圖繪製應用程式可展示餐館之可提供的食品及飲料之影像或展示咖啡店之內部佈置之影像等,因為此等影像可能比顯示此等商家所在之建築物大廈之外部影像更符合使用者預期。 第一階段4105及第二階段4110分別類似於階段3805及3810,其中地圖繪製應用程式顯示POI之一集合,使用者可自該集合選擇一POI。第二階段4110展示使用者選擇「Little Coffee Shop」。 在第三階段4115,GUI 3800已被啟動。然而,與上文藉由參看圖38及圖40所描述之階段3810及4025相對比,地圖繪製應用程式顯示影像之一集合(例如,咖啡、甜甜圈等之影像)而非建築物之影像。如上所述,由諸如Facebook、Twitter等之資訊收集實體來收集此等影像,地圖繪製應用程式自該等實體獲得與所選擇POI有關之資訊。 亦如上所述,不同實施例之地圖繪製應用程式使用不同技術來顯示影像。在此實例中,地圖繪製應用程式正在使用Ken Burns效應來順序地顯示該等影像。在此階段4115,地圖繪製應用程式顯示Little Coffee Shop提供給顧客之一杯咖啡及一塊鬆餅之影像(假設拍攝該影像之顧客或商店之所有者將該影像上傳至一資訊收集實體,諸如Yelp)。 下一階段4120展示地圖繪製應用程式已將此杯咖啡及鬆餅的影像放大以作為使用Ken Burns效應展示影像之部分。下一階段4125展示地圖繪製應用程式正在用Little Coffee Shop之內部之影像來對咖啡及鬆餅之影像應用交叉淡化。下一階段4130展示地圖繪製應用程式已將內部之影像完全放大。 圖42概念性地說明一些實施例執行的程序4200,其用以在啟動一「詳細資訊畫面」以用於展示關於POI之詳細資訊時顯示不同類型影像。具體言之,執行程序4200以用於在該詳細資訊畫面之媒體顯示區域(例如,上文藉由參看圖38、圖40及圖41所顯示之媒體顯示區域3835)中顯示媒體。在一些實施例中,程序4200係由地圖繪製應用程式執行。該程序在地圖繪製應用程式顯示一搜尋查詢之結果時開始。 程序4200藉由接收(在4205處)對POI之選擇而開始。程序4200以若干不同方式中之一者接收該選擇。舉例而言,當使用者選擇顯示於在POI (使用者希望找到關於該POI之更多資訊)之圖釘上方所顯示之橫幅中之一箭頭時,程序4200接收該選擇。當使用者自一些實施例之地圖繪製應用程式在使用者選擇一「清單(list)」按鈕時顯示的POI之一清單選擇該等POI中之一者時,程序4200亦可接收該選擇。 程序4200接下來判定(在4210處)用於顯示不同類型影像之較佳次序。如上所述,一些實施例之地圖繪製應用程式顯示不同類型影像。在一些實施例中,該等不同類型影像包括用於動畫3D視圖之影像、衛星影像、普通地圖影像、與商業有關之影像(例如,供應之菜肴之影像、內部佈置之影像、商家之雇員之影像等)。 在一些實施例中,程序4200基於所選擇之POI之類型來判定較佳次序。舉例而言,當POI為諸如餐館之商家時,程序4200偏愛顯示可由餐館供應之菜肴之影像。當POI為如Empire State Building之著名地標時,程序4200偏愛顯示該建築物及其周圍環境之影像。程序4200在判定較佳次序時分析所選擇POI之資料以判定POI之類型。 針對一特定類型影像,亦存在不同層級之影像。舉例而言,針對地標之影像,不同層級之影像包括動畫3D影像、衛星影像、普通地圖影像等。此等不同影像層級將含有不同量之資料,且因此經由網路擷取該資料耗費不同量之時間。舉例而言,動畫3D影像將可能含有最大量之資料且因此經由網路自資料之來源下載該等影像耗費的時間最長。 程序4200接著判定(在4215處)用於當前層級之影像之資料是否可自該來源得到。在一些實施例中,程序4200自在接收到一請求時伺服資料之一地圖伺服器或其他伺服器獲得資料。然而,並非每一個POI皆在伺服器中具有每一層級之影像。舉例而言,並非熟知或著名之建築物將可能不具有用於顯示該建築物之3D動畫視圖之影像資料。在一些實施例中,預設地,最初將要顯示之影像之當前層級設定為3D動畫影像層級。 當程序4200判定(在4215處)不可得到用於當前層級之影像之資料時,程序4200選擇(在4220處)要針對POI顯示的下一層級之影像。舉例而言,當不可獲得所選擇POI的用於動畫3D視圖之影像時,程序4200選擇(在4220處) POI之衛星影像。程序4200接著迴圈回至4215以判定是否可獲得用於此下一層級之影像之資料。 當程序4200判定(在4215處)可獲得用於顯示當前層級之影像之資料時,程序4200判定(在4225處)是否已接收該資料之全部及地圖繪製應用程式是否準備好顯示當前層級之影像。在一些情況下,當地圖繪製應用程式執行於之器件具有用以自遠端伺服器獲得影像資料之緩慢網路連接時,可能沒有足以用於顯示POI之當前層級之影像的影像資料。 當程序4200判定(在4225處)已獲得用於顯示當前層級之影像之足夠資料時,程序4200繼續至在下文予以進一步描述之4230。當程序4200判定(在4225處)不可獲得用於顯示當前層級之影像之資料時,程序4200顯示(在4235處)準備好顯示的下一層級之影像。亦即,程序4200顯示(在4235處)下一層級之影像中之一者,已獲得用於該等影像之足夠資料。當下一層級之影像皆無用於顯示該等影像之足夠資料時,程序4200在媒體顯示區域中顯示普通背景影像(一黑色影像)。 程序4200接著判定(在4240處)地圖繪製應用程式是否已獲得用於當前層級之影像之足夠資料。舉例而言,當程序4200在媒體顯示區域中顯示POI之衛星影像時,程序4200檢查是否已獲得用於動畫3D視圖之足夠資料。當程序4200判定(在4240處)已獲得的用於當前層級之影像之資料不足時,該程序迴圈回至4235,且在檢查用於當前層級之影像之資料是否準備好之同時繼續在媒體顯示區域中顯示下一層級之影像。 當程序4200判定(在4240處)已獲得用於當前層級之影像之足夠資料時,該程序自媒體顯示區域中正顯示的下一層級之影像切換(在4245處)至當前層級之影像。舉例而言,當正在顯示POI之衛星影像之同時用於3D動畫視圖之資料準備好供顯示時,程序4200自POI之衛星影像切換至POI之動畫3D視圖。不同實施例使用不同技術進行此切換。舉例而言,程序4200可應用Ken Burns效應以進行自下一層級之影像至當前層級之影像之轉變。程序4200接下來顯示(在4230處)當前層級之影像。程序4200接著結束。 圖43概念性地說明GUI 3800。具體言之,圖43以四個不同階段4305至4320說明索引標籤3840未被捲動離開所選擇POI之「詳細資訊畫面」(亦即,GUI 3800)。在一些實施例中,GUI 3800係可捲動的。亦即,GUI 3800之組件(諸如媒體顯示區域3835、表3840及資訊顯示區域3845)隨使用者向上或向下捲動GUI 3800而向上或向下移動。 如上所述,索引標籤3840係用於顯示針對與不同索引標籤相關聯之不同類型資訊而分組的不同輸入項集合的索引標籤。亦即,一些實施例之地圖繪製應用程式獲得來自不同資料收集實體的與POI有關之資料(例如,來自Facebook、Yelp、Yahoo、Twitter、blogs等之推文、RSS摘要、更新),且基於資料中所含之資料類型將該資料分類成不同群組。舉例而言,地圖繪製應用程式會將攜載關於POI之評論的資料段分類成一「評論」群組,且在使用者選擇「評論」索引標籤時將該等資料段顯示為資訊顯示區域3845中之輸入項。不同實施例使用不同技術來辨識一段資料正攜載之資訊之類型。舉例而言,地圖繪製應用程式在資料段中尋找一組關鍵字。 使用者可能建立或地圖繪製應用程式可另外提供之索引標籤之實例包括「活動(activities)」或「事件(events)」索引標籤,該等索引標籤在經選擇時將包括關於POI之任何時間相關資訊。舉例而言,若所選擇POI係Major League ballpark,則此索引標籤下之輸入項可包括關於與該球場相關聯的球隊之已排程比賽之資訊。作為另一實例,當所選擇POI係電影院時,對輸入項之選擇可使地圖繪製應用程式啟動一購票應用程式(例如,Fandango),使用者可使用該購票應用程式購買在電影院上映之電影之票。一般熟習此項技術者將認識到,可建立具有對應於索引標籤之類型各種功能性之許多其他可能的索引標籤。 在一些實施例中,地圖繪製應用程式允許使用者藉由定義地圖繪製應用程式用以篩選所獲得之資料段之若干關鍵字來建立一新群組。在此等實施例中,地圖繪製應用程式新增用於該新群組之一新索引標籤,且在該新索引標籤經選擇時列出包括該等關鍵字之資料段。當地圖繪製應用程式包括針對POI之一詳細資訊畫面可顯示之臨限數目個索引標籤時,一些實施例之地圖繪製應用程式使該等索引標籤3840可捲動(例如,水平地),使得使用者可捲動該等索引標籤3840且選擇任何所要索引標籤。一些實施例之地圖繪製應用程式亦允許使用者移除不想要的索引標籤。 如上所述,GUI 3800可在垂直方向上捲動。然而,不管使用者如何捲動GUI 3800,該等索引標籤3840將不會被捲動離開器件之螢幕。亦即,舉例而言,當向上捲動GUI 3800以使得該等索引標籤3840本來會離開螢幕時,一些實施例之地圖繪製應用程式將該等索引標籤3840保持在螢幕顯示區域內且將資訊顯示區域3845滑動到索引標籤3840之下。 藉由參看圖43之四個階段4305至4320來描述當使用者捲動GUI 3800時GUI 3800之一實例操作。第一階段4305展示在使用者自顯示用於「Little Coffee Shop」之圖釘及橫幅之地圖或自POI之清單選擇此POI之後啟動的GUI 3800。 第二階段4310展示使用者作出自GUI 3800之一方位之向上撥動示意動作。第三階段4315展示媒體顯示區域3835、索引標籤3840及資訊顯示區域3845已在螢幕內向上移動。詳言之,媒體顯示區域3835已幾乎完全捲動離開螢幕,顯得已滑動到頂端列3850之下。地圖繪製應用程式亦已擴展資訊顯示區域3845以顯示用於所選擇資訊索引標籤之更多輸入項。使用者亦正在進行另一向上撥動示意動作(或繼續初始向上撥動示意動作,具有或不具暫停)以進一步向上捲動GUI 3800。 第四階段4315展示該等索引標籤3840已向上移動至頂端列3850下方,且使用者更進一步地向上移動GUI 3800。然而,該等索引標籤3840既未離開螢幕亦未滑動到頂端列3850之下。地圖繪製應用程式亦已停止擴展資訊顯示區域3845。隨著使用者進一步向上捲動GUI 3800,地圖繪製應用程式使顯示於資訊顯示區域3845中之輸入項滑動到索引標籤3840及頂端列3850之下且使先前未顯示於資訊顯示區域3845中之更多輸入項自資訊顯示區域3845之底部出現,如第四階段4320中所展示。 第四階段4320亦說明一些實施例之地圖繪製應用程式顯示一組可選擇UI項目4360至4380。UI項目4360係用於展示自當前方位至POI之指引。UI項目4365係用於展示自POI至當前方位之指引。UI項目4370係用於將此POI新增至地圖繪製應用程式之聯絡人清單中。UI項目4375係用於與用於其他使用者之其他器件之其他地圖繪製應用程式共用此POI。UI項目4380係用於新增此POI作為書籤。 圖44概念性地說明GUI 3800。具體言之,圖44說明一些實施例之地圖繪製應用程式之用於在使用者選擇顯示於資訊顯示區域3845中之一輸入項時啟動一第三方應用程式的四個不同階段4405至4420。 在一些實施例中,當自在選擇用於一所選擇POI之「詳細資訊畫面」之一索引標籤時所顯示之輸入項之一清單選擇一輸入項時,地圖繪製應用程式啟動一第三方應用程式或在同時在地圖繪製應用程式執行於之器件上執行之一瀏覽器應用程式中開啟該第三方之網站。該第三方應用程式通常係所選擇輸入項之來源(例如,Yelp)。舉例而言,當使用者在顯示於資訊顯示區域3845中之一評論上觸按時,一些實施例之地圖繪製應用程式啟動該第三方應用程式(例如,Yelp應用程式)以展示所選擇評論之全部文字或允許使用者新增新評論。在此等實施例中,地圖繪製應用程式亦將提供供使用者回到用於POI之「詳細資訊畫面」(例如,GUI 3800)之構件。舉例而言,地圖繪製應用程式可在用於第三方應用程式之頂端列中顯示一「上一頁」按鈕,該按鈕在經選擇時將使用於POI之「詳細資訊畫面」重新顯示。 在其他實施例中,當自輸入項之清單選擇一輸入項時,地圖繪製應用程式不啟動一第三方應用程式。地圖繪製應用程式將改為「就地」顯示與該輸入項相關聯之資訊。亦即,舉例而言,地圖繪製應用程式可在資訊顯示區域3845中擴展該輸入項,且在不離開地圖繪製應用程式之情況下顯示完整評論或提供用以新增一新評論之構件。此等實施例之地圖繪製應用程式將利用一應用程式設計介面(API)來請求並獲得來自來源之完整評論。在一些實施例中,API係由來源(例如,Yelp)提供。 圖44之第一階段4405說明使用者選擇評論索引標籤4450。在第二階段4410中,地圖繪製應用程式在接收到該選擇時顯示與Little Coffee Shop有關之若干評論。如所展示,該等評論源於諸如Twitter、Facebook、Yelp等之不同來源。在一些實施例中,每一評論輸入項包括有限量之資訊。此等實施例中之一些實施例之地圖繪製應用程式允許使用者在選擇特定輸入項時檢視完整評論。 第二階段4410顯示使用者自顯示於資訊顯示區域3845中之評論清單選擇評論4455。在此實例中,如所展示,評論4455源於Yelp。第三階段4415展示地圖繪製應用程式已啟動Yelp應用程式。Yelp應用程式顯示評論4455之全部內容。該頁面之頂端列4460包括一「上一頁」按鈕,該按鈕在經選擇時使地圖繪製應用程式再次顯示「評論」索引標籤經選擇之GUI 3800。第三階段4415亦展示使用者選擇該「上一頁」按鈕。第四階段4420展示再次顯示之GUI 3800。此階段展示與階段4410處之GUI 3800相同的GUI 3800。 圖45概念性地說明GUI 3800。具體言之,圖45說明一些實施例之地圖繪製應用程式之用於在使用者選擇用於第三方應用程式之UI項目時啟動第三方應用程式的四個不同階段4505至4520。 第一階段4505說明在資訊索引標籤4555係當前所選擇之索引標籤時使用者選擇評論索引標籤4550。在第二階段4510中,地圖繪製應用程式回應於接收到對評論索引標籤之選擇而顯示與Little Coffee Shop有關之評論。如所展示,每一評論輸入項包括人之縮圖影像、人之星級評等、評論被上傳之日期及評論(描繪為線)。該等評論源於諸如Twitter、Facebook、Yelp等之不同來源。第二階段4510亦說明使用者正在向上捲動GUI 3800。 第三階段4515說明在GUI 3800被向上捲動時一些實施例之地圖繪製應用程式在接近擴展之資訊顯示區域4545之底部處顯示一組可選擇UI項目4560至4575。UI項目4560係用於啟動一第三方應用程式(例如,Yelp之行動應用程式)或一網站(例如,Yelp之網站)以檢視關於POI之更多評論。UI項目4565係用於啟動另一第三方應用程式(例如,Facebook之行動應用程式)或一網站(例如,Facebook之網站)以觸發該第三方應用程式提供之特殊特徵(例如,自Facebook簽到(check-in))。UI項目4570係用於新增快速提示。在一些實施例中,快速提示係地圖繪製應用程式不必依賴於第三方應用程式(亦即,無需經由該第三方應用程式或網站)的一類型評論。UI項目4575係用於新增評論。在一些實施例中,對UI項目4575之選擇使地圖繪製應用程式啟動一第三方應用程式或一網站以留下評論。第三階段4515亦展示對UI項目4560之選擇。 第四階段4520說明允許使用者新增關於POI之評論的第三方應用程式(例如,Yelp之行動應用程式)。 在一些實施例中,由一第三方應用程式(例如,Yelp)提供之資訊可基於使用者對特定第三方應用程式之個人偏好設定而對不同使用者有所不同。為了判定特定使用者之偏好設定,地圖繪製應用程式與該第三方應用程式通信以判定使用者之偏好設定且擷取針對使用者量身製作之資訊。在一些實施例中,為了顯示來自第三方應用程式之使用者之個人化資訊,使用者必須將第三方應用程式安裝在使用者用來執行地圖繪製應用程式之同一器件上。此外,在一些實施例中,使用者亦必須當前登入至第三方應用程式。舉例而言,使用者必須將Yelp應用程式下載於使用者用以執行地圖繪製應用程式之同一器件上,且使用者在其自地圖繪製應用程式存取Yelp特徵時必須登入Yelp應用程式。若使用者滿足此等條件,則將根據使用者之個人偏好設定來量身製作將使用地圖繪製應用程式為使用者顯示的關於POI之特定資訊。因此,不同使用者將基於使用者之個人偏好設定而看到關於POI之不同資訊。 為了應用特定使用者之偏好設定,地圖繪製應用程式及第三方應用程式必須首先驗證某些資訊。地圖繪製應用程式最初向第三方應用程式驗證使用者當前登入至第三方應用程式。在驗證了使用者當前登入至應用程式之後,地圖繪製應用程式將與使用者感興趣之特定POI有關之資料(例如,特定POI之識別)轉遞至第三方應用程式。第三方應用程式擷取與特定使用者之POI有關之資訊且將該資訊傳回至地圖繪製應用程式。在一些實施例中,地圖繪製應用程式接著根據使用者之偏好設定「就地」向使用者顯示資訊(例如,不轉至第三方應用程式)。 在一些實施例中,地圖繪製應用程式自第三方應用程式獲得鑑認資訊(例如,符記)。地圖繪製應用程式使用此符記存取將關於POI之資訊提供至第三方應用程式之伺服器。舉例而言,Yelp應用程式已經鑑認並被授權以存取Yelp伺服器。接著,地圖繪製應用程式可被授權以藉由使用自在同一器件上執行之Yelp應用程式獲得之鑑認資訊來存取Yelp伺服器。在此等實施例中,伺服器可維護使用者之個人偏好設定,使得當地圖繪製應用程式要求關於POI之資訊(例如,評論)時,伺服器傳回符合使用者之偏好設定之資訊。在其他實施例中,不要求使用者已登入至第三方應用程式。在此等實施例中,地圖繪製應用程式可使使用者經由地圖繪製應用程式直接登入至伺服器。 圖46說明在兩個不同使用者4601及4602 (圖中未圖示之使用者)之兩個不同器件上執行的兩個地圖繪製應用程式執行個體,該兩個執行個體展示自在該兩個不同器件上執行之特定第三方應用程式獲得的關於同一POI之兩個不同資訊集合。圖46以三個階段4605至4615說明每一使用者之地圖繪製應用程式。 在第一階段4605期間,兩個使用者正在檢視地圖繪製應用程式之地圖視圖。該兩個使用者均已使用搜尋列搜尋「Little Coffee Shop」。兩個使用者亦選擇箭頭(未圖示,分別被使用者4601及4602之手指4603及4604阻擋)以啟動POI之詳細資訊畫面。在第二階段4610,GUI 3800已被啟動。如所展示,地圖繪製應用程式顯示關於POI之各種資訊,包括地址、電話號碼、網站之URL及使用者評論。使用者4601及4602均亦選擇評論索引標籤4550以獲得關於該咖啡店之使用者評論之資訊。 在第三階段4615,GUI 3800展示自第三方應用程式(例如,Yelp)獲得的評論。然而,所顯示之使用者評論對於兩個使用者4601及4602係不同的,因為此等評論係基於使用者之個人偏好設定而擷取。第一使用者看到Jane、Bo及Sue所作之三條評論,而第二使用者看到來自Jon、Brad及Nat之評論。此情況可在使用者4601及4602已指定了用於特定第三方應用程式之不同個人偏好設定時發生。第一使用者4601可向第三方應用程式(亦即,此實例中之Yelp)指示:使用者4601僅想看到「食品專家」所作之評論,且因此顯示來自專家Jane、Bo及Sue之評論。第二使用者4602可能已指示:使用者4602想看到來自每一個人之評論,且因此顯示來自可能係專家及並非專家之Jon、Brad及Nat之評論。因此,每一使用者獲得來自第三方應用程式之關於特定POI之資訊,其係根據使用者之個人偏好設定而客製化及量身製作。 圖47概念性地說明GUI 3800。具體言之,圖47以三個不同階段4705至4715說明一些實施例之地圖繪製應用程式顯示用於所選擇POI之詳細資訊畫面中的一索引標籤上之一標記,且在該標記被選擇時顯示一特定資訊段。 在一些實施例中,地圖繪製應用程式顯示在GUI 3800之一索引標籤上之一標記以指示在選擇該索引標籤時將顯示之一輸入項包括關於POI之特殊資訊。舉例而言,標記可為展示POI具有一優惠之「優惠(deal)」標記,且關於該優惠之資訊可包括於在選擇該標記或該標記所在之索引標籤時會顯示之輸入項中之一者中。因而,標記充當一未讀取記號,其指示存在自輸入項之各別來源獲得輸入項以來使用者尚未檢視之輸入項。 第一階段4705展示GUI 3800。評論索引標籤4450具有附加至該索引標籤之標記4750。在一些實施例中,當存在包括關於POI之優惠之資訊之輸入項時,標記4750出現在索引標籤4450上。不同實施例之地圖繪製應用程式使用不同外觀之標記。舉例而言,在一些實施例中,該標記類似於可附加至一文件之一實體標記。該標記亦包括一或多個美元標誌以指示POI對於其顧客提供省錢的優惠。 下一階段4710顯示對標記4750之選擇(例如,藉由觸按)。第三階段4715展示該標記已自索引標籤4450消失且優惠之輸入項係顯示於資訊顯示區域3845中。在一些實施例中,地圖繪製應用程式改為啟動一第三方應用程式或在於地圖繪製應用程式執行於器件上執行之一瀏覽器應用程式中開啟第三方之網站。第三方應用程式或網頁將接著展示所選擇POI之優惠之細節。 圖48概念性地說明顯示於提供相對較大顯示區域之器件4820之顯示區域中的POI之「詳細資訊畫面」(例如,GUI 3800)之一實例。此器件之一實例為平板器件(例如,Apple Inc.所售之iPad®)。具體言之,圖48以三個不同階段4805至4815說明用以顯示用於特定POI之「詳細資訊畫面」的使用者與地圖繪製應用程式之互動。如所說明,與具有相對較小之顯示區域之器件(例如,諸如Apple Inc.所售之iPhone®之智慧型手機)相比,器件4820提供較大之螢幕顯示區域4825以用於檢視地圖繪製應用程式。較大之顯示區域允許地圖繪製應用程式高效地利用螢幕空間以在同一地圖視圖內顯示不同GUI且將至不同GUI畫面之切換減至最少。詳言之,當使用者選擇用於特定POI之箭頭時,地圖繪製應用程式開啟直接覆疊於地圖繪製應用程式之地圖視圖之上的用於POI之「詳細資訊畫面」而不改變至一新GUI頁面。 在第一階段4805,地圖繪製應用程式顯示圖釘4830及橫幅4835。該橫幅可為使用者鍵入針對「Little Coffee Shop」之搜尋查詢之結果。第二階段4810說明使用者選擇箭頭3875以啟動用於此POI之「詳細資訊畫面」。在第三階段4815,地圖繪製應用程式已啟動用於該詳細資訊畫面之GUI 3800。GUI 3800係完整地顯示於地圖繪製應用程式之地圖視圖內。在一些實施例中,當使用者正在縱向定向上檢視地圖繪製應用程式時,地圖繪製應用程式完整地顯示用於特定POI之「詳細資訊畫面」。在一些實施例中,當使用者檢視橫向定向之平板上之地圖繪製應用程式時,地圖繪製應用程式顯示用於POI之「詳細資訊畫面」,但可能要求使用者捲動通過「詳細資訊畫面」以檢視全部資訊。IV. 路線產生 如上所述,地圖繪製應用程式藉由提供獲得指引之若干不同方式來緊密地整合搜尋及路線識別體驗。一個此方式係經由主地圖視圖上(例如,在左上角)之一可選擇指引UI控制項(例如,按鈕),該UI控制項在經選擇時呈現用於編輯指引的一模態介面,且使使用者能夠請求更多客製化路線,諸如並非自當前方位開始之路線或步行路線而非僅僅駕駛路線。 在一些實施例中,地圖繪製應用程式允許使用者藉由將展示駕控指令之標誌滑進及滑出展示路線之UI頁面來檢查此等客製化路線。地圖繪製應用程式之此操作模式被稱為路線檢查模式或(手動)步進模式,此操作模式係一些實施例之地圖繪製應用程式能夠藉以操作之若干操作模式中之一者。此等操作模式之實例包括導航模式、地圖瀏覽模式及路線檢查模式。 接合點係兩個或兩個以上道路區段交會之處。路線係地圖中的開始方位與目的地方位之間的路徑。典型路線沿著該兩個方位之間的路徑具有零個或許多接合點。用於路線中之接合點之駕控指令識別自接合點前往之道路區段之方向。在一些實施例中,地圖繪製應用程式為使用者提供僅用於沿著路線之接合點中之一些接合點之駕控指令,因為使用者可不需要為了達到目的地方位而在路線中之每一個接合點處執行駕控。舉例而言,攜帶器件之使用者可認識的,在到達要進行轉向以到達目的地方位之接合點之前,使用者僅需要直行通過若干接合點。在此專利申請案中,當接合點有一駕控指令要顯示時,彼駕控指令被稱為「步驟」。 在導航模式下,一些實施例之地圖繪製應用程式為使用者提供用於器件之當前方位至一目的地方位之間的一路線的步驟之一集合。通常,地圖繪製應用程式在導航模式下在視覺上及用聲訊為使用者提供此等步驟。當攜帶器件之使用者偏離路線時,一些實施例之地圖繪製應用程式追蹤器件之方位且重新計算自偏離方位起的新路線以便將使用者自偏離方位重新指引至目的地方位。換言之,於導航模式下操作之一些實施例之地圖繪製應用程式要求器件始終在路線上。又,於導航模式下操作之一些實施例之地圖繪製應用程式藉由「快顯」步驟而非將步驟滑進及滑出顯示區域來顯示該等步驟。此外,在一些實施例中,在導航模式下操作時地圖繪製應用程式顯示的步驟(亦即,駕控指令)中之資訊係動態的。亦即,在器件沿著路線移動時,資訊(諸如,估計到達時間、至目的地方位之行程之剩餘時間、自器件之當前方位至目的地方位或具有下一步驟之下一接合點之剩餘距離等)由地圖繪製應用程式更新。 在路線檢查模式下,一些實施例之地圖繪製應用程式允許使用者將該等步驟滑進及滑出顯示區域以檢查路線中之每一步驟。或者,地圖繪製應用程式允許使用者操縱地圖(例如,藉由放大及縮小、在不同方向上滑動地圖)以顯示路線中之不同接合點。當作為使用者對地圖之操縱之結果而將具有一步驟之一接合點顯示於顯示區域中時,地圖繪製應用程式藉由將該步驟滑入(及將先前所顯示之步驟與當前所顯示之步驟之間的任何中間步驟滑入及滑出)來顯示該步驟。以此方式,使用者可藉由將該等步驟手動地滑入及滑出顯示區域或藉由操縱地圖以在顯示區域中顯示路線之某些接合點來檢查路線。A. 路線起始及搜尋 圖49依據四個階段4905至4920說明用以獲得路線選擇指引的使用者與地圖繪製應用程式之互動的實例。具體言之,此圖說明地圖繪製應用程式開始在路線檢查模式下操作。第一階段4905說明在使用者已選擇指引控制項160 (此圖中未圖示)之後的器件。第一階段4905亦說明使用者已在開始欄位245及結束欄位250中鍵入路線之開始及結束方位。 第二階段4910說明對路線產生控制項240之選擇。在一些實施例中,當選擇路線產生控制項時,地圖繪製應用程式將開始及結束方位資訊發送至一遠端伺服器以獲得路線。第三階段4915展示兩條路線,路線1及路線2,在一些實施例中,地圖繪製應用程式基於自遠端伺服器獲得之路線資訊而在地圖上顯現該兩條路線。第三階段4915亦說明地圖繪製應用程式已按預設選擇路線1。使用者選擇開始控制項4950以用於開始根據所選擇路線之導航。一些實施例之地圖繪製應用程式在接收到對開始控制項4950之選擇時開始在路線檢查模式下操作。 第四階段4920說明地圖繪製應用程式顯示指令標誌4930,在一些實施例中,指令標誌4930係用於瀏覽所選擇路線之一系列轉向提示指令標誌(在圖中未全部圖示該等指令標誌)中之第一標誌。地圖繪製應用程式允許使用者藉由沿著一特定軸線(例如,水平地)滑動該等標誌來瀏覽所選擇路線。在下文更詳細地描述此等可捲動指令標誌。在一些實施例中,地圖繪製應用程式允許使用者在所選擇路線之開始方位並非使用者之當前方位時瀏覽所選擇路線。又,當地圖繪製應用程式處在此模式下以用於允許使用者瀏覽或檢查所選擇路線時(如此階段4920中所展示),一些實施例之地圖繪製應用程式停用頁面捲曲或不顯示頁面捲曲。 除了在開始欄位245及結束欄位250中鍵入路線之開始及結束方位之外,一些實施例之地圖繪製應用程式亦允許使用者自先前所搜尋之路線之一清單選擇一路線。圖50依據四個階段5005至5020說明用以獲得路線選擇指引的使用者與地圖繪製應用程式之互動的實例。此實例係在使用指引控制項160來獲得兩個方位之間的路線之內容脈絡中提供的。 第一階段5005說明顯示城市之街道視圖之地圖的地圖繪製應用程式。使用者正在起始對在搜尋欄位165旁之位於顯示器之左上角處的指引控制項160之觸按。第二階段5010接下來說明應用程式呈現具有使用者先前已搜尋之新近路線指引之一清單之搜尋表5055。在此實例中,如所展示,使用者選擇至警察局之路線。 第三階段5015說明顯示具有器件之當前方位至所選擇路線之目的地的所選擇路線之地圖。此階段5015亦說明對清單檢視控制項235之選擇。第四階段5020說明地圖繪製應用程式呈現用以到達目的地之轉向提示指令之一清單。如所展示,該清單中之每一指令包括展示與指令相關聯之特定轉向之方向的方向圖示5035。在一些實施例中,該清單中之每一指令看上去與上文藉由參看圖49所描述之對應指令標誌4935相同。 圖51概念性地說明在器件之相對較大的顯示區域中顯示路線選擇指引之一實例。此器件之一實例為平板器件(例如,Apple Inc.所售之iPad®)。具體言之,圖51以三個不同階段5105至5115說明用以顯示一組路線選擇指引的使用者與地圖繪製應用程式之互動。如所說明,當與較小器件(例如,諸如Apple Inc.所售之iPhone®之智慧型手機)之顯示區域相比時,器件提供一較大之螢幕顯示區域以用於檢視地圖繪製應用程式。較大之顯示區域允許地圖繪製應用程式高效地利用螢幕空間以在地圖繪製應用程式之地圖視圖內顯示不同UI項目且使UI畫面的改變減至最少。舉例而言,當使用者選擇檢視路線選擇指引之一清單時,地圖繪製應用程式顯示直接覆疊於地圖上之路線選擇指引之該清單而不改變至一不同UI畫面。 第一階段5105說明執行一些實施例之地圖繪製應用程式之平板器件5120,其正顯示兩個方位之間的一特定路線之地圖視圖。詳言之,使用者已獲得在使用者之當前方位與POI「Pizza Place」之方位之間的一路線。使用者可能已經由若干途徑獲得此路線,該等途徑包括搜尋特徵、在地圖上放置圖釘及各種其他機制。地圖繪製應用程式亦正在顯示包括清單檢視控制項145之浮動控制項之一集合。 第二階段5110展示使用者正在選擇清單檢視控制項145以獲得路線選擇指引之一清單。第三階段5115說明地圖繪製應用程式現顯示覆疊於地圖繪製應用程式之地圖視圖之一部分上的路線選擇指引之清單。 在一些實施例中,當使用者自路線選擇指引之該清單選擇個別路線選擇指引時,地圖繪製應用程式在地圖上顯示路線之與所選擇路線選擇指引相關聯之對應部分。若路線之對應部分不在當前所顯示之地圖區內,則地圖繪製應用程式使地圖移位,以使得顯示含有對應部分的地圖之區。B. 路線顯示及檢閱 在一些實施例中,在地圖繪製應用程式向使用者呈現經識別路線時,該應用程式允許使用者選擇並捲動通過表示沿著所選擇路線之接合點之標誌的可選擇UI項目之一集合。當使用者捲動通過每一標誌時,呈現或醒目提示與當前在焦點上的標誌相關聯的路線之部分(例如,經由顏色醒目提示或經由對該部分加記號之另一幾何形狀(諸如,圓圈或其他記號))。一些實施例之地圖繪製應用程式之此操作模式被稱為路線檢查模式。操作於此模式下之地圖繪製應用程式允許使用者藉由操縱表示路線之一些接合點之指令性標誌的UI項目來檢查道路。在一些實施例中,(1)正被檢查之路線係介於兩個方位(該兩者均非地圖繪製應用程式執行於之器件之當前方位)之間時,及(2)當路線係針對步行(而非駕駛)指引而計算時,地圖繪製應用程式在路線檢查模式下操作。 圖52依據使用者捲動通過由使用者選擇之一特定路線之可捲動指令標誌之一集合的四個階段5205至5220來說明路線檢查模式之一實例。第一階段5205說明使用者藉由觸按頂端列140中之開始控制項來起始在開始及結束方位之間的一所選擇路線。如上所述,第二階段5210說明針對特定路線所呈現之第一可捲動標誌5225 (一可選擇UI項目),如顯示於頂端列140之中心處之「3之1」文字所指示。又,地圖繪製應用程式正在所顯示標誌5225表示之接合點上顯示當前接合點指示符5290。關於當前接合點指示符之更多細節係描述於2012年9月30日申請之題為「Mapping Application with Automatic Stepping Capabilities」的美國專利申請案13/632,002中。美國專利申請案13/632,002係以引用方式併入本文中。第二階段5210亦說明使用者在第一可捲動標誌上起始向左之撥動示意動作。該撥動示意動作使該標誌向地圖之左邊移位。在一些實施例中,使用者可觸按該標誌以使其移位至地圖之左邊(或右邊)並顯示下一標誌。 第三階段5215說明,第一可捲動標誌5225之一部分已捲動離開地圖顯示區域且用於該路線之一新標誌5235已變得部分可見。使用者可看到,該新標誌顯示一右轉箭頭。地圖繪製應用程式尚未移動當前接合點指示符5290,因為標誌5225仍為當前標誌。 第四階段5220說明在使用者已完成使第一標誌離開地圖之撥動示意動作之後的顯示。地圖繪製應用程式現顯示路線選擇指引之該清單之第二標誌5235,如顯示於頂端列140處之「3之2」文字所指示。此標誌指示在0.1英里後,使用者需要右轉進入第7街道。此外,應用程式已對地圖顯示區域之一部分進行放大且醒目提示該路線之對應於正被呈現之當前在焦點上的標誌之新區段5250。應用程式亦已將當前接合點指示符5290移動至由第二標誌5235指示之接合點。 替代性地或相結合地,使用者可藉由選擇路線之不同接合點(例如,藉由觸按)或導覽地圖(經由示意動作輸入)以便檢視與特定接合點相關聯之特定可捲動標誌來捲動通過每一標誌。圖53依據三個階段5305至5315說明使用者導覽地圖以便捲動通過不同的可捲動標誌。 第一階段5305說明具有針對在開始點與結束點之間的路線之特定接合點的經覆疊標誌(對應於路線中之「3之2」標誌)的地圖。應用程式亦已醒目提示路線之針對此標誌的對應部分。該標誌說明在0.1英里後,使用者需要右轉進入第7街道。第一階段5305亦說明使用者起始用以導覽地圖之撥動示意動作(例如,向右撥動使用者之手指)以便檢視地圖之一不同區。 第二階段5310說明在對應於向左移位之撥動示意動作之後所顯示的地圖之一新區。第三階段5315說明在完成之撥動示意動作之後,一新的可捲動標誌現覆疊於地圖上、對應於路線之現顯示於地圖之此特定區中的部分。此標誌係路線中之第三標誌,如顯示於地圖之頂部中心處之「3之3」文字所指示。該標誌指示在350英尺後,使用者將到達目的地。 為了導覽路線選擇指引之一集合,使用者具有捲動通過覆疊於地圖上之該等標誌抑或導覽地圖以捲動通過該等不同標誌的選項。又,當使用者觸按路線之一特定段時,地圖繪製應用程式捲動通過該等不同標誌以顯示對應於路線之該特定段之一標誌。在一些實施例中,地圖繪製應用程式顯示用於最接近於路線之經觸按部分之一接合點的標誌。 地圖繪製應用程式之此捲動特徵允許使用者在兩個方位之間行進時快速地確定所有必需駕控。此在預期到即將到來之轉向而需要進行大量車道變化之駕駛情形下可能尤其有用。 在一些實施例中,可捲動指令標誌中所展示之方向箭頭係簡單箭頭。在其他實施例中,當標誌或呈現上存在足夠空間以供使用較大標誌時,在導航模式下的一些實施例之地圖繪製應用程式藉由使用一較大圖形方向指示符來識別將在沿著路線之接合點處執行之駕控,該圖形方向指示符包括(1)粗略表示載具之路徑的突出風格化箭頭,及(2)對應於匯接點之其他元素之不加以強調的線及曲線之集合。在使用此方法之一些實施例中,T型匯接點處之右轉將由與較小之較暗區段接合的具有直角之大箭頭表示,該較小之較暗區段平行於大箭頭之區段中之一者延伸。在一些實施例中,較小區段亦被推開到側面,使得載具所採用之路徑處於支配地位。關於該等箭頭之更多細節係描述於2012年9月30日申請之題為「Context-Aware Voice Guidance」的美國專利申請案13/632,121中。美國專利申請案13/632,121係以引用方式併入本文中。在一些實施例中,地圖繪製應用程式使用看上去逼真的道路交通標誌來替代具有方向圖示之指令。 圖54說明顯示用於由使用者選擇之特定路線之可捲動指令標誌之一集合之實例。地圖繪製應用程式執行於之器件5400具有相對較大之顯示區域。此器件之一實例為平板器件(例如,iPad®)。此圖以兩個不同階段5405及5410說明用以步進通過該等指令標誌的使用者與地圖繪製應用程式之互動。 當地圖繪製應用程式正在具有較大顯示區域之器件上執行時,地圖繪製應用程式在任何給定時刻在顯示區域中顯示更多標誌。在一些實施例中,地圖繪製應用程式在顯示區域之頂部部分中顯示一列標誌,其中當前標誌在頂部部分的中間。地圖繪製應用程式可顯示之標誌之數目視器件之定向而改變。亦即,地圖繪製應用程式在顯示區域處於橫向定向中時可顯示比在顯示區域處於縱向定向中時更多的標誌。 第一階段5405展示地圖繪製應用程式正在顯示三個標誌5415至5425及第四標誌5430之一部分。在此實例中,標誌5415至5430表示具有總共六個步驟之所選擇路線之第一至第四指令。如頂端列5435指示,路線之第二指令係當前指令,且標誌5420經醒目提示且置放於顯示區域之頂部部分之中間以指示標誌5420正在表示當前指令。第一階段5405亦展示使用者正在向左撥動標誌5420。 第二階段5410展示地圖繪製應用程式正於顯示區域之頂部部分之中間顯示用於路線之第三指令之標誌5425。在此實例中,地圖繪製應用程式亦已醒目提示標誌5425,且對應於該標誌的路線之段被醒目提示,如圖所示。頂端列指示當前指令為路線之六條指令中之第三條。又,標誌5415現大部分滑出顯示區域且標誌5430現在被完全顯示。地圖繪製應用程式亦正在顯示第五標誌5445之一部分,其表示路線之第五指令。 一些實施例之地圖繪製應用程式允許使用者在藉由捲動指令標誌檢閱一所選擇路線時切換至概觀模式。在概觀模式下,一些實施例之地圖繪製應用程式調整地圖之縮放層級,以使得完整路線可顯現於地圖上。地圖繪製應用程式亦允許使用者回到使用者可藉以繼續檢閱指引指引指令之模式。圖55依據三個階段5505至5515說明用以在檢閱所選擇路線時切換至概觀模式的使用者與應用程式之互動之實例。 第一階段5505與上文藉由參看圖53所描述之階段5315相同。亦即,使用者已捲動至最後一個指令標誌5520。下一階段5510說明對概觀控制項5525之選擇。 第三階段5515說明概觀模式下之地圖。一些實施例之地圖繪製應用程式回應於接收到對概觀控制項5525之選擇而展示概觀模式下之地圖。地圖繪製應用程式已將地圖縮小,以使得完整路線顯示於地圖中。在一些情況下,當器件之當前方位非常接近於目的地方位(例如,在100公尺內)時,地圖繪製應用程式僅顯示自器件之當前方位至目的地方位之部分路線。 又,頂端列5530展示目的地(在此實例中為警察局)距器件之當前方位或此特定路線之開始方位7分鐘路程或0.5英里。頂端列5530現包括繼續控制項5540,在一些實施例中,該控制項係用於繼續對所選擇路線之導覽或檢查。地圖繪製應用程式亦在地圖中顯示清單檢視控制項235。 頂端列5530亦展示結束控制項5570。在地圖繪製應用程式正展示所選擇路線之概觀時,當地圖繪製應用程式接收到對結束控制項5570之選擇時,一些實施例之地圖繪製應用程式藉由回到地圖瀏覽模式來停止對所選擇路線之檢查。一些實施例之地圖繪製應用程式藉由以下操作而回到地圖瀏覽模式:自地圖移除所選擇路線;放回頁面捲曲;用包括指引控制項、搜尋欄位及書籤控制項的其他控制項之一集合替換頂端列中之資訊及控制項。一些實施例之地圖繪製應用程式在自檢查模式切換至地圖瀏覽模式時不使地圖移位至另一區。一些實施例之地圖繪製應用程式在地圖繪製應用程式進入地圖瀏覽模式時在地圖中針對開始及結束方位留下圖釘。 圖56概念性地說明一些實施例執行的程序5600,其用以允許使用者瀏覽針對在開始方位與結束方位之間的路線中之接合點之一組指令的標誌。在一些實施例中,程序5600係由地圖繪製應用程式執行。程序5600在地圖繪製應用程式已計算開始方位與結束方位之間的一或多個路線時開始。 程序5600藉由接收(在5605處)對路線之選擇而開始。如上文之圖49中所展示,一些實施例之地圖繪製應用程式在開始方位與結束方位之間存在兩條或兩條以上所產生路線時提供一推薦路線。當使用者不選擇另一路線時,地圖繪製應用程式在接收到對開始控制項(諸如開始控制項4950)之選擇時將該推薦路線作為所選擇路線。 接下來,程序5600接收(在5610處)用於開始瀏覽模式之一使用者輸入。在一些實施例中,地圖繪製應用程式在使用者選擇一開始控制項(諸如開始控制項4950)時進入瀏覽模式。在5615處,一些實施例之程序5600接著在地圖上顯示用於路線之第一接合點(亦即,開始方位)之一標誌及該標誌之對應接合點(亦即,第一接合點)。 程序5600接著接收(在5620處)一使用者輸入。在一些實施例中,該使用者輸入包括與地圖繪製應用程式之任何示意動作互動。舉例而言,使用者可藉由觸碰地圖之一或多個方位來縮放或撥動地圖。使用者亦可對當前所顯示之標誌進行觸按、撥動等。 程序5600接著判定(在5625處)該使用者輸入是否係用於移動當前所顯示之標誌。在一些實施例中,當使用者觸按當前所顯示之標誌或在某一方向上撥動當前所顯示之標誌時,程序5600判定該使用者輸入係用於移動該標誌。當程序5600判定該使用者輸入不用於移動當前所顯示之標誌時,程序5600繼續至在下文進一步描述之5635。 當程序5600判定該使用者輸入係用於移動當前所顯示之標誌時,程序5600根據該使用者輸入顯示(5630)一相鄰標誌(若可能)。舉例而言,程序5600根據該使用者輸入而顯示用於路線中之下一個或前一個接合點之下一個或前一個標誌。程序5600亦顯示路線之對應接合點。在一些實施例中,程序5600可縮放或移位至地圖之另一區以便顯示正被顯示之標誌之對應接合點。程序5600接著迴圈回至5620以接收另一使用者輸入。 當程序5600判定(在5625處)該使用者輸入不用於移動當前所顯示之標誌時,程序5600判定(在5635處)該輸入是否用於顯示不同於當前所顯示之接合點的一接合點。在一些實施例中,當使用者操縱地圖(例如,撥動、縮放等)以顯示地圖之另一區時或當使用者觸按較接近所顯示路線之另一接合點的路線之一部分時,該程序判定該輸入係用於顯示一接合點。當程序5600判定(在5635處)該使用者輸入不用於顯示另一接合點時,程序5600繼續至在下文進一步描述之5645。 當程序5600判定(在5635處)該使用者輸入係用於顯示另一接合點時,程序5600顯示(在5640處)另一接合點及該接合點之對應標誌。此標誌可不為當前所顯示之標誌之相鄰標誌。程序5600接著迴圈回至5620以接收另一使用者輸入。 當程序5600判定(在5635處)該使用者輸入不用於顯示另一接合點時,程序5600判定(在5645處)該使用者輸入是否用於展示路線之概觀。在一些實施例中,當地圖繪製應用程式接收到對概觀控制項之選擇時,程序5600判定該輸入係用於顯示路線之概觀。當程序5600判定(在5645處)該使用者輸入不用於展示路線之概觀時,程序5600繼續至在下文進一步描述之5670。 當程序5600判定(在5645處)該使用者輸入係用於顯示路線之概觀時,一些實施例之程序5600在地圖中顯示(在5650處)完整路線。該程序亦在顯示路線之概觀的同時接收另一使用者輸入。程序5600接著判定(在5655處)該輸入是否用於結束瀏覽模式。在一些實施例中,當地圖繪製應用程式接收到對一結束控制項(諸如,上文藉由參看圖55所描述之結束控制項5570)之選擇時,程序5600判定該輸入係用於結束瀏覽模式。當程序5600判定(在5655處)該使用者輸入係用於結束瀏覽模式時,程序5600結束。 當程序5600判定(在5655處)該使用者輸入不用於結束瀏覽模式時,該程序判定(在5660處)該輸入是否用於退出路線之概觀。在一些實施例中,當地圖繪製應用程式接收到對一繼續控制項(諸如,上文藉由參看圖55所描述之繼續控制項5540)之選擇時,程序5600判定該輸入係用於退出概觀。。 當程序5600判定(在5660處)該輸入不用於退出概觀模式時,該程序迴圈回至5650以顯示路線並接收另一使用者輸入。當程序5600判定(在5660處)該輸入係用於退出概觀模式時,該程序退出路線之概觀且顯示(在5665處)在展示概觀之前的標誌及其對應接合點。該程序接著迴圈回至5620以接收另一使用者輸入。 當程序5600判定(在5645處)所接收(在5620處)之輸入並非用於展示路線之概觀之輸入時,程序5600判定(在5670處)該輸入是否用於結束瀏覽模式。在一些實施例中,當地圖繪製應用程式接收到對結束控制項之選擇時,程序5600判定該輸入係用於結束瀏覽模式。當程序5600判定(在5670處)該使用者輸入係用於結束瀏覽模式時,程序5600結束。否則,該程序接著迴圈回至5620以接收另一使用者輸入。C. 導航模式 圖57說明執行一些實施例之地圖繪製應用程式之器件5700之一實例。此圖亦說明在此應用程式中啟動路線導航之一實例。圖57展示與地圖繪製應用程式之互動的六個階段5705、5710、5715、5717、5719及5721。第一階段5705展示UI 5720,其包括在停駐區域5725中及在UI之頁面上之若干應用程式之若干圖示。此頁面上之圖示中之一者為地圖繪製應用程式5730之圖示。第一階段展示使用者經由在此應用程式於螢幕上之方位處與器件之螢幕觸碰接觸來選擇地圖繪製應用程式。 第二階段5710展示在地圖繪製應用程式已開啟之後的器件。如此階段中所展示,地圖繪製應用程式之UI具有一開始頁面,在一些實施例中,該開始頁面(1)顯示器件之當前方位之地圖,及(2)配置於頂端列5740中及作為浮動控制項之若干UI控制項。 圖57之第三階段5715說明對指引控制項5760之選擇開啟指引鍵入頁面5780,其展示於第四階段5717中。指引控制項為三個機構中之一者,可經由該控制項指引地圖繪製應用程式識別並顯示兩個方位之間的一路線;另外兩個機構為(1)針對地圖中之所選擇項目所顯示之資訊橫幅中之一控制項,及(2)由器件識別之顯示於搜尋欄位5765中的新近路線。因此,資訊橫幅控制項及搜尋欄位5765係應用程式用來使不同模態之間的轉變順暢的兩個UI工具。 第四階段5717說明使用者選擇新近指引中之自動填充於表5782之一者。第五階段5719接著在2D地圖視圖上展示在經由頁面5780指定之指定開始方位與結束方位之間的三條路線。該階段亦展示對第二路線之選擇及在版面配置之頂部處之一列中的關於此路線之某資訊。此列經展示為包括開始及結束按鈕。開始按鈕經展示為在第五階段中被選擇。 如第六階段5721所展示,對開始按鈕之選擇指引應用程式進入轉向提示導航模式。在此實例中,應用程式已進入2D轉向提示導航模式。在其他實施例中,應用程式將按預設進入3D轉向提示導航模式。在此模式下,應用程式顯示逼真的標誌5784,其識別至導航路線中之下一匯接點駕控之距離及一些其他有關資訊。應用程式亦顯示一頂端列,其包括關於導航之一些資訊以及結束及概觀按鈕,該等按鈕分別用於結束導航及獲得導航路線之剩餘部分或導航路線之整個部分(在其他實施例中)之概觀。 應用程式進一步顯示上文所描述之浮動3D控制項5750及浮動清單控制項。應注意,清單控制項係在進入路線檢查及路線導航模態時適應性地新增至浮動控制項叢集,而位置指示符係在進入路線導航模態時被從浮動控制項移除。又,在自路線檢查模式轉變至路線導航模式時,應用程式在一些實施例中執行一動畫,其涉及在應用程式轉變成導航呈現之前頁面捲曲完全展開。 在一些實施例中,該動畫轉變包括自導航呈現移除頂端列、其相關聯控制項及浮動控制項,及在開始導航呈現之後的一短時間段後將標誌5784移動至呈現之頂部邊緣。如下文進一步描述,在一些實施例中,應用程式要求使用者在導航地圖上觸按以恢復頂端列、其控制項及浮動控制項,且要求另一觸按以再次自地圖移除此等控制項。其他實施例提供用於檢視並移除此等控制項之其他機制。 一些實施例之導航應用程式可在2D模式抑或3D模式下顯示導航。如上所述,浮動控制項中之一者為3D控制項5750,其允許一使用者在三維空間(3D)中檢視導航呈現。圖58說明一些實施例之導航應用程式如何提供3D控制項150以作為用於進入3D導航模式之快速機構。此圖以三個階段5805至5815說明此操作。第一階段5805說明使用者在檢視二維導航呈現之同時選擇3D控制項150。 第二階段5810說明在轉變成3D呈現中的導航呈現。如此圖中所展示,3D控制項在此階段表現為醒目提示的以指示導航呈現已進入3D模式。如上所述,在一些實施例中,導航應用程式藉由自三維場景中之在概念上可被視為捕獲地圖視圖之虛擬攝影機之位置的一特定位置顯現地圖視圖來產生導航地圖之3D視圖。將在下文藉由參看圖59來進一步描述此顯現。 第三階段5815接著說明在轉變成3D外觀的結束時的導航呈現。如在第二階段及第三階段中之建築物的高度之間的差所展示,在一些實施例中,自2D至3D導航之轉變包括一動畫,其展示導航地圖中之三維物件變大。 一些實施例之導航應用程式能夠自多個透視角度顯示導航地圖。應用程式可三維(3D)地或二維(2D)地展示地圖。3D地圖係虛擬攝影機所見之虛擬場景的所產生之模擬。圖59呈現一簡化實例以說明虛擬攝影機5912之概念。當顯現3D導航地圖時,虛擬攝影機為3D地圖場景中之位置(器件自該位置顯現場景之3D視圖)之概念化。圖59說明3D導航地圖場景5910中之包括四個物件之方位,該四個物件包括兩個建築物及兩條相交道路。為了說明虛擬攝影機概念,此圖說明三個情境,其中之每一者對應於一不同虛擬攝影機方位(亦即,一不同顯現位置)及顯示於器件上之一不同所得視圖。 第一階段5901展示在第一位置處以一角度(例如,與地平線成30度之角)向下指向3D場景5910的虛擬攝影機5912。藉由自階段5901中所展示之位置及角度顯現3D場景,應用程式產生3D地圖視圖5918。自此位置,攝影機指向係器件前方的一移動位置之一方位。使虛擬攝影機5912保持在器件之當前方位後面。在此情況下,「在當前方位後面」意謂在與器件移動之當前方向相反之方向上沿著導航應用程式之經界定路徑向後。 導航地圖視圖5918看上去就好像其係由攝影機自器件之方位指示符5916後上方拍攝一樣。虛擬攝影機之方位及角度將方位指示符5916置於接近導航地圖視圖5918之底部處。此亦導致畫面之大部分填充有在器件之目前方位前面的街道及建築物。與之對比,在一些實施例中,方位指示符5916在畫面之中心,畫面之一半表示在器件前面的事物且另一半表示在器件後面的事物。 第二階段5902展示在不同位置處以較大第二角度(例如,-45°)向下指向場景5910的虛擬攝影機5912。應用程式自此角度顯現場景5910,從而導致3D導航地圖視圖5928。建築物及道路小於其在第一導航地圖視圖5918中之圖示說明。再一次,虛擬攝影機5912在場景5910中位於方位指示符5916之後上方。此再次導致方位識別符出現在3D地圖視圖5928之下部部分中。攝影機之方位及定向亦再次導致畫面之大部分顯示進行導航的人需要知道之在攜載器件之汽車前面的事物。 第三階段5903展示由上而下視圖下的虛擬攝影機5912,該虛擬攝影機5912向下面向2D地圖上之一方位,該方位對應於3D地圖場景5910中用以顯現3D視圖5918及5928之方位。自該透視角度顯現之場景為2D地圖視圖5938。不同於第一及第二階段之在一些實施例中為透視3D顯現操作之3D顯現操作,第三階段中之顯現操作相對簡單,此係因為該操作僅需要裁剪應用程式或使用者所指定之縮放層級所識別的2D地圖之一部分。因此,此情形下的虛擬攝影機表徵稍微不必要地使對應用程式之操作之描述複雜化,此係因為裁剪2D地圖之一部分並非一透視顯現操作。 在一些實施例中,如下文進一步描述,可藉由在地圖進入3D模式之後改變用於檢視地圖之縮放層級來使虛擬攝影機移動。在此等實施例中之一些中,應用程式切換至在縮放層級達到一特定縮小層級時產生2D視圖之一由上而下模式(其中顯現位置筆直面向下)。如在第三階段5903中,在一些實施例中,當攝影機自3D透視圖切換至2D由上而下視圖時,地圖繪製應用程式從自一特定透視方向顯現3D場景切換至裁剪2D場景。此係因為,在此等實施例中,應用程式經設計以使用較容易且不產生不必要之透視假影之簡化顯現操作。然而,在其他實施例中,地圖繪製應用程式使用透視顯現操作而自一由上而下之虛擬攝影機位置顯現一3D場景。在此等實施例中,所產生之2D地圖視圖稍微不同於第三階段5903中所說明之地圖視圖5938,因為遠離視圖之中心的任何物件皆發生失真,其中物件距視圖之中心之距離越遠,則失真越大。 在不同實施例中,虛擬攝影機5912沿著不同軌跡移動。在圖59中說明兩個此等軌跡5950及5955。在此等軌跡兩者中,攝影機在圓弧中移動且在攝影機在圓弧上向上移動時更多地向下旋轉。軌跡5955與軌跡5950之不同之處在於:在軌跡5955中,攝影機在其沿圓弧向上移動時自當前方位向後移動。 當沿著圓弧中之一者移動時,攝影機旋轉以將方位指示符前面的一點維持在攝影機之焦點處。在一些實施例中,使用者可關閉三維視圖且純粹用二維視圖進行導航。舉例而言,一些實施例之應用程式允許藉由使用3D按鈕5960來開啟及關閉三維模式。3D按鈕5960對轉向提示導航特徵係不可少的,該按鈕在轉向提示導航特徵中起指示符及雙態開關之作用。當關閉3D時,攝影機將維持2D導航體驗,但當開啟3D時,在3D檢視角度不適當時(例如,當繞過在3D模式下可能被阻擋之拐角時)仍可能存在一些由上而下之透視圖。 作為允許使用者得到導航體驗之另一方式,一些實施例之地圖繪製應用程式在表示POI之圖釘旁出現的資訊橫幅中提供UI項目。圖60依據用以獲得路線選擇指引的使用者與地圖繪製應用程式之互動的三個階段6005至6015說明一實例。在使用汽車圖示6030之內容脈絡中提供此實例。 第一階段6005說明3D地圖視圖中之地圖。如所展示,3D控制項150表現為醒目提示的以指示地圖在3D地圖視圖中。第一階段6005亦說明針對由用所展示之搜尋查詢「Pizza」執行搜尋導致的搜尋之兩個圖釘之兩個資訊橫幅。使用者選擇汽車圖示6030。如上所述,汽車圖示6030係用於展示至由一圖釘表示之方位的一或多條路線,包括汽車圖示6030之橫幅與該圖釘相關聯。包括汽車圖示6030之橫幅6040亦展示對地方之簡要描述、星級評等及用於啟動用於POI之「詳細資訊畫面」之一箭頭。 第二階段6010說明一些實施例之地圖繪製應用程式回應於在先前階段6005中對汽車圖示6030之選擇而展示的兩條路線,路線1及路線2。使用者已選擇路線1,如醒目提示之橫幅6050所指示。使用者亦選擇開始控制項2060。如上所述,在一些實施例中,開始控制項4950係用於開始根據所選擇路線之導航。 第三階段6015說明地圖繪製應用程式顯示指令標誌6060,其係用於第一指令之標誌。地圖繪製應用程式已在頂端列140中用結束控制項5570及概觀控制項6075替換清除控制項255及開始控制項2060。結束控制項5570係用於結束路線之導航,且概觀控制項6075係用於藉由調整所顯示地圖之縮放層級(若調整縮放層級係展示完整路線必需的)而在地圖視圖中展示完整路線。在一些實施例中,地圖繪製應用程式在頂端列140中顯示估計到達時間、到達目的地所用的時間之量及距目的地之剩餘距離,如所展示。 當地圖繪製應用程式在地圖繪製應用程式正操作於路線檢查模式下時接收到對結束控制項5570之選擇時,一些實施例之地圖繪製應用程式藉由回到地圖瀏覽模式來停止對所選擇路線之檢查。一些實施例之地圖繪製應用程式藉由以下操作而回到地圖瀏覽模式:自地圖移除所選擇路線、將頁面捲曲放回原處、用包括指引控制項、搜尋欄位及書籤控制項的其他控制項之一集合替換頂端列中之資訊及控制項。亦即,地圖繪製應用程式使UI頁面之外觀回到類似於第一階段6005中所展示之UI頁面的UI頁面。一些實施例之地圖繪製應用程式在自檢查模式切換至地圖瀏覽模式時不使地圖移位至另一區。 應注意,儘管在搜尋欄位中之路線歷史鍵入及快速路線導航控制項皆不執行無法用可選擇指引項目(direction item)達成之動作,但該兩者充當使獲得最一般所要路線更加容易之重要加速器。 一些實施例使用自2D地圖視圖至3D地圖視圖之電影轉變或反之亦然。舉例而言,當地圖繪製應用程式在展示一路線之一開始方位時接收到對3D控制項150之選擇時,地圖繪製應用程式自2D地圖視圖開始,且自用於2D地圖之一第一虛擬攝影機視圖平滑地轉變至一新的虛擬攝影機3D視圖,該視圖被放得更大且指向路線之開始之方向。在這種情況下,虛擬攝影機視圖執行平移、縮放及旋轉操作之一組合以便到達用於導航的路線之開始。亦即,虛擬攝影機在圓弧上移動且在攝影機沿著圓弧向下移動時更多地向上旋轉。又,地圖繪製應用程式可使圓弧本身旋轉以將虛擬攝影機視點對準路線之初始道路區段。換言之,地圖繪製應用程式在電影轉變期間使地圖旋轉。 圖61說明器件6100,其隨著應用程式經過六個階段6105至6130自用於地圖瀏覽之非沈浸式地圖視圖轉變成用於導航之沈浸式地圖視圖而顯示地圖繪製應用程式。 第一階段6105說明使用者選擇方位「Pizza Place」之快速路線按鈕以便產生自使用者之當前方位(靠近器件6100之螢幕之中心)至所選擇方位之路線。第二階段6110說明地圖繪製應用程式顯示用以到達方位「Pizza Place」之路線6135。在第二階段6110,使用者選擇「開始」UI控制項6140。因此,應用程式開始進入導航。 如第三至第六階段6115至6130處所展示,一些實施例使用自2D (或3D)非沈浸式地圖視圖至3D沈浸式地圖視圖之電影轉變。應用程式顯示自其當前狀態(在6110處展示)開始且自第一虛擬攝影機視圖平滑地轉變至新虛擬攝影機視圖,該新虛擬攝影機視圖被放得更大且指向路線之開始之方向。在這種情況下,虛擬攝影機可執行平移、縮放及旋轉操作之一組合以便到達用於導航的路線之開始。如此等階段中所展示,虛擬攝影機移動並旋轉進入在第六階段6130中所展示之其最終方位,該方位在導航方位指示符(亦即,定位盤)後面。V. 多模式地圖繪製應用程式 圖62A至圖62B概念性地說明狀態圖6200,該圖描述一些實施例之整合式地圖繪製、搜尋及導航應用程式(例如,以上章節中所描述之應用程式)之不同狀態及此等狀態之間的轉變。一般熟習此項技術者將認識到一些實施例之應用程式可具有與所有不同類型輸入事件有關之許多不同狀態,且狀態圖6200特別集中至此等事件之一子集。狀態圖6200描述且參考用於改變應用程式之狀態之各種示意動作互動(例如,多點觸碰示意動作)。一般熟習此項技術者將認識到各種其他互動(諸如,游標控制器示意動作及按鈕點選、鍵盤輸入、觸控板/軌跡墊輸入等)亦可用於類似選擇操作。 當使用者最初開啟地圖繪製應用程式時,該應用程式在狀態6205 (地圖瀏覽狀態)下。在此狀態6205下,應用程式將已產生並顯示一地圖視圖。為了產生並顯示此地圖視圖,一些實施例之應用程式識別用於一區之地圖底圖之一所需集合、請求該等地圖底圖(例如,向一地圖繪製服務伺服器)、根據一虛擬攝影機之一特定方位、定向及透視角度產生該等地圖底圖之一視圖,且將該地圖視圖顯現至一器件顯示器。當在狀態6205下時,地圖視圖係靜態的。藉由在狀態6205下的應用程式,使用者可執行眾多操作以修改地圖視圖、搜尋實體(例如,名勝、地址等)、擷取用於導航之路線等。 在一些實施例中,整合式應用程式係顯示於具有一整合式觸敏顯示器之器件上。在地圖上之各種示意動作互動可使應用程式執行對地圖視圖之不同修改(例如,移動瀏覽、旋轉、縮放、修改地圖透視角度等)。當整合式應用程式接收到在地圖顯示上之示意動作互動(而非對覆疊於地圖顯示上之各種浮動或非浮動控制項之觸碰輸入)時,應用程式轉變至狀態6210以執行示意動作輸入辨識。 示意動作輸入辨識狀態6210區分不同類型示意動作輸入,且將此等類型輸入轉譯成不同地圖視圖修改操作。在一些實施例中,地圖繪製應用程式接收如由具有整合式觸敏顯示器之器件之作業系統轉譯之示意動作輸入。作業系統將觸碰輸入轉譯成示意動作類型及方位(例如,座標(x,y)處之「觸按」、具有兩個不同方位處之單獨觸碰輸入之「捏合」操作等)。在狀態6210,一些實施例之整合式地圖繪製應用程式將此等輸入轉譯成不同地圖視圖修改操作。 當應用程式接收到一第一類型示意動作輸入(例如,在地圖視圖上以旋轉運動一起移動之兩個單獨觸碰輸入)時,應用程式轉變至狀態6215以使地圖旋轉。為了使地圖視圖旋轉,一些實施例修改虛擬攝影機之判定地圖之哪個部分經顯現以建立地圖視圖的方位及/或定向。舉例而言,當在3D模式下時,地圖繪製應用程式使虛擬攝影機繞一特定位置(例如,觸碰輸入之中心、顯示之中心、識別使用者之方位之方位指示符等)旋轉。隨著第一類型示意動作輸入繼續,地圖繪製應用程式保持在狀態6215下以繼續使地圖旋轉。 當使用者釋放第一類型示意動作輸入時,一些實施例之應用程式轉變至狀態6230以執行一慣性計算。在一些實施例中,在使用者釋放某些類型觸碰輸入之後,應用程式繼續在一特定量的時間及/或距離中執行相關聯地圖視圖修改。在此情況下,在使用者釋放旋轉輸入之後,應用程式轉變至慣性計算狀態6230以計算額外旋轉量及應執行此旋轉之時間。在一些實施例中,應用程式使旋轉自地圖之目前旋轉(角)速率減速,就好像一「摩擦」力被施加至地圖一般。因而,一些實施例之慣性計算係基於第一類型示意動作輸入之速度。自狀態6230,應用程式轉變回至應用程式先前所處之地圖修改狀態。亦即,當應用程式自狀態6215 (旋轉狀態)轉變至慣性計算狀態6230時,應用程式在執行慣性計算之後接著轉變回至狀態6215。在地圖之旋轉完成之後,應用程式轉變回至狀態6205。 當應用程式接收到一第二類型示意動作輸入(例如,在地圖視圖上移動之單一觸碰輸入)時,應用程式轉變至狀態6220以使地圖移動瀏覽。為了使地圖視圖移動瀏覽,一些實施例修改虛擬攝影機之判定地圖之哪個部分經顯現以建立地圖視圖的方位。此使地圖看上去像在源於第二類型示意動作輸入之方向的一方向上滑動。在一些實施例中,當地圖視圖在3D透視模式下時,移動瀏覽程序涉及執行觸碰輸入之方位與扁平地圖上之一方位之相關,以便避免地圖視圖中之突然的不當跳躍。隨著第二類型示意動作輸入繼續,地圖繪製應用程式保持在狀態6220下以繼續使地圖移動瀏覽。 當使用者釋放第二類型示意動作輸入時,一些實施例之應用程式轉變至狀態6230以執行一慣性計算。在一些實施例中,在使用者釋放某些類型觸碰輸入之後,應用程式繼續在一特定量的時間及/或距離中執行相關聯地圖視圖修改。在此情況下,在使用者釋放移動瀏覽輸入之後,應用程式轉變至慣性計算狀態6230以計算移動地圖視圖(亦即,移動虛擬攝影機)之額外量及應執行此移動之時間。在一些實施例中,應用程式使該移動瀏覽運動自地圖之目前移動瀏覽速率減速,就好像一「摩擦」力被施加至地圖一般。因而,一些實施例之慣性計算係基於第二類型示意動作輸入之速度。自狀態6230,應用程式轉變回至應用程式先前所處之地圖修改狀態。亦即,當應用程式自狀態6220 (移動瀏覽狀態)轉變至慣性計算狀態6230時,應用程式在執行慣性計算之後接著轉變回至狀態6220。在地圖之移動瀏覽完成之後,應用程式轉變回至狀態6205。 當應用程式接收到一第三類型示意動作輸入(例如,移動靠攏或分開的兩個單獨觸碰輸入)時,應用程式轉變至狀態6225以放大或縮小地圖。為了改變地圖視圖之縮放層級,一些實施例修改虛擬攝影機之判定地圖之哪個部分經顯現以建立地圖視圖的方位(亦即,高度)。此使地圖視圖包括地圖之更多(若縮小)或更少(若放大)部分。在一些實施例中,隨著使用者放大或縮小,應用程式擷取不同地圖底圖(用於不同縮放層級)以產生並顯現新地圖視圖。隨著第三類型示意動作輸入繼續,地圖繪製應用程式保持在狀態6225下以繼續地圖之放大或縮小。 當使用者釋放第二類型示意動作輸入時,一些實施例之應用程式轉變至狀態6230以執行一慣性計算。在一些實施例中,在使用者釋放某些類型觸碰輸入之後,應用程式繼續在一特定量的時間及/或距離中執行相關聯地圖視圖修改(亦即,使虛擬攝影機移動至更高處或更低處)。在此情況下,在使用者釋放縮放輸入之後,應用程式轉變至慣性計算狀態6230以計算縮放地圖視圖(亦即,移動虛擬攝影機)之額外量及應執行此移動之時間。在一些實施例中,應用程式使縮放移動自地圖之目前放大或縮小速率(亦即,虛擬攝影機改變高度之速度)減速,就好像一「摩擦」力被施加至攝影機一般。因而,一些實施例之慣性計算係基於第三類型示意動作輸入之速度。自狀態6230,應用程式轉變回至應用程式先前所處之地圖修改狀態。亦即,當應用程式自狀態6225 (縮放狀態)轉變至慣性計算狀態6230時,應用程式在執行慣性計算之後接著轉變回至狀態6225。在地圖之縮放完成之後,應用程式轉變回至狀態6205。 為簡單起見,狀態圖6200說明使用同一慣性計算程序(狀態6230)之地圖移動瀏覽、縮放及旋轉程序。然而,在一些實施例中,此等不同地圖修改程序中之每一者實際上使用不同慣性計算來識別其特定類型移動的減速及停止。另外,一些實施例在接收到輸入時而非在使用者移除示意動作輸入時計算並修改慣性變數。 當應用程式接收到一第四類型示意動作輸入(例如,一致地在觸敏顯示器上向上或向下移動的兩個單獨觸碰輸入)時,應用程式轉變至狀態6235以修改地圖之透視圖。為了改變地圖之透視圖,一些實施例在地圖上沿著一圓弧移動虛擬攝影機,從而修改虛擬攝影機之方位及定向兩者(當攝影機將其視場之中心保持在地圖上之一特定方位處時)。在一些實施例中,不同縮放層級使用虛擬攝影機沿著其移動之不同圓弧。此等圓弧中之每一者具有一頂部點,虛擬攝影機在該點處筆直指向下,從而提供地圖之2D透視圖。另外,每一圓弧具有一底部點,其係虛擬攝影機在圓弧上可移動至之最低點。因此,在一些實施例中,第四類型示意動作輸入可使應用程式在2D地圖視圖與3D透視地圖視圖之間變化。隨著第四類型示意動作輸入繼續,地圖繪製應用程式保持在狀態6235下以繼續修改地圖之透視圖。 當使用者釋放第四類型示意動作輸入時,一些實施例之應用程式轉變至狀態6240以執行一慣性計算。在一些實施例中,在使用者釋放某些類型觸碰輸入之後,應用程式繼續在一特定量的時間及/或距離中執行相關聯地圖視圖修改(亦即,使虛擬攝影機移動至更高處或更低處)。在此情況下,在使用者釋放透視圖改變輸入之後,應用程式轉變至慣性計算狀態6240以計算修改地圖視圖之透視角度(亦即,沿著虛擬攝影機之圓弧移動虛擬攝影機)之額外量及應執行此移動之時間。在一些實施例中,應用程式使移動自地圖改變透視角度之速率(亦即,虛擬攝影機沿著其圓弧移動之速度)減速,就好像一「摩擦」力被施加至攝影機一樣。因而,一些實施例之慣性計算係基於執行第四類型示意動作輸入之速度。 另外,對於透視角度改變操作,一些實施例轉變至回彈計算狀態6245。如所說明,透視角度改變操作在一些實施例中具有所允許之最大及最小透視角度移位,其可視當前地圖視圖之縮放層級而定。因此,除了慣性計算之外,應用程式亦執行狀態6245處的回彈計算。回彈計算使用慣性計算來判定是否將到達沿著虛擬攝影機圓弧之最大點,且若到達該最大點,則判定虛擬攝影機在此點之速率。一些實施例允許虛擬攝影機稍微移動超過最大點以達到「回彈」點,在該點處,應用程式使虛擬攝影機在其圓弧上轉向,從而使攝影機往回朝向最大點移動。一些實施例僅在虛擬攝影機圓弧之一個末端(例如,圓弧之底部)包括彈回(bounce-back)功能性,而其他實施例在圓弧之兩端皆包括該功能性。自回彈計算狀態6245,應用程式轉變回至慣性計算狀態6240,接著轉變回至透視角度改變狀態6235以顯示地圖視圖移動。另外,當使用者在足夠長時間中執行第四類型觸碰輸入且透視角度達到其他最大點時,應用程式直接自狀態6235轉變至狀態6245以計算回彈資訊,且接著轉變回至狀態6235。在對地圖之透視圖之修改完成之後,應用程式轉變回至狀態6205。 以上狀態與地圖呈現上之各種多點觸碰示意動作有關,整合式地圖繪製、搜尋及導航應用程式將該等多點觸碰示意動作轉譯成對地圖呈現之不同修改。各種其他觸碰輸入亦可使應用程式改變狀態並執行各種功能。舉例而言,一些實施例將一3D可選擇項目覆疊在地圖視圖上(例如,作為浮動控制項),且選擇(例如,藉由觸按輸入)3D項目使應用程式轉變至6235以修改地圖視圖之透視角度。當地圖視圖在3D透視圖中開始時,應用程式將透視角度修改成2D視圖;當地圖視圖在2D視圖中開始時,應用程式將透視角度修改成3D視圖。在修改之後,應用程式返回狀態6205。 當使用者正在檢視處於狀態6205下之地圖時,應用程式呈現各種標籤以作為地圖視圖之部分。此等標籤中之一些指示名勝或其他方位。當使用者選擇某些標籤(例如,針對某些商業、公園等),應用程式轉變至狀態6250以顯示針對所選擇方位之橫幅(例如,一資訊顯示橫幅),接著返回地圖瀏覽狀態(其中橫幅顯示於地圖上)。在一些實施例中,此橫幅包括(1)一快速路線導航UI控制項(例如,一按鈕),其使應用程式擷取自器件之當前方位至所選擇方位之一路線(例如,一駕駛路線)而不離開地圖視圖,及(2)一資訊UI控制項(例如,按鈕),其使應用程式提供關於方位之額外資訊。 當使用者選擇UI控制項按鈕時,應用程式自狀態6205轉變至狀態6255以顯示所選擇方位之一預備區域。在一些實施例中,此預備區域顯示所選擇方位之一媒體呈現(例如,3D視訊呈現、所選擇方位之低空俯瞰視圖、針對方位所捕獲之一系列影像等),以及針對所選擇方位之各種資訊(聯絡人資訊、評論等)。隨著使用者執行各種操作以導覽預備區域並檢視預備區域內之資訊,應用程式保持在狀態6255下。當使用者選擇UI控制項以轉移回至地圖視圖時,應用程式轉變至狀態6205。 自地圖瀏覽視圖,使用者亦可容易地存取應用程式之搜尋功能。當選擇一特定UI控制項(例如,一搜尋列)時,應用程式轉變至搜尋鍵入建議狀態6260。在搜尋鍵入狀態,一些實施例顯示使用者可藉以鍵入搜尋字詞之一觸控螢幕小鍵盤。搜尋字詞可為商業名稱、地址、方位之類型(例如,咖啡店)等。當使用者鍵入字元時,應用程式保持在狀態6260下且基於新近搜尋、已鍵入之字母等而提供建議。一些實施例可使用基於字首之建議(例如,以已鍵入之字元開始之建議),以及其他建議(例如,做拼寫校正以在已鍵入之字串之開頭新增字元、調換資源等)。在一些實施例中,除了方位之外,選擇亦可包括新近鍵入之路線。若使用者在此階段選擇取消UI控制項,則應用程式不執行搜尋即轉移回至狀態6205。 當使用者選擇一搜尋字詞(建議之字詞抑或完全由使用者鍵入之字詞)時,應用程式轉變至狀態6265以在地圖視圖上顯示搜尋結果,且接著在顯示搜尋結果的同時轉變至狀態6205。一些實施例將搜尋結果顯示為地圖上之可選擇項目(例如,圖釘);對該等項目中之一者之選擇導致至狀態6250之轉變以顯示針對所選擇項目之橫幅。另外,一些實施例之應用程式自動地選擇搜尋結果中之一者(例如,一「最佳」結果)且作為狀態6265之部分而顯示此橫幅。 由於應用程式係緊密整合之地圖繪製、搜尋、路線選擇及導航應用程式,故使用者可容易地自地圖瀏覽狀態存取路線選擇功能。當選擇一特定UI控制項(例如,一路線鍵入按鈕)時,應用程式轉變至路線鍵入狀態6270。在路線鍵入狀態,一些實施例顯示一觸控螢幕小鍵盤,使用者可用該小鍵盤將方位(例如,地址、地方名稱、地方類型等)鍵入至「至」及「自」欄位中以便請求一路線。當使用者鍵入字元時,應用程式保持在狀態6270下且基於新近路線、新近搜尋、類似於針對搜尋鍵入所描述之自動完成之自動完成等來提供建議。若使用者在此階段選擇取消UI控制項,則應用程式不擷取路線即轉移回至狀態6205。 當使用者選擇一路線(例如,藉由鍵入「至」方位及「自」方位)時,應用程式轉變至路線顯示狀態6275。在此狀態,應用程式在地圖視圖上顯示自第一所選擇方位至第二所選擇方位之一或多條路線(例如,藉由在地圖視圖覆疊路線線)。一些實施例自動地選擇該等路線中之第一路線。在應用程式保持在狀態6275下(但修改路線線之顯示以指示對其他路線之選擇)之情況下,使用者可選擇其他路線中之任一者(例如,藉由在未選擇路線上觸按)。另外,當在狀態6275下時,一些實施例之應用程式顯示與路線選擇及導航有關之不同UI控制項,包括指引清單控制項、導航開始控制項及其他控制項。 又,在顯示有路線之地圖上之各種示意動作互動可使應用程式執行對地圖視圖之不同修改(例如,移動瀏覽、旋轉、縮放、修改地圖透視角度等)。在所有示意動作地圖修改操作(例如,作為狀態6215至6245之結果)為可用的情況下,當整合式應用程式在處於路線顯示狀態6275時於地圖顯示上接收到示意動作互動時,應用程式轉變至狀態6210以執行示意動作輸入辨識。亦即,應用程式將示意動作輸入轉譯成類似於上文針對狀態6215至6245所描述之操作的移動瀏覽、旋轉、縮放及/或透視角度改變操作,並具有針對虛擬攝影機移動之類似慣性及回彈特徵。儘管操作6215至6245返回至地圖瀏覽狀態6205,但自路線顯示狀態6275進入之結果操作返回至路線顯示狀態6275。 在一些實施例中,亦可自其他狀態進入路線顯示狀態6275。舉例而言,若使用者在處於狀態6205下時選擇橫幅上之快速路線UI控制項,則應用程式擷取自器件之當前方位至與橫幅相關聯之方位的一或多條路線。另外,一些實施例在狀態6260處之搜尋建議中顯示先前所請求之路線。當使用者選擇此等建議之路線中之一者時,應用程式自狀態6260直接轉變至狀態6275以在地圖上顯示一或多條路線。 自路線顯示狀態6275,應用程式可視使用者所選擇之不同控制項而轉變成各種不同模式。當使用者選擇用以清除路線之UI控制項時,應用程式轉變回至狀態6205以顯示不具有任何路線之地圖。另外,整合式應用程式可自路線顯示狀態6275進入一或多個導航模態。 當狀態6275處所顯示之所選擇路線自器件之當前方位開始且使用者選擇一導航開始控制項時,應用程式轉變至導航狀態6280。在一些實施例中,應用程式顯示自地圖視圖至用於導航之更加沈浸式的3D視圖之一電影轉變。在一些實施例之導航狀態6280內,虛擬攝影機跟隨使用者之沿著所選擇路線的方位以便呈現路線的即將來臨之部分。當路線完成(器件到達目的地方位)抑或使用者選擇用以結束導航之控制項時,應用程式轉變至狀態6205以呈現地圖瀏覽視圖6205。 在一些實施例中,在處於導航模式6280下時,在顯示有路線之地圖上之各種示意動作互動可使應用程式執行對地圖視圖之不同修改(例如,移動瀏覽、旋轉、縮放、修改地圖透視角度等)。在一些實施例中,在導航模式下僅所描述地圖修改操作中之一些操作為可用的。舉例而言,一些實施例允許使用者放大或縮小地圖,但不允許對地圖之任何其他修改。因此,當使用者提供示意動作輸入時,示意動作輸入辨識狀態6210濾除不與縮放操作相關聯之類型示意動作輸入(且隨後,應用程式返回至狀態6280)。當接收與縮放操作相關聯之類型示意動作輸入時,示意動作輸入辨識狀態辨識出此輸入,且應用程式轉變至類似於狀態6225之一狀態,以改變地圖之縮放層級(在一些實施例中,具有慣性計算)。 其他實施例可允許實現不同地圖修改操作。舉例而言,在一些實施例中,所有示意動作地圖修改操作(例如,狀態6215至6245之結果)在處於導航模式下時皆為可用的。一些實施例允許示意動作地圖修改操作之一子集,包括縮放及受限的移動瀏覽操作。在接收與移動瀏覽相關聯之類型示意動作輸入時,一些實施例之移動瀏覽操作將虛擬攝影機(在導航模式下時)向側向移動,接著使虛擬攝影機返回以沿著路線指向。儘管操作6215至6245返回至地圖瀏覽狀態6205,但自導航狀態6280進入之結果操作返回導航狀態6280。 當狀態6275處所顯示之所選擇路線自不同於器件之當前方位的方位開始(或路線係步行路線)且使用者選擇導航開始控制項時,應用程式轉變至狀態6285處之步進模式或路線檢查模式。在一些實施例中,應用程式一次一個地顯示沿著路線執行之駕控(例如,作為導航標誌)。藉由提供對駕控的示意動作輸入(例如,撥動示意動作),使用者可在處於路線檢查模式下時檢視不同駕控。該等駕控覆疊於一地圖上,且路線之至少一部分顯示於該地圖中。 如在路線顯示模式下一樣,地圖上之各種示意動作互動可使應用程式執行對地圖視圖之不同修改(例如,移動瀏覽、旋轉、縮放、修改地圖透視角度等)。當整合式應用程式在處於步進模式6285下時接收到地圖顯示上之示意動作互動時,應用程式轉變至狀態6210以執行示意動作輸入辨識,同時所有示意動作地圖修改操作(例如,狀態6215至6245之結果)為可用的。亦即,應用程式將示意動作輸入轉譯成類似於上文針對狀態6215至6245所描述之操作的移動瀏覽、旋轉、縮放及/或透視角度改變操作,並具有針對虛擬攝影機移動之類似慣性及回彈特徵。儘管操作6215至6245返回地圖瀏覽狀態6205,但自步進模式6285進入之結果操作返回至步進模式6285。 此外,在一些實施例中,示意動作輸入辨識辨識出在所顯示駕控上的至少一類型示意動作輸入,以便在駕控之間切換。當在所顯示之駕控上(而非在地圖視圖上)接收到一特定類型示意動作輸入(例如,撥動示意動作)時,應用程式轉變至用於改變所顯示之駕控之狀態(未圖示),接著返回狀態6285。 當整合式應用程式在處於步進狀態6285下時在所顯示之地圖上接收到示意動作互動時,應用程式轉變至狀態6210以執行示意動作輸入辨識,同時所有示意動作地圖修改操作(例如,狀態6215至6245之結果)為可用的。當修改操作完成時,應用程式返回至狀態6285。當使用者選擇控制項以結束經由駕控之步進時,應用程式轉變至狀態6205以呈現地圖瀏覽視圖。 另外,在一些實施例中,應用程式可自步進模式6285轉變至自動步進狀態6290。當使用者在應用程式處於狀態6285下時選擇一方位追蹤控制項時,應用程式轉變至自動步進模式6290,其係一不同導航模態。當在一些實施例之自動步進模式下時,整合式地圖繪製、搜尋及導航應用程式顯示器件之方位最接近於的駕控(例如,如按執行駕控所在之接合點所量測)。當器件移動(例如,沿著路線)至較接近於一不同駕控之一方位時,自動步進模式自動地顯示該不同駕控。當使用者取消選擇方位追蹤控制項時,應用程式轉變回至步進模式6285。當使用者在處於自動步進狀態6290下時選擇用以結束導航之控制項時,應用程式轉變至狀態6205以呈現地圖瀏覽視圖。 如在步進模式6285下一樣,地圖上之各種示意動作互動可使應用程式執行對地圖視圖之不同修改(例如,移動瀏覽、旋轉、縮放、修改地圖透視角度等)。當整合式應用程式在處於自動步進模式6290下時接收到地圖顯示上之示意動作互動時,應用程式轉變至狀態6210以執行示意動作輸入辨識,同時所有示意動作地圖修改操作(例如,狀態6215至6245之結果)為可用的。亦即,應用程式將示意動作輸入轉譯成類似於上文針對狀態6215至6245所描述之操作的移動瀏覽、旋轉、縮放及/或透視角度改變操作,並具有針對虛擬攝影機移動之類似慣性及回彈特徵。儘管操作6215至6245返回至地圖瀏覽狀態6205,但自自動步進模式6290進入之結果操作返回至自動步進模式6290。另外,在使用者將地圖移動瀏覽一特定距離時,一些實施例自動地關閉方位追蹤控制項,在該情況下,應用程式返回至步進模式狀態6285而非自動步進狀態6290。VI. 電子系統 上述特徵及應用程式中之許多者係實施為經指定為記錄於一電腦可讀儲存媒體(亦被稱為電腦可讀媒體)上之指令集的軟體程序。當此等指令由一或多個計算或處理單元(例如,一或多個處理器、處理器之核心或其他處理單元)執行時,該等指令使該(該等)處理單元執行該等指令中所指示之動作。電腦可讀媒體之實例包括(但不限於)CD-ROM、隨身碟、隨機存取記憶體(RAM)晶片、硬碟機、可抹除可程式化唯讀記憶體(EPROM)、電可抹除可程式化唯讀記憶體(EEPROM)等。電腦可讀媒體不包括以無線方式或經由有線連接傳遞之載波及電子信號。 在本說明書中,術語「軟體」意在包括可讀取至記憶體中以供處理器處理之駐留於唯讀記憶體中之韌體或儲存於磁性儲存器中的應用程式。又,在一些實施例中,多個軟體發明可實施為一較大程式之子部分,同時仍為相異軟體發明。在一些實施例中,多個軟體發明亦可實施為單獨程式。最後,一起實施此處所描述之軟體發明之單獨程式之任何組合在本發明之範疇內。在一些實施例中,軟體程式在經安裝以在一或多個電子系統上操作時定義執行且實行軟體程式之操作的一或多個特定機器實施。A. 行動器件 一些實施例之多模式地圖繪製應用程式在諸如智慧型手機(例如,iPhone®)及平板電腦(例如,iPad®)之行動器件上操作。圖63為此類行動計算器件之架構6300之一實例。行動計算器件之實例包括智慧型手機、平板電腦、膝上型電腦等。如所展示,行動計算器件6300包括一或多個處理單元6305、一記憶體介面6310及一周邊器件介面6315。 周邊器件介面6315係耦接至各種感測器及子系統,包括攝影機子系統6320、無線通信子系統6325、音訊子系統6330、I/O子系統6335等。周邊器件介面6315允許實現處理單元6305與各種周邊器件之間的通信。舉例而言,定向感測器6345 (例如,迴轉儀)及加速度感測器6350 (例如,加速度計)係耦接至周邊器件介面6315以促進定向及加速度功能。 攝影機子系統6320係耦接至一或多個光學感測器6340 (例如,電荷耦合器件(CCD)光學感測器、互補金屬氧化物半導體(CMOS)光學感測器等)。與光學感測器6340耦接之攝影機子系統6320促進攝影機功能,諸如影像及/或視訊資料捕獲。無線通信子系統6325用來促進通信功能。在一些實施例中,無線通信子系統6325包括射頻接收器及傳輸器,及光學接收器及傳輸器(圖63中未圖示)。一些實施例之此等接收器及傳輸器經實施以在諸如GSM網路、Wi-Fi網路、藍芽網路等之一或多個通信網路上操作。音訊子系統6330係耦接至一揚聲器以輸出音訊(例如,輸出語音導航指令)。另外,音訊子系統6330係耦接至一麥克風以促進語音啟用之功能,諸如語音辨識(例如,用於搜尋)、數位記錄等。 I/O子系統6335涉及在輸入/輸出周邊器件(諸如顯示器、觸控螢幕等)與處理單元6305之資料匯流排之間經由周邊器件介面6315的傳送。I/O子系統6335包括觸控螢幕控制器6355及其他輸入控制項6360以促進輸入/輸出周邊器件與處理單元6305之資料匯流排之間的傳送。如所展示,觸控螢幕控制器6355係耦接至觸控螢幕6365。觸控螢幕控制器6355使用多個觸敏技術中之任一者來偵測觸控螢幕6365上之接觸及移動。其他輸入控制器6360係耦接至其他輸入/控制器件,諸如一或多個按鈕。一些實施例包括可偵測接近觸碰互動而非觸碰互動或除觸碰互動外亦偵測接近觸碰互動之接近觸碰敏感螢幕及對應控制器。 記憶體介面6310係耦接至記憶體6370。在一些實施例中,記憶體6370包括揮發性記憶體(例如,高速隨機存取記憶體)、非揮發性記憶體(例如,快閃記憶體)、揮發性記憶體與非揮發性記憶體之一組合,及/或任何其他類型記憶體。如圖63中所說明,記憶體6370儲存作業系統(OS) 6372。OS 6372包括用於處置基本系統服務及用於執行硬體相依任務之指令。 記憶體6370亦包括:用以促進與一或多個額外器件之通信之通信指令6374;用以促進圖形使用者介面處理之圖形使用者介面指令6376;用以促進影像相關處理及功能之影像處理指令6378;用以促進輸入相關(例如,觸碰輸入)程序及功能之輸入處理指令6380;用以促進音訊相關程序及功能之音訊處理指令6382;及用以促進攝影機相關程序及功能之攝影機指令6384。上文所描述之指令僅為例示性的,且在一些實施例中,記憶體6370包括額外及/或其他指令。舉例而言,用於智慧型手機之記憶體可包括用以促進電話相關程序及功能之電話指令。另外,記憶體可包括用於多模式地圖繪製應用程式之指令。以上經識別指令不必實施為單獨軟體程式或模組。行動計算器件之各種功能可實施於硬體中及/或軟體中(包括一或多個信號處理及/或特殊應用積體電路中)。 雖然將圖63中所說明之組件展示為單獨組件,但一般熟習此項技術者將認識到,兩個或兩個以上組件可整合至一或多個積體電路中。另外,兩個或兩個以上組件可由一或多個通信匯流排或信號線耦接在一起。又,雖然已將該等功能中之許多者描述為由一個組件執行,但一般熟習此項技術者將認識到,關於圖63所描述之該等功能可分成兩個或兩個以上積體電路。B. 電腦系統 圖64概念性地說明本發明之一些實施例實施於之電子系統6400之另一實例。電子系統6400可為電腦(例如,桌上型電腦、個人電腦、平板電腦等)、電話、PDA或任何其他種類之電子或計算器件。此電子系統包括各種類型電腦可讀媒體及用於各種其他類型電腦可讀媒體之介面。電子系統6400包括匯流排6405、(多個)處理單元6410、圖形處理單元(GPU) 6415、系統記憶體6420、網路6425、唯讀記憶體6430、永久儲存器件6435、輸入器件6440及輸出器件6445。 匯流排6405共同表示以通信方式連接電子系統6400之眾多內部器件之所有系統、周邊及晶片組匯流排。舉例而言,匯流排6405以通信方式將該(該等)處理單元6410與唯讀記憶體6430、GPU 6415、系統記憶體6420及永久儲存器件6435連接。 自此等各種記憶體單元,該(該等)處理單元6410擷取用以執行之指令及用以處理之資料以便執行本發明之程序。在不同實施例中,該(該等)處理單元可為單一處理器或多核心處理器。一些指令被傳遞至GPU 6415且由GPU 6415執行。GPU 6415可卸載各種計算或補充由該(該等)處理單元6410提供之影像處理。在一些實施例中,可使用CoreImage之核心遮蔽語言來提供此功能性。 唯讀記憶體(ROM) 6430儲存該(該等)處理單元6410及電子系統之其他模組所需之靜態資料及指令。另一方面,永久儲存器件6435係讀取及寫入記憶體器件。此器件係即使在電子系統6400關閉時仍儲存指令及資料之非揮發性記憶體單元。本發明之一些實施例使用大容量儲存器件(諸如,磁性或光學碟片及其對應碟機、整合式快閃記憶體)作為永久儲存器件6435。 其他實施例使用抽取式儲存器件(諸如軟磁碟、快閃記憶體器件等及其對應磁碟機)作為永久儲存器件。類似於永久儲存器件6435,系統記憶體6420係讀取及寫入記憶體器件。然而,不同於儲存器件6435,系統記憶體6420係揮發性讀取及寫入記憶體,諸如隨機存取記憶體。系統記憶體6420儲存處理器在執行階段需要之指令及資料中之一些。在一些實施例中,本發明之程序係儲存於系統記憶體6420、永久儲存器件6435及/或唯讀記憶體6430中。自此等各種記憶體單元,該(該等)處理單元6410擷取用以執行之指令及用以處理之資料以便執行一些實施例之程序。 匯流排6405亦連接至輸入器件6440及輸出器件6445。該等輸入器件6440使使用者能夠傳達資訊至電子系統及選擇給電子系統之命令。該等輸入器件6440包括文數字鍵盤及指標器件(亦被稱作「游標控制器件」)、攝影機(例如,網路攝影機)、用於接收語音命令之麥克風或類似器件等。該等輸出器件6445顯示由電子系統產生之影像或以其他方式輸出資料。該等輸出器件6445包括印表機及顯示器件(諸如陰極射線管(CRT)或液晶顯示器(LCD)),以及揚聲器或類似音訊輸出器件。一些實施例包括充當輸入器件及輸出器件兩者之器件,諸如觸控螢幕。 最後,如圖64中所展示,匯流排6405亦經由一網路配接器(未圖示)將電子系統6400耦接至網路6425。以此方式,電腦可為電腦之網路(諸如區域網路(「LAN」)、廣域網路(「WAN」)或企業內部網路)或網路之網路(諸如網際網路)之一部分。可結合本發明使用電子系統6400之任何或全部組件。 一些實施例包括電子組件,諸如微處理器、將電腦程式指令儲存於機器可讀或電腦可讀媒體中之儲存器及記憶體(被替代地稱為電腦可讀儲存媒體、機器可讀媒體或機器可讀儲存媒體)。此等電腦可讀媒體之一些實例包括RAM、ROM、唯讀光碟(CD-ROM)、可記錄光碟(CD-R)、可重寫光碟(CD-RW)、唯讀數位影音光碟(例如,DVD-ROM、雙層DVD-ROM)、多種可記錄/可重寫DVD (例如,DVD-RAM、DVD-RW、DVD+RW等)、快閃記憶體(例如,SD卡、迷你SD卡、微型SD卡等)、磁性及/或固態硬碟機、唯讀且可記錄Blu-Ray®光碟、超高密度光碟、任何其他光學或磁性媒體,及軟磁碟。電腦可讀媒體可儲存可由至少一處理單元執行且包括用於執行各種操作之指令集的電腦程式。電腦程式或電腦程式碼之實例包括機器碼(諸如編譯器所產生之機器碼),及包括由電腦、電子組件或微處理器使用解譯器執行之較高層級程式碼之檔案。 雖然以上論述主要參考執行軟體之微處理器或多核心處理器,但一些實施例係藉由一或多個積體電路(諸如特殊應用積體電路(ASIC)或場可程式化閘陣列(FPGA))執行。在一些實施例中,此等積體電路執行儲存於電路本身上之指令。另外,一些實施例執行儲存於可程式化邏輯器件(PLD)、ROM或RAM器件中之軟體。 如本說明書及本申請案之任何請求項中所使用,術語「電腦」、「伺服器」、「處理器」及「記憶體」全部指代電子器件或其他技術器件。此等術語排除人員或人員之群組。對於本說明書之目的,術語顯示意謂顯示於電子器件上。如本說明書及本申請案之任何請求項中所使用,術語「電腦可讀媒體」、「多個電腦可讀媒體」及「機器可讀媒體」完全限制於以電腦可讀之形式儲存資訊之有形實體物件。此等術語排除任何無線信號、有線下載信號及任何其他暫時信號。VII. 地圖服務環境 各種實施例可在一地圖服務作業環境中操作。圖65根據一些實施例說明一地圖服務作業環境。地圖服務6530 (亦被稱為地圖繪製服務)可為經由各種通信方法及協定而與地圖服務6530通信之一或多個用戶端器件6502a至6502c提供地圖服務。在一些實施例中,地圖服務6530提供地圖資訊及其他地圖相關資料,諸如二維地圖影像資料(例如,利用衛星成像的道路之鳥瞰圖)、三維地圖影像資料(例如,具有諸如建築物之三維特徵之可遍歷地圖)、路線及指引計算(例如,渡口路線計算或針對行人之在兩個點之間的指引)、即時導航資料(例如,二維或三維之轉向提示視覺導航資料)、定位資料(例如,用戶端器件當前所在之處)及其他地理資料(例如,無線網路涵蓋範圍、天氣、交通資訊或附近興趣點)。在各種實施例中,地圖服務資料可包括用於不同國家或地區之區域化標籤。區域化標籤可用以在用戶端器件上以不同語言呈現地圖標籤(例如,街道名稱、城市名稱、興趣點)。用戶端器件6502a至6502c可藉由獲得地圖服務資料來利用此等地圖服務。用戶端器件6502a至6502c可實施用以處理地圖服務資料之各種技術。用戶端器件6502a至6502c可接著將地圖服務提供至各種實體,包括(但不限於)使用者、內部軟體或硬體模組及/或在用戶端器件6502a至6502c外之其他系統或器件。 在一些實施例中,地圖服務係由分散式計算系統中之一或多個節點實施。每一節點可被指派地圖服務之一或多個服務或組件。一些節點可被指派地圖服務之相同地圖服務或組件。在一些實施例中,負載平衡節點將存取或請求分散至地圖服務內之其他節點。在一些實施例中,地圖服務係實施為單一系統,諸如單一伺服器。伺服器內之不同模組或硬體器件可實施由地圖服務提供之各種服務中之一或多者。 在一些實施例中,地圖服務藉由產生各種格式之地圖服務資料來提供地圖服務。在一些實施例中,一種格式的地圖服務資料為地圖影像資料。地圖影像資料將影像資料提供至一用戶端器件,使得該用戶端器件可處理影像資料(例如,將影像資料顯現及/或顯示為二維或三維地圖)。地圖影像資料(二維抑或三維的)可指定一或多個地圖底圖。地圖底圖可為較大地圖影像之一部分。將一地圖之地圖底圖組裝在一起產生原始地圖。底圖可根據地圖影像資料、路線選擇或導航資料或任何其他地圖服務資料產生。在一些實施例中,地圖底圖係基於點陣之地圖底圖,其中底圖大小可為大於及小於通常所使用之256像素乘以256像素底圖之任何大小。基於點陣之地圖底圖可編碼為許多標準數位影像表示,該等表示包括(但不限於)點陣圖(.bmp)、圖形交換格式(.gif)、聯合照相專家群(.jpg、.jpeg等)、攜帶型網路圖形(.png)或帶標影像檔案格式(.tiff)。在一些實施例中,地圖底圖係基於向量之地圖底圖,使用向量圖形(包括(但不限於)可縮放向量圖形(.svg)或繪圖檔案(.drw))進行編碼。一些實施例亦包括具有向量及點陣資料之組合之底圖。與地圖底圖有關之後設資料或其他資訊亦可包括於地圖底圖內或與地圖底圖包括在一起,從而將更多地圖服務資料提供至用戶端器件。在各種實施例中,利用各種標準及/或協定對地圖底圖進行編碼以供傳輸,該等各種標準及/或協定中之一些描述於下文之實例中。 在各種實施例中,地圖底圖可視縮放層級而由具不同解析度之影像資料建構。舉例而言,對於低縮放層級(例如,世界或全球視圖),地圖或影像資料之解析度不必與高縮放層級(例如,城市或街道層級)下之解析度一樣高。舉例而言,當在全球視圖中時,可能不必顯現街道層級人為構造(artifact),因為此等物件將太小而可在許多情況下忽略。 在一些實施例中,地圖服務在編碼底圖以供傳輸之前執行用以分析地圖底圖之各種技術。此分析可最佳化用於用戶端器件及地圖服務兩者之地圖服務效能。在一些實施例中,根據基於向量之圖形技術分析地圖底圖之複雜性,且利用複雜及不複雜之層來建構地圖底圖。亦可針對可顯現為影像紋理之一般影像資料或圖案來分析地圖底圖,且藉由依靠影像遮罩來建構地圖底圖。在一些實施例中,地圖底圖中之基於點陣之影像資料含有某些遮罩值,該等值與一或多個紋理相關聯。一些實施例亦針對可與含有樣式識別符之某些地圖樣式相關聯之經指定特徵而分析地圖底圖。 在一些實施例中,其他地圖服務產生與地圖底圖分離的依賴各種資料格式之地圖服務資料。舉例而言,提供定位資料之地圖服務可利用符合定位服務協定之資料格式,該等定位服務協定諸如(但不限於)無線電資源定位服務協定(RRLP)、用於分碼多重存取(CDMA)之TIA 801、無線電資源控制(RRC)位置協定或LTE定位協定(LPP)。實施例亦可自用戶端器件接收或請求資料,其識別器件能力或屬性(例如,硬體規格或作業系統版本)或通信能力(例如,如根據無線信號強度或有線或無線網路類型判定之器件通信頻寬)。 地圖服務可自內部或外部來源獲得地圖服務資料。舉例而言,地圖影像資料中所使用之衛星影像可自外部服務或內部系統、儲存器件或節點獲得。其他實例可包括(但不限於)GPS輔助伺服器、無線網路涵蓋範圍資料庫、商業或個人目錄、天氣資料、政府資訊(例如,建設工程更新或道路名稱變化)或交通報告。地圖服務之一些實施例可更新地圖服務資料(例如,無線網路涵蓋範圍)以用於分析來自用戶端器件之未來請求。 地圖服務之各種實施例回應於用戶端器件對地圖服務之請求。此等請求可為對特定地圖或地圖之部分之請求。一些實施例將對地圖之請求格式化為對特定地圖底圖之請求。在一些實施例中,請求亦為地圖服務供應開始方位(或當前方位)及目的地方位以用於路線計算。用戶端器件亦可請求地圖服務顯現資訊,諸如地圖紋理或樣式表。在至少一些實施例中,請求亦為實施轉向提示導航之一系列請求中之一者。對其他地理資料之請求可包括(但不限於)當前方位、無線網路涵蓋範圍、天氣、交通資訊或附近興趣點。 在一些實施例中,地圖服務分析用戶端器件請求以最佳化器件或地圖服務操作。舉例而言,地圖服務可辨識出用戶端器件之方位在通信不良(例如,弱無線信號)之一區域中,且在發生通信損失之情況下發送更多地圖服務資料以供應用戶端器件或發送指令以利用不同用戶端硬體(例如,定向感測器)或軟體(例如,利用無線定位服務或Wi-Fi定位而非基於GPS之服務)。在另一實例中,地圖服務可分析用戶端器件對基於向量之地圖影像資料之請求,且根據影像之複雜性判定基於點陣之地圖資料更好地最佳化地圖影像資料。其他地圖服務之實施例可對用戶端器件請求執行類似分析,且因而,以上實例不欲為限制性的。 用戶端器件之各種實施例(例如,用戶端器件6502a至6502c)係實施於不同攜帶型多功能器件類型上。用戶端器件6502a至6502c經由各種通信方法及協定來利用地圖服務6530。在一些實施例中,用戶端器件6502a至6502c自地圖服務6530獲得地圖服務資料。用戶端器件6502a至6502c請求或接收地圖服務資料。用戶端器件6502a至6502c接著處理地圖服務資料(例如,顯現及/或顯示該資料)且可將該資料發送至器件上之另一軟體或硬體模組或發送至一外部器件或系統。 根據一些實施例,用戶端器件實施用以顯現及/或顯示地圖之技術。可以各種格式(諸如上文所描述之地圖底圖)請求或接收此等地圖。一用戶端器件可在二維或三維視圖中顯現一地圖。一用戶端器件之一些實施例顯示一經顯現地圖,且允許提供輸入之使用者、系統或器件操縱地圖中之虛擬攝影機,從而根據虛擬攝影機之位置、定向及視場來改變該地圖顯示。實施各種形式及輸入器件以操縱虛擬攝影機。在一些實施例中,經由某些單一或組合示意動作(例如,觸碰並保持或撥動)之觸碰輸入操縱虛擬攝影機。其他實施例允許操縱器件之實體方位來操縱虛擬攝影機。舉例而言,可將一用戶端器件自其當前位置向上傾斜以操縱虛擬攝影機向上旋轉。在另一實例中,可將一用戶端器件自其當前位置向前傾斜以使虛擬攝影機向前移動。可實施至用戶端器件之其他輸入器件,包括(但不限於)聽覺輸入(例如,說出的詞語)、實體鍵盤、滑鼠及/或搖桿。 一些實施例提供對虛擬攝影機操縱之各種視覺回饋,諸如在自二維地圖視圖轉變至三維地圖視圖時顯示可能的虛擬攝影機操縱之動畫。一些實施例亦允許進行輸入以選擇一地圖特徵或物件(例如,一建築物)且醒目提示該物件,從而產生維持虛擬攝影機的三維空間感的模糊效應。 一些實施例中,一用戶端器件實施一導航系統(例如,轉向提示導航)。導航系統提供可向使用者顯示之指引或路線資訊。用戶端器件之一些實施例向地圖服務請求指引或一路線計算。用戶端器件可自地圖服務接收地圖影像資料及路線資料。在一些實施例中,一用戶端器件實施一轉向提示導航系統,其基於自地圖服務及/或其他定位系統(諸如全球定位衛星(GPS)系統)接收之方位資訊及路線資訊而提供即時路線及指引資訊。用戶端器件可顯示反映用戶端器件之當前方位之地圖影像資料且即時地更新該地圖影像資料。一導航系統可提供聽覺或視覺指引以沿某一路線行進。 根據一些實施例,實施一虛擬攝影機以操縱導航地圖資料。用戶端器件之一些實施例允許器件將虛擬攝影機顯示定向調整為偏向路線目的地。一些實施例亦允許虛擬攝影機模擬虛擬攝影機之慣性運動地通過轉彎處。 用戶端器件實施各種技術來利用來自地圖服務之地圖服務資料。一些實施例實施用以最佳化二維及三維地圖影像資料之顯現的一些技術。在一些實施例中,用戶端器件於本端儲存顯現資訊。舉例而言,用戶端儲存含有樣式識別符之影像資料的樣式表,其提供顯現指引。在另一實例中,可儲存一般影像紋理以減少自地圖服務傳送之地圖影像資料之量。在不同實施例中,用戶端器件實施用以顯現二維及三維地圖影像資料之各種模型化技術,該等技術之實例包括(但不限於):根據二維建築物佔據面積資料產生三維建築物;模型化二維及三維地圖物件以判定用戶端器件通信環境;產生用以判定是否可自某一虛擬攝影機位置看到地圖標籤之模型;及產生用以使地圖影像資料之間的轉變平滑之模型。在一些實施例中,用戶端器件亦用某些技術對地圖服務資料進行排序或排定優先次序。舉例而言,用戶端器件偵測虛擬攝影機之運動或速率,若其超過某些臨限值,則針對某些區域載入並顯現細節較少的影像資料。其他實例包括:顯現基於向量之曲線以作為一系列點;預先載入與地圖服務通信不良之區域之地圖影像資料;基於顯示縮放層級來調適紋理;或根據複雜性顯現地圖影像資料。 在一些實施例中,用戶端器件利用與地圖底圖分離的各種資料格式進行通信。舉例而言,一些用戶端器件實施有輔助之全球定位衛星(A-GPS)且與利用符合定位服務協定之資料格式之定位服務通信,該等定位服務協定諸如(但不限於)無線電資源定位服務協定(RRLP)、用於分碼多重存取(CDMA)之TIA 801、無線電資源控制(RRC)位置協定或LTE定位協定(LPP)。用戶端器件亦可直接接收GPS信號。實施例亦可發送資料(在地圖服務請求或不請求的情況下),其識別用戶端器件之能力或屬性(例如,硬體規格或作業系統版本)或通信能力(例如,如根據無線信號強度或有線或無線網路類型判定之器件通信頻寬)。 圖65說明用於地圖服務6530及用戶端器件6502a至6502c之作業環境6500的一個可能實施例。在一些實施例中,器件6502a、6502b及6502c經由一或多個有線或無線網路6510進行通信。舉例而言,無線網路6510 (諸如蜂巢式網路)可藉由使用閘道器6514而與廣域網路(WAN) 6520 (諸如網際網路)通信。在一些實施例中,閘道器6514提供一封包導向式行動資料服務(諸如整合封包無線電服務(GPRS)),或允許無線網路將資料傳輸至其他網路(諸如廣域網路6520)之其他行動資料服務。同樣地,存取器件6512 (例如,IEEE 802.11g無線存取器件)提供對WAN 6520之通信存取。器件6502a及6502b可為能夠與地圖服務通信之任何攜帶型電子或計算器件。器件6502c可為能夠與地圖服務通信之任何非攜帶型電子或計算器件。 在一些實施例中,語音及資料通信均可經由無線網路6510及存取器件6512建立。舉例而言,器件6502a可經由無線網路6510、閘道器6514及WAN 6520 (例如,使用傳輸控制協定/網際網路協定(TCP/IP)或使用者資料報協定(UDP))來撥打及接聽電話(例如,使用網際網路語音通信協定(VoIP)協定),接收並發送電子郵件訊息(例如,使用簡易郵件傳送協定(SMTP)或郵局通訊協定3 (POP3)),且擷取電子文件及/或串流(諸如網頁、相片及視訊)。同樣地,在一些實施中,器件6502b及6502c可經由存取器件6512及WAN 6520來撥打及接聽電話,發送並接收電子郵件訊息,且擷取電子文件。在各種實施例中,所說明用戶端器件中之任一者可使用根據一或多個安全協定(諸如安全通訊端層(SSL)協定或傳輸層安全性(TLS)協定)建立之持續性連接來與地圖服務6530及/或其他服務6550通信。 器件6502a及6502b亦可藉由其他手段來建立通信。舉例而言,無線器件6502a可經由無線網路6510來與其他無線器件(例如,其他器件6502b、行動電話等)通信。同樣地,器件6502a及6502b可藉由使用一或多個通信子系統(諸如來自Bluetooth Special Interest Group, Inc. (Kirkland, Washington)的Bluetooth®通信)來建立同級間通信6540 (例如,個人區域網路)。器件6502c亦可建立與器件6502a或6502b之同級間通信(未圖示)。亦可實施其他通信協定及拓撲。器件6502a及6502b亦可自GPS衛星6560接收全球定位衛星(GPS)信號。 器件6502a、6502b及6502c可經由一或多個有線及/或無線網路6510或6512而與地圖服務6530通信。舉例而言,地圖服務6530可將地圖服務資料提供至顯現器件6502a、6502b及6502c。地圖服務6530亦可與其他服務6550通信以獲得資料來實施地圖服務。地圖服務6530及其他服務6550亦可自GPS衛星6560接收GPS信號。 在各種實施例中,地圖服務6530及/或其他服務6550經組態以處理來自用戶端器件中之任一者之搜尋請求。搜尋請求可包括(但不限於)對商業、地址、住宅方位、興趣點或其某一組合之查詢。地圖服務6530及/或其他服務6550可經組態以傳回與多種參數有關之結果,該等參數包括(但不限於)鍵入至地址列或其他文字鍵入欄位中之方位(包括縮寫及/或其他速記記法)、當前地圖視圖(例如,使用者可在停留在一個方位中時在多功能器件上檢視另一方位)、使用者之當前方位(例如,在當前地圖視圖不包括搜尋結果之情況下)及當前路線(若存在)。在各種實施例中,此等參數可影響基於不同優先權加權的搜尋結果之組成(及/或搜尋結果之排序)。在各種實施例中,被傳回之搜尋結果可為基於特定準則而選擇之結果之子集,該等特定準則包括(但不限於)搜尋結果(例如,一特定興趣點)已被請求之次數、與搜尋結果相關聯之品質之度量(例如,最高的使用者或編輯評論評等)及/或針對搜尋結果之評論之數量(例如,搜尋結果已被評論或評等之次數)。 在各種實施例中,地圖服務6530及/或其他服務6550經組態以提供顯示於用戶端器件上(諸如在地圖繪製應用程式內)之自動完成搜尋結果。舉例而言,自動完成搜尋結果可在使用者於多功能器件上鍵入一或多個搜尋關鍵字時填入畫面之一部分。在一些情況下,此特徵可節約使用者時間,因為在使用者鍵入完整搜尋查詢之前就可顯示所要的搜尋結果。在各種實施例中,自動完成搜尋結果可為由用戶在用戶端器件上發現之搜尋結果(例如,書籤或聯絡人)、由地圖服務6530及/或其他服務6550在別處(例如,自網際網路)發現之搜尋結果及/或該等搜尋結果之某一組合。如在命令之情況下,搜尋查詢中之任一者可由使用者經由語音或經由鍵入而輸入。多功能器件可經組態以在本文中所描述之地圖顯示中之任一者內用圖形顯示搜尋結果。舉例而言,圖釘或其他圖形指示符可將搜尋結果之方位指定為興趣點。在各種實施例中,回應於使用者對此等興趣點中之一者之選擇(例如,觸碰選擇,諸如觸按),多功能器件經組態以顯示關於所選擇興趣點之額外資訊,包括(但不限於)評等、評論或評論片段、營業時間、商店狀態(例如,開門營業、永久關閉等)及/或興趣點之店面之影像。在各種實施例中,此資訊中之任一者可顯示於回應於使用者選擇興趣點而顯示的圖形資訊卡上。 在各種實施例中,地圖服務6530及/或其他服務6550提供用以自用戶端器件6502a至6502c接收回饋之一或多個回饋機構。舉例而言,用戶端器件可將對搜尋結果之回饋提供至地圖服務6530及/或其他服務6550 (例如,指定評等、評論、臨時或永久停業、錯誤等之回饋);此回饋可用以更新關於興趣點之資訊以便在未來提供更準確或更新近之搜尋結果。在一些實施例中,地圖服務6530及/或其他服務6550可將測試資訊提供至用戶端器件(例如,A/B測試)以判定哪些搜尋結果係最佳的。舉例而言,用戶端器件可以隨機間隔接收兩個搜尋結果並向使用者呈現該兩個搜尋結果,且允許使用者指示最佳結果。用戶端器件可將最佳結果報告至地圖服務6530及/或其他服務6550以基於所選擇的測試技術(諸如A/B測試技術,其中將基線對照樣本與多種單變數測試樣本進行比較以便改良結果)來改良未來搜尋結果。 雖然已參考眾多特定細節描述本發明,但一般熟習此項技術者將認識到,在不脫離本發明之精神之情況下,可以其他特定形式來體現本發明。舉例而言,諸圖中之許多者說明各種觸碰示意動作(例如,觸按、觸按兩下、撥動示意動作、按住不放示意動作等)。然而,可經由不同觸碰示意動作(例如,撥動而非觸按等)或藉由非觸碰輸入(例如,使用游標控制器、鍵盤、觸控板/軌跡墊、接近觸碰敏感螢幕等)來執行所說明操作中之許多者。另外,若干圖概念性地說明了程序。可不以所展示及描述之精確次序來執行此等程序之特定操作。可不按一個連續操作系列來執行特定操作,且在不同實施例中,可執行不同特定操作。此外,程序可使用若干子程序來實施,或實施為較大巨集程序之部分。Numerous details, examples, and embodiments of the invention are set forth and described in the following detailed description. However, it will be apparent to those skilled in the art that the present invention is not limited to the illustrated embodiments and the invention may be practiced without the specific details and examples discussed. Some embodiments of the present invention provide an integrated mapping application that includes a number of useful modalities, including orientation browsing, map searching, route recognition, and route navigation operations. In some embodiments, the application is defined to be executed by a device having one of the touch-sensitive screens that displays the output of the application. In some embodiments, the device has a multi-touch interface for allowing a user to interact with the application by providing touch and gesture input via the screen. Examples of such devices are smart phones (eg, iPhone® sold by Apple Inc., mobile phones operating Android® operating systems, mobile phones operating Windows 8® operating systems, etc.). Several detailed embodiments of the invention are described below. Section I describes the UI controls and map browsing experience provided by the mapping application of some embodiments. Section II then describes the characteristics of the novel search field of the mapping application. Section III then describes a novel UI for presenting different types of detail information about the orientation. Next, Section IV describes several different ways for the user to get directions from the mapping application. Section V then describes the different modes of operation of the integrated mapping application of some embodiments. Section VI describes an example electronic system to which some embodiments of the present invention are implemented. Finally, Section VII describes a map service operating environment.I. Map browsing A. General control 1 illustrates an example of a device 100 for performing an integrated mapping application of some embodiments of the present invention. The application has a novel user interface (UI) design that smoothly and consistently integrates different models for applications by using a minimal set of screen controls that float above the content to display as much content as possible. The control of each of the states. In addition, this cluster is adapted to the current task of animating the content of the cluster as it moves between different modalities (eg, between browsing, searching, routing, and navigation). This common element of adaptive nature enables the mapping application to be optimized for different tasks while maintaining a consistent look and interaction model as it moves between tasks. Figure 1 shows three phases 105, 110, and 115 of interaction with a mapping application. The first stage 105 shows the UI 120 of the device, which includes several icons of several applications on the docking area 125 and on the page of the UI. One of the illustrations on this page is an illustration of the mapping application 130. The first stage shows that the user selects the mapping application by touching the screen with the orientation of the application on the screen of the device. The second stage 110 shows the device after the mapping application has been launched. As shown in this stage, the UI of the mapping application has a start page, in some embodiments, the start page (1) displays a map of the current orientation of the device, and (2) is configured in the top column 140 and acts as a float. Several UI controls for the control. As shown in FIG. 1, the floating control items include a position control item 145, a 3D control item 150, and a page curl control item 155, and the top column 140 includes a guidance control item 160, a search field 165, and a bookmark control item 170. The guidance control 160 opens a page via which the user can request to identify a route between a starting orientation and an ending orientation. As further described below, this control item is one of three mechanisms through which the mapping application can be directed to identify and display a route between two orientations, the other two being (1) for the map The selected item displays one of the control items in the information banner, and (2) the new route displayed by the device in the search field 165. Therefore, the information banner control and the search field 165 are two UI tools that the application uses to make the transition between different modalities smooth. In some embodiments, a user can initiate a search by tapping in the search field 165. The touch guides the application to present an animation that (1) presents a screen keypad and (2) opens a search table filled with valuable completions. This table has some important subtleties. When the search field is touched and before the word is edited, or when the search field is empty, the table contains a list of "recently used", in some embodiments, "recently used" is requested by the user. Recent search and route guidance. This makes it very easy to quickly bring out the results of recent accesses. After any edits in the search field, the table is populated by a search completion from both the local source (eg, bookmarks, contacts, recent searches, recent route directions, etc.) and the remote server. However, some embodiments include only recent route guidance when the user has not typed any text into the search field. Once the text is typed, the mapping application removes the recent route guidance from the search completion table. Incorporating the user's contact card into the search interface adds additional flexibility to the design. When the display is recently used, in some embodiments, the route from the current orientation to the user's home is always provided, while in other embodiments, the route is provided in the context of the content deemed "appropriate". Also, when the search term matches at least a portion of an address tag (eg, "ork" of "Work"), in some embodiments, the application presents the tagged user as one of the search tables. address. Together, these behaviors make searching the UI a powerful way to get results on a map from multiple sources. In addition to allowing the user to initiate a search, in some embodiments, the presence of a text field in the primary map view also allows the user to see the query corresponding to the search results on the map and remove the search by clearing the query. result. A bookmark control 170 (eg, a button) allows an application to bookmark a position and route. The position control item 145 allows the current position of the device to be specifically marked on the map. In some embodiments, once this position control is selected, the application maintains the current position of the device at the center of the map as the device moves. In some embodiments, the control can also identify the direction in which the device is currently pointing. The mapping application of some embodiments uses the coordinates (e.g., longitude, altitude, and latitude coordinates) of the GPS signals received by the device at the location of the device to identify the orientation of the device. Alternatively or in combination, the mapping application uses other methods (eg, cell tower triangulation) to calculate the current orientation. The 3D control item 150 is a control item for viewing a map in three dimensions (3D) or checking a route. The mapping application provides 3D controls as a fast mechanism for entering and exiting 3D. This control also serves as an indicator that (1) the current view is a 3D view, and (2) a 3D perspective can be used for a given map view (eg, a reduced map view may not have a 3D view available) ). In some embodiments, the 3D control item 150 provides at least three different appearances corresponding to some of these indications. For example, the 3D control item turns gray when the 3D view of the map is unavailable, becomes black when the 3D view is available but the map is in the 2D view, and turns blue when the map is in the 3D view. In some embodiments, the 3D control item has a fourth appearance (eg, a button that displays a building image or shape) when the immersive 3D map rendering is available at a given zoom level. The immersive and non-immersive 3D renderings are further described in U.S. Patent Application Serial No. 13/632,035, filed on Sep. in. The page curl control item 155 is a control item that allows the application to minimize the number of screen control items by placing a specific infrequently used action in the primary UI screen, via the "page" displayed on the map. Curl control to access the secondary UI screen. In some embodiments, the page curl is permanently displayed on at least some of the views provided by the application. For example, in some embodiments, the application displays the page curl permanently on the start page (shown in the second stage 110), and the application provides a start page to allow a user to browse or search for a position or identify a route. The page curl indicates the orientation of another set of controls that are conceptually "behind" the current view. When the page curl control item 155 is selected, the application renders a "peel" current view to display an animation of another view of another set of display controls. The third stage 115 illustrates one example of this animation. As shown in this stage, the stripping of the start page reveals a number of controls, in this example, the drop pin, print, show traffic, list, standard (standard), satellite (satellite), hybrid (hybrid) control items. In some embodiments, the operations performed by such controls are the same as those performed by similar controls in currently available smartphones, such as the iPhone operating iOS®. The use of page curl allows the application to display more maps while providing access to other functionally unobtrusive methods provided by another set of controls. Additionally, in some embodiments, the application does not use page curling in map views where additional functionality is considered inappropriate for current tasks. For example, in some embodiments, the application does not display this page curl when rendering the map view during navigation. Again, in some embodiments, the third stage 115 illustrates the user dragging a corner or an edge of the page to peel the page. However, in other embodiments, the animation of the stripped page is displayed by merely touching the page curl control item 155 without dragging the corners or edges.B. Adaptive button cluster As noted above, in some embodiments, the mapping application adaptively adds control items to the set of floating control items and removes control items to enable this cluster while maintaining a consistent appearance and interaction model between different tasks. Adapt to their tasks. 2 illustrates an example of an application adaptively modifying a floating control item cluster to add and remove list view control items 235. This example is provided in the context of the use of the indicator indicator 160 to obtain a route between two orientations. This example is also provided in accordance with six stages 205 to 530 of interaction with the mapping application. The first stage 205 illustrates the selection of the direction indicator 160. The second stage 210 next illustrates the selection of the route generation control item 240 after the user has entered the start and end orientations of the route in the start field 245 and the end field 250. The second stage 210 also shows that the mapping application displays a number of newly used route generation requests below the fields for typing the start and end orientations. The third stage 215 shows two routes 260 and 261 that the mapping application has identified for the provided start and end orientations. In some embodiments, the mapping application promptly prompts one of the routes to map the recommended route recommended by the application to the route indicating the eye-catching prompt. This phase also illustrates the beginning of the animation that shows the list view control item 235 slipping out of the 3D icon 150. Some embodiments of the mapping application display when there is a chance to display a list of items (the list is a list of instructions in a route, or a list of search results when multiple results are found for a given query) A list control is used as one of the floating controls. In some embodiments, touching the list control item brings up a modal list view. Having a modal list view keeps the mapping application simple and keeps the map in front and centered. In some embodiments, the manifest view itself is adapted and optimized for the type of manifest displayed, as the search results will be displayed along with star ratings (when available) and the routing steps will include instruction arrows. The fourth stage 220 displays the selection of the clear control item 255 to clear the identified routes 260 and 261 from the illustrated map. In response to this selection, routes 260 and 261 are removed from the map, and an animation start display list control item 235 is slid back to the 3D control item 150 as illustrated in the fifth stage 225. The sixth stage 230 shows the application UI after the animation has ended and the list control has been removed from the set of floating control items. Another floating control used by the mapping application of some embodiments is a compass. Figure 3 illustrates an example of an application adaptively modifying a floating control item cluster to add and remove a compass 300. This example is provided using location control item 145 to view the context of the current location and orientation of the device on the map presented by the device. In some embodiments, the position control item 145 can cause the mapping application to operate in three different states. For example, when the position control item 145 is not selected, the mapping application displays a map view. Upon receiving the first selection of the position control item 145, the mapping application shifts the map to display a region of the map that includes the current orientation of the device at the center of the region. Since then, some embodiments of the mapping application track the current orientation of the device as it moves. In some embodiments, the mapping application maintains the current orientation indicator at the center of the display area and shifts the map from one zone to another as the device moves from one zone to another. When the mapping application is receiving the second selection of the position control item 145 while maintaining the current orientation of the device at the center of the displayed area, the mapping application displays the origin of the map in the direction the device is currently facing. Identify one of the current positions to simulate light projection. When the mapping application is displaying a simulated light projection while the position control item 145 is selected again, the mapping application returns to the state prior to receiving the first selection. That is, the projection will disappear and the current position of the device will not be tracked. The examples illustrated in this figure are provided in accordance with five stages 305 to 325 of interaction with the mapping application. The first stage 305 illustrates that the mapping application is displaying a map area that happens to not include the current orientation of the device (ie, the current orientation indicator is not displayed in the map area). The second stage 310 illustrates that the position control item 145 is selected once. As described above, the first selection of the position control item 145 will cause the map to be shifted to display a map area having one of the current orientation indicators 326 at the center. The third stage 315 shows the result of selecting the position control item 145. Some embodiments identify the current location of the device by using the current orientation indicator 326. The current orientation indicator 326 has a different appearance in different embodiments. For example, the current orientation indicator 326 of some embodiments has the appearance of colored dots (eg, a blue dot) on the map. The identification of the current location is useful when the user has explored (eg, via a gesture action) that the displayed map is such that the device does not currently display the user's current orientation on the map. The fourth stage 320 illustrates that the position control item 145 is selected again. In some embodiments, the second selection of the position control item 145 will cause the application to display the simulated light projection 345 from the identified current position 326 in the map in the direction that the device is currently facing. This projection helps the user identify the direction the device is facing at any time. In some embodiments, this projection is always directed toward the top of the device (i.e., the orientation along which the search field 165 is positioned when the device is held in the longitudinal direction). This projection 345 is illustrated in the fifth stage 310. This phase also shows that in this mode, the mapping application presents a floating compass 300. This compass serves as an indicator that the user can use to identify one of the directions to the north pole. In some embodiments, the compass is in the shape of two isosceles triangles adjoining the bottom, one of the triangles pointing north (in a direction away from the adjacent bottom) and having a color that distinguishes it from another triangle (for example, orange). As further described below, the compass can also be used to resume northward orientation after the user has rotated the 2D or 3D view of the map. In some embodiments, the compass may remain in the map view after the mapping application receives another selection of the position control item 145. In some embodiments, the compass does not disappear until the mapping application receives a user input to remove the compass (eg, select a compass). The fifth stage 325 also shows that the map has been rotated to maintain the projection direction toward the top of the device. This is because the device has faced a different direction than the direction toward the top of the device in the previous stage 320. As the direction of the device moves, the direction of the compass 300 will also move relative to the top of the device. The compass has moved to indicate that the device is facing the northwest. In some embodiments, the mapping application changes the appearance of the position control item 145 once after the first selection and changes the appearance of the position control item 145 again after the second selection. The fifth stage 325 shows the appearance of the position control item 145 after the second selection, which is different from the appearance of the position control item 145 after the first selection.C. 2D or 3D 1. 3D Button In some embodiments, the mapping application can display one of the orientations in the map in either 2D mode or 3D mode. This allows the user to view one of the orientations in the map in either 2D mode or 3D mode. As described above, one of the floating control items is a 3D control item 150 that allows a user to view the map in three dimensions (3D) or to inspect a route. This control also serves as an indicator that (1) the current view is a 3D view, and (2) a 3D perspective can be used for a given map view (eg, a reduced map view may not have a 3D view available) ). 4 illustrates a manner in which the mapping application of some embodiments provides 3D control 150 as a fast mechanism for entering 3D mode for three-dimensional viewing of map orientation. This figure illustrates this operation in four stages 405 through 420. The first stage 405 illustrates the selection of the 3D control item 150 by the user when viewing the two-dimensional presentation of the area near the current orientation 425 of the user. For simplicity of description, the top column, floating control, and page curl are not depicted in this figure. The second stage 410 shows a three-dimensional representation of the user's current orientation on the map. As noted above, in some embodiments, the mapping application generates a 3D view of the map by presenting the map view from a particular location in the three dimensional scene that can be conceptually considered to capture the location of the virtual camera of the map view. This manifestation will be further described below by referring to FIG. 5. The third stage 415 shows the user browsing around the current orientation by performing a toggle operation (eg, by dragging a finger across the touch sensitive screen of the device). This toggle operation changes the 3D map view presented on the device to display a new orientation on the 3D map. This new orientation is illustrated in the fourth stage 420. In some embodiments, when the local drawing application is operating in the navigation mode (ie, when the local drawing application is presenting the turn to the prompt navigation view), the mapping application presents a 3D view of the map. In order to provide a visual difference between the 3D view of the map during navigation and the 3D view of the map during map browsing, the mapping application of some embodiments uses different style sheets that define the rendered graphics in different ways. For example, the mapping application of some embodiments uses a style sheet defined for the gray of the building, the white for the road, and the rounded corners for the street in the 3D view of the map during the map browsing. The mapping application of an embodiment uses a style sheet defined for the white of the building, the gray for the road, and the sharp corners for the street in the 3D view of the map during navigation. In some embodiments, the mapping application applies the style sheets to the same map basemap for a given area of the map. In other embodiments, the mapping application applies the style sheets to different map basemaps (eg, map basemaps, navigation basemaps, etc.) for the given zone. The use of a style sheet to visualize the map is further described in the above-incorporated U.S. Patent Application Serial No. 13/632,035.2. Virtual camera FIG. 5 presents a simplified example to illustrate the concept of virtual camera 505. When a 3D map is rendered, the virtual camera is conceptualized by the location in the 3D map scene (the device visualizes the scene from that location to produce a 3D view of the map). Figure 5 illustrates one of the orientations of a 3D map scene 535 comprising four objects, two objects and two intersecting roads. To illustrate the virtual camera concept, this figure illustrates three scenarios, each of which corresponds to a different virtual camera orientation (i.e., a different presentation position) and a different view displayed on the device. The first stage 510 shows a virtual camera pointing down at a first angle (eg, -30°) at a first perspective position to a 3D scene. In this position, the camera is pointed at an orientation that can be the rest position of the device or the orientation being explored, or a moving position in front of the moving orientation of the device (in the case of a map for navigation). In some embodiments, the preset position of the camera will be in a particular orientation relative to the current orientation, but this orientation can be modified as the user rotates the map. Presenting the 3D scene from the first perspective results in a 3D map view 525. The second stage 515 shows a virtual camera pointing down at a larger second angle (eg, -45[deg.]) at a different second perspective position. Presenting the 3D scene from this perspective results in a 3D map view 530 in which the buildings and roads are smaller than their illustrations in the first map view 525. The third stage 520 shows the virtual camera from the top down view, which faces down to an orientation on the 2D map 545 that corresponds to the orientation in the 3D map scene 535 used to visualize the 3D views 525 and 530. The scene that emerges from this perspective is a 2D map view 540. Unlike the 3D rendering operations of the first and second stages, which in some embodiments are transmissive 3D rendering operations, the rendering operation in the third stage is relatively simple, as the operation only requires cropping the application or the zoom level specified by the user. One of the identified 2D maps. Thus, the virtual camera representation in this case slightly unduly complicates the description of the operation of the application, since cutting a portion of the 2D map is not a perspective rendering operation. As in the third stage 520, in some embodiments, when the camera switches from the 3D perspective to the 2D top-down view, the mapping application switches from the 3D scene to the cropped 2D scene from a particular perspective. This is because, in these embodiments, the application is designed to simplify the rendering operation using one of the easier and non-essential perspective artifacts. However, in other embodiments, the mapping application uses a perspective rendering operation to visualize a 3D scene from a top-down virtual camera position. In such embodiments, the resulting 2D map view is slightly different from the map view 540 illustrated in the third stage 520 because any object that is far from the center of the view is distorted, with the object being at a distance from the center of the view. The farther away, the greater the distortion. In various embodiments, the virtual camera 505 moves along different trajectories or arcs. Two such tracks 550 and 555 are illustrated in FIG. Of these two trajectories, the camera moves in an arc and rotates more downward as the camera moves up on the arc. Track 555 differs from track 550 in that, in track 555, the camera moves backward from the current orientation as it ascends along the arc. When moving along one of the arcs, the camera rotates to maintain the desired orientation on the map at the focus of the camera. In some cases, one of the desired orientation devices is in a stationary orientation or the user is viewing a stationary orientation on the map. In other cases, the desired orientation is a moving orientation in front of the moving orientation of the device as the user moves with the device. In addition to controlling the camera with a navigation application (e.g., from 3D to 2D while bypassing the corner) (or not using a navigation application to control the camera), some embodiments allow the user to adjust the position of the camera. Some embodiments allow the user to make commanded gestures with two fingers to adjust the distance (height) and angle of the camera. Some embodiments even allow for controlling the camera with a variety of types of gestures. Figure 6 conceptually illustrates perspective adjustment features provided by the mapping application of some embodiments. In particular, Figure 6 illustrates virtual camera 600 in three different stages 605 through 610 that demonstrate the position of virtual camera 600 in response to perspective adjustment. As shown, Figure 6 illustrates one of the orientations of a 3D map 635 comprising four objects, two objects and two intersecting roads. The first stage 605 shows the virtual camera 600 pointing down at the 3D map 635 at a first perspective position (eg, 45 degrees) relative to the horizon. In this position, camera 600 is pointed at an orientation that can be the rest position of the device or the orientation being explored, or a moving position in front of the moving orientation of the device (when the map is used for navigation). In some embodiments, the preset position of camera 600 will be in a particular orientation relative to the current orientation, but this orientation can be modified as the user rotates the map. Presenting the 3D map view based on the location of the virtual camera 600 results in a 3D map view 625. The second stage 610 shows a virtual camera 600 pointing at a 3D map 635 at a lower perspective angle with respect to a smaller second angle (eg, 30 degrees) relative to the horizon at a different second perspective location. Stage 610 also shows that the user has provided a perspective angle for adjusting the view of the 3D map 635 by touching the screen with two fingers and dragging the two fingers in an upward direction (eg, a toggle action). Input. The scene rise is accomplished by the virtual camera 600 reducing and reducing the angle of view relative to the horizon. Using a virtual camera 600 positioned at this angle to visualize a 3D map view results in a 3D map view 630 in which the buildings and roads are higher than their graphical representations in the first map view 625. As shown by the dashed pattern of virtual camera 600, virtual camera 600 moves further down the arc 650 while tilting more upward (eg, pitching). The third stage 615 exhibits a point at a higher third angle (eg, 80°) at a different third perspective position relative to one of the horizons at a higher perspective angle (eg, virtual camera 600) Focus) virtual camera 600. Stage 615 also shows that the user has provided an input for adjusting the perspective angle of the view of the 3D map 635 by touching the screen with two fingers and dragging the two fingers in a downward direction (eg, a toggle action). . Scene reduction or flattening is accomplished by the virtual camera 600 raising and increasing its angle relative to the horizon. As shown at stage 615, in some embodiments, when the virtual camera 600 is positioned in a top-down or near-top-down position such that the 3D map view rendered using the virtual camera 600 appears as 2D, the mapping application The program flattens the building in the 3D map 635 (i.e., reduces the z-axis component of the polygon to the ground level). Using the virtual camera 600 positioned at this angle in the third stage 615 to visualize a 3D map view results in a 3D map view 640, where the building appears smaller and flatter than the illustration in the second map view 630 And the road looks smaller. As shown by the dashed pattern of virtual camera 600, virtual camera 600 moves further up the arc 650 while tilting more downward (eg, pitching). In some embodiments, when the local drawing application receives an input for adjusting the perspective angle for viewing the 3D map 635, the virtual camera 600 can be moved in this manner. In some of these embodiments, when the zoom level reaches a particular zoom level, the mapping application switches to generate a top-down mode of the 2D map view (where the appearing position is straight down). As moving along an arc, the virtual camera rotates to maintain the desired orientation on the map at the focus of the camera. In some cases, one of the desired orientation devices is in a stationary orientation or the user is viewing a stationary orientation on the map. In other cases, the desired orientation is a moving orientation in front of the moving orientation of the device as the user moves with the device.3. Used to enter or exit 3D Schematic action In addition to the 3D control, the mapping application of some embodiments also allows a user to transition a map view from a two-dimensional (2D) presentation to a 3D via a gesture input to the multi-touch interface of the device. Presented. For example, the two user input functions enable the user to "push down" a 2D map view into a 3D map view or "pull up" a 3D map view into a 2D map view. This can also be seen as pulling a virtual camera from a 2D (from the top) view to a 3D (side angle) view via two indicative actions. Different embodiments use different two-indicative action operations to push the 2D map view down into a 3D map view, or pull the 3D map view up into a 2D map view. Figure 7 illustrates an example of two indicative actions for downsampling a 2D map into a 3D map. This diagram presents this example in four phases of the operation of the UI of the mapping application. The first stage 705 shows the 2D map view that the application UI presents around the current orientation 725 of the device. The second stage 710 then displays the beginning of the two indicative action operations with the 2D view 740 pushed down until the 3D view is rendered. In some embodiments, the application identifies a pushdown to the 2D map when it detects that the two contact systems are placed horizontally or approximately horizontally on the 2D map and move up together. Some embodiments require that the movement exceed a certain amount in order to impose an inertia that counteracts the push of the 2D map to the 3D map, and thereby prevent this transition from occurring unexpectedly. Other embodiments use other schemes for transitioning from a 2D map to a 3D map via a schematic action input. For example, when a user places two fingers vertically with respect to each other and exerts a large force on the upper finger to trigger one of the sensors (eg, gyroscopes, etc.) of the device or trigger the rotation of the finger. At some point, the application of some embodiments performs this transition. Still other embodiments require performing an inverse operation to transition from a 2D map view to a 3D map view. For example, some embodiments require two horizontally aligned fingers to move down on a 2D map in unison to push the 2D view down into a 3D view. The third stage 715 shows the fingers after the user's two fingers have moved up a certain amount on the screen of the device. This stage also shows that the 2D map 740 has been replaced by the 3D map 745. The fourth stage 720 shows a 3D map 745 at the end of the two indicative movements. In this phase, the 3D control item 150 appears in a bold prompt to indicate that the current map view is a 3D map view. In some embodiments, the 3D map view can be pulled up into a 2D map view by performing an opposite two-finger operation. In particular, in these embodiments, the mapping application is detected when the local mapping application detects that the 3D map is uniformly moving downwards by more than a threshold amount of two horizontal or near horizontal contacts. Transition between 3D maps and 2D maps.D. Enter and exit 3D Animation While transitioning from a 2D map view to a 3D map view, some embodiments provide an animation showing that objects that appear flat in a 2D map view are raised and become larger in a 3D map view. This animation, which is shown in the U.S. Patent Application Serial No. 13/632,027, entitled "Displaying 3D Objects in a 3D Map Presentation", filed on September 30, 2012, is incorporated herein by reference. description. U.S. Patent Application Serial No. 13/632,027, incorporated herein by reference. Figure 8 illustrates this animation in three stages. The first stage 805 shows the user selecting the 3D control item 150 when viewing the 2D map view. The second stage 810 and the third stage 815 show subsequent views (although not necessarily continuous views) provided by the mapping application after the mapping application begins to provide the 3D map view. As the zoom level increases between the second and third phases, the height of the building in the map view is increased to provide an animation that is moving from the 2D view to the 3D scene. When transitioning from a 3D view to a 2D view, the mapping application of some embodiments provides an inverse animation that shows the objects in the scene shrink until the objects collapse into flat objects in the 2D map. In some embodiments, the mapping application provides a 2D to 3D or 3D to 2D transition when the local mapping application is operating in navigation mode or route inspection mode. These two modes of operation of the mapping application are further described below. Figure 9 illustrates, in six different stages 905 through 930, the mapping application of some embodiments changes the appearance of the 3D control to indicate different 2D and 3D states of the map view. The first stage 905 illustrates that the mapping application is displaying a map and floating controls including the 3D control 150. The mapping application is displaying the map in 2D at a specific low zoom level as shown (the map has not been enlarged a lot). The 3D control 150 is displayed using a first appearance (eg, the grey letter "3D") to indicate that the 3D map material is not available at this particular zoom level. The first stage 905 also shows that the mapping application is receiving user gesture input to zoom in on the map (i.e., to increase the zoom level). The second stage 910 shows that the mapping application is displaying the map at a zoom level that is higher than the zoom level used to display the map in the previous stage 905. However, the 3D control item 150 is maintaining the first appearance because the 3D map material is not available even at this particular higher zoom level. The second stage 910 also shows another schematic action input that the mapping application is receiving to further zoom in on the map. The third stage 915 shows that the mapping application is displaying the map at a zoom level that is higher than the zoom level used to display the map in the previous stage 910. The mapping application has changed the appearance of the 3D control 150 to a second appearance (eg, "3D" of black letters) to indicate that the 3D map material is available at this zoom level. When the mapping application receives the selection of the 3D control 150, the mapping application of some embodiments will change the appearance of the 3D control 150 to a third appearance (eg, the "3D" of the blue letter) and Display the map in 3D (for example, by turning the straight down view from 2D into a perspective view). This third appearance can thus indicate that the map is displayed in 3D. The third stage 915 shows that the mapping application is receiving another illustrative action input to further zoom the map to a higher zoom level. The third stage 915 shows that the mapping application of some embodiments is displaying the buildings in the map as gray squares. The fourth stage 920 shows that the mapping application is displaying the map at a zoom level that is higher than the zoom level it used to display the map in the previous stage 915. The mapping application has changed the appearance of the 3D control 150 to a fourth appearance (eg, a building icon of the first color as shown) to indicate 3D immersive map data for visualizing the immersive 3D map view. Available at this zoom level. The fourth stage 920 also shows that the mapping application is receiving the selection of the 3D control item 150. The fifth stage 925 and the sixth stage 930 show subsequent views (although not necessarily continuous views) provided by the mapping application after the mapping application begins to provide the 3D immersive map view. In some embodiments, the zoom level is unchanged between the fifth and sixth stages, but the height of the building in the map view is increased to provide an animation that is moving from the 2D view to the 3D immersive view . Again, from stages 920 to 925, the mapping application has changed the appearance of the 3D control to a fifth appearance (eg, a building icon of the second color as shown) to indicate that the map is in a 3D immersive view. display.E. Browse 1. Toggle In some embodiments, the mapping application allows the user to explore around one of the directions displayed in the map via a variety of mechanisms. For example, as described above, some embodiments of the mapping application allow a user to surround an orientation by performing one or more toggle operations on the touch sensitive screen of the device (eg, by dragging a finger) Browse. These actions move the view rendered by the application to a new orientation on the map. An example of a toggle operation in a 3D map view is described above with reference to FIG.2. Rotate In some embodiments, the mapping application also allows a user to rotate a 2D or 3D map via a gesture input. In some embodiments, the mapping application is a vector mapping application that allows direct manipulation of the map (such as rotation and 2D/3D manipulation) while browsing the map. However, some of the effects on the map can be disorienting. In the absence of an easy way to return to the north-up orientation (ie, the orientation of the north direction aligned with the top of the device), some users may have difficulty interacting with the map view. To address this issue, some embodiments of the mapping application provide floating compass controls on the map. As mentioned, this compass acts both as an indicator pointing north and as a button to restore north orientation. To further minimize clutter on the map, the mapping application only displays this button when the map is rotated. Figure 10 illustrates an example of rotating a 2D map and using a compass to straighten a rotated map in some embodiments of the invention. This figure illustrates this example in four phases. The first stage 1005 illustrates a 2D map view 1025. The second stage 1010 illustrates the rotation of this map view via two indicative actions. In this example, the user performs the gesture by placing two fingers on the map view and pulling down a finger while pushing up a finger. This rotational movement of the finger causes the application to rotate the map into a rotated map view 1030. In some embodiments, the mapping application calculates a midpoint between the two fingers and uses the midpoint as an anchor for rotation. In some such embodiments, if one of the two fingers does not move, the mapping application uses the location where the finger is not moved as an anchor point. In some embodiments, when the position control item 326 is present in the view (eg, by selecting the position control item 145), the mapping application uses the orientation of the position control item as the anchor of the rotation regardless of the orientation of the finger. The second stage 1010 also shows that in response to the rotation of the map, the application has presented a compass 300 on the map to indicate the north direction on the rotated map. The third stage 1015 then shows the user selecting the compass 300. The fourth stage 1020 then shows that after selecting the compass, the application rotates the map back to map view 1025 (ie, oriented northward). Figure 11 illustrates another example of a rotating map in some embodiments of the invention. This figure illustrates this example in four stages 1105 through 1120. In this example, the map is a 3D map. Thus, the first stage 1105 illustrates the 3D map view 1105. The second stage 1110 illustrates rotating the map view via two indicative actions. As previously mentioned, in this example, the user performs the gesture by placing two fingers on the map view and pulling down a finger while pushing up a finger. This rotational movement of the finger causes the application to rotate the map into a rotated map view 1130. In this example, the rotation system surrounds the current position of the device because, as described above, the current orientation indicator 326 is present in the map view 1125. The second stage 1110 also shows that in response to the rotation of the map, the application has presented a compass 300 on the map to indicate the north direction on the rotated map. The third stage 1115 then displays a further rotation of the map in response to the other two indicative actions of the user. The compass 300 still indicates the north direction but has rotated with the rotated map. The fourth stage 1120 then shows the map and the further rotation of the compass 300. In some embodiments, the mapping application does not allow the user to rotate the 2D or 3D map at certain zoom levels. For example, when zooming out of the map (to a low zoom level), the mapping application does not rotate the map when it receives a gesture input from the user to rotate the map (eg, a two-finger rotation operation). In some embodiments, the module of the mapping application responsible for moving the virtual camera checks the current zoom level and decides to ignore the instructions if the map should not be rotated at the current zoom level. In some other embodiments, the application rotates the map a certain distance when the user provides a gesture input for rotating the map, but rotates the map back to the preset orientation when the user releases or stops the gesture input ( For example, North). In some embodiments, the mapping application provides an inertial effect for the rotation of the map. When a user provides a particular type of gesture input (eg, an input that terminates at an angular rate or a translation rate greater than the threshold rate) to rotate the map, the mapping application generates an inertial effect that causes the map to continue to rotate and decelerate to a stop. . In some embodiments, the inertial effect provides the user with a more realistic interaction with the map that mimics behavior in the real world. Figure 12 illustrates the rotational operation with the inertial effects for the rotational operation in three different stages 1205 through 1215. For the sake of simplicity, the inertial effect is shown in this figure in terms of a 2D map view. However, the mapping application of some embodiments provides an inertial effect when viewing a map in 3D mode. The first stage 1205 presents a 2D map view 1220 of the 2D map. In some embodiments, the mapping application executes a program 1300, which is described below with reference to FIG. 13, to perform a rotating operation. As shown, the 2D map view 1220 includes a number of streets that extend in a parallel or vertical direction. The first stage 1205 also shows that the user is providing input to rotate the 2D map view 1220. Specifically, the user is performing by touching two fingers on two directions on the touch screen and rotating the two fingers in a clockwise direction (as indicated by the two arrows depicted in the figure). Used to rotate one of the 2D map views 1220 to illustrate the action. In this example, a striking reminder of the fingertip is illustrated for illustrative purposes. In some embodiments, the mapping application does not actually display a prominent prompt around the fingertips. The second stage 1210 displays a 2D map immediately after the user has completed input to rotate the 2D map. For this example, the user completes the input by lifting the two fingers off the touch screen of the device, as indicated by the eye-catching prompts around the fingertips that are no longer displayed. Additionally, the second stage 1210 displays a 2D map view 1225 of the 2D map rendered by the mapping application. As shown, the mapping application has rotated the 2D map from the 2D map view 1220 to the 2D map view 1225 in a clockwise direction. The street shown in the first stage 1205 has been rotated approximately 45 degrees in a clockwise direction. Different embodiments of the mapping application utilize different methods to implement the inertial effects of the rotating operation. For example, in some embodiments, the mapping application determines whether the user stops moving the finger or lifts the finger away from the touch screen based on an average of one of the fingers or the fingers. The angular (or translational) rate of the user's input as it approaches the instant. The mapping application of some embodiments treats each stop as the end of the input when the user repeatedly stops the finger without lifting the finger and begins to move the finger again, while in other embodiments, the mapping application The program treats the operation as an input until the user lifts his finger off the screen. The mapping application uses the angular rate to determine the angular amount (eg, degree) for the inertial effect, and determines that the virtual camera used to view the 2D map decelerates the angular rate (eg, in constant, exponential, logarithm, etc.) The manner in which the determined angular amount is rotated. In some embodiments, the mapping application visualizes and displays an animation of the inertial effect (eg, a deceleration rotation of the 2D map from the 2D map view 1225 that rotates the 2D map by the determined angular amount). In some embodiments, the mapping application itself does not analyze the user's gesture input. For example, the mapping application of these embodiments does not determine the angular rate of input by the user. Rather, the mapping application of these embodiments receives the angular rate determined by the operating system of the device executing the mapping application. The operating system of the device has an interface for receiving a gesture input from the user. The operating system analyzes the received input and provides the analysis to the mapping application. The mapping application will determine the inertia effect to apply based on the analysis of the input. The third stage 1215 illustrates a 2D map after the mapping application has visualized and displayed an animation of inertial effects. As shown, a 2D map view 1230 of the 3D map rendered by the mapping application is displayed. After the user completes the input in the second stage 1210, in the third stage 1215, the mapping application has rotated the 2D map clockwise more. As shown, the 2D map view 1230 in the third stage 1215 shows a street that rotates clockwise more than the street shown in the 3D map view 1225. In some embodiments, the mapping application also provides operations for operations other than rotating the map (such as moving a pan map or entering or exiting a 3D operation (eg, mobile browsing, rotation, 2D entry into 3D)) Inertia effect. The inertial effect for these other operations is further described in the above-incorporated U.S. Patent Application Serial No. 13/632,035. FIG. 13 conceptually illustrates a procedure 1300 for some embodiments for rotating a map view based on a gesture input. In some embodiments, the local drawing application is in a map view mode (eg, azimuth browsing mode, navigation mode, 2D view mode, 3D view mode, etc.) and the mapping application is executed by the mapping application on the device The mapping application executes program 1300 when the touch screen receives a gesture. The program 1300 begins by receiving (at 1310) one of the gestures for rotating the map view. In some embodiments, one of the gestures for rotating the map view includes a multi-touch gesture that is received via the touch screen (eg, touching the touch screen with multiple fingers simultaneously). In this example, the program 1300 receives a two-point touch rotation gesture. Next, the routine 1300 identifies (at 1320) the rotational component of the received gesture. The routine 1300 of some embodiments identifies the rotational component of the gesture by recognizing the amount of rotation of the touch point of the gesture. For example, in some such embodiments, the program 1300 identifies the amount of rotation of the touch point of the gesture by: (1) determining the initial orientation from one touch point to the initial of the other touch point a first vector of orientations; (2) a second vector determining a second orientation from the one touch point to a second orientation of the other touch point; and (3) based on an initial orientation of the touch points and The second orientation of the touch points determines the direction of rotation. The routine 1300 then determines (at 1330) whether the amount of rotation is within a margin. When the routine 1300 determines that the amount of rotation is not within the threshold amount, the routine 1300 ends. Otherwise, routine 1300 determines (at 1340) an axis of rotation based on the gesture. In some embodiments, the routine 1300 determines the axis of rotation by: (1) identifying one of the vectors along one of the initial orientations from one touch point to the other of the initial touch points; and (2) A point on the map view corresponding to the point along the vector (eg, a point on the map that coincides with the point along the vector) is determined. The program 1300 uses the determined point on the map view as the orientation of the axis (eg, the z-axis) around which the map view is rotated. Next, the routine 1300 adjusts (at 1350) the map view based on the axis of rotation and the amount of rotation. In some embodiments, the program 1300 adjusts the map view by rotating the map view around the determined axis of rotation in the determined direction of rotation. Different embodiments use different coordinate spaces for the map. For example, the map of some embodiments uses a Mercator unit coordinate space. In such embodiments, the program 1300 adjusts the position of the virtual camera relative to the map to adjust the map view. As another example, in some embodiments, the map uses the World Geodetic System (eg, WGS 84) as the coordinate space for the map. In some such embodiments, the program 1300 adjusts the map relative to the position of the virtual camera to adjust the map view. Finally, the program 1300 visualizes (at 1360) the adjusted map view for display on the device. In some embodiments, the rendered map view is an image representing one of the adjusted map views. Next, the routine 1300 ends. In some embodiments, the 3D map can be rotated at a zoom level of the defined range and/or set. For example, in some embodiments, the mapping application allows the 3D map to rotate at a defined number of highest zoom levels (eg, zoom levels 10 through 20) and prevent the 3D map from remaining at the lower zoom level (eg, Zoom level 1 to 10) rotates down. In some such embodiments, the mapping application receives no input to rotate the 3D map when the local drawing application receives an input to rotate the 3D map under one of the zoom levels defined to not allow the rotation. In other such embodiments, the mapping application generates instructions to rotate the 3D map when the input of the 3D map is rotated at a zoom level that is not allowed to rotate, but the mapping application generates instructions to rotate the 3D map, but The mapping application simply ignores these instructions. Those of ordinary skill in the art will recognize that in different embodiments, the zoom level that allows rotation operations on a 3D map may be defined in many different ways.3. Legend and name rotation The mapping application of some embodiments uses novel techniques to adjust the text and/or symbols appearing in the map view or to keep the text and/or symbols unadjusted as the local map view is rotated. Figure 14 illustrates an example of this novel method in accordance with four stages 1405 through 1420 of UI operations. In this example, the name Apple Inc. appears at a position that is 1 Infinite Loop, Cupertino California. In the first stage 1405, the name Apple Inc. is upright in a particular map view. In the second phase 1410 and the third phase 1415, the map view is rotated in response to the two-finger rotation operation of the user. In these two phases, the name Apple Inc. is shown to be slightly rotated at an angle that is much smaller than the angle of rotation of the map. The name Apple Inc. behaves as if the name was pinned to the map at its center or at the top, but the name of the name points down. Therefore, each time the map rotates, the name rotates slightly, but the center of gravity of the name causes it to rotate less and eventually returns the name to its upright position. This upright position of the name Apple Inc. is shown in the fourth stage 1420. This stage shows the map view after the rotation operation has completed. However, maintaining a constant upright position of all text and/or symbols in a rotated map may be a bit pleasing if the map has many text characters or symbols and many of the characters or symbols are counter-rotating to keep straight up. Distraction. Thus, for some of the characters and/or symbols, some embodiments of the mapping application use an alternate mechanism to adjust the orientation of the characters and/or symbols during rotation. Figure 15 illustrates one such alternative example in accordance with four stages 1505 through 1520 of UI operations. In this example, the name of the street is the character that is rotated after the map view has been rotated by a margin. In the first stage 1505, the street name is aligned with the upward and rightward direction of travel on the street. In the second phase 1510 and the third phase 1515, the map view is rotated in response to the two-finger rotation operation of the user. In these two phases, there is no street name rotation because the map has not rotated the necessary threshold. However, by the time the map is rotated to its orientation in the fourth stage 1520, the map has rotated sufficiently to pass a limit, which would require some of the street names (streets 1 to 4) to be rotated to maintain the names and Alignment of the upward direction of travel.4. Zoom and bounce Figure 16 illustrates an example of a user transitioning from a 3D map view to a 2D map view via two indicative action actions. This figure illustrates this transition in four stages 1605 through 1620. In the first three phases 1605 through 1615, the user performs a pinch-in operation that causes the application to zoom out the 3D view presented in the first phase in successive steps until the view becomes the 2D view illustrated in phase four 1620. . Alternatively or in combination with the perspective adjustment features described above with reference to Figure 6, the mapping application of some embodiments allows the user to zoom in and out of the view of the 3D map (e.g., by providing with two fingers) Action input). Figure 17 illustrates zoom adjustment features provided by the mapping application of some embodiments. In particular, Figure 17 illustrates virtual cameras 1712 in three different stages 1701 through 1703 that demonstrate the movement of virtual camera 1712 in response to zoom adjustments. As shown, Figure 17 shows one of the orientations of the 3D map 1710, which contains two buildings and two roads forming a T-shaped junction. The first stage 1701 presents a 3D map 1710 in which a virtual camera 1712 at a particular location points to a 3D map 1710. In this position, camera 1712 is pointed to an orientation that can be the rest position of the device or the orientation being explored, or a moving position in front of the moving orientation of the device (in the case of a map for navigation). Presenting the 3D map view based on the location of the virtual camera 1712 results in a 3D map view 1714. The second stage 1702 shows a virtual camera 1712 pointing to the 3D map 1710 at a different zoom level position. Stage 1702 shows that a user has provided input for increasing the zoom level of the view of the 3D map 1710 by having two fingers touch the screen of the device close to each other and moving the fingertip while the finger touches the screen. Separate (for example, an expansion gesture). The magnification adjustment is accomplished by the virtual camera 1712 moving along line 1750 near the 3D map 1710. In some embodiments, the mapping application uses a line 1750 along which the virtual camera 1712 is moved as a line formed by the front end of the virtual camera 1712 and the focus of the virtual camera 1712. The mapping application of some embodiments moves the virtual camera 1712 along a line formed by one of the front end of the virtual camera 1712 and the 3D map 1710 based on the user's input to magnify the view of the 3D map 1710. Presenting a 3D map view using the virtual camera 1712 at this location results in a 3D map view 1724 where the buildings and roads appear closer than the locations shown in the 3D map view 1714. As indicated by the dashed pattern of virtual camera 1712, virtual camera 1712 moves along line 1750 near 3D map 1710. The third stage 1703 shows a virtual camera 1712 pointing to the 3D map 1710 at a different zoom level position. In this stage 1703, the user has provided input to reduce the zoom level of the 3D map 1710 by causing the two fingers to touch the screen of the device far apart and to point the finger while touching the screen. The tips move closer together (for example, a pinch gesture). The reduction adjustment is accomplished by moving virtual camera 1712 along line 1755 away from 3D map 1710. In some embodiments, the mapping application uses a line 1755 along which the virtual camera 1712 is moved as a line formed by the front end of the virtual camera 1712 and the focus of the virtual camera 1712. The mapping application of some embodiments moves the virtual camera 1712 along a line formed by one of the front end of the virtual camera 1712 and the 3D map 1710 based on the user's input to magnify the view of the 3D map 1710. Presenting a 3D map view using the virtual camera 1712 at this location results in a 3D map view 1734 where the buildings and roads appear farther than the locations illustrated in the 3D map view 1724. As shown by the dashed pattern of virtual camera 1712, virtual camera 1712 moves away from 3D map 1710 along line 1755. As described above, FIG. 17 illustrates a number of example zoom adjustment operations and corresponding movements of the virtual camera in the 3D map to visualize the 3D map view of the 3D map. Those of ordinary skill in the art will recognize that many different scaling adjustments are possible. In addition, the mapping application of some embodiments performs a zoom adjustment operation in response to additional and/or different types of input (eg, touch screen, two touch screens, etc.). Figure 18 conceptually illustrates features provided by the mapping application of some embodiments for maintaining the position of the virtual camera within a defined range along an arc of a circle. In particular, FIG. 18 illustrates virtual camera 1800 at three different stages 1805 through 1815 that exhibit the position of virtual camera 1800 maintained within a defined range of arcs 1850. As shown in FIG. 18, one of the orientations in the 3D map 1835 includes two buildings and two roads forming a T-shaped junction. The first stage 1805 shows the virtual camera 1800 at a particular location along one of the arcs 1850. As shown, the arc 1850 represents a defined range (eg, an angular extent) within which the virtual camera 1800 can move. The first stage 1805 also exhibits three positions 1855 through 1865 along the arc 1850 (eg, a perspective perspective). In this example, the mapping application causes the virtual camera 1800 to be along the arc 1850 at a high perspective end of the arc 1850 in a manner similar to that described above with respect to FIG. 5 (eg, when the virtual camera 1800 is tilted downward) When it is at maximum, it moves between the position of the arc 1850 and the position 1855. Presenting a 3D map view based on the location of the virtual camera 1800 in the first stage 1805 results in a 3D map view 1825. When the virtual camera 1800 passes the position 1855 while moving toward the low perspective end of the arc 1850, the mapping application reduces the speed at which the virtual camera 1800 moves toward the low perspective end of the arc 1850, regardless of the input provided by the user. (for example, deceleration). In some embodiments, the mapping application causes the speed of the virtual camera 1800 to decrease at a constant rate, while in some embodiments, the mapping application causes the speed of the virtual camera 1800 to decrease at an exponential rate. Additional and/or different methods for reducing the speed of virtual camera 1800 are used in some embodiments. The second stage 1810 shows that the virtual camera 1800 has moved to a position along the arc 1850 at or near the low perspective end of the arc 1850. As shown, the user is providing input for adjusting the perspective angle of the view of the 3D map 1835 by touching the screen with two fingers and dragging the two fingers in an upward direction (eg, a toggle gesture). . In response to the input, the mapping application moves the virtual camera 1800 toward the low perspective end of the arc 1850 while tilting the virtual camera 1850 up. When the virtual camera reaches position 1865 along arc 1850, the mapping application prevents virtual camera 1800 from moving lower and beyond position 1865, even if the user continues to provide input for reducing the perspective of the view of 3D map 1835 ( For example, the user continues to drag two fingers up on the screen) as well. In some embodiments, the mapping application causes the virtual camera 1800 when the user stops providing input to reduce the perspective angle of the view of the 3D map 1835 (eg, the user lifts two fingers off the touch screen) The position "bounces" or "fast moves" from position 1865 along arc 1850 to position 1860. When the mapping application is generating or visualizing a 3D map view of the 3D map 1835 based on the view of the virtual camera 1800 during a bounce or fast moving motion, the resulting 3D map view provides a 3D map view that briefly bounces down or moves quickly A bounce animation to indicate to the user that the perspective of the map view cannot be reduced more. Presenting a 3D map view using the virtual camera 1800 positioned at this angle results in a 3D map view 1830 in which the buildings and roads are higher than the map view 1825. The third stage 1815 shows the mapping application responding to the virtual camera 1800 after the user has stopped providing input to have the virtual camera 1800 bounce or quickly moved to position 1860. Different embodiments use different techniques to implement bounce or fast movement of the virtual camera 1800. For example, the mapping application of some embodiments begins to cause virtual camera 1800 to rapidly accelerate in a defined distance along arc 1850, or until virtual camera 1800 reaches a defined speed. Next, the mapping application causes the virtual camera 1800 to decelerate along the arc 1850 at the remaining distance from the position 1860. Other ways to implement bounce or fast moving effects are used in some embodiments. Presenting a 3D map view using the virtual camera 1800 positioned at position 1860 along the arc 1850 in the third stage 1815 results in a 3D map view 1840, which is slightly more visible in the map view 1840 than the map view 1830. Small and flat, and the road looks a little smaller. As noted above, Figure 18 illustrates one technique for preventing a virtual camera from moving beyond the low perspective end of the arc. Instead of preventing the virtual camera from moving beyond the low perspective end of the arc or in combination with preventing the virtual camera from moving beyond the low perspective end of the arc, some embodiments of the mapping application utilize a high perspective end for preventing the virtual camera from moving beyond the arc One of the similar technologies. In addition, FIG. 18 shows an example of the following position: slowing down the position of the virtual camera along one of the arcs; preventing the virtual camera from moving through one of the positions of the arc; and the virtual camera moving quickly or bouncing back To a position along one of the arcs. Different embodiments define these locations in many different ways. For example, in some embodiments, the position of the virtual camera along the arc is slowed to be the same or similar to the position of the arc along which the virtual camera quickly moves or bounces back.5. Display module Figure 19 conceptually illustrates a process or map rendering pipeline 1900 that is executed by a mapping application of some embodiments to visualize a map for display at a client device (e.g., on a display of a client device). In some embodiments, the map rendering pipeline 1900 can be collectively referred to as a map rendering module. A more detailed version of this process line is described in U.S. Patent Application Serial No. 13/632,035, which is incorporated herein by reference. As illustrated, processing pipeline 1900 includes a basemap skimmer 1905, a set of grid builders 1915, a set of grid creation processors 1910, a controller 1975, a basemap provider 1920, a virtual camera 1930, and a map rendering engine 1925. . In some embodiments, the basemap skimmer 1905 performs various programs for capturing a map basemap in response to a request from the grid builder 1915 for a map basemap. As described below, the grid builder 1915 identifies an existing map basemap required to create a respective grid of map maps (either stored on a map drawing service server or on a device executing processing pipeline 1900). Cache memory). The basemap extractor 1905 receives a request for a map basemap to determine where to draw the map basemap (eg, from a map drawing service, one of the cache devices on the device, etc.), and Unzip the map basemap as needed. The grid builder 1915 (also referred to as the basemap source) of some embodiments is implemented by the basemap provider 1920 to establish different layers of the view basemap. Depending on the type of map being displayed by the mapping application, the basemap provider 1920 can perform individualized different numbers and different types of grid builders 1915. For example, for a low-flying (or satellite) view map, the basemap provider 1920 may only perform a single grid builder 1915, as some embodiments of the low-altitude overhead map do not contain data. Multiple layers. In fact, in some embodiments, the low-altitude overhead map contains a built-in grid generated at the mapping service where a low-altitude overhead image (taken by satellite, airplane, helicopter, etc.) is used as the texture of the grid. However, in some embodiments, an individualized additional mesh builder can be implemented for generating tags to overlay the low-level overhead views when the application is in the hybrid mode. For vector maps of 2D or 3D representations (i.e., non-satellite image maps), some embodiments perform a personalized individual grid builder 1915 to create a grid for incorporation into the map for: land cover polygon data (eg, parks, water bodies, etc.), roads, landmarks, point labels (eg, park labels, etc.), road labels, traffic (if traffic is shown), buildings, dot matrix data (for specific items under a specific zoom level) ), and other layers of the data. The resulting low-altitude view map is described in detail in PCT Application No. PCT/EP2011/054155 entitled "3D Streets". PCT Application No. PCT/EP2011/054155 is incorporated herein by reference. The mesh builder 1915 of some embodiments receives the "empty" view basemap from the basemap provider 1920 and passes the "established" view basemap back to the basemap provider 1920. That is, the basemap provider 1920 sends one or more view basemaps (not shown) to each of the grid builders 1915. Each of the view basemaps indicates one of the regions of the world for which a grid is drawn. Upon receiving this view basemap, the grid builder 1915 identifies the required map basemap from the map drawing service and sends its manifest to the basemap skimmer 1905. Upon receiving the basemap from the basemap skimmer 1905, the grid builder uses the vector data stored in the basemap to create a polygon mesh of the area depicted by the view basemap. In some embodiments, the mesh builder 1915 uses several different mesh generation processors 1910 to build the mesh. Such functions may include a grid generator, a triangometer, a shadow generator, and/or a texture decoder. In some embodiments, such functions (and additional mesh building functions) are available to each of the mesh builders, with different mesh builders 1915 using different functions. Each mesh builder 1915 passes its mesh layer filled view basemap back to the basemap provider 1920 after building its mesh. The basemap provider 1920 receives from the controller 1975 a particular view (i.e., volume or view frustum) representing the map view to be displayed (i.e., the volume visible from the virtual camera 1930). The basemap provider performs any culling (e.g., identifies surface areas that will be displayed in the underlay of the view), and then sends the view basemaps to the mesh builder 1915. The basemap provider 1920 then receives the established view basemap from the mesh builder, and in some embodiments, the basemap provider performs picking (eg, shifting) on the established mesh using a particular view from the virtual camera 1930. Except for surface areas that are too far away, objects that are completely behind other objects, etc.). In some embodiments, basemap provider 1920 receives established view basemaps from different mesh builders at different times (eg, due to different processing times to complete higher complexity and lower meshes) , at different times, etc., before the necessary basemap extractor 1905 receives the necessary map basemap). Once all of the layers of the view basemap have been passed back, the basemap provider 1920 of some embodiments places the layers together and releases the data to the controller 1975 for visualization. Virtual camera 1930 creates a volume or surface for pipeline 1900 to appear and sends this information to controller 1975. Based on the particular orientation and orientation from which the map will be rendered (i.e., the point in the 3D space from which the user "views" the map), the virtual camera recognizes a field of view for actual transmission to the basemap provider 1920. In some embodiments, when the local mapping application is rendering a 3D perspective for navigation, the virtual camera's field of view is determined according to an algorithm that generates a new virtual at regular intervals based on the movement of the user device. Camera orientation and orientation. In some embodiments, controller 1975 is responsible for managing basemap provider 1920, virtual camera 1930, and map rendering engine 1925. In some embodiments, multiple basemap providers may be physically implemented, and the controller places together several view basemaps (eg, a map basemap and a building basemap) to produce a handoff to the map rendering engine. One of the scenes of 1925. The map rendering engine 1925 is responsible for generating a map for output to a display device based on a mesh base map (not shown) transmitted from the virtual camera. The map rendering engine 1925 of some embodiments has several subroutines. In some embodiments, each different type of map element is visualized by a different subroutine, wherein the visualization engine 1925 handles the cloaking of different object layers (eg, placing the label above or behind different buildings, in land cover) Roads, etc. are generated above the objects). Examples of such visualization programs include road visualization programs, building visualization programs, and label display programs, vegetation display programs, dot matrix traffic display programs, dot matrix road display programs, satellite display programs, polygon display programs, and background dot matrix display programs. Wait. The operation of the rendering pipeline 1900 in some embodiments will now be described. Based on user input to view a particular map area at a particular zoom level, virtual camera 1930 specifies an orientation and orientation from which to view the map area, and sends the view frustum or volume to controller 1975 . Controller 1975 performs the individualization of one or more basemap providers. Although a basemap provider 1920 is shown in this figure, some embodiments allow for the simultaneous execution of individualized multiple basemap providers. For example, some embodiments perform individualized basemaps for buildings and separate basemaps for map maps. The basemap provider 1920 performs any sorting necessary to generate an empty view basemap that identifies the area of the map area that needs to be meshed, and sends the empty view basemap to the mesh builder 1915 for plotting the map Different layers (eg, roads, land covers, POI tags, etc.) perform individualization of the grid builders. The grid builder 1915 receives a list of information from the mapping service that identifies the different base maps available on the mapping service server (i.e., as nodes of the quadtree). The grid builder 1915 requests a base map extractor 1905 that passes the requested map basemap back to the grid builder 1915. Once a particular mesh builder 1915 has received its map basemap, the mesh builder begins to use the vector data stored in the map basemap to create a net for transmitting the basemap of the basemap provider 1920. grid. After creating a grid for its map layer, the grid builder 1915 sends the established view basemap back to the basemap provider 1920. The basemap provider 1920 waits until it has received all of the view basemaps from the various mesh builders 1915, then layeres the view basemaps together and sends the completed view basemap to the controller 1975. The controller stitches together the returned basemaps (eg, the map view basemap and the building view basemap) from all of its basemap providers and sends this scenario to the rendering engine 1925. The map rendering engine 1925 uses the information in the map basemap to draw the scene for display.II. Bearing search A. Search field behavior 1. Search field function and appearance In some embodiments, the search field of the mapping application is another UI tool that the application uses to smooth transitions between different modalities. In some embodiments, a user can initiate a search by tapping in the search field. The touch guide application presents an animation that (1) presents a screen keypad and (2) opens a search table filled with valuable completions. This table has some important subtleties. When the search field is touched and before the word is edited, or when the search field is empty, the table contains a list of "recently used", in some embodiments, "recently used" is requested by the user. Recent search and route guidance. This makes it very easy to quickly bring out the results of recent accesses. After any edits in the search field, the table is populated by a search completion from both the local source (eg, bookmarks, contacts, recent searches, recent route directions, etc.) and the remote server suggestion. Incorporating the user's contact card into the search interface adds additional flexibility to the design. When the display is recently used, in some embodiments, one of the routes from the current orientation to the user's home is always provided, while in other embodiments, the route is provided in the context of the content deemed "appropriate". Also, when the search term matches at least a portion of an address tag (e.g., "Wo" of "Work"), in some embodiments, the application presents the tagged address of the user who completed as one of the search tables. Together, these behaviors make searching the UI a powerful way to get results on a map from multiple sources. In addition to allowing a user to initiate a search, in some embodiments, the presence of a search field in the primary map view also allows the user to see the query corresponding to the search results on the map and remove the query by clearing the query. Search results.a) View recent use As described above, the search table displays a list of newly searched terms and one of the searched route directions when the search field is initially touched and before any search terms are provided or edited, or when the search field is empty. Figure 20 illustrates four phases 2005 to 2020 for the interaction of a user of a search list having a list of recent searches and recent route directions for the user with an application executing on the user's device. The first phase 2005 shows the device after the mapping application has been opened. As described above, the UI of the mapping application has a start page, in some embodiments, the start page (1) displays a map of the current orientation of the device, and (2) is configured in the top column 140 and acts as a float control. Several UI controls. In the first phase 2005, the user taps the currently empty search field 165. The top column 140 includes a guide control item 160 and a bookmark control item 170. The second stage 2010 illustrates that the application displays the lookup table 2040 after receiving the user's touch on the search field. This search form is displayed regardless of whether the user provides any search terms in the search field. The search form 2040 provides a list of suggested search completions, including the newly searched terms and route directions. In particular, the search form indicates that the user has recently searched for "John Smith" and "Pizzeria". Each of the search completions listed in the search form also indicates certain other useful information. For example, the icon 2045 displayed next to "John Smith" indicates that the person is included in a list of contacts on the user's device, and as indicated by the bookmark icon 2050, "Pizzeria" is currently stored as a bookmark. The search form also lists the user's new route guidelines. These new route guides include the guidance to "Royal Burgers" as explained at the bottom of the search form 2040. Again, the search form 2040 lists one of the options for obtaining a home location from the user to the home address of the user, which is illustrated as the first item of the search form 2040. In some embodiments, a route from the current orientation to one of the user's homes is always provided when the recent use is shown. In addition, the mapping application of some embodiments displays the recent route guidance only when the search field is empty. That is, once the user begins typing a search query, the mapping application does not include the new route guidance in the list of suggested search completions. The second stage 2010 also illustrates that the mapping application of some embodiments removes the navigation control item 160 and the bookmark control item 170 from the top column 140. The mapping application inserts a cancellation control 2055 that cancels the lookup table 2040 and returns to the map view shown in the previous stage 2005. The third stage 2015 indicates that the user has selected the guidance option to "Home" listed as the first item in the search form. By providing some of the most frequently requested user search and guidance requests (including the option of the home guide) at the top of the search form, the application provides the user with no need to navigate the application in large numbers to receive such results. The ability to quickly obtain information for the most general requests of users. The fourth stage 2020 illustrates displaying a mapping application corresponding to one of the directions from the user's current orientation to the user's home. The mapping application of some embodiments also removes the search field 165 and the cancel control item 2055 from the top column 140 and places the clear control item 255 and the start control item 2060. In some embodiments, the start control item 2060 is used to initiate navigation based on the selected route. In some embodiments, the mapping application centers the current orientation indicator in the display area such that the center of the display area displays the route from the current orientation. Users can also provide search queries in the search field. When a user types a complete search term in a search field, the mapping application of some embodiments provides a list of items that match or include the search terms that have been typed in the search field so far. For each particular search, the user has the option to select from a list of items displayed in the search table, or the user can select the keypad on the search term when it relates to the map and the user's current location. Search button to perform a complete search for the search term.b) Full search term Figure 21 illustrates, in four stages 2105 through 2120, an example of a user typing a search query and performing a search in a search field. In the first stage 2105, the user touches the search field 165. In this example, assume that the user has not yet performed a search before typing the search query, or that the search history has been cleared by the user or the mapping application. Therefore, the mapping application does not provide a search form with newly searched terms and directions. The next stage 2110 shows the device after the user has typed the search term "Pavilion". The search table 2155 describes a list of search completions for the query term "Pavilion" typed by the user. The search results listed include the proposed search for "Pavilion Market", "Pavilion Theaters" and guidance from the user's current location to the "Bowling Pavilion". However, in some embodiments, the text "current orientation" is not displayed for the search of the route. The truth is, as shown, the mapping application displays the address of the destination. This is because the mapping application of some embodiments assumes that the user typing a letter indicates that the user intends to reach the destination of the matching search query, and therefore, the destination address is compared to the indication that the route is from the current location to the destination. For more useful information for users. In addition, the list of suggested search completions in the search table does not display any bookmarks or contacts, because there is no matching item matching "Pavilion" that is locally stored on the user's device in this example. In addition, because the user has not performed any recent searches or guidelines for "Pavilion," in some embodiments, all of the suggested search completions listed in the lookup table have been obtained from the remote server. Information is obtained from the remote server as described further below. In addition, the search completions listed may include matching the completion of a local geographic (eg, street, neighborhood, city, state, or country). For example, "Pavilion Street" or "City of Pavilion" can appear in the search completion list when this street or city exists. Moreover, when the user types a part of an address (for example, "220 Pavilion"), if the address exists (for example, "220 Promenade Drive, Skycity, CA"), the remote server can retrieve the address for this address. The most meaningful completion. The third stage 2115 indicates that the user ignores any of the suggested search completions listed in the search table 2155 and instead selects the "Search" button on the keypad. The fourth stage 2120 illustrates a map with one of the pushpins 2190 for "Pavilion Theaters." In some embodiments, the mapping application adjusts the zoom level of the map, as further described below. In some of these embodiments, when an orientation is not in the map at the current zoom level, the mapping application does not display the pin for the orientation. Therefore, the pin for the "Bowling Pavilion" is not shown in the map. The mapping application has also removed the cancellation control 2055 from the top column 140 and restored the navigation control 160 and bookmark control 170 in the top column 140.c) Partial search terms and autocomplete After any edits in the search field, the mapping application of some embodiments immediately populates the search table with automatic search completion. That is, when the user types a search term in the search field, the mapping application provides a list of suggested search completions based on the characters that have been typed at a particular time. The mapping application of some embodiments derives from the local source (eg, bookmarks, contacts, recent searches, recent route directions, etc.) and also obtains such suggested searches from the remote server. Figure 22 illustrates four stages 2205 through 2220 of the user's initial search query and instant display of a search list with a list of recommended search completions. In the first stage 2205, the user is touching the search field 165 to initiate a search. The second stage 2210 illustrates that the mapping application renders the screen keypad 2250 and the user uses the keypad 2250 to type the letter "P" into the search field. Upon receipt of this first letter, the application immediately presents a lookup table with a list of suggested search completions collected from various sources (eg, local device and remote server recommendations). The search form will continue to adjust and refine the proposed search in the search form as it receives more user input (ie, more alphanumeric characters and symbols) and query terms in the search field. List of. In some embodiments, the mapping application adjusts and improves the list as the user provides more input, even if the user misspelled the word being typed. For example, when the user types "Piza", the mapping application will display the search with the correct spelling word "Pizza". The mapping application of some embodiments uses a spell check and correction mechanism and other materials (eg, search history, etc.) to find similar spelling words to generate a list of suggestions for the search completion. Each search can be sourced from a variety of sources, either locally on the user's device or from a remote source and server. The mapping application of some embodiments lists the search completion from the local source prior to the search completion from the remote source and server. For example, the search charts described in the second stage 2210 are listed from the top to the bottom of the list in the order of "Paul", "Pizza", "Police Station" and "Promenade". Several searches have been completed. "Paul" is a contact card from the user's device; "Pizza" is derived from a previous user search in one of the search history files stored on the user's device; and to "Police Station" to The guidelines are derived from a newly searched route guide. As noted above, in some embodiments, the text "current orientation" is not displayed for the search of the route. The fact is that some of the mapping applications of these embodiments display the address of the destination. In some cases, the mapping application does not indicate that the route starts from the current orientation and does not display the address of the destination. For example, the guide to "Police Station" does not show the address separately, as the search completion itself includes the address of the police station. However, "Promenade" is the search obtained from the remote server. The remote map server may suggest this based on a search query that has been used by other users of the map server from the current orientation of the device. Therefore, "Promenade" is listed at the bottom of the search table 2255 after the three suggestions obtained at the local end of the mapping application are completed. As further described below, the mapping application of some embodiments sorts the search completions obtained by the local end. In some embodiments, the suggested completion and search results of the mapping application are based on the current orientation of the device. That is, the suggested completion and search results within the area of the map located within one of the current orientations of the device. Alternatively or in combination, the map currently displayed in the display area is the area on which the recommendations of the mapping application of some embodiments are based and the search results are based. In these embodiments, the mapping application prefers the search completion and search results within the currently displayed area of the map. In addition, the mapping application considers other factors when defining and adjusting the list of suggested search completions. In some embodiments, the mapping application considers the time factor. For example, the mapping application divides the search history (that is, the list of search completions used previously) into different time periods of the day (eg, late night, morning, etc.), different time periods of one week, different time periods of January. And/or different time periods of the year, and prefer the search completion and search results for the specific time period set to which the current search time belongs. The third stage 2215 indicates that the user selects "Pizza" from the list of search completions displayed in the search table. The fourth stage 2220 illustrates that the mapping application now displays a map with the orientation of "Pizza Place" illustrated as a banner 2290 and a pin on the map. The user can then select various icons displayed on the banner to perform a variety of functions, including obtaining comments on the restaurant, invoking navigation to the restaurant, or receiving directions to the restaurant, and various other features as further described below. . The banner 2290 includes a route extraction control item 2295 (illustrated as a display of a display car) for extracting one of the routes from the current orientation to the pin (eg, a driving route) without leaving the map view. The route extraction control is also used to initiate the navigation experience. For example, the mapping application of some embodiments provides one or more routes from the current orientation of the device to the orientation of the pushpin upon receiving a selection of the route extraction control. When selecting a route, the mapping application can begin to operate in navigation mode or in route inspection mode. Figure 23 illustrates three stages 2305 through 2315 of the user's initial search query and the instant display of a search list with a list of recommended search completions. The three stages 2305 through 2315 of Figure 23 are similar to stages 2205, 2215 and 2220 of Figure 22, except that the map is in the 3D view. In the first stage 2305, the user is touching the search field 165 to initiate a search. The second stage 2310 illustrates that after receiving one of the letters in the search field 165, the application immediately presents a search form with a list of suggested search completions collected from various sources (eg, local device and remote server recommendations). . The second stage 2310 also indicates that the user selects "Pizza" from the list of search completions displayed in the search list. The third stage 2315 illustrates that the mapping application now displays a map having the orientation of "Pizza PLC1" and the orientation of "Pizza PLC2" (the banner is not shown) respectively illustrated on the map as associated pins 2390 and 2395. . The user can then select various icons displayed on the banner to perform a variety of functions, including obtaining comments on the restaurant, invoking navigation to the restaurant, or receiving directions to the restaurant, and various other features as further described below. .d) Preference for local results In order to provide some of the search completions in the search table, some embodiments of the mapping application analyze a variety of local information stored in the user's device. For example, each user's device may contain a list of contacts containing a number of contact cards. Each contact card can contain a variety of information and tags for each contact. For example, each contact card may contain a contact tag with information about the following (if applicable): contact name and last name, company name, home address, work address, mobile phone number, work phone number , email address, URL, and various other information. Similarly, the contact list may contain a specific contact card corresponding to one of the specific users of the mapping application, and the mapping application may designate the specific contact card as a "ME card." The mapping application can frequently access the user's ME card to take advantage of certain information required by certain application features, including the current location from the user to the user's home address or work address. The characteristics of the guidelines are provided in a number of different contexts within the mapping application. In particular, the mapping application of some embodiments lists the search for the mapping application obtained from the ME card at the top of the search table. Figure 24 illustrates four stages 2405 through 2420 of the user's instructions for entering a partial address and obtaining a home address from the user of the contact or ME card originating from the user. In particular, Figure 24 illustrates that the mapping application of some embodiments lists the home address of the user at the top of the search table 2455. In the first stage 2405, the user taps the search field to begin the process of typing the user's search query information. During the second phase 2410, the user has typed a portion of the number "12" that can at least partially match an address or a search term. In some embodiments, the application first matches the search query typed by the user with the information contained in the ME card of the user stored on the user's device and the contact list of the user. If the application detects any matching contact tags between the search query and the user's ME card, then the application of some embodiments will display the information found in the identified contacts as listed at the top of the search form. The suggested search is complete. In some embodiments, the mapping application displays the information found in the identified contacts as a suggested search only when the address of the information contact is matched. Under the suggested search, the mapping application displays the text (eg, "Current Orientation") to indicate that the route is from home to the home. However, as described above, the mapping application of some embodiments instead displays the destination address instead of displaying the text, or displaying the destination address in conjunction with displaying the text, or not displaying the text and destination address. The application will display additional matching contact cards below the search from the ME card. In some embodiments, the mapping application can also render a search that is unrelated to the ME card. For example, when the user types "12", the mapping application will present the matching search completion (including the social networking website message) from the previous search of the local end and the matching search completion from the remote server. The second stage 2410 illustrates the application automatically presenting the tagged home address of the user of the ME card originating from the user as completion in the lookup table. The application detects a match between the user's typed query "12" and the user's ME card containing the "1234 A Street, Santa..." home address tag. Since the match originates from the user's ME card, the application prioritizes the information from the local source and displays the information at the top of the list near the suggested search completion in the search form. The search form also shows other search completions, including "Bob Smith", "12 Hour Fitness" and "John Doe", all of which originate from various local sources and remote sources. For example, Bob Smith is currently stored in the contact list on the user device. The third stage 2415 describes the user's guidance for choosing a home. The fourth stage 2420 illustrates the application displaying a map having a route corresponding to the user's current orientation to the user's home. The application can also analyze other information or tags stored on the user's contact list or contact card. Figure 25 illustrates an example of a guideline for a user to type a partial search term and obtain a work address to a user originating from the user's contact card in four stages 2505 through 1820. In particular, this figure illustrates the guidelines for the mapping application of some embodiments to list at the top of the search table or near the top of the search table to the work address. In the first stage 2505, the user touches the search field 165 to begin the process of typing the user's search query. During the second phase 2510, the user has typed a partial search term "Wo", and the application detects it as "Work" stored in the work tag field of the user's contact card (or ME card) or Part of the address label for "A's Work". In some embodiments, the application renders the user's tagged work address to complete as one of the lookup tables. As shown, the user's tagged work address is at the top of the search table 2555. Under the suggested search, the mapping application displays the text (eg, "Current Orientation") to indicate that the route is from home to the home. However, as described above, the mapping application of some embodiments may instead display the destination address instead of displaying the text, or display the destination address in conjunction with displaying the text, or not displaying the text and destination address. As described above, the mapping application of some embodiments is near the top of the list of items in the search table, but displays a list of contacts that match and originate from the user's device under the information of the ME card that matches and originates from the user. Any information. For example, the search form 2555 is also near the top of the search form, but below the user's tagged work address, another contact card originating from the list of contacts stored in the user is listed for "Bob". Contact information for Woods. The search form is followed by "World Market" as a suggested search provided by a remote server. The order in which each suggested search is completed in the search table may be derived from various ranking algorithms and heuristic learning methods that rank the strength of the relationship between the search query term and the suggested search completion. One such heuristic learning method is further described below. In some embodiments, the search completion originating from the local source (e.g., a contact list) typically has a higher priority than the information originating from the remote server. These search completions are also displayed at the top of the search form. The third stage 2515 illustrates that the user selects a list item corresponding to the guidance to "Work" from the search list. The fourth stage 2520 illustrates the application displaying a map having a route corresponding to the guidance from the user's current orientation to the user's work.e) Bookmark In some embodiments, the search completion listed in the search table can also be obtained by accessing a variety of other information stored locally on the user's device. For example, some embodiments may analyze bookmark information stored on a user's device. Each bookmark may contain various orientation information that the user has indicated as a location of interest. Figure 26 illustrates four stages 2605 through 2620 of the user selecting a partial search query and a list of search completions in the search list. In the first stage 2605, the user taps the search field to begin the process of typing the user's search information. During the second phase 2610, the user has typed a partial search term "Bur" in the search field. The application matches this part of the query terms to various local and remote recommendations. Application matching includes guidelines for "Burt Smith", "Burger Palace" and to "Royal Burgers". For this search query, the application renders the user's tagged bookmarks as one of the suggested searches in the search form. In particular, the application has made "Bur" match the "Burger Palace" because the restaurant is currently stored as a bookmark in the user's device, as indicated by the bookmark icon 2630 next to "Burger Palace." In some embodiments, information matching the bookmarks of the user's device may be displayed in the search list in a particular sort order. Some embodiments display a bookmark-based search completion below the search completion of the contact list from the user. However, the bookmark-based search can still be displayed above any of the remote server search suggestions. The search field in stage 2610 indicates that the contact "Burt Smith" is still displayed at the top of the list of suggested search completions because the search is completed from a list of contacts on the user's device. Similarly, the bookmarks for users of "Burger Palace" are displayed as the second item in the search form, and the guidance from the current position to "Royal Burgers" is displayed at the bottom of the list. In some embodiments, the lookup table may define different priorities for displaying items originating from the source on the device. For example, some embodiments may incorporate the search history and the frequency with which the user selects different suggested search completions into consideration factors to determine the particular order by which the suggested search is completed in the search table. For example, if the user frequently searches for and selects the "Burger Palace" corresponding to the bookmark on the device, the application can display the suggested search completion at the top of the list, and will correspond to the contact card "Burt Smith". The display is displayed as the second item in the list displayed in the search table. The third stage 2615 shows the user selecting a bookmark for "Burger Palace." The fourth stage 2620 shows the pushpins and banners for the "Burger Palace".f) Sort search completion As described above, for any particular search query, the search table is populated by search completion from 1) local sources (eg, bookmarks, contacts, recent searches, recent route directions, etc.) and 2) remote server sources. The specific display order in which the suggested search is completed in the search table is derived using a variety of heuristic learning methods. In some embodiments, in general, the display order causes the search completion from the local source to take precedence over the search from the remote source. Figure 27 conceptually illustrates a procedure 2700 performed by some embodiments for determining the order in which the suggested searches are completed from different sources in a search table. In some embodiments, the program 2700 is executed by a mapping application. The program 2700 of some embodiments begins when the user begins searching for a search query in the field. The program 2700 begins by capturing or receiving (at 2705) the search query typed in the search field. The program 2700 then retrieves (at 2710) the matching information from one or more local sources. As noted above, in some embodiments, the local source includes a user's contact list, bookmarks, search history, and recent guidance. The program 2700 can match the query terms with certain information and tags stored in the contact card (including address tags, phone numbers, names or URLs, and other information stored in the contact card). The program can also match search queries with other local information. Other local information includes bookmarks and user search history information. Next, the program 2700 determines (at 2715) the display order of the retrieved matches from the local source. In some embodiments, the program 2700 first sorts the matches from each of the local sources based on certain criteria (eg, the frequency of use of the completion, the strength of the association between the search query and the match, etc.) and only selects from A certain number (for example, three) of the top of each source is matched. In some embodiments, the program 2700 sorts the retrieved matches based on the local source. For example, program 2700 displays the matches in the order of ME cards, contact lists, bookmarks, and search history. Other embodiments may have a different order. The program 2700 then displays (at 2720) the retrieved matches from the local source in the lookup table in accordance with the determined display order (at 2715). The program 2700 then receives or retrieves (at 2725) the search suggestion from a remote source (eg, a remote map server) by sending the search query to the remote source. In some embodiments, the program 2700 simultaneously sends the search query to the remote source and views it in the local source to find the matching information. In some embodiments, the server may apply its own search ranking algorithm to identify particular search suggestions and score the particular search suggestions. The server can then send a particular number of identified search results to the mapping application, and the program 2700 can use its own heuristic learning method (eg, the frequency of use of the completion, the strength of the association between the search query and the match, etc.) The identified search results are sorted and filtered (at 2730). For example, in some embodiments, the program 2700 can suggest the top three server search suggestions in the list of suggested search completions in the search table. Next, the program 2700 displays (at 2735) a match from the remote source under the match from the local source. The program 2700 then ends. Some embodiments may immediately present the local search suggestions and adjust such suggestions to include such remote server recommendations when the remote server receives the remote server recommendations, which provides a "quasi" instant feeling for the search program. For example, if the search table provides enough screen space to list ten individual search completions or suggestions, the program can initially list all ten of this source from the local export source (eg, bookmarks, contacts, search history). Wait for search suggestions. The program can then replace the local search suggestions with the suggestions received from the server when the advice received from the server is obtained via the network. Program 2700 can continually update a particular list of search suggestions when the server receives information that can be considered more important than the information listed in the search table.g) Display search results as a list 28 illustrates an example of performing a mapping application of some embodiments on a device 2800 (eg, a tablet device, such as the iPad® sold by Apple, Inc.), with smaller devices (eg, smart phones, such as Apple) Device 2800 has a relatively large display area compared to the display area of the iPhone® sold by Inc.. In particular, Figure 28 illustrates the interaction of a user for displaying a list of search results in a device having a relatively large display area with a mapping application in four different stages 2805 through 2820. The first stage 2805 shows that the mapping application has a top column 2806 that includes a set of controls, including a search field 2830 and a list view control item 2835 for displaying a list of search results. In some embodiments, the mapping application displays the list view control item 2835 in the top column 2806 when the search is completed based on one of the search queries as shown. In other embodiments, the mapping application places the list view control item 2835 in the search field 2830. In still other embodiments, the mapping application slides the list view control item 2835 by causing the list view control item 235 to slide out of the 3D icon 2890. In some such embodiments, the mapping application displays a list view control item 2835 when the local drawing application displays the search results in the map. In this example, the first stage 2805 shows that the mapping application has performed a search using "Pizza" as a search query. The mapping application displays the search results as two pushpins 2840 and 2845 in the map view. The mapping application of some embodiments also displays an information banner 2846 for one of the two pushpins indicating that the point of interest (POI) represented by the pushpin is the most suggested result. The first stage 2805 also shows that the user is selecting the list view control item 2835. The second stage 2810 shows that the mapping application is displaying a list 2850 of POIs that the mapping application has discovered using the search query. In this example, the manifest has three POIs. The first two POIs correspond to the push pins 2845 and 2840, respectively. The mapping application does not display a pin corresponding to the third POI "Pizza Planet" because the third POI is not located within the map of the map including the first two POIs. In some embodiments, when one of the POIs selected from the list is not within the zone of the currently displayed map, the mapping application shifts the map to display another zone. The third stage 2815 shows that the mapping application is receiving a selection of the third entry. The fourth stage 2815 shows that the mapping application has shifted the map to display another area of the map that includes the pin 2855 corresponding to the third POI. In some embodiments, the mapping application displays an animation for a duration to show that the map is being shifted to another area of the map. In some embodiments, the mapping application displays an information banner for the third POI because the pin for the POI is the only pin in the map area.h) Software architecture Figure 29 illustrates an example architecture of a mapping application that provides a list of suggested search completions based on a search query. In this example, the mapping application 2900 of some embodiments is executed in device 2905. As shown, the mapping application 2900 includes a search completion manager 2910, a local source manager 2915, a remote source manager 2920, a manifest manager 2925, a search query parser 2930, a recent search completion repository 2935, and a recent route. Guide the repository 2940. This figure also illustrates the search and map server 2960. The mapping application 2900, device 2905, and search and map server 2960 can each have a number of other modules, but are not depicted in this figure for simplicity of discussion. The search query parser 2930 receives a search query entered by the user via one of the input managers (not shown) of the device 2905. The query parser 2930 sends the parsed query to the search completion manager 2910, such that the search completion manager 2910 can generate a search request for the remote source manager 2920 and the local source manager 2915. The search query parser 2930 also receives a touch input on one of the search fields (eg, search field 165) and notifies the search completion manager of the input so that the search completion manager can search for the repository 2935 and the recent search. The Route Guidance Repository 2940 draws on the recent search completion and recent route guidance. When the search completion manager 2910 receives a notification from the search query parser that the user has touched one of the search fields when the search field is empty, the search completion manager 2910 finds the recent search completion repository 2935 and the recent route guide 2940. . In some embodiments, the search completion manager retrieves the search completion and route guidance used in a certain period of time (eg, hours, days, weeks, etc.) prior to receiving the touch input. The search completion manager 2910 also directs the remote source manager 2920 and the local source manager 2915 to find the search completion based on the parsed search query. The search completion manager 2910 then receives the search completion and route directions returned by the remote source manager 2920 and the local source manager 2915. The search completion manager collects the search completion and route guidance received from the remote source manager 2920 and the local source manager 2915, respectively, and filters out any duplicate completions and guidelines. The search completion manager then sends these completions and guidelines to the inventory manager 2925. The inventory manager 2925 sorts the search completion and driving directions based on certain criteria. As mentioned above, these criteria include the use of search completion and route guidance time, completion and route guidance from local sources or remote sources. The manifest manager 2925 then passes the sorted list to one of the display managers (not shown) of the device 2905 so that the list can be displayed to the user. The search completion manager also forwards the search request (ie, the full search query selected from the search completion list or the search query entered in the search field when the user selects "type" or search for the control item) and the selected route guide. And passing the requests and the directions to a search request manager (not shown) that will use the search requests to search or calculate a route. The search completion manager 2910 stores the search requests (ie, the search for actual search) and the selected route guides (ie, the start and destination directions) in the recent search completion repository 2935 and The new route guides the repository 2940. The Recently Completed Repository 2935 and the New Route Guidance Repository 2940 are used to store the newly used search request and the memory space that has been used to calculate the route. In some embodiments, the two repositories are used for fast access to the cache. The local source manager 2915 looks up the contact repository 2950 and the bookmark repository 2955 to find contacts (eg, ME cards) and bookmarks that at least partially match the parsed search queries received from the search completion manager 2910. The local source manager 2915 then generates a search completion based on the matched contacts and bookmarks and returns the search completions to the search completion manager 2910. Contacts and bookmarks stored in the repositories 2950 and 2955 are generated, maintained, and/or accessed by an application executing on the device 2905, and the applications include a mapping application 2900. The remote source manager 2920 sends the parsed search query received from the search completion manager 2910 to one or more servers (not all illustrated) including the search and map server 2960. The remote source manager 2920 receives the search completion and/or route guidance returned from the server 2960 in response to the search query sent to the search and map server 2960. The remote source manager 2920 then sends the completion and route directions to the search completion manager 2910. As shown, the search and map server includes a search completion repository 2965 and a route guidance repository 2970. In some embodiments, the search and map server stores the search request and route directions for calculating the route in the repositories 2965 and 2970. The search and map server 2960 receives the search requests and route guidance from the executing individual device (including device 2905) executing the mapping application of some embodiments (such as the mapping application 2900). The search and map server then generates the suggested search completion and route directions to device 2905. In some embodiments, the search and map server includes two servers that respectively supply map data and generate routes. The search completion repository 2965 and the route guidance repository 2970 of some embodiments are used to store the data storage structure of the search request and the route guidance.2. Clear search results via search fields In addition to allowing a user to initiate a search, in some embodiments, the presence of a search field in the primary map view also allows the user to see the query corresponding to the search results on the map and remove it by clearing the query. They searched for results. Figure 30 illustrates the user clearing results from map 3000 in three stages 3005 through 3015. The first stage 3005 illustrates that the map displays a pin 3025 for "Pizza Place." This pushpin 3025 may have been placed on the map 3000 by a variety of different mechanisms. For example, the user may have placed the pin on the map and received a lookup in the opposite direction, or the user may have typed a search query for one of "Pizza Place 321". The second stage 3010 illustrates the user selecting the "X" button 3030 in the search field 165 to clear any search queries displayed in the search field 165. In addition, when the search query is cleared, all search results (pins) associated with the displayed search query displayed on the map will also be cleared from the map 3000. The third stage shows that after the user selects the "X" button 3030, the search field 165 is now empty and the pin for "Pizza Place" is no longer displayed on the map. Figure 31 illustrates four stages 3105 through 3120 for the purpose of clearing the interaction of the user of the selected search result displayed on the map with the application executing on the user's device. The first stage 3105 shows the device after the mapping application has been opened. The second stage 3105 illustrates that the application displays the lookup table 3140 after receiving the user's touch on the search field. This search form is displayed regardless of whether the user provides any search terms in the search field. The search form 3140 provides a list of suggested search completions, including the newly searched terms and route directions. In particular, the search form indicates that the user has recently searched for "John Smith" and "Pizzeria". The search form also lists the user's new route guidance. These new route guidance include the guidance to "Royal Burgers" as explained at the bottom of the search form 3140. Again, the search table 3140 lists an option to obtain guidance from the user's current location to the user's home address, which is illustrated as the first item of the search table 3140. The top column 140 includes a guide control item 160 and a bookmark control item 170. The second stage 3110 describes the guidelines for the user to select "Home". The second stage 3110 also illustrates that the mapping application of some embodiments removes the navigation control item 160 and the bookmark control item 170 from the top column 140. The mapping application inserts the cancel control item 2055. The third stage 3115 illustrates that the mapping application displays a route corresponding to one of the directions from the user's current location to the user's home. As shown, the route has two push pins for the start and end points of the route. The mapping application of some embodiments also removes the cancel control item 2055 from the top column 140 and places the clear control item 255 and the start control item 2060. The third stage 3115 also illustrates the selection of the purge control item 255. The fourth stage 3120 indicates that the search field 165 is now empty, and the push pins for the start and end points of the route are no longer displayed on the map because the mapping application of some embodiments is receiving the clear control item 255. These buttons are removed from the map at the time of selection.B. Map shows zoom level settings for search results When the user is viewing the map in a particular view and performing a search query, some embodiments will transition to a new map view containing the search results of the user's query. A particular type of transition may include continuously adjusting the map zoom level and may include displaying an animation between the original map view and the new map view. The application considers a number of factors when deciding on the particular type of transition and whether to provide an animation between different map views. Some factors may include the distance between different map views given a particular zoom level, the information available to provide animation between the map views, the data bandwidth capabilities of the user's internet connection, and various other factor. 32 illustrates a procedure 3200 performed by some embodiments for determining a particular type of transition displayed between a current map view of a user and a target map view containing search results for a user performing a search query. In some embodiments, the program 3200 is executed by a mapping application. In some embodiments, the program 3200 begins when the mapping application generates a search result based on a user's search query. The program 3200 begins by capturing (at 3205) the search results. The program then defines (at 3210) the original zone and the target zone. In some embodiments, the program 3200 considers the map being displayed to the user. Program 3200 defines this map display as having a current map view of the original map area. The program 3200 then determines a target map view with one of the target map areas proposed, which needs to display the target map view to the user to provide an optimal map view showing some or all of the search results. In some embodiments, the program 3200 initially defines (at 3210) the original zone and the target zone at the same zoom level. In some such embodiments, the program 3200 initially maintains the zoom level of the original region and sets the zoom level of the target region to the zoom level of the original region. Program 3200 also sets the orientation of the target zone to the orientation of the original zone. Moreover, different embodiments locate the target zone in different ways. For example, in some embodiments, the program 3200 defines a target zone to include at least one search result. Again, the program 3200 of some embodiments defines the target zone by obtaining an average coordinate of the search results and setting the center of the target zone to the average coordinate. Next, routine 3200 determines (at 3215) whether the original zone and the target zone at least partially overlap. When the routine 3200 determines (at 3215) that the two zones at least partially overlap, the routine 3200 continues to 3225, which is described further below. When the program 3200 determines (at 3215) that the original zone does not overlap the target zone, the routine 3200 determines (at 3220) whether the two zones are separated by more than a threshold distance. In some embodiments, the program 3200 dynamically calculates this threshold distance based on the current zoom level of the original zone and the target zone. For example, the calculated threshold is inversely proportional to the scaling levels. That is, the more the regions are enlarged, the shorter the calculated threshold distance is. When the program 3200 determines (at 3220) that the two zones are above the threshold distance, the program 3200 displays (at 3230) the target zone, but does not display animation from the original zone to the target zone. Otherwise, the program displays (at 3235) an animation from the original zone to the target zone. Different embodiments use different animation techniques. For example, in some embodiments, the program 3200 transitions from the original zone to the target zone using cross-fade of the original zone and the target zone. In some embodiments, the program 3200 can transition from the original zone to the target zone as if the viewpoint of the virtual camera overlooking the original zone is moving to the target zone. When the program 3200 determines (at 3215) that the original zone at least partially overlaps the target zone, the program determines (at 3225) whether to modify the target zone. Operation 3225 is described in more detail below by referring to FIG. The program 3200 then determines (at 3240) whether to display an animation to the target zone. Operation 3240 is described in greater detail below by referring to FIG. Figure 33 illustrates a procedure 3300 executed by some embodiments for determining whether to modify a target zone when the target zone at least partially overlaps the original zone originally defined by the mapping application. In some embodiments, the program 3200 is executed by a mapping application. The program 3300 begins by: determining (at 3305) whether (1) the original region includes any search results and (2) whether the zoom level of the original region is less than a threshold zoom level (ie, the original region is not zoomed out) A threshold zoom level). When the program determines (at 3305) that the original region does not include any search results or that the zoom level of the original region is not less than the threshold zoom level, then the process 3300 continues to 3315, which is further described below. When the program determines (at 3305) that the original region includes at least one search result and the zoom level of the original region is less than the threshold zoom level, the program 3300 uses (at 3310) the original region as the target region. Program 3300 then proceeds to 3335, which is described further below. When the program determines (at 3305) that the original region does not include any search results or that the zoom level of the original region is not less than the threshold zoom level, routine 3300 determines (at 3315) whether the original region includes any search results. When the program 3300 determines (at 3315) that the original zone includes at least one search result, the process 3300 continues to 3335, which is further described below. When the program 3300 determines (at 3315) that the original region does not include the search results, the program 3300 expands (at 3320) the original region to include at least one search result and uses the expanded original region as the target region. Different embodiments extend the original zone in different ways. For example, in some embodiments, the program 3300 is expanded in all directions from the center of the original zone to include at least one search result, while in other embodiments, the program 3300 is not in all directions from the center of the original zone. Expanded to include at least one search result. In some such embodiments, the program 3300 expands in a manner that includes search results that are closest to the boundaries of the original region. Next, routine 3300 determines (at 3325) whether (1) whether the most important result is outside the target zone and (2) whether all of the search results in the target zone are significantly less important. Different embodiments evaluate the importance of search results in different ways. For example, some embodiments quantify the proximity of a search query to a search result and use the proximity of the quantization to determine importance. In particular, the program 3300 of some embodiments can treat the closest search result as the most important search result. Other embodiments use other techniques to assess the importance of the search report. Moreover, when the difference between the quantized proximity of the two search results is greater than a margin, the routine 3300 of some embodiments will consider a search result to be significantly less important than another search result. When the program 3300 determines (at 3325) that the most significant result is outside the target zone and all of the search results in the target zone are significantly less important, the program 3300 expands (at 3330) a certain size to include one or more search results. Program 3300 then loops back to 3325 to make another determination to see if the target zone is to be further extended. When the program 3300 determines (at 3325) that the most significant result is not outside the target zone or that all of the search results in the target zone are not significantly less important, the program 3300 is necessary to ensure that the target zone can accommodate any UI associated with the search results ( For example, the information banner) is further extended (at 3335) to the target zone. The program 3300 then ends. Figure 34 illustrates a procedure 3400 performed by some embodiments for (1) determining whether to display from the original region when the target region at least partially overlaps with the original region originally defined by the mapping application and when considering the target region to be modified Animation to the target area. In some embodiments, the program 3400 is executed by a mapping application. The routine 3400 begins by determining (at 3405) whether the zoom levels of the original and target regions differ by more than a first margin. In some embodiments, the first threshold difference represents a higher margin difference between the zoom level of the original zone and the target zone. In this case, the zoom level of the original area and the target area is considered to be significantly different. When the program 3400 determines (at 3405) that the zoom levels are significantly different, the program 3400 displays (at 3410) the target area, but does not display animation from the original area to the target area. When the program 3400 determines (at 3405) that the zoom levels are not significantly different, the routine 3400 determines (at 3415) whether the zoom levels differ by more than a second margin. In some embodiments, the second threshold difference represents a lower margin difference between the zoom level of the original zone and the target zone. When the difference between the zoom levels is lower than the higher threshold and the lower threshold, the zoom level of the original area and the target area is considered to be moderately different. When the program 3400 determines (at 3415) that the zoom levels of the original and target regions are moderately different, the program 3400 displays (at 3420) an animation from the original region to the target region. When the routine 3400 determines (at 3415) that the zoom levels of the original and target regions are neither moderately nor significantly different, the routine 3400 determines (at 3425) whether the original region includes all of the search results. When the program 3400 determines (at 3425) that the original zone includes all of the search results, the process ends. Otherwise, the process 3400 continues to 3430 to determine if displaying the animation may result in more search results being visible. In some embodiments, the program 3400 checks the animation to see if any search results will appear when the animation is displayed. When program 3400 determines (at 3430) that displaying the animation may result in more search results being visible, program 3400 displays (at 3420) an animation from the original zone to the target zone. Otherwise, the program 3400 ends. Figure 35 illustrates four stages 3505 through 3520 of the situation where the application displays to the target map area containing the corresponding search results without providing any animation between the current map view and the target map view. The first stage 3505 illustrates one of the map's original areas, which is shown, for example, by Cupertino, California. The map is displayed at a specific zoom level for one of the various highways. If the user's thumb and index finger are moving in the outward direction, the user is also adjusting the map (via a gesture input) to zoom into a more detailed view. The second stage 3510 illustrates that the map is now at a more detailed zoom level (i.e., zoomed in), and the individual streets displayed therein include "First Street", "Main Street" and "Second Street". The user also taps the search field 165 to initiate a search. The third stage 3515 indicates that the user types the search query "Smithsonian" into the search field and selects "Smithsonian Museum, Washington, DC" from the suggested search completion list in the search table 3555. When Washington, DC was selected, the app immediately displays a map of Washington DC without any animation. In this example, because Cupertino, CA and Washington, DC are separated by a considerable screen distance for the current map view and the specific zoom level, the application immediately jumps to the map of Washington, DC without providing any animation. For this given search, the application has determined that the screen distance between the map area displayed in stage 3510 and the target map area required to display the Washington DC is greater than a certain threshold, and thus, provided for a given zoom level Animation is not reasonable or feasible. In some embodiments, the mapping application displays a message when the target map area is too far from the current displayed map area or the user's current orientation (eg, a distance of more than a few hundred miles or thousands of miles) (For example, "Did you mean XYZ place in location A...?") to ask the user if he or she really wants to search for a distant target area. Alternatively or in combination, the mapping application can present the message to the user by voice (eg, by reading the message). In some embodiments, the mapping application does not provide search results until the user responds to the message. In some embodiments, in addition to the message, the mapping application also provides alternative search results. For example, the mapping application can provide a list of search results (eg, "Smith's Onion, Cupertino, CA") that can be found in or near the currently displayed area, or can be provided similar but closer to the current one. A search query relating to one of the displayed zones performs a search result. If the user selects an alternative result, the mapping application will display the search results on a region of the map that is closer to the currently displayed region. Figure 36 illustrates four stages 3605 through 3620 of the situation in which the application detects the search results within the original current map view and therefore does not have to scale the map or display an animation to any new target map area. The first stage 3605 illustrates the user viewing the map 3630 of Cupertino, California. Map 3630 is under a particular zoom level showing one of various highways. If the user's thumb and forefinger are moving in the outward direction, the user is also adjusting the map to zoom in to a more detailed view. The second stage 3610 indicates that the map is now under a more detailed zoom level, and the displayed individual streets include "First Street", "Main Street" and "Second Street". The user also taps the search field to initiate a search. The third stage 3615 indicates that the user types the search query "Coffee Shop" into the search field and selects the coffee shop located in "First Street" from the suggested search completion list in the search table. When a coffee shop is selected, the application displays the same current map view that the user is viewing before the search request, as shown in the fourth stage 3620. Since the search results for the coffee shop located in the first street can be viewed in the user's current map view, the application does not have to adjust the zoom settings or provide any animation to display the target map area. The application has set the target map area with the relevant search results as the current map area, and in this case, avoids changes between views in different areas of the map. 37 illustrates, in four different stages 3705 through 3720, the mapping application of some embodiments reduces the current map area view to present a number of search results found based on the search query. The first stage 3705 illustrates the mapping application after the user has typed the search query "Tech Companies Cupertino". The search table 3755 displays a list of search results including a previous search query 3760. As shown, the user is selecting a search query 3760 from the search table 3760. The mapping application of some embodiments stores search results for searches using a mapping application. The second stage 3710 displays a map of the current orientation under the detailed zoom level, wherein the displayed individual streets include "First Street", "Main Street" and "Second Street". Since the search results the tech companies in Cupertino are not located within the original current map area of the current map view, the application of some embodiments extends the map view such that the target area includes all search results located in the target area. The mapping application of some embodiments also determines that the animation of the map view from the current map area to the target area is required because the zoom level of the current area and the target area are significantly different. The third stage 3715 illustrates the maps at different zoom levels. The mapping application only briefly displays the map under this zoom level as part of the animation displayed by the mapping application in order to zoom out the map to a zoom level for displaying a target area larger than the original area. In some embodiments, the mapping application displays a 2D/3D transition as part of the animation. The fourth stage 3720 illustrates the map at the zoom level of the target area. That is, the mapping application has completed displaying the animation from the original zone to the target zone displayed at the second stage 3710. In some embodiments, the mapping application changes the duration of the animation for transitioning from the original zone to the target zone based on the amount of variation involved in the transition. For example, the mapping application of some embodiments animates the transition for a short duration when the original zone is not far from the target zone or when the original zone overlaps the target zone. When the distance between the two zones is relatively large (eg, hundreds of miles), the mapping application displays a longer animation. In some such embodiments, the mapping application may not display animation at all when the distance between the two zones is extremely large (eg, thousands of miles).III. Controls for previewing items on the map Some embodiments of the present invention provide a novel user interface for presenting different types of detailed information about a point of interest (POI). This user interface is referred to as a "detailed information stage" in the above and below description. In some embodiments, the detailed information screen includes a display area for displaying an image of the POI and a plurality of index tabs under which different types of information about the POI are grouped and presented to the user. The mapping application of some embodiments provides several different ways to display a detailed information screen for a POI. As described above, the mapping application of some embodiments displays a banner above each of the pins displayed as search results. The user can select one of the POI banners to open a detailed information screen for the POI. The mapping application also allows the user to open a detailed information screen for a POI by selecting a POI for a list of POIs presented by the search results of a search query from the mapping application of some embodiments. The mapping application allows the user to open the detail screen after placing a pin in one orientation. In addition, the mapping application allows the user to open a detailed information screen for the current orientation. FIG. 38 conceptually illustrates a GUI 3800 which is a "detailed information screen" of the selected POI. In particular, FIG. 38 illustrates a mapping application of some embodiments in six different stages 3805 through 3830 that display a 3D animation of the POI in the media display area 3835 of the GUI 3800. This figure illustrates that the GUI 3800 includes a media display area 3835, an index tab 3840, an information display area 3845, and a top column 3850. The media display area 3835 of some embodiments is for displaying different media of the POI. In some embodiments, when the GUI 3800 is launched, the mapping application initially displays a 3D animation of the POI. For example, when a POI is a building, the mapping application displays an animated 3D view of the surroundings of the building and building. In some embodiments, the mapping application displays the building as if it were viewed from a camera mounted on a helicopter hovering around the top of the building. Different embodiments produce 3D animations (3D video presentation) in different ways. For example, a 3D animation captures one video clip by a video capture device that orbits the earth along a track or a manned or unmanned aerial vehicle (eg, satellite, space shuttle, airplane, helicopter, etc.) flying at a lower altitude. In some embodiments, the mapping application generates a 3D video presentation by performing a blending operation (eg, a three-dimensional perspective blending operation) on a number of images captured by a flying object (such as a helicopter, airplane, satellite, etc.) for a particular orientation. Such images may be still images or images from a portion of one of the video clips captured by such objects. In some embodiments, the 3D rendering operation produces a video clip from the images by transitioning through the plurality of images for a set amount of time. In some embodiments, this transition results in multiple video frames being generated by capturing different subsets of the position from different perspective representations in the 3D scene at different times. In some embodiments, the mapping application generates a 3D video presentation by moving the virtual camera over and around the POI (eg, a building) and its surroundings in a 3D immersive map view or in a low-altitude overhead view. For example, the mapping application can move the virtual camera as if the virtual camera was capturing the POI and the surrounding environment from the flying object that wraps around the top of the building. The virtual camera and the 3D immersive map view are described in detail in the above-incorporated U.S. Patent Application Serial No. 13/632,035. When the material for the animated 3D view of the POI is not available (eg, data is not available in the map server or other local storage), the mapping application of some embodiments finds the next available type of image for display on the display. In area 3835. For example, the mapping application can display a satellite view of the POI. When the data for the animated 3D view of the POI is available, but it takes some time to obtain the necessary information to display the animated 3D (eg, a slow network connection due to the source of the device to the necessary data), some embodiments of the mapping application The program identifies an available type of media for use under the POI and first displays the media in the media display area 3835. For example, the mapping application of some embodiments displays satellite images of POIs in media display area 3835. In some embodiments, to provide an animation effect, the mapping application rotates the satellite image of the POI (eg, clockwise) rather than statically displaying the 2D satellite image. When sufficient data for displaying an animated 3D view is obtained, the mapping application of some embodiments switches from displaying a satellite image of the POI to an animated 3D view displaying the POI. Different embodiments of the mapping application use different effects to make this switch. For example, some embodiments of the mapping application cross-fade a POI 2D satellite image into an animated 3D view. In other embodiments, the mapping application can use the Ken Burns effect to display a 3D animated view from satellite imagery. In some embodiments, the mapping application determines the type of media of the POI initially displayed in the media display area 3835 based on the type of POI selected. For example, when the POI is a restaurant, the mapping application of some embodiments initially displays images of cooking dishes served by the restaurant or internal images of the restaurant. When displaying such images, the mapping application of different embodiments uses different effects to display different images. The different effects that the mapping application can use include Ken Burns effect, vignetting effect, crossfade, tilt, slide show, and more. The mapping application of some embodiments overlays the information text on the media displayed in the media display area 3835. In some embodiments, the information text is displayed to the left of the proximity media display area 3835, as shown in the third stage 3815. However, the orientation of the message text can be anywhere (eg, center) in the media display area 3835. In some embodiments, the mapping application applies different effects to the portion of the media display area that overlays the information text, making the text legible. For example, the mapping application can change the color of the portion or blur the portion. In other embodiments, the mapping application does not modify the image to make the information text legible. The reality is that the mapping application adjusts the information text to make it legible when the portion of the image that is overlaid changes. The mapping application of some embodiments may also switch from other types of media that were originally displayed when the GUI 3800 was launched. For example, when the user selects an input item from among the items displayed under the "Media" index tab (not shown), the mapping application displays the associated input item in the media display area 3835. The video, or the video associated with the selected entry, is played in the media display area 3835. Index tab 3840 is an index tag for displaying different sets of input items grouped for different types of information associated with different index tags. In some embodiments, as shown, the GUI 3800 initially includes an "Info" index tab, a "Reviews" index tab, and a "Photos" index tab. When the information index tab is selected, the mapping application displays the input related to the overall information about the POI in the information display area 3845. As shown, the overall information about the POI includes the phone number, the URL for the POI, the address, and the like. When a comment index tab is selected, the input to be displayed includes all comments collected by the news gathering entity (eg, Yelp, Facebook, Twitter, etc.) and supplied to the mapping application. Similarly, when a photo index tab is selected, the items to be displayed include photos collected by the information collection entity. The index tabs and entries displayed in the information display area 3845 are described in more detail below. The top column 3850 of some embodiments includes a "previous page" button 3895 for returning to the state prior to launching the GUI 3800. When the map with the search results has been displayed before the GUI 3800 is displayed, the "Previous Page" button 3895 will instruct the mapping application to return to displaying the map with the search results. When the list of POIs has been displayed before the GUI 3800 is displayed, the "Previous Page" button 3895 instructs the mapping application to return to the display list. The operation of the GUI 3800 will now be described. In the first stage 3805, as a result of typing a search query "Famous Building", the mapping application displays a pin 3860 and a banner 3865. In the next stage 3810, the user selects arrow 3875 to launch the "Details Info Screen" for this POI. In the third phase 3815, the GUI 3800 has been launched. The mapping application displays an initial collection of components of the GUI 3800 that includes a top column 3850, a media display area 3835, an index tab 3840, and an information display area 3845. In the media display area 3835, the mapping application displays an animated 3D view of the famous buildings and other buildings in the vicinity of the famous buildings. The mapping application also displays informational text about famous buildings to the left of the media display area 3835. The information displayed includes the name of the building, the address, the star rating, and the number of comments, as shown in the media display area 3835. Behind the text, the mapping application initially displays images of famous buildings. As shown, the building appears to be faded because the mapping application has faded portions of the image that appear behind the message text to make the text more prominent and legible. At this stage 3815, the information display area 3835 displays overall information about the building (e.g., phone number, building URL, address, etc.) because the information indexing tag, in some embodiments, is a preset index tag selection. The next stage 3820 shows that the virtual camera's viewpoint has changed, making the famous building appear next to the displayed information text. Some other nearby buildings are displayed behind the news text. The virtual camera's point of view also began to circle around the top of the famous building. The next stage, 3825, shows that the virtual camera has moved to the other side of the famous building. As shown, the "FMS BLDG" displayed on one side of the building now appears in the southwest direction of the media display area 3845 to show that the virtual camera has moved counterclockwise relative to the top of the building. The "H" in the circle at the top of the building also appears to have rotated from the previous stage of 3820. In addition, because the viewpoint of the virtual camera has changed, different neighboring buildings are displayed behind the information text. The next stage, 3830, shows that the virtual camera has moved to one side of the famous building. The virtual camera continues to move and is now on the side of the building where "FMS BLDG" is no longer visible. The "H" in the circle at the top of the building appears to be further rotated. The mapping application of some embodiments continues to display this animated 3D animated view (ie, will cause the virtual camera to continue to hover on top of the building) until the user provides additional input (eg, enters to close the GUI 3800, The items under the photo index tab select photos, etc.). In some embodiments, the mapping application uses a 3D rendering operation to generate a 3D rendering in its other modes of operation. For example, in some embodiments, the mapping application uses this operation whenever the user performs a search for a particular orientation or specifies other search criteria, or whenever the user explores one of the orientations on the map. A 3D video presentation. Figure 39 illustrates five different stages 3905 through 3925 illustrating that the mapping application uses 3D rendering operations to present one instance of a particular search result. The first stage 3905 illustrates the mapping application display lookup table 3940 after receiving the first letter of the search query typed by the user in the search field 165. As shown, the search form 3940 includes several search completions, including: "X Corp. "X Station 555..." and "Promenade X". The first stage 3905 also stated that "X Corp. was selected. "." The second stage 3910 illustrates a 3D map view 3910 as indicated by the eye-catching 3D control item 150. In some embodiments, the mapping application automatically displays the 3D animation of the POI when the user views the POI in 3D mode. Some of the mapping applications of these embodiments still allow the user to rotate the map view (eg, with two indicative actions). In some embodiments, the mapping application begins rendering a 3D animation of the POI after a certain amount of time (eg, a few seconds) without receiving input from the user. The second phase 3910 also illustrates that the mapping application begins to render a 3D animation of the building. As shown, 3D animation is showing sides 1 and 2 of the building of X Corporation, with side 2 having the name X Corporation. In this example, the 3D animation is rendered from a 3D video taken from a flying object that is spiraled counterclockwise around the top of the building. The third stage 3915 illustrates that the viewpoint has changed and the building appears to rotate clockwise as the flying object spirals counterclockwise. The 3D presentation is showing the sides 2 and 3 of the building of X Corporation. As shown, the "H" mark on the top of the building has also been rotated. The fourth stage 3920 and the fifth stage 3925 show further rotation of the building as the flying object hovers clockwise. The mapping application of some embodiments repeats the 3D rendering until the user provides one of the inputs to stop or change the animation (eg, two indicative actions, an input to exit the 3D mode, etc.). FIG. 40 conceptually illustrates the GUI 3800. Specifically, FIG. 40 illustrates the mapping application of some embodiments initially displaying the animation of the satellite image of the POI in the media display area 3835 of the GUI 3800 and obtaining sufficient data for the 3D animation in eight different stages 4005 to 4040. Switch to the 3D animation of POI when displaying 3D animation. In the first stage 4005, as a result of the key search query "MyWork Building", the mapping application displays the pin 4060 and the banner 4065. The user has also selected the lower right corner of the map to strip the map and display one of the group buttons described above. The next stage 4010 shows the user selecting the "List" button 4070 to cause the mapping application to display the search results as a list. In a third stage 4015, the mapping application displays a list of POIs after the user has selected the list button 4070 in the previous stage 4010. In this example, the list happens to include only one POI because the search query is adequately targeted to a particular outcome. In the next stage 4020, the user selects the entry 4075 to launch the "Details Info Screen" for this POI. As shown, the user has selected the "MyWork Building" POI. In the fifth stage 4025, the GUI 3800 has been launched. However, in contrast to stage 3815 described above with reference to Figure 38, the mapping application displays a satellite image of one of the actual buildings associated with the "MyWork Building" POI. The satellite image also shows other buildings near the building, rather than an animated 3D view of the building. The satellite image is a 2D image of the top of the building taken at a considerable distance (ie, from a satellite) from the top of the building. As shown, the mapping application also fades portions of the image that are overlaid by the information text to make the text clear and legible. The next phase of the 4030 show mapping application has rotated the satellite image clockwise to animate the satellite imagery. In the next phase 4035, the mapping application is using the 3D animated view of the building to cross-dif satellite imagery because the mapping application has enough information (for example, from a map server or other source of data) to display the 3D of the building. Animated view. In the next stage, the 4040 shows that the virtual camera for the 3D animated view has been moved to the other side of the "MyWork Building". As shown, the black corner of the building now appears in the east of the media display area 3835 to show that the virtual camera has moved counterclockwise relative to the top of the building. Also, because the viewpoint of the virtual camera has changed, different neighboring buildings are displayed behind the information text, as shown. Figure 41 conceptually illustrates the GUI 3800. In particular, Figure 41 illustrates, in six different stages 4105 through 4130, when the image of the POI is more meaningful to the user and contains more information, the mapping application of some embodiments is initially in the media display area 3835 of the GUI 3800. The image of the POI is displayed instead of the animated image of the building. For example, when the selected POI is a commercial establishment (eg, a restaurant or a coffee shop), the mapping application determines the appropriate image to display based on the user's expectations. For example, the mapping application can display images of food and beverages available in restaurants or display images of interiors of coffee shops, as such images may be more external than images showing the buildings in which the businesses are located. Meet user expectations. The first stage 4105 and the second stage 4110 are similar to stages 3805 and 3810, respectively, wherein the mapping application displays a collection of POIs from which the user can select a POI. The second stage 4110 shows the user selecting "Little Coffee Shop". In the third phase 4115, the GUI 3800 has been started. However, in contrast to stages 3810 and 4025 described above with reference to Figures 38 and 40, the mapping application displays a collection of images (e.g., images of coffee, donuts, etc.) rather than images of buildings. . As described above, such images are collected by an information gathering entity such as Facebook, Twitter, etc., and the mapping application obtains information about the selected POI from the entities. As also mentioned above, the mapping application of different embodiments uses different techniques to display images. In this example, the mapping application is using the Ken Burns effect to display the images sequentially. At this stage 4115, the mapping application displays an image of one cup of coffee and a piece of muffin provided by Little Coffee Shop (assuming that the owner of the image or the owner of the store uploaded the image to an information collection entity such as Yelp) . The next stage, the 4120 show mapping application, has zoomed in on this cup of coffee and muffin as part of the image display using the Ken Burns effect. The next phase of the 4125 show mapping application is using the internal image of the Little Coffee Shop to cross-fade the imagery of coffee and muffins. The next stage, the 4130 show map drawing application, has fully magnified the internal image. Figure 42 conceptually illustrates a procedure 4200 performed by some embodiments for displaying different types of images when launching a "details information screen" for displaying detailed information about the POI. In particular, program 4200 is executed for displaying media in a media display area of the detailed information screen (e.g., media display area 3835 shown above with reference to Figures 38, 40, and 41). In some embodiments, the program 4200 is executed by a mapping application. The program begins when the mapping application displays the results of a search query. Program 4200 begins by receiving (at 4205) the selection of the POI. Program 4200 receives the selection in one of several different manners. For example, program 4200 receives the selection when the user selects one of the banners displayed in the banner displayed above the pin on the POI (the user wishes to find more information about the POI). The program 4200 can also receive the selection when the user selects one of the POIs from a list of POIs displayed by the mapping application of some embodiments when the user selects a "list" button. Program 4200 next determines (at 4210) a preferred order for displaying different types of images. As mentioned above, the mapping application of some embodiments displays different types of images. In some embodiments, the different types of images include images for animated 3D views, satellite images, general map images, and commercial related images (eg, images of served dishes, images of interior furnishings, employees of merchants) Image, etc.). In some embodiments, the program 4200 determines a preferred order based on the type of POI selected. For example, when the POI is a merchant such as a restaurant, the program 4200 prefers to display images of dishes that can be served by the restaurant. When the POI is a famous landmark such as the Empire State Building, the program 4200 prefers to display an image of the building and its surroundings. The program 4200 analyzes the data of the selected POI to determine the type of the POI when determining the preferred order. There are also images of different levels for a particular type of image. For example, for landmark images, images of different levels include animated 3D images, satellite images, general map images, and the like. These different image levels will contain different amounts of data, and therefore it takes a different amount of time to retrieve the data via the network. For example, an animated 3D image will likely contain the largest amount of data and thus it takes the longest time to download the images from the source of the data via the network. The program 4200 then determines (at 4215) whether the material for the image of the current level is available from the source. In some embodiments, the program 4200 obtains data from a map server or other server that is one of the servo data upon receiving a request. However, not every POI has an image of each level in the server. For example, a building that is not well known or well known will likely not have imagery for displaying a 3D animated view of the building. In some embodiments, the current level of the image to be initially displayed is preset to a 3D animated image level. When the program 4200 determines (at 4215) that the data for the image of the current level is not available, the program 4200 selects (at 4220) the image of the next level to be displayed for the POI. For example, when an image for an animated 3D view of the selected POI is not available, the program 4200 selects (at 4220) the satellite image of the POI. The program 4200 then loops back to 4215 to determine if the material for the next level of image is available. When the program 4200 determines (at 4215) that the data for displaying the image of the current level is available, the program 4200 determines (at 4225) whether all of the data has been received and whether the mapping application is ready to display the image of the current level. . In some cases, when the local rendering application is executed with a slow network connection to obtain image data from the remote server, there may not be enough image material to display the current level of the POI. When the program 4200 determines (at 4225) that sufficient material has been obtained for displaying the image of the current level, the process 4200 continues to 4230, which is further described below. When the program 4200 determines (at 4225) that the data for displaying the image of the current level is not available, the program 4200 displays (at 4235) the image of the next level that is ready for display. That is, the program 4200 displays (at 4235) one of the images of the next level that sufficient data has been obtained for the images. When there is no sufficient data for displaying the images in the next level of image, the program 4200 displays the normal background image (a black image) in the media display area. The program 4200 then determines (at 4240) whether the mapping application has obtained sufficient material for the image of the current level. For example, when the program 4200 displays a satellite image of the POI in the media display area, the program 4200 checks if sufficient material for the animated 3D view has been obtained. When the program 4200 determines (at 4240) that the data for the image of the current level has been insufficient, the program loops back to 4235 and continues to the media while checking whether the data for the image of the current level is ready. The image of the next level is displayed in the display area. When the program 4200 determines (at 4240) that sufficient material has been obtained for the image of the current level, the program switches (at 4245) the image of the next level being displayed in the media display area to the image of the current level. For example, when the data for the 3D animated view is being displayed for display while the satellite image of the POI is being displayed, the program 4200 switches from the satellite image of the POI to the animated 3D view of the POI. Different embodiments use different techniques for this switching. For example, the program 4200 can apply the Ken Burns effect to make a transition from an image of the next level to an image of the current level. Program 4200 next displays (at 4230) the image of the current level. The program 4200 then ends. Figure 43 conceptually illustrates the GUI 3800. In particular, Figure 43 illustrates, in four different stages 4305 through 4320, that the index tab 3840 has not been scrolled away from the "detailed information screen" of the selected POI (i.e., GUI 3800). In some embodiments, the GUI 3800 is scrollable. That is, components of GUI 3800, such as media display area 3835, table 3840, and information display area 3845, move up or down as the user scrolls up or down GUI 3800. As noted above, the index tag 3840 is used to display index tags for different sets of input items grouped for different types of information associated with different index tags. That is, some embodiments of the mapping application obtain POI-related material from different data collection entities (eg, tweets from Facebook, Yelp, Yahoo, Twitter, blogs, etc., RSS feeds, updates), and are based on data. The type of data contained in the data is classified into different groups. For example, the mapping application classifies the data segments carrying the comments about the POI into a "comment" group, and displays the data segments as the information display area 3845 when the user selects the "comment" index tab. Inputs. Different embodiments use different techniques to identify the type of information that a piece of data is carrying. For example, the mapping application looks for a set of keywords in the data section. Examples of index tags that a user may create or map an application may additionally provide include "activities" or "events" index tags that, when selected, will include any time correlation with respect to the POI News. For example, if the selected POI is a Major League ballpark, the entry under this index tab may include information about the scheduled game of the team associated with the pitch. As another example, when the selected POI is a movie theater, the selection of the input item enables the mapping application to launch a ticketing application (eg, Fandango), which the user can use to purchase the movie in the movie theater. Ticket for the movie. Those of ordinary skill in the art will recognize that many other possible index tags having various functionalities corresponding to the type of index tag can be created. In some embodiments, the mapping application allows the user to create a new group by defining a mapping application to filter a number of keywords of the obtained data segment. In such embodiments, the mapping application adds a new index tag for one of the new groups, and lists the data segments including the keywords when the new index tag is selected. When the mapping application includes a limited number of index labels that can be displayed for one of the POI details information screens, the mapping application of some embodiments enables the index labels 3840 to be scrolled (eg, horizontally) for use The index tags 3840 can be scrolled and any desired index tags can be selected. The mapping application of some embodiments also allows the user to remove unwanted index tags. As described above, the GUI 3800 can be scrolled in the vertical direction. However, regardless of how the user scrolls through the GUI 3800, the index tabs 3840 will not be scrolled away from the screen of the device. That is, for example, when the GUI 3800 is scrolled up such that the index tabs 3840 would otherwise leave the screen, the mapping application of some embodiments keeps the index tabs 3840 within the screen display area and displays the information. Area 3845 slides under index tab 3840. An example operation of the GUI 3800 when the user scrolls the GUI 3800 is described by referring to the four stages 4305 to 4320 of FIG. The first stage 4305 shows a GUI 3800 that is launched after the user selects the map for the "Little Coffee Shop" pin and banner or selects the POI from the list of POIs. The second stage 4310 shows the user making an upward swiping gesture from one of the directions of the GUI 3800. The third stage 4315 shows that the media display area 3835, the index tab 3840, and the information display area 3845 have moved up within the screen. In particular, the media display area 3835 has almost completely scrolled away from the screen, appearing to have slid under the top column 3850. The mapping application has also extended the information display area 3845 to display more entries for the selected information index tab. The user is also performing another upward dialing gesture (or continuing the initial up toggle gesture with or without a pause) to further scroll up the GUI 3800. The fourth stage 4315 shows that the index tabs 3840 have moved up to the top of the top row 3850 and the user moves the GUI 3800 up further. However, the index tags 3840 neither leave the screen nor slide below the top row 3850. The mapping application has also stopped expanding the information display area 3845. As the user further scrolls up the GUI 3800, the mapping application causes the input items displayed in the information display area 3845 to be slid under the index tab 3840 and the top column 3850 and further undisplayed in the information display area 3845. Multiple entries appear from the bottom of the information display area 3845, as shown in the fourth stage 4320. The fourth stage 4320 also illustrates that the mapping application of some embodiments displays a set of selectable UI items 4360 through 4380. UI Project 4360 is used to show guidance from current orientation to POI. UI Project 4365 is used to show guidance from the POI to the current orientation. UI Project 4370 is used to add this POI to the contact list of the mapping application. UI project 4375 is used to share this POI with other mapping applications for other devices for other users. UI item 4380 is used to add this POI as a bookmark. Figure 44 conceptually illustrates the GUI 3800. In particular, FIG. 44 illustrates four different stages 4405 through 4420 of a mapping application of some embodiments for launching a third party application when a user selects an entry displayed in the information display area 3845. In some embodiments, the mapping application launches a third party application when an input item is selected from a list of one of the inputs displayed when one of the "Detailed Information Screen" tabs for a selected POI is selected. Or open the third party's website in one of the browser applications executed on the device where the mapping application is executed. This third-party application is usually the source of the selected input (for example, Yelp). For example, when the user taps on one of the comments displayed in the information display area 3845, the mapping application of some embodiments launches the third party application (eg, a Yelp application) to display the selected comment. All text or allow users to add new comments. In such embodiments, the mapping application will also provide the means for the user to return to the "Detailed Information Screen" (eg, GUI 3800) for the POI. For example, the mapping application can display a "previous page" button in the top column of the third-party application, and the button will be redisplayed in the "Detailed Information Screen" of the POI when selected. In other embodiments, the mapping application does not launch a third party application when an entry is selected from the list of entries. The mapping application will change to "in-place" to display the information associated with this entry. That is, for example, the mapping application can expand the entry in the information display area 3845 and display the full comment or provide a means to add a new comment without leaving the mapping application. The mapping application of these embodiments will utilize an application programming interface (API) to request and obtain complete comments from the source. In some embodiments, the API is provided by a source (eg, Yelp). The first stage 4405 of FIG. 44 illustrates the user selecting the comment index tab 4450. In the second stage 4410, the mapping application displays a number of comments related to Little Coffee Shop upon receiving the selection. As shown, these comments originate from different sources such as Twitter, Facebook, Yelp, and the like. In some embodiments, each comment entry includes a limited amount of information. The mapping application of some of these embodiments allows a user to view a full review when selecting a particular entry. The second stage 4410 displays the user's list of comments selection comments 4455 displayed in the information display area 3845. In this example, as shown, the comment 4455 is derived from Yelp. The third stage 4415 shows that the mapping application has launched the Yelp application. The Yelp app displays the entire contents of the comment 4455. The top column 4460 of the page includes a "Previous Page" button that, when selected, causes the mapping application to again display the "Review" index tab via the selected GUI 3800. The third stage 4415 also shows that the user selects the "previous page" button. The fourth stage 4420 shows the GUI 3800 displayed again. This stage shows the same GUI 3800 as the GUI 3800 at stage 4410. Figure 45 conceptually illustrates the GUI 3800. In particular, Figure 45 illustrates four different stages 4505 through 4520 of a mapping application of some embodiments for launching a third party application when a user selects a UI project for a third party application. The first stage 4505 illustrates that the user selects the comment index tab 4550 when the information index tab 4555 is the currently selected index tab. In a second stage 4510, the mapping application displays a comment related to Little Coffee Shop in response to receiving a selection of a review index tag. As shown, each comment entry includes a thumbnail image of the person, a star rating of the person, etc., the date the comment was uploaded, and a comment (depicted as a line). These comments originate from different sources such as Twitter, Facebook, Yelp, etc. The second stage 4510 also indicates that the user is scrolling up the GUI 3800. The third stage 4515 illustrates that the mapping application of some embodiments displays a set of selectable UI items 4560 through 4575 near the bottom of the extended information display area 4545 as the GUI 3800 is scrolled up. The UI Project 4560 is used to launch a third party application (eg, Yelp's mobile application) or a website (eg, Yelp's website) to view more comments about the POI. UI project 4565 is used to launch another third party application (eg, Facebook's mobile application) or a website (eg, Facebook's website) to trigger special features provided by the third party application (eg, signing in from Facebook ( Check-in)). UI project 4570 is used to add quick tips. In some embodiments, the quick reminder is that the mapping application does not have to rely on a third type of application (ie, without having to go through the third party application or website). UI project 4575 is used to add comments. In some embodiments, the selection of UI item 4575 causes the mapping application to launch a third party application or a website to leave a comment. The third phase 4515 also shows the choice of UI project 4560. The fourth stage 4520 describes a third party application that allows the user to add comments about the POI (eg, Yelp's mobile application). In some embodiments, the information provided by a third party application (eg, Yelp) may be different for different users based on the user's personal preferences for a particular third party application. In order to determine a particular user's preference settings, the mapping application communicates with the third party application to determine the user's preferences and to retrieve information tailored to the user. In some embodiments, in order to display personalized information from a user of a third party application, the user must install the third party application on the same device that the user uses to execute the mapping application. In addition, in some embodiments, the user must also currently log in to a third party application. For example, the user must download the Yelp application to the same device that the user uses to execute the mapping application, and the user must log in to the Yelp application when accessing the Yelp feature from the mapping application. If the user satisfies these conditions, the specific information about the POI that will be displayed to the user using the mapping application will be tailored to the user's personal preferences. Therefore, different users will see different information about the POI based on the user's personal preferences. In order to apply preferences for specific users, mapping applications and third-party applications must first verify certain information. The mapping application initially authenticates to the third party application that the user is currently logged into the third party application. After verifying that the user is currently logged into the application, the mapping application forwards the data related to the particular POI of the user's interest (eg, the identification of a particular POI) to the third party application. The third-party application retrieves information about the POI of a particular user and passes the information back to the mapping application. In some embodiments, the mapping application then displays "in-place" information to the user based on the user's preferences (eg, does not transfer to a third-party application). In some embodiments, the mapping application obtains authentication information (eg, tokens) from a third party application. The mapping application uses this token to access a server that provides information about the POI to a third-party application. For example, the Yelp application has been authenticated and authorized to access the Yelp server. The mapping application can then be authorized to access the Yelp server by using authentication information obtained from a Yelp application executing on the same device. In such embodiments, the server maintains the user's personal preferences such that when the local mapping application requests information about the POI (eg, a comment), the server returns information that matches the user's preferences. In other embodiments, the user is not required to be logged into a third party application. In these embodiments, the mapping application allows the user to log in directly to the server via the mapping application. Figure 46 illustrates two mapping application execution entities executing on two different devices of two different users 4601 and 4602 (users not shown), the two execution individuals exhibiting the two different Two different sets of information about the same POI obtained by a particular third-party application executing on the device. Figure 46 illustrates the mapping application for each user in three stages 4605 through 4615. During the first phase 4605, two users are viewing the map view of the mapping application. Both users have used the search bar to search for "Little Coffee Shop". The two users also select arrows (not shown, blocked by the fingers 4603 and 4604 of the users 4601 and 4602, respectively) to activate the detailed information screen of the POI. In the second phase 4610, the GUI 3800 has been launched. As shown, the mapping application displays various information about the POI, including address, phone number, website URL, and user comments. Both users 4601 and 4602 also select the comment index tab 4550 to obtain information about the user reviews of the coffee shop. In a third stage 4615, the GUI 3800 presents comments obtained from a third party application (eg, Yelp). However, the displayed user comments are different for the two users 4601 and 4602 because the comments are based on the user's personal preferences. The first user sees three comments by Jane, Bo, and Sue, while the second user sees comments from Jon, Brad, and Nat. This can occur when users 4601 and 4602 have specified different personal preferences for a particular third party application. The first user 4601 can indicate to the third party application (i.e., Yelp in this example) that the user 4601 only wants to see the comments made by the "food expert" and thus displays comments from experts Jane, Bo, and Sue. . The second user 4602 may have indicated that the user 4602 would like to see comments from each individual, and thus display comments from Jon, Brad, and Nat who may be experts and not experts. Thus, each user obtains information about a particular POI from a third party application that is customized and tailored to the user's personal preferences. Figure 47 conceptually illustrates the GUI 3800. In particular, Figure 47 illustrates, in three different stages 4705 to 4715, the mapping application of some embodiments displays one of the index labels on an index tab for the selected POI of the selected POI, and when the flag is selected Display a specific piece of information. In some embodiments, the mapping application displays one of the tags on one of the index tabs of the GUI 3800 to indicate that one of the entries will be displayed to include special information about the POI when the index tag is selected. For example, the tag may have a "deal" tag for the display POI, and the information about the offer may be included in one of the inputs that are displayed when the tag or the index tag of the tag is selected. Among them. Thus, the tag acts as an unread token indicating that there are entries that have not been viewed by the user since the input was obtained from the respective source of the entry. The first stage 4705 shows the GUI 3800. The comment index tag 4450 has a tag 4750 attached to the index tag. In some embodiments, when there is an entry that includes information about the offer for the POI, the tag 4750 appears on the index tab 4450. The mapping application of different embodiments uses tags of different appearances. For example, in some embodiments, the tag is similar to an entity tag that can be attached to one of the files. The indicia also includes one or more dollar signs to indicate that the POI provides a saving offer to its customers. The next stage 4710 displays the selection of the marker 4750 (eg, by tapping). The third stage 4715 shows that the mark has disappeared from the index tab 4450 and the offer of the offer is displayed in the information display area 3845. In some embodiments, the mapping application instead launches a third party application or opens a third party website in a browser application executed by the mapping application executing on the device. The third-party application or web page will then display the details of the offer for the selected POI. FIG. 48 conceptually illustrates one example of a "details screen" (eg, GUI 3800) of a POI displayed in a display area of a device 4820 that provides a relatively large display area. An example of this device is a tablet device (for example, Apple Inc. iPad® sold). In particular, Figure 48 illustrates the interaction of a user for displaying a "Detailed Information Screen" for a particular POI with a mapping application in three different stages 4805 through 4815. As illustrated, devices with relatively small display areas (eg, such as Apple Inc.) In contrast to the iPhone® smartphone sold, the device 4820 provides a larger screen display area 4825 for viewing the mapping application. The larger display area allows the mapping application to efficiently utilize the screen space to display different GUIs within the same map view and minimize switching to different GUI images. In detail, when the user selects the arrow for a specific POI, the mapping application opens the "Detailed Information Screen" for the POI directly overlaid on the map view of the mapping application without changing to a new one. GUI page. In the first stage 4805, the mapping application displays a pin 4830 and a banner 4835. This banner allows users to type in the results of a search query for "Little Coffee Shop". The second stage 4810 illustrates the user selecting arrow 3875 to launch the "Details Info Screen" for this POI. In the third stage 4815, the mapping application has launched GUI 3800 for the detailed information screen. The GUI 3800 is fully displayed in the map view of the mapping application. In some embodiments, the mapping application completely displays the "Detailed Information Screen" for a particular POI when the user is viewing the mapping application in portrait orientation. In some embodiments, when the user views the mapping application on the horizontally oriented tablet, the mapping application displays the "Detailed Information Screen" for the POI, but may require the user to scroll through the "Detailed Information Screen". To view all the information.IV. Route generation As mentioned above, the mapping application closely integrates the search and route identification experience by providing a number of different ways to obtain guidance. One such way is to select a UI control item (eg, a button) via one of the main map views (eg, in the upper left corner) that presents a modal interface for editing the guide when selected, and Enables the user to request more customized routes, such as routes that are not starting from the current orientation or walking routes rather than just driving routes. In some embodiments, the mapping application allows the user to check for such customized routes by sliding the display of the driving control instructions into and out of the UI page of the display route. This mode of operation of the mapping application is referred to as a route inspection mode or a (manual) stepping mode, which is one of several operational modes by which the mapping application of some embodiments can operate. Examples of such modes of operation include a navigation mode, a map browsing mode, and a route inspection mode. The junction is where two or more road sections meet. The route between the starting position and the destination position in the route map. A typical route has zero or many joints along the path between the two orientations. The driving command for the joint in the route identifies the direction of the road segment from the junction. In some embodiments, the mapping application provides the user with driving instructions for only some of the joints along the route, as the user may not need to be in each of the routes in order to reach the destination orientation. Perform driving control at the joint. For example, the user of the carrying device can recognize that the user only needs to go straight through several joints before reaching the joint to be turned to reach the destination orientation. In this patent application, when a joint has a driving command to be displayed, the driving command is referred to as a "step." In navigation mode, the mapping application of some embodiments provides the user with a collection of steps for a route between the current orientation of the device and a destination orientation. Typically, the mapping application provides these steps to the user visually and with audio in navigation mode. When the user carrying the device deviates from the route, the mapping application of some embodiments tracks the orientation of the device and recalculates the new route from the offset orientation to redirect the user from the off orientation to the destination orientation. In other words, the mapping application of some embodiments operating in the navigation mode requires the device to always be on the route. Moreover, the mapping application of some embodiments operating in the navigation mode displays the steps by a "fast display" step instead of sliding the steps in and out of the display area. Moreover, in some embodiments, the information in the steps displayed by the mapping application (ie, the driver control instructions) is dynamic when operating in the navigation mode. That is, as the device moves along the route, information such as estimated arrival time, time remaining to the destination orientation, current orientation from the device to the destination orientation, or the remainder of the next step with the next step Distance, etc.) Updated by the mapping application. In the route inspection mode, the mapping application of some embodiments allows the user to slide the steps in and out of the display area to check each step in the route. Alternatively, the mapping application allows the user to manipulate the map (eg, by zooming in and out, sliding the map in different directions) to display different joints in the route. When a joint having one step is displayed in the display area as a result of the user's manipulation of the map, the mapping application slides the step in (and the previously displayed steps are currently displayed) This step is shown by any intermediate steps between steps sliding in and out. In this manner, the user can check the route by manually sliding the steps into and out of the display area or by manipulating the map to display certain joints of the route in the display area.A. Route start and search Figure 49 illustrates an example of interaction between a user using a route selection guide and a mapping application in accordance with four stages 4905 through 4920. Specifically, this figure illustrates the mapping application starting to operate in route inspection mode. The first stage 4905 illustrates the device after the user has selected the steering control item 160 (not shown in this figure). The first stage 4905 also illustrates that the user has entered the start and end directions of the route in the start field 245 and the end field 250. The second stage 4910 illustrates the selection of the route generation control item 240. In some embodiments, when selecting a route generation control item, the mapping application sends start and end position information to a remote server to obtain a route. The third stage 4915 displays two routes, Route 1 and Route 2. In some embodiments, the mapping application visualizes the two routes on the map based on route information obtained from the remote server. The third stage 4915 also states that the mapping application has selected route 1 by default. The user selects the start control item 4950 for starting navigation according to the selected route. The mapping application of some embodiments begins operating in the route check mode upon receiving a selection of the start control item 4950. The fourth stage 4920 illustrates the mapping application display instruction flag 4930. In some embodiments, the instruction flag 4930 is used to browse a series of selected directions to the prompting instruction flag (the instruction flags are not all illustrated in the figure) The first sign in the middle. The mapping application allows the user to navigate through the selected route by sliding the markers along a particular axis (eg, horizontally). These scrollable instruction flags are described in more detail below. In some embodiments, the mapping application allows the user to browse the selected route when the starting orientation of the selected route is not the current orientation of the user. Again, when the local mapping application is in this mode for allowing the user to view or check the selected route (as shown in stage 4920), some embodiments of the mapping application disable page curling or not displaying the page. curly. In addition to typing the start and end orientations of the route in the start field 245 and the end field 250, the mapping application of some embodiments also allows the user to select a route from a list of previously searched routes. Figure 50 illustrates an example of interaction between a user using a route selection guide and a mapping application in accordance with four stages 5005 through 5020. This example is provided in the context of the use of the guidance control 160 to obtain a route between two orientations. The first stage 5005 illustrates a mapping application that displays a map of the street views of the city. The user is initiating a touch of the pointing control item 160 located at the upper left corner of the display next to the search field 165. The second stage 5010 next illustrates the application presenting a lookup table 5055 with a list of recent directions that the user has previously searched for. In this example, as shown, the user selects a route to the police station. The third stage 5015 illustrates displaying a map of the selected route having the current orientation of the device to the destination of the selected route. This stage 5015 also illustrates the selection of the list view control item 235. The fourth stage 5020 illustrates a list of one of the steering prompt instructions that the mapping application presents to reach the destination. As shown, each instruction in the list includes a direction icon 5035 that shows the direction of the particular steering associated with the instruction. In some embodiments, each instruction in the list appears to be identical to the corresponding instruction flag 4935 described above with reference to FIG. Figure 51 conceptually illustrates an example of displaying route selection guidelines in a relatively large display area of the device. An example of such a device is a tablet device (eg, an iPad® sold by Apple Inc.). Specifically, Figure 51 illustrates the interaction of the user to display a set of route selection guidelines with the mapping application at three different stages 5105 through 5115. As illustrated, the device provides a larger screen display area for viewing the mapping application when compared to the display area of a smaller device (eg, a smart phone such as the iPhone® sold by Apple Inc.) . The larger display area allows the mapping application to efficiently utilize the screen space to display different UI items within the map view of the mapping application and minimize changes to the UI picture. For example, when the user selects a list of the route selection guides, the mapping application displays the list of route selection guides that are directly overlaid on the map without changing to a different UI screen. The first stage 5105 illustrates a tablet device 5120 that executes a mapping application of some embodiments that is displaying a map view of a particular route between two orientations. In particular, the user has obtained a route between the current orientation of the user and the orientation of the POI "Pizza Place". The user may have obtained this route in several ways, including searching for features, placing pushpins on the map, and various other mechanisms. The mapping application is also displaying a collection of floating controls including the list view control item 145. The second stage 5110 shows that the user is selecting the list view control item 145 to obtain a list of route selection guidelines. The third stage 5115 illustrates that the mapping application now displays a list of routing instructions overlaid on one of the map views of the mapping application. In some embodiments, when the user selects an individual route selection guide from the list of route selection guidelines, the mapping application displays a corresponding portion of the route associated with the selected route selection guide on the map. If the corresponding portion of the route is not within the currently displayed map area, the mapping application shifts the map so that the area containing the map of the corresponding portion is displayed.B. Route display and review In some embodiments, when the mapping application presents the identified route to the user, the application allows the user to select and scroll through a collection of selectable UI items that represent the sign along the junction of the selected route. . As the user scrolls through each of the markers, the portion of the route associated with the marker currently in focus is presented or highlighted (eg, via a color eye-catching prompt or via another geometry that marks the portion (eg, Circle or other mark)). This mode of operation of the mapping application of some embodiments is referred to as a route inspection mode. The mapping application operating in this mode allows the user to inspect the road by manipulating UI items that represent the mandatory signs of some of the joints of the route. In some embodiments, (1) when the route being inspected is between two orientations (both of which are not the current orientation of the device on which the mapping application is executed), and (2) when the route is targeted The mapping application operates in the route inspection mode when calculated by walking (not driving). Figure 52 illustrates one example of a route check mode in accordance with four stages 5205 through 5220 of a set of scrollable command flags that a user scrolls through a particular route selected by the user. The first stage 5205 illustrates that the user initiates a selected route between the start and end directions by touching the start control in the top column 140. As described above, the second stage 5210 illustrates the first scrollable flag 5225 (a selectable UI item) presented for a particular route, as indicated by the "1 of 1" text displayed at the center of the top column 140. Again, the mapping application displays the current juncture indicator 5290 at the juncture indicated by the displayed flag 5225. Further details regarding the current joint point indicator are described in U.S. Patent Application Serial No. 13/632,002, filed on Sep. U.S. Patent Application Serial No. 13/632,002 is incorporated herein by reference. The second stage 5210 also illustrates the user's initial leftward toggle action on the first scrollable marker. The toggle gesture causes the marker to shift to the left of the map. In some embodiments, the user can touch the flag to shift it to the left (or right) of the map and display the next flag. The third stage 5215 illustrates that a portion of the first scrollable flag 5225 has been scrolled away from the map display area and one of the new marks 5235 for the route has become partially visible. The user can see that the new logo shows a right turn arrow. The mapping application has not moved the current juncture indicator 5290 because the flag 5225 is still the current flag. The fourth stage 5220 illustrates the display after the user has completed the toggle action to cause the first marker to leave the map. The mapping application now displays a second flag 5235 of the list of routing instructions as indicated by the "2 of 2" text displayed at the top column 140. This sign indicates that after 0.1 miles, the user needs to turn right into the 7th street. In addition, the application has zoomed in on one portion of the map display area and highlighted the new section 5250 of the route corresponding to the flag currently being rendered in focus. The application has also moved the current joint indicator 5290 to the joint indicated by the second flag 5235. Alternatively or in combination, the user can view a particular scrollable point associated with a particular joint by selecting a different joint of the route (eg, by tapping) or navigating the map (via a gesture input) The logo to scroll through each logo. Figure 53 illustrates the user navigation map to scroll through different scrollable markers in accordance with three stages 5305 through 5315. The first stage 5305 illustrates a map with overlaid signs (corresponding to the "2 of 2" sign in the route) for a particular joint of the route between the start point and the end point. The application has also highlighted the corresponding part of the route for this flag. The sign indicates that after 0.1 miles, the user needs to turn right into the 7th street. The first stage 5305 also illustrates that the user initiates a toggle gesture for navigating the map (eg, swiping the user's finger to the right) to view a different area of the map. The second stage 5310 illustrates a new zone of the map displayed after the toggle gesture corresponding to the shift to the left. The third stage 5315 illustrates that after the completed toggle gesture, a new scrollable marker is now overlaid on the map, corresponding to the portion of the route that is now displayed in that particular area of the map. This mark is the third mark in the route, as indicated by the "3 of 3" text displayed at the top center of the map. This sign indicates that after 350 feet, the user will arrive at the destination. To navigate a collection of route selection guidelines, the user has the option of scrolling through the markers overlaying the map or navigating the map to scroll through the different markers. Also, when the user touches a particular segment of the route, the mapping application scrolls through the different markers to display one of the particular segments corresponding to the route. In some embodiments, the mapping application displays a logo for the point of engagement of one of the touched portions that is closest to the route. This scrolling feature of the mapping application allows the user to quickly determine all necessary controls while traveling between the two orientations. This may be particularly useful in driving situations where it is expected that the upcoming turn will require a large number of lane changes. In some embodiments, the directional arrows shown in the scrollable command flag are simple arrows. In other embodiments, when there is sufficient space on the logo or presentation for the use of the larger logo, the mapping application of some embodiments in the navigation mode identifies the edge to be used by using a larger graphical direction indicator Driving control performed at the junction of the route, the graphical direction indicator includes (1) a prominent stylized arrow roughly indicating the path of the vehicle, and (2) a line corresponding to other elements of the junction point that are not emphasized And a collection of curves. In some embodiments using this method, the right turn at the T-junction will be indicated by a large arrow with a right angle engaged with the smaller darker section, which is parallel to the large arrow One of the sections extends. In some embodiments, the smaller section is also pushed open to the side such that the path employed by the carrier is dominant. Further details regarding such arrows are described in U.S. Patent Application Serial No. 13/632,121, filed on Sep. 30, 2012, entitled " Context-Aware Voice. U.S. Patent Application Serial No. 13/632, the disclosure of which is incorporated herein by reference. In some embodiments, the mapping application replaces the instructions with directional icons with seemingly realistic road traffic signs. Figure 54 illustrates an example of displaying a set of scrollable instruction flags for a particular route selected by a user. The mapping application is executed by the device 5400 having a relatively large display area. An example of such a device is a tablet device (eg, iPad®). This figure illustrates the interaction of the user stepping through the command flags with the mapping application in two different stages 5405 and 5410. When the mapping application is executing on a device with a large display area, the mapping application displays more flags in the display area at any given time. In some embodiments, the mapping application displays a list of markers in the top portion of the display area, with the current marker being in the middle of the top portion. The number of markers that the mapping application can display varies depending on the orientation of the device. That is, the mapping application can display more marks when the display area is in the landscape orientation than when the display area is in the portrait orientation. The first stage 5405 shows that the mapping application is displaying one of the three markers 5415 to 5425 and the fourth marker 5430. In this example, flags 5415 through 5430 represent first through fourth instructions having a selected route of a total of six steps. As indicated by the top column 5435, the second command of the route is the current command, and the flag 5420 is highlighted and placed in the middle of the top portion of the display area to indicate that the flag 5420 is indicating the current command. The first stage 5405 also shows that the user is dialing the sign 5420 to the left. The second stage 5410 shows that the mapping application is displaying a flag 5425 for the third instruction of the route in the middle of the top portion of the display area. In this example, the mapping application has also highlighted the flag 5425, and the segment corresponding to the route of the flag is highlighted, as shown. The top column indicates the third of the six instructions for the current instruction. Again, most of the flag 5415 now slides out of the display area and the flag 5430 is now fully displayed. The mapping application is also displaying a portion of the fifth flag 5445, which represents the fifth instruction of the route. The mapping application of some embodiments allows the user to switch to the overview mode when reviewing a selected route by scrolling the command flag. In the overview mode, the mapping application of some embodiments adjusts the zoom level of the map so that the full route can appear on the map. The mapping application also allows the user to return to the mode by which the user can continue to review the instructions. Figure 55 illustrates an example of user interaction with an application for switching to the overview mode when reviewing the selected route, in accordance with three stages 5505 through 5515. The first stage 5505 is identical to the stage 5315 described above with reference to FIG. That is, the user has scrolled to the last instruction flag 5520. The next stage 5510 illustrates the selection of the overview control item 5525. The third stage 5515 illustrates the map in the overview mode. The mapping application of some embodiments, in response to receiving the selection of the overview control 5525, displays the map in the overview mode. The mapping app has zoomed out the map so that the full route is displayed on the map. In some cases, when the current orientation of the device is very close to the destination orientation (eg, within 100 meters), the mapping application only displays portions of the route from the current orientation of the device to the destination orientation. Again, the top column 5530 shows the destination (in this example, the police station) 7 minutes away or 0.5 miles from the current orientation of the device or the starting orientation of this particular route. The top column 5530 now includes a continuation control item 5540 that, in some embodiments, is used to continue navigation or inspection of the selected route. The mapping application also displays the list view control item 235 in the map. The top column 5530 also shows an end control item 5570. When the mapping application is displaying an overview of the selected route, the local mapping application receives the selection of the end control item 5570, and the mapping application of some embodiments stops the selection by returning to the map browsing mode. Inspection of the route. The mapping application of some embodiments returns to the map browsing mode by: removing the selected route from the map; putting back the page curl; using other controls including the guide control, the search field, and the bookmark control. A collection replaces the information and controls in the top column. The mapping application of some embodiments does not shift the map to another zone when switching from the check mode to the map browsing mode. The mapping application of some embodiments leaves a pin in the map for the start and end orientations when the mapping application enters the map browsing mode. Figure 56 conceptually illustrates a procedure 5600 performed by some embodiments to allow a user to view a logo for a set of instructions for a joint in a route between a starting orientation and an ending orientation. In some embodiments, the program 5600 is executed by a mapping application. The program 5600 begins when the mapping application has calculated one or more routes between the starting and ending directions. Program 5600 begins by receiving (at 5605) the selection of the route. As shown in Figure 49 above, the mapping application of some embodiments provides a recommended route when there are two or more generated routes between the starting orientation and the ending orientation. When the user does not select another route, the mapping application will use the recommended route as the selected route upon receiving a selection of a start control item, such as start control item 4950. Next, the program 5600 receives (at 5610) one of the user inputs for initiating the browsing mode. In some embodiments, the mapping application enters the browsing mode when the user selects a start control item, such as start control item 4950. At 5615, the program 5600 of some embodiments then displays one of the first joints (i.e., the starting orientation) for the route and the corresponding joint of the marker (i.e., the first joint) on the map. The program 5600 then receives (at 5620) a user input. In some embodiments, the user input includes any gesture interaction with the mapping application. For example, the user can zoom or toggle the map by touching one or more of the maps. The user can also touch, toggle, etc. the currently displayed logo. The program 5600 then determines (at 5625) whether the user input is for moving the currently displayed logo. In some embodiments, when the user touches the currently displayed logo or toggles the currently displayed logo in a certain direction, routine 5600 determines that the user input is used to move the logo. When the program 5600 determines that the user input is not used to move the currently displayed logo, the routine 5600 continues to 5635, which is described further below. When the program 5600 determines that the user input is used to move the currently displayed logo, the program 5600 displays (5630) an adjacent flag (if possible) based on the user input. For example, the program 5600 displays one or a previous flag for the next or previous joint in the route based on the user input. Program 5600 also displays the corresponding joints of the route. In some embodiments, the program 5600 can be scaled or shifted to another area of the map to display the corresponding joint of the logo being displayed. Program 5600 then loops back to 5620 to receive another user input. When the program 5600 determines (at 5625) that the user input is not used to move the currently displayed flag, the routine 5600 determines (at 5635) whether the input is used to display a juncture that is different from the currently displayed juncture. In some embodiments, when the user manipulates the map (eg, toggle, zoom, etc.) to display another region of the map or when the user touches a portion of the route that is closer to another junction of the displayed route, The program determines that the input is used to display a joint. When the routine 5600 determines (at 5635) that the user input is not used to display another joint, the routine 5600 continues to 5645, which is further described below. When the program 5600 determines (at 5635) that the user input is used to display another joint, the routine 5600 displays (at 5640) another joint and the corresponding mark of the joint. This flag may not be the adjacent flag of the currently displayed logo. Program 5600 then loops back to 5620 to receive another user input. When the program 5600 determines (at 5635) that the user input is not used to display another joint, the routine 5600 determines (at 5645) whether the user input is used to present an overview of the route. In some embodiments, when the local mapping application receives a selection of an overview control, program 5600 determines that the input is used to display an overview of the route. When the program 5600 determines (at 5645) that the user input is not used to present an overview of the route, the routine 5600 continues to 5670, which is further described below. When the program 5600 determines (at 5645) that the user input is used to display an overview of the route, the program 5600 of some embodiments displays (at 5650) the full route in the map. The program also receives another user input while displaying an overview of the route. The program 5600 then determines (at 5655) whether the input is used to end the browsing mode. In some embodiments, when the local mapping application receives a selection of an end control item, such as the end control item 5570 described above with reference to FIG. 55, the program 5600 determines that the input is used to end browsing. mode. When the program 5600 determines (at 5655) that the user input is used to end the browsing mode, the program 5600 ends. When the program 5600 determines (at 5655) that the user input is not used to end the browsing mode, the program determines (at 5660) whether the input is used to exit the overview of the route. In some embodiments, when the local mapping application receives a selection of a continuation control item, such as the continuation control item 5540 described above with reference to FIG. 55, the program 5600 determines that the input system is used to exit the overview. . . When the program 5600 determines (at 5660) that the input is not used to exit the overview mode, the program loops back to 5650 to display the route and receive another user input. When the program 5600 determines (at 5660) that the input is used to exit the overview mode, the program exits the overview of the route and displays (at 5665) the flag and its corresponding joint before the presentation overview. The program then loops back to 5620 to receive another user input. When the program 5600 determines (at 5645) that the input received (at 5620) is not an input for presenting an overview of the route, the routine 5600 determines (at 5670) whether the input is used to end the browsing mode. In some embodiments, when the local mapping application receives a selection of an end control, program 5600 determines that the input is used to end the browsing mode. When the program 5600 determines (at 5670) that the user input is used to end the browsing mode, the program 5600 ends. Otherwise, the program then loops back to 5620 to receive another user input.C. Navigation mode Figure 57 illustrates an example of a device 5700 that performs a mapping application of some embodiments. This figure also shows an example of launching route navigation in this application. Figure 57 shows six phases 5705, 5710, 5715, 5717, 5719, and 5721 interacting with the mapping application. The first stage 5705 shows a UI 5720 that includes several icons of several applications in the docking area 5725 and on the pages of the UI. One of the illustrations on this page is an illustration of a mapping application 5730. The first stage shows the user selecting the mapping application by touching the screen of the device on the screen in the orientation of the application. The second stage 5710 shows the device after the mapping application has been opened. As shown in this stage, the UI of the mapping application has a start page, in some embodiments, the start page (1) displays a map of the current orientation of the device, and (2) is configured in the top column 5740 and acts as a float. Several UI controls for the control. The third stage 5715 of FIG. 57 illustrates the selection open guidance entry page 5780 for the guidance control item 5760, which is shown in the fourth stage 5717. The guidance control is one of three mechanisms through which the mapping application can be directed to identify and display a route between the two orientations; the other two institutions are (1) for the selected item in the map. One of the control items displayed in the information banner, and (2) a new route displayed by the device in the search field 5765. Therefore, the information banner control and the search field 5765 are two UI tools that the application uses to make the transition between different modalities smooth. The fourth stage 5717 illustrates the user selecting one of the tables 5782 that is automatically populated in the recent guidance. The fifth stage 5719 then displays the three routes between the specified start and end directions specified via page 5780 on the 2D map view. This stage also shows the selection of the second route and some information about the route in one of the tops of the layout. This column is shown to include start and end buttons. The start button is shown as being selected in the fifth stage. As shown in the sixth stage 5721, the selection of the start button directs the application to enter the steering alert navigation mode. In this example, the application has entered the 2D Steering Navigation mode. In other embodiments, the application will enter the 3D Steering Navigation mode by default. In this mode, the application displays a realistic logo 5784 that identifies the distance to the next junction of the navigation route and some other relevant information. The application also displays a top column that includes some information about the navigation and an end and overview button that are used to end the navigation and to obtain the remainder of the navigation route or the entire portion of the navigation route (in other embodiments). Overview. The application further displays the floating 3D control item 5750 and the floating list control item described above. It should be noted that the inventory control item is adaptively added to the floating control item cluster when entering the route inspection and route navigation modes, and the position indicator is removed from the floating control item when entering the route navigation mode. Also, in the transition from the route inspection mode to the route navigation mode, the application, in some embodiments, performs an animation that involves the page curl fully unfolding before the application transitions to the navigation presentation. In some embodiments, the animation transition includes a self-navigation presentation removal top column, its associated control and floating control, and moving the marker 5784 to the top edge of the presentation after a short period of time after the navigation presentation is initiated. As further described below, in some embodiments, the application requires the user to tap on the navigation map to restore the top column, its controls, and the float controls, and ask another touch to remove the controls from the map again. item. Other embodiments provide other mechanisms for viewing and removing such controls. The navigation application of some embodiments can display navigation in 2D mode or 3D mode. As described above, one of the floating control items is a 3D control item 5750 that allows a user to view the navigational presentation in three-dimensional space (3D). Figure 58 illustrates how the navigation application of some embodiments provides a 3D control 150 as a fast mechanism for entering a 3D navigation mode. This figure illustrates this operation in three stages 5805 through 5815. The first stage 5805 illustrates the user selecting the 3D control item 150 while viewing the two-dimensional navigation presentation. The second stage 5810 illustrates the navigational presentation in transitioning to a 3D rendering. As shown in this figure, the 3D control item appears as a prominent prompt at this stage to indicate that the navigation presentation has entered the 3D mode. As described above, in some embodiments, the navigation application generates a 3D view of the navigation map by presenting the map view from a particular location in the three-dimensional scene that is conceptually considered to be the location of the virtual camera that captured the map view. This manifestation will be further described below by referring to FIG. The third stage 5815 then illustrates the navigational presentation at the end of the transition to a 3D look. As shown by the difference between the heights of the buildings in the second and third phases, in some embodiments, the transition from 2D to 3D navigation includes an animation that shows that the three-dimensional object in the navigation map becomes larger. The navigation application of some embodiments is capable of displaying a navigation map from multiple perspectives. The application can display the map in three-dimensional (3D) or two-dimensional (2D). The 3D map is a simulation of the virtual scene seen by the virtual camera. Figure 59 presents a simplified example to illustrate the concept of virtual camera 5912. When a 3D navigation map is rendered, the virtual camera is conceptualized by the location in the 3D map scene from which the device visualizes the 3D view of the scene. Figure 59 illustrates the orientation of four objects in a 3D navigation map scene 5910 that includes two buildings and two intersecting roads. To illustrate the virtual camera concept, this figure illustrates three scenarios, each of which corresponds to a different virtual camera orientation (i.e., a different presentation position) and a different view displayed on the device. The first stage 5901 shows a virtual camera 5912 pointing down to the 3D scene 5910 at an angle (eg, at an angle of 30 degrees to the horizon) at the first location. The application generates a 3D map view 5918 by presenting the 3D scene from the position and angle shown in stage 5901. From this position, the camera is pointed at one of the positions of a moving position in front of the system. The virtual camera 5912 is held behind the current orientation of the device. In this case, "behind the current orientation" means backward along the defined path of the navigation application in the opposite direction to the current direction of device movement. The navigation map view 5918 appears as if it were taken by the camera from the device's orientation indicator 5916. The orientation and angle of the virtual camera places the orientation indicator 5916 at the bottom of the proximity navigation map view 5918. This also results in a large portion of the picture being filled with streets and buildings in front of the current orientation of the device. In contrast, in some embodiments, the orientation indicator 5916 is at the center of the picture, with one half of the picture representing things in front of the device and the other half representing things behind the device. The second stage 5902 shows a virtual camera 5912 pointing down to the scene 5910 at a second, larger angle (eg, -45°) at a different location. The application visualizes scene 5910 from this perspective, resulting in a 3D navigation map view 5928. Buildings and roads are smaller than their illustrations in the first navigation map view 5918. Again, virtual camera 5912 is located above scene orientation indicator 5916 in scene 5910. This again causes the position identifier to appear in the lower portion of the 3D map view 5928. The orientation and orientation of the camera again causes the majority of the picture to show that the person navigating needs to know what is in front of the car carrying the device. The third stage 5903 shows the virtual camera 5912 from the top down view, which faces downwardly toward one of the orientations on the 2D map, which corresponds to the orientation in the 3D map scene 5910 used to visualize the 3D views 5918 and 5928. The scene that emerges from this perspective is the 2D map view 5938. Unlike the 3D rendering operations of the first and second stages, which in some embodiments are perspective 3D rendering operations, the rendering operations in the third phase are relatively simple, since the operation only requires cropping the application or user specified One part of the 2D map identified by the zoom level. Thus, the virtual camera representation in this case slightly unduly complicates the description of the operation of the application, since cutting a portion of the 2D map is not a perspective rendering operation. In some embodiments, as further described below, the virtual camera can be moved by changing the zoom level for viewing the map after the map enters the 3D mode. In some of these embodiments, the application switches to generate a top-down mode of the 2D view when the zoom level reaches a particular zoom level (where the appearing position is straight down). As in the third stage 5903, in some embodiments, when the camera switches from the 3D perspective to the 2D top-down view, the mapping application switches from rendering the 3D scene from a particular perspective to cropping the 2D scene. This is because, in these embodiments, the application is designed to use a simplified rendering operation that is easier and does not create unnecessary perspective artifacts. However, in other embodiments, the mapping application uses a perspective rendering operation to visualize a 3D scene from a top-down virtual camera position. In such embodiments, the resulting 2D map view is slightly different from the map view 5938 illustrated in the third stage 5903 because any object that is far from the center of the view is distorted, with the object being farther from the center of the view. , the greater the distortion. In various embodiments, virtual camera 5912 moves along different trajectories. Two such trajectories 5950 and 5955 are illustrated in FIG. In both of these trajectories, the camera moves in an arc and rotates more downward as the camera moves up on the arc. Track 5955 differs from track 5950 in that in track 5955, the camera moves backward from the current orientation as it moves up the arc. When moving along one of the arcs, the camera rotates to maintain a point in front of the azimuth indicator at the focus of the camera. In some embodiments, the user can close the three-dimensional view and navigate purely with the two-dimensional view. For example, an application of some embodiments allows the 3D mode to be turned on and off by using the 3D button 5960. The 3D button 5960 is indispensable for the turn-by-turn navigation feature, which acts as an indicator and a toggle switch in the turn-by-turn navigation feature. When 3D is turned off, the camera will maintain a 2D navigation experience, but when 3D is turned on, there may still be some top-down when the 3D viewing angle is not appropriate (for example, when bypassing a corner that may be blocked in 3D mode) perspective. As another way to allow the user to get a navigation experience, the mapping application of some embodiments provides a UI item in an information banner that appears next to the pin representing the POI. Figure 60 illustrates an example in accordance with three stages 6005 through 6015 of the interaction of the user with the mapping application to obtain routing instructions. This example is provided in the context of the use of car icon 6030. The first stage 6005 illustrates the map in the 3D map view. As shown, the 3D control item 150 appears to be eye-catching to indicate that the map is in a 3D map view. The first stage 6005 also illustrates two information banners for the two push pins of the search resulting from performing a search with the displayed search query "Pizza". The user selects the car icon 6030. As noted above, the car icon 6030 is for displaying one or more routes to the orientation indicated by a pushpin, and the banner including the car icon 6030 is associated with the pushpin. The banner 6040, which includes the car icon 6030, also displays a brief description of the place, a star rating, and the arrow used to launch the "Detailed Information Screen" for the POI. The second stage 6010 illustrates the mapping of some embodiments to the two routes, Route 1 and Route 2, shown in the previous stage 6005 for the selection of the car icon 6030. The user has selected route 1, as indicated by the banner 6050 of the eye-catching prompt. The user also selects the start control item 2060. As noted above, in some embodiments, the start control item 4950 is used to initiate navigation based on the selected route. The third stage 6015 illustrates the mapping application display instruction flag 6060, which is used for the flag of the first instruction. The mapping application has replaced the clear control item 255 and the start control item 2060 with the end control item 5570 and the overview control item 6075 in the top column 140. The end control item 5570 is used to end the navigation of the route, and the overview control item 6075 is used to display the full route in the map view by adjusting the zoom level of the displayed map (if the adjustment zoom level is necessary to display the full route). In some embodiments, the mapping application displays the estimated arrival time, the amount of time it takes to reach the destination, and the remaining distance from the destination in the top column 140, as shown. When the mapping application receives a selection of the end control item 5570 while the mapping application is operating in the route inspection mode, the mapping application of some embodiments stops the selected route by returning to the map browsing mode. Check. The mapping application of some embodiments returns to the map browsing mode by the following operations: removing the selected route from the map, putting the page curl back, using the navigation control, the search field, and the bookmark control. A collection of controls replaces the information and controls in the top column. That is, the mapping application causes the appearance of the UI page to return to a UI page similar to the UI page shown in the first stage 6005. The mapping application of some embodiments does not shift the map to another zone when switching from the check mode to the map browsing mode. It should be noted that although the route history entry and fast route navigation controls in the search field do not perform actions that cannot be achieved with a selectable direction item, the two serve to make it easier to obtain the most general route. Important accelerator. Some embodiments use a movie transition from a 2D map view to a 3D map view or vice versa. For example, when the local drawing application receives a selection of the 3D control 150 when displaying a starting position of one of the routes, the mapping application starts from the 2D map view and is used for one of the first virtual cameras of the 2D map. The view smoothly transitions to a new virtual camera 3D view that is placed larger and points in the direction of the beginning of the route. In this case, the virtual camera view performs a combination of panning, zooming, and rotating operations to reach the beginning of the route for navigation. That is, the virtual camera moves on the arc and rotates more upward as the camera moves down the arc. Again, the mapping application can rotate the arc itself to align the virtual camera viewpoint with the initial road segment of the route. In other words, the mapping application rotates the map during the movie transition. Figure 61 illustrates a device 6100 that displays a mapping application as the application transitions from a non-immersive map view for map browsing to an immersive map view for navigation through six stages 6105 through 6130. The first stage 6105 illustrates the user selecting a quick route button for the orientation "Pizza Place" to generate a route from the user's current orientation (near the center of the screen of device 6100) to the selected orientation. The second stage 6110 illustrates that the mapping application displays route 6135 for reaching the location "Pizza Place." In the second phase 6110, the user selects the "Start" UI control item 6140. So the app starts to navigate. As shown at the third through sixth stages 6115 through 6130, some embodiments use a movie transition from a 2D (or 3D) non-immersive map view to a 3D immersive map view. The application display begins with its current state (shown at 6110) and smoothly transitions from the first virtual camera view to the new virtual camera view, which is placed larger and points in the direction of the beginning of the route. In this case, the virtual camera can perform a combination of panning, zooming, and rotating operations to reach the beginning of the route for navigation. As shown in such stages, the virtual camera moves and rotates into its final orientation as shown in the sixth stage 6130, which is behind the navigation orientation indicator (ie, the locating disc).V. Multi-mode mapping application 62A-62B conceptually illustrate a state diagram 6200 that depicts the different states of the integrated mapping, search and navigation application (eg, the applications described in the above sections) of some embodiments and the status of such states The transition between the two. Those of ordinary skill in the art will recognize that an application of some embodiments can have many different states associated with all of the different types of input events, and state diagram 6200 is particularly focused on a subset of such events. State diagram 6200 depicts and references various illustrative action interactions (e.g., multi-touch gestures) for changing the state of the application. Those skilled in the art will recognize that various other interactions (such as cursor controller gestures and button selection, keyboard input, trackpad/trackpad input, etc.) can also be used for similar selection operations. When the user initially opens the mapping application, the application is in state 6205 (map browsing state). In this state 6205, the application will have generated and displayed a map view. In order to generate and display this map view, the application of some embodiments identifies a desired set of one of the map basemaps for a zone, requests the map basemap (eg, to a map drawing service server), according to a virtual A particular orientation, orientation, and perspective of one of the cameras produces a view of the map basemap and visualizes the map view to a device display. When in state 6205, the map view is static. With the application under state 6205, the user can perform numerous operations to modify the map view, search for entities (eg, places of interest, addresses, etc.), retrieve routes for navigation, and the like. In some embodiments, the integrated application is displayed on a device having an integrated touch sensitive display. The various gesture interactions on the map allow the application to perform different modifications to the map view (eg, move browsing, rotate, zoom, modify the perspective of the map perspective, etc.). When the integrated application receives a gesture interaction on the map display (rather than a touch input to various floating or non-floating controls overlaid on the map display), the application transitions to state 6210 to perform the gesture Input identification. The gesture input recognition state 6210 distinguishes between different types of gesture inputs and translates these types of inputs into different map view modification operations. In some embodiments, the mapping application receives a gesture input as translated by an operating system of a device having an integrated touch sensitive display. The operating system translates the touch input into a gesture type and orientation (eg, a "touch" at coordinates (x, y), a "pinch" operation with separate touch inputs at two different orientations, etc.). In state 6210, the integrated mapping application of some embodiments translates such input into different map view modification operations. When the application receives a first type of gesture input (eg, two separate touch inputs that move together in a map view with a rotational motion), the application transitions to state 6215 to rotate the map. In order to rotate the map view, some embodiments modify which portion of the virtual camera's decision map is rendered to establish the orientation and/or orientation of the map view. For example, when in 3D mode, the mapping application rotates the virtual camera around a particular location (eg, the center of the touch input, the center of the display, the orientation indicator that identifies the orientation of the user, etc.). As the first type of gesture action input continues, the mapping application remains in state 6215 to continue rotating the map. When the user releases the first type of gesture input, the application of some embodiments transitions to state 6230 to perform an inertia calculation. In some embodiments, after the user releases certain types of touch inputs, the application continues to perform associated map view modifications for a certain amount of time and/or distance. In this case, after the user releases the rotation input, the application transitions to the inertia calculation state 6230 to calculate the amount of additional rotation and the time at which the rotation should be performed. In some embodiments, the application decelerates the current rotation (angle) rate of the rotation from the map as if a "frictional" force was applied to the map. Thus, the inertial calculations of some embodiments are based on the speed of the first type of gesture input. From state 6230, the application transitions back to the map modification state that the application was previously in. That is, when the application transitions from state 6215 (rotation state) to inertia calculation state 6230, the application then transitions back to state 6215 after performing the inertia calculation. After the rotation of the map is complete, the application transitions back to state 6205. When the application receives a second type of gesture input (eg, a single touch input that moves over the map view), the application transitions to state 6220 to cause the map to move through. In order to make the map view mobile browsing, some embodiments modify which portion of the virtual camera's decision map is rendered to establish the orientation of the map view. This causes the map to appear to slide up in the direction from the direction in which the second type of gesture input is derived. In some embodiments, when the local map view is in the 3D perspective mode, the mobile browsing program involves correlating the orientation of the touch input with one of the orientations on the flat map to avoid sudden improper jumps in the map view. As the second type of gesture input continues, the mapping application remains in state 6220 to continue to move the map for browsing. When the user releases the second type of gesture input, the application of some embodiments transitions to state 6230 to perform an inertia calculation. In some embodiments, after the user releases certain types of touch inputs, the application continues to perform associated map view modifications for a certain amount of time and/or distance. In this case, after the user releases the mobile browsing input, the application transitions to the inertia computing state 6230 to calculate the additional amount of the mobile map view (ie, the mobile virtual camera) and the time at which the movement should be performed. In some embodiments, the application decelerates the current browsing rate of the mobile browsing motion from the map as if a "frictional" force was applied to the map. Thus, the inertial calculations of some embodiments are based on the speed of the second type of gesture input. From state 6230, the application transitions back to the map modification state that the application was previously in. That is, when the application transitions from state 6220 (mobile browsing state) to inertia computing state 6230, the application then transitions back to state 6220 after performing the inertia calculation. After the mobile browsing of the map is complete, the application transitions back to state 6205. When the application receives a third type of gesture input (eg, two separate touch inputs that move closer or separate), the application transitions to state 6225 to zoom in or out of the map. In order to change the zoom level of the map view, some embodiments modify which portion of the virtual camera's decision map is rendered to establish the orientation (i.e., height) of the map view. This causes the map view to include more (if reduced) or less (if enlarged) portions of the map. In some embodiments, as the user zooms in or out, the application draws different map basemaps (for different zoom levels) to generate and visualize the new map view. As the third type of gesture input continues, the mapping application remains in state 6225 to continue zooming in or out of the map. When the user releases the second type of gesture input, the application of some embodiments transitions to state 6230 to perform an inertia calculation. In some embodiments, after the user releases certain types of touch inputs, the application continues to perform associated map view modifications for a certain amount of time and/or distance (ie, moving the virtual camera to a higher position) Or lower). In this case, after the user releases the zoom input, the application transitions to the inertia calculation state 6230 to calculate the additional amount of zoomed map view (ie, moving the virtual camera) and the time at which the move should be performed. In some embodiments, the application decelerates the zooming movement from the current zooming or zooming rate of the map (i.e., the speed at which the virtual camera changes altitude) as if a "frictional" force was applied to the camera. Thus, the inertial calculations of some embodiments are based on the speed of the third type of gesture input. From state 6230, the application transitions back to the map modification state that the application was previously in. That is, when the application transitions from state 6225 (zoom state) to inertia computation state 6230, the application then transitions back to state 6225 after performing the inertia calculation. After the zoom of the map is complete, the application transitions back to state 6205. For simplicity, state diagram 6200 illustrates a map move browsing, zooming, and rotation procedure using the same inertial calculation program (state 6230). However, in some embodiments, each of these different map modification programs actually uses a different inertia calculation to identify the deceleration and stop of their particular type of movement. Additionally, some embodiments calculate and modify the inertia variables upon receipt of an input rather than when the user removes the gesture input. When the application receives a fourth type of gesture input (eg, two separate touch inputs that move up or down consistently on the touch-sensitive display), the application transitions to state 6235 to modify the perspective of the map. In order to change the perspective of the map, some embodiments move the virtual camera along a circular arc on the map, thereby modifying both the orientation and orientation of the virtual camera (when the camera maintains the center of its field of view at a particular orientation on the map) Time). In some embodiments, different zoom levels use different arcs along which the virtual camera moves. Each of these arcs has a top point at which the virtual camera points straight down to provide a 2D perspective of the map. In addition, each arc has a bottom point that is movable to the lowest point on the arc of the virtual camera. Thus, in some embodiments, the fourth type of gesture input can cause the application to change between a 2D map view and a 3D perspective map view. As the fourth type of gesture input continues, the mapping application remains under state 6235 to continue modifying the perspective of the map. When the user releases the fourth type of gesture input, the application of some embodiments transitions to state 6240 to perform an inertia calculation. In some embodiments, after the user releases certain types of touch inputs, the application continues to perform associated map view modifications for a certain amount of time and/or distance (ie, moving the virtual camera to a higher position) Or lower). In this case, after the user releases the perspective change input, the application transitions to the inertia calculation state 6240 to calculate an additional amount of the perspective angle of the modified map view (ie, moving the virtual camera along the arc of the virtual camera) and The time at which this move should be performed. In some embodiments, the application decelerates the rate at which the movement changes the perspective angle from the map (i.e., the speed at which the virtual camera moves along its arc) as if a "frictional" force was applied to the camera. Thus, the inertial calculations of some embodiments are based on the speed at which the fourth type of gesture input is performed. Additionally, for the perspective angle changing operation, some embodiments transition to the rebound calculation state 6245. As illustrated, the perspective angle changing operation has, in some embodiments, the maximum and minimum perspective angular shifts allowed, which may depend on the zoom level of the current map view. Therefore, in addition to the inertia calculation, the application also performs the rebound calculation at state 6245. The rebound calculation uses inertia calculation to determine if the maximum point along the virtual camera arc will be reached, and if the maximum point is reached, the rate at which the virtual camera is at this point is determined. Some embodiments allow the virtual camera to move slightly past the maximum point to reach a "rebound" point at which the application causes the virtual camera to steer on its arc, thereby moving the camera back toward the maximum point. Some embodiments include bounce-back functionality only at one end of the virtual camera arc (e.g., at the bottom of the arc), while other embodiments include this functionality at both ends of the arc. From the rebound calculation state 6245, the application transitions back to the inertia calculation state 6240, and then transitions back to the perspective angle change state 6235 to display the map view movement. Additionally, when the user performs the fourth type of touch input for a sufficiently long time and the perspective angle reaches the other maximum point, the application transitions directly from state 6235 to state 6245 to calculate the rebound information, and then transitions back to state 6235. After the modification of the perspective of the map is completed, the application transitions back to state 6205. The above state is related to various multi-touch gestures on the map presentation, and the integrated map drawing, search and navigation application translates the multi-touch gestures into different modifications to the map presentation. Various other touch inputs also allow the application to change state and perform various functions. For example, some embodiments overlay a 3D selectable item on a map view (eg, as a floating control) and select (eg, by tapping the input) the 3D item to cause the application to transition to 6235 to modify the map. The perspective angle of the view. When the map view starts in the 3D perspective, the application modifies the perspective to a 2D view; when the local map view starts in the 2D view, the application modifies the perspective to a 3D view. After the modification, the application returns to state 6205. When the user is viewing the map under state 6205, the application presents various tags as part of the map view. Some of these tags indicate a place of interest or other orientation. When the user selects certain tags (eg, for certain businesses, parks, etc.), the application transitions to state 6250 to display a banner for the selected location (eg, an information display banner), and then returns to the map browsing state (where the banner Displayed on the map). In some embodiments, the banner includes (1) a quick route navigation UI control (eg, a button) that causes the application to retrieve a route from the current orientation of the device to the selected orientation (eg, a driving route) Without leaving the map view, and (2) a UI UI control (eg, a button) that causes the application to provide additional information about the orientation. When the user selects the UI control button, the application transitions from state 6205 to state 6255 to display one of the selected regions. In some embodiments, the preliminary area displays one of the selected orientations of the media presentation (eg, 3D video presentation, low altitude view of the selected orientation, a series of images captured for the orientation, etc.), and various Information (contact information, comments, etc.). The application remains in state 6255 as the user performs various operations to navigate the preparation area and view the information in the preparation area. When the user selects the UI control to transfer back to the map view, the application transitions to state 6205. From the map browsing view, users can easily access the search function of the application. When a particular UI control item (eg, a search column) is selected, the application transitions to the search keying suggestion state 6260. In the search for typing state, some embodiments show that the user can type one of the search terms to touch the screen keypad. The search term can be a business name, an address, a type of location (eg, a coffee shop), and the like. When the user types in a character, the application remains in state 6260 and provides suggestions based on recent searches, typed letters, and the like. Some embodiments may use prefix-based suggestions (eg, suggestions starting with typed characters), as well as other suggestions (eg, spelling corrections to add characters at the beginning of a typed string, swap resources, etc.) ). In some embodiments, in addition to the orientation, the selection may also include a newly typed route. If the user chooses to cancel the UI control at this stage, the application does not perform a search and then moves back to state 6205. When the user selects a search term (a suggested word or a word typed entirely by the user), the application transitions to state 6265 to display the search result on the map view, and then transitions to display the search result to State 6205. Some embodiments display the search results as selectable items on the map (eg, pushpins); selection of one of the items results in a transition to state 6250 to display a banner for the selected item. Additionally, the application of some embodiments automatically selects one of the search results (eg, a "best" result) and displays the banner as part of state 6265. Since the application is a tightly integrated mapping, search, route selection and navigation application, users can easily access the route selection function from the map browsing state. When a particular UI control item (eg, a route entry button) is selected, the application transitions to route entry state 6270. In the route entry state, some embodiments display a touch screen keypad that the user can use to type the orientation (eg, address, place name, place type, etc.) into the "to" and "from" fields for request. One route. As the user types in the characters, the application remains in state 6270 and provides suggestions based on recent routes, recent searches, auto-completion similar to auto-completion described for search typing, and the like. If the user chooses to cancel the UI control at this stage, the application will move back to state 6205 without taking the route. When the user selects a route (for example, by typing "to" orientation and "from" orientation), the application transitions to route display state 6275. In this state, the application displays one or more routes from the first selected orientation to the second selected orientation on the map view (eg, by overlaying the route line in a map view). Some embodiments automatically select the first of the routes. In the case where the application remains under state 6275 (but the display of the route is modified to indicate the selection of other routes), the user can select any of the other routes (eg, by tapping on the unselected route) ). Additionally, when in state 6275, the application of some embodiments displays different UI controls related to route selection and navigation, including guidance list controls, navigation start controls, and other controls. Also, various gesture interactions on the map displaying the route can cause the application to perform different modifications to the map view (eg, move browsing, rotate, zoom, modify the perspective of the map perspective, etc.). In the case where all gesture action map modification operations (eg, as a result of states 6215 through 6245) are available, the application transitions when the integrated application receives a gesture interaction on the map display while in the route display state 6275 State 6210 is performed to perform a gesture input identification. That is, the application translates the gesture input into a mobile browsing, rotation, zooming, and/or perspective angle changing operation similar to that described above for states 6215 through 6245, with similar inertia and back for virtual camera movement. Bullet feature. Although operations 6215 through 6245 return to map browsing state 6205, the resulting operation from route display state 6275 returns to route display state 6275. In some embodiments, route display state 6275 may also be entered from other states. For example, if the user selects the quick route UI control on the banner while in state 6205, the application retrieves one or more routes from the current orientation of the device to the orientation associated with the banner. Additionally, some embodiments display the previously requested route in the search suggestion at state 6260. When the user selects one of these suggested routes, the application transitions directly from state 6260 to state 6275 to display one or more routes on the map. From the route display state 6275, the application can be converted into various modes depending on the different control items selected by the user. When the user selects a UI control to clear the route, the application transitions back to state 6205 to display a map without any route. In addition, the integrated application can enter one or more navigation modalities from the route display state 6275. When the selected route displayed at state 6275 begins with the current orientation of the device and the user selects a navigation start control, the application transitions to navigation state 6280. In some embodiments, the application displays a movie transition from a map view to a more immersive 3D view for navigation. In the navigation state 6280 of some embodiments, the virtual camera follows the user's orientation along the selected route to present an upcoming portion of the route. When the route is completed (the device reaches the destination orientation) or the user selects the control to end the navigation, the application transitions to state 6205 to present the map browsing view 6205. In some embodiments, when in navigation mode 6280, various gesture interactions on the map displaying the route may cause the application to perform different modifications to the map view (eg, mobile browsing, rotation, zooming, modifying map perspective) Angle, etc.). In some embodiments, only some of the described map modification operations are available in the navigation mode. For example, some embodiments allow a user to zoom in or out of the map, but do not allow any other modifications to the map. Thus, when the user provides a gesture input, the gesture input recognition state 6210 filters out the type of gesture input that is not associated with the zoom operation (and then, the application returns to state 6280). Upon receiving a typed gesture input associated with the zoom operation, the gesture input recognition state recognizes the input and the application transitions to a state similar to state 6225 to change the zoom level of the map (in some embodiments, With inertia calculation). Other embodiments may allow for different map modification operations to be implemented. For example, in some embodiments, all of the schematic action map modification operations (eg, the results of states 6215 through 6245) are available while in the navigation mode. Some embodiments allow for a subset of action map modification operations to be illustrated, including scaling and limited mobile browsing operations. Upon receiving a type of gesture input associated with mobile browsing, the mobile browsing operation of some embodiments moves the virtual camera (when in navigation mode) laterally, then returns the virtual camera to point along the route. Although operations 6215 through 6245 return to map browsing state 6205, the resulting operation from navigation state 6280 returns to navigation state 6280. When the selected route displayed at state 6275 begins with an orientation other than the current orientation of the device (or route walking route) and the user selects the navigation start control, the application transitions to step mode or route check at state 6285. mode. In some embodiments, the application displays driving control along the route (eg, as a navigational marker) one at a time. By providing a gesture input to the driver (eg, a toggle action), the user can view different controls while in the route check mode. The driving controls are overlayed on a map and at least a portion of the route is displayed on the map. As in the route display mode, various gesture interactions on the map allow the application to perform different modifications to the map view (eg, move browsing, rotate, zoom, modify the perspective of the map perspective, etc.). When the integrated application receives the gesture interaction on the map display while in step mode 6285, the application transitions to state 6210 to perform the gesture input recognition, while all gestures map modification operations (eg, state 6215 to The result of 6245) is available. That is, the application translates the gesture input into a mobile browsing, rotation, zooming, and/or perspective angle changing operation similar to that described above for states 6215 through 6245, with similar inertia and back for virtual camera movement. Bullet feature. Although operations 6215 through 6245 return to map browsing state 6205, the resulting operation from step mode 6285 is returned to step mode 6285. Moreover, in some embodiments, the gesture input identification identifies at least one type of gesture input on the displayed driver for switching between controls. When a particular type of gesture input (eg, a toggle gesture) is received on the displayed driver (rather than on the map view), the application transitions to change the state of the displayed driver (not Shown), then return to state 6285. When the integrated application receives a gesture interaction on the displayed map while in the step state 6285, the application transitions to state 6210 to perform the gesture input recognition, while all gestures map modification operations (eg, status) The results of 6215 to 6245 are available. When the modification operation is complete, the application returns to state 6285. When the user selects a control to end the stepping through the ride, the application transitions to state 6205 to present a map view. Additionally, in some embodiments, the application can transition from step mode 6285 to automatic step state 6290. When the user selects an orientation tracking control while the application is in state 6285, the application transitions to automatic stepping mode 6290, which is a different navigation mode. When in some embodiments the automatic stepping mode, the integrated mapping, search, and navigation application displays the device's closest proximity to the control (eg, as measured by the junction at which the control is performed). The automatic stepping mode automatically displays the different driving controls as the device moves (eg, along the route) to a position that is closer to one of the different driving controls. When the user deselects the azimuth tracking control, the application transitions back to step mode 6285. When the user selects a control to end navigation while in the automatic step state 6290, the application transitions to state 6205 to present a map browsing view. As in step mode 6285, various gesture interactions on the map allow the application to perform different modifications to the map view (eg, move browsing, rotate, zoom, modify map perspective, etc.). When the integrated application receives the gesture interaction on the map display while in the automatic step mode 6290, the application transitions to state 6210 to perform the gesture input recognition, while all the gesture map modifications are performed (eg, state 6215) The result to 6245) is available. That is, the application translates the gesture input into a mobile browsing, rotation, zooming, and/or perspective angle changing operation similar to that described above for states 6215 through 6245, with similar inertia and back for virtual camera movement. Bullet feature. Although operations 6215 through 6245 return to map browsing state 6205, the resulting operation from automatic stepping mode 6290 returns to automatic stepping mode 6290. Additionally, some embodiments automatically turn off the azimuth tracking control when the user moves the map through a particular distance, in which case the application returns to step mode state 6285 instead of automatic step state 6290.VI. electronic system Many of the above features and applications are implemented as software programs designated as a set of instructions recorded on a computer readable storage medium (also referred to as a computer readable medium). When such instructions are executed by one or more computing or processing units (eg, one or more processors, processor cores, or other processing units), the instructions cause the processing units to execute the instructions The action indicated in . Examples of computer readable media include, but are not limited to, CD-ROMs, flash drives, random access memory (RAM) chips, hard disk drives, erasable programmable read only memory (EPROM), electrically erasable In addition to programmable read-only memory (EEPROM). Computer-readable media does not include carrier and electronic signals that are transmitted wirelessly or via a wired connection. In this specification, the term "software" is intended to include a firmware resident in a read-only memory that can be read into a memory for processing by a processor or an application stored in a magnetic storage. Also, in some embodiments, multiple software inventions can be implemented as sub-parts of a larger program while still being invented by dissimilar software. In some embodiments, multiple software inventions may also be implemented as separate programs. Finally, any combination of separate programs for implementing the software invention described herein is within the scope of the invention. In some embodiments, a software program, when installed to operate on one or more electronic systems, defines one or more specific machine implementations that execute and perform the operations of the software program.A. Mobile device The multi-modal mapping application of some embodiments operates on mobile devices such as smart phones (eg, iPhone®) and tablets (eg, iPad®). Figure 63 is an example of an architecture 6300 of such a mobile computing device. Examples of mobile computing devices include smart phones, tablets, laptops, and the like. As shown, the mobile computing device 6300 includes one or more processing units 6305, a memory interface 6310, and a peripheral device interface 6315. Peripheral device interface 6315 is coupled to various sensors and subsystems, including camera subsystem 6320, wireless communication subsystem 6325, audio subsystem 6330, I/O subsystem 6335, and the like. Peripheral device interface 6315 allows for communication between processing unit 6305 and various peripheral devices. For example, orientation sensor 6345 (eg, gyroscope) and acceleration sensor 6350 (eg, an accelerometer) are coupled to peripheral device interface 6315 to facilitate orientation and acceleration functions. Camera subsystem 6320 is coupled to one or more optical sensors 6340 (eg, charge coupled device (CCD) optical sensors, complementary metal oxide semiconductor (CMOS) optical sensors, etc.). Camera subsystem 6320 coupled to optical sensor 6340 facilitates camera functions, such as image and/or video data capture. Wireless communication subsystem 6325 is used to facilitate communication functions. In some embodiments, the wireless communication subsystem 6325 includes a radio frequency receiver and transmitter, and an optical receiver and transmitter (not shown in FIG. 63). Such receivers and transmitters of some embodiments are implemented to operate on one or more communication networks, such as a GSM network, a Wi-Fi network, a Bluetooth network, and the like. The audio subsystem 6330 is coupled to a speaker to output audio (eg, output voice navigation instructions). In addition, the audio subsystem 6330 is coupled to a microphone to facilitate voice enabled functions, such as speech recognition (eg, for searching), digital recording, and the like. The I/O subsystem 6335 relates to the transfer between the input/output peripheral devices (such as a display, touch screen, etc.) and the data bus of the processing unit 6305 via the peripheral device interface 6315. The I/O subsystem 6335 includes a touch screen controller 6355 and other input control items 6360 to facilitate transfer between the input/output peripheral devices and the data bus of the processing unit 6305. As shown, the touch screen controller 6355 is coupled to the touch screen 6365. The touch screen controller 6355 uses any of a number of touch sensitive technologies to detect contact and movement on the touch screen 6365. Other input controllers 6360 are coupled to other input/control devices, such as one or more buttons. Some embodiments include a proximity touch sensitive screen and a corresponding controller that can detect proximity touch interactions rather than touch interactions or detect proximity touch interactions in addition to touch interactions. The memory interface 6310 is coupled to the memory 6370. In some embodiments, the memory 6370 includes volatile memory (eg, high speed random access memory), non-volatile memory (eg, flash memory), volatile memory, and non-volatile memory. A combination, and / or any other type of memory. As illustrated in Figure 63, memory 6370 stores an operating system (OS) 6372. OS 6372 includes instructions for handling basic system services and for performing hardware dependent tasks. The memory 6370 also includes: communication instructions 6374 for facilitating communication with one or more additional devices; graphical user interface instructions 6376 for facilitating graphical user interface processing; image processing for facilitating image related processing and functions Instruction 6378; input processing instructions 6380 for facilitating input related (eg, touch input) programs and functions; audio processing instructions 6382 for facilitating audio related programs and functions; and camera instructions for facilitating camera related programs and functions 6384. The above described instructions are merely illustrative, and in some embodiments, memory 6370 includes additional and/or other instructions. For example, memory for a smart phone can include phone instructions to facilitate phone related programs and functions. Additionally, the memory can include instructions for a multimodal mapping application. The above identified instructions need not be implemented as separate software programs or modules. The various functions of the mobile computing device can be implemented in hardware and/or software (including one or more signal processing and/or special application integrated circuits). While the components illustrated in Figure 63 are shown as separate components, one of ordinary skill in the art will recognize that two or more components can be integrated into one or more integrated circuits. Additionally, two or more components may be coupled together by one or more communication busses or signal lines. Again, although many of these functions have been described as being performed by one component, those skilled in the art will recognize that the functions described with respect to Figure 63 can be divided into two or more integrated circuits. .B. computer system Figure 64 conceptually illustrates another example of an electronic system 6400 in which some embodiments of the present invention are implemented. Electronic system 6400 can be a computer (eg, a desktop, personal computer, tablet, etc.), a telephone, a PDA, or any other type of electronic or computing device. This electronic system includes various types of computer readable media and interfaces for various other types of computer readable media. The electronic system 6400 includes a bus 6405, a processing unit 6410, a graphics processing unit (GPU) 6415, a system memory 6420, a network 6425, a read-only memory 6430, a permanent storage device 6435, an input device 6440, and an output device. 6445. Busbars 6405 collectively represent all of the system, peripheral, and chipset busbars that communicatively connect the numerous internal components of electronic system 6400. For example, bus 6405 communicatively connects the processing unit 6410 with read-only memory 6430, GPU 6415, system memory 6420, and persistent storage device 6435. From among these various memory units, the processing unit 6410 retrieves the instructions for executing and the materials for processing to perform the procedures of the present invention. In various embodiments, the processing unit can be a single processor or a multi-core processor. Some instructions are passed to GPU 6415 and executed by GPU 6415. The GPU 6415 can offload various calculations or supplement the image processing provided by the processing unit 6410. In some embodiments, CoreImage's core occlusion language can be used to provide this functionality. Read-only memory (ROM) 6430 stores the static data and instructions required by the processing unit 6410 and other modules of the electronic system. On the other hand, the permanent storage device 6435 reads and writes to the memory device. This device is a non-volatile memory unit that stores instructions and data even when electronic system 6400 is turned off. Some embodiments of the present invention use bulk storage devices such as magnetic or optical disks and their corresponding disk drives, integrated flash memory as the permanent storage device 6435. Other embodiments use removable storage devices (such as floppy disks, flash memory devices, etc. and their corresponding disk drives) as permanent storage devices. Similar to persistent storage device 6435, system memory 6420 reads and writes to the memory device. However, unlike storage device 6435, system memory 6420 is a volatile read and write memory, such as random access memory. System memory 6420 stores some of the instructions and materials required by the processor during the execution phase. In some embodiments, the program of the present invention is stored in system memory 6420, persistent storage device 6435, and/or read only memory 6430. From among the various memory units, the processing unit 6410 retrieves instructions for executing and processing the data to perform the procedures of some embodiments. Bus 6405 is also coupled to input device 6440 and output device 6445. The input devices 6440 enable a user to communicate information to the electronic system and commands to the electronic system. The input devices 6440 include alphanumeric keyboards and indicator devices (also referred to as "cursor control devices"), cameras (eg, webcams), microphones or the like for receiving voice commands. The output devices 6445 display images produced by the electronic system or otherwise output the data. The output devices 6445 include printers and display devices such as cathode ray tubes (CRTs) or liquid crystal displays (LCDs), as well as speakers or similar audio output devices. Some embodiments include devices that function as both an input device and an output device, such as a touch screen. Finally, as shown in FIG. 64, bus 6405 also couples electronic system 6400 to network 6425 via a network adapter (not shown). In this way, the computer can be part of a computer network (such as a local area network ("LAN"), a wide area network ("WAN"), or an intranet (such as the Internet) or a network of the Internet (such as the Internet). Any or all of the components of electronic system 6400 can be used in conjunction with the present invention. Some embodiments include electronic components, such as microprocessors, storage and storage of computer program instructions in a machine readable or computer readable medium (alternatively referred to as computer readable storage media, machine readable media or Machine readable storage media). Some examples of such computer readable media include RAM, ROM, CD-ROM, CD-R, rewritable compact disc (CD-RW), read only audio and video discs (eg, DVD-ROM, dual-layer DVD-ROM), various recordable/rewritable DVDs (for example, DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (for example, SD card, mini SD card, Micro SD cards, etc., magnetic and/or solid state drives, read-only and recordable Blu-Ray® discs, ultra-high-density discs, any other optical or magnetic media, and floppy disks. The computer readable medium can store a computer program executable by at least one processing unit and including a set of instructions for performing various operations. Examples of computer programs or computer programs include machine code (such as machine code generated by a compiler), and files including higher level code executed by a computer, electronic component or microprocessor using an interpreter. Although the above discussion mainly refers to a microprocessor or multi-core processor executing software, some embodiments are implemented by one or more integrated circuits (such as special application integrated circuits (ASIC) or field programmable gate arrays (FPGA). ))carried out. In some embodiments, such integrated circuits perform instructions stored on the circuit itself. Additionally, some embodiments implement software stored in programmable logic device (PLD), ROM or RAM devices. As used in this specification and any claims of this application, the terms "computer", "server", "processor" and "memory" are all referring to electronic devices or other technical devices. These terms exclude groups of people or people. For the purposes of this specification, the term display means to be displayed on an electronic device. As used in this specification and any claims of this application, the terms "computer-readable medium", "multiple computer-readable medium" and "machine-readable medium" are completely limited to the storage of information in a computer readable form. Tangible physical objects. These terms exclude any wireless signals, cable download signals, and any other temporary signals.VII. Map service environment Various embodiments may operate in a map service operating environment. Figure 65 illustrates a map service job environment in accordance with some embodiments. The map service 6530 (also referred to as a mapping service) may provide map services for one or more of the client devices 6502a through 6502c in communication with the map service 6530 via various communication methods and protocols. In some embodiments, map service 6530 provides map information and other map related material, such as two-dimensional map image data (eg, a bird's eye view of a road imaged using satellite imaging), three-dimensional map image material (eg, having a three-dimensional image such as a building) Features can be traversed maps), routes and guidance calculations (eg, ferry route calculations or guidance for pedestrians between two points), instant navigation data (eg, 2D or 3D steering tips visual navigation data), positioning Data (for example, where the client device is currently located) and other geographic information (for example, wireless network coverage, weather, traffic information, or nearby points of interest). In various embodiments, the map service material may include regionalized tags for different countries or regions. The regionalization tag can be used to present map tags (eg, street names, city names, points of interest) in different languages on the client device. Client devices 6502a through 6502c may utilize such map services by obtaining map service data. Client devices 6502a through 6502c may implement various techniques for processing map service data. Client devices 6502a through 6502c may then provide map services to various entities including, but not limited to, users, internal software or hardware modules, and/or other systems or devices external to client devices 6502a through 6502c. In some embodiments, the map service is implemented by one or more nodes in a distributed computing system. Each node can be assigned one or more services or components of the map service. Some nodes may be assigned the same map service or component of the map service. In some embodiments, the load balancing node will distribute access or requests to other nodes within the map service. In some embodiments, the map service is implemented as a single system, such as a single server. Different modules or hardware devices within the server may implement one or more of the various services provided by the map service. In some embodiments, the map service provides a map service by generating map service materials in various formats. In some embodiments, the map service material in one format is map image material. The map image data provides the image data to a client device such that the client device can process the image material (eg, visualize and/or display the image data as a two- or three-dimensional map). Map image data (two-dimensional or three-dimensional) can specify one or more map basemaps. The map basemap can be part of a larger map image. The map map of a map is assembled to produce the original map. The basemap can be generated from map imagery, route selection or navigation data or any other map service data. In some embodiments, the map basemap is based on a bitmap basemap, wherein the basemap size can be any size greater than and less than the 256 pixels by 256 pixel basemap typically used. A bitmap-based map base map can be encoded into a number of standard digital image representations including, but not limited to, bitmap (.bmp), graphics interchange format (.gif), joint photography expert group (.jpg, . Jpeg, etc.), portable network graphics (.png) or tagged video file format (.tiff). In some embodiments, the map basemap is based on a vector map of the map, encoded using vector graphics including, but not limited to, scalable vector graphics (.svg) or graphics archive (.drw). Some embodiments also include a base map with a combination of vectors and bitmap data. Data or other information related to the map base map may also be included in or included with the map base map to provide more map service data to the client device. In various embodiments, map maps are encoded for transmission using various standards and/or protocols, some of which are described in the examples below. In various embodiments, the map basemap can be constructed from image data of different resolutions, depending on the zoom level. For example, for low zoom levels (eg, world or global views), the resolution of the map or image data does not have to be as high as the resolution at a high zoom level (eg, city or street level). For example, when in a global view, it may not be necessary to visualize street-level artifacts, as such objects would be too small to be ignored in many cases. In some embodiments, the map service performs various techniques for analyzing the map base map prior to encoding the base map for transmission. This analysis optimizes the performance of map services for both client devices and map services. In some embodiments, the complexity of the map base map is analyzed according to vector-based graphics techniques, and the map base map is constructed using complex and uncomplicated layers. The map base map can also be analyzed for general image data or patterns that can be visualized as image textures, and the map base map can be constructed by relying on image masks. In some embodiments, the bitmap-based image material in the map base map contains certain mask values that are associated with one or more textures. Some embodiments also analyze map maps for specified features that may be associated with certain map styles that include style identifiers. In some embodiments, other map services generate map service material that is separate from the map basemap and that relies on various data formats. For example, a map service providing location data may utilize a data format that conforms to a location service agreement such as, but not limited to, a Radio Resource Location Service Protocol (RRLP) for code division multiple access (CDMA). TIA 801, Radio Resource Control (RRC) Location Agreement or LTE Location Agreement (LPP). Embodiments may also receive or request data from a client device that identifies device capabilities or attributes (eg, hardware specifications or operating system versions) or communication capabilities (eg, as determined by wireless signal strength or wired or wireless network type) Device communication bandwidth). The map service can obtain map service materials from internal or external sources. For example, satellite imagery used in map imagery may be obtained from an external service or internal system, storage device, or node. Other examples may include, but are not limited to, GPS assisted servers, wireless network coverage databases, commercial or personal directories, weather data, government information (eg, construction engineering updates or road name changes), or traffic reports. Some embodiments of the map service may update the map service material (eg, wireless network coverage) for analysis of future requests from the client device. Various embodiments of the map service are responsive to requests from the client device for the map service. Such requests may be requests for a particular map or part of a map. Some embodiments format the request for a map as a request for a particular map basemap. In some embodiments, the request also supplies a starting location (or current orientation) and a destination orientation for the map service for route calculation. The client device may also request the map service to present information, such as a map texture or style sheet. In at least some embodiments, the request is also one of a series of requests to implement a turn-by-turn navigation. Requests for other geographic information may include, but are not limited to, current location, wireless network coverage, weather, traffic information, or nearby points of interest. In some embodiments, the map service analyzes client device requests to optimize device or map service operations. For example, the map service can recognize that the orientation of the client device is in one of the areas of poor communication (eg, weak wireless signal), and send more map service data to supply the client device or send in the event of communication loss. The instructions are to utilize different client hardware (eg, directional sensors) or software (eg, using wireless location services or Wi-Fi positioning rather than GPS based services). In another example, the map service can analyze the request of the client device for the vector-based map image data, and determine the map image data based on the bitmap to better optimize the map image data according to the complexity of the image. Embodiments of other map services may perform similar analysis on the client device request, and thus, the above examples are not intended to be limiting. Various embodiments of the client device (e.g., client devices 6502a through 6502c) are implemented on different types of portable multifunction devices. Client devices 6502a through 6502c utilize map service 6530 via various communication methods and protocols. In some embodiments, client devices 6502a through 6502c obtain map service material from map service 6530. The client devices 6502a through 6502c request or receive map service data. The client devices 6502a through 6502c then process the map service data (e.g., visualize and/or display the material) and may send the data to another software or hardware module on the device or to an external device or system. According to some embodiments, the client device implements techniques for visualizing and/or displaying a map. These maps may be requested or received in a variety of formats, such as the map basemap described above. A client device can present a map in a two- or three-dimensional view. Some embodiments of a client device display a virtual camera in a map as soon as the user or system is allowed to provide an input to manipulate the map, thereby changing the map display based on the position, orientation, and field of view of the virtual camera. Various forms and input devices are implemented to manipulate the virtual camera. In some embodiments, the virtual camera is manipulated via a touch input of some single or combined gesture (eg, touch and hold or toggle). Other embodiments allow the physical orientation of the device to be manipulated to manipulate the virtual camera. For example, a user device can be tilted up from its current position to manipulate the virtual camera to rotate up. In another example, a client device can be tilted forward from its current position to move the virtual camera forward. Other input devices that can be implemented to the client device include, but are not limited to, audible inputs (eg, spoken words), physical keyboards, mice, and/or joysticks. Some embodiments provide various visual feedbacks for virtual camera manipulation, such as displaying possible virtual camera manipulation animations when transitioning from a two-dimensional map view to a three-dimensional map view. Some embodiments also allow for input to select a map feature or object (e.g., a building) and highlight the object, thereby creating a blurring effect that maintains the three dimensional sense of the virtual camera. In some embodiments, a client device implements a navigation system (e.g., steering alert navigation). The navigation system provides guidance or route information that can be displayed to the user. Some embodiments of the client device request guidance or a route calculation to the map service. The client device can receive map image data and route data from the map service. In some embodiments, a client device implements a steering alert navigation system that provides an instant route based on location information and route information received from a map service and/or other positioning system, such as a Global Positioning Satellite (GPS) system. Guidance information. The client device can display map image data reflecting the current orientation of the user device and update the map image data in real time. A navigation system can provide audible or visual guidance to travel along a route. According to some embodiments, a virtual camera is implemented to manipulate navigation map material. Some embodiments of the client device allow the device to adjust the virtual camera display orientation to a biased route destination. Some embodiments also allow the virtual camera to simulate the inertial motion of the virtual camera through the turn. The client device implements various techniques to utilize map service material from the map service. Some embodiments implement techniques for optimizing the visualization of two-dimensional and three-dimensional map imagery. In some embodiments, the client device stores the presence information at the local end. For example, the client stores a style sheet containing image data of the style identifier, which provides a presentation guide. In another example, a general image texture can be stored to reduce the amount of map image data transmitted from the map service. In various embodiments, the client device implements various modeling techniques for visualizing two- and three-dimensional map image data, examples of which include, but are not limited to, generating a three-dimensional building from two-dimensional building footprint data. Modeling two-dimensional and three-dimensional map objects to determine the communication environment of the user device; generating a model for determining whether a map label can be seen from a virtual camera position; and generating a smooth transition between map image data model. In some embodiments, the client device also prioritizes or prioritizes the map service data using certain techniques. For example, the client device detects the motion or rate of the virtual camera, and if it exceeds some threshold, loads and visualizes less detailed image data for certain regions. Other examples include: rendering a vector-based curve as a series of points; preloading map imagery of areas that are poorly communicating with the map service; adapting the texture based on the display zoom level; or visualizing the map image data based on complexity. In some embodiments, the client device communicates using various data formats that are separate from the map basemap. For example, some client devices implement assisted Global Positioning Satellite (A-GPS) and communicate with location services that utilize a data service conforming to a location service agreement, such as, but not limited to, a radio resource location service. Protocol (RRLP), TIA 801 for Code Division Multiple Access (CDMA), Radio Resource Control (RRC) Location Agreement, or LTE Location Agreement (LPP). The client device can also receive GPS signals directly. Embodiments may also send data (in the case of a map service request or not request) that identifies the capabilities or attributes of the client device (eg, hardware specifications or operating system versions) or communication capabilities (eg, such as based on wireless signal strength) Or the communication bandwidth of the device determined by the wired or wireless network type). Figure 65 illustrates one possible embodiment of a work environment 6500 for map service 6530 and client devices 6502a through 6502c. In some embodiments, devices 6502a, 6502b, and 6502c communicate via one or more wired or wireless networks 6510. For example, wireless network 6510 (such as a cellular network) can communicate with a wide area network (WAN) 6520 (such as the Internet) by using gateway 6514. In some embodiments, gateway 6514 provides a packet-oriented mobile data service (such as Integrated Packet Radio Service (GPRS)) or other action that allows the wireless network to transmit data to other networks, such as wide area network 6520. Data service. Likewise, access device 6512 (e.g., an IEEE 802.11g wireless access device) provides communication access to WAN 6520. Devices 6502a and 6502b can be any portable electronic or computing device capable of communicating with a map service. Device 6502c can be any non-portable electronic or computing device capable of communicating with a map service. In some embodiments, both voice and data communications can be established via wireless network 6510 and access device 6512. For example, device 6502a can be dialed via wireless network 6510, gateway 6514, and WAN 6520 (eg, using Transmission Control Protocol/Internet Protocol (TCP/IP) or User Datagram Protocol (UDP)). Answer calls (for example, using the Voice over Internet Protocol (VoIP) protocol), receive and send email messages (for example, using Easy Mail Transfer Protocol (SMTP) or Post Office Protocol 3 (POP3)), and capture electronic files And/or streaming (such as web pages, photos and videos). Similarly, in some implementations, devices 6502b and 6502c can make and receive calls, access and receive email messages, and retrieve electronic files via access devices 6512 and WAN 6520. In various embodiments, any of the illustrated client devices can use persistent connections established in accordance with one or more security protocols, such as Secure Communications Layer (SSL) protocols or Transport Layer Security (TLS) protocols. To communicate with map service 6530 and/or other services 6550. Devices 6502a and 6502b can also establish communications by other means. For example, wireless device 6502a can communicate with other wireless devices (eg, other devices 6502b, mobile phones, etc.) via wireless network 6510. Similarly, devices 6502a and 6502b can establish peer-to-peer communication 6540 by using one or more communication subsystems, such as Bluetooth® communications from Bluetooth Special Interest Group, Inc. (Kirkland, Washington) (eg, personal area network) road). Device 6502c can also establish inter-stage communication (not shown) with device 6502a or 6502b. Other communication protocols and topologies can also be implemented. Devices 6502a and 6502b may also receive Global Positioning Satellite (GPS) signals from GPS satellites 6560. Devices 6502a, 6502b, and 6502c can communicate with map service 6530 via one or more wired and/or wireless networks 6510 or 6512. For example, map service 6530 can provide map service data to presentation devices 6502a, 6502b, and 6502c. The map service 6530 can also communicate with other services 6550 to obtain data to implement the map service. The map service 6530 and other services 6550 can also receive GPS signals from the GPS satellite 6560. In various embodiments, map service 6530 and/or other services 6550 are configured to process search requests from any of the client devices. The search request may include, but is not limited to, a query for a business, address, residential location, point of interest, or some combination thereof. The map service 6530 and/or other services 6550 can be configured to pass back results related to a variety of parameters including, but not limited to, the type entered into the address column or other text entry fields (including abbreviations and/or Or other shorthand notation), current map view (eg, the user can view another orientation on the multifunction device while staying in one orientation), the user's current orientation (eg, the current map view does not include search results) In case) and current route (if any). In various embodiments, such parameters may affect the composition of the search results (and/or the ranking of the search results) based on different priority weights. In various embodiments, the returned search results may be a subset of results selected based on particular criteria including, but not limited to, the number of times the search results (eg, a particular point of interest) have been requested, A measure of the quality associated with the search results (eg, the highest user or editorial review rating) and/or the number of reviews for the search results (eg, the number of times the search results have been reviewed or rated). In various embodiments, map service 6530 and/or other services 6550 are configured to provide automated completion of search results displayed on a client device, such as within a mapping application. For example, auto-complete search results can be filled in as part of the screen when the user types one or more search keywords on the multi-function device. In some cases, this feature can save user time because the desired search results can be displayed before the user types in the full search query. In various embodiments, the auto-complete search results may be search results (eg, bookmarks or contacts) found by the user on the client device, by map service 6530, and/or other services 6550 elsewhere (eg, from the Internet) Road) A search result found and/or a combination of such search results. As in the case of a command, any of the search queries can be entered by the user via voice or via typing. The multifunction device can be configured to graphically display the search results in any of the map displays described herein. For example, a pin or other graphical indicator can specify the orientation of the search results as a point of interest. In various embodiments, in response to the user selecting one of the points of interest (eg, a touch selection, such as a touch), the multifunction device is configured to display additional information regarding the selected point of interest, Includes, but is not limited to, images of ratings, comments or commentary clips, business hours, store status (eg, open doors, permanent closures, etc.) and/or storefronts of points of interest. In various embodiments, any of this information can be displayed on a graphical information card that is displayed in response to the user selecting a point of interest. In various embodiments, map service 6530 and/or other services 6550 provide one or more feedback mechanisms for receiving feedback from client devices 6502a through 6502c. For example, the client device may provide feedback on the search results to the map service 6530 and/or other services 6550 (eg, assign ratings, comments, temporary or permanent outages, errors, etc.); this feedback may be used to update Information about points of interest to provide more accurate or updated search results in the future. In some embodiments, map service 6530 and/or other services 6550 can provide test information to a client device (eg, an A/B test) to determine which search results are optimal. For example, the client device can receive two search results at random intervals and present the two search results to the user, and allows the user to indicate the best result. The client device can report the best results to map service 6530 and/or other services 6550 based on selected testing techniques (such as A/B testing techniques, where baseline control samples are compared to various single variable test samples to improve results) ) to improve future search results. While the invention has been described with respect to the specific embodiments of the present invention, it will be understood that the invention may be embodied in other specific forms without departing from the spirit of the invention. For example, many of the figures illustrate various touch gestures (eg, touch, tap twice, toggle gestures, hold down gestures, etc.). However, actions can be indicated via different touches (eg, toggle instead of touch, etc.) or by non-touch input (eg, using a cursor controller, keyboard, trackpad/track pad, proximity touch sensitive screen, etc.) ) to perform many of the illustrated operations. In addition, several figures conceptually illustrate the procedure. The specific operations of such programs may not be performed in the precise order shown and described. Specific operations may not be performed in a series of sequential operations, and in different embodiments, different specific operations may be performed. In addition, the program can be implemented using several subroutines or as part of a larger macro program.

100‧‧‧器件
105‧‧‧第一階段
110‧‧‧第二階段
115‧‧‧第三階段
120‧‧‧使用者介面(UI)
125‧‧‧停駐區域
130‧‧‧地圖繪製應用程式
140‧‧‧頂端列
145‧‧‧位置控制項/清單檢視控制項
150‧‧‧3D控制項
155‧‧‧頁面捲曲控制項
160‧‧‧指引控制項
165‧‧‧搜尋欄位
170‧‧‧書籤控制項
205‧‧‧第一階段
210‧‧‧第二階段
215‧‧‧第三階段
220‧‧‧第四階段
225‧‧‧第五階段
230‧‧‧第六階段
235‧‧‧清單檢視控制項
240‧‧‧路線產生控制項
245‧‧‧開始控制項
250‧‧‧結束控制項
255‧‧‧清除控制項
260‧‧‧路線
261‧‧‧路線
300‧‧‧羅盤
305‧‧‧第一階段
310‧‧‧第二階段
315‧‧‧第三階段
320‧‧‧第四階段
325‧‧‧第五階段
326‧‧‧當前方位指示符
345‧‧‧模擬光投射
405‧‧‧第一階段
410‧‧‧第二階段
415‧‧‧第三階段
420‧‧‧第四階段
425‧‧‧使用者之當前方位
505‧‧‧虛擬攝影機
510‧‧‧第一階段
515‧‧‧第二階段
520‧‧‧第三階段
525‧‧‧三維地圖視圖
530‧‧‧三維地圖視圖
535‧‧‧三維地圖場景
540‧‧‧二維地圖視圖
545‧‧‧二維地圖
550‧‧‧軌跡
555‧‧‧軌跡
600‧‧‧虛擬攝影機
605‧‧‧第一階段
610‧‧‧第二階段
615‧‧‧第三階段
625‧‧‧三維地圖視圖
630‧‧‧三維地圖視圖
635‧‧‧三維地圖
640‧‧‧三維地圖視圖
705‧‧‧第一階段
710‧‧‧第二階段
715‧‧‧第三階段
720‧‧‧第四階段
725‧‧‧器件之當前方位
740‧‧‧二維視圖/二維地圖
745‧‧‧三維地圖
805‧‧‧第一階段
810‧‧‧第二階段
815‧‧‧第三階段
905‧‧‧第一階段
910‧‧‧第二階段
915‧‧‧第三階段
920‧‧‧第四階段
925‧‧‧第五階段
930‧‧‧第六階段
1005‧‧‧第一階段
1010‧‧‧第二階段
1015‧‧‧第三階段
1020‧‧‧第四階段
1025‧‧‧二維地圖視圖
1030‧‧‧經旋轉地圖視圖
1105‧‧‧第一階段
1110‧‧‧第二階段
1115‧‧‧第三階段
1120‧‧‧第四階段
1125‧‧‧地圖視圖
1130‧‧‧經旋轉地圖視圖
1205‧‧‧第一階段
1210‧‧‧第二階段
1215‧‧‧第三階段
1220‧‧‧二維地圖視圖
1225‧‧‧二維地圖視圖
1230‧‧‧二維地圖視圖
1300‧‧‧程序
1405‧‧‧第一階段
1410‧‧‧第二階段
1415‧‧‧第三階段
1420‧‧‧第四階段
1505‧‧‧第一階段
1510‧‧‧第二階段
1515‧‧‧第三階段
1520‧‧‧第四階段
1605‧‧‧第一階段
1610‧‧‧第二階段
1615‧‧‧第三階段
1620‧‧‧第四階段
1701‧‧‧第一階段
1702‧‧‧第二階段
1703‧‧‧第三階段
1710‧‧‧三維地圖
1712‧‧‧虛擬攝影機
1714‧‧‧三維地圖視圖
1724‧‧‧三維地圖視圖
1734‧‧‧三維地圖視圖
1750‧‧‧線
1755‧‧‧線
1800‧‧‧虛擬攝影機
1805‧‧‧第一階段
1810‧‧‧第二階段
1815‧‧‧第三階段
1825‧‧‧三維地圖視圖
1830‧‧‧三維地圖視圖
1835‧‧‧三維地圖
1840‧‧‧三維地圖
1850‧‧‧圓弧
1855‧‧‧位置
1860‧‧‧位置
1865‧‧‧位置
1900‧‧‧處理(或地圖顯現)管線
1905‧‧‧底圖擷取器
1910‧‧‧網格建立處理器
1915‧‧‧網格建立器
1920‧‧‧底圖提供器
1925‧‧‧地圖顯現引擎
1930‧‧‧虛擬攝影機
1975‧‧‧控制器
2005‧‧‧第一階段
2010‧‧‧第二階段
2015‧‧‧第三階段
2020‧‧‧第四階段
2040‧‧‧搜尋表
2045‧‧‧圖示
2050‧‧‧書籤圖示
2055‧‧‧取消控制項
2060‧‧‧開始控制項
2105‧‧‧第一階段
2110‧‧‧第二階段
2115‧‧‧第三階段
2120‧‧‧第四階段
2155‧‧‧搜尋表
2190‧‧‧圖釘
2205‧‧‧第一階段
2210‧‧‧第二階段
2215‧‧‧第三階段
2220‧‧‧第四階段
2250‧‧‧螢幕小鍵盤
2255‧‧‧搜尋表
2290‧‧‧橫幅
2295‧‧‧路線提取控制項
2305‧‧‧第一階段
2310‧‧‧第二階段
2315‧‧‧第三階段
2390‧‧‧圖釘
2395‧‧‧圖釘
2405‧‧‧第一階段
2410‧‧‧第二階段
2415‧‧‧第三階段
2420‧‧‧第四階段
2455‧‧‧搜尋表
2505‧‧‧第一階段
2510‧‧‧第二階段
2515‧‧‧第三階段
2520‧‧‧第四階段
2555‧‧‧搜尋表
2605‧‧‧第一階段
2610‧‧‧第二階段
2615‧‧‧第三階段
2620‧‧‧第四階段
2630‧‧‧書籤圖示
2700‧‧‧程序
2805‧‧‧第一階段
2806‧‧‧頂端列
2810‧‧‧第二階段
2815‧‧‧第三階段
2820‧‧‧第四階段
2830‧‧‧搜尋欄位
2835‧‧‧清單檢視控制項
2840‧‧‧圖釘
2845‧‧‧圖釘
2846‧‧‧資訊橫幅
2850‧‧‧興趣點
2855‧‧‧圖釘
2890‧‧‧三維圖示
2900‧‧‧地圖繪製應用程式
2905‧‧‧器件
2910‧‧‧搜尋完成管理器
2915‧‧‧本端來源管理器
2920‧‧‧遠端來源管理器
2925‧‧‧清單管理器
2930‧‧‧搜尋查詢剖析器
2935‧‧‧新近搜尋完成儲存庫
2940‧‧‧新近路線指引儲存庫
2950‧‧‧聯絡人儲存庫
2955‧‧‧書籤儲存庫
2960‧‧‧搜尋及地圖伺服器
2965‧‧‧搜尋完成儲存庫
2970‧‧‧路線指引儲存庫
3000‧‧‧地圖
3005‧‧‧第一階段
3010‧‧‧第二階段
3015‧‧‧第三階段
3025‧‧‧圖釘
3030‧‧‧「X」按鈕
3105‧‧‧第一階段
3110‧‧‧第二階段
3115‧‧‧第三階段
3120‧‧‧第四階段
3140‧‧‧搜尋表
3200‧‧‧程序
3300‧‧‧程序
3400‧‧‧程序
3505‧‧‧第一階段
3510‧‧‧第二階段
3515‧‧‧第三階段
3520‧‧‧第四階段
3555‧‧‧搜尋表
3605‧‧‧第一階段
3610‧‧‧第二階段
3615‧‧‧第三階段
3620‧‧‧第四階段
3630‧‧‧地圖
3705‧‧‧第一階段
3710‧‧‧第二階段
3715‧‧‧第三階段
3720‧‧‧第四階段
3755‧‧‧搜尋表
3760‧‧‧搜尋查詢
3800‧‧‧圖形使用者介面(GUI)
3805‧‧‧第一階段
3810‧‧‧第二階段
3815‧‧‧第三階段
3820‧‧‧第四階段
3825‧‧‧第五階段
3830‧‧‧第六階段
3835‧‧‧媒體顯示區域
3840‧‧‧索引標籤
3845‧‧‧資訊顯示區
3850‧‧‧頂端列
3860‧‧‧圖釘
3865‧‧‧橫幅
3875‧‧‧箭頭
3895‧‧‧「上一頁」按鈕
3905‧‧‧第一階段
3910‧‧‧第二階段
3915‧‧‧第三階段
3920‧‧‧第四階段
3925‧‧‧第五階段
3940‧‧‧搜尋表
4005‧‧‧第一階段
4010‧‧‧第二階段
4015‧‧‧第三階段
4020‧‧‧第四階段
4025‧‧‧第五階段
4030‧‧‧第六階段
4035‧‧‧第七階段
4040‧‧‧第八階段
4060‧‧‧圖釘
4065‧‧‧橫幅
4070‧‧‧「清單」按鈕
4075‧‧‧輸入項
4105‧‧‧第一階段
4110‧‧‧第二階段
4115‧‧‧第三階段
4120‧‧‧第四階段
4125‧‧‧第五階段
4130‧‧‧第六階段
4200‧‧‧程序
4305‧‧‧第一階段
4310‧‧‧第二階段
4315‧‧‧第三階段
4320‧‧‧第四階段
4360‧‧‧使用者介面(UI)項目
4365‧‧‧使用者介面(UI)項目
4370‧‧‧使用者介面(UI)項目
4375‧‧‧使用者介面(UI)項目
4380‧‧‧使用者介面(UI)項目
4405‧‧‧第一階段
4410‧‧‧第二階段
4415‧‧‧第三階段
4420‧‧‧第四階段
4450‧‧‧索引標籤
4455‧‧‧評論
4460‧‧‧頂端列
4505‧‧‧第一階段
4510‧‧‧第二階段
4515‧‧‧第三階段
4520‧‧‧第四階段
4545‧‧‧展開之資訊顯示區域
4550‧‧‧評論索引標籤
4555‧‧‧資訊索引標籤
4560‧‧‧使用者介面(UI)項目
4565‧‧‧使用者介面(UI)項目
4575‧‧‧使用者介面(UI)項目
4601‧‧‧使用者
4602‧‧‧使用者
4603‧‧‧使用者之手指
4604‧‧‧使用者之手指
4605‧‧‧第一階段
4610‧‧‧第二階段
4615‧‧‧第三階段
4705‧‧‧第一階段
4710‧‧‧第二階段
4715‧‧‧第三階段
4750‧‧‧標記
4805‧‧‧第一階段
4810‧‧‧第二階段
4815‧‧‧第三階段
4820‧‧‧器件
4825‧‧‧較大之螢幕顯示區域
4830‧‧‧圖釘
4835‧‧‧橫幅
4905‧‧‧第一階段
4910‧‧‧第二階段
4915‧‧‧第三階段
4920‧‧‧第四階段
4930‧‧‧指令標誌
4950‧‧‧開始控制項
5005‧‧‧第一階段
5010‧‧‧第二階段
5015‧‧‧第三階段
5020‧‧‧第四階段
5035‧‧‧方向圖示
5055‧‧‧搜尋表
5105‧‧‧第一階段
5110‧‧‧第二階段
5115‧‧‧第三階段
5120‧‧‧平板器件
5205‧‧‧第一階段
5210‧‧‧第二階段
5215‧‧‧第三階段
5220‧‧‧第四階段
5225‧‧‧第一可捲動標誌
5235‧‧‧新標誌
5250‧‧‧新區段
5290‧‧‧當前接合點指示符
5305‧‧‧第一階段
5310‧‧‧第二階段
5315‧‧‧第三階段
5400‧‧‧器件
5405‧‧‧第一階段
5410‧‧‧第二階段
5415‧‧‧標誌
5420‧‧‧標誌
5425‧‧‧標誌
5430‧‧‧標誌
5435‧‧‧頂端列
5505‧‧‧第一階段
5510‧‧‧第二階段
5515‧‧‧第三階段
5520‧‧‧最後一個指令標誌
5525‧‧‧概觀控制項
5530‧‧‧頂端列
5540‧‧‧繼續控制項
5570‧‧‧結束控制項
5600‧‧‧程序
5700‧‧‧器件
5705‧‧‧第一階段
5710‧‧‧第二階段
5715‧‧‧第三階段
5717‧‧‧第四階段
5719‧‧‧第五階段
5720‧‧‧使用者介面
5721‧‧‧第六階段
5725‧‧‧停駐區域
5730‧‧‧地圖繪製應用程式
5740‧‧‧頂端列
5750‧‧‧浮動三維控制項
5760‧‧‧指引控制項
5765‧‧‧搜尋欄位
5780‧‧‧指引鍵入頁面
5782‧‧‧表
5784‧‧‧標誌
5805‧‧‧第一階段
5810‧‧‧第二階段
5815‧‧‧第三階段
5901‧‧‧第一階段
5902‧‧‧第二階段
5903‧‧‧第三階段
5910‧‧‧3D導航地圖場景
5912‧‧‧虛擬攝影機
5916‧‧‧器件方位指示符
5918‧‧‧三維地圖視圖
5928‧‧‧三維導航地圖視圖
5938‧‧‧二維地圖視圖
5950‧‧‧軌跡
5955‧‧‧軌跡
5960‧‧‧3D按鈕
6005‧‧‧第一階段
6010‧‧‧第二階段
6015‧‧‧第三階段
6030‧‧‧汽車圖示
6040‧‧‧橫幅
6050‧‧‧橫幅
6060‧‧‧指令標誌
6075‧‧‧概觀控制項
6100‧‧‧器件
6105‧‧‧第一階段
6110‧‧‧第二階段
6115‧‧‧第三階段
6120‧‧‧第四階段
6125‧‧‧第五階段
6130‧‧‧第六階段
6135‧‧‧路線
6140‧‧‧「開始」使用者介面控制項
6200‧‧‧狀態圖
6205‧‧‧地圖瀏覽狀態
6210‧‧‧示意動作輸入辨識狀態
6215‧‧‧狀態(旋轉狀態)
6220‧‧‧狀態(移動瀏覽狀態)
6225‧‧‧狀態
6230‧‧‧慣性計算狀態
6235‧‧‧透視圖改變狀態
6240‧‧‧慣性計算狀態
6245‧‧‧回彈計算狀態
6250‧‧‧用以顯示所選擇方位之橫幅之狀態
6255‧‧‧用以顯示所選擇方位之預備區域之狀態
6260‧‧‧搜尋鍵入建議狀態
6265‧‧‧用以顯示搜尋結果之狀態
6270‧‧‧路線鍵入狀態
6275‧‧‧路線顯示狀態
6280‧‧‧導航狀態/導航模式
6285‧‧‧步進模式狀態
6290‧‧‧自動步進狀態
6300‧‧‧行動計算器件之架構/行動計算器件
6305‧‧‧處理單元
6310‧‧‧記憶體介面
6315‧‧‧周邊器件介面
6320‧‧‧攝影機子系統
6325‧‧‧無線通信子系統
6330‧‧‧音訊子系統
6335‧‧‧I/O子系統
6340‧‧‧光學感測器
6345‧‧‧定向感測器
6350‧‧‧加速感測器
6355‧‧‧觸控螢幕控制器
6360‧‧‧其他輸入控制器
6365‧‧‧觸控螢幕
6370‧‧‧記憶體
6372‧‧‧作業系統(OS)
6374‧‧‧通信指令
6376‧‧‧圖像使用者介面指令
6378‧‧‧影像處理指令
6380‧‧‧輸入處理指令
6382‧‧‧音訊處理指令
6384‧‧‧攝影機指令
6400‧‧‧電子系統
6405‧‧‧匯流排
6410‧‧‧處理單元
6415‧‧‧圖形處理單元(GPU)
6420‧‧‧系統記憶體
6425‧‧‧網路
6430‧‧‧唯讀記憶體(ROM)
6435‧‧‧永久儲存器件
6440‧‧‧輸入器件
6445‧‧‧輸出器件
6500‧‧‧作業環境
6502a‧‧‧用戶端器件
6502b‧‧‧用戶端器件
6502c‧‧‧用戶端器件
6510‧‧‧有線或無線網路
6512‧‧‧存取器件
6514‧‧‧閘道器
6520‧‧‧廣域網路(WAN)
6530‧‧‧地圖服務
6550‧‧‧其他服務
6560‧‧‧GPS衛星
100‧‧‧ devices
105‧‧‧First stage
110‧‧‧ second stage
115‧‧‧ third stage
120‧‧‧User Interface (UI)
125‧‧‧Parking area
130‧‧‧Map drawing application
140‧‧‧top column
145‧‧‧Location Control Item/List View Control
150‧‧‧3D control
155‧‧‧Page Curl Control
160‧‧‧Guidelines
165‧‧‧Search field
170‧‧‧Bookmark Controls
205‧‧‧ first stage
210‧‧‧ second stage
215‧‧‧ third stage
220‧‧‧ fourth stage
225‧‧‧ fifth stage
230‧‧‧ sixth stage
235‧‧‧List view control
240‧‧‧ route generation control
245‧‧‧Start control
250‧‧‧End control
255‧‧‧Clear control
260‧‧‧ route
261‧‧‧ route
300‧‧‧ compass
305‧‧‧ first stage
310‧‧‧ second stage
315‧‧‧ third stage
320‧‧‧ fourth stage
325‧‧‧ fifth stage
326‧‧‧ current position indicator
345‧‧‧simulated light projection
405‧‧‧ first stage
410‧‧‧ second stage
415‧‧‧ third stage
420‧‧‧ fourth stage
425‧‧‧User's current location
505‧‧‧Virtual Camera
510‧‧‧ first stage
515‧‧‧ second stage
520‧‧‧ third stage
525‧‧‧3D map view
530‧‧‧3D map view
535‧‧‧3D map scene
540‧‧‧2D map view
545‧‧‧2D map
550‧‧‧ track
555‧‧‧Track
600‧‧‧Virtual Camera
605‧‧‧ first stage
610‧‧‧ second stage
615‧‧‧ third stage
625‧‧‧3D map view
630‧‧‧3D map view
635‧‧‧3D map
640‧‧‧3D map view
705‧‧‧ first stage
710‧‧‧ second stage
715‧‧‧ third stage
720‧‧‧ fourth stage
725‧‧‧ Current orientation of the device
740‧‧‧2D view/2D map
745‧‧‧3D map
805‧‧‧ first stage
810‧‧‧ second stage
815‧‧‧ third stage
905‧‧‧ first stage
910‧‧‧ second stage
915‧‧‧ third stage
920‧‧‧ fourth stage
925‧‧‧ fifth stage
930‧‧‧6th stage
1005‧‧‧ first stage
1010‧‧‧ second stage
1015‧‧‧ third stage
1020‧‧‧ fourth stage
1025‧‧‧2D map view
1030‧‧‧Rotated map view
1105‧‧‧First stage
1110‧‧‧ second stage
1115‧‧‧ third stage
1120‧‧‧ fourth stage
1125‧‧‧Map view
1130‧‧‧Rotated map view
1205‧‧‧First stage
1210‧‧‧ second stage
1215‧‧‧ third stage
1220‧‧‧2D map view
1225‧‧‧2D map view
1230‧‧‧2D map view
1300‧‧‧Program
1405‧‧‧First stage
1410‧‧‧ second stage
1415‧‧‧ third stage
1420‧‧‧ fourth stage
1505‧‧‧First stage
1510‧‧‧ second stage
1515‧‧‧ third stage
1520‧‧‧ fourth stage
1605‧‧‧ first stage
1610‧‧‧ second stage
1615‧‧‧ third stage
1620‧‧‧ fourth stage
The first phase of 1701‧‧
1702‧‧‧ second stage
1703‧‧‧ third stage
1710‧‧‧3D map
1712‧‧‧Virtual Camera
1714‧‧‧3D map view
1724‧‧‧3D map view
1734‧‧‧3D map view
Line 1750‧‧
1755‧‧‧ line
1800‧‧‧Virtual Camera
1805‧‧‧First stage
1810‧‧‧ second stage
1815‧‧‧ third stage
1825‧‧‧3D map view
1830‧‧‧3D map view
1835‧‧‧3D map
1840‧‧‧3D map
1850‧‧‧ arc
1855‧‧‧Location
1860‧‧‧ position
1865‧‧‧ position
1900‧‧‧Processing (or map display) pipeline
1905‧‧‧ base map extractor
1910‧‧‧Grid build processor
1915‧‧‧Grid Builder
1920‧‧‧ basemap provider
1925‧‧‧Map Display Engine
1930‧‧‧Virtual camera
1975‧‧‧ Controller
2005‧‧‧First Stage
The second phase of 2010‧‧
The third phase of 2015‧‧
2020‧‧‧ fourth stage
2040‧‧‧Search Form
2045‧‧‧ icon
2050‧‧‧ Bookmark icon
2055‧‧‧Cancel control
2060‧‧‧Start control
2105‧‧‧ first stage
2110‧‧‧ second stage
2115‧‧‧ third stage
2120‧‧‧ fourth stage
2155‧‧‧Search Form
2190‧‧ pin
2205‧‧‧ first stage
2210‧‧‧ second stage
2215‧‧‧ third stage
2220‧‧‧ fourth stage
2250‧‧‧Screen keypad
2255‧‧‧Search Form
2290‧‧‧ banner
2295‧‧‧ route extraction control
2305‧‧‧First stage
2310‧‧‧ second stage
2315‧‧‧ third stage
2390‧‧‧Pushpin
2395‧‧‧Pushpin
2405‧‧‧First stage
2410‧‧‧ second stage
2415‧‧‧ third stage
2420‧‧‧ fourth stage
2455‧‧‧Search Form
2505‧‧‧ first stage
2510‧‧‧ second stage
2515‧‧‧ third stage
2520‧‧‧ fourth stage
2555‧‧‧Search Form
2605‧‧‧First stage
2610‧‧‧ second stage
2615‧‧‧ third stage
2620‧‧‧ fourth stage
2630‧‧‧ Bookmark icon
2700‧‧‧Program
2805‧‧‧First stage
2806‧‧‧top column
2810‧‧‧ second stage
2815‧‧‧ third stage
2820‧‧‧ fourth stage
2830‧‧‧Search field
2835‧‧‧List view control
2840‧‧·Pushpin
2845‧‧‧Pushpin
2846‧‧‧Information Banner
2850‧‧‧ points of interest
2855‧‧ pin
2890‧‧‧3D illustration
2900‧‧‧Map drawing application
2905‧‧‧Device
2910‧‧‧Search Completion Manager
2915‧‧‧Local Source Manager
2920‧‧‧Remote Source Manager
2925‧‧‧List Manager
2930‧‧‧Search Query Profiler
2935‧‧‧New search completed repository
2940‧‧‧New Route Guidance Repository
2950‧‧‧Contacts Repository
2955‧‧‧Bookmark repository
2960‧‧‧Search and Map Server
2965‧‧‧Search completed repository
2970‧‧‧ Route Guidance Repository
Map of 3000‧‧‧
3005‧‧‧ first stage
3010‧‧‧ second stage
3015‧‧‧ third stage
3025‧‧‧Pushpin
3030‧‧‧"X" button
3105‧‧‧ first stage
3110‧‧‧ second stage
3115‧‧‧ third stage
3120‧‧‧ fourth stage
3140‧‧‧Search Form
3200‧‧‧ Procedure
3300‧‧‧Program
3400‧‧‧Program
3505‧‧‧ first stage
3510‧‧‧ second stage
3515‧‧‧ third stage
3520‧‧‧ fourth stage
3555‧‧‧Search Form
3605‧‧‧ first stage
3610‧‧‧ second stage
3615‧‧‧ third stage
3620‧‧‧ fourth stage
Map of 3630‧‧
3705‧‧‧ first stage
The second phase of 3710‧‧
3715‧‧‧ third stage
3720‧‧‧ fourth stage
3755‧‧‧Search Form
3760‧‧ Search Search
3800‧‧‧Graphical User Interface (GUI)
The first phase of 3805‧‧
The second phase of 3810‧‧
3815‧‧‧ third stage
3820‧‧‧ fourth stage
3825‧‧‧The fifth stage
3830‧‧‧6th stage
3835‧‧‧Media display area
3840‧‧‧ index label
3845‧‧‧Information display area
3850‧‧‧top column
3860‧‧·Pushpin
3865‧‧‧ banner
3875‧‧‧ arrow
3895‧‧‧"Previous" button
3905‧‧‧First stage
3910‧‧‧ second stage
3915‧‧‧ third stage
3920‧‧‧ fourth stage
3925‧‧‧ fifth stage
3940‧‧‧Search Form
4005‧‧‧First stage
4010‧‧‧ second stage
4015‧‧‧ third stage
4020‧‧‧ fourth stage
4025‧‧‧ fifth stage
4030‧‧‧Sixth stage
4035‧‧‧ seventh stage
4040‧‧‧8th stage
4060‧‧‧Pushpin
4065‧‧‧ banner
4070‧‧‧"List" button
4075‧‧‧ Inputs
4105‧‧‧ first stage
4110‧‧‧ second stage
4115‧‧‧ third stage
4120‧‧‧ fourth stage
4125‧‧‧ fifth stage
4130‧‧‧6th stage
4200‧‧‧ Procedure
4305‧‧‧First stage
The fourth phase of 4310‧‧
4315‧‧‧ third stage
4320‧‧‧Fourth stage
4360‧‧‧User Interface (UI) Project
4365‧‧‧User Interface (UI) Project
4370‧‧‧User Interface (UI) Project
4375‧‧‧User Interface (UI) Project
4380‧‧‧User Interface (UI) Project
4405‧‧‧First stage
4410‧‧‧ second stage
4415‧‧‧ third stage
4420‧‧‧ fourth stage
4450‧‧‧ index label
4455‧‧‧Comment
4460‧‧‧top column
4505‧‧‧First Stage
4510‧‧‧ second stage
4515‧‧‧ third stage
4520‧‧‧ fourth stage
4545‧‧‧ Expanded information display area
4550‧‧‧Comment Index Label
4555‧‧‧Information Index Label
4560‧‧‧User Interface (UI) Project
4565‧‧‧User Interface (UI) Project
4575‧‧‧User Interface (UI) Project
4601‧‧‧Users
4602‧‧‧Users
4603‧‧‧ User's finger
4604‧‧‧ User's finger
4605‧‧‧First stage
4610‧‧‧ second stage
4615‧‧‧ third stage
4705‧‧‧First Stage
4710‧‧‧ second stage
4715‧‧‧ third stage
4750‧‧‧ mark
4805‧‧‧ first stage
4810‧‧‧ second stage
4815‧‧‧ third stage
4820‧‧‧Device
4825‧‧‧large screen display area
4830‧‧‧Pushpin
4835‧‧‧ banner
4905‧‧‧First stage
4910‧‧‧ second stage
4915‧‧‧ third stage
4920‧‧‧ fourth stage
4930‧‧‧ instruction mark
4950‧‧‧Start control
5005‧‧‧ first stage
5010‧‧‧ second stage
5015‧‧‧ third stage
5020‧‧‧ fourth stage
5035‧‧‧ Directional icon
5055‧‧‧Search Form
5105‧‧‧First stage
5110‧‧‧ second stage
5115‧‧‧ third stage
5120‧‧‧Table device
5205‧‧‧First stage
5210‧‧‧ second stage
5215‧‧‧ third stage
5220‧‧‧ fourth stage
5225‧‧‧First scrollable sign
5235‧‧‧New sign
5250‧‧‧New section
5290‧‧‧ current joint indicator
5305‧‧‧First stage
The fifth phase of 5310‧‧
5315‧‧‧ third stage
5400‧‧‧Device
5405‧‧‧First stage
5410‧‧‧ second stage
5415‧‧‧ sign
5420‧‧‧ mark
5425‧‧‧ mark
5430‧‧‧ mark
5435‧‧‧top column
5505‧‧‧First stage
5510‧‧‧ second stage
5515‧‧‧ third stage
5520‧‧‧Last instruction mark
5525‧‧‧Overview control
5530‧‧‧top column
5540‧‧‧Continue control
5570‧‧‧End control
5600‧‧‧Program
5700‧‧‧ devices
5705‧‧‧First Stage
5710‧‧‧ second stage
5715‧‧‧ third stage
5717‧‧‧ fourth stage
5719‧‧‧The fifth stage
5720‧‧‧User interface
5721‧‧‧6th stage
5725‧‧‧Parking area
5730‧‧‧Map drawing application
5740‧‧‧top column
5750‧‧‧ Floating 3D Controls
5760‧‧‧Guidelines
5765‧‧‧Search field
5780‧‧‧Guide type page
5782‧‧‧Table
5784‧‧‧ mark
5805‧‧‧First Stage
5810‧‧‧ second stage
5815‧‧‧ third stage
5901‧‧‧First stage
5902‧‧‧ second stage
5903‧‧‧ third stage
5910‧‧‧3D navigation map scene
5912‧‧‧Virtual Camera
5916‧‧‧Device orientation indicator
5918‧‧‧3D map view
5928‧‧‧Three-dimensional navigation map view
5938‧‧‧2D map view
5950‧‧‧Track
5955‧‧‧Track
5960‧‧‧3D button
6005‧‧‧ first stage
6010‧‧‧ second stage
6015‧‧‧ third stage
6030‧‧‧Car icon
6040‧‧‧ banner
6050‧‧‧ banner
6060‧‧‧ instruction mark
6075‧‧‧ Overview Controls
6100‧‧‧Device
6105‧‧‧First stage
6110‧‧‧ second stage
6115‧‧‧ third stage
6120‧‧‧ fourth stage
6125‧‧‧ fifth stage
6130‧‧‧6th stage
Route 6135‧‧
6140‧‧‧ "Start" user interface control
6200‧‧‧ State diagram
6205‧‧‧Map browsing status
6210‧‧‧ Indicates the action input identification status
6215‧‧‧ State (rotation state)
6220‧‧‧ Status (mobile browsing status)
6225‧‧‧ Status
6230‧‧‧Inertial calculation status
6235‧‧‧Perspective change state
6240‧‧‧Inertial calculation status
6245‧‧‧Rebound calculation status
6250‧‧‧The status of the banner used to display the selected orientation
6255‧‧‧Used to display the status of the prepared area of the selected position
6260‧‧‧Search type suggestion status
6265‧‧‧Used to display the status of search results
6270‧‧‧ route entry status
6275‧‧‧Route display status
6280‧‧‧Navigation Status/Navigation Mode
6285‧‧‧Step mode status
6290‧‧‧Automatic stepping state
6300‧‧‧Activity Computing Device Architecture/Action Computing Device
6305‧‧‧Processing unit
6310‧‧‧ memory interface
6315‧‧‧ peripheral device interface
6320‧‧‧ camera subsystem
6325‧‧‧Wireless communication subsystem
6330‧‧‧ Audio subsystem
6335‧‧‧I/O subsystem
6340‧‧‧Optical sensor
6345‧‧‧ Directional Sensor
6350‧‧‧Acceleration sensor
6355‧‧‧Touch Screen Controller
6360‧‧‧Other input controllers
6365‧‧‧ touch screen
6370‧‧‧ memory
6372‧‧‧Operating System (OS)
6374‧‧‧Communication Directive
6376‧‧‧Image User Interface Instructions
6378‧‧‧Image Processing Instructions
6380‧‧‧Input processing instructions
6382‧‧‧Operation Processing Instructions
6384‧‧‧ camera instructions
6400‧‧‧Electronic system
6405‧‧‧ Busbar
6410‧‧‧Processing unit
6415‧‧‧Graphical Processing Unit (GPU)
6420‧‧‧System Memory
6425‧‧‧Network
6430‧‧‧Reading Memory (ROM)
6435‧‧‧Permanent storage device
6440‧‧‧Input device
6445‧‧‧ Output device
6500‧‧‧Working environment
6502a‧‧‧Customer Devices
6502b‧‧‧Customer device
6502c‧‧‧Customer device
6510‧‧‧Wired or wireless network
6512‧‧‧Access device
6514‧‧‧ gateway
6520‧‧‧ Wide Area Network (WAN)
6530‧‧‧Map service
6550‧‧‧Other services
6560‧‧‧GPS satellite

在附加之申請專利範圍中陳述本發明之新穎特徵。然而,出於解釋目的,在以下諸圖中陳述本發明之若干實施例。 圖1說明執行一些實施例之整合式地圖繪製應用程式之器件之一實例。 圖2說明適應性地修改浮動控制項叢集之整合式應用程式之一實例。 圖3說明適應性地修改浮動控制項叢集之整合式應用程式之一實例。 圖4說明一些實施例之地圖繪製應用程式提供3D控制項以作為進入3D模式以用於三維地檢視地圖方位之快速機構的方式。 圖5呈現一簡化實例以說明虛擬攝影機之概念。 圖6概念性地說明由一些實施例之地圖繪製應用程式提供之透視調整特徵。 圖7說明用於將2D地圖下推成3D地圖之兩指示意動作的一個實例。 圖8說明自2D地圖視圖至3D地圖視圖之轉變。 圖9說明一些實施例之地圖繪製應用程式改變3D控制項之外觀以指示地圖視圖之不同2D及3D狀態。 圖10說明旋轉2D地圖及使用羅盤來調直已旋轉地圖之一實例。 圖11說明在本發明之一些實施例中的旋轉地圖之另一實例。 圖12說明與慣性效應一起的旋轉操作。 圖13說明與慣性效應一起的旋轉操作。 圖14說明一些實施例之地圖繪製應用程式使用新穎技術來隨著地圖視圖旋轉而調整出現在地圖視圖中之文字及/或符號或保持文字及/或符號未經調整。 圖15說明定向文字及/或符號之一實例。 圖16說明使用者經由兩指示意動作操作而自3D地圖視圖轉變至2D地圖視圖之一實例。 圖17說明由一些實施例之地圖繪製應用程式提供之縮放調整特徵。 圖18概念性地說明由一些實施例之地圖繪製應用程式提供之用於將虛擬攝影機之位置維持在沿著圓弧之界定範圍內之特徵。 圖19概念性地說明由一些實施例之地圖繪製應用程式執行以便顯現地圖以供在用戶端器件處顯示之處理或地圖顯現管線。 圖20說明為了顯示具有使用者之新近搜尋及新近路線指引之清單之搜尋表的使用者與在使用者之器件上執行之應用程式之互動。 圖21說明在搜尋欄位中鍵入搜尋查詢及執行搜尋之一實例。 圖22說明起始搜尋查詢及具有推薦之搜尋完成之清單之搜尋表的瞬間顯示之一實例。 圖23說明起始搜尋查詢及具有推薦之搜尋完成之清單之搜尋表的瞬間顯示之一實例。 圖24說明鍵入部分地址及獲得至來源於聯絡人卡片之住家地址之指引之一實例。 圖25說明使用者鍵入部分搜尋字詞及獲得至來源於聯絡人卡片之工作地址之指引之一實例。 圖26說明鍵入部分搜尋查詢及自搜尋表中的搜尋完成之清單選擇書籤之一實例。 圖27概念性地說明一些實施例執行的程序,其用以判定用來在搜尋表中顯示來自不同來源之所建議搜尋完成之次序。 圖28說明在與較小器件之顯示區域相比具有相對較大顯示區域之器件上執行地圖繪製應用程式之一實例。 圖29說明基於搜尋查詢來提供所建議搜尋完成之清單的地圖繪製應用程式之一實例架構。 圖30說明自地圖清除搜尋結果之一實例。 圖31說明為了清除顯示於地圖上之所選擇搜尋結果的使用者與在使用者之器件上執行之應用程式之互動之一實例。 圖32說明一些實施例執行的程序,其用以判定用以在使用者之當前地圖視圖與含有使用者之經執行搜尋查詢之搜尋結果的目標地圖視圖之間顯示的特定類型轉變。 圖33說明一些實施例執行的程序,其用以在目標區與最初由地圖繪製應用程式界定之原始區至少部分地重疊時判定是否要修改目標區。 圖34說明一些實施例執行的程序,其用以在(1)目標區與最初由地圖繪製應用程式界定之原始區至少部分地重疊時且在考慮要修改目標區時判定是否要顯示自原始區至目標區之動畫。 圖35說明應用程式顯示至含有對應搜尋結果之目標地圖區的轉變而不提供當前地圖視圖與目標地圖視圖之間的任何動畫的情形。 圖36說明應用程式偵測原始當前地圖視圖內之搜尋結果且因此不必縮放或用動畫轉變至任何新目標地圖區的情形。 圖37說明目標地圖視圖中之搜尋結果不在原始區內,使得應用程式擴展目標區且顯示原始區與目標區之間的動畫的情形。 圖38概念性地說明用於展示關於所選擇方位之詳細資訊的UI頁面。 圖39說明使用3D顯現操作來展示特定搜尋結果之地圖繪製應用程式之一實例。 圖40概念性地說明顯示一方位之衛星影像之動畫。 圖41說明顯示方位之影像。 圖42概念性地說明一些實施例執行的程序,其用以在啟動用於展示關於方位之詳細資訊的一「詳細資訊畫面」時顯示不同類型影像。 圖43說明將索引標籤捲動關閉。 圖44說明在使用者選擇顯示於資訊顯示區域中之輸入項時啟動第三方應用程式。 圖45說明在使用者選擇針對第三方應用程式之UI項目時啟動第三方應用程式。 圖46說明在兩個不同使用者之不同器件上執行而展示同一方位之兩個不同資訊集合的兩個地圖繪製應用程式執行個體。 圖47說明在針對所選擇方位之詳細資訊畫面中的索引標籤上顯示標記。 圖48說明用以顯示針對特定方位之「詳細資訊畫面」的使用者與一些實施例之地圖繪製應用程式之互動。 圖49說明用以獲得路線選擇指引的使用者與地圖繪製應用程式之互動之一實例。 圖50說明用以獲得路線選擇指引的使用者與地圖繪製應用程式之互動之一實例。 圖51概念性地說明在器件之相對較大顯示區域中顯示路線選擇指引之一實例。 圖52說明捲動通過由使用者選擇之特定路線之可捲動指令標誌之一集合。 圖53說明捲動通過不同可捲動標誌。 圖54說明顯示由使用者選擇之特定路線之可捲動指令標誌之一集合之一實例。 圖55說明用以在檢閱所選擇路線時切換至概觀模式的使用者與應用程式之互動之一實例。 圖56概念性地說明一些實施例執行的程序,其用以允許使用者瀏覽針對在開始方位與結束方位之間的路線中之接合點之一組指令之標誌。 圖57說明執行一些實施例之地圖繪製應用程式之器件之一實例。 圖58說明一些實施例之導航應用程式提供3D控制項以作為進入3D導航模式之快速機構的方式。 圖59說明虛擬攝影機之概念。 圖60說明用以獲得路線選擇指引的使用者與地圖繪製應用程式之互動之一實例。 圖61說明當地圖繪製應用程式自用於地圖瀏覽之非沈浸式地圖視圖轉變成用於導航之沈浸式地圖視圖時顯示地圖繪製應用程式的器件。 圖62A至圖62B概念性地說明狀態圖,該圖描述一些實施例之整合式地圖繪製、搜尋及導航應用程式之不同狀態及此等狀態之間的轉變。 圖63為行動計算器件之架構之一實例。 圖64概念性地說明本發明之一些實施例實施於之電子系統之一實例。 圖65根據一些實施例說明地圖服務作業環境。The novel features of the invention are set forth in the appended claims. However, for purposes of explanation, several embodiments of the invention are set forth in the following figures. Figure 1 illustrates an example of a device that performs an integrated mapping application of some embodiments. Figure 2 illustrates an example of an integrated application that adaptively modifies a cluster of floating control items. Figure 3 illustrates an example of an integrated application that adaptively modifies a cluster of floating control items. 4 illustrates a manner in which a mapping application of some embodiments provides a 3D control as a fast mechanism for entering a 3D mode for three-dimensional viewing of a map orientation. Figure 5 presents a simplified example to illustrate the concept of a virtual camera. Figure 6 conceptually illustrates perspective adjustment features provided by the mapping application of some embodiments. Figure 7 illustrates an example of two indicative actions for downsampling a 2D map into a 3D map. Figure 8 illustrates the transition from a 2D map view to a 3D map view. Figure 9 illustrates that the mapping application of some embodiments changes the appearance of the 3D control to indicate different 2D and 3D states of the map view. Figure 10 illustrates an example of rotating a 2D map and using a compass to straighten a rotated map. Figure 11 illustrates another example of a rotating map in some embodiments of the invention. Figure 12 illustrates the rotational operation along with the inertial effect. Figure 13 illustrates the rotational operation along with the inertial effect. 14 illustrates that the mapping application of some embodiments uses novel techniques to adjust the text and/or symbols appearing in the map view or to keep the text and/or symbols unadjusted as the map view is rotated. Figure 15 illustrates an example of directed text and/or symbols. Figure 16 illustrates an example of a user transitioning from a 3D map view to a 2D map view via two indicative action actions. Figure 17 illustrates zoom adjustment features provided by the mapping application of some embodiments. Figure 18 conceptually illustrates features provided by the mapping application of some embodiments for maintaining the position of the virtual camera within a defined range along an arc of a circle. Figure 19 conceptually illustrates a process or map rendering pipeline executed by a mapping application of some embodiments to visualize a map for display at a client device. Figure 20 illustrates the interaction of a user who has a search list with a list of recent searches and recent directions for the user with an application executing on the user's device. Figure 21 illustrates an example of typing a search query and performing a search in a search field. Figure 22 illustrates an example of an instant display of a search list that initiates a search query and a list of recommended search completions. Figure 23 illustrates an example of an instant display of a search list that initiates a search query and a list of recommended search completions. Figure 24 illustrates an example of entering a partial address and obtaining guidance to a home address derived from a contact card. Figure 25 illustrates an example of a guide for a user to type a partial search term and obtain a work address from a contact card. Figure 26 illustrates an example of typing a partial search query and a list selection bookmark for the search completion in the self-search table. Figure 27 conceptually illustrates a procedure performed by some embodiments for determining the order in which suggested search completions from different sources are displayed in a search table. Figure 28 illustrates an example of performing a mapping application on a device having a relatively large display area compared to a display area of a smaller device. Figure 29 illustrates an example architecture of a mapping application that provides a list of suggested search completions based on a search query. Figure 30 illustrates an example of clearing search results from a map. Figure 31 illustrates an example of interaction between a user executing a selected search result displayed on a map and an application executing on a user's device. 32 illustrates a procedure performed by some embodiments for determining a particular type of transition to be displayed between a current map view of a user and a target map view containing search results for a user performing a search query. Figure 33 illustrates a procedure performed by some embodiments to determine whether to modify a target zone when the target zone at least partially overlaps the original zone originally defined by the mapping application. Figure 34 illustrates a procedure performed by some embodiments to determine whether to display from the original region when (1) the target region at least partially overlaps with the original region originally defined by the mapping application and when considering the target region to be modified Animation to the target area. Figure 35 illustrates a situation in which an application displays a transition to a target map area containing a corresponding search result without providing any animation between the current map view and the target map view. Figure 36 illustrates the situation in which the application detects the search results within the original current map view and therefore does not have to scale or transition to any new target map area. FIG. 37 illustrates a case where the search result in the target map view is not in the original area, so that the application expands the target area and displays an animation between the original area and the target area. Figure 38 conceptually illustrates a UI page for displaying detailed information about the selected orientation. Figure 39 illustrates an example of a mapping application that uses a 3D rendering operation to display a particular search result. Figure 40 conceptually illustrates an animation showing a satellite image of one orientation. Figure 41 illustrates an image showing the orientation. Figure 42 conceptually illustrates a procedure performed by some embodiments for displaying different types of images when launching a "details screen" for displaying detailed information about the orientation. Figure 43 illustrates scrolling off the index tab. Figure 44 illustrates the launching of a third party application when the user selects an entry displayed in the information display area. Figure 45 illustrates launching a third party application when a user selects a UI project for a third party application. Figure 46 illustrates two mapping application execution individuals executing on two different devices of two different users while presenting two different sets of information in the same orientation. Figure 47 illustrates the display of indicia on an index tab in the detailed information screen for the selected orientation. Figure 48 illustrates the interaction of a user for displaying a "Detailed Information Screen" for a particular orientation with a mapping application of some embodiments. Figure 49 illustrates an example of interaction between a user using a route selection guide and a mapping application. Figure 50 illustrates an example of interaction between a user using a route selection guide and a mapping application. Figure 51 conceptually illustrates one example of displaying route selection guidelines in a relatively large display area of the device. Figure 52 illustrates a set of scrollable command flags scrolling through a particular route selected by the user. Figure 53 illustrates scrolling through different scrollable markers. Figure 54 illustrates an example of one of a set of scrollable instruction flags showing a particular route selected by a user. Figure 55 illustrates an example of interaction between a user and an application for switching to the overview mode while reviewing the selected route. Figure 56 conceptually illustrates a procedure performed by some embodiments to allow a user to view a flag for a set of instructions for a joint in a route between a starting orientation and an ending orientation. Figure 57 illustrates an example of a device that performs a mapping application of some embodiments. Figure 58 illustrates a manner in which the navigation application of some embodiments provides 3D control items as a fast mechanism for entering a 3D navigation mode. Figure 59 illustrates the concept of a virtual camera. Figure 60 illustrates an example of interaction between a user using a route selection guide and a mapping application. Figure 61 illustrates a device for displaying a mapping application when the local mapping application changes from a non-immersive map view for map browsing to an immersive map view for navigation. 62A-62B conceptually illustrate state diagrams depicting different states of the integrated mapping, search, and navigation applications of some embodiments and transitions between such states. Figure 63 is an example of the architecture of a mobile computing device. Figure 64 conceptually illustrates an example of an electronic system to which some embodiments of the present invention are implemented. Figure 65 illustrates a map service job environment in accordance with some embodiments.

150‧‧‧3D控制項 150‧‧‧3D control

405‧‧‧第一階段 405‧‧‧ first stage

410‧‧‧第二階段 410‧‧‧ second stage

415‧‧‧第三階段 415‧‧‧ third stage

420‧‧‧第四階段 420‧‧‧ fourth stage

425‧‧‧使用者之當前方位 425‧‧‧User's current location

Claims (1)

一種用於在包含一觸敏式螢幕及一觸碰輸入介面之一器件上一呈現一地圖之方法,該方法包含: 自一三維(3D)地圖場景之一特定透視圖(perspective view)顯示該地圖之一3D呈現; 自該觸碰輸入介面接收一觸碰輸入; 回應於所接收之該觸碰輸入,將該3D地圖場景之該透視圖改變至一最大透視圖;及 當所接收之該觸碰輸入試圖將該透視圖推至超過該最大透視圖時,提供一彈跳動畫,該彈跳動畫以一簡短彈跳展示該3D地圖呈現以指示已到達該最大透視圖。A method for presenting a map on a device comprising a touch sensitive screen and a touch input interface, the method comprising: displaying the view from a specific perspective view of a three dimensional (3D) map scene a 3D representation of the map; receiving a touch input from the touch input interface; changing the perspective of the 3D map scene to a maximum perspective in response to the received touch input; and when receiving the When the touch input attempts to push the perspective beyond the maximum perspective, a bounce animation is provided that displays the 3D map presentation in a short bounce to indicate that the maximum perspective has been reached.
TW106113163A 2012-06-05 2013-06-04 Method, machine-readable medium and electronic device for presenting a map TWI625706B (en)

Applications Claiming Priority (30)

Application Number Priority Date Filing Date Title
US201261655997P 2012-06-05 2012-06-05
US201261655995P 2012-06-05 2012-06-05
US61/655,997 2012-06-05
US61/655,995 2012-06-05
US201261656043P 2012-06-06 2012-06-06
US201261656032P 2012-06-06 2012-06-06
US201261656015P 2012-06-06 2012-06-06
US201261656080P 2012-06-06 2012-06-06
US61/656,043 2012-06-06
US61/656,015 2012-06-06
US61/656,032 2012-06-06
US61/656,080 2012-06-06
US201261657880P 2012-06-10 2012-06-10
US201261657858P 2012-06-10 2012-06-10
US201261657864P 2012-06-10 2012-06-10
US61/657,880 2012-06-10
US61/657,864 2012-06-10
US61/657,858 2012-06-10
US201261699842P 2012-09-11 2012-09-11
US201261699841P 2012-09-11 2012-09-11
US201261699851P 2012-09-11 2012-09-11
US201261699855P 2012-09-11 2012-09-11
US61/699,842 2012-09-11
US61/699,841 2012-09-11
US61/699,851 2012-09-11
US61/699,855 2012-09-11
US13/632,102 2012-09-30
US13/632,124 2012-09-30
US13/632,102 US9886794B2 (en) 2012-06-05 2012-09-30 Problem reporting in maps
US13/632,124 US9367959B2 (en) 2012-06-05 2012-09-30 Mapping application with 3D presentation

Publications (2)

Publication Number Publication Date
TW201727597A true TW201727597A (en) 2017-08-01
TWI625706B TWI625706B (en) 2018-06-01

Family

ID=50550547

Family Applications (3)

Application Number Title Priority Date Filing Date
TW106113163A TWI625706B (en) 2012-06-05 2013-06-04 Method, machine-readable medium and electronic device for presenting a map
TW105117425A TWI592913B (en) 2012-06-05 2013-06-04 Method, machine-readable medium and electronic device for presenting a map
TW102119821A TWI550568B (en) 2012-06-05 2013-06-04 Mapping application with 3d presentation

Family Applications After (2)

Application Number Title Priority Date Filing Date
TW105117425A TWI592913B (en) 2012-06-05 2013-06-04 Method, machine-readable medium and electronic device for presenting a map
TW102119821A TWI550568B (en) 2012-06-05 2013-06-04 Mapping application with 3d presentation

Country Status (1)

Country Link
TW (3) TWI625706B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI701535B (en) * 2018-08-29 2020-08-11 和碩聯合科技股份有限公司 Method and system for planning trajectory
TWI726539B (en) * 2019-12-16 2021-05-01 英業達股份有限公司 Processing method of range selector
CN112967360A (en) * 2021-04-23 2021-06-15 自然资源部国土卫星遥感应用中心 Synthetic aperture radar image Voronoi polygon mosaic method considering multiple dimensions
TWI779592B (en) * 2021-05-05 2022-10-01 萬潤科技股份有限公司 Map editing method and device
TWI781538B (en) * 2020-04-03 2022-10-21 大陸商騰訊科技(深圳)有限公司 Navigation method, device, computer equipment and storage medium
CN112967360B (en) * 2021-04-23 2024-05-03 自然资源部国土卫星遥感应用中心 Multi-dimensional synthetic aperture radar image Voronoi polygon mosaic method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI743519B (en) * 2019-07-18 2021-10-21 萬潤科技股份有限公司 Self-propelled device and method for establishing map
JP7424784B2 (en) * 2019-09-30 2024-01-30 株式会社小松製作所 Control device, working machine and control method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004271901A (en) * 2003-03-07 2004-09-30 Matsushita Electric Ind Co Ltd Map display system
JP4198513B2 (en) * 2003-04-18 2008-12-17 パイオニア株式会社 MAP INFORMATION PROCESSING DEVICE, MAP INFORMATION PROCESSING SYSTEM, POSITION INFORMATION DISPLAY DEVICE, ITS METHOD, ITS PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
JP4138574B2 (en) * 2003-05-21 2008-08-27 株式会社日立製作所 Car navigation system
JP4855654B2 (en) * 2004-05-31 2012-01-18 ソニー株式会社 On-vehicle device, on-vehicle device information providing method, on-vehicle device information providing method program, and on-vehicle device information providing method program
US8355862B2 (en) * 2008-01-06 2013-01-15 Apple Inc. Graphical user interface for presenting location information
US8493408B2 (en) * 2008-11-19 2013-07-23 Apple Inc. Techniques for manipulating panoramas
US8640020B2 (en) * 2010-06-02 2014-01-28 Microsoft Corporation Adjustable and progressive mobile device street view
US8315791B2 (en) * 2010-06-18 2012-11-20 Nokia Coporation Method and apparatus for providing smart zooming of a geographic representation
KR101638918B1 (en) * 2010-08-17 2016-07-12 엘지전자 주식회사 Mobile terminal and Method for converting display mode thereof
TW201222460A (en) * 2010-11-19 2012-06-01 Data Statistical Analysis Corp Interface with map location as a key point to facilitate searching rental or sale housing object

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI701535B (en) * 2018-08-29 2020-08-11 和碩聯合科技股份有限公司 Method and system for planning trajectory
TWI726539B (en) * 2019-12-16 2021-05-01 英業達股份有限公司 Processing method of range selector
TWI781538B (en) * 2020-04-03 2022-10-21 大陸商騰訊科技(深圳)有限公司 Navigation method, device, computer equipment and storage medium
CN112967360A (en) * 2021-04-23 2021-06-15 自然资源部国土卫星遥感应用中心 Synthetic aperture radar image Voronoi polygon mosaic method considering multiple dimensions
CN112967360B (en) * 2021-04-23 2024-05-03 自然资源部国土卫星遥感应用中心 Multi-dimensional synthetic aperture radar image Voronoi polygon mosaic method
TWI779592B (en) * 2021-05-05 2022-10-01 萬潤科技股份有限公司 Map editing method and device

Also Published As

Publication number Publication date
TWI625706B (en) 2018-06-01
TW201407560A (en) 2014-02-16
TWI592913B (en) 2017-07-21
TW201638906A (en) 2016-11-01
TWI550568B (en) 2016-09-21

Similar Documents

Publication Publication Date Title
US11727641B2 (en) Problem reporting in maps
US9047691B2 (en) Route display and review
US9135751B2 (en) Displaying location preview
US9367959B2 (en) Mapping application with 3D presentation
US9311750B2 (en) Rotation operations in a mapping application
US10176633B2 (en) Integrated mapping and navigation application
EP2672229A2 (en) Mapping application with novel search field
US20130326380A1 (en) Triage Tool for Problem Reporting in Maps
TWI625706B (en) Method, machine-readable medium and electronic device for presenting a map
TWI521187B (en) Integrated mapping and navigation application
TWI533264B (en) Route display and review
TW201407562A (en) Mapping application with novel search field