US20160116297A1 - Intuitive Preview of Upcoming Navigational Instructions - Google Patents
Intuitive Preview of Upcoming Navigational Instructions Download PDFInfo
- Publication number
- US20160116297A1 US20160116297A1 US14/990,145 US201614990145A US2016116297A1 US 20160116297 A1 US20160116297 A1 US 20160116297A1 US 201614990145 A US201614990145 A US 201614990145A US 2016116297 A1 US2016116297 A1 US 2016116297A1
- Authority
- US
- United States
- Prior art keywords
- navigational
- maneuvers
- indicators
- distance
- sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 40
- 238000004891 communication Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 230000000737 periodic effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3632—Guidance using simplified or iconic instructions, e.g. using arrows
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3655—Timing of guidance instructions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K15/00—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
- G06K15/02—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
- G06K15/18—Conditioning data for presenting it to the physical printing elements
- G06K15/1835—Transforming generic data
- G06K15/1842—Geometric transformations, e.g. on raster data
- G06K15/1843—Changing size or raster resolution
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096855—Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
- G08G1/096861—Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where the immediate route instructions are output to the driver, e.g. arrow signs for next turn
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
Definitions
- the present disclosure relates generally to navigational systems.
- the present disclosure is directed to systems and methods for providing an intuitive preview of upcoming navigational instructions.
- Navigational devices are becoming increasingly commonplace in the modern world. For example, navigational devices can be used for navigating a vehicle such as a car, boat, or airplane or for use when walking through an unfamiliar location.
- navigational devices are no longer limited to devices specifically designed with the sole-purpose of providing navigational instructions. Instead, navigational devices can include a large variety of computing devices capable of implementing one or more applications to provide near-instantaneous instructions for navigating from almost any location to any other location.
- such applications often offer “turn-by-turn” navigational instruction, which provides navigation over a sequence of navigational maneuvers (e.g. driving maneuvers such as “turn right”).
- the sequence of maneuvers can be described by a group of textual entries that respectively describe the upcoming maneuvers.
- the navigational device can provide a group of graphical icons that respectively represent the upcoming maneuvers or can output audio in the form of human speech that describes the upcoming maneuvers.
- certain display methods implemented by current navigational devices can fail to provide users with an intuitive, user-friendly sense of the scale and relationship between the upcoming maneuvers.
- the navigational device may fail to indicate the distance between upcoming maneuvers or may provide the navigational instruction only upon approaching a predefined distance from the maneuver location.
- the navigation device may fail to provide sufficient advanced warning to enable the user to be in proper position or otherwise appropriately anticipate the maneuver. For example, a driver may be required to merge or change lanes immediately after a first maneuver in order to be in position to make a second maneuver. As such, if the navigational device fails to the give the driver appropriate notice, then the driver may miss the second maneuver.
- the navigational device provides the distance between upcoming maneuvers in a textual format
- the user may struggle to mentally convert the textual distance information into a full comprehension of the physical distance.
- user effort to comprehend textual distances or time spent looking at the device display to read the text can undesirably distract the user from the navigational activity (e.g. driving the car).
- One example aspect of the present disclosure is directed to a method for providing navigational instruction.
- the method includes obtaining, by one or more computing devices, navigational information describing a sequence of navigational maneuvers associated with a route.
- the method includes determining, by the one or more computing devices, a distance between each navigational maneuver and the previous sequential navigational maneuver.
- the method includes displaying, by the one or more computing devices, a user interface providing a sequence of indicators respectively representing the sequence of navigational maneuvers. A space between each indicator and the previous sequential indicator is proportional to the distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator.
- FIG. 1 depicts an example user interface according to an example embodiment of the present disclosure
- FIG. 2 depicts an example navigational system according to an example embodiment of the present disclosure
- FIG. 3 depicts a flow chart of an example method for providing navigational instruction according to an example embodiment of the present disclosure.
- FIG. 4 depicts a flow chart of an example method for providing navigational instruction according to an example embodiment of the present disclosure.
- a navigational device can obtain navigational information describing a sequence of navigational maneuvers associated with a route.
- the device can display a user interface that provides a sequence of indicators that respectively represent the sequence of navigational maneuvers.
- the device can determine its current location and then identify one or more maneuvers that should occur within a given timeframe.
- a space can be provided between each pair of adjacent indicators on the user interface.
- the space between each indicator and the previous sequential indicator can be proportional to a distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator.
- the user of the navigational device can be provided with an intuitive visual sense of the spatial and/or temporal relationship between upcoming navigational maneuvers.
- a navigational device implementing the present disclosure can obtain navigational information describing a route from an origin to a destination.
- the navigational device can communicate with a server over a network to obtain the navigational information.
- the route can include a sequence of navigational maneuvers.
- navigational maneuvers can include driving or walking maneuvers such as “turn right”, transit maneuvers such as “board the southbound L-Train”, or other suitable forms of navigational maneuvers.
- a distance can exist between each pair of sequential navigational maneuvers.
- the distance can be a physical distance between the locations respectively associated with the pair of navigational maneuvers.
- the distance can be a driving distance (e.g. the distance that a car must travel along one or more roadways).
- the distance between each pair of navigational maneuvers can be a travel time such as, for example, an average travel time between the locations respectively associated with the pair of navigational maneuvers.
- the distance between each pair of navigational maneuvers can be a current expected travel time that incorporates real-time information concerning traffic conditions, weather conditions, current device speed, or other factors.
- the navigational device can identify one or more upcoming maneuvers for display in the user interface.
- the navigational device can determine the current position of the device or device user. Based on such current position, the navigational device can identify one or more upcoming maneuvers. For example, the device can identify the next three upcoming maneuvers.
- the navigational device can identify both the current position and speed of the device or device user. Based on such information, the device can determine which of the sequence of navigational maneuvers the user is expected to reach within a threshold amount of time.
- the navigational device can simply display upcoming maneuvers received from a server.
- the device can report its current location and/or speed to the server and the server can, in turn, provide the navigational device with data identifying the upcoming maneuvers.
- the data from the server can identify a plurality of upcoming maneuvers and a plurality of distances respectively associated with the upcoming maneuvers and the navigational device can use such information to determine the appropriate presentation of the upcoming maneuvers.
- the data from the server can include a listing of upcoming maneuvers and associated distances or spacings along with a style sheet.
- the device can then apply the style sheet to the provided listing.
- the server can provide the navigational device with a web page or other data structure in which the upcoming maneuvers are already organized (e.g. spaced according to distance) for display.
- the navigational device can then indicate the identified maneuvers to the user via a user interface.
- the device can provide a sequence of indicators in the user interface that respectively represent the identified upcoming navigational maneuvers.
- the indicators can be textual entries that describe the upcoming maneuvers using text.
- the indicators can be graphical icons such as, for example, a graphical arrow showing a right turn.
- the indicators can be displayed at different positions along a first axis of the user interface that is representative of distance (e.g. physical distance, travel time, current expected travel time, etc.).
- the first axis can be the y-axis of the user interface.
- an interval or space can be provided between each pair of adjacent indicators displayed along the first axis.
- the interval between each pair of adjacent indicators can be proportional to the distance between the pair of maneuvers such pair of indicators represent.
- the user interface can be updated on a periodic basis to reflect the user's progress along the route.
- the navigational device can determine when a navigational maneuver has been performed and remove the corresponding indicator from the user interface.
- the device can periodically determine its position relative to the route and update the user interface accordingly. For example, in some implementations, the device can scroll the displayed indicators along the distance axis as the user progresses along the route (e.g. scrolled upwards when the indicators are presented along the y-axis with the next maneuver shown at the top).
- the navigational device can periodically communicate with a server to receive additional information, refresh a web page, or otherwise update the display of upcoming maneuvers.
- indicators representing maneuvers that the user is newly approaching can be presented in a fashion which visually simulates the indicator moving onto the bottom of the display area from previously being below the display area and out of sight. In such fashion, the user can be given the impression that the device display area is virtually scrolling through the entire sequence of indicators so as to display only the most relevant upcoming indicators.
- the navigational device can determine a scale of the distance axis of the user interface based on a current speed at which the device or the device user is travelling.
- the scale of the distance axis can decrease (e.g. show a smaller amount of distance over the same display space) when the speed is smaller and increase (e.g. show a larger amount of distance over the same display space) when the speed is greater.
- the navigational device can periodically adjust the scale of the distance axis based on the current speed of the device or the device user.
- the device can then determine which upcoming maneuvers should be indicated in the user interface display area based on the scale of the distance axis and the respective distances associated with the upcoming maneuvers.
- the identified maneuvers can then be respectively represented by a sequence of indicators at corresponding positions along the distance axis.
- determinations regarding the scale of the distance axis can be performed at a server and then communicated to the navigational device.
- the navigational device can then update the display according to the most recent information received from the server.
- the systems and method of the present disclosure can assist in providing an intuitive preview of upcoming navigational instructions.
- a space between each indicator displayed in a user interface and a previous sequential indicator can be proportional to a distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator.
- the user of the navigational device can be provided with an intuitive visual sense of the spatial and/or temporal relationship between upcoming navigational maneuvers.
- FIG. 1 depicts an example user interface 100 according to an example embodiment of the present disclosure.
- user interface 100 is shown as provided on the display of a navigational device 150 (e.g. a smartwatch).
- a navigational device 150 e.g. a smartwatch
- Each indicator can represent an upcoming navigational maneuver.
- Each indicator can include a textual entry (e.g. “Bear Left Onto Stone Ave.”) and/or a graphical icon (e.g. an arrow showing a leftwards turn).
- the plurality of indicators can be ordered into a sequence based on the expected order in which they should be performed. For example, as shown in FIG. 1 , the plurality of indicators can be presented at different positions along a y-axis of interface 100 .
- the indicator 102 for the next upcoming maneuver can be shown in a larger font/icon size at the top of the display area.
- a particular color or other spatial designations can be used to highlight the next upcoming indicator.
- a space can be provided between each pair of adjacent indicators.
- space 112 is provided between indicators 102 and 104 while space 114 is provided between indicators 104 and 106 .
- the space between each pair of adjacent indicators can be proportional to the distance between the pair of maneuvers such pair of indicators represent. For example, as shown in FIG. 1 , space 112 which corresponds to a distance of 15 minutes is larger than space 114 which corresponds to a distance of 10 minutes.
- FIG. 1 shows spaces 112 and 114 as based on distance in terms of a travel time
- the distance can be a physical distance between the locations respectively associated with the pair of navigational maneuvers; a driving distance (e.g. the distance that a car must travel along one or more roadways); a travel time such as, for example, an average travel time between the locations respectively associated with the pair of navigational maneuvers; or a current expected travel time that incorporates real-time information concerning traffic conditions, weather conditions, current device speed, or other factors.
- user interface 100 is provided as an example.
- User interfaces implementing the present disclosure may include many other various colors, patterns, divisions of space, fonts, icons, or other visual characteristics that are different than those shown in FIG. 1 .
- user interface 100 provides an intuitive preview of upcoming navigational instructions.
- a space between each indicator displayed in user interface 100 and a previous sequential indicator can be proportional to a distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator.
- the user of the navigational device can be provided with an intuitive visual sense of the spatial and/or temporal relationship between upcoming navigational maneuvers.
- FIG. 2 depicts an example navigational system 200 according to an example embodiment of the present disclosure.
- Navigational system 200 includes a navigational device 210 in communication with a server 230 over a network 250 . Although a single navigational device 210 is depicted, navigational system 200 can include a client-server architecture in which any number of navigational devices can be connected to server 230 over network 250 .
- Navigational device 210 can be any suitable device used for navigation, including a sole-purpose navigational device, a smartphone, a tablet, a laptop, a PDA, a device installed within a dashboard of a vehicle, a heads up display in a vehicle, a wearable computing device (e.g. eyeglasses containing one or more embedded computing devices), or any other device that is configured to display navigational instructions.
- Navigational device 210 can include one or more processor(s) 212 , a memory 214 , a display 218 , a positioning system 220 , and a network interface 222 .
- the processor(s) 212 can be any suitable processing device, such as a microprocessor, microcontroller, integrated circuit, or other suitable processing device.
- the memory 214 can include any suitable computing system or media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices.
- the memory 214 can store information accessible by processor(s) 212 , including instructions that can be executed by processor(s) 212 .
- the instructions can be any set of instructions that when executed by the processor(s) 212 , cause the processor(s) 212 to provide desired functionality.
- memory 214 can store an application module 216 .
- Navigational device 210 can implement application module 216 to execute aspects of the present disclosure, including directing communications with server 230 and providing navigational instructions to a user (e.g. generating and/or displaying a navigational user interface).
- module refers to computer logic utilized to provide desired functionality.
- a module can be implemented in hardware, application specific circuits, firmware and/or software controlling a general purpose processor.
- the modules are program code files stored on the storage device, loaded into memory and executed by a processor or can be provided from computer program products, for example computer executable instructions, that are stored in a tangible computer-readable storage medium such as RAM, hard disk or optical or magnetic media.
- Memory 214 can also include data, such as geographic data, that can be retrieved, manipulated, created, or stored by processor(s) 212 . In some implementations, such data can be accessed and used to generate maps and navigational instructions.
- the navigational device 210 can also include a positioning system 220 that can be used to identify the position of the navigational device 210 .
- the positioning system 220 can be any device or circuitry for monitoring the position, speed, and/or heading of the navigational device 210 .
- the positioning system 220 can determine actual or relative position by using a satellite navigation positioning system (e.g. a GPS system, a Galileo positioning system, the GLObal Navigation satellite system (GLONASS), the BeiDou Satellite Navigation and Positioning system), an inertial navigation system, a magnetic field positioning system, a dead reckoning system, based on IP address, by using triangulation and/or proximity to cellular towers or WiFi hotspots, and/or other suitable techniques for determining position.
- a satellite navigation positioning system e.g. a GPS system, a Galileo positioning system, the GLObal Navigation satellite system (GLONASS), the BeiDou Satellite Navigation and Positioning system
- GLONASS GLObal Navigation satellite system
- the navigational device 210 can include various input/output devices for providing and receiving information from a user, such as a touch screen, touch pad, data entry keys, speakers, mouse, and/or a microphone suitable for voice recognition.
- a user such as a touch screen, touch pad, data entry keys, speakers, mouse, and/or a microphone suitable for voice recognition.
- the navigational device 210 can use display 218 to present information to the user, including textual or graphical navigational instructions
- Network interface 222 can be any suitable device or circuitry for providing communications across network 250 .
- network interface 222 can include one or more of a receiver, a transmitter, an antenna, a modem, a port, or other suitable components.
- the navigational device 210 can exchange data with one or more servers 230 over the network 250 via network interface 222 .
- Server 230 can be any suitable form of server or other computing device configured to supply navigational device 210 with the appropriate information.
- multiple servers are accessed in a sequence or in parallel by navigational device 210 to retrieve or obtain the desired information or functionality.
- server 230 can include a processor(s) 232 , a memory 234 , and a network interface 238 .
- the memory 234 can store information accessible by processor(s) 232 , including instructions 236 that can be executed by processor(s) and data.
- Server 230 can include or be in communication with one or more databases, including a traffic database 240 and/or a geographic information system 242 . Server 230 can access databases 240 and 242 over a LAN, WAN, or other suitable computing construct.
- Traffic database 240 can store or provide data describing real-time or daily traffic conditions. For example, traffic database 240 can provide data describing the locations of any current traffic stoppages, congestions, or other traffic conditions.
- Geographic information system 242 can store or provide geographic data, including map data, point of interest data, road categorization data, or other suitable data.
- server 230 can use data obtained from geographic information system 242 to determine and provide navigational instructions from an origin to a destination.
- the network 250 can be any type of communications network, such as a local area network (e.g. intranet), wide area network (e.g. Internet), or some combination thereof.
- communication between the navigational device 210 and server 230 can be carried via network interface using any type of wired and/or wireless connection, using a variety of communication protocols (e.g. TCP/IP, HTTP), encodings or formats (e.g. HTML, XML), and/or protection schemes (e.g. VPN, secure HTTP, SSL).
- TCP/IP Transmission Control Protocol/IP
- HTTP HyperText Transfer Protocol
- encodings or formats e.g. HTML, XML
- protection schemes e.g. VPN, secure HTTP, SSL
- FIG. 3 depicts a flow chart of an example method ( 300 ) for providing navigational instruction according to an example embodiment of the present disclosure. Although method ( 300 ) will be discussed with reference to system 200 of FIG. 2 , method ( 300 ) can be performed by any suitable computing system.
- FIG. 3 depicts steps performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the various steps of method ( 300 ) can be omitted, adapted, and/or rearranged in various ways without departing from the scope of the present disclosure.
- navigational information describing a route can be obtained.
- navigational device 210 can communicate with server 230 over network 250 to receive navigation information describing a plurality of navigational maneuvers to be performed according to a route.
- Example navigational maneuvers can include driving or walking maneuvers such as “turn right”, transit maneuvers such as “board the southbound L-Train”, or other suitable forms of navigational maneuvers.
- a current position of the device can be determined.
- navigational device 210 can operate positioning system 220 to determine a current position of the device 210 .
- a plurality of upcoming maneuvers can be determined based on the current position of the device.
- navigational device 210 can analyze the current position of the device relative to the route so as to identify the next upcoming navigational maneuvers. For example, in some implementations, the next three anticipated maneuvers can be identified at ( 306 ).
- the navigational device 210 can identify both the current position and speed of the device or device user. Based on such information, the device can determine at ( 306 ) which of the sequence of navigational maneuvers the user is expected to reach within a threshold amount of time.
- the navigational device 210 can receive data from server 230 that identifies the plurality of upcoming maneuvers.
- the device 210 can report its current location and/or speed to server 230 and server 230 can, in turn, provide the navigational device with data identifying the upcoming maneuvers.
- the data from the server 230 can identify the plurality of upcoming maneuvers along with a plurality of distances respectively associated with the upcoming maneuvers.
- the navigational device 210 can then use such information to determine the appropriate presentation of the upcoming maneuvers at ( 308 ).
- the data received at ( 306 ) from the server 230 can include a listing of upcoming maneuvers and associated distances or spacings along with a style sheet.
- the device 210 can then apply the style sheet to the provided listing at ( 308 ).
- the server 230 can provide the navigational device 210 with a web page or other data structure in which the upcoming maneuvers are already organized (e.g. spaced according to distance) for display.
- a plurality of indicators respectively representing the upcoming maneuvers determined at ( 306 ) can be displayed on a user interface.
- a space provided between each pair of adjacent indicators can be proportional to a distance between the pair of maneuvers such pair of indicators represent.
- the indicators can be displayed at different positions along a first axis of the user interface that is representative of distance (e.g. physical distance, driving distance, travel time, current expected travel time, etc.).
- the first axis can be the y-axis of the user interface.
- the interval or space between each pair of adjacent indicators can be proportional to the distance between the pair of maneuvers such pair of indicators represent.
- the distance can be a physical distance between the locations respectively associated with the pair of navigational maneuvers.
- the distance can be a driving distance (e.g. the distance that a car must travel along one or more roadways).
- the distance between each pair of navigational maneuvers can be a travel time such as, for example, an average travel time between the locations respectively associated with the pair of navigational maneuvers.
- the distance between each pair of navigational maneuvers can be a current expected travel time that incorporates real-time information concerning traffic conditions, weather conditions, current device speed, or other factors.
- method ( 300 ) can return to ( 304 ) and re-determine the current position of the device.
- the navigational device 210 can periodically determine its position relative to the route and update the user interface accordingly.
- the navigational device 210 can periodically communicate with the server 230 to receive additional information, refresh a web page, or otherwise update the display of upcoming maneuvers.
- the device can scroll the displayed indicators along the distance axis as the user progresses along the route (e.g. scrolled upwards when the indicators are presented along the y-axis with the next maneuver shown at the top).
- indicators representing maneuvers that the user is newly approaching can be presented in a fashion which visually simulates the indicator moving onto the bottom of the display area from previously being below the display area and out of sight.
- the user can be given the impression that the device display area is virtually scrolling through the entire sequence of indicators so as to display only the most relevant upcoming indicators.
- FIG. 4 depicts a flow chart of an example method ( 400 ) for providing navigational instruction according to an example embodiment of the present disclosure.
- method ( 400 ) will be discussed with reference to system 200 of FIG. 2 , method ( 400 ) can be performed by any suitable computing system.
- FIG. 4 depicts steps performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the various steps of method ( 400 ) can be omitted, adapted, and/or rearranged in various ways without departing from the scope of the present disclosure.
- navigational information describing a route can be obtained.
- navigational device 210 can communicate with server 230 over network 250 to receive navigation information describing a plurality of navigational maneuvers to be performed according to a route.
- a current position of the device can be determined.
- navigational device 210 can operate positioning system 220 to determine a current position of the device 210 .
- a current speed of the device can be determined. For example, at ( 406 ) navigational device 210 can compare recent position determinations to determine a speed and a heading. As another example, navigational device 210 can receive data input from other devices or components that identify the current speed of the device.
- a scale of a distance axis of a user interface can be determined based at least in part on the current speed.
- the scale of the distance axis can decrease (e.g. show a smaller amount of distance over the same display space) when the speed is smaller and increase (e.g. show a larger amount of distance over the same display space) when the speed is greater.
- the navigational device can determine the scale of the distance axis at ( 408 ) based at least in part on the current speed.
- the server 230 can determine the scale of the distance axis at ( 408 ) based at least in part on the current speed and communicate such information to the navigational device 210 .
- one or more upcoming maneuvers can be identified for display based at least in part on the scale of the distance axis.
- navigational device 210 can consider the scale of the distance axis as determined at ( 408 ) with respect to the available display space of display 218 .
- navigational device 210 can determine at ( 410 ) which of the upcoming navigational maneuvers should be displayed in the display area of display 218 .
- the server 230 can identify the one or more upcoming maneuvers for display based at least in part on the scale of the distance axis and then communicate such information to the navigational device 210 .
- the data communicated by the server 230 can include the plurality of maneuvers along with distances or spacings in a style sheet.
- the data communicated by the server 230 can be a web page or other data structure in which the upcoming maneuvers are already spaced for display according to the scale of the distance axis determined at ( 408 ).
- one or more indicators respectively representing the one or more upcoming maneuvers identified at ( 410 ) can be displayed along the distance axis at positions corresponding to their distance from the navigational device.
- method ( 400 ) can return to ( 404 ) and re-determine the current position of the device.
- the navigational device 210 can periodically adjust the scale of the distance axis of the user interface based on the current speed of the device or the device user.
- the device 210 can then determine which upcoming maneuvers should be indicated in the user interface display area based on the scale of the distance axis and the respective distances associated with the upcoming maneuvers.
- server processes discussed herein may be implemented using a single server or multiple servers working in combination.
- Databases and applications may be implemented on a single system or distributed across multiple systems. Distributed components may operate sequentially or in parallel.
- computing tasks discussed herein as being performed at a server can instead be performed at a client device (e.g. navigational device communicating with a server).
- computing tasks discussed herein as being performed at the client device can instead be performed at the server.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Engineering & Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The present application is a continuation of and claims priority to U.S. patent application Ser. No. 14/267,235, filed May 1, 2014, which claims priority to U.S. Provisional Patent Application Ser. No. 61/941,116 filed Feb. 18, 2014. The present application claims priority to and benefit of all such applications and hereby incorporates all such applications herein by reference in their entirety.
- The present disclosure relates generally to navigational systems. In particular, the present disclosure is directed to systems and methods for providing an intuitive preview of upcoming navigational instructions.
- Navigational devices are becoming increasingly commonplace in the modern world. For example, navigational devices can be used for navigating a vehicle such as a car, boat, or airplane or for use when walking through an unfamiliar location.
- Furthermore, as smartphones, tablets, or other computing devices become increasingly able to determine their own position in the world using GPS or other positioning systems, navigational devices are no longer limited to devices specifically designed with the sole-purpose of providing navigational instructions. Instead, navigational devices can include a large variety of computing devices capable of implementing one or more applications to provide near-instantaneous instructions for navigating from almost any location to any other location.
- In particular, such applications often offer “turn-by-turn” navigational instruction, which provides navigation over a sequence of navigational maneuvers (e.g. driving maneuvers such as “turn right”). As an example, the sequence of maneuvers can be described by a group of textual entries that respectively describe the upcoming maneuvers. As other examples, the navigational device can provide a group of graphical icons that respectively represent the upcoming maneuvers or can output audio in the form of human speech that describes the upcoming maneuvers.
- However, certain display methods implemented by current navigational devices can fail to provide users with an intuitive, user-friendly sense of the scale and relationship between the upcoming maneuvers. As an example, the navigational device may fail to indicate the distance between upcoming maneuvers or may provide the navigational instruction only upon approaching a predefined distance from the maneuver location.
- However, if there are two maneuvers within relative proximity to one another, the navigation device may fail to provide sufficient advanced warning to enable the user to be in proper position or otherwise appropriately anticipate the maneuver. For example, a driver may be required to merge or change lanes immediately after a first maneuver in order to be in position to make a second maneuver. As such, if the navigational device fails to the give the driver appropriate notice, then the driver may miss the second maneuver.
- As another example, in the instance that the navigational device provides the distance between upcoming maneuvers in a textual format, the user may struggle to mentally convert the textual distance information into a full comprehension of the physical distance. Alternatively, user effort to comprehend textual distances or time spent looking at the device display to read the text can undesirably distract the user from the navigational activity (e.g. driving the car).
- Aspects and advantages of the present disclosure will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of embodiments of the present disclosure.
- One example aspect of the present disclosure is directed to a method for providing navigational instruction. The method includes obtaining, by one or more computing devices, navigational information describing a sequence of navigational maneuvers associated with a route. The method includes determining, by the one or more computing devices, a distance between each navigational maneuver and the previous sequential navigational maneuver. The method includes displaying, by the one or more computing devices, a user interface providing a sequence of indicators respectively representing the sequence of navigational maneuvers. A space between each indicator and the previous sequential indicator is proportional to the distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator.
- These and other features, aspects and advantages of the present disclosure will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
- A full and enabling description of the present disclosure, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
-
FIG. 1 depicts an example user interface according to an example embodiment of the present disclosure; -
FIG. 2 depicts an example navigational system according to an example embodiment of the present disclosure; -
FIG. 3 depicts a flow chart of an example method for providing navigational instruction according to an example embodiment of the present disclosure; and -
FIG. 4 depicts a flow chart of an example method for providing navigational instruction according to an example embodiment of the present disclosure. - Reference now will be made in detail to embodiments of the present disclosure, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the present disclosure, not limitation of the present disclosure. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made to the present disclosure without departing from the scope or spirit of the disclosure. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present disclosure covers such modifications and variations as come within the scope of the appended claims and their equivalents.
- Generally, the present disclosure is directed to systems and methods for providing an intuitive preview of upcoming navigational instructions. In particular, a navigational device can obtain navigational information describing a sequence of navigational maneuvers associated with a route. The device can display a user interface that provides a sequence of indicators that respectively represent the sequence of navigational maneuvers. As an example, the device can determine its current location and then identify one or more maneuvers that should occur within a given timeframe. A space can be provided between each pair of adjacent indicators on the user interface. In particular, the space between each indicator and the previous sequential indicator can be proportional to a distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator. In such fashion, the user of the navigational device can be provided with an intuitive visual sense of the spatial and/or temporal relationship between upcoming navigational maneuvers.
- More particularly, a navigational device implementing the present disclosure can obtain navigational information describing a route from an origin to a destination. For example, the navigational device can communicate with a server over a network to obtain the navigational information.
- The route can include a sequence of navigational maneuvers. For example, navigational maneuvers can include driving or walking maneuvers such as “turn right”, transit maneuvers such as “board the southbound L-Train”, or other suitable forms of navigational maneuvers.
- A distance can exist between each pair of sequential navigational maneuvers. For example, the distance can be a physical distance between the locations respectively associated with the pair of navigational maneuvers. As another example, the distance can be a driving distance (e.g. the distance that a car must travel along one or more roadways).
- As yet another example, the distance between each pair of navigational maneuvers can be a travel time such as, for example, an average travel time between the locations respectively associated with the pair of navigational maneuvers. As another example, the distance between each pair of navigational maneuvers can be a current expected travel time that incorporates real-time information concerning traffic conditions, weather conditions, current device speed, or other factors.
- According to an aspect of the present disclosure, the navigational device can identify one or more upcoming maneuvers for display in the user interface. As an example, in some implementations, the navigational device can determine the current position of the device or device user. Based on such current position, the navigational device can identify one or more upcoming maneuvers. For example, the device can identify the next three upcoming maneuvers.
- As another example, in some implementations, the navigational device can identify both the current position and speed of the device or device user. Based on such information, the device can determine which of the sequence of navigational maneuvers the user is expected to reach within a threshold amount of time.
- As yet another example, the navigational device can simply display upcoming maneuvers received from a server. In particular, the device can report its current location and/or speed to the server and the server can, in turn, provide the navigational device with data identifying the upcoming maneuvers. For example, the data from the server can identify a plurality of upcoming maneuvers and a plurality of distances respectively associated with the upcoming maneuvers and the navigational device can use such information to determine the appropriate presentation of the upcoming maneuvers.
- As another example, in some implementations, the data from the server can include a listing of upcoming maneuvers and associated distances or spacings along with a style sheet. The device can then apply the style sheet to the provided listing. As yet another example, the server can provide the navigational device with a web page or other data structure in which the upcoming maneuvers are already organized (e.g. spaced according to distance) for display.
- The navigational device can then indicate the identified maneuvers to the user via a user interface. In particular, the device can provide a sequence of indicators in the user interface that respectively represent the identified upcoming navigational maneuvers. As an example, the indicators can be textual entries that describe the upcoming maneuvers using text. As another example, the indicators can be graphical icons such as, for example, a graphical arrow showing a right turn.
- In some implementations of the present disclosure, the indicators can be displayed at different positions along a first axis of the user interface that is representative of distance (e.g. physical distance, travel time, current expected travel time, etc.). For example, the first axis can be the y-axis of the user interface.
- In particular, an interval or space can be provided between each pair of adjacent indicators displayed along the first axis. The interval between each pair of adjacent indicators can be proportional to the distance between the pair of maneuvers such pair of indicators represent.
- The user interface can be updated on a periodic basis to reflect the user's progress along the route. For example, in some implementations, the navigational device can determine when a navigational maneuver has been performed and remove the corresponding indicator from the user interface.
- As another example, the device can periodically determine its position relative to the route and update the user interface accordingly. For example, in some implementations, the device can scroll the displayed indicators along the distance axis as the user progresses along the route (e.g. scrolled upwards when the indicators are presented along the y-axis with the next maneuver shown at the top).
- As yet another example, the navigational device can periodically communicate with a server to receive additional information, refresh a web page, or otherwise update the display of upcoming maneuvers.
- According to another aspect of the present disclosure, indicators representing maneuvers that the user is newly approaching can be presented in a fashion which visually simulates the indicator moving onto the bottom of the display area from previously being below the display area and out of sight. In such fashion, the user can be given the impression that the device display area is virtually scrolling through the entire sequence of indicators so as to display only the most relevant upcoming indicators.
- In further embodiments of the present disclosure, the navigational device can determine a scale of the distance axis of the user interface based on a current speed at which the device or the device user is travelling. For example, the scale of the distance axis can decrease (e.g. show a smaller amount of distance over the same display space) when the speed is smaller and increase (e.g. show a larger amount of distance over the same display space) when the speed is greater.
- In particular, the navigational device can periodically adjust the scale of the distance axis based on the current speed of the device or the device user. The device can then determine which upcoming maneuvers should be indicated in the user interface display area based on the scale of the distance axis and the respective distances associated with the upcoming maneuvers. The identified maneuvers can then be respectively represented by a sequence of indicators at corresponding positions along the distance axis.
- As another example, determinations regarding the scale of the distance axis can be performed at a server and then communicated to the navigational device. The navigational device can then update the display according to the most recent information received from the server.
- Thus, the systems and method of the present disclosure can assist in providing an intuitive preview of upcoming navigational instructions. In particular, a space between each indicator displayed in a user interface and a previous sequential indicator can be proportional to a distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator. In such fashion, the user of the navigational device can be provided with an intuitive visual sense of the spatial and/or temporal relationship between upcoming navigational maneuvers.
- With reference now to the FIGS., example embodiments of the present disclosure will be discussed in further detail.
FIG. 1 depicts anexample user interface 100 according to an example embodiment of the present disclosure. In particular,user interface 100 is shown as provided on the display of a navigational device 150 (e.g. a smartwatch). - Provided in
user interface 100 are a plurality of indicators, such as, for example,indicators - The plurality of indicators can be ordered into a sequence based on the expected order in which they should be performed. For example, as shown in
FIG. 1 , the plurality of indicators can be presented at different positions along a y-axis ofinterface 100. - As an example, as shown in
FIG. 1 , theindicator 102 for the next upcoming maneuver can be shown in a larger font/icon size at the top of the display area. For example, a particular color or other spatial designations can be used to highlight the next upcoming indicator. - According to an aspect of the present disclosure, a space can be provided between each pair of adjacent indicators. For example,
space 112 is provided betweenindicators space 114 is provided betweenindicators - The space between each pair of adjacent indicators can be proportional to the distance between the pair of maneuvers such pair of indicators represent. For example, as shown in
FIG. 1 ,space 112 which corresponds to a distance of 15 minutes is larger thanspace 114 which corresponds to a distance of 10 minutes. - Further, while
FIG. 1 showsspaces - In addition, it will be understood that
user interface 100 is provided as an example. User interfaces implementing the present disclosure may include many other various colors, patterns, divisions of space, fonts, icons, or other visual characteristics that are different than those shown inFIG. 1 . - Thus,
user interface 100 provides an intuitive preview of upcoming navigational instructions. In particular, a space between each indicator displayed inuser interface 100 and a previous sequential indicator can be proportional to a distance between the navigational maneuver represented by such indicator and the navigational maneuver represented by the previous sequential indicator. In such fashion, the user of the navigational device can be provided with an intuitive visual sense of the spatial and/or temporal relationship between upcoming navigational maneuvers. -
FIG. 2 depicts an examplenavigational system 200 according to an example embodiment of the present disclosure.Navigational system 200 includes anavigational device 210 in communication with aserver 230 over anetwork 250. Although a singlenavigational device 210 is depicted,navigational system 200 can include a client-server architecture in which any number of navigational devices can be connected toserver 230 overnetwork 250. -
Navigational device 210 can be any suitable device used for navigation, including a sole-purpose navigational device, a smartphone, a tablet, a laptop, a PDA, a device installed within a dashboard of a vehicle, a heads up display in a vehicle, a wearable computing device (e.g. eyeglasses containing one or more embedded computing devices), or any other device that is configured to display navigational instructions.Navigational device 210 can include one or more processor(s) 212, a memory 214, adisplay 218, apositioning system 220, and a network interface 222. - The processor(s) 212 can be any suitable processing device, such as a microprocessor, microcontroller, integrated circuit, or other suitable processing device. The memory 214 can include any suitable computing system or media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices. The memory 214 can store information accessible by processor(s) 212, including instructions that can be executed by processor(s) 212. The instructions can be any set of instructions that when executed by the processor(s) 212, cause the processor(s) 212 to provide desired functionality.
- In particular, in some devices, memory 214 can store an
application module 216.Navigational device 210 can implementapplication module 216 to execute aspects of the present disclosure, including directing communications withserver 230 and providing navigational instructions to a user (e.g. generating and/or displaying a navigational user interface). - It will be appreciated that the term “module” refers to computer logic utilized to provide desired functionality. Thus, a module can be implemented in hardware, application specific circuits, firmware and/or software controlling a general purpose processor. In one embodiment, the modules are program code files stored on the storage device, loaded into memory and executed by a processor or can be provided from computer program products, for example computer executable instructions, that are stored in a tangible computer-readable storage medium such as RAM, hard disk or optical or magnetic media.
- Memory 214 can also include data, such as geographic data, that can be retrieved, manipulated, created, or stored by processor(s) 212. In some implementations, such data can be accessed and used to generate maps and navigational instructions.
- The
navigational device 210 can also include apositioning system 220 that can be used to identify the position of thenavigational device 210. Thepositioning system 220 can be any device or circuitry for monitoring the position, speed, and/or heading of thenavigational device 210. For example, thepositioning system 220 can determine actual or relative position by using a satellite navigation positioning system (e.g. a GPS system, a Galileo positioning system, the GLObal Navigation satellite system (GLONASS), the BeiDou Satellite Navigation and Positioning system), an inertial navigation system, a magnetic field positioning system, a dead reckoning system, based on IP address, by using triangulation and/or proximity to cellular towers or WiFi hotspots, and/or other suitable techniques for determining position. - The
navigational device 210 can include various input/output devices for providing and receiving information from a user, such as a touch screen, touch pad, data entry keys, speakers, mouse, and/or a microphone suitable for voice recognition. For instance, thenavigational device 210 can usedisplay 218 to present information to the user, including textual or graphical navigational instructions - Network interface 222 can be any suitable device or circuitry for providing communications across
network 250. For example, network interface 222 can include one or more of a receiver, a transmitter, an antenna, a modem, a port, or other suitable components. - The
navigational device 210 can exchange data with one ormore servers 230 over thenetwork 250 via network interface 222.Server 230 can be any suitable form of server or other computing device configured to supplynavigational device 210 with the appropriate information. In particular, in some implementations, multiple servers are accessed in a sequence or in parallel bynavigational device 210 to retrieve or obtain the desired information or functionality. - Similar to
navigational device 210,server 230 can include a processor(s) 232, amemory 234, and anetwork interface 238. Thememory 234 can store information accessible by processor(s) 232, includinginstructions 236 that can be executed by processor(s) and data. -
Server 230 can include or be in communication with one or more databases, including atraffic database 240 and/or ageographic information system 242.Server 230 can accessdatabases -
Traffic database 240 can store or provide data describing real-time or daily traffic conditions. For example,traffic database 240 can provide data describing the locations of any current traffic stoppages, congestions, or other traffic conditions. -
Geographic information system 242 can store or provide geographic data, including map data, point of interest data, road categorization data, or other suitable data. In some implementations,server 230 can use data obtained fromgeographic information system 242 to determine and provide navigational instructions from an origin to a destination. - The
network 250 can be any type of communications network, such as a local area network (e.g. intranet), wide area network (e.g. Internet), or some combination thereof. In general, communication between thenavigational device 210 andserver 230 can be carried via network interface using any type of wired and/or wireless connection, using a variety of communication protocols (e.g. TCP/IP, HTTP), encodings or formats (e.g. HTML, XML), and/or protection schemes (e.g. VPN, secure HTTP, SSL). -
FIG. 3 depicts a flow chart of an example method (300) for providing navigational instruction according to an example embodiment of the present disclosure. Although method (300) will be discussed with reference tosystem 200 ofFIG. 2 , method (300) can be performed by any suitable computing system. - In addition,
FIG. 3 depicts steps performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the various steps of method (300) can be omitted, adapted, and/or rearranged in various ways without departing from the scope of the present disclosure. - At (302) navigational information describing a route can be obtained. For example,
navigational device 210 can communicate withserver 230 overnetwork 250 to receive navigation information describing a plurality of navigational maneuvers to be performed according to a route. Example navigational maneuvers can include driving or walking maneuvers such as “turn right”, transit maneuvers such as “board the southbound L-Train”, or other suitable forms of navigational maneuvers. - At (304) a current position of the device can be determined. For example,
navigational device 210 can operatepositioning system 220 to determine a current position of thedevice 210. - At (306) a plurality of upcoming maneuvers can be determined based on the current position of the device. As an example,
navigational device 210 can analyze the current position of the device relative to the route so as to identify the next upcoming navigational maneuvers. For example, in some implementations, the next three anticipated maneuvers can be identified at (306). - As another example, in some implementations, at (306) the
navigational device 210 can identify both the current position and speed of the device or device user. Based on such information, the device can determine at (306) which of the sequence of navigational maneuvers the user is expected to reach within a threshold amount of time. - As another example, at (306) the
navigational device 210 can receive data fromserver 230 that identifies the plurality of upcoming maneuvers. In particular, thedevice 210 can report its current location and/or speed toserver 230 andserver 230 can, in turn, provide the navigational device with data identifying the upcoming maneuvers. For example, the data from theserver 230 can identify the plurality of upcoming maneuvers along with a plurality of distances respectively associated with the upcoming maneuvers. Thenavigational device 210 can then use such information to determine the appropriate presentation of the upcoming maneuvers at (308). - As yet another example, in some implementations, the data received at (306) from the
server 230 can include a listing of upcoming maneuvers and associated distances or spacings along with a style sheet. Thedevice 210 can then apply the style sheet to the provided listing at (308). As another example, at (306) theserver 230 can provide thenavigational device 210 with a web page or other data structure in which the upcoming maneuvers are already organized (e.g. spaced according to distance) for display. - At (308) a plurality of indicators respectively representing the upcoming maneuvers determined at (306) can be displayed on a user interface. A space provided between each pair of adjacent indicators can be proportional to a distance between the pair of maneuvers such pair of indicators represent.
- As an example, in some implementations, at (308) the indicators can be displayed at different positions along a first axis of the user interface that is representative of distance (e.g. physical distance, driving distance, travel time, current expected travel time, etc.). For example, the first axis can be the y-axis of the user interface.
- The interval or space between each pair of adjacent indicators can be proportional to the distance between the pair of maneuvers such pair of indicators represent. For example, the distance can be a physical distance between the locations respectively associated with the pair of navigational maneuvers. As another example, the distance can be a driving distance (e.g. the distance that a car must travel along one or more roadways).
- As yet another example, the distance between each pair of navigational maneuvers can be a travel time such as, for example, an average travel time between the locations respectively associated with the pair of navigational maneuvers. As another example, the distance between each pair of navigational maneuvers can be a current expected travel time that incorporates real-time information concerning traffic conditions, weather conditions, current device speed, or other factors.
- After (308), method (300) can return to (304) and re-determine the current position of the device. In such fashion, the
navigational device 210 can periodically determine its position relative to the route and update the user interface accordingly. Alternatively or additionally, thenavigational device 210 can periodically communicate with theserver 230 to receive additional information, refresh a web page, or otherwise update the display of upcoming maneuvers. - As an example of periodic updates, in some implementations, the device can scroll the displayed indicators along the distance axis as the user progresses along the route (e.g. scrolled upwards when the indicators are presented along the y-axis with the next maneuver shown at the top).
- Thus, indicators representing maneuvers that the user is newly approaching can be presented in a fashion which visually simulates the indicator moving onto the bottom of the display area from previously being below the display area and out of sight. In such fashion, the user can be given the impression that the device display area is virtually scrolling through the entire sequence of indicators so as to display only the most relevant upcoming indicators.
-
FIG. 4 depicts a flow chart of an example method (400) for providing navigational instruction according to an example embodiment of the present disclosure. Although method (400) will be discussed with reference tosystem 200 ofFIG. 2 , method (400) can be performed by any suitable computing system. - In addition,
FIG. 4 depicts steps performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the various steps of method (400) can be omitted, adapted, and/or rearranged in various ways without departing from the scope of the present disclosure. - At (402) navigational information describing a route can be obtained. For example,
navigational device 210 can communicate withserver 230 overnetwork 250 to receive navigation information describing a plurality of navigational maneuvers to be performed according to a route. - At (404) a current position of the device can be determined. For example,
navigational device 210 can operatepositioning system 220 to determine a current position of thedevice 210. - At (406) a current speed of the device can be determined. For example, at (406)
navigational device 210 can compare recent position determinations to determine a speed and a heading. As another example,navigational device 210 can receive data input from other devices or components that identify the current speed of the device. - At (408) a scale of a distance axis of a user interface can be determined based at least in part on the current speed. For example, the scale of the distance axis can decrease (e.g. show a smaller amount of distance over the same display space) when the speed is smaller and increase (e.g. show a larger amount of distance over the same display space) when the speed is greater.
- As an example, in some implementations, the navigational device can determine the scale of the distance axis at (408) based at least in part on the current speed. As another example, in some implementations, the
server 230 can determine the scale of the distance axis at (408) based at least in part on the current speed and communicate such information to thenavigational device 210. - At (410) one or more upcoming maneuvers can be identified for display based at least in part on the scale of the distance axis. Thus, for example,
navigational device 210 can consider the scale of the distance axis as determined at (408) with respect to the available display space ofdisplay 218. Thus, given the scale of the distance axis,navigational device 210 can determine at (410) which of the upcoming navigational maneuvers should be displayed in the display area ofdisplay 218. - As another example, at (410) the
server 230 can identify the one or more upcoming maneuvers for display based at least in part on the scale of the distance axis and then communicate such information to thenavigational device 210. For example, the data communicated by theserver 230 can include the plurality of maneuvers along with distances or spacings in a style sheet. As another example, the data communicated by theserver 230 can be a web page or other data structure in which the upcoming maneuvers are already spaced for display according to the scale of the distance axis determined at (408). - At (412) one or more indicators respectively representing the one or more upcoming maneuvers identified at (410) can be displayed along the distance axis at positions corresponding to their distance from the navigational device.
- After (412) method (400) can return to (404) and re-determine the current position of the device. In such fashion, the
navigational device 210 can periodically adjust the scale of the distance axis of the user interface based on the current speed of the device or the device user. Thedevice 210 can then determine which upcoming maneuvers should be indicated in the user interface display area based on the scale of the distance axis and the respective distances associated with the upcoming maneuvers. - Therefore, only indicators for those maneuvers that are expected to be reached within a navigationally-significant period of time, as determined by the current speed of the device, will be displayed. Further, the space between the indicators will be proportional to their respective distances from one another, as they will be positioned along the distance axis according their respective distances.
- The technology discussed herein makes reference to servers, databases, software applications, and other computer-based systems, as well as actions taken and information sent to and from such systems. One of ordinary skill in the art will recognize that the inherent flexibility of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and functionality between and among components. For instance, server processes discussed herein may be implemented using a single server or multiple servers working in combination. Databases and applications may be implemented on a single system or distributed across multiple systems. Distributed components may operate sequentially or in parallel.
- Furthermore, computing tasks discussed herein as being performed at a server can instead be performed at a client device (e.g. navigational device communicating with a server). Likewise, computing tasks discussed herein as being performed at the client device can instead be performed at the server.
- While the present subject matter has been described in detail with respect to specific example embodiments and methods thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, the scope of the present disclosure is by way of example rather than by way of limitation, and the subject disclosure does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/990,145 US20160116297A1 (en) | 2014-02-18 | 2016-01-07 | Intuitive Preview of Upcoming Navigational Instructions |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461941116P | 2014-02-18 | 2014-02-18 | |
US14/267,235 US9243921B2 (en) | 2014-02-18 | 2014-05-01 | Intuitive preview of upcoming navigational instructions |
US14/990,145 US20160116297A1 (en) | 2014-02-18 | 2016-01-07 | Intuitive Preview of Upcoming Navigational Instructions |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/267,235 Continuation US9243921B2 (en) | 2014-02-18 | 2014-05-01 | Intuitive preview of upcoming navigational instructions |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160116297A1 true US20160116297A1 (en) | 2016-04-28 |
Family
ID=53797842
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/267,235 Active 2034-05-28 US9243921B2 (en) | 2014-02-18 | 2014-05-01 | Intuitive preview of upcoming navigational instructions |
US14/990,145 Abandoned US20160116297A1 (en) | 2014-02-18 | 2016-01-07 | Intuitive Preview of Upcoming Navigational Instructions |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/267,235 Active 2034-05-28 US9243921B2 (en) | 2014-02-18 | 2014-05-01 | Intuitive preview of upcoming navigational instructions |
Country Status (3)
Country | Link |
---|---|
US (2) | US9243921B2 (en) |
DE (1) | DE202015009170U1 (en) |
WO (1) | WO2015126891A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113264036A (en) * | 2021-05-19 | 2021-08-17 | 广州小鹏汽车科技有限公司 | Guiding method and device based on parking function in automatic driving |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8203438B2 (en) | 2008-07-29 | 2012-06-19 | Masimo Corporation | Alarm suspend system |
US9243921B2 (en) * | 2014-02-18 | 2016-01-26 | Google Inc. | Intuitive preview of upcoming navigational instructions |
JP6542708B2 (en) * | 2016-04-22 | 2019-07-10 | 株式会社Subaru | Display device |
US10883848B2 (en) | 2018-09-20 | 2021-01-05 | Here Global B.V. | Methods and systems for providing an improved maneuver countdown bar |
US11014577B2 (en) * | 2018-11-28 | 2021-05-25 | Here Global B.V. | Method and apparatus for presenting a feedforward cue in a user interface before an upcoming vehicle event occurs |
CN110618848A (en) * | 2018-12-25 | 2019-12-27 | 北京时光荏苒科技有限公司 | Page display method, device, equipment and storage medium |
US11473924B2 (en) * | 2020-02-28 | 2022-10-18 | Lyft, Inc. | Transition of navigation modes for multi-modal transportation |
JP2023537959A (en) | 2020-08-18 | 2023-09-06 | グーグル エルエルシー | Navigation guidance preview |
EP4283611A3 (en) | 2020-09-11 | 2024-02-21 | Google LLC | Detecting and improving simultaneous navigation sessions on multiple devices |
US11508194B1 (en) * | 2021-05-02 | 2022-11-22 | Jeffrey Scott VanDeusen | Position keyed lockbox |
DE102022120675A1 (en) | 2022-08-16 | 2024-02-22 | Bayerische Motoren Werke Aktiengesellschaft | Providing navigation instructions |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9243921B2 (en) * | 2014-02-18 | 2016-01-26 | Google Inc. | Intuitive preview of upcoming navigational instructions |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5654892A (en) | 1991-10-18 | 1997-08-05 | Zexel Usa Corporation | Navigation system displaying forthcoming turns |
US6199013B1 (en) | 1997-07-15 | 2001-03-06 | Navigation Technologies Corp. | Maneuver generation program and method |
US6397145B1 (en) | 2000-03-06 | 2002-05-28 | Magellan Dis, Inc. | Navigation system with complex maneuver instruction |
US6728630B1 (en) | 2002-03-07 | 2004-04-27 | General Motors Corporation | Method for providing route instructions to a mobile vehicle |
US7321824B1 (en) | 2002-12-30 | 2008-01-22 | Aol Llc | Presenting a travel route using more than one presentation style |
US20050131638A1 (en) | 2003-12-11 | 2005-06-16 | Sencaj Randall W. | Route sequence viewing in navigation system |
KR101479773B1 (en) | 2007-12-27 | 2015-01-13 | 엘지전자 주식회사 | Navigation apparatus and Method for providing TBT(Turn-By-Turn Position) List |
US8260550B2 (en) | 2009-06-19 | 2012-09-04 | GM Global Technology Operations LLC | Presentation of navigation instructions using variable levels of detail |
US20120150436A1 (en) | 2010-12-10 | 2012-06-14 | Volkswagen Ag | Method for Displaying a Travel Route |
WO2012167154A2 (en) | 2011-06-03 | 2012-12-06 | Apple Inc. | Systems and methods for printing maps and directions |
-
2014
- 2014-05-01 US US14/267,235 patent/US9243921B2/en active Active
-
2015
- 2015-02-18 WO PCT/US2015/016292 patent/WO2015126891A1/en active Application Filing
- 2015-02-18 DE DE202015009170.9U patent/DE202015009170U1/en active Active
-
2016
- 2016-01-07 US US14/990,145 patent/US20160116297A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9243921B2 (en) * | 2014-02-18 | 2016-01-26 | Google Inc. | Intuitive preview of upcoming navigational instructions |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113264036A (en) * | 2021-05-19 | 2021-08-17 | 广州小鹏汽车科技有限公司 | Guiding method and device based on parking function in automatic driving |
Also Published As
Publication number | Publication date |
---|---|
US20150233722A1 (en) | 2015-08-20 |
DE202015009170U1 (en) | 2016-11-08 |
WO2015126891A1 (en) | 2015-08-27 |
US9243921B2 (en) | 2016-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9243921B2 (en) | Intuitive preview of upcoming navigational instructions | |
US10996067B2 (en) | Pickup location selection and augmented reality navigation | |
EP3516337B1 (en) | Providing traffic warnings to a user based on return journey delays | |
US7613331B2 (en) | Recording medium storing map information, map information processing device, map information processing system, map information processing method, map information processing program and recording medium storing the map information processing program | |
EP2017580A2 (en) | Navigation device and navigation program | |
US9267810B2 (en) | Systems and methods for displaying navigational information | |
US20200134645A1 (en) | Demand prediction information display control method, display control device, and non-transitory recording medium storing display control program | |
CN103047993A (en) | Method and device for path planning | |
US8831881B1 (en) | Interactive user interface for displaying available trips | |
US11408747B2 (en) | In-vehicle apparatus and information presentation method | |
RU2706606C1 (en) | Method, system and device for marking events of turning and routing of vehicle | |
CN104236571A (en) | Vehicle-mounted dynamic navigator | |
US9857197B2 (en) | Linear route condition interface | |
JP6053638B2 (en) | Display control apparatus and display control method | |
US11188994B2 (en) | Display control method, display control device, non-transitory recording medium storing display control program, and display control system for displaying forecasted demand for a vehicle dispatch | |
JP6804899B2 (en) | A computer-readable recording medium on which a display control device, a display control method, a display control program, and a display control program are recorded. | |
US20140039789A1 (en) | Dynamic progressive map granularity for navigation | |
JP2022023880A (en) | Display control device, display control method, display control program, and computer-readable storage medium on which display control program is recorded | |
JP6711785B2 (en) | Information processing system, information processing apparatus, and program | |
JP6964967B2 (en) | A computer-readable recording medium on which a display control device, a display control method, a display control program, and a display control program are recorded. | |
JP2024026491A (en) | Device, method and program for display control, and computer-readable recording medium having display control program recorded therein | |
JP2010203947A (en) | Navigation apparatus, program and server system | |
JP2018185258A (en) | Device and method for searching for route | |
JP2012207924A (en) | Information presentation device, information presentation method, information presentation program, and recording medium storing information presentation program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FAABORG, ALEXANDER JAMES;KAPLAN, JOSHUA ROBIN;SIGNING DATES FROM 20140220 TO 20140222;REEL/FRAME:037430/0889 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044144/0001 Effective date: 20170929 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE THE REMOVAL OF THE INCORRECTLY RECORDED APPLICATION NUMBERS 14/149802 AND 15/419313 PREVIOUSLY RECORDED AT REEL: 44144 FRAME: 1. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:068092/0502 Effective date: 20170929 |