US20150177018A1 - Touch screen based interaction with traffic data - Google Patents
Touch screen based interaction with traffic data Download PDFInfo
- Publication number
- US20150177018A1 US20150177018A1 US14/637,357 US201514637357A US2015177018A1 US 20150177018 A1 US20150177018 A1 US 20150177018A1 US 201514637357 A US201514637357 A US 201514637357A US 2015177018 A1 US2015177018 A1 US 2015177018A1
- Authority
- US
- United States
- Prior art keywords
- presentation
- traffic
- touch screen
- virtual broadcast
- presenter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3635—Guidance using 3D or perspective road maps
- G01C21/3638—Guidance using 3D or perspective road maps including 3D objects and buildings
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3691—Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
- G01C21/3694—Output thereof on a road map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/012—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/091—Traffic information broadcasting
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
Definitions
- Existing broadcast presentations generally include a variety of maps, images, and animations that display current or forecasted conditions for reference by a presenter (i.e., news reporter) during a broadcast presentation such as a traffic or weather report.
- the broadcast presentation is often produced prior to a scheduled broadcast for presentation by a traffic or weather reporter in a fixed arrangement (much like a slide show) with a prerehearsed script.
- the presenter has the ability to control the speed and manner in which the broadcast presentation is presented to a viewing audience, the content in the maps and images remains fixed. That is, the content presented during the broadcast presentation is not in real-time and is outdated.
- the reporting of outdated information e.g., traffic or weather information
- Another shortcoming of existing broadcast technology is the lack of interaction with the content of the virtual broadcast presentation. Since the presentation contains pre-determined content, a presenter is unable to directly interact with or manipulate the maps and images of the presentation. The presenter cannot, for example, retrieve real-time conditions or other information associated with the maps or images of the presentation.
- Embodiments of the present invention allow a presenter to interact with traffic data and other related data in real-time using a touch screen or other suitable display.
- a method for touch screen based interaction with traffic data is claimed.
- a virtual broadcast presentation is generated based on traffic data received from one or more information sources.
- a signal based on user interaction with a touch screen is generated and received.
- User interaction may include the selection of an interactive element included in the virtual broadcast presentation.
- the signal generated by the touch screen is then processed and the virtual broadcast presentation is updated in response to the processed signal.
- a system for touch screen based interaction with traffic data includes at least a communications module and a presentation rendering module, each module stored in memory and executable by a processor.
- Execution of the communications module by the processor receives a signal generated by the touch screen.
- the signal may be based on user interaction with the touch screen, wherein the user interaction includes selection of an interactive element included in a virtual broadcast presentation.
- Execution of the presentation rendering module by the processor generates the virtual broadcast presentation based on traffic data received from one or more information sources, processes the signal generated by the touch screen, and updates the virtual broadcast presentation in response to the processed signal.
- a non-transitory computer-readable storage medium includes a computer program that is executable by a processor to perform a method for touch screen based interaction with traffic data.
- a virtual broadcast presentation is generated based on traffic data received from one or more information sources.
- a signal based on user interaction with a touch screen is generated and received. User interaction may include the selection of an interactive element included in the virtual broadcast presentation.
- the signal generated by the touch screen is then processed and the virtual broadcast presentation is updated in response to the processed signal.
- FIG. 1 illustrates a block diagram of an environment for the broadcast of a virtual broadcast presentation that a user may interact with and reference in real-time
- FIG. 2 illustrates the virtual broadcast presentation engine of FIG. 1 .
- FIGS. 3A-3B illustrate a virtual broadcast presentation displayed on a touch screen.
- FIG. 4 illustrates an interactive element appearing in a virtual broadcast presentation.
- FIG. 5 illustrates the interaction technique of “pinching” used with a virtual broadcast presentation.
- FIGS. 6A-6B illustrate a virtual broadcast presentation in ‘trip time’ mode.
- FIGS. 7A-7B illustrate a traffic camera appearing within a virtual broadcast presentation.
- FIG. 8 is a flowchart illustrating a method for touch screen based interaction with traffic data presented in a virtual broadcast presentation.
- the present invention provides for the use of a touch screen to interact with traffic information and other related data during a virtual broadcast presentation.
- the virtual broadcast presentation may include maps, images, graphics, animations, multimedia overlays, and the like, that are rendered in a two-dimensional or three-dimensional manner on a display such as a touch screen.
- a presenter may refer to the presentation in real-time and may manipulate a view of the virtual broadcast presentation using the touch screen. The presenter may also use the touch screen to select an interactive element included in the broadcast presentation.
- FIG. 1 illustrates a block diagram of an environment for the broadcast of a virtual broadcast presentation that a user may interact with and reference in real-time.
- the environment 100 of FIG. 1 includes a computing device 110 having a virtual broadcast presentation engine 120 .
- the computing device 110 of FIG. 1 is communicatively coupled to information sources 130 , a touch screen 140 , and a broadcast system 150 .
- FIG. 1 illustrates one particular environment 100 including certain elements for the broadcast of a virtual presentation, alternative embodiments may be implemented that utilize differing elements than those disclosed in FIG. 1 (or combinations of the same), but that otherwise fall within the scope and spirit of the present invention.
- the computing device 110 and the virtual broadcast presentation engine 120 may generate a composite presentation that includes a virtual broadcast presentation.
- the virtual broadcast presentation may be two-dimensional or three-dimensional.
- the composite presentation may be generated using information obtained in real-time (or near real-time) from the information sources 130 as described in further detail below.
- the virtual broadcast presentation engine 120 in particular, is discussed with respect to FIG. 2 .
- the computing device 110 may include various components such as one or more of communications interfaces, a processor, memory, storage, and any number of buses providing communication therebetween (not depicted).
- the processor may execute instructions implemented through computing modules or engines while the memory and storage may both permanently or temporarily store data including the aforementioned modules and engines.
- Information sources 130 may be provided by various organizations and in a variety of forms.
- Information sources 130 may include data sources related to traffic data such as traffic flow and as described in U.S. patent application Ser. No. 11/302,418, now U.S. Pat. No. 7,221,287, or weather data such as forecasts.
- Information sources 130 may also include data sources related to newsworthy events or incidents, school closings, election results, and other information that may be featured in a virtual broadcast presentation.
- Information sources 130 may require subscription or authentication for access and may be accessible via Telnet, FTP, or web services protocols.
- Information may be received from information sources 130 in real-time or near real-time to allow for generation of an equally real-time or near real-time presentation. That presentation may, in turn, be manipulated in real-time.
- information sources 130 may include one or more of the 511.org system (a collaboration of public agencies including the California Highway Patrol, Metropolitan Transportation Commission, and CALTRANS), the California Highway Patrol (CHP) World Wide Web server, the PeMS system at the University of California at Berkeley, various public event listings, or a publicly or privately accessible user input mechanism.
- the information sources 130 may include the National Weather Service among other weather information sources.
- Other data sources or alternative types of data sources e.g., non-traffic and non-weather related sources may be incorporated and utilized in various embodiments of the present invention.
- Touch screen 140 may be any multi-touch touch screen known in the art capable of recognizing complex gestures.
- Touch screen 140 may employ any touch screen technology known in the art including but not limited to resistive technology, surface acoustic wave technology, capacitive sensing (e.g., surface capacitance, projected capacitance, mutual capacitance, self capacitance), infrared, force panel technology, optical imaging, dispersive signal technology, acoustic pulse recognition, and the like.
- a presenter may interact with touch screen 140 using any interaction technique known in the art such as touch-drag motions, “pinching,” (e.g., zooming in or out of a web page or photo by touching the user interface and either spreading two fingers apart or bringing two fingers close together), scrolling (e.g.
- Touch screen 140 may include various sensors such as a light sensor for adjusting touch screen brightness. Touch screen 140 may also include a tilt sensor, an accelerometer, a gyroscopic component, and/or magnetometer for sensing orientation of touch screen 140 and/or switching between landscape and portrait modes. Touch screen 140 may be a liquid crystal display or any other suitable display.
- the broadcast system 150 disseminates the composite presentation to viewers. Dissemination may occur via radio waves such as UHF or VHF, cable, satellite, or the World Wide Web. Hardware and software necessary to effectuate a broadcast may be included in the broadcast system 150 and are generally known to those skilled in the broadcasting art.
- FIG. 2 illustrates the virtual broadcast presentation engine of FIG. 1 .
- the virtual broadcast presentation engine 120 of FIG. 2 includes a communications module 210 , a presentation rendering module 220 , a selection module 230 , a feedback module 240 , and a trip calculation module 250 .
- the virtual broadcast presentation engine 120 and its constituent modules may be stored in memory and executed by a processing device to effectuate the functionality corresponding thereto.
- the virtual broadcast presentation engine 120 may be composed of more or less modules (or combinations of the same) and still fall within the scope of the present invention.
- the functionality of the selection module 230 and the functionality of the feedback module 240 may be combined into a single module.
- Execution of the communications module 210 allows for receipt of a signal generated by touch screen 140 , which may be based at least partially on a user selection such as the selection by a presenter of an interactive element displayed within the virtual broadcast presentation.
- the signal may additionally be based on—in part or in whole—the actuation of other components included within the virtual broadcast presentation such as various soft keys with different functionalities.
- execution of the communications module 210 may also allow for receipt of dynamic information from information sources 130 . This dynamic information may be used by other modules for generating, manipulating, and interacting with the virtual broadcast presentation.
- execution of the presentation rendering module 220 allows for the generation of a virtual broadcast presentation based on the dynamic information received through execution of the communications module 210 .
- the dynamic information may include traffic information, weather information, newsworthy events or incidents, election results, school closings, or other information that may be featured in a virtual broadcast presentation.
- Execution of the presentation rendering module 220 may also allow for manipulation of a view of the virtual broadcast presentation in response to the signal received by the communications module 210 from touch screen 140 .
- Manipulating the view of the presentation may include one or more of panning across, rotating, tilting, or zooming in/out of the virtual broadcast presentation.
- Signals corresponding to various motions of touch screen 140 may be assigned to various other manipulations of the virtual broadcast presentation. For example, touching touch screen 140 with one finger and moving the finger upwards may adjust the view and scroll upwards along the map or presentation.
- actuation of a soft key displayed within the virtual broadcast presentation may affect zoom speed, whereas actuation of a different soft key may affect zoom direction.
- Execution of the selection module 230 allows for selection of an interactive element included in the virtual broadcast presentation in response to the received signal.
- An interactive element may include a soft key displayed within the virtual broadcast presentation.
- the interactive element may also represent a traffic alert. For example, if road construction is taking place at a given intersection of two streets, an icon indicative of road construction may be placed in the virtual broadcast presentation at a position that corresponds to that given intersection. Execution of the selection module 230 may also select the interactive element when the interactive element is positioned near the center of the virtual broadcast presentation.
- Selecting the interactive element may cause one of a variety of responses from the virtual broadcast presentation. For example, selection of an interactive element may cause additional information related to the interactive element to be displayed within the virtual broadcast presentation.
- the interactive element may correspond to a traffic camera wherein selection of the interactive element causes a live camera view to appear within the virtual broadcast presentation.
- Execution of the feedback module 240 provides feedback to the presenter to inform the presenter that a given interactive element is selectable.
- the interactive element may be selectable in certain regions of the virtual broadcast presentation, such as the center.
- the feedback may include highlighting of the interactive element.
- non-visible feedback may be invoked. Examples of non-visible feedback include a vibration of touch screen 140 or an audible tone.
- Execution of the feedback module 240 also provides feedback to the presenter that a given interactive element has been successfully selected. For example, if the presenter has selected a particular interactive element, feedback module 240 may highlight the interactive element, change the color or appearance of the interactive element, or cause the interactive element to blink or flash continually. Such feedback confirms the selection of the interactive element and prevents the presenter from selecting the same interactive element multiple times.
- Execution of the trip calculation module 250 may allow for the determination or calculation of an estimated amount of time (e.g., ‘trip time’) needed to travel from a selected location to another location. For example, the presenter may select a first interactive element displayed in the virtual broadcast presentation wherein the first interactive element corresponds to a starting point or location. The presenter may then select a second interactive element displayed in the presentation that corresponds to a desired end point or destination location.
- An interactive element or starting/end point may include a particular street, road, landmark or point of interest, highway, neighborhood, town, city, area, region or the like.
- Trip calculation module 250 may calculate the estimated amount of time required to traverse the real world distance from the first selected interactive element to the second interactive element in real-time considering, at least in part, information from information sources 130 .
- trip calculation module 250 may consider the actual distance from the starting point to the end point, as well as various conditions affecting travel, including current weather conditions or traffic conditions such as a recent accident or road closure.
- trip calculation module 250 may be used to calculate an estimated travel distance between two selected locations. Execution of trip calculation module 250 may occur following the actuation of a ‘mode key’ as discussed further in FIG. 3A below.
- Execution of the virtual broadcast presentation engine 120 may output the virtual broadcast presentation to other components of the computing device 110 for generation of the composite presentation. Accordingly, the computing device 110 may output the composite presentation to the broadcast system 150 for dissemination to viewers.
- FIG. 3A illustrates a virtual broadcast presentation 300 displayed on a touch screen 140 .
- the presentation 300 of FIG. 3A includes traffic information.
- the principles described herein with respect to traffic are equally applicable to embodiments of the present invention that include weather information, newsworthy events or incidents, school closings, election results, or other information that may be featured on a virtual broadcast presentation.
- Presentation 300 may be generated and manipulated by execution of the presentation rendering module 220 in real-time.
- Presentation 300 may include satellite images of a given area with an animated road traffic report. A detailed description of animated road traffic reports may be found in U.S. patent application Ser. No. 11/302,418, now U.S. Pat. No. 7,221,287, the disclosure of which is incorporated by reference.
- Satellite images may be manipulated by execution of the presentation rendering module 220 to aid in generating three-dimensional information.
- two-dimensional satellite images may be processed in the context of other geographical information (e.g., topographical information) to generate a three-dimensional satellite image that reflects information along an x-, y-, and z-axis as illustrated in presentation 300 .
- the textured three-dimensional representation of landscape of a particular urban area aligns with and provides the three-dimensional coordinates for the road ways that may be animated and overlaid on the satellite images.
- the presentation 300 may also include a variety of markers ( 310 A- 310 C) to identify or label various locations, landmarks, or points of interest appearing in presentation 300 such as exit ramps, highways, named sections of highways, or city streets. These markers may be readily or universally recognizable, such as a highway marker resembling a California state highway sign with the appropriate highway number. Presentation 300 may also include markers or icons corresponding to the location of traffic incidents, road construction, and traffic cameras. Some or all of these markers 310 C may be interactive elements of the virtual broadcast presentation 300 and show real-time conditions, such as an average traffic speed associated with a particular location. An interactive element may include any marker, icon, label, object, or image appearing in presentation 300 that may be associated with real-time content or data. An interactive element, for example, may include a street, road, bridge, highway, landmark, point of interest, traffic incident or alert, road construction, or traffic camera.
- a presenter 305 may select an interactive element using touch screen 140 .
- FIG. 4 illustrates an interactive element appearing in a virtual broadcast presentation 300 displayed on touch screen 140 .
- an interactive element 410 i.e., traffic incident
- additional information related to that interactive element may be displayed.
- an interactive element marking a traffic incident may be selected resulting in detailed textual information describing the traffic incident being displayed within presentation 300 (not shown).
- presentation 300 may include images of vehicles 315 appearing along a specific roadway or highway.
- a vehicle 315 may be animated, for example, to show the speed and direction of traffic along a particular highway.
- Presentation 300 may also use color coding to demonstrate real-time traffic conditions. Color coding may help a viewer of the presentation 300 to quickly understand real-time traffic conditions associated with a depicted map or location.
- Presentation 300 may include a legend 320 describing various objects or color representations used in presentation 300 .
- a ‘green’ colored section of a road, street, or highway for example, may represent that real-time traffic is moving at a speed of 50 miles per hour or higher (e.g., normal or optimal conditions).
- a ‘yellow’ colored highway may represent traffic speeds of 25 miles per hour or higher (e.g., delayed conditions), while a ‘red’ colored highway may represent traffic speeds that are less than 25 miles per hour (e.g., slow or impacted conditions).
- the presentation 300 may also display one or more soft keys with various functionalities such as orientation key 325 , tilt key 330 , rotation key 335 , synchronization key 340 , previous and next presentation display keys 345 A- 345 B, and mode key 350 .
- Presenter 305 may actuate a soft key to facilitate or enhance the understanding of the content of presentation 300 .
- presenter 305 may use tilt key 330 to adjust or modify a view or perspective of presentation 300 vertically or horizontally.
- the presenter 305 may also change the perspective of presentation 300 by actuating rotation key 335 . Changing the perspective of presentation 300 may alter the orientation of the presentation such that a ‘north’ direction of a map or image is not oriented at the top of touch screen 140 .
- presenter 305 may actuate orientation key 325 to return the ‘north’ direction to the top of touch screen 140 .
- presenter 305 may touch a soft key with one finger or hand (e.g., tilt key 330 or rotation key 335 ) while using the other hand to activate the functionality of the soft key (e.g., move or adjust the touch screen in the desired direction).
- the presentation 300 may also include a synchronization key 340 .
- Presentation 300 may be generated based on information received in real-time or near real-time through execution of communications module 210 .
- Presenter 305 may actuate synchronization key 340 to cause the synchronization of data in real time such that presentation 300 reflects the most current information and conditions.
- synchronization of data may be done automatically.
- presenter 305 or another user may program or instruct computing device 110 to synchronize data at regular time periods (e.g., every 10 seconds, every minute, every two minutes, etc.).
- a presenter 305 may zoom in or out of presentation 300 by actuating keys corresponding to a particular view of the presentation, such as a previous key 345 A and a next key 345 B.
- previous key 345 A may revert to a presentation that offers a zoom out view while next key 345 B may allow a view that zooms in from the current view.
- Previous and next keys may, alternatively, be assigned zoom in or zoom out functionality.
- Presenter 305 may actuate a particular key ( 345 A, 345 B) multiple times to further zoom in or out of the current view.
- the previous key 345 A and next key 345 B may be used to display or shift to a different image or map within presentation 300 .
- the presentation 300 may also include mode key 350 .
- Presenter 305 may operate presentation 300 in different modes such as ‘trip time mode’ or ‘navigation mode.’
- Presenter 305 may switch between various modes by actuating mode key 350 .
- Presenter 305 may use navigation mode to view presentation 300 as described in FIG. 3B below.
- Trip time mode is discussed in further detail in FIGS. 6A and 6B below.
- FIG. 3B illustrates the virtual broadcast presentation 300 of FIG. 3A following manipulation by presenter 305 .
- Presenter 305 may manipulate presentation 300 in navigation mode to review or illustrate real-time traffic conditions (e.g., average traffic speeds, traffic incidents, etc.) associated with various locations depicted in presentation 300 .
- a view of presentation 300 may be manipulated to give the effect of ‘flying,’ or scrolling through the three-dimensional virtual representation of the traffic map and images.
- FIG. 3B illustrates presentation 300 of FIG. 3A following presenter 305 touching touch screen 140 and scrolling through presentation 300 .
- presentation 300 shows a magnified portion of presentation 300 (i.e., the intersection of highways 287 and 107 ) and the associated traffic conditions (e.g., traffic speeds).
- Presenter 300 may interact with presentation 300 using other interaction techniques known in the art.
- FIG. 5 illustrates the interaction technique of “pinching” (e.g., zooming in or out of presentation 300 ) used with virtual broadcast presentation 300 displayed on touch screen 140 .
- presenter 305 may interact with presentation 300 by touching touch screen 140 and bringing two fingers closer together (on one hand or with two). Such motion may cause the view associated with presentation 300 to zoom out of the current viewpoint.
- presenter 305 may also manipulate presentation 300 by panning, tilting, and/or rotating the view. For example, as presenter 305 touches touch screen 140 to scroll through presentation 300 , touch screen 140 generates a corresponding signal that is received in conjunction with execution of the communications module 210 . In turn, the presentation rendering module 220 may be executed to move or rotate the presentation 300 a corresponding amount as presenter 305 manipulated the touch screen 140 . The correspondence of the presentation 300 to manipulation of the touch screen 140 gives the presenter 305 the sensation of directly controlling the presentation 300 . Such manipulation of the view may also be used in selecting interactive elements. For example, if a particular interactive element may be selected only when near the center of the presentation 300 , the presenter may cause the view to be manipulated such that the particular interactive element is centered and therefore selectable.
- FIGS. 6A-6B illustrate a virtual broadcast presentation in ‘trip time mode.’
- Presenter 305 may activate trip time mode by actuating mode key 350 . Once trip time mode has been activated, presenter 305 may select an interactive element corresponding to a first location or starting point by touching the interactive element within presentation 300 displayed on touch screen 140 . As shown in FIG. 6A , presenter 305 has selected or designated “83.sup.rd Ave” as a starting point. Following selection of the first location, display 355 A may appear confirming the selection of presenter 305 .
- Presenter 305 may then select another interactive element corresponding to a second location or end point or travel destination by touching a second interactive element within presentation 300 displayed on touch screen 140 . As shown in FIG. 6B , presenter 305 has selected or designated “1st Ave” as an end point.
- trip calculation module 250 may calculate the estimated amount of time required to traverse the real world distance from the first selected interactive element (i.e., “83.sup.rd Ave”) to the second interactive element (i.e., “1st Ave”) in real-time considering, at least in part, information from information sources 130 . For example, trip calculation module 250 may consider various conditions affecting travel such as weather conditions or traffic conditions such as a recent accident, a road closure, or any other delay.
- Display 355 B may then display the estimated trip time (i.e., “28 minutes”), as well as any condition affecting travel such as weather conditions or a traffic delay, within presentation 300 on touch screen 140 .
- Display 355 B may also show the route (i.e., highway “25”) associated with the calculated trip time.
- trip time module 250 may calculate or forecast the estimated trip time based on a time of day and/or date (i.e., special day or occasion) designated by presenter 305 .
- presenter 305 may want to determine the estimated trip time at 9:00 AM (e.g., morning rush hour) or at 8:00 PM (e.g., a later evening hour).
- presenter 305 may want to determine the estimated trip time when departing at a particular time on the Labor Day holiday or on a date when a sporting event, concert, or other large gathering is scheduled at a venue.
- trip time mode presenter 305 may input the desired time of day and/or date and select a starting point and end point for trip time calculation.
- trip time mode may also be used to calculate an estimated travel distance between two selected locations (not shown). The calculated estimated travel distance may also be displayed within presentation 300 .
- FIGS. 7A-7B illustrate a traffic camera appearing within virtual broadcast presentation 300 displayed on touch screen 140 .
- an interactive element appearing in presentation 300 may include a traffic camera ( 710 A, 710 B).
- Presenter 305 may select traffic camera 710 A by touching the traffic camera 710 A within presentation 300 displayed on touch screen 140 (as shown in FIG. 7A ).
- a live video feed 720 corresponding to the location of a real-world traffic camera may be displayed within presentation 300 (as shown in FIG. 7B ).
- Presenter 305 may then use live video 720 feed to view actual traffic conditions associated with the real world location of traffic camera 710 A.
- FIG. 8 is a flow chart illustrating a method 800 for touch screen interaction with traffic data presented in a virtual broadcast presentation.
- the steps of method 800 may be performed in varying orders. Steps may be added or subtracted from the method 800 and still fall within the scope of the present invention.
- the steps of the process of FIG. 8 may be embodied in hardware or software including a non-transitory computer-readable storage medium comprising instructions executable by a processor of a computing device.
- a real-time, virtual broadcast presentation 300 is generated.
- the presentation 300 may be based on dynamic information and may be two-dimensional or three-dimensional.
- Execution of the presentation rendering module 210 may perform step 810 .
- the dynamic information may include real-time traffic information or real-time weather information and be received from the information sources 130 in conjunction with execution of the communications module 210 .
- a signal generated by touch screen 140 may be received.
- the signal generated by touch screen 140 may be based at least partially on the selection by a presenter of an interactive element displayed within presentation 300 on touch screen 140 .
- the signal may also be based on the actuation of other components included in the touch screen 140 such as soft keys.
- Step 820 may be performed by execution of the communications module 210 . Receipt of the signal in step 820 allows for processing of presentation 300 at step 830 .
- presentation 300 is processed in response to the signal received at step 820 .
- Execution of the presentation rendering module 220 may perform step 830 .
- Presentation 300 may be processed, for example, to allow for real-time manipulation of presentation 300 and various views thereof such as zooming in and out, scrolling, panning across, tilting, or rotating presentation 300 .
- Presentation 300 may also be processed based on the actuation of a particular soft key displayed within presentation 300 on touch screen 140 .
- presentation 300 is updated in response to the processed signal from step 830 .
- Execution of the presentation rendering module 220 may perform step 840 .
- presentation 300 may be updated to show a manipulated viewpoint desired by presenter 305 (e.g., rotated or tilted presentation).
- Presentation 300 may also be updated to show presentation 300 in a particular mode such as ‘navigation mode’ or ‘trip time mode.’
- Presentation 300 may also be updated to display information associated with an interactive element selected by presenter 305 , such as information regarding a traffic incident, road closure, or average travel speeds.
- any number of additional and/or optional steps that are not otherwise depicted may be included in method 800 . These steps may include selection of an interactive element included in the virtual broadcast presentation using touch screen 140 or feedback being provided to the presenter to inform the presenter that an interactive element is selectable.
- Computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable storage media include a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, or any other memory chip or cartridge.
- a bus may carry data to system RAM, from which a CPU retrieves and executes the instructions.
- the instructions received by system RAM may optionally be stored on a fixed disk either before or after execution by a CPU.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- General Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Databases & Information Systems (AREA)
- Atmospheric Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ecology (AREA)
- Environmental Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Environmental & Geological Engineering (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Navigation (AREA)
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Circuits Of Receivers In General (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present application is a continuation and claims the priority benefit of U.S. application Ser. No. 12/860,700 filed Aug. 20, 2010, which is a continuation-in-part and claims the priority benefit of U.S. patent application Ser. No. 12/398,120 filed Mar. 4, 2009, the disclosures of which are incorporated herein by reference.
- Existing broadcast presentations generally include a variety of maps, images, and animations that display current or forecasted conditions for reference by a presenter (i.e., news reporter) during a broadcast presentation such as a traffic or weather report. The broadcast presentation is often produced prior to a scheduled broadcast for presentation by a traffic or weather reporter in a fixed arrangement (much like a slide show) with a prerehearsed script. Although the presenter has the ability to control the speed and manner in which the broadcast presentation is presented to a viewing audience, the content in the maps and images remains fixed. That is, the content presented during the broadcast presentation is not in real-time and is outdated. The reporting of outdated information (e.g., traffic or weather information) may have a drastic effect on a viewing audience who may rely on the reported information to make decisions about such things as travel or logistics.
- Another shortcoming of existing broadcast technology is the lack of interaction with the content of the virtual broadcast presentation. Since the presentation contains pre-determined content, a presenter is unable to directly interact with or manipulate the maps and images of the presentation. The presenter cannot, for example, retrieve real-time conditions or other information associated with the maps or images of the presentation.
- As such, there is a need in the art for touch screen based interaction with traffic data and other related data.
- Embodiments of the present invention allow a presenter to interact with traffic data and other related data in real-time using a touch screen or other suitable display.
- In a first claimed embodiment, a method for touch screen based interaction with traffic data is claimed. Through the method, a virtual broadcast presentation is generated based on traffic data received from one or more information sources. A signal based on user interaction with a touch screen is generated and received. User interaction may include the selection of an interactive element included in the virtual broadcast presentation. The signal generated by the touch screen is then processed and the virtual broadcast presentation is updated in response to the processed signal.
- In a second claimed embodiment, a system for touch screen based interaction with traffic data is claimed. The system includes at least a communications module and a presentation rendering module, each module stored in memory and executable by a processor. Execution of the communications module by the processor receives a signal generated by the touch screen. The signal may be based on user interaction with the touch screen, wherein the user interaction includes selection of an interactive element included in a virtual broadcast presentation. Execution of the presentation rendering module by the processor generates the virtual broadcast presentation based on traffic data received from one or more information sources, processes the signal generated by the touch screen, and updates the virtual broadcast presentation in response to the processed signal.
- In a third claimed embodiment, a non-transitory computer-readable storage medium is claimed. The storage medium includes a computer program that is executable by a processor to perform a method for touch screen based interaction with traffic data. A virtual broadcast presentation is generated based on traffic data received from one or more information sources. A signal based on user interaction with a touch screen is generated and received. User interaction may include the selection of an interactive element included in the virtual broadcast presentation. The signal generated by the touch screen is then processed and the virtual broadcast presentation is updated in response to the processed signal.
-
FIG. 1 illustrates a block diagram of an environment for the broadcast of a virtual broadcast presentation that a user may interact with and reference in real-time -
FIG. 2 illustrates the virtual broadcast presentation engine ofFIG. 1 . -
FIGS. 3A-3B illustrate a virtual broadcast presentation displayed on a touch screen. -
FIG. 4 illustrates an interactive element appearing in a virtual broadcast presentation. -
FIG. 5 illustrates the interaction technique of “pinching” used with a virtual broadcast presentation. -
FIGS. 6A-6B illustrate a virtual broadcast presentation in ‘trip time’ mode. -
FIGS. 7A-7B illustrate a traffic camera appearing within a virtual broadcast presentation. -
FIG. 8 is a flowchart illustrating a method for touch screen based interaction with traffic data presented in a virtual broadcast presentation. - The present invention provides for the use of a touch screen to interact with traffic information and other related data during a virtual broadcast presentation. The virtual broadcast presentation may include maps, images, graphics, animations, multimedia overlays, and the like, that are rendered in a two-dimensional or three-dimensional manner on a display such as a touch screen. A presenter may refer to the presentation in real-time and may manipulate a view of the virtual broadcast presentation using the touch screen. The presenter may also use the touch screen to select an interactive element included in the broadcast presentation.
-
FIG. 1 illustrates a block diagram of an environment for the broadcast of a virtual broadcast presentation that a user may interact with and reference in real-time. Theenvironment 100 ofFIG. 1 includes acomputing device 110 having a virtualbroadcast presentation engine 120. Thecomputing device 110 ofFIG. 1 is communicatively coupled toinformation sources 130, atouch screen 140, and abroadcast system 150. WhileFIG. 1 illustrates oneparticular environment 100 including certain elements for the broadcast of a virtual presentation, alternative embodiments may be implemented that utilize differing elements than those disclosed inFIG. 1 (or combinations of the same), but that otherwise fall within the scope and spirit of the present invention. - The
computing device 110 and the virtualbroadcast presentation engine 120 may generate a composite presentation that includes a virtual broadcast presentation. The virtual broadcast presentation may be two-dimensional or three-dimensional. The composite presentation may be generated using information obtained in real-time (or near real-time) from theinformation sources 130 as described in further detail below. The virtualbroadcast presentation engine 120, in particular, is discussed with respect toFIG. 2 . Thecomputing device 110 may include various components such as one or more of communications interfaces, a processor, memory, storage, and any number of buses providing communication therebetween (not depicted). The processor may execute instructions implemented through computing modules or engines while the memory and storage may both permanently or temporarily store data including the aforementioned modules and engines. - The
information sources 130 may be provided by various organizations and in a variety of forms.Information sources 130 may include data sources related to traffic data such as traffic flow and as described in U.S. patent application Ser. No. 11/302,418, now U.S. Pat. No. 7,221,287, or weather data such as forecasts.Information sources 130 may also include data sources related to newsworthy events or incidents, school closings, election results, and other information that may be featured in a virtual broadcast presentation.Information sources 130 may require subscription or authentication for access and may be accessible via Telnet, FTP, or web services protocols. Information may be received frominformation sources 130 in real-time or near real-time to allow for generation of an equally real-time or near real-time presentation. That presentation may, in turn, be manipulated in real-time. - In an embodiment of the present invention utilizing traffic data specific to the San Francisco Bay area,
information sources 130 may include one or more of the 511.org system (a collaboration of public agencies including the California Highway Patrol, Metropolitan Transportation Commission, and CALTRANS), the California Highway Patrol (CHP) World Wide Web server, the PeMS system at the University of California at Berkeley, various public event listings, or a publicly or privately accessible user input mechanism. For weather data, theinformation sources 130 may include the National Weather Service among other weather information sources. Other data sources or alternative types of data sources (e.g., non-traffic and non-weather related sources) may be incorporated and utilized in various embodiments of the present invention. -
Touch screen 140 may be any multi-touch touch screen known in the art capable of recognizing complex gestures.Touch screen 140 may employ any touch screen technology known in the art including but not limited to resistive technology, surface acoustic wave technology, capacitive sensing (e.g., surface capacitance, projected capacitance, mutual capacitance, self capacitance), infrared, force panel technology, optical imaging, dispersive signal technology, acoustic pulse recognition, and the like. A presenter may interact withtouch screen 140 using any interaction technique known in the art such as touch-drag motions, “pinching,” (e.g., zooming in or out of a web page or photo by touching the user interface and either spreading two fingers apart or bringing two fingers close together), scrolling (e.g. sliding a finger up and down or left and right to scroll through a page), or other user-centered interactive effects (e.g. horizontally sliding sub-section, bookmarks menu, menu bars, and a “back” button).Touch screen 140 may include various sensors such as a light sensor for adjusting touch screen brightness.Touch screen 140 may also include a tilt sensor, an accelerometer, a gyroscopic component, and/or magnetometer for sensing orientation oftouch screen 140 and/or switching between landscape and portrait modes.Touch screen 140 may be a liquid crystal display or any other suitable display. - The
broadcast system 150 disseminates the composite presentation to viewers. Dissemination may occur via radio waves such as UHF or VHF, cable, satellite, or the World Wide Web. Hardware and software necessary to effectuate a broadcast may be included in thebroadcast system 150 and are generally known to those skilled in the broadcasting art. -
FIG. 2 illustrates the virtual broadcast presentation engine ofFIG. 1 . The virtualbroadcast presentation engine 120 ofFIG. 2 includes acommunications module 210, apresentation rendering module 220, aselection module 230, afeedback module 240, and atrip calculation module 250. The virtualbroadcast presentation engine 120 and its constituent modules may be stored in memory and executed by a processing device to effectuate the functionality corresponding thereto. The virtualbroadcast presentation engine 120 may be composed of more or less modules (or combinations of the same) and still fall within the scope of the present invention. For example, the functionality of theselection module 230 and the functionality of thefeedback module 240 may be combined into a single module. - Execution of the
communications module 210 allows for receipt of a signal generated bytouch screen 140, which may be based at least partially on a user selection such as the selection by a presenter of an interactive element displayed within the virtual broadcast presentation. The signal may additionally be based on—in part or in whole—the actuation of other components included within the virtual broadcast presentation such as various soft keys with different functionalities. - In addition to the signal generated by
touch screen 140, execution of thecommunications module 210 may also allow for receipt of dynamic information frominformation sources 130. This dynamic information may be used by other modules for generating, manipulating, and interacting with the virtual broadcast presentation. - Referring again to
FIG. 2 , execution of thepresentation rendering module 220 allows for the generation of a virtual broadcast presentation based on the dynamic information received through execution of thecommunications module 210. The dynamic information may include traffic information, weather information, newsworthy events or incidents, election results, school closings, or other information that may be featured in a virtual broadcast presentation. - Execution of the
presentation rendering module 220 may also allow for manipulation of a view of the virtual broadcast presentation in response to the signal received by thecommunications module 210 fromtouch screen 140. Manipulating the view of the presentation may include one or more of panning across, rotating, tilting, or zooming in/out of the virtual broadcast presentation. Signals corresponding to various motions oftouch screen 140 may be assigned to various other manipulations of the virtual broadcast presentation. For example, touchingtouch screen 140 with one finger and moving the finger upwards may adjust the view and scroll upwards along the map or presentation. As another example, actuation of a soft key displayed within the virtual broadcast presentation may affect zoom speed, whereas actuation of a different soft key may affect zoom direction. - Execution of the
selection module 230 allows for selection of an interactive element included in the virtual broadcast presentation in response to the received signal. An interactive element may include a soft key displayed within the virtual broadcast presentation. The interactive element may also represent a traffic alert. For example, if road construction is taking place at a given intersection of two streets, an icon indicative of road construction may be placed in the virtual broadcast presentation at a position that corresponds to that given intersection. Execution of theselection module 230 may also select the interactive element when the interactive element is positioned near the center of the virtual broadcast presentation. - Selecting the interactive element may cause one of a variety of responses from the virtual broadcast presentation. For example, selection of an interactive element may cause additional information related to the interactive element to be displayed within the virtual broadcast presentation. In one embodiment, the interactive element may correspond to a traffic camera wherein selection of the interactive element causes a live camera view to appear within the virtual broadcast presentation.
- Execution of the
feedback module 240 provides feedback to the presenter to inform the presenter that a given interactive element is selectable. For example, the interactive element may be selectable in certain regions of the virtual broadcast presentation, such as the center. When the interactive element enters or leaves the center of the virtual broadcast presentation, the presenter may be informed via feedback. The feedback may include highlighting of the interactive element. To avoid distracting or otherwise undesirable imagery such as a cursor being included in the virtual broadcast presentation, non-visible feedback may be invoked. Examples of non-visible feedback include a vibration oftouch screen 140 or an audible tone. - Execution of the
feedback module 240 also provides feedback to the presenter that a given interactive element has been successfully selected. For example, if the presenter has selected a particular interactive element,feedback module 240 may highlight the interactive element, change the color or appearance of the interactive element, or cause the interactive element to blink or flash continually. Such feedback confirms the selection of the interactive element and prevents the presenter from selecting the same interactive element multiple times. - Execution of the
trip calculation module 250 may allow for the determination or calculation of an estimated amount of time (e.g., ‘trip time’) needed to travel from a selected location to another location. For example, the presenter may select a first interactive element displayed in the virtual broadcast presentation wherein the first interactive element corresponds to a starting point or location. The presenter may then select a second interactive element displayed in the presentation that corresponds to a desired end point or destination location. An interactive element or starting/end point may include a particular street, road, landmark or point of interest, highway, neighborhood, town, city, area, region or the like.Trip calculation module 250 may calculate the estimated amount of time required to traverse the real world distance from the first selected interactive element to the second interactive element in real-time considering, at least in part, information frominformation sources 130. When calculating a trip time,trip calculation module 250, for example, may consider the actual distance from the starting point to the end point, as well as various conditions affecting travel, including current weather conditions or traffic conditions such as a recent accident or road closure. In another embodiment,trip calculation module 250 may be used to calculate an estimated travel distance between two selected locations. Execution oftrip calculation module 250 may occur following the actuation of a ‘mode key’ as discussed further inFIG. 3A below. - Execution of the virtual
broadcast presentation engine 120 may output the virtual broadcast presentation to other components of thecomputing device 110 for generation of the composite presentation. Accordingly, thecomputing device 110 may output the composite presentation to thebroadcast system 150 for dissemination to viewers. -
FIG. 3A illustrates avirtual broadcast presentation 300 displayed on atouch screen 140. Thepresentation 300 ofFIG. 3A includes traffic information. The principles described herein with respect to traffic are equally applicable to embodiments of the present invention that include weather information, newsworthy events or incidents, school closings, election results, or other information that may be featured on a virtual broadcast presentation.Presentation 300 may be generated and manipulated by execution of thepresentation rendering module 220 in real-time.Presentation 300 may include satellite images of a given area with an animated road traffic report. A detailed description of animated road traffic reports may be found in U.S. patent application Ser. No. 11/302,418, now U.S. Pat. No. 7,221,287, the disclosure of which is incorporated by reference. - Satellite images may be manipulated by execution of the
presentation rendering module 220 to aid in generating three-dimensional information. For example, two-dimensional satellite images may be processed in the context of other geographical information (e.g., topographical information) to generate a three-dimensional satellite image that reflects information along an x-, y-, and z-axis as illustrated inpresentation 300. The textured three-dimensional representation of landscape of a particular urban area aligns with and provides the three-dimensional coordinates for the road ways that may be animated and overlaid on the satellite images. - The
presentation 300 may also include a variety of markers (310A-310C) to identify or label various locations, landmarks, or points of interest appearing inpresentation 300 such as exit ramps, highways, named sections of highways, or city streets. These markers may be readily or universally recognizable, such as a highway marker resembling a California state highway sign with the appropriate highway number.Presentation 300 may also include markers or icons corresponding to the location of traffic incidents, road construction, and traffic cameras. Some or all of thesemarkers 310C may be interactive elements of thevirtual broadcast presentation 300 and show real-time conditions, such as an average traffic speed associated with a particular location. An interactive element may include any marker, icon, label, object, or image appearing inpresentation 300 that may be associated with real-time content or data. An interactive element, for example, may include a street, road, bridge, highway, landmark, point of interest, traffic incident or alert, road construction, or traffic camera. - A
presenter 305 may select an interactive element usingtouch screen 140.FIG. 4 illustrates an interactive element appearing in avirtual broadcast presentation 300 displayed ontouch screen 140. In one embodiment, an interactive element 410 (i.e., traffic incident) may be marked by a particular icon, image, or symbol (e.g., an arrow pointing to the location of the traffic incident), as shown inFIG. 4 . When an interactive element is selected, additional information related to that interactive element may be displayed. In one embodiment, an interactive element marking a traffic incident may be selected resulting in detailed textual information describing the traffic incident being displayed within presentation 300 (not shown). - Returning to
FIG. 3A ,presentation 300 may include images ofvehicles 315 appearing along a specific roadway or highway. Avehicle 315 may be animated, for example, to show the speed and direction of traffic along a particular highway.Presentation 300 may also use color coding to demonstrate real-time traffic conditions. Color coding may help a viewer of thepresentation 300 to quickly understand real-time traffic conditions associated with a depicted map or location.Presentation 300 may include alegend 320 describing various objects or color representations used inpresentation 300. A ‘green’ colored section of a road, street, or highway, for example, may represent that real-time traffic is moving at a speed of 50 miles per hour or higher (e.g., normal or optimal conditions). A ‘yellow’ colored highway may represent traffic speeds of 25 miles per hour or higher (e.g., delayed conditions), while a ‘red’ colored highway may represent traffic speeds that are less than 25 miles per hour (e.g., slow or impacted conditions). - The
presentation 300 may also display one or more soft keys with various functionalities such asorientation key 325,tilt key 330,rotation key 335,synchronization key 340, previous and nextpresentation display keys 345A-345B, andmode key 350.Presenter 305 may actuate a soft key to facilitate or enhance the understanding of the content ofpresentation 300. For example,presenter 305 may usetilt key 330 to adjust or modify a view or perspective ofpresentation 300 vertically or horizontally. Thepresenter 305 may also change the perspective ofpresentation 300 by actuatingrotation key 335. Changing the perspective ofpresentation 300 may alter the orientation of the presentation such that a ‘north’ direction of a map or image is not oriented at the top oftouch screen 140. As such,presenter 305 may actuate orientation key 325 to return the ‘north’ direction to the top oftouch screen 140. In one embodiment,presenter 305 may touch a soft key with one finger or hand (e.g., tilt key 330 or rotation key 335) while using the other hand to activate the functionality of the soft key (e.g., move or adjust the touch screen in the desired direction). - The
presentation 300 may also include asynchronization key 340.Presentation 300 may be generated based on information received in real-time or near real-time through execution ofcommunications module 210.Presenter 305 may actuatesynchronization key 340 to cause the synchronization of data in real time such thatpresentation 300 reflects the most current information and conditions. In one embodiment, synchronization of data may be done automatically. In another embodiment,presenter 305 or another user may program or instructcomputing device 110 to synchronize data at regular time periods (e.g., every 10 seconds, every minute, every two minutes, etc.). - A
presenter 305 may zoom in or out ofpresentation 300 by actuating keys corresponding to a particular view of the presentation, such as a previous key 345A and a next key 345B. For example, previous key 345A may revert to a presentation that offers a zoom out view while next key 345B may allow a view that zooms in from the current view. Previous and next keys may, alternatively, be assigned zoom in or zoom out functionality.Presenter 305 may actuate a particular key (345A, 345B) multiple times to further zoom in or out of the current view. In one embodiment, the previous key 345A and next key 345B may be used to display or shift to a different image or map withinpresentation 300. - The
presentation 300 may also includemode key 350.Presenter 305 may operatepresentation 300 in different modes such as ‘trip time mode’ or ‘navigation mode.’Presenter 305 may switch between various modes by actuatingmode key 350.Presenter 305 may use navigation mode to viewpresentation 300 as described inFIG. 3B below. Trip time mode is discussed in further detail inFIGS. 6A and 6B below. -
FIG. 3B illustrates thevirtual broadcast presentation 300 ofFIG. 3A following manipulation bypresenter 305.Presenter 305 may manipulatepresentation 300 in navigation mode to review or illustrate real-time traffic conditions (e.g., average traffic speeds, traffic incidents, etc.) associated with various locations depicted inpresentation 300. A view ofpresentation 300 may be manipulated to give the effect of ‘flying,’ or scrolling through the three-dimensional virtual representation of the traffic map and images. Aspresenter 305 scrolls throughpresentation 300, various interactive elements may be highlighted and/or become available for selection.FIG. 3B illustratespresentation 300 ofFIG. 3A following presenter 305touching touch screen 140 and scrolling throughpresentation 300. As a result, presentation 300 (as shown inFIG. 3B ) shows a magnified portion of presentation 300 (i.e., the intersection ofhighways 287 and 107) and the associated traffic conditions (e.g., traffic speeds). -
Presenter 300 may interact withpresentation 300 using other interaction techniques known in the art. For example,FIG. 5 illustrates the interaction technique of “pinching” (e.g., zooming in or out of presentation 300) used withvirtual broadcast presentation 300 displayed ontouch screen 140. As shown inFIG. 5 ,presenter 305 may interact withpresentation 300 by touchingtouch screen 140 and bringing two fingers closer together (on one hand or with two). Such motion may cause the view associated withpresentation 300 to zoom out of the current viewpoint. - Besides zooming in or out of
presentation 300,presenter 305 may also manipulatepresentation 300 by panning, tilting, and/or rotating the view. For example, aspresenter 305 touchestouch screen 140 to scroll throughpresentation 300,touch screen 140 generates a corresponding signal that is received in conjunction with execution of thecommunications module 210. In turn, thepresentation rendering module 220 may be executed to move or rotate the presentation 300 a corresponding amount aspresenter 305 manipulated thetouch screen 140. The correspondence of thepresentation 300 to manipulation of thetouch screen 140 gives thepresenter 305 the sensation of directly controlling thepresentation 300. Such manipulation of the view may also be used in selecting interactive elements. For example, if a particular interactive element may be selected only when near the center of thepresentation 300, the presenter may cause the view to be manipulated such that the particular interactive element is centered and therefore selectable. -
FIGS. 6A-6B illustrate a virtual broadcast presentation in ‘trip time mode.’Presenter 305 may activate trip time mode by actuatingmode key 350. Once trip time mode has been activated,presenter 305 may select an interactive element corresponding to a first location or starting point by touching the interactive element withinpresentation 300 displayed ontouch screen 140. As shown inFIG. 6A ,presenter 305 has selected or designated “83.sup.rd Ave” as a starting point. Following selection of the first location, display 355A may appear confirming the selection ofpresenter 305. -
Presenter 305 may then select another interactive element corresponding to a second location or end point or travel destination by touching a second interactive element withinpresentation 300 displayed ontouch screen 140. As shown inFIG. 6B ,presenter 305 has selected or designated “1st Ave” as an end point. Following selection of the second interactive element,trip calculation module 250 may calculate the estimated amount of time required to traverse the real world distance from the first selected interactive element (i.e., “83.sup.rd Ave”) to the second interactive element (i.e., “1st Ave”) in real-time considering, at least in part, information frominformation sources 130. For example,trip calculation module 250 may consider various conditions affecting travel such as weather conditions or traffic conditions such as a recent accident, a road closure, or any other delay. Display 355B may then display the estimated trip time (i.e., “28 minutes”), as well as any condition affecting travel such as weather conditions or a traffic delay, withinpresentation 300 ontouch screen 140. Display 355B may also show the route (i.e., highway “25”) associated with the calculated trip time. - Besides calculating the estimate trip time in real-time,
trip time module 250 may calculate or forecast the estimated trip time based on a time of day and/or date (i.e., special day or occasion) designated bypresenter 305. For example,presenter 305 may want to determine the estimated trip time at 9:00 AM (e.g., morning rush hour) or at 8:00 PM (e.g., a later evening hour). As another example,presenter 305 may want to determine the estimated trip time when departing at a particular time on the Labor Day holiday or on a date when a sporting event, concert, or other large gathering is scheduled at a venue. In trip time mode,presenter 305 may input the desired time of day and/or date and select a starting point and end point for trip time calculation. In another embodiment, trip time mode may also be used to calculate an estimated travel distance between two selected locations (not shown). The calculated estimated travel distance may also be displayed withinpresentation 300. -
FIGS. 7A-7B illustrate a traffic camera appearing withinvirtual broadcast presentation 300 displayed ontouch screen 140. In one embodiment, an interactive element appearing inpresentation 300 may include a traffic camera (710A, 710B).Presenter 305 may selecttraffic camera 710A by touching thetraffic camera 710A withinpresentation 300 displayed on touch screen 140 (as shown inFIG. 7A ). Following selection oftraffic camera 710A associated with a particular location, a live video feed 720 corresponding to the location of a real-world traffic camera may be displayed within presentation 300 (as shown inFIG. 7B ).Presenter 305 may then uselive video 720 feed to view actual traffic conditions associated with the real world location oftraffic camera 710A. -
FIG. 8 is a flow chart illustrating amethod 800 for touch screen interaction with traffic data presented in a virtual broadcast presentation. The steps ofmethod 800 may be performed in varying orders. Steps may be added or subtracted from themethod 800 and still fall within the scope of the present invention. The steps of the process ofFIG. 8 may be embodied in hardware or software including a non-transitory computer-readable storage medium comprising instructions executable by a processor of a computing device. - At
step 810, a real-time,virtual broadcast presentation 300 is generated. Thepresentation 300 may be based on dynamic information and may be two-dimensional or three-dimensional. Execution of thepresentation rendering module 210 may performstep 810. The dynamic information may include real-time traffic information or real-time weather information and be received from theinformation sources 130 in conjunction with execution of thecommunications module 210. - At
step 820, a signal generated bytouch screen 140 may be received. The signal generated bytouch screen 140 may be based at least partially on the selection by a presenter of an interactive element displayed withinpresentation 300 ontouch screen 140. The signal may also be based on the actuation of other components included in thetouch screen 140 such as soft keys. Step 820 may be performed by execution of thecommunications module 210. Receipt of the signal instep 820 allows for processing ofpresentation 300 atstep 830. - At
step 830,presentation 300 is processed in response to the signal received atstep 820. Execution of thepresentation rendering module 220 may performstep 830.Presentation 300 may be processed, for example, to allow for real-time manipulation ofpresentation 300 and various views thereof such as zooming in and out, scrolling, panning across, tilting, orrotating presentation 300.Presentation 300 may also be processed based on the actuation of a particular soft key displayed withinpresentation 300 ontouch screen 140. - At
step 840,presentation 300 is updated in response to the processed signal fromstep 830. Execution of thepresentation rendering module 220 may performstep 840. For example,presentation 300 may be updated to show a manipulated viewpoint desired by presenter 305 (e.g., rotated or tilted presentation).Presentation 300 may also be updated to showpresentation 300 in a particular mode such as ‘navigation mode’ or ‘trip time mode.’Presentation 300 may also be updated to display information associated with an interactive element selected bypresenter 305, such as information regarding a traffic incident, road closure, or average travel speeds. - Any number of additional and/or optional steps that are not otherwise depicted may be included in
method 800. These steps may include selection of an interactive element included in the virtual broadcast presentation usingtouch screen 140 or feedback being provided to the presenter to inform the presenter that an interactive element is selectable. - It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the invention. Computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable storage media include a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, or any other memory chip or cartridge.
- Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus may carry data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM may optionally be stored on a fixed disk either before or after execution by a CPU.
- The foregoing detailed description of the technology herein has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology and its practical application to thereby enable others skilled in the art to best utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claims appended hereto.
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/637,357 US20150177018A1 (en) | 2009-03-04 | 2015-03-03 | Touch screen based interaction with traffic data |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/398,120 US8619072B2 (en) | 2009-03-04 | 2009-03-04 | Controlling a three-dimensional virtual broadcast presentation |
US12/860,700 US8982116B2 (en) | 2009-03-04 | 2010-08-20 | Touch screen based interaction with traffic data |
US14/637,357 US20150177018A1 (en) | 2009-03-04 | 2015-03-03 | Touch screen based interaction with traffic data |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/860,700 Continuation US8982116B2 (en) | 2009-03-04 | 2010-08-20 | Touch screen based interaction with traffic data |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150177018A1 true US20150177018A1 (en) | 2015-06-25 |
Family
ID=45605714
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/860,700 Active 2030-12-03 US8982116B2 (en) | 2009-03-04 | 2010-08-20 | Touch screen based interaction with traffic data |
US14/637,357 Abandoned US20150177018A1 (en) | 2009-03-04 | 2015-03-03 | Touch screen based interaction with traffic data |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/860,700 Active 2030-12-03 US8982116B2 (en) | 2009-03-04 | 2010-08-20 | Touch screen based interaction with traffic data |
Country Status (3)
Country | Link |
---|---|
US (2) | US8982116B2 (en) |
EP (1) | EP2635989A4 (en) |
WO (1) | WO2012024694A2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150067540A1 (en) * | 2013-09-02 | 2015-03-05 | Samsung Electronics Co., Ltd. | Display apparatus, portable device and screen display methods thereof |
US9127959B2 (en) | 2003-07-25 | 2015-09-08 | Pelmorex Canada Inc. | System and method for delivering departure notifications |
US9293039B2 (en) | 2012-01-27 | 2016-03-22 | Pelmorex Canada Inc. | Estimating time travel distributions on signalized arterials |
US9368029B2 (en) | 2002-03-05 | 2016-06-14 | Pelmorex Canada Inc. | GPS generated traffic information |
US9390620B2 (en) | 2011-05-18 | 2016-07-12 | Pelmorex Canada Inc. | System for providing traffic data and driving efficiency data |
US9448690B2 (en) | 2009-03-04 | 2016-09-20 | Pelmorex Canada Inc. | Controlling a three-dimensional virtual broadcast presentation |
CN108228737A (en) * | 2016-12-13 | 2018-06-29 | 通用电气航空系统有限责任公司 | Travel path and data integrated system based on map |
US10223909B2 (en) | 2012-10-18 | 2019-03-05 | Uber Technologies, Inc. | Estimating time travel distributions on signalized arterials |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8607167B2 (en) * | 2007-01-07 | 2013-12-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for providing maps and directions |
US8302033B2 (en) * | 2007-06-22 | 2012-10-30 | Apple Inc. | Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information |
US8171432B2 (en) * | 2008-01-06 | 2012-05-01 | Apple Inc. | Touch screen device, method, and graphical user interface for displaying and selecting application options |
US8327272B2 (en) | 2008-01-06 | 2012-12-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US8982116B2 (en) | 2009-03-04 | 2015-03-17 | Pelmorex Canada Inc. | Touch screen based interaction with traffic data |
US9046924B2 (en) | 2009-03-04 | 2015-06-02 | Pelmorex Canada Inc. | Gesture based interaction with traffic data |
US8464182B2 (en) | 2009-06-07 | 2013-06-11 | Apple Inc. | Device, method, and graphical user interface for providing maps, directions, and location-based information |
US8862576B2 (en) | 2010-01-06 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for mapping directions between search results |
US8456297B2 (en) * | 2010-01-06 | 2013-06-04 | Apple Inc. | Device, method, and graphical user interface for tracking movement on a map |
WO2012065188A2 (en) | 2010-11-14 | 2012-05-18 | Triangle Software Llc | Crowd sourced traffic reporting |
WO2012154870A2 (en) * | 2011-05-09 | 2012-11-15 | Zoll Medical Corporation | Systems and methods for ems navigation user interface |
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US9202297B1 (en) | 2011-07-12 | 2015-12-01 | Domo, Inc. | Dynamic expansion of data visualizations |
US10001898B1 (en) | 2011-07-12 | 2018-06-19 | Domo, Inc. | Automated provisioning of relational information for a summary data visualization |
TW201312452A (en) * | 2011-09-02 | 2013-03-16 | Inventec Corp | Display method and electronic device using the same |
US20130104039A1 (en) * | 2011-10-21 | 2013-04-25 | Sony Ericsson Mobile Communications Ab | System and Method for Operating a User Interface on an Electronic Device |
US8737821B2 (en) | 2012-05-31 | 2014-05-27 | Eric Qing Li | Automatic triggering of a zoomed-in scroll bar for a media program based on user input |
US9686451B2 (en) * | 2015-01-21 | 2017-06-20 | Toyota Jidosha Kabushiki Kaisha | Real time driving difficulty categorization |
JP6193912B2 (en) * | 2015-04-24 | 2017-09-06 | 株式会社パイ・アール | Drive recorder |
CN106095823A (en) * | 2016-05-31 | 2016-11-09 | 青岛海信移动通信技术股份有限公司 | A kind of map-indication method and device |
DE112017007078B4 (en) * | 2017-03-16 | 2024-08-22 | Ford Global Technologies, Llc | VEHICLE EVENT IDENTIFICATION |
SG11201910178SA (en) * | 2017-05-11 | 2019-11-28 | Channelfix Com Llc | Video-tournament platform |
US12022359B2 (en) | 2020-05-18 | 2024-06-25 | Apple Inc. | User interfaces for viewing and refining the current location of an electronic device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7835858B2 (en) * | 2002-11-22 | 2010-11-16 | Traffic.Com, Inc. | Method of creating a virtual traffic network |
US7847708B1 (en) * | 2005-09-29 | 2010-12-07 | Baron Services, Inc. | System for providing site-specific, real-time environmental condition information to vehicles and related methods |
Family Cites Families (299)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5297049A (en) * | 1982-11-08 | 1994-03-22 | Hailemichael Gurmu | Vehicle guidance system |
US5126941A (en) | 1982-11-08 | 1992-06-30 | Hailemichael Gurmu | Vehicle guidance system |
US5247439A (en) | 1982-11-08 | 1993-09-21 | Hailemichael Gurmu | Vehicle guidance system |
US4796191A (en) * | 1984-06-07 | 1989-01-03 | Etak, Inc. | Vehicle navigational system and method |
US4914605A (en) * | 1984-10-22 | 1990-04-03 | Etak, Inc. | Apparatus and method for displaying a map |
JPH0827811B2 (en) | 1985-02-28 | 1996-03-21 | 株式会社日立製作所 | Transportation planning method and system |
US4734863A (en) * | 1985-03-06 | 1988-03-29 | Etak, Inc. | Apparatus for generating a heading signal for a land vehicle |
CA1277043C (en) | 1985-07-25 | 1990-11-27 | Marvin S. White, Jr. | Apparatus storing a representation of topological structures and methods of building and searching the representation |
US4788645A (en) | 1986-03-21 | 1988-11-29 | Etak, Incorporated | Method and apparatus for measuring relative heading changes in a vehicular onboard navigation system |
US4878170A (en) | 1987-03-17 | 1989-10-31 | Zeevi Eliahu I | Vehicle navigation system |
US4792803A (en) | 1987-06-08 | 1988-12-20 | Madnick Peter A | Traffic monitoring and reporting system |
US5095532A (en) * | 1989-12-29 | 1992-03-10 | Robert Bosch Gmbh | Method and apparatus for route-selective reproduction of broadcast traffic announcements |
US5173691A (en) | 1990-07-26 | 1992-12-22 | Farradyne Systems, Inc. | Data fusion process for an in-vehicle traffic congestion information system |
US5164904A (en) | 1990-07-26 | 1992-11-17 | Farradyne Systems, Inc. | In-vehicle traffic congestion information system |
US5182555A (en) * | 1990-07-26 | 1993-01-26 | Farradyne Systems, Inc. | Cell messaging process for an in-vehicle traffic congestion information system |
US5276785A (en) * | 1990-08-02 | 1994-01-04 | Xerox Corporation | Moving viewpoint with respect to a target in a three-dimensional workspace |
JP2689357B2 (en) | 1990-09-04 | 1997-12-10 | 株式会社ゼクセル | Relative direction detection method |
JP2767315B2 (en) | 1990-09-04 | 1998-06-18 | 株式会社ゼクセル | Vehicle relative heading detection method |
US5220507A (en) | 1990-11-08 | 1993-06-15 | Motorola, Inc. | Land vehicle multiple navigation route apparatus |
US5068656A (en) | 1990-12-21 | 1991-11-26 | Rockwell International Corporation | System and method for monitoring and reporting out-of-route mileage for long haul trucks |
US5845227A (en) | 1991-02-01 | 1998-12-01 | Peterson; Thomas D. | Method and apparatus for providing shortest elapsed time route and tracking information to users |
USRE38724E1 (en) | 1991-02-01 | 2005-04-12 | Peterson Thomas D | Method and apparatus for providing shortest elapsed time route and tracking information to users |
JP2955073B2 (en) | 1991-08-05 | 1999-10-04 | ビステオン・テクノロジーズ,エル・エル・シー | Vehicle navigation system |
US5297028A (en) * | 1991-08-27 | 1994-03-22 | Zexel Corporation Daihatsu-Nissan | Method and apparatus for correcting drift errors in an angular rate sensor |
US5311195A (en) | 1991-08-30 | 1994-05-10 | Etak, Inc. | Combined relative and absolute positioning method and apparatus |
US5515284A (en) | 1991-09-25 | 1996-05-07 | Zexel Corporation | Storage medium for map information for navigation system and system for offering map information for navigation system |
US5283575A (en) * | 1991-11-08 | 1994-02-01 | Zexel Corporation | System and method for locating a travelling vehicle |
US5394333A (en) * | 1991-12-23 | 1995-02-28 | Zexel Usa Corp. | Correcting GPS position in a hybrid naviation system |
US5339246A (en) | 1992-03-17 | 1994-08-16 | Zexel Corporation Diahatsu-Nissan | Apparatus for correcting vehicular compass heading with the aid of the global positioning system |
US5291412A (en) * | 1992-03-24 | 1994-03-01 | Zexel Corporation | Navigation system |
US5262775A (en) | 1992-04-07 | 1993-11-16 | Zexel Corporation | Navigation system with off-route detection and route recalculation |
US5608635A (en) * | 1992-04-14 | 1997-03-04 | Zexel Corporation | Navigation system for a vehicle with route recalculation between multiple locations |
US5303159A (en) * | 1992-04-14 | 1994-04-12 | Zexel Corporation Daihatsu-Nissan | Navigation system with off-route detection and route recalculation |
US5291414A (en) * | 1992-04-14 | 1994-03-01 | Zexel Corporation Diahatsu-Nissan Ikebukuro | Navigation system for guiding a vehicle along a precomputed optimal route |
US5291413A (en) * | 1992-04-14 | 1994-03-01 | Zexel Corporation Daihatsu-Nissan Ikebukuro | Navigation system for guiding a vehicle along a precomputed optimal route |
JPH05313578A (en) | 1992-05-13 | 1993-11-26 | Sumitomo Electric Ind Ltd | Navigation device |
US5345382A (en) | 1992-05-15 | 1994-09-06 | Zexel Corporation | Calibration method for a relative heading sensor |
US5359529A (en) | 1992-05-15 | 1994-10-25 | Zexel Corporation | Route guidance on/off-route state filter |
JP2983111B2 (en) | 1992-07-20 | 1999-11-29 | 富士通株式会社 | Electrical equipment |
US5390123A (en) * | 1992-06-09 | 1995-02-14 | Zexel Corporation Daihatsu-Nissay | Navigation system with accurate determination of angular velocity |
JPH0634384A (en) | 1992-07-16 | 1994-02-08 | Zexel Corp | Vehicular navigation device |
US5465079A (en) | 1992-08-14 | 1995-11-07 | Vorad Safety Systems, Inc. | Method and apparatus for determining driver fitness in real time |
SE470367B (en) | 1992-11-19 | 1994-01-31 | Kjell Olsson | Ways to predict traffic parameters |
US5374933A (en) | 1993-01-05 | 1994-12-20 | Zexel Corporation | Position correction method for vehicle navigation system |
JP2999339B2 (en) | 1993-01-11 | 2000-01-17 | 三菱電機株式会社 | Vehicle route guidance device |
JPH076295A (en) | 1993-06-15 | 1995-01-10 | Aisin Seiki Co Ltd | Mobile station position monitoring system |
US5550538A (en) | 1993-07-14 | 1996-08-27 | Zexel Corporation | Navigation system |
US5583972A (en) | 1993-08-02 | 1996-12-10 | Miller; Richard L. | 3-D weather display and weathercast system |
US5488559A (en) * | 1993-08-02 | 1996-01-30 | Motorola, Inc. | Map-matching with competing sensory positions |
US5402120A (en) * | 1993-08-18 | 1995-03-28 | Zexel Corporation | Navigation system |
US5414630A (en) | 1993-09-29 | 1995-05-09 | Zexel Corporation | Vehicle-mounted navigation system |
US5539645A (en) | 1993-11-19 | 1996-07-23 | Philips Electronics North America Corporation | Traffic monitoring system with reduced communications requirements |
EP0660083B1 (en) | 1993-12-27 | 2000-09-27 | Aisin Aw Co., Ltd. | Vehicular information display system |
US5793310A (en) * | 1994-02-04 | 1998-08-11 | Nissan Motor Co., Ltd. | Portable or vehicular navigating apparatus and method capable of displaying bird's eye view |
SE516278C2 (en) * | 1994-03-04 | 2001-12-10 | Volvo Ab | Traffic information systems and procedures for providing traffic information |
US5751245A (en) | 1994-03-25 | 1998-05-12 | Trimble Navigation Ltd. | Vehicle route and schedule exception reporting system |
US5546107A (en) | 1994-04-05 | 1996-08-13 | Etak, Inc. | Automatic chain-based conflation of digital maps |
US5706503A (en) * | 1994-05-18 | 1998-01-06 | Etak Inc | Method of clustering multi-dimensional related data in a computer database by combining the two verticles of a graph connected by an edge having the highest score |
US5515283A (en) | 1994-06-20 | 1996-05-07 | Zexel Corporation | Method for identifying highway access ramps for route calculation in a vehicle navigation system |
US6321158B1 (en) | 1994-06-24 | 2001-11-20 | Delorme Publishing Company | Integrated routing/mapping information |
JP3099934B2 (en) | 1994-09-08 | 2000-10-16 | 株式会社東芝 | Travel time prediction device |
US5931888A (en) | 1994-09-22 | 1999-08-03 | Aisin Aw Co., Ltd. | Navigation system for vehicles with alternative route searching capabilities |
CA2158500C (en) | 1994-11-04 | 1999-03-30 | Ender Ayanoglu | Navigation system for an automotive vehicle |
US5554845A (en) | 1994-11-14 | 1996-09-10 | Santa Barbara Research Center | Method and apparatus to effectively eliminate optical interference structure in detector response |
US5610821A (en) | 1994-11-18 | 1997-03-11 | Ibm Corporation | Optimal and stable route planning system |
US5485161A (en) * | 1994-11-21 | 1996-01-16 | Trimble Navigation Limited | Vehicle speed control based on GPS/MAP matching of posted speeds |
US5499182A (en) * | 1994-12-07 | 1996-03-12 | Ousborne; Jeffrey | Vehicle driver performance monitoring system |
US5699056A (en) | 1994-12-28 | 1997-12-16 | Omron Corporation | Traffic information system |
US5682525A (en) | 1995-01-11 | 1997-10-28 | Civix Corporation | System and methods for remotely accessing a selected group of items of interest from a database |
US5938720A (en) | 1995-02-09 | 1999-08-17 | Visteon Technologies, Llc | Route generation in a vehicle navigation system |
US5712788A (en) * | 1995-02-09 | 1998-01-27 | Zexel Corporation | Incremental route calculation |
US5532690A (en) | 1995-04-04 | 1996-07-02 | Itt Corporation | Apparatus and method for monitoring and bounding the path of a ground vehicle |
SE9501919L (en) | 1995-05-19 | 1996-07-01 | Dimbis Ab | Detection and prediction of traffic disturbances |
US5731978A (en) * | 1995-06-07 | 1998-03-24 | Zexel Corporation | Method and apparatus for enhancing vehicle navigation through recognition of geographical region types |
DE69628091T2 (en) | 1995-06-13 | 2004-04-01 | Matsushita Electric Industrial Co., Ltd., Kadoma | Vehicle navigation device and recording medium for program storage therefor |
US5862244A (en) * | 1995-07-13 | 1999-01-19 | Motorola, Inc. | Satellite traffic reporting system and methods |
US5911773A (en) | 1995-07-24 | 1999-06-15 | Aisin Aw Co., Ltd. | Navigation system for vehicles |
JP3425276B2 (en) * | 1995-08-11 | 2003-07-14 | 株式会社日立製作所 | Information notification system |
US5898390A (en) * | 1995-09-14 | 1999-04-27 | Zexel Corporation | Method and apparatus for calibration of a distance sensor in a vehicle navigation system |
JPH09114367A (en) | 1995-10-24 | 1997-05-02 | Mitsubishi Electric Corp | On-vehicle traveling controller |
DE19539641C2 (en) | 1995-10-25 | 2000-02-17 | Daimler Chrysler Ag | Method and device for traffic situation-dependent vehicle route guidance |
US5933100A (en) | 1995-12-27 | 1999-08-03 | Mitsubishi Electric Information Technology Center America, Inc. | Automobile navigation system with dynamic traffic data |
US5729458A (en) * | 1995-12-29 | 1998-03-17 | Etak, Inc. | Cost zones |
US5797134A (en) | 1996-01-29 | 1998-08-18 | Progressive Casualty Insurance Company | Motor vehicle monitoring system for determining a cost of insurance |
US5742922A (en) * | 1996-02-12 | 1998-04-21 | Hyundai Motor Company | Vehicle navigation system and method for selecting a route according to fuel consumption |
US5774827A (en) | 1996-04-03 | 1998-06-30 | Motorola Inc. | Commuter route selection system |
DE69730262T2 (en) | 1996-04-16 | 2005-09-01 | Xanavi Informatics Corp., Zama | Map display device, navigation device and map display method |
US5893898A (en) * | 1996-07-30 | 1999-04-13 | Alpine Electronics, Inc. | Navigation system having intersection routing using a road segment based database |
US6111521A (en) | 1996-09-18 | 2000-08-29 | Mannesmann Vdo Ag | Apparatus for supplying traffic-related information |
US5922042A (en) | 1996-09-30 | 1999-07-13 | Visteon Technologies, Llc | Automatic resumption of route guidance in vehicle navigation system |
US5904728A (en) | 1996-10-11 | 1999-05-18 | Visteon Technologies, Llc | Voice guidance timing in a vehicle navigation system |
JP3698835B2 (en) | 1996-10-25 | 2005-09-21 | 三菱電機株式会社 | Traffic information display device, display method thereof, and medium on which display control program for traffic information display device is recorded |
US5902350A (en) | 1996-10-30 | 1999-05-11 | Visteon Technologies, Llc | Generating a maneuver at the intersection through a turn lane |
US5948043A (en) | 1996-11-08 | 1999-09-07 | Etak, Inc. | Navigation system using GPS data |
US5982298A (en) | 1996-11-14 | 1999-11-09 | Microsoft Corporation | Interactive traffic display and trip planner |
JP4397972B2 (en) | 1996-11-19 | 2010-01-13 | サージックス コーポレイション | Transient voltage protection device and manufacturing method thereof |
US6253154B1 (en) | 1996-11-22 | 2001-06-26 | Visteon Technologies, Llc | Method and apparatus for navigating with correction of angular speed using azimuth detection sensor |
US5916299A (en) | 1996-11-25 | 1999-06-29 | Etak, Inc. | Method for determining exits and entrances for a region in a network |
US5893081A (en) * | 1996-11-25 | 1999-04-06 | Etak, Inc. | Using multiple levels of costs for a pathfinding computation |
US6058390A (en) | 1996-11-26 | 2000-05-02 | Visteon Technologies, Llc | Vehicle navigation assistance device having fast file access capability |
US5910177A (en) | 1996-12-09 | 1999-06-08 | Visteon Technologies, Llc | Navigating close proximity routes with a vehicle navigation system |
US5862509A (en) * | 1996-12-20 | 1999-01-19 | Zexel Corporation | Vehicle navigation using timed turn and timed lane restrictions |
US5928307A (en) | 1997-01-15 | 1999-07-27 | Visteon Technologies, Llc | Method and apparatus for determining an alternate route in a vehicle navigation system |
US5978730A (en) | 1997-02-20 | 1999-11-02 | Sony Corporation | Caching for pathfinding computation |
US5850190A (en) | 1997-03-06 | 1998-12-15 | Sony Corporation | Traffic information pager |
US6209026B1 (en) * | 1997-03-07 | 2001-03-27 | Bin Ran | Central processing and combined central and local processing of personalized real-time traveler information over internet/intranet |
US5987381A (en) | 1997-03-11 | 1999-11-16 | Visteon Technologies, Llc | Automobile navigation system using remote download of data |
JPH10261188A (en) | 1997-03-21 | 1998-09-29 | Omron Corp | Travel time prediction device and travel time prediction mechanism |
US20010018628A1 (en) | 1997-03-27 | 2001-08-30 | Mentor Heavy Vehicle Systems, Lcc | System for monitoring vehicle efficiency and vehicle and driver perfomance |
JP3560761B2 (en) | 1997-04-01 | 2004-09-02 | 富士通テン株式会社 | Navigation system |
JPH10293533A (en) | 1997-04-21 | 1998-11-04 | Mitsubishi Electric Corp | Topographic information display system |
US5999882A (en) | 1997-06-04 | 1999-12-07 | Sterling Software, Inc. | Method and system of providing weather information along a travel route |
US6091956A (en) | 1997-06-12 | 2000-07-18 | Hollenberg; Dennis D. | Situation information system |
DE19724919A1 (en) * | 1997-06-12 | 1999-01-07 | Adolph Michael Dr | Method for generating, merging and updating data usable in a route guidance system |
EP0922201B1 (en) | 1997-07-01 | 2002-09-11 | Siemens Aktiengesellschaft | Navigation system for use in a vehicle |
US5991687A (en) | 1997-07-02 | 1999-11-23 | Case Corporation | System and method for communicating information related to a geographical area |
US6091359A (en) | 1997-07-14 | 2000-07-18 | Motorola, Inc. | Portable dead reckoning system for extending GPS coverage |
US6680694B1 (en) | 1997-08-19 | 2004-01-20 | Siemens Vdo Automotive Corporation | Vehicle information system |
DE19746904B4 (en) | 1997-10-23 | 2004-09-30 | Telefonaktiebolaget L M Ericsson (Publ) | Traffic data evaluation device and associated method for a network with dynamic switching |
US6021406A (en) * | 1997-11-14 | 2000-02-01 | Etak, Inc. | Method for storing map data in a database using space filling curves and a method of searching the database to find objects in a given area and to find objects nearest to a location |
US6065120A (en) | 1997-12-09 | 2000-05-16 | Phone.Com, Inc. | Method and system for self-provisioning a rendezvous to ensure secure access to information in a database from multiple devices |
US6097399A (en) | 1998-01-16 | 2000-08-01 | Honeywell Inc. | Display of visual data utilizing data aggregation |
US6038509A (en) * | 1998-01-22 | 2000-03-14 | Etak, Inc. | System for recalculating a path |
EP1717677B1 (en) * | 1998-01-26 | 2015-06-17 | Apple Inc. | Method and apparatus for integrating manual input |
US6252544B1 (en) | 1998-01-27 | 2001-06-26 | Steven M. Hoffberg | Mobile communication device |
US6016485A (en) * | 1998-02-13 | 2000-01-18 | Etak, Inc. | System for pathfinding |
US6144919A (en) | 1998-03-27 | 2000-11-07 | Visteon Technologies, Llc | Method and apparatus for using non-digitized cities for route calculation |
JPH11311533A (en) | 1998-04-28 | 1999-11-09 | Xanavi Informatics Corp | Routing device |
US6298305B1 (en) | 1998-07-15 | 2001-10-02 | Visteon Technologies, Llc | Methods and apparatus for providing voice guidance in a vehicle navigation system |
EP0987665A3 (en) | 1998-07-30 | 2000-10-25 | Visteon Technologies, LLC | Vehicle navigation system, method and apparatus |
US6147626A (en) | 1998-08-11 | 2000-11-14 | Visteon Technologies, Llc | Determination of zero-angular-velocity output level for angular velocity sensor |
JP2000055675A (en) | 1998-08-14 | 2000-02-25 | Toyota Motor Corp | Map display device for vehicle and its method |
JP3449240B2 (en) | 1998-09-24 | 2003-09-22 | 株式会社デンソー | Vehicle current position detection device, vehicle current position display device, navigation device, and recording medium |
US6161092A (en) | 1998-09-29 | 2000-12-12 | Etak, Inc. | Presenting information using prestored speech |
JP3654009B2 (en) | 1998-10-08 | 2005-06-02 | 日産自動車株式会社 | Navigation device |
US6598016B1 (en) | 1998-10-20 | 2003-07-22 | Tele Atlas North America, Inc. | System for using speech recognition with map data |
US6504541B1 (en) * | 1998-10-21 | 2003-01-07 | Tele Atlas North America, Inc. | Warping geometric objects |
US6532304B1 (en) * | 1998-10-21 | 2003-03-11 | Tele Atlas North America, Inc. | Matching geometric objects |
FI106823B (en) * | 1998-10-23 | 2001-04-12 | Nokia Mobile Phones Ltd | Information retrieval system |
DE69942924D1 (en) | 1998-11-23 | 2010-12-16 | Integrated Transp Information | System for direct traffic monitoring |
US6150961A (en) | 1998-11-24 | 2000-11-21 | International Business Machines Corporation | Automated traffic mapping |
DE19856187A1 (en) | 1998-12-05 | 2000-06-15 | Alcatel Sa | Satellite-based map matching process |
DE19856704C2 (en) | 1998-12-09 | 2001-09-13 | Daimler Chrysler Ag | Method and device for vehicle route guidance and / or travel time estimation |
US6885937B1 (en) * | 1998-12-10 | 2005-04-26 | Tele Atlas North America, Inc. | Shortcut generator |
US6754485B1 (en) * | 1998-12-23 | 2004-06-22 | American Calcar Inc. | Technique for effectively providing maintenance and information to vehicles |
US6188956B1 (en) * | 1998-12-30 | 2001-02-13 | Garmin Corporation | Navigation device and method for selectively displaying thoroughfare names |
US6295492B1 (en) | 1999-01-27 | 2001-09-25 | Infomove.Com, Inc. | System for transmitting and displaying multiple, motor vehicle information |
DE19903909A1 (en) | 1999-02-01 | 2000-08-03 | Delphi 2 Creative Tech Gmbh | Method and device for obtaining relevant traffic information and for dynamic route optimization |
US6463400B1 (en) | 1999-02-01 | 2002-10-08 | Tele Atlas North America Inc | Quarter sectioning algorithm |
WO2000050917A1 (en) | 1999-02-22 | 2000-08-31 | Magellan Dis Inc. | Vehicle navigation system with correction for selective availability |
US6466862B1 (en) | 1999-04-19 | 2002-10-15 | Bruce DeKock | System for providing traffic information |
US6710774B1 (en) | 1999-05-12 | 2004-03-23 | Denso Corporation | Map display device |
US6222485B1 (en) * | 1999-05-20 | 2001-04-24 | Garmin Corporation | Use of desired orientation in automotive navigation equipment |
AU5029900A (en) | 1999-05-21 | 2000-12-12 | Etak Inc. | Computing sign text for branches of an electronic map network |
US6362730B2 (en) | 1999-06-14 | 2002-03-26 | Sun Microsystems, Inc. | System and method for collecting vehicle information |
JP3532492B2 (en) | 1999-06-25 | 2004-05-31 | 株式会社ザナヴィ・インフォマティクス | Road traffic information providing system, information providing device, and navigation device |
US6256577B1 (en) | 1999-09-17 | 2001-07-03 | Intel Corporation | Using predictive traffic modeling |
US6360165B1 (en) * | 1999-10-21 | 2002-03-19 | Visteon Technologies, Llc | Method and apparatus for improving dead reckoning distance calculation in vehicle navigation system |
US6282496B1 (en) | 1999-10-29 | 2001-08-28 | Visteon Technologies, Llc | Method and apparatus for inertial guidance for an automobile navigation system |
US6335765B1 (en) * | 1999-11-08 | 2002-01-01 | Weather Central, Inc. | Virtual presentation system and method |
US6253146B1 (en) | 1999-12-06 | 2001-06-26 | At&T Corp. | Network-based traffic congestion notification service |
JP3941312B2 (en) * | 1999-12-24 | 2007-07-04 | 株式会社日立製作所 | Road traffic system and information processing method thereof |
US6353795B1 (en) * | 2000-02-01 | 2002-03-05 | Infospace, Inc. | Method and system for matching an incident to a route |
US6317685B1 (en) | 2000-03-13 | 2001-11-13 | Navigation Technologies Corp. | Method and system for providing alternate routes with a navigation system |
US20010026276A1 (en) | 2000-03-17 | 2001-10-04 | Kiyomi Sakamoto | Map display device and navigation device |
JP2001330451A (en) | 2000-03-17 | 2001-11-30 | Matsushita Electric Ind Co Ltd | Map display and automobile navigation system |
US6480783B1 (en) | 2000-03-17 | 2002-11-12 | Makor Issues And Rights Ltd. | Real time vehicle guidance and forecasting system under traffic jam conditions |
US6388612B1 (en) * | 2000-03-26 | 2002-05-14 | Timothy J Neher | Global cellular position tracking device |
US6456935B1 (en) | 2000-03-28 | 2002-09-24 | Horizon Navigation, Inc. | Voice guidance intonation in a vehicle navigation system |
US6282486B1 (en) | 2000-04-03 | 2001-08-28 | International Business Machines Corporation | Distributed system and method for detecting traffic patterns |
US6731940B1 (en) | 2000-04-28 | 2004-05-04 | Trafficmaster Usa, Inc. | Methods of using wireless geolocation to customize content and delivery of information to wireless communication devices |
GB0011797D0 (en) | 2000-05-16 | 2000-07-05 | Yeoman Group Plc | Improved vehicle routeing |
JP2001331893A (en) * | 2000-05-22 | 2001-11-30 | Matsushita Electric Ind Co Ltd | Traffic violation warning and storing device |
US6317686B1 (en) | 2000-07-21 | 2001-11-13 | Bin Ran | Method of providing travel time |
US6292745B1 (en) | 2000-07-24 | 2001-09-18 | Navigation Technologies Corp. | Method and system for forming a database of geographic data for distribution to navigation system units |
US6675085B2 (en) * | 2000-08-17 | 2004-01-06 | Michael P. Straub | Method and apparatus for storing, accessing, generating and using information about speed limits and speed traps |
DE50014953D1 (en) | 2000-08-24 | 2008-03-20 | Siemens Vdo Automotive Ag | Method and navigation device for querying destination information and navigating in a map view |
US6556905B1 (en) * | 2000-08-31 | 2003-04-29 | Lisa M. Mittelsteadt | Vehicle supervision and monitoring |
US6735516B1 (en) | 2000-09-06 | 2004-05-11 | Horizon Navigation, Inc. | Methods and apparatus for telephoning a destination in vehicle navigation |
US6539302B1 (en) | 2000-09-06 | 2003-03-25 | Navigation Technologies Corporation | Method, system, and article of manufacture for providing notification of traffic conditions |
AU2001295820A1 (en) * | 2000-09-25 | 2002-04-02 | Transactions, Inc. | System and method to correlate and access related text with locations on an electronically displayed map |
US7135961B1 (en) | 2000-09-29 | 2006-11-14 | International Business Machines Corporation | Method and system for providing directions for driving |
US6424910B1 (en) | 2000-11-22 | 2002-07-23 | Navigation Technologies Corp. | Method and system for providing related navigation features for two or more end users |
US6603405B2 (en) | 2000-12-05 | 2003-08-05 | User-Centric Enterprises, Inc. | Vehicle-centric weather prediction system and method |
JP2002190091A (en) | 2000-12-20 | 2002-07-05 | Pioneer Electronic Corp | Traveling time setting method and device, method and device for calculating route using it |
US6542814B2 (en) * | 2001-03-07 | 2003-04-01 | Horizon Navigation, Inc. | Methods and apparatus for dynamic point of interest display |
US6456931B1 (en) | 2001-03-07 | 2002-09-24 | Visteon Technologies, Llc | Indicating directions to destination and intermediate locations in vehicle navigation systems |
JP2002267467A (en) | 2001-03-09 | 2002-09-18 | Mitsubishi Electric Corp | Navigation system |
BR0208409A (en) | 2001-03-27 | 2004-08-31 | Computer Ass Think Inc | System and method for determining spatial hierarchy for polygonal data by using cubic root scaling |
US6484092B2 (en) | 2001-03-28 | 2002-11-19 | Intel Corporation | Method and system for dynamic and interactive route finding |
US6584400B2 (en) | 2001-04-09 | 2003-06-24 | Louis J C Beardsworth | Schedule activated management system for optimizing aircraft arrivals at congested airports |
US6552656B2 (en) * | 2001-04-12 | 2003-04-22 | Horizon Navigation, Inc. | Method and apparatus for generating notification of changed conditions behind a vehicle |
US6728605B2 (en) * | 2001-05-16 | 2004-04-27 | Beacon Marine Security Limited | Vehicle speed monitoring system and method |
US6600994B1 (en) | 2001-05-17 | 2003-07-29 | Horizon Navigation, Inc. | Quick selection of destinations in an automobile navigation system |
US6560532B2 (en) | 2001-05-25 | 2003-05-06 | Regents Of The University Of California, The | Method and system for electronically determining dynamic traffic information |
US6594576B2 (en) * | 2001-07-03 | 2003-07-15 | At Road, Inc. | Using location data to determine traffic information |
US6690294B1 (en) | 2001-07-10 | 2004-02-10 | William E. Zierden | System and method for detecting and identifying traffic law violators and issuing citations |
US6700503B2 (en) * | 2001-08-06 | 2004-03-02 | Siemens Energy & Automation, Inc | Method of communicating conditions within a storage tank level |
WO2003014671A1 (en) | 2001-08-10 | 2003-02-20 | Aisin Aw Co., Ltd. | Traffic information search method, traffic information search system, mobile body communication device, and network navigation center |
US6470268B1 (en) | 2001-08-14 | 2002-10-22 | Horizon Navigation, Inc. | Navigation destination entry via glyph to digital translation |
US20030035518A1 (en) | 2001-08-16 | 2003-02-20 | Fan Rodric C. | Voice interaction for location-relevant mobile resource management |
US20030046158A1 (en) * | 2001-09-04 | 2003-03-06 | Kratky Jan Joseph | Method and system for enhancing mobile advertisement targeting with virtual roadside billboards |
JP3882554B2 (en) * | 2001-09-17 | 2007-02-21 | 日産自動車株式会社 | Navigation device |
US6650997B2 (en) | 2001-09-28 | 2003-11-18 | Robert Bosch Gmbh | System and method for interfacing mobile units using a cellphone |
US6473000B1 (en) | 2001-10-24 | 2002-10-29 | James Secreet | Method and apparatus for measuring and recording vehicle speed and for storing related data |
US6606557B2 (en) | 2001-12-07 | 2003-08-12 | Motorola, Inc. | Method for improving dispatch response time |
US6545637B1 (en) | 2001-12-20 | 2003-04-08 | Garmin, Ltd. | Systems and methods for a navigational device with improved route calculation capabilities |
US6999873B1 (en) * | 2001-12-21 | 2006-02-14 | Garmin Ltd. | Navigation system, method and device with detour algorithm |
US6687615B1 (en) * | 2001-12-21 | 2004-02-03 | Garmin Ltd. | Navigation system, method and device with detour algorithm |
US6901330B1 (en) | 2001-12-21 | 2005-05-31 | Garmin Ltd. | Navigation system, method and device with voice guidance |
US6728628B2 (en) | 2001-12-28 | 2004-04-27 | Trafficgauge, Inc. | Portable traffic information system |
US6983204B2 (en) * | 2002-01-09 | 2006-01-03 | International Business Machines Corporation | Mapping travel routes |
US20030135304A1 (en) | 2002-01-11 | 2003-07-17 | Brian Sroub | System and method for managing transportation assets |
US6622086B2 (en) | 2002-02-11 | 2003-09-16 | Horizon Navigation, Inc. | Method of representing a location in a database for a vehicle navigation system |
US7221287B2 (en) * | 2002-03-05 | 2007-05-22 | Triangle Software Llc | Three-dimensional traffic report |
US6989765B2 (en) * | 2002-03-05 | 2006-01-24 | Triangle Software Llc | Personalized traveler information dissemination system |
US6681176B2 (en) * | 2002-05-02 | 2004-01-20 | Robert Bosch Gmbh | Method and device for a detachable navigation system |
AU2003243646A1 (en) * | 2002-06-21 | 2004-01-06 | Nuride, Inc. | System and method for facilitating ridesharing |
US7103854B2 (en) | 2002-06-27 | 2006-09-05 | Tele Atlas North America, Inc. | System and method for associating text and graphical views of map information |
JP4657728B2 (en) | 2002-08-29 | 2011-03-23 | アイティス・ホールディングス・ピーエルシー | Apparatus and method for providing traffic information |
US7116326B2 (en) | 2002-09-06 | 2006-10-03 | Traffic.Com, Inc. | Method of displaying traffic flow data representing traffic conditions |
US6807483B1 (en) | 2002-10-11 | 2004-10-19 | Televigation, Inc. | Method and system for prediction-based distributed navigation |
US6845316B2 (en) * | 2002-10-14 | 2005-01-18 | Mytrafficnews.Com, Inc. | Distribution of traffic and transit information |
US20040080624A1 (en) * | 2002-10-29 | 2004-04-29 | Yuen Siltex Peter | Universal dynamic video on demand surveillance system |
CN1711545A (en) | 2002-11-15 | 2005-12-21 | 欧姆龙株式会社 | Charging method in service providing system, service providing server, service providing program, recording medium containing the service providing program, terminal device, terminal processing progra |
US7406543B2 (en) | 2002-11-29 | 2008-07-29 | Traffic.Com, Inc. | Remote radio spectrum information acquisition |
US9108107B2 (en) | 2002-12-10 | 2015-08-18 | Sony Computer Entertainment America Llc | Hosting and broadcasting virtual events using streaming interactive video |
JP3928721B2 (en) * | 2003-01-23 | 2007-06-13 | アイシン・エィ・ダブリュ株式会社 | Vehicle navigation device |
KR101168423B1 (en) | 2003-02-05 | 2012-07-25 | 가부시키가이샤 자나비 인포메틱스 | Path search method of navigation apparatus and display method of traffic information |
PT1611416E (en) | 2003-02-26 | 2007-09-12 | Tomtom Int Bv | Navigation device and method for displaying alternative routes |
JP4255007B2 (en) | 2003-04-11 | 2009-04-15 | 株式会社ザナヴィ・インフォマティクス | Navigation device and travel time calculation method thereof |
US6931309B2 (en) | 2003-05-06 | 2005-08-16 | Innosurance, Inc. | Motor vehicle operating data collection and analysis |
US7440842B1 (en) | 2003-05-09 | 2008-10-21 | Dimitri Vorona | System for transmitting, processing, receiving, and displaying traffic information |
US8825356B2 (en) | 2003-05-09 | 2014-09-02 | Dimitri Vorona | System for transmitting, processing, receiving, and displaying traffic information |
WO2004104968A1 (en) * | 2003-05-15 | 2004-12-02 | Landsonar, Inc. | System and method for evaluating vehicle and operator performance |
US7610145B2 (en) * | 2003-07-25 | 2009-10-27 | Triangle Software Llc | System and method for determining recommended departure time |
JP3994937B2 (en) * | 2003-07-29 | 2007-10-24 | アイシン・エィ・ダブリュ株式会社 | Vehicle traffic information notification system and navigation system |
US7634352B2 (en) * | 2003-09-05 | 2009-12-15 | Navteq North America, Llc | Method of displaying traffic flow conditions using a 3D system |
US7756639B2 (en) | 2003-10-06 | 2010-07-13 | Sirf Technology, Inc. | System and method for augmenting a satellite-based navigation solution |
US7561966B2 (en) | 2003-12-17 | 2009-07-14 | Denso Corporation | Vehicle information display system |
WO2005064567A1 (en) * | 2003-12-19 | 2005-07-14 | Bayerische Motoren Werke Aktiengesellschaft | Traffic status recognition with a threshold value method |
JP2007522548A (en) | 2004-01-30 | 2007-08-09 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 3D cursor control system |
US7180501B2 (en) | 2004-03-23 | 2007-02-20 | Fujitsu Limited | Gesture based navigation of a handheld user interface |
DE102004038740A1 (en) * | 2004-08-10 | 2006-02-23 | Robert Bosch Gmbh | Method for displaying map information |
US7511634B2 (en) * | 2004-12-22 | 2009-03-31 | Htnb Corporation | Retrieving and presenting dynamic traffic information |
KR20070106709A (en) * | 2005-01-03 | 2007-11-05 | 부미, 인코포레이티드 | Systems and methods for night time surveillance |
US7548197B2 (en) * | 2005-06-20 | 2009-06-16 | At&T Intellectual Property I, L.P. | GPS parasite system |
US7933395B1 (en) * | 2005-06-27 | 2011-04-26 | Google Inc. | Virtual tour of user-defined paths in a geographic information system |
US7885758B2 (en) | 2005-06-30 | 2011-02-08 | Marvell World Trade Ltd. | GPS-based traffic monitoring system |
US7603138B2 (en) | 2005-08-22 | 2009-10-13 | Toshiba American Research, Inc. | Environmental monitoring using mobile devices and network information server |
US7927216B2 (en) * | 2005-09-15 | 2011-04-19 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
JP4794957B2 (en) * | 2005-09-14 | 2011-10-19 | 任天堂株式会社 | GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD |
US7486201B2 (en) * | 2006-01-10 | 2009-02-03 | Myweather, Llc | Combined personalized traffic and weather report and alert system and method |
US7912627B2 (en) * | 2006-03-03 | 2011-03-22 | Inrix, Inc. | Obtaining road traffic condition data from mobile data sources |
US8700296B2 (en) * | 2006-03-03 | 2014-04-15 | Inrix, Inc. | Dynamic prediction of road traffic conditions |
US7912628B2 (en) | 2006-03-03 | 2011-03-22 | Inrix, Inc. | Determining road traffic conditions using data from multiple data sources |
US8014936B2 (en) | 2006-03-03 | 2011-09-06 | Inrix, Inc. | Filtering road traffic condition data obtained from mobile data sources |
JP5424373B2 (en) | 2006-03-09 | 2014-02-26 | 任天堂株式会社 | Image processing apparatus, image processing program, image processing system, and image processing method |
JP4837405B2 (en) | 2006-03-09 | 2011-12-14 | 任天堂株式会社 | Coordinate calculation apparatus and coordinate calculation program |
JP4151982B2 (en) | 2006-03-10 | 2008-09-17 | 任天堂株式会社 | Motion discrimination device and motion discrimination program |
JP4798705B2 (en) | 2006-03-23 | 2011-10-19 | 任天堂株式会社 | POSITION CALCULATION DEVICE, POSITION CALCULATION PROGRAM, GAME DEVICE, AND GAME PROGRAM |
JP3990716B1 (en) | 2006-04-07 | 2007-10-17 | 富士重工業株式会社 | Vehicle display device |
US7558674B1 (en) | 2006-04-24 | 2009-07-07 | Wsi, Corporation | Weather severity and characterization system |
KR100796339B1 (en) | 2006-05-13 | 2008-01-21 | 삼성전자주식회사 | Method and apparatus for providing traffic information using schedule registration information |
US7908076B2 (en) * | 2006-08-18 | 2011-03-15 | Inrix, Inc. | Representative road traffic flow information based on historical data |
US8125448B2 (en) * | 2006-10-06 | 2012-02-28 | Microsoft Corporation | Wearable computer pointing device |
US20080133120A1 (en) | 2006-11-30 | 2008-06-05 | Romanick Ian D | Method for determining and outputting travel instructions for most fuel-efficient route |
US20080255754A1 (en) | 2007-04-12 | 2008-10-16 | David Pinto | Traffic incidents processing system and method for sharing real time traffic information |
US8100769B2 (en) | 2007-05-09 | 2012-01-24 | Nintendo Co., Ltd. | System and method for using accelerometer outputs to control an object rotating on a display |
US8175802B2 (en) * | 2007-06-28 | 2012-05-08 | Apple Inc. | Adaptive route guidance based on preferences |
US20090061971A1 (en) * | 2007-08-31 | 2009-03-05 | Visual Sports Systems | Object Tracking Interface Device for Computers and Gaming Consoles |
US8190359B2 (en) | 2007-08-31 | 2012-05-29 | Proxpro, Inc. | Situation-aware personal information management for a mobile device |
US20110037619A1 (en) * | 2009-08-11 | 2011-02-17 | On Time Systems, Inc. | Traffic Routing Using Intelligent Traffic Signals, GPS and Mobile Data Devices |
US8068974B2 (en) * | 2007-09-10 | 2011-11-29 | GM Global Technology Operations LLC | Methods and systems for determining driver efficiency and operating modes in a hybrid vehicle |
DE102007045991A1 (en) | 2007-09-26 | 2009-04-02 | Siemens Ag | Method for determining consumption and / or emission values |
KR20090038540A (en) * | 2007-10-16 | 2009-04-21 | 주식회사 현대오토넷 | Apparatus and method for changing image position on the screen, and nevigation system using the same |
US8428856B2 (en) | 2007-10-29 | 2013-04-23 | At&T Intellectual Property I, L.P. | Methods, systems, devices, and computer program products for implementing condition alert services |
US8050862B2 (en) | 2007-10-30 | 2011-11-01 | GM Global Technology Operations LLC | Vehicular navigation system for recalling preset map views |
US9183744B2 (en) | 2008-01-29 | 2015-11-10 | Here Global B.V. | Method for providing images of traffic incidents |
US8881040B2 (en) | 2008-08-28 | 2014-11-04 | Georgetown University | System and method for detecting, collecting, analyzing, and communicating event-related information |
US8279086B2 (en) | 2008-09-26 | 2012-10-02 | Regents Of The University Of Minnesota | Traffic flow monitoring for intersections with signal controls |
US8386157B2 (en) | 2008-10-10 | 2013-02-26 | Jin Hong Kim | Universal GPS traffic monitoring system |
US8185297B2 (en) | 2008-10-15 | 2012-05-22 | Navteq NA LLC | Personalized traffic reports |
US8335647B2 (en) * | 2008-12-04 | 2012-12-18 | Verizon Patent And Licensing Inc. | Navigation based on popular user-defined paths |
US8255151B2 (en) | 2008-12-09 | 2012-08-28 | Motorola Mobility Llc | Method and system for providing environmentally-optimized navigation routes |
GR1006698B (en) | 2008-12-22 | 2010-02-05 | Method and system for the collection, processing and distribution of traffic data for optimizing routing in satellite navigation systems of vehicles. | |
US8364389B2 (en) | 2009-02-02 | 2013-01-29 | Apple Inc. | Systems and methods for integrating a portable electronic device with a bicycle |
US8619072B2 (en) | 2009-03-04 | 2013-12-31 | Triangle Software Llc | Controlling a three-dimensional virtual broadcast presentation |
US8982116B2 (en) | 2009-03-04 | 2015-03-17 | Pelmorex Canada Inc. | Touch screen based interaction with traffic data |
US9046924B2 (en) | 2009-03-04 | 2015-06-02 | Pelmorex Canada Inc. | Gesture based interaction with traffic data |
KR20110049548A (en) | 2009-11-05 | 2011-05-12 | 엘지전자 주식회사 | Navigation method of mobile terminal and apparatus thereof |
JP2012003408A (en) | 2010-06-15 | 2012-01-05 | Rohm Co Ltd | Drive recorder |
WO2012065188A2 (en) | 2010-11-14 | 2012-05-18 | Triangle Software Llc | Crowd sourced traffic reporting |
TW201227381A (en) | 2010-12-20 | 2012-07-01 | Ind Tech Res Inst | Real-time traffic situation awareness system and method |
CA2839866C (en) | 2011-05-18 | 2021-04-13 | Triangle Software Llc | System for providing traffic data and driving efficiency data |
DE102011076370A1 (en) | 2011-05-24 | 2012-11-29 | Robert Bosch Gmbh | Machine tool braking device |
CA2883973C (en) | 2012-01-27 | 2021-02-23 | Edgar Rojas | Estimating time travel distributions on signalized arterials |
US8606508B2 (en) | 2012-02-09 | 2013-12-10 | Flightaware, Llc | System and method for sending air traffic data to users for display |
US10223909B2 (en) | 2012-10-18 | 2019-03-05 | Uber Technologies, Inc. | Estimating time travel distributions on signalized arterials |
-
2010
- 2010-08-20 US US12/860,700 patent/US8982116B2/en active Active
-
2011
- 2011-08-22 WO PCT/US2011/048680 patent/WO2012024694A2/en active Application Filing
- 2011-08-22 EP EP11818901.8A patent/EP2635989A4/en not_active Withdrawn
-
2015
- 2015-03-03 US US14/637,357 patent/US20150177018A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7835858B2 (en) * | 2002-11-22 | 2010-11-16 | Traffic.Com, Inc. | Method of creating a virtual traffic network |
US7847708B1 (en) * | 2005-09-29 | 2010-12-07 | Baron Services, Inc. | System for providing site-specific, real-time environmental condition information to vehicles and related methods |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9602977B2 (en) | 2002-03-05 | 2017-03-21 | Pelmorex Canada Inc. | GPS generated traffic information |
US9368029B2 (en) | 2002-03-05 | 2016-06-14 | Pelmorex Canada Inc. | GPS generated traffic information |
US9401088B2 (en) | 2002-03-05 | 2016-07-26 | Pelmorex Canada Inc. | Method for predicting a travel time for a traffic route |
US9489842B2 (en) | 2002-03-05 | 2016-11-08 | Pelmorex Canada Inc. | Method for choosing a traffic route |
US9640073B2 (en) | 2002-03-05 | 2017-05-02 | Pelmorex Canada Inc. | Generating visual information associated with traffic |
US9127959B2 (en) | 2003-07-25 | 2015-09-08 | Pelmorex Canada Inc. | System and method for delivering departure notifications |
US9644982B2 (en) | 2003-07-25 | 2017-05-09 | Pelmorex Canada Inc. | System and method for delivering departure notifications |
US10289264B2 (en) | 2009-03-04 | 2019-05-14 | Uber Technologies, Inc. | Controlling a three-dimensional virtual broadcast presentation |
US9448690B2 (en) | 2009-03-04 | 2016-09-20 | Pelmorex Canada Inc. | Controlling a three-dimensional virtual broadcast presentation |
US9547984B2 (en) | 2011-05-18 | 2017-01-17 | Pelmorex Canada Inc. | System for providing traffic data and driving efficiency data |
US9390620B2 (en) | 2011-05-18 | 2016-07-12 | Pelmorex Canada Inc. | System for providing traffic data and driving efficiency data |
US9293039B2 (en) | 2012-01-27 | 2016-03-22 | Pelmorex Canada Inc. | Estimating time travel distributions on signalized arterials |
US10223909B2 (en) | 2012-10-18 | 2019-03-05 | Uber Technologies, Inc. | Estimating time travel distributions on signalized arterials |
US10971000B2 (en) | 2012-10-18 | 2021-04-06 | Uber Technologies, Inc. | Estimating time travel distributions on signalized arterials |
US20150067540A1 (en) * | 2013-09-02 | 2015-03-05 | Samsung Electronics Co., Ltd. | Display apparatus, portable device and screen display methods thereof |
CN108228737A (en) * | 2016-12-13 | 2018-06-29 | 通用电气航空系统有限责任公司 | Travel path and data integrated system based on map |
US11237016B2 (en) | 2016-12-13 | 2022-02-01 | Ge Aviation Systems Llc | Map-based trip trajectory and data integration system |
US11946770B2 (en) | 2016-12-13 | 2024-04-02 | Ge Aviation Systems Llc | Map-based trip trajectory and data integration system |
Also Published As
Publication number | Publication date |
---|---|
WO2012024694A3 (en) | 2012-04-12 |
US20100312462A1 (en) | 2010-12-09 |
EP2635989A4 (en) | 2016-12-21 |
US8982116B2 (en) | 2015-03-17 |
WO2012024694A2 (en) | 2012-02-23 |
EP2635989A2 (en) | 2013-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150177018A1 (en) | Touch screen based interaction with traffic data | |
US9046924B2 (en) | Gesture based interaction with traffic data | |
US10289264B2 (en) | Controlling a three-dimensional virtual broadcast presentation | |
US20200378779A1 (en) | Augmented reality interface for navigation assistance | |
TWI575224B (en) | A navigation device, methods for providing navigation instructions on a device and related non-transitory machine readable mediums | |
TWI459283B (en) | Adjustable and progressive mobile device street view | |
US20090160873A1 (en) | Interactive virtual weather map | |
US9087362B2 (en) | Traffic collision incident visualization with location context | |
DE112013002792T5 (en) | navigation application | |
US9593959B2 (en) | Linear projection-based navigation | |
US11140510B2 (en) | Contextual map view | |
WO2022262193A1 (en) | Navigation method and apparatus, and computer device and computer-readable storage medium | |
JP3950085B2 (en) | Map-guided omnidirectional video system | |
EP4334683A2 (en) | User interfaces for maps and navigation | |
WO2024066881A1 (en) | Map navigation method and apparatus, and computer device and storage medium | |
EP3113523A1 (en) | A video region indicator that indicates that video content is available | |
CN115131978A (en) | Method, device and equipment for displaying data and storage medium | |
US20240331660A1 (en) | Program, terminal control method, terminal, information processing method, and information processing device | |
WO2022261621A2 (en) | User interfaces for maps and navigation | |
NO323509B1 (en) | Method of animating a series of still images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: 1836549 ONTARIO LIMITED C/O PELMOREX MEDIA INC., C Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TRIANGLE SOFTWARE LLC;REEL/FRAME:039899/0802 Effective date: 20120824 Owner name: TRIANGLE SOFTWARE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUEZIEC, ANDRE;BLANQUART, BRIAC;REEL/FRAME:039899/0766 Effective date: 20100823 Owner name: PELMOREX CANADA INC., CANADA Free format text: CHANGE OF NAME;ASSIGNOR:1836549 ONTARIO LIMITED C/O PELMOREX MEDIA INC.;REEL/FRAME:040184/0048 Effective date: 20130205 |
|
AS | Assignment |
Owner name: MUDDY RIVER, SERIES 97 OF ALLIED SECURITY TRUST I, Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PELMOREX CANADA, INC.;REEL/FRAME:041069/0953 Effective date: 20170118 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: UBER TECHNOLOGIES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MUDDY RIVER, SERIES 97 OF ALLIED SECURITY TRUST I;REEL/FRAME:045204/0800 Effective date: 20171103 |