GB2524514A - Navigation system - Google Patents

Navigation system Download PDF

Info

Publication number
GB2524514A
GB2524514A GB1405304.5A GB201405304A GB2524514A GB 2524514 A GB2524514 A GB 2524514A GB 201405304 A GB201405304 A GB 201405304A GB 2524514 A GB2524514 A GB 2524514A
Authority
GB
United Kingdom
Prior art keywords
navigation
route
user
navigation system
directions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1405304.5A
Other versions
GB2524514B (en
GB201405304D0 (en
Inventor
Naseem Akhtar
Brian Gerrard
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1405304.5A priority Critical patent/GB2524514B/en
Publication of GB201405304D0 publication Critical patent/GB201405304D0/en
Priority to PCT/EP2015/056369 priority patent/WO2015144751A1/en
Priority to EP15712130.2A priority patent/EP3123115A1/en
Priority to AU2015238339A priority patent/AU2015238339B2/en
Priority to US15/128,777 priority patent/US10408634B2/en
Publication of GB2524514A publication Critical patent/GB2524514A/en
Application granted granted Critical
Publication of GB2524514B publication Critical patent/GB2524514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

A navigation system for providing navigation directions to a user comprising an input for receiving user commands to define a navigation route and processing means arranged to determine navigation directions relating to the user selected route. An output is arranged to output navigation directions to the user wherein the processing means is arranged to determine the boundary of the navigation route and the output is arranged to output an indication signal to mark the boundaries of the navigation route. The projection system may project lane markers 12 on the ground to define the boundaries of the route. A head up display (HUD) 14 may also display the lane markers. Other route features such as bends, corners, junctions and width restrictions may be displayed. The width of the navigation route may be determined by analysing aerial or satellite imagery or a look-up table. The display may be used when the route has been obscured by poor weather conditions.

Description

NAVIGATION SYSTEM
TECHNICAL FIELD
The present disclosure relates to a navigation system and particularly, but not exclusively, to a navigation system and method which provide functionality to highlight the boundaries of a route to a user. Aspects of the invention relate to a system, to a method and to a vehicle.
BACKGROUND
Navigation systems, e.g. GPS based navigation devices, are becoming increasingly popular and provide users with a number of capabilities such as location services on a map, turn-by-turn navigation directions to a user in charge of a vehicle via text or speech or graphical interface and traffic services to reroute users around congestion.
Such navigation systems will commonly use global positioning system (OPS) signals to determine latitude and longitude information for the device's location on Earth. Such navigation systems may be able to provide route options between two points, alternative route options in the event of route congestion or route closures and the location of services on route (e.g. food, fuel etc.).
Navigation systems may take a number of forms such as a dedicated GPS navigation device, a mobile phone with GPS capability or a laptop computer with navigation software.
Users may plan routes using a navigation system either by entering a start and end point or by searching for a location to travel to and allowing the device to determine its current location as the start point. In many instances a user may be presented with multiple route options for their journey and may additionally have the ability to customise the route slightly by selecting different roads either via a direct interaction with the suggested route or by selecting or de-selecting journey options in a settings menu (e.g. a user may choose to avoid roads with tolls or avoid motorways).
It is noted that for convenience reference is made within this document to OPS navigation systems. It is to be acknowledged however that alternative location services may be used, e.g. the GLONASS, Beidou (or Compass) or Galileo satellite navigation systems.
Once a route has been selected the navigation system outputs navigation directions to the user to reach their destination.
In certain circumstances however it may be difficult to follow the navigation directions because the boundary of the selected route is not clear to the user, e.g. because of adverse weather conditions (snow on route, flooding, landslip, thick fog, heavy rain etc.).
The present invention has been devised to mitigate or overcome the above mentioned problems with following route directions.
SUMMARY OF THE INVENTION
According to a first aspect of the present invention there is provided a navigation system for providing navigation directions to a user comprising: an input for receiving user commands to define a navigation route; processing means arranged to determine navigation directions relating to the user selected route; and an output arranged to output navigation directions to the user wherein the processing means is arranged to determine the boundary of the navigation route and the output is arranged to output an indication signal to mark the boundaries of the navigation route.
The indication signal may be arranged to be sent to a projection system for projecting lane markers on the ground/external to the navigation system (or in the case of a vehicle example, external to the vehicle).
The indication signal may be arranged to be sent to a head-up display to indicate the boundaries of the navigation route to the user.
The indication signal may comprise one or more of the following route features: bends, corners, junctions, width restrictions.
The processing means may determine the width of the navigation route by analysing aerial or satellite imagery.
The processing means may determine the width of the navigation route by looking up boundary information in a look up table.
The processing means may be arranged to determine whether the boundaries of the navigation route are obscured. In order to determine whether a route is obscured, the processing means may be arranged to analyse aerial or satellite imagery to determine whether the route is obscured. Alternatively, the processing means may be arranged to determine a likelihood that the route has been obscured based on a driving mode of a traction control system (e.g. if a snow driving mode has been engaged then the processing means may determine that there is a chance the route has been obscured).
The processing means may be arranged to determine whether the route has been obscure based on an external communication, e.g. a weather report.
The invention extends to a navigation arrangement comprising a navigation system according to the first aspect of the invention and a route boundary display means for displaying the boundary of the navigation route. The navigation arrangement may further comprise user control means for allowing a user to switch the route boundary display ON or OFF.
The invention extends to a vehicle comprising a navigation system according to the first aspect of the invention or a navigation arrangement incorporating a navigation system according to the first aspect of the invention. The navigation system cr navigation arrangement may be integrated into the vehicle. The navigation system may comprise a mobile communications device in communication with the vehicle.
According to a second aspect of the present invention there is provided a method for providing navigation directions to a user comprising: determining navigation directions relating to a user selected route, and outputting navigation directions to the user, wherein determining navigation directions comprises determining the boundary of the navigation route and outputting navigation directions comprises outputting an indication signal to mark the boundaries of the navigation.
The invention extends to a computer program product comprising computer readable code for controlling a computing device to carry out the method of the second aspect of the invention.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention wll now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a schematic of a navigation system in accordance with an embodiment of the present invention; Figure 2 is a flow chart of a method of using the navigation system of Figure 1; Figures 3a and Sb are respectively a map image (photograph) of an area of land comprising a number of roads in a road network and an off road area and the corresponding map view of the same area of land; Figure 4 is a schematic of a navigation system in accordance with a further embodiment of the present invention; Figure 5a is flow chart of a method of using the navigation system of Figure 4 in accordance with the further embodiment of the present invention; Figure Sb is a flow chart of a method of using a navigation system of Figure 4 in accordance with the further embodiment of the present invention.
DETAILED DESCRIPTION
Figures 1 and 4 show navigation systems in accordance with embodiments of the present invention.
The navigation system 10 shown in Figure 1 is in communication with a display device 12, a head-up display 14 and a projection system 16.
In the example of a vehicle navigation system, the display device 12 may either comprise a stand-alone display device (e.g. a tablet computing device such as an iPad®) or may be integrated into the vehicle (e.g. a dashboard mounted screen). The navigation system may also comprise a stand-alone system (for example it may comprise a software application running on the tablet device) or may be integrated into the vehicle.
The navigation system 10 further comprises a route module 18 which may define a route between points based on a user's input and a trajectory guidance module 20 which may determine driving/routing directions based on the route selected.
The route module 18 may be in communication with a user input device. In the example shown in Figure 1 user input is via a touch-screen enabled display screen 12 and/or via physical user controls 22. It is noted however that other user input devices may be used e.g. keyboard, mouse, trackball inputs.
Planning the directions (trajectory) that the navigation system needs to take once the route has been set by the user is handled by the trajectory guidance system 20, the output of which may be supplied on one or more of the display screen 12, a head-up display 14 or a projection system 16.
The navigation system 10 may determine, as part of the route generation and trajectory guidance generation processes, the boundary of the route that has been generated. For example, where the route comprises a route via a road network then the navigation system may determine (e.g. via GPS location and a suitable look up table) the width (boundaries) of the route along the path selected by the user.
The boundaries of the route may then be displayed to the user. The route boundaries may be displayed via a head-up display 14 or via a suitable projection system 16 such as an LED or laser projection system which projects the boundary 22 of the route outside the vehicle 24 onto the ground (in effect it would project lane markers" 22 or a virtual lane").
Where the route selected by the user comprises off road elements then the navigation system 10 may analyse satellite imagery to determine the width of any off-road tracks included within the selected path.
Regardless of the mechanism for displaying the boundary of the selected route to the user it is noted that the "virtual lanes" may be displayed so as to show road features such as bends, corners, junctions, route width restrictions etc. The navigation system 10 may be arranged to determine whether the route is obscured such that the virtual lanes need to be displayed to the user and may be further arranged to output the boundary 22 of the route to the user in the event that such a determination has been made. The navigation system 10 may be arranged to analyse image data received from image sensors (e.g. cameras) to determine if the route has been obscured. Alternatively, in the event that the navigation system is in communication with a vehicle traction control system the navigation system may determine that the route is obscured if the traction control system has entered certain driving modes (e.g. a snow mode may indicate that the route is likely to be obscured).
Alternatively or additionally the user of the navigation system may be provided with a user command option to switch the route boundary display on" or "off'.
Figure 2 illustrates a method according to an embodiment of the present invention for displaying route boundaries 22 to a user. Figure 2 is described below with reference to the relevant features of Figure 1.
In a first example a user may activate, at 50, their navigation system 10. A set of menu options may then be displayed, at 52, to the user via the display screen 12 of the navigation system 10. The user may then request, at 54, a route to a destination from the navigation system which plots, at 56, the route using a route laying program. It is noted that the requesting the route and plotting route steps 54, 56 are described in more detail with respect to Figures Sa and Sb below for a particular example where the user is plotting an "off-road" route.
Once the route has been plotted the user may determine, at 58, whether the route is visible or not. Alternatively, the navigation system may make a determination, at 58, about the visibility of the route. If the route is visible then navigation directions may be presented to the user of the navigation system either via the display device 12 of the navigation system or, in the event that the system is for example being used in a vehicle, via a head-up display (HUD) 14. The navigation directions may also be sent to a steering controller 60 for autonomous vehicles.
If the route is invisible however (e.g. obscured from the user's view in the environment) then the navigation system 10 may determine the boundary 22 of the route and may send this information to a laser projection system 16 which then projects virtual lanes on the ground outside the vehicle 24 for the user to see. In an alternative embodiment the boundary information may be supplied to the user via a HUD display 14.
The above provision of "virtual lanes" to the user is applicable in both on-road and off-road scenarios. In the event that the user wishes to plot an off-road route then they may select an off-road navigation mode 62 and then select a map image icon 64 in order to initiate a route planning mode for off-road use.
The off-road option is described in more detail with reference to Figures 3a, Sb, 4, 5a and 5b below.
Figure 3a shows an aerial image of an area of land and Figure 3b shows substantially the same area of land in "map view". The land comprises a number of roads 100 within a road network and an "off road" area 102 comprising a number of un-tarmacked tracks.
A known navigation system 10 tasked with planning a route between points A and B would return a route using the road network, e.g. path 104 (via roads A438/449/417 as shown in Figure Sb). However, it is noted that for vehicles capable of traversing un-tarmacked tracks and/or open ground a more direct route may be represented by path 106.
The present invention provides a user with the functionality to define a route such as path 106 using the map image of Figure 3a. It is noted that such functionality may be used where there is no road network, e.g. in rural locations or more exotic locations such as jungle, desert or snowfield. It is also noted that the invention would allow a user to define an alternative route in the event of road slippage or other obstruction in the event of an extreme weather event such as storms and flooding.
Figure 4 shows a navigation system in accordance with an embodiment of the present invention. The navigation system 10 shown in Figure 4 is in communication with a display device 12, a head-up display 14, a steering control system 60, a terrain information system 110, a number of sensors 112 (which may, in the example of a vehicle navigation system, be either on-board sensors, sensors that are not part of the vehicle or a mixture of the two) and a projection system 16.
In the example of a vehicle navigation system, the display device 12 may either comprise a stand-alone display device (e.g. a tablet computing device such as an iPad®) or may be integrated into the vehicle (e.g. a dashboard mounted screen). The navigation system 10 may also comprise a stand-alone system (for example it may comprise a software application running on the tablet device) or may be integrated into the vehicle.
For clarity only one sensor input 112 is shown in Figure 4. However, it is to be appreciated that the navigation system 10 may receive input signals from a plurality of sensors such as visual sensors (e.g. vehicle mounted cameras), radar-based sensors, gyroscopic sensors (to determine pitch, roll and yaw), inertial sensors e.g. accelerometers (to determine linear, longitudinal and lateral vehicle acceleration, velocity and distance / displacement) and a OPS antenna.
The navigation system 10 further comprises a route module 18 which may define a route between points based on a user's input and a trajectory guidance module 20 which may determine driving/routing directions based on the route selected.
The route module 18 may be in communication with a user input device 22. In the example shown in Figure 4 user input is via a touch-screen enabled display screen 12 and/or via physical user controls 22. It is noted however that other user input devices may be used e.g. keyboard, mouse, trackball inputs.
As described in relation to Figures 5a and 5b below the navigation system 10 in accordance with embodiments of the present invention enables a user to trace a route on a displayed image 114 (e.g. an aerial map image) in order to plan a journey. The navigation system 10 is therefore additionally in communication with a map image data store. The map image data store is located remotely from the navigation system.
Map images may be fed directly into the route module 18 (e.g. if the supplied may images are accompanied by latitude/longitude location meta-data thereby enabling the route module to locate the system with respect to the supplied map image using, for example, OPS location data derived from the GPS antenna).
Alternatively, where a data connection between the navigation system 10 and the map image data store is temporary, map images covering a preselected area of the Earth may be downloaded to a data store (labelled "115" in Figure 4) within or associated with the navigation system.
If map images supplied to the image data store 115 comprise latitude/longitude location meta-data then they may be supplied directly to the route module 18.
If map images that are either supplied to the image data store 115 or are served directly for use in the route module 18 do not comprise latitude/longitude location meta-data then such map images may be routed through a simultaneous localisation and mapping (SLAM) module 120.
SLAM is a known technique used by autonomous vehicles to build a map within an unknown environment. In the present invention the SLAM module 120 operates to estimate the navigation system's position within the supplied map image 114 using available sensor information. In the example where the navigation system 10 is within or associated with a vehicle then the SLAM module may take advantage a! existing vehicle sensors in order to estimate the vehicle's position within the supplied map image. Sensor data may be derived from on-board camera, radar units.
Map images that either have been supplied with location meta-data or have had such data estimated using the SLAM module can then be used by the route module 18.
Planning the directions (trajectory) that the navigation system 10 needs to take once the route has been set by the user is handled by the trajectory guidance system 20, the output of which may be supplied on one or more of the display screen 12, a head-up display 14, a steering control system 60 or a projection system 16.
The method according to the present invention is now described in relation to Figures Sa and 5b.
In Figure 5a, a user of the navigation system 10 initially requests, 200, a map image (i.e. a photographic image) in order to plan a route. The request may be made via, for example a touch screen input on the display device 12.
In response to the user command the navigation system 10 locates itself, 202, on the map image. The map image may be supplied from an image source, e.g. Google or other mapping services, that incorporates location metadata. In such an event then the navigation system may conveniently be able to look up the GPS map image coordinates that correspond to the GPS lock that it has itself determined.
In the event that the map image has been supplied from an image source that does not include location metadata then the map image may be passed to a position estimator module which then estimates the location based on data received from a number of sensors.
The position estimator module may comprise a SLAM module.
Once the position estimator module has estimated the navigation system's location this is supplied along with the map image to the routing module within the system and is then displayed, 204, on the display screen. In the event that the route is obscured, e.g. because of snow on the ground or flooding, then the navigation system 10 may additionally determine the boundary 22 of the route from the map images 114 and display these to the user either via the HUD 14 or via the projection system 16.
It is noted that the location estimation process may use six degrees of freedom state estimation using any one or more of the following: GPS/INS (Inertial navigation system)/wheel speed sensors; mobile communication signals; visual data from image sensors; radar data for feature localisation.
The user may then adjust the size of the map image being displayed by a known pinch and zoom" method or by double tapping the touchscreen of the display screen (see steps "Zooming required?" 206, user requirements" 208 where the user interacts with the image, "map adjustment" 210 where the routing module/navigation system alters the display of the map image and "Map/image displayed" 212 where the user demanded map image is displayed on the display screen).
The user may then be prompted, 214, to input route information onto the map image. This may be accomplished, 216, by the user placing start/end icons on the map image and then tracing a route to be followed, e.g. by sliding their finger over map image as displayed on the display screen 12.
As the user interacts with the touchscreen the x, y positions of the user selected track will be stored, 218, such that the completed route can then be displayed.
Optionally the navigation system 10 may interact with a terrain information system 220 (this could be a standalone system or an integrated module within the navigation system). The terrain information system 220 may be able to provide, 222, terrain details relevant to the user selected route. For example, elevation (z) data may be incorporated into the selected route and the user warned of potentially difficut terrain within the selected route track.
Route quality may be indicated by red, yellow and green colours. High risk routes may be shaded red and any private property or international/state boundaries may also be indicated.
Once the complete route has been displayed on the display screen (and any available terrain information system incorporated) the user may be prompted to confirm, 224, the route selection (see Figure Sb).
In the event the user rejects the route then the system may prompt the user, in 228, to select an alternate route and the system may cycle round to the start of the route selection process.
In the event that the user accepts the route then the system may output the route to the trajectory guidance module 20 to generate, in 230, routing directions for the user. These may be output to the display device 12, to a head-up display 14 (e.g. in a vehicle or via wearable technology such as Google® Glasses), a projection system 16, and may be output to an autonomous steering controller system 60.
It is noted that where the position estimator module, 120, has estimated the location of the navigation system 10 within the map image 114 then the route developed between the user and the route module 18 will be an initial route (and the directions generated by the trajectory guidance module will be initial directions). As the navigation system moves along the selected route however the location data relative to the initially supplied map image will be improved by further data received from the sensors and the route will become more accurate as it is traversed.
The route that is traversed may be recorded by the navigation system and then uploaded to a central server so that other users may take advantage of the route.
Within the present application the term "map image" is taken to mean photographic imagery of a region of the Earth, e.g. an aerial or satellite photograph.
The invention extends to the features described In the folio wing numbered paragraphs: 1. A navigation system for providing navigation directions to a user comprising: an input for receiving user commands to define a navigation route; a processor arranged to determine navigation directions relating to the user selected route; an output arranged to output navigation directions to the user wherein the processor is arranged to determine the boundary of the navigation route and the output is arranged to output an indication signal to mark the boundaries of the navigation route.
2. A navigation system as claimed in paragraph 1, wherein the indication signal is arranged to be sent to a projection system for projecting lane markers on the ground.
3. A navigation system as claimed in paragraph 1, wherein the indication signal is arranged to be sent to a head-up display to indicate the boundaries of the navigation route to the use.
4. A navigation system as claimed in paragraph 1, wherein the indication signal comprises one or more of the following route features: bends, corners, junctions, width restrictions.
5. A navigation system as claimed in paragraph 1, wherein the processor determines the width of the navigation route by analysing aerial or satellite imagery.
6. A navigation system as claimed in paragraph 1, wherein the processor determines the width of the navigation route by looking up boundary information in a look up table.
7. A navigation system as claimed in paragraph 1, wherein the processor is arranged to determine whether the boundaries of the navigation route are obscured.
8. A navigation system as claimed in paragraph 7, wherein the processor is arranged to analyse aerial or satellite imagery to determine whether the route is obscured.
9. A navigation system as claimed in paragraph 7, wherein the processor is arranged to determine whether the route has been obscured based on a driving mode of a traction control system.
10. A navigation system as claimed in paragraph 7, wherein the processor is arranged to determine whether the route has been obscure based on an external communication.
11. A navigation arrangement comprising a navigation system as claimed in paragraph 1 and a route boundary display for displaying the boundary of the navigation route.
12. A navigation arrangement as claimed in paragraph 11, further comprising user control means for allowing a user to switch the route boundary display ON or OFF.
13. A vehicle comprising a navigation system according to paragraph 1 or a navigation arrangement according to paragraph 11.
14. A vehicle as claimed in paragraph 13, wherein the navigation system or navigation arrangement is integrated into the vehicle.
15. A vehicle as claimed in paragraph 13, wherein the navigation system comprises a mobile communications device in communication with the vehicle.
16. A method for providing navigation directions to a user comprising: receiving user commands to define a navigation route; determining navigation directions relating to the user selected route; outputting navigation directions to the user wherein determining navigation directions comprises determining the boundary of the navigation route and outputting navigation directions comprises outputting an ndication signal to mark the boundaries of the navigation.
17. A non-transitory computer readable medium storing a program for controlling a computing device to carry out the method of paragraph 16.

Claims (17)

  1. CLAIMS1. A navigation system for providing navigation directions to a user comprising: an input for receiving user commands to define a navigation route; processing means arranged to determine navigation directions relating to the user selected route; and an output arranged to output navigation directions to the user; wherein the processing means is arranged to determine the boundary of the navigation route and the output is arranged to output an indication signal to mark the boundaries of the navigation route.
  2. 2. A navigation system as claimed in Claim 1, wherein the indication signal is arranged to be sent to a projection system for projecting lane markers on the ground.
  3. 3. A navigation system as claimed in Claim 1 or Claim 2, wherein the indication signal is arranged to be sent to a head-up display to indicate the boundaries of the navigation route to the use.
  4. 4. A navigation system as claimed in any preceding claim, wherein the indcation signal comprises one or more of the following route features: bends, corners, junctions, width restrictions.
  5. 5. A navigation system as claimed in any preceding claim, wherein the processing means determines the width of the navigation route by analysing aerial or satellite imagery.
  6. 6. A navigation system as claimed in any preceding claim, wherein the processing means determines the width of the navigation route by looking up boundary information in alook up table.
  7. 7. A navigation system as claimed in any preceding claim, wherein the processing means is arranged to determine whether the boundaries of the navigation route are obscured.
  8. 8. A navigation system as claimed in Claim 7, wherein the processing means is arranged to analyse aerial or satellite imagery to determine whether the route is obscured.
  9. 9. A navigation system as claimed in Claim 7 or Claim 8, wherein the processing means is arranged to determine whether the route has been obscured based on a driving mode of a traction control system.
  10. 10. A navigation system as claimed in any one of Claims 7 to 9, wherein the processing means is arranged to determine whether the route has been obscure based on an external communication.
  11. 11. A navigation arrangement comprising a navigation system as claimed in any preceding claim and a route boundary display means for displaying the boundary of the navigation route.
  12. 12. A navigation arrangement as claimed in Claim 11, further comprising user control means for allowing a user to switch the route boundary display ON or OFF.
  13. 13. A vehicle comprising a navigation system according to any one of Claims 1 to 10 or a navigation arrangement according to Claim 11 or 12.
  14. 14. A vehicle as claimed in Claim 13, wherein the navigation system or navigation arrangement is integrated into the vehicle.
  15. 15. A vehicle as claimed in Claim 13, wherein the navigation system comprises a mobile communications device in communication with the vehicle.
  16. 16. A method for providing navigation directions to a user comprising: determining navigation directions relating to a user selected route; and outputting navigation directions to the user; wherein determining navigation directions comprises determining the boundary of the navigation route and outputting navigation directions comprises outputting an indication signal to mark the boundaries of the navigation.
  17. 17. A computer program product comprising computer readable code for controlling a computing device to carry out the method of Caim 16.
GB1405304.5A 2014-03-25 2014-03-25 Navigation system Active GB2524514B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
GB1405304.5A GB2524514B (en) 2014-03-25 2014-03-25 Navigation system
PCT/EP2015/056369 WO2015144751A1 (en) 2014-03-25 2015-03-25 Navigation system
EP15712130.2A EP3123115A1 (en) 2014-03-25 2015-03-25 Navigation system
AU2015238339A AU2015238339B2 (en) 2014-03-25 2015-03-25 Navigation system
US15/128,777 US10408634B2 (en) 2014-03-25 2015-03-25 Navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1405304.5A GB2524514B (en) 2014-03-25 2014-03-25 Navigation system

Publications (3)

Publication Number Publication Date
GB201405304D0 GB201405304D0 (en) 2014-05-07
GB2524514A true GB2524514A (en) 2015-09-30
GB2524514B GB2524514B (en) 2017-08-16

Family

ID=50686857

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1405304.5A Active GB2524514B (en) 2014-03-25 2014-03-25 Navigation system

Country Status (1)

Country Link
GB (1) GB2524514B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3058106A1 (en) * 2016-10-28 2018-05-04 Valeo Vision ADAPTING THE LIGHTING OF A VEHICLE TO MATERIALIZE A MARKING
DE102017213104A1 (en) * 2017-07-28 2019-01-31 Continental Automotive Gmbh Method and device for operating a motor vehicle

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6977630B1 (en) * 2000-07-18 2005-12-20 University Of Minnesota Mobility assist device
US20070013495A1 (en) * 2005-06-15 2007-01-18 Denso Coropration Vehicle drive assist system
WO2007039877A2 (en) * 2005-10-05 2007-04-12 Politecnico Di Torino System for assisting in vehicle driving in presence of fog and/or on irregular roads
EP1916153A2 (en) * 2006-10-26 2008-04-30 Bayerische Motoren Werke Aktiengesellschaft Method for displaying information
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
US20120249589A1 (en) * 2011-03-29 2012-10-04 Bayerische Motoren Werke Aktiengesellschaft Method for the Output of Graphic Driving Indications
WO2013113500A1 (en) * 2012-02-02 2013-08-08 Audi Ag Driver assistance system and method for virtual representation of a road layout under obscured visibility and/or poor visibility conditions
DE202013006071U1 (en) * 2013-07-05 2013-09-12 Stephan Kaut Projected light grids from vehicles
WO2014095067A1 (en) * 2012-12-21 2014-06-26 Harman Becker Automotive Systems Gmbh A system for a vehicle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8781730B2 (en) * 2011-04-11 2014-07-15 Garmin Switzerland Gmbh Route selection employing metrics

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6977630B1 (en) * 2000-07-18 2005-12-20 University Of Minnesota Mobility assist device
US20070013495A1 (en) * 2005-06-15 2007-01-18 Denso Coropration Vehicle drive assist system
WO2007039877A2 (en) * 2005-10-05 2007-04-12 Politecnico Di Torino System for assisting in vehicle driving in presence of fog and/or on irregular roads
EP1916153A2 (en) * 2006-10-26 2008-04-30 Bayerische Motoren Werke Aktiengesellschaft Method for displaying information
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
US20120249589A1 (en) * 2011-03-29 2012-10-04 Bayerische Motoren Werke Aktiengesellschaft Method for the Output of Graphic Driving Indications
WO2013113500A1 (en) * 2012-02-02 2013-08-08 Audi Ag Driver assistance system and method for virtual representation of a road layout under obscured visibility and/or poor visibility conditions
WO2014095067A1 (en) * 2012-12-21 2014-06-26 Harman Becker Automotive Systems Gmbh A system for a vehicle
DE202013006071U1 (en) * 2013-07-05 2013-09-12 Stephan Kaut Projected light grids from vehicles

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3058106A1 (en) * 2016-10-28 2018-05-04 Valeo Vision ADAPTING THE LIGHTING OF A VEHICLE TO MATERIALIZE A MARKING
DE102017213104A1 (en) * 2017-07-28 2019-01-31 Continental Automotive Gmbh Method and device for operating a motor vehicle

Also Published As

Publication number Publication date
GB2524514B (en) 2017-08-16
GB201405304D0 (en) 2014-05-07

Similar Documents

Publication Publication Date Title
US10408634B2 (en) Navigation system
JP7045628B2 (en) Vehicle equipment, vehicles, and computer programs for controlling vehicle behavior
JP7125214B2 (en) Programs and computing devices
CN105765345B (en) Apparatus and method for displaying navigation instructions
US11092456B2 (en) Object location indicator system and method
US8315796B2 (en) Navigation device
CN108891417A (en) For operating the method and data processing system of automatic driving vehicle
EP3573013A1 (en) Method and device for constructing traffic route on basis of longitude/latitude lines and performing map search
US20220332349A1 (en) Navigation based on partially occluded pedestrians
CN113631885A (en) Navigation method and device
JP2009500765A (en) Method for determining traffic information and apparatus configured to perform the method
US9086293B2 (en) Navigation device and control method for displaying detour
US20220228881A1 (en) Method and system for controlling head up display
EP3339808B1 (en) Positioning objects in an augmented reality display
CN110081899A (en) A kind of high-precision electronic map road conditions display methods and device
KR20140000488A (en) Method for guiding turn point of crossroad and navigation system
GB2524514A (en) Navigation system
KR101131871B1 (en) Navigation apparatus and output method of highway rest area reaching guide information thereof
GB2524513A (en) Navigation system
KR20080019690A (en) Navigation device with camera-info
KR102610411B1 (en) Apparatus for route guidance
JP2023106427A (en) Map display device, control method therefor, program, and storage medium
JP2023075913A (en) Vehicle and control method therefore
CN118089758A (en) Collaborative awareness method, apparatus, device, storage medium, and program product