WO2009132681A1 - Navigation device and method of operational preference selection therefor - Google Patents

Navigation device and method of operational preference selection therefor Download PDF

Info

Publication number
WO2009132681A1
WO2009132681A1 PCT/EP2008/003711 EP2008003711W WO2009132681A1 WO 2009132681 A1 WO2009132681 A1 WO 2009132681A1 EP 2008003711 W EP2008003711 W EP 2008003711W WO 2009132681 A1 WO2009132681 A1 WO 2009132681A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
user
data
apparatus
operational
navigation device
Prior art date
Application number
PCT/EP2008/003711
Other languages
French (fr)
Inventor
Jeroen Trum
Breght Boschker
Original Assignee
Tom Tom International B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3484Personalized, e.g. from learned user behaviour or user-defined profiles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements of navigation systems
    • G01C21/3605Destination input or retrieval
    • G01C21/3608Destination input or retrieval using speech input, e.g. using speech recognition

Abstract

A navigation apparatus (200) comprises a processing resource (210) operably coupled to a data store (230). An operational environment is also provided that is supported, when in use, by the processing resource (210) and arranged to store operational preference profiles and respective associated user identifier data in the data store (230). The processing resource (210) is also arranged to determine contactlessly (550, 552, 554, 556) an identity of a user. The identity of the user determined is used to retrieve an operational preference profile from the operational preference profiles in the data store (230).

Description

NAVIGATION DEVICE AND METHOD OF OPERATIONAL PREFERENCE SELECTION THEREFOR

Field of the Invention The present invention relates to a navigation device of the type that, for example, is capable of configuration on a per-user basis. The present invention also relates to a method of operational preference selection for a navigation device of the type that, for example, is capable of configuration on a per-user basis.

Background to the Invention

Portable computing devices, for example Portable Navigation Devices (PNDs) that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-car or other vehicle navigation systems. In general terms, a modern PND comprises a processor, memory (at least one of volatile and non-volatile, and commonly both), and map data stored within said memory. The processor and memory cooperate to provide an execution environment in which a software operating system may be established, and additionally it is commonplace for one or more additional software programs to be provided to enable the functionality of the PND to be controlled, and to provide various other functions.

Typically these devices further comprise one or more input interfaces that allow a user to interact with and control the device, and one or more output interfaces by means of which information may be relayed to the user. Illustrative examples of output interfaces include a visual display and a speaker for audible output, illustrative examples of input interfaces include one or more physical buttons to control on/off operation or other features of the device (which buttons need not necessarily be on the device itself but could be on a steering wheel if the device is built into a vehicle), and a microphone for detecting user speech. In one particular arrangement, the output interface display may be configured as a touch sensitive display (by means of a touch sensitive overlay or otherwise) additionally to provide an input interface by means of which a user can operate the device by touch.

Devices of this type will also often include one or more physical connector interfaces by means of which power and optionally data signals can be transmitted to and received from the device, and optionally one or more wireless transmitters/receivers to allow communication over cellular telecommunications and other signal and data networks, for example Bluetooth, Wi-Fi, Wϊ-Max, GSM, UMTS and the like. PNDs of this type also include a GPS antenna by means of which satellite- broadcast signals, including location data, can be received and subsequently processed to determine a current location of the device.

The PND may also include electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted. Typically such features are most commonly provided in in-vehicle navigation systems, but may also be provided in PNDs if it is expedient to do so. The utility of such PNDs is manifested primarily in their ability to determine a route between a first location (typically a start or current location) and a second location (typically a destination). These locations can be input by a user of the device, by any of a wide variety of different methods, for example by postcode, street name and house number, previously stored "well known" destinations (such as famous locations, municipal locations (such as sports grounds or swimming baths) or other points of interest), and favourite or recently visited destinations.

Typically, the PND is enabled by software for computing a "best" or "optimum" route between the start and destination address locations from the map data. A "best" or "optimum" route is determined on the basis of predetermined criteria and need not necessarily be the fastest or shortest route. The selection of the route along which to guide the driver can be very sophisticated, and the selected route may take into account existing, predicted and dynamically and/or wirelessly received traffic and road information, historical information about road speeds, and the driver's own preferences for the factors determining road choice (for example the driver may specify that the route should not include motorways or toll roads).

In addition, the device may continually monitor road and traffic conditions, and offer to or choose to change the route over which the remainder of the journey is to be made due to changed conditions. Real time traffic monitoring systems, based on various technologies (e.g. mobile phone data exchanges, fixed cameras, GPS fleet tracking) are being used to identify traffic delays and to feed the information into notification systems.

PNDs of this type may typically be mounted on the dashboard or windscreen of a vehicle, but may also be formed as part of an on-board computer of the vehicle radio or indeed as part of the control system of the vehicle itself. The navigation device may also be part of a hand-held system, such as a PDA (Portable Digital Assistant), a media player, a mobile phone or the like, and in these cases, the normal functionality of the hand-held system is extended by means of the installation of software on the device to perform both route calculation and navigation along a calculated route.

Route planning and navigation functionality may also be provided by a desktop or mobile computing resource running appropriate software. For example, the Royal Automobile Club (RAC) provides an on-line route planning and navigation facility at http://www.rac.co.uk, which facility allows a user to enter a start point and a destination whereupon the server with which the user's computing resource is communicating calculates a route (aspects of which may be user specified), generates a map, and generates a set of exhaustive navigation instructions for guiding the user from the selected start point to the selected destination. The facility also provides for pseudo three-dimensional rendering of a calculated route, and route preview functionality which simulates a user travelling along the route and thereby provides the user with a preview of the calculated route.

In the context of a PND, once a route has been calculated, the user interacts with the navigation device to select the desired calculated route, optionally from a list of proposed routes. Optionally, the user may intervene in, or guide the route selection process, for example by specifying that certain routes, roads, locations or criteria are to be avoided or are mandatory for a particular journey. The route calculation aspect of the PND forms one primary function, and navigation along such a route is another primary function. During navigation along a calculated route, it is usual for such PNDs to provide visual and/or audible instructions to guide the user along a chosen route to the end of that route, i.e. the desired destination. It is also usual for PNDs to display map information on-screen during the navigation, such information regularly being updated on-screen so that the map information displayed is representative of the current location of the device, and thus of the user or user's vehicle if the device is being used for in- vehicle navigation.

An icon displayed on-screen typically denotes the current device location, and is centred with the map information of current and surrounding roads in the vicinity of the current device location and other map features also being displayed. Additionally, navigation information may be displayed, optionally in a status bar above, below or to one side of the displayed map information, examples of navigation information include a distance to the next deviation from the current road required to be taken by the user, the nature of that deviation possibly being represented by a further icon suggestive of the particular type of deviation, for example a left or right turn. The navigation function also determines the content, duration and timing of audible instructions by means of which the user can be guided along the route. As can be appreciated a simple instruction such as "turn left in 100 m" requires significant processing and analysis. As previously mentioned, user interaction with the device may be by a touch screen, or additionally or alternately by steering column mounted remote control, by voice activation or by any other suitable method. A further important function provided by the device is automatic route recalculation in the event that: a user deviates from the previously calculated route during navigation (either by accident or intentionally); real-time traffic conditions dictate that an alternative route would be more expedient and the device is suitably enabled to recognize such conditions automatically, or if a user actively causes the device to perform route re-calculation for any reason.

It is also known to allow a route to be calculated with user defined criteria; for example, the user may prefer a scenic route to be calculated by the device, or may wish to avoid any roads on which traffic congestion is likely, expected or currently prevailing. The device software would then calculate various routes and weigh more favourably those that include along their route the highest number of points of interest (known as POIs) tagged as being for example of scenic beauty, or, using stored information indicative of prevailing traffic conditions on particular roads, order the calculated routes in terms of a level of likely congestion or delay on account thereof. Other POI-based and traffic information-based route calculation and navigation criteria are also possible. Although the route calculation and navigation functions are fundamental to the overall utility of PNDs, it is possible to use the device purely for information display, or "free-driving", in which only map information relevant to the current device location is displayed, and in which no route has been calculated and no navigation is currently being performed by the device. Such a mode of operation is often applicable when the user already knows the route along which it is desired to travel and does not require navigation assistance.

Devices of the type described above, for example the 720T model manufactured and supplied by TomTom International B. V., provide a reliable means for enabling users to navigate from one position to another. Such devices are of great utility when the user is not familiar with the route to the destination to which they are navigating. Before embarking on a journey, a user typically configures settings of the PND to best suit the operation preferences of the user in respect of the PND. However, it is not uncommon for an occasion when the PND is operated by a user to be succeeded by a period of use by a different user. Whilst the ability to change settings to improve convenience of use of the PND is advantageous, re-configuration of the PND to suit the different user is a time-consuming activity. The absence of a shortcut to select pre-defined settings associated with the user of the PND can require a level of preparation prior to use of the PND that is unacceptable to some users. Additionally, even with the provision of the shortcut to select a predefined setting, it is typically necessary to navigate a menu structure in order to reach a part of a user-interface that permits selection by the user of the pre-defined setting.

In the field of computing devices, it is known for a computing device, for example a Personal Computer (PC), to have respective different settings to accommodate different users and it is also known to provide a fingerprint reader in order to capture identifying information concerning a prospective user. The identifying information, in this example a fingerprint, is used to query a database of user data in order to obtain log-in information associated with the prospective user. The log-in information is then provided to an existing log-in interface of, for example, an operating system supported by the PC in order to provide the operating system with the log-in information obtained from the database, such as a username and a password.

Whilst use of the fingerprint reader speeds access to a working environment provided by the PC and reserved for the prospective user, the fingerprint reader is not always able to capture the fingerprint reliably upon a single "swipe" of a finger against a scanning window of the fingerprint reader. Consequently, the prospective user sometimes has to attempt recognition of the fingerprint a number of times before successful recognition and hence access is granted to use the PC.

In the automotive industry, a simpler approach has been taken and driver preference settings, for example seat position, steering wheel and pedal settings, are selectable by depression of a button provided in the cockpit of a vehicle, for example on a steering wheel. Sometimes more than one button needs to be depressed in a short sequence. However, this approach also requires a conscious act by the user in order to ensure selection of a setting associated with the user.

In the context of a PND, equipping the PND with a fingerprint reader, in an analogous manner to that described above in relation to PCs, is not only a manufacturing expense but also prone by unreliability, for example an ability to provide a fingerprint when required before departure due to sweat or temporary deformation of fingerpads. The need therefore to make several attempts to obtain recognition of the fingerprint is time consuming, delays departure, frustrating to the user and increases driver workload. Indeed, the mere act of interacting with the fingerprint reader constitutes additional user workload. The provision of dedicated buttons for selection of the predefined settings associated with different users, whilst more reliable, can serve to make the housing of the PND less attractive. Again, the provision of the dedicated button results in a need for action from the user, resulting in increased user workload prior to commencement of a journey. Indeed, in a vehicle, where other driver selections may need to be made prior to commencement of the journey, for example driving position- related settings, the need also to configure the PND constitutes an additional inconvenience and hindrance to commencement of the journey. Additionally, the provision of the dedicated button frustrates the objective of providing a PND that operates almost exclusively through a touch-screen "soft" interface. In this respect, providing access to selection of the pre-defined settings at a higher level of the menu structure serves either to overcrowd the higher level of the menu structure and hence provide a less user-friendly menu structure, or necessitates sacrifice of a menu option at the higher level in order to accommodate the ability to select pre-defined settings at the higher level. The sacrificed menu option is then has to be provided at a lower level of the menu structure and therefore requires more finger navigation of the menu structure in order to reach the sacrificed menu option. Clearly, this can be inconvenient and hence undesirable if the sacrificed menu option is of some importance and/or frequently used.

Summary of the Invention

According to a first aspect of the present invention, there is provided a navigation apparatus comprising: a processing resource operably coupled to a data store; an operational environment supported, when in use, by the processing resource and arranged to store operational preference profiles and respective associated user identifier data in the data store; wherein the processing resource is arranged to determine contactlessly an identity of a user and use the identity of the user to retrieve an operational preference profile from the operational preference profiles in the data store.

The processing resource may be arranged to implement an operational preference of the operational preference profile in response to identification of the user.

The identity of the user may be determined using a recognition algorithm. The identity of the user may be determined using candidate physical characteristic data derived from the user. The candidate physical characteristic data may be candidate biometric data. The data store comprises physical characteristic data respectively identifying users. The physical characteristic data may be biometric data.

The data store may maintain respective associations between the physical characteristic data and the user identifier data.

The processing resource may be arranged to sample information in relation to the user via an input device and generate the candidate physical characteristic data from the information sampled. The information sampled may be voice data.

The input device may be an audio input device. The audio input device may be a microphone. Additionally or alternatively, the information sampled may be image data. The image data may be facial image data.

The input device may be an optical input device. The optical input device may be a camera, for example a digital camera.

The image data may be time-varying image data; the candidate physical characteristic data may relate to a change of the image data with time. The apparatus may further comprise an operational preference profile database that may comprise the operational preference profiles and respective associated user identifier data. The apparatus may further comprise a physical characteristics database that may comprise the user identifier data and respectively associated physical characteristic data. The processing resource may be arranged to generate, at least periodically, the candidate physical characteristic data in order to determine the identity of the user.

The processing resource may be arranged to learn the identity of the user from use of a core operational feature supported by the operational environment. The use of the core operational feature may be a voice recognition facility for controlling the navigation device.

The operational preference may relate to an apparatus external to the navigation apparatus and controllable via the operational environment.

According to a second aspect of the present invention, there is provided a method of operational preference selection, the method comprising: providing an operational environment; storing operational preference profiles and respective associated user identifier data; determining contactlessly an identity of a user; using the identity of the user to retrieve an operational preference profile from the stored operational preference profiles.

According to a third aspect of the present invention, there is provided a computer program element comprising computer program code means to make a computer execute the method as set forth in accordance with the second aspect of the invention.

The computer program element may be embodied on a computer readable medium.

It is thus possible to provide a navigation apparatus and a method of operational preference selection that makes operation of the navigation apparatus more convenient for a user. The need to negotiate a menu structure is obviated or at least mitigated. The apparatus and method also obviate the need to sacrifice menu options in the menu structure by pushing such menu options further down the structure in order to accommodate a menu option to permit users to select operational preference profiles. Additionally, it is possible to recall and set operational preferences associated with apparatus and devices capable of being controlled by the navigation apparatus.

Other advantages of these embodiments are set out hereafter, and further details and features of each of these embodiments are defined in the accompanying dependent claims and elsewhere in the following detailed description.

Brief Description of the Drawings

At least one embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:

Figure 1 is a schematic illustration of an exemplary part of a Global Positioning System (GPS) usable by a navigation device; Figure 2 is a schematic illustration of electronic components arranged to provide a navigation device;

Figure 3 is a schematic illustration of the manner in which a navigation device may receive information over a wireless communication channel;

Figures 4A and 4B are illustrative perspective views of a navigation device; Figure 5 is a schematic representation of an architectural stack employed by the navigation device;

Figure 6 is a schematic diagram of an arrangement supported by a processing resource and constituting an embodiment of the invention;

Figure 7 is a flow diagram of a first method of acquiring characteristic data, employing the arrangement of Figure 6;

Figure 8 is a flow diagram of a first method of operational preference selection, employing the arrangement of Figure 6;

Figure 9 is a schematic diagram of another arrangement supported by the processing resource and constituting another embodiment of the invention; Figure 10 is a flow diagram of a second method of acquiring characteristic data, employing the arrangement of Figure 9; and

Figure 11 is a flow diagram of a second method of operational preference selection, employing the arrangement of Figure 9.

Detailed Description of Preferred Embodiments

Throughout the following description identical reference numerals will be used to identify like parts.

Embodiments of the present invention will now be described with particular reference to a PND. It should be remembered, however, that the teachings of the present invention are not limited to PNDs but are instead universally applicable to any type of processing device that is configured to execute navigation software so as to provide route planning and navigation functionality. It follows therefore that in the context of the present application, a navigation device is intended to include (without limitation) any type of route planning and navigation device, irrespective of whether that device is embodied as a PND, a navigation device built into a vehicle, or indeed a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA)) executing route planning and navigation software.

It will also be apparent from the following that the teachings of the present invention even have utility in circumstances where a user is not seeking instructions on how to navigate from one point to another, but merely wishes to be provided with a view of a given location. In such circumstances the "destination" location selected by the user need not have a corresponding start location from which the user wishes to start navigating, and as a consequence references herein to the "destination" location or indeed to a "destination" view should not be interpreted to mean that the generation of a route is essential, that travelling to the "destination" must occur, or indeed that the presence of a destination requires the designation of a corresponding start location.

With the above provisos in mind, the Global Positioning System (GPS) of Figure 1 and the like are used for a variety of purposes. In general, the GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users. Formerly known as NAVSTAR, the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.

The GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal allows the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.

As shown in Figure 1 , the GPS system is denoted generally by reference numeral 100. A plurality of satellites 120 are in orbit about the earth 124. The orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely to be asynchronous. A GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.

The spread spectrum signals 160, continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock. Each satellite 120, as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120. It is appreciated by those skilled in the relevant art that the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.

Referring to Figure 2, it should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components. The navigation device 200 is located within a housing (not shown). The housing includes a processing resource, for example a processor 210, coupled to an input device 220 and a display device, for example a display screen 240. Although reference is made here to the input device 220 in the singular, the skilled person should appreciate that the input device 220 represents any number of input devices, including a keyboard device, voice input device, touch panel and/or any other known input device utilised to input information. Likewise, the display screen 240 can include any type of display screen such as a Liquid Crystal Display (LCD), for example. For ease of reference, two exemplary input devices are shown separately in Figure 2. In this respect, an audio input device, for example a microphone 226 is operatively coupled to the processor 210. Additionally or alternatively, an optical input device, for example a camera is operatively coupled to the processor 210. The audio input device and/or the optical input device do not require physical contact with a user of the navigation device 200. In one arrangement, one aspect of the input device 220, the touch panel, and the display screen 240 are integrated so as to provide an integrated input and display device, including a touchpad or touchscreen input so that a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual or "soft" buttons. In this respect, the processor 210 supports a graphical user interface that operates in conjunction with the touchscreen.

The navigation device may include an output device 260, for example an audible output device (e.g. a loudspeaker). As output device 260 can produce audible information for a user of the navigation device 200, it is should equally be understood that input device 240 can include the microphone 226 and software for receiving input voice commands as well.

In the navigation device 200, the processor 210 is operatively connected to and set to receive input information from input device 220 via a connection 225, and operatively connected to at least one of the display screen 240 and the output device 260, via respective output connections 245, to output information thereto. Further, the processor 210 is operatively connected to memory 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200. The external I/O device 280 may include, but is not limited to an external listening device such as an earpiece for example. The connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.

Figure 2 further illustrates an operative connection between the processor 210 and an antenna/receiver 250 via connection 255, wherein the antenna/receiver 250 can be a GPS antenna/receiver for example. It will be understood that the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.

Further, it will be understood by one of ordinary skill in the art that the electronic components shown in Figure 2 are powered by power sources (not shown) in a conventional manner. As will be understood by one of ordinary skill in the art, different configurations of the components shown in Figure 2 are contemplated. For example, the components shown in Figure 2 may be in communication with one another via wired and/or wireless connections and the like. Thus, the navigation device 200 described herein can be a portable or handheld navigation device 200. In addition, the portable or handheld navigation device 200 of Figure 2 can be connected or "docked" in a known manner to a vehicle such as a bicycle, a motorbike, a car or a boat for example. Such a navigation device 200 is then removable from the docked location for portable or handheld navigation use.

Referring now to Figure 3, the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (such as a mobile phone, PDA, and/or any device with mobile phone technology) establishing a digital connection (such as a digital connection via known Bluetooth technology for example). Thereafter, through its network service provider, the mobile device can establish a network connection (through the internet for example) with a server 302. As such, a "mobile" network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/or in a vehicle) and the server 302 to provide a "real-time" or at least very "up to date" gateway for information.

The establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using an internet for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example. The mobile device can utilize any number of communication standards such as CDMA2000, GSM, IEEE 802.11 a/b/g/n, etc.

As such, an internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example. For this connection, an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service)- connection. The navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology for example, in a known manner, wherein any number of appropriate data communications protocols can be employed.

The navigation device 200 may include its own mobile phone technology within the navigation device 200 itself (including an antenna for example, or optionally using the internal antenna of the navigation device 200). The mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card (e.g. Subscriber Identity Module (SIM) card), complete with necessary mobile phone technology and/or an antenna for example. As such, mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.

For GRPS phone settings, a Bluetooth enabled navigation device may be used to work correctly with the ever changing spectrum of mobile phone models, manufacturers, etc., model/manufacturer specific settings may be stored on the navigation device 200 for example. The data stored for this information can be updated.

In Figure 3 the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements. The server 302 and the navigation device 200 can communicate when a connection via the communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).

The server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312. The processor 304 is further operatively connected to transmitter 308 and receiver 310, to transmit and receive information to and from navigation device 200 via communications channel 318. The signals sent and received may include data, communication, and/or other propagated signals. The transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a single transceiver.

Server 302 is further connected to (or includes) a mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314. The mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302.

The navigation device 200 is arranged to communicate with the server 302 through communications channel 318, and includes processor, memory, etc. as previously described with regard to Figure 2, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.

Software stored in server memory 306 provides instructions for the processor 304 and allows the server 302 to provide services to the navigation device 200. One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200. Another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.

The communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302. Both the server 302 and navigation device 200 include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.

The communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example. In one illustrative arrangement, the communication channel 318 includes telephone and computer networks. Furthermore, the communication channel 318 may be capable of accommodating wireless communication, for example, infrared communications, radio frequency communications, such as microwave frequency communications, etc. Additionally, the communication channel 318 can accommodate satellite communication.

The communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology. For example, the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc. Both digital and analogue signals can be transmitted through the communication channel 318. These signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.

The server 302 includes a remote server accessible by the navigation device 200 via a wireless channel. The server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.

The server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200. Alternatively, a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200. Alternatively, a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet. The navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection for example. For many dynamic calculations, the processor 304 in the server 302 may be used to handle the bulk of processing needs, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.

As indicated above in Figure 2, the navigation device 200 includes the processor 210, the input device 220, and the display screen 240. The input device 220 and display screen 240 are integrated into an integrated input and display device to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example. Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art. Further, the navigation device 200 can also include any additional input device 220 and/or any additional output device, such as audio input/output devices for example. Referring to Figure 4A, the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen for example) and the other components of Figure 2 (including, but not limited to, the internal GPS receiver 250, the microprocessor 210, a power supply (not shown), memory systems 230, etc.). The navigation device 200 may sit on an arm 292, which itself may be secured to a vehicle dashboard/window/etc, using a suction cup 294. This arm 292 is one example of a docking station to which the navigation device 200 can be docked.

As shown in Figure 4B, the navigation device 200 can be docked or otherwise connected to the arm 292 of the docking station by snap connecting the navigation device 292 to the arm 292 for example. The navigation device 200 may then be rotatable on the arm 292, as shown by the arrow of Figure 4B. To release the connection between the navigation device 200 and the docking station, a button (not shown) on the navigation device 200 may be pressed, for example. Other equally suitable arrangements for coupling and decoupling the navigation device to a docking station are well known to persons of ordinary skill in the art.

Referring now to Figure 5 of the accompanying drawings, the processor 210 and memory 230 cooperate to support a BIOS (Basic Input/Output System) 340 that functions as an interface between the functional hardware components 330 of the navigation device 200 and the software executed by the device. The processor 210 then loads an operating system 350 from the memory 210, which provides an environment in which application software 360 (implementing some or all of the above described route planning and navigation functionality) can run. The application software 506 provides an operational environment that supports core functions of the navigation device, for example map viewing, route planning, navigation functions and any other functions associated therewith. Additionally, a recognition module 370 is supported by the operating system 504 and provides functionality beyond the core functions provided by the application software 360. However, the skilled person should appreciate that the recognition module 370 can be incorporated into the application software 360, although the functionality of the incorporated recognition module remains beyond that of the core functions.

The application software 506 supports non-simultaneous use of the navigation device 200 by different users. In this respect, a user preferences database (not shown) is maintained in the memory 230 in which an operational preference profile comprising one or more user operational preferences relating to use of the navigation device 200 can be stored against an identifier for a user. Of course, the skilled person should appreciate that a number of operational preferences can therefore be stored against the identifier for the user, for example one of more of: driving-related preferences, for example safety preferences, such as showing safety reminders, identification of preferred points of interest and/or preferences associated with toll-roads; audible output- related preferences, for example audible announcement preferences, such as voice type, loudspeaker volume, speaker preferences, such as choice of audio output device; display preferences, for example appearance of symbol identifying current location; map colour schemes, brightness preferences, preferences associated with display of a compass, two- or three-dimensional view preference, preferences associated with an informational status bar, automatic zooming preferences or preferences associated with units of measurements used; device control preferences, for example preferences associated with display of icons for quick access to menu options, preferences associated with naming details on a displayed portion of a map, left or right-handed operation preferences, route planning preferences, keypad or keyboard preferences, media playback preferences, wireless communications preferences, such as Bluetooth preferences, server access preferences, battery saving preferences, language preferences and/or menu display preferences.

In this example, a user operational preference profile already stored by the navigation device 200 can be selected manually via a settings menu option of a menu structure provided by the application software 506, the user being presented with a list of users known to the navigation device 200 for selection therefrom. The menu structure also comprises a menu option to record current operational preferences set in respect of the navigation device 200 as a new operational preference profile.

Turning to Figure 6, the microphone 226 is operably coupled to the recognition module 370 supported by the processor 210, the recognition module 370 comprising an audio processing sub-module 400 and an audio matching engine sub-module 402. The audio processing sub-module 400 is operably coupled to the microphone 226 and the audio matching engine sub-module 402. The audio matching engine sub-module 402 is capable of communicating with the application software 360 via a suitable software interface 403. Additionally, the memory 230 stores an audio profile database 404 comprising audio profile data characterising, in this example, voices of users, and associated identifiers of users. The audio matching engine sub-module 402 and the audio processing sub-module 400 are capable of accessing the audio profile database 404.

In operation (Figure 7), a first user acquires the navigation device 200 in order to use the navigation device for a first time. The first user thus eventually powers-up the navigation device 200 and sets (Step 500) one or more operational preferences associated with use of the navigation device 200, for example preferred announcement voice (female), volume of loudspeaker, display colour scheme and/or safety alerts. The first user then selects (Step 502) the menu option to record the selected operational preferences as a new operational preference profile and the first user is prompted by the user interface of the navigation device 200 to select a name or handle for association and storage of the operational preferences set in the user preferences database, the name or handle constituting a first identifier associated with the first user. The navigation device 200 then provides the first user with an option to provide a voice sample for rapid and/or automatic recall of the new operational preference profile on future occasions. The voice sample can be, if desired, free speech, or as in this example a predetermined word, such as "TomTom" or a set of words.

Assuming the first user wishes to provide the voice sample, the application software 360 prompts (Step 504) the user to provide the voice sample and the voice sample is recorded by the audio processing sub-module 400 whereupon the voice sample is subjected to signal processing in order to generate voice profile data that characterises the voice sample. In this example, the voice sample is subjected to Fourier analysis and a resulting Fourier profile of the voice sample generated (Step 506) constitutes the voice profile data. The voice profile data is stored (Step 508) in the audio profile database 404 by the audio processing sub-module 400 along with the identifier previously provided by the first user. The navigation device 200 is then used by the first user at the discretion of the first user in order to satisfy the navigational, location determination and/or map presentation needs of the first user until the navigation device 200 is powered-down. The above process can be repeated for a second user and any subsequent new users upon use of the navigation device 200 for a first time. Referring to Figure 8, on a subsequent occasion when the first user powers the navigation device up in preparation, for example, to embark upon a journey requiring the operation and/or assistance of the navigation device 200, the navigation device 200 asks the user if a search for a configuration profile should be made. The application software 360 then prompts (Step 550) the user to provide a voice sample, for example the predetermined keyword or words, such as "TomTom". The audio processing sub-module 400 records, via the microphone 226, the keyword spoken by the first user and subjects (Step 552) the voice sample to signal processing, for example the Fourier analysis mentioned above in order to generate sample voice profile data. The sample voice profile data is an example of candidate physical characteristic data that can be derived from user-related measurements and/or observations. Alternatively, collection of the voice sample can be performed automatically upon powering-up the navigation device 200 without requesting confirmation from the user or during use of other features of the navigation device 200 where voice input is required, for example when the voice recognition facility of the navigation device 200 mentioned above is used, such as to locate a destination or target address. As a further alternative, the prompting for the provision of the voice sample can be triggered by the first user, for example by tapping on the display screen 290 a predetermined number of time, for example more than once, such as three times in short succession.

Once generated, the sample voice profile data is provided to the audio matching engine sub-module 402 where the sample voice profile data is subjected (Step 554) to a recognition algorithm, for example a pattern matching algorithm, by the audio matching engine sub-module 402 using the voice profile data retrieved from the audio profile database 404.

If a match is not found (Step 556) by the audio matching engine sub-module 402 between the sample voice profile data and any of the voice profile data stored in the audio profile database 404, then the audio matching engine sub-module 402 checks (Step 558) an internal counter, c, to determine whether the counter has been set to a value greater than 2 (the counter is, of course, initialised when the navigation device 200 us powered-up). If the value of the internal counter, c, does not exceed 2, the internal counter, c, is incremented by unity (Step 560) and the first user is prompted again to provide a voice sample generation and matching is also attempted again (Steps 550 to 556). The above process (Steps 550 to 560) is repeated until either a match is obtained or the value of the internal counter, c, exceeds 2. When the value of the internal counter, c, exceeds 2, the value of the internal counter, c, is reset to zero (Step 562), the first user is advised that a match cannot be found and no further attempts to identify stored operational preferences associated with first the user from voice samples is attempted unless the process is specifically re-initiated by the first user.

In the event that a match (Step 556) has been found between the sample voice profile data and the voice profile data stored in the audio profile database 404, the first identifier mentioned above and associated with the matching voice profile data contained in the audio profile database 404, and associated with the first user, is then passed by the audio matching engine sub-module 402 to the application software 360 as a message, the application software 360 recovering the operational preference stored against the identifier associated with the first user in the user preferences database. The application software 360 then implements (Step 564) the operational preference or preferences associated with the operational preference profile recovered from the user preferences database, and the navigation device 200 continues to operate in accordance with the programming of the application software 360 in order to provide navigation assistance, location determination assistance and/or map information.

In another embodiment, (Figure 9), the camera 228 is operably coupled to the recognition module 370 supported by the processor 210, the recognition module 370 comprising an image processing sub-module 406 and an image matching engine sub- module 408. The image processing sub-module 408 is operably coupled to the camera 228 and the image matching engine sub-module 408. The image matching engine sub- module 408 is capable of communicating with the application software 360 via a suitable software interface 409. Additionally, the memory 230 stores an image profile database 410 comprising physical characteristic data concerning respective visual appearances of users and associated identifiers of users. The image matching engine sub-module 408 is capable of accessing the image profile database 410.

Referring to Figure 10, a first user acquires the navigation device 200 in order to use the navigation device for a first time. The first user thus eventually powers-up the navigation device 200 and sets (Step 600) one or more operational preferences associated with use of the navigation device 200, for example preferred announcement voice (female), volume of loudspeaker, display colour scheme and/or safety alerts. The first user then selects (Step 602) the menu option to record the selected operational preferences as a new operational preference profile and the first user is prompted by the user interface of the navigation device 200 to select a name or handle for association and storage of the operational preferences set in the user preferences database, the name or handle constituting a first identifier associated with the first user. The navigation device 200 then provides the first user with an option to capture an image of the face of the first user (hereinafter referred to an "image capture sample") for rapid recall and/or automatic of the new operational preference profile on future occasions.

Assuming the first user wishes to provide the image capture sample, the application software 360 prompts (Step 604) the user to present his or her face to the camera 228 and the camera 228 captures the image of the face of the first user. In order to provide consistent and accurate results, it is desirable, but not essential, to capture the image of the face within a predetermined region defined by a part of the angle of coverage of the camera 228. In this example, a visual feedback technique is therefore employed, whereby the image received by the camera 228 is displayed by the display screen 240 and a two-dimensional shape, for example a rectangle, is superimposed upon the display of the image received by the camera 228. The first user is then instructed by the navigation device 200 to present his or her face to the camera 228 so as to ensure that the face of the first user is within the rectangle prior to capture of the image of the face for image processing purposes. In this respect, the presence of the image of the face of the first user within the rectangle can be automatically detected by appropriate image processing software or indicated manually by the user by touching the display screen 240 or a vocal command via the microphone 226. Once captured, the image capture sample is then subjected to signal processing in order to generate facial profile data that characterises a region of the image capture sample relating to the face of the first user. In this example, the image capture sample is subjected to a facial feature recognition algorithm (Step 606), for example a so-called eigenface or eigenimage algorithm, though other algorithms can be employed depending upon processing capabilities. The skilled person should appreciate that other processing steps can be employed in the process of face recognition, for example threshold setting and the generation of matrices of weight. However, any suitable technique can be employed and so for the sake of conciseness and clarity of description, specific face recognition techniques will not be described in further detail herein. Once processed by the image processing sub-module 406, the image profile data output is stored (Step 608) in the image profile database 410 by the image processing sub-module 406 along with the identifier previously provided by the first user. The navigation device 200 is then used by the first user at the discretion of the first user in order to satisfy the navigational, location determination and/or map presentation needs of the first user until the navigation device 200 is powered-down. The above process can be repeated for a second user and any subsequent new users upon use of the navigation device 200 for a first time.

Referring to Figure11 , on a subsequent occasion when the first user powers the navigation device up in preparation, for example, to embark upon a journey requiring the operation and/or assistance of the navigation device 200, the navigation device 200 asks the user if a search for a configuration profile should be made. The application software 360 then prompts (Step 650) the user to present his or her face to the camera 228 in order to obtain an image capture sample. The face is therefore aligned within a capture region of the camera using the visual feedback technique. The image processing sub- module 406, then subjects (Step 652) the image capture sample to signal processing, for example the eigenface or eigenimage analysis described above and any suitable ancillary processing in order to generate image profile sample data. The image profile sample data is an example of candidate physical characteristic data that can be derived from user-related measurements and/or observations. Alternatively, collection of the image profile sample data can be performed automatically upon powering-up the navigation device 200 without requesting confirmation from the user or during use of the navigation device 200, for example by repeatedly capturing images until a match is obtained during use of the navigation device 200. As a further alternative, the prompting for the presentation of the face of the first user can be triggered by the first user, for example by tapping on the display screen 290 a predetermined number of time, for example more than once, such as three times in short succession.

Once generated, the image profile sample data is provided to the image matching engine sub-module 408 where the image profile sample data is subjected (Step 654) to a recognition algorithm, for example a pattern matching algorithm, by the image matching engine sub-module 408 using the image profile data retrieved from the image profile database 410.

If a match is not found (Step 656) by the image matching engine sub-module 408 between the image profile sample data and any of the image profile data stored in the audio profile database 410, then the image matching engine sub-module 408 checks (Step 658) an internal counter, c, to determine whether the counter has been set to a value greater than 2 (the counter is, of course, initialised when the navigation device 200 us powered-up). If the value of the internal counter, c, does not exceed 2, the internal counter, c, is incremented (Step 660) by unity and the first user is prompted again to provide a image profile sample data generation and matching is also attempted again (Steps 650 to 656). The above process (Steps 650 to 660) is repeated until either a match is obtained or the value of the internal counter, c, exceeds 2. When the value of the internal counter, c, exceeds 2, the value of the internal counter, c, is reset to zero (Step 662), the first user is advised that a match cannot be found and no further attempts to identify stored operational preferences associated with first the user from images of the first user captured is attempted unless the process is specifically re- initiated by the first user.

In the event that a match (Step 656) has been found between the image profile sample data and the image profile data stored in the image profile database 410, the first identifier mentioned above and associated with the matching image profile data contained in the image profile database 410, and associated with the first user, is then passed by the image matching engine sub-module 408 to the application software 360 as a message, the application software 360 recovering the operational preferences stored against the first identifier associated with the first user in the user preferences database. The application software 360 then implements (Step 664) the operational preference or preferences associated with the operational preference profile recovered from the user preferences database, and the navigation device 200 continues to operate in accordance with the programming of the application software 360 in order to provide navigation assistance, location determination assistance and/or map information.

In further embodiment, the processor 210 supports the techniques of any of the preceding embodiments, for example the first and second embodiments, in order to improve reliability of retrieval of the identifier associated with the first user. In another embodiment, the camera 228 can be used to capture images that comprise more than the face of the user for generation of the physical characteristic data, for example a space occupied by a driver of a vehicle. Furthermore, the physical characteristic data in this and previous embodiments can have, if desired, a time-varying aspect, for example the images collected can vary with time, the varying images revealing a pattern of behaviour, ritualistic or otherwise, that can be used to characterise and identify a user.

Although the above embodiments have been described in the context of face recognition or voice recognition, the skilled person should appreciate that any other suitable physical characteristic data can be generated by contactless derivation from a user, for example biometric data, can be employed in order to recognise the user of the navigation device 200 and retrieved operational preferences for implementation in respect of the recognised user.

It will also be appreciated that whilst various aspects and embodiments of the present invention have heretofore been described, the scope of the present invention is not limited to the particular arrangements set out herein and instead extends to encompass all arrangements, and modifications and alterations thereto, which fall within the scope of the appended claims.

For example, the recognition module can be arranged to improve the profile data held in relation to the first user (or other users) by implementing a learning algorithm that continuously or periodically acquires voice samples and/or capture image samples and uses the acquired samples to refine the profile data held, for example by employing an averaging technique in relation to the profile sample data acquired. In relation to the voice samples, the samples can be acquired when the first user uses the voice recognition facility of the navigation device 200 to control the navigation device 200. In relation to the image capture samples, the image processing sub-module 406 can periodically monitor a region of, for example, a vehicle cockpit where the first user is located to drive the vehicle.

By way of another example, although the attempt to recognise the first user is made when the first user powers-up the navigation device, it should be appreciated that the attempt to recognise the first user can be deferred until some time after powering-up the navigation device 200, for example when the first user is en-route to a destination, the voice samples and/or image capture samples being acquired at least periodically during use of the navigation device 200.

As a further example, whilst embodiments described in the foregoing detailed description refer to GPS, it should be noted that the navigation device may utilise any kind of position sensing technology as an alternative to (or indeed in addition to) the GPS. For example the navigation device may utilise other global navigation satellite systems (GNSS) such as the proposed European Galileo system when available. Equally, it is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location, for example the long range navigation (LORAN)-C system.

It should also be appreciated that although reference has been made herein to preferences associated with the navigation device 200, the operational preferences can also relate to other devices that the navigation device is capable of controlling or providing control signals or data, for example a tuner or stereo of a vehicle in which the navigation device 200 is located. For example, one of the operational preferences can be an equalizer setting or settings or other sound parameter control facility of the in- vehicle stereo.

Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared. The series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.

It will also be well understood by persons of ordinary skill in the art that whilst the preferred embodiment implements certain functionality by means of software, that functionality could equally be implemented solely in hardware (for example by means of one or more ASICs (application specific integrated circuit)) or indeed by a mix of hardware and software. As such, the scope of the present invention should not be interpreted as being limited only to being implemented in software.

Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present invention is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims

1. A navigation apparatus comprising: a processing resource operably coupled to a data store; an operational environment supported, when in use, by the processing resource and arranged to store operational preference profiles and respective associated user identifier data in the data store; wherein the processing resource is arranged to determine contactlessly an identity of a user and use the identity of the user to retrieve an operational preference profile from the operational preference profiles in the data store.
2. An apparatus as claimed in Claim 2, wherein the processing resource is arranged to implement an operational preference of the operational preference profile in response to identification of the user.
3. An apparatus as claimed in Claim 1 or Claim 2, wherein the identity of the user is determined using a recognition algorithm.
4. An apparatus as claimed in any one of the preceding claims, wherein the identity of the user is determined using candidate physical characteristic data derived from the user.
5. An apparatus as claimed in Claim 4, wherein the data store comprises physical characteristic data respectively identifying users.
6. An apparatus as claimed in Claim 5, wherein the data store maintains respective associations between the physical characteristic data and the user identifier data.
7. An apparatus as claimed in Claim 4 or Claim 5, wherein the processing resource is arranged to sample information in relation to the user via an input device and generate the candidate physical characteristic data from the information sampled.
8. An apparatus as claimed in any one of Claims 4 to 7, wherein the information sampled is voice data.
9. An apparatus as claimed in Claim 7, wherein the input device is an audio input device.
10. An apparatus as claimed in any one of Claims 4 to 7, wherein the information sampled is image data.
11. An apparatus as claimed in Claim 10, wherein the image data is facial image data.
12. An apparatus as claimed in Claim 7, wherein the input device is an optical input device.
13. An apparatus as claimed in Claim 10, wherein the image data is time-varying image data, the candidate physical characteristic data relating to a change of the image data with time.
14. An apparatus as claimed in any one of the preceding claims, further comprising an operational preference profile database comprising the operational preference profiles and respective associated user identifier data.
15. An apparatus as claimed in any one of the preceding claims, further comprising a physical characteristics database comprising the user identifier data and respectively associated physical characteristic data.
16. An apparatus as claimed in any one of the preceding claims, wherein the processing resource is arranged to generate, at least periodically, the candidate physical characteristic data in order to determined the identity of the user.
17. An apparatus as claimed in any one of the preceding claims, wherein the processing resource is arranged to learn the identity of the user from use of a core operational feature supported by the operational environment.
18. An apparatus as claimed in Claim 2, wherein the operational preference relates to an apparatus external to the navigation apparatus and controllable via the operational environment.
19. A method of operational preference selection, the method comprising: providing an operational environment; storing operational preference profiles and respective associated user identifier data; determining contactlessly an identity of a user; using the identity of the user to retrieve an operational preference profile from the stored operational preference profiles.
20. A computer program element comprising computer program code means to make a computer execute the method as claimed in Claim 19.
21. A computer program element as claimed in Claim 20, embodied on a computer readable medium.
PCT/EP2008/003711 2008-05-02 2008-05-02 Navigation device and method of operational preference selection therefor WO2009132681A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2008/003711 WO2009132681A1 (en) 2008-05-02 2008-05-02 Navigation device and method of operational preference selection therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2008/003711 WO2009132681A1 (en) 2008-05-02 2008-05-02 Navigation device and method of operational preference selection therefor

Publications (1)

Publication Number Publication Date
WO2009132681A1 true true WO2009132681A1 (en) 2009-11-05

Family

ID=40229807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/003711 WO2009132681A1 (en) 2008-05-02 2008-05-02 Navigation device and method of operational preference selection therefor

Country Status (1)

Country Link
WO (1) WO2009132681A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050137753A1 (en) * 2003-12-22 2005-06-23 International Business Machines Corporation Medical applications in telematics
US20060155398A1 (en) * 1991-12-23 2006-07-13 Steven Hoffberg Adaptive pattern recognition based control system and method
US7142696B1 (en) * 1999-11-03 2006-11-28 Robert Bosch Gmbh Assistance device in a motor vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060155398A1 (en) * 1991-12-23 2006-07-13 Steven Hoffberg Adaptive pattern recognition based control system and method
US7142696B1 (en) * 1999-11-03 2006-11-28 Robert Bosch Gmbh Assistance device in a motor vehicle
US20050137753A1 (en) * 2003-12-22 2005-06-23 International Business Machines Corporation Medical applications in telematics

Similar Documents

Publication Publication Date Title
US20080228393A1 (en) Navigation device and method
US20100100310A1 (en) System and method for providing route calculation and information to a vehicle
US20090177677A1 (en) Navigation device and method
US20110112762A1 (en) Validating map data corrections
US20110131154A1 (en) Navigation device, method & system
US20080228386A1 (en) Navigation device and method
US20120185163A1 (en) navigation route planning
US20090177383A1 (en) Navigation device and method
US20130131986A1 (en) Navigation or mapping apparatus & method
US20090177386A1 (en) Navigation device and method
US20110063132A1 (en) Apparatus and method for determining information
US20100251134A1 (en) Communications apparatus, system and method of providing a user interface
US20110178703A1 (en) Navigation apparatus and method
US20110131243A1 (en) Data acquisition apparatus, data acquisition system and method of acquiring data
US20130211705A1 (en) Navigation device & method
US20110112750A1 (en) Route preview
WO2010040385A1 (en) Navigation apparatus and method for use therein
US20110301841A1 (en) Navigation system and method for providing departure times
US20130261954A1 (en) Mapping or navigation apparatus and method of operation thereof
WO2010040400A1 (en) Navigation apparatus and method of providing points of interest
US20120245838A1 (en) Method of identifying a temporarily located road feature, navigation apparatus, system for identifying a temporarily located road feature, and remote data processing server apparatus
US20110125398A1 (en) Navigation apparatus, server apparatus and method of providing point of interest data
US20120053823A1 (en) Navigation device & method
US20140025432A1 (en) Methods and systems for obtaining information
WO2013001094A1 (en) Navigation methods and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08758418

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 08758418

Country of ref document: EP

Kind code of ref document: A1