WO2011072746A1 - Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same - Google Patents

Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same Download PDF

Info

Publication number
WO2011072746A1
WO2011072746A1 PCT/EP2009/067456 EP2009067456W WO2011072746A1 WO 2011072746 A1 WO2011072746 A1 WO 2011072746A1 EP 2009067456 W EP2009067456 W EP 2009067456W WO 2011072746 A1 WO2011072746 A1 WO 2011072746A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
features
accessibility
feature
navigation device
Prior art date
Application number
PCT/EP2009/067456
Other languages
French (fr)
Inventor
Michel Alders
Jasper Michiel Van Hemert
Original Assignee
Tomtom International B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tomtom International B.V. filed Critical Tomtom International B.V.
Priority to PCT/EP2009/067456 priority Critical patent/WO2011072746A1/en
Publication of WO2011072746A1 publication Critical patent/WO2011072746A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3611Destination input or retrieval using character input or menus, e.g. menus of POIs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • G08G1/096838Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the user preferences are taken into account or the user selects one route out of a plurality
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096877Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement
    • G08G1/096888Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement where input information is obtained using learning systems, e.g. history databases

Definitions

  • At least one embodiment of the present application generally relates to methods of adaptively determining the accessibility of features provided to a user through a user interface (UI) and electronic devices using the same.
  • At least one embodiment generally relates to a user interface of an electronic device which is a navigation device, for example, portable navigation devices (so-called PNDs) or in-vehicle navigation devices; in particular to PNDs or in- vehicle navigation devices that include Global Positioning System (GPS) signal reception and processing functionality or other positioning systems.
  • PNDs portable navigation devices
  • GPS Global Positioning System
  • Other embodiments relate, more generally, to any type of device including a user interface.
  • Navigation devices that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-vehicle or other vehicle navigation systems.
  • Such devices include user interfaces that provide a user with access to many features.
  • feature creep and device clutter in a PND is addressed in a limited way by allowing a user to view a smaller subset of features (e.g., 'show FEWER' menu options).
  • a user may not find the 'show FEWER' feature.
  • existing methods do not tailor the accessibility of features to a particular user and the accessibility of the features does not change with the user. Summary
  • example embodiments disclose methods for adaptively determining the accessibility of features provided to a user through a user interface and/ or electronic devices using the same.
  • a method of adaptively determining accessibility of features provided to a user through a user interface (UI) of an electronic device may include adaptively determining whether to increase accessibility of at least one of the features provided to the user through the UI based on a first criteria and adaptively determining whether to decrease the accessibility of the at least one of the features based on a second criteria.
  • a computer readable medium may include computer readable instructions stored thereon for execution by a processor to perform adaptive determination of the accessibility of features provided to a user through a UI.
  • an electronic device may include a processor, a memory, a display and a user input device.
  • the electronic device may be configured to provide instructions to the processor according to application software in order to adaptively determine whether to increase accessibility of at least one feature provided to a user by the electronic device based on a first criteria and to adaptively determine whether to decrease the accessibility of the at least one feature based on a second criteria.
  • FIG. 1 is a schematic illustration of a Global Positioning System (GPS);
  • FIG. 2 is a schematic illustration of electronic components arranged to provide a navigation device
  • FIG. 3 is a schematic illustration of the manner in which a navigation device may receive information over a wireless communication channel
  • FIGS. 4A and 4B are illustrative perspective diagrams of a navigation device
  • FIGS. 5a-5i are illustrative screenshots from a navigation device for a destination input process
  • FIG. 6 is an illustrative screenshot from a navigation device depicting a start location for an illustrative calculated route
  • FIG. 7 is a schematic representation of an architectural stack employed by the navigation device of FIG. 3;
  • FIG. 8 is a high level diagram of constituent parts of a user profile
  • FIG. 9 is a detailed schematic of the navigation system of FIG. 2;
  • FIG. 10 is a flowchart illustrating a method of determining the accessibility of a feature based on defined criteria.
  • FIG. 1 1 is a flowchart illustrating incremental change in the accessibility of a feature according to iterations of the method of FIG. 10. Detailed Description of Example Embodiments
  • spatially relative terms e.g., "beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or a relationship between a feature and another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, for example, the term “below” can encompass both an orientation which is above as well as below. The device may be otherwise oriented (rotated 90 degrees or viewed or referenced at other orientations) and the spatially relative descriptors used herein should be interpreted accordingly.
  • the software implemented aspects of example embodiments are typically encoded on some form of computer readable medium or implemented over some type of transmission medium.
  • the computer readable medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or "CD ROM"), and may be read only or random access.
  • the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. Example embodiments are not limited by these aspects of any given implementation. [0035] Example embodiments of the present disclosure will now be described with particular reference to a PND.
  • a user interface is intended to include any device through which a user accesses a plurality of features. Examples of such devices include computing resources such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA).
  • PC personal computer
  • PDA portable digital assistant
  • FIG. 1 illustrates an example view of Global Positioning System (GPS), usable by navigation devices.
  • GPS Global Positioning System
  • NAVSTAR the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • the GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • the GPS system is denoted generally by reference numeral 100.
  • a plurality of satellites 120 are in orbit about the earth 124.
  • the orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous.
  • a GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
  • the spread spectrum signals 160 continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock.
  • Each satellite 120 as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120.
  • the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.
  • FIG. 2 is an illustrative representation of electronic components of a navigation device 200 according to an example embodiment of the present disclosure, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
  • the navigation device 200 is located within a housing (not shown).
  • the housing includes a processor 210 connected to an input device 220 and a display device 240.
  • the input device 220 can include a keyboard device, voice input device, touch panel and/ or any other known input device utilized to input information; and the display device 240 can include any type of display screen such as an LCD display, for example.
  • the input device 220 and display device 240 are integrated into an integrated input and display device, including a touchpad or touch screen input so that a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual buttons.
  • the navigation device may include an output device 260, for example an audible output device (e.g. a loudspeaker).
  • output device 260 can produce audible information for a user of the navigation device 200, it is should equally be understood that input device 240 can include a microphone and software for receiving input voice commands as well.
  • processor 210 is operatively connected to and set to receive input information from input device 220 via a connection 225, and operatively connected to at least one of display device 240 and output device 260, via output connections 245, to output information thereto. Further, the processor 210 is operably coupled to a memory 230 via connection 235 and is further adapted to receive/ send information from/ to input/ output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200.
  • the memory 230 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non-volatile memory, for example a digital memory, such as a flash memory.
  • RAM Random Access Memory
  • non-volatile memory for example a digital memory, such as a flash memory.
  • the external I/O device 280 may include, but is not limited to an external listening device such as an earpiece for example.
  • the connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/ or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/ or to establish a connection to a server via the internet or some other network for example.
  • FIG. 2 further illustrates an operative connection between the processor 210 and an antenna/ receiver 250 via connection 255, wherein the antenna/ receiver 250 may be a GPS antenna/ receiver for example.
  • the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
  • the electronic components shown in FIG. 2 are powered by power sources (not shown) in a conventional manner.
  • power sources not shown
  • different configurations of the components shown in FIG. 2 are considered to be within the scope of the present application.
  • the components shown in FIG. 2 may be in communication with one another via wired and/or wireless connections and the like.
  • the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200.
  • the portable or handheld navigation device 200 of FIG. 2 may be connected or "docked" in a known manner to a vehicle such as a bicycle, a motorbike, a car or a boat for example. Such a navigation device 200 may then be removable from the docked location for portable or handheld navigation use.
  • the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (e.g., a mobile phone, PDA, and/ or any device with mobile phone technology) establishing a digital connection (e.g., as a digital connection via known Bluetooth technology). Thereafter, through a network service provider, the mobile device may establish a network connection (e.g., through the internet) with a server 302.
  • a mobile device e.g., a mobile phone, PDA, and/ or any device with mobile phone technology
  • a digital connection e.g., as a digital connection via known Bluetooth technology
  • the mobile device may establish a network connection (e.g., through the internet) with a server 302.
  • a "mobile” network connection may be established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/ or in a vehicle) and the server 302 to provide a "real-time” or at least very “up to date” gateway for information.
  • the establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using an internet such as the World Wide Web for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example.
  • the mobile device can utilize any number of communication standards such as CDMA, GSM, and/ or WAN.
  • An internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example.
  • an internet connection between the server 302 and the navigation device 200 may be established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service) connection.
  • GPRS connection is a highspeed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet.
  • the navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the navigation device 200 may include its own mobile phone technology within the navigation device 200 itself, including an antenna for example, or optionally using the internal antenna of the navigation device 200.
  • the mobile phone technology within the navigation device 200 can include internal components as specified above, and/ or can include an insertable card (e.g., Subscriber Identity Module or SIM card), complete with necessary mobile phone technology and/ or an antenna for example.
  • mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.
  • a Bluetooth enabled navigation device may be used to correctly work with the ever changing spectrum of mobile phone models, manufacturers, etc., and model/ manufacturer specific settings may be stored on the navigation device 200, for example.
  • the data stored for this information can be updated.
  • the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements.
  • the server 302 and a navigation device 200 can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).
  • the server 302 may include, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312.
  • the processor 304 is further operatively connected to transmitter 308 and receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318.
  • the signals sent and received may include data, communication, and/or other propagated signals.
  • the transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
  • Server 302 is further connected to (or includes) a mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314.
  • the mass storage device 312 may contain a store of navigation data and map information, and may again be a separate device from the server 302 or can be incorporated into the server 302.
  • the navigation device 200 may be adapted to communicate with the server 302 through communications channel 318, and may include processor 210, memory 230, etc. as previously described with regard to FIG. 2, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
  • Software stored in server memory 306 may provide instructions for the processor 304 and may allow the server 302 to provide services to the navigation device 200.
  • One service that may be provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200.
  • Another service that may be provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
  • the communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302.
  • Both the server 302 and navigation device 200 may include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.
  • the communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • RF radio-frequency
  • the communication channel 318 includes telephone and computer networks. Furthermore, the communication channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency, infrared communication, etc. Additionally, the communication channel 318 can accommodate satellite communication.
  • the communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology.
  • the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc.
  • TDMA Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile Communications
  • Both digital and analogue signals may be transmitted through the communication channel 318.
  • These signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
  • the server 302 may include a remote server accessible by the navigation device 200 via a wireless channel.
  • the server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200.
  • a personal computer (not shown) may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200.
  • a mobile telephone or other handheld device (not shown) may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
  • the navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection, for example.
  • the processor 304 in the server 302 may be used to handle the bulk of the processing needs, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
  • a navigation device 200 may include a processor 210, an input device 220, and a display device 240.
  • the input device 220 and display device 240 may be integrated into an integrated input and display device to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example.
  • a touch panel screen for example.
  • Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art.
  • the navigation device 200 can also include any additional input device 220 and/or any additional output device 260, such as audio input/ output devices for example.
  • FIGS. 4A and 4B are perspective views of a navigation device 200.
  • the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen for example) and the other components of FIGS. 2 and 3 (including but not limited to an internal GPS, and antenna/ receiver 250, processor 210, a power supply, memory 230, etc.).
  • an integrated input and display device 290 a touch panel screen for example
  • the other components of FIGS. 2 and 3 including but not limited to an internal GPS, and antenna/ receiver 250, processor 210, a power supply, memory 230, etc.
  • the navigation device 200 may sit on an arm 292, which itself may be secured to a vehicle dashboard/ window/ etc. using a suction cup 294.
  • This arm 292 is one example of a docking station to which the navigation device 200 can be docked.
  • the navigation device 200 can be docked or otherwise connected to an arm 292 of the docking station by snap connecting the navigation device 292 to the arm 292 for example.
  • the navigation device 200 may then be rotatable on the arm 292, as shown by the arrow of FIG. 4B.
  • a button on the navigation device 200 may be pressed, for example.
  • Other equally suitable arrangements for coupling and decoupling the navigation device to a docking station are well known to persons of ordinary skill in the art.
  • FIGS. 5a-5i there is depicted a series of screenshots from a navigation device 200.
  • This navigation device 200 may have a touch screen interface for displaying information to a user and for accepting input to the device from the user.
  • the screenshots show an illustrative example embodiment of a destination location input process for a user whose home location has been set to the offices in The Hague of the European Patent Office, and who wishes to navigate to a street address in Amsterdam, The Netherlands for which they know the street name and building number.
  • the device may acquire a GPS fix and calculate (in a known manner) the current location of the navigation device 200.
  • the user may then be presented, as shown in FIG. 5a, with a display 340 showing in pseudo three-dimensions the local environment 342 in which the navigation device 200 is determined to be located, and in a region 344 of the display 340 below the local environment a series of control and status messages.
  • the navigation device 200 switches to display (as shown in FIG. 5b) a series of virtual buttons 346 by means of which a user can, inter alia, input a destination that they wish to navigate to.
  • the navigation device 200 may switch to display (as shown in FIG. 5c) a plurality of virtual buttons that are each associated with a different category of selectable destinations.
  • the display shows a "home” button that if pressed would set the destination to the stored home location.
  • the "favourite” button if pressed, may reveal a list of destinations that the user has previously stored in the navigation device 200 and if one of these destinations is then selected the destination for the route to be calculated may be set to the selected previously stored destination.
  • the "recent destination” button if pressed, may reveal a list of selectable destinations held in the memory 230 of the navigation device 200 and to which the user has recently navigated. Selection of one of the destinations populating this list may set the destination location for this route to the selected (previously visited) location.
  • the "point of interest” button if pressed, may reveal a number of options by which a user can opt to navigate to any of a plurality of locations, such as cash machines, petrol stations or tourist attractions for example, that have been pre-stored in the device as locations that a user of the device might want to navigate to.
  • the "arrow" shaped virtual button may open a new menu of additional options, and the "address” button 350 may commence a process by which the user can input the street address of the destination that they wish to navigate to.
  • the user knows the street address and house number of the destination and hence selects the "street and house number" virtual button 352 whereupon the user may then be presented, as shown in FIG. 5e, a prompt 354 to enter the name of the city that they wish to navigate to, a flag button 356 by which the user can select the country in which the desired city is located, and a virtual keyboard 358 that may be operated by the user, if necessary, to input the name of the destination city.
  • the user has previously navigated to locations in Rijswijk and Amsterdam, and the PND therefore additionally provides the user with a list 360 of selectable cites.
  • the user in this instance wishes to navigate to Amsterdam, and on selection of Amsterdam from the list 360 the navigation device 200 displays, as shown in FIG. 5f, a virtual keyboard 362 by means of which a user can input street names, a prompt 364 for entry of a street name and, in this instance, as the user has previously navigated to a street in Amsterdam, a list 366 of selectable streets in Amsterdam.
  • the navigation device 200 may display a smaller virtual keypad 368 and prompts the user, via prompt 370, to enter the number of the house in the selected street and city that they wish to navigate to. If the user has previously navigated to a house number in this street, then that number (as shown in FIG. 5g) is initially shown. If, as in this instance, the user wishes to navigate to No. 35, Rembrandtplein once again, then the user need only touch a "done" virtual button 372 displayed at the bottom right hand corner of the display. If the user should wish to navigate to a different house number in Rembrandtplein, then all they need do is operate the keypad 368 to input the appropriate house number.
  • the user is asked in FIG. 5h, whether they wish to arrive at a particular time. If the user should push the "yes" button, then functionality is invoked that estimates the time required to travel to the destination and advises the user when they should leave (or if they are running late, should have left) their current location in order to arrive at their destination on time. In this instance the user is not concerned about arriving at a particular time and hence selects the "no" virtual button.
  • Selecting the "no" button 374 may cause the navigation device 200 to calculate a route between the current location and the selected destination and to display that route 376, as shown in FIG. 5i, on a relatively low magnification map that shows the entire route.
  • the user may be provided with a "done” virtual button 378 which they can press to indicate that they are happy with the calculated route, a "find alternative” button 380 that the user can press to cause the navigation device 200 to calculate another route to the selected destination, and a “details” button 382 that a user can press to reveal selectable options for the display of more detailed information concerning the currently displayed route 376.
  • FIGS. 5a-5i it is assumed that the user is happy with the displayed route, and once the "done" button 378 has been pressed the user may be presented, as shown in FIG. 6, with a pseudo three- dimensional view of the current, start, location for the navigation device 200.
  • the display depicted in FIG. 6 is similar to that shown in FIG. 5a except that the displayed local environment 342 now includes a start location flag 384 and a waypoint indicator 386 indicating the next manoeuvre (in this instance, a left hand turn).
  • the lower part of the display has also changed and now displays the name of the street in which the navigation device 200 is currently located, an icon 388 indicating the distance to and type of the next manoeuvre (from the current location of the navigation device 200), and a dynamic display 390 including the distance and time to the selected destination.
  • the user may then commence their journey and the navigation device 200 may guide the user, in a known manner, by updating the map in accordance with determined changes in navigation device 200 location, and by providing the user with visual and, optionally, audible navigation instructions.
  • FIG. 7 is an example diagram of hardware and software cooperation of a navigation device 200.
  • hardware 281 e.g., processor 210 and memory 230
  • BIOS Basic Input/Output System
  • the processor 210 may load an operating system 284 from the memory 230, which provides an environment in which application software 286 (implementing the functionality of the navigation device 200) can run.
  • the application software 286 may provide an operational environment including a graphical user interface (GUI) 288 that supports core functions of the navigation device, for example map viewing and route planning, as described above.
  • GUI graphical user interface
  • the application software 286 may provide features to a user through the GUI 288 by displaying, for example, virtual buttons 346 graphically displayed as icons.
  • FIG. 8 is an example high level diagram of constituent parts of a user profile.
  • the navigation device 200 may be configured to determine the accessibility of features provided to a user through a user interface based on, inter alia, the profile of the user. As shown in FIG. 8, the navigation device 200 may be adapted to receive input from interaction with a user 616 indicative of physiological 602, psychological 604, behavioural 606, external 608 and interactive 610 factors. These factors 602-610 may be logged in a memory 230 of the navigation device 200 and used to determine a user profile 614.
  • the user profile 614 may be updated in real time in order to keep the profile 614 current and relevant. Changes in the profile 614 may be indicative of a variety of factors associated with the user.
  • the user profile is described with respect to the user, the user profile 614 may be the profile of the user, of other known users, and/or a profile of a prototypical user (e.g., Safe Driving user profile).
  • a user profile 614 may include subjective input provided by a user (e.g., explicit measure).
  • a user may populate a profile by, for example, answering questions. For example, questions may be provided through a web interface or start-up wizard to create an extensive profile. The answers to the questions may identify subjective characteristics of the user.
  • Example subjective characteristics of the user may include, but are not limited to, a level of experience with a particular user interface, a level of interest in technology, a degree of desired user control of a device 200, acceptance of user initiatives, a level of desired interaction (e.g., interaction with a device 200 while driving), a knowledge of available functionality provided by a device 200, an expected usage of the available functionality, a type of functionality desired, etc.
  • a user profile 614 may also include feedback from a user, for example, feedback concerning events initiated by the device 200.
  • a user profile 614 may include a wide variety of subjective input of a user.
  • a user profile 614 may include objective data.
  • the application software 286 may collect data related to the use of a navigation device 200 by a user.
  • the objective data may be used, for example, to infer characteristics of a user (e.g., implicit measure).
  • the type of collected data may include, for example, a type of functionality used, an amount of functionality used, how functionality is used over time, the amount of functionality added or removed by a user, a technological level of the functionality used by the user (e.g., advanced or basic), an amount of modification of functionality (e.g., use of route changing functionality), a quality of user interaction with a navigation device 200 (e.g., smoothness while driving), a frequency of use of the navigation device 200, an amount of use of explicit measures, a rate a user rejects or accepts functionality proposed by a navigation device 200, etc.
  • a user profile 614 may include a wide variety of objective data based on, for example, the use of the navigation device 200.
  • a user profile may include a specific assignment of the accessibility of each feature made available through the application software 286.
  • the application software 286 may consider both explicit and implicit measures in determining the accessibility of features provided to a user through a user interface. For example, although the accessibility of functionality provided to a user through a user interface may initially be based predominantly on explicit measures, data concerning implicit measures may indicate error or change in the explicit measures.
  • the application software 286 may adaptively conform the functionality provided by the navigation device 200 (described below) based on both subjective and objective characteristics of the user. However, example embodiments are not so limited. For example, the functionality provided by the navigation device 200 may be based on only explicit or implicit measures. In at least one example embodiment, the weight given to implicit and explicit measures is different.
  • FIG. 9 is an example of a detailed schematic of the navigation system of FIG. 2.
  • the navigation device 200 is shown in more detail in FIG. 9. It will be appreciated that a skilled person may select only some of the described components or factors used when implementing the device or alternate components or factors not described herein.
  • the navigation device 200 may include a processor 210 coupled to memory 230.
  • the memory 230 may be arranged to store mapping and navigational data 704 and user profiles 614 and/ or the data and user profiles may be accessed from a remote data store or server 728.
  • the server 728 may be coupled to and/or include remote databases or information services 730-734 (e.g., traffic database 730, map database 732 and/or weather database 734).
  • the navigation device 200 may be equipped with a positioning system 726 such as a GPS system or a mobile communication network triangulation system, as shown in Figure 1 , for determining the current location of the vehicle.
  • the navigation device 200 may be coupled with input devices 708-718 for collecting data indicative of factors for use in determining the user's profile.
  • Such input devices may include, for example, physiological sensors 708, microphones 710, user input devices 712, links 714 to remote databases, wireless signal receivers 716 such as BluetoothTM receivers and mobile phone signal receivers and cameras 718, etc.
  • the navigation device 200 may also be provided with output devices such as a display 720, a vibrating alert device 722, an audio output device 724, etc.
  • the navigation device 200 may also be adapted to collect, store and update data input by a user by using the user input devices 712 to receive input from the user. Examples of suitable user input devices include touch screens, a keyboard, buttons, roller balls, soft keys or virtual keys, a link to a personal computer, etc.
  • the interface 288 may be operable to present prompts and/ or questions to the user and obtain input representing, for example, personal preferences and characteristics from the user. Examples of such preferences may include, for example, whether a user prefers accessibility to every feature of the navigation device 200 or less than all the features.
  • the interface 288 may also be arranged to prompt and/ or obtain input from the user regarding whether or not to change the accessibility of features, for example.
  • the navigation device 200 may be provided with a communications module, for example a USB port or wi-fi device, so as to allow the navigation device 200 to link to a personal computer to transfer details of the collected data from the computer to the device.
  • the navigation device 200 may be adapted to collect and, where appropriate, update the variables forming the user's profile 614.
  • the entire profile 614 may be analysed using algorithms in order to determine the user's characteristics.
  • Accessibility may mean absolute accessibility in which a user either does or does not have access to a feature.
  • Accessibility may also mean relative accessibility, where accessibility may be a measure of the difficulty in accessing, or time needed to access, a feature relative to other features.
  • a feature that may be accessed through a single input of the user interface e.g., a single voice command or application of tactile pressure
  • Relative accessibility may also describe the level of perceptibility of a feature relative to other features in terms of visual or auditory prominence.
  • a visual prominence of a feature may refer to aspects of a graphical representation of the feature, for example aspects of an icon.
  • Example aspects of an icon may include a background color of an icon, a border of an icon, a size of the icon, a blink speed of an icon, a location of an icon on a display screen, and/or the form of the icon, etc.
  • a feature may be made more or less prominent by altering one or more of the example aspects (e.g., adding or removing a border) and/ or aspects other than those described herein.
  • visual prominence has been described in terms of the prominence of a graphical representation of a feature, any changes in visual prominence are contemplated. For example, changing the brightness of pixels of the display device 240 associated with a feature may change a visual prominence of the feature.
  • auditory prominence of a feature may refer to an auditory cue associated with the feature.
  • a sound may alert a user that a particular feature may be used. Auditory alerting may include, for example, specific tones, series of tones, changes in volume, changes in speed of a series of tones, and/or changes in pitch.
  • example embodiments are described with respect to visual and auditory stimulus, any stimulus involving a physiological method of perception used to alert a user is contemplated by example embodiments.
  • tactile stimulus e.g., vibration
  • the application software 286 may initially assign an accessibility level to the features of a navigation device 200 based on criteria determined by, for example, a manufacturer.
  • the manufacturer may pre-configure the navigation device 200 to make available all or a limited subset of the features based on, for example, whether the navigation device 200 is new, the model of the device and/or the target user.
  • the limited subset of features may include a base number of essential features to initially provide a streamlined or simplified version of the GUI 288 to all users.
  • the accessibility of features provided through the application software 286 may be based on, for example, the user profile 614, the profile of other known users, template users (e.g., prototype users), and /or the characteristics of users where the device will be sold.
  • the application software 286 may initially preset the accessibility of the features provided by the navigation device 200, in an embodiment of the present application, a user may change the preset accessibility.
  • the accessibility preset by the application software 286 may be changed in a variety of ways. For example, the user may manually set the accessibility of a single feature, a subset of features or all the features at once.
  • a corner of the display 340 may be a virtual button used by the user to activate a functionality that facilitates manually changing the accessibility of features.
  • the user may also set the accessibility of features by selecting, for example, profiles of other users, known users or prototype users. These profiles may be saved in the memory 214 and/or may be imported from an external source (e.g., the server 150). User profiles, such as the user profile 614, may include a specific assignment of the accessibility of each feature made available through the application software 286.
  • a user may also be provided with assistance in determining the accessibility of features through the GUI 288.
  • the navigation device 200 may provide advice to a user by way of a prompt on the display device 240 based on a criteria (e.g., usage statistics).
  • the user may ask the navigation device 200 to propose a change in accessibility.
  • the user may ask the navigation device 200 to propose adding or removing one or more features.
  • the navigation device 200 may propose adding or removing a feature based on, for example, a criteria (described below), a short questionnaire provided to the user, and/or based upon history and speed of adaption when a next set of features is provided by a manufacturer.
  • a user may change the accessibility of features provided by the navigation device 200 through the GUI 288 through the GUI 288 include going back in history (e.g., reverting to a user profile as it existed in the past) and/or by synchronizing the navigation device 200 with GPS traces.
  • GPS traces may include positional data collected by the navigation device 200. Synchronization of the navigation device 200 with GPS traces may provide objective data that may be used by the application software 286 to determine the accessibility of features (described below).
  • a change in the accessibility of features may involve expanding or contracting the GUI 288 (e.g., adding or removing features).
  • expansion or contraction may refer to changing the absolute availability of a feature on a navigation device 200, it may also mean that the accessibility of the feature is significantly changed.
  • a GUI 288 may be contracted by deleting or hiding a feature from the user, or by removing the feature from display as an icon (virtual button) and placing the feature into an 'inactive menu' listing inactive functions.
  • the inactive menu itself is generally reduced in accessibility so that it requires several user inputs to access and therefore is buried in a selection tree.
  • FIGS. 5a-5c illustrate examples of a how a GUI 288 selection tree may be navigated by activating virtual buttons (e.g., virtual buttons 346) in sequence to locate a feature or menu of interest.
  • the navigation device 200 may be configured to adaptively determine the accessibility of features provided to a user through a GUI 288. Adaptive determination may be an optional feature or may be an essential function. In general, if users are satisfied with the functionality they are using, the GUI 288 may not change. If the user desires to make use of a greater amount of the available functionality, the accessibility of previously unused features may be increased. If the user does not make use of currently available features, the accessibility of those features may be decreased. Whether or not the accessibility of a feature is changed may be determined by the application software 286 based on, for example, one or more criteria.
  • an example embodiment of the present application is directed to an electronic device (200), including a processor (210), a memory (230), a display (720), and a user input device ( 12), wherein the electronic device (200) is configured to provide instructions to the processor according to application software (286) to adaptively determine whether to increase accessibility of at least one feature provided to a user by the electronic device (200) based on a first criteria and to adaptively determine whether to decrease the accessibility of the at least one feature based on a second criteria.
  • the determination of whether or not to change the accessibility of one or more features by the application software 286 may not be independent and may require complex calculation applying one or more criteria to one or more features and comparing results of such calculations to similar calculations (or groups of calculations) made for every other feature (or groups of features). For example, one or more calculations may be made for each feature based one or more criteria. Each feature may be assigned an accessibility ranking based on the calculations. To finally determine an accessibility of each feature, the accessibility ranking for each feature may be compared to the accessibility ranking of every other feature.
  • the comparison may be made using a hierarchical ordering algorithm of varying complexity that considers any number of parameters.
  • each criteria may be different between criteria and features.
  • Individual criteria and features, or groups of criteria and features may be given more or less importance for any number of reasons. For example, a feature that is used less often than another feature but is usable less often may be given a greater weight than a feature that is used more often but is usable more often. Frequent use of an infrequently usable feature may indicate a user preference for the feature.
  • An example embodiment of the present application is directed to a method of adaptively determining accessibility of features provided to a user through a user interface (288) of an electronic device (200), the method including adaptively determining whether to increase accessibility of at least one of the features provided to the user through the user interface (288) based on a first criteria and adaptively determining whether to decrease the accessibility of the at least one of the features based on a second criteria.
  • FIG. 10 is a flowchart illustrating an example embodiment of a method of determining the accessibility of a feature based on defined criteria (e.g., categories of criteria).
  • the application software 286 is set to an Initial State S I 000 where the accessibility of each feature of the navigation device 200 is definite.
  • the application software 286 may determine the accessibility ranking of one or more features based on one or more of the criteria S 1010, S 1020, S 1030, S 1040, S 1050 and S 1060. Based on the accessibility rankings, the application software 286 may make a decision S 1075 whether or not to change the accessibility of one or more features of the GUI 288.
  • the application software may also determines how the accessibility of one or more features will change (e.g., increase or decrease). Once a determination is made, if the accessibility of each feature will remain the same (e.g., "NO” in FIG. 10), the Initial State S 1000 is maintained. If the decision is to change the accessibility of one or more features based on at least one of the criteria (e.g., "YES” in FIG. 10), accessibility is changed and New State S I 080 is achieved. At New State S I 080, the accessibility of each feature of the navigation device 200 is definite.
  • FIG. 10 is illustrated with respect to criteria S 1010, S 1020, S 1030, S 1040 S 1050 and S 1060, one having ordinary skill in the art will understand that any number of criteria may be used to decide whether accessibility of a feature will change and these criteria are intended as examples.
  • a method of adaptively determining the accessibility of features is described with reference to accessibility rankings, example embodiments are not so limited.
  • the accessibility of features may be determined according to separate iterations in which only one feature is considered at a time without hierarchical ranking.
  • the accessibility of a feature may be set with or without respect to the accessibility of every other feature in many different ways.
  • Criteria S 1010 is a usage characteristic of a user.
  • Usage characteristics generally refer to statistics compiled by the application software 286.
  • the usage characteristic is a number of times a feature is used by the user over a period of time in which the navigation device 200 is actively used. For example, if the device has been in use for 96 hours and a feature is never used, the application software 286 may lower the accessibility raking of the feature.
  • a usage characteristic may include a number of previously accepted or rejected proposals.
  • the usage characteristic of criteria S 1010 may include, for example, a usage characteristic of a different user and/or a statistical aggregation of a plurality of users.
  • a statistical aggregation of a plurality of users may include, for example, grouping users according to one or more usage characteristics. Accessibility of features may be changed based on, for example, trends related to users falling within the aggregation. For example, user 'User 1 ' has the following characteristics: male; traffic subscription; frequent user of traffic features; and device profile expert. 'User belongs to an aggregation of users that have the same or similar usage characteristics.
  • the application software 286 may change the accessibility of features based on data that, for example, other users falling within the aggregation often use feature ⁇ ' or have accepted the device recommendation of feature ⁇ '. For example, the application software 286 may adjust the likelihood that the accessibility of feature ⁇ ' is increased or the likelihood that feature ⁇ ' is recommended to 'User 1 '.
  • Criteria S 1020 is a user profile.
  • the navigation device 200 in communication with the server 302 and/or 728 may track the changes in a profile of a different user (e.g., a similar and/or specified user) and/or a prototypical user. These changes may be tracked with or without the user activating such tracking. Accessibility of a feature may be changed based on changes in the different user profile and/ or a prototypical user profile. The changes in the different or prototypical user profile may be changes to the accessibility of features and / or any other profile parameter.
  • Criteria S 1030 is a manner in which the PND is used.
  • the memory 230 may store selection sequences of navigation through the GUI 288 selection tree. Specific features may be associated with the one or more of the selection sequences.
  • the application software 286 may monitor the selections made by the user. If the user navigates through the GUI 288 selection tree such that the user selects options matching a selection sequence, the application software 286 may change the accessibility ranking of a feature. For example, if a specific selection sequence activates a specific type of feature, the application software 286 may increase the accessibility ranking of similar features.
  • Criteria S 1040 is a behaviour of a user, such as a driving behaviour.
  • the workload and/or stress of a driver may be monitored by a PND. If the workload and/or stress of a user increases, the user can generally attend to less information. Attending to information from the PND rather than the primary driving task in situations where the user is under increased workload and/ or stress can potentially lead to unsafe situations.
  • the application software 286 may increase or decrease the accessibility of a feature based on workload and/or stress of the user.
  • Other example driving behaviours may include, for example, the usual time of the day that a user drives, the speed at which a user drives, and/ or the average daily distances driven by the user. According to an example embodiment, the behaviour of the user may be a number of features used by the user.
  • Criteria S 1050 is a preference of a user.
  • a preference of a user may be used to override a determination of the application software 286 as to whether or not to increase or decrease the accessibility ranking of a feature.
  • a user can prevent the application software 286 from increasing or decreasing in accessibility by locking the feature. For example, a user may deselect a feature so that it remains perpetually hidden despite a determination by the application software 286 that a feature should be made more accessible. According to at least one example embodiment, the user may remove a feature from the navigation device 200.
  • Criteria S I 060 is a new feature.
  • a new feature may be a feature recently downloaded (e.g., from the server 302), one available on the device but hidden until an event occurs (e.g., unlocking of the feature via activation for a trial or by payment), or a feature shared between users (e.g., shared content).
  • new features have increased accessibility relative to pre-existing features.
  • example embodiments may decrease the accessibility of new features. For example, a user may wish to maintain a simple GUI 288 and hide all new features made available through the GUI 288 by the navigation device 200.
  • Criteria S I 070 is the usability of a feature. If the application software 286 detects that the user is in a situation in which a feature may be used, the accessibility of the feature may be increased. The following examples are provided to illustrate criteria S 1070. According to at least one embodiment, if a user consistently detours around a route provided by the PND, the feature "Avoid Part of the Route” may be made more accessible to the user. If “Avoid Part of the Route” is used frequently, the feature “Map Corrections" may be increased in accessibility. According to at least one embodiment, if a user is on a familiar route, the user may be prompted that the feature "Turn Off Voice" is available so that voice instruction may be turned off.
  • a user frequently drives to a particular parking garage the user may be prompted that the feature "Park Assist” is available.
  • the user may be prompted that "POI” functionality is available.
  • a user frequently drives through a street designated as blocked by the device 200 the user may be prompted that the feature "(Un)Block Street” is available.
  • a determination of whether or not to change the accessibility of one or more features provided to a user through a user interface may occur iteratively. If the accessibility of the feature will change, a determination of how the accessibility of the feature will change may occur.
  • FIG. 1 1 is a flowchart illustrating incremental change in the accessibility of a feature according to iterations of the method of FIG. 10.
  • step S I 1 10 may represent a first iteration of the method of FIG. 10. For example, if criteria S 1010 (usage characteristic) is the criteria and the user has frequently used a feature that is set at a relatively low accessibility level, the accessibility may be increased. The graphical representation of the feature may be moved to a position of a GUI 288 selection tree that requires fewer user inputs to activate.
  • a second iteration of the method of FIG. 10 is performed and the accessibility of the feature is increased.
  • the user continues to frequently use the feature and the feature is used more frequently than other similar features.
  • the feature is moved to the top level of the GUI 288 selection tree and the background color of the icon may be changed.
  • step S I 130 although the user continues to frequently use a feature, other features receive a higher ranking in step SI 120. Accordingly, the feature is moved down in the GUI 288 selection tree and a border is added so that it exceeds the visual prominence of other features at that level of the GUI 288 selection tree.
  • the icon is positioned so that it is more visually prominent than other icons at that level of the GUI 288 selection tree.
  • FIG. 1 1 is illustrative of an iterative method of determining the accessibility of a feature.
  • steps S I 1 10-S l 130 are examples only.
  • adaptive determination of accessibility of a feature is described as occurring in sequential iterative steps, the rate of change for each feature or group of features may be based on different frequencies (e.g., a feature may be eligible for change only every other iteration or for no iterations).
  • the sequence of increasing or decreasing accessibility is not limited by FIG. 1 1.
  • steps S I 1 10 and SI 120 show increases in accessibility
  • step S I 130 shows a decrease in accessibility
  • the determination resulting from each iteration may be any one of an increase, a decrease or no change to the accessibility of at least one feature.
  • step S I 1 10 shows an increase of the accessibility of a feature and the increase is described as a change in position of the feature in a GUI 288 selection tree.
  • any type of change of accessibility may be determined at each iteration or no determination of a type of change in accessibility may occur (e.g., no determination where a same type of change occurs) .
  • the accessibility of a new feature may be set to a maximum accessibility in step S I 1 10.
  • the accessibility of a new feature may also include accessibility not available to existing features.
  • a new feature may be merged into a GUI 288 selection tree in such a manner that a new feature is easily distinguished from pre-existing features.
  • the distinguishing feature may be reserved solely for new features (e.g., reserved borders).
  • a new feature may also remain at a maximum accessibility despite satisfying criteria instructing a decrease in accessibility for pre-existing features, or at least remain at a maximum accessibility for a longer time period and/ or until a condition is satisfied.
  • a new feature may remain at a maximum accessibility for a longer time period than a pre-existing feature or until S I 070 (usability of a feature) occurs. If criteria S 1070 occurs and the navigation device 200 has detected an occasion that the new feature can be used, the newly added feature may be made available to the user. For example, the feature may blink and/ or prompt the user and/ or be accompanied by an auditory alert.
  • the feature may be incrementally decreased in accessibility. For example, the new feature may decrease in accessibility until it is indistinguishable from pre-existing features. Once the new feature is indistinguishable from pre-existing features, the new feature may become subject to the same criteria and given the same accessibility as pre-existing features.
  • a new feature may be initially set to a minimum accessibility until it may be used, and when the new feature may be used, the accessibility of the new feature is increased to an accessibility reserved for new features.
  • a prompt may be displayed indicating to the user that at least one of the features is available.
  • a feature may eventually be decreased in accessibility until it reaches a minimum accessibility.
  • a minimum accessibility may include unavailability of the feature on the device, availability on the device but not in the GUI 288 or a graphical representation of the feature may be removed and the feature placed in an inactive menu.
  • Change in accessibility of a feature may or may not be relatively transparent. For example, if the navigation device 200 detects that a user never uses a feature, it may become less distinguishable over time (e.g., over iterations as described with respect to FIG. 1 1 , above). The graphical representation of the feature may change in background color, the border may change, and auditory alerting may be removed. Eventually the navigation device 200 may fade the feature until it is no longer represented graphically. In no case is the user prompted and each change in accessibility is invisible to the user. Accordingly, the progression to minimum accessibility is gradual and relatively transparent to a user. Alternatively, the progression to minimum accessibility may be abrupt and involve the user at every decrease of accessibility.
  • the application software 286 may prompt the user each time a decrease in accessibility of the feature is determined to be appropriate.
  • the transparency of change in accessibility of features may include a variable amount of involvement of the user and the transparency of change may be set accordingly.
  • a speed at which the application software 286 changes the accessibility of features may be adjustable.
  • the application software 286 may be set to change the accessibility of one or more features each iteration of FIG. 10.
  • the application software may be set to change the accessibility of the one or more features based on any number of iterations.
  • a speed may be assigned feature by feature, to individual features or to all the features.
  • the application software 286 may be set to never change the accessibility of features.
  • the speed and transparency at which the application software 286 changes the accessibility of features may be preset (e.g., by a manufacturer).
  • An initial global speed and transparency setting may be determined, for example, through empirical studies of the satisfaction of an average user or types of users to various rates of change of the GUI 288.
  • the speed and transparency of changing feature accessibility may be subsequently alterable by the user so that adaptive determination conforms to the comfort level of each user.
  • the alteration may be made by the user feature by feature, by subgroups of features or with respect to all features.
  • a speed and/or transparency of the accessibility of features may be based on a preference of the user, a profile selected by the user and/ or a usage characteristic of the user.
  • the number of times a feature is used may be counted over a 96 hour period.
  • a user may, for example, extend or shorten that period with respect to one or more features or for every feature.
  • the GUI 288 may provide an overall setting for speed and transparency at which the application software 286 changes accessibility so that a user may adjust the speed and transparency of adaptive determination globally.
  • the feature accessibility may be set to change quickly for a progressive type of user (e.g., Michael Schumacher), may change at a slow pace for a conservative user and/ or may change at a rate falling somewhere in between for an average user.
  • the global speed and transparency setting may adjust feature by feature settings by, for example, a common factor and/or according to a setting template.
  • example embodiments are applicable to any device including a plurality of features provided to the user through a user interface.
  • the example embodiments implement certain functionality by means of software, that functionality could equally be implemented solely in hardware (for example by way of one or more ASICs (application specific integrated circuit)) or indeed by a mix of hardware and software.
  • ASICs application specific integrated circuit
  • Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions or program segments stored on a tangible data recording medium (computer readable medium), such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared.
  • a tangible data recording medium such as a diskette, CD-ROM, ROM, or fixed disk
  • the series of computer instructions or program segments can constitute all or part of the functionality of the method of embodiments described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.

Abstract

Adaptive determination (S 1075) of the accessibility of features provided to a user through a user interface (288) and electronic devices (200) including the same. Accessibility of a feature with respect to every other feature may be determined (S 1075) based on criteria (S 1005) at a speed and transparency that may be adjustable by the user.

Description

TITLE
METHODS OF ADAPTIVELY DETERMINING THE ACCESSIBILITY OF FEATURES PROVIDED THROUGH A USER INTERFACE AND NAVIGATION
APPARATUSES USING THE SAME
Field
[0001] At least one embodiment of the present application generally relates to methods of adaptively determining the accessibility of features provided to a user through a user interface (UI) and electronic devices using the same. At least one embodiment generally relates to a user interface of an electronic device which is a navigation device, for example, portable navigation devices (so-called PNDs) or in-vehicle navigation devices; in particular to PNDs or in- vehicle navigation devices that include Global Positioning System (GPS) signal reception and processing functionality or other positioning systems. Other embodiments relate, more generally, to any type of device including a user interface.
Background
[0002] When purchasing an electronic device, consumers generally look for a device that simplifies their lives and which has many attractive features. For this reason, manufacturers often include a large number of features that appeal to a broad range of consumers. However, as more and more features are added (e.g., feature creep), a device may become cluttered and difficult to use.
[0003] Although consumers may purchase an electronic device based on the number and attractiveness of features included in the device, most will initially use very few of the available features. Further, even experienced users tend to only use about 15% of available features. As a result, electronic devices may be cumbersome to use because unused features prevent or delay the user from accessing functionality they want. In some cases, users become frustrated if they cannot access a desired feature and return the device as 'malfunctioning' even though the device is working as intended.
[0004] If a user continues to use a device and becomes more experienced, they are often ready to expand the number of features they access. However, it may be difficult for them to find new features they want to use, either because of the sheer amount of features made available by the device or because they do not know that a feature exists. This may lead to an overall lower satisfaction of the user or the user may even purchase a different device to obtain a desired feature, despite the fact that the current device includes that feature.
[0005] Navigation devices that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-vehicle or other vehicle navigation systems. Such devices include user interfaces that provide a user with access to many features. Currently, feature creep and device clutter in a PND is addressed in a limited way by allowing a user to view a smaller subset of features (e.g., 'show FEWER' menu options). However, due to device clutter, a user may not find the 'show FEWER' feature. Additionally, existing methods do not tailor the accessibility of features to a particular user and the accessibility of the features does not change with the user. Summary
[0006] Accordingly, to address at least one of the above and/ or other problems, example embodiments disclose methods for adaptively determining the accessibility of features provided to a user through a user interface and/ or electronic devices using the same.
[0007] According to at least one example embodiment, a method of adaptively determining accessibility of features provided to a user through a user interface (UI) of an electronic device may include adaptively determining whether to increase accessibility of at least one of the features provided to the user through the UI based on a first criteria and adaptively determining whether to decrease the accessibility of the at least one of the features based on a second criteria.
[0008] According to at least one example embodiment, a computer readable medium may include computer readable instructions stored thereon for execution by a processor to perform adaptive determination of the accessibility of features provided to a user through a UI.
[0009] According to at least one example embodiment, an electronic device may include a processor, a memory, a display and a user input device. The electronic device may be configured to provide instructions to the processor according to application software in order to adaptively determine whether to increase accessibility of at least one feature provided to a user by the electronic device based on a first criteria and to adaptively determine whether to decrease the accessibility of the at least one feature based on a second criteria.
[0010] Advantages of these and other embodiments are set out hereafter, and further details and features of each of these embodiments are defined in the accompanying claims and elsewhere in the following detailed description.
Brief Description of the Drawings
[0011] Various aspects of the teachings of the present disclosure, and arrangements embodying those teachings, will hereafter be described by way of illustrative example with reference to the accompanying drawings, in which:
[0012] FIG. 1 is a schematic illustration of a Global Positioning System (GPS);
[0013] FIG. 2 is a schematic illustration of electronic components arranged to provide a navigation device;
[0014] FIG. 3 is a schematic illustration of the manner in which a navigation device may receive information over a wireless communication channel;
[0015] FIGS. 4A and 4B are illustrative perspective diagrams of a navigation device;
[0016] FIGS. 5a-5i are illustrative screenshots from a navigation device for a destination input process;
[0017] FIG. 6 is an illustrative screenshot from a navigation device depicting a start location for an illustrative calculated route;
[0018] FIG. 7 is a schematic representation of an architectural stack employed by the navigation device of FIG. 3;
[0019] FIG. 8 is a high level diagram of constituent parts of a user profile;
[0020] FIG. 9 is a detailed schematic of the navigation system of FIG. 2;
[0021] FIG. 10 is a flowchart illustrating a method of determining the accessibility of a feature based on defined criteria; and
[0022] FIG. 1 1 is a flowchart illustrating incremental change in the accessibility of a feature according to iterations of the method of FIG. 10. Detailed Description of Example Embodiments
[0023] Various example embodiments will now be described more fully with reference to the accompanying drawings in which some example embodiments are illustrated.
[0024] Accordingly, while example embodiments are capable of various modifications and alternative forms, embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed, but on the contrary, example embodiments are to cover all modifications, equivalents, and alternatives falling within the scope of example embodiments. Like numbers refer to like elements throughout the description of the figures.
[0025] It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and / or" includes any and all combinations of one or more of the associated listed items.
[0026] It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between" versus "directly between," "adjacent" versus "directly adjacent," etc.) .
[0027] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a," "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. I t will b e fu rt h e r u n d e rs t o o d t h at t he terms "comprises," "comprising," "includes" and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements and/ or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
[0028] Spatially relative terms, e.g., "beneath," "below," "lower," "above," "upper" and the like, may be used herein for ease of description to describe one element or a relationship between a feature and another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, for example, the term "below" can encompass both an orientation which is above as well as below. The device may be otherwise oriented (rotated 90 degrees or viewed or referenced at other orientations) and the spatially relative descriptors used herein should be interpreted accordingly.
[0029] It should also be noted that in some alternative implementations, the functions/ acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/ acts involved.
[0030] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0031] Portions of example embodiments and corresponding detailed description are presented in terms of software, or algorithms and symbolic representations of operation on data bits within a computer memory. These descriptions and representations are the ones by which those of ordinary skill in the art effectively convey the substance of their work to others of ordinary skill in the art. An algorithm, as the term is used here, and as it is used generally, is conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of optical, electrical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0032] In the following description, illustrative embodiments will be described with reference to acts and symbolic representations of operations (e.g., in the form of flowcharts) that may be implemented as program modules or functional processes include routines, programs, objects, components, data structures, that perform particular tasks or implement particular abstract data types and may be implemented using existing hardware at existing network elements or control nodes (e.g., a database). Such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs) computers or the like.
[0033] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, or as is apparent from the discussion, terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0034] Note also that the software implemented aspects of example embodiments are typically encoded on some form of computer readable medium or implemented over some type of transmission medium. The computer readable medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or "CD ROM"), and may be read only or random access. Similarly, the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. Example embodiments are not limited by these aspects of any given implementation. [0035] Example embodiments of the present disclosure will now be described with particular reference to a PND. It should be remembered, however, that the teachings of the present disclosure are not limited to PNDs but are instead universally applicable to any user interface. It follows therefore that in the context of the present application, a user interface is intended to include any device through which a user accesses a plurality of features. Examples of such devices include computing resources such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA).
[0036] With the above provisos in mind, FIG. 1 illustrates an example view of Global Positioning System (GPS), usable by navigation devices. Such systems are known and are used for a variety of purposes. In general, GPS is a satellite- radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users. Formerly known as NAVSTAR, the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
[0037] The GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
[0038] As shown in FIG. 1 , the GPS system is denoted generally by reference numeral 100. A plurality of satellites 120 are in orbit about the earth 124. The orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous. A GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
[0039] The spread spectrum signals 160, continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock. Each satellite 120, as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120. It is appreciated by those skilled in the relevant art that the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.
[0040] FIG. 2 is an illustrative representation of electronic components of a navigation device 200 according to an example embodiment of the present disclosure, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
[0041] The navigation device 200 is located within a housing (not shown). The housing includes a processor 210 connected to an input device 220 and a display device 240. The input device 220 can include a keyboard device, voice input device, touch panel and/ or any other known input device utilized to input information; and the display device 240 can include any type of display screen such as an LCD display, for example. In an example arrangement, the input device 220 and display device 240 are integrated into an integrated input and display device, including a touchpad or touch screen input so that a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual buttons.
[0042] The navigation device may include an output device 260, for example an audible output device (e.g. a loudspeaker). As output device 260 can produce audible information for a user of the navigation device 200, it is should equally be understood that input device 240 can include a microphone and software for receiving input voice commands as well.
[0043] In the navigation device 200, processor 210 is operatively connected to and set to receive input information from input device 220 via a connection 225, and operatively connected to at least one of display device 240 and output device 260, via output connections 245, to output information thereto. Further, the processor 210 is operably coupled to a memory 230 via connection 235 and is further adapted to receive/ send information from/ to input/ output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200. The memory 230 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non-volatile memory, for example a digital memory, such as a flash memory. The external I/O device 280 may include, but is not limited to an external listening device such as an earpiece for example. The connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/ or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/ or to establish a connection to a server via the internet or some other network for example.
[0044] FIG. 2 further illustrates an operative connection between the processor 210 and an antenna/ receiver 250 via connection 255, wherein the antenna/ receiver 250 may be a GPS antenna/ receiver for example. It will be understood that the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
[0045] Further, it will be understood by one of ordinary skill in the art that the electronic components shown in FIG. 2 are powered by power sources (not shown) in a conventional manner. As will be understood by one of ordinary skill in the art, different configurations of the components shown in FIG. 2 are considered to be within the scope of the present application. For example, the components shown in FIG. 2 may be in communication with one another via wired and/or wireless connections and the like. Thus, the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200. [0046] In addition, the portable or handheld navigation device 200 of FIG. 2 may be connected or "docked" in a known manner to a vehicle such as a bicycle, a motorbike, a car or a boat for example. Such a navigation device 200 may then be removable from the docked location for portable or handheld navigation use.
[0047] Referring now to FIG. 3, the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (e.g., a mobile phone, PDA, and/ or any device with mobile phone technology) establishing a digital connection (e.g., as a digital connection via known Bluetooth technology). Thereafter, through a network service provider, the mobile device may establish a network connection (e.g., through the internet) with a server 302. As such, a "mobile" network connection may be established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/ or in a vehicle) and the server 302 to provide a "real-time" or at least very "up to date" gateway for information.
[0048] The establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using an internet such as the World Wide Web for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example. The mobile device can utilize any number of communication standards such as CDMA, GSM, and/ or WAN.
[0049] An internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example. For this connection, an internet connection between the server 302 and the navigation device 200 may be established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service) connection. GPRS connection is a highspeed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet.
[0050] The navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example. [0051] The navigation device 200 may include its own mobile phone technology within the navigation device 200 itself, including an antenna for example, or optionally using the internal antenna of the navigation device 200. The mobile phone technology within the navigation device 200 can include internal components as specified above, and/ or can include an insertable card (e.g., Subscriber Identity Module or SIM card), complete with necessary mobile phone technology and/ or an antenna for example. As such, mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.
[0052] For GRPS phone settings, a Bluetooth enabled navigation device may be used to correctly work with the ever changing spectrum of mobile phone models, manufacturers, etc., and model/ manufacturer specific settings may be stored on the navigation device 200, for example. The data stored for this information can be updated.
[0053] In FIG. 3 the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements. The server 302 and a navigation device 200 can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).
[0054] The server 302 may include, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312. The processor 304 is further operatively connected to transmitter 308 and receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318. The signals sent and received may include data, communication, and/or other propagated signals. The transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver. [0055] Server 302 is further connected to (or includes) a mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314. The mass storage device 312 may contain a store of navigation data and map information, and may again be a separate device from the server 302 or can be incorporated into the server 302.
[0056] The navigation device 200 may be adapted to communicate with the server 302 through communications channel 318, and may include processor 210, memory 230, etc. as previously described with regard to FIG. 2, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
[0057] Software stored in server memory 306 may provide instructions for the processor 304 and may allow the server 302 to provide services to the navigation device 200. One service that may be provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200. Another service that may be provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
[0058] The communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302. Both the server 302 and navigation device 200 may include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.
[0059] The communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
[0060] In one illustrative arrangement, the communication channel 318 includes telephone and computer networks. Furthermore, the communication channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency, infrared communication, etc. Additionally, the communication channel 318 can accommodate satellite communication.
[0061] The communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology. For example, the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc. Both digital and analogue signals may be transmitted through the communication channel 318. These signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
[0062] The server 302 may include a remote server accessible by the navigation device 200 via a wireless channel. The server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
[0063] The server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200. Alternatively, a personal computer (not shown) may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200. Alternatively, a mobile telephone or other handheld device (not shown) may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet. [0064] The navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection, for example. For many dynamic calculations, the processor 304 in the server 302 may be used to handle the bulk of the processing needs, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
[0065] As indicated above in FIG. 2, a navigation device 200 may include a processor 210, an input device 220, and a display device 240. The input device 220 and display device 240 may be integrated into an integrated input and display device to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example. Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art. Further, the navigation device 200 can also include any additional input device 220 and/or any additional output device 260, such as audio input/ output devices for example.
[0066] FIGS. 4A and 4B are perspective views of a navigation device 200. As shown in FIG. 4A, the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen for example) and the other components of FIGS. 2 and 3 (including but not limited to an internal GPS, and antenna/ receiver 250, processor 210, a power supply, memory 230, etc.).
[0067] The navigation device 200 may sit on an arm 292, which itself may be secured to a vehicle dashboard/ window/ etc. using a suction cup 294. This arm 292 is one example of a docking station to which the navigation device 200 can be docked.
[0068] As shown in FIG. 4B, the navigation device 200 can be docked or otherwise connected to an arm 292 of the docking station by snap connecting the navigation device 292 to the arm 292 for example. The navigation device 200 may then be rotatable on the arm 292, as shown by the arrow of FIG. 4B. To release the connection between the navigation device 200 and the docking station, a button on the navigation device 200 may be pressed, for example. Other equally suitable arrangements for coupling and decoupling the navigation device to a docking station are well known to persons of ordinary skill in the art.
[0069] Referring now to FIGS. 5a-5i there is depicted a series of screenshots from a navigation device 200. This navigation device 200 may have a touch screen interface for displaying information to a user and for accepting input to the device from the user. The screenshots show an illustrative example embodiment of a destination location input process for a user whose home location has been set to the offices in The Hague of the European Patent Office, and who wishes to navigate to a street address in Amsterdam, The Netherlands for which they know the street name and building number.
[0070] When the user switches on their navigation device 200, the device may acquire a GPS fix and calculate (in a known manner) the current location of the navigation device 200. The user may then be presented, as shown in FIG. 5a, with a display 340 showing in pseudo three-dimensions the local environment 342 in which the navigation device 200 is determined to be located, and in a region 344 of the display 340 below the local environment a series of control and status messages.
[0071] By touching the display of the local environment 342, the navigation device 200 switches to display (as shown in FIG. 5b) a series of virtual buttons 346 by means of which a user can, inter alia, input a destination that they wish to navigate to.
[0072] By touching the "navigate to" virtual button 348, the navigation device 200 may switch to display (as shown in FIG. 5c) a plurality of virtual buttons that are each associated with a different category of selectable destinations. In this instance, the display shows a "home" button that if pressed would set the destination to the stored home location. However, in this instance as the user is already at their home location (namely the EPO's offices in the Hague) selecting this option may not cause a route to be generated. The "favourite" button, if pressed, may reveal a list of destinations that the user has previously stored in the navigation device 200 and if one of these destinations is then selected the destination for the route to be calculated may be set to the selected previously stored destination. The "recent destination" button, if pressed, may reveal a list of selectable destinations held in the memory 230 of the navigation device 200 and to which the user has recently navigated. Selection of one of the destinations populating this list may set the destination location for this route to the selected (previously visited) location. The "point of interest" button, if pressed, may reveal a number of options by which a user can opt to navigate to any of a plurality of locations, such as cash machines, petrol stations or tourist attractions for example, that have been pre-stored in the device as locations that a user of the device might want to navigate to. The "arrow" shaped virtual button may open a new menu of additional options, and the "address" button 350 may commence a process by which the user can input the street address of the destination that they wish to navigate to.
[0073] Since the user, in this example, knows the street address of the destination that they wish to navigate to, it is assumed that this "address" button is operated (by touching the button displayed on the touch screen), whereupon (as shown in FIG. 5d) the user is presented with a series of address input options - in particular for address input by "city centre", by "postcode", by "crossing or intersection" (for example a junction of two roads) and by "street and house number".
[0074] In this example the user knows the street address and house number of the destination and hence selects the "street and house number" virtual button 352 whereupon the user may then be presented, as shown in FIG. 5e, a prompt 354 to enter the name of the city that they wish to navigate to, a flag button 356 by which the user can select the country in which the desired city is located, and a virtual keyboard 358 that may be operated by the user, if necessary, to input the name of the destination city. In this instance the user has previously navigated to locations in Rijswijk and Amsterdam, and the PND therefore additionally provides the user with a list 360 of selectable cites.
[0075] The user in this instance wishes to navigate to Amsterdam, and on selection of Amsterdam from the list 360 the navigation device 200 displays, as shown in FIG. 5f, a virtual keyboard 362 by means of which a user can input street names, a prompt 364 for entry of a street name and, in this instance, as the user has previously navigated to a street in Amsterdam, a list 366 of selectable streets in Amsterdam.
[0076] In this example the user wishes to return to the street, Rembrandtplein, that they have previously visited and so selects Rembrandtplein from the displayed list 366. [0077] Once a street has been selected, the navigation device 200 then may display a smaller virtual keypad 368 and prompts the user, via prompt 370, to enter the number of the house in the selected street and city that they wish to navigate to. If the user has previously navigated to a house number in this street, then that number (as shown in FIG. 5g) is initially shown. If, as in this instance, the user wishes to navigate to No. 35, Rembrandtplein once again, then the user need only touch a "done" virtual button 372 displayed at the bottom right hand corner of the display. If the user should wish to navigate to a different house number in Rembrandtplein, then all they need do is operate the keypad 368 to input the appropriate house number.
[0078] Once the house number has been input, the user is asked in FIG. 5h, whether they wish to arrive at a particular time. If the user should push the "yes" button, then functionality is invoked that estimates the time required to travel to the destination and advises the user when they should leave (or if they are running late, should have left) their current location in order to arrive at their destination on time. In this instance the user is not concerned about arriving at a particular time and hence selects the "no" virtual button.
[0079] Selecting the "no" button 374 may cause the navigation device 200 to calculate a route between the current location and the selected destination and to display that route 376, as shown in FIG. 5i, on a relatively low magnification map that shows the entire route. The user may be provided with a "done" virtual button 378 which they can press to indicate that they are happy with the calculated route, a "find alternative" button 380 that the user can press to cause the navigation device 200 to calculate another route to the selected destination, and a "details" button 382 that a user can press to reveal selectable options for the display of more detailed information concerning the currently displayed route 376.
[0080] In the above described FIGS. 5a-5i, it is assumed that the user is happy with the displayed route, and once the "done" button 378 has been pressed the user may be presented, as shown in FIG. 6, with a pseudo three- dimensional view of the current, start, location for the navigation device 200. The display depicted in FIG. 6 is similar to that shown in FIG. 5a except that the displayed local environment 342 now includes a start location flag 384 and a waypoint indicator 386 indicating the next manoeuvre (in this instance, a left hand turn). The lower part of the display has also changed and now displays the name of the street in which the navigation device 200 is currently located, an icon 388 indicating the distance to and type of the next manoeuvre (from the current location of the navigation device 200), and a dynamic display 390 including the distance and time to the selected destination.
[0081] The user may then commence their journey and the navigation device 200 may guide the user, in a known manner, by updating the map in accordance with determined changes in navigation device 200 location, and by providing the user with visual and, optionally, audible navigation instructions.
[0082] FIG. 7 is an example diagram of hardware and software cooperation of a navigation device 200. Turning to FIG. 7, hardware 281 (e.g., processor 210 and memory 230) may cooperate to support a BIOS (Basic Input/Output System) 282 that functions as an interface between functional hardware components 281 of the navigation device 200 and the software executed by the device. The processor 210 may load an operating system 284 from the memory 230, which provides an environment in which application software 286 (implementing the functionality of the navigation device 200) can run. The application software 286 may provide an operational environment including a graphical user interface (GUI) 288 that supports core functions of the navigation device, for example map viewing and route planning, as described above. The application software 286 may provide features to a user through the GUI 288 by displaying, for example, virtual buttons 346 graphically displayed as icons.
[0083] FIG. 8 is an example high level diagram of constituent parts of a user profile. The navigation device 200 may be configured to determine the accessibility of features provided to a user through a user interface based on, inter alia, the profile of the user. As shown in FIG. 8, the navigation device 200 may be adapted to receive input from interaction with a user 616 indicative of physiological 602, psychological 604, behavioural 606, external 608 and interactive 610 factors. These factors 602-610 may be logged in a memory 230 of the navigation device 200 and used to determine a user profile 614. The user profile 614 may be updated in real time in order to keep the profile 614 current and relevant. Changes in the profile 614 may be indicative of a variety of factors associated with the user. Although the user profile is described with respect to the user, the user profile 614 may be the profile of the user, of other known users, and/or a profile of a prototypical user (e.g., Safe Driving user profile).
[0084] A user profile 614 may include subjective input provided by a user (e.g., explicit measure). A user may populate a profile by, for example, answering questions. For example, questions may be provided through a web interface or start-up wizard to create an extensive profile. The answers to the questions may identify subjective characteristics of the user. Example subjective characteristics of the user may include, but are not limited to, a level of experience with a particular user interface, a level of interest in technology, a degree of desired user control of a device 200, acceptance of user initiatives, a level of desired interaction (e.g., interaction with a device 200 while driving), a knowledge of available functionality provided by a device 200, an expected usage of the available functionality, a type of functionality desired, etc. A user profile 614 may also include feedback from a user, for example, feedback concerning events initiated by the device 200. One having ordinary skill in the art will understand that a user profile 614 may include a wide variety of subjective input of a user.
[0085] A user profile 614 may include objective data. For example, the application software 286 may collect data related to the use of a navigation device 200 by a user. The objective data may be used, for example, to infer characteristics of a user (e.g., implicit measure). The type of collected data may include, for example, a type of functionality used, an amount of functionality used, how functionality is used over time, the amount of functionality added or removed by a user, a technological level of the functionality used by the user (e.g., advanced or basic), an amount of modification of functionality (e.g., use of route changing functionality), a quality of user interaction with a navigation device 200 (e.g., smoothness while driving), a frequency of use of the navigation device 200, an amount of use of explicit measures, a rate a user rejects or accepts functionality proposed by a navigation device 200, etc. One having ordinary skill in the art will understand that a user profile 614 may include a wide variety of objective data based on, for example, the use of the navigation device 200. In at least one embodiment, a user profile may include a specific assignment of the accessibility of each feature made available through the application software 286. [0086] The application software 286 may consider both explicit and implicit measures in determining the accessibility of features provided to a user through a user interface. For example, although the accessibility of functionality provided to a user through a user interface may initially be based predominantly on explicit measures, data concerning implicit measures may indicate error or change in the explicit measures. The application software 286 may adaptively conform the functionality provided by the navigation device 200 (described below) based on both subjective and objective characteristics of the user. However, example embodiments are not so limited. For example, the functionality provided by the navigation device 200 may be based on only explicit or implicit measures. In at least one example embodiment, the weight given to implicit and explicit measures is different.
[0087] FIG. 9 is an example of a detailed schematic of the navigation system of FIG. 2. The navigation device 200 is shown in more detail in FIG. 9. It will be appreciated that a skilled person may select only some of the described components or factors used when implementing the device or alternate components or factors not described herein.
[0088] The navigation device 200 may include a processor 210 coupled to memory 230. The memory 230 may be arranged to store mapping and navigational data 704 and user profiles 614 and/ or the data and user profiles may be accessed from a remote data store or server 728. The server 728 may be coupled to and/or include remote databases or information services 730-734 (e.g., traffic database 730, map database 732 and/or weather database 734). The navigation device 200 may be equipped with a positioning system 726 such as a GPS system or a mobile communication network triangulation system, as shown in Figure 1 , for determining the current location of the vehicle.
[0089] The navigation device 200 may be coupled with input devices 708-718 for collecting data indicative of factors for use in determining the user's profile. Such input devices may include, for example, physiological sensors 708, microphones 710, user input devices 712, links 714 to remote databases, wireless signal receivers 716 such as Bluetooth™ receivers and mobile phone signal receivers and cameras 718, etc. The navigation device 200 may also be provided with output devices such as a display 720, a vibrating alert device 722, an audio output device 724, etc. [0090] The navigation device 200 may also be adapted to collect, store and update data input by a user by using the user input devices 712 to receive input from the user. Examples of suitable user input devices include touch screens, a keyboard, buttons, roller balls, soft keys or virtual keys, a link to a personal computer, etc.
[0091] The interface 288 may be operable to present prompts and/ or questions to the user and obtain input representing, for example, personal preferences and characteristics from the user. Examples of such preferences may include, for example, whether a user prefers accessibility to every feature of the navigation device 200 or less than all the features. The interface 288 may also be arranged to prompt and/ or obtain input from the user regarding whether or not to change the accessibility of features, for example.
[0092] The navigation device 200 may be provided with a communications module, for example a USB port or wi-fi device, so as to allow the navigation device 200 to link to a personal computer to transfer details of the collected data from the computer to the device. The navigation device 200 may be adapted to collect and, where appropriate, update the variables forming the user's profile 614. The entire profile 614 may be analysed using algorithms in order to determine the user's characteristics.
[0093] An example embodiment of a determination of the accessibility of features provided to a user through a user interface will now be described. Accessibility as described herein may mean absolute accessibility in which a user either does or does not have access to a feature. Accessibility may also mean relative accessibility, where accessibility may be a measure of the difficulty in accessing, or time needed to access, a feature relative to other features. For example, a feature that may be accessed through a single input of the user interface (e.g., a single voice command or application of tactile pressure) may be considered relatively more accessible than a feature that is accessed using more than one input (e.g., multiple voice commands or multiple applications of tactile pressure). Relative accessibility may also describe the level of perceptibility of a feature relative to other features in terms of visual or auditory prominence.
[0094] In a case where relative accessibility refers to visual or auditory prominence, a visual prominence of a feature may refer to aspects of a graphical representation of the feature, for example aspects of an icon. Example aspects of an icon may include a background color of an icon, a border of an icon, a size of the icon, a blink speed of an icon, a location of an icon on a display screen, and/or the form of the icon, etc. A feature may be made more or less prominent by altering one or more of the example aspects (e.g., adding or removing a border) and/ or aspects other than those described herein. Although visual prominence has been described in terms of the prominence of a graphical representation of a feature, any changes in visual prominence are contemplated. For example, changing the brightness of pixels of the display device 240 associated with a feature may change a visual prominence of the feature.
[0095] Similarly, in a case where relative accessibility refers to auditory prominence, auditory prominence of a feature may refer to an auditory cue associated with the feature. For example, a sound may alert a user that a particular feature may be used. Auditory alerting may include, for example, specific tones, series of tones, changes in volume, changes in speed of a series of tones, and/or changes in pitch.
[0096] Although example embodiments are described with respect to visual and auditory stimulus, any stimulus involving a physiological method of perception used to alert a user is contemplated by example embodiments. For example, tactile stimulus (e.g., vibration).
[0097] The application software 286 may initially assign an accessibility level to the features of a navigation device 200 based on criteria determined by, for example, a manufacturer. The manufacturer may pre-configure the navigation device 200 to make available all or a limited subset of the features based on, for example, whether the navigation device 200 is new, the model of the device and/or the target user. The limited subset of features may include a base number of essential features to initially provide a streamlined or simplified version of the GUI 288 to all users. However, other possibilities exist. The accessibility of features provided through the application software 286 may be based on, for example, the user profile 614, the profile of other known users, template users (e.g., prototype users), and /or the characteristics of users where the device will be sold. One having ordinary skill in the art understands that the selection of base features may be according to any number of different factors. [0098] Although the application software 286 may initially preset the accessibility of the features provided by the navigation device 200, in an embodiment of the present application, a user may change the preset accessibility. By using the input devices 712 and/or the input device 204 to provide input from the user to the navigation device 200, the accessibility preset by the application software 286 may be changed in a variety of ways. For example, the user may manually set the accessibility of a single feature, a subset of features or all the features at once. A corner of the display 340 may be a virtual button used by the user to activate a functionality that facilitates manually changing the accessibility of features. The user may also set the accessibility of features by selecting, for example, profiles of other users, known users or prototype users. These profiles may be saved in the memory 214 and/or may be imported from an external source (e.g., the server 150). User profiles, such as the user profile 614, may include a specific assignment of the accessibility of each feature made available through the application software 286.
[0099] A user may also be provided with assistance in determining the accessibility of features through the GUI 288. For example, the navigation device 200 may provide advice to a user by way of a prompt on the display device 240 based on a criteria (e.g., usage statistics). According to another example embodiment, the user may ask the navigation device 200 to propose a change in accessibility. For example, the user may ask the navigation device 200 to propose adding or removing one or more features. The navigation device 200 may propose adding or removing a feature based on, for example, a criteria (described below), a short questionnaire provided to the user, and/or based upon history and speed of adaption when a next set of features is provided by a manufacturer.
[0100] Other examples in which a user may change the accessibility of features provided by the navigation device 200 through the GUI 288 include going back in history (e.g., reverting to a user profile as it existed in the past) and/or by synchronizing the navigation device 200 with GPS traces. GPS traces may include positional data collected by the navigation device 200. Synchronization of the navigation device 200 with GPS traces may provide objective data that may be used by the application software 286 to determine the accessibility of features (described below). [0101] A change in the accessibility of features may involve expanding or contracting the GUI 288 (e.g., adding or removing features). Although expansion or contraction may refer to changing the absolute availability of a feature on a navigation device 200, it may also mean that the accessibility of the feature is significantly changed. For example, a GUI 288 may be contracted by deleting or hiding a feature from the user, or by removing the feature from display as an icon (virtual button) and placing the feature into an 'inactive menu' listing inactive functions. Although not required, the inactive menu itself is generally reduced in accessibility so that it requires several user inputs to access and therefore is buried in a selection tree. FIGS. 5a-5c illustrate examples of a how a GUI 288 selection tree may be navigated by activating virtual buttons (e.g., virtual buttons 346) in sequence to locate a feature or menu of interest.
[0102] The navigation device 200 may be configured to adaptively determine the accessibility of features provided to a user through a GUI 288. Adaptive determination may be an optional feature or may be an essential function. In general, if users are satisfied with the functionality they are using, the GUI 288 may not change. If the user desires to make use of a greater amount of the available functionality, the accessibility of previously unused features may be increased. If the user does not make use of currently available features, the accessibility of those features may be decreased. Whether or not the accessibility of a feature is changed may be determined by the application software 286 based on, for example, one or more criteria.
[0103] Thus, an example embodiment of the present application is directed to an electronic device (200), including a processor (210), a memory (230), a display (720), and a user input device ( 12), wherein the electronic device (200) is configured to provide instructions to the processor according to application software (286) to adaptively determine whether to increase accessibility of at least one feature provided to a user by the electronic device (200) based on a first criteria and to adaptively determine whether to decrease the accessibility of the at least one feature based on a second criteria.
[0104] According to example embodiments of the present application, the determination of whether or not to change the accessibility of one or more features by the application software 286 may not be independent and may require complex calculation applying one or more criteria to one or more features and comparing results of such calculations to similar calculations (or groups of calculations) made for every other feature (or groups of features). For example, one or more calculations may be made for each feature based one or more criteria. Each feature may be assigned an accessibility ranking based on the calculations. To finally determine an accessibility of each feature, the accessibility ranking for each feature may be compared to the accessibility ranking of every other feature. One having ordinary skill in the art will understand that the comparison may be made using a hierarchical ordering algorithm of varying complexity that considers any number of parameters.
[0105] It should be understood that the weight applied to each criteria, and to each criteria as applied to specific features, may be different between criteria and features. Individual criteria and features, or groups of criteria and features, may be given more or less importance for any number of reasons. For example, a feature that is used less often than another feature but is usable less often may be given a greater weight than a feature that is used more often but is usable more often. Frequent use of an infrequently usable feature may indicate a user preference for the feature.
[0106] An example embodiment of the present application is directed to a method of adaptively determining accessibility of features provided to a user through a user interface (288) of an electronic device (200), the method including adaptively determining whether to increase accessibility of at least one of the features provided to the user through the user interface (288) based on a first criteria and adaptively determining whether to decrease the accessibility of the at least one of the features based on a second criteria.
[0107] FIG. 10 is a flowchart illustrating an example embodiment of a method of determining the accessibility of a feature based on defined criteria (e.g., categories of criteria). Referring to FIG. 10, the application software 286 is set to an Initial State S I 000 where the accessibility of each feature of the navigation device 200 is definite. According to S I 005, the application software 286 may determine the accessibility ranking of one or more features based on one or more of the criteria S 1010, S 1020, S 1030, S 1040, S 1050 and S 1060. Based on the accessibility rankings, the application software 286 may make a decision S 1075 whether or not to change the accessibility of one or more features of the GUI 288. The application software may also determines how the accessibility of one or more features will change (e.g., increase or decrease). Once a determination is made, if the accessibility of each feature will remain the same (e.g., "NO" in FIG. 10), the Initial State S 1000 is maintained. If the decision is to change the accessibility of one or more features based on at least one of the criteria (e.g., "YES" in FIG. 10), accessibility is changed and New State S I 080 is achieved. At New State S I 080, the accessibility of each feature of the navigation device 200 is definite.
[0108] Although FIG. 10 is illustrated with respect to criteria S 1010, S 1020, S 1030, S 1040 S 1050 and S 1060, one having ordinary skill in the art will understand that any number of criteria may be used to decide whether accessibility of a feature will change and these criteria are intended as examples. Further, although a method of adaptively determining the accessibility of features is described with reference to accessibility rankings, example embodiments are not so limited. For example, the accessibility of features may be determined according to separate iterations in which only one feature is considered at a time without hierarchical ranking. One having ordinary skill in the art will recognize that the accessibility of a feature may be set with or without respect to the accessibility of every other feature in many different ways.
[0109] A description of example criteria S 1010, S 1020, S 1030, S 1040 S 1050 and S I 060 according to FIG. 10 will now be provided.
[0110] Criteria S 1010 is a usage characteristic of a user. Usage characteristics generally refer to statistics compiled by the application software 286. According to an example embodiment, the usage characteristic is a number of times a feature is used by the user over a period of time in which the navigation device 200 is actively used. For example, if the device has been in use for 96 hours and a feature is never used, the application software 286 may lower the accessibility raking of the feature. According to another example embodiment, in a case where the application software 286 proposes features to a user, a usage characteristic may include a number of previously accepted or rejected proposals. Although example embodiments have been described with respect to a usage characteristic of a user, the usage characteristic of criteria S 1010 may include, for example, a usage characteristic of a different user and/or a statistical aggregation of a plurality of users.
[0111] According to at least one embodiment, a statistical aggregation of a plurality of users may include, for example, grouping users according to one or more usage characteristics. Accessibility of features may be changed based on, for example, trends related to users falling within the aggregation. For example, user 'User 1 ' has the following characteristics: male; traffic subscription; frequent user of traffic features; and device profile expert. 'User belongs to an aggregation of users that have the same or similar usage characteristics. The application software 286 may change the accessibility of features based on data that, for example, other users falling within the aggregation often use feature Ύ' or have accepted the device recommendation of feature Ύ'. For example, the application software 286 may adjust the likelihood that the accessibility of feature Ύ' is increased or the likelihood that feature Ύ' is recommended to 'User 1 '.
[0112] Criteria S 1020 is a user profile. For example, the navigation device 200 in communication with the server 302 and/or 728 may track the changes in a profile of a different user (e.g., a similar and/or specified user) and/or a prototypical user. These changes may be tracked with or without the user activating such tracking. Accessibility of a feature may be changed based on changes in the different user profile and/ or a prototypical user profile. The changes in the different or prototypical user profile may be changes to the accessibility of features and / or any other profile parameter.
[0113] Criteria S 1030 is a manner in which the PND is used. For example, the memory 230 may store selection sequences of navigation through the GUI 288 selection tree. Specific features may be associated with the one or more of the selection sequences. The application software 286 may monitor the selections made by the user. If the user navigates through the GUI 288 selection tree such that the user selects options matching a selection sequence, the application software 286 may change the accessibility ranking of a feature. For example, if a specific selection sequence activates a specific type of feature, the application software 286 may increase the accessibility ranking of similar features.
[0114] Criteria S 1040 is a behaviour of a user, such as a driving behaviour. For example, the workload and/or stress of a driver may be monitored by a PND. If the workload and/or stress of a user increases, the user can generally attend to less information. Attending to information from the PND rather than the primary driving task in situations where the user is under increased workload and/ or stress can potentially lead to unsafe situations. The application software 286 may increase or decrease the accessibility of a feature based on workload and/or stress of the user. Other example driving behaviours may include, for example, the usual time of the day that a user drives, the speed at which a user drives, and/ or the average daily distances driven by the user. According to an example embodiment, the behaviour of the user may be a number of features used by the user.
[0115] Criteria S 1050 is a preference of a user. A preference of a user may be used to override a determination of the application software 286 as to whether or not to increase or decrease the accessibility ranking of a feature. A user can prevent the application software 286 from increasing or decreasing in accessibility by locking the feature. For example, a user may deselect a feature so that it remains perpetually hidden despite a determination by the application software 286 that a feature should be made more accessible. According to at least one example embodiment, the user may remove a feature from the navigation device 200.
[0116] Criteria S I 060 is a new feature. A new feature may be a feature recently downloaded (e.g., from the server 302), one available on the device but hidden until an event occurs (e.g., unlocking of the feature via activation for a trial or by payment), or a feature shared between users (e.g., shared content). Generally, new features have increased accessibility relative to pre-existing features. However, example embodiments may decrease the accessibility of new features. For example, a user may wish to maintain a simple GUI 288 and hide all new features made available through the GUI 288 by the navigation device 200.
[0117] Criteria S I 070 is the usability of a feature. If the application software 286 detects that the user is in a situation in which a feature may be used, the accessibility of the feature may be increased. The following examples are provided to illustrate criteria S 1070. According to at least one embodiment, if a user consistently detours around a route provided by the PND, the feature "Avoid Part of the Route" may be made more accessible to the user. If "Avoid Part of the Route" is used frequently, the feature "Map Corrections" may be increased in accessibility. According to at least one embodiment, if a user is on a familiar route, the user may be prompted that the feature "Turn Off Voice" is available so that voice instruction may be turned off. According to at least one embodiment, if a user frequently drives to a particular parking garage, the user may be prompted that the feature "Park Assist" is available. According to at least one embodiment, if a user frequently travels to a point of interest (POI), the user may be prompted that "POI" functionality is available. According to at least one embodiment, if a user frequently drives through a street designated as blocked by the device 200, the user may be prompted that the feature "(Un)Block Street" is available.
[0118] According to an example embodiment, a determination of whether or not to change the accessibility of one or more features provided to a user through a user interface (e.g., GUI 288) may occur iteratively. If the accessibility of the feature will change, a determination of how the accessibility of the feature will change may occur.
[0119] FIG. 1 1 is a flowchart illustrating incremental change in the accessibility of a feature according to iterations of the method of FIG. 10. Referring to FIG. 1 1 , step S I 1 10 may represent a first iteration of the method of FIG. 10. For example, if criteria S 1010 (usage characteristic) is the criteria and the user has frequently used a feature that is set at a relatively low accessibility level, the accessibility may be increased. The graphical representation of the feature may be moved to a position of a GUI 288 selection tree that requires fewer user inputs to activate. At step S I 120, a second iteration of the method of FIG. 10 is performed and the accessibility of the feature is increased. For example, the user continues to frequently use the feature and the feature is used more frequently than other similar features. The feature is moved to the top level of the GUI 288 selection tree and the background color of the icon may be changed. At step S I 130, although the user continues to frequently use a feature, other features receive a higher ranking in step SI 120. Accordingly, the feature is moved down in the GUI 288 selection tree and a border is added so that it exceeds the visual prominence of other features at that level of the GUI 288 selection tree. Moreover, the icon is positioned so that it is more visually prominent than other icons at that level of the GUI 288 selection tree.
[0120] FIG. 1 1 is illustrative of an iterative method of determining the accessibility of a feature. However, steps S I 1 10-S l 130 are examples only. For example, although adaptive determination of accessibility of a feature is described as occurring in sequential iterative steps, the rate of change for each feature or group of features may be based on different frequencies (e.g., a feature may be eligible for change only every other iteration or for no iterations). [0121] Further, the sequence of increasing or decreasing accessibility is not limited by FIG. 1 1. Although steps S I 1 10 and SI 120 show increases in accessibility, and step S I 130 shows a decrease in accessibility, the determination resulting from each iteration may be any one of an increase, a decrease or no change to the accessibility of at least one feature. Similarly, although FIG. 1 1 is described according to specific types of changes of accessibility, these changes are only examples and example embodiments encompass any type of change to accessibility. For example, step S I 1 10 shows an increase of the accessibility of a feature and the increase is described as a change in position of the feature in a GUI 288 selection tree. However, any type of change of accessibility may be determined at each iteration or no determination of a type of change in accessibility may occur (e.g., no determination where a same type of change occurs) .
[0122] For example, according to criteria S 1060 (new feature) the accessibility of a new feature may be set to a maximum accessibility in step S I 1 10. The accessibility of a new feature may also include accessibility not available to existing features. For example, a new feature may be merged into a GUI 288 selection tree in such a manner that a new feature is easily distinguished from pre-existing features. The distinguishing feature may be reserved solely for new features (e.g., reserved borders). A new feature may also remain at a maximum accessibility despite satisfying criteria instructing a decrease in accessibility for pre-existing features, or at least remain at a maximum accessibility for a longer time period and/ or until a condition is satisfied. For example, a new feature may remain at a maximum accessibility for a longer time period than a pre-existing feature or until S I 070 (usability of a feature) occurs. If criteria S 1070 occurs and the navigation device 200 has detected an occasion that the new feature can be used, the newly added feature may be made available to the user. For example, the feature may blink and/ or prompt the user and/ or be accompanied by an auditory alert.
[0123] If a new feature satisfies a criteria instructing a decrease in accessibility for new features, for instance if a user fails to use the feature despite a response according to criteria S I 070, the feature may be incrementally decreased in accessibility. For example, the new feature may decrease in accessibility until it is indistinguishable from pre-existing features. Once the new feature is indistinguishable from pre-existing features, the new feature may become subject to the same criteria and given the same accessibility as pre-existing features.
[0124] According to at least one example embodiment, a new feature may be initially set to a minimum accessibility until it may be used, and when the new feature may be used, the accessibility of the new feature is increased to an accessibility reserved for new features. A prompt may be displayed indicating to the user that at least one of the features is available.
[0125] Although not shown in FIGS. 10 or 1 1 , a feature may eventually be decreased in accessibility until it reaches a minimum accessibility. For example, a minimum accessibility may include unavailability of the feature on the device, availability on the device but not in the GUI 288 or a graphical representation of the feature may be removed and the feature placed in an inactive menu.
[0126] Change in accessibility of a feature may or may not be relatively transparent. For example, if the navigation device 200 detects that a user never uses a feature, it may become less distinguishable over time (e.g., over iterations as described with respect to FIG. 1 1 , above). The graphical representation of the feature may change in background color, the border may change, and auditory alerting may be removed. Eventually the navigation device 200 may fade the feature until it is no longer represented graphically. In no case is the user prompted and each change in accessibility is invisible to the user. Accordingly, the progression to minimum accessibility is gradual and relatively transparent to a user. Alternatively, the progression to minimum accessibility may be abrupt and involve the user at every decrease of accessibility. For example, the application software 286 may prompt the user each time a decrease in accessibility of the feature is determined to be appropriate. One having ordinary skill in the art will understand that the transparency of change in accessibility of features may include a variable amount of involvement of the user and the transparency of change may be set accordingly.
[0127] A speed at which the application software 286 changes the accessibility of features may be adjustable. For example, the application software 286 may be set to change the accessibility of one or more features each iteration of FIG. 10. The application software may be set to change the accessibility of the one or more features based on any number of iterations. A speed may be assigned feature by feature, to individual features or to all the features. As an alternative, the application software 286 may be set to never change the accessibility of features.
[0128] The speed and transparency at which the application software 286 changes the accessibility of features may be preset (e.g., by a manufacturer). An initial global speed and transparency setting may be determined, for example, through empirical studies of the satisfaction of an average user or types of users to various rates of change of the GUI 288. The speed and transparency of changing feature accessibility may be subsequently alterable by the user so that adaptive determination conforms to the comfort level of each user. The alteration may be made by the user feature by feature, by subgroups of features or with respect to all features. According to at least one example embodiment, a speed and/or transparency of the accessibility of features may be based on a preference of the user, a profile selected by the user and/ or a usage characteristic of the user. For example, as discussed above with reference to criteria S 1010 (usage characteristic of a user), the number of times a feature is used may be counted over a 96 hour period. However, a user may, for example, extend or shorten that period with respect to one or more features or for every feature.
[0129] Additionally, the GUI 288 may provide an overall setting for speed and transparency at which the application software 286 changes accessibility so that a user may adjust the speed and transparency of adaptive determination globally. The feature accessibility may be set to change quickly for a progressive type of user (e.g., Michael Schumacher), may change at a slow pace for a conservative user and/ or may change at a rate falling somewhere in between for an average user. The global speed and transparency setting may adjust feature by feature settings by, for example, a common factor and/or according to a setting template.
[0130] While embodiments described in the foregoing detailed description refer to a navigation device 200, it should be noted that example embodiments are applicable to any device including a plurality of features provided to the user through a user interface. For example, any electronic device, computing or non-computing, operating via software or via hardware, and/ or that is mobile or fixed, is contemplated by example embodiments. Accordingly, while the example embodiments implement certain functionality by means of software, that functionality could equally be implemented solely in hardware (for example by way of one or more ASICs (application specific integrated circuit)) or indeed by a mix of hardware and software. As such, the scope of the present disclosure should not be interpreted as being limited only to being implemented in software.
[0131] Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions or program segments stored on a tangible data recording medium (computer readable medium), such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared. The series of computer instructions or program segments can constitute all or part of the functionality of the method of embodiments described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.
[0132] It will also be appreciated that whilst various aspects and embodiments of the present disclosure have heretofore been described, the scope of the present disclosure is not limited to the particular arrangements set out herein and instead extends to encompass all arrangements, and modifications and alterations thereto, which fall within the scope of the appended claims.
[0133] Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present disclosure is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims

What is claimed is: 1. A method of adaptively determining (S 1075) accessibility of features provided to a user through a user interface (288) of an electronic device (200), the method comprising:
adaptively determining (S 1075) whether to increase accessibility of at least one of the features (S I 1 10, S I 120) provided to the user through the user interface (288) based on a first criteria (S I 005); and
adaptively determining (SI 075) whether to decrease the accessibility of the at least one of the features (S I 130) based on a second criteria (SI 005).
2. The method of claim 1 , wherein at least one of the first and second criteria (S I 005) include at least one of a usage characteristic of the user
(S 1010) and a usage characteristic of at least one other user (S I 010).
3. The method of claim 2, wherein the at least one of the usage characteristic of the user and the at least one other user (S 1010) is a number of times the at least one of the features is used over a period of time.
4. The method of claim 1 , wherein at least one of the first and second criteria (S 1005) is based on at least one of a user profile (S 1020), a manner in which the user interface is used (S 1030), a behaviour of the user (S 1040), and a preference of the user (S 1050).
5. The method of claim 4, wherein the behaviour of the user (S 1040) is at least one of a driving behaviour of the user and a number of features used by the user.
6. The method of claim 4, wherein the user profile (S I 020) includes at least one of a profile of a similar user, a profile of a specified user and a profile template.
7. The method of claim 4, wherein the manner in which the user interface is used (S I 030) includes a sequence of selections made by the user through the user interface (288).
8. A method as claimed in any of the preceding claims, wherein the increase or decrease of the accessibility of the at least one of the features (S I 1 10, S I 120, S I 130) includes at least one of changing a visual prominence of the at least one of the features, changing auditory alerting of the user, and changing a number of selections required to be made by the user through the user interface (288) to activate the at least one of the features.
9. A method as claimed in any of the preceding claims, wherein the at least one of the features is a new feature, and
the at least one of the features is initially set to a maximum accessibility.
10. The method of claim 8, wherein the at least one of the features is prominently displayed and accompanied by at least one of an auditory alert and a visual alert.
1 1. The method of claim 8, wherein the changing of the visual prominence includes changing at least one of a background color of an icon used to represent the at least one of the features, a border of the icon, a size of the icon, a blink speed of the icon, and a location of the icon.
12. A method as claimed in any of the preceding claims, wherein the decrease of the accessibility of the at least one of the features (S I 130) includes at least one of removing a graphical representation of the feature from the user interface (288) and placing the at least one of the features in a menu including inactive functionality.
13. A method as claimed in any of the preceding claims, wherein the first and second criteria (SI 005) are the same.
14. A method as claimed in any of the preceding claims, further comprising: performing the increase or decrease of the accessibility of the at least one of the features (SI 110, SI 120, SI 130),
wherein the user is prompted for approval one of before or after the performing of the increase or decrease of the accessibility of the at least one of the features (SI 110, SI 120, SI 130).
15. The methods of any of claims 1-13, further comprising:
performing the increase or decrease of the accessibility of the at least one of the features (SI 110, SI 120, SI 130),
wherein the electronic device increases or decreases the accessibility of the at least one of the features (SI 110, SI 120, SI 130) without prompting the user.
16. The method of claim 1, further comprising:
performing the increase or decrease of the accessibility of the at least one of the features (SI 110, SI 120, SI 130),
wherein the user manually sets the accessibility of the at least one of the features.
17. The methods of any of claims 1-13, further comprising:
performing the increase or decrease of the accessibility of the at least one of the features (SI 110, SI 120, SI 130),
wherein the electronic device (200) increases or decreases the
accessibility of the at least one of the features (SI 110, SI 120, SI 130) incrementally, and
a speed of the increase or decrease of the accessibility of the at least one of the features is changeable based on at least one of a speed preference of a user (S1050), the profile selected by the user (S1020) and the usage
characteristic of the user (S1010).
18. The method of claim 4, further comprising:
performing the increase or decrease of the accessibility of the at least one of the features (SI 110, SI 120, SI 130), wherein the preference of the user (S I 050) locks the at least one of the features so that the at least one of the features is removed or the accessibility of the at least one of the features remains fixed.
19. The method of claim 9, wherein the maximum accessibility of the new feature (S 1060) is different from a maximum accessibility of a pre-existing feature, such that the new and pre-existing features are distinguishable.
20. The method of claim 1 , further comprising:
performing the increase or decrease of the accessibility of the at least one of the features (S I 1 10, S I 120, S I 130),
wherein the at least one of the features is a new feature,
the first criteria (S I 005) is an indication that the at least one of the features can be used (S 1070), and
the at least one of the features is initially set to a minimum accessibility until the at least one of the features can be used, and when the at least one of the features can be used the accessibility is increased to an accessibility reserved for new features and a prompt is displayed indicating to the user that the at least one of the features is available.
21. The method of claim 20, wherein if the at least one of the features is set to the accessibility reserved for new features and the second criteria
(S I 005) includes a number of times the at least one of the features is used over a period of time, and the criteria is satisfied, the at least one of the features is incrementally made less accessible until the at least one of the features is indistinguishable from pre-existing features, at which time the at least one of the features becomes subject to the same criteria as the pre-existing features.
22. A computer readable medium having computer readable
instructions stored thereon for execution by a processor (210) to perform method of at least one of the preceding claims.
23. An electronic device including a user interface (288) configured to execute the method of any of claims 1 -21.
24. An electronic device (200), comprising:
a processor (210);
a memory (230);
a display (720); and
a user input device (710, 712),
wherein the electronic device (200) is configured to provide instructions to the processor according to application software (286) to adaptively determine (S 1075) whether to increase accessibility of at least one feature (S I 1 10, S I 120) provided to a user by the electronic device (200) based on a first criteria (S 1005) and to adaptively determine (S I 075) whether to decrease the accessibility of the at least one feature (S I 130) based on a second criteria (S 1005).
25. The electronic device (200) of claim 24, wherein at least one of the first and second criteria is based on at least one of a usage characteristic of the user (S 1010), a usage characteristic of at least one other user (S 1010), a user profile (S 1020), a manner in which the user input device (712) is used (S I 030), a behaviour of the user (S I 040), a preference of the user (S 1050), whether the feature is a new feature (S 1060) and the usability of the feature (S 1070).
26. The electronic device (200) of either claim 24 or 25, further comprising:
a vibrating alert device (722); and
an audio output device (724).
27. The electronic device (200) of either claim 24 or 25, wherein the increase or decrease of the accessibility of the at least one feature (S I 1 10-
S l 130) includes at least one of changing a visual prominence of the at least one feature, changing auditory alerting of the user, and changing a number of selections required to be made by the user through the user input device (712) to activate the at least one feature.
28. An electronic device (200) as claimed in any of claims 24-26, wherein the electronic device (200) is a navigation device.
29. The navigation device of claim 28, wherein the navigation device portable navigation device (PND).
30. The navigation device of claim 28, wherein the navigation device -vehicle navigation device.
PCT/EP2009/067456 2009-12-17 2009-12-17 Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same WO2011072746A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2009/067456 WO2011072746A1 (en) 2009-12-17 2009-12-17 Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2009/067456 WO2011072746A1 (en) 2009-12-17 2009-12-17 Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same

Publications (1)

Publication Number Publication Date
WO2011072746A1 true WO2011072746A1 (en) 2011-06-23

Family

ID=42545447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/067456 WO2011072746A1 (en) 2009-12-17 2009-12-17 Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same

Country Status (1)

Country Link
WO (1) WO2011072746A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020085043A1 (en) * 2000-12-28 2002-07-04 International Business Machines Corporation Context-responsive in-vehicle display system
US20070194902A1 (en) * 2006-02-17 2007-08-23 Microsoft Corporation Adaptive heads-up user interface for automobiles
EP1992515A1 (en) * 2007-05-16 2008-11-19 Volkswagen AG Multifunctional display and operational device and method for operating a multifunctional display and operational device in a motor vehicle
US20090150814A1 (en) * 2007-12-06 2009-06-11 Sony Corporation Dynamic update of a user interface based on collected user interactions

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020085043A1 (en) * 2000-12-28 2002-07-04 International Business Machines Corporation Context-responsive in-vehicle display system
US20070194902A1 (en) * 2006-02-17 2007-08-23 Microsoft Corporation Adaptive heads-up user interface for automobiles
EP1992515A1 (en) * 2007-05-16 2008-11-19 Volkswagen AG Multifunctional display and operational device and method for operating a multifunctional display and operational device in a motor vehicle
US20090150814A1 (en) * 2007-12-06 2009-06-11 Sony Corporation Dynamic update of a user interface based on collected user interactions

Similar Documents

Publication Publication Date Title
EP2646781B1 (en) Navigation methods and systems
US20070265772A1 (en) Portable navigation device
EP2867619B1 (en) Methods and systems generating driver workload data
EP2556341B1 (en) Navigation or mapping apparatus and method
JP7025210B2 (en) Routing policies and methods and systems for generating routes
JP5702605B2 (en) Navigation device, system and method with wireless search module
US9170119B2 (en) Method and system for dynamically adapting user interfaces in vehicle navigation systems to minimize interaction complexity
JP2010536075A (en) Location data processing apparatus and method for importing location information
DE112016002580T5 (en) MOBILE GEOGRAPHIC APPLICATION IN AUTOMOBILE ENVIRONMENT
JP2011506986A (en) Navigation apparatus and method
WO2012089273A1 (en) Methods and systems for obtaining information
WO2011072745A1 (en) Dynamic point of interest suggestion
CN110400191A (en) The method and apparatus recommended for adaptive vehicle feature
WO2011079870A1 (en) Method for warning of insufficient battery level in a navigation device
WO2010076045A1 (en) Timed route navigation device
WO2011072746A1 (en) Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same
CN106453543A (en) Data transmission method and device based on automobile instrument device
EP2368091A1 (en) Navigation device and method for determining a route of travel
CN104864877A (en) System and method for enabling point of interest information to navigation system
CN116821511A (en) Information recommendation method and device, electronic equipment and storage medium
TW201122995A (en) Methods of adaptively determining the accessibility of features provided through a user interface and navigation apparatuses using the same
WO2011072744A1 (en) Dynamic point of interest suggestion
WO2010081548A1 (en) Navigation apparatus, location selection support system and method of supporting location selection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09806089

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09806089

Country of ref document: EP

Kind code of ref document: A1