WO2003088013A2 - Method and system of managing a user interface of a communications device - Google Patents

Method and system of managing a user interface of a communications device Download PDF

Info

Publication number
WO2003088013A2
WO2003088013A2 PCT/EP2003/001610 EP0301610W WO03088013A2 WO 2003088013 A2 WO2003088013 A2 WO 2003088013A2 EP 0301610 W EP0301610 W EP 0301610W WO 03088013 A2 WO03088013 A2 WO 03088013A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
display screen
content
user interface
keypad
Prior art date
Application number
PCT/EP2003/001610
Other languages
French (fr)
Other versions
WO2003088013A3 (en
Inventor
Raquel Navarro-Prieto
Paul Dominic Baker
James Alexander Rex
Original Assignee
Motorola Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Inc filed Critical Motorola Inc
Priority to AU2003206916A priority Critical patent/AU2003206916A1/en
Publication of WO2003088013A2 publication Critical patent/WO2003088013A2/en
Publication of WO2003088013A3 publication Critical patent/WO2003088013A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72466User interfaces specially adapted for cordless or mobile telephones with selection means, e.g. keys, having functions defined by the mode or the status of the device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it

Definitions

  • the present invention relates to managing a content and/or an orientation of a mobile wireless communications device.
  • the invention may, for example, be useful when a keyboard or mouse is not convenient.
  • the mobile data devices may provide information comprising video, audio, data or a combination of these.
  • Videoconferencing systems that use microphone arrays to locate talkers and direct cameras at them are known prior art.
  • the three-dimensional (tilt) movements of a mobile communication device are used to control the display orientation.
  • the user can change the orientation of the display content by actuating special features.
  • the user can also actively select the orientation, for example by pressing a button.
  • the invention comprises a method in accordance with appended independent claim 1 , a system in accordance with appended independent claim 16, and devices in accordance with claim 21.
  • the prime benefit of a user interface in accordance with the present invention is that the user does not need to re-position either him/herself or the device in order to use it. Additionally, multiple users could more easily time-share a single display.
  • Another application of the present invention is for managing orientation of other kinds of user interfaces than just displays. It can be easily applied for managing functions of keys on keypads and touch screens. The keypad and touch screen of the device will be automatically redefined to provide orientation that is easiest to use.
  • Figure 1 is a flowchart illustrating a method of managing content and orientation of a user interface in accordance with an embodiment of the present invention
  • Figure 2 is a schematic illustration of a system for managing content and orientation of a user interface in accordance with an embodiment of the present invention.
  • a “content of a user interface” herein below refers, in case of a display screen, to a picture presented on it. In the cases of keypads and touch screens, this refers to a function that is assigned to each key.
  • information window refers to an integral part of a picture on the display screen with consistent content. This may be, for example, one of several windows presented by a 'Windows' TM operating system.
  • redefining refers to a process of changing assignments of each of the keys of the keypad and/or touch screen.
  • the present invention allows a mobile wireless communications device or system to gather information about the orientation and relative position of the user. Technologies used for tracking relative position of the user may involve voice, ultrasonic, capacitive, radar and infrared tracking. This information is used to select how the user interface should be adapted and oriented to enhance user interaction.
  • this invention presents a distinctive way of managing the information presented on the displays of small mobile devices. It is, however, within the contemplation of a person resonably skilled in the art to extend this principle to the interfaces of fixed devices.
  • a fixed device is a public information kiosk, which may for example be built into a table.
  • a method and system for managing diverse "levels" of information on the display For instance it can be applied when the device offers diverse windows, each using a different media. In such case the user can select which one is the most important. For example, a window showing a diagram may be more significant to the user than another window that is showing text.
  • the method allows optimisation of the organization (zooming in and out) of the windows presented to the user.
  • the distance between the user and the display can automatically cause the window to zoom in.
  • the advantage of this method is that it helps automate both window management, and the zoom ratio for a user.
  • the method takes into account what the user can see at any given distance from the display. Automatic window managing becomes very important in scenarios where the user is mobile and not able to use a mouse or keyboard for this task.
  • the device ascertains information about the relative position of the user and the device. Therefore the device can zoom in to the image displayed as the angle between the normal to the display screen and the user's line of sight increases, i.e. the user moves away from the optimal viewing axis of the device. The device can zoom out from the image displayed as the angle between the normal to the display screen and the user's line of sight decreases, i.e. the user moves towards the optimal viewing axis of the device.
  • Prior art zooming strategies based simply upon the distance of the user from the device may also be incorporated.
  • a method and a system for automatically orienting displays can be applied in small data devices that can be used irrespective of their orientation.
  • the display screen of the device will show images in various orientations relative to the body of the device. This useful feature enables a device's user to interact with it from multiple viewing directions. Additionally if it had such screens on more than one face, it could be held in yet more different orientations. This concept can also be applied to mobile phones.
  • step 100 a relative location of a user and user interface is determined.
  • a signal from an array 208 of sensors 210 is used.
  • Control unit 204 determines this relative location.
  • step 100 The determination performed in step 100 consists of two independent substeps, 102 and 104.
  • step 102 the distance between the user and the user interface 202 is determined.
  • step 104 control unit 204 determines the relative position of the user and the user interface.
  • One example of a user interface 202 that can be managed according to the present invention is a display screen.
  • the control unit 204 in step 106 compares the determined distance with predefined values. These predefined values are distances related to sizes of information window on the display screen. Each predefined distance value has a corresponding size of information window that assures necessary detail recognition. When the determined distance exceeds any of these predefined values, the control unit 204 zooms in, in step 108, the information window with the most important content. In step 110, the information window with the most important content is zoomed-out. Zooming-out is performed when the determined distance is shorter than the predefined one for this size of information window. After zooming-out, other windows may then also be displayed on the display screen. Steps 108 and 110 may only be performed when the display screen contains more than one information window.
  • the importance of an information window is ranked by the user.
  • the ranking is stored in memory 206 and can be changed dynamically.
  • the ranking can also be predefined by a kind of content of the information window.
  • the information window may contain a video picture, a text, or a graphic. The user will select the one of these that is most useful when relatively far from the display.
  • the relative position of the user and the user interface determined in step 104 can be used for changing orientation of the display screen.
  • the control unit 204 compares the determined relative position of the user and the display screen with a predefined set of relative positions. Each predefined position has a corresponding orientation of the display screen content that assures the best possible view direction.
  • the display screen content is rotated when the determined relative position of the user does not match the predefined position of current orientation of the display screen content. After rotation, the relative position of the user matches the predefined position of current orientation of the display screen content.
  • the angle of rotation of the display screen content can be freely chosen. However for some kinds of display screens the angle of rotation should be approximately equal to 90 degrees or a multiple thereof.
  • user interface 202 that can be managed according to the present invention are a keypad and a touch screen.
  • the control unit 204 compares the determined relative position of the user and the keypad and/or the touch screen with a predefined set of positions. Each predefined position has corresponding orientation of the keys of the keypad and the touch screen that assures the easiest possible use.
  • the keys' definitions are redefined when the determined relative position of the user does not match the predefined position of current keys' definitions. After redefining, the relative position of the user matches the predefined position of the current keys' definitions.
  • the relative position of the user and screen are known. Therefore, the image or a window may be zoomed in, as the angle between the user and the normal axis to the screen increases. Conversely, the image or window may be zoomed out, as the angle between the user and the normal axis to the screen decreases. All changes of the screen content and its orientation, as well as redefining the keys of the keypad and the touch screen, can be referred to as adapting. This adapting according to the present invention is performed automatically, but it is also possible that the user of the device can manually influence this adapting.
  • a device incorporating system 200 has the display screen on more than one face.
  • a system 200 managing the content and/or orientation of the user interface 202 comprises an array 208 of sensors 210.
  • the array 208 of sensors 210 is connected to a control unit 204.
  • the control unit 204 is able to determine relative location of the user and the user interface. The determination is made on the basis of signals from array 208 of sensors 21.0.
  • the control unit 204 is connected to the user interface 202 and controls its content and/or orientation.
  • the user interface 202 can be a display screen and/or a keypad and/or a touch screen.
  • a memory 206 for storing a predefined ranking of an importance of the content of the display screen, is connected to the control unit 204.
  • the array 208 of sensors 210 can consist of at least two microphones, or infrared sensors, capacitive sensors, a radar transponder, or ultrasonic sensors, or any combination of them.
  • a system in accordance with the invention, or the method of the invention may be used in various data devices.
  • the invention is usable in portable or mobile radio communication devices. Therefore the system may be used in a mobile telephone or a portable or mobile PMR radio.
  • the invention also may be used in a personal digital assistant (PDA) or laptop computer, linked for example by a radio or infra-red communication link to a cellular network.
  • PDA personal digital assistant
  • Such a network may be in a building, or be a cellular telephone network, or a UMTS/3G network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Transceivers (AREA)

Abstract

A method and a system of managing a user interface of a mobile wireless communications device for use in data devices, comprises the step of determining a relative location of the user and the device. The determination is performed on the basis of the signals from an array (208) of sensors (210). A control unit (204) determines this relative location. Knowing the relative location of the user, the control unit (204) can automatically change the orientation of the user interface. If the user interface (202) is a display screen, its content can be rotated to a position that ensures the user the best possible view. If the user interface (202) is a keypad or a touch screen, functions of each key can be redefined to ensure the best possible position for use. Knowing the distance between the user and the user interface, the control unit (204) can change the size of the most important information window on the display screen. The change is made automatically as the distance between the user and the user interface changes. The importance of information windows is ranked, and the ranking is stored in memory (206).

Description

Method and System of Managing A User Interface of a Communications
Device
Technical Field
The present invention relates to managing a content and/or an orientation of a mobile wireless communications device. The invention may, for example, be useful when a keyboard or mouse is not convenient.
Background
With the growth of wireless telecommunications, there has been an increased focus on small mobile data devices. These provide a wide range of services. This wide range of services on small devices has dramatically increased the number of design issues related to the presentation of this information. The mobile data devices may provide information comprising video, audio, data or a combination of these.
Estimating the direction of talkers and other sound sources using microphone arrays is known prior art. Videoconferencing systems that use microphone arrays to locate talkers and direct cameras at them are known prior art.
Diverse stimuli are used as inputs for interface navigation, e.g. gaze, voice. Systems that use eye gaze interaction are attractive because they use people's spontaneous eye movements to control the interface. Although these systems could accurately track a user's attention on the display, they are not practical for many products. One of the reasons is the high cost of the eye-tracking technology. Another disadvantage of these systems is the high level of intrusiveness of the eye tracking devices, e.g. head mounted systems.
In a method known in the prior art, the three-dimensional (tilt) movements of a mobile communication device are used to control the display orientation. In other devices, the user can change the orientation of the display content by actuating special features. The user can also actively select the orientation, for example by pressing a button.
Statement of Invention
It is an object of the present invention to provide a novel method and system for managing a user interface of a mobile wireless communication device which overcomes the disadvantages of the prior art.
The invention comprises a method in accordance with appended independent claim 1 , a system in accordance with appended independent claim 16, and devices in accordance with claim 21.
The prime benefit of a user interface in accordance with the present invention is that the user does not need to re-position either him/herself or the device in order to use it. Additionally, multiple users could more easily time-share a single display.
With a multi-orientation hand-held device, the user does not have to consider how to pick up the device. The device will automatically adapt itself to provide the best possible presentation of information. Also, as the display could rotate freely, left- handed people may not have some of the difficulties that the non-symmetric displays of PDAs (Personal Digital Assistant) present at the moment.
Another application of the present invention is for managing orientation of other kinds of user interfaces than just displays. It can be easily applied for managing functions of keys on keypads and touch screens. The keypad and touch screen of the device will be automatically redefined to provide orientation that is easiest to use.
This is a significant departure from the current art, as seen in WO 01/88679 (MathEngine). The cited prior art only concerns displays, and relies on knowledge of the actual position of the device. Brief description of the drawings
The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
Figure 1 is a flowchart illustrating a method of managing content and orientation of a user interface in accordance with an embodiment of the present invention;
Figure 2 is a schematic illustration of a system for managing content and orientation of a user interface in accordance with an embodiment of the present invention.
Detailed description of the preferred embodiment
The term a "content of a user interface" herein below refers, in case of a display screen, to a picture presented on it. In the cases of keypads and touch screens, this refers to a function that is assigned to each key.
The term "information window" herein below refers to an integral part of a picture on the display screen with consistent content. This may be, for example, one of several windows presented by a 'Windows' ™ operating system.
The term "redefining" herein below refers to a process of changing assignments of each of the keys of the keypad and/or touch screen.
The present invention allows a mobile wireless communications device or system to gather information about the orientation and relative position of the user. Technologies used for tracking relative position of the user may involve voice, ultrasonic, capacitive, radar and infrared tracking. This information is used to select how the user interface should be adapted and oriented to enhance user interaction.
Hence, this invention presents a distinctive way of managing the information presented on the displays of small mobile devices. It is, however, within the contemplation of a person resonably skilled in the art to extend this principle to the interfaces of fixed devices. One example of such a fixed device is a public information kiosk, which may for example be built into a table.
In accordance with the present invention, there is thus provided a method and system for managing diverse "levels" of information on the display. For instance it can be applied when the device offers diverse windows, each using a different media. In such case the user can select which one is the most important. For example, a window showing a diagram may be more significant to the user than another window that is showing text.
The method allows optimisation of the organization (zooming in and out) of the windows presented to the user. In the case where the device has a single window, the distance between the user and the display can automatically cause the window to zoom in. The advantage of this method is that it helps automate both window management, and the zoom ratio for a user. The method takes into account what the user can see at any given distance from the display. Automatic window managing becomes very important in scenarios where the user is mobile and not able to use a mouse or keyboard for this task.
The device ascertains information about the relative position of the user and the device. Therefore the device can zoom in to the image displayed as the angle between the normal to the display screen and the user's line of sight increases, i.e. the user moves away from the optimal viewing axis of the device. The device can zoom out from the image displayed as the angle between the normal to the display screen and the user's line of sight decreases, i.e. the user moves towards the optimal viewing axis of the device. Prior art zooming strategies based simply upon the distance of the user from the device may also be incorporated.
In accordance with another aspect of the present invention there is provided a method and a system for automatically orienting displays. It can be applied in small data devices that can be used irrespective of their orientation. The display screen of the device will show images in various orientations relative to the body of the device. This useful feature enables a device's user to interact with it from multiple viewing directions. Additionally if it had such screens on more than one face, it could be held in yet more different orientations. This concept can also be applied to mobile phones.
Referring to figures 1 and 2, in step 100 a relative location of a user and user interface is determined. For this purpose, a signal from an array 208 of sensors 210 is used. Control unit 204 determines this relative location.
The determination performed in step 100 consists of two independent substeps, 102 and 104. In step 102, the distance between the user and the user interface 202 is determined. In step 104, control unit 204 determines the relative position of the user and the user interface.
One example of a user interface 202 that can be managed according to the present invention is a display screen.
The control unit 204 in step 106 compares the determined distance with predefined values. These predefined values are distances related to sizes of information window on the display screen. Each predefined distance value has a corresponding size of information window that assures necessary detail recognition. When the determined distance exceeds any of these predefined values, the control unit 204 zooms in, in step 108, the information window with the most important content. In step 110, the information window with the most important content is zoomed-out. Zooming-out is performed when the determined distance is shorter than the predefined one for this size of information window. After zooming-out, other windows may then also be displayed on the display screen. Steps 108 and 110 may only be performed when the display screen contains more than one information window.
The importance of an information window is ranked by the user. The ranking is stored in memory 206 and can be changed dynamically. The ranking can also be predefined by a kind of content of the information window. The information window may contain a video picture, a text, or a graphic. The user will select the one of these that is most useful when relatively far from the display. The relative position of the user and the user interface determined in step 104 can be used for changing orientation of the display screen.
The control unit 204, in step 112, compares the determined relative position of the user and the display screen with a predefined set of relative positions. Each predefined position has a corresponding orientation of the display screen content that assures the best possible view direction. In step 114, the display screen content is rotated when the determined relative position of the user does not match the predefined position of current orientation of the display screen content. After rotation, the relative position of the user matches the predefined position of current orientation of the display screen content.
The angle of rotation of the display screen content can be freely chosen. However for some kinds of display screens the angle of rotation should be approximately equal to 90 degrees or a multiple thereof.
Other examples of the user interface 202 that can be managed according to the present invention are a keypad and a touch screen.
The control unit 204, in step 112, compares the determined relative position of the user and the keypad and/or the touch screen with a predefined set of positions. Each predefined position has corresponding orientation of the keys of the keypad and the touch screen that assures the easiest possible use. In step 116, the keys' definitions are redefined when the determined relative position of the user does not match the predefined position of current keys' definitions. After redefining, the relative position of the user matches the predefined position of the current keys' definitions.
The relative position of the user and screen are known. Therefore, the image or a window may be zoomed in, as the angle between the user and the normal axis to the screen increases. Conversely, the image or window may be zoomed out, as the angle between the user and the normal axis to the screen decreases. All changes of the screen content and its orientation, as well as redefining the keys of the keypad and the touch screen, can be referred to as adapting. This adapting according to the present invention is performed automatically, but it is also possible that the user of the device can manually influence this adapting.
Reference is now made to Fig. 2, which depicts a system that can manage the content and/or orientation of the user interface in accordance with an embodiment of the present invention. In another embodiment of the present invention, a device incorporating system 200 has the display screen on more than one face.
A system 200 managing the content and/or orientation of the user interface 202, according to the present invention, comprises an array 208 of sensors 210. The array 208 of sensors 210 is connected to a control unit 204.
The control unit 204 is able to determine relative location of the user and the user interface. The determination is made on the basis of signals from array 208 of sensors 21.0. The control unit 204 is connected to the user interface 202 and controls its content and/or orientation.
The user interface 202 can be a display screen and/or a keypad and/or a touch screen. A memory 206, for storing a predefined ranking of an importance of the content of the display screen, is connected to the control unit 204.
The array 208 of sensors 210 can consist of at least two microphones, or infrared sensors, capacitive sensors, a radar transponder, or ultrasonic sensors, or any combination of them.
A system in accordance with the invention, or the method of the invention may be used in various data devices. In particular, the invention is usable in portable or mobile radio communication devices. Therefore the system may be used in a mobile telephone or a portable or mobile PMR radio. The invention also may be used in a personal digital assistant (PDA) or laptop computer, linked for example by a radio or infra-red communication link to a cellular network. Such a network may be in a building, or be a cellular telephone network, or a UMTS/3G network.

Claims

Claims
1. A method of managing a content and/or an orientation of a user interface (202) of a mobile wireless communication device, the method comprising the steps of:
- the mobile wireless communication device sensing a relative location (102, 104) of a user and said user interface of said device;
- adapting said content and/or said orientation of said user interface.
2. A method according to claim 1 , wherein said sensing of said relative location is performed on the basis of signals from an array of sensors.
3. A method according to claim 1 , wherein said content comprises an image presented on said user interface, and the step of adapting the content comprises the steps of:
- zooming in to the information window, when the angle between the normal to the display screen and the line-of-sight to the user increases;
- zooming back from the information window, when the angle between the normal to the display screen and the line-of-sight to the user decreases.
4. A method according to claim 1 , wherein said content comprises an image presented on said user interface, and the step of adapting the content is applied when said display screen contains more than one information window, and said adapting comprises the steps of:
- zooming in to the information window with the most important content, when the said distance between said display screen and said user exceeds a predefined value;
- zooming back from the information window with the most important content, when said distance between said display screen and said user is shorter than said predefined value, allowing other information windows with content of a lower ranked importance to be displayed on said display screen.
5. A method according to claim 4, wherein the ranking of said importance of said content of said display screen is predefined by said user, and can be changed dynamically.
6. A method according to claim 4, wherein the ranking of said importance of said content of said display screen is predefined in dependence on the kind of content.
7. A method according to claim 4, wherein said kind of content can be a video picture, a text, a graphic.
8. A method according to claims 5 or 6, wherein said predefined ranking of said content is stored in a memory of said device.
9. A method according to claim 1 , wherein the user interface is a display screen, and said sensing of said relative position consists of:
- determining a distance between a user and said display screen; and/or
- determining said relative position of said display screen and said user; characterised wherein the adapting of said orientation of said display screen consists of a step of a rotation of said content of said display screen, when said relative position of said user and said display screen would not ensure a practical position for viewing of said display screen.
10. A method according to claim 9, wherein an angle of said rotation is approximately equal to 90 degrees, or a multiple thereof.
11. A method according to claim 1 , wherein said user interface is a keypad and/or a touch screen.
12. A method according to claim 11 , wherein said adapting of said orientation of said keypad and/or said touch screen consists of a step of redefining of functions of each key of said keypad and/or said touch screen, when said relative position of said user and said keypad and/or said touch screen does not ensure a practical position for using said keypad and/or said touch screen.
13. A method according to claim 12 wherein said redefining of functions of each key of said keypad and/or said touch screen ensures said user a practical position for using said keypad and/or said touch screen.
14. A method according to claim 1 wherein said adapting is performed automatically.
15. A method according to claim 1 wherein said user can manually influence said adapting.
16. A system for managing a content and/or an orientation of a user interface (202) of a mobile wireless communication device, said system comprising:
- a control unit (204), adapted to determine a relative location of a user and said user interface (202), and to control said content and/or said orientation;
- a user interface (202) for entering and/or providing information, connected to said control unit;
- an array (208) of sensors (204). for detecting said user, connected to said control unit.
17. A system according to claim 16, wherein said user interface is a display screen and/or a keypad and/or a touch screen.
18. A system according to claim 17, comprising a memory for storing a predefined ranking of an importance of said content of said display screen, that is connected to said control unit.
19. A system according to claim 17, wherein said system has said display screen on more than one face.
20. A system according to claim 16, wherein said array of sensors consists of one or more of at least two microphones, or infrared sensors, or ultrasonic sensors, or capacitive sensors, or radar, or any combination thereof.
1. A mobile telephone, a portable or mobile (PMR) radio, a personal digital assistant (PDA), or a laptop computer according to any of claims 16-20, or adapted to operate in accordance with the method of any of claims 1 - 15.
PCT/EP2003/001610 2002-04-12 2003-02-18 Method and system of managing a user interface of a communications device WO2003088013A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2003206916A AU2003206916A1 (en) 2002-04-12 2003-02-18 Method and system of managing a user interface of a communications device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0208402.8 2002-04-12
GB0208402A GB2387504B (en) 2002-04-12 2002-04-12 Method and system of managing a user interface of a communication device

Publications (2)

Publication Number Publication Date
WO2003088013A2 true WO2003088013A2 (en) 2003-10-23
WO2003088013A3 WO2003088013A3 (en) 2004-07-29

Family

ID=9934698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2003/001610 WO2003088013A2 (en) 2002-04-12 2003-02-18 Method and system of managing a user interface of a communications device

Country Status (3)

Country Link
AU (1) AU2003206916A1 (en)
GB (1) GB2387504B (en)
WO (1) WO2003088013A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013043419A1 (en) * 2011-09-20 2013-03-28 Microsoft Corporation Adjusting user interfaces based on entity location
WO2015194705A1 (en) * 2014-06-18 2015-12-23 Lg Electronics Inc. Mobile terminal and method for controlling the same
EP3044647A1 (en) * 2013-09-11 2016-07-20 Google Technology Holdings LLC Electronic device and method for detecting presence and motion
US11202325B2 (en) 2016-11-29 2021-12-14 Pacesetter, Inc. Managing dynamic connection intervals for implantable and external devices

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005071604A2 (en) * 2004-01-20 2005-08-04 Koninklijke Philips Electronics N.V. Graphical user interface
US20050245204A1 (en) * 2004-05-03 2005-11-03 Vance Scott L Impedance matching circuit for a mobile communication device
JP2005328204A (en) * 2004-05-12 2005-11-24 Pentax Corp Digital camera and portable equipment
GB0512503D0 (en) 2005-06-18 2005-07-27 Jkid Ltd A portable device
JP2014035562A (en) * 2012-08-07 2014-02-24 Sony Corp Information processing apparatus, information processing method, and computer program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001027727A2 (en) * 1999-10-13 2001-04-19 Gateway, Inc. A system and method utilizing motion input for manipulating a display of data
WO2002093331A1 (en) * 2001-05-16 2002-11-21 Myorigo Oy Method and device for browsing information on a display
EP1316877A1 (en) * 2001-11-14 2003-06-04 Nokia Corporation A method for controlling the displaying of information in an electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143604A (en) * 1997-11-05 1999-05-28 Nec Corp Portable terminal equipment
GB0011455D0 (en) * 2000-05-13 2000-06-28 Mathengine Plc Browser system and method for using it

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001027727A2 (en) * 1999-10-13 2001-04-19 Gateway, Inc. A system and method utilizing motion input for manipulating a display of data
WO2002093331A1 (en) * 2001-05-16 2002-11-21 Myorigo Oy Method and device for browsing information on a display
EP1316877A1 (en) * 2001-11-14 2003-06-04 Nokia Corporation A method for controlling the displaying of information in an electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANON.: "Personal Computer environmental control via a proximity sensor" IBM TECHNICAL DISCLOSURE BULLETIN, vol. 36, no. 8, August 1993 (1993-08), pages 343-345, XP000390248 Armonk, NY, US *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013043419A1 (en) * 2011-09-20 2013-03-28 Microsoft Corporation Adjusting user interfaces based on entity location
US9293107B2 (en) 2011-09-20 2016-03-22 Microsoft Technology Licensing, Llc Adjusting user interfaces based on entity location
AU2012312850B2 (en) * 2011-09-20 2016-12-08 Microsoft Technology Licensing, Llc Adjusting user interfaces based on entity location
RU2627106C2 (en) * 2011-09-20 2017-08-03 МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи User interface customisation based upon object location
US10241806B2 (en) 2011-09-20 2019-03-26 Microsoft Technology Licensing, Llc Adjusting user interfaces based on entity location
EP3044647A1 (en) * 2013-09-11 2016-07-20 Google Technology Holdings LLC Electronic device and method for detecting presence and motion
WO2015194705A1 (en) * 2014-06-18 2015-12-23 Lg Electronics Inc. Mobile terminal and method for controlling the same
US9377917B2 (en) 2014-06-18 2016-06-28 Lg Electronics Inc. Mobile terminal and method for controlling the same
US11202325B2 (en) 2016-11-29 2021-12-14 Pacesetter, Inc. Managing dynamic connection intervals for implantable and external devices
US11778674B2 (en) 2016-11-29 2023-10-03 Pacesetter, Inc. Managing dynamic connection intervals for implantable and external devices

Also Published As

Publication number Publication date
WO2003088013A3 (en) 2004-07-29
AU2003206916A8 (en) 2003-10-27
AU2003206916A1 (en) 2003-10-27
GB2387504A (en) 2003-10-15
GB2387504B (en) 2005-03-16
GB0208402D0 (en) 2002-05-22

Similar Documents

Publication Publication Date Title
US8928723B2 (en) Mobile terminal and control method thereof
US9167072B2 (en) Mobile terminal and method of controlling the same
EP2180676B1 (en) Mobile communication terminal and screen scrolling method thereof
US8423076B2 (en) User interface for a mobile device
CN102238275B (en) Mobile terminal and method for displaying an image in a mobile terminal
US9262867B2 (en) Mobile terminal and method of operation
KR100981200B1 (en) A mobile terminal with motion sensor and a controlling method thereof
US20090299730A1 (en) Mobile terminal and method for correcting text thereof
EP2385687B1 (en) Mobile terminal and control method thereof
US20100115407A1 (en) Mobile terminal and displaying method thereof
KR20120046991A (en) Mobile terminal and method for controlling the same
KR20090107853A (en) Mobile terminal and method for processing screen thereof
KR20140049290A (en) Mobile termina, broadcasting terminal and controlling method thereof
KR101689171B1 (en) Mobile Terminal And Method Of Photographing Image Using The Same
US8260268B2 (en) Method for transmitting and receiving data in mobile terminal and mobile terminal using the same
KR100917527B1 (en) User interface controlling method by detecting user's gestures
WO2003088013A2 (en) Method and system of managing a user interface of a communications device
KR101781849B1 (en) Mobile terminal and method for controlling the same
CN109902679B (en) Icon display method and terminal equipment
KR101638906B1 (en) Mobile terminal and method for controlling the same
US20110093793A1 (en) Method for attaching data and mobile terminal thereof
KR20100045434A (en) A method for inputting orders using gestures
KR101604698B1 (en) Mobile terminal and method for controlling the same
KR101741399B1 (en) Mobile terminal and method for controlling display
KR101774314B1 (en) Mobile terminal and Method for cotrolling the same

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP