Method and System of Managing A User Interface of a Communications
Device
Technical Field
The present invention relates to managing a content and/or an orientation of a mobile wireless communications device. The invention may, for example, be useful when a keyboard or mouse is not convenient.
Background
With the growth of wireless telecommunications, there has been an increased focus on small mobile data devices. These provide a wide range of services. This wide range of services on small devices has dramatically increased the number of design issues related to the presentation of this information. The mobile data devices may provide information comprising video, audio, data or a combination of these.
Estimating the direction of talkers and other sound sources using microphone arrays is known prior art. Videoconferencing systems that use microphone arrays to locate talkers and direct cameras at them are known prior art.
Diverse stimuli are used as inputs for interface navigation, e.g. gaze, voice. Systems that use eye gaze interaction are attractive because they use people's spontaneous eye movements to control the interface. Although these systems could accurately track a user's attention on the display, they are not practical for many products. One of the reasons is the high cost of the eye-tracking technology. Another disadvantage of these systems is the high level of intrusiveness of the eye tracking devices, e.g. head mounted systems.
In a method known in the prior art, the three-dimensional (tilt) movements of a mobile communication device are used to control the display orientation.
In other devices, the user can change the orientation of the display content by actuating special features. The user can also actively select the orientation, for example by pressing a button.
Statement of Invention
It is an object of the present invention to provide a novel method and system for managing a user interface of a mobile wireless communication device which overcomes the disadvantages of the prior art.
The invention comprises a method in accordance with appended independent claim 1 , a system in accordance with appended independent claim 16, and devices in accordance with claim 21.
The prime benefit of a user interface in accordance with the present invention is that the user does not need to re-position either him/herself or the device in order to use it. Additionally, multiple users could more easily time-share a single display.
With a multi-orientation hand-held device, the user does not have to consider how to pick up the device. The device will automatically adapt itself to provide the best possible presentation of information. Also, as the display could rotate freely, left- handed people may not have some of the difficulties that the non-symmetric displays of PDAs (Personal Digital Assistant) present at the moment.
Another application of the present invention is for managing orientation of other kinds of user interfaces than just displays. It can be easily applied for managing functions of keys on keypads and touch screens. The keypad and touch screen of the device will be automatically redefined to provide orientation that is easiest to use.
This is a significant departure from the current art, as seen in WO 01/88679 (MathEngine). The cited prior art only concerns displays, and relies on knowledge of the actual position of the device.
Brief description of the drawings
The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
Figure 1 is a flowchart illustrating a method of managing content and orientation of a user interface in accordance with an embodiment of the present invention;
Figure 2 is a schematic illustration of a system for managing content and orientation of a user interface in accordance with an embodiment of the present invention.
Detailed description of the preferred embodiment
The term a "content of a user interface" herein below refers, in case of a display screen, to a picture presented on it. In the cases of keypads and touch screens, this refers to a function that is assigned to each key.
The term "information window" herein below refers to an integral part of a picture on the display screen with consistent content. This may be, for example, one of several windows presented by a 'Windows' ™ operating system.
The term "redefining" herein below refers to a process of changing assignments of each of the keys of the keypad and/or touch screen.
The present invention allows a mobile wireless communications device or system to gather information about the orientation and relative position of the user. Technologies used for tracking relative position of the user may involve voice, ultrasonic, capacitive, radar and infrared tracking. This information is used to select how the user interface should be adapted and oriented to enhance user interaction.
Hence, this invention presents a distinctive way of managing the information presented on the displays of small mobile devices. It is, however, within the
contemplation of a person resonably skilled in the art to extend this principle to the interfaces of fixed devices. One example of such a fixed device is a public information kiosk, which may for example be built into a table.
In accordance with the present invention, there is thus provided a method and system for managing diverse "levels" of information on the display. For instance it can be applied when the device offers diverse windows, each using a different media. In such case the user can select which one is the most important. For example, a window showing a diagram may be more significant to the user than another window that is showing text.
The method allows optimisation of the organization (zooming in and out) of the windows presented to the user. In the case where the device has a single window, the distance between the user and the display can automatically cause the window to zoom in. The advantage of this method is that it helps automate both window management, and the zoom ratio for a user. The method takes into account what the user can see at any given distance from the display. Automatic window managing becomes very important in scenarios where the user is mobile and not able to use a mouse or keyboard for this task.
The device ascertains information about the relative position of the user and the device. Therefore the device can zoom in to the image displayed as the angle between the normal to the display screen and the user's line of sight increases, i.e. the user moves away from the optimal viewing axis of the device. The device can zoom out from the image displayed as the angle between the normal to the display screen and the user's line of sight decreases, i.e. the user moves towards the optimal viewing axis of the device. Prior art zooming strategies based simply upon the distance of the user from the device may also be incorporated.
In accordance with another aspect of the present invention there is provided a method and a system for automatically orienting displays. It can be applied in small data devices that can be used irrespective of their orientation. The display screen of the device will show images in various orientations relative to the body of the device. This useful feature enables a device's user to interact with it from
multiple viewing directions. Additionally if it had such screens on more than one face, it could be held in yet more different orientations. This concept can also be applied to mobile phones.
Referring to figures 1 and 2, in step 100 a relative location of a user and user interface is determined. For this purpose, a signal from an array 208 of sensors 210 is used. Control unit 204 determines this relative location.
The determination performed in step 100 consists of two independent substeps, 102 and 104. In step 102, the distance between the user and the user interface 202 is determined. In step 104, control unit 204 determines the relative position of the user and the user interface.
One example of a user interface 202 that can be managed according to the present invention is a display screen.
The control unit 204 in step 106 compares the determined distance with predefined values. These predefined values are distances related to sizes of information window on the display screen. Each predefined distance value has a corresponding size of information window that assures necessary detail recognition. When the determined distance exceeds any of these predefined values, the control unit 204 zooms in, in step 108, the information window with the most important content. In step 110, the information window with the most important content is zoomed-out. Zooming-out is performed when the determined distance is shorter than the predefined one for this size of information window. After zooming-out, other windows may then also be displayed on the display screen. Steps 108 and 110 may only be performed when the display screen contains more than one information window.
The importance of an information window is ranked by the user. The ranking is stored in memory 206 and can be changed dynamically. The ranking can also be predefined by a kind of content of the information window. The information window may contain a video picture, a text, or a graphic. The user will select the one of these that is most useful when relatively far from the display.
The relative position of the user and the user interface determined in step 104 can be used for changing orientation of the display screen.
The control unit 204, in step 112, compares the determined relative position of the user and the display screen with a predefined set of relative positions. Each predefined position has a corresponding orientation of the display screen content that assures the best possible view direction. In step 114, the display screen content is rotated when the determined relative position of the user does not match the predefined position of current orientation of the display screen content. After rotation, the relative position of the user matches the predefined position of current orientation of the display screen content.
The angle of rotation of the display screen content can be freely chosen. However for some kinds of display screens the angle of rotation should be approximately equal to 90 degrees or a multiple thereof.
Other examples of the user interface 202 that can be managed according to the present invention are a keypad and a touch screen.
The control unit 204, in step 112, compares the determined relative position of the user and the keypad and/or the touch screen with a predefined set of positions. Each predefined position has corresponding orientation of the keys of the keypad and the touch screen that assures the easiest possible use. In step 116, the keys' definitions are redefined when the determined relative position of the user does not match the predefined position of current keys' definitions. After redefining, the relative position of the user matches the predefined position of the current keys' definitions.
The relative position of the user and screen are known. Therefore, the image or a window may be zoomed in, as the angle between the user and the normal axis to the screen increases. Conversely, the image or window may be zoomed out, as the angle between the user and the normal axis to the screen decreases.
All changes of the screen content and its orientation, as well as redefining the keys of the keypad and the touch screen, can be referred to as adapting. This adapting according to the present invention is performed automatically, but it is also possible that the user of the device can manually influence this adapting.
Reference is now made to Fig. 2, which depicts a system that can manage the content and/or orientation of the user interface in accordance with an embodiment of the present invention. In another embodiment of the present invention, a device incorporating system 200 has the display screen on more than one face.
A system 200 managing the content and/or orientation of the user interface 202, according to the present invention, comprises an array 208 of sensors 210. The array 208 of sensors 210 is connected to a control unit 204.
The control unit 204 is able to determine relative location of the user and the user interface. The determination is made on the basis of signals from array 208 of sensors 21.0. The control unit 204 is connected to the user interface 202 and controls its content and/or orientation.
The user interface 202 can be a display screen and/or a keypad and/or a touch screen. A memory 206, for storing a predefined ranking of an importance of the content of the display screen, is connected to the control unit 204.
The array 208 of sensors 210 can consist of at least two microphones, or infrared sensors, capacitive sensors, a radar transponder, or ultrasonic sensors, or any combination of them.
A system in accordance with the invention, or the method of the invention may be used in various data devices. In particular, the invention is usable in portable or mobile radio communication devices. Therefore the system may be used in a mobile telephone or a portable or mobile PMR radio. The invention also may be used in a personal digital assistant (PDA) or laptop computer, linked for example by a radio or infra-red communication link to a cellular network. Such a network may be in a building, or be a cellular telephone network, or a UMTS/3G network.