GB2298112A - Computer device with position-based information input device and operating system tool input - Google Patents
Computer device with position-based information input device and operating system tool input Download PDFInfo
- Publication number
- GB2298112A GB2298112A GB9502773A GB9502773A GB2298112A GB 2298112 A GB2298112 A GB 2298112A GB 9502773 A GB9502773 A GB 9502773A GB 9502773 A GB9502773 A GB 9502773A GB 2298112 A GB2298112 A GB 2298112A
- Authority
- GB
- United Kingdom
- Prior art keywords
- input
- operating system
- application layer
- tool
- presented
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Input From Keyboards Or The Like (AREA)
Abstract
A computer device having a display or screen (15) arranged to display a set of different application layer fields (122, 123, 124) and an operating system input tool (121) presented in place of selected ones of the application layer fields. A digitizer or other position-based information input device (17) is mounted co-spatial with the information display device for receiving input actions. A processor (10) accepts input actions from the input device and causes the information display device to display information corresponding to the input actions. The operating system input tool (121) has selectable tool parameters and different application layer fields comprise different tool parameters, whereby input actions which are presented in different locations of application layer fields cause different operating system input tools to be presented for entry of information into the different application layer fields. Eg if input stylus indicates a numerical field, then a virtual numerical keypad appears, whereas if a date field is indicated, a date input dialog box appears.
Description
COMPUTER DEVICE WITCH POSITION-BASED INFORMATION
INPUT DEVICE AND OPERATING SYSTEM TOOL INPUT Fleld of the Invention
This invention relates to a computer device, for example a handdata terminal, comprising an information display device and a positionbased information input device such as a digitizer. Such computer devices include, for example, pen-based computers.
Background of the Invention
In the field of pen computers with no physical keyboard, it is known to emulate a keyboard on a computer screen. A screen keyboard works by displaying a picture of a keyboard. The user "types" on the keyboard by touching the keys with a pen or by "clicking" with a mouse. Such keyboards can be used on a touch-screen computer with no pen but using the operator's finger.
On-screen keyboards that are currently in use are very inconvenient to use. Since pen computers usually have a small screen and the size of the keyboard is relative to the screen size, existing keyboards are either too small and can hardly be seen or are too large and cover most of the screen and hide too much information.
When popping up, the keyboard is located in a constant place and might cover the current focused ("active") field. The way it works is either by activating an iconized keyboard for each field and deactivating it before moving to another field (which is very cumbersome), or by double clicking the field, activating a keyboard, entering an input and removing the keyboard.
In summary, it is known to provide a computer comprising an LCD or CRT display arranged to display (a) an application layer set of input and display fields and (b) an on-screen keyboard which is provided to the application by the operating system and is presented on the display in place of application layer input and display fields. Such a computer also has a position-based information input device such as an electronic pen or a mouse-and-cursor co-spatial with the display device for receiving input actions and has a processor for accepting input actions from the input device and causing the display to display information corresponding to the input actions.The processor software is arranged (a) to interpret an input action which is presented in a location corresponding to an application layer field as an association between the operating system tool and the application layer field (e.g. the keyboard or other operating system tool becomes active for a current "active" field of the application), and (b) to interpret an input action (e.g. a "click" of a mouse) which is presented in a location corresponding to the operating system input tool as an application layer input for the application layer field (e.g. the active field) associated with the operating system input tool.
There is a need for an improved input tool which is presented on a display device.
Summarv of the Invention
According to the present invention, a computer device is provided comprising an information display device (e.g. an LCD or a threedimensional display) arranged to display (a) an application layer set of input and display fields and (b) an operating system input tool (e.g. an alpha keyboard, a numeric keyboard or an alphanumeric keyboard or pushbuttons, a system panel, or a time or date keyboard), which is presented in place of application layer input and display fields.The computer device further comprises a position-based information input device (such as a digitizer and stylus, an electronic pen, a mouse-andcursor or a position sensing glove with position image) co-spatial with the display device for receiving input actions; and processing means for accepting input actions from the input device and causing the display device to display information corresponding to the input actions.The processing means are arranged (a) to interpret an input action which is presented in a location corresponding to an application layer field as an association between the operating system tool and the application layer field (e.g. the keyboard or other operating system tool becomes active for a current "active" field of the application), and (b) to interpret an input action (e.g. a "click" of a mouse) which is presented in a location corresponding to the operating system input tool as an application layer input for the application layer field (e.g. the active field) associated with the operating system input tool.The invention is characterised in that the operating system input tool has selectable tool parameters and different application layer fields comprise different tool parameters, whereby input actions which are presented in different application layer field locations cause different operating system input tools to be presented for entry of information into the different application layer fields.
The invention has the advantage of being able to provide an operating system input device (e.g. and on-screen keyboard) which is customisable to the particular application field with which it is to be associated.
Moreover, by defining a standard interface for the operating system input tool, i.e. a set of parameters, any application run on the computer device is able to utilise the operating system input device.
The arrangement has advantages in terms of ease-of-use.
The operating system input tool parameters may include location parameters whereby input actions which are presented in different application layer field locations cause operating system input tools to be presented in different locations. In this manner, the location of the onscreen keyboard or other operating system input device can be optimised for the particular circumstances. The need for the user to move the onscreen keyboard or other operating system input tool is reduced or eliminated.
It is preferred that the processing means comprise operating system input tool location computation means (e.g. a novel algorithm in the operating system) for computing a location for the operating system input tool from input tool parameters provided by different application layer fields, wherein the location of the operating system input tool computed is dictated in part by the application layer field with which the operating system tool is associated at a given time (e.g. the active field) and in part by the operating system.
This feature has the advantage that the position of the on-screen keyboard or other operating system input device is not solely dictated by the location of the active field, but rather its location can be held steady when possible or convenient and moved when necessary.
For example, the application may define a primary location and a secondary location and the operating system will position the input tool in the primary position unless there is a conflict in particular circumstances, in which case it will locate the input tool in the secondary location. Such particular circumstances include a conflict with other operating system functions (e.g. another input or display tool) or a conflict with the active field of the application layer.
For successive input actions accepted from the input device presented in locations corresponding to different application layer fields following a given input action in a location corresponding to a first application layer field, it is preferred that the operating system input tool location computation means are arranged to maintain the position of the operating system input tool constant for input actions presented in application layer fields located relatively close to the first application layer field and to generate a new position for the operating system input tool for input actions presented in application layer fields located relatively remote from the first application layer field. this can be achieved by judicious selection of the primary and secondary locations of the input tool for each application layer field.
In this manner, a user can commence using an on-screen keyboard (for example) and continue to use that keyboard in the same location for successive input actions, but when the cursor, stylus or pen has travelled a long distance across the display, it becomes advantageous to move the position of the keyboard (or other input device) to a new position. An advantage is that the keyboard does not jump around with each small movement of the input device, but moves after a major movement of the input device so as to "open up" or uncover a new area (or region) of the display for further use.
The operating system input tool parameters may include a parameter defining a set of input fields for the operating system input tool.
In this manner input tools with different numbers of input fields (e.g. an alpha keyboard, a numeric keyboard, an alphanumeric keyboard, pushbuttons, function keys, a panel, a system panel, or a time or date keyboard) can be presented as appropriate for the application field in question.
In this manner, a contextual sub-keyboard is provided which is based on the content of the active field. There are many selectable subkeyboards, each containing the minimum number of keys that are required for a certain field. In this way, the keyboard can be large in size and yet not cover the screen due to its small number of keys.
It is preferred that the keyboard constantly stays on the screen and changes its type according to the field content. When it pops up, it is automatically located in the optimal location on screen, far enough from the active field not to hide the relevant data, and it may be moved manually by dragging.
A valuable novel feature is that the size of the keyboard can be easily enlarged or reduced, e.g. by dragging its corner, to be clear and readable even on small screens, in a manner otherwise known in the field of windows in general (e.g. application screens and operating system screens), but not heretofore applied to operating system tools.
The location of the operating system input tool may depend on its type. E.g. a keyboard may be held in a fixed primary location with an alternative secondary location, or may always be located adjacent the field with which it is associated at a given time.
Other parameters of the operating system tool can be offered by the operating system and requested by the application.
It is also possible to specify, through the application program interface (API), labels for the keyboard which are defined dynamically and it is possible to specify the orientation of a keyboard, e.g. horizontal or vertical, and to specify the number of buttons presented.
It is a novel feature to provide a keyboard that constantly stays on the screen and just relocates each time a new application layer field becomes active, in order not to hide the relevant field.
A preferred embodiment of the invention is now described, by way of example, with reference to the drawings.
Glossarv
The following abbreviations are used:
API application program interface
APM advanced power management
APPLET a mini-application, designed to be accessible only from within
an application
BIOS base input output system
DLL dynamic link library
HDT hand-held data terminal
LCD liquid crystal display
RAM random access memory
ROM read only memory
TX see TX for Windows application, SRS document no. 970.0344
Revision 3.0
Bnef Descnption of the Drawings
FIG. 1 is a hardware diagram of a preferred embodiment of a computer device according to the invention.
FIG. 2 is a diagram of the overall software structure of the computer device of FIG. 1.
FIG. 3 is a diagram of the digitizer driver software structure of the computer device of FIG. 1.
FIGS. 4 and 5 are illustrations of examples of on-screen keyboards used in the preferred embodiment of the invention.
FIGS. 6 and 7 are illustrations of examples of on-screen time and date keyboards used in the preferred embodiment of the invention.
FIG. 8 is an illustration of a pushbutton keyboard.
FIG. 9 is a software flow diagram illustrating an algorithm performed by the operating system software of FIG. 2.
FIGS. 10 and 11 are illustrations of a screen of the computer device of FIG. 1 showing the keyboard of FIG. 5 in different positions, for explanation of the software flow diagram of FIG. 9.
FIGS. 12 and 13 are illustrations of examples of other on-screen panels used in the preferred embodiment of the invention.
Detailed Description of the Drawings
Referring to Fig. 1, the computer device comprises a microprocessor 10 with RAM memory 11 and ROM memory 12 and hand data terminal circuitry 13 including an output circuit 14 connected to an
LCD or other display 15 and an input circuit 16 connected to a digitizer 17.
The digitizer 17 is pressure based and operated by a stylus 18 (or by a user's finger), but it may take other forms. The digitizer 17 forms an input device which is co-spatial with the display 15. Other forms of cospatial display and input devices include a cathode ray tube with a mouse and cursor. Even a three-dimensional display and a three-dimensional input glove can be used.
Also shown in FIG. 1 are data radio transceiver circuitry 19 and power control circuitry 20, both coupled to the microprocessor 10.
Stored in the ROM 12 are various programs for controlling the microprocessor 10. These include an operating system, which in this case is preferably 'Windows' (trademark) from Microsoft, Inc. The ROM 12 also includes one or more application programs which can be opened and closed under the control of the operating system.
Referring to the overall structure of the programs stored in ROM 12,
FIG. 2 shows that the operating system comprises an operating system core 30, a power manager module 31, a radio frequency (RF) manager 32, a file system 33, an HDT driver 34 and a virtual keyboard module 35.
In outline terms, the power manager module 31 controls the power control circuitry 20 (to minimize where possible supply of power to the hand data terminal circuitry 13, radio transceiver circuitry 19 and other elements), the RF manager module 32 controls the transceiver circuitry 19 and the HDT driver 34 controls the HDT circuitry 13.
In FIG. 2, the broad arrows 37, 38, 39, 40, 41 and 42 are hardware interfaces. Arrow 37 is an X86-SL interface to the power control circuitry 20. Arrow 38 is a hardware interface to the transceiver circuitry 19.
Arrow 39 is an interface to the RAM memory 11 and arrow 40 is a flash disk interface to flash disk memory in RAM 11. Arrow 41 is a hardware interface to the HDT circuitry 13 and arrow 42 is a hardware interface between the core of the microprocessor 10 and the digitizer input circuit 16.
Arrows 50, 51 and 52 are application program interfaces (APIs), interfacing with various application programs (not individually shown) which are stored in ROM 12. These interfaces are described in detail below. Arrow 53 to the core 30 of the operating system is also an API.
Arrows 54 and 55 are APM BIOS interfaces.
The arrows between the various modules 30 to 35 are internal software interfaces. The arrow from the operating system core 30 to the virtual keyboard module 35 is a pen event interrupt and the arrow from the virtual keyboard module 35 to pen system core 30 is a keyboard event interrupt. The double headed arrow between core 30 and RF manager module 32 is a communications API. The arrow from RF manager module 32 to HDT driver 34 represents RF events. The arrow from operating system core 30 to HDT driver 34 represents system hooks.
To summarize FIG. 2, each sub-system has three types of interface: (a) hardware interfaces represented by the wide arrows 37 to 42; (b) application programmer interfaces (arrows 50 to 53) which define subsystem services and can be used by any application or any other system component; and (c) intra-system interfaces, which define interfaces between sub-systems. In addition to these intra-system interfaces, every software component in the system is a valid windows and APM aware component and responds to the following events: windows-initialization and shutdown; APM, suspend, resume, critical resume.
Details of the power manager module 31, the RF manager module 32 and the file system 33 need not be described.
Referring to the HDT driver 34, this sub-system controls the following hardware components: a keyboard controller (for example type 8051); shift register inputs and outputs (for example via 8051 software extension); speaker and microphone level; battery level gauge; A/D via 8051 software extension and LED via 8051 software extension.
The API 51 is constructed from HDT hardware access services and from event logging services including: read level (from AID in the microprocessor 10 (for example battery voltage); write to HDT input and output commands (eg. audio routing); APM BIOS interface. Via the system hooks interface from the operating system core 30 the HDT subsystem will receive system events such as windows crash, system management state change, special key events and emergency. Via the RF events interface from the RF manager module 32 the HDT sub-system will receive RF system notifications such as radio battery low, radio operation state, in-out coverage, receive, transmit etc.
The details of the HDT driver need not be described. The HDT subsystem has an optional physical keyboard (not shown), which the HDT driver 34 scans and, upon occurrence of events on this keyboard (or other managing interface) data is passed by interface 55 to the appropriate application.
Referring to virtual keyboard module 35, this is a user input subsystem, the purpose of which is to provide the user an input medium which replaces a physical keyboard. This medium is implemented by graphic icons, select lists and keyboards. It enables the user to run the
HDT application without the need for a physical keyboard.
Module 35 has an API 50 and user input services are exported to the application by three main functions: (a) create user input object - this service allows the application to define new user input objects; (b) show user input object - this service provides pop-up user input objects to the application window; and (c) destroy user input object - this service removes user input objects from the application window.
The user input sub-system sends 'Windows' messages upon activation of a key which is defined as a message key. The pen events represented by the arrow from the operating system core 30 to the virtual keyboard module 35 passes standard mouse API information from the operating system to the virtual keyboard module 35. Keyboard events generated by the stylus 18 on the digitizer 17 cause generating of 'Windows' standard keyboard messages, which pass from the virtual keyboard module 35 to the operation system 30.
The pen driver in Windows core 30 accepts an input from the hardware interface 42 and its output is a pen event. The pen event is the input to the virtual keyboard module 35 and its output is a keyboard event.
Referring now to FIG. 3, a block diagram of digitizer driver software in operating system core 30 is shown. The function of the digitizer driver is to transfer pen co-ordinates from the digitizer 17 to the 'Windows' operating system core 30.
The digitizer driver software comprises a standard pen driver module 100, a virtual pen driver 102, a pen DLL 104 and a pen APPLET 106. Pen APPLET 106 interfaces with a system initialisation pen section 108. The standard pen driver 100 interfaces with the rest of the 'Windows' operating system 110.
In operation, the operating system 110 provides a drive enable signal to the standard pen driver 100. The standard pen driver provides a hardware initialization signal on line 112 to the digitizer 17. The connection between the digitizer 17 and the virtual pen driver 102 is by means of a UART interrupt request 114. The virtual pen driver 102 receives buffer addresses 116 from the standard pen driver 100 and provides pen packets 118 to the standard pen driver 100.
The digitizer driver is based on a 'Kurta' driver, as is known in the art, with certain enhancements, as follows. Within the HDT circuitry 13 there is a UART, having a first-in-first-out memory (FIFO). This FIFO is used to reduce the overall load on the microprocessor 10 whilst the system acquires pen (i.e. stylus) movement co-ordinates. Every pen co-ordinate packet generates one interrupt per packet (instead of one interrupt per byte as in a standard 'Kurta' driver). An interrupt handler in the virtual pen driver 102 reads all the available characters from the FIFO. Before leaving the handler it checks the UART received-data-ready bit. If this is asserted, it jumps to the place where the receive holding register is read.
Referring again to FIG. 2, it can be seen that the virtual keyboard module 35 has an API 50. This API is a novel interface which provides a standardised definition of on-screen keyboards that the virtual keyboard module 35 will present on the display 15. Actions entered into the virtual keyboard appearing on the display 15 will be received by the virtual pen driver 102 and supplied to the pen DLL 104.
The API 50 is a DLL with five functions, written in C. The five functions are as follows: 1) bool initKDBPackage(...); 2) *keyboard~type CreatePanel(...); 3) bool ShowPanel(...); 4) bool DestroyPanel(...); 5) bool terminateKDBPackage(...);
All functions return a value indicating whether the function has been successful or not. Every created panel or keyboard must receive a DestroyPanel(...) call before the application is terminated. The DLL must receive a initKBDPackage(...) call to be initialized. Before quitting the application a terminateKBDPackage(...) call should be made as well.
It is a significant feature of the equipment that a number of keyboards and panels (generically referred to as "input tools") can be offered by the operating system core 30 through the virtual keyboard module 35 to any application being run on the microprocessor 10. This is achieved though the unified API 50.
There are six different types of predefined keyboards plus a pushbutton panel and a system panel. The predefined keyboards are: alpha, numeric, alphanumeric, full keyboard, time and date.
The first and second of these are shown in FIGS, 4 and 5. The alpha keyboard is standard QWERTY keyboard with only letters, plus: esc, backspace, control, shift, caps, tab, alt, space and enter. The alphanumeric keyboard is the keyboards of FIGS. 4 and 5 combined.
The time and date keyboards are shown in FIGS. 6 and 7. Other keyboards are possible, as shown in FIGS 8 and 12, but these would also need to be defined in the API.
The keyboards are created in the initializing call to the DLL. Thus they do not require another CreatePanel(...) call. The programmer calls a
ShowPanel(...) function with the proper parameter as described in the
API.
The various applications (spreadsheet, data-base, word processor, diary etc.) operate the keyboards and panels by calling the DLL's functions.
The input entered on the keyboard type of window is displayed immediately in the field character-by-character while typing.
The push button panel is a programmer defined keyboard.
Buttons on this panel have three possible actions: send message, post message, key event. In run time the programmer calls two functions to show a panel. The first is a CreatePanel(...) function. This function creates the panel as specified in the parameters. After a successful execution of the function a call to a ShowPanel(...) function is made. All customization of the panel is done through the parameters as specified in the API of the function calls.
The system panel is a predefined panel. This panel is also created when the function initKBDPackage(...) is called. The system panel has the "always on top" option activated. To show the system panel a
ShowPanel(...) must be called.
The API functions definitions are as follows.
CREATEPANEL() FUNCTION
CreatePanel ( format parameters:
Bool on~top, II always on top is enabled
Bool has~frame, 1/ a frame exists - > resizable int size, 1/ size=3. During run time,
size will receive the value
which the keyboard was
closed at.
int n~button, it number of buttons int n~rows, II number of rows int n~col, H number of colums location~type loc, # bottom/top...
orient~type orient, # horizontal fill~type fill, II compact but-act-type *but~act, H structure of buttons
A CreatePanel function will create the panel according to the specified number of buttons and specified labels. With minor modifications, buttons can be created with pictures too. The create function is called once in TX initialization. TX initialization is an initialization procedure for initialising data and creating objects, that is done once when a TX is run.
DATA STRUCTURES USED IN CREATEPANEL() location~type = enum (top, bottom, left, right, exact) orient~type = enum { horizontal, , vertical) fill~type = enum ( compact, full) but~act~type = struct ( act~type key~stroke, H see **
str key~string,
MSG msg, //message to be sent
size~type size,
str alias,
h~but hbut Unbutton handle ** Data structure used in act~type:
act~type = enum I key~stroke, post, send)
SHOWPANEL(...) FUNCTION
ShowPanel (H format parameters::
*keyboard~type kt, H pointer to structure
point locpoint H location
hWND hwnd H handle of window
DATA STRUCTURES USED IN SHOWPANEL() *keyboard~type = pointer to the keyboard created in
createPanel(). Pointers have the following
names: (alpha, num, alnum, full, date, time) point = (x1, yl) H The user either gives the top left point of
keyboard or leaves the system the option to
decide where to place the keyboard.
hWND = handle of window to show the panel in.
Through the above commands and definitions, virtual keyboards are defined, created and displayed which serve the purpose of a physical keyboard or a mouse. Some buttons are regular keys like keyboards keys and some are pushbuttons, icons or functional keys. For each type of data, e.g. each field of each application, a different input tool window can be defined with the minimum number of buttons which suits the context of the current input. In this way the window can include large and clear fonts and minimum number of buttons or keys.
There is preferably always at least one input tool window that exists on the screen. This avoids the need for any other input (except a reset button). There may be more than one. Some pop up automatically and some are activated by the user. No matter which kind of input tool window is presented, they all have the. same API and certain similar characteristics, including changing size and automatic moving (both described below).
Functionally they may be divided into two different groups: adjustable format window tools that should always appear on the screen (either in full display or iconized) and can be generated per application with flexible format and fixed format keyboards that are prepared once and are fixed throughout all applications. The latter pop on and off automatically by changing the type of data field and should not stay resident on the screen all the time.
In addition to the keyboard types defined above, a pushbutton keyboard can be provided. This is a flexible sized panel of pushbuttons.
The Create Panel function will create this panel according to the programming. The number of buttons and labels are taken from the configuration and the required returned keycode or message are given. It should automatically order the keys in rows and columns according to the number of keys. An example is shown in FIG. 8.
A significant preferred feature of the new automatic floating screen keyboard is its automatic location feature. The location algorithm is shown in FIG.9.
Each keyboard or input device has defined for it through the API a field type: "regular" or "adjacent". Where a field type is "adjacent", this defines that it is a field for which the input device will be located adjacent the field. Where the field type is "regular", it will have defined a primary location and a secondary location for the input device. The primary location is the preferred location.
As the user uses the stylus 18 on the digitizer 17 to select a new field in the application which is displayed on the screen 15, the algorithm of
Fig. 9 commences at step 300. Step 301 determines whether the new field is a "adjacent" type of field. If it is, the program passes to step 302 and the location for the input device is selected. This location is chosen as being a location near to the new active field. The actual relative locations of the field and the input device location are a matter of design choice. For example, the location for the input device may be to the right of the field or below the field or some other manner of calculating.
For this type of input device, the location is always convenient for the current field.
If the type of input device is "regular", the program passes from step 301 to step 303 to select a location for the input device. In step 303, the algorithm determines whether there is a collision between the primary location of the input device and the current field. A collision exists if the input device partially overlaps the current field, thereby partially obscuring the current field. A collision can also be defined to exist if the current field and the input device are too close together, i.e. the current field lies within a certain margin around the input device. If no collision exists, the algorithm passes to step 304 and the input device is placed in its primary location. If there is a collision between the primary location of the input device and the location of the new field, step 305 causes the input device to be located in its secondary location.The secondary location is selected to be remote from the primary location, i.e. in another part of the screen altogether, clear of the collision.
The location algorithm for a regular type of input device prevents it from jumping around the screen unnecessarily.
The above algorithm is illustrated in Figs. 10 and 11. In Fig. 10, a spreadsheet 120 is shown displayed on the display 15 as well as a numeric keypad 121. The stylus 18 is shown as entering an action in field 122. For field 122, the keyboard type defined is a numeric keypad of "regular" type and the primary location is defined as being co-ordinates X1, Y1. The same primary location can be defined for field 123. For all the fields in the same column, it is convenient to have the keyboard located in the position shown as the primary location. In this position, the information surrounding the fields in question is visible and the keypad 121 remains static and does not jump around the screen.
A problem arises, however, when the stylus is used in field 124 (in the column headed "price"). For the fields above field 124, the primary location X1, Y1 is satisfactory, but as the stylus has moved closer to the keypad 121, the keyboard begins to obscure the area of interest of the screen. Accordingly, a secondary location for the keypad 121 is defined as shown in Fig. 11. This secondary location has co-ordinates X2, Y1. This secondary location is selected when the stylus 18 is used in field 124 or field 125 or any of the fields beneath those fields in the application display (i.e. in the "price" or "date" columns of the spreadsheet).
Note that for field 125, the type of keyboard will be defined as a date keyboard, as shown in FIG. 7. In other words, when the stylus 18 is used to activate field 125, the keyboard type will automatically change from a numeric keyboard to a date keyboard. This is a very significant timesaving feature. The user does not need to select the appropriate keyboard, nor is it necessary to have both keyboards appearing simultaneously and cluttering up the screen.
In summary, the default location is not a simple fixed place but rather a short function that decides on the best place to position the keyboard in relation to the current window or field in focus. It is the
DLL's responsibility to obtain the location of the user's input in order to position the panel in the correct position.
It is an option as to whether the initial location is the default location or the last one used.
All panels have the same man-machine interface. The features are standard features: frame with a title in the middle; a button on the right for iconizing; on the left hand side there is a button with pop down menu and the bottom right corner of the panel is used for sizing. Not all of these features are illustrated in FIGS. 4-8, 12 and 13.
A permanent status control panel includes a group of pushbuttons to control hardware and indicate status of hardware components.
This panel should always exist on screen or can be iconized if it hides a form. This window is always on top. An example is shown in FIG. 12.
The system panel includes five buttons: OFF button 200, power button 201, radio ON/Off button 202, keyboard button 203 and speaker increment and decrement buttons 204 and 205. It also includes two indicators: battery and radio coverage 206 and 207.
The panel can be iconized as shown in FIG 13 in which it is illustrated that a stylus action in inconize box 220 causes panel 222 to be replaced by a single icon 224 which itself acts as a button. Upon the occurrence of a stylus input in the area of the icon 224, the panel 222 is returned.
In prior arrangements there has been a trade-off between a full keyboard which is too small and hard to read and a large size keyboard which is easy to read but is too big and covers most of the screen data.
Since there is no one keyboard that contains all necessary features, an optimized on-screen keyboard has been described which fits not only pen computers but also touch-screen computers.
The size of keyboard can be controlled by the user by stretching or shrinking its window. The user can change the input tool window size like any other 'Window' by stretching and squeezing its edge. The program is also able to change the size by changing a parameter. It is adjustable in size by locating the stylus at a location on the display 15 corresponding to a size box 226 or other predetermined portion of the operating system tool and by moving the stylus to a selected new location.
This feature enables any size of keyboard to be implemented, which is very useful in small pen computers or touch-screen computers. The fonts can be bigger and the keys can be much more readable. It is an option that the initial size can be a default size or the last size used. When a user changes the size it changes only in predefined steps.
It is possible to choose a key even by a touch of a finger. When the keyboard covers some important information on the screen, it can be decreased in size to the optimal size for the method of use.
The on-screen keyboard pops up by double tapping a field with the stylus 18 or by double clicking the mouse. Once it is on, it remains resident on the screen. There is no need to deactivate it before each new activation. Each application field is able to have a keyboard defined for that field through a common API, so that in cases where the current type of field is different from the previous field used, the keyboard will disappear and a new keyboard will pop up automatically.
The type of keyboard depends on the type of field, whether it is a numeric keyboard or an alphanumeric, letters-only keyboard or a keyboard with arrows, ENTER and ESC keys only. Another type can be a keyboard with functional keys only, F1 - F12.
The keyboard is a contextual keyboard which changes according to the field's type. It minimizes the size of the keyboard and lets the user type very quickly and with the least number of errors, since the number of keys offered is small and each key is valid.
Keyboards and windows (generically "operating system input tools") can be fixed and adjustable.
Fixed format keyboards are used only in editing mode. They pop on and off automatically by touching the data field (in the application). Only one fixed format keyboard can exist at once on the screen as they substitute one another. When a new one appears, the previous one disappears. It is also possible that none appears at all, when no editing is done or the user has removed it already.
Adjustable format input windows are used for operating specific functions by one touch of a pen. They exist permanently on the screen either in full display or iconized. They are created dynamically in run time according to specified parameters. By contrast to fixed format keyboards, the adjustable input windows can appear on the screen more than one at a time. In the preferred embodiment at least one system panel is always present. The others can be iconized and their icon will be part of the permanent panel. The permanent one should be always on top, meaning that it is always in the foreground and no application will overwrite it. It is activated through a DLL when 'Windows' comes up (in the initialization process of 'Windows').
Since the pushbutton panel is flexible in content and size, an option is for it to be created according to the TX configuration program. The number of buttons may depend on the number of defined functional keys and on how they were programmed in the TX configuration program.
A very advantageous feature is the optimal location of the keyboard on the screen. The first priority is to display it in a vacant space on the screen. If there is no vacant space, it will be displayed further away from the associated field's location, while attempting not to overlap fields that belong to the same group. In the case where all fields belong to the same group, the keyboard will be located further away from the current field to avoid covering the field's content.
The keyboard can be manually moved to a different location by dragging it either by mouse or by pen.
The above input tools are considered to be the optimal input tools for pen computers. They comprise all features essential for easy activation of an adaptive keyboard with the least number of operations. They are distinguished by flexibility and lack of complexity, clearness and nondisruptive appearance on the screen. They always pop up immediately in the right shape and right moment with just tapping of the pen and with no redundant operations and can be adjusted easily to the most convenient size. They can be generated dynamically in run time by specifying selected parameters. They facilitate the making of a customized keyboard.
Claims (8)
1. A computer device comprising:
an information display device arranged to display (a) an application layer set of input and display fields and (b) an operating system input tool presented in place of application layer input and display fields;
a position-based information input device co-spatial with the display device for receiving input actions; and
processing means for accepting input actions from the input device and causing the information display device to display information corresponding to the input actions;
wherein the processing means are arranged to interpret an input action which is presented in a location corresponding to an application layer field as an association between the operating system input tool and the application layer field, and (b) interpret an input action which is presented in a location corresponding to the operating system input tool as an application layer input for the application layer field associated with the operating system input tool,
characterised in that the operating system input tool has selectable tool parameters and
different application layer fields comprise different tool parameters, whereby input actions which are presented in different application layer field locations cause different operating system input tools to be presented for entry of information into the different application layer fields.
2. A computer device according to claim 1, wherein the operating system input tool parameters include location parameters whereby input actions which are presented in different application layer field locations cause operating system input tools to be presented in different locations.
3. A computer device according to claim 2, wherein the processing means comprises operating system input tool location computation means for computing a location for the operating system input tool from input tool parameters provided by different application layer fields, wherein the location of the operating system input tool computed is dictated in part by the application layer field with which the operating system tool is associated at a given time (an active field) and in part by the location of the position-based information input device.
4. A computer device according to claim 3 wherein, for successive input actions accepted from the input device presented in locations corresponding to different application layer fields following a given input action in a location corresponding to a first application layer field, the operating system input tool location computation means are arranged to maintain the position of the operating system input tool constant for input actions presented in application layer fields located relatively close to the first application layer field and to generate a new position for the operating system input tool for input actions presented in application layer fields located relatively remote from the first application layer field.
5. A computer device according to claim 1, wherein the operating system input tool parameters include a parameter defining a set of input fields for the operating system input tool, whereby input actions which are presented in different application layer field locations cause operating system input tools with different numbers of input fields to be presented.
6. A computer device according to claim 1, wherein the operating system tool input has a variable size parameter.
7. A computer device according to claim 6, wherein the operating system tool input is adjustable in size by locating the position based input device at a location on the information display device corresponding to a predetermined portion of the operating system tool and moving the position based input device to a selected new location.
8. A computer device comprising:
an information display device arranged to display (a) a set of different application layer input and display fields and (b) an operating system input tool presented in place of selected ones of the application layer input and display fields;
a position-based mformation input device co-spatial with the information display device for receiving input actions; and
a processor for accepting input actions from the input device and causing the information display device to display information corresponding to the input actions;
wherein:
the processor is arranged to interpret an input action which is presented in a location corresponding to an application layer input and display field as an association between the operating system input tool and the application layer input and display field, and (b) interpret an input action which is presented in a location corresponding to the operating system input tool as an application layer input for the application layer input and display field associated with the operating system input tool;
the operating system input tool has selectable tool parameters; and
different application layer input and display fields comprise different tool parameters, whereby input actions which are presented in different locations of application layer input and display fields cause different operating system input tools to be presented for entry of information into the different application layer input and display fields.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9502773A GB2298112B (en) | 1995-02-11 | 1995-02-11 | Computer device with position-based information input device and operating system tool input |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9502773A GB2298112B (en) | 1995-02-11 | 1995-02-11 | Computer device with position-based information input device and operating system tool input |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9502773D0 GB9502773D0 (en) | 1995-03-29 |
GB2298112A true GB2298112A (en) | 1996-08-21 |
GB2298112B GB2298112B (en) | 1999-11-03 |
Family
ID=10769510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9502773A Expired - Fee Related GB2298112B (en) | 1995-02-11 | 1995-02-11 | Computer device with position-based information input device and operating system tool input |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2298112B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2322779A (en) * | 1996-10-30 | 1998-09-02 | Samsung Electronics Co Ltd | Character-based information retrieving apparatus and method therefor |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0249293A2 (en) * | 1986-06-10 | 1987-12-16 | Philips Electronics Uk Limited | Processor-based data and/or graphics display apparatus |
GB2251774A (en) * | 1990-11-21 | 1992-07-15 | Smiths Industries Plc | Radar display |
-
1995
- 1995-02-11 GB GB9502773A patent/GB2298112B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0249293A2 (en) * | 1986-06-10 | 1987-12-16 | Philips Electronics Uk Limited | Processor-based data and/or graphics display apparatus |
GB2251774A (en) * | 1990-11-21 | 1992-07-15 | Smiths Industries Plc | Radar display |
Non-Patent Citations (1)
Title |
---|
Using QuickMenus,pp 35/36 "WordPerfect for Windows ,CHEAT SHEET "Paul McFedries, alpha books 1994 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2322779A (en) * | 1996-10-30 | 1998-09-02 | Samsung Electronics Co Ltd | Character-based information retrieving apparatus and method therefor |
GB2322779B (en) * | 1996-10-30 | 1999-01-06 | Samsung Electronics Co Ltd | Character-based information retrieving apparatus and method therefor |
Also Published As
Publication number | Publication date |
---|---|
GB9502773D0 (en) | 1995-03-29 |
GB2298112B (en) | 1999-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0403125B1 (en) | Zoom mode operations in display apparatus | |
JP4074431B2 (en) | Soft input panel system and method | |
US5638501A (en) | Method and apparatus for displaying an overlay image | |
US5917486A (en) | System and method for client program control of a computer display cursor | |
US6833827B2 (en) | System and method for automatically switching between writing and text input modes | |
US5485174A (en) | Display image scroll control and method | |
AU652767B2 (en) | Computer with tablet input to standard programs | |
US7389475B2 (en) | Method and apparatus for managing input focus and Z-order | |
US5252951A (en) | Graphical user interface with gesture recognition in a multiapplication environment | |
US6359572B1 (en) | Dynamic keyboard | |
US6002400A (en) | Method and apparatus for handles to components in graphical windows user interface | |
US9401099B2 (en) | Dedicated on-screen closed caption display | |
EP0693724A1 (en) | A method of reconfiguring a simulated keyboard device in a computer system | |
US20150058776A1 (en) | Providing keyboard shortcuts mapped to a keyboard | |
WO1999028812A1 (en) | Intelligent touch display | |
JPH07117868B2 (en) | Method and device for defining touch-type operating keyboard | |
KR20050019906A (en) | Information display input device and information display input method, and information processing device | |
JP2002041023A (en) | Computer system, display control device, display device, display control method, recording medium and program transmission device | |
WO1999028813A1 (en) | Navigation tool for graphical user interface | |
JP2003523562A (en) | pointing device | |
US20060277491A1 (en) | Information processing apparatus and display control method | |
US20070113196A1 (en) | Window switching method and system | |
GB2298112A (en) | Computer device with position-based information input device and operating system tool input |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |