GB2425700A - Data entry using a three dimensional visual user interface - Google Patents

Data entry using a three dimensional visual user interface Download PDF

Info

Publication number
GB2425700A
GB2425700A GB0508841A GB0508841A GB2425700A GB 2425700 A GB2425700 A GB 2425700A GB 0508841 A GB0508841 A GB 0508841A GB 0508841 A GB0508841 A GB 0508841A GB 2425700 A GB2425700 A GB 2425700A
Authority
GB
United Kingdom
Prior art keywords
data
dimensional
characters
character
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0508841A
Other versions
GB0508841D0 (en
Inventor
Gordon Frederick Ross
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB0508841A priority Critical patent/GB2425700A/en
Publication of GB0508841D0 publication Critical patent/GB0508841D0/en
Publication of GB2425700A publication Critical patent/GB2425700A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user

Abstract

The current method proposes an additional method for data entry comprising a virtual keyboard data entry capability using three dimensional interactive object elements for the entry selection process. These three dimensional objects can decompose into constituent parts with a greater amount of detail and with rotational capability to provide a number of facets to deliver additional choices for selection. The method is applicable to a variety of input methods including character based, graphical based, phrase based, phonetic and semantic input as well as touch input, joystick or rocker input and voice input amongst others. The method can be adapted for physical keyboard devices that are tailored to the three dimensional input method described here. It is particularly applicable to text input into mobile devices e.g. phones.

Description

* 2425700
A METHOD FOR DATA ENTRY USING A THREE DIMENSIONAL VISUAL USER
INTERFACE WITH IMPLOSION AND EXPLOSION OF CONSTITUENT PARTS
This invention relates to a method for user entry of data, including (but by no means limited to) alphanumeric characters, text and messages, on electronic devices. The invention is particularly suitable for use with, but is by no means limited to, portable electronic devices such as mobile phones, personal digital assistants and tablet computers.
Copyright Notice Portions of the disclosure of this patent document contain material that is subject to copyright. The copyright owner has no objection to the reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark office patent file or records but otherwise reserves all copyright rights whatsoever.
Background to the Invention
Data entry into electronic devices is a ubiquitous part of modern life, with applications ranging, for example, from the DVD control panel, to data entry into Personal Information Managers (PIMS) on Personal Computers, Personal Digital Assistants and Mobile Smart and Feature Phones. On small screen, small devices in particular there have been difficulties finding effective and elegant ways to input data. Elaborate ways have been devised to enable people to input data, most frequently with a combination of predictive text and context-sensitive option display and with multiple presses on a single key until the desired character is identified. Less frequently two dimensional navigation of an on-screen two dimensional virtual keyboard is possible via touch, joystick or remote control.
These methods of data entry have a lot in common with Morse Code and they share some of the same limitations of multiple presses for a single word. Multiple key entry is not just inefficient, it can also give rise to problems with conditions such as Repetitive Strain Injury (RSI). Yet until now the small size of both the keyboard and the screen of many intelligent devices have prevented effective alternatives being devised. Now with the realisation of three dimensional user interactions with structured information displays new opportunities are opened up. This present application applies these breakthroughs to the design of a virtual three dimensional keyboard'.
The core of the problem with any current data entry on the Mobile phone is the small device size. Everything we mention in regard to the Mobile Phone applies equally well to other small or small-screen devices such Personal Digital Assistants, Geographic Positioning Systems, Consumer Electronic Devices with digital displays, Home Automation and other Control Systems, Information Kiosks, and other similar intelligent devices with a small-footprint of the device and/or the visual display(s). The solution described here in relation to Mobile Phones applies to all these other devices equally well, and it applies just as well to large screen devices as well.
Various approaches have been made to overcome the small keyboard and screen size of devices. Pointers can be provided to allow selection of items from small displays or handwriting or voice recognition systems can be used. These have enjoyed limited take up to date as they are not seen as intuitive by many people. They typically also require two free hands for effective entry which is a drawback in many situations.
Summary of the Invention
According to a first aspect of the invention there is provided a method for enabling data to be entered on an electronic device, the method comprising: providing a visual user interface in which a three dimensional object is displayed, said object being rotatable in response to user input, wherein faces of the object are populated with characters, graphical icons or other data items for selection by a user; receiving input from the user selecting a character, graphical icon or other data item on a face of the object; and adding the selected character, graphical icon or data item to a data stream or string for subsequent processing.
This three dimensional object enables a wide range of characters, graphical icons, phrases, words and other data items to be presented for selection by the user in a simple and intuitive manner.
Said three dimensional object may be a sub-object of a parent three dimensional object, the parent three dimensional object being rotatable in response to user input and, further in response to user input, capable of decomposing or exploding to thereby present the sub- object for rotation and selection of a character or other data item. This step of decomposition may include separation of the sub-object away from the parent object, and/or enlargement of the sub-object into the foreground of the three dimensional environment in which the three dimensional objects are displayed. In the physical world when we want to inspect an object more closely we bring that object from the background into our visual foreground and bring it closer for inspection. This capability of movement from background to foreground and rotation to present different facets requires a three dimensional landscape and this is what this present method provides for data input. New possibilities for data input are opened up by presenting the user with a three dimensional structure of whatever desired shape; for ease of exposition we focus on the cube shape in the description that follows.
Other three dimensional shapes and surfaces are equally appropriate.
The new possibilities described here are in relation to a rotating interactive data entry structure that decomposes into elements that can be re-composed, with different levels of more inclusive and less inclusive objects that themselves can be rotated and interacted with for content selection and data input.
In one embodiment, preferably the faces of the parent object are populated with characters, graphical icons or other data items for selection by a user and, on selection of a first character, graphical icon or other data item on a face of the parent object, the sub-object that is presented is populated with characters, graphical icons or other data items the identity of which are dependent on the identity of said first character, graphical icon or other data item.
In this embodiment, preferably the visual user interface is configured for character or graphical icon entry in one or more human alphabets or languages, and the characters or graphical icons with which the subobject is populated are only those characters or graphical icons which, in a given human alphabet or language, may linguistically follow or combine with said first character or graphical icon. For example, in the English language using the Roman alphabet, if the first character selected is a "q", then the sub-object may be populated only with a letter "u", since (in most cases) only the letter "u" may follow a "q" when forming a Further predictive techniques may be employed to restrict the choice of characters or graphical icons available to the user after a first character has been selected. Such techniques are known to those skilled in the art of mobile telephones having text messaging capabilities, and to those skilled in the art of computer programming for linguistic applications.
The one or more human alphabets may be selected from a group comprising Japanese, Chinese, Korean, Roman, Cyrillic, Greek, Armenian, Etruscan, Georgian, International Phonetic Alphabet, Hebrew, Arabic, Bulgarian, Cherokee, Croatian, Russian, Czech, Estonian, German, Hawaiian, Hungarian, Icelandic, Indian, Khmer/Cambodian, Kurdish, Lao, Latvian, Lithuanian, Macedonian, Manchu, Mongolian, Polish, Singaporese, Slavonic, Slovak, Slovene, Somali, Tibetan, Thai, Turkish and Urdu. This is not an exhaustive list, and the present disclosure is applicable to other alphabets and languages.
In particularly preferred embodiments, the human alphabet is Japanese or Chinese, and the characters or graphical icons are Japanese kana or kanji, or Chinese kanji. In Japanese, characters representing different syllables may be combined to form compound characters or words, and accordingly in one such embodiment the kana with which the sub-object is populated may be only those kana which, in Japanese, may linguistically follow or combine with a kana selected on the parent object.
One or more faces of the object may be populated with phrases or icons for selection by the user. The provision of common pre-prepared phrases is of particular benefit when composing SMS text messages, since it saves the user having to compose the message letter by letter or word by word.
The processing of the data stream or string may comprise transmitting it as a text message, or supplying it to a word processor or other software application, or supplying it as a response to a query made by a software application. Again, this is not an exhaustive list, and other processing or transmission operations may be performed on the data stream or string, as will be apparent to those skilled in the art.
According to a second aspect of the invention there is provided an electronic device configured to implement a method in accordance with the first aspect of the invention.
According to a third aspect of the invention there is provided a computer program executable to cause processing means to implement a method in accordance with the first aspect of the invention.
According to a third aspect of the invention there is provided a computer program stored on a data carrier, said computer program being executable to cause processing means to implement a method in accordance with the first aspect of the invention.
Brief Description of the Drawings
Embodiments of the invention will now be described, by way of example, and with reference to the drawings in which: Figure Ia shows an example of a visual user interface where the parent object is in the form of a three-dimensional rotatable 3x3x3 cube, here showing one face of this cube viewed straight on - the cube components (the sub-objects or elements that "explode" or "implode") may be treated in the same way as physical objects occupying a defined space; Figure 1 b shows another face of the 3x3x3 cube of Figure Ia, the face shown in Figure lb being adjacent to the face shown in Figure la; Figure 2a shows another face of the 3x3x3 cube of Figure la, the face shown in Figure 2a being adjacent to the face shown in Figure 1 b; Figures 2b and 3a show a straight on view of the effect of selecting a component of the overall structure and bringing it into the foreground for further selection and interaction; Figure 3b shows an angled view of the three dimensional rotatable interface of Figure 3a, having a background and a foreground and the cube object in the foreground; Figures 4a and 4b show straight on and angled views of how different data entry selections can appear on different facets of an object, here in foreground; Figures 5a and 5b show examples of a mixture of different alphabets on the different facets of the parent' cube (in this particular example the Roman letters could, for example, be used as access to an address book application or contacts list, while the Japanese facets that are shown could be used for entry of a Japanese language message), the cube having been rotated such that it is seen here in a angled view; Figures 6a and 6b show the highlighting of particular characters that have been selected from the available options, for entry into a data display window; Figures 7a and 7b show how the same interface can be used just not for single data elements but for collections of elements, in this case common phases used in communications, in both English and Japanese; Figures 8a and 8b illustrate examples of how the user interface presents a common look and feel in the user interface into devices, and common interaction and data entry methods; and
I
Figures 9a and 9b illustrate examples of a soft key toolbar, here in two dimensions although it could equally well be a three dimensional toolbar. These soft keys replace or supplement the hard coded keys that many intelligent devices have.
These figures are illustrative of the power of the method and show how the realisation is not restricted to a particular software framework or method, but can be realised in a number of ways dependent on the skills and experience of the person skilled in the art who wants to apply the method. Further clarification and insight into the method may be obtained from the examples of a range of possible applications provided herein, although the list is not intended to be exhaustive and one skilled in the art will have no difficulty in thinking of other relevant applications.
Detailed Description of Preferred Embodiments
The present embodiments represent the best ways known to the applicant of putting the invention into practice. However they are not the only ways in which this can be achieved.
One skilled in the art will be able to define many variations and modifications of the invention, and this disclosure is intended to encompass both the principle method and variations and modifications thereof. There are also many computer languages and methods of hardware and software combinations that may be used to realise the innovations detailed below. Application of the present embodiments is not restricted to any one or more computer languages but apply across all software and computer languages and combinations and all hardware and software combinations to realise the same or similar ends.
Nor does our description refer only to one particular shape, although in the examples we use the shape of a cube for interaction and explosion and implosion for convenience. Rather the invention and the steps of realisation can be used in relation to any regular or irregular shape or surface, and our method of invention is not restricted to any particular shape or group of shapes or surfaces.
The present embodiments involve processing steps that may form part of a computer program or a set of instruction code, that may be executed on a mobile phone, computer or other processing device The computer program or instruction code may be supplied on a data carrier such as a CD-ROM, floppy diskette, USB memory stick or flash memory card, or may be downloadable as a digital signal over a network such as the Internet. Alternatively a processor arranged to execute the processing steps may be hard coded to implement the program, or the program may be provided in firmware.
The current method establishes a landscape within the interaction device, whereby objects can be brought into foreground and exploded or imploded and rotated and examined.
Information objects are represented as features within the landscape and they are capable of rotation either in background or in foreground, or in any intermediate position in the virtual space between the extremes. Such objects can not just be rotated; they can, if required, be decomposed into consistent parts. So a single "Parent" object can be decomposed into a number of constituent parts, or Children", and these Children can then be re-combined into the same or different Parent Objects and so on.
This ability to associate Children Objects with Parent Objects makes it is possible to create a multi-dimensional display and interaction space. In addition, although the overall information structure remains the same, it is possible to change the textures on the objects in the landscape so transforming the features and functions of the input device. In essence this changes the "personality" of the data entry structure, so for example one set of textures may create a space for interacting in one language. Another set of textures could create a space for interacting in another language. And a third set of textures could create a space for interaction in multiple languages. Each texture change gives rise to a new "personality" although the navigation controls remain the same across these personality changes ensuring the user is provided with a consistent user interface across different domains of data input activity. The activity sequence will give rise to further exploded shapes or, when the object is at the lowest level in any hierarchy, the selection action will give rise to the corresponding element in the data display window.
Realisation of a multi dimensional visual user interface and associated data entry method The realisation of a multi-dimensional, multi-level data entry structure with multiple personalities for data entry into intelligent devices consists of the following steps and such variations and modifications as will be apparent to someone skilled in the art.
Step 1: Create within the device software, computer hardware and networks a visual display with multi-dimensional landscapes within which objects are placed in background and foreground, such objects based on any shape of three dimensional polygons, or ball, sphere or other three dimensional shapes.
Step 2: Iteratively apply the process in Step I to create a collection of shapes organised within a landscape which the user can navigate and interact with in defined ways.
Step 3: Organise said shapes created in Steps 1 and 2 into multi-level clusters with higher- level clusters enclosing lower level clusters, thereby establishing Parent Objects and Child Objects with as many levels as are required. Such objects, or parts thereof, can have functions and actions associated with them so when an object is selected and activated the corresponding data entry action is initiated.
Step 4: Create a texture to go on all sides, including if need be the inside, of the shapes so the structures acquire a solid, semi-solid or transparent form which can, if required, change when selected and activated. Different textures can be created although for usability only one texture is likely to be visible on an underlying structure at any one time. Changing the textures changes the underling data entry functionality with which particular areas of textures are associated.
Step 5: The creation of the texture includes the information or options for data entry on or within the textures, and the activation of the area encompassing the data element, howsoever this is achieved, causes the corresponding character or data element or information to be displayed in an appropriate display area.
Step 6: The three dimensional landscape requires navigation controls and such controls give users the ability to move around the landscape and to rotate, extract, replace, include or select any particular shapes or clusters and allow one or more of the shapes to be rotated around any axis up to 360 degrees, together or independently. These navigational controls can be physically implemented on the device or they can be virtual soft-key controls or a combination of the two.
Step 7: The underling software routines calculate the texture modifications to provide visual integrity of object as it is rotated, thereby providing an apparently rotating three-dimensional object within the flat physical characteristics of a display screen, VDU, or other display or interaction device. Such three-dimensional capability allows objects to be brought into foreground, placed into background, or positioned in middle ground as required. When areas on the shapes at the lowest level are selected it gives rise to the appropriate character, graphic, word, phase or object being placed in the associated data display window.
Step 8: If it supports it the connectivity of a device enables the possibility of delivering the data that is created with the above steps to a third party for communications and information exchange.
Step 9: Having created a structure, given it organisation and texture and provided further information on or in the structure, having placed its rotation and actions under some form of automated or user control the user can input data, create messages and provide information to the self or others.
The above is an indicative sequence to create the novel information structure although the novel method described here covers other steps and sequences that someone skilled in the art would have no difficulty in specifying. The method is applicable to all language types whatever their underling structure and organisation so it is relevant to European, Middle East and Asian languages amongst others.
ExamDleS of the use of the three dimensional visual user interface Use for entry of Japanese kana and/or Roman alphabet characters There are two types of scripts in Japanese: Kanji (graphical characters) and Kana (syllabary). Kanji reflects the meaning of a word, whilst Kana represents the pronunciation.
Kana is written as a combination of lines, symbols and other characters.
In a first embodiment of the present invention, as shown in Figure Ia, a 3x3x3 cube II, which effectively serves as a virtual keyboard, is displayed in a window 10 in the visual display of an electronic device (e. g. a mobile phone or PDA). The cube II may be manipulated and rotated by a user, in a preferred embodiment by a dragging action using a cursor control device or (particularly in the case of mobile devices and PDA5) by using a touch and drag action on a touch-sensitive display.
The general operation of this visual user interface is that the user selects characters, phrases or other data items from the three dimensional cube 11. The selected characters are then added to a string which is displayed in region 14 of the window 10 (see Figure 6b for an illustration of this region 14 containing such a string 15). This string may then be transmitted to another mobile device as a text message, or may be supplied to another software application such as a word processor. Alternatively the selected characters may be transmitted as a data stream to a software application or to another device, which may take place over a network.
The 3x3x3 cube 11 is composed of a number of sub-cubes to form a 3x3x3 array. Each of the sub-cubes may be decomposed into further sub-cubes.
Command buttons' 16 are also provided in the window 10, to enable the user to indicate that the text string has been completed, or to delete or edit characters that have been entered.
Further command buttons for other functions may be provided, as will be apparent to those skilled in the art.
As shown in Figure Ia, on face 12 of the cube 11, nine Japanese characters are provided on the sub-cubes for selection by the user. Figure 1 b shows the same cube rotated through 90 to the left, to reveal a second face 13 which is adjacent to face 12. In this example, the interrelation between the faces 12 and 13 is such that the three characters on the right of the face 12 are the three characters on the left of the face 13.
As shown in Figure Ib, some of the sub-cubes may be labelled to refer to phrases or expressions instead of individual characters. For example, subcube 17 is labelled "Greetings" and sub-cube 18 is labelled "Occasions". The use of such phrases is described in more detail below. Figure lb also illustrates that a mixture of alphabets can be handled using this interface, with the sub-cubes on the right of Figure lb being in the Roman alphabet, whilst the sub-cubes on the left are Japanese.
Figure 2a shows the same cube rotated through a further 90 to the left, to reveal a third face 19 which is adjacent to face 13. In this example, the interrelation between the faces 13 and 19 is such that the characters on the right of the face 13 are the characters on the left of the face 19. It can be seen that face 13 is predominantly populated with characters from the Roman alphabet.
With Japanese kana, once a first character has been chosen, only certain other characters may linguistically follow or combine with that first character. The present user interface may be configured such that, once the user has selected a first character, only those characters which may linguistically follow or combine with the first character are presented for selection by the user. This may be implemented as shown in Figure 2b. Here, character 12a has been selected from face 12 of the cube 11 (as shown in Figure Ia). In response to the selection of character 12a from cube 11, the surface of the cube is adapted to display cube 25, the face of which shows the first selected character 12a in the centre, and presents for selection four further characters (20, 21, 22 and 23) which linguistically may follow character 12a. Cube 25 may be regarded as a sub-cube or a sub-object of parent' cube 11, and the transition between cube 11 and sub-cube 25 may be displayed as sub-cube 25 exploding from cube 11. The user may select one of these four characters (20, 21, 22 or 23), or may rotate the cube 25 to reveal other characters for selection (as shown in Figure 3b).
Alternatively, the user may revert to the initial parent' cube 11, in which case the sub-cube 25 may implode back into the parent' cube 11.
Figures 4a and 4b illustrate the rotation of a sub-cube populated using Roman alphabet characters.
Figure 5a illustrates the rotation of a parent cube populated using Japanese characters, and Figure 5b illustrates the rotation of a parent cube populated using a mixture of Japanese and Roman characters.
Once a character has been selected by a user (e.g. by touching on it), the character may be highlighted to signify that the character has been selected. This is illustrated in Figures 6a and 6b, which show characters 28 and 29 highlighted. Figure 6b also shows a string 15 of Japanese characters which have been entered, displayed in region 14 of the window.
As mentioned above, one or more sub-cubes may used to offer the user preprepared phrases or expressions for selection. As shown in Figure 7b, a sub-cube 30 (for "Greetings") may be exploded from face 13 of parent cube 11, once the user selects region 17. The "Greetings" face of sub-cube 30 offers a number of pre-prepared greetings phrases (in this example, in Japanese) for selection by the user.
Figures 7a and 8b illustrate further sub-cube faces, providing series of pre-prepared phrases, here relating to problems, in English and Japanese. Figures 9a and 9b provide further examples of sub-cube faces, containing phrases relating to arrangements, in English and Japanese. Figure 8a shows how a sub-cube of pre-prepared phrases and expressions may be rotated in three dimensions, to enable the user to select a desired face from which to select a phrase for entry.
Further Examples
The novel method described here can be applied in many different situations for many different applications for data entry into smart devices, limited only by the individual's imagination given a knowledge of the prior art and an understanding of the novel methods presented here. For the purpose of clarity a selection of example applications are provided to show some of the scope and depth of this novel method, as these apply to mobile phones, Personal Digital Assistants (PDAs), Home and other Control Systems and other similar devices.
* Message Creation The method can be used to provide a three dimensional interactive experience for data entry using virtual keyboards for users of mobile phones, personal digital assistants, palm tops and similar devices.
* Service Interaction The same three dimensional user experience can be provided to users of mobile networked devices for interaction with services across fixed and wireless networks.
* DataEntry The three dimensional method can be used for effective data entry.
* Response Creation The novel method can be used for generation of responses to interactive content services.
* Mobile Database Interaction The method can be applied for mobile devices when selecting and interacting with information from databases of different sorts.
* Personal, Corporate, Enterprise or Organisational Mobile information stores and repositories and electronic catalogues The method applies to all forms of three dimensional mobile information display and interaction or the type described for any individual, group or organisation for bothcreating and selecting information from such sources.
* Mobile Catalogue Interaction Likewise the method can be used for using mobile devices to create messages and data entry of a variety of types including but not limited to house details for estate agents, job details for recruitment firms and products for product sales organisations and similar applications.
* Mobile Payment Systems The method extends easily to the display of pricing, costing, or payment information on a mobile phone or similar display device, such payment information displayed as part of the texture information in background or on Parent and Child Objects and selected in the same way as for the data entry requirements described above.
* Mobile Security Systems The method has applicability to interactive security systems where remote access via a mobile phone or similar device is enabled as part of the solution with the three dimensional interface to give proactive responses when challenged in person or across a network. The novel application could form part of a Personal Identity System replacing the need for physical identity cards with a powerful and connected virtual identity card with easy input of data into the security systems.
* Mobile Network connected Information Kiosk Activities The method applies equally well to information kiosk activities of a variety of types and purposes where there is a requirement to input data in an intuitive and effective method other than standard keyboard entry.
* Other intelligent device display and interaction with Small footprint display screens The novel method can be used for a variety of types of intelligent devices including in-car entertainment and information systems, home entertainment, information and security systems, and educational solutions.
* Mobile Electronic Programme Guides The method can be used for interaction with three dimensional electronic programme guides and associated material, including ordering and paying for additional optional items of information, communication, entertainment and transaction activity with simple virtual keyboard data entry.
* Mobile Software applications of various types including but not limited to personal and group productivity tools e.g. word processor, spreadsheets, presentation graphics and other software.
The novel method has applications to the interaction on devices with a variety of software programmes, and the content contained therein, with the three dimensional features herein described if need be associated with their traditional two dimensional forms and any multi- dimensional variations thereof to create a multi-dimensional virtual mobile office.
* Geographic Information and Location Systems and the delivery of local or national directory in formation organised in three dimensional forms as described The three dimensional interface can be used to select information as required by the user, including information relating to a locality based on the ability to select geographic location and identify services within the locality with interaction based on the three dimensional virtual keyboard.
* Mobile Personal or Group tools such as Calendar, Diary, To Do list, Task Lists, Schedulers, Planners and similar Being able to enter information in three dimensions and link the different dimensions and facets to each other opens up new opportunities for products and services and software for applications like a three dimensional time and task manager using the novel methods described here, separate from or within a more inclusive two or three dimensional Portal.
* Software packages and extensions for Mobile Interaction including business planning and business information, Executive In formation Systems and Solutions, Accounting Packages, Information Packages, Presentation and Planning Packages including Project and Programme Management and so on The features and functionality of the novel methods described here are ideal for a wide variety of mobile phone interaction for personal and business solutions for information selection, organisation and interaction, amongst other things.
* Mobile Gaming applications for individuals or groups, as stand-alone applications or for playing across networks of various types Many current games are already three dimensional, but while the game itself may be three dimensional the data entry can still be two dimensional without the novel methods described here and these can be applied for delivery of 3D gaming interfaces or even novel 3D games themselves for the data entry requirements of such applications.
* Music and Video in formation selection, playing and payment on the Move using Mobile delivery or streaming capabilities The novel methods are ideal for providing selection based on data input of audio, video and other forms of multi-media information.
* Public In formation Kiosks and Displays Linked to Mobile Devices and Networks Whenever there is a need to input simple or complex information the novel methods described here can present the information and interaction in different and novel ways quite
distinct from prior art.
* Shop Displays linked to Mobile Devices and Networks Likewise in-shop, in-store or in-location displays can be used to present options and data entry in three-dimensional interactive formats.
* Central and Local Government information displays and Enterprise In formation, Charity In formation and other similar information interaction applications organised and presented for information entiy from Mobile Phone devices and other intelligent devices.
The are major efforts to put Government information, both Central and Local, on-line and the novel methods described here allows for this to happen for information entry from Mobile Phones and similar devices.
* Personal Information records including historical records, financial in formation, medical records, transaction information and logs and such like Information is sometimes difficult to locate and the date entry for subsequent search and/or selection of this novel method allow the solution to be used in a variety of applications within the Mobile Phone or similar device.
* Entry of in formation to legacy databases, individually or in combination with other information, one-way or two way, using if required the ability to display different information on different faces Much data and information is locked into legacy databases on a variety of computer and software systems. Using extraction techniques and the novel presentation method it is possible to extract information from multiple databases.
* Tourist information displays and information for regional or local portals for Mobile Devices Information on a region, its attractions, skills, resources and business activities is often useful to business and consumer visitors to electronic information sources and corresponding physical or virtual locations and the novel method of information entry makes it easy for the non-expert user to find the information they are looking for.
* Indirect applications in the products and services of others for example graphics designers, webmasters, hardware engineers, manufactures and such like designing solutions for data entry Software and novel methods of the sort described can be applied directly or they can be applied by third parties or others removed from the origination of the method and such indirect use is a possible application of the novel methods described for third party development of solutions requiring data entry.
* Device manufactures with hardware or hardware and software and display facsimiles of the novel methods described here in, for example, data entry for Internet Protocol phones, multi-dimensional displays, electronic time managers, in-home entertainment and security systems and such like The novel methods can extend to new device hardware solutions embodying the principles of information and data capture and the display and interaction with the data thus generated as described in the method.
* Information Interaction for an Application Service Provider, Systems Integrators or similar Service Providers, developers of Applications and Business Process solutions, and information and entertainment providers can use the novel methods described here to capture data and information.
* Educational material delivered to Mobile Devices The novel methods can be used for the selection of education material for schools and institutes of higher education and for ongoing and lifetime training for individuals or used by institutions and organisations for the ongoing training and development of employees and customers * Embodiment of the Virtual Keyboard ideas identified here into new forms of physical ently keyboards of various designs using the novel three dimensional keyboard methods described here.
For entry of data into traditional computers using keyboards of various sorts with the capability to interact with the virtual keyboards described herein.
Summary
The present disclosure relates to a method to realise software programmes, hardware devices and networks to create three dimensional virtual keyboard data entry to input data into intelligent devices. The use of a three dimensional keyboard opens up direct spin and single key depression interaction by providing options on different facets of the three dimensional objects. This reduces or eliminates multi-depression of a single key to select the desired input element. By enabling foreground display with rotation of components of the overall data entry system it is possible to provide spin and click and single click entry of data or selection of content from displayed choices.
The present disclosure provides a method for the creation, capture, modification, updating and changing of data using a three dimensional virtual keyboard that delivers objects in a landscape with background and foreground, said objects capable of rotation and movement within the three dimensional space, with form and function associated with areas of the objects, said objects capable of decomposition or integration into Parent and Child Objects, with the ability by changing the texture or look and feel of the object to change the function of said object thereby providing an ability to impart multiple data entry types onto a single device with consistent navigation and interaction for the user across the different data entry types, so establishing a ubiquitous user interface for data entry, capture, modification and updating across different applications within intelligent devices, to enable users to enter data and content into free form or defined applications, stand-alone or connected by one or more networks.
The present disclosure further provides a three-dimensional human-devicecontent interface data capture capable of display and interaction of multiple data entry types when created by the method described above.
References and Related Material BenHajji Farid and 3D graphical User Interfaces Dybner Erik 1999 Department of Computer and Systems Sciences Stockholm University and The Royal Institute of Technology British Telecom 1995 Service Provision Systems for Communications Networks, British Telecom 1995 WO 95/30317 British Telecom 1998 A System and Method for the co-ordination and control of information supply using a distributed multi-agent platform, European Patent Application EP 0 967 545 Al Cline & Lorensen 1999 3D Surfaces generated from a List of Cubic Elements, US Patent 5,900,880 Chirieleison et al 1998 Virtual Reality Warehouse System Complement, Wa 99/61967 Davies et al 1996 Methods and/or Systems for Accessing Information, British Telecommunications Patent Application WO 96/23265 Digital Island 2000 Method and System for aptimizing Routing of Data Packets Patent WO/00/3838 I Earle 1995 Method and Apparatus for Storing and Retrieving Multi- Dimensional Data in Computer Memory, US Patent 5, 359, 724 Egger et at 1996 Method and Apparatus for Indexing, Searching and Displaying Data, US Patent 5,832,494 Galumbeck et at Communications System having an Addressable Receiver, US Patent 4,725,886 Hitachi 1997 Method for Table Graphic Display and Processing, JP 09-0811 14A Liaw et at 1996 System and Method for Multi-Dimensional Information Processing, US Patent 5,572,644 Monson 1995 Animated Map Display Method for Computer-Controlled Agricultural Product Application Equipment, US Patent Olsson 1999 Information Routing, Ericsson Telecommunications Patent Application Wa 99/36864 Philips 1997 Spatial Browsing Approach to Multimedia Information Retrieval wa 98/53391 Pooser & Pooser 1996 User Interface Navigational System and Method for Interactive Representation of Information Contained within a Database, US Patent 5,812,134 Ross 1979 Multiple Group Membership, Social Mobility and Intergroup Relations. An Investigation of Group Boundaries and Boundary Crossings. Ph.D. Thesis, University of Bristol, England.
Ross 1992 "Information Technology - the Catalyst for Change" PA Consulting Group, Mercury Books, ISBN 1 85251 042 0 Ross 2001 Multi- Dimensional Human-Computer-Content Processing Using Iterative, Structured, Shared Categorisation Cycles, GB0113613.4 Ross 2001 A Method for Multidimensional Processing and Interaction Using Iterative Categorisation Cycles and Multiple Overlapping Categorisations, GB 01113614.2 Ross 2001 A Method using Iterative Categorisation Cycles for Integrated Multi- Device Compound Document Structure, Creation, Representation, Processing and Interaction, GB 0113615.9 Ross 2001 A Method Using Iterative Categorisation Cycles for Multi-Group Multi-dimension Human Computer Content Processing and Communications with Database Linking, GB 0113616.7 Ross 2001 Methods of Iterative Categorisation for Navigation, Personalisation and Temporal Organisation within Structured Multidimensional Information processing, GB 0113617.5 Ross 2001 Methods of Iterative Cycles of Categorisation with Persistent Connectivity for Information Exchange and Transactions within Multi-Device Environments, GB 0113618.3 Ross 2001 Methods for Transmitting Information to Individuals and Groups by Cyclical Categorisation, Exploiting Locality whilst Preserving Privacy, GB01136191 Ross 2001 Methods for Iterative Categorisation for Managed Ubiquitous Delivery of Information and Transactions across Differentiated Media Channels, GB 0113620.9 Ross 2001 Methods of Iterative Categorisation for Maintaining Information and Transaction Co-ordination across Differentiated Media Types via Multiple End-to-End Persistent Links, GB 0113621.7 Ross 2001 Methods for Maintaining Action and Interaction Coherence across Differentiated Media Channels by Preserving End-to- End Integrity across Multiple Boundaries, GB 0113607.6 Ross 2001 Methods for Information Interaction using both Mobile and Fixed Communications and Single and Multi-screen Displays for Enhanced Functionality, Transaction Integrity, Security and Usability, GB 0113747.0 Spyglass (1999) Content Conversion of Electronic Documents, GB Patent Application 2 344 197 Staussmann P (1985) Information Payoff: The Transformation of Work in the Electronic Age (New York, Free Press 1985) Wolf et al (1998) Method of Performing a Parallel Relational Database Query in a Multiprocessor Environment, US Patent 5,765,146

Claims (15)

1. A method for enabling data to be entered on an electronic device, the method comprising: providing a visual user interface in which a three dimensional object is displayed, said object being rotatable in response to user input, wherein faces of the object are populated with characters, graphical icons or other data items for selection by a user; receiving input from the user selecting a character, graphical icon or other data item on a face of the object; and adding the selected character, graphical icon or data item to a data stream or string for subsequent processing.
2. A method as claimed in Claim 1, wherein said three dimensional object is a sub- object of a parent three dimensional object, the parent three dimensional object being rotatable in response to user input and, further in response to user input, capable of decomposing or exploding to thereby present the sub-object for rotation and selection of a character or other data item.
3. A method as claimed in Claim 2, wherein faces of the parent object are populated with characters, graphical icons or other data items for selection by a user, and wherein, on selection of a first character, graphical icon or other data item on a face of the parent object, the subobject that is presented is populated with characters, graphical icons or other data items the identity of which are dependent on the identity of said first character, graphical icon or other data item.
4. A method as claimed in Claim 3, wherein the visual user interface is configured for character or graphical icon entry in one or more human alphabets or languages, and wherein the characters or graphical icons with which the sub-object is populated are only those characters or graphical icons which, in a given human alphabet or language, may linguistically follow or combine with said first character or graphical icon.
5. A method as claimed in Claim 4, wherein the one or more human alphabets are selected from a group comprising Japanese, Chinese, Korean, Roman, Cyrillic, Greek, Armenian, Etruscan, Georgian, International Phonetic Alphabet, Hebrew, Arabic, Bulgarian, Cherokee, Croatian, Russian, Czech, Estonian, German, Hawaiian, Hungarian, Icelandic, Indian, Khmer/Cambodian, Kurdish, Lao, Latvian, Lithuanian, Macedonian, Manchu, Mongolian, Polish, Singaporese, Slavonic, Slovak, Slovene, Somali, Tibetan, Thai, Turkish and Urdu.
6. A method as claimed in Claim 4, wherein the human alphabet is Japanese or Chinese, and the characters or graphical icons are Japanese kana or kanji, or Chinese kanji.
7. A method as claimed in any preceding claim, wherein one or more faces of the object are populated with phrases or icons for selection by the user.
8. A method as claimed in any preceding claim, wherein the processing of said data stream or string comprises transmitting it as a text message.
9. A method as claimed in any of Claims I to 7, wherein the processing of said data stream or string comprises supplying it to a word processor or other software application.
10. A method as claimed in any of Claims I to 7, wherein the processing of said data stream or string comprises supplying it as a response to a query made by a software application.
11. An electronic device configured to implement a method as claimed in any preceding claim.
12. A computer program executable to cause processing means to implement a method as claimed in any of Claims I to 10.
13. A computer program stored on a data carrier, said computer program being executable to cause processing means to implement a method as claimed in any of Claims 1 to 10.
14 A method substantially as herein described with reference to and as illustrated in any combination of the accompanying drawings.
15. A visual user interface substantially as herein described with reference to and as illustrated in any combination of the accompanying drawings.
GB0508841A 2005-04-29 2005-04-29 Data entry using a three dimensional visual user interface Withdrawn GB2425700A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0508841A GB2425700A (en) 2005-04-29 2005-04-29 Data entry using a three dimensional visual user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0508841A GB2425700A (en) 2005-04-29 2005-04-29 Data entry using a three dimensional visual user interface

Publications (2)

Publication Number Publication Date
GB0508841D0 GB0508841D0 (en) 2005-06-08
GB2425700A true GB2425700A (en) 2006-11-01

Family

ID=34674142

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0508841A Withdrawn GB2425700A (en) 2005-04-29 2005-04-29 Data entry using a three dimensional visual user interface

Country Status (1)

Country Link
GB (1) GB2425700A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1998245A3 (en) * 2007-05-14 2009-01-14 Samsung Electronics Co., Ltd. Method and apparatus for inputting characters in mobile communication terminal
US20090109184A1 (en) * 2007-10-24 2009-04-30 Lg Electronics Inc. Mobile terminal and controlling method thereof
EP2079010A2 (en) 2007-12-28 2009-07-15 HTC Corporation Handheld electronic device and operation method thereof
EP2134083A1 (en) * 2008-06-12 2009-12-16 Alcatel, Lucent Method and system for switching between video sources
WO2010018126A1 (en) * 2008-08-13 2010-02-18 Deutsche Telekom Ag Mobile telephone with menu guidance on the screen
WO2010128991A1 (en) * 2009-05-05 2010-11-11 Sony Ericsson Mobile Communications Ab User input for hand-held device
CN101924809A (en) * 2010-08-26 2010-12-22 北京播思软件技术有限公司 Touch screen-based intelligent three-dimensional dial and quick dialing method
CN102063245A (en) * 2010-09-03 2011-05-18 文军 Three-dimensional virtual keyboard
WO2011085553A1 (en) 2010-01-15 2011-07-21 Nokia Corporation Virtual keyboard
WO2011148210A1 (en) * 2010-05-25 2011-12-01 Sony Ericsson Mobile Communications Ab A user interface for a touch sensitive display on an electronic device
WO2011153848A1 (en) * 2010-06-09 2011-12-15 腾讯科技(深圳)有限公司 Method and system for applying three-dimensional (3d) switch panels in instant messaging tool
US8234219B2 (en) 2008-09-09 2012-07-31 Applied Systems, Inc. Method, system and apparatus for secure data editing
EP2560092A1 (en) * 2011-08-19 2013-02-20 Giga-Byte Technology Method and system for parameter configuration
EP2763070A1 (en) * 2013-02-01 2014-08-06 Sap Ag Graphical user interface (GUI) that receives directional input to change face for receiving passcode
GB2510443A (en) * 2013-02-01 2014-08-06 Appycube Ltd Accessing a database using a puzzle cube
EP2821901A4 (en) * 2012-04-06 2015-06-24 Zte Corp Method and apparatus for processing keyboard input
KR101570695B1 (en) * 2009-05-29 2015-11-27 엘지전자 주식회사 Image Display Device and Control Method for the Same
WO2015196703A1 (en) * 2014-06-23 2015-12-30 中兴通讯股份有限公司 Application icon display method and apparatus
EP2557491A3 (en) * 2011-08-08 2016-03-02 Acer Incorporated Hand-held devices and methods of inputting data
EP2733593A3 (en) * 2012-11-14 2017-11-29 Samsung Electronics Co., Ltd Method and electronic device for providing virtual keyboard

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2626209A1 (en) * 1988-01-26 1989-07-28 Aerospatiale Machine for cutting material in strip form using high-pressure fluid jets
US5515486A (en) * 1994-12-16 1996-05-07 International Business Machines Corporation Method, apparatus and memory for directing a computer system to display a multi-axis rotatable, polyhedral-shape panel container having front panels for displaying objects
WO2000004440A1 (en) * 1998-07-13 2000-01-27 Koninklijke Philips Electronics N.V. Virtual 3d object control
EP1052566A1 (en) * 1999-05-14 2000-11-15 Alcatel Graphical user interface
US20030156146A1 (en) * 2002-02-20 2003-08-21 Riku Suomela Graphical user interface for a mobile device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2626209A1 (en) * 1988-01-26 1989-07-28 Aerospatiale Machine for cutting material in strip form using high-pressure fluid jets
US5515486A (en) * 1994-12-16 1996-05-07 International Business Machines Corporation Method, apparatus and memory for directing a computer system to display a multi-axis rotatable, polyhedral-shape panel container having front panels for displaying objects
WO2000004440A1 (en) * 1998-07-13 2000-01-27 Koninklijke Philips Electronics N.V. Virtual 3d object control
EP1052566A1 (en) * 1999-05-14 2000-11-15 Alcatel Graphical user interface
US20030156146A1 (en) * 2002-02-20 2003-08-21 Riku Suomela Graphical user interface for a mobile device

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1998245A3 (en) * 2007-05-14 2009-01-14 Samsung Electronics Co., Ltd. Method and apparatus for inputting characters in mobile communication terminal
US9176659B2 (en) 2007-05-14 2015-11-03 Samsung Electronics Co., Ltd. Method and apparatus for inputting characters in a mobile communication terminal
US20090109184A1 (en) * 2007-10-24 2009-04-30 Lg Electronics Inc. Mobile terminal and controlling method thereof
EP2056215A1 (en) * 2007-10-24 2009-05-06 LG Electronics Inc. Mobile terminal and controlling method thereof
US8217907B2 (en) 2007-10-24 2012-07-10 Lg Electronics Inc. Mobile terminal and controlling method thereof
EP2079010A2 (en) 2007-12-28 2009-07-15 HTC Corporation Handheld electronic device and operation method thereof
EP2079010A3 (en) * 2007-12-28 2009-11-18 HTC Corporation Handheld electronic device and operation method thereof
US8514186B2 (en) 2007-12-28 2013-08-20 Htc Corporation Handheld electronic device and operation method thereof
WO2009149925A1 (en) * 2008-06-12 2009-12-17 Alcatel Lucent Method and system for switching between video sources
EP2134083A1 (en) * 2008-06-12 2009-12-16 Alcatel, Lucent Method and system for switching between video sources
JP2011525732A (en) * 2008-06-12 2011-09-22 アルカテル−ルーセント Method and system for switching between video sources
WO2010018126A1 (en) * 2008-08-13 2010-02-18 Deutsche Telekom Ag Mobile telephone with menu guidance on the screen
DE102008038897A1 (en) * 2008-08-13 2010-02-18 Deutsche Telekom Ag Mobile phone with menu navigation on the screen
US8234219B2 (en) 2008-09-09 2012-07-31 Applied Systems, Inc. Method, system and apparatus for secure data editing
WO2010128991A1 (en) * 2009-05-05 2010-11-11 Sony Ericsson Mobile Communications Ab User input for hand-held device
US20100287505A1 (en) * 2009-05-05 2010-11-11 Sony Ericsson Mobile Communications Ab User Input for Hand-Held Device
KR101570695B1 (en) * 2009-05-29 2015-11-27 엘지전자 주식회사 Image Display Device and Control Method for the Same
WO2011085553A1 (en) 2010-01-15 2011-07-21 Nokia Corporation Virtual keyboard
EP2524283A4 (en) * 2010-01-15 2016-03-09 Nokia Technologies Oy Virtual keyboard
EP3306454A1 (en) * 2010-05-25 2018-04-11 Sony Mobile Communications, Inc A user interface for a touch sensitive display on an electronic device
CN103069376B (en) * 2010-05-25 2016-11-09 索尼移动通信株式会社 The user interface of the touch-sensitive display on electronic equipment
CN103069376A (en) * 2010-05-25 2013-04-24 索尼移动通信公司 A user interface for a touch sensitive display on an electronic device
WO2011148210A1 (en) * 2010-05-25 2011-12-01 Sony Ericsson Mobile Communications Ab A user interface for a touch sensitive display on an electronic device
US8656294B2 (en) 2010-05-25 2014-02-18 Sony Corporation User interface for a touch sensitive display on an electronic device
WO2011153848A1 (en) * 2010-06-09 2011-12-15 腾讯科技(深圳)有限公司 Method and system for applying three-dimensional (3d) switch panels in instant messaging tool
US8719732B2 (en) 2010-06-09 2014-05-06 Tencent Technology (Shenzhen) Company Limited Method and system for applying 3D switch panel in instant messaging tool
CN101924809A (en) * 2010-08-26 2010-12-22 北京播思软件技术有限公司 Touch screen-based intelligent three-dimensional dial and quick dialing method
CN102063245A (en) * 2010-09-03 2011-05-18 文军 Three-dimensional virtual keyboard
EP2557491A3 (en) * 2011-08-08 2016-03-02 Acer Incorporated Hand-held devices and methods of inputting data
US9069581B2 (en) 2011-08-19 2015-06-30 Giga-Byte Technology Co., Ltd. Method and system for parameter configuration
EP2560092A1 (en) * 2011-08-19 2013-02-20 Giga-Byte Technology Method and system for parameter configuration
CN102981719B (en) * 2011-08-19 2017-03-01 技嘉科技股份有限公司 parameter setting method and system
EP2821901A4 (en) * 2012-04-06 2015-06-24 Zte Corp Method and apparatus for processing keyboard input
EP2733593A3 (en) * 2012-11-14 2017-11-29 Samsung Electronics Co., Ltd Method and electronic device for providing virtual keyboard
GB2510443A (en) * 2013-02-01 2014-08-06 Appycube Ltd Accessing a database using a puzzle cube
EP2763070A1 (en) * 2013-02-01 2014-08-06 Sap Ag Graphical user interface (GUI) that receives directional input to change face for receiving passcode
US9304655B2 (en) 2013-02-01 2016-04-05 Sap Se Graphical user interface (GUI) that receives directional input to change face for receiving passcode
WO2015196703A1 (en) * 2014-06-23 2015-12-30 中兴通讯股份有限公司 Application icon display method and apparatus
CN105302407A (en) * 2014-06-23 2016-02-03 中兴通讯股份有限公司 Application icon display method and apparatus

Also Published As

Publication number Publication date
GB0508841D0 (en) 2005-06-08

Similar Documents

Publication Publication Date Title
GB2425700A (en) Data entry using a three dimensional visual user interface
Heer et al. Vizster: Visualizing online social networks
CN102999255B (en) Dynamic navigation bar used for expanded communication services
Fowler et al. Web application design handbook: Best practices for web-based software
US7631273B2 (en) Interactive inventor's menus within a software computer and video display system
US20060085763A1 (en) System and method for using an interface
US20100070910A1 (en) Data-Oriented User Interface for Mobile Device
Grant 101 UX principles: A definitive design guide
Kim et al. Menu design for computers and cell phones: Review and reappraisal
Fong ARPA does windows: the defense underpinning of the PC revolution
Björk et al. POWERVIEW Using information links and information views to navigate and visualize information on small displays
CN108292324A (en) The inline order of content creation
Trewin et al. Abstract representations as a basis for usable user interfaces
Tidwell A pattern language for human-computer interface design
US8930842B2 (en) Method for generating a search query
Lee et al. Menu-driven systems
Marcial A comparison of screen size and interaction technique: Examining execution times on the smartphone, tablet and traditional desktop computer
Nelson What’s on my mind
Robbins et al. TapGlance: designing a unified smartphone interface
Nielsen Classification of dialog techniques
GB2412818A (en) Three dimensional interactive menus for a mobile device
Tavakoli Cultural heritage and user interface design
Nascimento et al. Chapter Three. Digital Metaphors
Chawla et al. Unconventional Content Strategy and Hierarchical Taxonomy for Modern Web and Mobile App Design
Burns Beginning Windows 8 Application Development-XAML Edition

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)