GB2597055A - Dynamic context-specific input device and method - Google Patents

Dynamic context-specific input device and method Download PDF

Info

Publication number
GB2597055A
GB2597055A GB2010169.7A GB202010169A GB2597055A GB 2597055 A GB2597055 A GB 2597055A GB 202010169 A GB202010169 A GB 202010169A GB 2597055 A GB2597055 A GB 2597055A
Authority
GB
United Kingdom
Prior art keywords
behaviour
mapping
context
input device
behaviours
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2010169.7A
Other versions
GB202010169D0 (en
Inventor
Joca Eriand
Coveva Tony
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Coveva Ltd
Original Assignee
Coveva Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coveva Ltd filed Critical Coveva Ltd
Priority to GB2010169.7A priority Critical patent/GB2597055A/en
Publication of GB202010169D0 publication Critical patent/GB202010169D0/en
Publication of GB2597055A publication Critical patent/GB2597055A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0238Programmable keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting

Abstract

Manual input device 100, e.g. virtual keyboard, switches inputs to remote processing devices dynamically and context-specifically. A processor of manual‑input‑device 100: detects, from a remote processing device, a first context, and generates a first behaviour mapping based on the first context; detects, from a remote processing device, a second context, and generates a second behaviour mapping based on the second context; the first and second mappings comprise respective mappings of behaviours to corresponding grid locations, the behaviours having behaviour indicators 114, 130. Input-device 100 comprises a touch-screen displaying the behaviour indicators according to grid locations of corresponding behaviours; the touchscreen display receives inputs from a user at physical locations on the touchscreen display, said inputs comprising interactions with indicators 114, 130. A first behaviour mapping can have indicators 114 providing convenient virtual keyboard shortcuts in the first context of an operating system and immutable indicators 130 for crucial operability of the first context. It also comprises vacant grid locations 132 for customisable behaviours. Navigational behaviours 134 allow selection of mappings for a different context, e.g. photo editing or video editing software.

Description

DYNAMIC CONTEXT-SPECIFIC INPUT DEVICE AND METHOD Field of the Invention The present invention relates to input devices such as a keyboard, and particularly to virtual such input devices.
Background to the Invention
Professional software applications are widely used for both work and/or leisure. Examples may include productivity applications, design and media applications, video editing, music production, and word processing applications, and gaming applications, among many others. All of these software applications typically make use of a standard keyboard layout to provide quick-and easy-access shortcuts to specific commands and functions within the software. Such shortcuts can improve accessibility to the software and provide for sustained interest and productivity.
Such shortcuts can work well if the application is used on a regular basis and the user has therefore logged enough time and effort in order to learn the specific key-associations for the particular software. If the application is not used regularly however, or if a user takes an extended break between uses of the software, such users typically have difficulty in recalling the specific key-associations for the software, which greatly affects interest and productivity, while also resulting in lack of efficiency in key presses, with the user often cycling through incorrect key presses to identify the intended command. Such issues can also be identified if a user makes use of shortcuts across a plethora of software applications, which can often be the case for complex tasks requiring a variety of different software tools. This non-intuitive method of key-association can result in great lengths of time spent navigating menus available in the software to identify the correct command.
Some solutions have been provided which provide static layouts having keys which are more intuitively-linked to specific processes, but these still have drawbacks in that for multiple applications, several such layouts are required. Functionality of such layouts is additionally limited and non-scalable to suit the iterative nature of software development. Additionally, different users have different preferences for such layouts depending on which they feel is more intuitive for them. Such static layouts therefore provide only similar problems to those of a standard keyboard.
It is therefore desirable to provide a manual input device which is dynamic, flexible and intuitive which maximises user-effectiveness and uptake while minimising causes for reduced productivity and efficiency.
Summary of the Invention
The present application is directed to manual input devices having a plurality of inputs, each input being arranged to be mapped to a corresponding behaviour. Manual input devices of the present invention comprise two or more such behaviour mappings which are context-specific.
The manual input device is preferably arranged to dynamically switch between the two or more context-specific behaviour mappings, which may be driven by the detection of a change in a context. In such embodiments the change may be from a first context to a second context, wherein the manual input device is arranged to switch from a first behaviour mapping associated with the first context, to a second behaviour mapping associated with the second context, which may be following detection of the change by the manual input device. In preferable embodiments, the manual input device is arranged to accept actuation of one or more of the plurality of inputs from a user, and is further arranged to output a behaviour trigger to a remote processing device, the behaviour trigger being associated with the corresponding behaviour mapped to the one or more actuated inputs. In some preferable embodiments, the behaviour mappings of the manual input device may be at least in part customisable by a user.
Suitable elements of customisation may, for example, include location of a mapped behaviour or corresponding input, or indeed any suitable characteristic of a specific mapped behaviour or corresponding input. The corresponding input may, for example, take the form of interaction with an interactive indicator associated with the respective behaviour. A behaviour may, in some embodiments be mapped to more than one input, location or indicator. Some embodiments of the manual input device may permit dynamic switching, by a user, between a first context-specific behaviour mapping associated with a current context, and second standard or ubiquitous behaviour mapping having a predetermined and non-adjustable layout of behaviour indicators or inputs, wherein the second standard or ubiquitous behaviour mapping is not associated with a context. Non-context related switching between the context-specific behaviour mapping and the standard behaviour mapping may be performed by a user interacting with an associated behaviour-mapped input on the context-specific and standard behaviour mappings. In particular embodiments, the invention is directed to a virtual keyboard, but other examples of suitable manual input devices in accordance with the present invention will be envisaged.
In accordance with the first aspect of the present invention, there is provided a manual input device arranged to: detect a first context; generate, based on the first context, a first behaviour mapping; detect a second context; and generate, based on the second context, a second behaviour mapping; the first and second behaviour mappings comprising a respective mapping of each of a plurality of behaviours to one or more corresponding grid locations, each behaviour having one or more corresponding behaviour indicators; the device further comprising: a touch-screen having a touchscreen display arranged to display the behaviour indicators, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding said behaviour; the touchscreen display being arranged to receive input from a user at the one or more physical locations on the touchscreen display, said input comprising an interaction with one or more of the plurality of behaviour indicators.
In some embodiments, the detection of the first and second context, and corresponding generation of the first and second behaviour mappings, may be performed on a host device (such as a mobile smart device or a computer comprising the present manual input device) on which the first context and the second context occur. In such embodiments, the first context and the second context may be different software applications or states running on the host device. In preferable embodiments, the host device is a remote processing device, being remote to the manual input device, the manual input device comprising a processor arranged to: detect, from a remote processing device (which may be the remote host device), a first context; generate, based on the first context, a first behaviour mapping; detect, from the remote processing device, a second context; and generate, based on the second context, a second behaviour mapping. In preferable embodiments, the manual input device is arranged to output the corresponding behaviour associated with the one or more behaviour indicators interacted with by the user. The output is preferably received and interpreted by the remote processing device.
In the context of the present invention, the term "behaviour mapping" will be understood by the skilled addressee to mean one or more behaviours each paired with a corresponding location on a grid, each grid location being capable of being mapped to a location in physical space. The location may be absolute and determined, for example, by a vector or tuple defining coordinates of the location. The location may alternatively, or in addition, be relative, for example to the position of particular point in space or an edge of the touchscreen display.
The location may be scalable to permit touchscreen displays of varying sizes and aspect ratios, and may therefore comprise a scalar component, or exponent, for example. The term "behaviour" will be understood to mean a function triggered, such as on the remote processing device, as a result of actuation of, or interaction with, the associated behaviour indicator by a user. Such a function may be triggered, for example, by a function or method call, or by reference to an associated reference or pointer. As an example, a behaviour indicator of a first behaviour mapping may comprise a visible parameter labelled "displayText", the displayText having a string value of "Copy" which will be displayed to the user on the touchscreen display of the manual input device, at the corresponding grid location of the behaviour. Following interaction with the "Copy" behaviour indicator by the user, the manual input device may be arranged to output the corresponding behaviour by way of a signal to the remote processing device, the signal comprising an indication of the keyboard key "c" and the keyboard key modifier/mutator "control" (or "CTRL"). The remote processing device may, in some embodiments, directly interpret the output from the manual input device as an appropriate input for the first context and perform the associated behaviour in the first context running thereon. In other embodiments, the remote processing device may be a server running a server application arranged to interpret the behaviour from the output of the manual input device, and arranged to output the interpreted behaviour to a context running thereon, or to a processing device running the context. The present invention will be understood to be directed to any suitable means of input, for example to a remote processing device, which is arranged to dynamically switch between context-specific input layouts in accordance with corresponding switches between contexts.
Users of current keyboard solutions are required to associate the same standard set of keys for a variety of tasks across various software applications and computational contexts. Users cannot currently utilise context-specific layouts which are activated for the user upon detection of specific context. A standard keyboard layout is non-intuitive for some contexts and inflexible to non-binary demands. It is therefore difficult for such users to pool multiple software applications in a convenient manner to perform specific tasks with maximum efficiency using present solutions, and without a considerable initial learning curve. Even once mastered, which itself requires considerable time and sustained effort and interest from a user, the non-intuitive nature of present solutions causes users to frequently forget key associations and there is presently no easy way to solve this issue. The problems associated with present solutions can have a severe negative impact on uptake of particular software tools, due to considerable skill, time, and sustained interest and effort required to overcome the present barriers.
The present invention aims to provide a flexible and scalable solution which enables maximum accessibility of software tools to a user. Dynamically switching between touchscreen layouts by employing context-specific behaviour mappings looks to provide a user with maximum comfort and convenience while permitting the pooling of multiple software tools for performing a particular task swiftly and with ease. The improved ease and speed with which software tools are preferably pooled for performing a specific task by a user preferably contributes to the deskilling of operations and thereby lowering the barrier to entry for particular tasks. The dynamic context-specific switching of the present invention preferably consistently provides the user with a touchscreen layout which is optimised to a particular context, software or tool, and therefore minimises the number of interactions required from the user to perform a specific task, thereby maximising efficiency. An optional provision of multiple customisable behaviour mappings or touchscreen layouts preferably enables further optimisation from a user, to suit the user's own style of working or to incorporate future functionality for a specific context or software application. Such embodiments are therefore intended to be scalable to suit an iterative or incremental software development model.
In preferable embodiments of the present invention, the processor is further arranged to change the display of the plurality of behaviour indicators following said detection of the first or second context.
Context-specific switching of the behaviour indicators preferably enables the user to remain focussed on a particular task without being required to switch their key-associations to those of a new context. The behaviour indicators preferably comprise a context-specific and/or behaviour-specific visual component which maintains readability and intuitiveness across contexts for the user. The sustained focus from a user preferably permits maximum engagement and sustained speed in performing a particular task.
Said input may, in some embodiments, further comprise a modification of one or more of: the first behaviour mapping; the second behaviour mapping; one or more of the plurality of behaviours; one or more of the grid locations; one or more of the physical locations.
In some embodiments, user-customisation of one or more elements of the input device, such as the behaviour mappings (location or behaviour component), the touchscreen layout or the visual component of the behaviour indicators, can be key to maintaining minimal long-term interaction of the user and maximum deskilling of operation of a particular software tool or context. Permitting the user to customise the present invention to suit their own style of working can allow the user to grow the operability of the present invention as their own knowledge of the particular software or context grows. Software developers frequently implement an iterative or incremental development model for their software tools, which can mean that current solutions become deprecated or obsolete with time. The customisability of the present invention preferably permits robustness to changes in functionality of contexts or software tools over time. Such functionality may, for example, progress from requiring only binary input commands, such as key-down or key-up events, to ranged inputs such as sliders and angular or rotational inputs. Such a feature preferably extends the lifetime of the present invention and additional permits a more varied range of potential applications.
The input device of the present invention is preferably arranged to output one or more behaviours to the remote processing device, the remote processing device being arranged to implement said behaviours. Such behaviours may include indications of a key press, which may include multi-key mutators or modifiers. The terms "mutators" and "modifiers" will be understood in the present context as representing functionality wherein one or more keys are used to modify the function of a further key, for example, holding one or more first keys and selecting a second key. In some embodiments therefore, a single physical location may be mapped to a plurality of behaviours, wherein access to one or more of said plurality behaviours may be dependent upon activation of a preceding behaviour, which may be associated with a different grid location of the behaviour mapping.
Output of such behaviours may comprise an output of an event trigger, such as an acknowledgment of a key-up or key-down event. Other outputs may comprise a method or function call, a request or a request response, a change of a variable, parameter or argument, or any other suitable implementation of the associated behaviour on the remote processing device. Communication between the manual input device of the present invention and the remote processing device is preferably direct, but may in some embodiments occur by way of an intermediary device such as a server. As such, the present device may be used in any suitable remote manner, which may be driven by comfort, usability or safety requirements.
Such embodiments may, for example, require multiple-user engagement with the same context on the same remote device. In such cases remote interaction between the present device and the remote processing device may be preferred. In embodiments wherein multiple-users interact simultaneously with the remote processing device, each with corresponding manual input devices of the present invention, any suitable queueing and subsequent asynchronous processing of outputted behaviours may be implemented as will be appreciated.
The input device is therefore preferably remote to the remote processing device Embodiments will be appreciated wherein "remote" is understood to refer to a wired connection between the present manual input device and a remote processing device. Other embodiments will be appreciated wherein "remote" refers to any suitable wireless communication between the manual input device and a remote processing device. Embodiments will also be appreciated wherein the manual input device is arranged to interact with a non-remote processing device.
In some embodiments, the first and/or second behaviour mapping may further comprise a sub-mapping of behaviours, the sub-mapping comprising a different mapping of each of a plurality of behaviours to one or more corresponding grid locations. The touchscreen may be arranged to display a plurality of behaviour indicators associated with the sub-mapping, overlaying the behaviour indicators associated with the first and/or second behaviour mapping, such that the sub-mapping behaviour indicators are prioritised for user interaction over the behaviour indicators associated with the first and/or second behaviour mapping. In such embodiments, the sub-mapping may comprise a fixed plurality of grid locations. In some such embodiments, a behaviour indicator associated with the first and/or second behaviour mapping is preferably arranged to initiate a display of the behaviour indicators associated with the sub-mapping on the touchscreen.
A behaviour mapping of the present invention, in accordance with such embodiments, may further comprise a separate "sub-mapping" which may or may not be related to the respective context of the behaviour mapping. The "sub-mapping" may, in some embodiments, represent a mapping of low-priority behaviours in the respective context, as compared with the behaviour mapping. The priority level of behaviours in such embodiments may, for example, be determined according to frequency of interactions from a user, or any other suitable factor, which may be user-defined. In some embodiments, wherein the "sub-mapping" is unrelated to the context, the "sub-mapping" may comprise standard or ubiquitous behaviours common or ubiquitous across all contexts. In an example wherein the manual input device is a virtual keyboard, the "sub-mapping" may comprise a behaviour mapping associated with a standard keyboard layout (which may be region-or language-specific). Activation of, or visualisation of, the "sub-mapping" and/or an associated behaviour indicator layout on the touchscreen display, may be initiated by a user through interaction with the context-specific behaviour mapping.
By providing a separate sub-mapping, embodiments of the present invention preferably provide compartmentalisation of behaviours according to a priority level, which may in some embodiments be determined by user-specific use-pattern such as frequency of use. In such cases, the present invention preferably permits the freeing-up of a display of the present input device, thereby providing an un-cluttered and personalised display of inputs which reduces wasted interaction from a user through, for example, incorrect input presses. The "sub-mapping" may, in some embodiments be associated with a different context to the behaviour mapping, for example wherein the different context may be a background function or software application (such as, for example, a music player; a video player; a streaming, tele/videoconferencing, recording or "screen-sharing" software; or a social media platform). In such cases, the present invention may permit simultaneous interaction with two separate software applications without leaving the first or second application, thereby maximising efficiency of a user in performing a particular task. The "sub-mapping" may additionally allow the user to revert to a globally-or ubiquitously-effective behaviour mapping such as that of a standard keyboard layout. The present invention thereby provides flexibility in the form of a context-specific layout when this is required, while permitting a user to refire to a more familiar and commonly-used layout for, for example more global-scope functions, thereby maximising the flexibility of the present invention.
The first behaviour mapping and the second behaviour mapping are preferably freely adjustable by a user. The first behaviour mapping and/or the second behaviour mapping may comprise a subset of fixed behaviours which are not adjustable by a user and a subset of adjustable behaviours which are freely adjustable by a user.
User customisability may, in some embodiments, be provided in only a limited capacity which may be necessary to, for example, prevent Ul traps or prevent a user from customising the present invention in such a way which renders the context or software application as nonfunctional, or the input device as non-responsive. Such a limitation on customisability may be provided using any suitable feature such as by providing behaviours having fixed location components in the behaviour mapping, or prevent a specific critical or essential functional behaviour from being removed or relegated to a "sub-mapping". Such a feature preferably maximises usability of the present invention and acts to additionally deskill complex context and software applications.
Each behaviour of the first and second behaviour mapping may comprise one or more behaviour indicator characteristics associated with each behaviour indicator, the behaviour indicator characteristics comprises one or more selected from: grid location; two-dimensional area; shape; colour; icon; display text; animation sequence; highlighted properties; key down properties; key up properties.
The behaviour indicator characteristics may act to define the associated behaviour, the mode of interaction accepted from a user, or a visual characteristic of the behaviour indicator. The behaviour indicator may, for example, comprise an icon, symbol or logo associated with the corresponding behaviour and thereby provide a user with immediate visual indication of the behaviour. The behaviour indicator may comprise an array of visual characteristics which may be linked to one or more states of the behaviour indicator. For example, the behaviour indicator may comprise a first symbol when unselected by a user, and a second symbol when selected by a user. In the case of a rotational element or slider, the behaviour indicator may provide a dynamic numerical display of a value which is updated according to an adjusted position or angle of rotation of the slider or rotational element. The behaviour indicator may additionally comprise a sequence of associated images or an animation or particle effect associated with a state of the behaviour indicator. It will be appreciated that a behaviour indicator may comprise one of any number of suitable states, such as "selected", "unselected", "position", angle of rotation", "duration of selection" among others.
The first and second behaviour mapping may each comprise a prioritised subset of behaviours, the prioritised subset of behaviours comprising a priority value, and a grid location according to said priority value. The prioritised subset of behaviours each comprise a behaviour indicator having a two-dimensional area according to the priority value.
Behaviour indicators may comprise a visible component (such as, for example, size, shape, colour, brightness among others) which is relative to a priority value of the associated behaviour. Such prioritisation may be driven, for example, by frequency of interaction from a user, or by a predetermined level of importance of the corresponding behaviour to the associated context. In such embodiments, important behaviours are preferably prioritised for the user's attention due to their colour, brightness or animation, or made more easy for the user to interact with due to the their size.
Each behaviour of the first and second behaviour mapping may comprise an action set defining one or more actions according to one or more states of the corresponding behaviour indicator, the one or more states being selected from: a binary state-pair; a range of three or more states.
In preferable embodiments, the present invention provides a dynamic input device which accepts a variety of input methods which extend beyond mere binary key presses (for example defining one or more first actions or behaviours according to a first "key-down" state; and one or more second actions or behaviours according to a second "key-up" state), to ranged input such as using sliders or pressure-associated inputs (for example having more than two possible states, each state having one or more associated actions or behaviours). In some embodiments, the possible states of a behaviour indicator may comprise one or more characteristics of an adjustable position or location of the behaviour indicator, such as the adjustable position or location of a trackpad input or virtual joystick to output corresponding mouse cursor coordinates. Such an input device may, for example, provide a continuous output to the remote processing device according to a state of the behaviour indicator, in order to permit such mouse tracking functionality. In order to limit any effect of continued output signalling to the remote processing device on processing performance of the remote processing device or of the manual input device, some embodiments may only permit output of a state-change of such a behaviour indicator. As such, the manual input device may be arranged to only output an action or behaviour associated with a change to the state of a particular behaviour indictor, such that a need for continuous communication with the remote processing device during period of no change to said behaviour indicator state is circumvented. Such output, only as a result of a detected state-change of a behaviour indicator by the manual input device, preferably reduces processing performance issues associated with continued behaviour indicator state/action/behaviour output to the remote processing device. As such the present invention may offer greater functionality than mere key presses and may additionally include functionality for, optionally frame-rate dependent inputs such as, updates to a mouse cursor position. This preferably provides a varied functionality to the present invention not found in current solutions.
In preferable embodiments, the present invention is cross-platform compatible. The term "cross-platform compatible" in the context of the present invention will be understood to mean independent of a specific core operating system or platform, thereby suitable for use with any software application regardless of the operating system on which the software application is running.
A method of providing a context-specific behaviour mapping for a manual input device, the method comprising: detecting by a processor of an input device, from a remote processing device, a first context; generating by the processor, a first behaviour mapping associated with the first context; the first behaviour mapping comprising a mapping of each of a plurality of first behaviours to one or more corresponding grid locations; displaying, by a touchscreen of the input device, a plurality of first behaviour indicators each associated with a corresponding first behaviour, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding first behaviour; and detecting, by the processor, from a remote processing device, a second context; generating by the processor, a second behaviour mapping associated with the second context; the second behaviour mapping comprising a mapping of each of a plurality of second behaviours to one or more corresponding grid locations.
The method preferably further comprises the step of: updating the display, by the touchscreen, to display a plurality of second behaviour indicators each associated with a corresponding second behaviour, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding second behaviour.
In such a way the method preferably comprises automatically switching the context-specific layout of the behaviour indicators according to a corresponding switch in context.
The manual input device may be a manual input device in accordance with the first aspect of the present invention.
Any features described herein as being suitable for incorporation into one or more aspects or embodiments of the present invention, will be understood as being intended to be generalizable across any and all aspects and embodiments of the present disclosure.
Detailed Description
Specific embodiments will now be described by way of example only, and with reference to the accompanying drawings, in which: FIG. 1 shows a schematic representation of an example manual input device in accordance with the first aspect of the present invention; FIG. 2 shows a plan view of the manual input device of FIG. 1 displaying a behaviour mapping associated with a first context; FIG. 3 shows a plan view of the manual input device of FIG. 1 displaying a behaviour mapping associated with a second context, FIG. 4 shows a plan view of the manual input device of FIG. 1 displaying a behaviour mapping associated with a sub mapping; FIG. 5 shows a plan view of the manual input device of FIG. 1 displaying a behaviour mapping associated with a third context; and FIG. 6 shows a plan view of the manual input device of FIG. 1 displaying a behaviour mapping associated with a fourth context, and FIG. 7 shows a flow chart representing steps of an example method in accordance with the second aspect of the present invention.
Referring to FIG. 1, a schematic view of a manual input device 100 in accordance with the first aspect of the present invention is shown. The device 100 comprises a processor 102 arranged to detect, from a remote processing device 104, a first context and generate, based on the first context, a first behaviour mapping 106. The processor 102 is arranged to store the first behaviour mapping on a memory module 108 of the device 100. The first behaviour mapping 106 comprises a mapping of each of a plurality of behaviours 110 to one or more corresponding grid locations 112, each behaviour 110 having one or more corresponding behaviour indicators 114. The one or more corresponding behaviour indicators 114 in the embodiment shown each comprise a plurality of corresponding characteristics 116, which include an image (not shown) providing an indication of the corresponding behaviour 110 and transform characteristics of the image (including relative size and orientation values determined according to a screen size). The device 100 further comprises a touch-screen having a touchscreen display 118. The processor 102 is arranged to access the first behaviour mapping 106 from the memory 108, and further arranged to display the behaviour indicators 114 of the first behaviour mapping 106 at the respective physical locations of the touchscreen display 118 according to the grid locations 112 of the corresponding said behaviour 110. The touchscreen display 118 is arranged to receive input from a user at the one or more physical locations on the touchscreen display 118, said input comprising an interaction with one or more of the plurality of displayed behaviour indicators 114. Following said interaction with a behaviour indicator 114, the processor 102 is arranged to provide an associated output trigger of the associated behaviour 110 on the remote processing device 104.
The processor 102 is further arranged to detect from the remote processing device 104, a second context; and generate, based on the second context, a second behaviour mapping 122; the second behaviour mapping 122 being of similar construction to the first behaviour mapping 106 and comprising a respective mapping of each of a plurality of behaviours 124 to one or more corresponding grid locations 126, each behaviour 124 having one or more corresponding behaviour indicators 128 having corresponding transform characteristics 129 of the image (including relative size and orientation values determined according to a screen size). Upon detect of the second context, the processor 102 is arranged to stop displaying the behaviour indicators 114 of the first behaviour mapping 106 on the touchscreen display 118 and begin displaying the behaviour indicators 128 of the second behaviour mapping 122 on the touchscreen display 118. Following interaction with a behaviour indicator 128 of the second behaviour mapping 122, the processor 102 is arranged to provide an associated output trigger 127 of the associated behaviour 124 on the remote processing device 104.
As such, a dynamic context-specific input device 100 is provided which is arranged to adapt the display of input indicators to a particular context of a remote device 104, thereby providing for more suitable and sustained interaction with the remote device 104.
Referring to FIG. 2, a plan view of the example manual input device of FIG. 1 is shown following detection of a first context by the processor 102 and the subsequent displaying of behaviour indicators 114 associated the first context. In the example shown, the manual input device 100 is a virtual keyboard, and the first context is a standard operating system. The first behaviour mapping 106 provides convenient virtual keyboard shortcuts, in the form of the associated behaviour indicators 114 for providing navigational functionality of the standard operating system on the remote device 104. In the embodiment shown, the first behaviour mapping 106 comprises a set of standard, immutable behaviour indicators 130 associated with key operability of the operating system, and cannot be customised by a user. These immutable behaviours 130 define operability which is crucial to the functioning of the first context, the standard operating system, and any customising of these behaviours 130 by a user would risk loss of function of the operating system, and therefore are not customisable in the embodiment shown. Embodiments will be appreciated wherein any number of behaviour mappings may be either fully customisable, partly customisable, or fixed.
In the embodiment of FIG. 2, the first behaviour mapping 106 further comprises an area of vacant grid locations 132 which correspond to customisable behaviours (not shown) of the first behaviour mapping 106. The customisable behaviours are arranged to accept location, output and behaviour indicator parameters from a user. As such, the first behaviour mapping 106 is partly customisable by the user.
The first behaviour mapping 106 further comprises an area of navigational behaviours 134 arranged to cause the touchscreen display 118 to display a different context-specific behaviour mapping. As such, the embodiment shown can allow the user to select to display a behaviour mapping for a context which is not the first context.
FIG. 3 shows the example manual input device of FIG. 1 following detection of a second context, the second context being a photo editing software. In FIG. 3, and following detection of the second context, the processor has accessed the second behaviour mapping 122 and displayed the associated behaviour indicator layout on the touchscreen display 118. A group of the behaviour indicators 128 comprise icons indicating the corresponding output behaviour of the behaviour indicator. A second group of the behaviour indicators 128 comprise text when indicates that these behaviour indicators are customisable by the user. In the embodiment shown, by selecting one or more of the customisable indicators, the user may allocate a corresponding behaviour output to the selected indicators. It may be the case that a behaviour output is allocated to more than one indicator for ease of access to the specific behaviour.
The second behaviour mapping 122 comprises a behaviour indicator 135 which, when interacted with by a user, causes the processor 102 to initiate display of a "sub-mapping" 136 to touchscreen display 118. The sub-mapping 136 is shown displayed on the touchscreen display 118 in FIG. 4, and provides a standard keyboard layout overlaid over the second behaviour mapping 122 and prioritised for interaction from a user. Each behaviour indictor 138 of the sub-mapping 136, following interaction with said behaviour indicator 138, causes output of an associated behaviour to the remote processing device 104. The sub mapping 136 additionally comprises a behaviour indicator which causes a return to the underlying layout of the second behaviour mapping 122.
In use, the user may themselves navigate between the behaviour mapping of the first context, the behaviour mapping of the second context and the sub-mapping, at will. Additionally, the processor is arranged to dynamically switch between behaviour mappings according to corresponding switches between contexts at the remote processing device. Other embodiments will be appreciated wherein this switching functionality is solely driven by the remote processing device and wherein a user may not switch between context-specific behaviour mappings. It is preferable, however, that the user should have maximum flexibility of use of the present invention and therefore be able to navigate the behaviour mappings of various contexts at will.
FIG. 5 and FIG. 6 show plan views of the manual input device of FIG. 1 having displayed on the touchscreen thereof examples of further behaviour indicator layouts of context-specific behaviour mappings associated with further contexts. In FIG. 5, for example, a behaviour indicator layout 140 of a behaviour mapping associated with a photo editing software is shown, and in FIG. 6, a behaviour indicator layout 150 of a behaviour mapping associated with a video editing software is shown. These layouts of FIG. 5 and FIG. 6 demonstrate behaviour indicators having varying functionalities when interacted with by a user. For example, the behaviour indicator layout 140 of FIG. 5 provides behaviour indicators 142 representing sliders arranged for, for example, colour selection. The behaviour output of said sliders 142 is therefore one of a range of possible outputs, such as an RGB colour vector or hex code, compared with a standard binary output of a key-press. The behaviour indicator layout 150 of FIG. 6 provides behaviour indicators 152 representing a rotational tool arranged for, for example, video scrubbing. The behaviour output of said rotational tool 152 is therefore one of a range of possible outputs, such as a time-point in a video sequence, compared with a standard binary output of a key-press.
As such the present invention may, in some embodiments, provide a vast array of functionality, which, together with the context-specific switchable nature of the behaviour mappings, provides a dynamic, flexible and intuitive context-specific interface which maximises user-effectiveness and uptake while minimising causes for reduced productivity and efficiency.
FIG. 7 shows a flow chart representing example steps of a method 200 in accordance with the second aspect of the present invention. The method 200 is a method of providing a context-specific behaviour mapping for a manual input device, such as that described in relation to FIG. 1 to FIG. 6. The method comprises the steps of: -detecting by a processor of an input device, from a remote processing device, a first context 202; -generating by the processor, a first behaviour mapping associated with the first context; the first behaviour mapping comprising a mapping of each of a plurality of first behaviours to one or more corresponding grid locations 204; -displaying, by a touchscreen of the input device, a plurality of first behaviour indicators each associated with a corresponding first behaviour, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding first behaviour 206; -detecting, by the processor, from a remote processing device, a second context 208; -generating by the processor, a second behaviour mapping associated with the second context; the second behaviour mapping comprising a mapping of each of a plurality of second behaviours to one or more corresponding grid locations 210; and -updating the display, by the touchscreen, to display a plurality of second behaviour indicators each associated with a corresponding second behaviour, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding second behaviour 212.
The method 200 will be understood in the context of the manual input device 100 described in relation to FIG. 1 to FIG. 6, and additional embodiments may incorporate any features described as suitable there-for.
It will be appreciated that the above described embodiments are given by way of example only and that various modifications may be made to the described embodiments without departing from the scope of the invention as defined in the appended claims. For example, the present invention may be used in conjunction with any context or software of the remote processing device, such as, for example productivity applications, design and media applications, video editing, music production, and word processing applications, and gaming applications, among many others which will be appreciated.

Claims (17)

  1. CLAIMSA manual input device comprising: a processor arranged to: -detect, from a remote processing device, a first context; -generate, based on the first context, a first behaviour mapping; -detect, from a remote processing device, a second context; and -generate, based on the second context, a second behaviour mapping; the first and second behaviour mappings comprising a respective mapping of each of a plurality of behaviours to one or more corresponding grid locations, each behaviour having one or more corresponding behaviour indicators; the device further comprising: a touch-screen having a touchscreen display arranged to display the behaviour indicators, each said indicator being displayed at a physical location on the touchscreen display according to the one or more grid locations of the corresponding said behaviour; the touchscreen display being arranged to receive input from a user at the one or more physical locations on the touchscreen display, said input comprising an interaction with one or more of the plurality of behaviour indicators.
  2. 2. An input device as claimed in claim 1, wherein the input device is arranged to output one or more behaviours to the remote processing device, the remote processing device being arranged to implement said behaviours.
  3. 3. An input device as claimed in claim 2, wherein the input device is remote to the remote processing device.
  4. 4. An input device as claimed in claim 1, claim 2 or claim 3, wherein the processor is further arranged to change the display of the plurality of behaviour indicators following said detection of the first or second context.
  5. 5. An input device as claimed in any one of the preceding claims, wherein said input further comprises a modification of one or more of: the first behaviour mapping; the second behaviour mapping; one or more of the plurality of behaviours; one or more of the grid locations; one or more of the physical locations. 6. 7. 8. 9. 10. 11. 12. 13.
  6. An input device as claimed in any one of the preceding claims, wherein the first and/or second behaviour mapping may further comprise a sub-mapping of behaviours, the sub-mapping comprising a different mapping of each of a plurality of behaviours to one or more corresponding grid locations.
  7. An input device as claimed in claim 6, wherein the touchscreen is arranged to display a plurality of behaviour indicators associated with the sub-mapping, overlaying the behaviour indicators associated with the first and/or second behaviour mapping, such that the sub-mapping behaviour indicators are prioritised for user interaction over the behaviour indicators associated with the first and/or second behaviour mapping.
  8. An input device as claimed in claim 6 or claim 7, wherein the sub-mapping comprises a fixed plurality of grid locations.
  9. An input device as claimed in claim 6, claim 7 or claim 8, wherein a behaviour indicator associated with the first and/or second behaviour mapping is arranged to initiate a display of the behaviour indicators associated with the sub-mapping on the touchscreen.
  10. An input device as claimed in any one of the preceding claims, wherein the first behaviour mapping and the second behaviour mapping are freely adjustable by a user.
  11. An input device as claimed in claim 10, wherein the first behaviour mapping and/or the second behaviour mapping comprises a subset of fixed behaviours which are not adjustable by a user and a subset of adjustable behaviours which are freely adjustable by a user.
  12. An input device as claimed in any one of the preceding claims, wherein each behaviour of the first and second behaviour mapping comprises one or more behaviour indicator characteristics associated with each behaviour indicator, the behaviour indicator characteristics comprises one or more selected from: grid location; two-dimensional area; shape; colour; icon; display text; animation sequence; highlighted properties; key down properties; key up properties.
  13. An input device as claimed in claim 12, wherein the first and second behaviour mapping each comprise a prioritised subset of behaviours, the prioritised subset of behaviours comprising a priority value, and a grid location according to said priority value.
  14. 14. An input device as claimed in claim 13, wherein the prioritised subset of behaviours each comprise a behaviour indicator having a two-dimensional area according to the priority value.
  15. 15. An input device as claimed in any one of the preceding claims, wherein each behaviour of the first and second behaviour mapping comprises an action set defining one or more actions according to one or more states of the corresponding behaviour indicator, the one or more states being selected from: a binary state-pair; a range of three or more states.
  16. 16. A method of providing a context-specific behaviour mapping for a manual input device, the method comprising: - detecting by a processor of an input device, from a remote processing device, a first context; -generating by the processor, a first behaviour mapping associated with the first context; the first behaviour mapping comprising a mapping of each of a plurality of first behaviours to one or more corresponding grid locations; - displaying, by a touchscreen of the input device, a plurality of first behaviour indicators each associated with a corresponding first behaviour, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding first behaviour; -detecting, by the processor, from a remote processing device, a second context; -generating by the processor, a second behaviour mapping associated with the second context; the second behaviour mapping comprising a mapping of each of a plurality of second behaviours to one or more corresponding grid locations.
  17. 17. A method as claimed in claim 16, the method further comprising: - updating the display, by the touchscreen, to display a plurality of second behaviour indicators each associated with a corresponding second behaviour, each said indicator being displayed at a location on the touchscreen display according to the one or more grid locations of the corresponding second behaviour.
GB2010169.7A 2020-07-02 2020-07-02 Dynamic context-specific input device and method Pending GB2597055A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2010169.7A GB2597055A (en) 2020-07-02 2020-07-02 Dynamic context-specific input device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2010169.7A GB2597055A (en) 2020-07-02 2020-07-02 Dynamic context-specific input device and method

Publications (2)

Publication Number Publication Date
GB202010169D0 GB202010169D0 (en) 2020-08-19
GB2597055A true GB2597055A (en) 2022-01-19

Family

ID=72050558

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2010169.7A Pending GB2597055A (en) 2020-07-02 2020-07-02 Dynamic context-specific input device and method

Country Status (1)

Country Link
GB (1) GB2597055A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100265183A1 (en) * 2009-04-20 2010-10-21 Microsoft Corporation State changes for an adaptive device
US20100265182A1 (en) * 2009-04-20 2010-10-21 Microsoft Corporation Context-based state change for an adaptive input device
US20130162411A1 (en) * 2011-12-22 2013-06-27 Qualcomm Incorporated Method and apparatus to adapt a remote control user interface
US20180032203A1 (en) * 2016-07-29 2018-02-01 Apple Inc. Systems, devices, and methods for dynamically providing user interface controls at a touch-sensitive secondary display
US20180267704A1 (en) * 2017-03-14 2018-09-20 Adobe Systems Incorporated File-based custom configuration of dynamic keyboards

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100265183A1 (en) * 2009-04-20 2010-10-21 Microsoft Corporation State changes for an adaptive device
US20100265182A1 (en) * 2009-04-20 2010-10-21 Microsoft Corporation Context-based state change for an adaptive input device
US20130162411A1 (en) * 2011-12-22 2013-06-27 Qualcomm Incorporated Method and apparatus to adapt a remote control user interface
US20180032203A1 (en) * 2016-07-29 2018-02-01 Apple Inc. Systems, devices, and methods for dynamically providing user interface controls at a touch-sensitive secondary display
US20180267704A1 (en) * 2017-03-14 2018-09-20 Adobe Systems Incorporated File-based custom configuration of dynamic keyboards

Also Published As

Publication number Publication date
GB202010169D0 (en) 2020-08-19

Similar Documents

Publication Publication Date Title
US10754517B2 (en) System and methods for interacting with a control environment
US11567644B2 (en) Cursor integration with a touch screen user interface
US6643721B1 (en) Input device-adaptive human-computer interface
KR101424294B1 (en) Multi-touch uses, gestures, and implementation
US8373673B2 (en) User interface for initiating activities in an electronic device
US8638315B2 (en) Virtual touch screen system
US7802202B2 (en) Computer interaction based upon a currently active input device
US20110285651A1 (en) Multidirectional button, key, and keyboard
GB2492709A (en) System for displaying a sequence of cards to open activities on an electronic device
EP2606416B1 (en) Highlighting of objects on a display
WO2012145366A1 (en) Improving usability of cross-device user interfaces
US20100077304A1 (en) Virtual Magnification with Interactive Panning
EP2682850A1 (en) Prioritization of multitasking applications in a mobile device interface
KR20210005753A (en) Method of selection of a portion of a graphical user interface
KR20060112471A (en) Method for offer menu of directional interface using pointing device in mobile terminal
CN109218514A (en) A kind of control method, device and equipment
KR20150084792A (en) Method of selecting interactivity mode
KR20140138101A (en) Mobile terminal based on 3D function key and Method for converting of display 3D function key
GB2597055A (en) Dynamic context-specific input device and method
US10019127B2 (en) Remote display area including input lenses each depicting a region of a graphical user interface
EP3908907B1 (en) Techniques for multi-finger typing in mixed-reality
US20210141528A1 (en) Computer device with improved touch interface and corresponding method
CN115469786A (en) Display apparatus and drawing object selection method