WO2018154327A1 - Computer interface system and method - Google Patents

Computer interface system and method Download PDF

Info

Publication number
WO2018154327A1
WO2018154327A1 PCT/GB2018/050490 GB2018050490W WO2018154327A1 WO 2018154327 A1 WO2018154327 A1 WO 2018154327A1 GB 2018050490 W GB2018050490 W GB 2018050490W WO 2018154327 A1 WO2018154327 A1 WO 2018154327A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer
interface
inputs
controller
input
Prior art date
Application number
PCT/GB2018/050490
Other languages
French (fr)
Inventor
Adam Peter SHORTLAND
Rebecca Hazell EAST
Original Assignee
Guy's And St. Thomas' Nhs Foundation Trust
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guy's And St. Thomas' Nhs Foundation Trust filed Critical Guy's And St. Thomas' Nhs Foundation Trust
Publication of WO2018154327A1 publication Critical patent/WO2018154327A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the present invention relates to a computer interface system and method that are particularly applicable for providing access to computer systems such as games to users with different abilities.
  • Interfaces there are various types of interfaces to computer systems. Common interface types include: physical interface devices such as mice, keyboards and joysticks; touch-screen interfaces and, voice interfaces such as the chatbots and virtual assistants popularised through phones and smart speakers. Interfaces types are selected, and the interface then designed, to suit the particular application and audience.
  • Touch-screen interfaces have been driven by smartphones and tablet computing devices as the touchscreen is the primary input system for these device types. Many of these devices allow apps to be downloaded and installed from an associated Appstore. In order to maintain security, publishing via an Appstore is often controlled by the
  • UX user experience design
  • a computer interface system for providing an alternate interface to a computer system, the computer interface system comprising a
  • controller interface a processor, a database and an output interface, and being connectable to the computer system via the output interface and to a controller via the controller interface, the database mapping each of a plurality of user inputs of the controller to a respective input of the computer system, wherein upon receipt of a user input at the controller interface via the controller, the processor is configured to determine, from the database, the mapped respective input to the computer system and communicate the mapped input to the computer system via the output interface.
  • an alternate user input mechanism is available to users.
  • operation is transparent to the computer gaming system and it may also be transparent to the controller used to obtain the alternate user input.
  • the mapping is configurable either by a user or operator and/or as a result of a training routine.
  • the system may select the user's best gestures for the mapping or may assist the user/operator in doing so. It may also allow sub-optimal gestures to be included in the mapping to force the user while playing a game to perform a particular exercise or gesture to increase mobility, strength etc.
  • embodiments of the present invention may be used to improve a user's efficiency in games. For example, substitute commands could be trained that are as efficient as possible and reduce latency introduced by a user's actions when making the standard game movements.
  • Figure 1 is a schematic diagram of a computer system including a computer interface system according to an embodiment of the present invention
  • Figure 2 is a schematic diagram of a computer interface system
  • Figure 3 is a schematic diagram of a computer interface system
  • Figure 4 is a schematic diagram of a computer system including a computer interface system according to another embodiment of the present invention.
  • Figure 1 is a schematic diagram of a computer interface system 100 according to an embodiment of the present invention for use in controlling a computer system 10.
  • the computer system 10 generally includes a processor 20, a data repository 30 and a default user interface device 40.
  • the computer system 10 may be one of a known type such as a PC, a video gaming system such as an Xbox (RTM), a TV set top box, virtual or augmented reality system, smart speaker, home automation system, home entertainment system, etc. It may alternatively be a bespoke system.
  • the computer system 10 may include or may be connected to a display 50 such as a monitor, TV, projector etc.
  • the default user interface device 40 (which may be a separate device or built into the computer system 10 or another component such as the display 50) provides access to the user interface of the computer system as defined/designed by the producer of the computer system (or software being run by the computer system). This may, for example, be a keyboard, mouse, touch-screen, game controller, voice recognition system etc.
  • the computer system 10 when absent of the computer interface system described below, the computer system 10 operates in a conventional way and enables users to operate the computer system using the default user interface device 40 (for example a keyboard, controller etc) as expected by the computer system's producer.
  • the alternate user input mechanism is available to users as described below.
  • the alternate user input mechanism does not replace the default user interface device 40, but it instead provides an alternate user input mechanism that operates in parallel to the default user interface device 40.
  • the computer interface system 100 and the alternate user input mechanism operate transparently to the computer system 10, which operates and reacts as if user inputs were being received via the default user interface device 40.
  • the computer interface system 100 operates the alternate user input mechanism by mapping alternate user inputs to the inputs expected by the computer system (or game being executed etc) and converts alternate user inputs as they are received to their mapped inputs before communicating them to the computer system 10.
  • the computer interface system 100 may be part of the computer system 10 or it may be a separate component. It may be in the form of hardware, software (such as a driver or other software installed on the computer system 10) or some combination of the two.
  • a gesture-based system may be provided to receive alternate inputs that are mapped to inputs of a game controller, remote control etc.
  • Figure 2 is a schematic diagram of a computer interface system 100 according to an embodiment of the present invention that is suitable for use in the system of Figure 1.
  • the computer system 10 is a computer gaming system 10 (or is being used as a computer gaming system) that executes standard computer games available online (either to be played online or downloaded to the data repository 30 for execution from there) or purchased and installed or executed from a removeable data store (not shown) such as a CD, DVD, cartridge etc.
  • the computer interface system 100 includes a controller interface 110, a processor 120, a database 130 and an output interface 140.
  • the controller interface 110 connects to a controller 150 that may or may not be offered as part of the computer interface system 100.
  • the connection may be via cables or wireless (such as Bluetooth).
  • the database 130 maps inputs received from the controller 150 to inputs expected by a game executed on the computer system 10.
  • An input from the controller 150 via the controller interface 110 is passed to the processor 120 which in turn cross-references it with the database 130 to determine if there is a mapped game input. If there is a mapped game input, the controller causes the mapped game input to be output to the computer system 10 and in turn the game via the output interface 140.
  • a computer game having a game controller as its default user interface device 40 may expect up-down-left-right-A-B-X-Y etc inputs.
  • the computer interface system maps its controller inputs (which might, for example be gestures captured by a camera or motion detection device) to individual outputs and translates the inputs using the mapping to provide the appropriate output to the game. For example, it may identify a right arm being raised, find in the mapping this is the "right” input to the game and therefore output "right” (or its digital or otherwise encoded equivalent) to the game.
  • the controller 150 is a gesture or motion sensing input device.
  • it may be a controller such as the Microsoft Kinect (RTM) device which includes a camera and a depth sensor (and other features such as a time of flight sensor, infra red camera and microphone array).
  • the processor 120 may preferably rely on detection capabilities such as motion analysis and feature extraction of the controller 150 to determine what is or is not a particular gesture or motion or it may additionally or alternatively do this itself. It may be that a combination of the two approaches may be taken, for example taking the detected gesture from the controller 150 but also
  • inputs are generated either by the controller 150 or from data from the controller 150 that classify detected gestures and a confidence score that the detected gesture is that which it has been classified as.
  • the processor 120 includes a training routine that can be executed and enables the abilities of a user to be evaluated and appropriate alternate user inputs suggested and stored in the database 130.
  • the guidance or instructions on the training routine is displayed on screen by the processor 120 in place of the game and the user is prompted to make a sequence of inputs via the controller 150.
  • a dummy or avatar may be shown making the desired gesture or motion for the user to copy.
  • the detected gesture and confidence score is then recorded in the database 130 before moving on to the next gesture/motion in the sequence is shown. Gestures may be required to be performed multiple times to test repeatability.
  • a mapping is determined by the processor 120.
  • mapping may affect the selection of the mapping including :
  • the computer interface system 100 may instead suggest a different gesture such as leaning of the trunk, head nod or clap for "move left”.
  • An example set of gestures that may be considered are:
  • processor 130 has been described as a separate
  • the database 130 may be stored in the data repository 30.
  • controllers 150 may be used to obtain a broad range of inputs for a user and/or provide a more sophisticated system from a combination of relatively simple controllers 150.
  • Figure 3 is a schematic diagram of a computer interface system
  • the computer interface system 100 may also include a bias control 160.
  • the bias control 160 the user or other operator may override the mapping and designate one or more control actions such as gestures to be included.
  • a user wishing to play a game or use a computer system such as those described above may be caused to make sub-optimal gestures (from the perspective of the computer system, in effect making it slightly harder to control the game or computer system than if the default input was used) to be included in the mapping.
  • the gestures may be sub-optimal for control of the computer system, requiring more effort or control than other gestures that may be available, they can also be selected or determined so as to force the user while playing a game to perform a particular exercise or gesture to increase mobility, strength etc.
  • the alternate user interface can be used to convert standard computer systems into computer systems that additionally aid in therapy.
  • the computer interface system 100 may include different mappings for a user, for example one set which includes sub-optimal gestures that must be performed for a predetermined (or increasing) time period each day, after which a more optimal (comfortable, less tiring) set may be substituted.
  • the processor 120 may monitor confidence in the detected gestures and if confidence drops, suggest switching to a different mapping (and likewise if confidence increases, adjust difficulty of the gesture or suggest re-training to determine if a better mapping can be generated).
  • the database is preferably stored locally, it may also be synchronised to a central data store 170 for monitoring and/or
  • the database (and preferably the central database) may be accessible by clinicians or others involved in
  • Inputs to a computer system that may be needed/desirable in mappings may be determined in a number of ways. This may be a manual process with the default input device 40 being connected to the computer interface system 100 or otherwise being monitored during mapping assignment/training; they may be selected from a list (eg. Up, down, a particular keystroke, button "A” etc) or they may be matched or determined from knowledge or analysis of the computer system 10.
  • Inputs may be matched in a number of ways - for example they may be classified according to detectable attributes of the computer system 10 such as process identifier, file being executed or other attributes that can be obtained from the computer system 10.
  • a default mapping may be provided (or a mapping according to genre type such as first person shooter, running and obstacle avoidance, driving, platform, TV remote control, Xbox type controller, etc). Games may be mapped in many ways.
  • the processor may select from the hierarchy:
  • FIG. 4 is a schematic diagram of a computer system including a computer interface system according to another embodiment of the present invention.
  • the computer system 10 is a home entertainment system 10.
  • the computer interface system 100 includes a controller 200, a controller interface 110, a processor 120, a database 130 and an output interface 140.
  • the controller 200 is a wearable device such as a smartwatch or other worn device such as an accelerometer, magnetometer, gyroscope or combination such as an inertial measurement device.
  • the controller 200 may be bespoke or may be a pre-configured device having these capabilities that are accessed through an API or similar on the device 200. Likewise, a number of worn devices may be used as the controller 200.
  • the controller interface 110 connects to a controller 200, typically via Bluetooth. The connection may be direct or alternately, and as illustrated, it may be via an intermediate device such as a smartphone 210 with the smartphone 210 monitoring the controller 200 via
  • controller 200 and then relaying the data to the controller interface 110, for example over a data communications network 220 such as a cellular or WiFi network.
  • a data communications network 220 such as a cellular or WiFi network.
  • the database 130 maps inputs received from the controller 200 to inputs expected by the home entertainment system 10.
  • An input from the controller 200 via the smartphone 210, network 220 and controller interface 110 is passed to the processor 120 which in turn cross-references it with the database 130 to determine if there is a mapped home entertainment input. If there is a mapped home entertainment input, the controller causes the mapped home entertainment input to be output to the computer system 10 and in turn the home entertainment system 10 via the output interface 140.
  • an output device 230 may be the recipient of the output via the output interface 140.
  • this may be an infra-red transmitter box or similar that receives the home
  • the output device can act as a proxy/signal converter and enable the control interface system 100 to work with devices/computer systems 10 that would not otherwise be controllable.
  • code e.g., a software algorithm or program
  • firmware e.g., a software algorithm or program
  • computer useable medium having control logic for enabling execution on a computer system having a computer processor.
  • Such a computer system typically includes memory storage configured to provide output from execution of the code which configures a processor in accordance with the execution.
  • the code can be arranged as firmware or software, and can be organized as a set of modules such as discrete code modules, function calls, procedure calls or objects in an object-oriented
  • the code can comprise a single module or a plurality of modules that operate in cooperation with one another.

Abstract

A computer interface system and method are described for providing an alternate interface to a computer system. The computer interface system includes a controller interface, a processor, a database and an output interface. The computer interface system is connectable to the computer system via the output interface and to a controller via the controller interface. The database maps each of a plurality of user inputs of the controller to a respective input of the computer system. Upon receipt of a user input at the controller interface via the controller, the processor is configured to determine, from the database, the mapped respective input to the computer system and communicate the mapped input to the computer system via the output interface.

Description

COMPUTER INTERFACE SYSTEM AND METHOD Field of the Invention
The present invention relates to a computer interface system and method that are particularly applicable for providing access to computer systems such as games to users with different abilities.
Background to the Invention
There are various types of interfaces to computer systems. Common interface types include: physical interface devices such as mice, keyboards and joysticks; touch-screen interfaces and, voice interfaces such as the chatbots and virtual assistants popularised through phones and smart speakers. Interfaces types are selected, and the interface then designed, to suit the particular application and audience.
In the case of computer games (which also include video games) various interface types are used. Some games have actions in response to particular keyboard keypresses or controller button presses. Some games utilise input devices such as mice, joysticks, steering wheels or the like via which a user can control the game.
It is common for computer games to be targeted at a particular age demographic. While this is predominantly concerned with the maturity of the user in terms of the content (games with violence or dark themes are not targeted at young users, for example), the age rating/targeting also impacts the interface type. For example a game targeted
predominantly at relatively young users (for example those aged 6-10) would commonly have a much simpler interface than one that is targeted at adults. This is done by the game producer at the time of designing and writing the game so as to provide an appropriate interface that matches the expected abilities of the target audience. If the interface is too complex or unintuitive, the game will likely be less successful as the interface will make the game harder and/or less engaging.
Touch-screen interfaces have been driven by smartphones and tablet computing devices as the touchscreen is the primary input system for these device types. Many of these devices allow apps to be downloaded and installed from an associated Appstore. In order to maintain security, publishing via an Appstore is often controlled by the
manufacturer and only apps passing the manufacturer's checks are permitted to publish via the Appstore. Certain manufacturers include design requirements as well as security constraints and an app's user interface must also meet these for the app to be published (they argue that this is to maintain quality and user experience). Voice interfaces have become popular for interfaces to search systems as they can be intuitive, particularly as recent advances in artificial intelligence have significantly improved their speech recognition and conversational abilities. However, voice interfaces have seen very little success in computer games as the latency involved in voice capture and processing is unacceptable in most games.
The design principles associated with user interfaces apply across all computing/computer device areas. A relatively new discipline, known as user experience design (UX) has stemmed from this which is often separated from programming/system design and focused on user satisfaction and interaction with a product/system.
This approach means that computer interfaces, whether for games, home automation systems, websites or other computer systems, have common themes and properties. While user interface (UI) and user experience (UX) designers will work to differentiate their products from their competition, the user interface type is often very similar. However, user interface and user experience design focusses on the optimal experience for the primary targeted user demographic. For example, while age is a strong indicator of ability, there are other factors that may mean that a user otherwise fitting a demographic finds an interface difficult to use effectively. For example, potential users may have physical and/or mental disabilities and/or they may have temporary injuries.
The market of users with these different abilities is generally a small subset of the overall user base and is often overlooked . While a limited number of products, systems and computer games are produced that are targeted at this market, they are generally not main-stream and lack the popularity, refinement and support of other games.
Statement of Invention
According to an aspect of the present invention, there is provided a computer interface system for providing an alternate interface to a computer system, the computer interface system comprising a
controller interface, a processor, a database and an output interface, and being connectable to the computer system via the output interface and to a controller via the controller interface, the database mapping each of a plurality of user inputs of the controller to a respective input of the computer system, wherein upon receipt of a user input at the controller interface via the controller, the processor is configured to determine, from the database, the mapped respective input to the computer system and communicate the mapped input to the computer system via the output interface.
In embodiments of the present invention, an alternate user input mechanism is available to users. Preferably operation is transparent to the computer gaming system and it may also be transparent to the controller used to obtain the alternate user input. Preferably, the mapping is configurable either by a user or operator and/or as a result of a training routine. The system may select the user's best gestures for the mapping or may assist the user/operator in doing so. It may also allow sub-optimal gestures to be included in the mapping to force the user while playing a game to perform a particular exercise or gesture to increase mobility, strength etc. It will also be appreciated that embodiments of the present invention may be used to improve a user's efficiency in games. For example, substitute commands could be trained that are as efficient as possible and reduce latency introduced by a user's actions when making the standard game movements.
Brief Description of the Drawings
Embodiments of the present invention will now be described by way of example only with reference to the accompanying drawings, in which: Figure 1 is a schematic diagram of a computer system including a computer interface system according to an embodiment of the present invention;
Figure 2 is a schematic diagram of a computer interface system
according to an embodiment of the present invention that is suitable for use in the system of Figure 1 ;
Figure 3 is a schematic diagram of a computer interface system
according to an embodiment of the present invention illustrating various further, optional, features; and,
Figure 4 is a schematic diagram of a computer system including a computer interface system according to another embodiment of the present invention.
Detailed Description
Figure 1 is a schematic diagram of a computer interface system 100 according to an embodiment of the present invention for use in controlling a computer system 10.
The computer system 10 generally includes a processor 20, a data repository 30 and a default user interface device 40. However, it will be appreciated from the description below that embodiments of the present invention can accommodate different types of computer system having different components. For example, the computer system 10 may be one of a known type such as a PC, a video gaming system such as an Xbox (RTM), a TV set top box, virtual or augmented reality system, smart speaker, home automation system, home entertainment system, etc. It may alternatively be a bespoke system. The computer system 10 may include or may be connected to a display 50 such as a monitor, TV, projector etc. The default user interface device 40 (which may be a separate device or built into the computer system 10 or another component such as the display 50) provides access to the user interface of the computer system as defined/designed by the producer of the computer system (or software being run by the computer system). This may, for example, be a keyboard, mouse, touch-screen, game controller, voice recognition system etc. Preferably, when absent of the computer interface system described below, the computer system 10 operates in a conventional way and enables users to operate the computer system using the default user interface device 40 (for example a keyboard, controller etc) as expected by the computer system's producer.
However, when the computer interface system 100 is connected to (or executed on) the computer system 10 an alternate user input
mechanism is available to users as described below. Preferably, the alternate user input mechanism does not replace the default user interface device 40, but it instead provides an alternate user input mechanism that operates in parallel to the default user interface device 40. Preferably, the computer interface system 100 and the alternate user input mechanism operate transparently to the computer system 10, which operates and reacts as if user inputs were being received via the default user interface device 40.
Preferably, the computer interface system 100 operates the alternate user input mechanism by mapping alternate user inputs to the inputs expected by the computer system (or game being executed etc) and converts alternate user inputs as they are received to their mapped inputs before communicating them to the computer system 10.
The computer interface system 100 may be part of the computer system 10 or it may be a separate component. It may be in the form of hardware, software (such as a driver or other software installed on the computer system 10) or some combination of the two.
Preferably, although it is not essential, the alternate user input
mechanism is configured to accept substantially different input types and/or substantially more granularly assignable and detectible inputs than that of the default user interface device 40 so that the computer system 10 is useable by users that cannot use, or cannot reliably use, the default user interface device 40. For example, a gesture-based system may be provided to receive alternate inputs that are mapped to inputs of a game controller, remote control etc.
Figure 2 is a schematic diagram of a computer interface system 100 according to an embodiment of the present invention that is suitable for use in the system of Figure 1.
In this embodiment, the computer system 10 is a computer gaming system 10 (or is being used as a computer gaming system) that executes standard computer games available online (either to be played online or downloaded to the data repository 30 for execution from there) or purchased and installed or executed from a removeable data store (not shown) such as a CD, DVD, cartridge etc. The computer interface system 100 includes a controller interface 110, a processor 120, a database 130 and an output interface 140.
The controller interface 110 connects to a controller 150 that may or may not be offered as part of the computer interface system 100. The connection may be via cables or wireless (such as Bluetooth).
The database 130 maps inputs received from the controller 150 to inputs expected by a game executed on the computer system 10. An input from the controller 150 via the controller interface 110 is passed to the processor 120 which in turn cross-references it with the database 130 to determine if there is a mapped game input. If there is a mapped game input, the controller causes the mapped game input to be output to the computer system 10 and in turn the game via the output interface 140.
Therefore, a computer game having a game controller as its default user interface device 40 may expect up-down-left-right-A-B-X-Y etc inputs. The computer interface system maps its controller inputs (which might, for example be gestures captured by a camera or motion detection device) to individual outputs and translates the inputs using the mapping to provide the appropriate output to the game. For example, it may identify a right arm being raised, find in the mapping this is the "right" input to the game and therefore output "right" (or its digital or otherwise encoded equivalent) to the game.
In a preferred embodiment, the controller 150 is a gesture or motion sensing input device. For example it may be a controller such as the Microsoft Kinect (RTM) device which includes a camera and a depth sensor (and other features such as a time of flight sensor, infra red camera and microphone array). The processor 120 may preferably rely on detection capabilities such as motion analysis and feature extraction of the controller 150 to determine what is or is not a particular gesture or motion or it may additionally or alternatively do this itself. It may be that a combination of the two approaches may be taken, for example taking the detected gesture from the controller 150 but also
determining gesture/confidence via another approach such as
heuristics, AI etc.
Preferably, inputs are generated either by the controller 150 or from data from the controller 150 that classify detected gestures and a confidence score that the detected gesture is that which it has been classified as.
While it may be that the alternate user inputs are fixed or set as defaults in the database 130 in mappings (so that the computer game interface system 100 can be used straight after installation), it is preferred that this is configurable.
In a preferred embodiment, the processor 120 includes a training routine that can be executed and enables the abilities of a user to be evaluated and appropriate alternate user inputs suggested and stored in the database 130.
Preferably, the guidance or instructions on the training routine is displayed on screen by the processor 120 in place of the game and the user is prompted to make a sequence of inputs via the controller 150. In the case of gestures/motion, a dummy or avatar may be shown making the desired gesture or motion for the user to copy. The detected gesture and confidence score is then recorded in the database 130 before moving on to the next gesture/motion in the sequence is shown. Gestures may be required to be performed multiple times to test repeatability.
Once motions corresponding to all of the sequence have been
evaluated, a mapping is determined by the processor 120.
Various factors may affect the selection of the mapping including :
Degree of confidence of the user being able to make the gestu Degree of differentiation over other gestures; and,
Inputs needed by game. For example, if following the training routine it is determined that the user cannot hold his or her left arm out to the side or overhead (or cannot do so such that it is reliably and repeatably detected), the computer interface system 100 may instead suggest a different gesture such as leaning of the trunk, head nod or clap for "move left".
An example set of gestures that may be considered are:
Neutral (hands close to sides)
Left/right arm out to the side.
Left/right arm bent at the elbow (hands more or less level with head) Left/right arm up (vertical)
Left/right arm swipe to side
Left/right arm swipe to belly
Left/right arm punch (straight out in front)
Left/right arm push (straight out in front) Left/right arm straight in front (not punch/not push)
Left/right arm supinate (rotation of the forearm to face hands upwards) Trunk lean to side (left/right)
Trunk lean forward
Clap
Nod
Although the processor 130 has been described as a separate
component to the processor 20 of the computer system 10, it need not be and the same processor may be used. Likewise, the database 130 may be stored in the data repository 30.
Optionally, multiple different controllers 150 may be used to obtain a broad range of inputs for a user and/or provide a more sophisticated system from a combination of relatively simple controllers 150.
Figure 3 is a schematic diagram of a computer interface system
according to an embodiment of the present invention illustrating various further, optional, features.
The computer interface system 100 may also include a bias control 160. Using the bias control 160, the user or other operator may override the mapping and designate one or more control actions such as gestures to be included. In this way, a user wishing to play a game or use a computer system such as those described above may be caused to make sub-optimal gestures (from the perspective of the computer system, in effect making it slightly harder to control the game or computer system than if the default input was used) to be included in the mapping. However, while the gestures may be sub-optimal for control of the computer system, requiring more effort or control than other gestures that may be available, they can also be selected or determined so as to force the user while playing a game to perform a particular exercise or gesture to increase mobility, strength etc. In this way, the alternate user interface can be used to convert standard computer systems into computer systems that additionally aid in therapy.
The computer interface system 100 may include different mappings for a user, for example one set which includes sub-optimal gestures that must be performed for a predetermined (or increasing) time period each day, after which a more optimal (comfortable, less tiring) set may be substituted. Optionally, the processor 120 may monitor confidence in the detected gestures and if confidence drops, suggest switching to a different mapping (and likewise if confidence increases, adjust difficulty of the gesture or suggest re-training to determine if a better mapping can be generated).
Although the database is preferably stored locally, it may also be synchronised to a central data store 170 for monitoring and/or
consolidation with the databases of other users (and also optionally allowing a user to use his or her mappings across multiple gaming systems). Additionally, the database (and preferably the central database) may be accessible by clinicians or others involved in
treatment/therapy to enable monitoring or adjustment of the gestures to be performed. Likewise, settings for training, heuristic classification etc may be pushed out to local systems from the central data store 170. Inputs to a computer system that may be needed/desirable in mappings may be determined in a number of ways. This may be a manual process with the default input device 40 being connected to the computer interface system 100 or otherwise being monitored during mapping assignment/training; they may be selected from a list (eg. Up, down, a particular keystroke, button "A" etc) or they may be matched or determined from knowledge or analysis of the computer system 10.
Inputs may be matched in a number of ways - for example they may be classified according to detectable attributes of the computer system 10 such as process identifier, file being executed or other attributes that can be obtained from the computer system 10. A default mapping may be provided (or a mapping according to genre type such as first person shooter, running and obstacle avoidance, driving, platform, TV remote control, Xbox type controller, etc). Games may be mapped in many ways. For example the processor may select from the hierarchy:
Mapping for specific game;
Mapping for genre of game;
Default mapping.
A similar hierarchy may be provided for non-gaming
systems/applications. The mappings may be obtained or updated from the central data store 170. It will be appreciated that the controllers 150 need not be the same type/model and could be selected based on the abilities of the user and also the types of games they wish to play, the computer interface system being adaptable to both controller types and game types. Figure 4 is a schematic diagram of a computer system including a computer interface system according to another embodiment of the present invention. In this embodiment, the computer system 10 is a home entertainment system 10.
In this embodiment, the computer interface system 100 includes a controller 200, a controller interface 110, a processor 120, a database 130 and an output interface 140.
The controller 200 is a wearable device such as a smartwatch or other worn device such as an accelerometer, magnetometer, gyroscope or combination such as an inertial measurement device. The controller 200 may be bespoke or may be a pre-configured device having these capabilities that are accessed through an API or similar on the device 200. Likewise, a number of worn devices may be used as the controller 200. The controller interface 110 connects to a controller 200, typically via Bluetooth. The connection may be direct or alternately, and as illustrated, it may be via an intermediate device such as a smartphone 210 with the smartphone 210 monitoring the controller 200 via
Bluetooth to determine gestures made by the user wearing the
controller 200 and then relaying the data to the controller interface 110, for example over a data communications network 220 such as a cellular or WiFi network. In such an arrangement, determination whether a gesture has been made or not be take place in an app running on the smartphone 210, in the computer system interface 100 or in the controller 200 (or may in some way be distributed between them).
As in previous embodiments, the database 130 maps inputs received from the controller 200 to inputs expected by the home entertainment system 10. An input from the controller 200 via the smartphone 210, network 220 and controller interface 110 is passed to the processor 120 which in turn cross-references it with the database 130 to determine if there is a mapped home entertainment input. If there is a mapped home entertainment input, the controller causes the mapped home entertainment input to be output to the computer system 10 and in turn the home entertainment system 10 via the output interface 140.
It will be appreciated that an output device 230 may be the recipient of the output via the output interface 140. For example, this may be an infra-red transmitter box or similar that receives the home
entertainment input and then outputs this as one or more infra-red signals. In this way, the output device can act as a proxy/signal converter and enable the control interface system 100 to work with devices/computer systems 10 that would not otherwise be controllable.
It is to be appreciated that certain embodiments of the invention as discussed above may be incorporated as code (e.g., a software algorithm or program) residing in firmware and/or on computer useable medium having control logic for enabling execution on a computer system having a computer processor. Such a computer system typically includes memory storage configured to provide output from execution of the code which configures a processor in accordance with the execution. The code can be arranged as firmware or software, and can be organized as a set of modules such as discrete code modules, function calls, procedure calls or objects in an object-oriented
programming environment. If implemented using modules, the code can comprise a single module or a plurality of modules that operate in cooperation with one another.
Optional embodiments of the invention can be understood as including the parts, elements and features referred to or indicated herein, individually or collectively, in any or all combinations of two or more of the parts, elements or features, and wherein specific integers are mentioned herein which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be
incorporated herein as if individually set forth. Although illustrated embodiments of the present invention have been described, it should be understood that various changes, substitutions, and alterations can be made by one of ordinary skill in the art without departing from the present invention which is defined by the recitations in the claims below and equivalents thereof.
The content of the abstract filed herewith and GB 1703073.5 from which priority is claimed are hereby incorporated by reference.

Claims

Claims
1. A computer interface system for providing an alternate interface to a computer system, the computer interface system comprising a controller interface, a processor, a database and an output interface, and being connectable to the computer system via the output interface and to a controller via the controller interface, the database mapping each of a plurality of user inputs of the controller to a respective input of the computer system, wherein upon receipt of a user input at the controller interface via the controller, the processor is configured to determine, from the database, the mapped respective input to the computer system and communicate the mapped input to the computer system via the output interface.
2. The computer interface system of claim 1, wherein the processor is configured to execute a calibration routine to determine a user input, the calibration routine including :
receiving inputs from the user via the controller;
classifying the inputs into distinguishable inputs;
determining, from the inputs of each distinguishable input, a threshold or range classifying the inputs;
receiving a selection of one of the distinguishable inputs and one of the inputs of the computer system; and,
recording the threshold or range for the selected distinguishable input and the input of the computer system as a mapping in the database.
3. The computer interface system of claim 2, wherein the processor is arranged to adjust the threshold or range in the database over time in dependence on predetermined criteria.
4. The computer interface system of claim 3, wherein the
predetermined criteria include feedback on the user achieving the distinguishable input.
5. The computer interface system of claim 3 or 4, wherein the processor is configured to adjust the threshold or range over time to drive it toward a predetermined optimal threshold or range.
6. The computer interface system of claim 5, wherein the
predetermined criteria include deviation from a predetermined optimal threshold or range.
7. The computer interface system of any preceding claim, wherein the computer system comprises a computer game system.
8. The computer interface system of any preceding claim, wherein the computer interface system is executed independently of the computer system.
9. The computer interface system of any preceding claim, wherein the computer interface system is executed remotely of the computer system and the output interface is arranged to communicate with the computer system via a data communications network.
10. The computer interface system of any preceding claim, wherein the controller comprises a gesture detection system.
11. The computer interface system of claim 10, wherein the gesture detection system is configured to monitor the user and includes one or more of:
a camera, a depth sensor, time of flight sensor, a microphone, an inertial measurement device, an accelerometer, a gyroscope, and a magnetometer.
12. A computer implemented interface method for providing an alternate interface to a computer system, comprising :
mapping, in a database, each of a plurality of user inputs receivable from a controller to a respective input of a computer system;
receiving one of the user inputs from the controller;
determining the mapped input of the computer system from the database; and,
outputting the determined input to the computer system.
13. The method of claim 12, further comprising :
executing, by a processor, a calibration routine to determine a user input, the calibration routine including :
receiving inputs from the user via the controller;
classifying the inputs into distinguishable inputs;
determining, from the inputs of each distinguishable input, a threshold or range classifying the inputs;
receiving a selection of one of the distinguishable inputs and one of the inputs of the computer system; and,
recording the threshold or range for the selected distinguishable input and the input of the computer system as a mapping in the database.
14. The method of claim 13, further comprising adjusting the threshold or range in the database over time in dependence on feedback on the user achieving the distinguishable input during use of the alternate interface.
15. The method of claim 13 or 14, further comprising adjusting the threshold or range over time to drive it toward a predetermined optimal threshold or range.
PCT/GB2018/050490 2017-02-24 2018-02-26 Computer interface system and method WO2018154327A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1703073.5 2017-02-24
GBGB1703073.5A GB201703073D0 (en) 2017-02-24 2017-02-24 Computer game interface system and method

Publications (1)

Publication Number Publication Date
WO2018154327A1 true WO2018154327A1 (en) 2018-08-30

Family

ID=58544245

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/050490 WO2018154327A1 (en) 2017-02-24 2018-02-26 Computer interface system and method

Country Status (2)

Country Link
GB (1) GB201703073D0 (en)
WO (1) WO2018154327A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11806630B1 (en) 2022-05-31 2023-11-07 Sony Interactive Entertainment LLC Profile-based detection of unintended controller errors

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100199229A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Mapping a natural input device to a legacy system
US20130339850A1 (en) * 2012-06-15 2013-12-19 Muzik LLC Interactive input device
US20140194065A1 (en) * 2013-01-04 2014-07-10 Kopin Corporation AD-HOC Network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100199229A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Mapping a natural input device to a legacy system
US20130339850A1 (en) * 2012-06-15 2013-12-19 Muzik LLC Interactive input device
US20140194065A1 (en) * 2013-01-04 2014-07-10 Kopin Corporation AD-HOC Network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TOYIN OSUNKOYA ET AL: "Gesture-Based Human-Computer-Interaction Using Kinect for Windows Mouse Control and PowerPoint Presentation", 1 December 2013 (2013-12-01), Proc. The 46th Midwest instruction and computing symposium (MICS2013), XP055479545, Retrieved from the Internet <URL:https://pdfs.semanticscholar.org/6fec/f4e7f95e806b3304fb0a88c7262e7ef9ce29.pdf> [retrieved on 20180530] *
VINCENT TAM ET AL: "Integrating the Kinect camera, gesture recognition and mobile devices for interactive discussion", TEACHING, ASSESSMENT AND LEARNING FOR ENGINEERING (TALE), 2012 IEEE INTERNATIONAL CONFERENCE ON, IEEE, 20 August 2012 (2012-08-20), pages H4C - 11, XP032268883, ISBN: 978-1-4673-2417-5, DOI: 10.1109/TALE.2012.6360362 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11806630B1 (en) 2022-05-31 2023-11-07 Sony Interactive Entertainment LLC Profile-based detection of unintended controller errors
WO2023235089A1 (en) * 2022-05-31 2023-12-07 Sony Interactive Entertainment LLC Profile-based detection of unintended controller errors

Also Published As

Publication number Publication date
GB201703073D0 (en) 2017-04-12

Similar Documents

Publication Publication Date Title
US10933313B2 (en) Programmable actuation inputs of an accessory and methods thereof
US11435825B2 (en) Haptic interaction method, tool and system
US11701585B2 (en) Gaming device with independent gesture-sensitive areas
US10086267B2 (en) Physical gesture input configuration for interactive software and video games
US9409087B2 (en) Method and apparatus for processing gestures
US20110195781A1 (en) Multi-touch mouse in gaming applications
US20220296996A1 (en) Variable actuators of an accessory and methods thereof
US20140329589A1 (en) Method and apparatus for configuring a gaming environment
JP2023502243A (en) Latency compensation using user-input machine learning prediction
CN115427122A (en) Virtual console game controller
US20220365591A1 (en) Method and apparatus for virtualizing a computer accessory
US20140160037A1 (en) Method and apparatus for configuring and selectively sensing use of a device
WO2018154327A1 (en) Computer interface system and method
TWI492095B (en) A method of handling a computer by a portable calling device
US11745101B2 (en) Touch magnitude identification as input to game
TWM449618U (en) Configurable hand-held system for interactive games
US11908097B2 (en) Information processing system, program, and information processing method
US20230128658A1 (en) Personalized vr controls and communications
JP2024512346A (en) Controller state management for client-server networking
AU2022234397A1 (en) Virtual automatic aiming
KR20140030390A (en) Input device having different usability according to grip condition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18715792

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18715792

Country of ref document: EP

Kind code of ref document: A1