GB2619543A - Input mapping modification system and method - Google Patents

Input mapping modification system and method Download PDF

Info

Publication number
GB2619543A
GB2619543A GB2208459.4A GB202208459A GB2619543A GB 2619543 A GB2619543 A GB 2619543A GB 202208459 A GB202208459 A GB 202208459A GB 2619543 A GB2619543 A GB 2619543A
Authority
GB
United Kingdom
Prior art keywords
user
mapping
input
peripherals
inputs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2208459.4A
Other versions
GB202208459D0 (en
Inventor
Cavalla Nicola
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB2208459.4A priority Critical patent/GB2619543A/en
Publication of GB202208459D0 publication Critical patent/GB202208459D0/en
Publication of GB2619543A publication Critical patent/GB2619543A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Input(s) received 700 from peripheral(s) (e.g. 400, 42 figs. 3, 4) result in function(s) being performed 710 in accordance with input mapping defined for respective peripheral(s). A physical state of the user is determined 720 by a user determination unit (620, fig. 5) and may include user difficulty, injury or fatigue. The physical state of a user may be determined based on response-times, a user consistently failing to press a button, duration of use, captured audio / images, gaze tracking, performance, focus or biometric information. Alternatively, the user may identify physical limitations or restrictions in their user profile. Input mapping of peripheral(s) is modified 730 in response to a user’s physical state. For example, by remapping the functions of buttons on a controller; modifying input mapping from a first peripheral to a second (e.g. from voice to gaze inputs); modifying input mapping from one hand to another; modifying input mapping from a single peripheral to a plurality of peripherals.

Description

INPUT MAPPING MODIFICATION SYSTEM AND METHOD BACKGROUND OF THE INVENTION
Field of the invention
This disclosure relates to an input mapping modification system and method.
Description of the Prior Art
The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
With increasing levels of interactivity and immersion offered by virtual experiences (such as video games, video content, and immersive applications) in recent years, there has also been an increase in the complexity of interacting with those experiences. This can be realised in a number of ways, such as increasing the number of peripherals and/or the number of functions able to be performed with each peripheral. The range of inputs that can be provided may also be increased -for instance, a user may be encouraged to provide motion inputs (such as hand gestures) while holding a peripheral.
For instance, consider virtual reality arrangements; in many cases, these require (or at least encourage) the use of two handheld controllers for a complete user experience. Each of these controllers is often expected to be used for motion-tracked inputs as well as more traditional button inputs. This can place a significant burden on users, particularly those who may have difficulty with operating peripherals for any of a number of reasons; such a burden may also be increased throughout the duration of a play session, due to user fatigue or the like.
Such a burden may be experienced differently by different users, and as such a flexible approach may be desired in which users are able to reduce the operational complexity of a peripheral. One example of this is the use of customised controllers; these have been relied upon to address user accessibility concerns, as well as to provide more convenient inputs. Examples of customisations include more responsive buttons/triggers (so as to reduce the difficulty of pressing them), and the addition of new inputs such as paddles on the rear of the controller to replace other inputs (such as replacing shoulder buttons on a controller, which can be harder for users to reach).
It is therefore considered advantageous to reduce the operational burden upon a user of peripherals for providing inputs for controlling a processing device, such as a games console, to control an interactive user experience.
It is in the context of the above discussion that the present disclosure arises. SUMMARY OF THE INVENTION
This disclosure is defined by claim 1.
Further respective aspects and features of the disclosure are defined in the appended claims.
It is to be understood that both the foregoing general description of the invention and the following detailed description are exemplary, but are not restrictive, of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein: Figures 1 and 2 schematically illustrate a user playing a game using a Sony® PlayStation® games console; Figures 3 and 4 schematically illustrate exemplary input devices; Figure 5 schematically illustrates a method for mitigating user problems with operating input devices; Figure 6 schematically illustrates a system for modifying an input mapping for one or more peripherals; and Figure 7 schematically illustrates a method for modifying an input mapping for one or more peripherals. DESCRIPTION OF THE EMBODIMENTS Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, embodiments of the present disclosure are described.
Figure 1 schematically illustrates a user wearing a head-mountable display (H MD) connected to a Sony® PlayStation® games console 300 as an example of a base device. The games console 300 is connected to a mains power supply 310 and (optionally) to a main display screen (not shown). A cable 82, acting as power supply and signal cables, links the HMD 20 to the games console 300 and is, for example, plugged into a USB socket 320 on the console 300.
In Figure 1, the user is also shown holding a hand-held controller 330 which may be, for example, a Sony® Move® controller which communicates wirelessly with the games console 300 to control (or to contribute to the control of) operations relating to a currently executed program at the games console.
A camera 305 is associated with the console 300 to capture images of the user 10 and/or the controller 25 330.
The Move controller comprises a handle portion 100 and an illuminated end portion 110. The handle portion 100 can carry one or more control buttons and houses a so-called inertial measurement unit (IMU) which will be described in more detail below. The illuminated end portion 110 comprises one or more light emitting diodes (LEDs) inside a translucent spherical shell and which are capable of being illuminated, for example under the control of an apparatus such as the games console 300.
The Move controller 330 provides an example of a control device comprising an elongate handle portion 100 which houses the inertial detector 332, the wireless interface 334 and an illuminated end portion 110 at an end of the handle portion.
In use, the IMU transmits inertial detections to the games console 300 and the games console 300 also tracks, using images captured by the camera 305, the location of the illuminated end portion 110.
A pair of video displays in the HMD 20 are arranged to display images provided via the games console 300, and a pair of earpieces 60 in the HMD 20 are arranged to reproduce audio signals generated by the games console 300. The games console may be in communication with a video server. The USB connection from the games console 300 also provides power to the HMD 20, according to the USB standard.
Figure 2 schematically illustrates a similar arrangement in which the games console is connected (by a wired or wireless link) to a so-called "break out box" acting as a base or intermediate device 350, to which the HMD 20 is connected by a cabled link 82. The breakout box has various functions in this regard. One function is to provide a location, near to the user, for some user controls relating to the operation of the HMD, such as (for example) one or more of a power control, a brightness control, an input source selector, a volume control and the like. Another function is to provide a local power supply for the HMD (if one is needed according to the embodiment being discussed). Another function is to provide a local cable anchoring point. In this last function, it is not envisaged that the break-out box 350 is fixed to the ground or to a piece of furniture, but rather than having a very long trailing cable from the games console 300, the break-out box provides a locally weighted point so that the cable 82 linking the HMD 20 to the break-out box will tend to move around the position of the break-out box. This can improve user safety and comfort by avoiding the use of very long trailing cables.
The arrangements of Figures 1 and 2 are considered to be purely exemplary, in that the hardware arrangements used to implement embodiments of the present disclosure are not limited to the configurations shown in these Figures. For example, in some cases it may be considered appropriate to use a display device such as a television rather than an HMD; similarly, any suitable peripheral (or peripherals) may be operated by the user. Figures 3 and 4 show examples of two such peripherals that may be used to provide inputs to a processing device. In some embodiments, rather than a games console it is considered that other processing devices may be used, such as general purpose computers, mobile phones, or processing devices integrated with display devices (such as televisions or HMDs).
Figure 3 schematically illustrates a controller 400 that is intended to be operated in a two-handed fashion by a user (although in some cases, a single hand operation of the controller may be suitable). The controller 400 comprises a number of elements to enable a user to provide inputs to control processing by a processing device (such as playing a game); these include buttons 410, directional buttons 420, joysticks 430, and a touchpad 440. Further inputs may also be provided using a microphone, or through motion tracking of the controller (via camera-based tracking and/or inertial motion sensors, for example). The controller 400 may also include elements such as haptic feedback and/or speaker arrangements which increase the interactivity of the controller.
Figure 4 schematically illustrates a two views of a controller 42 that is intended to be operated in a single-handed fashion by a user. Such a controller is primarily used by being held in the user's hand, and tracking movements of the user's respective hand to assist with representing the user within a virtual environment. The illustrated controller 42 comprises a tracking object 450 such as an illuminated ball, which may be used to optically track the controller's position in space by a camera connected to a host device apparatus hosting the virtual reality environment. Although not shown, the VR controller may also comprise for example one or more accelerometers and/or gyroscopes to track changes in acceleration and hence optionally also velocity and position, these being integrated either within the controller or by the host device.
The VR controller also comprises a handle 454 and a number of inputs 452, here labelled A-G. The position and number of controls in the figures are purely exemplary. These controls may be arranged in any manner suitable to the functioning and ergonomics of the controller. In the illustrated example, there are three basic groups; buttons A-D may correspond to buttons on a standard videogame controller and have corresponding functions, and similarly button E may correspond to a button on the standard controller such as a trigger or action button, but transposed relative to buttons A to D so as to be easily accessible by the user's thumb. Button F is a trigger typically used with an index finger. Finally, button G is physically separate from the other buttons to avoid accidental use and may be used to trigger or select out-of-gameplay functions such as game menus or operating system menus.
Each of these exemplary controllers offers a range of inputs that vary in their ease of use; this ease of use may vary respective inputs between users. For instance, users with smaller hands and/or limited mobility may have difficulty operating triggers located on the shoulders of the controller 400, or those buttons closer to the centre of the controller. Similarly, some users may find the use of joysticks 430 easier than the buttons due to physical constraints. With reference to the controller 42, it is considered that a user may find the trigger F easier to operate than the buttons A-E; it is also noted that the ease of operation for the buttons A-D may vary depending on which hand the controller 42 is held in, as this will change the distance the thumb has to reach to press these buttons.
Of course, in some cases the controllers may be used to provide motion-based inputs instead of (or in addition to) button presses or the like. Motion-based inputs may have varying levels of difficulty for different users, and can also generate fatigue -particularly for those who may have found the motion a challenge to begin with. Such fatigue may also be experienced with non-motion-based inputs, for example through repetitive pressing of the same button, particularly if the button is hard for the user to reach.
It is therefore considered that the ease of interaction with input devices may vary in accordance with a number of variables. Different users may experience different challenges, and these may be different for each of a user's hands (for instance, if the hands have different levels of dexterity). Different users may also have different levels of resilience when it comes to using an input device, such that some users will fatigue more quickly and/or experience different difficulties resulting from this fatigue. User fatigue may relate to the overall condition of the user, such as general energy levels, or may be determined on a more specific basis such as hand fatigue, neck fatigue, wrist fatigue, elbow fatigue, general arm fatigue, or fatigue for any specific body part of collection of body parts.
Of course, the present disclosure should not be limited only to difficulties resulting from fatigue; any deterioration in the ability of a user to interact over the course of a gameplay (or other interaction) session may be detected, such as detecting cramps or injuries resulting from overexertion as a part of the interaction process.
Figure 5 schematically illustrates a method for mitigating such problems for a user of one or more input devices associated with a particular processing device. This is distinct from a user having difficulty with the content they are interacting with, as instead this method seeks to identify (or predict) difficulty with operating input devices. For instance, failing a level repeatedly would not be considered a difficulty with operating an input device, but consistently failing to press a button (which could lead to failing a level) A step 500 comprises receiving inputs from a user controlling processing of an application executed by a processing device. These may be any combination of motion inputs, button presses, or other interactive element manipulations (such as using touch pads, touch screens, or joysticks).
A step 510 comprises determining increasing user difficulty with providing inputs; this may comprise a determination of the physical state of the user. This increasing difficulty may be determined in any suitable manner, and may be predictive or detected as appropriate for a given implementation. For instance, user fatigue may correlate with user difficulty with inputs (as operation will become more challenging as the user tires) and as such user fatigue may be the quantity that is determined. This detection may be based upon any suitable input information, with examples of the detection being provided below. Exemplary information that may be utilised includes user response times, user performance, biometric information, information about the duration of play, user inputs (and variations in the inputs over time), captured audio or images of the user, and/or gaze tracking information. Any combination of these may be used to generate an indication or prediction of user difficulty in providing inputs.
A step 520 comprises modifying the interaction of a user with the one or more input devices; this may include modifying an input mapping for one or more controllers operated by the user, and/or modifying one or more other characteristics of the controller or controllers. This can include changing the functions associated with respective input elements, for example, and/or changing the sensitivity of inputs to better accommodate a user's physical state. In some cases, the type of input can be modified -for example, if a user's hands become fatigued then it can be useful to remap functions to voice or gaze inputs even if these would otherwise provide a less precise or accurate input. Such a decrease in performance may not be noticed by a user, however, in the case that their physical state has deteriorated such that inputs are not able to be accurately provided using the default inputs.
The modification of an input mapping can be implemented in a number of ways; the purpose of this is to assign functions to input elements (such as buttons on a controller) in such a manner so as to ease the operation burden upon the user. In some embodiments, the modification may be based upon information about the user state -for instance, the modifications may be tailored to particular user difficulties so as to remap inputs to alleviate specific issues of the user.
Other characteristics that may be modified include the reduction of haptic feedback, for example, as this can increase user discomfort or difficulty if they are particularly sensitive to such sensations. This step may include the provision of a notification to (and/or require a confirmation by) the user before changes are implemented.
A step 530 comprises receiving further inputs from a user to control the processing based upon the modification of interactions as performed in step 520. For instance, this may comprise the user providing inputs using a new mapping of buttons and functions.
A method such as this can therefore be seen to mitigate user issues with using input devices throughout a gameplay session (or other session of interaction with a processing device) by modifying the way in which the user interacts in response to the detection or prediction of such issues.
A prediction of user issues with operating an input device may be based upon any number of inputs, such as past performance (which can be game or activity specific) and detections of changes in user performance (in-game, or in operation of the input device). A user may provide information as a part of their user profile to assist in this process, indicating any physical limitations or restrictions that may impact their ability to interact freely with an input device throughout a gameplay session. For instance, if a user knows that their right hand tires after a particular duration of gameplay then they may include this in their profile information. Such information may also be recorded by the system without requiring specific user input, by monitoring how the user interacts.
A detection of user issues may be based upon similar information; the detection and prediction may differ in the timing of the determination of the user state (that is, before the fatigue reaches a threshold level or after). For example, a user profile may indicate explicit indications of user fatigue or injury that they have encountered in the past -for instance, a user may be able to indicate in their profile that they struggle to press certain buttons when fatigued, and as a result such a struggle can be interpreted as a sign of user fatigue.
In some examples, reaction times may be monitored as a part of the determination; these may be compared to average values for a user, or against values for a user throughout a particular interaction session. The monitoring may include detection of how long it takes a user to provide a requested or expected input in response to a stimulus, for instance, and/or a detection of how a user's input timing changes over time. For instance, if a user begins to show difficulty executing particular combinations of successive button presses then it may be considered that their reactions are slowed.
If a user neglects to use certain buttons or inputs for an extended period of time, then this can be considered to be an indication of difficulty in doing so. The determination of an 'extended period' may be based upon the game or application in some embodiments; this may be based upon the importance of the function associated with the input, for instance, and/or an expected frequency of use (based upon user history or average user values). As an example, it could be considered that if a user has not pressed the run button for ten in-game minutes in a football game then it may be considered likely that the user is having difficulty using that button and that a remapping may be desirable.
In some cases, a user performance within a game may be used as an indicator of user difficulty with providing inputs. While it may be the case that overall performance is measured (such as completion rate or total score), it may also or instead be patterns or changes in user performance that are measured. For instance, failure at particular tasks may be more indicative of user issues by considering the challenges posed by that task. For example, a task that relies upon precision operation (such as a shooting a small target) may be considered differently to a task that relies upon fast reflexes (such as dodging enemies). Even if the task as a whole is successful, analysis of weaknesses in the user's performance at particular aspects of the task my still be useful in determining the user's physical state.
Gaze tracking information (or other information that can be used to determine a user reaction, such as capturing audio or head motion) may also be used as a part of the determination process. Such information can be used to determine when a user has mentally responded to a particular stimulus (such as first spotting an enemy), which can provide a reference point against which to measure reactions. It is also considered that if the user reaction becomes more delayed, then this may be a sign of fatigue.
In some cases, a change in the force or force profile with which a user provides inputs (such as how hard and/or fast a user presses a button or trigger) may be considered; similarly, for motion-based inputs, a speed and/or range of motion may be considered. Such information may be indicative of user fatigue or injury as a lesser force, speed, and/or range of motion may be considered to represent a user not 6 performing an input to their usual standard. It is therefore considered appropriate to monitor user inputs over time, and to detect deviations between similar inputs at different times as an indicator of the user's physical state. In some cases, it may be necessary to consider in-game conditions or an equivalent to ensure that the stimulus is sufficiently similar such that the same input by the user would be expected to be provided.
In one example, the modification of a mapping for an input device may be implemented on-the-fly, changing the assignment of one or more inputs until it is determined that the user is no longer suffering from fatigue or the like (or at least their ability to operate the input device is no longer suffering). This can be based upon information about which buttons are used most frequently, and which are used least frequency -and modifying the mapping so as to cause those being used most to be nearer the user. This may be particularly useful in cases in which the dominant inputs vary throughout gameplay, for instance; an example of this is a game which has puzzle segments and action segments -in the former case, there may be a greater reliance on a 'use' or 'interact' button, while in the latter case there may be a greater reliance on 'shoot' or 'run' buttons. The mapping may also consider user reaction times for operating each of the input elements; those which have the slowest reaction time (such as if a user has to stretch to reach a button) may be assigned the least consequential functions for particular gameplay.
Alternatively, or in addition, the mapping may utilise two or more predefined profiles which can be switched between in dependence upon a determination of a particular user state; these can be associated with a particular user profile, or defined by a particular game or processing device/platform.
For instance, a user who expects certain difficulties interacting with input devices over a period of time may be able to define personal mapping profiles to reflect this.
One example of this is a left-handed and a right-handed profile for a device that is intended to be used single-handedly; when it is determined that a user is fatigued using the controller in their right hand, the mapping can be changed to the left-handed mapping which can allow the user to switch hands and continue playing with their less-fatigued hand. In a two-handed controller example, a comparable mapping change may involve a remapping between buttons 410 and 420 as shown in Figure 4, for instance. This can effectively change the handed-ness of the controller despite both hands still being used.
In some embodiments, the remapping may include assigning multiple functions to a single button; this is similar to the use of macros or other shortcuts. This can be advantageous as complex functions can be performed more easily, although the lack of flexibility may impact user performance. For instance, if a button was mapped so as to perform two jump actions in a row, then the user would no longer be able to control the timing of the second jump and so may perform worse in a game. However, this may be a suitable trade-off if a user is having difficulty operating the input device -for instance, if they were not able to press the same button twice in quick succession, and so were unable to otherwise perform a double jump. These remappings may be predefined for a game and/or a user, or may be defined on the fly based upon information about the expected inputs from a user (such as based upon information about a game state).
The remapping may also consider the use of alternative input devices or input methods. For instance, if a user is deemed to have particular fatigue in one hand while using a two-handed controller then the remapping may include mapping inputs to a single-handed controller that the user is to operate instead. Similarly, if the user's neck is determined to have become fatigued then head motion inputs can be remapped to a controller (this may be a controller already in use, which may require a remapping of that controller's inputs, or the user may be instructed to being using a controller). In a similar this manner, the remapping may include a one-to-many or many-to-one input device mapping; that is, a user may be instructed to use a single input device in place of multiple input devices or multiple input devices in place of a single input device. This may be particularly useful when the input devices have significantly different form factors and/or input methods, as the user may experience relief from injury or fatigue even without resting the body part in question.
Such changes to the mapping may be maintained for any suitable duration. In some cases, the mapping may be maintained until a further deterioration in the user's physical state is determined (such as fatigue using the new mapping). Alternatively, or in addition, user confirmation of a further change to the mapping (or reversion to the original mapping) may be required. It is also considered that the mapping may be maintained until the end of the gameplay session, or interaction session with the input device/processing device combination. In some cases, the remapping may be maintained at the start of the next gameplay session or the like, so as to provide greater continuity of inputs for a user by allowing them to use the same mapping as that which was most recently used.
Figure 6 schematically illustrates a system for modifying an input mapping for one or more peripherals. The system comprises an input receiving unit 600, a processing unit 610, a user state determining unit 620, and a remapping unit 630. These units may be embodied in any suitable processing device, such as a games console, general purpose computer, or a server (such as a cloud gaming arrangement). In some embodiments, the functionality may be provided by a number of devices operating in conjunction with one another; this may include any number of processing devices, servers, and/or peripherals (for instance, in the case that the remapping is handled by a peripheral).
The input receiving unit 600 is configured to receive one or more inputs from one or more peripherals. The peripherals may include any suitable controllers, or other devices for providing inputs such as motion-tracked devices (including HMDs).
The processing unit 610 is configured to perform one or more functions in response to the received inputs in accordance with an input mapping defined for respective peripherals.
The user state determining unit 620 is configured to determine a physical state of the user. This determination may be a prediction or a detection of the user's physical state at a particular time; while detections may be more accurate, in some cases predictions may be advantageous as this can allow remapping to be performed before the user's physical state deteriorates too far. The user state determination unit 620 may be configured to determine a physical state of the user in dependence upon one or more of characteristics of inputs provided by the user, user response times, biometric information of the user, user performance, and/or user focus. Examples of these are discussed above with reference to step 510 of Figure 5.
The determination of the physical state may comprise a determination of user difficulty in operating a peripheral; in some embodiments a more specific determination is made such that the determination of the physical state comprises a determination of injury and/or fatigue associated with one or more parts of the user's body. The determination of injury and/or fatigue is considered to be an example of a detection of the physical state of the user can be indicative of user difficulty in operating an input device. Rather than being limited to a determination of the user's physical state at a specific time, it should be considered that numerous determinations may be performed or the determination may use 8 data for a period of time so as to enable changes in the user's physical state to be identified. This change can be the basis for a determination of user difficulty in some cases, rather than the physical state itself -for instance, if a user is determined to be tiring then this may be sufficient rather than waiting for a determination that the user has reached a threshold tiredness.
The remapping unit 630 is configured to modify the input mapping for one or more of the peripherals in response to a change in the physical state of the user. This remapping may be implemented in a number of different ways, including modifying how inputs are provided by a single device, changing the number of devices being used as inputs, and/or changing the devices being used for inputs. The remapping unit 630 may be configured to modify the input mapping by switching between two or more predefined input mappings, or it may be configured to reassign functions freely by generating a new input mapping as required. The remapping unit 630 may in some cases be configured to modify the input mapping in dependence upon the detected physical state of the user, such that particular aspects of a user's difficulty can be addressed.
In some embodiments, the remapping unit 630 is configured to modify the input mapping from a first peripheral to a second peripheral. This can be used to cause a user to change which peripheral they are using to provide inputs, as the different form factor, input elements, and/or functionality may ease the difficulty of operation for a user in response to a detection of difficulty.
In some embodiments, the remapping unit 630 is configured to modify the input mapping such that functionality associated with a first of the user's hands is instead associated with the second of the user's hands. In such cases, it is also considered that the remapping unit 630 may be further configured to modify the input mapping such that functionality associated with the second of the user's hands is instead associated with the first of the user's hands. This can include swapping the mappings for peripherals held in the user's left and right hands (or if a single peripheral is used, change the mapping to be more compatible with the user's other hand), or modifying the mapping for a single controller such that one or more functions mapped to the left/right side of the controller are remapped to the other of the left/right side.
In some embodiments the remapping unit 630 is configured to modify the input mapping from a mapping for a plurality of peripherals to a mapping for a single peripheral, or to modify the input mapping from a mapping for a single peripheral to a mapping for a plurality of peripherals. This can encourage a user to rest a hand by removing the need for a second controller, for example, or to encourage a user to divide a physical burden of operating a peripheral between multiple peripherals by assigning a portion of the functionality to each of a plurality of peripherals.
In some embodiments, the remapping unit 630 is configured to modify the input mapping such that multiple functions are reassigned from multiple respective inputs to a single input for performing each of the multiple functions in combination or in succession. An example of this is the use of macros assigned to a particular input element. This may be advantageous in that it can reduce the operational burden of a peripheral by requiring fewer button presses, for example.
In some embodiments, the remapping unit 630 may be configured to modify the outputs of one or more peripherals, including adjusting an audio output, a haptic feedback output, and/or a light output. This can be realised by simply reducing the impact of these (such as by decreasing brightness/intensity), while in others there can be a remapping of the feedback. For example, if a user's physical state indicates that haptic feedback is not appropriate then there may be a remapping of haptic feedback to audio or visual outputs.
The arrangement of Figure 6 is an example of a processor (for example, a GPU and/or CPU located in a games console or any other computing device) that is operable to modify an input mapping for one or more peripherals, and in particular is operable to: receive one or more inputs from one or more peripherals; perform one or more functions in response to the received inputs in accordance with an input mapping defined for respective peripherals; determine a physical state of the user; and modify the input mapping for one or more of the peripherals in response to a change in the physical state of the user.
Figure 7 schematically illustrates a method for modifying an input mapping for one or more peripherals. This method may be implemented by the functional units described with reference to Figure 6; as noted above, these functional units may be implemented using any combination of processing devices including games consoles, general purpose computing devices, servers, and peripherals which offer processing capabilities.
A step 700 comprises receiving one or more inputs from one or more peripherals.
A step 710 comprises performing one or more functions in response to the received inputs in accordance with an input mapping defined for respective peripherals.
A step 720 comprises determining a physical state of the user.
A step 730 comprises modifying the input mapping for one or more of the peripherals in response to a change in the physical state of the user.
The techniques described above may be implemented in hardware, software or combinations of the two. In the case that a software-controlled data processing apparatus is employed to implement one or more features of the embodiments, it will be appreciated that such software, and a storage or transmission medium such as a non-transitory machine-readable storage medium by which such software is provided, are also considered as embodiments of the disclosure.
Thus, the foregoing discussion discloses and describes merely exemplary embodiments of the present invention. As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting of the scope of the invention, as well as other claims. The disclosure, including any readily discernible variants of the teachings herein, defines, in part, the scope of the foregoing claim terminology such that no inventive subject matter is dedicated to the public.

Claims (15)

  1. CLAIMS1. A system for modifying an input mapping for one or more peripherals, the system comprising: an input receiving unit configured to receive one or more inputs from one or more peripherals; a processing unit configured to perform one or more functions in response to the received inputs in accordance with an input mapping defined for respective peripherals; a user state determination unit configured to determine a physical state of the user; and a remapping unit configured to modify the input mapping for one or more of the peripherals in response to a change in the physical state of the user.
  2. 2. A system according to claim 1, wherein the determination of the physical state comprises a determination of user difficulty in operating a peripheral.
  3. 3. A system according to any preceding claim, wherein the determination of the physical state comprises a determination of injury and/or fatigue associated with one or more parts of the user's body.
  4. 4. A system according to any preceding claim, wherein the remapping unit is configured to modify the input mapping from a first peripheral to a second peripheral.
  5. 5. A system according to any preceding claim, wherein the remapping unit is configured to modify the input mapping such that functionality associated with a first of the user's hands is instead associated with the second of the user's hands.
  6. 6. A system according to claim 5, wherein the remapping unit is further configured to modify the input mapping such that functionality associated with the second of the user's hands is instead associated with the first of the user's hands.
  7. 7. A system according to any preceding claim, wherein the remapping unit is configured to modify the input mapping from a mapping for a plurality of peripherals to a mapping for a single peripheral, or to modify the input mapping from a mapping for a single peripheral to a mapping for a plurality of peripherals.
  8. 8. A system according to any preceding claim, wherein the remapping unit is configured to modify the input mapping by switching between two or more predefined input mappings.
  9. 9. A system according to any preceding claim, wherein the remapping unit is configured to modify the input mapping such that multiple functions are reassigned from multiple respective inputs to a single input for performing each of the multiple functions in combination or in succession.
  10. 10. A system according to any preceding claim, wherein the remapping unit is configured to modify the input mapping in dependence upon the detected physical state of the user.
  11. 11. A system according to any preceding claim, wherein the remapping unit is configured to modify the outputs of one or more peripherals, including adjusting an audio output, a haptic feedback output, and/or a light output.
  12. 12. A system according to any preceding claim, wherein the user state determination unit configured to determine a physical state of the user in dependence upon one or more of characteristics of inputs provided by the user, user response times, biometric information of the user, user performance, and/or user focus.
  13. 13. A method for modifying an input mapping for one or more peripherals, the method comprising: receiving one or more inputs from one or more peripherals; performing one or more functions in response to the received inputs in accordance with an input mapping defined for respective peripherals; determining a physical state of the user; and modifying the input mapping for one or more of the peripherals in response to a change in the physical state of the user.
  14. 14. Computer software which, when executed by a computer, causes the computer to carry out the method of claim 13.
  15. 15. A non-transitory machine-readable storage medium which stores computer software according to claim 14.
GB2208459.4A 2022-06-09 2022-06-09 Input mapping modification system and method Pending GB2619543A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2208459.4A GB2619543A (en) 2022-06-09 2022-06-09 Input mapping modification system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2208459.4A GB2619543A (en) 2022-06-09 2022-06-09 Input mapping modification system and method

Publications (2)

Publication Number Publication Date
GB202208459D0 GB202208459D0 (en) 2022-07-27
GB2619543A true GB2619543A (en) 2023-12-13

Family

ID=82496363

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2208459.4A Pending GB2619543A (en) 2022-06-09 2022-06-09 Input mapping modification system and method

Country Status (1)

Country Link
GB (1) GB2619543A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110136568A1 (en) * 2009-12-09 2011-06-09 Sony Computer Entertainment America Inc. Portable Game Controller Settings
US20130288777A1 (en) * 2012-04-25 2013-10-31 Electronic Entertainment Design And Research Interactive gaming analysis systems and methods
US20160209928A1 (en) * 2015-01-16 2016-07-21 Samsung Electronics Co., Ltd. Virtual input device and method for receiving user input using the same
US20210405783A9 (en) * 2015-07-27 2021-12-30 Jordan A. Berger Universal keyboard

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110136568A1 (en) * 2009-12-09 2011-06-09 Sony Computer Entertainment America Inc. Portable Game Controller Settings
US20130288777A1 (en) * 2012-04-25 2013-10-31 Electronic Entertainment Design And Research Interactive gaming analysis systems and methods
US20160209928A1 (en) * 2015-01-16 2016-07-21 Samsung Electronics Co., Ltd. Virtual input device and method for receiving user input using the same
US20210405783A9 (en) * 2015-07-27 2021-12-30 Jordan A. Berger Universal keyboard

Also Published As

Publication number Publication date
GB202208459D0 (en) 2022-07-27

Similar Documents

Publication Publication Date Title
JP6644035B2 (en) Glove interface object and method
CN111356968B (en) Rendering virtual hand gestures based on detected hand inputs
US10254833B2 (en) Magnetic tracking of glove interface object
US11435825B2 (en) Haptic interaction method, tool and system
EP3259655B1 (en) Magnetic tracking of glove fingertips with peripheral devices
US10317997B2 (en) Selection of optimally positioned sensors in a glove interface object
JP2023502243A (en) Latency compensation using user-input machine learning prediction
CN115427122A (en) Virtual console game controller
Yi et al. Dexcontroller: Designing a vr controller with grasp-recognition for enriching natural game experience
Silva et al. Exertion interfaces for computer videogames using smartphones as input controllers
GB2619543A (en) Input mapping modification system and method
CN109416679B (en) Multiple electronic control and tracking devices for mixed reality interactions
JP2023027007A (en) dynamic game intervention
JP7299743B2 (en) Game program and information processing device
WO2022130681A1 (en) Game system, program, and information processing method
GB2556894A (en) Apparatus and method of interactive control
IES20160002A2 (en) Computer pointing device with physiological sensors