GB2614712A - Security method for extended reality applications - Google Patents

Security method for extended reality applications Download PDF

Info

Publication number
GB2614712A
GB2614712A GB2200331.3A GB202200331A GB2614712A GB 2614712 A GB2614712 A GB 2614712A GB 202200331 A GB202200331 A GB 202200331A GB 2614712 A GB2614712 A GB 2614712A
Authority
GB
United Kingdom
Prior art keywords
user
extended reality
user interface
interface elements
reality environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2200331.3A
Inventor
Roscoe Jonathan
Smith-Creasey Max
Martins Andrade Tiago
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Telecommunications PLC
Original Assignee
British Telecommunications PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Telecommunications PLC filed Critical British Telecommunications PLC
Priority to GB2200331.3A priority Critical patent/GB2614712A/en
Publication of GB2614712A publication Critical patent/GB2614712A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/52Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
    • G06F21/54Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by adding security routines or objects to programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/83Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/71Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
    • G06F21/75Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information by inhibiting the analysis of circuitry or operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2109Game systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computer implemented security method for applications executing in an extended reality environment is described. The method includes receiving sensor data indicative of a user interacting S2 with an application in the extended reality environment S3, and comparing the sensor data, or a user profile generated from the sensor data, with a usage profile for the application S4. If a degree of similarity between the sensor data or the user profile and the usage profile is greater than a threshold S5, the extended reality environment is modified S6. The modifications include modifying the user interface such that the sequence of interactions e.g. gestures to differ between accesses, so that a side-channel attack can be mitigated. The modification may be applied only when the security risk is high, and may include changing the presentation of the elements of the user interface, such as changing the time of display, the position, the size and/or the layout of the elements (figs 3A, 3B).

Description

Security Method for Extended Reality Applications The present invention relates to a security method for extended reality applications. In particular, embodiments of the present invention relate to a security method which seeks to mitigate the risk of side-channel attacks in an extended reality (XR) environment.
Applications which execute in an extended reality environment, such as augmented reality, virtual reality and mixed reality, typically require a user to utilise gestures (input using handheld controller(s)) to interact with the environment and engage with the application. In particular, extended reality environments involve the representation of sophisticated human gestures in a virtual environment for the purpose of interacting with, and controlling, an application.
Where such interactions take place in respect of a secure or sensitive application, sequences of gestures themselves may act as a fingerprint for a user interacting with such an application, in a similar way to a record of keystrokes in a conventional mouse and keyboard application.
A security threat to such systems may therefore arise in the capture and replay of sequences of gestures to either (or both) gain access to an application and utilise functions of such an application that may otherwise be sensitive or restricted. For example, the use of accelerometer data from ancillary devices such as smartwatches and the like can provide rich and detailed insights into the gestures of a user interacting with an XR system. In particular, these insights may be used as part of a side-channel attack -that is, an attack based on information gained from the implementation of a computer system, and in this case a user's interactions with it, rather than weaknesses in the implemented application itself.
According to a first aspect of the present invention, there is provided a computer implemented security method for an application executing in an extended reality 25 environment, the method comprising: receiving sensor data indicative of a user interacting with the application in the extended reality environment; comparing the sensor data, or a user profile generated from the sensor data, with a usage profile for the application; and if a degree of similarity between the sensor data or the user profile and the usage profile is greater than a threshold, modifying the extended reality environment.
By introducing adjustments to the application that cause a sequence of interactions (for example gestures) by a user using the application to differ between accesses, a side-channel attack can be mitigated. By carrying out such adjustments specifically when a user's interactions (as measured by the sensor data) are similar to those expected based on a usage profile for the application, a variety of benefits ensue, such as reducing processing (since the adjustments are only carried out when the security risk is high, rather than all the time), and reducing disruption for the user (who may notice the adjustments).
Preferably, the extended reality environment comprises one or more user interface elements generated by the application for eliciting a sequence of user interactions by the user, and wherein the modifying of the extended reality environment comprises modifying the presentation of the one or more user interface elements. Examples of such user interface elements may include a virtual keyboard or keypad, selectable/manipulable virtual objects, and text or graphical pop-ups.
The user interface elements may be modified in a number of different ways. These different types of modification may be used individually, or in any combination. In one example, modifying the user interface elements comprises adjusting a time of display of the one or more user interface elements within the extended reality environment. In another example, modifying of the user interface elements comprises adjusting a position of display of the one or more user interface elements within the extended reality environment. In another example, modifying of the user interface elements comprises adjusting an appearance, size and/or layout of the one or more user interface elements within the extended reality environment.
While the above ways of modifying user interface elements can be applied to either a single element, or multiple elements, certain ways of modifying user interface elements relate only to multiple user interface elements, and relate generally to modifying how plural user interface elements relate to each other temporally and/or spatially within the extended reality environment. In one such example, modifying the user interface elements comprises adjusting a temporal or spatial order in which the user interface elements are presented within the extended reality environment. In another such example, modifying the user interface elements comprises adjusting a time delay between the display of successive ones of the plurality of user interface elements within the extended reality environment.
Preferably, the sensor data is accelerometer data indicative of gestures being performed by the user. Such accelerometer data may be collected by one or more controllers held by the user.
The method may further comprise identifying user interaction events from the sensor data, and generating the user profile from the identified user interaction events, wherein the usage profile comprises expected user interaction events for the application, and wherein a degree of similarity between the user profile and the usage profile is determined by detecting a correlation with respect to time between user interaction events identified from the sensor data with expected user interaction events in the usage profile.
Alternatively, rather than utilising a comparison of features (for example interactions) derived by classification from the sensor data, a similarity between stored and live sensor data may be compared (potentially after signal processing such as smoothing functions). In this case, determining the degree of similarity between the sensor data and the usage profile may comprise comparing a window of (optionally processed) sensor data with respect to time (captured during live use of the application) with a window of stored sensor data of the usage profile The usage profile may be generated for the user based on previous interactions of the user with the extended reality environment. Alternatively, the usage profile may be the same for all users, and fixed for the application.
In some implementations the modifications may simply be to cycle through a set of two or more predetermined variations of the user interface element(s) (and/or its manner, position or timing of display). However, in more robust implementations the modification of the 20 extended reality environment may include a random or pseudo random component.
According to a second aspect of the present invention, there is a provided a computer system including a processor and memory storing computer program code for performing the steps of the method set out above.
According to a third aspect of the present invention, there is a provided a computer system including a processor and memory storing computer program code for performing the steps of the method set out above.
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a schematic block diagram of a computer system suitable for the operation 30 of embodiments of the present invention; Figure 2 is a schematic block diagram of the high-level logical components and process a security system of an application executing in an extended reality environment; Figures 3A and 3B are schematic diagrams showing one example adjustment of user interface elements; Figures 4A and 4B are schematic diagrams showing another example adjustment of user interface elements; and Figure 5 is a flowchart of a security method for an application executing in an extended reality environment according to embodiments of the present invention.
Figure 1 is a block diagram of a computer system suitable for the operation of embodiments of the present invention. A central processor unit (CPU) 102 is communicatively connected to a storage 104 and an input/output (I/O) interface 106 via a data bus 108. The storage 104 can be any read/write storage device such as a random-access memory (RAM) or a non-volatile storage device. An example of a non-volatile storage device includes a disk or tape storage device. The I/O interface 106 is an interface to devices for the input or output of data, or for both input and output of data. Examples of I/O devices connectable to I/O interface 106 include a keyboard, a mouse, a display (such as a monitor) and a network connection. For the present technique the I/O interface 106 is shown to be operatively connected to an extended reality (XR) headset 110, and to one or more XR controller(s) 112.
The XR headset 110 includes a display for positioning in front of a user's eyes, for presenting the user with a visual representation of the extended reality environment, and sensors for detecting motion of the user's head. The headset 110 may also include a speaker for providing audio output to the user (such audio itself potentially constituting a user interface element). As the user moves their head, the movement is sensed and the visual representation of the extended reality environment adjusted to reflect a change in viewing direction within the extended reality environment. The XR controllers 112 are generally hand-held, and typically two are provided -for the user to hold one in each hand. The XR controllers 112 may be provided with one or more input devices such as buttons, switches, joypads or the like, to permit interaction by the user with the extended reality environment. Further, the position of the XR controllers 112, generally with respect to the XR headset 112, is tracked, and used to define the position of the user's left and right hands within the extended reality environment. In this way, the user is able to move their head (and thus the XR headset 110) in the real world, which causes an associated change in the field of view of the extended reality environment displayed on the XR headset 110. The XR controllers 112 may also be able to provide haptic feedback, which again may constitute a user interface element.
The user is able to move their hands (and thus the controllers 112) in the real world, which causes an associated change in a position of the user's virtual hands within the extended reality environment-which may be represented in the display on the XR headset 110 if the current field of view thereof includes the virtual hand position(s). As a result, the user is able to interact with virtual objects within the extended reality environment by moving their virtual hands close to the virtual objects within the extended reality environment (by moving the XR controller(s) 112), and optionally by manipulating one or more of the input devices on the controller(s), for example to initiate an "interact" function such as selecting or grabbing the virtual object. It will be appreciated that, to interact with a particular user interface element, such as a virtual keyboard or keypad within the extended reality environment, in a predictable way, for example to enter a particular password or passcode, a sequence of movements of the controllers to sequentially align with and select specific keys of the virtual keyboard or keypad, will also be predictable. If an external device such as a smartwatch is able to track the movement of a user's hand(s), then these movements may be usable to identify the password or passcode entered by the user. Similar principles may apply to other interactions with the extended reality environment. The use of this external (accelerometer) data collected while a user is interacting with the application in the extended reality environment is a form of side-channel attack.
Extended reality (XR) is an umbrella term that encompasses the spectrum of technologies combining real and virtual environments, such as virtual reality (VR), augmented reality (AR) and mixed reality (MR). Virtual Reality (VR) makes different cognitive interactions possible in a computer-generated environment which models a 3D virtual space or virtual world. Typically, VR requires a head-mounted display (HMD), such as the headset 110, to visualise the virtual world, and which enables navigation in the environment as well as manipulation of objects, for example using the controller(s) 112 described above. Rather than creating a completely simulated environment (as in VR), Augment Reality (AR) preserves the real environment and its surroundings, but combines these real world elements with virtual elements, allowing the user to interact with virtual 3D objects that are placed in the environment. Mixed Reality (MR) is a combination of both VR and AR which produces new environments and visualisations where physical and digital objects co-exist, so that real or virtual objects can be added to virtual environments and virtual objects can be added to the real world. In all such cases, it will be understood that a user typically navigates an XR environment, at least in part, using gestures, carried out using the controllers 112.
It is common for devices such as smart watches or fit bands to collect user data (for example heart rate, GPS position, and accelerometer measurements) and send it elsewhere. This opens the possibility for a malicious attacker to compromise those devices and build a user biometric profile with the goal of reconstructing the user steps in the same application to gain access to sensitive information, such as (but not limited to) the password or passcode entry described above.
Given advancements in user activity detection (for example gait recognition on mobile 5 devices and, more recently, activity detection within VR environments) it is important to ensure that sensitive actions are shielded from attackers for privacy reasons. The fact that activities within extended reality environments carry different movements and therefore different accelerometer footprints, make it feasible for an attack to use such smartwatches to identify a user's current activity. It is even feasible to use this information to determine what 10 they are doing within that particular activity (for example if the attacker knows the user is within a typing application using a standard QWERTY keyboard they could model how the movement of the smartwatch could correlate with letters selected by the user and their XR controller 112).
Different VR games or other applications can be expected to produce distinct accelerometer outputs. The differences between the accelerometer outputs may in some cases be sufficient to enable the training of a classifier to identify particular interactions made by the user with the environment within which the application is executing. These identified interactions may then constitute a security breach.
In order to mitigate this risk, the present technique proposes the introduction of functional "chaff" (obfuscating features) into the application, such as randomised adjustments to an arrangement of the virtual world, an order of events, or other features of the XR application, in order that any captured sequence of gestures is unsuitable for engaging with the application that is similarly adjusted subsequently. This represents a form of data obfuscation.
Figure 2 schematically illustrates the proposed VR system 200, being used by a user 1.
The VR system 200 stores a plurality of biometric profiles, in this example including a game profile 210, a VSOC (virtual security operations centre) profile 220 and a virtual chat profile 230. Each of the biometric profiles is a model profile generated from gesture information for a user (either actual or notional) using the specific application in a default state where no adjustment takes place to obfuscate a user's gestural inputs. The model profile may take the form of an expected sequence of interactions/gestures by the user. It will be appreciated that the model profile for each of the applications corresponding to the profiles 210, 220, 230 can be expected to be different due to the different interactions expected of the user for the different applications.
An activity tracker 240 is provided which continuously tracks the current VR activity of the user in relation to an application they are currently interacting with. The tracked activity may take the form of accelerometer signals from the controllers 112, or specific interactions and/or gestures performed by the user and extracted/classified from the accelerometer signals. That is, the same or similar information to that stored in the biometric profiles 210, 220, 230 is generated by the activity tracker 240. A similarity engine 250 is provided. Whilst the user is engaged in an application, the collected accelerometer data is continuously compared to the biometric accelerometer profiles by the similarity engine, which uses techniques such as Long Short Term Memory (LSTM) and/or Dynamic Time Warping (DTVV).
These algorithms are useful because of their ability to compare time series data such as accelerometer data. In this way, the similarity engine continuously compares interactions by the user as tracked by the activity tracker 240 with the model profile 210, 220, 230 corresponding the same application to identify a degree of similarity.
The degree of similarity (similarity score) is then compared with a threshold, which may optionally be different for each application/profile. The threshold may also be context specific. In particular, in certain contexts such as the entry of passwords or personal information where security is more important, a lower threshold may be used (thus requiring a lower degree of similarity to trigger obfuscation). Where there is considerable similarity between the user's tracked sequence of gestures as output by the activity tracker 240 and those reflected in the model profile 210, 220, 230 (that is, when the threshold is met or exceeded), a dynamic and randomised "chaff" generation function 260 is triggered to adjust subsequent gesture requirements to cause the user's future gesture sequences to deviate from those characteristic of the unadapted application. In this way, a user's gesture profile will not be suitable for unauthorised capture and replay to access and/or use the XR application. Explained differently, the chaff generation function may enforce an action within the VR application that is abnormal and therefore lowers the similarity score to that application. This can be expected to thwart attackers with access to smartwatch accelerometers identifying the user activity. The chaff function may increase or decrease the speed of aspect of a certain game or require the user to reach higher or lower in a certain application (within the XR environment). Conversely, if the similarity engine 250 does not find a high similarity score then the similarity comparison is simply continued with no invocation of the chaff generation function 260.
The use of model profiles 210, 220, 230, activity tracker 240, similarity engine 250 and chaff action generator 260, and in particular the generation of randomised adaptations to 35 effect changes to gestures based on a comparison with a model profile for an application, makes it possible to achieve obfuscation of user gestures in an efficient and unobtrusive manner.
The adjustments made by way of this technique may take many forms. Typically, the extended reality environment comprises one or more user interface elements generated by the application for eliciting a sequence of user interactions by the user. Generally these are visual elements, but audio or hapfic cues could be used too (and thus adjusted by the chaff generation function 260). The presentation of the user interface elements may be modified in various ways to correspondingly modify the resulting interactions of the user with the extended reality environment. For example, a time of display of the user interface elements may be adjusted, with the result that the timing of the resultant user gesture or other interaction will be earlier or later than would be expected based on the application profile. A position of display of the user interface elements within the extended reality environment may be adjusted, with the result that the user's movement in interacting with that gesture would differ from that expected based on the application profile. Similarly, an appearance, size or layout of the user interface elements within the extended reality environment may be adjusted, again with the result that an interaction of the user with those elements would be achieved by way of a movement which differs from that expected based on the application profile.
In the case of multiple user interface elements being used to elicit a gestural response from the user, a temporal or spatial order in which the user interface elements are presented within the extended reality environment may be adjusted, thereby resulting in a timing of a gesture and/or the movements involved in the gesture, differing from that expected based on the application profile. Similarly, a time delay between the display of successive ones of a plurality of user interface elements within the extended reality environment may be adjusted, causing a timing of a resulting gesture differing from that expected based on the application profile.
In Figures 3A and 3B, one specific example is provided in which a visual representation of a keyboard 310, as a user interface element (or a group of user interface elements) is provided within a field of view of an extended reality environment 300. Figure 3A illustrates a position and size of the keyboard 310 within the field of view for the base case represented by the application profile, whereas Figure 3B illustrates a position and size of the keyboard 310 within the field of view for a live case following positional adjustment of the keyboard 310 in response to a similarity threshold being satisfied as discussed above. It can be seen from a comparison of Figures 3A and 3B that the keyboard 310 has been moved upwardly and to the right as a result of the adjustment, and has also been decreased in size (or alternatively moved further away from the user within the extended reality environment). These adjustments result in the movements required of the user (in moving the controllers 112), differing from that expected based on the application profile.
In Figures 4A and 4B, another specific example is provided in which a visual representation of four selectable user interface elements 410, 412, 414, 416 within a field of view of an extended reality environment 400 is shown. The elements are labelled "A", "B", "C" and "D", and may represent different options which may be selected by the user by interacting with these elements. In Figure 4A, the elements 410, 412, 414, 416 are presented, left to right, in order "A", "B", "C", "D", corresponding to the base case represented by the application profile. In Figure 4B, the elements 410, 412, 414, 416 are presented, left to right, in a modified order "D", "C", "B", "A". In order to achieve the same selection of one or a sequence of these user interface elements, the user will require a different movement or series of movements of the controller 112.
In both of the above examples it can be understood that the user's movements will differ from "expected" movements (corresponding to the base usage profile) due to the variations applied to the extended reality environment, and in particular to the user interface elements. An attack based on tracking the user's movements (for example using a wrist-based accelerometer) is therefore less likely to succeed, since the same sequence of movements will not generally achieve the same interactions with the extended reality environment in a subsequent session.
Figure 5 is a flowchart of the method carried out by the system shown in Figures 1 and 2. The process starts with a step Si, at which a default accelerometer profile is built for an application. Separate profiles can be expected to be built for each of a plurality of applications which a user can be expected to interact with within the extended reality environment. A single base profile may be used, independently of specific users, or alternatively a user-dependent profile may be generated, which will take into account the specific nature of the motions used by that user. This may improve accuracy, since different motions may be used by different users to achieve the same interactions with the application, due for example to differences in height, arm length, mobility and movement style of different users. In the case of a single base profile, this may be fixed in advance during development of the application, and based for example on test usage of the application during its development. In the case of user-specific profiles, these may be generated once, for example during a test mode of the application, or may be refined over time by monitoring interactions of the user with the application.
At a step S2, the user engages with the application in a "live" situation and starts interacting with the XR environment. At a step 53, a real-time accelerometer profile of the user is generated, using accelerometer sensor data from the controller(s) 112. At a step 54, a similarity between the default profile generated at the step Si and the real-time accelerometer profile generated at the step S3 is evaluated. At a step S5, the evaluated similarity is compared with a similarity threshold. If it is determined, at the step 55, that the similarity threshold is exceeded, then at a step 56 the virtual environment/user experience is adjusted, in the manner(s) described above. Following the adjustment, the process returns to the step S2 for further user engagement with the application, which continues to be monitored, forming a loop which continues while the application is being executed and the user continues to interact with the XR environment. If however it is determined at the step S5, that the similarity threshold is not exceeded, then no adjustment takes place and the process returns to the step S2 for further user engagement with the application, which continues to be monitored, forming a loop. It will therefore be appreciated that adjustments may be made when the degree of similarity between the live usage of the application and the base profile becomes too great (and the similarity threshold exceeded), but that when the degree of similarity is relatively low that no adjustments take place, with the result that the user experience is in line with user expectations.
Accelerometer sensors have been commonly used in related literature to detect user activities. Given the observed differences in different activities in accelerometer readings whilst gaming, it would be feasible to perform activity detection within XRNR environments. To detect the general activity, a Long Short-Term Memory (LSTM) Recurrent Neural Network (RN N) may be appropriate because it has an awareness of historical events within the inputted signals and could use these to aid classification of sections of input data.
Furthermore, specific actions within activities could be detected through methods such as Dynamic Time Warping (DTVV), which would require the input of a specific section of a signal where an action occurred and matching it, via DTW, to other known actions. DTW may be useful in this technique because it can be used to classify time series data where sequences may be of different lengths or contain unique events but at different times in the series.
In "Discerning User Activity in Extended Reality Through Side-Channel Accelerometer Observations" -Tiago, Smith-Creasey, Roscoe (2020), the viability of building biometric profiles for XR applications using accelerometer data is demonstrated.
In W02019199769A1, a technique of chaff generation in computer networks is described, in which it is highlighted that chaff is often identifiable by outsiders due to its random nature.
The present technique is also able to ensure that chaff is not easily identifiable (or make it harder to identify), by way of the intelligent chaff action generation module 260.
Both the generation of application profiles and the live tracking utilise accelerometer data to profile users engaged in an XR activity. By comparing the user's live profile to a known database activity (the stored profiles 210, 220, 230), it is possible to determine the degree of similarity between them. When the similarity reaches a certain threshold, a random modification is applied to the application environment/game. The modification should preferably be experienced by the user as natural within the game/application space. This has the purpose of reducing the similarity of the accelerometer profile. Therefore, it would reduce the similarity for an attacker too, obfuscating the real meaning of the behaviour. In this way, side-channel attack mitigation can be achieved.
From the above, it will be appreciated that the present technique provides a security method for an application executing in VR/AR or other XR environment. The method comprises receiving a predetermined accelerometer profile for the particular application (reflecting typical use of the application), generating an accelerometer profile for a user interacting with the application in use, performing a comparison of the profiles (for example using ML LSTM) to determine a degree of similarity there between, and responsive to the comparison, applying a modification to the application environment/game for the purpose of distinguishing the accelerometer profile for the user from the accelerometer profile for the application.
The present technique is designed to capture accelerometer data from VR controllers and compare the accelerometer data to profiles of VR applications. The profiles for each application may have been created for a user as they use the application (up to a point of sufficient data collection) or provided by the application vendor. Each profile has a distinct set of information that illustrates how a user will (or is expected to) interact with that application.
This technique may be provided as an Application Programming Interface (API) within developed VR applications, such that application vendors can create their own chaff requirements and actions non-common to the application.
Random action generation may be carried out locally by the application without the need of any server connection. As described above, the random action can be a new random position to specific objects in the 3D environment or it can be a modification of the 3D objects themselves (for example shuffling the keys of a keyboard) or the way the object behaves to the user actions (while still making sense to that user). As an example, while a user is typing a password on a virtual keyboard to gain access to sensitive information, the biometric profiles are being compared with live accelerometer data in real-time. If a high degree of similarity is reached, then the keyboard is randomly moved to a new position or the keys of the virtual keyboard shuffled (while still making sense for the user). With this technique being applied whenever the degree of similarity is high, the compromised devices will not be able to categorise and collect the password entered through the virtual keyboard.
Insofar as embodiments of the invention described are implementable, at least in part, using a software-controlled programmable processing device, such as a microprocessor, digital signal processor or other processing device, data processing apparatus or system, it will be appreciated that a computer program for configuring a programmable device, apparatus or system to implement the foregoing described methods is envisaged as an aspect of the present invention. The computer program may be embodied as source code or undergo compilation for implementation on a processing device, apparatus or system or may be embodied as object code, for example.
Suitably, the computer program is stored on a carrier medium in machine or device readable form, for example in solid-state memory, magnetic memory such as disk or tape, optically or magneto-optically readable memory such as compact disk or digital versatile disk etc., and the processing device utilises the program or a part thereof to configure it for operation. The computer program may be supplied from a remote source embodied in a communications medium such as an electronic signal, radio frequency carrier wave or optical carrier wave. Such carrier media are also envisaged as aspects of the present invention.
It will be understood by those skilled in the art that, although the present invention has been described in relation to the above described example embodiments, the invention is not limited thereto and that there are many possible variations and modifications which fall within 25 the scope of the invention.
The scope of the present invention includes any novel features or combination of features disclosed herein. The applicant hereby gives notice that new claims may be formulated to such features or combination of features during prosecution of this application or of any such further applications derived therefrom. In particular, with reference to the appended claims, features from dependent claims may be combined with those of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in the specific combinations enumerated in the claims.

Claims (14)

  1. CLAIMS1. A computer implemented security method for an application executing in an extended reality environment, the method comprising: receiving sensor data indicative of a user interacting with the application in the extended reality environment; comparing the sensor data, or a user profile generated from the sensor data, with a usage profile for the application; and if a degree of similarity between the sensor data or the user profile and the usage profile is greater than a threshold, modifying the extended reality environment.
  2. 2. The method of claim 1, wherein the extended reality environment comprises one or more user interface elements generated by the application for eliciting a sequence of user interactions by the user, and wherein the modifying of the extended reality environment comprises modifying the presentation of the one or more user interface elements.
  3. 3. The method of claim 2, wherein said modifying of the user interface elements comprises adjusting a time of display of the one or more user interface elements within the extended reality environment.
  4. 4. The method of claim 2 or claim 3, wherein said modifying of the user interface elements comprises adjusting a position of display of the one or more user interface elements within the extended reality environment.
  5. 5. The method of any one of claims 2 to 4, wherein said modifying of the user interface 25 elements comprises adjusting an appearance, size and/or layout of the one or more user interface elements within the extended reality environment.
  6. 6. The method of any one of claims 2 to 5, wherein a plurality of the user interface elements are provided, and said modifying of the user interface elements comprises 30 adjusting a temporal or spatial order in which the user interface elements are presented within the extended reality environment.
  7. 7. The method of any one of claims 2 to 6, wherein a plurality of the user interface elements are provided, and said modifying of the user interface elements comprises 35 adjusting a time delay between the display of successive ones of the plurality of user interface elements within the extended reality environment.
  8. 8. The method of any preceding claim, wherein the sensor data is accelerometer data indicative of gestures being performed by the user.
  9. 9. The method of any preceding claim, comprising identifying user interaction events from the sensor data, and generating the user profile from the identified user interaction events, wherein the usage profile comprises expected user interaction events for the application, and wherein a degree of similarity between the user profile and the usage profile is determined by detecting a correlation with respect to time between user interaction events identified from the sensor data with expected user interaction events in the usage profile.
  10. 10. The method of any one of claims 1 to 8, wherein determining the degree of similarity between the sensor data and the usage profile comprises comparing a window of sensor data with respect to time with a window of stored sensor data of the usage profile.
  11. 11. The method of any preceding claim, wherein the usage profile is generated for the user based on previous interactions of the user with the extended reality environment.
  12. 12. The method of any preceding claim, wherein the modification of the extended reality 20 environment is random or pseudo random.
  13. 13. A computer system including a processor and memory storing computer program code for performing the steps of the method of any preceding claim.
  14. 14. A computer program element comprising computer program code to, when loaded into a computer system and executed thereon, cause the computer to perform the steps of a method as claimed in any of claims 1 to 12.
GB2200331.3A 2022-01-12 2022-01-12 Security method for extended reality applications Pending GB2614712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2200331.3A GB2614712A (en) 2022-01-12 2022-01-12 Security method for extended reality applications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2200331.3A GB2614712A (en) 2022-01-12 2022-01-12 Security method for extended reality applications

Publications (1)

Publication Number Publication Date
GB2614712A true GB2614712A (en) 2023-07-19

Family

ID=86895739

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2200331.3A Pending GB2614712A (en) 2022-01-12 2022-01-12 Security method for extended reality applications

Country Status (1)

Country Link
GB (1) GB2614712A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140125574A1 (en) * 2012-11-05 2014-05-08 Mike Scavezze User authentication on display device
US20170324726A1 (en) * 2015-08-28 2017-11-09 Thomson Licensing Digital authentication using augmented reality
US20180107839A1 (en) * 2016-10-14 2018-04-19 Google Inc. Information privacy in virtual reality
US20180158053A1 (en) * 2016-12-02 2018-06-07 Bank Of America Corporation Augmented Reality Dynamic Authentication
WO2021179968A1 (en) * 2020-03-07 2021-09-16 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and system for authenticating a user for providing access to a content on a wearable computing device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140125574A1 (en) * 2012-11-05 2014-05-08 Mike Scavezze User authentication on display device
US20170324726A1 (en) * 2015-08-28 2017-11-09 Thomson Licensing Digital authentication using augmented reality
US20180107839A1 (en) * 2016-10-14 2018-04-19 Google Inc. Information privacy in virtual reality
US20180158053A1 (en) * 2016-12-02 2018-06-07 Bank Of America Corporation Augmented Reality Dynamic Authentication
WO2021179968A1 (en) * 2020-03-07 2021-09-16 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and system for authenticating a user for providing access to a content on a wearable computing device

Similar Documents

Publication Publication Date Title
US9747436B2 (en) Method, system, and device of differentiating among users based on responses to interferences
US9531701B2 (en) Method, device, and system of differentiating among users based on responses to interferences
US9526006B2 (en) System, method, and device of detecting identity of a user of an electronic device
US10747305B2 (en) Method, system, and device of authenticating identity of a user of an electronic device
US9071969B2 (en) System, device, and method of detecting identity of a user of an electronic device
US10565569B2 (en) Methods and systems related to multi-factor, multidimensional, mathematical, hidden and motion security pins
Serwadda et al. When kids' toys breach mobile phone security
US7337466B2 (en) Information hiding through time synchronization
US9483115B2 (en) Triggering control of audio for walk-around characters
CN103761460B (en) Method for authenticating users of display equipment
EP3557384A1 (en) Device and method for providing dynamic haptic playback for an augmented or virtual reality environments
WO2018007821A1 (en) Obscuring data when gathering behavioral data
WO2013147084A1 (en) Information input apparatus, screen display apparatus, and information input program
CN105144028B (en) Haptic effect signal exchanges unlock
Sivasamy et al. VRCAuth: continuous authentication of users in virtual reality environment using head-movement
EP3011483B1 (en) System, device, and method of detecting identity of a user of a mobile electronic device
KR102312900B1 (en) User authentication on display device
Andrade et al. Discerning user activity in extended reality through side-channel accelerometer observations
GB2614712A (en) Security method for extended reality applications
CN114547581A (en) Method and apparatus for providing a captcha system
Sluganovic Security of mixed reality systems: authenticating users, devices, and data
KR102297532B1 (en) Method and system for providing content based on virtual reality sound
Grenga Android based behavioral biometric authentication via multi-modal fusion
Wang et al. Identity authentication based on dynamic touch behavior on smartphone
WO2019240766A1 (en) Gesture based accesses