GB2605570A - Methods and systems for recording a user experience - Google Patents

Methods and systems for recording a user experience Download PDF

Info

Publication number
GB2605570A
GB2605570A GB2104444.1A GB202104444A GB2605570A GB 2605570 A GB2605570 A GB 2605570A GB 202104444 A GB202104444 A GB 202104444A GB 2605570 A GB2605570 A GB 2605570A
Authority
GB
United Kingdom
Prior art keywords
capture
user
user experience
recording
deterministic data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2104444.1A
Other versions
GB202104444D0 (en
Inventor
Howie Scott
Gilardi Marco
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of the West of Scotland
Original Assignee
University of the West of Scotland
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of the West of Scotland filed Critical University of the West of Scotland
Priority to GB2104444.1A priority Critical patent/GB2605570A/en
Publication of GB202104444D0 publication Critical patent/GB202104444D0/en
Publication of GB2605570A publication Critical patent/GB2605570A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Human Computer Interaction (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A method for recording a user experience generated by a computer-implemented application, wherein the application is configured to receive one or more user inputs and to process the one or more user inputs to generate the user experience; the method comprising: generating a first capture of the user experience by recording the one or more user inputs; and recording the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data.

Description

METHODS AND SYSTEMS FOR RECORDING A USER EXPERIENCE
The present disclosure relates to methods and systems for recording a user experience. In particular the present disclosure relates to computer implemented 5 methods and systems for recording a simulated user experience.
BACKGROUND
Simulated user experiences such as virtual reality (VR), augmented reality (AR) or mixed reality (MR) training simulations allow real-world or fictitious environments to be replicated for various purpose such as training, teaching, testing or entertainment. For example, virtual reality simulations are becoming increasingly popular for training applications, since they allow to visually replicate real-world environments such that trainees can operate and train in an immersive life-like environment without any danger or safety concern, on-demand and anywhere in the world.
Using virtual reality technology, trainees can conduct training simulations with a variety of devices, such as computers, laptops, mobiles and tablets, or with VR head-mounted displays. Unlike classroom-based training, virtual solutions require no scheduling or setup time, and are indefinitely re-usable. VR allows trainees to replicate physical movements and interactions with their virtual surroundings as in a real-life scenario, creating an engaging and immersive experience that can be conducted by the trainee remotely. These virtual solutions can provide staff who are isolated on ships, oil-rigs or on-site production facilities with the ability to conduct training virtually.
The ability to record and reconstruct simulated user experiences has provided a viable route to reviewing user gameplay in various types of simulated user experiences such as video game, VR or AR experiences.
Reviewing gameplay may be useful for various reasons, such as assessing a user performance in a training or exam session, identifying strengths and defects in the design and software of the experience, reproducing the experience for training purposes, reviewing experiences carried out under examination to identify any potential misconduct, identifying any bug or abnormal events in the simulated experience, and so on.
However, at present, virtual training solutions are limited mostly to self-induced training, with no reliable method for the training expert to assess the trainee, or for the trainee to receive feedback from a human training expert as would be expected in classroom-based training. The lack of technology required for remote observation, monitoring assessment and feedback of trainees' performance can significantly reduce the effectiveness of VR, AR and MR-based training as compared to classroom-based approaches. Therefore, without direct observation, outcomes from virtual training solutions may be less reliable.
Recording of VR. AR and MR user experiences has traditionally been approached using several different techniques, such as video recording the user throughout the experience or capturing animation data, using techniques traditionally adopted in visual effects for cutscenes, like those designed for Movies and TV shows.
Animation capture records animation data throughout an experience; that is, it records any modification of an object throughout the game. The recorded modifications can then be played back so that the animation of the object(s) is reproduced. However, animation techniques are designed for visual uses and the repetition of the recorded modifications does not provide any information on the gameplay experience, since the animation is formed independently of all other game data. This makes animation capture insufficient for any in-depth assessment of simulated user experiences, since the information required for a proper assessment mostly reside in the gameplay data.
Simulated user experiences are designed to respond to user inputs, hence the first and most basic recording techniques started by only capturing user inputs. Although recording user input makes it possible to recreate all deterministic gameplay data, recording user input alone is not sufficient to enable a full reconstruction of user gameplay since simulated user experiences generally also comprise some elements which are non-deterministic. For example, any user actions whose outcome is not predetermined is non-deterministic.
Hence, more sophisticated recording methods record all non-deterministic data alongside user inputs, such that during reconstruction, user inputs can be synced with the recorded non-deterministic data to reconstruct the gameplay experience linearly. These methods enable the reconstruction and reproduction of all outcomes of the recorded gameplay experience.
All the above-mentioned recording methods enable a linear capture and reconstructions of the gameplay. That is, the abovementioned recording methods provide a formatted set of user input actions and non-deterministic data modifications which are ordered according to their time of acquisition and which that can be reconstructed according to said time of acquisition. This ensures that during reconstruction of a simulated user experience all data are reconstructed in linear order according to their acquisition time, leading to the correct replication of any event (be it a user action or object modification), and in turn, to the reconstruction of the full user gameplay experience.
However, since the recorded data only capture user inputs and non-deterministic elements of the simulated user experience, the above approach provides no prior knowledge of gameplay until the data is reconstructed. Only when the events and actions conducted during gameplay are reproduced by linearly reconstructing the recorded data, the unrecorded data of the simulated user experience can be identified.
Therefore, such linear recording and reconstruction methods have various limitations. The recording must be reconstructed from the beginning in order to reproduce the simulated user experience; it cannot be skipped, reversed, or fast forwarded. Each time a passage of the user experience needs to be re-assessed, the whole user experience must be reconstructed from the beginning because the linear nature of the recording requires all previous data to be known in order to reproduce the same outcomes at any given point of the reconstructed user experience. As a result reproducing a simulated user experience and reviewing gameplay can be slow, tedious, and not that useful. Because linear reconstructions of gameplay experience lack information of unrecorded game data, which can only be identified after the reproduction has taken place during reconstruction, the advantages and effectiveness of the recording is very much limited and does not provide great benefits compared to video or animation recordings.
The following example exemplifies the disadvantage of linear recordings of user inputs and non-deterministic data: if directions for a journey to move from a predefined starting location on a map were provided, it would not be possible to determine the end point of the journey unless the directions are re-visited in a linear fashion. Only by revisiting all directions linearly, it would be possible to determine the correct end point. It would not be possible to know the end point or to determine landmarks visited during the journey without reconstructing all directions linearly. Any divergence from the linear directions, as would happen if the directions were revisited in the wrong order, would not provide the expected end point.
Some prior art recording methods address the above problem by also capturing all data in the simulated user experience. That is, along with capturing user-input and non-deterministic data, they enable developers to record all deterministic data and any other data which are necessary to enable non-linear reconstruction of gameplay.
Assuming enough data is captured and the recorded data is configured correctly, the additional data recorded by these methods allows to reconstruct the simulated user experience starting from any timeframe of simulated user experience, thereby enabling non-linear reconstruction functionalities such as rewinding, skipping or fast forwarding without the reconstruction diverging from the actual simulated user experience.
Additional deterministic data to be captured in order to enable non-linear reconstruction can be defined by the developer of the simulated user experience and can be used to reconstruct a specified time instance of the simulated user experience without having to linearly go through the recorded data from the beginning.
However, due to the large amount and complexity of data used in simulated user experiences, methods which attempt to record both non-deterministic and all deterministic data in real-time tend to be prone to errors during data capture. If any of the additional data is missed during recording and/or if data recording is not configured correctly, the reconstruction would likely diverge when using any non-linear functionality. Moreover, these methods have the significant disadvantage that since large amount of additional data are captured and saved in real time during the simulated user experience, software and hardware performance can be negatively impacted.
The recording process as well as the reconstruction process require additional overhead for processing and computing power to deal with the large amount of additional data. Often, this result in the gameplay experienced by the user to deteriorate with stuttering and overall poor performance, since the hardware must run the simulated user experience application alongside the recording of a large amount of data.
Some systems attempt to circumvent this problem by setting a limit on the time available for data to be recorded during each single time frame of the simulated user experience, any remaining data not saved during that frame being pushed for recording to the next frame. According to this approach, if the time limit for recording data of a given timeframe of the experience is reached, the additional deterministic data not yet saved is pushed to the next timeframe of the simulated user experience and acquired during the next timeframe. If in the next timeframe once again the time limit for recording is reached, data may be pushed to yet the following timeframe, and so no. This recording approach minimises the impact on the simulated user experience by prioritising the responsiveness of the real time experience at the expense of the accuracy of the recorded data. If a considerable number of data is shifted by one or more timeframes, the recorded data may be visibly out of sync with respect to the actual timeframe in which they were generated. The more data needs to be saved and the longer the time of the recording, the more likely it will be that data are pushed out of sync causing increasing deterioration in the quality of the recording.
Due to the above reasons, recording methods which capture all data necessary for non-linear reconstruction in real-time are generally only suitable for capturing and replaying short gameplay clips and/or for capturing the gameplay of a single user or from a single point of view, such as to re-watch event from the perspective of a kill cam in a modern shooter game. However, this does not offer a complete insight on the simulated user experience and would significantly limits the depth of the analysis which can be carried out on the recorded data. An Al analysis tool, for example, would be unlikely to learn anything significant from such limited recordings.
SUMMARY
It is desirable to provide methods and system which overcome one or more of the above limitations.
According to a first aspect of the present disclosure there is provided a method for recording a user experience generated by a computer-implemented application, wherein the application is configured to receive one or more user inputs and to process the one or more user inputs to generate the user experience; the method comprising: generating a first capture of the user experience by recording the one or more user inputs; and recording the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data.
Optionally, generating the second capture of the user experience comprises recording the set of deterministic data generated by the application.
Optionally, generating the first capture of the user experience comprises recording a set of non-deterministic data generated by the application; and generating a second capture of the user experience comprises the application processing the non-deterministic data from the first capture.
Optionally, the user experience comprises one or more objects; and the method 15 further comprises providing a database for recording changes of the one or more objects; and generating in the database an object instance for each object Optionally, each object instance comprises one or more default components which are common to all objects and one or more object specific components, each component comprising one or more variables.
Optionally, each object instance comprises one or more variables, each variable corresponding to one of a user input, a deterministic data and a non-deterministic data; and recording user input, recording non-deterministic data and recording deterministic data each comprises updating the value of the corresponding variable in the database.
Optionally, the first capture is generated over a first time interval and the second capture is generated over a second time interval; the first time interval comprises a first plurality of consecutive time frames; the second time interval comprises a second plurality of consecutive time frames, each time frame of the second plurality corresponding to a time frame of the first plurality; and recording a user input, non-deterministic data or deterministic data each comprises: at each time frame, determining if a value of said user input non-deterministic data or deterministic data has changed with respect to the previous time frame by querying the corresponding variable in the database; and if a change is detected, creating a new instance of the corresponding variable in the database.
Optionally, the method further comprises timestamping each new instance of a variable with the time frame at which the change was detected.
Optionally, all variables in the database are of a predetermined variable type and determining if a value of said user input non-deterministic data or deterministic data has changed with respect to the previous time frame by querying the corresponding variable in the database comprises: storing in a library an original variable type of said user input, non-deterministic data or deterministic data; converting the value of said user input, non-deterministic data or deterministic data to the predetermined variable type; and comparing the converted value with the instance of the corresponding variable in the database which is timestamped with the previous time frame.
Optionally, generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data comprises: for each user input retrieving from the library its original variable type; retrieving the variable corresponding to said user input from the database; and converting said variable to the original variable type prior to the application processing the user input.
Optionally, generating a second capture of the user experience further comprises: for each non-deterministic data, retrieving from the library its original variable type; retrieving the variable corresponding to said non-deterministic data from the database; and converting said variable to the original variable type prior to the application processing the non-deterministic data.
Optionally, the user experience is a simulated user experience.
Optionally, the simulated user experience is an extended reality experience.
According to a second aspect of the disclosure there is provided a computer implemented system for recording a user experience generated by an application, wherein the application is configured to receive one or more user inputs and to process the one or more user inputs to generate the user experience; the system comprising: one or more processors configured to execute the application; a first capture module configured to generate a first capture of the user experience by recording the one or more user inputs; and a second capture module configured to record the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data.
Optionally, the second capture module is configured to receive in input the set of deterministic data generated by the application and record the set of deterministic data to generate the second capture.
Optionally, the first capture module is further configured to record a set of non-deterministic data generated by the application to generate the first capture; and recording the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data comprises the application processing the set of non-deterministic data from the first capture.
Optionally, the user experience comprises one or more objects; and the system further comprises a database configured to record changes of the one or more objects, the database comprising an object instance for each object of the user experience.
The method of the second aspect may also incorporate using or providing features of the first aspect and various other steps as disclosed herein.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosure is described in further detail below by way of example and with reference to the accompanying drawings in which: Figure 1 is a schematic diagram of a method for recording a user experience according to a first aspect of the present disclosure; Figure 2A is a schematic diagram illustrating the working of a first step of the method of Figure 1; Figure 2B is another schematic diagram illustrating the working of the first step of the method of Figure 1; Figure 3 is a schematic diagram illustrating an example hierarchy for a typical user experience; Figure 4 is a schematic diagram illustrating the workings of a method for recording a user experience according to a specific embodiment of the present disclosure; Figure 5 is a schematic diagram of a system for updating a variable in a database according to some embodiments of the present disclosure; Figure 6 is a further schematic diagram illustrating a method for recording deterministic data during a second step of the method of Figure 1; Figure 7 is a schematic diagram of a system for updating a variable in the database according to a specific embodiment of the present disclosure; Figure 8 is a schematic diagram illustrating the workings of a library for converting a variable to a predetermined variable type, in accordance with a specific embodiment of the present disclosure; Figure 9 is a schematic diagram of an object of an example user experience; Figure 10 is a schematic diagram illustrating the generation of a first capture of a user experience and a second capture of a user experience according to a specific
embodiment of the present disclosure;
Figure 11 is a schematic diagram illustrating a timeline of a recording of a user experience according to some of the methods and systems of the present disclosure; Figure 12 shows an example graphical user interfaces for replaying a recorded user experience; Figure 13 is a schematic diagram of a computer implemented system for recording a user experience generated by an application, in accordance with a second aspect of the present disclosure.
DETAILED DESCRIPTION
The present disclosure provides methods and systems for recording user experiences. In particular, the present disclosure provides computer-implemented methods and systems for recording simulated user experiences. Recording user experiences may be particularly beneficial for monitoring, assessing and supporting training delivered via simulated user experiences.
Simulated user experiences may comprise extended reality experiences as well as any other type of electronic game experiences which do or do not make use of extended reality features. Generally, electronic game means any game that involves interaction with one or more electronic user interfaces or input devices, such as joysticks, controllers, keyboards, or motion sensing devices, to generate feedback for a player, where the feedback may be shown on a video device, such as a standard TV set, monitor, touchscreen, VR or AR headset or screen. Any electronic game or simulated user experience may or may not have a video-based feedback. For example, some electronic games may provide user interactions and feedback only via audio, tactile or other sensor-based input and output devices which allow interaction with any of the user's senses.
Although the present disclosure is directed in particular to the recording of extended reality experience, such as virtual reality (VR), mixed reality (MR) or augmented reality (AR) user experiences, user experience may refer to any sort of electronic game and/or user experience provided within a digital environment. Although the methods and system of the present disclosure are described with reference to simulated user experiences, they are suitable for the recording and reconstruction of user actions in any computer-based application, such as the reproduction of a poster in Adobe Photoshop, or the reproduction of models in 3D modelling software like Blender. Such use-cases would be useful for example for teaching and training purpose, allowing instructors to supply tutorial formats to students, as well as for providing remote methods of observing students work and for allowing identification of actions completed by the user for problem solving and assistance tasks.
Simulated user experiences may be classified according to the platform on which they are implemented, such as arcade platforms, console platforms, PC, smartphones, tablet computers, virtual and augmented reality systems, remote cloud gaming. It will be appreciated that the present disclosure is not limited to any specific platform. Moreover, it will be appreciated that the word "game(s)" is used herein in a broad sense and is not meant to limit the present disclosure to simulated user experiences executed for the mere entertainment and enjoyment of the user but refers to simulated user experiences executed for any purpose, such as entertainment, education, training, learning, physical and/or mental exercise to cite a few. It will also be appreciated that simulated user experiences may involve one or more individuals and/or computer entities and that the word "user" is not limited to a single person but may also refer to a group of users, including non-human users.
Hereinafter, simulated user experience may also be referred to as "game experience", "simulated experience" or "user experience".
Hereinafter, linear reconstruction means that the reconstruction of the simulated user experience is performed from recorded data which needs to be processed in a linear order according to their acquisition times, which may be recorded via applying a timestamp to the recorded data. In a linear reconstruction data cannot be reproduced or reconstructed out of pre-recorded order and no alteration to the point of reconstruction can take place, meaning the reconstruction cannot be skipped ahead or reversed backwords. The reconstruction must be restarted from the beginning whenever data needs to be re-examined.
Non-linear reconstruction means that the simulated user experience can be reconstructed from the recorded data dynamically, that is starting from any point of the recording regardless of the order the data was acquired in. A non-linear reconstruction allows for skipping, fast-forwarding or reversing from any point of the recording.
Runtime is the time at which the user experience to be recorded takes places. At runtime a user is interacting with the simulated user experience via one or more input devices.
Figure 1 is a schematic diagram of a method 100 for recording a user experience 102 according to a first aspect of the present disclosure. The user experience 102 is generated by a computer-implemented application 104 which is configured to receive one or more user input 106 and to process the one or more user inputs to generate the user experience 102. The user experience 102 to be recorded may be for example a simulated user experience, such as an extended reality user experience.
The method 100 comprises, at step 110, generating a first capture of the user experience by recording the one or more user inputs; and at step 120, recording the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data.
Hereinafter, the first capture generated at step 110 of method 100 may also be referred to as "lightweight capture" and the second capture generated at step 120 of method 100 may be referred to as "processed capture".
In preferred embodiments, generating the first capture comprises recording the user inputs and one or more non-deterministic data generated by the application 104; and generating the second capture comprises recording one or more deterministic data generated by the application 104.
Non-deterministic data means any data of the user experience which does not have a predetermined value or state and which has an element of randomness applied to it which cannot be accurately reproduced during a reconstructions of the experience from the user inputs alone. For example, non-deterministic data may comprise data related to an object of the user experience which is Al-driven or physics-driven. Deterministic data means any data of the user experience which have a fixed state or which can be reproduced by the application 104 in a deterministic way from the user inputs and the non-deterministic data, according to one or more predetermined rules.
In some embodiments, the lightweight capture may be generated by recording all user inputs and all non-deterministic data which are indispensable to at least enable a linear reconstruction of the user experience. In other words, the lightweight capture may comprise all experience data which are out-with the control of application 104. The processed capture then adds to the lightweight capture all experience data not previously recorded in the lightweight capture, such that nonlinear functionalities are enabled.
In some embodiments, the lightweight capture is generated during a first time interval, hereinafter also referred to as "runtime"; and the processed capture is generated during a second time interval, hereinafter also referred to as "reconstruction time interval". The first and second time intervals may overlap, that is, the reconstruction of the user experience from the data comprised in the lightweight capture may begin before the lightweight capture has been fully generated.
It will be appreciated that the lightweight capture may also comprise one or more deterministic data, though in preferred embodiment the lightweight capture will only comprise user inputs and any non-deterministic data, such that impact on runtime performance of the user experience is minimal, since only the minimum amount of data necessary to enable linear reconstruction is captured, thereby reducing the computing power required to enable the recording.
The lightweight capture generated at step 110 may be saved for example as a data file. Thereafter, the lightweight capture may be provided in input to application 104 and the application 104 process the user inputs and/or non-deterministic data comprised in the lightweight capture in order to re-generate the deterministic data of the user experience which have not been recorded in the lightweight capture.
Figure 2A and 2B are schematic diagrams illustrating the working of the first step 110 of the method 100. Figure 2A shows a virtual reality experience 200 in which a user 204 interacts with the application 104 via user inputs 206. Figure 2B illustrates a virtual reality experience 200' reconstructed from a lightweight capture of virtual reality experience 200. The lightweight capture comprises recorded user inputs 206'. By reconstructing the virtual experience 200 from the lightweight recording it is possible to reproduce the user's inputs and actions, imitating the experience of a user being present and interacting with the virtual reality environment of the virtual user experience.
It will be appreciated that the computer-implemented application 104 may be configured to be executed on a variety of devices, such as laptops, desktop, consoles, smartphones, tablets, etc. and that the application 104 may be run on the same or on different devices at runtime and at reconstruction time. Moreover, each of the capture may be generated using the same or a different device than the device employed to run the application 104 which generates the virtual experience at runtime.
At step 120 the application 104 linearly reconstructs the user experience by processing the user inputs and non-deterministic data recorded in the lightweight capture, thereby generating the deterministic data. Step 120 may commence at any time after the start of the generation of the lightweight capture. Preferably, if the step 120 is performed on the same computing device than the step 110, the step 120 will only commence once the generation of the lightweight capture is terminated, such that the execution of step 120 does not impact the user experience at runtime.
By adding the deterministic data to the processed capture, non-linear reconstructing and replaying functionalities are enabled.
This two-layer recording approach in which only the lightweight recording is captured during runtime allows to overcome the issue of lag being introduced by prior art methods which attempt to record all data during runtime, whilst at the same time allowing full experience data to be captured during the step 120 and therefore ensuring non-linear functionalities of the reconstruction and replay of the user experience.
Moreover, the length of the recording is no longer an issue since the processed capture is generated after runtime and with no limit applied to the time allowed for recording data, allowing all data to be recorded with the correct acquisition time and therefore ensuring that they are correctly synced.
Figure 3 is a schematic diagram illustrating an example hierarchy for a typical user experience 300. The user experience 300 comprises one or more objects 302, such as Object A and Object B shown in the figure. Each object 302 comprises one or more components 304, such as component 1 and component 2 of object A and B respectively in the figure. Each component 304 comprises one or more variables 306; for example variables "Object A -Compl -Var 1", "Object A -Compl -Var 2" and "Object A -Compl -Var 3" of Component 1 of Object A; "Object A -Comp2 -Var 1" of Component 2 of Object A; and so forth. Each object 302 may comprise a different number of components 304 and each component 304 may comprise a different number of variables 306.
It will be appreciated that different user experiences may have different hierarchies from the specific example shown of Figure 3. In particular, the objects of the user experiences may themselves have different hierarchies.
In some embodiments, the method 100 may further comprise the step of providing a database for recording changes of the objects 302 and generating in the database an object instance for each object of the user experience.
Figure 4 is a schematic diagram illustrating the workings of a method for recording a user experience according to a specific embodiment of the method 100 wherein a database 420 for storing the evolution of each object 302 in the recorded user experience is provided. Common reference numerals and variables between figures represent common features.
At step 110, the first capture 402 (lightweight capture) is generated and at step 120 the second capture (processed capture) 404 is generated. During the first step 110, the non-deterministic data 406 and user inputs 106 are recorded in real-time.
During the second step 120, the user inputs from the first capture 402 are processed by the application 104 to re-generate the deterministic data 408 of the user experience, which were not recorded at step 110. At step 120, the deterministic data 408 are recorded to generate the second capture 404.
In some embodiments, the second capture may be generated by creating a copy of the first capture 402 and overwriting said copy. In other embodiments the second capture may be generated by directly overwriting the first capture 402.
The user inputs 106 may comprise for example a user role and/or one or more signals generated by one or more hardware input devices, such as a headset device, a motion sensor, a joystick device, and so on. The signals may represent for example a motion of the device, or the state of a button or a switch, or an audio, visual or other type of signal captured by a sensor, such as an eye-tracking signal. The methods and systems of the present disclosure may be configured to account for functionality and usability of any sort of user input device and to simultaneously record inputs from different types of input devices.
In preferred embodiments, only non-deterministic data and user inputs are recorded at step 110. However, in some embodiments some deterministic data may be recorded at step 110. The amount of data recorded to generate the first capture 402 is selected such that the recording at step 110 has negligible impact on the user experience performance at runtime.
In both steps 110 and 120, the database comprises an instance 430 of each object 302 in the user experience (not shown). Object instances may have different properties depending on the specific user experience and on the specific object.
In some embodiments, each object instance comprises one or more default components which are common to all objects and one or more object specific components, wherein default and object specific component may in turn comprise one or more variables. In other embodiments, an object hierarchy may be more complex and comprise various intermediate entities in the object hierarchy between components and variables; in yet other embodiments the hierarchy may be simpler and an object only comprises one or more variables. The structure and hierarchy of each object instance in the database 420 reflects the structure and hierarchy of the corresponding object in the user experience.
Each variable of an object instance corresponds to a user input, a deterministic data or a non-deterministic data. User input, non-deterministic data and deterministic data comprises are recorded by updating the value of the corresponding variable in the database 420.
In the specific example of Figure 4, database 420 comprises object instance 430 and object instance 430 comprises variables 432, 434, 436. Variables 432 and 434 correspond to non-deterministic data and variable 436 corresponds to a deterministic data.
The first capture 402 is generated during runtime and the second capture 404 is generated over a reconstruction time interval. The runtime and reconstruction time intervals are each divided into a plurality of consecutive time frames, wherein each timeframe of the reconstruction time interval corresponds to a timeframe of the do runtime interval, such that deterministic data recorded at step 120 are properly synced with non-deterministic data and user inputs recorded at step 110.
At each timeframe of the runtime, the user input and the non-deterministic data are monitored to determine whether their value has changed with respect to the previous timeframe. That is, the database is queried to retrieve the corresponding variable and to compare the value of the variable stored in the database with the user input or non-deterministic data at the present timeframe. Only if a change is detected, a new instance of the variable is created in the dataset with the present value of the user input or non-deterministic data.
Similarly, during the reconstruction time interval, at each time frame all deterministic data which have not been previously recorded in the lightweight capture are monitored and compared with the values stored in a database as explained above and, only if a change is detected, a new instance of the corresponding variable is created.
For example, a new variable instance 432i of variable 432 is created at times t2 (Van, t2); a new variable instance 434i of variable 434 is created at times t3 and t4; and so forth. Each new variable instance is time-stamped by recording the time frame at which the instance was created and associating said time frame to the new instance. Any time the first capture is processed by the application 104 or the second capture is replayed by a user for monitoring purposes, the timestamps associated with each variable instance ensure that all experience data are reconstructed in sync with each other and with the user inputs.
If a variable which is being monitored at either step 110 or 120 has not changed value, no recording takes places for that variable at that specific time frame. If no value exists for a variable which is being monitored, i.e. a variable has not yet been initialized in the database, the present value is assigned to the variable in the database.
The time frames of the recording may or may not match a frame rate at which the user experience is executed. In preferred embodiments the time frames of the recording will be equal or greater than a minimum predetermined frame rate considered suitable for the recorded user experience. For example, VR applications typically have a minimum framerate of 72 frames per seconds. By only updating a variable when its value has changed, the methods and systems of the present disclosure help ensuring that the required framerate is met during recording.
Figure 5 is a schematic diagram of a system SOO for updating a variable 506 stored in the database 402. The system 500 comprises a monitoring module 510 configured to, for each variable 506 stored in the database 402, compare the value of the latest variable instance 506i with The value of the corresponding user input or non-deterministic data or deterministic data at the current time frame. If a change is detected, the value of the input/data at the current time frame is provided in input to a saving module 520 which is configured to create a new instance 506i of the variable in the database. The saving module 520 is further configured to attach a timestamp 507i to the instance 506i. If no change is detected, the variable in the database is left unchanged and no action is performed by the saving module 520. The values of the variable, whether updated or unchanged, is provided to the application 104 for the next time frame.
The timestamps 507i attached to variables which correspond to user inputs or non-deterministic data enable the linear reconstruction of the data recorded in the lightweight capture at step 120. User inputs and non-deterministic data are fed into the application 104 according to their timestamps such that the application 104 is "tricked" into believing that the actions and input modifications are being provided by a user currently going through the experience. At step 120 the lightweight recording is reproduced linearly and any data evolution not previously captured is recorded.
Figure 6 is a schematic diagram illustrating a method 600 for recording deterministic data during step 120, according to a specific embodiment of the 10 method 100.
The monitoring and recording of data in the database follow the same procedure as during the step 110. However, during step 120 all data not previously captured are monitored and recorded. The method 600 is configured to, at the beginning of step 120, perform a scan of all variables in the experience hierarchy and compare said scan with the variables currently saved in the database, in order to identify variables which have not yet been recorded by the lightweight capture. These variables are monitored during the processing of the lightweight recording and all missing experience data are recorded to generate the second capture, which may then be saved as a data file.
It will be appreciated that different embodiments may implement different monitoring and recording architectures. For example, in some embodiments a new instance of each object may be created in the database each time any component or variable of the object is updated, as opposed to only creating a new instance of the specific variable which has been updated. The same concept may apply at any level of object hierarchy; for example, in other embodiments wherein each object comprises one or more components and each components comprises one or more variable, a new instance of each component may be created in the database any time a variable within the component is updated.
The system and methods of the present disclosure may be configured to handle multiple reconstructions in parallel or sequentially, meaning the system and the method of the present disclosure may be configured to automatically process multiple lightweight captures simultaneously or sequentially to generate the corresponding processed captures.
Because generating the processed capture requires considerable computer resources, it is expected that in some applications the process recording generation may be conducted on a remote server or computer hardware which is not the same Lo server or computer hardware employed to execute the user experience and/or to generate the lightweight capture.
Each step/component of the methods and system according to the present disclosure is configured to work independently with no input required by the developer or user of the user experience.
Variables of the objects in the user experience may be of different types. For example, a variable may be of any of the types which are common in computer programming, such as integer, string, bool, float. Variables may also be of types specific for user device typically used in VR, AR or gaming experiences in general, such as "ButtonStatus" or "PlayerRole". Custom variable types may be created by the developer of the user experience.
In some embodiments, the methods and systems of the present disclosure may employ software based on object-oriented programming. As will be known to the skilled person, all variable types within object-oriented programming stem from the base type 'object' and may have various predetermined fields, such as functions and methods which depend on the variable type. For example, a variable of type string will always have methods that enable string functionality, such as "ToUpper" / "ToLower" methods, which allow the value of the variable to be made into upper or lower case.
In order to simplify the handling and monitoring of variables of different types, the database 420 may be configured to store only variables of a predetermined variable type, such as a "string" type. In such embodiments, the systems and methods of the present disclosure may be configured to ignore all fields which are unique to the specific variable type and to deal with all variables as of type 'object', that is to cast each variable to 'object' type. By ignoring the predetermined fields, access to some information or functionalities may be lost, however there are not indispensable for the purpose of saving and monitoring the variables in the database. Casting all variables to object simplifies the handling, monitoring and storage of each variables, avoiding the need to configure each function to work for each individual variable type.
When variables are casted to 'objects' only a reference to their value stored in memory is maintained. Said value can then be converted to the predetermined value type that the database is configured to accept, e.g. "string". Meanwhile the original variable type can be stored in library for subsequent recovery of the full variable information.
Some embodiments may be configured to: store the original variable type of the values of each user input and deterministic or non-deterministic data in a library, convert the values into a predetermined type (e.g. string) and compare the converted value with the instance of the variable in the database. This allows to minimize coding and computing costs by enabling the same steps/components to be used for comparing all inputs and data with the corresponding variables in the database, as explained above.
When the application 104 processes the user input and non-deterministic data from the first capture in order to re-produce the deterministic data, each user input and/or non-deterministic data is retrieved from the database by: retrieving from the library its original variable type; retrieving the variable corresponding to the user input or non-deterministic data from the database; and converting said variable to the original variable type prior to providing the user input or data to the application 104.
Figure 7 is a schematic diagram of a system 700 for updating a variable 706 in the database 420, according to a specific embodiment of the method 100. The system 700 comprises a recording module 710 and a saving module 720. The recording module comprises a conversion module 712 which is configured to cast the variable 706 from an original variable type to a string type, such that the variable value is converted into a string. The saving module 720 is configured to save the converted variable value into the database 420.
Figure 8 is a schematic diagram 800 illustrating the workings of a library for converting a variable 706 to a predetermined variable type, in accordance with a specific embodiment of the method 100. The library may comprise a list of pre-defined and custom variable types. When the data or user input corresponding to a string variable stored in the database 402 needs to be reconstructed, the original variable type of the data or input is retrieve from the library and the string variable is converted to the original variable type prior to being provided in input to the application 104.
In some embodiments, each object of the user experience may have a unique name and unique identifier-key such that each object instance in the database 402 can be easily retrieved during recording and reconstruction. This is illustrated for example in Figure 9, which is a schematic diagram of an object 900 according to a specific example of the user experience 102. The object 900 comprises one or more predefined components 902 which are common to each object. Each predefined components may have a predefined hierarchy. For example, each predefined component may comprise one or more sub-components and/or one or more variables, such as variables related to an activation state, position, rotation, or scale of the object The object 900 further comprises a plurality of custom components 904 which are specific to the object 900. Each component may comprise one or more variable data, whose values determine the properties of the object. For example, an object may comprise a variable corresponding to a virtual health bar value of the object.
Default variable data may be for example all variables of an objects which are pre-defined by the game engine which generates a virtual reality experience, whilst custom components 904 may comprise all variables defined by a developer of the user experience.
In some embodiments, the method 100 may be configured to store one or more default variables of each object without converting them to a predetermined variable type. Since default data are pre-defined data which normally exist for all objects and all user experiences and which are likely to change between most time frames, handling and storing these data without converting them to the predetermined variable type, allows for better performance and minor computational cost of the recording and reconstructing operations.
All other variables corresponding to user input/experience data which are not default variables are handled by converting their type as previously explained. This handling approach allows to dynamically capture all variable data regardless of their type, such that the recording methods and systems of the present disclosure have universal compatibility with all sort of user experiences data.
The above approach allows to record all object and variable data while preserving the full hierarchy and granularity of the user experience.
Figure 10 is a schematic diagram illustrating an example scenario in which non-deterministic data are recorded during the generation of the first capture of a user experience according to a specific embodiment of the present disclosure (data shown in red in the figure) and all remaining experience data are captured during the generation of a second capture of the user experience (data shown in blue in the figure).
The above recording approach can be adopted both for the generation of the first and the second capture. The above recording operation is performed for all objects present within a user experience. If an object has no non-deterministic properties, none of the object information would be recorded during the generation of the lightweight capture. In this scenario only the user inputs would be recorded during the generation of the first capture and all data relating to the object and its variable information would be recorded during the generation of the second capture.
Figure 11 is a schematic diagram illustrating a timeline 1100 of a recording of a user experience according to some of the methods and systems of the present disclosure.
The methods and systems according to the present disclosure provide a complete datafile containing a history of all modifications of gameplay data of the user experience.
The processed capture may be saved in a datafile for future use. This allow to search the recorded user experience without having to reconstruct it It would be possible to use a query to search a database of various datafiles to discover what user experiences contained a specific set of themes, events, or data outcomes. Because all the data modified during the user experience is recorded, it is easier to identify cause and effect relationships of the gameplay. For example, a query looking at use experiences in which a user pressed additional input buttons just prior to interacting with an object might be used to identify potential flaws in game design and user interaction, without having to watch every recorded experience.
In some embodiments, the method 100 may further comprise the step of providing a graphical user interface for interacting with a datafile of the recorded user experience and enabling various replay functionalities. Figure 12 shows an example graphical user interfaces which may be provided in some embodiments of the present disclosure for replaying a recorded user experience.
Figure 13 is a schematic diagram of a computer implemented system 1300 for recording a user experience 1302 generated by an application, in accordance with a second aspect of the present disclosure. The application (not shown) is configured to receive one or more user inputs 1306 and to process the one or more user inputs 1306 to generate the user experience 1302.
The system 1300 comprises: a first processor 1308a configured to execute the application; a first capture module 1310 configured to generate a first capture 1312 of the user experience by recording the one or more user inputs 1306; and a second capture module 1320 configured to record the user experience by generating a second capture 1322 of the user experience, whereby the application processes the user inputs 1306' from the first capture 1312 to generate a set of deterministic data 1324.
The disclosed methods and systems allow to dynamically capture and reconstruct all data of a user experience at variable level, making it possible to dynamically access, capture, save and reconstruct individual variables from the recorded user experience. All the data from the user experience is stored in the database preserving the hierarchy of the data in the simulated user experience. This dynamic approach allows to search, access, reconstruct or modify individual values of variable data, without having to reconstruct the entire dataset of information, whilst minimizing computing and memory costs of the methods and systems according to the present disclosure.
All variables, included public, private and protected variables, can be dynamically monitored and accessed.
Prior art methods for recording simulated user experiences either record too little data in order not to impact performance at runtime, which may results in a shallow reconstruction of gameplay which does not provide useful insight, or record too much data, which may result in limited performance at runtime and/or out of sync recorded data which can cause the reconstruction to diverge and become unrepresentative of the simulated user experience.
The methods and systems of the present disclosure provide a two-layer recording of a simulated user experience, comprising a lightweight recording and a processed recording which enables the capture of simulated user experience with minimal impact on the runtime performance of the simulated experience, since lightweight recording only captures the minimum amount of data necessary to allow linear reconstruction of the experience during runtime. This makes the lightweight recording suitable for use on any device, including low-end hardware, such as portable VR equipment and smartphones, without having any significant impact on the responsiveness and interactivity of the hardware.
The lightweight recording is provided in input to the processing module, which is configured to provide in output the processed recording of the simulated user experience by reconstructing and recording all remaining data of the simulated experience not previously captured by the lightweight recording. The processed recording allows for the identification of any game events, themes, and structures at any time frame of the recorded simulated experience via a simple query of the data modifications stored in the database. Hence the processed recording provides full non-linear functionalities and enables a more useful analysis of the simulated user experience The methods of the present disclosure enable a lag-free experience at runtime and therefore are particularly useful for recording VR. AR or MR simulated experiences, where a consistently high-refresh rate is required to provide an optimal user experience and to minimize motion sickness. The methods and system of the present disclosure enable developer debugging, training feedback, generation of timeline data, summary of gameplay modification, searching and indexing of variable modifications.
It will be appreciated that the method and systems of the present disclosure may be 30 implemented by any suitable combination of hardware and software components and are not limited to the specific examples provided herein. For example, the database 420 need not be stored on a single device and may provide on or more memory components which comprise one or more of a hard-drive, a cloud-based storage, a server, and so on.
The systems and methods according to the present disclosure may be configured to 5 be embedded in a software-based system which provides the simulated user experience.
It will be appreciated that in different embodiments some of the steps of the above methods may be executed in different order without departing from the scope of the present disclosure and that the stated steps do not preclude the presence or addition of one or more other steps.
It will also be appreciated that the components and method steps described above with reference to specific embodiments may be interchangeable and that many other embodiments may be obtained by combining individual components and methods steps in different ways without departing from the scope of the present disclosure.
In conclusion, the methods and systems of the present disclosure employ a new two-layered recording approach and a new way of handling data of simulated user experiences in order to provide a recording of the full simulated experience and non-linear reproduction of the recording withoutimpacting performance at runtime or quality of the recording. Hence the methods and system of the present disclosure provide training experts with unparalleled control of their observation of trainee performance, allowing monitoring from any perspective or time-period.
Various improvements and modifications may be made to the above without departing from the scope of the disclosure.

Claims (17)

  1. CLAIMS1. A method for recording a user experience generated by a computer-implemented application, wherein the application is configured to receive one or more user inputs 5 and to process the one or more user inputs to generate the user experience; the method comprising: generating a first capture of the user experience by recording the one or more user inputs; and recording the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data.
  2. 2. The method of claim 2, wherein generating the second capture of the user experience comprises recording the set of deterministic data generated by the 15 application.
  3. 3. The method of claim 2, wherein generating the first capture of the user experience comprises recording a set of non-deterministic data generated by the application; and generating a second capture of the user experience comprises the application processing the non-deterministic data from the first capture.
  4. 4. The method of claim 3, wherein the user experience comprises one or more objects; and the method further comprises providing a database for recording changes of the one or more objects; and generating in the database an object instance for each object.
  5. 5. The method of claim 4 wherein each object instance comprises one or more default components which are common to all objects and one or more object specific components, each component comprising one or more variables.
  6. 6. The method of claim 4 wherein each object instance comprises one or more variables, each variable corresponding to one of a user input, a deterministic data and a non-deterministic data; and recording user input, recording non-deterministic data and recording deterministic data each comprises updating the value of the corresponding variable in the database.
  7. 7. The method of claim 6, wherein the first capture is generated over a first time interval and the second capture is generated over a second time interval; the first time interval comprises a first plurality of consecutive time frames; the second time interval comprises a second plurality of consecutive time frames, each time frame of the second plurality corresponding to a time frame of the first plurality; and recording a user input, non-deterministic data or deterministic data each comprises at each time frame, determining if a value of said user input, non-deterministic data or deterministic data has changed with respect to the previous time frame by querying the corresponding variable in the database; and if a change is detected, creating a new instance of the corresponding variable in the database.
  8. 8. The method of claim 7 wherein the method further comprises timestamping each new instance of a variable with the time frame at which the change was detected.
  9. 9. The method of claim 8, wherein all variables in the database are of a predetermined variable type and determining if a value of said user input, non-deterministic data or deterministic data has changed with respect to the previous time frame by querying the corresponding variable in the database comprises: storing in a library an original variable type of said user input non-deterministic data or deterministic data; converting the value of said user input, non-deterministic data or deterministic data to the predetermined variable type; and comparing the converted value with the instance of the corresponding variable in the database which is timestamped with the previous time frame.
  10. 10. The method of claim 9 wherein generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data comprises for each user input, retrieving from the library its original variable type; retrieving the variable corresponding to said user input from the database; and converting said variable to the original variable type prior to the application processing the user input.
  11. 11. The method of claim 10 wherein generating a second capture of the user experience further comprises, for each non-deterministic data, retrieving from the library its original variable type; retrieving the variable corresponding to said non-deterministic data from the database; and converting said variable to the original variable type prior to the application processing the non-deterministic data.
  12. 12. The method of claim 1, wherein the user experience is a simulated user 25 experience.
  13. 13. The method of claim 12, wherein the simulated user experience is an extended reality experience.
  14. 14. A computer implemented system for recording a user experience generated by an application, wherein the application is configured to receive one or more user inputs and to process the one or more user inputs to generate the user experience; the system comprising: one or more processors configured to execute the application; a first capture module configured to generate a first capture of the user experience by recording the one or more user inputs; and a second capture module configured to record the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data.
  15. 15. The computer-implemented system of claim 14, wherein the second capture module is configured to receive in input the set of deterministic data generated by the application and record the set of deterministic data to generate the second capture.
  16. 16. The computer implemented system of claim 14, wherein the first capture module is further configured to record a set of non-deterministic data generated by the application to generate the first capture; and recording the user experience by generating a second capture of the user experience, whereby the application processes the user inputs from the first capture to generate a set of deterministic data further comprises the application processing the set of non-deterministic data from the first capture.
  17. 17. The computer-implemented system of claim 14, wherein the user experience 25 comprises one or more objects; and the system further comprises a database configured to record changes of the one or more objects, the database comprising an object instance for each object of the user experience.
GB2104444.1A 2021-03-29 2021-03-29 Methods and systems for recording a user experience Withdrawn GB2605570A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2104444.1A GB2605570A (en) 2021-03-29 2021-03-29 Methods and systems for recording a user experience

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2104444.1A GB2605570A (en) 2021-03-29 2021-03-29 Methods and systems for recording a user experience

Publications (2)

Publication Number Publication Date
GB202104444D0 GB202104444D0 (en) 2021-05-12
GB2605570A true GB2605570A (en) 2022-10-12

Family

ID=75783848

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2104444.1A Withdrawn GB2605570A (en) 2021-03-29 2021-03-29 Methods and systems for recording a user experience

Country Status (1)

Country Link
GB (1) GB2605570A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100251031A1 (en) * 2009-03-24 2010-09-30 Jason Nieh Systems and methods for recording and replaying application execution
US20110281645A1 (en) * 2010-05-11 2011-11-17 Roger Daniel Wolfson Method and apparatus for online rendering of game files
US8732670B1 (en) * 2010-06-29 2014-05-20 Ca, Inc. Ensuring determinism during programmatic replay in a virtual machine
US20140194211A1 (en) * 2013-01-09 2014-07-10 Blizzard Entertainment, Inc. Restoring gameplay by replaying past inputs
US20170246544A1 (en) * 2016-02-26 2017-08-31 Microsoft Technology Licensing, Llc Video game streaming for spectating

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100251031A1 (en) * 2009-03-24 2010-09-30 Jason Nieh Systems and methods for recording and replaying application execution
US20110281645A1 (en) * 2010-05-11 2011-11-17 Roger Daniel Wolfson Method and apparatus for online rendering of game files
US8732670B1 (en) * 2010-06-29 2014-05-20 Ca, Inc. Ensuring determinism during programmatic replay in a virtual machine
US20140194211A1 (en) * 2013-01-09 2014-07-10 Blizzard Entertainment, Inc. Restoring gameplay by replaying past inputs
US20170246544A1 (en) * 2016-02-26 2017-08-31 Microsoft Technology Licensing, Llc Video game streaming for spectating

Also Published As

Publication number Publication date
GB202104444D0 (en) 2021-05-12

Similar Documents

Publication Publication Date Title
US11227439B2 (en) Systems and methods for multi-user virtual reality remote training
Pongnumkul et al. Pause-and-play: automatically linking screencast video tutorials with applications
MacIntyre et al. DART: a toolkit for rapid design exploration of augmented reality experiences
US20220172633A1 (en) Augmented reality and virtual reality systems
US20200310842A1 (en) System for User Sentiment Tracking
US9472119B2 (en) Computer-implemented operator training system and method of controlling the system
CN108619723A (en) A kind of processing method of application operating, device and storage medium
US11823587B2 (en) Virtual reality system with inspecting function of assembling and disassembling and inspection method thereof
EP4235629A1 (en) Recorded physical interaction playback
US20220223067A1 (en) System and methods for learning and training using cognitive linguistic coding in a virtual reality environment
US11327775B2 (en) Method for recording and playing back a media-synchronized user browsing session
Kamarianakis et al. Less is more: Efficient networked VR transformation handling using geometric algebra
TWI826764B (en) Method for producing and replaying courses based on virtual reality and system thereof
Steptoe et al. Multimodal data capture and analysis of interaction in immersive collaborative virtual environments
GB2605570A (en) Methods and systems for recording a user experience
CN106293703A (en) The method automatically generated based on developmental game software under particular model
WO2019190722A1 (en) Systems and methods for content management in augmented reality devices and applications
US20230162420A1 (en) System and method for provision of personalized multimedia avatars that provide studying companionship
US20210366299A1 (en) Selecting lesson asset information based on a physicality assessment
US20120215507A1 (en) Systems and methods for automated assessment within a virtual environment
Aziz et al. Linking computer game engines with remote experiments
US11922595B2 (en) Redacting content in a virtual reality environment
TWI773014B (en) Virtual reality interactive self-learning system
Siebenmann et al. Investigation into Recording, Replay and Simulation of Interactions in Virtual Reality
張純 Development of visual programming environment for virtual reality application

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)