WO2018205809A1 - 虚拟现实应用数据处理方法、计算机设备和存储介质 - Google Patents

虚拟现实应用数据处理方法、计算机设备和存储介质 Download PDF

Info

Publication number
WO2018205809A1
WO2018205809A1 PCT/CN2018/083584 CN2018083584W WO2018205809A1 WO 2018205809 A1 WO2018205809 A1 WO 2018205809A1 CN 2018083584 W CN2018083584 W CN 2018083584W WO 2018205809 A1 WO2018205809 A1 WO 2018205809A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
playback
reality application
manipulation data
computer device
Prior art date
Application number
PCT/CN2018/083584
Other languages
English (en)
French (fr)
Inventor
周扬
林龙芳
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP18799360.5A priority Critical patent/EP3611610A4/en
Publication of WO2018205809A1 publication Critical patent/WO2018205809A1/zh
Priority to US16/508,483 priority patent/US11307891B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/48Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/493Resuming a game, e.g. after pausing, malfunction or power failure
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5038Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the execution order of a plurality of tasks, e.g. taking priority or time dependency constraints into consideration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/77Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory

Definitions

  • the present application relates to the field of computer technologies, and in particular, to a virtual reality application data processing method, a computer device, and a storage medium.
  • VR Virtual Reality
  • some virtual reality applications require the user to perform a series of operations. When the user completes the previous operation, the user can enter the next operation. If the user wants to re-operate an operation in the historical operation, he must restart from the beginning. Until the operation to the target operation that you want to reproduce.
  • a virtual reality application data processing method a computer device, and a storage medium are provided.
  • a virtual reality application data processing method comprising:
  • the computer device receives a playback start instruction
  • the computer device restores the virtual reality application to an initial operating state in response to the playback start instruction
  • the computer device acquires pre-captured and stored manipulation data sent by the virtual reality operating hardware to the runtime library and having time series;
  • the computer device imports the manipulation data into the runtime library according to a corresponding sequence
  • the computer device transmits the imported manipulation data to the virtual reality application in an initial running state through the runtime library for playback processing.
  • a computer device comprising a memory and a processor, the memory storing computer readable instructions, the computer readable instructions being executed by the processor such that the processor performs the following steps:
  • the imported manipulation data is transferred to the virtual reality application in an initial running state for playback processing.
  • One or more storage media storing computer readable instructions, when executed by one or more processors, cause one or more processors to perform the following steps:
  • the imported manipulation data is transferred to the virtual reality application in an initial running state for playback processing.
  • FIG. 1 is an application environment diagram of a virtual reality application data processing method in an embodiment
  • FIG. 2 is a schematic diagram showing the internal structure of a computer device in an embodiment
  • FIG. 3 is a schematic flowchart of a virtual reality application data processing method in an embodiment
  • Figure 4 is a block diagram of a system in one embodiment
  • FIG. 5 is a structural diagram of a parameter definition structure of a registration event callback function and an event import function in a runtime library in an embodiment
  • FIG. 6 is a schematic flowchart of a third-party view screen generating step in an embodiment
  • FIG. 7 is a timing diagram of a virtual reality application data processing method in an embodiment
  • FIG. 8 is a schematic flowchart of a step of generating an instant virtual reality picture in an embodiment
  • FIG. 9 is a schematic flow chart of a virtual reality application data processing method in another embodiment.
  • FIG. 10 is a block diagram of a virtual reality application data processing apparatus in an embodiment
  • FIG. 11 is a block diagram of a virtual reality application data processing apparatus in another embodiment
  • FIG. 12 is a block diagram of a virtual reality application data processing apparatus in still another embodiment.
  • Figure 13 is a block diagram of a virtual reality application data processing apparatus in still another embodiment.
  • FIG. 1 is an application environment diagram of a virtual reality application data processing method in an embodiment.
  • an application environment of the interactive data processing method includes a terminal 110, a virtual reality operating hardware 120, and a head mounted display 130.
  • a virtual reality application (which may be simply referred to as a virtual reality application) is installed in the terminal 110.
  • the virtual reality operating hardware 120 is configured to receive or collect control operations and generate manipulation data capable of controlling the operation of the virtual reality application.
  • the virtual reality operating hardware 120 can include an operating handle and/or a position tracking device or the like.
  • the manipulation data may include operational control data generated by operations performed by the handle and/or physical spatial location data generated by the location tracking device tracking the location.
  • a Head Mounted Display (HMD) 130 is a head mounted display device that outputs a function of displaying a virtual reality screen.
  • the terminal 110 may be a desktop computer or a mobile terminal, and the mobile terminal may include at least one of a mobile phone, a tablet, a personal digital assistant, and a wearable device.
  • the terminal 110 may receive a playback start instruction to restore the virtual reality application to an initial operational state in response to the playback start instruction.
  • the terminal 110 acquires the pre-captured and stored manipulation data sent by the virtual reality operating hardware 120 to the runtime library and has timing, and imports the manipulation data into the runtime library according to the corresponding timing.
  • the terminal 110 passes the imported manipulation data to the virtual reality application in the initial running state through the runtime library for playback processing.
  • the terminal 110 may output a playback screen generated by the virtual reality application according to the manipulation data to the head mounted display 130, and the head mounted display 130 may display the playback screen. It can be understood that in other embodiments, the terminal 110 may not output the playback screen generated during the playback process to the head mounted display 130 for display.
  • the computer device can be the terminal 110 of FIG.
  • the computer device includes a processor, memory, network interface, display screen, and input device connected by a system bus.
  • the memory comprises a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium of the computer device can store an operating system and computer readable instructions that, when executed, cause the processor to perform a virtual reality application data processing method.
  • the processor of the computer device is used to provide computing and control capabilities to support the operation of the entire computer device.
  • the internal memory can store computer readable instructions that, when executed by the processor, cause the processor to perform a virtual reality application data processing method.
  • the network interface of the computer device is used for network communication.
  • the display screen of the computer device may be a liquid crystal display or an electronic ink display screen
  • the input device of the computer device may be a touch layer covered on the display screen, or a button, a trackball or a touchpad provided on the computer device casing, It can be an external keyboard, trackpad or mouse.
  • the touch layer and display form a touch screen.
  • FIG. 2 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation of the computer device to which the solution of the present application is applied.
  • the specific computer device may It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • FIG. 3 is a schematic flowchart of a virtual reality application data processing method in an embodiment. This embodiment is mainly illustrated by applying the virtual reality application data processing method to the terminal 110 in FIG. 1 described above. Referring to FIG. 3, the method specifically includes the following steps:
  • the playback start instruction is used to trigger execution of the playback processing logic.
  • a playback switch can be set on the terminal, and a playback start command is triggered by a triggering operation (eg, touch, slide, click, or press) of the playback switch.
  • the playback switch is a function button for controlling playback on and/or off.
  • the playback switch can be a separate function button or a function button composed of a plurality of buttons.
  • the playback switch can be a physical switch or a virtual switch displayed on the terminal screen.
  • the generating of the playback start command may also be triggered by performing a preset playback trigger action on the terminal.
  • a playback mode is entered, which is a mode for performing playback processing.
  • virtual reality refers to the use of computer simulation to generate a virtual world in a three-dimensional space, providing users with simulations of visual, auditory, tactile and other senses, allowing users to observe things in the three-dimensional space as if they were immersive.
  • a virtual reality application is an application for realizing virtual reality effects.
  • the virtual reality application can be a virtual reality game application. It can be understood that the virtual reality application can also be other types of applications.
  • the initial running state is the state when the virtual reality application is initially running. Specifically, the virtual reality application may be restarted to restore the virtual reality application to the initial running state, or the virtual reality application may be restored to the initial running state by the restoration mechanism.
  • the virtual reality operating hardware is used to receive control operations and generate hardware capable of controlling the manipulation data of the virtual reality application.
  • the virtual reality operating hardware may include an operating handle and/or a position tracking device, and the like. It can be understood that if the head mounted display device and the control handle itself can implement the position tracking function and can collect and send the manipulation data to the terminal, the head mounted display device and the control handle also belong to the position tracking device.
  • the manipulation data is data corresponding to the control operation and which controls the operation of the virtual reality application.
  • the manipulation data may include operational control data generated by operations performed by the handle and/or physical spatial location data generated by the location tracking device tracking the location.
  • the composition of the manipulation data may include a combination of one or more of a data block size, a time node, a manipulation event type, a manipulation device type, a manipulation duration, and a manipulation event additional data.
  • the type of manipulation event includes a handle button and/or a head-mounted display rotation; the type of the manipulation device includes a handle and/or a head-mounted display, etc., and manipulates additional data of the event, including physical space position data during manipulation, and the physical space position may be Includes physical spatial location data (ie, HMD location data) for the head mounted display. It will be appreciated that the event extra data may also include other data associated with the manipulation.
  • Table 1 is a storage structure of manipulation data in one embodiment.
  • composition of the operational data for example, the manipulation data "40 10:44:28.545545 305 2 2 2,3,1", where:
  • the Runtime Library is a special computer library used by the compiler to implement programming language built-in functions to provide runtime (execution) support for the language program.
  • a runtime library that acts as a translation layer between hardware and applications after the language program has run.
  • the runtime library can be used to feed back the manipulation data sent by the virtual reality operation hardware to the virtual reality application.
  • FIG. 4 is a system block diagram of an embodiment in which virtual reality operating hardware sends control data to a runtime library, and the runtime library feeds back control data to a virtual reality application.
  • the time-consuming manipulation data sent by the virtual reality operation hardware to the runtime library can be captured and stored, and the timing data with pre-capture and storage can be obtained.
  • the terminal may, after receiving the playback start instruction, acquire the pre-captured and stored manipulation data sent by the virtual reality operating hardware to the runtime library and having timing.
  • capture refers to a way of data interception and delivery. That is, the terminal can intercept the virtual reality operating hardware and send it to the runtime library and have the timing control data to acquire the manipulation data, and continue to transmit the manipulation data to the runtime library, so that the user does not affect the user when acquiring the manipulation data.
  • Timing refers to the chronological order.
  • the stored time-series manipulation data is stored with chronological manipulation data.
  • the manipulation data can be one or more.
  • each of the manipulation data may be correspondingly stored with the corresponding time node, or the manipulation data may be stored in a corresponding chronological order, and only the manipulation data itself may be stored.
  • the method prior to receiving the playback start instruction, further comprises: registering a callback function with the runtime library; and calling the callback function to capture the timing when the virtual reality operating hardware sends the manipulated data with timing to the runtime library Manipulate the data and store it.
  • the callback function is configured to capture time-series manipulation data sent by the virtual reality operating hardware to the runtime library.
  • the callback function can be a hook function.
  • the terminal may register the callback function with the runtime library by calling a registration event callback function (RegisterHookEvent()) in the runtime library when the virtual reality application is in an initial running state.
  • the registration event callback function is used to register a callback function in the terminal and used to capture the manipulation data to the runtime library.
  • the virtual reality operating hardware As the user operates on the virtual reality application, the virtual reality operating hardware generates manipulation data.
  • the terminal can call a callback function to capture the timed manipulation data and store it.
  • the terminal may register a callback function with the runtime library when receiving the walkthrough mode open command, and the virtual reality application is in an initial running state when the walkthrough mode is enabled.
  • the walkthrough mode can be a mode when a virtual reality application is used in normal operation.
  • the walkthrough mode can also be a separate mode that is dedicated to the use of the walkthrough.
  • the walkthrough mode can be a mode that allows the user to practice the operation, but is not used as a user level consideration.
  • the terminal may also register a callback function with the runtime library when the virtual reality application is launched. It can be understood that the virtual reality application is initially running when it is started.
  • the callback function captures and stores the timed manipulation data sent by the virtual reality operation hardware to the runtime library, the acquisition of the manipulation data does not affect the user's virtual reality.
  • the normal operation and use of the application are registering a callback function with the runtime library, and the callback function captures and stores the timed manipulation data sent by the virtual reality operation hardware to the runtime library.
  • the terminal can import the corresponding manipulation data into the runtime library according to the timing from first to last.
  • the terminal can import the manipulation data into the runtime library according to the corresponding timing by calling an event import function in the runtime library.
  • the event import function (PollNextEvent()) is used to implement the import of stored manipulation data.
  • Figure 5 is a diagram showing the structure design of a parameter definition in the runtime library of the registration event callback function (RegisterHookEvent()) and the event import function (PollNextEvent()) in one embodiment.
  • the diamond in Fig. 5 represents an aggregation relationship, that is, an overall and partial inclusion relationship, which is partially directed to the whole, as shown in Fig. 5, HMD matrix data and HMD vector data belong to a part of HMD position data, HMD position data and control handle events.
  • the data is part of the event information, which is part of the data to be fetched by the event import function and the callback function.
  • the event import function may be a preset function in the runtime library, or may be a temporary function code injected into the runtime library after receiving the playback start instruction (ie, after the playback mode is turned on).
  • the preset function is a function that is set in advance in the runtime library. Even if the playback start instruction is not received, the preset function exists in the runtime library.
  • Temporary functions are functions that are temporarily injected and that automatically fail as the system finishes running.
  • the temporary function can be a function that is injected into the runtime library only after receiving the playback start instruction. When the system finishes running, the temporary function does not exist in the runtime library.
  • the event import function is a temporary function code injected into the runtime library after receiving the playback start instruction
  • the import function does not exist in the runtime library when the playback start instruction is not received, which can be reduced. Occupation of the runtime library space.
  • the terminal can accelerate the control data into the runtime library according to the corresponding timing.
  • the terminal can also import the control data into the runtime library according to the corresponding time node, that is, import it into the runtime library at normal speed.
  • S310 Pass the imported manipulation data to the virtual reality application in an initial running state through the runtime library for playback processing.
  • the runtime library is used for feeding back the manipulation data sent by the virtual reality operation hardware to the virtual reality application. Then, the terminal imports the manipulation data into the runtime library according to the corresponding timing, and the runtime library will also import the manipulation. The data is passed to the virtual reality application in the initial running state according to the corresponding timing.
  • the terminal in order to meet the effect of the real-time operation control, can transmit the manipulation data to the virtual reality application in the initial running state through the runtime library in real time, that is, after the terminal imports the manipulation data into the runtime library, the terminal can pass the runtime.
  • the library can deliver data to the virtual reality application in its initial state of operation.
  • the runtime library may be modified in advance, and the terminal may not transmit the manipulation data to the virtual reality application in the initial running state after receiving the imported manipulation data through the runtime library. It is uniformly transmitted to the virtual reality application in the initial running state by a preset period to reduce the frequency of data transmission and save system resources.
  • the terminal does not limit how the terminal transmits the imported manipulation data to the virtual reality application in the initial running state through the runtime library, as long as the manipulation data is sent to the virtual reality application in the initial running state.
  • the virtual reality application in the initial running state performs the running processing corresponding to the manipulation data
  • the transmitted manipulation data is the manipulation data of the previous user operation
  • the virtual reality application performs the running processing corresponding to the manipulation data, and can implement the playback. deal with.
  • the above-mentioned virtual reality application data processing method restores the virtual reality to the initial running state and automatically restores the virtual reality when receiving the playback start instruction by pre-capturing and storing the hardware generated by the virtual reality operation to the runtime library and having the timing control data.
  • the pre-captured and stored manipulation data is imported into the runtime library according to the corresponding timing, and the imported manipulation data is transmitted to the virtual reality application in the initial running state through the runtime library for playback processing, and can be reproduced by automatic playback processing.
  • the operation in the user history operation does not require the user to re-operate from the beginning, which greatly improves the operation efficiency.
  • the virtual application itself is not improved, the playback process can be realized, and the versatility is avoided, which avoids the cost problem and the difficulty problem caused by the improvement of each virtual application itself.
  • step S308 includes: determining a playback mode corresponding to the virtual reality application; when the playback mode is the first playback mode, reducing the proximity control according to the accelerated import multiple corresponding to the first playback mode.
  • the recording time interval of the data; the manipulation data is sequentially imported into the runtime library according to the corresponding timing and the reduced recording time interval.
  • the terminal may determine a playback mode corresponding to the virtual reality application currently performing the playback process.
  • the playback mode may include accelerated playback and/or reduced playback. Accelerated playback refers to accelerating the manipulation speed corresponding to the stored manipulation data to improve the playback speed.
  • Restore playback refers to the restoration playback process with the manipulation speed corresponding to the stored manipulation data.
  • the recording time interval is the interval between the time nodes of the recorded temporally adjacent manipulation data. For example, if the time node of the previous manipulation data recorded is t1, and the time node of the latter manipulation data is t2, the recording interval of the two adjacent manipulation data is t2-t1.
  • the terminal may acquire the accelerated import multiple corresponding to the first playback mode.
  • the acceleration import multiple can be set in advance.
  • the terminal reduces the recording time interval of the control data adjacent to the timing according to the acceleration import multiple; and sequentially imports the manipulation data into the runtime library according to the corresponding timing and the reduced recording time interval.
  • the recording interval of two adjacent manipulation data arranged in time series is 0.02 second, then the recording time interval between the two is reduced to 0.01 second, and then the manipulation data is reduced according to the corresponding timing and time.
  • the 0.01 second recording interval is in turn imported into the runtime library. Playback efficiency can be improved by speeding up the import.
  • the method further includes: when the playback mode is the second playback mode, acquiring the time node corresponding to the manipulation data; and sequentially importing the manipulation data into the runtime library according to the corresponding time node.
  • the terminal may acquire the time node corresponding to the manipulation data, and sequentially import the manipulation data into the runtime library according to the corresponding time node.
  • the time node corresponding to the manipulation data is the time node of the stored user to operate, and the manipulation data is sequentially imported into the runtime library according to the corresponding time node, so that the manipulation data can be realized according to the user before operation.
  • the control speed is imported into the runtime library, which enables a consistent restore of the operation.
  • first and second in the first playback mode and the second playback mode are only used to distinguish different playback modes, and are not limited in size, subordinate or sequence.
  • the playback mode corresponding to the virtual reality application is determined, and the manipulation data is imported according to the corresponding playback mode, so that the playback mode is more accurate and effective, thereby improving playback. effectiveness.
  • determining a playback mode corresponding to the virtual reality application includes: determining, when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space satisfy a one-to-one mapping relationship, determining, corresponding to the virtual reality application The playback mode is the first playback mode.
  • the virtual three-dimensional coordinates refer to position coordinates corresponding to the user character in the virtual three-dimensional space constructed by the virtual reality application.
  • Physical space is the real three-dimensional space.
  • the physical three-dimensional coordinates are the position coordinates of the user in the real three-dimensional space.
  • the terminal determines whether the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space satisfy a one-to-one mapping relationship. If the one-to-one mapping relationship between the virtual three-dimensional coordinates and the physical three-dimensional coordinates is satisfied, the position of the user in the virtual three-dimensional space constructed by the virtual reality application is determined only by the physical space position, and the handle and/or the headwear are not involved.
  • the terminal can determine that the virtual reality application supports the accelerated import by the operation data generated by the operation of the display and the corresponding time node, that is, the terminal can determine that the playback mode corresponding to the virtual reality application is the first playback mode (ie, accelerated playback). It can be understood that in other embodiments, for a virtual reality application that supports accelerated import, the terminal may also import into the runtime library by means of normal import without performing accelerated import.
  • determining a playback mode corresponding to the virtual reality application includes: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space do not satisfy a one-to-one mapping relationship, determining that the virtual reality application is corresponding to the virtual reality application The playback mode is the second playback mode.
  • the terminal may determine that the virtual reality application does not support the accelerated import, that is, determine the playback mode corresponding to the virtual reality application. It is the second playback mode (ie, restore playback).
  • the virtual three-dimensional coordinates in the virtual reality application need to be determined by the physical three-dimensional coordinates in the physical space and the manipulation data generated by the operation of the handle and/or the head-mounted display, and the corresponding time nodes, the virtual The virtual three-dimensional coordinates in the real application and the physical three-dimensional coordinates in the physical space do not satisfy the one-to-one mapping relationship.
  • the playback mode corresponding to the virtual reality application is determined, and the playback mode can be determined more accurately. Therefore, playback based on accurate playback mode can improve playback efficiency.
  • the method further includes the step of playing back a picture blurring process, specifically comprising the steps of: acquiring a playback picture generated by the virtual reality application according to the manipulation data; performing a blurring process on the playback picture; and performing the blurring after the blurring process The screen output is displayed.
  • the corresponding playback screen may be generated according to the manipulation data by the virtual reality application.
  • the terminal may determine corresponding virtual three-dimensional coordinates according to the manipulation data through the virtual reality application, and generate a playback screen corresponding to the determined virtual three-dimensional coordinates.
  • the generated images correspond to virtual three-dimensional coordinates in the virtual three-dimensional space, and different virtual three-dimensional images in the virtual three-dimensional space.
  • the screen corresponding to the coordinates is also different. For example, if the user character in the virtual three-dimensional space moves forward, the corresponding screen will also change.
  • the terminal acquires a playback screen generated by the virtual reality application according to the manipulation data, and blurs the playback screen.
  • the terminal can implement the blurring processing on the playback screen by reducing the quality of the playback screen, or overlay the fuzzy graphic elements on the playback screen, such as superimposing a mosaic layer to implement blurring of the playback screen. deal with.
  • the specific implementation manner of the blurring processing on the playback screen by the terminal is not limited, as long as the playback screen can be blurred.
  • the terminal may output and display the playback screen generated after the fuzzification process, that is, the terminal may send the fuzzified playback screen to the head mounted display for output display. The user can see the blurred playback screen through the head-mounted display.
  • the step of performing the blurring processing on the playback screen is performed.
  • the generated screen may cause the user to have a dizzy feeling, and the playback screen is blurred, which can avoid dizziness and improve the playback operation. Quality and efficiency.
  • the method further includes: generating a third-party perspective image corresponding to the playback screen; and outputting the third-party perspective image to superimpose the third-party perspective image on the blurred playback screen for display.
  • the playback screen generated by the virtual reality application according to the manipulation data is a first-party perspective image.
  • the terminal may generate a third-party perspective picture corresponding to the playback screen by modifying a screen drawing parameter of the playback screen that generates the first-party perspective. It can be understood that in the third-party perspective picture, the image corresponding to the user character itself is a component of the third-party perspective image, and in the first-party perspective image, the user role is not displayed.
  • the terminal may output the generated third-party perspective image corresponding to the playback screen to superimpose the third-party perspective image on the playback screen after the blurring process for display. In this way, the user can be prevented from being dizzy, and the user can see the current playback progress.
  • the method further comprises: receiving a playback stop instruction; canceling the blurring processing on the playback screen in response to the playback stop instruction, and canceling generating and outputting the third-party perspective image corresponding to the playback screen.
  • the playback stop command is used to trigger execution logic to stop playback.
  • the user can issue a playback stop command by clicking or pressing an operation at any position during playback.
  • the terminal can receive a playback stop command initiated by the user at any position during playback. For example, all the control data stored in the terminal for playback processing can be played back for 30 seconds, but the user does not want to perform all playback, the user can initiate a playback stop command at any position before the entire 30-second playback process, such as playback. By 20 seconds, the user can click or press the playback stop button to initiate a playback stop command.
  • the terminal may cancel the blurring processing on the playback screen in response to the playback stop instruction, and cancel the generation and output of the third-party perspective screen corresponding to the playback screen.
  • the terminal may cancel the blurring process of the playback screen corresponding to the position corresponding to the playback stop command, that is, the playback screen corresponding to the position corresponding to the playback stop command is a picture that has not been subjected to the blurring process and can be normally output and displayed.
  • the terminal can also cancel generating and outputting a third-party perspective picture corresponding to the playback screen. That is, after receiving the playback stop command, the terminal may no longer generate a third-party view screen corresponding to the playback screen corresponding to the playback stop command. That is, after the terminal responds to the playback stop command and performs the above processing, the user finally sees through the head mounted display the playback screen itself that has not been blurred.
  • generating a third-party perspective image corresponding to the playback screen (referred to as a third-party perspective image generation step) specifically includes the following steps:
  • the virtual reality application calls the drawing function to draw a playback screen corresponding to the manipulation data.
  • the terminal may acquire the picture drawing parameter value that the virtual reality application transmits to the drawing function.
  • the acquired picture drawing parameter value is a drawing parameter value used to draw the first view picture. It can be understood that after the terminal draws the parameter value of the picture transmitted by the virtual reality application to the drawing function, the terminal may continue to transfer the picture drawing parameter value to the drawing function, so that the drawing function generates the first angle of view playback according to the picture drawing parameter value. Picture.
  • the mapping parameter value conversion relationship between the first party view and the third party view may be a drawing parameter value conversion rule that converts the first party view image into a third party view image.
  • the terminal may modify the obtained drawing parameter value for drawing the first view picture to the drawing parameter value for drawing the third view picture according to the drawing parameter value conversion rule of the first party view and the third party view.
  • the terminal may call a drawing function, and draw a third-party view picture according to the modified picture drawing parameter (ie, the drawing parameter value used to draw the third view picture).
  • the modified picture drawing parameter ie, the drawing parameter value used to draw the third view picture.
  • the third-party view image is generated by acquiring and modifying the picture drawing parameters, so that the third-party view picture generation is very convenient and the efficiency is improved.
  • step S602 includes: when the virtual reality application calls the drawing function to draw the playback screen, obtains a picture drawing parameter value that is transmitted by the virtual reality application to the drawing function by using a hook function injected in the drawing function; step S606 includes: passing The hook function injected in the drawing function calls the drawing function, and draws a third-party perspective picture according to the modified picture drawing parameter value.
  • the terminal may trigger to inject a hook function into the drawing function when receiving the playback instruction.
  • the terminal can obtain the picture drawing parameter value transmitted by the virtual reality application to the drawing function by drawing the hook function injected in the function.
  • the terminal can call the corresponding drawing function again by the hook function injected in the drawing function, and draw a third-party perspective picture according to the modified picture drawing parameter value.
  • the picture drawing parameters are acquired and modified by the hook function injected in the drawing function to generate a third-party view picture, so that the third-party view picture generation is very convenient and the efficiency is improved.
  • FIG. 7 is a timing diagram of a virtual reality application data processing method in an embodiment, specifically including the following steps:
  • the third-party software calls the registration event callback function in the runtime library to register the callback function with the runtime library.
  • the third-party software injects a hook function into the drawing function in the drawing program.
  • the third-party software captures the manipulation data sent by the virtual reality operating hardware to the runtime library through the callback function and stores it.
  • the third-party software calls the event import function in the runtime library to import the stored manipulation data to the runtime library.
  • the runtime library passes the imported manipulation data to the virtual reality application.
  • the virtual reality application calls the drawing function in the drawing program to draw the playback picture corresponding to the manipulation data.
  • the hook function injected in the drawing function of the drawing program acquires and modifies the picture drawing parameter values.
  • the drawing function in the drawing program generates a playback screen based on the original screen drawing parameter values.
  • the drawing program returns the playback screen and the third-party perspective screen to the third-party software.
  • the third-party software blurs the playback screen and superimposes the third perspective image.
  • the third party software sends the blurred playback screen and the superimposed third perspective image to the head mounted display.
  • the head mounted display output displays a blurred playback screen, and a superimposed third perspective picture.
  • the method further includes an instant virtual reality picture generating step, which specifically includes the following steps:
  • the playback stop command is used to trigger execution logic to stop playback.
  • the user can issue a playback stop command by clicking or pressing an operation at any position during playback.
  • the terminal can receive a playback stop command initiated by the user at any position during playback.
  • the manipulation data of all user playback processing stored by the terminal can be played back for 30 seconds, but the user does not want to perform all playback, the user can initiate a playback stop instruction at any position before the entire 30 second playback process, such as playback.
  • the user can click or press the playback stop button to initiate a playback stop command.
  • the terminal After receiving the playback stop command, the terminal stops importing the manipulation data into the runtime library to end the playback process. It can be understood that after the playback process is finished, that is, the instant operation processing state is restored, the terminal can control the runtime library to resume the state of receiving the immediate manipulation data sent by the virtual reality operation hardware.
  • the user can operate the virtual reality operating hardware to issue an immediate operation on the virtual reality application, and the virtual reality operating hardware sends the real-time manipulation data corresponding to the instant operation to the runtime library.
  • the terminal receives the instant manipulation data sent by the virtual reality operating hardware through the runtime library and transmits it to the virtual reality application.
  • the terminal may form a virtual reality picture and output a display according to the real-time manipulation data through the virtual reality application.
  • the terminal may determine corresponding virtual three-dimensional coordinates according to the real-time manipulation data through the virtual reality application, and generate an instant virtual reality picture corresponding to the determined virtual three-dimensional coordinates.
  • the virtual operation processing state after receiving the playback stop command, the virtual operation processing state can be restored, and the virtual reality image is formed and outputted according to the real-time manipulation data sent by the virtual reality operation hardware. It realizes the seamless docking of playback and instant operation, avoiding the need to perform other operations to restore to the instant operation state and improve the operation efficiency.
  • FIG. 9 in an embodiment, another virtual reality application data processing method is provided, and the method specifically includes the following steps:
  • step S908 determining a playback mode corresponding to the virtual reality application.
  • the playback mode is the first playback mode
  • the process proceeds to step S910.
  • the playback mode is the second playback mode
  • the process proceeds to step S912.
  • the playback mode corresponding to the virtual reality application is the first playback mode.
  • the playback mode corresponding to the virtual reality application is the second playback mode.
  • S912 Obtain a time node corresponding to the manipulation data; and import the manipulation data into the runtime library according to the corresponding time node.
  • S914 Pass the imported manipulation data to the virtual reality application in an initial running state through the runtime library for playback processing.
  • S918 Modify a picture drawing parameter value according to a mapping parameter value conversion relationship between the first party view and the third party view.
  • the drawing function is called by the hook function injected in the drawing function, and the third-party perspective picture is drawn according to the modified picture drawing parameter value.
  • S922 Acquire a playback screen generated by the virtual reality application according to the manipulation data, perform blurring processing on the playback screen, and output and display the blurred playback screen.
  • S930 Receive real-time manipulation data sent by the virtual reality operation hardware through the runtime library and deliver the data to the virtual reality application.
  • the above-mentioned virtual reality application data processing method restores the virtual reality to the initial running state and automatically restores the virtual reality when receiving the playback start instruction by pre-capturing and storing the hardware generated by the virtual reality operation to the runtime library and having the timing control data.
  • the pre-captured and stored manipulation data is imported into the runtime library according to the corresponding timing, and the imported manipulation data is transmitted to the virtual reality application in the initial running state through the runtime library for playback processing, and can be reproduced by automatic playback processing.
  • the operation in the user history operation does not require the user to re-operate from the beginning, which greatly improves the operation efficiency.
  • the playback mode corresponding to the virtual reality application is determined, and the manipulation data is imported according to the corresponding playback mode, so that the playback mode is more accurate and effective, thereby improving the playback efficiency. And by speeding up the import, the playback efficiency can be further improved.
  • blurring the playback screen can avoid dizziness and improve the quality and efficiency of the playback operation.
  • the third-party view screen is superimposed on the playback screen after the blurring process for display. In this way, the user can be prevented from being dizzy, and the user can see the current playback progress.
  • the third-party view picture generation is very convenient and the efficiency is improved.
  • the virtual operation processing state can be restored, and the virtual reality image is formed and outputted according to the real-time manipulation data sent by the virtual reality operation hardware. It realizes the seamless docking of playback and instant operation, avoiding the need to perform other operations to restore to the instant operation state and improve the operation efficiency.
  • a terminal is further provided, and the internal structure of the terminal is as shown in FIG. 2, the terminal includes a virtual reality application data processing device, where the virtual reality application data processing device includes each module, and each module This may be achieved in whole or in part by software, hardware or a combination thereof.
  • a virtual reality application data processing apparatus 1000 includes: an obtaining module 1003, an operating state management module 1004, a manipulation data importing module 1006, and a playback processing module 1008. among them:
  • the obtaining module 1003 is configured to receive a playback start instruction.
  • the running state management module 1004 is configured to restore the virtual reality application to an initial running state in response to the playback start instruction.
  • the obtaining module 1002 is further configured to acquire pre-captured and stored manipulation data sent by the virtual reality operating hardware to the runtime library and having timing.
  • the manipulation data import module 1006 is configured to import the manipulation data into the runtime library according to the corresponding timing.
  • the playback processing module 1008 is configured to pass the imported manipulation data to the virtual reality application in an initial running state for playback processing through the runtime library.
  • the apparatus 1000 further includes:
  • the function registration module 1001 is configured to register a callback function with the runtime library.
  • the data capture module 1002 is configured to invoke a callback function to capture the manipulated data with timing and store when the virtual reality operating hardware sends the manipulated data with timing to the runtime library.
  • the manipulation data importing module 1006 is further configured to determine a playback mode corresponding to the virtual reality application; when the playback mode is the first playback mode, the zoom-in timing is reduced according to the accelerated import multiple corresponding to the first playback mode.
  • the recording time interval of the adjacent manipulation data; the manipulation data is sequentially imported into the runtime library according to the corresponding timing and the reduced recording time interval.
  • the manipulation data importing module 1006 is further configured to determine a playback mode corresponding to the virtual reality application; when the playback mode is the second playback mode, acquiring a time node corresponding to the manipulation data; Time node, which in turn imports the runtime library.
  • the manipulation data importing module 1006 is further configured to: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space satisfy a one-to-one mapping relationship, determine that the playback mode corresponding to the virtual reality application is the first A playback method.
  • the manipulation data importing module 1006 is further configured to: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space do not satisfy the one-to-one mapping relationship, determine that the playback mode corresponding to the virtual reality application is The second playback mode.
  • the apparatus 1000 further includes:
  • the playback screen processing module 1010 is configured to acquire a playback screen generated by the virtual reality application according to the manipulation data; perform a blurring process on the playback screen; and output the display screen after the blurring processing.
  • the playback screen processing module 1010 is further configured to generate a third-party perspective image corresponding to the playback screen, and output the third-party perspective image to superimpose the third-party perspective image on the blurred playback screen for display.
  • the playback picture processing module 1010 is further configured to receive a playback stop instruction; cancel the blurring process performed on the playback picture in response to the playback stop instruction, and cancel generating and outputting the third-party view picture corresponding to the playback picture.
  • the playback screen processing module 1010 is further configured to: when the virtual reality application invokes the drawing function to draw the playback screen, acquire a picture drawing parameter value that is transmitted by the virtual reality application to the drawing function; according to the first party perspective and the third party perspective Draw the parameter value conversion relationship, modify the picture drawing parameter value; call the drawing function to draw the third-party view picture according to the modified picture drawing parameter value.
  • the playback picture processing module 1010 is further configured to obtain, by using a hook function injected in the drawing function, the picture drawing parameter value transmitted by the virtual reality application to the drawing function when the virtual reality application calls the drawing function to draw the playback picture.
  • the playback picture processing module 1010 is further configured to call a drawing function by using a hook function injected in the drawing function, and draw a third-party view picture according to the modified picture drawing parameter value.
  • the apparatus 1000 further includes:
  • the playback processing end module 1012 is configured to stop importing the manipulation data into the runtime library after receiving the playback stop command to end the playback process.
  • the real-time manipulation module 1014 is configured to receive the real-time manipulation data sent by the virtual reality operation hardware and transmit the same to the virtual reality application through the runtime library; and form a virtual reality image according to the real-time manipulation data and output the display through the virtual reality application.
  • a computer apparatus comprising a memory and a processor, the memory storing computer readable instructions that, when executed by the processor, cause the processor to execute The following steps: receiving a playback start instruction; in response to the playback start instruction, restoring the virtual reality application to an initial running state; acquiring pre-captured and stored manipulation data sent by the virtual reality operating hardware to the runtime library and having timing; The manipulation data is imported into the runtime library according to a corresponding sequence; and the imported manipulation data is transmitted to the virtual reality application in an initial running state for playback processing by the runtime library.
  • the computer readable instructions prior to the receiving the playback start instruction, further cause the processor to perform the steps of: registering a callback function with the runtime library; and transmitting the timing to the virtual reality operating hardware to the runtime library
  • the callback function is called to capture the timed manipulation data and store it.
  • the importing, by the processor, the manipulation data into the runtime library according to a corresponding timing comprises: determining a playback mode corresponding to the virtual reality application; and when the playback mode is the first playback In the mode, the recording time interval of the manipulation data adjacent to the time series is reduced according to the acceleration introduction multiple corresponding to the first playback mode; and the manipulation data is sequentially followed by the corresponding time interval and the reduced recording time interval. Import the runtime library.
  • the computer readable instructions further cause the processor to perform the steps of: determining a playback mode corresponding to the virtual reality application; and when the playback mode is the second playback mode, acquiring the time corresponding to the manipulation data a node; the manipulation data is sequentially imported into the runtime library according to a corresponding time node.
  • the determining, by the processor, the playback mode corresponding to the virtual reality application includes: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space satisfy a one-to-one mapping relationship, Determining a playback mode corresponding to the virtual reality application is a first playback mode.
  • the determining, by the processor, the playback mode corresponding to the virtual reality application includes: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space do not satisfy a one-to-one mapping relationship, Then, it is determined that the playback mode corresponding to the virtual reality application is the second playback mode.
  • the computer readable instructions further cause the processor to: acquire a playback screen generated by the virtual reality application according to the manipulation data; perform blurring processing on the playback screen; Playback screen output display.
  • the computer readable instructions further cause the processor to: generate a third party view picture corresponding to the playback picture; output the third party view picture to overlay the third party view picture The playback screen after the blurring process is displayed.
  • the computer readable instructions further cause the processor to: receive a playback stop instruction; in response to the playback stop instruction, cancel the blurring process on the playback screen and cancel the generation and output The third-party perspective screen corresponding to the playback screen.
  • the generating, by the processor, the generating a third-party view image corresponding to the playback screen comprising: acquiring the virtual reality application to the drawing when the virtual reality application calls a drawing function to draw a playback image a picture drawing parameter value transmitted by the function; modifying the picture drawing parameter value according to the drawing parameter value conversion relationship of the first party view and the third party view; calling the drawing function to draw a third party view according to the modified picture drawing parameter value Picture.
  • the processor when the virtual reality application invokes the drawing function to draw the playback screen, draws the picture drawing parameter value that is transmitted by the virtual reality application to the drawing function, including: calling the drawing function in the virtual reality application
  • the screen drawing parameter value transmitted by the virtual reality application to the drawing function is acquired by the hook function injected in the drawing function.
  • the calling the function performed by the processor, and drawing the third-party view image according to the modified picture drawing parameter value includes: calling the drawing function by using a hook function injected in the drawing function, according to the modified picture Draw parameter values to draw a third-party view.
  • the computer readable instructions further cause the processor to: after receiving the playback stop instruction, stop importing the manipulation data into the runtime library to end the playback process;
  • the time library receives the real-time manipulation data sent by the virtual reality operation hardware and transmits to the virtual reality application; through the virtual reality application, forms a virtual reality picture according to the instant manipulation data and outputs the display.
  • a storage medium storing computer readable instructions, when executed by one or more processors, causes one or more processors to perform the steps of: receiving a playback start instruction Responding to the playback start instruction, restoring the virtual reality application to an initial running state; acquiring pre-captured and stored manipulation data sent by the virtual reality operating hardware to the runtime library and having timing; and pressing the manipulation data according to the corresponding timing Importing into the runtime library; passing the imported manipulation data to the virtual reality application in an initial running state for playback processing through the runtime library.
  • the computer readable instructions prior to the receiving the playback start instruction, further cause the processor to perform the steps of: registering a callback function with the runtime library; and transmitting the timing to the virtual reality operating hardware to the runtime library
  • the callback function is called to capture the timed manipulation data and store it.
  • the importing, by the processor, the manipulation data into the runtime library according to a corresponding timing comprises: determining a playback mode corresponding to the virtual reality application; and when the playback mode is the first playback In the mode, the recording time interval of the manipulation data adjacent to the time series is reduced according to the acceleration introduction multiple corresponding to the first playback mode; and the manipulation data is sequentially followed by the corresponding time interval and the reduced recording time interval. Import the runtime library.
  • the computer readable instructions further cause the processor to perform the steps of: determining a playback mode corresponding to the virtual reality application; and when the playback mode is the second playback mode, acquiring the time corresponding to the manipulation data a node; the manipulation data is sequentially imported into the runtime library according to a corresponding time node.
  • the determining, by the processor, the playback mode corresponding to the virtual reality application includes: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space satisfy a one-to-one mapping relationship, Determining a playback mode corresponding to the virtual reality application is a first playback mode.
  • the determining, by the processor, the playback mode corresponding to the virtual reality application includes: when the virtual three-dimensional coordinates in the virtual reality application and the physical three-dimensional coordinates in the physical space do not satisfy a one-to-one mapping relationship, Then, it is determined that the playback mode corresponding to the virtual reality application is the second playback mode.
  • the computer readable instructions further cause the processor to: acquire a playback screen generated by the virtual reality application according to the manipulation data; perform blurring processing on the playback screen; Playback screen output display.
  • the computer readable instructions further cause the processor to: generate a third party view picture corresponding to the playback picture; output the third party view picture to overlay the third party view picture The playback screen after the blurring process is displayed.
  • the computer readable instructions further cause the processor to: receive a playback stop instruction; in response to the playback stop instruction, cancel the blurring process on the playback screen and cancel the generation and output The third-party perspective screen corresponding to the playback screen.
  • the generating, by the processor, the generating a third-party view image corresponding to the playback screen comprising: acquiring the virtual reality application to the drawing when the virtual reality application calls a drawing function to draw a playback image a picture drawing parameter value transmitted by the function; modifying the picture drawing parameter value according to the drawing parameter value conversion relationship of the first party view and the third party view; calling the drawing function to draw a third party view according to the modified picture drawing parameter value Picture.
  • the processor when the virtual reality application invokes the drawing function to draw the playback screen, draws the picture drawing parameter value that is transmitted by the virtual reality application to the drawing function, including: calling the drawing function in the virtual reality application
  • the screen drawing parameter value transmitted by the virtual reality application to the drawing function is acquired by the hook function injected in the drawing function.
  • the calling the function performed by the processor, and drawing the third-party view image according to the modified picture drawing parameter value includes: calling the drawing function by using a hook function injected in the drawing function, according to the modified picture Draw parameter values to draw a third-party view.
  • the computer readable instructions further cause the processor to: after receiving the playback stop instruction, stop importing the manipulation data into the runtime library to end the playback process;
  • the time library receives the real-time manipulation data sent by the virtual reality operation hardware and transmits to the virtual reality application; through the virtual reality application, forms a virtual reality picture according to the instant manipulation data and outputs the display.
  • the various steps in the various embodiments of the present application are not necessarily performed in the order indicated by the steps. Except as explicitly stated herein, the execution of these steps is not strictly limited, and the steps may be performed in other orders. Moreover, at least some of the steps in the embodiments may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be executed at different times, and the execution of these sub-steps or stages The order is also not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of the other steps.
  • Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM is available in a variety of formats, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronization chain.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDRSDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • Synchlink DRAM SLDRAM
  • Memory Bus Radbus
  • RDRAM Direct RAM
  • DRAM Direct Memory Bus Dynamic RAM
  • RDRAM Memory Bus Dynamic RAM

Abstract

一种虚拟现实应用数据处理方法,包括:接收回放开始指令;响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;将所述操控数据按照相应时序导入所述运行时库中;通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。

Description

虚拟现实应用数据处理方法、计算机设备和存储介质
本申请要求于2017年05月09日提交中国专利局,申请号为2017103223363,申请名称为“虚拟现实应用数据处理方法、计算机设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,特别是涉及一种虚拟现实应用数据处理方法、计算机设备和存储介质。
背景技术
随着科学技术的飞速发展,应用的类型也越来越丰富。虚拟现实(VR,Virtual Reality)应用凭借其良好的互动性和较好的体验,越来越受到广大用户的青睐。
目前部分虚拟现实应用需要用户进行一系列的操作,当用户完成前一个操作后才能进入到下一个操作,如果用户想要对历史操作中的某一操作重新进行操作时,必须从头开始重新操作,直至操作到想要重现的目标操作。
因此,目前用户想要对历史操作中的某一操作重新进行操作时,需要从头开始重新操作,导致操作效率比较低。
发明内容
根据本申请提供的各种实施例,提供一种虚拟现实应用数据处理方法、计算机设备和存储介质。
一种虚拟现实应用数据处理方法,所述方法包括:
计算机设备接收回放开始指令;
所述计算机设备响应于所述回放开始指令,将虚拟现实应用恢复至初始 运行状态;
所述计算机设备获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;
所述计算机设备将所述操控数据按照相应时序导入所述运行时库中;及
所述计算机设备通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行以下步骤:
接收回放开始指令;
响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;
获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;
将所述操控数据按照相应时序导入所述运行时库中;及
通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
一个或多个存储有计算机可读指令的存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行以下步骤:
接收回放开始指令;
响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;
获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;
将所述操控数据按照相应时序导入所述运行时库中;及
通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一个实施例中虚拟现实应用数据处理方法的应用环境图;
图2为一个实施例中计算机设备的内部结构示意图;
图3为一个实施例中虚拟现实应用数据处理方法的流程示意图;
图4为一个实施例中系统框图;
图5为一个实施例中运行时库中的注册事件回调函数和事件导入函数的参数定义结构设计图;
图6为一个实施例中第三方视角画面生成步骤的流程示意图;
图7为一个实施例中虚拟现实应用数据处理方法的时序图;
图8为一个实施例中即时虚拟现实画面生成步骤的流程示意图;
图9为另一个实施例中虚拟现实应用数据处理方法的流程示意图;
图10为一个实施例中虚拟现实应用数据处理装置的框图;
图11为另一个实施例中虚拟现实应用数据处理装置的框图;
图12为又一个实施例中虚拟现实应用数据处理装置的框图;及
图13为再一个实施例中虚拟现实应用数据处理装置的框图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
图1为一个实施例中虚拟现实应用数据处理方法的应用环境图。参照图1,该交互数据处理方法的应用环境包括终端110、虚拟现实操作硬件120和头戴式显示器130。其中,终端110中安装有虚拟现实应用程序(可简称为虚拟现实应用)。虚拟现实操作硬件120,用于接收或采集控制操作,并生成 能够控制虚拟现实应用程序的运行的操控数据。虚拟现实操作硬件120可以包括操作手柄和/或位置追踪设备等。操控数据,可以包括用手柄进行的操作所生成的操作控制数据和/或,由位置追踪设备追踪位置所生成的物理空间位置数据。头戴式显示器(HMD,Head Mounted Display)130是具备输出显示虚拟现实画面功能的头戴式显示设备。终端110可以是台式计算机或移动终端,移动终端可以包括手机、平板电脑、个人数字助理和穿戴式设备等中的至少一种。
终端110可以接收回放开始指令,响应于回放开始指令,将虚拟现实应用恢复至初始运行状态。终端110获取预先捕获并存储的由虚拟现实操作硬件120发向运行时库且具有时序的操控数据,将操控数据按照相应时序导入运行时库中。终端110通过运行时库,将导入的操控数据传递至处于初始运行状态的虚拟现实应用进行回放处理。在一个实施例中,终端110可以将虚拟现实应用根据操控数据生成的回放画面输出至头戴式显示器130,头戴式显示器130可以将该回放画面进行显示。可以理解,在其它实施例中,终端110也可以不将回放处理过程中生成的回放画面输出至头戴式显示器130进行显示。
图2为一个实施例中计算机设备的内部结构示意图。该计算机设备可以是图1中的终端110。参照图2,该计算机设备包括通过系统总线连接的处理器、存储器、网络接口、显示屏和输入装置。其中,存储器包括非易失性存储介质和内存储器。该计算机设备的非易失性存储介质可存储操作系统和计算机可读指令,该计算机可读指令被执行时,可使得处理器执行一种虚拟现实应用数据处理方法。该计算机设备的处理器用于提供计算和控制能力,支撑整个计算机设备的运行。该内存储器中可储存有计算机可读指令,该计算机可读指令被处理器执行时,可使得处理器执行一种虚拟现实应用数据处理方法。计算机设备的网络接口用于进行网络通信。计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏,计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板, 还可以是外接的键盘、触控板或鼠标等。触摸层和显示屏构成触控屏。
本领域技术人员可以理解,图2中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
图3为一个实施例中虚拟现实应用数据处理方法的流程示意图。本实施例主要以该虚拟现实应用数据处理方法应用于上述图1中的终端110来举例说明。参照图3,该方法具体包括如下步骤:
S302,接收回放开始指令。
其中,回放开始指令,用于触发回放处理逻辑的执行。
在一个实施例中,终端上可以设置回放开关,通过对该回放开关的触发操作(比如,触摸、滑动、点击或按压等操作),来触发生成回放开始指令。其中,回放开关是用于控制回放开启和/或关闭的功能按键。回放开关可以是一个独立的功能按键,也可以是由多个按键组合而成的功能按键。回放开关可以是物理开关,也可以是终端屏幕上显示的虚拟开关。
在另一个实施例中,也可以通过在终端上进行预设的回放触发动作,来触发生成回放开始指令。
在一个实施例中,接收回放开始指令后,即可进入回放模式,该回放模式是用于进行回放处理的模式。
S304,响应于回放开始指令,将虚拟现实应用恢复至初始运行状态。
其中,虚拟现实,是指利用电脑模拟产生一个三维空间的虚拟世界,提供使用者关于视觉、听觉、触觉等感官的模拟,让使用者如同身历其境一般地观察三度空间内的事物。
虚拟现实应用,是用于实现虚拟现实效果的应用程序。在一个实施例中,虚拟现实应用可以是虚拟现实游戏应用程序。可以理解,虚拟现实应用也可以是其它类型的应用程序。
初始运行状态,是虚拟现实应用初始运行时的状态。具体地,可以对虚 拟现实应用进行重新启动来将虚拟现实应用恢复至初始运行状态,也可以通过还原机制将虚拟现实应用恢复至初始运行状态。
S306,获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据。
其中,虚拟现实操作硬件,是用于接收控制操作,并生成能够控制虚拟现实应用程序运行的操控数据的硬件。虚拟现实操作硬件可以包括操作手柄和/或位置追踪设备等。可以理解,如果头戴式显示设备和控制手柄本身能够实现位置追踪功能、且能够采集并发送操控数据至终端,则头戴式显示设备和控制手柄也属于位置追踪设备。
操控数据,是与控制操作对应的、且对虚拟现实应用的运行起控制作用的数据。操控数据,可以包括用手柄进行的操作所生成的操作控制数据和/或,由位置追踪设备追踪位置所生成的物理空间位置数据。
在一个实施例中,操控数据的组成可以包括数据块大小、时间节点、操控事件类型、操控设备类型、操控时长和操控事件额外数据等中的一种或几种的组合。
其中,操控事件类型包括手柄按钮和/或头戴式显示器旋转等;操控设备类型包括手柄和/或头戴式显示器等,操控事件额外数据,包括操控时的物理空间位置数据,物理空间位置可以包括头戴式显示器的物理空间位置数据(即HMD位置数据)。可以理解,事件额外数据还可以包括其它与操控关联的数据。参照表1,表1为一个实施例中的操控数据的存储结构。
表1
Figure PCTCN2018083584-appb-000001
现举例说明,操作数据的组成,比如,操控数据“40 10:44:28.545545  305 2 2 2,3,1”,其中:
40:表示存储的数据块大小
10:44:28.545545:表示时间节点
305:表示左手柄滚动按钮
2:表示左手柄
2:表示手柄滚动时间长度
2,3,1:表示手柄滚动时的物理空间位置
运行时库(Runtime Library),是一种被编译器用来实现编程语言内置函数,以提供该语言程序运行时(执行)支持的一种特殊的计算机程序库。运行时库,用于在语言程序运行起来之后,作为硬件和应用之间的转换层。在本实施例中,运行时库,可以用于将虚拟现实操作硬件发送的操控数据反馈至虚拟现实应用。
图4为一个实施例中系统框图,虚拟现实操作硬件发送操控数据至运行时库,运行时库反馈操控数据至虚拟现实应用。
可以理解,在用户直接操作虚拟现实应用时,可以捕获虚拟现实操作硬件向运行时库发送的具有时序的操控数据并存储,得到预先捕获并存储的具有时序的操控数据。终端则可以在接收回放开始指令后,获取该预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据。
其中,捕获,是指拦截并传递的一种数据获取方式。即终端可以通过拦截虚拟现实操作硬件发向运行时库且具有时序的操控数据,以获取该操控数据,并将该操控数据继续传递至运行时库,从而实现在获取操控数据时,不影响用户对虚拟现实应用的正常操作及使用。
时序,是指时间先后顺序。存储的具有时序的操控数据,是存储的相互间具有时间先后顺序的操控数据。操控数据可以为一个或多个。在一个实施例中,可以是将各个操控数据和所对应的时间节点进行对应存储,也可以是将操控数据按照对应的时间先后顺序进行存储,仅存储操控数据本身。
在一个实施例中,在接收回放开始指令之前,该方法还包括:向运行时 库注册回调函数;当虚拟现实操作硬件向运行时库发送具有时序的操控数据时,调用回调函数,捕获具有时序的操控数据并存储。
其中,回调函数,用于捕获虚拟现实操作硬件向运行时库发送的具有时序的操控数据。在一个实施例中,回调函数可以是钩子函数。
具体地,终端可以在虚拟现实应用处于初始运行状态时,通过调用运行时库中的注册事件回调函数(RegisterHookEvent()),向运行时库注册回调函数。其中,注册事件回调函数,用于向运行时库注册终端中的、且用于捕获操控数据的回调函数。随着用户对虚拟现实应用的操作,虚拟现实操作硬件会产生操控数据。当虚拟现实操作硬件向运行时库发送具有时序的操控数据时,终端可以调用回调函数,捕获该具有时序的操控数据并存储。在一个实施例中,终端可以在接收演练模式开启指令时,向运行时库注册回调函数,在演练模式开启时,虚拟现实应用处于初始运行状态。演练模式,可以是正常操作使用虚拟现实应用时的模式。演练模式,也可以是一种单独的、且专用于演练使用的模式,比如,对于虚拟现实游戏应用来说,演练模式可以是让用户练习操作,但不用作用户等级的考量的一种模式。
在其它实施例中,终端也可以是在虚拟现实应用启动时,向运行时库注册回调函数。可以理解,虚拟现实应用启动时处于初始运行状态。
上述实施例中,通过向运行时库注册回调函数,并回调函数捕获并存储虚拟现实操作硬件向运行时库发送的具有时序的操控数据,在实现获取操控数据的同时,不影响用户对虚拟现实应用的正常操作及使用。
S308,将操控数据按照相应时序导入运行时库中。
具体地,终端可以按照由先到后的时序将相应的操控数据导入运行时库中。
在一个实施例中,终端可以通过调用运行时库中的事件导入函数,将操控数据按照相应时序导入运行时库中。事件导入函数(PollNextEvent())用于实现对存储的操控数据的导入。
图5为一个实施例中注册事件回调函数(RegisterHookEvent())和事件导 入函数(PollNextEvent())在运行时库中的参数定义结构设计图。图5中的菱形表示一种聚合关系,即整体和部分的包含关系,由部分指向整体,如图5中的HMD矩阵数据和HMD矢量数据属于HMD位置数据的一部分,HMD位置数据和控制手柄事件数据属于事件信息的一部分,事件信息属于事件导入函数和回调函数所要获取数据中的一部分。其中,事件导入函数可以是运行时库中的预设函数,也可以是接收回放开始指令后(即开启回放模式后),向运行时库中注入的临时函数代码。预设函数,是预先在运行时库中设置好的函数,即使没有接收回放开始指令,该预设函数也存在于运行时库中。临时函数,是临时性注入的函数,随着系统的运行结束自动失效的函数。临时函数可以是只有接收回放开始指令后才向运行时库中注入的函数,等到系统结束运行后,该临时函数就不存在于运行时库中。
可以理解,当事件导入函数是接收回放开始指令后,向运行时库中注入的临时函数代码时,那么在未接收回放开始指令的情况下,运行时库中就不存在这个导入函数,可以减少对运行时库空间的占用。
其中,终端可以将操控数据按照相应时序加速导入运行时库中。终端也可以将操控数据按照相应的时间节点导入运行时库中,即以正常速度导入运行时库中。
S310,通过运行时库,将导入的操控数据传递至处于初始运行状态的虚拟现实应用进行回放处理。
可以理解,运行时库是用于将虚拟现实操作硬件发送的操控数据反馈至虚拟现实应用,那么,终端是将操控数据按照相应时序导入运行时库中,则运行时库也会将导入的操控数据按照相应时序传递至处于初始运行状态的虚拟现实应用。
在一个实施例中,为了满足实时操作控制的效果,终端可以通过运行时库实时向处于初始运行状态的虚拟现实应用传递操控数据,即终端向运行时库导入操控数据后,即可通过运行时库可以向处于初始运行状态的虚拟现实应用实时传递数据。在另一个实施例中,也可以预先对运行时库作改进,终 端可以通过运行时库在接收到导入的操控数据后,不将操控数据实时的传递至处于初始运行状态的虚拟现实应用,而是以预设周期统一传递至处于初始运行状态的虚拟现实应用,以减少数据发送频率,节省系统资源。这里对终端通过运行时库如何将导入的操控数据传递至处于初始运行状态的虚拟现实应用不做限定,只要能够满足将操控数据发送至处于初始运行状态的虚拟现实应用即可。
可以理解,处于初始运行状态的虚拟现实应用执行与操控数据对应的运行处理,而传递的操控数据是之前用户操作时的操控数据,因此虚拟现实应用执行与操控数据对应的运行处理,可以实现回放处理。
上述虚拟现实应用数据处理方法,通过预先捕获并存储由实现虚拟现实操作的硬件发向运行时库且具有时序的操控数据,在接收回放开始指令时,将虚拟现实恢复至初始运行状态,并自动的将该预先捕获并存储的操控数据按照相应时序导入运行时库中,通过运行时库将导入的操控数据传递至处于初始运行状态的虚拟现实应用进行回放处理,通过自动回放处理即可重现用户历史操作中的操作,而不需要用户从头开始重新操作,大大提高了操作效率。
此外,未对虚拟应用程序本身作改进,即可实现回放处理,具有通用性,避免了针对每个虚拟应用程序本身作改进造成的成本问题及难度问题。
在一个实施例中,步骤S308包括:确定与虚拟现实应用对应的回放方式;当回放方式为第一回放方式时,则根据与第一回放方式对应的加速导入倍数,缩小按时序相邻的操控数据的记录时间间隔;将操控数据按照相应时序和缩小后的记录时间间隔,依次导入运行时库。
具体地,终端在导入操控数据至运行时库前,可以确定当前进行回放处理的虚拟现实应用所对应的回放方式。在一个实施例中,回放方式可以包括加速回放和/或还原回放。加速回放,是指在所存储的操控数据所对应的操控速度上进行加速,以提高回放的速度。还原回放,是指以所存储的操控数据所对应的操控速度进行还原回放处理。
可以理解,终端根据用户操作生成相应的操控数据时,终端会记录该操控数据所产生的时间节点。记录时间间隔,是记录的按时序相邻的操控数据的时间节点之间的间隔。比如,记录的前一个操控数据的时间节点为t1,记录的后一个操控数据的时间节点为t2,则该两个相邻操控数据的记录时间间隔为t2-t1。
当与虚拟现实应用对应的回放方式为第一回放方式(即加速回放)时,终端可以获取与该第一回放方式对应的加速导入倍数。该加速导入倍数可以是预先设置的。终端根据该加速导入倍数,缩小按时序相邻的操控数据的记录时间间隔;将操控数据按照相应时序和缩小后的记录时间间隔,依次导入运行时库。
比如,加速导入倍数为2倍,按时序排列的相邻的两个操控数据的记录时间间隔为0.02秒,则缩小两者的记录时间间隔为0.01秒,然后将操控数据按照相应时序和缩小后的0.01秒的记录时间间隔,依次导入运行时库。通过加速导入,可以提高回放效率。
在一个实施例中,该方法还包括:当回放方式为第二回放方式时,则获取操控数据所对应的时间节点;将操控数据按照对应的时间节点,依次导入运行时库。
当与虚拟现实应用对应的回放方式为第二回放方式(即还原回放)时,终端可以获取操控数据所对应的时间节点,并将操控数据按照对应的时间节点,依次导入运行时库。可以理解,操控数据所对应的时间节点,是所存储的用户进行操作的时间节点,将操控数据按照对应的时间节点,依次导入运行时库,就能够实现将操控数据按照用户之前进行操作时的操控速度进行导入至运行时库,即可以实现对操作的一致性还原。
需要说明的是,这里的第一回放方式和第二回放方式中的“第一”和“第二”仅用于区分不同的回放方式,并不作大小、从属或先后等方面的限定。
上述实施例中,在将存储的操控数据导入运行时库前,确定虚拟现实应用所对应的回放方式,根据相应的回放方式来导入操控数据,使得回放方式 更加的准确、有效,从而提高了回放效率。
在一个实施例中,确定与虚拟现实应用对应的回放方式,包括:当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标满足一一映射关系时,则判定与虚拟现实应用对应的回放方式为第一回放方式。
其中,虚拟三维坐标,是指在虚拟现实应用所构造的虚拟三维空间中,用户角色所对应的位置坐标。物理空间,是真实三维空间。物理三维坐标,是在真实三维空间中,用户所处的位置坐标。
具体地,终端会判断虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标是否满足一一映射关系。如果虚拟三维坐标与物理三维坐标之间满足一一映射关系,说明用户在虚拟现实应用所构造的虚拟三维空间中的位置,仅由物理空间位置决定,不会牵涉到手柄和/或头戴式显示器等操作产生的操控数据和相应时间节点的约束,则终端可以判定该虚拟现实应用支持加速导入,即终端可以判定与该虚拟现实应用对应的回放方式为第一回放方式(即加速回放)。可以理解,在其它实施例中,对于支持加速导入的虚拟现实应用,终端也可以不进行加速导入,而采取正常导入的方式导入至运行时库中。
在一个实施例中,确定与虚拟现实应用对应的回放方式,包括:当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系时,则判定与虚拟现实应用对应的回放方式为第二回放方式。
具体地,当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系,终端则可以判定该虚拟现实应用不支持加速导入,即判定与虚拟现实应用对应的回放方式为第二回放方式(即还原回放)。在一个实施例中,当虚拟现实应用中的虚拟三维坐标,需要由物理空间中的物理三维坐标和手柄和/或头戴式显示器等操作产生的操控数据和相应时间节点共同决定,则说明虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系。
上述实施例中,通过判断虚拟现实应用中的虚拟三维坐标与物理空间中 的物理三维坐标之间是否满足一一映射关系,来确定虚拟现实应用所对应的回放方式,可以更加准确的确定回放方式,从而基于准确的回放方式进行回放,能够提高回放效率。
在一个实施例中,该方法还包括回放画面模糊化处理的步骤,具体包括以下步骤:获取虚拟现实应用根据操控数据生成的回放画面;将回放画面进行模糊化处理;将模糊化处理后的回放画面输出显示。
具体地,终端通过运行时库将存储的操控数据传递至虚拟现实应用后,可以通过虚拟现实应用根据操控数据生成相应的回放画面。
在一个实施例中,终端可以通过虚拟现实应用根据操控数据,确定对应的虚拟三维坐标,并生成与所确定出的虚拟三维坐标对应的回放画面。可以理解,对于一些需要进行连续性操作的虚拟现实应用(比如,虚拟现实游戏应用中),其所生成的画面与虚拟三维空间中的虚拟三维坐标对应的,在虚拟三维空间中的不同虚拟三维坐标所对应的画面也不相同,比如,虚拟三维空间中的用户角色往前行走,所对应的画面也会发生变化。
终端获取虚拟现实应用根据操控数据生成的回放画面,并将回放画面进行模糊化处理。具体地,终端可以通过降低回放画面的质量来实现对回放画面的模糊化处理,也可以在回放画面上叠加模糊性的图形元素,比如叠加一层马赛克图层,来实现对回放画面的模糊化处理。这里对终端对回放画面进行模糊化处理的具体实现方式不作限定,只要能够实现将回放画面进行模糊化处理即可。进一步地,终端可以将模糊化处理后生成的回放画面输出并显示,即终端可以将模糊化处理后的回放画面发送至头戴式显示器进行输出显示。用户则可以通过头戴式显示器看到模糊化处理后的回放画面。
在一个实施例中,可以是,当终端按照第二回放方式(即还原回放)将操控数据按照对应的时间节点,依次导入运行时库时,再执行上述将回放画面进行模糊化处理的步骤。可以理解,当终端将操控数据通过第二回放方式导入以进行回放处理时,所生成的画面可能使用户产生晕眩感,对回放画面进行模糊化处理,可以避免晕眩感,提高了回放操作的质量和有效率。
在一个实施例中,该方法还包括:生成与回放画面对应的第三方视角画面;将第三方视角画面输出,以将第三方视角画面叠加于模糊化处理后的回放画面进行显示。
其中,虚拟现实应用根据操控数据生成的回放画面是第一方视角画面。终端可以通过修改生成第一方视角的回放画面的画面绘制参数,来生成与该回放画面对应的第三方视角画面。可以理解,第三方视角画面中,用户角色所对应的形象本身是第三方视角画面的组成部分,而在第一方视角画面中,用户角色并不显示。
终端可以将生成的与回放画面对应的第三方视角画面输出,以将第三方视角画面叠加于模糊化处理后的回放画面进行显示。这样一来,既可以避免用户晕眩,又可以让用户看到当前的回放进度。
在一个实施例中,该方法还包括:接收回放停止指令;响应于回放停止指令,取消对回放画面进行的模糊化处理,并取消生成和输出与回放画面对应的第三方视角画面。
其中,回放停止指令,用于触发停止回放的执行逻辑。
具体地,用户可以在回放过程中的任意位置通过点击或按压等操作,来发出回放停止指令。终端可以接收用户在回放过程中的任意位置上发起的回放停止指令。比如,终端所存储的全部用于回放处理的操控数据,可以进行30秒的回放,但是用户不想要进行全部回放,则用户可以在整个30秒回放过程之前的任意位置发起回放停止指令,比如回放到20秒时,用户就可以点击或按压回放停止按钮来发起回放停止指令了。
终端可以响应于回放停止指令,取消对回放画面进行的模糊化处理,并取消生成和输出与回放画面对应的第三方视角画面。具体地,终端可以取消对回放停止指令所对应位置的回放画面的模糊化处理,即回放停止指令所对应位置的回放画面是未经过模糊化处理的、且可以正常输出显示的画面。并且,终端还可以取消生成和输出与回放画面对应的第三方视角画面。即终端在接收到回放停止指令后,可以不再生成与回放停止指令所对应位置的回放 画面对应的第三方视角画面。即终端在响应于回放停止指令并进行上述处理后,用户最终通过头戴式显示器看到的是未经过模糊化处理的回放画面本身。
如图6所示,在一个实施例中,生成与回放画面对应的第三方视角画面(简称第三方视角画面生成步骤),具体包括以下步骤:
S602,在虚拟现实应用调用绘制函数绘制回放画面时,获取虚拟现实应用向绘制函数传递的画面绘制参数值。
具体地,终端通过运行时库向虚拟现实应用传递所存储的操控数据后,虚拟现实应用会调用绘制函数绘制与该操控数据对应的回放画面。在虚拟现实应用调用绘制函数绘制回放画面时,终端可以获取虚拟现实应用向绘制函数传递的画面绘制参数值。其中,该获取的画面绘制参数值是用于绘制第一视角画面的绘制参数值。可以理解,终端在获取虚拟现实应用向绘制函数传递的画面绘制参数值后,可以继续将该画面绘制参数值传递至绘制函数,以使该绘制函数根据该画面绘制参数值生成第一视角的回放画面。
S604,根据第一方视角和第三方视角的绘制参数值转换关系,修改画面绘制参数值。
其中,第一方视角和第三方视角的绘制参数值转换关系,可以是将第一方视角画面转换为第三方视角画面的绘制参数值转换规则。终端可以根据第一方视角和第三方视角的绘制参数值转换规则,将获取的用于绘制第一视角画面的绘制参数值修改为用于绘制第三视角画面的绘制参数值。
S606,调用绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
具体地,终端可以调用绘制函数,根据该修改后的画面绘制参数(即用于绘制第三视角画面的绘制参数值),绘制第三方视角画面。
上述实施例中,通过获取并修改画面绘制参数来生成第三方视角画面,使得第三方视角画面生成非常的便捷,提高了效率。
在一个实施例中,步骤S602包括:在虚拟现实应用调用绘制函数绘制回放画面时,通过绘制函数中注入的钩子函数,获取虚拟现实应用向绘制函数 传递的画面绘制参数值;步骤S606包括:通过绘制函数中注入的钩子函数调用绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
具体地,终端可以在接收回放指令时,触发向绘制函数注入钩子函数。在虚拟现实应用调用绘制函数绘制回放画面时,终端则可以通过绘制函数中注入的钩子函数,获取虚拟现实应用向绘制函数传递的画面绘制参数值。
在对所获取的画面绘制参数值修改完毕后,终端可以通过绘制函数中注入的钩子函数再次调用所对应的绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
上述实施例中,通过绘制函数中注入的钩子函数获取并修改画面绘制参数以生成第三方视角画面,使得第三方视角画面生成非常的便捷,提高了效率。
可以理解,实际上,终端是通过运行于终端中的第三方软件来实现本实施例中的虚拟现实应用数据处理方法。图7为一个实施例中虚拟现实应用数据处理方法的时序图,具体包括以下步骤:
1)第三方软件调用运行时库中的注册事件回调函数,向运行时库注册回调函数。
2)第三方软件向绘图程序中的绘制函数注入钩子函数。
3)第三方软件通过回调函数捕获由虚拟现实操作硬件向运行时库发送的操控数据并存储。
4)第三方软件调用运行时库中的事件导入函数,导入所存储的操控数据至运行时库。
5)运行时库传递所导入的操控数据至虚拟现实应用。
6)虚拟现实应用调用绘图程序中的绘制函数,以绘制操控数据所对应的回放画面。
7)绘图程序的绘制函数中所注入的钩子函数获取并修改画面绘制参数值。
8)绘图程序中的绘制函数根据原始的画面绘制参数值,生成回放画面。
9)绘图程序的绘制函数中所注入的钩子函数,调用绘制函数根据修改后的画面绘制参数值,生成第三方视角画面。
10)绘图程序返回该回放画面和第三方视角画面至第三方软件。
11)第三方软件对回放画面模糊化处理,并叠加上第三视角画面。
12)第三方软件发送模糊化回放画面和叠加的第三视角画面至头戴式显示器。
13)头戴式显示器输出显示模糊化回放画面,和叠加的第三视角画面。
如图8所示,在一个实施例中,该方法还包括即时虚拟现实画面生成步骤,具体包括以下步骤:
S802,接收到回放停止指令后,停止将操控数据导入运行时库中,以结束回放处理。
其中,回放停止指令,用于触发停止回放的执行逻辑。
具体地,用户可以在回放过程中的任意位置通过点击或按压等操作,来发出回放停止指令。终端可以接收用户在回放过程中的任意位置上发起的回放停止指令。比如,终端所存储的全部用户回放处理的操控数据,可以进行30秒的回放,但是用户不想要进行全部回放,则用户可以在整个30秒回放过程之前的任意位置发起回放停止指令,比如回放到20秒时,用户就可以点击或按压回放停止按钮来发起回放停止指令了。
终端在接收到回放停止指令后,停止将操控数据导入运行时库中,以结束回放处理。可以理解,结束回放处理后,即恢复到即时操作处理状态,终端可以控制运行时库恢复到可接收虚拟现实操作硬件发送的即时操控数据的状态。
S804,通过运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至虚拟现实应用。
具体地,用户可以操作虚拟现实操作硬件发出对虚拟现实应用的即时操作,虚拟现实操作硬件将与该即时操作对应的即时操控数据发送至运行时库。终端通过运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至 虚拟现实应用。
S806,通过虚拟现实应用,根据即时操控数据形成虚拟现实画面并输出显示。
具体地,终端可以通过虚拟现实应用,根据即时操控数据形成虚拟现实画面并输出显示。在一个实施例中,终端可以通过虚拟现实应用根据即时操控数据,确定对应的虚拟三维坐标,并生成与所确定出的虚拟三维坐标对应的即时的虚拟现实画面。
上述实施例中,在接收回放停止指令后,即可恢复至即时操作处理状态,并根据虚拟现实操作硬件发送的即时操控数据,形成虚拟现实画面并输出显示。实现了回放和即时操作的无缝对接,避免了还需要进行其他操作才能恢复至即时操作状态,提高了操作效率。
如图9所示,在一个实施例中,提供了另一种虚拟现实应用数据处理方法,该方法具体包括以下步骤:
S902,向运行时库注册回调函数,当虚拟现实操作硬件向运行时库发送具有时序的操控数据时,调用回调函数,捕获具有时序的操控数据并存储。
S904,接收回放开始指令,响应于回放开始指令,将虚拟现实应用恢复至初始运行状态。
S906,获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据。
S908,确定与虚拟现实应用对应的回放方式。当回放方式为第一回放方式时,则进入步骤S910,当回放方式为第二回放方式时,则进入步骤S912。
其中,当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标满足一一映射关系时,则判定与虚拟现实应用对应的回放方式为第一回放方式。当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系时,则判定与虚拟现实应用对应的回放方式为第二回放方式。
S910,根据与第一回放方式对应的加速导入倍数,缩小按时序相邻的操控数据的记录时间间隔;将操控数据按照相应时序和缩小后的记录时间间隔, 依次导入运行时库。
S912,获取操控数据所对应的时间节点;将操控数据按照对应的时间节点,依次导入运行时库。
S914,通过运行时库,将导入的操控数据传递至处于初始运行状态的虚拟现实应用进行回放处理。
S916,在虚拟现实应用调用绘制函数绘制回放画面时,通过绘制函数中注入的钩子函数,获取虚拟现实应用向绘制函数传递的画面绘制参数值。
S918,根据第一方视角和第三方视角的绘制参数值转换关系,修改画面绘制参数值。
S920,通过绘制函数中注入的钩子函数调用绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
S922,获取虚拟现实应用根据操控数据生成的回放画面,将回放画面进行模糊化处理,将模糊化处理后的回放画面输出显示。
S924,将第三方视角画面输出,以将第三方视角画面叠加于模糊化处理后的回放画面进行显示。
S926,接收回放停止指令,响应于回放停止指令,取消对回放画面进行的模糊化处理,并取消生成和输出与回放画面对应的第三方视角画面。
S928,停止将操控数据导入运行时库中,以结束回放处理。
S930,通过运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至虚拟现实应用。
S932,通过虚拟现实应用,根据即时操控数据形成虚拟现实画面并输出显示。
上述虚拟现实应用数据处理方法,通过预先捕获并存储由实现虚拟现实操作的硬件发向运行时库且具有时序的操控数据,在接收回放开始指令时,将虚拟现实恢复至初始运行状态,并自动的将该预先捕获并存储的操控数据按照相应时序导入运行时库中,通过运行时库将导入的操控数据传递至处于初始运行状态的虚拟现实应用进行回放处理,通过自动回放处理即可重现用 户历史操作中的操作,而不需要用户从头开始重新操作,大大提高了操作效率。
其次,在将存储的操控数据导入运行时库前,确定虚拟现实应用所对应的回放方式,根据相应的回放方式来导入操控数据,使得回放方式更加的准确、有效,从而提高了回放效率。且通过加速导入,可以进一步地提高回放效率。
然后,对回放画面进行模糊化处理,可以避免晕眩感,提高了回放操作的质量和有效率。
再者,以将第三方视角画面叠加于模糊化处理后的回放画面进行显示。这样一来,既可以避免用户晕眩,又可以让用户看到当前的回放进度。
此外,通过获取并修改画面绘制参数来生成第三方视角画面,使得第三方视角画面生成非常的便捷,提高了效率。
最后,在接收回放停止指令后,即可恢复至即时操作处理状态,并根据虚拟现实操作硬件发送的即时操控数据,形成虚拟现实画面并输出显示。实现了回放和即时操作的无缝对接,避免了还需要进行其他操作才能恢复至即时操作状态,提高了操作效率。
在一个实施例中,还提供了一种终端,该终端的内部结构可如图2所示,该终端包括虚拟现实应用数据处理装置,该虚拟现实应用数据处理装置中包括各个模块,每个模块可全部或部分通过软件、硬件或其组合来实现。
如图10所示,在一个实施例中,提供了一种虚拟现实应用数据处理装置1000,该装置1000包括:获取模块1003、运行状态管理模块1004、操控数据导入模块1006及回放处理模块1008,其中:
获取模块1003,用于接收回放开始指令。
运行状态管理模块1004,用于响应于回放开始指令,将虚拟现实应用恢复至初始运行状态。
获取模块1002还用于获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据。
操控数据导入模块1006,用于将操控数据按照相应时序导入运行时库中。
回放处理模块1008,用于通过运行时库,将导入的操控数据传递至处于初始运行状态的虚拟现实应用进行回放处理。
如图11所示,在一个实施例中,装置1000还包括:
函数注册模块1001,用于向运行时库注册回调函数。
操控数据捕获模块1002,用于当虚拟现实操作硬件向运行时库发送具有时序的操控数据时,调用回调函数,捕获具有时序的操控数据并存储。
在一个实施例中,操控数据导入模块1006还用于确定与虚拟现实应用对应的回放方式;当回放方式为第一回放方式时,则根据与第一回放方式对应的加速导入倍数,缩小按时序相邻的操控数据的记录时间间隔;将操控数据按照相应时序和缩小后的记录时间间隔,依次导入运行时库。
在一个实施例中,操控数据导入模块1006还用于确定与虚拟现实应用对应的回放方式;当回放方式为第二回放方式时,则获取操控数据所对应的时间节点;将操控数据按照对应的时间节点,依次导入运行时库。
在一个实施例中,操控数据导入模块1006还用于当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标满足一一映射关系时,则判定与虚拟现实应用对应的回放方式为第一回放方式。
在一个实施例中,操控数据导入模块1006还用于当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系时,则判定与虚拟现实应用对应的回放方式为第二回放方式。
如图12所示,在一个实施例中,装置1000还包括:
回放画面处理模块1010,用于获取虚拟现实应用根据操控数据生成的回放画面;将回放画面进行模糊化处理;将模糊化处理后的回放画面输出显示。
在一个实施例中,回放画面处理模块1010还用于生成与回放画面对应的第三方视角画面;将第三方视角画面输出,以将第三方视角画面叠加于模糊化处理后的回放画面进行显示。
在一个实施例中,回放画面处理模块1010还用于接收回放停止指令;响 应于回放停止指令,取消对回放画面进行的模糊化处理,并取消生成和输出与回放画面对应的第三方视角画面。
在一个实施例中,回放画面处理模块1010还用于在虚拟现实应用调用绘制函数绘制回放画面时,获取虚拟现实应用向绘制函数传递的画面绘制参数值;根据第一方视角和第三方视角的绘制参数值转换关系,修改画面绘制参数值;调用绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
在一个实施例中,回放画面处理模块1010还用于在虚拟现实应用调用绘制函数绘制回放画面时,通过绘制函数中注入的钩子函数,获取虚拟现实应用向绘制函数传递的画面绘制参数值。
回放画面处理模块1010还用于通过绘制函数中注入的钩子函数调用绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
如图13所示,在一个实施例中,装置1000还包括:
回放处理结束模块1012,用于接收到回放停止指令后,停止将操控数据导入运行时库中,以结束回放处理。
即时操控模块1014,用于通过运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至虚拟现实应用;通过虚拟现实应用,根据即时操控数据形成虚拟现实画面并输出显示。
在一个实施例中,提供了一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行以下步骤:接收回放开始指令;响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;将所述操控数据按照相应时序导入所述运行时库中;通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
在一个实施例中,在所述接收回放开始指令之前,计算机可读指令还使得处理器执行以下步骤:向运行时库注册回调函数;当虚拟现实操作硬件向所述运行时库发送具有时序的操控数据时,调用所述回调函数,捕获所述具 有时序的操控数据并存储。
在一个实施例中,处理器所执行的所述将所述操控数据按照相应时序导入所述运行时库中,包括:确定与虚拟现实应用对应的回放方式;当所述回放方式为第一回放方式时,则根据与所述第一回放方式对应的加速导入倍数,缩小按时序相邻的所述操控数据的记录时间间隔;将所述操控数据按照相应时序和缩小后的记录时间间隔,依次导入所述运行时库。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:确定与虚拟现实应用对应的回放方式;当所述回放方式为第二回放方式时,则获取所述操控数据所对应的时间节点;将所述操控数据按照对应的时间节点,依次导入所述运行时库。
在一个实施例中,处理器所执行的所述确定与虚拟现实应用对应的回放方式,包括:当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标满足一一映射关系时,则判定与所述虚拟现实应用对应的回放方式为第一回放方式。
在一个实施例中,处理器所执行的所述确定与虚拟现实应用对应的回放方式,包括:当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系时,则判定与所述虚拟现实应用对应的回放方式为第二回放方式。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:获取所述虚拟现实应用根据所述操控数据生成的回放画面;将所述回放画面进行模糊化处理;将模糊化处理后的回放画面输出显示。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:生成与所述回放画面对应的第三方视角画面;将所述第三方视角画面输出,以将所述第三方视角画面叠加于模糊化处理后的回放画面进行显示。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:接收回放停止指令;响应于所述回放停止指令,取消对所述回放画面进行的模糊化处理,并取消生成和输出与所述回放画面对应的第三方视角画面。
在一个实施例中,处理器所执行的所述生成与所述回放画面对应的第三方视角画面,包括:在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值;根据第一方视角和第三方视角的绘制参数值转换关系,修改所述画面绘制参数值;调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
在一个实施例中,处理器所执行的在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值,包括:在虚拟现实应用调用绘制函数绘制回放画面时,通过所述绘制函数中注入的钩子函数,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值。
处理器所执行的所述调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面,包括:通过所述绘制函数中注入的钩子函数调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:接收到回放停止指令后,停止将所述操控数据导入所述运行时库中,以结束所述回放处理;通过所述运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至所述虚拟现实应用;通过所述虚拟现实应用,根据所述即时操控数据形成虚拟现实画面并输出显示。
在一个实施例中,提供了一种存储有计算机可读指令的存储介质,该计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行以下步骤:接收回放开始指令;响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;将所述操控数据按照相应时序导入所述运行时库中;通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
在一个实施例中,在所述接收回放开始指令之前,计算机可读指令还使得处理器执行以下步骤:向运行时库注册回调函数;当虚拟现实操作硬件向 所述运行时库发送具有时序的操控数据时,调用所述回调函数,捕获所述具有时序的操控数据并存储。
在一个实施例中,处理器所执行的所述将所述操控数据按照相应时序导入所述运行时库中,包括:确定与虚拟现实应用对应的回放方式;当所述回放方式为第一回放方式时,则根据与所述第一回放方式对应的加速导入倍数,缩小按时序相邻的所述操控数据的记录时间间隔;将所述操控数据按照相应时序和缩小后的记录时间间隔,依次导入所述运行时库。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:确定与虚拟现实应用对应的回放方式;当所述回放方式为第二回放方式时,则获取所述操控数据所对应的时间节点;将所述操控数据按照对应的时间节点,依次导入所述运行时库。
在一个实施例中,处理器所执行的所述确定与虚拟现实应用对应的回放方式,包括:当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标满足一一映射关系时,则判定与所述虚拟现实应用对应的回放方式为第一回放方式。
在一个实施例中,处理器所执行的所述确定与虚拟现实应用对应的回放方式,包括:当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系时,则判定与所述虚拟现实应用对应的回放方式为第二回放方式。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:获取所述虚拟现实应用根据所述操控数据生成的回放画面;将所述回放画面进行模糊化处理;将模糊化处理后的回放画面输出显示。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:生成与所述回放画面对应的第三方视角画面;将所述第三方视角画面输出,以将所述第三方视角画面叠加于模糊化处理后的回放画面进行显示。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:接收回放停止指令;响应于所述回放停止指令,取消对所述回放画面进行的模糊化 处理,并取消生成和输出与所述回放画面对应的第三方视角画面。
在一个实施例中,处理器所执行的所述生成与所述回放画面对应的第三方视角画面,包括:在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值;根据第一方视角和第三方视角的绘制参数值转换关系,修改所述画面绘制参数值;调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
在一个实施例中,处理器所执行的在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值,包括:在虚拟现实应用调用绘制函数绘制回放画面时,通过所述绘制函数中注入的钩子函数,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值。
处理器所执行的所述调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面,包括:通过所述绘制函数中注入的钩子函数调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
在一个实施例中,计算机可读指令还使得处理器执行以下步骤:接收到回放停止指令后,停止将所述操控数据导入所述运行时库中,以结束所述回放处理;通过所述运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至所述虚拟现实应用;通过所述虚拟现实应用,根据所述即时操控数据形成虚拟现实画面并输出显示。
应该理解的是,虽然本申请各实施例中的各个步骤并不是必然按照步骤标号指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,各实施例中至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流 程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种虚拟现实应用数据处理方法,包括:
    计算机设备接收回放开始指令;
    所述计算机设备响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;
    所述计算机设备获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;
    所述计算机设备将所述操控数据按照相应时序导入所述运行时库中;及
    所述计算机设备通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
  2. 根据权利要求1所述的方法,其特征在于,还包括:
    所述计算机设备向运行时库注册回调函数;及
    当虚拟现实操作硬件向所述运行时库发送具有时序的操控数据时,所述计算机设备调用所述回调函数,捕获所述具有时序的操控数据并存储。
  3. 根据权利要求1所述的方法,其特征在于,所述计算机设备将所述操控数据按照相应时序导入所述运行时库中包括:
    所述计算机设备确定与虚拟现实应用对应的回放方式;
    当所述回放方式为第一回放方式时,所述计算机设备则根据与所述第一回放方式对应的加速导入倍数,缩小按时序相邻的所述操控数据的记录时间间隔;及
    所述计算机设备将所述操控数据按照相应时序和缩小后的记录时间间隔,依次导入所述运行时库。
  4. 根据权利要求1所述的方法,其特征在于,还包括:
    所述计算机设备确定与虚拟现实应用对应的回放方式;
    当所述回放方式为第二回放方式时,所述计算机设备则获取所述操控数据所对应的时间节点;及
    所述计算机设备将所述操控数据按照对应的时间节点,依次导入所述运 行时库。
  5. 根据权利要求3所述的方法,其特征在于,所述计算机设备确定与虚拟现实应用对应的回放方式包括:
    当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标满足一一映射关系时,所述计算机设备则判定与所述虚拟现实应用对应的回放方式为第一回放方式。
  6. 根据权利要求4所述的方法,其特征在于,所述计算机设备确定与虚拟现实应用对应的回放方式包括:
    当虚拟现实应用中的虚拟三维坐标与物理空间中的物理三维坐标不满足一一映射关系时,所述计算机设备则判定与所述虚拟现实应用对应的回放方式为第二回放方式。
  7. 根据权利要求1所述的方法,其特征在于,还包括:
    所述计算机设备获取所述虚拟现实应用根据所述操控数据生成的回放画面;
    所述计算机设备将所述回放画面进行模糊化处理;及
    所述计算机设备将模糊化处理后的回放画面输出显示。
  8. 根据权利要求7所述的方法,其特征在于,还包括:
    所述计算机设备生成与所述回放画面对应的第三方视角画面;及
    所述计算机设备将所述第三方视角画面输出,以将所述第三方视角画面叠加于模糊化处理后的回放画面进行显示。
  9. 根据权利要求8所述的方法,其特征在于,所述计算机设备生成与所述回放画面对应的第三方视角画面包括:
    所述计算机设备在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值;
    所述计算机设备根据第一方视角和第三方视角的绘制参数值转换关系,修改所述画面绘制参数值;及
    所述计算机设备调用所述绘制函数,根据修改后的画面绘制参数值绘制 第三方视角画面。
  10. 根据权利要求9所述的方法,其特征在于,所述计算机设备在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值包括:
    所述计算机设备在虚拟现实应用调用绘制函数绘制回放画面时,通过所述绘制函数中注入的钩子函数,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值;及
    所述计算机设备调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面包括:
    所述计算机设备通过所述绘制函数中注入的钩子函数调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
  11. 根据权利要求1所述的方法,其特征在于,还包括:
    所述计算机设备接收到回放停止指令后,停止将所述操控数据导入所述运行时库中,以结束所述回放处理;
    所述计算机设备通过所述运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至所述虚拟现实应用;及
    所述计算机设备通过所述虚拟现实应用,根据所述即时操控数据形成虚拟现实画面并输出显示。
  12. 一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行以下步骤:
    接收回放开始指令;
    响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;
    获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;
    将所述操控数据按照相应时序导入所述运行时库中;及
    通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的 所述虚拟现实应用进行回放处理。
  13. 根据权利要求12所述的计算机设备,其特征在于,所述计算机可读指令还使得所述处理器执行以下步骤:
    向运行时库注册回调函数;及
    当虚拟现实操作硬件向所述运行时库发送具有时序的操控数据时,调用所述回调函数,捕获所述具有时序的操控数据并存储。
  14. 根据权利要求12所述的计算机设备,其特征在于,所述处理器所执行的将所述操控数据按照相应时序导入所述运行时库中包括:
    确定与虚拟现实应用对应的回放方式;
    当所述回放方式为第一回放方式时,则
    根据与所述第一回放方式对应的加速导入倍数,缩小按时序相邻的所述操控数据的记录时间间隔;及
    将所述操控数据按照相应时序和缩小后的记录时间间隔,依次导入所述运行时库。
  15. 根据权利要求12所述的计算机设备,其特征在于,所述计算机可读指令还使得所述处理器执行以下步骤:
    确定与虚拟现实应用对应的回放方式;
    当所述回放方式为第二回放方式时,则
    获取所述操控数据所对应的时间节点;及
    将所述操控数据按照对应的时间节点,依次导入所述运行时库。
  16. 根据权利要求12所述的计算机设备,其特征在于,所述计算机可读指令还使得所述处理器执行以下步骤:
    获取所述虚拟现实应用根据所述操控数据生成的回放画面;
    将所述回放画面进行模糊化处理;及
    将模糊化处理后的回放画面输出显示。
  17. 根据权利要求16所述的计算机设备,其特征在于,所述计算机可读指令还使得所述处理器执行以下步骤:
    生成与所述回放画面对应的第三方视角画面;及
    将所述第三方视角画面输出,以将所述第三方视角画面叠加于模糊化处理后的回放画面进行显示。
  18. 根据权利要求17所述的计算机设备,其特征在于,所述处理器所执行的所述生成与所述回放画面对应的第三方视角画面包括:
    在虚拟现实应用调用绘制函数绘制回放画面时,获取所述虚拟现实应用向所述绘制函数传递的画面绘制参数值;
    根据第一方视角和第三方视角的绘制参数值转换关系,修改所述画面绘制参数值;及
    调用所述绘制函数,根据修改后的画面绘制参数值绘制第三方视角画面。
  19. 根据权利要求12所述的计算机设备,其特征在于,所述计算机可读指令还使得所述处理器执行以下步骤:
    接收到回放停止指令后,停止将所述操控数据导入所述运行时库中,以结束所述回放处理;
    通过所述运行时库,接收由虚拟现实操作硬件发送的即时操控数据并传递至所述虚拟现实应用;及
    通过所述虚拟现实应用,根据所述即时操控数据形成虚拟现实画面并输出显示。
  20. 一种存储有计算机可读指令的存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如下步骤:
    接收回放开始指令;
    响应于所述回放开始指令,将虚拟现实应用恢复至初始运行状态;
    获取预先捕获并存储的由虚拟现实操作硬件发向运行时库且具有时序的操控数据;
    将所述操控数据按照相应时序导入所述运行时库中;及
    通过所述运行时库,将导入的所述操控数据传递至处于初始运行状态的所述虚拟现实应用进行回放处理。
PCT/CN2018/083584 2017-05-09 2018-04-18 虚拟现实应用数据处理方法、计算机设备和存储介质 WO2018205809A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18799360.5A EP3611610A4 (en) 2017-05-09 2018-04-18 DATA PROCESSING METHODS OF AN APPLICATION OF VIRTUAL REALITY, COMPUTER DEVICE AND STORAGE MEDIUM
US16/508,483 US11307891B2 (en) 2017-05-09 2019-07-11 Virtual reality application data processing method, computer device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710322336.3A CN108874267B (zh) 2017-05-09 2017-05-09 虚拟现实应用数据处理方法、计算机设备和存储介质
CN201710322336.3 2017-05-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/508,483 Continuation US11307891B2 (en) 2017-05-09 2019-07-11 Virtual reality application data processing method, computer device and storage medium

Publications (1)

Publication Number Publication Date
WO2018205809A1 true WO2018205809A1 (zh) 2018-11-15

Family

ID=64104831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/083584 WO2018205809A1 (zh) 2017-05-09 2018-04-18 虚拟现实应用数据处理方法、计算机设备和存储介质

Country Status (4)

Country Link
US (1) US11307891B2 (zh)
EP (1) EP3611610A4 (zh)
CN (1) CN108874267B (zh)
WO (1) WO2018205809A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110598027A (zh) * 2019-09-10 2019-12-20 Oppo广东移动通信有限公司 图像处理效果的显示方法、装置、电子设备以及存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256317B (zh) * 2020-10-21 2022-07-29 上海曼恒数字技术股份有限公司 一种虚拟现实沉浸式大屏追踪系统的快速构建方法、介质及设备
CN112535865A (zh) * 2020-12-15 2021-03-23 网易(杭州)网络有限公司 游戏内容的回放方法、终端、可读存储介质及电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111864A (zh) * 2013-04-22 2014-10-22 腾讯科技(深圳)有限公司 应用操作的录制方法、回放方法及相应的装置
US20140368519A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Managing Transitions of Adaptive Display Rates for Different Video Playback Scenarios
CN104580973A (zh) * 2014-12-30 2015-04-29 中国科学院深圳先进技术研究院 一种虚拟手术模拟过程的录制、回放方法及装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064355A (en) * 1994-05-24 2000-05-16 Texas Instruments Incorporated Method and apparatus for playback with a virtual reality system
US20160267720A1 (en) * 2004-01-30 2016-09-15 Electronic Scripting Products, Inc. Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience
US20120086630A1 (en) * 2010-10-12 2012-04-12 Sony Computer Entertainment Inc. Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
CN102769775A (zh) * 2012-06-12 2012-11-07 严幸华 覆盖图像提供系统、服务器和方法
JP2014127987A (ja) * 2012-12-27 2014-07-07 Sony Corp 情報処理装置および記録媒体
DE112014001637T5 (de) * 2013-03-28 2015-12-24 Mitsubishi Electric Corporation Wiedergabevorrichtung, Steuerverfahren und Programm
US9715764B2 (en) * 2013-10-03 2017-07-25 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US9997199B2 (en) * 2014-12-05 2018-06-12 Warner Bros. Entertainment Inc. Immersive virtual reality production and playback for storytelling content
US11351466B2 (en) * 2014-12-05 2022-06-07 Activision Publishing, Ing. System and method for customizing a replay of one or more game events in a video game
EP3206122A1 (en) * 2016-02-10 2017-08-16 Nokia Technologies Oy An apparatus and associated methods
US20170249785A1 (en) * 2016-02-29 2017-08-31 Vreal Inc Virtual reality session capture and replay systems and methods

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111864A (zh) * 2013-04-22 2014-10-22 腾讯科技(深圳)有限公司 应用操作的录制方法、回放方法及相应的装置
US20140368519A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Managing Transitions of Adaptive Display Rates for Different Video Playback Scenarios
CN104580973A (zh) * 2014-12-30 2015-04-29 中国科学院深圳先进技术研究院 一种虚拟手术模拟过程的录制、回放方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3611610A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110598027A (zh) * 2019-09-10 2019-12-20 Oppo广东移动通信有限公司 图像处理效果的显示方法、装置、电子设备以及存储介质
CN110598027B (zh) * 2019-09-10 2022-09-02 Oppo广东移动通信有限公司 图像处理效果的显示方法、装置、电子设备以及存储介质

Also Published As

Publication number Publication date
CN108874267B (zh) 2022-12-09
US11307891B2 (en) 2022-04-19
US20190332418A1 (en) 2019-10-31
EP3611610A1 (en) 2020-02-19
EP3611610A4 (en) 2021-01-27
CN108874267A (zh) 2018-11-23

Similar Documents

Publication Publication Date Title
WO2020029526A1 (zh) 视频特效添加方法、装置、终端设备及存储介质
US20210158594A1 (en) Dynamic emoticon-generating method, computer-readable storage medium and computer device
WO2018205809A1 (zh) 虚拟现实应用数据处理方法、计算机设备和存储介质
CN109525884B (zh) 基于分屏的视频贴纸添加方法、装置、设备和存储介质
WO2016106997A1 (zh) 屏幕截图方法及装置、移动终端
CN116821040B (zh) 基于gpu直接存储器访问的显示加速方法、装置及介质
US11941728B2 (en) Previewing method and apparatus for effect application, and device, and storage medium
CN107025100A (zh) 播放多媒体数据的方法、界面渲染方法及装置、设备
CN110971974B (zh) 配置参数创建方法、装置、终端及存储介质
WO2022151752A1 (zh) 场景切换方法及装置
CN114430460A (zh) 拍摄方法、装置和电子设备
CN108845741B (zh) 一种ar表情的生成方法、客户端、终端和存储介质
WO2022001572A1 (zh) 一种图像处理方法、图像处理装置、电子设备和存储介质
CN114153362A (zh) 信息处理方法及装置
CN109656463A (zh) 个性表情的生成方法、装置及系统
US10642561B2 (en) Display control apparatus, display control method, and computer readable medium
CN109743635B (zh) 评论回复方法、装置、设备和存储介质
CN110865864A (zh) 快应用的界面显示方法、装置、设备及存储介质
WO2022117028A1 (zh) 基于游戏直播的主播输入操作分享与显示方法、及设备
CN113810538B (zh) 视频编辑方法和视频编辑装置
CN114500844A (zh) 拍摄方法、装置和电子设备
JP6201501B2 (ja) 動画編集装置、動画編集方法およびプログラム
WO2023185968A1 (zh) 相机功能页面切换方法、装置、电子设备及存储介质
CN112202958B (zh) 截图方法、装置及电子设备
CN115390731A (zh) 显示元素的位置调整方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18799360

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018799360

Country of ref document: EP

Effective date: 20191111