Disclosure of Invention
In order to overcome at least the above deficiencies in the prior art, an object of the present application is to provide a virtual reality system and a virtual reality object control device, which can avoid the situation that a virtual reality object cannot be effectively adapted to an application environment scene due to non-uniformity of control modes of the application environment scene and the virtual reality object in the virtual reality scene, and improve the rendering effect of the virtual reality object and the adaptation degree of the application environment scene.
In a first aspect, the present application provides a virtual reality object control method, which is applied to a virtual reality device, where the virtual reality device is in communication connection with at least one object fluctuation range, and the method includes:
acquiring first control information of an object service corresponding to an object to be displayed in a current virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file;
comparing a set of control node associations between the first control information and the second control information;
when the control behaviors of any at least two associated control nodes in the control node association set have conflict, determining a display object adaptation strategy of the scene service for the target associated control node with conflict;
determining target object adaptation areas for controlling the display of the object to be displayed and a control instruction sequence aiming at each target object adaptation area according to the display object adaptation strategy;
and executing object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence aiming at each target object adaptation area.
In a possible design of the first aspect, the step of determining a presentation object adaptation policy of the scenario service for a target associated control node with a conflict includes:
acquiring a current scene space corresponding to the control node association set from the scene service according to the control node association set;
calculating a first scene linkage space where the current scene space is located according to a preset display object adaptation matrix, performing simulation updating on the range of the first scene linkage space, and acquiring a second scene linkage space where the current scene space is located so as to enable the second scene linkage space to be an initial scene linkage space of a next scene space;
taking the next scene space as the current scene space, updating the preset display object adaptation matrix to obtain an updated display object adaptation matrix, and performing linkage updating on the initial scene linkage space corresponding to the current scene space according to the updated display object adaptation matrix to obtain the initial scene linkage space corresponding to the next scene space until all scene objects in the scene space are subjected to simulated linkage, so as to obtain a simulated linkage result;
calculating a corresponding dynamic adaptation function according to an initial simulation linkage parameter, the linkage times of each scene object in the scene space, the total linkage times of each scene object and a region configuration parameter of the initial scene linkage space;
and outputting the dynamic adaptation function, the simulation linkage result and the scene parameters of the scene space as the display object adaptation strategy of the control node association set.
In a possible design of the first aspect, the step of calculating a corresponding dynamic adaptation function according to an initial simulation linkage parameter, a number of times that each scene object in the scene space is linked, a total number of times that each scene object is linked, and a region configuration parameter of the initial scene linkage space includes:
acquiring a plurality of simulated linkage space coordinates according to the initial simulated linkage parameters, and acquiring a linkage coordinate value of each simulated linkage space coordinate in the plurality of simulated linkage space coordinates;
acquiring simulation linkage aggregation head information of each simulation linkage space coordinate according to the linkage coordinate value of each simulation linkage space coordinate and a simulation linkage interval before simulation linkage of each simulation linkage space coordinate, wherein the simulation linkage aggregation head information comprises the simulation linkage interval and the number of times of linkage and the total number of times of linkage of each corresponding scene object;
calculating to obtain an initial value of the simulated linkage interval of each simulated linkage space coordinate according to the simulated linkage type of each simulated linkage space coordinate and the simulated linkage interval of each simulated linkage space coordinate;
inquiring a simulation linkage information table to obtain coordinate offset adaptive parameters of the plurality of simulation linkage space coordinates according to the simulation linkage interval initial value of each simulation linkage space coordinate and the corresponding times of linkage and total linkage times of each scene object;
determining parameter fusion information between coordinate offset adaptation parameters of the plurality of simulated linkage space coordinates and region configuration parameters of the initial scene linkage space to obtain a plurality of parameter fusion information;
calculating simulation linkage results of a plurality of parameter fusion information and corresponding simulation linkage control parameters, and processing the simulation linkage control parameters according to a simulation linkage process node sequence in the simulation linkage results to obtain a plurality of simulation linkage control parameter sets;
sequentially extracting simulation linkage adaptation processes in the plurality of simulation linkage control parameter sets, taking matching targets in the plurality of simulation linkage adaptation processes as simulation linkage targets, and respectively and sequentially generating a simulation linkage set corresponding to each simulation linkage target according to the simulation linkage adaptation processes;
respectively matching the linkage amplitude between each matching target in the simulation linkage adaptation process with each simulation linkage set, wherein the linkage amplitude corresponds to the absolute value of the difference between the sequence maximum value and the sequence minimum value of each simulation linkage set;
setting corresponding simulation linkage adaptation nodes for each simulation linkage set according to linkage amplitude matched with each simulation linkage set, carrying out association configuration on the simulation linkage sets provided with the simulation linkage adaptation nodes according to the simulation linkage adaptation process, and applying the simulation linkage sets which are subjected to the association configuration to corresponding simulation linkage controls according to the types of simulation linkage control parameter sets corresponding to the simulation linkage sets which are subjected to the association configuration to obtain target simulation linkage controls;
and combining the dynamic adaptation functions of each target simulation linkage control to obtain the corresponding dynamic adaptation functions.
In a possible design of the first aspect, the step of outputting the dynamic adaptation function, the simulation linkage result, and a scene parameter of a scene space as a display object adaptation policy of the control node association set includes:
performing function substitution on each simulation linkage node in the simulation linkage result according to the dynamic adaptation function to determine a simulation linkage adaptation coordinate point of each simulation linkage node, and acquiring a process configuration file of the simulation linkage node according to the simulation linkage adaptation coordinate point;
determining master linkage adaptation configuration information of the simulation linkage nodes according to the process configuration file, searching slave linkage adaptation configuration information corresponding to the simulation linkage nodes based on the master linkage adaptation configuration information, and combining each simulation linkage node into at least one spatial configuration relation chain according to the slave linkage adaptation configuration information;
extracting chain simulation linkage parameters which are corresponding to each spatial configuration relationship chain and used for representing simulation linkage of each spatial configuration relationship chain from the simulation linkage nodes based on each spatial configuration relationship chain;
determining linkage control information of each spatial configuration relation chain when the simulated linkage node is controlled according to the chain simulation linkage parameters, and splicing each spatial configuration relation chain according to the linkage control logic relation of each linkage control information to obtain a spliced spatial configuration relation chain;
extracting corresponding splicing space point adaptation information according to the splicing space points on the splicing space configuration relation chain, grouping the splicing space point adaptation information according to different object display labels, calculating adaptation adjustment information of each object display label, and selecting a splicing space point adaptation interval according to the adaptation adjustment information;
when an updating process for updating the adaptation strategy of the display object is generated in the adaptation information of the splicing space point according to the adaptation interval of the splicing space point, an adaptation mapping script corresponding to the adaptation interval of the splicing space point is obtained according to the updating process;
generating a mapping bit space for recording the adaptive mapping script, mapping the adaptive mapping script to the mapping bit space, and setting mapping associated information of the adaptive mapping script according to an object display tag of the splicing space point adaptive information;
judging whether the splicing space point adaptation information is in a state of executing the display object adaptation strategy or not according to the mapping association information, and determining at least one updating parameter and an updating logic flow for updating the display object adaptation strategy according to the updating flow when the splicing space point adaptation information is not in the state of executing the display object adaptation strategy;
and updating the display object adaptation strategy according to the at least one updating parameter and the updating logic flow.
In a possible design of the first aspect, the step of determining, according to the display object adaptation policy, target object adaptation regions for display control of the object to be displayed and a control instruction sequence for each target object adaptation region includes:
according to the display object adaptation strategy, carrying out index search on the fluctuation range of each object related to the object to be displayed, and determining an object control coordinate system corresponding to the object to be displayed;
determining a fluctuation distance segment set according to the object control coordinate system, extracting a dense control area of the object control coordinate system, taking a set threshold value as a fluctuation interval, and extracting a centralized control range of the dense control area associated with the fluctuation distance segment set;
generating a plurality of visual movement units according to visual movement directions of visual areas in the fluctuation distance section coordinates according to at least two related fluctuation distance section coordinates in the centralized control range, calculating overlapping areas between all the visual areas in the next fluctuation distance section coordinate and all the visual areas in the previous fluctuation distance section coordinate, and obtaining a corresponding visual movement direction table according to each obtained overlapping area;
according to the visual moving direction table, a visual moving unit which has matched visual moving directions and the overlapping area between the visual areas of the two visual moving units is smaller than the maximum continuous overlapping area of the object control coordinate system in the overlapping area is obtained to form a coordinate space of a fluctuation distance section;
distributing nodes in each fluctuation distance segment coordinate space to obtain a distributed interval of each distributed fluctuation distance segment coordinate space, generating a corresponding object control coordinate system space according to the dense control area, and indexing the object control coordinate system space to obtain distributed intervals of a plurality of index nodes;
matching according to the distribution interval on the coordinate space of the fluctuation distance segment and the distribution interval of the index nodes on the object control coordinate system space to obtain an expression logic matching interval;
and determining target object adaptation areas for controlling the display of the object to be displayed and a control instruction sequence aiming at each target object adaptation area from the expression logic matching interval.
In a possible design of the first aspect, the step of performing an object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence for each target object adaptation area includes:
generating an adaptation control path and adaptation control identification information when the target object adaptation area allocates the corresponding control instruction sequence according to the determined target object adaptation area;
performing channel identification processing on an allocation channel corresponding to the object to be displayed to obtain a plurality of rendering nodes, determining a rendering label corresponding to each rendering node, and determining corresponding rendering action data according to the rendering labels;
identifying the adaptation control path and the adaptation control identification information to the rendering action data to obtain rendering labels, determining rendering linkage amplitude between the rendering labels and each rendering label in the rendering action data, and determining rendering adaptation parameters of the rendering labels according to the adaptation control path of the rendering label corresponding to the maximum value of the rendering linkage amplitude;
determining a dynamic adjustment interval and a dynamic adjustment direction according to the rendering adaptation parameters, and determining a dynamic adjustment point priority parameter of each dynamic adjustment point in the dynamic adjustment interval and an allocation mapping strategy priority parameter of each allocation mapping strategy in the dynamic adjustment direction according to the obtained dynamic adjustment interval and the obtained dynamic adjustment direction;
obtaining the dynamic adjusting point priority parameters of the dynamic adjusting points and the priority coincidence results of the distribution mapping strategy priority parameters of the distribution mapping strategies according to the dynamic adjusting point priority parameters of the dynamic adjusting points in the dynamic adjusting interval and the distribution mapping strategy priority parameters of the distribution mapping strategies in the dynamic adjusting direction, and generating distribution mapping blocks for representing the distribution mapping strategies and the priority coincidence results of the distribution mapping strategies according to the priority coincidence results;
determining an access queue of the dynamic adjustment interval and the dynamic adjustment direction according to each allocation mapping block, determining a first allocation queue of allocation tasks corresponding to allocation tasks of the access queue in the dynamic adjustment interval according to association parameters of association items of the allocation tasks corresponding to the allocation tasks of the access queue in the dynamic adjustment interval, and determining a second allocation queue of the allocation tasks of the access queue corresponding to the allocation tasks in the dynamic adjustment direction according to dynamic adjustment direction parameters of the allocation tasks of the access queue corresponding to the allocation tasks in the dynamic adjustment direction;
and executing object control operation on the object to be displayed according to the first distribution queue and the second distribution queue of each distribution task of the access queue.
In one possible design of the first aspect, the method further includes:
and updating the first control information of the object service and the second control information of the scene service according to the result of the object control operation on the object to be displayed.
In a second aspect, an embodiment of the present application further provides a virtual reality object control apparatus, which is applied to a virtual reality device, where the virtual reality device is connected to at least one object fluctuation range in a communication manner, and the apparatus includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring first control information of an object service corresponding to an object to be displayed in a current virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file;
a comparison module configured to compare a control node association set between the first control information and the second control information;
a first determining module, configured to determine, when there is a conflict in control behaviors of any at least two associated control nodes in the control node association set, a display object adaptation policy of the scene service for a target associated control node where the conflict exists;
the second determining module is used for determining target object adaptation areas for controlling the display of the object to be displayed and a control instruction sequence aiming at each target object adaptation area according to the display object adaptation strategy;
and the object control module is used for executing object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence aiming at each target object adaptation area.
In a third aspect, an embodiment of the present application further provides a virtual reality device, where the virtual reality device includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is configured to be connected in communication with at least one object fluctuation range, the machine-readable storage medium is configured to store a program, an instruction, or code, and the processor is configured to execute the program, the instruction, or the code in the machine-readable storage medium to perform the method for controlling a virtual reality object in the first aspect or any one of the possible designs in the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are detected on a computer, the instructions cause the computer to perform the virtual reality object control method in the first aspect or any one of the possible designs of the first aspect.
Based on any one of the above aspects, the method includes obtaining first control information of an object service corresponding to each object to be displayed and second control information of a corresponding scene service, and when control behaviors of any at least two associated control nodes in a control node association set between the first control information and the second control information are in conflict, determining a displayed object adaptation strategy of the scene service for the conflicting target associated control nodes, so as to determine a target object adaptation area and a control instruction sequence for each target object adaptation area, and thus executing object control operation on the object to be displayed. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.
Detailed Description
The present application will now be described in detail with reference to the drawings, and the specific operations in the method embodiments may also be applied to the apparatus embodiments or the system embodiments.
Fig. 1 is an interaction diagram of a virtual reality system 10 according to an embodiment of the present application. The virtual reality system 10 may include a server 200 and a virtual reality device 100 communicatively connected to the server 200, and a processor executing instruction operations may be included in the virtual reality device 100. The virtual reality system 10 shown in fig. 1 is only one possible example, and in other possible embodiments, the virtual reality system 10 may include only a portion of the components shown in fig. 1 or may include other components.
In some embodiments, the server 200 may be a single server or a group of servers. The set of operating servers may be centralized or distributed (e.g., server 200 may be a distributed system). In some embodiments, the server 200 may be local or remote to the virtual reality device 100. For example, the server 200 may access information stored in the virtual reality device 100 and a database, or any combination thereof, via a network. As another example, the server 200 may be directly connected to at least one of the virtual reality device 100 and a database to access information and/or data stored therein. In some embodiments, the server 200 may be implemented on a cloud platform; by way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud (community cloud), a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the server 200 may include a processor. The processor may process information and/or data related to the service request to perform one or more of the functions described herein. A processor may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a Processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer), a microprocessor, or the like, or any combination thereof.
The network may be used for the exchange of information and/or data. In some embodiments, one or more components in the virtual reality system 10 (e.g., the server 200, the virtual reality device 100, and the database) may send information and/or data to other components. In some embodiments, the network may be any type of wired or wireless network, or combination thereof. Merely by way of example, Network 130 may include a wired Network, a Wireless Network, a fiber optic Network, a telecommunications Network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a WLAN, a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth Network, a ZigBee Network, a Near Field Communication (NFC) Network, or the like, or any combination thereof. In some embodiments, the network may include one or more network access points. For example, the network may include wired or wireless network access points, such as base stations and/or network switching nodes, through which one or more components of the virtual reality system 10 may connect to the network to exchange data and/or information.
The aforementioned database may store data and/or instructions. In some embodiments, the database may store data assigned to the virtual reality device 100. In some embodiments, the database may store data and/or instructions for the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile Read-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), Erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like. In some embodiments, the database may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, across clouds, multiple clouds, or the like, or any combination thereof.
In some embodiments, the database may be connected to a network to communicate with one or more components in the virtual reality system 10 (e.g., server 200, virtual reality device 100, etc.). One or more components in the virtual reality system 10 may access data or instructions stored in a database via a network. In some embodiments, the database may be directly connected to one or more components in the virtual reality system 10 (e.g., the server 200, the virtual reality device 100, etc.; or, in some embodiments, the database may be part of the server 200.
To solve the technical problem in the foregoing background art, fig. 2 is a schematic flowchart of a virtual reality object control method provided in an embodiment of the present application, and the virtual reality object control method provided in this embodiment may be executed by the virtual reality device 100 shown in fig. 1, which is described in detail below.
Step S110, when receiving the virtual reality display request, obtaining a corresponding virtual reality configuration file from the server 200, and obtaining first control information of an object service corresponding to each object to be displayed in the virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file.
Step S120, comparing the control node association set between the first control information and the second control information, and when the control behaviors of any at least two associated control nodes in the control node association set conflict, determining a display object adaptation strategy of the scene service for the conflicting target associated control node.
Step S130, determining target object adaptation areas for performing display control on the objects to be displayed and a control instruction sequence aiming at each target object adaptation area according to the display object adaptation strategy.
Step S140, according to the determined target object adaptation areas and the control instruction sequence for each target object adaptation area, performing object control operation on the object to be displayed.
In this embodiment, a user may access the server 200 through a user terminal, and select a desired option from the virtual reality configuration options provided by the server 200 to send a virtual reality display request to the server 200, the server 200 may forward the virtual reality display request to the virtual reality device 100, and the virtual reality device 100 may obtain, from the server 200 according to the virtual reality display request, a virtual reality configuration file corresponding to the option selected by the user. For example, the user can select a scene which is preferred by the user and some virtual reality objects which are customized by the user.
The virtual reality configuration file may include first control information of an object service corresponding to each object to be displayed and second control information of a scene service corresponding to the virtual reality configuration file, where the first control information and the second control information are respectively used to represent a control instruction of each control link, and may be specifically configured by a provider of the virtual display model. In addition, the target object adaptation area may be understood as an area for performing adaptation control on an object to be displayed in a specific display process.
Based on this, in this embodiment, by acquiring first control information of an object service corresponding to each object to be displayed and second control information of a corresponding scene service, when there is a conflict between control behaviors of any at least two associated control nodes in a control node association set between the first control information and the second control information, a display object adaptation policy of the scene service for a target associated control node where there is a conflict is determined, so as to determine a target object adaptation area and a control instruction sequence for each target object adaptation area, thereby performing object control operation on the object to be displayed. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform (namely, the situation of conflict exists) can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.
In a possible design, for step S120, in order to effectively determine the adaptation policy of the display object and improve the rendering effect of the virtual reality object and the adaptation degree to the application environment scene, in this embodiment, a current scene space corresponding to a target associated control node where there is a conflict is obtained from a scene service, then a first scene linkage space where the current scene space is located is calculated according to a preset adaptation matrix of the display object, a range of the first scene linkage space is simulated and updated, and a second scene linkage space where the current scene space is located is obtained, so that the second scene linkage space is an initial scene linkage space of a next scene space.
And then, updating the preset display object adaptation matrix by taking the next scene space as the current scene space to obtain an updated display object adaptation matrix, and performing linkage updating on the initial scene linkage space corresponding to the current scene space according to the updated display object adaptation matrix to obtain the initial scene linkage space corresponding to the next scene space until all the scene objects in the scene space are subjected to simulated linkage, so as to obtain a simulated linkage result.
On the basis, the corresponding dynamic adaptation function can be calculated according to the initial simulation linkage parameters, the linkage times of all scene objects in the scene space, the total linkage times of all scene objects and the regional configuration parameters of the initial scene linkage space.
As a possible example, the present embodiment may obtain a plurality of simulated linkage spatial coordinates according to the initial simulated linkage parameters, obtain a linkage coordinate value of each simulated linkage spatial coordinate in the plurality of simulated linkage spatial coordinates, and then obtain simulated linkage set header information of each simulated linkage spatial coordinate according to the linkage coordinate value of each simulated linkage spatial coordinate and a simulated linkage interval before simulated linkage of each simulated linkage spatial coordinate.
It should be noted that the simulation linkage aggregation header information includes the number of times of linkage between the simulation linkage interval and each corresponding scene object and the total number of times of linkage.
And then, calculating to obtain an initial value of the simulated linkage interval of each simulated linkage space coordinate according to the simulated linkage type of each simulated linkage space coordinate and the simulated linkage interval of each simulated linkage space coordinate. For example, the analog linkage type may correspond to an interval coefficient, and on this basis, the analog linkage interval of each analog linkage space coordinate may be multiplied by the interval coefficient to obtain an initial value of the analog linkage interval of each analog linkage space coordinate.
Then, according to the initial value of the simulated linkage interval of each simulated linkage space coordinate and the corresponding linkage times and total linkage times of each scene object, a simulated linkage information table is inquired to obtain coordinate offset adaptive parameters of a plurality of simulated linkage space coordinates, then parameter fusion information between the coordinate offset adaptive parameters of the plurality of simulated linkage space coordinates and the regional configuration parameters of the initial scene linkage space is determined to obtain a plurality of parameter fusion information, and the simulated linkage results and the corresponding simulated linkage control parameters of the plurality of parameter fusion information are calculated, and the simulated linkage control parameters are processed according to the simulated linkage process node sequence in the simulated linkage results to obtain a plurality of simulated linkage control parameter sets.
And then, sequentially extracting the simulation linkage adaptation processes in the plurality of simulation linkage control parameter sets, taking the matching targets in the plurality of simulation linkage adaptation processes as simulation linkage targets, and sequentially generating a simulation linkage set corresponding to each simulation linkage target according to the simulation linkage adaptation processes, so that the linkage amplitude between each matching target in the simulation linkage adaptation processes can be respectively matched with each simulation linkage set. It should be noted that the linkage amplitude corresponds to the absolute value of the difference between the sequence maximum and the sequence minimum of the analog linkage set.
Then, corresponding simulation linkage adaptation nodes can be set for each simulation linkage set according to linkage amplitude matched with each simulation linkage set, the simulation linkage sets provided with the simulation linkage adaptation nodes are subjected to associated configuration according to the simulation linkage adaptation process, the simulation linkage sets subjected to associated configuration are applied to the corresponding simulation linkage controls according to the types of the simulation linkage control parameter sets corresponding to the simulation linkage sets subjected to associated configuration, target simulation linkage controls are obtained, and the dynamic adaptation functions of each target simulation linkage control are combined to obtain the corresponding dynamic adaptation functions.
On the basis of the description, the dynamic adaptation function, the simulation linkage result and the scene parameters of the scene space can be used as the display object adaptation strategy output of the control node association set.
For example, function substitution may be performed on each simulation linkage node in the simulation linkage result according to a dynamic adaptation function, a simulation linkage adaptation coordinate point of each simulation linkage node is determined, main linkage adaptation configuration information of the simulation linkage node is determined according to the simulation linkage adaptation coordinate point, slave linkage adaptation configuration information corresponding to the simulation linkage node is found out based on the main linkage adaptation configuration information, and each simulation linkage node is combined into at least one spatial configuration relationship chain according to the slave linkage adaptation configuration information.
Then, a chain simulation linkage parameter corresponding to each spatial configuration relationship chain and used for representing the simulated linkage of each spatial configuration relationship chain can be obtained based on each spatial configuration relationship chain, linkage control information of each spatial configuration relationship chain when the spatial configuration relationship chain is in linkage simulation linkage nodes is determined according to the chain simulation linkage parameter, and each spatial configuration relationship chain is spliced according to the linkage control logic relationship of each linkage control information to obtain a spliced spatial configuration relationship chain.
Then, corresponding splicing space point adaptation information can be extracted according to splicing space points on the splicing space configuration relation chain, the splicing space point adaptation information is grouped according to different object display labels, adaptation adjusting information of each object display label is calculated, a splicing space point adaptation interval is selected according to the adaptation adjusting information, when an updating process for updating a display object adaptation strategy is generated in the splicing space point adaptation information according to the splicing space point adaptation interval, an adaptation mapping script corresponding to the splicing space point adaptation interval is obtained according to the updating process, a mapping bit space for recording the adaptation mapping script is generated at the same time, the adaptation mapping script is mapped to the mapping bit space, and mapping relevant information of the adaptation mapping script is set according to the object display labels of the splicing space point adaptation information.
Therefore, whether the splicing space point adaptation information is adapted to the display object adaptation strategy or not can be judged according to the mapping association information, when the splicing space point adaptation information is adapted to the display object adaptation strategy, at least one updating parameter for updating the display object adaptation strategy is determined according to the updating process, so that the display object adaptation strategy is updated according to the at least one updating parameter, wherein the display object adaptation strategy comprises a control instruction corresponding to each unit area.
In a possible design, in step S130, the embodiment may locate the fluctuation range of each object related to the object to be displayed, determine an object control coordinate system corresponding to the object to be displayed, then determine a fluctuation distance segment set according to the object control coordinate system, extract the dense control region of the object control coordinate system, and extract the centralized control range of the fluctuation distance segment set associated with the dense control region with the set threshold as the fluctuation interval.
Among them, the dense control region may be used to represent a region in the object control coordinate system formed such that the number of controllable coordinate points in the unit coordinate system is greater than a set number (e.g., 50).
Then, according to at least two related fluctuation distance segment coordinates in the centralized control range, a plurality of visual movement units are generated according to the visual movement direction in the visual area corresponding to the fluctuation distance segment coordinates, the overlapping area between all the visual areas in the next fluctuation distance segment coordinate and all the visual areas in the previous fluctuation distance segment coordinate is calculated, and a corresponding visual movement direction table is obtained according to each obtained overlapping area. Therefore, the visual moving units of which the visual moving directions are matched and the overlapping area between the visual areas of the two visual moving units is smaller than the maximum continuous overlapping area of the object control coordinate system in the overlapping area can be obtained according to the visual moving direction table to form the coordinate space of the fluctuation distance section.
On the basis, the coordinate space in the coordinate space of each fluctuation distance segment can be adapted to obtain an adapted interval of each adapted fluctuation distance segment coordinate space, a target object adapted area for display control of an object to be displayed is determined according to the adapted interval of each fluctuation distance segment coordinate space, and therefore a control instruction sequence aiming at each target object adapted area is determined according to a control instruction of a unit area corresponding to each target object adapted area in the displayed object adapted strategy.
In a possible design, for step S140, in this embodiment, an adaptation control path and adaptation control identification information when the target object adaptation area performs adaptation control on a corresponding control instruction sequence are generated according to the determined target object adaptation area, then a rendering node corresponding to the object to be displayed is identified to obtain a plurality of rendering nodes, a rendering label corresponding to each rendering node is determined, corresponding rendering action data is determined according to the rendering label, so that the adaptation control path and the adaptation control identification information can be identified to the rendering action data to obtain the rendering label, a rendering linkage amplitude between the rendering label and each rendering label in the rendering action data is determined, rendering adaptation parameters of the rendering label are determined according to the adaptation control path of the rendering label corresponding to N before sequencing of the rendering linkage amplitudes, wherein N is a preset positive integer.
And then, determining a dynamic adjustment interval and a dynamic adjustment direction according to the rendering adaptation parameters, and executing object control operation on the object to be displayed according to the dynamic adjustment interval and the dynamic adjustment direction. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.
On the basis, in order to facilitate subsequent use, the present embodiment may update the first control information of the object service and the second control information of the scene service according to the result of the object control operation on the object to be displayed. Thus, the user does not need to adapt again if he selects the same configuration option during subsequent use.
Fig. 3 is a schematic functional module diagram of a virtual reality object control apparatus 300 according to an embodiment of the present application, and in this embodiment, the virtual reality object control apparatus 300 may be divided into functional modules according to the foregoing method embodiments. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the present application is schematic, and is only a logical function division, and there may be another division manner in actual implementation. For example, when the function modules are divided for the respective functions, the virtual reality object control apparatus 300 shown in fig. 3 is only one apparatus diagram. The virtual reality object control apparatus 300 may include an obtaining module 310, a first determining module 320, a second determining module 330, and an object control module 340, and the functions of the functional modules of the virtual reality object control apparatus 300 are described in detail below.
The obtaining module 310 is configured to obtain a corresponding virtual reality configuration file from the server 200 when receiving the virtual reality display request, and obtain first control information of an object service corresponding to each object to be displayed in the virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file.
The first determining module 320 is configured to compare the control node association sets between the first control information and the second control information, and determine, when there is a conflict in control behaviors of any at least two associated control nodes in the control node association sets, a display object adaptation policy of the scene service for a target associated control node where the conflict exists.
The second determining module 330 is configured to determine, according to the display object adaptation policy, a target object adaptation area for performing display control on an object to be displayed and a control instruction sequence for each target object adaptation area.
And the object control module 340 is configured to execute an object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence for each target object adaptation area.
Further, fig. 4 is a schematic structural diagram of a virtual reality device 100 for executing the virtual reality object control method according to the embodiment of the present application. As shown in fig. 4, the virtual reality device 100 may include a network interface 110, a machine-readable storage medium 120, a processor 130, and a bus 140. The processor 130 may be one or more, and one processor 130 is illustrated in fig. 4 as an example. The network interface 110, the machine-readable storage medium 120, and the processor 130 may be connected by a bus 140 or otherwise, as exemplified by the connection by the bus 140 in fig. 4.
The machine-readable storage medium 120 is a computer-readable storage medium, and can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules (for example, the obtaining module 310, the first determining module 320, the second determining module 330, and the object control module 340 shown in fig. 3) corresponding to the virtual reality object control method in the embodiment of the present application. The processor 130 executes various functional applications and data processing of the terminal device by detecting the software programs, instructions and modules stored in the machine-readable storage medium 120, that is, the above-mentioned virtual reality object control method is implemented, and details are not described herein.
The machine-readable storage medium 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the machine-readable storage medium 120 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memories of the systems and methods described herein are intended to comprise, without being limited to, these and any other suitable memory of a publishing node. In some examples, the machine-readable storage medium 120 may further include memory located remotely from the processor 130, which may be connected to the virtual reality device 100 over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The processor 130 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 130. The processor 130 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
The virtual reality device 100 can interact with other devices (e.g., the server 200) through the communication interface 110. Communication interface 110 may be a circuit, bus, transceiver, or any other device that may be used to exchange information. Processor 130 may send and receive information using communication interface 110.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website site, computer, virtual reality device, or data center to another website site, computer, virtual reality device, or data center by wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a virtual reality device, a data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.