CN112015272B - Virtual reality system and virtual reality object control device - Google Patents

Virtual reality system and virtual reality object control device Download PDF

Info

Publication number
CN112015272B
CN112015272B CN202010860811.4A CN202010860811A CN112015272B CN 112015272 B CN112015272 B CN 112015272B CN 202010860811 A CN202010860811 A CN 202010860811A CN 112015272 B CN112015272 B CN 112015272B
Authority
CN
China
Prior art keywords
linkage
adaptation
control
space
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010860811.4A
Other languages
Chinese (zh)
Other versions
CN112015272A (en
Inventor
简吉波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Euro Software Technology Development Co.,Ltd.
Original Assignee
Beijing Euro Software Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Euro Software Technology Development Co ltd filed Critical Beijing Euro Software Technology Development Co ltd
Priority to CN202010860811.4A priority Critical patent/CN112015272B/en
Publication of CN112015272A publication Critical patent/CN112015272A/en
Application granted granted Critical
Publication of CN112015272B publication Critical patent/CN112015272B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • G06F9/4451User profiles; Roaming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a virtual reality system and a virtual reality object control device, wherein by acquiring first control information of an object service corresponding to each object to be displayed and second control information of a corresponding scene service, when control behaviors of any at least two associated control nodes in a control node association set between the first control information and the second control information are in conflict, a display object adaptation strategy of the scene service for a target associated control node with the conflict is determined, so that a target object adaptation region and a control instruction sequence of each target object adaptation region are determined, and therefore object control operation is executed on the object to be displayed. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.

Description

Virtual reality system and virtual reality object control device
Technical Field
The present application relates to the field of virtual reality technologies, and in particular, to a virtual reality system and a virtual reality object control device.
Background
In a virtual reality scenario, some control information, such as a weather environment, a road environment, etc., is usually configured for a unified application environment scenario, and some control information is also configured for some extended virtual reality object associations in the application environment scenario, where the virtual reality objects may be generally selected by a user or selected according to a specific virtual reality scenario service. In a conventional scheme, due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform, the virtual reality object may not be effectively adapted to the application environment scene, and especially, in the case that the application scene environment is usually fixed and unchanged, but the virtual reality object changes frequently, the rendering effect of the virtual reality object may be seriously affected.
Disclosure of Invention
In order to overcome at least the above deficiencies in the prior art, an object of the present application is to provide a virtual reality system and a virtual reality object control device, which can avoid the situation that a virtual reality object cannot be effectively adapted to an application environment scene due to non-uniformity of control modes of the application environment scene and the virtual reality object in the virtual reality scene, and improve the rendering effect of the virtual reality object and the adaptation degree of the application environment scene.
In a first aspect, the present application provides a virtual reality object control method, which is applied to a virtual reality device, where the virtual reality device is in communication connection with at least one object fluctuation range, and the method includes:
acquiring first control information of an object service corresponding to an object to be displayed in a current virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file;
comparing a set of control node associations between the first control information and the second control information;
when the control behaviors of any at least two associated control nodes in the control node association set have conflict, determining a display object adaptation strategy of the scene service for the target associated control node with conflict;
determining target object adaptation areas for controlling the display of the object to be displayed and a control instruction sequence aiming at each target object adaptation area according to the display object adaptation strategy;
and executing object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence aiming at each target object adaptation area.
In a possible design of the first aspect, the step of determining a presentation object adaptation policy of the scenario service for a target associated control node with a conflict includes:
acquiring a current scene space corresponding to the control node association set from the scene service according to the control node association set;
calculating a first scene linkage space where the current scene space is located according to a preset display object adaptation matrix, performing simulation updating on the range of the first scene linkage space, and acquiring a second scene linkage space where the current scene space is located so as to enable the second scene linkage space to be an initial scene linkage space of a next scene space;
taking the next scene space as the current scene space, updating the preset display object adaptation matrix to obtain an updated display object adaptation matrix, and performing linkage updating on the initial scene linkage space corresponding to the current scene space according to the updated display object adaptation matrix to obtain the initial scene linkage space corresponding to the next scene space until all scene objects in the scene space are subjected to simulated linkage, so as to obtain a simulated linkage result;
calculating a corresponding dynamic adaptation function according to an initial simulation linkage parameter, the linkage times of each scene object in the scene space, the total linkage times of each scene object and a region configuration parameter of the initial scene linkage space;
and outputting the dynamic adaptation function, the simulation linkage result and the scene parameters of the scene space as the display object adaptation strategy of the control node association set.
In a possible design of the first aspect, the step of calculating a corresponding dynamic adaptation function according to an initial simulation linkage parameter, a number of times that each scene object in the scene space is linked, a total number of times that each scene object is linked, and a region configuration parameter of the initial scene linkage space includes:
acquiring a plurality of simulated linkage space coordinates according to the initial simulated linkage parameters, and acquiring a linkage coordinate value of each simulated linkage space coordinate in the plurality of simulated linkage space coordinates;
acquiring simulation linkage aggregation head information of each simulation linkage space coordinate according to the linkage coordinate value of each simulation linkage space coordinate and a simulation linkage interval before simulation linkage of each simulation linkage space coordinate, wherein the simulation linkage aggregation head information comprises the simulation linkage interval and the number of times of linkage and the total number of times of linkage of each corresponding scene object;
calculating to obtain an initial value of the simulated linkage interval of each simulated linkage space coordinate according to the simulated linkage type of each simulated linkage space coordinate and the simulated linkage interval of each simulated linkage space coordinate;
inquiring a simulation linkage information table to obtain coordinate offset adaptive parameters of the plurality of simulation linkage space coordinates according to the simulation linkage interval initial value of each simulation linkage space coordinate and the corresponding times of linkage and total linkage times of each scene object;
determining parameter fusion information between coordinate offset adaptation parameters of the plurality of simulated linkage space coordinates and region configuration parameters of the initial scene linkage space to obtain a plurality of parameter fusion information;
calculating simulation linkage results of a plurality of parameter fusion information and corresponding simulation linkage control parameters, and processing the simulation linkage control parameters according to a simulation linkage process node sequence in the simulation linkage results to obtain a plurality of simulation linkage control parameter sets;
sequentially extracting simulation linkage adaptation processes in the plurality of simulation linkage control parameter sets, taking matching targets in the plurality of simulation linkage adaptation processes as simulation linkage targets, and respectively and sequentially generating a simulation linkage set corresponding to each simulation linkage target according to the simulation linkage adaptation processes;
respectively matching the linkage amplitude between each matching target in the simulation linkage adaptation process with each simulation linkage set, wherein the linkage amplitude corresponds to the absolute value of the difference between the sequence maximum value and the sequence minimum value of each simulation linkage set;
setting corresponding simulation linkage adaptation nodes for each simulation linkage set according to linkage amplitude matched with each simulation linkage set, carrying out association configuration on the simulation linkage sets provided with the simulation linkage adaptation nodes according to the simulation linkage adaptation process, and applying the simulation linkage sets which are subjected to the association configuration to corresponding simulation linkage controls according to the types of simulation linkage control parameter sets corresponding to the simulation linkage sets which are subjected to the association configuration to obtain target simulation linkage controls;
and combining the dynamic adaptation functions of each target simulation linkage control to obtain the corresponding dynamic adaptation functions.
In a possible design of the first aspect, the step of outputting the dynamic adaptation function, the simulation linkage result, and a scene parameter of a scene space as a display object adaptation policy of the control node association set includes:
performing function substitution on each simulation linkage node in the simulation linkage result according to the dynamic adaptation function to determine a simulation linkage adaptation coordinate point of each simulation linkage node, and acquiring a process configuration file of the simulation linkage node according to the simulation linkage adaptation coordinate point;
determining master linkage adaptation configuration information of the simulation linkage nodes according to the process configuration file, searching slave linkage adaptation configuration information corresponding to the simulation linkage nodes based on the master linkage adaptation configuration information, and combining each simulation linkage node into at least one spatial configuration relation chain according to the slave linkage adaptation configuration information;
extracting chain simulation linkage parameters which are corresponding to each spatial configuration relationship chain and used for representing simulation linkage of each spatial configuration relationship chain from the simulation linkage nodes based on each spatial configuration relationship chain;
determining linkage control information of each spatial configuration relation chain when the simulated linkage node is controlled according to the chain simulation linkage parameters, and splicing each spatial configuration relation chain according to the linkage control logic relation of each linkage control information to obtain a spliced spatial configuration relation chain;
extracting corresponding splicing space point adaptation information according to the splicing space points on the splicing space configuration relation chain, grouping the splicing space point adaptation information according to different object display labels, calculating adaptation adjustment information of each object display label, and selecting a splicing space point adaptation interval according to the adaptation adjustment information;
when an updating process for updating the adaptation strategy of the display object is generated in the adaptation information of the splicing space point according to the adaptation interval of the splicing space point, an adaptation mapping script corresponding to the adaptation interval of the splicing space point is obtained according to the updating process;
generating a mapping bit space for recording the adaptive mapping script, mapping the adaptive mapping script to the mapping bit space, and setting mapping associated information of the adaptive mapping script according to an object display tag of the splicing space point adaptive information;
judging whether the splicing space point adaptation information is in a state of executing the display object adaptation strategy or not according to the mapping association information, and determining at least one updating parameter and an updating logic flow for updating the display object adaptation strategy according to the updating flow when the splicing space point adaptation information is not in the state of executing the display object adaptation strategy;
and updating the display object adaptation strategy according to the at least one updating parameter and the updating logic flow.
In a possible design of the first aspect, the step of determining, according to the display object adaptation policy, target object adaptation regions for display control of the object to be displayed and a control instruction sequence for each target object adaptation region includes:
according to the display object adaptation strategy, carrying out index search on the fluctuation range of each object related to the object to be displayed, and determining an object control coordinate system corresponding to the object to be displayed;
determining a fluctuation distance segment set according to the object control coordinate system, extracting a dense control area of the object control coordinate system, taking a set threshold value as a fluctuation interval, and extracting a centralized control range of the dense control area associated with the fluctuation distance segment set;
generating a plurality of visual movement units according to visual movement directions of visual areas in the fluctuation distance section coordinates according to at least two related fluctuation distance section coordinates in the centralized control range, calculating overlapping areas between all the visual areas in the next fluctuation distance section coordinate and all the visual areas in the previous fluctuation distance section coordinate, and obtaining a corresponding visual movement direction table according to each obtained overlapping area;
according to the visual moving direction table, a visual moving unit which has matched visual moving directions and the overlapping area between the visual areas of the two visual moving units is smaller than the maximum continuous overlapping area of the object control coordinate system in the overlapping area is obtained to form a coordinate space of a fluctuation distance section;
distributing nodes in each fluctuation distance segment coordinate space to obtain a distributed interval of each distributed fluctuation distance segment coordinate space, generating a corresponding object control coordinate system space according to the dense control area, and indexing the object control coordinate system space to obtain distributed intervals of a plurality of index nodes;
matching according to the distribution interval on the coordinate space of the fluctuation distance segment and the distribution interval of the index nodes on the object control coordinate system space to obtain an expression logic matching interval;
and determining target object adaptation areas for controlling the display of the object to be displayed and a control instruction sequence aiming at each target object adaptation area from the expression logic matching interval.
In a possible design of the first aspect, the step of performing an object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence for each target object adaptation area includes:
generating an adaptation control path and adaptation control identification information when the target object adaptation area allocates the corresponding control instruction sequence according to the determined target object adaptation area;
performing channel identification processing on an allocation channel corresponding to the object to be displayed to obtain a plurality of rendering nodes, determining a rendering label corresponding to each rendering node, and determining corresponding rendering action data according to the rendering labels;
identifying the adaptation control path and the adaptation control identification information to the rendering action data to obtain rendering labels, determining rendering linkage amplitude between the rendering labels and each rendering label in the rendering action data, and determining rendering adaptation parameters of the rendering labels according to the adaptation control path of the rendering label corresponding to the maximum value of the rendering linkage amplitude;
determining a dynamic adjustment interval and a dynamic adjustment direction according to the rendering adaptation parameters, and determining a dynamic adjustment point priority parameter of each dynamic adjustment point in the dynamic adjustment interval and an allocation mapping strategy priority parameter of each allocation mapping strategy in the dynamic adjustment direction according to the obtained dynamic adjustment interval and the obtained dynamic adjustment direction;
obtaining the dynamic adjusting point priority parameters of the dynamic adjusting points and the priority coincidence results of the distribution mapping strategy priority parameters of the distribution mapping strategies according to the dynamic adjusting point priority parameters of the dynamic adjusting points in the dynamic adjusting interval and the distribution mapping strategy priority parameters of the distribution mapping strategies in the dynamic adjusting direction, and generating distribution mapping blocks for representing the distribution mapping strategies and the priority coincidence results of the distribution mapping strategies according to the priority coincidence results;
determining an access queue of the dynamic adjustment interval and the dynamic adjustment direction according to each allocation mapping block, determining a first allocation queue of allocation tasks corresponding to allocation tasks of the access queue in the dynamic adjustment interval according to association parameters of association items of the allocation tasks corresponding to the allocation tasks of the access queue in the dynamic adjustment interval, and determining a second allocation queue of the allocation tasks of the access queue corresponding to the allocation tasks in the dynamic adjustment direction according to dynamic adjustment direction parameters of the allocation tasks of the access queue corresponding to the allocation tasks in the dynamic adjustment direction;
and executing object control operation on the object to be displayed according to the first distribution queue and the second distribution queue of each distribution task of the access queue.
In one possible design of the first aspect, the method further includes:
and updating the first control information of the object service and the second control information of the scene service according to the result of the object control operation on the object to be displayed.
In a second aspect, an embodiment of the present application further provides a virtual reality object control apparatus, which is applied to a virtual reality device, where the virtual reality device is connected to at least one object fluctuation range in a communication manner, and the apparatus includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring first control information of an object service corresponding to an object to be displayed in a current virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file;
a comparison module configured to compare a control node association set between the first control information and the second control information;
a first determining module, configured to determine, when there is a conflict in control behaviors of any at least two associated control nodes in the control node association set, a display object adaptation policy of the scene service for a target associated control node where the conflict exists;
the second determining module is used for determining target object adaptation areas for controlling the display of the object to be displayed and a control instruction sequence aiming at each target object adaptation area according to the display object adaptation strategy;
and the object control module is used for executing object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence aiming at each target object adaptation area.
In a third aspect, an embodiment of the present application further provides a virtual reality device, where the virtual reality device includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is configured to be connected in communication with at least one object fluctuation range, the machine-readable storage medium is configured to store a program, an instruction, or code, and the processor is configured to execute the program, the instruction, or the code in the machine-readable storage medium to perform the method for controlling a virtual reality object in the first aspect or any one of the possible designs in the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are detected on a computer, the instructions cause the computer to perform the virtual reality object control method in the first aspect or any one of the possible designs of the first aspect.
Based on any one of the above aspects, the method includes obtaining first control information of an object service corresponding to each object to be displayed and second control information of a corresponding scene service, and when control behaviors of any at least two associated control nodes in a control node association set between the first control information and the second control information are in conflict, determining a displayed object adaptation strategy of the scene service for the conflicting target associated control nodes, so as to determine a target object adaptation area and a control instruction sequence for each target object adaptation area, and thus executing object control operation on the object to be displayed. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic view of an application scenario of a virtual reality system according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a virtual reality object control method according to an embodiment of the present application;
fig. 3 is a schematic functional block diagram of a virtual reality object control apparatus according to an embodiment of the present disclosure;
fig. 4 is a block diagram schematically illustrating a structure of a virtual reality device for implementing the virtual reality object control method according to the embodiment of the present application.
Detailed Description
The present application will now be described in detail with reference to the drawings, and the specific operations in the method embodiments may also be applied to the apparatus embodiments or the system embodiments.
Fig. 1 is an interaction diagram of a virtual reality system 10 according to an embodiment of the present application. The virtual reality system 10 may include a server 200 and a virtual reality device 100 communicatively connected to the server 200, and a processor executing instruction operations may be included in the virtual reality device 100. The virtual reality system 10 shown in fig. 1 is only one possible example, and in other possible embodiments, the virtual reality system 10 may include only a portion of the components shown in fig. 1 or may include other components.
In some embodiments, the server 200 may be a single server or a group of servers. The set of operating servers may be centralized or distributed (e.g., server 200 may be a distributed system). In some embodiments, the server 200 may be local or remote to the virtual reality device 100. For example, the server 200 may access information stored in the virtual reality device 100 and a database, or any combination thereof, via a network. As another example, the server 200 may be directly connected to at least one of the virtual reality device 100 and a database to access information and/or data stored therein. In some embodiments, the server 200 may be implemented on a cloud platform; by way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud (community cloud), a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the server 200 may include a processor. The processor may process information and/or data related to the service request to perform one or more of the functions described herein. A processor may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a Processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer), a microprocessor, or the like, or any combination thereof.
The network may be used for the exchange of information and/or data. In some embodiments, one or more components in the virtual reality system 10 (e.g., the server 200, the virtual reality device 100, and the database) may send information and/or data to other components. In some embodiments, the network may be any type of wired or wireless network, or combination thereof. Merely by way of example, Network 130 may include a wired Network, a Wireless Network, a fiber optic Network, a telecommunications Network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a WLAN, a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth Network, a ZigBee Network, a Near Field Communication (NFC) Network, or the like, or any combination thereof. In some embodiments, the network may include one or more network access points. For example, the network may include wired or wireless network access points, such as base stations and/or network switching nodes, through which one or more components of the virtual reality system 10 may connect to the network to exchange data and/or information.
The aforementioned database may store data and/or instructions. In some embodiments, the database may store data assigned to the virtual reality device 100. In some embodiments, the database may store data and/or instructions for the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile Read-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), Erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like. In some embodiments, the database may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, across clouds, multiple clouds, or the like, or any combination thereof.
In some embodiments, the database may be connected to a network to communicate with one or more components in the virtual reality system 10 (e.g., server 200, virtual reality device 100, etc.). One or more components in the virtual reality system 10 may access data or instructions stored in a database via a network. In some embodiments, the database may be directly connected to one or more components in the virtual reality system 10 (e.g., the server 200, the virtual reality device 100, etc.; or, in some embodiments, the database may be part of the server 200.
To solve the technical problem in the foregoing background art, fig. 2 is a schematic flowchart of a virtual reality object control method provided in an embodiment of the present application, and the virtual reality object control method provided in this embodiment may be executed by the virtual reality device 100 shown in fig. 1, which is described in detail below.
Step S110, when receiving the virtual reality display request, obtaining a corresponding virtual reality configuration file from the server 200, and obtaining first control information of an object service corresponding to each object to be displayed in the virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file.
Step S120, comparing the control node association set between the first control information and the second control information, and when the control behaviors of any at least two associated control nodes in the control node association set conflict, determining a display object adaptation strategy of the scene service for the conflicting target associated control node.
Step S130, determining target object adaptation areas for performing display control on the objects to be displayed and a control instruction sequence aiming at each target object adaptation area according to the display object adaptation strategy.
Step S140, according to the determined target object adaptation areas and the control instruction sequence for each target object adaptation area, performing object control operation on the object to be displayed.
In this embodiment, a user may access the server 200 through a user terminal, and select a desired option from the virtual reality configuration options provided by the server 200 to send a virtual reality display request to the server 200, the server 200 may forward the virtual reality display request to the virtual reality device 100, and the virtual reality device 100 may obtain, from the server 200 according to the virtual reality display request, a virtual reality configuration file corresponding to the option selected by the user. For example, the user can select a scene which is preferred by the user and some virtual reality objects which are customized by the user.
The virtual reality configuration file may include first control information of an object service corresponding to each object to be displayed and second control information of a scene service corresponding to the virtual reality configuration file, where the first control information and the second control information are respectively used to represent a control instruction of each control link, and may be specifically configured by a provider of the virtual display model. In addition, the target object adaptation area may be understood as an area for performing adaptation control on an object to be displayed in a specific display process.
Based on this, in this embodiment, by acquiring first control information of an object service corresponding to each object to be displayed and second control information of a corresponding scene service, when there is a conflict between control behaviors of any at least two associated control nodes in a control node association set between the first control information and the second control information, a display object adaptation policy of the scene service for a target associated control node where there is a conflict is determined, so as to determine a target object adaptation area and a control instruction sequence for each target object adaptation area, thereby performing object control operation on the object to be displayed. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform (namely, the situation of conflict exists) can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.
In a possible design, for step S120, in order to effectively determine the adaptation policy of the display object and improve the rendering effect of the virtual reality object and the adaptation degree to the application environment scene, in this embodiment, a current scene space corresponding to a target associated control node where there is a conflict is obtained from a scene service, then a first scene linkage space where the current scene space is located is calculated according to a preset adaptation matrix of the display object, a range of the first scene linkage space is simulated and updated, and a second scene linkage space where the current scene space is located is obtained, so that the second scene linkage space is an initial scene linkage space of a next scene space.
And then, updating the preset display object adaptation matrix by taking the next scene space as the current scene space to obtain an updated display object adaptation matrix, and performing linkage updating on the initial scene linkage space corresponding to the current scene space according to the updated display object adaptation matrix to obtain the initial scene linkage space corresponding to the next scene space until all the scene objects in the scene space are subjected to simulated linkage, so as to obtain a simulated linkage result.
On the basis, the corresponding dynamic adaptation function can be calculated according to the initial simulation linkage parameters, the linkage times of all scene objects in the scene space, the total linkage times of all scene objects and the regional configuration parameters of the initial scene linkage space.
As a possible example, the present embodiment may obtain a plurality of simulated linkage spatial coordinates according to the initial simulated linkage parameters, obtain a linkage coordinate value of each simulated linkage spatial coordinate in the plurality of simulated linkage spatial coordinates, and then obtain simulated linkage set header information of each simulated linkage spatial coordinate according to the linkage coordinate value of each simulated linkage spatial coordinate and a simulated linkage interval before simulated linkage of each simulated linkage spatial coordinate.
It should be noted that the simulation linkage aggregation header information includes the number of times of linkage between the simulation linkage interval and each corresponding scene object and the total number of times of linkage.
And then, calculating to obtain an initial value of the simulated linkage interval of each simulated linkage space coordinate according to the simulated linkage type of each simulated linkage space coordinate and the simulated linkage interval of each simulated linkage space coordinate. For example, the analog linkage type may correspond to an interval coefficient, and on this basis, the analog linkage interval of each analog linkage space coordinate may be multiplied by the interval coefficient to obtain an initial value of the analog linkage interval of each analog linkage space coordinate.
Then, according to the initial value of the simulated linkage interval of each simulated linkage space coordinate and the corresponding linkage times and total linkage times of each scene object, a simulated linkage information table is inquired to obtain coordinate offset adaptive parameters of a plurality of simulated linkage space coordinates, then parameter fusion information between the coordinate offset adaptive parameters of the plurality of simulated linkage space coordinates and the regional configuration parameters of the initial scene linkage space is determined to obtain a plurality of parameter fusion information, and the simulated linkage results and the corresponding simulated linkage control parameters of the plurality of parameter fusion information are calculated, and the simulated linkage control parameters are processed according to the simulated linkage process node sequence in the simulated linkage results to obtain a plurality of simulated linkage control parameter sets.
And then, sequentially extracting the simulation linkage adaptation processes in the plurality of simulation linkage control parameter sets, taking the matching targets in the plurality of simulation linkage adaptation processes as simulation linkage targets, and sequentially generating a simulation linkage set corresponding to each simulation linkage target according to the simulation linkage adaptation processes, so that the linkage amplitude between each matching target in the simulation linkage adaptation processes can be respectively matched with each simulation linkage set. It should be noted that the linkage amplitude corresponds to the absolute value of the difference between the sequence maximum and the sequence minimum of the analog linkage set.
Then, corresponding simulation linkage adaptation nodes can be set for each simulation linkage set according to linkage amplitude matched with each simulation linkage set, the simulation linkage sets provided with the simulation linkage adaptation nodes are subjected to associated configuration according to the simulation linkage adaptation process, the simulation linkage sets subjected to associated configuration are applied to the corresponding simulation linkage controls according to the types of the simulation linkage control parameter sets corresponding to the simulation linkage sets subjected to associated configuration, target simulation linkage controls are obtained, and the dynamic adaptation functions of each target simulation linkage control are combined to obtain the corresponding dynamic adaptation functions.
On the basis of the description, the dynamic adaptation function, the simulation linkage result and the scene parameters of the scene space can be used as the display object adaptation strategy output of the control node association set.
For example, function substitution may be performed on each simulation linkage node in the simulation linkage result according to a dynamic adaptation function, a simulation linkage adaptation coordinate point of each simulation linkage node is determined, main linkage adaptation configuration information of the simulation linkage node is determined according to the simulation linkage adaptation coordinate point, slave linkage adaptation configuration information corresponding to the simulation linkage node is found out based on the main linkage adaptation configuration information, and each simulation linkage node is combined into at least one spatial configuration relationship chain according to the slave linkage adaptation configuration information.
Then, a chain simulation linkage parameter corresponding to each spatial configuration relationship chain and used for representing the simulated linkage of each spatial configuration relationship chain can be obtained based on each spatial configuration relationship chain, linkage control information of each spatial configuration relationship chain when the spatial configuration relationship chain is in linkage simulation linkage nodes is determined according to the chain simulation linkage parameter, and each spatial configuration relationship chain is spliced according to the linkage control logic relationship of each linkage control information to obtain a spliced spatial configuration relationship chain.
Then, corresponding splicing space point adaptation information can be extracted according to splicing space points on the splicing space configuration relation chain, the splicing space point adaptation information is grouped according to different object display labels, adaptation adjusting information of each object display label is calculated, a splicing space point adaptation interval is selected according to the adaptation adjusting information, when an updating process for updating a display object adaptation strategy is generated in the splicing space point adaptation information according to the splicing space point adaptation interval, an adaptation mapping script corresponding to the splicing space point adaptation interval is obtained according to the updating process, a mapping bit space for recording the adaptation mapping script is generated at the same time, the adaptation mapping script is mapped to the mapping bit space, and mapping relevant information of the adaptation mapping script is set according to the object display labels of the splicing space point adaptation information.
Therefore, whether the splicing space point adaptation information is adapted to the display object adaptation strategy or not can be judged according to the mapping association information, when the splicing space point adaptation information is adapted to the display object adaptation strategy, at least one updating parameter for updating the display object adaptation strategy is determined according to the updating process, so that the display object adaptation strategy is updated according to the at least one updating parameter, wherein the display object adaptation strategy comprises a control instruction corresponding to each unit area.
In a possible design, in step S130, the embodiment may locate the fluctuation range of each object related to the object to be displayed, determine an object control coordinate system corresponding to the object to be displayed, then determine a fluctuation distance segment set according to the object control coordinate system, extract the dense control region of the object control coordinate system, and extract the centralized control range of the fluctuation distance segment set associated with the dense control region with the set threshold as the fluctuation interval.
Among them, the dense control region may be used to represent a region in the object control coordinate system formed such that the number of controllable coordinate points in the unit coordinate system is greater than a set number (e.g., 50).
Then, according to at least two related fluctuation distance segment coordinates in the centralized control range, a plurality of visual movement units are generated according to the visual movement direction in the visual area corresponding to the fluctuation distance segment coordinates, the overlapping area between all the visual areas in the next fluctuation distance segment coordinate and all the visual areas in the previous fluctuation distance segment coordinate is calculated, and a corresponding visual movement direction table is obtained according to each obtained overlapping area. Therefore, the visual moving units of which the visual moving directions are matched and the overlapping area between the visual areas of the two visual moving units is smaller than the maximum continuous overlapping area of the object control coordinate system in the overlapping area can be obtained according to the visual moving direction table to form the coordinate space of the fluctuation distance section.
On the basis, the coordinate space in the coordinate space of each fluctuation distance segment can be adapted to obtain an adapted interval of each adapted fluctuation distance segment coordinate space, a target object adapted area for display control of an object to be displayed is determined according to the adapted interval of each fluctuation distance segment coordinate space, and therefore a control instruction sequence aiming at each target object adapted area is determined according to a control instruction of a unit area corresponding to each target object adapted area in the displayed object adapted strategy.
In a possible design, for step S140, in this embodiment, an adaptation control path and adaptation control identification information when the target object adaptation area performs adaptation control on a corresponding control instruction sequence are generated according to the determined target object adaptation area, then a rendering node corresponding to the object to be displayed is identified to obtain a plurality of rendering nodes, a rendering label corresponding to each rendering node is determined, corresponding rendering action data is determined according to the rendering label, so that the adaptation control path and the adaptation control identification information can be identified to the rendering action data to obtain the rendering label, a rendering linkage amplitude between the rendering label and each rendering label in the rendering action data is determined, rendering adaptation parameters of the rendering label are determined according to the adaptation control path of the rendering label corresponding to N before sequencing of the rendering linkage amplitudes, wherein N is a preset positive integer.
And then, determining a dynamic adjustment interval and a dynamic adjustment direction according to the rendering adaptation parameters, and executing object control operation on the object to be displayed according to the dynamic adjustment interval and the dynamic adjustment direction. Therefore, the situation that the virtual reality object cannot be effectively adapted to the application environment scene due to the fact that the control modes of the application environment scene and the virtual reality object in the virtual reality scene are not uniform can be avoided, and the rendering effect of the virtual reality object and the adaptation degree of the application environment scene are improved.
On the basis, in order to facilitate subsequent use, the present embodiment may update the first control information of the object service and the second control information of the scene service according to the result of the object control operation on the object to be displayed. Thus, the user does not need to adapt again if he selects the same configuration option during subsequent use.
Fig. 3 is a schematic functional module diagram of a virtual reality object control apparatus 300 according to an embodiment of the present application, and in this embodiment, the virtual reality object control apparatus 300 may be divided into functional modules according to the foregoing method embodiments. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the present application is schematic, and is only a logical function division, and there may be another division manner in actual implementation. For example, when the function modules are divided for the respective functions, the virtual reality object control apparatus 300 shown in fig. 3 is only one apparatus diagram. The virtual reality object control apparatus 300 may include an obtaining module 310, a first determining module 320, a second determining module 330, and an object control module 340, and the functions of the functional modules of the virtual reality object control apparatus 300 are described in detail below.
The obtaining module 310 is configured to obtain a corresponding virtual reality configuration file from the server 200 when receiving the virtual reality display request, and obtain first control information of an object service corresponding to each object to be displayed in the virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file.
The first determining module 320 is configured to compare the control node association sets between the first control information and the second control information, and determine, when there is a conflict in control behaviors of any at least two associated control nodes in the control node association sets, a display object adaptation policy of the scene service for a target associated control node where the conflict exists.
The second determining module 330 is configured to determine, according to the display object adaptation policy, a target object adaptation area for performing display control on an object to be displayed and a control instruction sequence for each target object adaptation area.
And the object control module 340 is configured to execute an object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence for each target object adaptation area.
Further, fig. 4 is a schematic structural diagram of a virtual reality device 100 for executing the virtual reality object control method according to the embodiment of the present application. As shown in fig. 4, the virtual reality device 100 may include a network interface 110, a machine-readable storage medium 120, a processor 130, and a bus 140. The processor 130 may be one or more, and one processor 130 is illustrated in fig. 4 as an example. The network interface 110, the machine-readable storage medium 120, and the processor 130 may be connected by a bus 140 or otherwise, as exemplified by the connection by the bus 140 in fig. 4.
The machine-readable storage medium 120 is a computer-readable storage medium, and can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules (for example, the obtaining module 310, the first determining module 320, the second determining module 330, and the object control module 340 shown in fig. 3) corresponding to the virtual reality object control method in the embodiment of the present application. The processor 130 executes various functional applications and data processing of the terminal device by detecting the software programs, instructions and modules stored in the machine-readable storage medium 120, that is, the above-mentioned virtual reality object control method is implemented, and details are not described herein.
The machine-readable storage medium 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the machine-readable storage medium 120 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memories of the systems and methods described herein are intended to comprise, without being limited to, these and any other suitable memory of a publishing node. In some examples, the machine-readable storage medium 120 may further include memory located remotely from the processor 130, which may be connected to the virtual reality device 100 over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The processor 130 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 130. The processor 130 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
The virtual reality device 100 can interact with other devices (e.g., the server 200) through the communication interface 110. Communication interface 110 may be a circuit, bus, transceiver, or any other device that may be used to exchange information. Processor 130 may send and receive information using communication interface 110.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website site, computer, virtual reality device, or data center to another website site, computer, virtual reality device, or data center by wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a virtual reality device, a data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.

Claims (7)

1. A virtual reality system is characterized by comprising virtual reality equipment and a server in communication connection with the virtual reality equipment, wherein the server accesses information stored in the virtual reality equipment through a network, the server is implemented on a cloud platform, and the network comprises a wired or wireless network access point;
the virtual reality equipment is used for acquiring a corresponding virtual reality configuration file from the server when receiving a virtual reality display request, and acquiring first control information of an object service corresponding to each object to be displayed in the virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file, wherein the first control information and the second control information are respectively used for representing control instructions of each control link;
the virtual reality device is configured to compare control node association sets between the first control information and the second control information, and when control behaviors of any at least two associated control nodes in the control node association sets conflict, determine a display object adaptation policy of the scene service for a target associated control node with conflict, where a target object adaptation region is a region for adaptive control of an object to be displayed in a specific display process;
the virtual reality equipment is used for determining a target object adaptation region for performing display control on the object to be displayed and a control instruction sequence aiming at each target object adaptation region according to the display object adaptation strategy;
the virtual reality equipment is used for executing object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence aiming at each target object adaptation area;
the method for the virtual reality device to specifically determine the display object adaptation policy of the scene service for the target associated control node with the conflict includes:
acquiring a current scene space corresponding to the target associated control node with conflict from the scene service;
calculating a first scene linkage space where the current scene space is located according to a preset display object adaptation matrix, performing simulation updating on the range of the first scene linkage space, and acquiring a second scene linkage space where the current scene space is located so as to enable the second scene linkage space to be an initial scene linkage space of a next scene space;
taking the next scene space as the current scene space, updating the preset display object adaptation matrix to obtain an updated display object adaptation matrix, and performing linkage updating on the initial scene linkage space corresponding to the current scene space according to the updated display object adaptation matrix to obtain the initial scene linkage space corresponding to the next scene space until all scene objects in the scene space are subjected to simulated linkage, so as to obtain a simulated linkage result;
calculating a corresponding dynamic adaptation function according to an initial simulation linkage parameter, the linkage times of each scene object in the scene space, the total linkage times of each scene object and a region configuration parameter of the initial scene linkage space;
and outputting the dynamic adaptation function, the simulation linkage result and the scene parameters of the scene space as the display object adaptation strategy of the control node association set.
2. The virtual reality system according to claim 1, wherein the virtual reality device calculates a corresponding dynamic adaptation function according to an initial simulation linkage parameter, a number of times of linkage of each scene object in the scene space, a total number of times of linkage of each scene object, and a region configuration parameter of the initial scene linkage space, and includes:
acquiring a plurality of simulated linkage space coordinates according to the initial simulated linkage parameters, and acquiring a linkage coordinate value of each simulated linkage space coordinate in the plurality of simulated linkage space coordinates;
acquiring simulation linkage aggregation head information of each simulation linkage space coordinate according to the linkage coordinate value of each simulation linkage space coordinate and a simulation linkage interval before simulation linkage of each simulation linkage space coordinate, wherein the simulation linkage aggregation head information comprises the simulation linkage interval and the number of times of linkage and the total number of times of linkage of each corresponding scene object;
calculating to obtain an initial value of the simulated linkage interval of each simulated linkage space coordinate according to the simulated linkage type of each simulated linkage space coordinate and the simulated linkage interval of each simulated linkage space coordinate;
inquiring a simulation linkage information table to obtain coordinate offset adaptive parameters of the plurality of simulation linkage space coordinates according to the simulation linkage interval initial value of each simulation linkage space coordinate and the corresponding times of linkage and total linkage times of each scene object;
determining parameter fusion information between coordinate offset adaptation parameters of the plurality of simulated linkage space coordinates and region configuration parameters of the initial scene linkage space to obtain a plurality of parameter fusion information;
calculating simulation linkage results of a plurality of parameter fusion information and corresponding simulation linkage control parameters, and processing the simulation linkage control parameters according to a simulation linkage process node sequence in the simulation linkage results to obtain a plurality of simulation linkage control parameter sets;
sequentially extracting simulation linkage adaptation processes in the plurality of simulation linkage control parameter sets, taking matching targets in the plurality of simulation linkage adaptation processes as simulation linkage targets, and respectively and sequentially generating a simulation linkage set corresponding to each simulation linkage target according to the simulation linkage adaptation processes;
respectively matching the linkage amplitude between each matching target in the simulation linkage adaptation process with each simulation linkage set, wherein the linkage amplitude corresponds to the absolute value of the difference between the sequence maximum value and the sequence minimum value of each simulation linkage set;
setting corresponding simulation linkage adaptation nodes for each simulation linkage set according to linkage amplitude matched with each simulation linkage set, carrying out association configuration on the simulation linkage sets provided with the simulation linkage adaptation nodes according to the simulation linkage adaptation process, and applying the simulation linkage sets which are subjected to the association configuration to corresponding simulation linkage controls according to the types of simulation linkage control parameter sets corresponding to the simulation linkage sets which are subjected to the association configuration to obtain target simulation linkage controls;
and combining the dynamic adaptation functions of each target simulation linkage control to obtain the corresponding dynamic adaptation functions.
3. The virtual reality system according to claim 1, wherein the virtual reality device specifically outputs the dynamic adaptation function, the simulation linkage result, and the scene parameters of the scene space as the display object adaptation policy of the control node association set, and includes:
performing function substitution on each simulation linkage node in the simulation linkage result according to the dynamic adaptation function to determine a simulation linkage adaptation coordinate point of each simulation linkage node, determining main linkage adaptation configuration information of the simulation linkage node according to the simulation linkage adaptation coordinate point, finding out slave linkage adaptation configuration information corresponding to the simulation linkage node based on the main linkage adaptation configuration information, and combining each simulation linkage node into at least one spatial configuration relation chain according to the slave linkage adaptation configuration information;
acquiring chain simulation linkage parameters which correspond to each spatial configuration relationship chain and are used for representing simulation linkage of each spatial configuration relationship chain based on each spatial configuration relationship chain;
determining linkage control information of each spatial configuration relation chain when the simulated linkage node is linked according to the chain simulation linkage parameters, and splicing each spatial configuration relation chain according to the linkage control logic relation of each linkage control information to obtain a spliced spatial configuration relation chain;
extracting corresponding splicing space point adaptation information according to the splicing space points on the splicing space configuration relation chain, grouping the splicing space point adaptation information according to different object display labels, calculating adaptation adjustment information of each object display label, and selecting a splicing space point adaptation interval according to the adaptation adjustment information;
when an updating process for updating the adaptation strategy of the display object is generated in the adaptation information of the splicing space point according to the adaptation interval of the splicing space point, an adaptation mapping script corresponding to the adaptation interval of the splicing space point is obtained according to the updating process;
generating a mapping bit space for recording the adaptive mapping script, mapping the adaptive mapping script to the mapping bit space, and setting mapping associated information of the adaptive mapping script according to an object display tag of the splicing space point adaptive information;
judging whether the splicing space point adaptation information is adapted to the display object adaptation strategy or not according to the mapping association information, and determining at least one updating parameter for updating the display object adaptation strategy according to the updating process when the splicing space point adaptation information is adapted to the display object adaptation strategy so as to update the display object adaptation strategy according to the at least one updating parameter, wherein the display object adaptation strategy comprises a control instruction corresponding to each unit area.
4. The virtual reality system according to claim 1, wherein the virtual reality device determines, according to the display object adaptation policy, a target object adaptation region for display control of the object to be displayed and a manner of a control instruction sequence for each target object adaptation region, and includes:
positioning each object fluctuation range related to the object to be displayed, and determining an object control coordinate system corresponding to the object to be displayed;
determining a fluctuation distance segment set according to the object control coordinate system, extracting a dense control area of the object control coordinate system, taking a set threshold as a fluctuation interval, and extracting a centralized control range of the dense control area associated with the fluctuation distance segment set, wherein the dense control area is used for representing an area formed by the fact that the number of controllable coordinate points in a unit coordinate system in the object control coordinate system is larger than a set number;
generating a plurality of visual movement units according to visual movement directions in the visual areas corresponding to the fluctuation distance section coordinates according to at least two related fluctuation distance section coordinates in the centralized control range, calculating overlapping areas between all the visual areas in the next fluctuation distance section coordinate and all the visual areas in the previous fluctuation distance section coordinate, and obtaining a corresponding visual movement direction table according to each obtained overlapping area;
according to the visual movement direction table, acquiring a visual movement unit of which the visual movement directions are matched and the overlapping area between the visual areas of the two visual movement units is smaller than the maximum continuous overlapping area of the object control coordinate system in the overlapping area so as to form a fluctuation distance section coordinate space;
adapting the coordinate space in each fluctuation distance section coordinate space to obtain an adapted interval of each adapted fluctuation distance section coordinate space, and determining a target object adapted area for controlling the display of the object to be displayed according to the adapted interval of each fluctuation distance section coordinate space;
and determining a control instruction sequence aiming at each target object adaptation area according to the control instruction of the unit area corresponding to each target object adaptation area in the display object adaptation strategy.
5. The virtual reality system according to claim 1, wherein the manner in which the virtual reality device performs object control operations on the object to be displayed specifically according to the determined target object adaptation regions and the control instruction sequence for each target object adaptation region includes:
generating an adaptation control path and adaptation control identification information when the target object adaptation area performs adaptation control on the corresponding control instruction sequence according to the determined target object adaptation area;
identifying rendering nodes corresponding to the object to be displayed to obtain a plurality of rendering nodes, determining rendering labels corresponding to the rendering nodes, and determining corresponding rendering action data according to the rendering labels;
identifying the adaptation control path and the adaptation control identification information to the rendering action data to obtain rendering labels, determining rendering linkage amplitude between the rendering labels and each rendering label in the rendering action data, and determining rendering adaptation parameters of the rendering labels according to the adaptation control path of the rendering label corresponding to N rendering labels before the rendering linkage amplitude is sequenced, wherein N is a preset positive integer;
and determining a dynamic adjustment interval and a dynamic adjustment direction according to the rendering adaptation parameters, and executing object control operation on the object to be displayed according to the dynamic adjustment interval and the dynamic adjustment direction.
6. The virtual reality system according to any one of claims 1 to 5, wherein the virtual reality device is further configured to update the first control information of the object service and the second control information of the scene service according to an object control operation result on the object to be displayed.
7. A virtual reality object control device is applied to virtual reality equipment, the virtual reality equipment is in communication connection with a server, the server accesses information stored in the virtual reality equipment through a network, the server is implemented on a cloud platform, the network comprises a wired or wireless network access point, and the device comprises:
the virtual reality display system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a corresponding virtual reality configuration file from the server when receiving a virtual reality display request, and acquiring first control information of an object service corresponding to each object to be displayed in the virtual reality configuration file and second control information of a scene service corresponding to the virtual reality configuration file, and the first control information and the second control information are respectively used for representing control instructions of each control link;
a first determining module, configured to compare a control node association set between the first control information and the second control information, and when there is a conflict between control behaviors of any at least two associated control nodes in the control node association set, determine an adaptation policy of a display object of the scene service for a target associated control node where the conflict exists;
a second determining module, configured to determine, according to the display object adaptation policy, a target object adaptation area for performing display control on the object to be displayed and a control instruction sequence for each target object adaptation area, where the target object adaptation area is an area for performing adaptation control on the object to be displayed in a specific display process;
the object control module is used for executing object control operation on the object to be displayed according to the determined target object adaptation areas and the control instruction sequence aiming at each target object adaptation area;
the manner that the first determining module is specifically configured to determine the display object adaptation policy of the scenario service for the target associated control node having the conflict includes:
acquiring a current scene space corresponding to the target associated control node with conflict from the scene service;
calculating a first scene linkage space where the current scene space is located according to a preset display object adaptation matrix, performing simulation updating on the range of the first scene linkage space, and acquiring a second scene linkage space where the current scene space is located so as to enable the second scene linkage space to be an initial scene linkage space of a next scene space;
taking the next scene space as the current scene space, updating the preset display object adaptation matrix to obtain an updated display object adaptation matrix, and performing linkage updating on the initial scene linkage space corresponding to the current scene space according to the updated display object adaptation matrix to obtain the initial scene linkage space corresponding to the next scene space until all scene objects in the scene space are subjected to simulated linkage, so as to obtain a simulated linkage result;
calculating a corresponding dynamic adaptation function according to an initial simulation linkage parameter, the linkage times of each scene object in the scene space, the total linkage times of each scene object and a region configuration parameter of the initial scene linkage space;
and outputting the dynamic adaptation function, the simulation linkage result and the scene parameters of the scene space as the display object adaptation strategy of the control node association set.
CN202010860811.4A 2020-03-10 2020-03-10 Virtual reality system and virtual reality object control device Active CN112015272B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010860811.4A CN112015272B (en) 2020-03-10 2020-03-10 Virtual reality system and virtual reality object control device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010159540.XA CN111367414B (en) 2020-03-10 2020-03-10 Virtual reality object control method and device, virtual reality system and equipment
CN202010860811.4A CN112015272B (en) 2020-03-10 2020-03-10 Virtual reality system and virtual reality object control device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010159540.XA Division CN111367414B (en) 2020-03-10 2020-03-10 Virtual reality object control method and device, virtual reality system and equipment

Publications (2)

Publication Number Publication Date
CN112015272A CN112015272A (en) 2020-12-01
CN112015272B true CN112015272B (en) 2022-03-25

Family

ID=71206781

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202010159540.XA Active CN111367414B (en) 2020-03-10 2020-03-10 Virtual reality object control method and device, virtual reality system and equipment
CN202010860807.8A Active CN112015271B (en) 2020-03-10 2020-03-10 Virtual reality control method and device based on cloud platform and virtual reality equipment
CN202010860811.4A Active CN112015272B (en) 2020-03-10 2020-03-10 Virtual reality system and virtual reality object control device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202010159540.XA Active CN111367414B (en) 2020-03-10 2020-03-10 Virtual reality object control method and device, virtual reality system and equipment
CN202010860807.8A Active CN112015271B (en) 2020-03-10 2020-03-10 Virtual reality control method and device based on cloud platform and virtual reality equipment

Country Status (1)

Country Link
CN (3) CN111367414B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817447B (en) * 2021-01-25 2024-05-07 暗物智能科技(广州)有限公司 AR content display method and system
CN115981517B (en) * 2023-03-22 2023-06-02 北京同创蓝天云科技有限公司 VR multi-terminal cooperative interaction method and related equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108011886A (en) * 2017-12-13 2018-05-08 上海曼恒数字技术股份有限公司 A kind of cooperative control method, system, equipment and storage medium
CN108665553A (en) * 2018-04-28 2018-10-16 腾讯科技(深圳)有限公司 A kind of method and apparatus for realizing virtual scene conversion
CN109741463A (en) * 2019-01-02 2019-05-10 京东方科技集团股份有限公司 Rendering method, device and the equipment of virtual reality scenario
CN109861948A (en) * 2017-11-30 2019-06-07 腾讯科技(成都)有限公司 Virtual reality data processing method, device, storage medium and computer equipment
CN110519247A (en) * 2019-08-16 2019-11-29 上海乐相科技有限公司 A kind of one-to-many virtual reality display method and device

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4936522B2 (en) * 2006-09-29 2012-05-23 キヤノン株式会社 Image processing method and image processing apparatus
US20170154468A1 (en) * 2015-12-01 2017-06-01 Le Holdings (Beijing) Co., Ltd. Method and electronic apparatus for constructing virtual reality scene model
CN106210861B (en) * 2016-08-23 2020-08-07 上海幻电信息科技有限公司 Method and system for displaying bullet screen
CN106527709B (en) * 2016-10-28 2020-10-02 Tcl移动通信科技(宁波)有限公司 Virtual scene adjusting method and head-mounted intelligent device
EP3340187A1 (en) * 2016-12-26 2018-06-27 Thomson Licensing Device and method for generating dynamic virtual contents in mixed reality
CN106774941B (en) * 2017-01-16 2019-11-19 福建农林大学 Touch screen terminal 3D virtual role moves the solution to conflict with scene camera
CN107168780B (en) * 2017-04-06 2020-09-08 北京小鸟看看科技有限公司 Virtual reality scene loading method and equipment and virtual reality equipment
US10268263B2 (en) * 2017-04-20 2019-04-23 Microsoft Technology Licensing, Llc Vestibular anchoring
CN107885326A (en) * 2017-10-26 2018-04-06 中国电子科技集团公司第二十八研究所 Smart city planning and designing method based on HoloLens
EP3540566B1 (en) * 2018-03-12 2022-06-08 Nokia Technologies Oy Rendering a virtual scene
US11164377B2 (en) * 2018-05-17 2021-11-02 International Business Machines Corporation Motion-controlled portals in virtual reality
CN109508093B (en) * 2018-11-13 2022-08-09 江苏视睿迪光电有限公司 Virtual reality interaction method and device
CN109600525B (en) * 2018-11-15 2021-01-05 中国联合网络通信集团有限公司 Virtual reality-based call center control method and device
CN110187853B (en) * 2019-04-18 2023-01-03 北京奇艺世纪科技有限公司 Picture conflict recognition method, device, picture processing device and medium
CN110096156B (en) * 2019-05-13 2021-06-15 东北大学 Virtual reloading method based on 2D image
CN110478898B (en) * 2019-08-12 2024-03-15 网易(杭州)网络有限公司 Configuration method and device of virtual scene in game, storage medium and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109861948A (en) * 2017-11-30 2019-06-07 腾讯科技(成都)有限公司 Virtual reality data processing method, device, storage medium and computer equipment
CN108011886A (en) * 2017-12-13 2018-05-08 上海曼恒数字技术股份有限公司 A kind of cooperative control method, system, equipment and storage medium
CN108665553A (en) * 2018-04-28 2018-10-16 腾讯科技(深圳)有限公司 A kind of method and apparatus for realizing virtual scene conversion
CN109741463A (en) * 2019-01-02 2019-05-10 京东方科技集团股份有限公司 Rendering method, device and the equipment of virtual reality scenario
CN110519247A (en) * 2019-08-16 2019-11-29 上海乐相科技有限公司 A kind of one-to-many virtual reality display method and device

Also Published As

Publication number Publication date
CN112015271A (en) 2020-12-01
CN112015272A (en) 2020-12-01
CN112015271B (en) 2022-03-25
CN111367414B (en) 2020-10-13
CN111367414A (en) 2020-07-03

Similar Documents

Publication Publication Date Title
CN111352670B (en) Virtual reality scene loading method and device, virtual reality system and equipment
CN111104291B (en) Environment monitoring method, device and system based on Internet of things and monitoring server
CN112015272B (en) Virtual reality system and virtual reality object control device
CN111132145B (en) Network communication safety monitoring method, device, server and network communication system
CN111209336A (en) Data distribution method and device based on block chain and server
WO2019141225A1 (en) Conflict management method and system for multiple mobile robots
US11832147B2 (en) Profiling location information and network traffic density from telemetry data
CN111367528B (en) Compiling method and device of software development kit, software development system and server
CN112035490B (en) Electric vehicle information monitoring method, device and system based on cloud platform
CN111249106B (en) Training control device and system of old people rehabilitation robot
CN112491985A (en) Remote meter reading data processing method, gas meter system and gas cloud platform
CN112181562A (en) Browser view loading method, device and system
CN111107162B (en) Indoor positioning data processing method, device and system based on Internet of things
CN109271158A (en) A kind of method and its system realized based on graphic programming platform with buyun variable
CN112417668B (en) Ecological protection intelligent early warning method and device and server
CN111209509B (en) Information display method and device based on big data platform and big data platform
CN111178209B (en) Nuclear magnetic resonance interaction processing method and device and nuclear magnetic resonance interaction system
CN111652377A (en) Robot learning method, device and medium based on block chain
CN115617328B (en) Application generation method, system, equipment and medium for public support platform
CN114216451B (en) Robot map updating method and device
CN111988187A (en) Internet connection method and device of central management server
CN113505807A (en) Training method, device and system for security check equipment and storage medium
CN112215744A (en) Image data processing method, device and system based on cloud platform
CN114355939A (en) Path planning method and device of movable equipment and navigation system
CN112613535A (en) Water quality detection control method, device and platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220303

Address after: 100089 No. 516, 5 / F, complex building a 1, Qinghe Yongtai Park, Haidian District, Beijing

Applicant after: Beijing Euro Software Technology Development Co.,Ltd.

Address before: 264004 room 1607, building B3, 1861 cultural and Creative Industry Park, 7 Tongshi South Road, Zhifu District, Yantai City, Shandong Province

Applicant before: Jian Jibo

GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Virtual reality systems and virtual reality object control devices

Granted publication date: 20220325

Pledgee: Beijing Bank Co.,Ltd. Beiyuan Road Branch

Pledgor: Beijing Euro Software Technology Development Co.,Ltd.

Registration number: Y2024110000120