CN111831353A - OpenXR standard-based runtime library, data interaction method, device and medium - Google Patents

OpenXR standard-based runtime library, data interaction method, device and medium Download PDF

Info

Publication number
CN111831353A
CN111831353A CN202010655829.0A CN202010655829A CN111831353A CN 111831353 A CN111831353 A CN 111831353A CN 202010655829 A CN202010655829 A CN 202010655829A CN 111831353 A CN111831353 A CN 111831353A
Authority
CN
China
Prior art keywords
client
application program
module
runtime
library
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010655829.0A
Other languages
Chinese (zh)
Other versions
CN111831353B (en
Inventor
李孟臻
范承鑫
赵凯
李岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Parallel Cloud Technology Beijing Co ltd
Original Assignee
Parallel Cloud Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Parallel Cloud Technology Beijing Co ltd filed Critical Parallel Cloud Technology Beijing Co ltd
Priority to CN202010655829.0A priority Critical patent/CN111831353B/en
Publication of CN111831353A publication Critical patent/CN111831353A/en
Application granted granted Critical
Publication of CN111831353B publication Critical patent/CN111831353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application provides an OpenXR standard-based runtime library, a data interaction method, equipment and a medium, wherein the runtime library comprises: the system comprises an identification acquisition module, a database and a database, wherein the identification acquisition module is used for acquiring a unique identification, the unique identification is matched with a VR/AR application program request sent by a client, and the database is used for being loaded by the VR/AR application program when the VR/AR application program is started; the configuration information reading module is used for reading the configuration information in the corresponding named shared memory according to the acquired unique identifier; the network communication module is used for establishing UDP socket link with the client; and the video stream generating module is used for generating a video stream and sending the video stream to the network communication module. The method and the system can realize one-to-one binding between the VR/AR application program and the client by establishing the unique identifier, thereby realizing that different VR/AR application programs requested by a plurality of clients can be simultaneously started on one PC without confusion, and effectively saving system resources and cost.

Description

OpenXR standard-based runtime library, data interaction method, device and medium
Technical Field
The application relates to the technical field of virtual reality, in particular to an OpenXR standard-based runtime library, a data interaction method, equipment and a medium.
Background
VR (Virtual Reality) and AR (Augmented Reality) have received a lot of attention in recent years, but with the problem that although a large number of hardware and software companies have started to double the effort in this area, more and more devices (each with its own incompatible API) SDKs are increasing the severity of the fragmentation problem, despite the wide variety.
API fragmentation results in application developers having to spend a great deal of time, money, and resources integrating with various hardware to achieve the goal of being compatible with more hardware devices. Even large teams are forced to choose the platforms and devices they support, while for small teams the problem is even more severe. They lack the money and resources of a large team, which would lead to a severe two-stage differentiation of VR and AR markets. The richness and diversity of resources are stricken greatly.
Therefore, OpenVR, which is a common API for VR/AR devices, is introduced by the company vale, and really solves the problem of API fragmentation, but an OpenVR runtime library, i.e., the steadvr, is not an open source, which results in that there is no chance of secondary development, and some customized requirements cannot be met, and OpenVR limits that only one VR/AR application program instance can be run on one computer, which results in waste of computer resources. If the resources of one computer are enough to run 5 VR/AR applications with the same resource consumption, only one VR/AR application can be run on one computer after you select OpenVR. If we set up a server for a VR/AR application, our server can only serve one user at a time.
Currently, some enterprises develop a runtime library according to the OpenXR specification, for example, microsoft develops its runtime and runtime released by oculus based on OpenXR, but microsoft's runtime can only connect with its own VR/AR device HoloLens series, and oculus's runtime also only aims at its own device, resulting in poor compatibility of the runtime library; in addition, the mode of plug wire is adopted to traditional VR/AR mostly, is about to use an HDMI line or DP line to connect VR/AR equipment and PC for the user can not remove anytime and anywhere, and the use convenience is relatively poor, and user experience is not good.
Disclosure of Invention
In view of this, the present application provides an runtime, a data interaction method, a device, and a medium based on the OpenXR standard, and aims to implement a solution for a runtime with better versatility and compatibility, so that a user can enjoy VR/AR content in different VR/AR devices.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides an OpenXR-standard-based runtime library, where the runtime library is in a PC, the PC further includes a server, a VR/AR application, and a named shared memory, and the runtime library includes:
the system comprises an identification acquisition module, a runtime library and a named shared memory, wherein the identification acquisition module is used for acquiring a unique identification, the unique identification is matched with a VR/AR application program request sent by a client, the runtime library is used for being loaded by the VR/AR application program when the VR/AR application program is started, the unique identification is used for identifying the named shared memory, configuration information of the client sending the VR/AR application program request is written in the named shared memory, and the configuration information comprises field angle information, resolution ratio, frame rate, pupil distance information and height information of the client;
the configuration information reading module is used for reading the configuration information in the corresponding named shared memory according to the acquired unique identifier;
the network communication module is used for establishing a UDP socket link with the client, receiving attitude data from the client and sending the generated video stream to the client;
and the video stream generation module is used for processing the received attitude data, generating a video stream and sending the video stream to the network communication module.
Optionally, the video stream generating module specifically includes:
the data operation module is used for performing matrix rotation and matrix translation operation on the posture data, the interpupillary distance information and the height information to obtain processed posture data of the left eye and the right eye and providing the processed posture data to the VR/AR application program;
the texture extraction module is used for extracting the textures of the left eye and the right eye after the VR/AR application program finishes rendering according to the obtained processed attitude data and sending the extracted textures to the rendering and coding module;
a rendering and encoding module: and the texture processing module is used for re-rendering the received textures, encoding and packaging the rendered textures into a video stream, and submitting the video stream to the network communication module.
Optionally, the runtime library further includes:
the buffer queue module is used for buffering the attitude data received from the client;
the monitoring module is used for monitoring the length of the attitude data buffer queue;
and the refreshing module is used for adjusting the frequency of the attitude information provided for the VR/AR application program according to the refreshing rate of the client after the attitude data is received by the operation library.
Optionally, the runtime library further includes:
the first key set module is used for storing a handle key action set concerned by the VR/AR application program;
the second key set module is used for storing a handle key action set which can be sent by the client;
and the key action sending module is used for taking the intersection of the key actions stored in the first case set module and the second key set module and providing the intersection for the VR/AR application program.
In a second aspect, the present application provides a data interaction method for an OpenXR-standard-based runtime library, where the runtime library is in a PC, the PC further includes a server, a VR/AR application, and a named shared memory, and the method includes:
a server in the PC receives a VR/AR application program request from a client, wherein the VR/AR application program request comprises configuration information of the client, and the configuration information comprises field angle information, resolution ratio, frame rate, interpupillary distance information and height information of the client;
the server analyzes the configuration information in the VR/AR application program request, writes the analyzed configuration information into a named shared memory, and creates a unique identifier matched with the VR/AR application program request to identify the named shared memory;
the running library is loaded by the VR/AR application program requested by the client;
the operation library acquires a unique identifier matched with the VR/AR application program request, and reads configuration information in a corresponding memory through the unique identifier;
after a UDP socket link is established between the runtime library and the client, the client sends attitude data to the runtime library;
and the operation library receives the attitude data and then processes the attitude data to generate a video stream and sends the video stream to the client.
Optionally, the specific method for receiving and processing the attitude data by the runtime library, generating a video stream, and sending the video stream to the client includes:
the operating library performs matrix rotation and matrix translation operations on the attitude data, the pupil distance information and the height information to obtain processed attitude data of left and right eyes, and provides the processed attitude data to a VR/AR application program;
the VR/AR application program renders the left eye texture and the right eye texture according to the obtained processed posture data, generates texture information and sends the texture information to the runtime library;
the operation library receives the texture information and extracts the texture in the texture information;
and the runtime library re-renders the received texture, encodes and packs the rendered texture into a video stream, and sends the video stream to the client.
Optionally, after receiving the attitude data, the runtime library further includes:
and putting the attitude data into a buffer queue.
Optionally, the specific method for establishing the UDP socket link between the client and the runtime library includes:
the operating library starts UDP socket service and provides the port number bound by the UDP protocol to the client;
and the client is connected with the runtime library according to the port number.
In a third aspect, an embodiment of the present application further provides an apparatus, including: a processor, a memory and a communication unit;
the memory stores machine-readable instructions executable by the processor, the processor and the memory communicating through the communication unit when the device is operating;
wherein the processor executes the machine-readable instructions to perform the methods of the various aspects described above.
In a fourth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the method in the above aspects.
The beneficial effect of this application is:
1. the application can realize one-to-one binding between the VR/AR application program and the client by creating the unique identifier, so that the application programs of different VR/ARs requested by a plurality of clients can be started on one PC without confusion, and system resources and cost can be effectively saved;
2. the application runtime communicates with the client through the UDP protocol, so that all the clients capable of establishing UDP communication links with the runtime can be connected with the application runtime, and manufacturers, models and software and hardware configuration of the clients are not specifically limited, therefore, the application runtime has better universality and compatibility.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a block diagram of an architecture of a runtime of the present application based on the OpenXR standard;
FIG. 2 is a block diagram of a video stream generation module according to the present application;
fig. 3 is a flowchart of a data interaction method of the runtime library based on the OpenXR standard according to the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments.
The runtime library and the VR/AR (VR or AR) application program are in a one-to-one relationship, and when a VR/AR application program is started, the VR/AR application program needs to be loaded once correspondingly, and each loading creates an independent space in the memory for the runtime library, namely, the runtime libraries loaded each time are independent. If 3 VR/AR applications are launched: a, B, C, explain that there are 3 customer ends that is 3 VR/AR glasses: if the first glasses start the a application program, it should receive the image of the a application program collected and encoded by the runtime library loaded by the a application program, but not the B application program or the C application program. Meanwhile, the posture information sent by the first glasses should also be sent to the runtime library loaded by the application a, but not loaded by the application B or the application C.
If the first, second, and third 3 VR/AR glasses come from 3 different manufacturers, the parameter requirements of the 3 glasses for the received video may also be different, for example: resolution and FOV (field angle). Even if the resolution and FOV requirements are the same, some VR/AR applications may have a better experience with a higher FPS (frame rate), but some VR/AR applications may not have the same FPS, and the height of the user of each client may be different, in the VR/AR applications, the height of the user may be a large parameter affecting the user experience, and the above separate personalized parameters also need the corresponding runtime library of each client to process and then transmit to the VR/AR application through the interface between the runtime library and the VR/AR application.
In order to solve the technical problem of how to implement one-to-one matching between a VR/AR application and a client, in a first aspect of the present application, a runtime library based on an OpenXR standard is provided, where the runtime library is in a PC, and the PC further includes a server, a VR/AR application, and a named shared memory, as shown in fig. 1, the runtime library includes:
an identifier obtaining module 110, configured to obtain a unique identifier, where the unique identifier is matched with a VR/AR application request sent by a client, the runtime is used to be loaded by a VR/AR application when the VR/AR application is started, the unique identifier is used to identify the named shared memory, configuration information of the client that sends the VR/AR application request is written in the named shared memory, and the configuration information includes field angle information, resolution, frame rate, interpupillary distance information, and height information of the client;
usually, starting a VR/AR application program requires loading a runtime library correspondingly, and although the same runtime library is the same, different individuals exist in the memory and have different memory intervals. The operation library is in process communication with an external process in a memory sharing mode. Each runtime library and the external process are communicated with each other and have a single shared memory, and the runtime library is a dynamic library in nature.
In the application, each time the server receives a VR/AR application program request sent by a client, a unique identifier matched with the VR/AR application program request is generated and used for identifying a named shared memory in which client configuration information is stored. The export function of the runtime library not only has an export interface specified by the OpenXR standard, but also has an additional interface for transferring the unique identifier, and before the runtime library is correctly loaded, that is, before the application program calls the interface specified by the OpenXR standard, the additional interface is called to transfer the unique identifier in a current operating system.
Specifically, the unique identifier may be used as a suffix of the named shared memory name, for example, the named shared memory name is CloudLark _123456, where 123456 is the unique identifier.
Specifically, the client exists in a client device, the client device may be an intelligent glasses or a VR all-in-one machine, and the height information is height information of a VR/AR application experiencer or a user.
A configuration information reading module 120, configured to read configuration information in the corresponding named shared memory according to the obtained unique identifier;
and after the running library obtains the unique identifier, reading the content in the block of shared memory by taking the unique identifier as the identifier of the named shared memory. It should be noted that each time the runtime library is loaded, there is an independent shared memory communicating with the external process.
The network communication module 130 is configured to establish a UDP socket link with a client, receive gesture data from the client, and send a generated video stream to the client;
in the application, a UDP protocol is used for communication between the client and the runtime library, so that to achieve one-to-one correspondence between the client and the VR/AR application, that is, one-to-one correspondence between the client and the runtime library loaded by the VR/AR application, the runtime library needs to inform the client of a port number bound by the UDP protocol, after the UDP component is started, the runtime library informs the client of the UDP port number, and the client sends attitude data to the runtime library according to the port number or receives a video stream encoded by the runtime library.
And the video stream generating module 140 is configured to process the received gesture data, generate a video stream, and send the video stream to the network communication module.
Because the unique identifier is matched with the VR/AR application program request, the VR/AR application program request is matched with the client side of the VR/AR application program request, and the runtime library is correspondingly loaded once when one VR/AR application program is started, the runtime library acquires the unique identifier to read the information of the corresponding client side, namely, the matching between the runtime library and the client side is realized, namely, the one-to-one matching between the VR/AR application program loaded with the runtime library and the client side is realized.
The application can realize one-to-one matching between the VR/AR application program and the client by creating the unique identifier, so that different VR/AR application programs requested by a plurality of clients can be started on one PC simultaneously without confusion; in addition, the operation library is communicated with the client through a UDP protocol, so that all the clients capable of establishing UDP communication links with the operation library can be connected with the operation library, and manufacturers, models and software and hardware configuration of the clients are not specifically limited, so that the operation library has better universality and compatibility.
Specifically, as shown in fig. 2, the video stream generating module 140 includes:
the data operation module 141 is used for performing matrix rotation and matrix translation operations on the posture data, the pupil distance information and the height information to obtain processed posture data of the left eye and the right eye, and providing the processed posture data to the VR/AR application program;
in order to enable the rendered picture to have a stereoscopic impression, the posture data of the left eye and the right eye need to be calculated by combining the pupil distance, the default value of the pupil distance is 0.064 m, and the pupil distance can also be specifically set according to actual conditions.
The texture extraction module 142 is used for extracting the textures of the left eye and the right eye after the VR/AR application program finishes rendering according to the provided processed posture data and sending the extracted textures to the rendering and encoding module;
specifically, after the VR/AR application program obtains the processed pose data, it renders the left-eye and right-eye textures according to the processed pose data, generates texture information, and sends the texture information to a texture extraction module 142 of a runtime library, where the texture extraction module 142 extracts the left-eye and right-eye textures in the texture information.
In the actual communication process, the VR/AR application may call an interface of the runtime library to tell the runtime library whether the texture is left-eye and right-eye separated or left-eye and right-eye synthesized, if the texture is left-eye and right-eye separated independently, the texture extraction module 142 needs to synthesize the left-eye and right-eye textures and then submit the synthesized texture to the rendering and encoding module 143, and if the texture is left-eye and right-eye synthesized, the synthesized texture is directly submitted to the rendering and encoding module 143.
And the rendering and encoding module 143 is configured to re-render the received texture, encode and package the rendered texture into a video stream, and submit the video stream to the network communication module 130.
The rendering and encoding module 143 renders the received texture in the local device, i.e., the graphics card, encodes and packages the rendered texture into a video stream, and submits the video stream to the network communication module 130. In the aspect of coding, the runtime library can use an England hardware coding interface, and is more efficient and stable than soft coding. The rendering and encoding module 143, when encoding the video, performs encoding according to the resolution requirement in the configuration information of the client, so that the client obtains the resolution matched with itself after receiving the video decoding.
As an optional implementation, the runtime further includes:
a signaling processing module, configured to process various received signaling, for example, start or stop sending a video stream;
for example, when the client disconnects, the VR/AR application is also closed and the video stream stops being sent.
The buffer queue module is used for buffering the attitude data received from the client;
the monitoring module is used for monitoring the length of the attitude data buffer queue;
the method comprises the steps that a buffer queue of attitude data is created by a running library, the attitude data sent by a client is prepared to be received, meanwhile, the length of the attitude data queue needs to be monitored, because the length of the attitude data queue is too large, the attitude data can be congested, even communication delay is directly increased, and when the delay is up to a certain value, for example, more than 80 milliseconds, the attitude data queue can be cleaned, so that the delay caused by congestion is reduced.
And the refreshing module is used for adjusting the frequency of the attitude information provided for the VR/AR application program according to the refreshing rate of the client after the attitude data is received by the operation library.
After establishing a UDP socket link between the client and the runtime library, starting to send attitude data to the runtime library, after receiving the attitude data, the runtime library adjusts the frequency of the attitude data provided to the VR/AR application program according to the refresh rate of the client, i.e. the frame rate (the client usually sends the attitude data to the runtime library at a certain frequency, which is usually 60HZ), and keeps the two consistent.
It should be noted that the client informs the runtime frame rate is actually the frequency at which the client sends the attitude data to inform the runtime. Because one frame of picture is generated, one gesture data is needed, and the VR/AR application program needs to obtain one gesture data each time to perform one rendering and then submit the rendering to the runtime library for one frame of picture.
As an optional implementation, the runtime further includes:
the first key set module is used for storing a handle key action set concerned by the VR/AR application program;
the second key set module is used for storing a handle key action set which can be sent by the client;
and the key action sending module is used for taking the intersection of the key actions stored in the first case set module and the second key set module and providing the intersection for the VR/AR application program.
In the use of VR/AR applications, the gesture data does not include only the gestures of the head-mounted device, and there are many VR/AR applications that use a handle to accomplish user and application interaction. After the VR/AR application is started, the application does not know the type of the hardware currently connected to the runtime, i.e., the type of the handle, nor does the hardware know which key states the application needs to pay attention to, which causes a problem of bidirectional blind selection.
In order to solve the above problems, the present application regards all types of handles and key actions supported by the OpenXR standard as a mapping relation in the runtime library as a whole set of key actions. After the application program is started, the key actions concerned by the application program are informed to the running library through a specific interface, the running library puts the keys into a set 1, after a client, namely VR/AR equipment, is connected with the running library, the client also informs the key actions which can be sent by the client to the running library, the running library puts the key actions which can be sent by the client into a set 2, and when the application program requests a handle key action state, the running library provides the application program with the contents in the intersection of the set 1 and the set 2. In the above scheme, the set 1 and the set 2 must be subsets of the full set of key actions, i.e. must be devices and key actions supported in the OpenXR specification.
In a second aspect, the present application provides a data interaction method for an OpenXR-standard-based runtime library, where the runtime library is in a PC, and the PC further includes a server, a VR/AR application, and a named shared memory, as shown in fig. 3, where the method includes:
s301, a server in the PC receives a VR/AR application program request from a client, wherein the VR/AR application program request comprises configuration information of the client, and the configuration information comprises field angle information, resolution, frame rate, interpupillary distance information and height information of the client;
specifically, the client exists in a client device, the client device may be an intelligent glasses or a VR all-in-one machine, and the height information is height information of a VR/AR application experiencer or a user.
S302: the server analyzes the configuration information in the VR/AR application program request, writes the analyzed configuration information into a named shared memory, and creates a unique identifier matched with the VR/AR application program request to identify the named shared memory;
specifically, the unique identifier may be used as a suffix of the named shared memory name, for example, the named shared memory name is CloudLark _123456, where 123456 is the unique identifier.
S303: the running library is loaded by the VR/AR application program requested by the client;
s304, the running library acquires a unique identifier matched with the VR/AR application program request, and reads the configuration information in the corresponding memory through the unique identifier;
in the application, the export function of the runtime library not only has the export interface specified by the OpenXR standard, but also has an additional interface for transferring the unique identifier, and before the runtime library is correctly loaded, that is, before the application program calls the interface specified by the OpenXR standard, the additional interface is called to transfer the unique identifier in the current operating system.
And after the running library obtains the unique identifier, reading the content in the block of shared memory by taking the unique identifier as the identifier of the named shared memory. It should be noted that each time the runtime library is loaded, there is an independent shared memory communicating with the external process.
The application example is as follows:
a-glasses require 2400 x 1200 resolution and 60fps for a-applications, and B-glasses require 2800 x 1400 resolution and 72fps for B-applications. The server will open up a segment of memory under the name app _ a upon request from the first glasses, contents 2400,1200, 60. And the name app _ A is transmitted to an operation library loaded by an application program A through an export interface of the dynamic library, and after the operation library obtains the name of the shared memory, the operation library reads the section of the memory of the name of the app _ A, so that 2400,1200 and 60 can be read out for correct processing.
S305: after a UDP socket link is established between the runtime library and the client, the client sends attitude data to the runtime library;
in the application, a UDP protocol is used for communication between the client and the runtime library, so that to achieve one-to-one correspondence between the client and the VR/AR application, that is, one-to-one correspondence between the client and the runtime library loaded by the VR/AR application, the runtime library needs to inform the client of a port number bound by the UDP protocol, after the UDP component is started, the runtime library informs the client of the UDP port number, and the client sends the posture data to the runtime library according to the port number.
And S306, the operation library receives the attitude data and then processes the attitude data to generate a video stream and sends the video stream to the client.
After receiving the attitude data, the operation library performs matrix rotation and matrix translation operation on the attitude data, the interpupillary distance information and the height information to obtain processed attitude data of left and right eyes, and sends the processed attitude data to a VR/AR application program;
in order to enable the rendered picture to have a stereoscopic impression, the posture data of the left eye and the right eye need to be calculated by combining the pupil distance, the default value of the pupil distance is 0.064 m, and the pupil distance can also be specifically set according to actual conditions.
And after receiving the processed attitude data, the VR/AR application program renders the left eye texture and the right eye texture according to the processed attitude data to generate texture information, and sends the texture information to a runtime library, and the runtime library extracts the left eye texture and the right eye texture in the texture information.
In the actual communication process, whether the texture provided by the VR/AR application to the runtime library is left-eye and right-eye separated or left-eye and right-eye synthesized needs to be judged according to the previous configuration interface. If the left eye and the right eye are independent, rendering coding is carried out after the left eye texture and the right eye texture are synthesized, and if the left eye texture and the right eye texture are synthesized, rendering coding is directly carried out. And after rendering and encoding, encoding and packaging the rendered texture into a video stream and sending the video stream to the client.
In the aspect of coding, the runtime library can use an England hardware coding interface, and is more efficient and stable than soft coding. When the video is coded, the video is coded according to the resolution requirement in the configuration information of the client, so that the client obtains the resolution matched with the client after receiving the video decoding.
The application can realize one-to-one binding between the VR/AR application program and the client by creating the unique identifier, so that different VR/AR application programs requested by a plurality of clients can be started on one PC simultaneously without confusion; in addition, the operation library is communicated with the client through a UDP protocol, so that all the clients capable of establishing UDP communication links with the operation library can be connected with the operation library, and manufacturers, models and software and hardware configuration of the clients are not specifically limited, so that the operation library has better universality and compatibility.
As an optional embodiment, after receiving the attitude data, the runtime further includes:
and putting the attitude data into a buffer queue.
The method comprises the steps that a buffer queue of attitude data is created by a running library, the attitude data sent by a client is prepared to be received, meanwhile, the length of the attitude data queue needs to be monitored, because the length of the attitude data queue is too large, the attitude data can be congested, even communication delay is directly increased, and when the delay is up to a certain value, for example, more than 80 milliseconds, the attitude data queue can be cleaned, so that the delay caused by congestion is reduced.
It should be noted that, the application can receive various information sent by the VR/AR all-in-one machine (i.e., the client) in a wireless network communication manner, and transmit the information to the application when the application needs the information. The wireless interaction of VR/AR is realized, and the overstaffed and inconvenience of wired PCVR are eliminated.
In a third aspect, an embodiment of the present application further provides an apparatus, including: a processor, a memory and a communication unit;
the memory stores machine-readable instructions executable by the processor, the processor and the memory communicating through the communication unit when the device is operating;
wherein the processor executes the machine-readable instructions to perform the methods of the various aspects described above.
The memory may be used to store instructions for execution by the processor and may be implemented by any type of volatile or non-volatile memory terminal or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. The execution instructions in the memory, when executed by the processor, enable the apparatus to perform some or all of the steps in the method embodiments described below.
The processor is a control center of the storage terminal, connects various parts of the whole electronic terminal by using various interfaces and lines, and executes various functions of the electronic terminal and/or processes data by operating or executing software programs and/or modules stored in the memory and calling data stored in the memory. The processor may be composed of an Integrated Circuit (IC), for example, a single packaged IC, or a plurality of packaged ICs connected with the same or different functions. For example, a processor may include only a Central Processing Unit (CPU). In the embodiments of the present application, the CPU may be a single arithmetic core or may include multiple arithmetic cores.
A communication unit for establishing a communication channel so that the storage device can communicate with other terminals. And receiving user data sent by other terminals or sending the user data to other terminals.
In a fourth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the method in the above aspects.
The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
According to the application, data interaction can be realized on the same PC and a plurality of VR/AR application programs which realize the OpenXR standard in an application layer, so that system resources and cost are saved; in addition, the operation library is communicated with the client through a UDP protocol, so that all the clients capable of establishing UDP communication links with the operation library can be connected with the operation library, and manufacturers, models and software and hardware configuration of the clients are not specifically limited, so that the operation library has better universality and compatibility.
In the embodiments provided in the present application, it should be understood that the disclosed system and method may be implemented in other ways. For example, the above-described node embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, each functional module in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. The OpenXR standard-based runtime library is characterized in that the runtime library is in a PC, the PC further comprises a server, a VR/AR application program and a named shared memory, and the runtime library comprises:
the system comprises an identification acquisition module, a running library and a named shared memory, wherein the identification acquisition module is used for acquiring a unique identification, the unique identification is matched with a VR/AR application program request sent by a client, the running library is used for being loaded by the VR/AR application program when the VR/AR application program is started, the unique identification is used for identifying the named shared memory, configuration information of the client sending the VR/AR application program request is written in the named shared memory, and the configuration information comprises field angle information, resolution ratio, frame rate, pupil distance information and height information of the client;
the configuration information reading module is used for reading the configuration information in the corresponding named shared memory according to the acquired unique identifier;
the network communication module is used for establishing a UDP socket link with the client, receiving attitude data from the client and sending the generated video stream to the client;
and the video stream generation module is used for processing the received attitude data, generating a video stream and sending the video stream to the network communication module.
2. The OpenXR standard based runtime of claim 1, wherein the video stream generation module specifically comprises:
the data operation module is used for performing matrix rotation and matrix translation operation on the posture data, the interpupillary distance information and the height information to obtain processed posture data of the left eye and the right eye and sending the processed posture data to the VR/AR application program;
the texture extraction module is used for extracting the textures of the left eye and the right eye after the VR/AR application program finishes rendering according to the provided processed attitude data and sending the textures to the rendering and coding module;
a rendering and encoding module: and the texture processing module is used for re-rendering the received textures, encoding and packaging the rendered textures into a video stream, and submitting the video stream to the network communication module.
3. The OpenXR standard based runtime of claim 1 or 2, further comprising:
the buffer queue module is used for buffering the attitude data received from the client;
the monitoring module is used for monitoring the length of the attitude data buffer queue;
and the refreshing module is used for adjusting the frequency of the attitude information provided for the VR/AR application program according to the refreshing rate of the client after the attitude data is received by the operation library.
4. The OpenXR standard based runtime of claim 1 or 2, further comprising:
the runtime further comprises:
the first key set module is used for storing a handle key action set concerned by the VR/AR application program;
the second key set module is used for storing a handle key action set which can be sent by the client;
and the key action sending module is used for taking the intersection of the key actions stored in the first case set module and the second key set module and providing the intersection for the VR/AR application program.
5. The data interaction method of the OpenXR standard-based runtime is characterized in that the runtime is in a PC, the PC further comprises a server, a VR/AR application program and a named shared memory, and the method comprises the following steps:
a server in the PC receives a VR/AR application program request from a client, wherein the VR/AR application program request comprises configuration information of the client, and the configuration information comprises field angle information, resolution ratio, frame rate, interpupillary distance information and height information of the client;
the server analyzes the configuration information in the VR/AR application program request, writes the analyzed configuration information into a named shared memory, and creates a unique identifier matched with the VR/AR application program request to identify the named shared memory;
the running library is loaded by the VR/AR application program requested by the client;
the operation library acquires a unique identifier matched with the VR/AR application program request, and reads configuration information in a corresponding memory through the unique identifier;
after a UDP socket link is established between the runtime library and the client, the client sends attitude data to the runtime library;
and the operation library receives the attitude data and then processes the attitude data to generate a video stream and sends the video stream to the client.
6. The method for data interaction based on the OpenXR standard runtime of claim 5, wherein the specific method for the runtime to receive the gesture data for processing, generate a video stream, and send the video stream to the client comprises:
the operating library performs matrix rotation and matrix translation operations on the attitude data, the pupil distance information and the height information to obtain processed attitude data of left and right eyes, and provides the processed attitude data to a VR/AR application program;
the VR/AR application program renders the left eye texture and the right eye texture according to the obtained processed posture data, generates texture information and sends the texture information to the runtime library;
the operation library receives the texture information and extracts the texture in the texture information;
and the runtime library re-renders the received texture, encodes and packs the rendered texture into a video stream, and sends the video stream to the client.
7. The method for data interaction based on the OpenXR standard runtime of claim 6, wherein the runtime, after receiving the gesture data, further comprises:
and putting the attitude data into a buffer queue.
8. The OpenXR standard-based runtime data interaction method as recited in claim 7, wherein the specific method for establishing a UDP socket link between the client and the runtime is as follows:
the operating library starts UDP socket service and provides the port number bound by the UDP protocol to the client;
and the client is connected with the runtime library according to the port number.
9. An apparatus, comprising: a processor, a memory and a communication unit;
the memory stores machine-readable instructions executable by the processor, the processor and the memory communicating through the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of any of claims 5 to 8.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the method of any one of claims 5-8.
CN202010655829.0A 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium Active CN111831353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010655829.0A CN111831353B (en) 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010655829.0A CN111831353B (en) 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium

Publications (2)

Publication Number Publication Date
CN111831353A true CN111831353A (en) 2020-10-27
CN111831353B CN111831353B (en) 2024-02-20

Family

ID=72900368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010655829.0A Active CN111831353B (en) 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium

Country Status (1)

Country Link
CN (1) CN111831353B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112612456A (en) * 2020-12-25 2021-04-06 深圳市引力创新科技有限公司 Multi-program systematic management framework and management method
CN115209178A (en) * 2021-04-14 2022-10-18 华为技术有限公司 Information processing method, device and system
CN116560858A (en) * 2023-07-07 2023-08-08 北京蔚领时代科技有限公司 VR cloud server container isolation method and system
CN117596377A (en) * 2024-01-18 2024-02-23 腾讯科技(深圳)有限公司 Picture push method, device, electronic equipment, storage medium and program product

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7266616B1 (en) * 2001-08-08 2007-09-04 Pasternak Solutions Llc Method and system for digital rendering over a network
CN107197342A (en) * 2017-06-16 2017-09-22 深圳创维数字技术有限公司 A kind of data processing method, intelligent terminal, VR equipment and storage medium
CN107203434A (en) * 2017-06-22 2017-09-26 武汉斗鱼网络科技有限公司 A kind of texture shared method, device and computer-readable recording medium
US20180336069A1 (en) * 2017-05-17 2018-11-22 Tsunami VR, Inc. Systems and methods for a hardware agnostic virtual experience
CN109814719A (en) * 2018-07-26 2019-05-28 亮风台(上海)信息科技有限公司 A kind of method and apparatus of the display information based on wearing glasses
US10325410B1 (en) * 2016-11-07 2019-06-18 Vulcan Inc. Augmented reality for enhancing sporting events
US10452868B1 (en) * 2019-02-04 2019-10-22 S2 Systems Corporation Web browser remoting using network vector rendering
CN110413386A (en) * 2019-06-27 2019-11-05 深圳市富途网络科技有限公司 Multiprocessing method, apparatus, terminal device and computer readable storage medium
US10497180B1 (en) * 2018-07-03 2019-12-03 Ooo “Ai-Eksp” System and method for display of augmented reality
US10558824B1 (en) * 2019-02-04 2020-02-11 S2 Systems Corporation Application remoting using network vector rendering
CN111030990A (en) * 2019-11-05 2020-04-17 华为技术有限公司 Method for establishing communication connection, client and server
CN111064985A (en) * 2018-10-16 2020-04-24 北京凌宇智控科技有限公司 System, method and device for realizing video streaming
CN111316334A (en) * 2017-11-03 2020-06-19 三星电子株式会社 Apparatus and method for dynamically changing virtual reality environment

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7266616B1 (en) * 2001-08-08 2007-09-04 Pasternak Solutions Llc Method and system for digital rendering over a network
US10325410B1 (en) * 2016-11-07 2019-06-18 Vulcan Inc. Augmented reality for enhancing sporting events
US20180336069A1 (en) * 2017-05-17 2018-11-22 Tsunami VR, Inc. Systems and methods for a hardware agnostic virtual experience
CN107197342A (en) * 2017-06-16 2017-09-22 深圳创维数字技术有限公司 A kind of data processing method, intelligent terminal, VR equipment and storage medium
CN107203434A (en) * 2017-06-22 2017-09-26 武汉斗鱼网络科技有限公司 A kind of texture shared method, device and computer-readable recording medium
CN111316334A (en) * 2017-11-03 2020-06-19 三星电子株式会社 Apparatus and method for dynamically changing virtual reality environment
US10497180B1 (en) * 2018-07-03 2019-12-03 Ooo “Ai-Eksp” System and method for display of augmented reality
CN109814719A (en) * 2018-07-26 2019-05-28 亮风台(上海)信息科技有限公司 A kind of method and apparatus of the display information based on wearing glasses
CN111064985A (en) * 2018-10-16 2020-04-24 北京凌宇智控科技有限公司 System, method and device for realizing video streaming
US10452868B1 (en) * 2019-02-04 2019-10-22 S2 Systems Corporation Web browser remoting using network vector rendering
US10558824B1 (en) * 2019-02-04 2020-02-11 S2 Systems Corporation Application remoting using network vector rendering
CN110413386A (en) * 2019-06-27 2019-11-05 深圳市富途网络科技有限公司 Multiprocessing method, apparatus, terminal device and computer readable storage medium
CN111030990A (en) * 2019-11-05 2020-04-17 华为技术有限公司 Method for establishing communication connection, client and server

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112612456A (en) * 2020-12-25 2021-04-06 深圳市引力创新科技有限公司 Multi-program systematic management framework and management method
CN115209178A (en) * 2021-04-14 2022-10-18 华为技术有限公司 Information processing method, device and system
WO2022218209A1 (en) * 2021-04-14 2022-10-20 华为技术有限公司 Information processing method, apparatus and system
CN116560858A (en) * 2023-07-07 2023-08-08 北京蔚领时代科技有限公司 VR cloud server container isolation method and system
CN117596377A (en) * 2024-01-18 2024-02-23 腾讯科技(深圳)有限公司 Picture push method, device, electronic equipment, storage medium and program product
CN117596377B (en) * 2024-01-18 2024-05-28 腾讯科技(深圳)有限公司 Picture push method, device, electronic equipment, storage medium and program product

Also Published As

Publication number Publication date
CN111831353B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
CN111831353B (en) Operation library based on OpenXR standard, data interaction method, device and medium
JP7110272B2 (en) Electronic device and its control method
CN102137151B (en) Remote protocol with multi connection channels
US8762544B2 (en) Selectively communicating data of a peripheral device to plural sending computers
CN105704161B (en) For transmitting and receiving the long-range method and system that graph data is presented
WO2022257699A1 (en) Image picture display method and apparatus, device, storage medium and program product
CN102413150A (en) Server and virtual desktop control method and virtual desktop control system
CN101553795A (en) Multi-user display proxy server
CN101681246A (en) Sharing a computer display across a network
CN113034629B (en) Image processing method, image processing device, computer equipment and storage medium
CN102196033B (en) A kind ofly transmit and receive the long-range method and system presenting data
JP2008526107A (en) Using graphics processors in remote computing
CN111984114A (en) Multi-person interaction system based on virtual space and multi-person interaction method thereof
CN115065684B (en) Data processing method, apparatus, device and medium
CN104765636B (en) A kind of synthetic method and device of remote desktop image
CN102664939A (en) Method and device for mobile terminal of screen mirror image
CN114616536A (en) Artificial reality system for transmitting surface data using superframe
US7075544B2 (en) Apparatus and method of processing image in thin-client environment and apparatus and method of receiving the processed image
US11872482B2 (en) Distributed multi-terminal and multi-network supporting system for android online game
CN113778593B (en) Cloud desktop control method and device, electronic equipment, storage medium and program product
CN113327303B (en) Image processing method, image processing device, computer equipment and storage medium
CN114268779A (en) Image data processing method, device, equipment and computer readable storage medium
WO2023179395A1 (en) Data transmission system and method, service system, device, and storage medium
CN115373618B (en) Multi-screen display method and device, vehicle machine and storage medium
US20230153137A1 (en) Remote rendering system, method and device based on virtual mobile architecture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant