CN111640191B - VR (virtual reality) -based method for collecting and processing projection screen images - Google Patents

VR (virtual reality) -based method for collecting and processing projection screen images Download PDF

Info

Publication number
CN111640191B
CN111640191B CN202010506055.5A CN202010506055A CN111640191B CN 111640191 B CN111640191 B CN 111640191B CN 202010506055 A CN202010506055 A CN 202010506055A CN 111640191 B CN111640191 B CN 111640191B
Authority
CN
China
Prior art keywords
screen
picture
texture
rendering thread
opengles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010506055.5A
Other languages
Chinese (zh)
Other versions
CN111640191A (en
Inventor
孙鹏飞
高磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing IQIYI Intelligent Technology Co Ltd
Original Assignee
Nanjing IQIYI Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing IQIYI Intelligent Technology Co Ltd filed Critical Nanjing IQIYI Intelligent Technology Co Ltd
Priority to CN202010506055.5A priority Critical patent/CN111640191B/en
Publication of CN111640191A publication Critical patent/CN111640191A/en
Application granted granted Critical
Publication of CN111640191B publication Critical patent/CN111640191B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a projection screen picture collecting and processing method based on a VR (virtual reality) integrated machine, which is characterized in that an updated picture of a 3D application is put into a projection screen module, and different threads can share texture resources under the same process while picture updating is carried out. Compared with the prior art, the technical scheme of the invention can flexibly process multiple screen-throwing requests and can perfectly adapt to the size of the target screen-throwing end. Meanwhile, the acquired and processed pictures are clear and complete, no large black edge exists, and the influence on the system rendering load is small. Both for the user and for the preparation of the advertising material.

Description

VR (virtual reality) -based method for collecting and processing projection screen images
Technical Field
The invention relates to the technical field of VR (virtual reality), in particular to a method for collecting and processing a video recording screen picture based on a VR integrated machine.
Background
VR (Virtual Reality) is a computer simulation system capable of creating and experiencing a Virtual world, and is a system of multi-source information fusion, interactive three-dimensional dynamic view and entity behavior by utilizing a computer to generate a simulation environment, so that a user is immersed in the simulation environment.
When the VR equipment performs screen projection processing, monocular images in the integrated machine can be projected to external equipment or recorded into video files. There are two common screen-throwing processing modes at present: the first way in the prior art is: one is to record the whole screen directly as a picture source by means of a virtual screen of an android system. The second way in the prior art is: by means of the Presentation scheme of the android system. Apk renders the picture to be projected to a separate surfacview and then associates the surfacview with the virtual screen with Presentation. And finally, transferring the picture to a virtual screen.
It is clear that the two solutions of the prior art still have some objective technical problems: the technical defect of the first mode is that: the virtual screen of android system is used to record the whole screen directly as the picture source. The greatest disadvantage of this approach is that the projected pictures are two circular pictures, the image seen by the receiving end is not attractive, and the user experience is poor. The technical defect of the second mode is that: the Presentation scheme by means of an android system has the defects of low rendering efficiency, surface effect requirement and difficulty in adding post-processing links (such as watermarking). In particular, the method comprises the steps of,
when image data is transmitted through surface image processing, additional operations are performed between the pictures of the copy apk and the virtual screen, data transmission delay and system load are increased, and the phenomenon of picture tearing of the cast pictures can be caused because the copy operation is synchronous with the rendering of the apk.
Therefore, how to overcome the above technical problems is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
Therefore, the embodiment of the invention provides a method for collecting and processing a video recording screen picture based on a VR (virtual reality) integrated machine, which is used for solving the technical problems.
An embodiment of the invention provides a method for collecting and processing a screen-throwing picture based on a VR (virtual reality) integrated machine, which puts an updated picture of a 3D application into a screen-throwing module, and can realize different threads sharing texture resources under the same process while carrying out picture updating, and specifically comprises the following operation steps:
step S100: after the background screen-throwing service unit is started, a background main rendering thread is created, and simultaneously, a texture of OpenGLES is created; the texture of the OpenGLES is used as a texture area of the data terminal to provide shared texture resources;
step S200: the screen projection module with the screen projection requirement sends a screen projection request to a background screen projection service unit; the background screen-throwing service unit identifies the received screen-throwing request, and identifies the ID number of the Surface-x and the size information of the required picture of the Surface-x contained in the screen-throwing request; the size information comprises the height size and the width size of a screen required by Surface-x;
step S300: after receiving the screen projection request in the step S200, the background screen projection service unit creates a rendering thread A based on Surface-x in the screen projection request and shares a context with a background main rendering thread; the rendering thread A and the background main rendering thread share a context, so as to share texture resources recorded by textures of OpenGLES in the background main rendering thread; the rendering thread A corresponds to the Surface-x one by one; the other rendering threads in the background screen-throwing service unit call texture resources recorded by the textures of OpenGLES at will; meanwhile, the background screen-throwing service unit creates a Surface-APP based on the texture of OpenGLES of the background main rendering thread;
step S400: the background screen-throwing service unit transmits a Surface-APP to the 3D application through an AIDL interface, and the Surface-APP is used for establishing information link connection between the 3D application and the background screen-throwing service unit;
step S500: the 3D application comprises a 3D main rendering thread, and after the 3D application receives the Surface-APP, the 3D application creates a rendering thread B sharing the context with the 3D main rendering thread based on the Surface-APP; the rendering thread B is used for rendering and drawing the monocular picture in the 3D main rendering thread, and the rendered monocular picture is dropped on the picture of the Surface-APP so as to ensure that the picture of the Surface-APP is updated, and the updated picture is finally added into the texture of OpenGLES.
Preferably, as one possible embodiment; the screen projection module comprises an Rtmp screen projection module and a Miracast screen projection module.
Preferably, as one possible embodiment; in step S500, the method further includes an operation step of clipping the updated frame to adapt to the size of the projection module:
step S510: after adding the updated picture into the texture of OpenGLES, calling the size information recorded in the Surface-x of the screen projection module of the target; and cutting according to the size information to ensure that the updated picture size is matched with the screen projection module.
Preferably, as one possible embodiment; in step S500, the method further includes the operation step of adding logo watermark to the updated picture:
step S520: after adding the updated picture to the texture of OpenGLES, acquiring logo watermarks to be added, adding the logo watermarks to the updated picture, and finally throwing the updated picture into a screen throwing module of a target.
Preferably, as one possible embodiment; in the above step S520: the method for acquiring the logo watermark to be added and adding the logo watermark to the updated picture comprises the following steps:
step S5210: after adding an updated picture to the texture of OpenGLES, the rendering thread A acquires a logo watermark to be added, adds the logo watermark to the updated picture, and sorts the picture added with the logo watermark into texture resources of the texture of OpenGLES;
step S5220: the screen projection module of the target accesses the texture of OpenGLES, and according to the ID number of the Surface-x, an updated picture corresponding to the same link with the ID number of the Surface-x is called and is projected into the screen projection module of the target.
The embodiment of the invention has the following technical advantages:
the embodiment of the invention provides a technical scheme of a video recording screen picture acquisition processing method based on a VR (virtual reality) integrated machine, which at least comprises the following two technical advantages;
1. according to the VR-based screen capture processing method, the system rendering load is reduced, and the reason for specifically reducing the system rendering load is mainly that the screen of the 3D application is transmitted to the background screen capture service through an AIDL interface without being subjected to surfaceflinger processing. 2. According to the VR-based screen capture processing method, the shared texture area among different rendering threads in the same process is realized, so that processing operation is simplified, rendering storage space is reduced, and system rendering load is further reduced; research shows that different threads in the same process in the prior art are isolated from each other, and sharing of textures cannot be realized; therefore, after the technical scheme is implemented, other rendering threads which are subsequently created in the same process can also be accessed at will to share texture resources recorded by the textures of OpenGLES in the background main rendering thread.
Drawings
In order to more clearly illustrate the technical solutions of the present invention, the drawings that are required for the embodiments will be briefly described, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope of the present invention. Like elements are numbered alike in the various figures.
Fig. 1 shows a main flow diagram of a method for collecting and processing a video recording screen based on a VR integrated machine according to an embodiment of the present invention;
fig. 2 shows a schematic control principle structure diagram of a recording screen image acquisition processing method based on a VR integrated machine according to an embodiment of the present invention.
Reference numerals: 10-a screen projection module; 20-a background screen-throwing service unit; 21-a background main rendering thread; texture of 22-OpenGLES; 30-3D application; 31-3D master rendering thread.
Description of the embodiments
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments.
The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present invention.
The terms "comprises," "comprising," "including," or any other variation thereof, are intended to cover a specific feature, number, step, operation, element, component, or combination of the foregoing, which may be used in various embodiments of the present invention, and are not intended to first exclude the presence of or increase the likelihood of one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing.
Furthermore, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of the invention belong. The terms (such as those defined in commonly used dictionaries) will be interpreted as having a meaning that is the same as the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in connection with the various embodiments of the invention.
Examples
As shown in fig. 1, a detailed description of a method for acquiring and processing a video recording screen based on a VR integrated machine in a first embodiment of the present invention is described below.
The invention provides a projection screen picture collecting and processing method based on a VR (virtual reality) integrated machine, which puts an updated picture of a 3D application into a projection screen module, and can realize different threads sharing texture resources under the same process while updating the picture, and specifically comprises the following operation steps:
step S100: after the background screen-throwing service unit is started, a background main rendering thread is created, and simultaneously, a texture of OpenGLES is created; the texture of the OpenGLES is used as a texture area of the data terminal to provide shared texture resources; that is, the background main rendering thread in step S100 is different from the 3D main rendering thread in step S500; renderThread refers to a rendering thread RenderThread, which is a rendering thread within an android system.
Step S200: the screen projection module with the screen projection requirement sends a screen projection request to a background screen projection service unit; the background screen-throwing service unit identifies the received screen-throwing request, and identifies the ID number of the Surface-x and the size information of the required picture of the Surface-x contained in the screen-throwing request; the size information comprises the height size and the width size of a screen required by Surface-x; for example: the screen throwing request comprises Surface-1 and size information of a picture required by the Surface-1. Another screen projection module may send Surface-2 and the size information of the required picture of Surface-2; (i.e., surface-x wherein x is a number, i.e., ID number);
step S300: after receiving the screen projection request in the step S200, the background screen projection service unit creates a rendering thread A based on Surface-x in the screen projection request and shares a context with a background main rendering thread; the rendering thread A and the background main rendering thread share a context and are used for sharing texture resources recorded by textures of OpenGLES in the background main rendering thread; the rendering thread A corresponds to the Surface-x one by one; the rendering thread A is also used for adding the additional logo to the monocular picture and sorting the drawing result into texture resources of the OpenGLES texture; the other rendering threads in the background screen-throwing service unit call texture resources recorded by the textures of OpenGLES at will; meanwhile, the background screen-throwing service unit creates a Surface-APP based on the texture of OpenGLES of the background main rendering thread; the screen projection module comprises an Rtmp screen projection module, a Miracast screen projection module or other forms of screen projection modules.
Step S400: the background screen-throwing service unit transmits a Surface-APP to a 3D application (or 3D application APK) through an AIDL interface, and the Surface-APP is used for establishing information link connection between the 3D application and the background screen-throwing service unit;
step S500: the 3D application comprises a 3D main rendering thread, and after the 3D application receives the Surface-APP, the 3D application creates a rendering thread B sharing the context with the 3D main rendering thread based on the Surface-APP; the rendering thread B is used for rendering and drawing a monocular picture (or monocular texture) in the 3D main rendering thread, namely, each time the 3D application changes the picture, the 3D main rendering thread needs to process the picture, the rendered monocular picture falls on the picture of the Surface-APP so as to ensure that the picture of the Surface-APP is updated, and the updated picture is finally added into the texture of the OpenGLES; that is, it should be noted that, the rendering thread B renders and draws the monocular image in the 3D main rendering thread, and the rendered monocular image falls on the Surface-APP image to ensure that the Surface-APP image is updated, and the Surface-APP image is created by the implementation of the OpenGLES texture in the background service unit, so that the background service unit can also learn that each frame of image data of the Surface-APP changes, and at this time, the content of the OpenGLES texture in the background service unit is the latest image.
It should be noted that, the background main rendering thread and the rendering thread a created in the preamble step S100 are all in the process of the background screen-throwing service unit, while the 3D main rendering thread and the rendering thread B created later are in the process of the 3D application; on the one hand, the 3D application creates a rendering thread B based on the Surface-APP, then the rendering thread B draws monocular textures, and the final picture result falls on the Surface-APP. The following is illustrative: for example, rendering thread B renders at a frame rate of 30FPS, and the Surface-APP screen is updated at a frame rate of 30 FPS. On the other hand, the Surface-APP is created by the background service unit implementation. Therefore, the background service unit can also know the change of each frame of picture data of the Surface-APP, namely, each time the picture is updated in the rendering thread B of the 3D application, the background service unit can receive a system notification to inform that the Surface-APP is updated. The content of texture 1 in the background service element is now the latest picture.
In the process of video rendering development using an embedded open graphics library (Open Graphics Library for Embedded Systems, openGLES), it is necessary to create textures from decoded video frame data and render images from the created textures. According to the technical scheme adopted by the embodiment of the invention, the texture of OpenGLES is used as a texture area of the data terminal to provide shared texture resources;
in the above step S300, the rendering thread a and the background main rendering thread implement the sharing of the context, and the final essential purpose is to share the texture resources described by the texture of OpenGLES in the background main rendering thread, so that other rendering threads (e.g., rendering thread B1, rendering thread B2, rendering thread B3, etc.) may access the texture resources described by the texture of OpenGLES; therefore, the purpose of sharing texture resources is that when other rendering threads in the same process draw, texture resources of textures of OpenGLES using a background main rendering thread can be arbitrarily called to draw directly. In addition, the shared context exists only between different threads of the same process. It should be noted that, the background main rendering thread and the rendering thread a created in the preamble step S100 are both in the process of the background screen-throwing service unit; then the 3D main rendering thread and the rendering thread B which are created later occur in the process of the 3D application; it should be noted that, the background screen-throwing service unit creates a rendering thread a based on the Surface-x in the screen-throwing request each time the screen-throwing request of step S100 is received, that is, creates a rendering thread a for each Surface-x; one rendering thread a is specifically spawned for each Surface-x ID number.
As shown in fig. 2, a control principle applied by the VR integrated machine-based screen capture processing method is illustrated, where a screen capture module 10, a background screen capture service unit 20, a background main rendering thread 21, textures 22 of OpenGLES in the background main rendering thread, a rendering thread a, a Surface-x, a Surface-APP, a 3D application 30, a 3D main rendering thread 31, a rendering thread B, and the like are illustrated respectively.
In step S500, the method further includes an operation step of clipping the updated frame to adapt to the size of the projection module:
step S510: after adding the updated picture into the texture of OpenGLES, calling the size information recorded in the Surface-x of the screen projection module of the target; and cutting according to the size information to ensure that the updated picture size is matched with the screen projection module.
It should be noted that, in the prior art, the monocular image is not cut, the monocular image is generally square, however, the display screen of the external screen projection module is rectangular, so that the prior art is centered, i.e. the monocular image is projected onto the display screen of the external screen projection module, so that large black edges appear on two sides, and the appearance is seriously affected. However, the embodiment of the invention adopts a clipping adaptation mode to avoid the occurrence of large black edges.
In step S500, the method further includes the operation step of adding logo watermark to the updated picture:
step S520: after adding the updated picture to the texture of OpenGLES, acquiring logo watermarks to be added, adding the logo watermarks to the updated picture, and finally throwing the updated picture into a screen throwing module of a target.
It should be noted that, in the prior art, the watermark cannot be added to the updated picture, and in the embodiment of the invention, the updated picture can be added to the texture of OpenGLES, and then the logo watermark is obtained and added to the updated picture.
In a specific technical solution, in the step S520 described above: the method for acquiring the logo watermark to be added and adding the logo watermark to the updated picture comprises the following steps:
step S5210: after adding an updated picture to the texture of OpenGLES, the rendering thread A acquires a logo watermark to be added, adds the logo watermark to the updated picture, and sorts the picture added with the logo watermark into texture resources of the texture of OpenGLES;
step S5220: the screen projection module of the target accesses the texture of OpenGLES, and according to the ID number of the Surface-x, an updated picture corresponding to the same link with the ID number of the Surface-x is called and is projected into the screen projection module of the target.
In summary, the method for collecting and processing the video recording screen image based on the VR integrated machine provided by the embodiment of the invention can flexibly process multiple video recording requests and can perfectly adapt to the size of the target video recording end. And the picture is clear and complete, and has no large black edge. The influence on the system rendering load is small. Both for the user and for the preparation of the advertising material.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention.

Claims (5)

1. The method for collecting and processing the projection screen picture based on the VR integrated machine is characterized in that an updated picture of the 3D application is put into a projection screen module, and different threads under the same process can share texture resources while the picture is updated, and the method specifically comprises the following operation steps:
step S100: after the background screen-throwing service unit is started, a background main rendering thread is created, and simultaneously, a texture of OpenGLES is created; the texture of the OpenGLES is used as a texture area of the data terminal to provide shared texture resources;
step S200: the screen projection module with the screen projection requirement sends a screen projection request to a background screen projection service unit; the background screen-throwing service unit identifies the received screen-throwing request, and identifies the ID number of the Surface-x and the size information of the required picture of the Surface-x contained in the screen-throwing request; the size information comprises the height size and the width size of a screen required by Surface-x;
step S300: after receiving the screen projection request in the step S200, the background screen projection service unit creates a rendering thread A based on Surface-x in the screen projection request and shares a context with a background main rendering thread; the rendering thread A and the background main rendering thread share a context, so as to share texture resources recorded by textures of OpenGLES in the background main rendering thread; the rendering thread A corresponds to the Surface-x one by one; the other rendering threads in the background screen-throwing service unit call texture resources recorded by the textures of OpenGLES at will; meanwhile, the background screen-throwing service unit creates a Surface-APP based on the texture of OpenGLES of the background main rendering thread;
step S400: the background screen-throwing service unit transmits a Surface-APP to the 3D application through an AIDL interface, and the Surface-APP is used for establishing information link connection between the 3D application and the background screen-throwing service unit;
step S500: the 3D application comprises a 3D main rendering thread, and after the 3D application receives the Surface-APP, the 3D application creates a rendering thread B sharing the context with the 3D main rendering thread based on the Surface-APP; the rendering thread B is used for rendering and drawing the monocular picture in the 3D main rendering thread, and the rendered monocular picture is dropped on the picture of the Surface-APP so as to ensure that the picture of the Surface-APP is updated, and the updated picture is finally added into the texture of OpenGLES.
2. The VR integrated machine-based projection screen image acquisition processing method of claim 1, wherein the projection screen module comprises an Rtmp projection screen module and a Miracast projection screen module.
3. The method for collecting and processing a projection screen picture based on a VR integrated machine as set forth in claim 2, further comprising the operation step of clipping the updated picture to adapt to the size of the projection screen module in step S500:
step S510: after adding the updated picture into the texture of OpenGLES, calling the size information recorded in the Surface-x of the screen projection module of the target; and cutting according to the size information to ensure that the updated picture size is matched with the screen projection module.
4. The method for processing a projection screen picture collection based on a VR integrated machine as set forth in claim 3, further comprising the operation step of adding logo watermark to the updated picture in step S500:
step S520: after adding the updated picture to the texture of OpenGLES, acquiring logo watermarks to be added, adding the logo watermarks to the updated picture, and finally throwing the updated picture into a screen throwing module of a target.
5. The method for processing the projection screen image acquisition based on the VR integrated machine as set forth in claim 4, wherein in the step S520: the method for acquiring the logo watermark to be added and adding the logo watermark to the updated picture comprises the following steps:
step S5210: after adding an updated picture to the texture of OpenGLES, the rendering thread A acquires a logo watermark to be added, adds the logo watermark to the updated picture, and sorts the picture added with the logo watermark into texture resources of the texture of OpenGLES;
step S5220: the screen projection module of the target accesses the texture of OpenGLES, and according to the ID number of the Surface-x, an updated picture corresponding to the same link with the ID number of the Surface-x is called and is projected into the screen projection module of the target.
CN202010506055.5A 2020-06-05 2020-06-05 VR (virtual reality) -based method for collecting and processing projection screen images Active CN111640191B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010506055.5A CN111640191B (en) 2020-06-05 2020-06-05 VR (virtual reality) -based method for collecting and processing projection screen images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010506055.5A CN111640191B (en) 2020-06-05 2020-06-05 VR (virtual reality) -based method for collecting and processing projection screen images

Publications (2)

Publication Number Publication Date
CN111640191A CN111640191A (en) 2020-09-08
CN111640191B true CN111640191B (en) 2023-04-21

Family

ID=72329826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010506055.5A Active CN111640191B (en) 2020-06-05 2020-06-05 VR (virtual reality) -based method for collecting and processing projection screen images

Country Status (1)

Country Link
CN (1) CN111640191B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565839B (en) * 2020-11-23 2022-11-29 青岛海信传媒网络技术有限公司 Display method and display device of screen projection image
CN116887005B (en) * 2021-08-27 2024-05-03 荣耀终端有限公司 Screen projection method, electronic device and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103902390A (en) * 2014-03-12 2014-07-02 深圳创维-Rgb电子有限公司 Inter-process communication method based on Android application layer and basis application communication system
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay
CN110248022A (en) * 2019-06-06 2019-09-17 武汉卡比特信息有限公司 A kind of throwing screen method of tripartite's application based on mobile phone interconnection
CN110908624A (en) * 2018-09-18 2020-03-24 深圳市布谷鸟科技有限公司 Control method and system for screen abnormal display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103902390A (en) * 2014-03-12 2014-07-02 深圳创维-Rgb电子有限公司 Inter-process communication method based on Android application layer and basis application communication system
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay
CN110908624A (en) * 2018-09-18 2020-03-24 深圳市布谷鸟科技有限公司 Control method and system for screen abnormal display
CN110248022A (en) * 2019-06-06 2019-09-17 武汉卡比特信息有限公司 A kind of throwing screen method of tripartite's application based on mobile phone interconnection

Also Published As

Publication number Publication date
CN111640191A (en) 2020-09-08

Similar Documents

Publication Publication Date Title
GB2553892B (en) 2D video with option for projected viewing in modeled 3D space
EP3883256A1 (en) Live stream processing method in webrtc and stream pushing client
US8907968B2 (en) Image rendering device, image rendering method, and image rendering program for rendering stereoscopic panoramic images
CN111133763A (en) Superposition processing method and device in 360 video system
US20100060652A1 (en) Graphics rendering system
US20070070067A1 (en) Scene splitting for perspective presentations
KR101267120B1 (en) Mapping graphics instructions to associated graphics data during performance analysis
CN111640191B (en) VR (virtual reality) -based method for collecting and processing projection screen images
US11095871B2 (en) System that generates virtual viewpoint image, method and storage medium
CN103034969B (en) A kind of display method and system and display device for cartoon
CN115350479B (en) Rendering processing method, device, equipment and medium
WO2019118028A1 (en) Methods, systems, and media for generating and rendering immersive video content
CN103460292B (en) Scene graph for defining a stereoscopic graphical object
US9143754B2 (en) Systems and methods for modifying stereoscopic images
WO2022033162A1 (en) Model loading method and related apparatus
WO2024041238A1 (en) Point cloud media data processing method and related device
CN116069435B (en) Method, system and storage medium for dynamically loading picture resources in virtual scene
JP5026472B2 (en) Image generating apparatus, operation method of image generating apparatus, and recording medium
CN113992679A (en) Automobile image display method, system and equipment
CN115715464A (en) Method and apparatus for occlusion handling techniques
GB2470759A (en) Displaying videogame on 3D display by generating stereoscopic version of game without modifying source code
CN115665461B (en) Video recording method and virtual reality device
CN110944239A (en) Video playing method and device
US20090231330A1 (en) Method and system for rendering a three-dimensional scene using a dynamic graphics platform
US11962743B2 (en) 3D display system and 3D display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant