CN117611472A - Fusion method for metaspace and cloud rendering - Google Patents

Fusion method for metaspace and cloud rendering Download PDF

Info

Publication number
CN117611472A
CN117611472A CN202410096289.5A CN202410096289A CN117611472A CN 117611472 A CN117611472 A CN 117611472A CN 202410096289 A CN202410096289 A CN 202410096289A CN 117611472 A CN117611472 A CN 117611472A
Authority
CN
China
Prior art keywords
rendering
region
scene
workload
priority
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410096289.5A
Other languages
Chinese (zh)
Other versions
CN117611472B (en
Inventor
袁梁
罗翼鹏
易洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Wutong Technology Co ltd
Original Assignee
Sichuan Wutong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Wutong Technology Co ltd filed Critical Sichuan Wutong Technology Co ltd
Priority to CN202410096289.5A priority Critical patent/CN117611472B/en
Publication of CN117611472A publication Critical patent/CN117611472A/en
Application granted granted Critical
Publication of CN117611472B publication Critical patent/CN117611472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a fusion method for metaspace and cloud rendering, which relates to the field of image data processing, and aims to create metaspace and scene elements applied to the metaspace; dividing the created scene elements into rendering workload and rendering priority; rendering the scene elements of the metaspace according to the divided rendering workload and rendering priority; creating a three-dimensional coordinate system, setting nodes, creating scene elements, fusing the created scene elements with the corresponding nodes, dividing the rendering workload according to the scene complexity and the resolution of the scene elements, dividing the regional rendering priority according to the rendering workload and the scene rendering priority, setting a plurality of cloud rendering nodes, and rendering the scene elements in the metaspace from high to low by utilizing the cloud rendering nodes according to the regional rendering priority, thereby improving the rendering efficiency, ensuring that the key model obtains higher rendering priority, and improving the overall user experience.

Description

Fusion method for metaspace and cloud rendering
Technical Field
The invention relates to the field of image data processing, in particular to a fusion method for metaspace and cloud rendering.
Background
The metauniverse is a virtual world, and is constructed and presented through a digital technology, in the virtual world, a large number of social relations such as business, entertainment, education and the like can be reconstructed, and the real-time cloud rendering is a cloud rendering technology, which can render real-time scenes in real time in the rendering process, so that scenes in the real world can be simulated more truly, the metauniverse has close relations with the real-time cloud rendering, the virtual world is presented through the cloud rendering, and the real-time cloud rendering can provide higher-quality and more real virtual scenes, and because the real-time performance of the cloud rendering is strong, the rendering work can be completed more quickly and efficiently when large-scale rendering tasks are processed, and the loading time of the metauniverse is shortened;
how to complete rendering work more quickly and efficiently, ensure that a key model obtains higher rendering priority and improve the overall user experience is a problem which needs to be solved, and therefore, a fusion method for metaspace and cloud rendering is provided.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a fusion method for metaspace and cloud rendering.
The aim of the invention can be achieved by the following technical scheme: a fusion method for metaspace and cloud rendering, comprising the steps of:
step S1: creating a metaspace and a scene element applied to the metaspace;
step S2: dividing the created scene elements into rendering workload and rendering priority;
step S3: and rendering the scene elements of the metaspace according to the divided rendering workload and the rendering priority.
Further, the process of creating a metaspace and a scene element applied in the metaspace includes:
creating a three-dimensional coordinate system in the created metaspace, setting corresponding nodes in the metaspace, and obtaining the three-dimensional coordinate corresponding to each node; and creating a corresponding scene element according to the node, and fusing the created scene element with the corresponding node.
Further, the process of dividing the rendering workload for the created scene element includes:
dividing a metaspace into a plurality of areas, and setting scene complexity of scene elements in each area;
obtaining the resolution of scene elements in each region;
obtaining a rendering workload evaluation coefficient according to the set scene complexity and the acquired resolution;
setting a threshold range of a rendering workload evaluation coefficient;
obtaining the rendering workload of each region according to the comparison result of the obtained rendering workload evaluation coefficient and the threshold range of the set rendering workload evaluation coefficient;
the rendering workload includes a primary rendering workload, a secondary rendering workload, and a tertiary rendering workload.
Further, the process of setting the scene complexity of the scene elements in each region includes:
acquiring geometric details of scene elements in each region, and setting a threshold range of the geometric details;
and obtaining the scene complexity of the scene elements in each region according to the obtained comparison result of the geometric details of the scene elements in each region and the set threshold range of the geometric details.
Further, the process of prioritizing rendering of the created scene elements includes:
acquiring a view angle range of a user character role, and setting scene rendering priority of scene elements in each region;
judging the zone rendering priority of each zone according to the view angle range of the user character, the rendering workload of each zone and the scene rendering priority of the scene elements in each zone;
the region rendering priorities comprise a highest region rendering priority, a first region rendering priority, a second region rendering priority, a third region rendering priority and a fourth region rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene elements in the region is the first scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the second region rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority, and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the region is the second scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the first-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the second rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the three-level rendering workload, the region rendering priority of the region is marked as the fourth region rendering priority;
and obtaining the region in the view angle range of the persona according to the view angle range of the persona of the user, ignoring the region rendering priority before the region in the view angle range of the persona, and marking the region rendering priority of the region in the view angle range of the persona as the highest region rendering priority.
Further, the highest region rendering priority is higher than the first region rendering priority, the second region rendering priority is higher than the third region rendering priority, and the fourth region rendering priority.
Further, the process of setting the scene rendering priority of the scene elements in each region includes:
acquiring the number of pixel points of scene elements in each region, and setting a threshold range of the pixel points;
and obtaining scene rendering priority of the scene elements in each region according to the obtained comparison result of the number of the pixel points of the scene elements in each region and the set threshold range of the pixel points.
Further, the process of rendering the scene elements of the metaspace according to the partitioned rendering workload and the rendering priority includes:
setting a plurality of cloud rendering nodes, wherein the cloud rendering nodes comprise high-performance cloud rendering nodes, medium-performance cloud rendering nodes and low-performance cloud rendering nodes;
acquiring the rendering workload and the region rendering priority of each region, screening out the region with the highest region rendering priority, adopting a high-performance cloud rendering node to render the region with the screened rendering workload as primary rendering workload, adopting a medium-performance cloud rendering node to render the region with the screened rendering workload as secondary rendering workload, and adopting a low-performance cloud rendering node to render the region with the screened rendering workload as tertiary rendering workload;
screening out the region with the region rendering priority being the first region rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the region with the screened rendering workload being the primary rendering workload, and adopting a medium-performance cloud rendering node to render the region with the screened rendering workload being the secondary rendering workload;
screening out the area with the area rendering priority being the second area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
screening out the area with the area rendering priority being the third area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
and after the rendering is completed, rendering the remaining area by adopting a low-performance cloud rendering node.
Compared with the prior art, the invention has the beneficial effects that: creating a three-dimensional coordinate system in a metaspace, setting nodes, creating corresponding scene elements according to the set nodes, fusing the created scene elements with the corresponding nodes, dividing the scene elements of the metaspace into rendering workload according to scene complexity and resolution of the scene elements, dividing the scene elements of the metaspace into regional rendering priorities according to the rendering workload and the scene rendering priorities, setting a plurality of cloud rendering nodes, and rendering the scene elements of the metaspace from high to low by utilizing the cloud rendering nodes according to the regional rendering priorities, so that the rendering work is completed quickly and efficiently, the higher rendering priority of a key model is ensured, and the overall user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic diagram of the present invention.
Detailed Description
As shown in fig. 1, a fusion method for metaspace and cloud rendering includes the following steps:
step S1: creating a metaspace and a scene element applied to the metaspace;
step S2: dividing the created scene elements into rendering workload and rendering priority;
step S3: rendering the scene elements of the metaspace according to the divided rendering workload and rendering priority;
it should be further noted that, in the implementation process, the process of creating the metaspace and the scene element applied in the metaspace includes:
creating a three-dimensional coordinate system in the created metaspace, setting corresponding nodes in the metaspace, and obtaining the three-dimensional coordinate corresponding to each node; creating corresponding scene elements according to the nodes, associating one scene element with each node, and fusing the created scene elements with the corresponding nodes;
it should be further noted that, in the implementation process, the process of dividing the created scene element into the rendering workload and the rendering priority includes:
dividing a metaspace into a plurality of regions, marking the divided regions as i, i=1, 2,3, … …, n, n being more than or equal to 1 and n being an integer;
obtaining geometric details of scene elements in each region, and marking the geometric details as C i
The geometric details are the number of colors contained in the scene element;
setting a threshold range of geometric details, denoted (C0, C1) and (C1, C2);
setting scene complexity of scene elements in each region according to the acquired geometric details of the scene elements in each region and the set threshold range of the geometric details;
the scene complexity includes a first scene complexity, a second scene complexity, a third scene complexity, and a fourth scene complexity;
the more geometric details of scene elements within the region, the higher the scene complexity of scene elements within the region;
when C i Setting the scene complexity of the scene elements in the area to be a first scene complexity when C0 is not more than;
when C0 < C i When C1 is less than, setting the scene complexity of the scene elements in the area as second scene complexity;
when C1 is less than or equal to C i Setting the scene complexity of the scene elements in the area to be a third scene complexity when C2 is not more than;
when C i Setting the scene complexity of the scene elements in the region to be a fourth scene complexity when the scene elements are more than C2;
the scene complexity of the scene elements in each region is marked as F i F when the scene complexity of the scene element in the region is set to the first scene complexity i When the scene complexity of the scene element in the region is set to the second scene complexity, F i When the scene complexity of the scene element in the region is set to the third scene complexity =b, F i When the scene complexity of the scene element in the region is set to the first scene complexity, F i =d;
It should be further noted that, in the implementation process, the complexity of the first scene is higher than that of the second scene and that of the third scene, that is, a > b > c > d;
obtaining the resolution of scene elements in each region;
the resolution of the scene elements in each acquired region is recorded as X i
According toSetting a rendering workload evaluation coefficient, which is recorded as XV, by the acquired resolution of scene elements in each region and the scene complexity of the set scene elements i
Wherein XV i =s1×X i +s2×F i S1 is a weight coefficient occupied by the resolution of the scene element, s2 is a weight coefficient occupied by the scene complexity of the scene element, and the values of s1 and s2 are both larger than zero;
setting a threshold range of the rendering workload evaluation coefficient, and recording the threshold range of the rendering workload evaluation coefficient as (XV 0, XV 1);
when XV i When XV0 is not more than, the rendering workload of the area is recorded as three-level rendering workload;
when XV0 < XV i When XV1 is less than, the rendering workload of the area is recorded as secondary rendering workload;
when XV i When the rendering workload of the area is not less than XV1, the rendering workload of the area is recorded as primary rendering workload;
it should be further noted that, in the implementation process, the primary rendering workload is greater than the secondary rendering workload and greater than the tertiary rendering workload;
acquiring the number of pixel points of scene elements in each region, and marking the number as N i
Setting a threshold range of the pixel points, and marking the threshold range as (N0, N1);
setting scene rendering priority of scene elements in each region according to the acquired comparison result of the number of pixel points of the scene elements in each region and the set threshold range of the pixel points;
the scene rendering priorities comprise a first scene rendering priority, a second scene rendering priority and a third scene rendering priority;
the more pixels of scene elements in the region, the higher the scene rendering priority of the scene elements in the region;
when N is i Setting the scene rendering priority of the scene elements in the area as a first scene rendering priority when N0 is not more than;
when N0 < N i When < N1, the scene rendering of the scene elements in the area is optimizedThe first level is set as a second scene rendering priority;
when N is i Setting the scene rendering priority of the scene elements in the area as a third scene rendering priority when the scene rendering priority is not less than N1;
it should be further noted that, in the implementation process, the first scene rendering priority is higher than the second scene rendering priority and higher than the third scene rendering priority;
acquiring a visual angle range of a user character;
judging the zone rendering priority of each zone according to the view angle range of the user character, the rendering workload of each zone and the scene rendering priority of the scene elements in each zone;
the region rendering priorities comprise a highest region rendering priority, a first region rendering priority, a second region rendering priority, a third region rendering priority and a fourth region rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene elements in the region is the first scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the second region rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority, and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the region is the second scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the first-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the second rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the three-level rendering workload, the region rendering priority of the region is marked as the fourth region rendering priority;
obtaining a region in the view angle range of the persona according to the view angle range of the persona of the user, ignoring the region rendering priority before the region in the view angle range of the persona, and marking the region rendering priority of the region in the view angle range of the persona as the highest region rendering priority;
it should be further noted that, in the implementation process, the highest region rendering priority is higher than the first region rendering priority, the second region rendering priority is higher than the third region rendering priority, and the fourth region rendering priority.
It should be further noted that, in the implementation process, the process of rendering the scene element of the metaspace according to the partitioned rendering workload and the rendering priority includes:
setting a plurality of cloud rendering nodes, wherein the cloud rendering nodes comprise high-performance cloud rendering nodes, medium-performance cloud rendering nodes and low-performance cloud rendering nodes;
the high-performance cloud rendering node is used for rendering the region with the primary rendering workload, the medium-performance cloud rendering node is used for rendering the region with the secondary rendering workload, and the low-performance cloud rendering node is used for rendering the region with the tertiary rendering workload;
acquiring the rendering workload and the region rendering priority of each region, screening out the region with the highest region rendering priority, adopting a high-performance cloud rendering node to render the region with the screened rendering workload as primary rendering workload, adopting a medium-performance cloud rendering node to render the region with the screened rendering workload as secondary rendering workload, and adopting a low-performance cloud rendering node to render the region with the screened rendering workload as tertiary rendering workload;
screening out the region with the region rendering priority being the first region rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the region with the screened rendering workload being the primary rendering workload, and adopting a medium-performance cloud rendering node to render the region with the screened rendering workload being the secondary rendering workload;
screening out the area with the area rendering priority being the second area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
screening out the area with the area rendering priority being the third area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
after the rendering is completed, rendering the remaining area by adopting a low-performance cloud rendering node;
the scene rendering of the metaspace is completed, and the fusion of the metaspace and the cloud rendering is successful.
The above embodiments are only for illustrating the technical method of the present invention and not for limiting the same, and it should be understood by those skilled in the art that the technical method of the present invention may be modified or substituted without departing from the spirit and scope of the technical method of the present invention.

Claims (8)

1. A fusion method for metaspace and cloud rendering, comprising the steps of:
step S1: creating a metaspace and a scene element applied to the metaspace;
step S2: dividing the created scene elements into rendering workload and rendering priority;
step S3: and rendering the scene elements of the metaspace according to the divided rendering workload and the rendering priority.
2. The fusion method for metaspace and cloud rendering of claim 1, wherein creating metaspace and scene elements applied within the metaspace comprises:
creating a three-dimensional coordinate system in the created metaspace, setting corresponding nodes in the metaspace, and obtaining the three-dimensional coordinate corresponding to each node; and creating a corresponding scene element according to the node, and fusing the created scene element with the corresponding node.
3. A fusion method for metaspace and cloud rendering as claimed in claim 2, wherein the process of partitioning the rendering workload for the created scene elements comprises:
dividing a metaspace into a plurality of areas, and setting scene complexity of scene elements in each area;
obtaining the resolution of scene elements in each region;
obtaining a rendering workload evaluation coefficient according to the set scene complexity and the acquired resolution;
setting a threshold range of a rendering workload evaluation coefficient;
obtaining the rendering workload of each region according to the comparison result of the obtained rendering workload evaluation coefficient and the threshold range of the set rendering workload evaluation coefficient;
the rendering workload includes a primary rendering workload, a secondary rendering workload, and a tertiary rendering workload.
4. A fusion method for metaspace and cloud rendering according to claim 3, wherein the process of setting scene complexity of scene elements in each region comprises:
acquiring geometric details of scene elements in each region, and setting a threshold range of the geometric details;
and obtaining the scene complexity of the scene elements in each region according to the obtained comparison result of the geometric details of the scene elements in each region and the set threshold range of the geometric details.
5. A fusion method for metaspace and cloud rendering as claimed in claim 3, wherein prioritizing the created scene elements comprises:
acquiring a view angle range of a user character role, and setting scene rendering priority of scene elements in each region;
judging the zone rendering priority of each zone according to the view angle range of the user character, the rendering workload of each zone and the scene rendering priority of the scene elements in each zone;
the region rendering priorities include a highest region rendering priority, a first region rendering priority, a second region rendering priority, a third region rendering priority, and a fourth region rendering priority.
6. The fusion method for metaspace and cloud rendering of claim 5, wherein setting scene rendering priorities of scene elements in each region comprises:
acquiring the number of pixel points of scene elements in each region, and setting a threshold range of the pixel points;
and obtaining scene rendering priority of the scene elements in each region according to the obtained comparison result of the number of the pixel points of the scene elements in each region and the set threshold range of the pixel points.
7. The fusion method for metaspace and cloud rendering of claim 5, wherein the process of rendering the scene elements of the metaspace according to the partitioned rendering workload and rendering priority comprises:
setting a plurality of cloud rendering nodes, wherein the cloud rendering nodes comprise high-performance cloud rendering nodes, medium-performance cloud rendering nodes and low-performance cloud rendering nodes;
acquiring the rendering workload and the region rendering priority of each region, screening out a region with the region rendering priority being the highest region rendering priority, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
screening out a region with the region rendering priority being the first region rendering priority after the rendering is finished, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
screening out a region with the region rendering priority being the second region rendering priority after the rendering is completed, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
screening out a region with the region rendering priority being the third region rendering priority after the rendering is completed, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
and after the rendering is completed, rendering the remaining area by adopting a low-performance cloud rendering node.
8. The fusion method for metaspace and cloud rendering of claim 7, wherein when the rendering workload of the region is a primary rendering workload, rendering it by using a high-performance cloud rendering node;
when the rendering workload of the region is the secondary rendering workload, rendering the region by adopting a medium-performance cloud rendering node;
and when the rendering workload of the region is three-level rendering workload, rendering the region by adopting a low-performance cloud rendering node.
CN202410096289.5A 2024-01-24 2024-01-24 Fusion method for metaspace and cloud rendering Active CN117611472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410096289.5A CN117611472B (en) 2024-01-24 2024-01-24 Fusion method for metaspace and cloud rendering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410096289.5A CN117611472B (en) 2024-01-24 2024-01-24 Fusion method for metaspace and cloud rendering

Publications (2)

Publication Number Publication Date
CN117611472A true CN117611472A (en) 2024-02-27
CN117611472B CN117611472B (en) 2024-04-09

Family

ID=89960214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410096289.5A Active CN117611472B (en) 2024-01-24 2024-01-24 Fusion method for metaspace and cloud rendering

Country Status (1)

Country Link
CN (1) CN117611472B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012140360A1 (en) * 2011-04-12 2012-10-18 Real Fusio France Method and system for rendering a virtual scene in three dimensions
CN110738721A (en) * 2019-10-12 2020-01-31 四川航天神坤科技有限公司 Three-dimensional scene rendering acceleration method and system based on video geometric analysis
WO2022116759A1 (en) * 2020-12-03 2022-06-09 腾讯科技(深圳)有限公司 Image rendering method and apparatus, and computer device and storage medium
WO2022127278A1 (en) * 2020-12-18 2022-06-23 完美世界(北京)软件科技发展有限公司 Method and apparatus for rendering virtual scene
CN115512019A (en) * 2021-06-21 2022-12-23 华为云计算技术有限公司 Rendering method, device and system
CN115984519A (en) * 2022-12-27 2023-04-18 富春科技股份有限公司 VR-based space scene display method, system and storage medium
CN116028176A (en) * 2022-12-25 2023-04-28 中势科技有限公司 Resource scheduling method applied to meta universe
CN116152410A (en) * 2022-12-02 2023-05-23 浙江毫微米科技有限公司 Method, system, equipment and storage medium for rendering images in meta universe
CN117036574A (en) * 2023-08-11 2023-11-10 北京百度网讯科技有限公司 Rendering method, rendering device, electronic equipment and storage medium
CN117271749A (en) * 2023-11-04 2023-12-22 北京蔚领时代科技有限公司 Creation method and computer for non-player characters in meta-universe scene

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012140360A1 (en) * 2011-04-12 2012-10-18 Real Fusio France Method and system for rendering a virtual scene in three dimensions
CN110738721A (en) * 2019-10-12 2020-01-31 四川航天神坤科技有限公司 Three-dimensional scene rendering acceleration method and system based on video geometric analysis
WO2022116759A1 (en) * 2020-12-03 2022-06-09 腾讯科技(深圳)有限公司 Image rendering method and apparatus, and computer device and storage medium
WO2022127278A1 (en) * 2020-12-18 2022-06-23 完美世界(北京)软件科技发展有限公司 Method and apparatus for rendering virtual scene
CN115512019A (en) * 2021-06-21 2022-12-23 华为云计算技术有限公司 Rendering method, device and system
CN116152410A (en) * 2022-12-02 2023-05-23 浙江毫微米科技有限公司 Method, system, equipment and storage medium for rendering images in meta universe
CN116028176A (en) * 2022-12-25 2023-04-28 中势科技有限公司 Resource scheduling method applied to meta universe
CN115984519A (en) * 2022-12-27 2023-04-18 富春科技股份有限公司 VR-based space scene display method, system and storage medium
CN117036574A (en) * 2023-08-11 2023-11-10 北京百度网讯科技有限公司 Rendering method, rendering device, electronic equipment and storage medium
CN117271749A (en) * 2023-11-04 2023-12-22 北京蔚领时代科技有限公司 Creation method and computer for non-player characters in meta-universe scene

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ZHAO, J等: "Research progress of virtual view rendering technology for light field display", 《LIQUID CRYSTALS AND DISPLAYS》, vol. 38, no. 10, 5 October 2023 (2023-10-05), pages 1361 - 1371 *
ZHU, F等: "Gaze-Contingent Rendering in Virtual Reality", 《37TH COMPUTER GRAPHICS INTERNATIONAL (CGI) CONFERENCE》, vol. 12221, 18 October 2022 (2022-10-18), pages 16 - 23 *
曹勇: "高效的室内场景渲染系统设计与优化", 《CNKI中国优秀硕士毕业论文全文库(信息科技辑)》, no. 4, 15 April 2021 (2021-04-15), pages 138 - 737 *
沈健祺: "元宇宙下基于边缘的资源分配算法研究", 《CNKI中国优秀硕士毕业论文全文库(信息科技辑)》, no. 7, 15 July 2023 (2023-07-15), pages 136 - 613 *

Also Published As

Publication number Publication date
CN117611472B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN110969589B (en) Dynamic scene blurred image blind restoration method based on multi-stream annotating countermeasure network
DE112021003335T5 (en) POINT CLOUD GEOMETRY COMPRESSION USING OCTREES WITH MULTIPLE SCAN ORDERS
WO2002091302A3 (en) Image sequence enhancement system and method
CN110264405B (en) Image processing method, device, server and storage medium based on interpolation algorithm
JPH03218581A (en) Picture segmentation method
JP2007158510A (en) Image processor and its control method, computer program, and computer readable memory medium
CN111091151B (en) Construction method of generation countermeasure network for target detection data enhancement
CN101510299A (en) Image self-adapting method based on vision significance
CN112822479A (en) Depth map generation method and device for 2D-3D video conversion
CN117611472B (en) Fusion method for metaspace and cloud rendering
CN102542528B (en) Image conversion processing method and system
CN108875589B (en) Video detection method for road area
Soni et al. Removal of high density salt and pepper noise removal by modified median filter
CN113989460B (en) Real-time sky replacement special effect control method and device for augmented reality scene
CN115527258A (en) Face exchange method based on identity information response
CN115631489A (en) Three-dimensional semantic scene completion method, device, equipment and medium
CN110772790B (en) Method and system for resetting strange area of game map brush
CN109491565A (en) The module information display methods and equipment of object in three-dimensional scenic
Huang et al. Image dehazing in disproportionate haze distributions
CN106648634A (en) Screen shot method and screen shot device
CN105243652B (en) The method and device of image noise reduction
Huang et al. Generation of stereo oil paintings from RGBD images
CN116824029B (en) Method, device, electronic equipment and storage medium for generating holographic shadow
CN105427354B (en) Image vector expression based on plane set of blocks
CN104867094A (en) Image processing method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant