CN117611472B - Fusion method for metaspace and cloud rendering - Google Patents
Fusion method for metaspace and cloud rendering Download PDFInfo
- Publication number
- CN117611472B CN117611472B CN202410096289.5A CN202410096289A CN117611472B CN 117611472 B CN117611472 B CN 117611472B CN 202410096289 A CN202410096289 A CN 202410096289A CN 117611472 B CN117611472 B CN 117611472B
- Authority
- CN
- China
- Prior art keywords
- rendering
- region
- scene
- workload
- priority
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 445
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 12
- 238000000034 method Methods 0.000 claims description 26
- 238000012216 screening Methods 0.000 claims description 12
- 238000011156 evaluation Methods 0.000 claims description 11
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Abstract
The invention discloses a fusion method for metaspace and cloud rendering, which relates to the field of image data processing, and aims to create metaspace and scene elements applied to the metaspace; dividing the created scene elements into rendering workload and rendering priority; rendering the scene elements of the metaspace according to the divided rendering workload and rendering priority; creating a three-dimensional coordinate system, setting nodes, creating scene elements, fusing the created scene elements with the corresponding nodes, dividing the rendering workload according to the scene complexity and the resolution of the scene elements, dividing the regional rendering priority according to the rendering workload and the scene rendering priority, setting a plurality of cloud rendering nodes, and rendering the scene elements in the metaspace from high to low by utilizing the cloud rendering nodes according to the regional rendering priority, thereby improving the rendering efficiency, ensuring that the key model obtains higher rendering priority, and improving the overall user experience.
Description
Technical Field
The invention relates to the field of image data processing, in particular to a fusion method for metaspace and cloud rendering.
Background
The metauniverse is a virtual world, and is constructed and presented through a digital technology, in the virtual world, a large number of social relations such as business, entertainment, education and the like can be reconstructed, and the real-time cloud rendering is a cloud rendering technology, which can render real-time scenes in real time in the rendering process, so that scenes in the real world can be simulated more truly, the metauniverse has close relations with the real-time cloud rendering, the virtual world is presented through the cloud rendering, and the real-time cloud rendering can provide higher-quality and more real virtual scenes, and because the real-time performance of the cloud rendering is strong, the rendering work can be completed more quickly and efficiently when large-scale rendering tasks are processed, and the loading time of the metauniverse is shortened;
how to complete rendering work more quickly and efficiently, ensure that a key model obtains higher rendering priority and improve the overall user experience is a problem which needs to be solved, and therefore, a fusion method for metaspace and cloud rendering is provided.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a fusion method for metaspace and cloud rendering.
The aim of the invention can be achieved by the following technical scheme: a fusion method for metaspace and cloud rendering, comprising the steps of:
step S1: creating a metaspace and a scene element applied to the metaspace;
step S2: dividing the created scene elements into rendering workload and rendering priority;
step S3: and rendering the scene elements of the metaspace according to the divided rendering workload and the rendering priority.
Further, the process of creating a metaspace and a scene element applied in the metaspace includes:
creating a three-dimensional coordinate system in the created metaspace, setting corresponding nodes in the metaspace, and obtaining the three-dimensional coordinate corresponding to each node; and creating a corresponding scene element according to the node, and fusing the created scene element with the corresponding node.
Further, the process of dividing the rendering workload for the created scene element includes:
dividing a metaspace into a plurality of areas, and setting scene complexity of scene elements in each area;
obtaining the resolution of scene elements in each region;
obtaining a rendering workload evaluation coefficient according to the set scene complexity and the acquired resolution;
setting a threshold range of a rendering workload evaluation coefficient;
obtaining the rendering workload of each region according to the comparison result of the obtained rendering workload evaluation coefficient and the threshold range of the set rendering workload evaluation coefficient;
the rendering workload includes a primary rendering workload, a secondary rendering workload, and a tertiary rendering workload.
Further, the process of setting the scene complexity of the scene elements in each region includes:
acquiring geometric details of scene elements in each region, and setting a threshold range of the geometric details;
and obtaining the scene complexity of the scene elements in each region according to the obtained comparison result of the geometric details of the scene elements in each region and the set threshold range of the geometric details.
Further, the process of prioritizing rendering of the created scene elements includes:
acquiring a view angle range of a user character role, and setting scene rendering priority of scene elements in each region;
judging the zone rendering priority of each zone according to the view angle range of the user character, the rendering workload of each zone and the scene rendering priority of the scene elements in each zone;
the region rendering priorities comprise a highest region rendering priority, a first region rendering priority, a second region rendering priority, a third region rendering priority and a fourth region rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene elements in the region is the first scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the second region rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority, and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the region is the second scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the first-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the second rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the three-level rendering workload, the region rendering priority of the region is marked as the fourth region rendering priority;
and obtaining the region in the view angle range of the persona according to the view angle range of the persona of the user, ignoring the region rendering priority before the region in the view angle range of the persona, and marking the region rendering priority of the region in the view angle range of the persona as the highest region rendering priority.
Further, the highest region rendering priority is higher than the first region rendering priority, the second region rendering priority is higher than the third region rendering priority, and the fourth region rendering priority.
Further, the process of setting the scene rendering priority of the scene elements in each region includes:
acquiring the number of pixel points of scene elements in each region, and setting a threshold range of the pixel points;
and obtaining scene rendering priority of the scene elements in each region according to the obtained comparison result of the number of the pixel points of the scene elements in each region and the set threshold range of the pixel points.
Further, the process of rendering the scene elements of the metaspace according to the partitioned rendering workload and the rendering priority includes:
setting a plurality of cloud rendering nodes, wherein the cloud rendering nodes comprise high-performance cloud rendering nodes, medium-performance cloud rendering nodes and low-performance cloud rendering nodes;
acquiring the rendering workload and the region rendering priority of each region, screening out the region with the highest region rendering priority, adopting a high-performance cloud rendering node to render the region with the screened rendering workload as primary rendering workload, adopting a medium-performance cloud rendering node to render the region with the screened rendering workload as secondary rendering workload, and adopting a low-performance cloud rendering node to render the region with the screened rendering workload as tertiary rendering workload;
screening out the region with the region rendering priority being the first region rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the region with the screened rendering workload being the primary rendering workload, and adopting a medium-performance cloud rendering node to render the region with the screened rendering workload being the secondary rendering workload;
screening out the area with the area rendering priority being the second area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
screening out the area with the area rendering priority being the third area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
and after the rendering is completed, rendering the remaining area by adopting a low-performance cloud rendering node.
Compared with the prior art, the invention has the beneficial effects that: creating a three-dimensional coordinate system in a metaspace, setting nodes, creating corresponding scene elements according to the set nodes, fusing the created scene elements with the corresponding nodes, dividing the scene elements of the metaspace into rendering workload according to scene complexity and resolution of the scene elements, dividing the scene elements of the metaspace into regional rendering priorities according to the rendering workload and the scene rendering priorities, setting a plurality of cloud rendering nodes, and rendering the scene elements of the metaspace from high to low by utilizing the cloud rendering nodes according to the regional rendering priorities, so that the rendering work is completed quickly and efficiently, the higher rendering priority of a key model is ensured, and the overall user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic diagram of the present invention.
Detailed Description
As shown in fig. 1, a fusion method for metaspace and cloud rendering includes the following steps:
step S1: creating a metaspace and a scene element applied to the metaspace;
step S2: dividing the created scene elements into rendering workload and rendering priority;
step S3: rendering the scene elements of the metaspace according to the divided rendering workload and rendering priority;
it should be further noted that, in the implementation process, the process of creating the metaspace and the scene element applied in the metaspace includes:
creating a three-dimensional coordinate system in the created metaspace, setting corresponding nodes in the metaspace, and obtaining the three-dimensional coordinate corresponding to each node; creating corresponding scene elements according to the nodes, associating one scene element with each node, and fusing the created scene elements with the corresponding nodes;
it should be further noted that, in the implementation process, the process of dividing the created scene element into the rendering workload and the rendering priority includes:
dividing a metaspace into a plurality of regions, marking the divided regions as i, i=1, 2,3, … …, n, n being more than or equal to 1 and n being an integer;
obtaining geometric details of scene elements in each region, and marking the geometric details as C i ;
The geometric details are the number of colors contained in the scene element;
setting a threshold range of geometric details, denoted (C0, C1) and (C1, C2);
setting scene complexity of scene elements in each region according to the acquired geometric details of the scene elements in each region and the set threshold range of the geometric details;
the scene complexity includes a first scene complexity, a second scene complexity, a third scene complexity, and a fourth scene complexity;
the more geometric details of scene elements within the region, the higher the scene complexity of scene elements within the region;
when C i Setting the scene complexity of the scene elements in the area to be a first scene complexity when C0 is not more than;
when C0 < C i When C1 is less than, setting the scene complexity of the scene elements in the area as second scene complexity;
when C1 is less than or equal to C i Setting the scene complexity of the scene elements in the area to be a third scene complexity when C2 is not more than;
when C i Setting the scene complexity of the scene elements in the region to be a fourth scene complexity when the scene elements are more than C2;
the scene complexity of the scene elements in each region is marked as F i F when the scene complexity of the scene element in the region is set to the first scene complexity i When the scene complexity of the scene element in the region is set to the second scene complexity, F i When the scene complexity of the scene element in the region is set to the third scene complexity =b, F i When the scene complexity of the scene element in the region is set to the first scene complexity, F i =d;
It should be further noted that, in the implementation process, the complexity of the first scene is higher than that of the second scene and that of the third scene, that is, a > b > c > d;
obtaining the resolution of scene elements in each region;
recording the resolution of the scene elements in each acquired regionIs X i ;
Setting a rendering workload evaluation coefficient according to the acquired resolution of the scene elements in each region and the scene complexity of the set scene elements, and recording as XV i ;
Wherein XV i =s1×X i +s2×F i S1 is a weight coefficient occupied by the resolution of the scene element, s2 is a weight coefficient occupied by the scene complexity of the scene element, and the values of s1 and s2 are both larger than zero;
setting a threshold range of the rendering workload evaluation coefficient, and recording the threshold range of the rendering workload evaluation coefficient as (XV 0, XV 1);
when XV i When XV0 is not more than, the rendering workload of the area is recorded as three-level rendering workload;
when XV0 < XV i When XV1 is less than, the rendering workload of the area is recorded as secondary rendering workload;
when XV i When the rendering workload of the area is not less than XV1, the rendering workload of the area is recorded as primary rendering workload;
it should be further noted that, in the implementation process, the primary rendering workload is greater than the secondary rendering workload and greater than the tertiary rendering workload;
acquiring the number of pixel points of scene elements in each region, and marking the number as N i ;
Setting a threshold range of the pixel points, and marking the threshold range as (N0, N1);
setting scene rendering priority of scene elements in each region according to the acquired comparison result of the number of pixel points of the scene elements in each region and the set threshold range of the pixel points;
the scene rendering priorities comprise a first scene rendering priority, a second scene rendering priority and a third scene rendering priority;
the more pixels of scene elements in the region, the higher the scene rendering priority of the scene elements in the region;
when N is i Setting the scene rendering priority of the scene elements in the area as a first scene rendering priority when N0 is not more than;
when N0 < N i When the scene rendering priority of the scene elements in the area is less than N1, setting the scene rendering priority of the scene elements in the area as a second scene rendering priority;
when N is i Setting the scene rendering priority of the scene elements in the area as a third scene rendering priority when the scene rendering priority is not less than N1;
it should be further noted that, in the implementation process, the first scene rendering priority is higher than the second scene rendering priority and higher than the third scene rendering priority;
acquiring a visual angle range of a user character;
judging the zone rendering priority of each zone according to the view angle range of the user character, the rendering workload of each zone and the scene rendering priority of the scene elements in each zone;
the region rendering priorities comprise a highest region rendering priority, a first region rendering priority, a second region rendering priority, a third region rendering priority and a fourth region rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene element in the area is the first scene rendering priority and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the first area rendering priority;
if the scene rendering priority of the scene elements in the region is the first scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the second region rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority and the rendering workload of the area is the first-level rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the area is the second scene rendering priority, and the rendering workload of the area is the second rendering workload, marking the area rendering priority of the area as the second area rendering priority;
if the scene rendering priority of the scene element in the region is the second scene rendering priority and the rendering workload of the region is the three-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the first-level rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the second rendering workload, marking the region rendering priority of the region as the third region rendering priority;
if the scene rendering priority of the scene element in the region is the third scene rendering priority and the rendering workload of the region is the three-level rendering workload, the region rendering priority of the region is marked as the fourth region rendering priority;
obtaining a region in the view angle range of the persona according to the view angle range of the persona of the user, ignoring the region rendering priority before the region in the view angle range of the persona, and marking the region rendering priority of the region in the view angle range of the persona as the highest region rendering priority;
it should be further noted that, in the implementation process, the highest region rendering priority is higher than the first region rendering priority, the second region rendering priority is higher than the third region rendering priority, and the fourth region rendering priority.
It should be further noted that, in the implementation process, the process of rendering the scene element of the metaspace according to the partitioned rendering workload and the rendering priority includes:
setting a plurality of cloud rendering nodes, wherein the cloud rendering nodes comprise high-performance cloud rendering nodes, medium-performance cloud rendering nodes and low-performance cloud rendering nodes;
the high-performance cloud rendering node is used for rendering the region with the primary rendering workload, the medium-performance cloud rendering node is used for rendering the region with the secondary rendering workload, and the low-performance cloud rendering node is used for rendering the region with the tertiary rendering workload;
acquiring the rendering workload and the region rendering priority of each region, screening out the region with the highest region rendering priority, adopting a high-performance cloud rendering node to render the region with the screened rendering workload as primary rendering workload, adopting a medium-performance cloud rendering node to render the region with the screened rendering workload as secondary rendering workload, and adopting a low-performance cloud rendering node to render the region with the screened rendering workload as tertiary rendering workload;
screening out the region with the region rendering priority being the first region rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the region with the screened rendering workload being the primary rendering workload, and adopting a medium-performance cloud rendering node to render the region with the screened rendering workload being the secondary rendering workload;
screening out the area with the area rendering priority being the second area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
screening out the area with the area rendering priority being the third area rendering priority after the rendering is finished, adopting a high-performance cloud rendering node to render the area with the screened rendering workload being the primary rendering workload, adopting a medium-performance cloud rendering node to render the area with the screened rendering workload being the secondary rendering workload, and adopting a low-performance cloud rendering node to render the area with the screened rendering workload being the tertiary rendering workload;
after the rendering is completed, rendering the remaining area by adopting a low-performance cloud rendering node;
the scene rendering of the metaspace is completed, and the fusion of the metaspace and the cloud rendering is successful.
The above embodiments are only for illustrating the technical method of the present invention and not for limiting the same, and it should be understood by those skilled in the art that the technical method of the present invention may be modified or substituted without departing from the spirit and scope of the technical method of the present invention.
Claims (4)
1. A fusion method for metaspace and cloud rendering, comprising the steps of:
step S1: creating a metaspace and a scene element applied to the metaspace;
step S2: dividing the created scene elements into rendering workload and rendering priority;
step S3: rendering the scene elements of the metaspace according to the divided rendering workload and rendering priority;
the process of creating a metaspace and a scene element applied within the metaspace includes:
creating a three-dimensional coordinate system in the created metaspace, setting corresponding nodes in the metaspace, and obtaining the three-dimensional coordinate corresponding to each node; creating a corresponding scene element according to the node, and fusing the created scene element with the corresponding node;
the process of dividing the rendering workload for the created scene elements includes:
dividing a metaspace into a plurality of areas, and setting scene complexity of scene elements in each area;
obtaining the resolution of scene elements in each region;
obtaining a rendering workload evaluation coefficient according to the set scene complexity and the acquired resolution;
setting a threshold range of a rendering workload evaluation coefficient;
obtaining the rendering workload of each region according to the comparison result of the obtained rendering workload evaluation coefficient and the threshold range of the set rendering workload evaluation coefficient;
the rendering workload comprises primary rendering workload, secondary rendering workload and tertiary rendering workload;
the process of setting scene complexity of scene elements in each region comprises the following steps:
acquiring geometric details of scene elements in each region, and setting a threshold range of the geometric details;
obtaining scene complexity of scene elements in each region according to the obtained comparison result of the geometric details of the scene elements in each region and the set threshold range of the geometric details;
the process of prioritizing rendering of the created scene elements includes:
acquiring a view angle range of a user character role, and setting scene rendering priority of scene elements in each region;
judging the zone rendering priority of each zone according to the view angle range of the user character, the rendering workload of each zone and the scene rendering priority of the scene elements in each zone;
the region rendering priorities include a highest region rendering priority, a first region rendering priority, a second region rendering priority, a third region rendering priority, and a fourth region rendering priority.
2. The fusion method for metaspace and cloud rendering of claim 1, wherein the process of setting scene rendering priorities of scene elements in each region comprises:
acquiring the number of pixel points of scene elements in each region, and setting a threshold range of the pixel points;
and obtaining scene rendering priority of the scene elements in each region according to the obtained comparison result of the number of the pixel points of the scene elements in each region and the set threshold range of the pixel points.
3. The fusion method for metaspace and cloud rendering according to claim 2, wherein the process of rendering the scene elements of the metaspace according to the divided rendering workload and rendering priority comprises:
setting a plurality of cloud rendering nodes, wherein the cloud rendering nodes comprise high-performance cloud rendering nodes, medium-performance cloud rendering nodes and low-performance cloud rendering nodes;
acquiring the rendering workload and the region rendering priority of each region, screening out a region with the region rendering priority being the highest region rendering priority, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
screening out a region with the region rendering priority being the first region rendering priority after the rendering is finished, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
screening out a region with the region rendering priority being the second region rendering priority after the rendering is completed, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
screening out a region with the region rendering priority being the third region rendering priority after the rendering is completed, and rendering the region by adopting a corresponding cloud rendering node according to the rendering workload of the screened region;
and after the rendering is completed, rendering the remaining area by adopting a low-performance cloud rendering node.
4. A fusion method for metaspace and cloud rendering as claimed in claim 3, wherein when the rendering workload of the region is a primary rendering workload, rendering it by using a high-performance cloud rendering node;
when the rendering workload of the region is the secondary rendering workload, rendering the region by adopting a medium-performance cloud rendering node;
and when the rendering workload of the region is three-level rendering workload, rendering the region by adopting a low-performance cloud rendering node.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410096289.5A CN117611472B (en) | 2024-01-24 | 2024-01-24 | Fusion method for metaspace and cloud rendering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410096289.5A CN117611472B (en) | 2024-01-24 | 2024-01-24 | Fusion method for metaspace and cloud rendering |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117611472A CN117611472A (en) | 2024-02-27 |
CN117611472B true CN117611472B (en) | 2024-04-09 |
Family
ID=89960214
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410096289.5A Active CN117611472B (en) | 2024-01-24 | 2024-01-24 | Fusion method for metaspace and cloud rendering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117611472B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012140360A1 (en) * | 2011-04-12 | 2012-10-18 | Real Fusio France | Method and system for rendering a virtual scene in three dimensions |
CN110738721A (en) * | 2019-10-12 | 2020-01-31 | 四川航天神坤科技有限公司 | Three-dimensional scene rendering acceleration method and system based on video geometric analysis |
WO2022116759A1 (en) * | 2020-12-03 | 2022-06-09 | 腾讯科技(深圳)有限公司 | Image rendering method and apparatus, and computer device and storage medium |
WO2022127278A1 (en) * | 2020-12-18 | 2022-06-23 | 完美世界(北京)软件科技发展有限公司 | Method and apparatus for rendering virtual scene |
CN115512019A (en) * | 2021-06-21 | 2022-12-23 | 华为云计算技术有限公司 | Rendering method, device and system |
CN115984519A (en) * | 2022-12-27 | 2023-04-18 | 富春科技股份有限公司 | VR-based space scene display method, system and storage medium |
CN116028176A (en) * | 2022-12-25 | 2023-04-28 | 中势科技有限公司 | Resource scheduling method applied to meta universe |
CN116152410A (en) * | 2022-12-02 | 2023-05-23 | 浙江毫微米科技有限公司 | Method, system, equipment and storage medium for rendering images in meta universe |
CN117036574A (en) * | 2023-08-11 | 2023-11-10 | 北京百度网讯科技有限公司 | Rendering method, rendering device, electronic equipment and storage medium |
CN117271749A (en) * | 2023-11-04 | 2023-12-22 | 北京蔚领时代科技有限公司 | Creation method and computer for non-player characters in meta-universe scene |
-
2024
- 2024-01-24 CN CN202410096289.5A patent/CN117611472B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012140360A1 (en) * | 2011-04-12 | 2012-10-18 | Real Fusio France | Method and system for rendering a virtual scene in three dimensions |
CN110738721A (en) * | 2019-10-12 | 2020-01-31 | 四川航天神坤科技有限公司 | Three-dimensional scene rendering acceleration method and system based on video geometric analysis |
WO2022116759A1 (en) * | 2020-12-03 | 2022-06-09 | 腾讯科技(深圳)有限公司 | Image rendering method and apparatus, and computer device and storage medium |
WO2022127278A1 (en) * | 2020-12-18 | 2022-06-23 | 完美世界(北京)软件科技发展有限公司 | Method and apparatus for rendering virtual scene |
CN115512019A (en) * | 2021-06-21 | 2022-12-23 | 华为云计算技术有限公司 | Rendering method, device and system |
CN116152410A (en) * | 2022-12-02 | 2023-05-23 | 浙江毫微米科技有限公司 | Method, system, equipment and storage medium for rendering images in meta universe |
CN116028176A (en) * | 2022-12-25 | 2023-04-28 | 中势科技有限公司 | Resource scheduling method applied to meta universe |
CN115984519A (en) * | 2022-12-27 | 2023-04-18 | 富春科技股份有限公司 | VR-based space scene display method, system and storage medium |
CN117036574A (en) * | 2023-08-11 | 2023-11-10 | 北京百度网讯科技有限公司 | Rendering method, rendering device, electronic equipment and storage medium |
CN117271749A (en) * | 2023-11-04 | 2023-12-22 | 北京蔚领时代科技有限公司 | Creation method and computer for non-player characters in meta-universe scene |
Non-Patent Citations (4)
Title |
---|
Gaze-Contingent Rendering in Virtual Reality;Zhu, F等;《37th Computer Graphics International (CGI) Conference》;20221018;第12221卷;16-23 * |
Research progress of virtual view rendering technology for light field display;Zhao, J等;《LIQUID CRYSTALS AND DISPLAYS》;20231005;第38卷(第10期);1361-1371 * |
元宇宙下基于边缘的资源分配算法研究;沈健祺;《CNKI中国优秀硕士毕业论文全文库(信息科技辑)》;20230715(第7期);I136-613 * |
高效的室内场景渲染系统设计与优化;曹勇;《CNKI中国优秀硕士毕业论文全文库(信息科技辑)》;20210415(第4期);I138-737 * |
Also Published As
Publication number | Publication date |
---|---|
CN117611472A (en) | 2024-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110969589B (en) | Dynamic scene blurred image blind restoration method based on multi-stream annotating countermeasure network | |
KR101964282B1 (en) | 2d image data generation system using of 3d model, and thereof method | |
CN108182657A (en) | A kind of face-image conversion method that confrontation network is generated based on cycle | |
WO2002091302A3 (en) | Image sequence enhancement system and method | |
CN105678216A (en) | Spatio-temporal data stream video behavior recognition method based on deep learning | |
CN102741879A (en) | Method for generating depth maps from monocular images and systems using the same | |
CN110264405B (en) | Image processing method, device, server and storage medium based on interpolation algorithm | |
JP2007158510A (en) | Image processor and its control method, computer program, and computer readable memory medium | |
JPH03218581A (en) | Picture segmentation method | |
CN111091151B (en) | Construction method of generation countermeasure network for target detection data enhancement | |
CN101510299A (en) | Image self-adapting method based on vision significance | |
GB2606785A (en) | Adaptive convolutions in neural networks | |
CN102750685A (en) | Image processing method and device | |
CN106101858A (en) | A kind of video generation method and device | |
CN113052764A (en) | Video sequence super-resolution reconstruction method based on residual connection | |
CN117611472B (en) | Fusion method for metaspace and cloud rendering | |
CN107203961B (en) | Expression migration method and electronic equipment | |
CN102542528B (en) | Image conversion processing method and system | |
Soni et al. | Removal of high density salt and pepper noise removal by modified median filter | |
CN115457448B (en) | Intelligent extraction system for video key frames | |
CN115527258A (en) | Face exchange method based on identity information response | |
CN110772790B (en) | Method and system for resetting strange area of game map brush | |
Huang et al. | Image dehazing in disproportionate haze distributions | |
CN106648634A (en) | Screen shot method and screen shot device | |
CN108389208B (en) | Intelligent image adaptive display method based on semantic segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |