WO2022135050A1 - 渲染方法、设备以及系统 - Google Patents

渲染方法、设备以及系统 Download PDF

Info

Publication number
WO2022135050A1
WO2022135050A1 PCT/CN2021/133713 CN2021133713W WO2022135050A1 WO 2022135050 A1 WO2022135050 A1 WO 2022135050A1 CN 2021133713 W CN2021133713 W CN 2021133713W WO 2022135050 A1 WO2022135050 A1 WO 2022135050A1
Authority
WO
WIPO (PCT)
Prior art keywords
rendering
target scene
processing
angle
data
Prior art date
Application number
PCT/CN2021/133713
Other languages
English (en)
French (fr)
Inventor
尹青
谢坤
Original Assignee
华为云计算技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为云计算技术有限公司 filed Critical 华为云计算技术有限公司
Priority to EP21909049.5A priority Critical patent/EP4258218A4/en
Publication of WO2022135050A1 publication Critical patent/WO2022135050A1/zh
Priority to US18/338,835 priority patent/US20230351671A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Definitions

  • the present application relates to the field of three-dimensional rendering, and in particular, to a rendering method, device and system.
  • Rendering refers to the process of using software to generate an image from a model, where a model is a description of a three-dimensional object in a strictly defined language or data structure, including geometry, viewpoint, texture, and lighting information.
  • the images are digital images or bitmap images.
  • Rendering is a term similar to "an artist's rendering of a scene", and is also used to describe "the process of computing effects in a video editing file to produce the final video output”.
  • Rendering can include pre-rendering/offline rendering or real-time rendering/online rendering, where pre-rendering is usually used for real-world simulation with predetermined scripts such as movies and advertisements; real-time rendering is usually used Unscripted live-action simulation for flight training, 3D games, and interactive architectural demonstrations.
  • Real-time rendering usually uses rasterization rendering. However, rasterization rendering requires a large amount of calculation, which will lead to a lot of waste of resources.
  • the present application provides a rendering method, device and system, which can effectively save computing resources.
  • a rendering method is provided, which is applied to a rendering application server, the rendering application server belongs to a rendering system, and the rendering system includes a rendering application client and a rendering engine, wherein the rendering application server and The rendering engine is deployed on a remote rendering node and receives a first rendering request and a second rendering request, wherein the first rendering request indicates a target scene and a first angle for observing the target scene, and the second rendering request indicates the target scene and the second angle from which the target scene is observed; the rendering engine performs non-perspective-related processing on the data of the target scene to obtain non-perspective processing data; The first angle of the target scene and the non-perspective processing data are processed including perspective-related processing to obtain a first rendered image; Including viewing angle correlation processing, a second rendered image is obtained.
  • performing non-perspective-related processing on the data of the target scene by the rendering engine to obtain non-perspective processing data includes: invoking a target image rendering pipeline by the rendering engine to perform a non-perspective processing on the target scene.
  • the data are processed in a non-viewpoint-dependent manner.
  • the non-view related processing does not include processing related to the first angle of viewing the target scene and the second angle of viewing the target scene.
  • the viewing angle-related processing includes processing related to the observation of a first angle of the target scene or a second angle of the observation of the target scene.
  • the non-view-dependent processing includes one or more of vertex specification, vertex shader processing, tessellation, and geometry shaders
  • the view-dependent processing includes one or more of clipping and culling, or multiple.
  • the first rendering request includes: an identifier of the target scene and the first angle from which the target scene is observed; or, geometric data of part or all of the meshes in the target scene , texture data, material data, and the first angle of viewing the target scene.
  • the second rendering request includes: an identifier of the target scene and the second angle from which the target scene is observed; or, geometric data of part or all of the meshes in the target scene , texture data, material data, and the second angle of viewing the target scene.
  • a rendering of the application server includes: a communication module and a rendering module;
  • the communication module is configured to receive a first rendering request and a second rendering request, wherein the first rendering request indicates a target scene and a first angle for observing the target scene, and the second rendering request indicates the target scene and a second angle to observe the target scene;
  • the rendering module is configured to perform non-perspective-related processing on the data of the target scene through the rendering engine to obtain non-perspective processing data;
  • the rendering module is configured to perform, by the rendering engine, a first rendered image including viewing angle correlation processing according to the first angle of observation of the target scene and the non-perspective processing data;
  • the rendering module is configured to obtain a second rendered image by performing, by the rendering engine, a viewing angle correlation process according to the second viewing angle of the target scene and the non-viewing angle processing data.
  • performing non-perspective-related processing on the data of the target scene by the rendering engine to obtain non-perspective processing data includes: invoking a target image rendering pipeline by the rendering engine to perform a non-perspective processing on the target scene.
  • the data are processed in a non-viewpoint-dependent manner.
  • the non-view related processing does not include processing related to the first angle of viewing the target scene and the second angle of viewing the target scene.
  • the viewing angle-related processing includes processing related to the observation of a first angle of the target scene or a second angle of the observation of the target scene.
  • the non-view-dependent processing includes one or more of vertex specification, vertex shader processing, tessellation, and geometry shaders
  • the view-dependent processing includes one or more of clipping and culling, or multiple.
  • the first rendering request includes: an identifier of the target scene and the first angle from which the target scene is observed; or, geometric data of part or all of the meshes in the target scene , texture data, material data, and the first angle of viewing the target scene.
  • the second rendering request includes: an identifier of the target scene and the second angle from which the target scene is observed; or, geometric data of part or all of the meshes in the target scene , texture data, material data, and the second angle of viewing the target scene.
  • a rendering system in a third aspect, includes a rendering application server, a rendering application client, and a rendering engine, wherein the rendering application server and the rendering engine are deployed on a remote rendering node,
  • the rendering application server is configured to receive a first rendering request and a second rendering request, wherein the first rendering request indicates a target scene and a first angle for observing the target scene, and the second rendering request indicates the a target scene and a second angle for observing the target scene;
  • the rendering application server is configured to perform non-perspective-related processing on the data of the target scene through the rendering engine to obtain non-perspective processing data;
  • the rendering application server is configured to obtain a first rendered image by performing, by the rendering engine, a processing including viewing angle correlation according to the first viewing angle of the target scene and the non-viewing angle processing data;
  • the rendering application server is configured to perform, through the rendering engine, processing including viewing angle correlation according to the second viewing angle of the target scene and the non-viewing angle processing data to obtain a second rendered image.
  • performing non-perspective-related processing on the data of the target scene by the rendering engine to obtain non-perspective processing data includes: invoking a target image rendering pipeline by the rendering engine to perform a non-perspective processing on the target scene.
  • the data are processed in a non-viewpoint-dependent manner.
  • the non-view related processing does not include processing related to the first angle of viewing the target scene and the second angle of viewing the target scene.
  • the viewing angle-related processing includes processing related to the observation of a first angle of the target scene or a second angle of the observation of the target scene.
  • the non-view-dependent processing includes one or more of vertex specification, vertex shader processing, tessellation, and geometry shaders
  • the view-dependent processing includes one or more of clipping and culling, or multiple.
  • the first rendering request includes: an identifier of the target scene and the first angle from which the target scene is observed; or, geometric data of part or all of the meshes in the target scene , texture data, material data, and the first angle of viewing the target scene.
  • the second rendering request includes: an identifier of the target scene and the second angle from which the target scene is observed; or, geometric data of part or all of the meshes in the target scene , texture data, material data, and the second angle of viewing the target scene.
  • a computing node in a fourth aspect, includes a processor and a memory, and the processor executes a program in the memory, thereby performing the method according to any one of the first aspects.
  • a computer-readable storage medium comprising instructions that, when executed on a computing node, cause the computing node to perform the method according to any one of the first aspects.
  • a computer program product is provided, when the computer program product is read and executed by the computer, the method according to any one of the first aspects will be executed.
  • FIGS. 1A to 1B are schematic structural diagrams of some rendering systems involved in the present application.
  • FIG. 2 is a schematic diagram of viewing a target scene from multiple angles involved in the present application
  • FIG. 3 is a schematic diagram of scheduling multiple rendering requests to be executed by multiple first image rendering pipelines involved in the present application
  • FIG. 4 is a schematic structural diagram of a first image rendering pipeline provided by the present application.
  • FIG. 5 is a schematic diagram of a transformation process of a vertex shader provided by the present application.
  • FIG. 6 is a schematic diagram of a tessellation technique provided by the present application.
  • Fig. 7 is the schematic diagram of cutting provided by the application.
  • FIG. 8 is a schematic diagram of an assembly line segment provided by the present application.
  • FIG. 10 is a schematic diagram of rasterization provided by the present application.
  • FIGS. 11A to 11C are schematic diagrams of some second image rendering pipelines provided by the present application.
  • FIG. 12 is a schematic flowchart of a raster rendering method provided by the present application.
  • FIG. 13 is a schematic diagram of the angle at which the first user observes the target scene provided by the present application.
  • FIGS. 14A to 14C are schematic diagrams of rendering respectively corresponding to the second image rendering pipelines of FIGS. 11A to 11C provided in the present application;
  • 15 is a schematic diagram of scheduling multiple rendering requests to be executed as multiple second image rendering pipelines and multiple third image rendering pipelines involved in the present application;
  • 16 is a schematic flowchart of a raster rendering method provided by the present application.
  • FIG. 17 is a schematic flowchart of a raster rendering method provided by the present application.
  • FIG. 20 is a schematic structural diagram of a rendering application server provided by the present application.
  • FIG. 21 is a schematic structural diagram of a remote rendering platform provided by the present application.
  • FIG. 1A is a schematic structural diagram of a rendering system involved in the present application.
  • the raster rendering system of the present application is used for a 2D image obtained by rendering a 3D model of a target scene through a rendering method, that is, a rendered image.
  • the rendering method may include rasterization rendering and the like.
  • the raster rendering system of the present application may include: a plurality of terminal devices 110 , a network device 120 and a remote rendering platform 130 .
  • the remote rendering platform 130 may be deployed on a public cloud.
  • the remote rendering platform 130 and the terminal device 110 are generally deployed in different data centers or even geographic areas.
  • the terminal device 110 may be a device that needs to display a rendered image in real time, for example, a virtual reality device (Virtual Reality, VR) for flight training, a computer for virtual games, a smartphone for virtual shopping malls, and so on , which is not specifically limited here.
  • the terminal device can be a device with high configuration and high performance (for example, multi-core, high frequency, large memory, etc.), or a device with low configuration and low performance (for example, single core, low frequency, small memory, etc.) equipment.
  • the terminal device 110 may include hardware, an operating system, and a rendering application client.
  • the network device 120 is used to transmit data between the terminal device 110 and the remote rendering platform 130 through a communication network of any communication mechanism/communication standard.
  • the communication network may be a wide area network, a local area network, a point-to-point connection, etc., or any combination thereof.
  • the remote rendering platform 130 includes a plurality of remote rendering nodes, and each remote rendering node includes rendering hardware, virtualization services, rendering engines and rendering applications from bottom to top.
  • the rendering hardware includes computing resources, storage resources, and network resources.
  • the computing resources can adopt a heterogeneous computing architecture, for example, a central processing unit (CPU) + graphics processing unit (GPU) architecture, CPU+AI chip, CPU+GPU+AI chip architecture, etc. , which is not specifically limited here.
  • Storage resources can include memory and so on.
  • Network resources may include network cards and the like.
  • Virtualization service is a service that builds the resources of multiple physical hosts into a unified resource pool through virtualization technology, and flexibly isolates independent resources according to the needs of users to run the user's application program.
  • virtualization services may include virtual machine (virtual machine, VM) services and container (container) services.
  • a rendering engine can be used to implement image rendering algorithms.
  • the rendering application server can be used to call the rendering engine to complete the rendering
  • the rendering application client on the terminal device 110 and the rendering application server of the remote rendering platform 130 are collectively referred to as rendering applications, wherein common rendering applications may include: game applications, VR applications, movie special effects, animations, and the like.
  • the user inputs instructions through the rendering application client, the rendering application client translates the data into data and sends it to the rendering application server, the rendering application server processes and gives the result, and then the rendering application client translates it into graphical presentation to the user.
  • the rendering application client is an intermediary between the user and the rendering application server.
  • the rendering application server may be provided by a rendering application provider
  • the rendering application client may be provided by the rendering application provider
  • the rendering engine may be provided by a cloud service provider.
  • the rendering application can be a game application.
  • the game application developer installs the game application server on the remote rendering platform provided by the cloud service provider, and the game application game developer provides the game application client through the Internet. Download it to the user and install it on the user's terminal device.
  • cloud service providers also provide rendering engines, which can provide computing power for game applications.
  • the rendering application client, the rendering application server and the rendering engine may all be provided by a cloud service provider.
  • a management device 140 is also included.
  • the management device 140 may be a user's terminal device and a device provided by a third party other than the cloud service provider's remote rendering platform 130 .
  • the management device 140 may be a device provided by a game developer.
  • the game developer can manage the rendering application through the management device 140.
  • the game developer can specify the image quality of the initial rendered image provided by the rendering application server to the rendering application client through the management device 140, and so on.
  • the management device 140 may be disposed on the remote rendering platform, or may be disposed outside the remote rendering platform, which is not specifically limited here.
  • the target scene includes a light source and a three-dimensional model.
  • the light produced by the light source is projected on the 3D model.
  • FIG. 2 assuming that the target scene is shown in the upper part of FIG. 2 , when the first user of the terminal device 1 observes from the first perspective, the rendered image that needs to be generated is shown on the left in FIG. 2 .
  • the second user of the device 2 observes from the second viewing angle, the rendered image that needs to be generated is shown on the right side of FIG. 2 .
  • the terminal device 1 and the terminal device 2 may independently use the resources of the remote rendering platform 130 to perform rasterization rendering on the target scene, so as to obtain rendered images from different angles. specifically,
  • the terminal device 1 sends a first rendering request to the remote rendering platform 130 through the network device 120, and the remote rendering platform 130 calls the rendering engine to perform rasterization rendering on the target scene from the perspective of the first user according to the first rendering request, thereby obtaining the first user.
  • the rendered image of the target scene generated from the perspective of .
  • the first rendering request is used to indicate the first viewing angle and the target scene.
  • the terminal device 2 sends a second rendering request to the remote rendering platform 130 through the network device 120, and the remote rendering platform 130 invokes the rendering engine to rasterize and render the target scene from the perspective of the first user according to the second rendering request, thereby obtaining the second user.
  • the rendered image of the target scene generated from the perspective of .
  • the second rendering request is used to indicate the second viewing angle and the target scene.
  • the process of performing rasterization rendering on the remote rendering platform may be as follows: As shown in Figure 3, after the rendering application server receives the concurrent rendering request 1 to rendering request 9 (not shown) sent by the rendering application client, the rendering request 1 After rendering request 9 is passed to the rendering engine, the rendering engine generates rendering task 1 according to rendering request 1, generates rendering task 2 according to rendering request 2, ..., and generates rendering task 9 according to rendering request 9. Then, the rendering engine schedules the first image rendering pipeline 1 to execute rendering task 1 according to rendering task 1 to obtain rendered image 1; schedules the first image rendering pipeline 2 to execute rendering task 2 according to rendering task 2 to obtain rendered image 2; ..., according to the rendering The task 9 schedules the first image rendering pipeline 9 to execute the rendering task 9 to obtain the rendered image 9 .
  • different rendering requests occupy different first image rendering pipelines, for example, rendering request 1 to rendering request 9 occupy the first image rendering pipeline 1 to first image rendering pipeline 9 respectively, even if different rendering requests are for the same
  • the target scene is rendered from different angles.
  • FIG. 4 is a schematic structural diagram of a first image rendering pipeline provided by the present application.
  • the first image rendering pipeline provided by the present application generally includes an application stage, a geometry stage, and a rasterization stage. in,
  • Application stage Collision detection, acceleration algorithms, input detection, animation, force feedback and texture animation, transformations, simulations, geometric deformations, and some calculations not performed in other stages are usually implemented.
  • Geometry stage usually includes vertex specification, vertex shader, primitive assembly, tessellation, geometry shader, vertex post-processing, Primitive assembly, rasterization, fragment shaders, and per-sample operations are staged.
  • Vertex specs are often used to get vertex data.
  • the vertex data is generated according to the three-dimensional model in the target scene, the vertex data includes three-dimensional coordinates of the vertex, and the vertex data may also include the normal vector of the vertex, the color of the vertex, and the like.
  • a vertex can be a point on a 3D model, for example, where two edges of a polygon meet in a 3D model, a common endpoint of two edges in a 3D model, and so on.
  • Vertex shaders are usually used to transform the 3D coordinates of vertices from model space (object space) to screen space (screen/image space).
  • the transformation process can be: transform from model space to world space (world space), then transform from world space to view space (view space), and then transform from view space to nominal projection space ( normalized projection space), and then transform from the nominal projection space to the screen space.
  • the viewing space includes a viewing frustum, the space inside the viewing frustum is a space that can be seen from the user's perspective, and the space outside the viewing frustum is a space that cannot be seen from the user's perspective.
  • Tessellation is used to dramatically increase the number of vertices in a 3D model.
  • the three-dimensional model includes three vertices that form a triangle.
  • the number of vertices in the 3D model has changed from three to six. It can be seen that the 3D model appears rough and rigid before tessellation, and the 3D model appears realistic and vivid after tessellation.
  • Geometry shaders are used to transform one or more vertices in a 3D model into completely different primitives, thereby generating more vertices.
  • Vertex post-processing is used to clip the primitive, that is, if the primitive is partly outside the view frustum and partly inside the view frustum, the part of the primitive outside the view frustum needs to be clipped, only The portion within the optic frustum is preserved.
  • Fig. 7 the left half of Fig. 7 and the right half of Fig. 7 show the images before clipping the primitives outside the vertebral body and after clipping the primitives inside the vertebral body, respectively. Schematic.
  • Primitive assembly is usually used to assemble vertices in a 3D model into geometric primitives. This stage will generate a series of triangles, line segments and points. Wherein, as shown in FIG. 8, the assembled line segments may include (a) independent line segments; (b) line segments connected end to end but not closed eventually; (c) line segments connected end to end and finally sealed and closed.
  • the assembled triangles can include (a) connecting in sequence according to the defined order of the points; (b) starting from the first point, draw a triangle for each group of three points, and the triangles are independent; ( c) Starting from the third point, each point is combined with the previous two points to draw a triangle, that is, a linear continuous triangle string; (d) Starting from the third point, each point is combined with the previous point and the first point Draw a triangle, a fan-shaped continuous triangle.
  • culling can also be done, i.e., the removal of invisible objects from the scene.
  • culling may include frustum culling, viewport culling, and occlusion culling.
  • the rasterization stage includes rasterization, fragment shaders, and per-sample operations.
  • Rasterization is the process of converting vertex data into fragments, which has the effect of converting a graph into an image composed of rasters.
  • the feature is that each element corresponds to a pixel in the frame buffer. Therefore, the first part of the rasterization works: as shown in Figure 10, determines which integer grid areas in the window coordinates are occupied by the primitive; the second part of the work: assigns a color value and a depth value to each area.
  • the rasterization process produces fragments.
  • the fragment shader is used to calculate the final color output of the pixel.
  • Pixel-by-pixel processing includes depth testing as well as transparency processing. It can be understood that if we draw an object with a closer distance first, and then draw an object with a further distance, the object with a distance will cover the object with a closer distance because of the latter drawing. This effect is not what we hoped.
  • the depth test actually records the distance (drawing coordinates) of the pixel from the camera in the 3D world.
  • the depth buffer stores the greater the depth value (Z value) of each pixel (drawn on the screen), the greater the distance. The farther the camera is, therefore, with the depth buffer, the order in which the objects are drawn is not so important, and they can be displayed normally according to the distance (Z value).
  • the present application proposes a raster rendering method, device and system, which can effectively reduce the demand for computing resources.
  • the present application proposes a second image rendering pipeline, which divides the processing in the first image rendering pipeline into two parts: non-perspective-related processing and subsequent processing.
  • the second image rendering pipeline only includes non-perspective-related processing and does not include subsequent processing.
  • the subsequent processing includes viewing angle related processing.
  • the non-perspective-related processing refers to processing that is irrelevant to the user's perspective
  • the perspective-dependent processing refers to processing that is related to the user's perspective.
  • the non-view related processing although the first user's viewing angle of the target scene is different from the second user's viewing angle of the target scene, since the target scene is the same, the processing is the same.
  • the viewing angle correlation processing although the target scene is the same, the processing is different because the angle at which the first user observes the target scene and the angle at which the second user observes the target scene are different.
  • FIGS. 11A to 11C are schematic diagrams of some second image rendering pipelines provided by the present application.
  • the second image rendering pipeline may include the following implementations:
  • the non-view related processing in the second image rendering pipeline may include vertex specifications and vertex shaders.
  • Subsequent processing can include tessellation, geometry shaders, vertex post-processing, primitive assembly, rasterization, fragment shaders, and per-pixel processing, with view-dependent processing including clipping and culling.
  • clipping can occur in any stage of tessellation technology, vertex post-processing, and primitive assembly, and culling can occur in the primitive assembly stage.
  • the non-view-related processing in the second image rendering pipeline may include vertex specifications, vertex shaders, and tessellation techniques, and subsequent processing may include geometry shaders, vertex post-processing, graph Meta-assembly, rasterization, fragment shaders, and per-pixel processing, view-dependent processing including clipping and culling.
  • clipping can occur in any stage of geometry shader, vertex post-processing, and primitive assembly, and culling can occur in the primitive assembly stage.
  • the non-view related processing in the second image rendering pipeline may include vertex shader processing, tessellation technology and geometry shader, and subsequent processing may include vertex post-processing, primitive assembly , rasterization, fragment shaders, and per-pixel processing, view-dependent processing including clipping and culling.
  • clipping can occur in any stage of vertex post-processing and primitive assembly, and culling can occur in the primitive assembly stage.
  • the non-perspective-related processing may also include some other processing that is independent of the viewing angle, for example, the processing of assembling the vertices in the 3D model into points, line segments and triangles in the primitive assembly (excluding clipping processing) ) is also set in non-view-dependent processing.
  • FIG. 12 is a schematic flowchart of a raster rendering method provided by the present application. As shown in Figure 12, the raster rendering method of this embodiment includes:
  • the rendering application client of the first terminal device sends a first rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the first rendering request sent by the rendering application client of the first terminal device, wherein the first rendering request is used to indicate the target scene and the first user to observe the target scene. Angle.
  • the first rendering request may include the following implementations:
  • the first rendering request includes an identifier of the target scene and an angle from which the first user observes the target scene.
  • the remote rendering platform may pre-store the corresponding relationship between the identifier of the target scene and the data of the target scene, for example, geometric data, texture data, and material data. Therefore, the remote rendering platform can find the data of the corresponding target scene through the identifier of the target scene to perform rendering.
  • the geometric data may include vertex data of each mesh in the target scene, etc.
  • the texture data may include the color of each mesh in the target scene, and the like
  • the material data may include each mesh in the target scene. materials, such as metallic, specular, and diffuse materials, etc. As shown in FIG.
  • the angle at which the first user observes the target scene can be expressed as (P 1 , ⁇ 1 ), where P 1 is the vertical distance from the first user’s viewpoint E 1 to the rendered image, and ⁇ 1 is the The angle between the line connecting the viewpoint E 1 to the center point O of the rendered image and the horizontal line.
  • the first rendering request includes geometric data, texture data, material data of the target scene, and an angle from which the first user observes the target scene.
  • the first rendering request may include all geometric data, texture data, material data of the target scene and the angle from which the first user observes the target scene.
  • the first rendering request may include geometric data, texture data, and material data in the target scene that are changed relative to the previous scene, and the angle from which the first user observes the target scene. It can be understood that, at this time, the remote rendering platform may not need to pre-store the correspondence between the identifier of the target scene and the geometric data, texture data and material data of the target scene.
  • the rendering application client of the second terminal device sends a second rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the second rendering request sent by the rendering application client of the second terminal device, wherein the second rendering request is used to instruct the target scene and the second user to observe the The angle of the target scene.
  • the content included in the second rendering request is similar to the content included in the first rendering request.
  • the statement of the first rendering request please refer to the statement of the first rendering request, which will not be described further here.
  • the rendering engine of the remote rendering platform invokes the second image rendering pipeline to perform non-perspective-related processing on the data of the target scene to obtain non-perspective processing data.
  • the remote rendering platform uses the rendering engine to perform subsequent processing including viewing angle related processing according to the angle at which the first user observes the target scene and the non-perspective processing data, to obtain a first rendered image.
  • the remote rendering platform uses the rendering engine to perform subsequent processing including viewing angle related processing according to the viewing angle at which the second user observes the target scene and the non-viewing angle processing data, to obtain a second rendered image.
  • the rendering application server of the remote rendering platform sends the first rendering image to the rendering application client of the first terminal device.
  • the rendering application client of the first terminal device receives the first rendering image sent by the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform sends the second rendering image to the rendering application client of the second terminal device.
  • the rendering application client of the second terminal device receives the second rendering image sent by the rendering application server of the remote rendering platform.
  • the remote rendering platform after the remote rendering platform receives the first rendering request and the second rendering request, it generates a combined rendering task according to the first rendering request and the second rendering request, and calls the second image rendering pipeline to perform non- Perspective-dependent processing to obtain non-perspective processing data. That is to say, the first rendering request and the second rendering request generate a common combined rendering task, and use the same second image rendering pipeline to perform non-perspective-related processing to obtain non-perspective processing data.
  • the remote rendering platform copies the non-perspective processing data into two copies, one of which is used for subsequent processing in combination with the angle at which the first user observes the target scene, so as to obtain a first rendered image, and the other is used for combining with The second user observes the angle of the target scene and performs subsequent processing to obtain a second rendered image. Steps S103 to S105 will be described in detail below with reference to FIGS. 11A-11C .
  • the remote rendering platform invokes the second image rendering pipeline to perform vertex specification and vertex shader processing on the data of the target scene, thereby obtaining vertex shading data .
  • the remote rendering platform then duplicates the vertex shading data in two.
  • the remote rendering platform performs tessellation technology, geometry shader, vertex post-processing (including clipping), primitive assembly (including culling),
  • the first rendered image is obtained by rasterization, fragment shader and pixel-by-pixel processing, and the remote rendering platform performs tessellation technology and geometry shader according to the angle at which the second user observes the target scene and the second vertex shading data , vertex post-processing (including clipping), primitive assembly (including culling), rasterization, fragment shader, and per-pixel processing to obtain a second rendered image. That is to say, although the remote rendering platform receives the first rendering request and the second rendering request, the remote rendering platform only performs vertex specification and vertex shader processing once, so that computing resources can be effectively saved.
  • the remote rendering platform After the remote rendering platform receives the first rendering request and the second rendering request, the remote rendering platform performs vertex specification, vertex shader and surface tessellation processing on the data of the target scene, so as to obtain a detailed Fractional data.
  • the remote rendering platform duplicates the subdivision data in two.
  • the remote rendering platform performs geometry shader, vertex post-processing (including clipping), primitive assembly (including culling), rasterization, Fragment shader and pixel-by-pixel processing to obtain the first rendered image
  • the remote rendering platform performs geometry shader and vertex post-processing (including cropping) according to the angle at which the second user observes the target scene and the second subdivision data , primitive assembly (including culling), rasterization, fragment shader, and per-pixel processing to obtain a second rendered image. That is to say, although the remote rendering platform receives the first rendering request and the second rendering request, the remote rendering platform only processes the vertex specification, vertex shader and surface tessellation technology once, so the calculation can be effectively saved. resource.
  • the remote rendering platform After the remote rendering platform receives the first rendering request and the second rendering request, the remote rendering platform performs vertex specification, vertex shader, tessellation technology and geometry shader processing on the data of the target scene , to obtain geometric shading data.
  • the remote rendering platform duplicates the geometry shading data in two.
  • the remote rendering platform performs vertex post-processing (including clipping), primitive assembly (including culling), rasterization, fragment shader and
  • the first rendered image is obtained by pixel-by-pixel processing, and the remote rendering platform performs vertex post-processing (including cropping) and primitive assembly (including culling) according to the angle at which the second user observes the target scene and the second geometric shading data , rasterization, fragment shader, and per-pixel processing to get the second rendered image. That is to say, although the remote rendering platform receives the first rendering request and the second rendering request, the remote rendering platform only performs one processing of vertex specification, vertex shader, tessellation technology and geometry shader, so it can Effectively save computing resources.
  • the rendering system also has a third terminal device and a fourth terminal device
  • the third terminal device and the fourth terminal device also need to render the target scene, so as to obtain the third rendered image and the fourth rendered image.
  • the remote rendering platform also only needs to perform non-perspective-related processing once, and does not need to perform non-perspective-related processing four times, which can more effectively reduce the waste of computing resources.
  • the remote rendering platform may start a first thread to establish a second image rendering pipeline to perform non-perspective-related processing on the data of the target scene, thereby obtaining non-perspective processing data, and may start a second thread to Follow-up processing including viewing angle-related processing is performed according to the viewing angle and non-perspective processing data of the first user observing the target scene, so as to obtain a first rendered image, and a third thread can be started to The angle and non-angle processing data are subjected to subsequent processing including angle-related processing, thereby obtaining a second rendered image.
  • the execution of the foregoing S104 to S105 may be performed in parallel or sequentially in any order, and the execution of the foregoing S106 to S107 may be performed in parallel or sequentially in any order.
  • the processing order in the image rendering pipeline is: vertex shader, tessellation technology, geometry shader, vertex post-processing (including clipping), primitive assembly (including culling), rasterization, fragment shader And pixel-by-pixel processing is introduced as an example.
  • the processing order in the image rendering pipeline may change, which is not specifically limited here.
  • FIG. 15 is a schematic diagram of scheduling multiple rendering requests to be executed as multiple second image rendering pipelines and multiple third image rendering pipelines involved in the present application.
  • the process of performing rasterization rendering on the remote rendering platform may be as follows: as shown in Figure 15, after the rendering application server receives the concurrent rendering request 1 to rendering request 9 (not shown) sent by the rendering application client, the rendering application server sends the rendering request 1 to the rendering request. To render request 9 is passed to the rendering engine.
  • the rendering engine generates rendering task 1 according to the identity of the target scene carried in rendering request 1 and the identity of the target scene carried in rendering request 2, and calls the second image rendering pipeline 1 to perform non-perspective related processing to obtain non-perspective data 1 , and respectively send the non-perspective data 1 to the third image rendering pipeline 1 and the third image rendering pipeline 2 for subsequent processing including viewing angle-related processing, to obtain the rendered image 1 and the rendered image 2, respectively.
  • the rendering engine generates a rendering task 2 according to the identification of the target scene carried in the rendering request 3 and the identification of the target scene carried in the rendering request 4, and calls the second image rendering pipeline 2 to perform non-perspective related processing to obtain non-perspective data 2 , and respectively send the non-perspective data 2 to the third image rendering pipeline 3 and the third image rendering pipeline 4 for subsequent processing including viewing angle related processing, to obtain the rendered image 3 and the rendered image 4 respectively.
  • the rendering engine generates a rendering task 3 according to the identifier of the target scene carried in the rendering request 5, and calls the second image rendering pipeline 3 to perform non-perspective related processing, obtains the non-perspective data 3, and sends the non-perspective data 3 to the third
  • the image rendering pipeline 5 performs subsequent processing including viewing angle related processing to obtain a rendered image 5 .
  • the rendering engine generates a rendering task 4 according to the identification of the target scene carried in the rendering request 6, the identification of the target scene carried in the rendering request 7 and the identification of the target scene carried in the rendering request 8, and calls the second image rendering pipeline 4.
  • Perform non-perspective correlation processing to obtain non-perspective data 4, and respectively send the non-perspective data 4 to the third image rendering pipeline 6, the third image rendering pipeline 7 and the third image rendering pipeline 8 for subsequent processing including viewing angle correlation processing.
  • the rendered image 6, the rendered image 7 and the rendered image 8 are obtained respectively.
  • the rendering engine generates a rendering task 5 according to the identifier of the target scene carried in the rendering request 9, and calls the second image rendering pipeline 5 to perform non-perspective related processing, obtains the non-perspective data 5, and sends the non-perspective data 5 to the third
  • the image rendering pipeline 9 performs subsequent processing including viewing angle related processing to obtain a rendered image 9 .
  • FIG. 16 is a schematic flowchart of a raster rendering method provided by the present application. As shown in FIG. 16 , the raster rendering method of this embodiment includes:
  • the rendering application client of the first terminal device sends a first rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the first rendering request sent by the rendering application client of the first terminal device, wherein the first rendering request is used to indicate the target scene and the first user to observe the target scene. Angle.
  • the remote rendering platform invokes the rendering engine to invoke the second image rendering pipeline to perform non-perspective-related processing on the data of the target scene according to the first rendering request, to obtain non-perspective processing data.
  • the remote rendering platform invokes the rendering engine to perform subsequent processing including viewing angle related processing according to the viewing angle of the target scene and the non-viewing angle processing data by the first user to obtain a first rendered image.
  • the rendering application server of the remote rendering platform sends the first rendered image to the rendering application client of the first terminal device.
  • the rendering application client of the first terminal device receives the first rendering image sent by the rendering application server of the remote rendering platform.
  • the rendering application client of the second terminal device sends a second rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the second rendering request sent by the rendering application client of the second terminal device, wherein the second rendering request is used to instruct the target scene and the second user to observe the The angle of the target scene.
  • the remote rendering platform invokes the rendering engine to perform subsequent processing including viewing angle-related processing according to the viewing angle of the target scene by the second user and the non-viewing angle processing data to obtain a second rendered image.
  • the rendering application server of the remote rendering platform sends the second rendering image to the rendering application client of the second terminal device.
  • the rendering application client of the second terminal device receives the second rendering image sent by the rendering application server of the remote rendering platform.
  • the definitions of the first rendering request, the second rendering request, the non-perspective-related processing, the viewing-angle-dependent processing and the subsequent processing are all the same as the first rendering request, the second rendering request, the The definitions of non-view related processing, viewing angle related processing and subsequent processing are the same.
  • the related content in the embodiment corresponding to FIG. 12 please refer to the related content in the embodiment corresponding to FIG. 12 , which will not be described here.
  • FIG. 17 is a schematic flowchart of a raster rendering method provided by the present application. As shown in FIG. 17 , the raster rendering method of this embodiment includes:
  • the rendering application client of the first terminal device sends a first rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the first rendering request sent by the rendering application client of the first terminal device, wherein the first rendering request is used to indicate the target scene.
  • the rendering application client of the second terminal device sends a second rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the second rendering request sent by the rendering application client of the second terminal device, wherein the second rendering request is used to indicate the target scene.
  • the remote rendering platform invokes the second image rendering pipeline through the rendering engine to perform non-perspective-related processing on the data of the target scene, thereby obtaining non-perspective processing data.
  • the remote rendering platform sends the non-perspective processing data to the rendering application client of the first terminal device.
  • the rendering application client of the first terminal device receives the non-perspective processing data sent by the remote rendering platform.
  • the remote rendering platform sends the non-perspective processing data to the rendering application client of the second terminal device.
  • the rendering application client of the second terminal device receives the non-perspective processing data sent by the remote rendering platform.
  • S306 The rendering application client of the first terminal device performs subsequent processing including viewing angle-related processing, thereby obtaining a first rendered image.
  • S307 The rendering application client of the second terminal device performs subsequent processing including viewing angle-related processing, thereby obtaining a second rendered image.
  • the execution of the foregoing S304 to S305 may be performed in parallel or sequentially in any order, and the execution of the foregoing S306 to S307 may be performed in parallel or sequentially in any order.
  • the definitions of the first rendering request, the second rendering request, the non-perspective-related processing, the viewing-angle-dependent processing and the subsequent processing are all the same as the first rendering request, the second rendering request, the The definitions of non-view related processing, viewing angle related processing and subsequent processing are the same.
  • the related content in the embodiment corresponding to FIG. 12 please refer to the related content in the embodiment corresponding to FIG. 12 , which will not be described here.
  • FIG. 18 is a schematic flowchart of a raster rendering method provided by the present application. As shown in FIG. 18 , the raster rendering method of this embodiment includes:
  • the rendering application client of the terminal device sends a first rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the first rendering request sent by the rendering application client of the terminal device, wherein the first rendering request is used to indicate the target scene and the first angle from which the user observes the target scene. .
  • the remote rendering platform invokes the second image rendering pipeline through the rendering engine to perform non-perspective-related processing on the data of the target scene to obtain non-perspective processing data.
  • the remote rendering platform uses the rendering engine to perform subsequent processing including viewing angle related processing according to the first angle and non-perspective processing data at which the user observes the target scene, to obtain a first rendered image.
  • S404 The rendering application server of the remote rendering platform sends the first rendered image to the rendering application client of the terminal device.
  • the rendering application client of the terminal device receives the first rendered image sent by the rendering application server of the remote rendering platform.
  • the rendering application client of the terminal device sends a second rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the second rendering request sent by the rendering application client of the terminal device, wherein the second rendering request is used to indicate the target scene and the second angle from which the user observes the target scene. .
  • the remote rendering platform uses the rendering engine to perform subsequent processing including viewing angle related processing according to the second angle and non-perspective processing data at which the user observes the target scene, to obtain a second rendered image.
  • the rendering application server of the remote rendering platform sends the second rendering image to the rendering application client of the terminal device.
  • the rendering application client of the terminal device receives the second rendering image sent by the rendering application server of the remote rendering platform.
  • the definitions of the first rendering request, the second rendering request, the non-perspective-related processing, the viewing-angle-dependent processing and the subsequent processing are all the same as the first rendering request, the second rendering request, the The definitions of non-view related processing, viewing angle related processing and subsequent processing are the same.
  • the related content in the embodiment corresponding to FIG. 12 please refer to the related content in the embodiment corresponding to FIG. 12 , which will not be described here.
  • FIG. 19 is a schematic flowchart of a raster rendering method provided by the present application. As shown in FIG. 19 , the raster rendering method of this embodiment includes:
  • the management device sends a first rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the first rendering request sent by the management device, wherein the first rendering request is used to indicate the target scene and the angle from which the first user observes the target scene.
  • the management device sends a second rendering request to the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform receives the second rendering request sent by the management device, wherein the second rendering request is used to indicate the target scene and the angle at which the second user observes the target scene.
  • the content included in the second rendering request is similar to the content included in the first rendering request.
  • the statement of the first rendering request please refer to the statement of the first rendering request, which will not be described further here.
  • the rendering engine of the remote rendering platform invokes the second image rendering pipeline to perform non-perspective-related processing on the data of the target scene to obtain non-perspective processing data.
  • the remote rendering platform uses the rendering engine to perform subsequent processing including viewing angle related processing according to the angle at which the first user observes the target scene and the non-perspective processing data, to obtain a first rendered image.
  • the remote rendering platform uses the rendering engine to perform subsequent processing including viewing angle related processing according to the viewing angle and non-viewing angle processing data at which the second user observes the target scene, to obtain a second rendered image.
  • the rendering application server of the remote rendering platform sends the first rendered image to the rendering application client of the first terminal device.
  • the rendering application client of the first terminal device receives the first rendering image sent by the rendering application server of the remote rendering platform.
  • the rendering application server of the remote rendering platform sends the second rendering image to the rendering application client of the second terminal device.
  • the rendering application client of the second terminal device receives the second rendering image sent by the rendering application server of the remote rendering platform.
  • the definitions of the first rendering request, the second rendering request, the non-perspective-related processing, the viewing-angle-dependent processing and the subsequent processing are all the same as the first rendering request, the second rendering request, the The definitions of non-view related processing, viewing angle related processing and subsequent processing are the same.
  • the related content in the embodiment corresponding to FIG. 12 please refer to the related content in the embodiment corresponding to FIG. 12 , which will not be described here.
  • FIG. 20 is a schematic structural diagram of a rendering application server provided by the present application.
  • the rendering application server of this embodiment includes: a communication module 210 and a rendering module 220 .
  • the communication module 210 is configured to receive a first rendering request and a second rendering request, wherein the first rendering request indicates a target scene and a first angle for observing the target scene, and the second rendering request indicates the target a scene and a second angle from which to observe the target scene;
  • the rendering module 220 is configured to perform non-perspective-related processing on the data of the target scene through the rendering engine to obtain non-perspective processing data;
  • the rendering module 220 is configured to perform, through the rendering engine, a first rendered image including viewing angle-related processing according to the first angle of observation of the target scene and the non-perspective processing data;
  • the rendering module 220 is configured to obtain a second rendered image by performing, by using the rendering engine, processing including viewing angle correlation according to the second viewing angle of the target scene and the non-viewing angle processing data.
  • the definitions of the first rendering request, the second rendering request, the non-perspective-related processing, the viewing-angle-dependent processing and the subsequent processing are all the same as the first rendering request, the second rendering request, the The definitions of non-view related processing, viewing angle related processing and subsequent processing are the same.
  • the rendering application server shown in FIG. 20 can execute the steps performed by the rendering application server in the remote rendering platform shown in FIG. 12 , FIG. 16 , FIG. 17 , FIG. 18 and FIG. 19 .
  • FIG. FIG. 17 , FIG. 18 , FIG. 19 and related descriptions are not specifically limited here.
  • FIG. 21 is a schematic structural diagram of a computing node provided by the present application.
  • the computing node of this embodiment may include a processor 410 , a memory 420 , a network card 430 , and a bus 440 .
  • the computing node may be the rendering node in FIG. 1A or FIG. 1B .
  • Processor 410 may be one or more general-purpose processors, where a general-purpose processor may be any type of device capable of processing electronic instructions, including a central processing unit (CPU), microprocessor, microcontroller, Main processor, controller and application specific integrated circuit (ASIC) and so on.
  • the processor 410 executes various types of digitally stored instructions, such as software or firmware programs stored in the memory 420 .
  • processor 410 may be an x86 processor or the like.
  • the processor 410 sends commands to the memory 420 through a physical interface to complete storage-related tasks.
  • the memory 420 may include a read-only memory (ROM) or a hard disk drive (HDD) or a solid-state drive (SSD).
  • ROM read-only memory
  • HDD hard disk drive
  • SSD solid-state drive
  • the memory 420 may be used to store the first quality parameter, the second quality parameter, the first rendered image, the second rendered image, and the difference data.
  • Network card 430 also known as a network interface controller, network interface card, or local area network (LAN) adapter.
  • LAN local area network
  • the computing node may further include one or more of an input device and an output device, where the input device may be a mouse, a keyboard, and the like.
  • Output devices may include displays and the like.
  • FIG. 21 can perform the steps performed by the remote rendering platform in FIG. 12 , FIG. 16 , FIG. 17 , FIG. 18 and FIG. 19 . 19 and related descriptions are not specifically limited here.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, storage disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), among others.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)

Abstract

本申请提供了一种渲染方法、设备以及系统。所述方法应用于渲染应用服务端,渲染应用服务端属于渲染系统,渲染系统包括渲染应用客户端以及渲染引擎,渲染应用服务端和渲染引擎部署于远程渲染节点,接收第一渲染请求以及第二渲染请求,第一渲染请求指示目标场景以及观察目标场景的第一角度,第二渲染请求指示目标场景以及观察目标场景的第二角度;通过渲染引擎对目标场景的数据进行非视角相关处理,得到非视角处理数据;根据观察目标场景的第一角度和非视角处理数据进行包括视角相关处理,得到第一渲染图像;根据观察目标场景的第二角度和非视角处理数据进行包括视角相关处理,得到第二渲染图像。

Description

渲染方法、设备以及系统
本申请要求于2020年12月21日提交中国专利局、申请号为202011521452.6、申请名称为“渲染方法、设备以及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及三维渲染领域,尤其是一种渲染方法、设备以及系统。
背景技术
渲染是指用软件从模型生成图像的过程,其中,模型是用严格定义的语言或者数据结构对于三维物体的描述,它包括几何、视点、纹理以及照明信息。图像是数字图像或者位图图像。渲染这个术语类似于“艺术家对于场景的渲染”,另外,渲染也用于描述“计算视频编辑文件中的效果,以生成最终视频输出的过程”。渲染可以包括预渲染(pre-rendering/offline rendering)或者实时渲染(real-time rendering/online rendering),其中,预渲染通常是用于电影、广告等有预定脚本的实景仿真;实时渲染通常是用于飞行训练、3D游戏和交互式建筑演示等无预定脚本的实景仿真。实时渲染通常采用的是光栅化渲染,但是,光栅化渲染的计算量非常大,会带来大量的资源浪费。
发明内容
本申请提供了一种渲染方法、设备以及系统,能够有效地节约计算资源。
第一方面,提供了一种渲染方法,应用于渲染应用服务端,所述渲染应用服务端属于渲染系统,所述渲染系统包括渲染应用客户端以及渲染引擎,其中,所述渲染应用服务端和所述渲染引擎部署于远程渲染节点,接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视角处理数据进行包括视角相关处理,得到第一渲染图像;通过所述渲染引擎根据所述观察所述目标场景的第二角度和所述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
在一些可能的设计中,所述通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据,包括:通过所述渲染引擎调用目标图像渲染管线对所述目标场景的数据进行非视角相关处理。
在一些可能的设计中,所述非视角相关处理不包括和所述观察所述目标场景的第一角度以及所述观察所述目标场景的第二角度相关的处理。
在一些可能的设计中,所述视角相关处理包括所述观察所述目标场景的第一角度或者所述观察所述目标场景的第二角度相关的处理。
在一些可能的设计中,所述非视角相关处理包括顶点规格、顶点着色器处理、曲面细分技术、几何着色器中的一个或者多个,所述视角相关处理包括裁剪和剔除中的一个或者多个。
在一些可能的设计中,所述第一渲染请求包括:所述目标场景的标识以及所述观察所述 目标场景的第一角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第一角度。
在一些可能的设计中,所述第二渲染请求包括:所述目标场景的标识以及所述观察所述目标场景的第二角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第二角度。
第二方面,提供了一种渲染该应用服务端,所述渲染该应用服务端包括:通信模块以及渲染模块;
所述通信模块用于接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;
所述渲染模块用于通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;
所述渲染模块用于通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视角处理数据进行包括视角相关处理,得到第一渲染图像;
所述渲染模块用于通过所述渲染引擎根据所述观察所述目标场景的第二角度和所述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
在一些可能的设计中,所述通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据,包括:通过所述渲染引擎调用目标图像渲染管线对所述目标场景的数据进行非视角相关处理。
在一些可能的设计中,所述非视角相关处理不包括和所述观察所述目标场景的第一角度以及所述观察所述目标场景的第二角度相关的处理。
在一些可能的设计中,所述视角相关处理包括所述观察所述目标场景的第一角度或者所述观察所述目标场景的第二角度相关的处理。
在一些可能的设计中,所述非视角相关处理包括顶点规格、顶点着色器处理、曲面细分技术、几何着色器中的一个或者多个,所述视角相关处理包括裁剪和剔除中的一个或者多个。
在一些可能的设计中,所述第一渲染请求包括:所述目标场景的标识以及所述观察所述目标场景的第一角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第一角度。
在一些可能的设计中,所述第二渲染请求包括:所述目标场景的标识以及所述观察所述目标场景的第二角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第二角度。
第三方面,提供了一种渲染系统,所述渲染系统包括渲染应用服务端、渲染应用客户端以及渲染引擎,其中,所述渲染应用服务端和所述渲染引擎部署于远程渲染节点,
所述渲染应用服务端用于接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;
所述渲染应用服务端用于通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;
所述渲染应用服务端用于通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视角处理数据进行包括视角相关处理,得到第一渲染图像;
所述渲染应用服务端用于通过所述渲染引擎根据所述观察所述目标场景的第二角度和所 述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
在一些可能的设计中,所述通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据,包括:通过所述渲染引擎调用目标图像渲染管线对所述目标场景的数据进行非视角相关处理。
在一些可能的设计中,所述非视角相关处理不包括和所述观察所述目标场景的第一角度以及所述观察所述目标场景的第二角度相关的处理。
在一些可能的设计中,所述视角相关处理包括所述观察所述目标场景的第一角度或者所述观察所述目标场景的第二角度相关的处理。
在一些可能的设计中,所述非视角相关处理包括顶点规格、顶点着色器处理、曲面细分技术、几何着色器中的一个或者多个,所述视角相关处理包括裁剪和剔除中的一个或者多个。
在一些可能的设计中,所述第一渲染请求包括:所述目标场景的标识以及所述观察所述目标场景的第一角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第一角度。
在一些可能的设计中,所述第二渲染请求包括:所述目标场景的标识以及所述观察所述目标场景的第二角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第二角度。
第四方面,提供了一种计算节点,所述计算节点包括处理器以及存储器,所述处理器执行所述存储器中的程序,从而执行如第一方面任一项所述的方法。
第五方面,提供了一种计算机可读存储介质,包括指令,当所述指令在计算节点上运行时,使得所述计算节点执行如第一方面任一项所述的方法。
第六方面,提供了一种计算机程序产品,当所述计算机程序产品被所述计算机读取并执行时,如第一方面任一项所述的方法将被执行。
上述方案中,当第一渲染请求和第二渲染请求的目标场景相同时,可以只需要计算一次非视角相关处理即可,能够有效地减少渲染所需要的计算量。
附图说明
为了更清楚地说明本申请实施例或背景技术中的技术方案,下面将对本申请实施例或背景技术中所需要使用的附图进行说明。
图1A至图1B是本申请涉及的一些渲染系统的结构示意图;
图2是本申请涉及的从多个角度观看目标场景的示意图;
图3是本申请涉及的将多个渲染请求调度为多个第一图像渲染管线执行的示意图;
图4是本申请提供的一种第一图像渲染管线的结构示意图;
图5是本申请提供的顶点着色器的变换的过程的示意图;
图6是本申请提供的曲面细分技术的示意图;
图7是本申请提供的裁剪的示意图;
图8是本申请提供的装配线段的示意图;
图9是本申请提供的装配三角形的示意图;
图10是本申请提供的光栅化的示意图;
图11A至图11C是本申请提供的一些第二图像渲染管线的示意图;
图12是本申请提供的一种光栅渲染方法的流程示意图;
图13是本申请提供的第一用户观察所述目标场景的角度的示意图;
图14A至图14C是本申请提供的分别对应图11A至图11C的第二图像渲染管线进行渲染的示意图;
图15是本申请涉及的将多个渲染请求调度为多个第二图像渲染管线以及多个第三图像渲染管线执行的示意图;
图16是本申请提供的一种光栅渲染方法的流程示意图;
图17是本申请提供的一种光栅渲染方法的流程示意图;
图18是本申请提供的一种光栅渲染方法的流程示意图;
图19是本申请提供的一种光栅渲染方法的流程示意图;
图20是本申请提供的一种渲染应用服务端的结构示意图;
图21是本申请提供的一种远程渲染平台的结构示意图。
具体实施方式
参见图1A,图1A是本申请涉及的一种渲染系统的结构示意图。本申请的光栅渲染系统用于通过渲染方法对目标场景的3D模型进行渲染得到的2D图像,即渲染图像。其中,渲染方法可以包括光栅化渲染等等。本申请的光栅渲染系统可以包括:多个终端设备110、网络设备120以及远程渲染平台130。远程渲染平台130具体可以部署在公有云上。远程渲染平台130和终端设备110一般部署在不同的数据中心乃至地理区域内。
终端设备110可以是需要实时显示渲染图像的设备,例如,可以是用于飞行训练的虚拟现实设备(Virtual Reality,VR)、可以是用于虚拟游戏的电脑以及用于虚拟商城的智能手机等等,此处不作具体限定。终端设备可以是高配置、高性能(例如,多核、高主频、内存大等等)的设备,也可以是低配置,低性能(例如,单核、低主频、内存小等等)的设备。在一具体的实施例中,终端设备110可以包括硬件、操作系统以及渲染应用客户端。
网络设备120用于在终端设备110通过任何通信机制/通信标准的通信网络与远程渲染平台130之间传输数据。其中,通信网络可以是广域网、局域网、点对点连接等方式,或它们的任意组合。
远程渲染平台130包括多个远程渲染节点,每个远程渲染节点自下而上包括渲染硬件、虚拟化服务、渲染引擎以及渲染应用。其中,渲染硬件包括计算资源、存储资源以及网络资源。计算资源可以采用异构计算架构,例如,可以采用中央处理器(central processing unit,CPU)+图形处理器(graphics processing unit,GPU)架构,CPU+AI芯片,CPU+GPU+AI芯片架构等等,此处不作具体限定。存储资源可以包括内存等等。网络资源可以包括网卡等等。虚拟化服务是通过虚拟化技术将多个物理主机的资源构建为统一的资源池,并按照用户的需要灵活地隔离出相互独立的资源以运行用户的应用程序的服务。常见地,虚拟化服务可以包括虚拟机(virtual machine,VM)服务以及容器(container)服务。渲染引擎可以用于实现图像渲染算法。渲染应用服务端可以用于调用渲染引擎以完成渲染图像的渲染。
终端设备110上的渲染应用客户端和远程渲染平台130的渲染应用服务端统称渲染应用,其中,常见的渲染应用可以包括:游戏应用、VR应用、电影特效以及动画等等。用户通过渲染应用客户端输入指令,渲染应用客户端翻译成数据发送给渲染应用服务端,渲染应用服务端处理完给出结果,然后再由渲染应用客户端翻译成图形化呈现给用户。可以说,渲染应用客户端是一个用户和渲染应用服务端之间的中介。在一具体的实施方式中,渲染应用服务端可以是渲染应用提供商提供的,渲染应用客户端可以是渲染应用提供商提供的,渲染引擎可以是云服务提供商提供的。举个例子说明,渲染应用可以是游戏应用,游戏应用的游戏开发 商将游戏应用服务端安装在云服务提供商提供的远程渲染平台上,游戏应用的游戏开发商将游戏应用客户端通过互联网提供给用户下载,并安装在用户的终端设备上。此外,云服务提供商还提供了渲染引擎,渲染引擎可以为游戏应用提供计算能力。在另一种具体的实施方式中,渲染应用客户端、渲染应用服务端和渲染引擎可以均是云服务提供商提供的。
在图1B所示的光栅渲染系统中,还包括管理设备140。管理设备140上可以是用户的终端设备和云服务提供商的远程渲染平台130之外的第三方提供的设备。例如,管理设备140可以是游戏开发商提供的设备。游戏开发商可以通过管理设备140对渲染应用进行管理,例如,游戏开发商可以通过管理设备140指定渲染应用服务端向渲染应用客户端提供的初始的渲染图像的图像质量等等。可以理解,管理设备140可以设置于远程渲染平台之上,也可以设置于远程渲染平台之外,此处不作具体限定。
在多用户参与的虚拟场景中,为了能够让每个用户都产生置身其中的真实感,往往不同的用户需要的是从不同的角度生成的同一个目标场景的渲染图像。其中,目标场景包括光源以及三维模型。光源产生的光线投射在三维模型中。如图2所示,假设目标场景如图2中的上面所示,当终端设备1的第一用户从第一视角进行观察时,需要生成的渲染图像如图2中的左边所示,当终端设备2的第二用户从第二视角进行观察时,需要生成的渲染图像如图2中的右边所示。终端设备1和终端设备2可以分别独立地利用远程渲染平台130的资源对目标场景进行光栅化渲染,从而得到不同角度的渲染图像。具体地,
终端设备1通过网络设备120向远程渲染平台130发出第一渲染请求,远程渲染平台130调用渲染引擎根据第一渲染请求从第一用户的视角出发对目标场景进行光栅化渲染,从而得到第一用户的视角生成的该目标场景的渲染图像。其中,第一渲染请求用于指示所述第一视角以及所述目标场景。
终端设备2通过网络设备120向远程渲染平台130发出第二渲染请求,远程渲染平台130调用渲染引擎根据第二渲染请求从第一用户的视角出发对目标场景进行光栅化渲染,从而得到第二用户的视角生成的该目标场景的渲染图像。其中,第二渲染请求用于指示所述第二视角以及所述目标场景。
远程渲染平台进行光栅化渲染的过程可以是:如图3所示,渲染应用服务端接收到渲染应用客户端发送的并发的渲染请求1至渲染请求9(图未示)之后,将渲染请求1至渲染请求9传递给渲染引擎,渲染引擎根据渲染请求1生成渲染任务1,根据渲染请求2生成渲染任务2,…,根据渲染请求9生成渲染任务9。然后,渲染引擎根据渲染任务1调度第一图像渲染管线1执行渲染任务1,得到渲染图像1;根据渲染任务2调度第一图像渲染管线2执行渲染任务2,得到渲染图像2;…,根据渲染任务9调度第一图像渲染管线9执行渲染任务9,得到渲染图像9。并且,不同渲染请求分别占据不同的第一图像渲染管线,例如,渲染请求1至渲染请求9分别占据第一图像渲染管线1至第一图像渲染管线9,即使,不同的渲染请求是对同一个目标场景进行不同角度的渲染。
参见图4,图4是本申请提供的一种第一图像渲染管线的结构示意图。如图4所示,本申请提供的第一图像渲染管线通常包括应用阶段、几何阶段以及光栅化阶段。其中,
应用程序阶段:通常可以实现的有碰撞检测、加速算法、输入检测,动画,力反馈以及纹理动画,变换、仿真、几何变形,以及一些不在其他阶段执行的计算。
几何阶段:通常包括顶点规格(vertex specification)、顶点着色器(vertex shader)、图元 装配、曲面细分技术(tessellation)、几何着色器(geometry shader)、顶点后处理(vertex post-processing)、图元装配(primitive assembly)、光栅化(rasterization)、片段着色器(fragment shader)以及逐像素处理(per-sample operations)等等多个分阶段。
顶点规格通常用于获取顶点数据。其中,所述顶点数据是根据目标场景中的三维模型生成的,所述顶点数据包括顶点的三维坐标,顶点数据还可以包括顶点的法向量以及顶点的颜色等等。顶点可以是三维模型上的点,例如,三维模型中多边形两条边相交的地方、三维模型中两条边的公共端点等等。
顶点着色器通常用于将顶点的三维坐标从模型空间(object space)变换至屏幕空间(screen/image space)。如图5所示,变换的过程可以是:从模型空间变换为世界空间(world space),再从世界空间变换为视景空间(view space),再从视景空间变换为标称投影空间(normalized projection space),再从标称投影空间变换为屏幕空间。其中,视景空间中包括视椎体,视椎体内的空间是从用户的角度可以看到的空间,视椎体外的空间是从用户的角度无法看到的空间。
曲面细分技术用于将三维模型中的顶点的数量大幅增加。如图6所示,假设三维模型包括构成三角形的三个顶点,在进行曲面细分前,如图6左边所示,三维模型中有三个顶点。在进行曲面细分后,如图6右边所示,三维模型中的顶点数量从三个变成了六个。可以看出,在曲面细分前三维模型显得粗糙而且僵硬,在曲面细分后三维模型显得逼真而且生动。
几何着色器用于将三维模型中的一个或多个顶点转变为完全不同的基本图形(primitive),从而生成更多的顶点。
顶点后处理用于对图元进行裁剪,即,如果图元部分在视椎体之外,部分在视椎体之内,则需要将该图元在视椎体之外的部分进行裁剪,只保留视椎体之内的部分。以图7所示为例,图7的左半部分和图7的右半部分分别展示了对视椎体之外的图元进行裁剪前和对视椎体之内的图元进行裁剪后的示意图。如图7的左半部分所示,在裁剪前,左下角的线段一半位于视椎体之外,另一半位于视椎体之内,右上角的三角形的一半部分位于视椎体之外,另一半位于视椎体之内。如图7的右半部分所示,在裁剪后,左下角的线段超出视椎体之外的部分已经被裁掉,并且,在左下角的线段和视椎体的交界处将会生成新的顶点,右上角的三角形超出视椎体之外的部分已经被裁掉,并且,在右上角三角形的两条边和视椎体中交界处将会生成两个新的顶点。
图元装配通常用于将把三维模型中的顶点装配成几何图元,这个阶段将产生一系列的三角形、线段和点。其中,如图8所示,装配好的线段可以包括(a)独立的线段;(b)首尾相连但是最终不闭合的线段;(c)首尾相连最终封口闭合的线段。如图9所示,装配好的三角形可以包括(a)按点的定义顺序依次连接;(b)从第1个点开始,每三个点一组画一个三角形,三角形之间是独立的;(c)从第三个点开始,每点与前面的两个点组合画一个三角形,即线性连续三角形串;(d)从第三个点开始,每点与前一个点和第一个点组合画一个三角形,即扇形连续三角形。在此阶段,还可以进行剔除,即,从场景中移除看不到的物体。这里,剔除可以包括视椎体剔除(frustum culling)、视口剔除以及遮挡剔除(occlusion culling)。
光栅化阶段包括光栅化(rasterization)、片段着色器(fragment shader)以及逐像素处理(per-sample operation)。
光栅化是把顶点数据转换为片元的过程,具有将图转化为一个个栅格组成的图象的作用,特点是每个元素对应帧缓冲区中的一像素。因此,光栅化的第一部分工作:如图10所示,决定窗口坐标中的哪些整型栅格区域被图元占用;第二部分工作:分配一个颜色值和一个深度 值到各个区域。光栅化过程产生的是片元。
片段着色器用于计算像素最后的颜色输出。
逐像素处理包括深度测试以及透明度处理。可以理解,如果我们先绘制一个距离较近的物体,再绘制距离较远的物体,则距离远的物体因为后绘制,会把距离近的物体覆盖掉,这样的效果并不是我们所希望的,而深度测试其实就是记录像素点在3D世界中距离摄像机的距离(绘制坐标),深度缓存中存储着每个象素点(绘制在屏幕上的)的深度值(Z值)越大,则离摄像机越远,因此,有了深度缓存以后,绘制物体的顺序就不那么重要了,都能按照远近(Z值)正常显示。
但是,远程渲染平台通过第一图像渲染管线从第一用户的视角生成第一渲染图像和通过第一图像渲染管线从第二用户的视角生成第二渲染图像,需要耗费大量的计算资源。
为了解决上述问题,本申请提出了一种光栅渲染方法、设备以及系统,能够有效减少计算资源的需求。
本申请提出了一种第二图像渲染管线,将第一图像渲染管线中的处理分为非视角相关处理和后续处理两部分,该第二图像渲染管线只包括非视角相关处理,不包括后续处理。其中,后续处理中包括视角相关处理。非视角相关处理是指和用户视角无关的处理,视角相关处理是指和用户视角相关的处理。在非视角相关处理中,尽管第一用户观察所述目标场景的角度和第二用户观察所述目标场景的角度不相同,但是,因为目标场景是相同的,所以,处理都是一样的。而在视角相关处理中,尽管目标场景是相同的,但是,由于第一用户观察所述目标场景的角度和第二用户观察所述目标场景的角度是不相同,所以,处理并不一样。
参见图11A至图11C,图11A至图11C是本申请提供的一些第二图像渲染管线的示意图。如图11A-图11C所示,第二图像渲染管线可以包括以下几种实现方式:
第一种实现方式,如图11A所示,第二图像渲染管线中的非视角相关处理可以包括顶点规格以及顶点着色器。后续处理可以包括曲面细分技术、几何着色器、顶点后处理、图元装配、光栅化、片段着色器以及逐像素处理,视角相关处理包括裁剪和剔除。其中,裁剪可以发生在曲面细分技术、顶点后处理、图元装配中的任意一个阶段,剔除可以发生在图元装配阶段。
第二种实现方式,如图11B所示,第二图像渲染管线中的非视角相关处理可以包括顶点规格、顶点着色器以及曲面细分技术,后续处理可以包括几何着色器、顶点后处理、图元装配、光栅化、片段着色器以及逐像素处理,视角相关处理包括裁剪和剔除。其中,裁剪可以发生几何着色器、顶点后处理、图元装配中的任意一个阶段,剔除可以发生在图元装配阶段。
第三种实现方式,如图11C所示,第二图像渲染管线中的非视角相关处理可以包括顶点着色器处理、曲面细分技术以及几何着色器,后续处理可以包括顶点后处理、图元装配、光栅化、片段着色器以及逐像素处理,视角相关处理包括裁剪和剔除。其中,裁剪可以发生在顶点后处理、图元装配中的任意一个阶段,剔除可以发生在图元装配阶段。
除了上述实施例之外,非视角相关处理还可以包括一些其他与视角无关的处理,例如,可以将图元装配中将三维模型中的顶点装配成点、线段以及三角形的处理(不包括裁剪处理)的部分也设置在非视角相关处理中。
参见图12,图12是本申请提供的一种光栅渲染方法的流程示意图。如图12所示,本实 施方式的光栅渲染方法,包括:
S101:第一终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第一渲染请求。相应地,远程渲染平台的渲染应用服务端接收第一终端设备的渲染应用客户端发送的第一渲染请求,其中,所述第一渲染请求用于指示目标场景以及第一用户观察所述目标场景的角度。
在本申请具体的实施例中,所述第一渲染请求可以包括以下的实现方式:
在第一种方式中,第一渲染请求包括目标场景的标识以及第一用户观察所述目标场景的角度。可以理解,远程渲染平台可以预先存储有目标场景的标识与目标场景的数据,例如,几何数据、纹理数据以及材质数据等等之间的对应关系。所以,远程渲染平台均可以通过目标场景的标识查找到对应的目标场景的数据来进行渲染。其中,所述几何数据可以包括目标场景中的各个网格的顶点数据等等,所述纹理数据可以包括目标场景中各个网格的颜色等等,所述材质数据可以包括目标场景中各个网格的材质,例如,金属、镜面以及漫反射材料等等。如图13所示,第一用户观察所述目标场景的角度可以表示为(P 1,θ 1),其中,P 1为第一用户的视点E 1到渲染图像的垂直距离,θ 1为该视点E 1到渲染图像的中心点O的连线与水平线的夹角。
在第二种方式中,第一渲染请求包括目标场景的几何数据、纹理数据、材质数据以及第一用户观察所述目标场景的角度。例如,第一渲染请求可以包括目标场景的全部的几何数据、纹理数据、材质数据以及第一用户观察所述目标场景的角度。或者,第一渲染请求可以包括目标场景中相对上一场景发生变化的几何数据、纹理数据、材质数据以及第一用户观察所述目标场景的角度。可以理解,此时远程渲染平台可以不需要预先存储有目标场景的标识与目标场景的几何数据、纹理数据以及材质数据之间的对应关系。
S102:第二终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第二渲染请求。相应地,远程渲染平台的渲染应用服务端接收第二终端设备的渲染应用客户端发送的第二渲染请求,其中,所述第二渲染请求用于指示所述目标场景以及第二用户观察所述目标场景的角度。
在本申请具体的实施例中,第二渲染请求包含的内容第一渲染请求包含的内容相类似,具体请参见第一渲染请求的陈述,此处不再展开描述。
S103:远程渲染平台的渲染引擎调用第二图像渲染管线对所述目标场景的数据进行非视角相关处理,得到非视角处理数据。
S104:远程渲染平台通过渲染引擎根据所述第一用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第一渲染图像。
S105:远程渲染平台通过渲染引擎根据所述第二用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第二渲染图像。
S106:远程渲染平台的渲染应用服务端向第一终端设备的渲染应用客户端发送第一渲染图像。相应地,第一终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第一渲染图像。
S107:远程渲染平台的渲染应用服务端向第二终端设备的渲染应用客户端发送第二渲染图像。相应地,第二终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第二渲染图像。
在本申请具体的实施例中,在远程渲染平台接收到第一渲染请求以及第二渲染请求之后,根据第一渲染请求和第二渲染请求生成合并渲染任务,并调用第二图像渲染管线进行非视角 相关处理从而得到非视角处理数据。也就是说,第一渲染请求和第二渲染请求生成一个公共的合并渲染任务,并使用相同的第二图像渲染管线进行非视角相关处理从而得到非视角处理数据。然后,远程渲染平台将非视角处理数据复制成两份,其中一份用于结合所述第一用户观察所述目标场景的角度进行后续处理,从而得到第一渲染图像,另外一份用于结合所述第二用户观察所述目标场景的角度进行后续处理,从而得到第二渲染图像。下面将结合图11A-图11C对步骤S103至步骤S105进行详细的介绍。
在一具体的实施例中,结合图11A所示的第二图像渲染管线进行详细介绍。如图14A所示,远程渲染平台在接收到第一渲染请求和第二渲染请求之后,调用第二图像渲染管线对所述目标场景的数据进行顶点规格以及顶点着色器处理,从而得到顶点着色数据。然后,远程渲染平台将顶点着色数据复制成两份。远程渲染平台根据所述第一用户观察所述目标场景的角度和第一份的顶点着色数据进行曲面细分技术、几何着色器、顶点后处理(包含裁剪)、图元装配(包含剔除)、光栅化、片段着色器以及逐像素处理而得到第一渲染图像,远程渲染平台根据所述第二用户观察所述目标场景的角度和第二份的顶点着色数据进行曲面细分技术、几何着色器、顶点后处理(包含裁剪)、图元装配(包含剔除)、光栅化、片段着色器以及逐像素处理从而得到第二渲染图像。也就是说,尽管远程渲染平台尽管接收到了第一渲染请求和第二渲染请求,但是,远程渲染平台只进行了一次顶点规格以及顶点着色器处理,因此,能够有效地节约计算资源。
在一具体的实施例中,结合图11B所示的第二图像渲染管线进行详细介绍。如图14B所示,远程渲染平台在接收到第一渲染请求和第二渲染请求之后,远程渲染平台对所述目标场景的数据进行顶点规格、顶点着色器以及曲面细分技术处理,从而得到细分数据。远程渲染平台将细分数据复制成两份。然后,远程渲染平台根据所述第一用户观察所述目标场景的角度和第一份的细分数据进行几何着色器、顶点后处理(包括裁剪)、图元装配(包括剔除)、光栅化、片段着色器以及逐像素处理从而得到第一渲染图像,远程渲染平台根据所述第二用户观察所述目标场景的角度和第二份的细分数据进行几何着色器、顶点后处理(包括裁剪)、图元装配(包括剔除)、光栅化、片段着色器以及逐像素处理从而得到第二渲染图像。也就是说,尽管远程渲染平台尽管接收到了第一渲染请求和第二渲染请求,但是,远程渲染平台只进行了一次顶点规格、顶点着色器以及曲面细分技术处理,因此,能够有效地节约计算资源。
在一具体的实施例中,结合图11C所示的第二图像渲染管线进行详细介绍。如图14C所示,远程渲染平台在接收到第一渲染请求和第二渲染请求之后,远程渲染平台对所述目标场景的数据进行顶点规格、顶点着色器、曲面细分技术以及几何着色器处理,从而得到几何着色数据。远程渲染平台将几何着色数据复制成两份。然后,远程渲染平台根据所述第一用户观察所述目标场景的角度和第一份的几何着色数据进行顶点后处理(包括裁剪)、图元装配(包括剔除)、光栅化、片段着色器以及逐像素处理从而得到第一渲染图像,远程渲染平台根据所述第二用户观察所述目标场景的角度和第二份的几何着色数据进行顶点后处理(包括裁剪)、图元装配(包括剔除)、光栅化、片段着色器以及逐像素处理从而得到第二渲染图像。也就是说,尽管远程渲染平台尽管接收到了第一渲染请求和第二渲染请求,但是,远程渲染平台只进行了一次顶点规格、顶点着色器、曲面细分技术以及几何着色器处理,因此,能够有效地节约计算资源。
可以理解,如果渲染系统还存在第三终端设备和第四终端设备,第三终端设备和第四终端设备也需要对目标场景进行渲染,从而得到第三渲染图像和第四渲染图像,此时,远程渲染平台也只需要进行一次非视角相关处理即可,不需要分别进行四次非视角相关处理,能够 更有效地减少计算资源的浪费。
可以理解,每个渲染请求对应的后续处理,也可以调用一个第三图像渲染管线进行处理。
在本申请具体的实施例中,远程渲染平台可以启动第一线程来建立第二图像渲染管线对所述目标场景的数据进行非视角相关处理,从而获得非视角处理数据,可以启动第二线程来根据第一用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,从而得到第一渲染图像,可以启动第三线程来根据第一用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,从而得到第二渲染图像。
上述S104至S105的执行可以并行或者以任意顺序先后执行,上述S106至S107的执行可以并行或者以任意顺序先后执行。
上述方案中,是以图像渲染管线内的处理顺序为:顶点着色器、曲面细分技术、几何着色器、顶点后处理(包括裁剪)、图元装配(包括剔除)、光栅化、片段着色器以及逐像素处理为例进行介绍的,在实际应用中,图像渲染管线内的处理顺序可以发生变化,此处不作具体限定。
参见图15,图15是本申请涉及的将多个渲染请求调度为多个第二图像渲染管线以及多个第三图像渲染管线执行的示意图。
远程渲染平台进行光栅化渲染的过程可以是:如图15所示,渲染应用服务端接收到渲染应用客户端发送的并发的渲染请求1至渲染请求9(图未示)之后,将渲染请求1至渲染请求9传递给渲染引擎。
渲染引擎根据渲染请求1中携带的目标场景的标识和渲染请求2中携带的目标场景的标识相同,生成渲染任务1,并调用第二图像渲染管线1进行非视角相关处理,得到非视角数据1,并分别将非视角数据1发送给第三图像渲染管线1和第三图像渲染管线2进行包括视角相关处理在内的后续处理,分别得到渲染图像1以及渲染图像2。
渲染引擎根据渲染请求3中携带的目标场景的标识和渲染请求4中携带的目标场景的标识相同,生成渲染任务2,并调用第二图像渲染管线2进行非视角相关处理,得到非视角数据2,并分别将非视角数据2发送给第三图像渲染管线3和第三图像渲染管线4进行包括视角相关处理在内的后续处理,分别得到渲染图像3以及渲染图像4。
渲染引擎根据渲染请求5中携带的目标场景的标识,生成渲染任务3,并调用第二图像渲染管线3进行非视角相关处理,得到非视角数据3,并分别将非视角数据3发送给第三图像渲染管线5进行包括视角相关处理在内的后续处理,得到渲染图像5。
渲染引擎根据渲染请求6中携带的目标场景的标识、渲染请求7中携带的目标场景的标识和渲染请求8中携带的目标场景的标识相同,生成渲染任务4,并调用第二图像渲染管线4进行非视角相关处理,得到非视角数据4,并分别将非视角数据4发送给第三图像渲染管线6、第三图像渲染管线7和第三图像渲染管线8进行包括视角相关处理在内的后续处理,分别得到渲染图像6、渲染图像7以及渲染图像8。
渲染引擎根据渲染请求9中携带的目标场景的标识,生成渲染任务5,并调用第二图像渲染管线5进行非视角相关处理,得到非视角数据5,并分别将非视角数据5发送给第三图像渲染管线9进行包括视角相关处理在内的后续处理,得到渲染图像9。
参见图16,图16是本申请提供的一种光栅渲染方法的流程示意图。如图16所示,本实施方式的光栅渲染方法,包括:
S201:第一终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第一渲染请求。相应地,远程渲染平台的渲染应用服务端接收第一终端设备的渲染应用客户端发送的第一渲染请求,其中,所述第一渲染请求用于指示目标场景以及第一用户观察所述目标场景的角度。
S202:远程渲染平台调用渲染引擎根据第一渲染请求调用第二图像渲染管线对所述目标场景的数据进行非视角相关处理,得到非视角处理数据。
S203:远程渲染平台调用渲染引擎根据所述第一用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第一渲染图像。
S204:远程渲染平台的渲染应用服务端向第一终端设备的渲染应用客户端发送第一渲染图像。相应地,第一终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第一渲染图像。
S205:第二终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第二渲染请求。相应地,远程渲染平台的渲染应用服务端接收第二终端设备的渲染应用客户端发送的第二渲染请求,其中,所述第二渲染请求用于指示所述目标场景以及第二用户观察所述目标场景的角度。
S206:远程渲染平台调用渲染引擎根据所述第二用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第二渲染图像。
S207:远程渲染平台的渲染应用服务端向第二终端设备的渲染应用客户端发送第二渲染图像。相应地,第二终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第二渲染图像。
上述实施例中,第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义均与图12所示的光栅渲染方法中的第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义相同,具体请参见图12对应的实施例中的相关内容,此处不再展开描述。
参见图17,图17是本申请提供的一种光栅渲染方法的流程示意图。如图17所示,本实施方式的光栅渲染方法,包括:
S301:第一终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第一渲染请求。相应地,远程渲染平台的渲染应用服务端接收第一终端设备的渲染应用客户端发送的第一渲染请求,其中,所述第一渲染请求用于指示目标场景。
S302:第二终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第二渲染请求。相应地,远程渲染平台的渲染应用服务端接收第二终端设备的渲染应用客户端发送的第二渲染请求,其中,所述第二渲染请求用于指示所述目标场景。
S303:远程渲染平台通过渲染引擎调用第二图像渲染管线对所述目标场景的数据进行非视角相关处理,从而得到非视角处理数据。
S304:远程渲染平台将非视角处理数据发送给第一终端设备的渲染应用客户端。相应地,第一终端设备的渲染应用客户端接收远程渲染平台发送的非视角处理数据。
S305:远程渲染平台将非视角处理数据发送给第二终端设备的渲染应用客户端。相应地,第二终端设备的渲染应用客户端接收远程渲染平台发送的非视角处理数据。
S306:第一终端设备的渲染应用客户端进行包括视角相关处理在内的后续处理,从而得到第一渲染图像。
S307:第二终端设备的渲染应用客户端进行包括视角相关处理在内的后续处理,从而得到第二渲染图像。
上述S304至S305的执行可以并行或者以任意顺序先后执行,上述S306至S307的执行可以并行或者以任意顺序先后执行。
上述实施例中,第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义均与图12所示的光栅渲染方法中的第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义相同,具体请参见图12对应的实施例中的相关内容,此处不再展开描述。
参见图18,图18是本申请提供的一种光栅渲染方法的流程示意图。如图18所示,本实施方式的光栅渲染方法,包括:
S401:终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第一渲染请求。相应地,远程渲染平台的渲染应用服务端接收终端设备的渲染应用客户端发送的第一渲染请求,其中,所述第一渲染请求用于指示目标场景以及用户观察所述目标场景的第一角度。
S402:远程渲染平台通过渲染引擎调用第二图像渲染管线对所述目标场景的数据进行非视角相关处理,得到非视角处理数据。
S403:远程渲染平台通过渲染引擎根据所述用户观察所述目标场景的第一角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第一渲染图像。
S404:远程渲染平台的渲染应用服务端向终端设备的渲染应用客户端发送第一渲染图像。相应地,终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第一渲染图像。
S405:终端设备的渲染应用客户端向远程渲染平台的渲染应用服务端发送第二渲染请求。相应地,远程渲染平台的渲染应用服务端接收终端设备的渲染应用客户端发送的第二渲染请求,其中,所述第二渲染请求用于指示目标场景以及用户观察所述目标场景的第二角度。
S406:远程渲染平台通过渲染引擎根据所述用户观察所述目标场景的第二角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第二渲染图像。
S407:远程渲染平台的渲染应用服务端向终端设备的渲染应用客户端发送第二渲染图像。相应地,终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第二渲染图像。
上述实施例中,第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义均与图12所示的光栅渲染方法中的第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义相同,具体请参见图12对应的实施例中的相关内容,此处不再展开描述。
参见图19,图19是本申请提供的一种光栅渲染方法的流程示意图。如图19所示,本实施方式的光栅渲染方法,包括:
S501:管理设备向远程渲染平台的渲染应用服务端发送第一渲染请求。相应地,远程渲染平台的渲染应用服务端接收管理设备发送的第一渲染请求,其中,所述第一渲染请求用于指示目标场景以及第一用户观察所述目标场景的角度。
S502:管理设备向远程渲染平台的渲染应用服务端发送第二渲染请求。相应地,远程渲染平台的渲染应用服务端接收管理设备发送的第二渲染请求,其中,所述第二渲染请求用于 指示所述目标场景以及第二用户观察所述目标场景的角度。
在本申请具体的实施例中,第二渲染请求包含的内容第一渲染请求包含的内容相类似,具体请参见第一渲染请求的陈述,此处不再展开描述。
S503:远程渲染平台的渲染引擎调用第二图像渲染管线对所述目标场景的数据进行非视角相关处理,得到非视角处理数据。
S504:远程渲染平台通过渲染引擎根据所述第一用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第一渲染图像。
S505:远程渲染平台通过渲染引擎根据所述第二用户观察所述目标场景的角度和非视角处理数据进行包括视角相关处理在内的后续处理,得到第二渲染图像。
S506:远程渲染平台的渲染应用服务端向第一终端设备的渲染应用客户端发送第一渲染图像。相应地,第一终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第一渲染图像。
S507:远程渲染平台的渲染应用服务端向第二终端设备的渲染应用客户端发送第二渲染图像。相应地,第二终端设备的渲染应用客户端接收远程渲染平台的渲染应用服务端发送的第二渲染图像。
上述实施例中,第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义均与图12所示的光栅渲染方法中的第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义相同,具体请参见图12对应的实施例中的相关内容,此处不再展开描述。
参见图20,图20是本申请提供的一种渲染应用服务端的结构示意图。如图20所示,本实施方式的渲染应用服务端,包括:通信模块210以及渲染模块220。
所述通信模块210用于接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;
所述渲染模块220用于通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;
所述渲染模块220用于通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视角处理数据进行包括视角相关处理,得到第一渲染图像;
所述渲染模块220用于通过所述渲染引擎根据所述观察所述目标场景的第二角度和所述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
上述实施例中,第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义均与图12所示的光栅渲染方法中的第一渲染请求、第二渲染请求、非视角相关处理、视角相关处理以及后续处理的定义相同,具体请参见图12对应的实施例中的相关内容,此处不再展开描述。并且,图20所示的渲染应用服务端可以执行图12、图16、图17、图18以及图19中远程渲染平台中的渲染应用服务端执行的步骤,具体请参见图12、图16、图17、图18、图19以及相关描述,此处不作具体限定。
参见图21,图21是本申请提供的一种计算节点的结构示意图。本实施方式的计算节点可以包括处理器410、存储器420、网卡430以及总线440。该计算节点具体可以是图1A或图1B中的渲染节点。
处理器410可以是一个或者多个通用处理器,其中,通用处理器可以是能够处理电子指令的任何类型的设备,包括中央处理器(central processing unit,CPU)、微处理器、微控制器、主处理器、控制器以及专用集成电路(application specific integrated circuit,ASIC)等等。处理器410执行各种类型的数字存储指令,例如存储在存储器420中的软件或者固件程序。在一具体的实施例中,处理器410可以是x86处理器等等。处理器410通过物理接口将命令发送给存储器420,以完成存储相关的任务。
存储器420可以包括只读存储器(read-only memory,ROM)或者硬盘(hard disk drive,HDD)或固态硬盘(solid-state drive,SSD)。存储器420可以用于存储第一质量参数、第二质量参数、第一渲染图像、第二渲染图像以及差值数据。
网卡430,还被称为网络接口控制器、网络接口卡或者局域网(local area network,LAN)适配器。
可选地,计算节点还可以包括输入设备以及输出设备中的一种或者多种,其中,输入设备可以是鼠标、键盘等等。输出设备可以包括显示器等等。
可以理解,图21所示的计算节点可以执行图12、图16、图17、图18以及图19中远程渲染平台执行的步骤,具体请参见图12、图16、图17、图18、图19以及相关描述,此处不作具体限定。
上述方案中,当第一渲染请求和第二渲染请求的目标场景相同时,可以只需要计算一次非视角相关处理即可,能够有效地减少渲染所需要的计算量。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、存储盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态存储盘Solid State Disk(SSD))等。

Claims (10)

  1. 一种渲染方法,其特征在于,应用于渲染应用服务端,所述渲染应用服务端属于渲染系统,所述渲染系统包括渲染应用客户端以及渲染引擎,其中,所述渲染应用服务端和所述渲染引擎部署于远程渲染节点,
    接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;
    通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;
    通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视角处理数据进行包括视角相关处理,得到第一渲染图像;
    通过所述渲染引擎根据所述观察所述目标场景的第二角度和所述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
  2. 根据权利要求1所述的方法,其特征在于,所述通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据,包括:
    通过所述渲染引擎调用目标图像渲染管线对所述目标场景的数据进行非视角相关处理。
  3. 根据权利要求1或2所述的方法,其特征在于,所述非视角相关处理不包括和所述观察所述目标场景的第一角度以及所述观察所述目标场景的第二角度相关的处理。
  4. 根据权利要求1-3任一所述的方法,其特征在于,所述视角相关处理包括所述观察所述目标场景的第一角度或者所述观察所述目标场景的第二角度相关的处理。
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述非视角相关处理包括顶点规格、顶点着色器处理、曲面细分技术、几何着色器中的一个或者多个,所述视角相关处理包括裁剪和剔除中的一个或者多个。
  6. 根据权利要求1-5任一权利要求所述的方法,其特征在于,
    所述第一渲染请求包括:所述目标场景的标识以及所述观察所述目标场景的第一角度;或者,所述目标场景中的部分或者全部网格的几何数据、纹理数据、材质数据以及所述观察所述目标场景的第一角度。
  7. 一种渲染应用服务端,其特征在于,所述渲染应用服务端包括:通信模块以及渲染模块;
    所述通信模块用于接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;
    所述渲染模块用于通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;
    所述渲染模块用于通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视 角处理数据进行包括视角相关处理,得到第一渲染图像;
    所述渲染模块用于通过所述渲染引擎根据所述观察所述目标场景的第二角度和所述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
  8. 一种渲染系统,其特征在于,所述渲染系统包括渲染应用服务端、渲染应用客户端以及渲染引擎,其中,所述渲染应用服务端和所述渲染引擎部署于远程渲染节点,
    所述渲染应用服务端用于接收第一渲染请求以及第二渲染请求,其中,所述第一渲染请求指示目标场景以及观察所述目标场景的第一角度,所述第二渲染请求指示所述目标场景以及观察所述目标场景的第二角度;
    所述渲染应用服务端用于通过所述渲染引擎对所述目标场景的数据进行非视角相关处理,得到非视角处理数据;
    所述渲染应用服务端用于通过所述渲染引擎根据所述观察所述目标场景的第一角度和所述非视角处理数据进行包括视角相关处理,得到第一渲染图像;
    所述渲染应用服务端用于通过所述渲染引擎根据所述观察所述目标场景的第二角度和所述非视角处理数据进行包括视角相关处理,得到第二渲染图像。
  9. 一种计算节点,其特征在于,所述计算节点包括处理器以及存储器,所述处理器执行所述存储器中的程序,从而执行如权利要求1至6任一权利要求所述的方法。
  10. 一种计算机可读存储介质,其特征在于,包括指令,当所述指令在计算节点上运行时,使得所述计算节点执行如权利要求1至6任一权利要求所述的方法。
PCT/CN2021/133713 2020-12-21 2021-11-26 渲染方法、设备以及系统 WO2022135050A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21909049.5A EP4258218A4 (en) 2020-12-21 2021-11-26 DISPLAY METHOD, APPARATUS AND SYSTEM
US18/338,835 US20230351671A1 (en) 2020-12-21 2023-06-21 Rendering Method, Device, and Rendering System

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011521452.6 2020-12-21
CN202011521452.6A CN114723862A (zh) 2020-12-21 2020-12-21 渲染方法、设备以及系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/338,835 Continuation US20230351671A1 (en) 2020-12-21 2023-06-21 Rendering Method, Device, and Rendering System

Publications (1)

Publication Number Publication Date
WO2022135050A1 true WO2022135050A1 (zh) 2022-06-30

Family

ID=82157360

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/133713 WO2022135050A1 (zh) 2020-12-21 2021-11-26 渲染方法、设备以及系统

Country Status (4)

Country Link
US (1) US20230351671A1 (zh)
EP (1) EP4258218A4 (zh)
CN (1) CN114723862A (zh)
WO (1) WO2022135050A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102196300A (zh) * 2010-03-18 2011-09-21 国际商业机器公司 虚拟世界场景的图像的提供方法和设备及处理方法和设备
US20180033203A1 (en) * 2016-08-01 2018-02-01 Dell Products, Lp System and method for representing remote participants to a meeting
CN110163943A (zh) * 2018-11-21 2019-08-23 深圳市腾讯信息技术有限公司 图像的渲染方法和装置、存储介质、电子装置
CN111191060A (zh) * 2019-12-13 2020-05-22 佛山欧神诺云商科技有限公司 一种3d模型实时渲染方法、装置及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9196081B2 (en) * 2011-12-15 2015-11-24 Intel Corporation Techniques for enhancing multiple view performance in a three dimensional pipeline
CN111381967A (zh) * 2020-03-09 2020-07-07 中国联合网络通信集团有限公司 虚拟对象的处理方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102196300A (zh) * 2010-03-18 2011-09-21 国际商业机器公司 虚拟世界场景的图像的提供方法和设备及处理方法和设备
US20180033203A1 (en) * 2016-08-01 2018-02-01 Dell Products, Lp System and method for representing remote participants to a meeting
CN110163943A (zh) * 2018-11-21 2019-08-23 深圳市腾讯信息技术有限公司 图像的渲染方法和装置、存储介质、电子装置
CN111191060A (zh) * 2019-12-13 2020-05-22 佛山欧神诺云商科技有限公司 一种3d模型实时渲染方法、装置及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4258218A4

Also Published As

Publication number Publication date
EP4258218A4 (en) 2024-06-19
EP4258218A1 (en) 2023-10-11
US20230351671A1 (en) 2023-11-02
CN114723862A (zh) 2022-07-08

Similar Documents

Publication Publication Date Title
US10614549B2 (en) Varying effective resolution by screen location by changing active color sample count within multiple render targets
EP3673463B1 (en) Rendering an image from computer graphics using two rendering computing devices
US10311548B2 (en) Scaling render targets to a higher rendering resolution to display higher quality video frames
US9928637B1 (en) Managing rendering targets for graphics processing units
US20240096007A1 (en) Rendering Method, Device, and System
CN111340928A (zh) 一种结合光线跟踪的Web端实时混合渲染方法、装置及计算机设备
EP4213102A1 (en) Rendering method and apparatus, and device
US10078911B2 (en) System, method, and computer program product for executing processes involving at least one primitive in a graphics processor, utilizing a data structure
CN113076152B (zh) 渲染方法及装置、电子设备和计算机可读存储介质
KR20170040698A (ko) 그래픽스 파이프라인을 수행하는 방법 및 장치
US20230206567A1 (en) Geometry-aware augmented reality effects with real-time depth map
CN112316433A (zh) 游戏画面渲染方法、装置、服务器和存储介质
CN113838184A (zh) 渲染方法、设备以及系统
US20150015574A1 (en) System, method, and computer program product for optimizing a three-dimensional texture workflow
WO2021249358A1 (zh) 渲染方法、设备以及系统
JP7160495B2 (ja) 画像前処理方法、装置、電子機器及び記憶媒体
US11302054B2 (en) Varying effective resolution by screen location by changing active color sample count within multiple render targets
US20230316626A1 (en) Image rendering method and apparatus, computer device, and computer-readable storage medium
WO2022135050A1 (zh) 渲染方法、设备以及系统
US10062140B2 (en) Graphics processing systems
CN114327790A (zh) 基于Linux系统Android容器的渲染方法
CN115705668A (zh) 一种视图绘制的方法、装置及存储介质
KR102713170B1 (ko) 실시간 깊이 맵을 사용한 지오메트리 인식 증강 현실 효과
WO2023142756A1 (zh) 直播互动方法、装置以及系统
RU2810701C2 (ru) Гибридный рендеринг

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21909049

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021909049

Country of ref document: EP

Effective date: 20230703

NENP Non-entry into the national phase

Ref country code: DE