WO2023193639A1 - Procédé et appareil de rendu d'image, support lisible et dispositif électronique - Google Patents

Procédé et appareil de rendu d'image, support lisible et dispositif électronique Download PDF

Info

Publication number
WO2023193639A1
WO2023193639A1 PCT/CN2023/084542 CN2023084542W WO2023193639A1 WO 2023193639 A1 WO2023193639 A1 WO 2023193639A1 CN 2023084542 W CN2023084542 W CN 2023084542W WO 2023193639 A1 WO2023193639 A1 WO 2023193639A1
Authority
WO
WIPO (PCT)
Prior art keywords
height
shadow
pixel
pixel point
scene image
Prior art date
Application number
PCT/CN2023/084542
Other languages
English (en)
Chinese (zh)
Inventor
金祎
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2023193639A1 publication Critical patent/WO2023193639A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation

Definitions

  • the present disclosure relates to the technical field of image processing, and specifically, to an image rendering method, device, readable medium, electronic equipment, computer program product, and computer program.
  • a depth map of the scene is first rendered at the light source position, and then the entire scene is rendered at the camera, and compared with the previously rendered depth map to obtain the shadow effect of the scene.
  • the amount of data that needs to be processed for two scene renderings is relatively large, resulting in relatively low rendering efficiency.
  • the present disclosure provides an image rendering method, which method includes:
  • the target shadow height is the height of a designated pixel point in the shadow of the scene image, and the designated pixel point is at the same position as the pixel point in the horizontal direction of the world space corresponding to the target scene, And the height of the pixels in the height direction of the world space is greater than or equal to the preset height threshold.
  • an image rendering device which includes:
  • the image acquisition module is used to obtain the scene image to be rendered in the target scene
  • An image rendering module configured to determine, for each pixel in the scene image, a target shadow height corresponding to the pixel, and determine a shadow value corresponding to the pixel according to the target shadow height. Shadow values are used to render the pixels to render the scene image;
  • the target shadow height is the height of a designated pixel point in the shadow of the scene image, and the designated pixel point is at the same position as the pixel point in the horizontal direction of the world space corresponding to the target scene, And the height of the pixels in the height direction of the world space is greater than or equal to the preset height threshold.
  • the present disclosure provides a computer-readable medium having a computer program stored thereon, and when the computer program is executed by a processing device, the steps of the method described in the first aspect of the present disclosure are implemented.
  • the present disclosure provides electronic equipment, including:
  • a processing device configured to execute the computer program in the storage device to implement the steps of the method described in the first aspect of the present disclosure.
  • the present disclosure provides a computer program product, which includes a computer program.
  • the computer program When the computer program is executed by a processing device, the steps of the method described in the first aspect of the present disclosure are implemented.
  • the present disclosure provides a computer program.
  • the computer program When the computer program is executed by a processing device, the steps of the method described in the first aspect of the present disclosure are implemented.
  • FIG. 1 is a flowchart of an image rendering method according to an exemplary embodiment of the present disclosure.
  • FIG. 2 is a schematic diagram of a target shadow height according to an exemplary embodiment of the present disclosure.
  • FIG. 3 is a flowchart of another image rendering method according to an exemplary embodiment of the present disclosure.
  • Figure 4 is a schematic diagram of a shadow height according to an exemplary embodiment of the present disclosure.
  • FIG. 5 is a block diagram of an image rendering device according to an exemplary embodiment of the present disclosure.
  • FIG. 6 is a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • the term “include” and its variations are open-ended, ie, “including but not limited to.”
  • the term “based on” means “based at least in part on.”
  • the term “one embodiment” means “at least one embodiment”; the term “another embodiment” means “at least one additional embodiment”; and the term “some embodiments” means “at least some embodiments”. Relevant definitions of other terms will be given in the description below.
  • a prompt message is sent to the user to clearly remind the user that the operation requested will require the acquisition and use of the user's personal information. Therefore, users can autonomously choose whether to provide personal information to software or hardware such as electronic devices, applications, servers or storage media that perform the operations of the technical solution of the present disclosure based on the prompt information.
  • the method of sending prompt information to the user may be, for example, a pop-up window, and the prompt information may be presented in the form of text in the pop-up window.
  • the pop-up window can also contain a selection control for the user to choose "agree” or "disagree” to provide personal information to the electronic device.
  • the current mobile perspective shadow solutions include Cascade Shadow Map algorithm, baked shadows, contact shadows, etc.
  • the Cascade Shadow Map algorithm is equivalent to rendering the scene twice during the processing process, and the corresponding Drawcall (drawing initiation) data volume will be relatively large, resulting in CPU (Central Processing Unit, Central Processing Unit) and GPU (Graphics Processing Unit) , graphics processor) is also relatively large, which affects the rendering efficiency.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • graphics processor graphics processor
  • the present disclosure provides an image rendering method, device, readable medium and electronic equipment, which can determine the shadow value of the pixel point according to the target shadow height corresponding to the pixel point, and then determine the shadow value of the pixel point according to the shadow value. Rendering is performed so that the amount of data that needs to be processed during the rendering process is relatively small, thereby improving the efficiency of image rendering.
  • Figure 1 is a flow chart of an image rendering method according to an exemplary embodiment of the present disclosure. As shown in Figure 1, the method may include:
  • the target scene may be a game scene
  • the scene image may be an image that needs to be rendered in the game scene.
  • the target shadow height is the height of the specified pixel in the shadow of the scene image
  • the specified pixel is the same as the horizontal position of the pixel in the world space corresponding to the target scene, and is in the world space height square Pixels whose upward height is greater than or equal to the preset height threshold.
  • the preset height threshold may be the height corresponding to the pixel that has the same position in the height direction of the world space and the highest position in the horizontal direction of the world space.
  • Figure 2 is a schematic diagram of a target shadow height according to an exemplary embodiment of the present disclosure. As shown in Figure 2, the terrain in the scene image is rendered from a top-down perspective.
  • the target shadow height can be the world The intersection point of the straight line in the height direction of space and the straight line in the direction of the light source.
  • Shadow saturate(exp(MaxOcclusionHeight–C*positionWS.y)) (1)
  • Shadow is the shadow value
  • MaxOcclusionHeight is the target shadow height
  • positionWS.y is the y-axis of the coordinates of the world space corresponding to the pixel point
  • C is a preset constant
  • C can be pre-tested based on experiments. For example, C can be 20.
  • the function of the Saturate(x) function is to return 0 when the value of x is less than or equal to 0, to return 1 when the value of x is greater than or equal to 1, and to return 1 when the value of x is between 0 and 1. If the value is between , x is returned.
  • the shadow value corresponding to the pixel it means that the pixel is not in shadow.
  • the shadow value corresponding to the pixel When the shadow value corresponding to the pixel is 1, it means that the pixel is in shadow, and the color of the shadow is the darkest. , when the shadow value corresponding to the pixel is between 0 and 1, the corresponding shadow color can be rendered according to the size of the shadow value. For example, the larger the shadow value, the darker the color. The shadow value The smaller it is, the lighter the color can be.
  • the pixel After obtaining the shadow value corresponding to the pixel, the pixel can be rendered by referring to the rendering method in the prior art, which will not be described again here.
  • the shadow value of the pixel can be determined according to the target shadow height corresponding to the pixel, and the pixel can be rendered according to the shadow value. In this way, the amount of data that needs to be processed during the rendering process is relatively small, thereby improving image rendering. s efficiency.
  • Figure 3 is a flowchart of another image rendering method according to an exemplary embodiment of the present disclosure. As shown in Figure 3, the method may include:
  • the target scene may be a game scene
  • the scene image may be an image that needs to be rendered in the game scene.
  • the coordinates of the world space corresponding to the pixel, the pixel size corresponding to the height texture, and the proportion of the world space can be obtained to determine the corresponding coordinates of the scene image.
  • the offset value between the center point of the terrain and the origin of the coordinates of the world space corresponding to the pixel point, and the pixel is determined based on the coordinates of the world space corresponding to the pixel point, the offset value, the scale and the pixel size.
  • the texture coordinates corresponding to the point can be obtained through existing techniques, and will not be described again here.
  • the world space ratio corresponding to the height texture can be preset according to the scene image. For example, if the pixel occupies 1m*1m world space, the ratio can be 1. The smaller the ratio, the smaller the pixel in the scene image. The greater the density, the better the rendering effect.
  • (cellU, cellV) is the texture coordinate corresponding to the pixel point
  • positionWS.x is the x-axis of the coordinates of the pixel point corresponding to the world space
  • positionWS.z is the z-axis of the coordinates of the pixel point corresponding to the world space
  • HeightOcclusionStartPoint is The offset value
  • TexelsPerMeter is the proportion of the world space corresponding to the pixel
  • HeightOcclusionTex_TexelSize is the pixel size corresponding to the height texture.
  • the pixel size can be a quaternion, and the pixel size can be represented by a four-dimensional vector.
  • the four-dimensional vector can be Vector4(1/width, 1/height, width, height).
  • the height texture may include a shadow height corresponding to each pixel of the scene image.
  • the height texture can be obtained, and based on the texture coordinates of the pixel, it is determined whether the height texture is the same as the pixel.
  • Specified pixels corresponding to the target scene have the same position in the horizontal direction of the world space and have a height greater than or equal to the preset height threshold in the height direction of the world space.
  • the height texture can be obtained in advance in the following way: for each pixel in the scene image, determine the scene depth of the pixel in the direction of the light source in the scene image, and determine the scene depth corresponding to the pixel based on the scene depth. Shadow height, and a set of shadow heights corresponding to multiple pixels is used as the height texture.
  • the ShadowMap algorithm For each pixel in the scene image, you can refer to the implementation of the ShadowMap algorithm to determine the scene depth of the pixel in the direction of the light source in the scene image. Further, for each pixel in the scene image, after determining the scene depth of the pixel in the direction of the light source in the scene image, the terrain height corresponding to the pixel can be determined. After determining the position corresponding to the terrain height When in the shadow of the scene image, a preset height interval is obtained, and the shadow height corresponding to the pixel is determined based on the terrain height and the preset height interval.
  • the terrain height may be the height of a reference plane located directly below the pixel in the height direction of the world space coordinate in the scene image.
  • the reference plane may be the ground or the highest point of a tree.
  • the plane on which it is located, the terrain height can be obtained in advance based on the terrain in the scene image.
  • the preset height interval can be pre-tested based on the scene image. The smaller the preset height interval, the higher the accuracy of determining the shadow height. However, due to the relatively large amount of calculation, the performance will be poor. For scenarios with relatively high accuracy requirements and low performance requirements, you can set a smaller preset height interval. For example, the preset height interval can be 5m. For scenarios with relatively low accuracy requirements and high performance requirements, A larger preset height interval can be set. For example, the preset height interval can be 7m. The present disclosure does not limit the setting method of the preset height interval.
  • the terrain height when it is determined that the position corresponding to the terrain height is in the shadow of the scene image, the terrain height can be used as the undetermined height, and the shadow height determination step is executed in a loop until a new undetermined height is determined. If the position corresponding to the height is not in the shadow of the scene image, the difference between the new undetermined height and the preset height interval is used as the shadow height corresponding to the pixel.
  • the step of determining the shadow height includes: determining whether the position corresponding to the undetermined height is in the shadow according to the scene depth, and if the position corresponding to the undetermined height is in the shadow, spacing the undetermined height from the preset height The sum value is used as the new pending height.
  • the position corresponding to the undetermined height is in the shadow of the scene image.
  • the terrain height h is in in the shadow.
  • Table 1 lists the shadow heights corresponding to multiple pixels in the scene image. As shown in Table 1, the shadow heights corresponding to the pixels in the scene image include 0, 25, 30, and 35. It should be noted that the table The shadow height shown in 1 is only the shadow height corresponding to some pixels in the scene image.
  • the present disclosure can obtain the shadow height corresponding to each pixel point in the scene image in advance, or can obtain the shadow height corresponding to each pixel point in the scene image in real time, and the present disclosure is not limited to this. Compared with the method of obtaining the shadow height in real time, obtaining the shadow height in advance can reduce the performance overhead during image rendering and further improve rendering efficiency.
  • the shadow height corresponding to the pixel can be filtered for each pixel in the scene image, and the filtered shadow height is composed of A collection of textures that serve as that height. In this way, the shadow height can be blurred once, making the overall edge of the shadow rendering softer and achieving a soft shadow effect.
  • the ESM algorithm in Logarithmic Space Filtering can be used to filter the shadow height corresponding to each pixel.
  • the filtering formula can be:
  • d is the filtered shadow height corresponding to the current pixel in the scene image
  • d 0 is the shadow height corresponding to the current pixel
  • d i is the shadow height corresponding to the i-th pixel around the current pixel.
  • N is 8
  • w 0 is the weight value corresponding to the current pixel
  • w i is the weight value corresponding to the i-th pixel
  • c is a preset constant, which can be obtained through pre-testing based on experiments, for example, c is 20.
  • the weight value corresponding to each pixel can be predetermined according to the filtering method. For example, if the filtering method is 3*3 BOX filtering, then the weight value corresponding to each pixel is the same, both 1/9. If the filtering method is Gaussian filtering, the weight value corresponding to each pixel is different.
  • the current pixel in the scene image is the pixel in row 2 and column 6 (the pixel with black background in Table 1)
  • the current pixel The shadow height corresponding to the point is 30, and the shadow heights (d 1 ⁇ d 8 ) corresponding to the eight pixel points around the current pixel point are 30, 30, 30, 30, 0, 0, 0, and 30 respectively.
  • Figure 4 is a schematic diagram of a shadow height according to an exemplary embodiment of the present disclosure. As shown in Figure 4, in terms of resolution An image of 4096*4096 in R16 format displays the shadow height corresponding to each pixel in the scene image.
  • the target shadow height corresponding to the pixel can be determined from the pre-obtained height texture corresponding to the scene image based on the texture coordinates corresponding to the pixel in the scene image, and the shadow of the pixel can be determined based on the target shadow height. value, the pixel is rendered according to the shadow value.
  • the present disclosure can also perform filtering processing on the shadow height obtained in advance to achieve the effect of soft shadow.
  • Figure 5 is a block diagram of an image rendering device according to an exemplary embodiment of the present disclosure. As shown in Figure 5, the device may include:
  • the image acquisition module 501 is used to acquire the scene image to be rendered in the target scene
  • the image rendering module 502 is configured to determine, for each pixel point in the scene image, the target shadow height corresponding to the pixel point, and determine the shadow value corresponding to the pixel point according to the target shadow height. Shadow values are used to render the pixels to render the scene image;
  • the target shadow height is the height of a designated pixel point in the shadow of the scene image, and the designated pixel point is at the same position as the pixel point in the horizontal direction of the world space corresponding to the target scene, And the height of the pixels in the height direction of the world space is greater than or equal to the preset height threshold.
  • the image rendering module 502 is also used to:
  • the shadow height corresponding to the specified pixel point is used as the target shadow height.
  • the image rendering module 502 is also used to:
  • the texture coordinates corresponding to the pixel are determined based on the coordinates of the pixel corresponding to the world space, the ratio, the pixel size and the offset value.
  • the image rendering module 502 is also used to:
  • For each pixel in the scene image determine the scene depth of the pixel in the direction of the light source in the scene image, and determine the shadow height corresponding to the pixel according to the scene depth;
  • a set of shadow heights corresponding to multiple pixel points is used as the height texture.
  • the image rendering module 502 is also used to:
  • a preset height interval is obtained, and the shadow height corresponding to the pixel is determined based on the terrain height and the preset height interval.
  • the image rendering module 502 is also used to:
  • the shadow height determination steps include:
  • the sum of the interval between the undetermined height and the preset height is used as the new undetermined height.
  • the image rendering module 502 is also used to:
  • the set of the shadow heights corresponding to the plurality of pixels, as the height texture includes:
  • the set of filtered shadow heights is used as the height texture.
  • the image rendering module 502 is also used to:
  • the shadow value corresponding to the pixel is determined.
  • the scene image is a game scene image.
  • the shadow value of the pixel can be determined according to the target shadow height corresponding to the pixel, and the pixel can be rendered according to the shadow value. In this way, the amount of data that needs to be processed during the rendering process is relatively small, thereby improving image rendering. s efficiency.
  • Terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, PDA (Personal Digital Assistant, personal digital assistant), PAD (tablet computer), PMP (Portable Media Player, portable multimedia players), vehicle-mounted terminals (such as vehicle-mounted navigation terminals), etc., and fixed terminals such as digital TVs, desktop computers, etc.
  • PDA Personal Digital Assistant
  • PAD tablet computer
  • PMP Portable Media Player
  • vehicle-mounted terminals such as vehicle-mounted navigation terminals
  • fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG. 6 is only an example and should not impose any limitations on the functions and scope of use of the embodiments of the present disclosure.
  • the electronic device 600 may include a processing device (such as a central processing unit, a graphics processor, etc.) 601, which may process data according to a program stored in a read-only memory (Read Only Memory, ROM) 602 or from a storage device 608
  • a processing device such as a central processing unit, a graphics processor, etc.
  • the program loaded into the random access memory (Random Access Memory, RAM) 603 performs various appropriate actions and processing.
  • RAM 603 Random Access Memory
  • various programs and data required for the operation of the electronic device 600 are also stored.
  • the processing device 601, ROM 602 and RAM 603 are connected to each other via a bus 604.
  • An input/output (I/O) interface 605 is also connected to bus 604.
  • the following devices can be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD) , an output device 607 such as a speaker, a vibrator, etc.; a storage device 608 including a magnetic tape, a hard disk, etc.; and a communication device 609.
  • the communication device 609 may allow the electronic device 600 to communicate wirelessly or wiredly with other devices to exchange data.
  • FIG. 6 illustrates electronic device 600 with various means, it should be understood that implementation or availability of all illustrated means is not required. More or fewer means may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product including a computer program carried on a non-transitory computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via communication device 609, or from storage device 608, or from ROM 602.
  • the processing device 601 When the computer program is executed by the processing device 601, the above functions defined in the method of the embodiment of the present disclosure are performed.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium may be, for example, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or any combination thereof. More specific examples of computer readable storage media may include, but are not limited to: an electrical connection having one or more wires, a portable computer disk, a hard drive, random access memory (RAM), read only memory (ROM), removable Programmd read-only memory (EPROM or flash memory), fiber optics, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium that can send, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any suitable medium, including but not limited to: wire, optical cable, RF (radio frequency), etc., or any suitable combination of the above.
  • the client and server can communicate using any currently known or future developed network protocol such as HTTP (HyperText Transfer Protocol), and can communicate with digital data in any form or medium.
  • Communications e.g., communications network
  • Examples of communication networks include Local Area Networks (LANs), Wide Area Networks (WANs), the Internet (e.g., the Internet), and end-to-end networks (e.g., ad hoc end-to-end networks), as well as any current network for knowledge or future research and development.
  • LANs Local Area Networks
  • WANs Wide Area Networks
  • the Internet e.g., the Internet
  • end-to-end networks e.g., ad hoc end-to-end networks
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; it may also exist independently without being assembled into the electronic device.
  • the computer-readable medium carries one or more programs.
  • the electronic device obtains the scene image to be rendered in the target scene; and targets each scene image in the scene image.
  • pixel points determine the target shadow height corresponding to the pixel point, determine the shadow value corresponding to the pixel point according to the target shadow height, and render the pixel point according to the shadow value to render the pixel point.
  • the scene image wherein, the target shadow height is the height of a designated pixel point in the shadow of the scene image, and the designated pixel point is the horizontal direction of the world space corresponding to the pixel point in the target scene. Pixel points with the same position and a height greater than or equal to a preset height threshold in the height direction of the world space.
  • Computer program code for performing the operations of the present disclosure may be written in one or more programming languages, including but not limited to object-oriented programming languages—such as Java, Smalltalk, C++, and Includes conventional procedural programming languages - such as "C" or similar programming languages.
  • the program code can be completed Execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as an Internet service provider). connected via the Internet).
  • LAN local area network
  • WAN wide area network
  • Internet service provider such as an Internet service provider
  • each block in the flowchart or block diagram may represent a module, segment, or portion of code that contains one or more logic functions that implement the specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown one after another may actually execute substantially in parallel, or they may sometimes execute in the reverse order, depending on the functionality involved.
  • each block of the block diagram and/or flowchart illustration, and combinations of blocks in the block diagram and/or flowchart illustration can be implemented by special purpose hardware-based systems that perform the specified functions or operations. , or can be implemented using a combination of specialized hardware and computer instructions.
  • the modules involved in the embodiments of the present disclosure can be implemented in software or hardware.
  • the name of the module does not constitute a limitation on the module itself under certain circumstances.
  • the image acquisition module can also be described as "a module that acquires the scene image to be rendered in the target scene.”
  • exemplary types of hardware logic components include: field programmable gate array (Field Programmable Gate Array, FPGA), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), application specific standard product (Application Specific Standard Product (ASSP), System on Chip (SOC), Complex Programmable Logic Device (CPLD), etc.
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, laptop disks, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM portable compact disk read-only memory
  • magnetic storage device or any suitable combination of the above.
  • Example 1 provides an image rendering method.
  • the method includes: obtaining a scene image to be rendered in a target scene; and determining, for each pixel point in the scene image, the The target shadow height corresponding to the pixel point is determined, and the shadow value corresponding to the pixel point is determined according to the target shadow height, and the pixel point is rendered according to the shadow value to render the scene image; wherein, The target shadow height is the height of a designated pixel in the shadow of the scene image, and the designated pixel is at the same position as the pixel in the horizontal direction of the world space corresponding to the target scene, and in Pixels whose height in the height direction of the world space is greater than or equal to a preset height threshold.
  • Example 2 provides the method of Example 1. Determining the target shadow height corresponding to the pixel point includes: determining the texture coordinates corresponding to the pixel point; according to the texture coordinates and The height texture corresponding to the scene image obtained in advance determines the designated pixel point, and the height texture includes each image of the scene image. The shadow height corresponding to the pixel point; the shadow height corresponding to the specified pixel point is used as the target shadow height.
  • Example 3 provides the method of Example 2. Determining the texture coordinates corresponding to the pixel point includes: obtaining the coordinates of the pixel point corresponding to the world space, the height texture corresponding to The pixel size and the proportion of the world space; determine the offset value between the center point of the terrain corresponding to the scene image and the origin of the coordinates of the world space corresponding to the pixel point; according to the corresponding coordinates of the pixel point to the world The spatial coordinates, the scale, the pixel size and the offset value determine the texture coordinates corresponding to the pixel point.
  • the height texture is pre-acquired in the following manner: for each pixel point in the scene image, determine whether the pixel point is in the scene image The scene depth in the direction of the light source is determined, and the shadow height corresponding to the pixel point is determined according to the scene depth; a set of the shadow heights corresponding to multiple pixel points is used as the height texture.
  • determining the shadow height corresponding to the pixel point according to the scene depth includes: determining the terrain height corresponding to the pixel point, and the terrain height is the height of the reference plane located directly below the pixel point in the height direction of the coordinates of the world space corresponding to the pixel point in the scene image; when it is determined that the position corresponding to the terrain height is located in the scene image In the case of shadow, a preset height interval is obtained, and the shadow height corresponding to the pixel is determined based on the terrain height and the preset height interval.
  • determining the shadow height corresponding to the pixel point according to the terrain height and the preset height interval includes: using the terrain height as an undetermined height, and execute the shadow height determination step in a loop until it is determined that the position corresponding to the new undetermined height is not in the shadow of the scene image, and the difference between the new undetermined height and the preset height interval is used as the corresponding pixel point
  • the shadow height of The sum of the height and the preset height interval is used as the new pending height.
  • Example 7 before using a set of shadow heights corresponding to a plurality of pixel points as the height texture, the method further includes: The shadow height is subjected to filtering processing; the set of the shadow heights corresponding to a plurality of the pixel points as the height texture includes: the set of the shadow heights after filtering is used as The highly textured.
  • determining the shadow value corresponding to the pixel point according to the target shadow height includes: according to the coordinates of the pixel point corresponding to the world space and the Shadow height determines the shadow value corresponding to the pixel.
  • Example 9 provides the method of any one of Examples 1-8, and the scene image is a game scene image.
  • Example 10 provides an image rendering device, which includes: an image acquisition module for acquiring a scene image to be rendered in a target scene; an image rendering module for For each pixel in the scene image, determine the target shadow height corresponding to the pixel, and determine the shadow value corresponding to the pixel according to the target shadow height. According to the shadow value, calculate the shadow value for the pixel. Rendering is performed to render the scene image; wherein the target shadow height is the height of a designated pixel point in the shadow of the scene image, and the designated pixel point is corresponding to the pixel point in the target scene.
  • the pixels have the same position in the horizontal direction of the world space and the height in the height direction of the world space is greater than or equal to the preset height threshold.
  • Example 11 provides a computer-readable medium having a computer-readable medium stored thereon A computer program that, when executed by a processing device, implements the steps of the method described in any one of Examples 1-9.
  • Example 12 provides an electronic device, including: a storage device having a computer program stored thereon; and a processing device configured to execute the computer program in the storage device, to Implement the steps of the method described in any of Examples 1-9.
  • Example 13 provides a computer program product, including a computer program that, when executed by a processing device, implements the steps of the method described in any one of Examples 1-9.
  • Example 14 provides a computer program that, when executed by a processing device, implements the steps of the method described in any of Examples 1-9 of the present disclosure.
  • the scene image to be rendered in the target scene is obtained; for each pixel in the scene image, the target shadow height corresponding to the pixel is determined, and based on the target shadow height, the target shadow height is determined.
  • the shadow value corresponding to the pixel point is used to render the pixel point according to the shadow value to render the scene image; wherein the target shadow height is the height of the designated pixel point in the shadow of the scene image.
  • the designated pixel point is a pixel point that has the same position as the pixel point in the horizontal direction of the world space corresponding to the target scene, and has a height greater than or equal to a preset height threshold in the height direction of the world space.
  • the present disclosure can determine the shadow value of a pixel based on the target shadow height corresponding to the pixel, and render the pixel based on the shadow value. In this way, the amount of data that needs to be processed during the rendering process is relatively small, thereby improving the efficiency of the pixel. Image rendering efficiency.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

La présente divulgation concerne un procédé et un appareil de rendu d'images, un support lisible, un dispositif électronique, un produit programme d'ordinateur et un programme d'ordinateur. Le procédé consiste : à acquérir une image de scène à rendre dans une scène cible ; par rapport à chaque point de pixel dans l'image de scène, à déterminer une hauteur d'ombre cible correspondant au point de pixel, à déterminer une valeur d'ombre correspondant au point de pixel en fonction de la hauteur d'ombre cible, et, en fonction de la valeur d'ombre, à réaliser le rendu du point de pixel de façon à réaliser le rendu de l'image de scène. La hauteur d'ombre cible est la hauteur d'un point de pixel spécifié dans l'ombre de l'image de scène, le point de pixel spécifié étant un point de pixel ayant une même position avec le point de pixel dans la direction horizontale d'un espace du monde correspondant à la scène cible, et ayant une hauteur dans la direction verticale de l'espace du monde supérieure ou égale à un seuil de hauteur prédéfini. C'est-à-dire, la présente divulgation peut déterminer la valeur d'ombre du point de pixel en fonction de la hauteur d'ombre cible correspondant au point de pixel, et réaliser le rendu du point de pixel en fonction de la valeur d'ombre, de telle sorte que, pendant un processus de rendu, la quantité de données à traiter est faible, ce qui permet d'améliorer l'efficacité de rendu d'image.
PCT/CN2023/084542 2022-04-07 2023-03-28 Procédé et appareil de rendu d'image, support lisible et dispositif électronique WO2023193639A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210363937.X 2022-04-07
CN202210363937.XA CN114742934A (zh) 2022-04-07 2022-04-07 图像渲染方法、装置、可读介质及电子设备

Publications (1)

Publication Number Publication Date
WO2023193639A1 true WO2023193639A1 (fr) 2023-10-12

Family

ID=82278380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/084542 WO2023193639A1 (fr) 2022-04-07 2023-03-28 Procédé et appareil de rendu d'image, support lisible et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114742934A (fr)
WO (1) WO2023193639A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114742934A (zh) * 2022-04-07 2022-07-12 北京字跳网络技术有限公司 图像渲染方法、装置、可读介质及电子设备
CN118057460A (zh) * 2022-11-21 2024-05-21 荣耀终端有限公司 图像处理方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436673A (zh) * 2011-10-24 2012-05-02 克拉玛依红有软件有限责任公司 一种大规模室外场景的阴影绘制方法
CN104103089A (zh) * 2014-07-29 2014-10-15 无锡梵天信息技术股份有限公司 一种基于图像屏幕空间的实时软阴影实现方法
CN109993823A (zh) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 阴影渲染方法、装置、终端及存储介质
US11232628B1 (en) * 2020-11-10 2022-01-25 Weta Digital Limited Method for processing image data to provide for soft shadow effects using shadow depth information
CN114119854A (zh) * 2021-11-29 2022-03-01 北京字跳网络技术有限公司 阴影渲染方法、游戏文件打包方法及相应装置
CN114742934A (zh) * 2022-04-07 2022-07-12 北京字跳网络技术有限公司 图像渲染方法、装置、可读介质及电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4001227B2 (ja) * 2002-05-16 2007-10-31 任天堂株式会社 ゲーム装置及びゲームプログラム
US8577170B2 (en) * 2011-09-15 2013-11-05 Microsoft Corporation Shadow detection in a single image
CN106447761B (zh) * 2016-08-31 2019-03-08 北京像素软件科技股份有限公司 一种阴影渲染方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436673A (zh) * 2011-10-24 2012-05-02 克拉玛依红有软件有限责任公司 一种大规模室外场景的阴影绘制方法
CN104103089A (zh) * 2014-07-29 2014-10-15 无锡梵天信息技术股份有限公司 一种基于图像屏幕空间的实时软阴影实现方法
CN109993823A (zh) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 阴影渲染方法、装置、终端及存储介质
US11232628B1 (en) * 2020-11-10 2022-01-25 Weta Digital Limited Method for processing image data to provide for soft shadow effects using shadow depth information
CN114119854A (zh) * 2021-11-29 2022-03-01 北京字跳网络技术有限公司 阴影渲染方法、游戏文件打包方法及相应装置
CN114742934A (zh) * 2022-04-07 2022-07-12 北京字跳网络技术有限公司 图像渲染方法、装置、可读介质及电子设备

Also Published As

Publication number Publication date
CN114742934A (zh) 2022-07-12

Similar Documents

Publication Publication Date Title
WO2023193639A1 (fr) Procédé et appareil de rendu d'image, support lisible et dispositif électronique
US20220327726A1 (en) Face image processing method and apparatus, readable medium, and electronic device
CN110728622B (zh) 鱼眼图像处理方法、装置、电子设备及计算机可读介质
CN112733820B (zh) 障碍物信息生成方法、装置、电子设备和计算机可读介质
WO2024037556A1 (fr) Appareil et procédé de traitement d'image, dispositif et support de stockage
WO2024104248A1 (fr) Procédé et appareil de rendu pour panorama virtuel, dispositif, et support de stockage
WO2024016930A1 (fr) Procédé et appareil de traitement d'effets spéciaux, dispositif électronique et support de stockage
WO2023116801A1 (fr) Procédé et appareil permettant d'effectuer le rendu d'un effet de particules, dispositif et support
WO2023193642A1 (fr) Procédé et appareil de traitement vidéo, dispositif, et support de stockage
WO2024032752A1 (fr) Procédé et appareil pour générer une image d'effet spécial de transition, dispositif, et support de stockage
CN113205601B (zh) 漫游路径生成方法、装置、存储介质及电子设备
WO2024016923A1 (fr) Procédé et appareil de génération de graphe à effets spéciaux, dispositif et support de stockage
WO2023231926A1 (fr) Procédé et appareil de traitement d'image, dispositif, et support de stockage
WO2024088141A1 (fr) Procédé et appareil de traitement d'effet spécial, dispositif électronique et support d'enregistrement
WO2024027820A1 (fr) Procédé et appareil de génération d'animation à base d'images, dispositif, et support de stockage
CN113673446A (zh) 图像识别方法、装置、电子设备和计算机可读介质
WO2024041623A1 (fr) Procédé et appareil de génération de carte à effets spéciaux, dispositif, et support de stockage
WO2023193613A1 (fr) Procédé et appareil d'effets d'ombrage, et support et dispositif électronique
CN111862342A (zh) 增强现实的纹理处理方法、装置、电子设备及存储介质
WO2023138467A1 (fr) Procédé et appareil de génération d'objet virtuel, dispositif, et support de stockage
WO2023174087A1 (fr) Procédé et appareil de génération de vidéo à effet spécial, dispositif et support de stockage
WO2023138468A1 (fr) Procédé et appareil de génération d'objet virtuel, dispositif, et support de stockage
WO2023109564A1 (fr) Procédé et appareil de traitement d'image vidéo, dispositif électronique et support de stockage
WO2023025085A1 (fr) Procédé et appareil de traitement audio, et dispositif, support et produit de programme
CN113274735B (zh) 模型处理方法、装置、电子设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23784203

Country of ref document: EP

Kind code of ref document: A1