WO2020114271A1 - 图像渲染方法、设备及存储介质 - Google Patents

图像渲染方法、设备及存储介质 Download PDF

Info

Publication number
WO2020114271A1
WO2020114271A1 PCT/CN2019/120729 CN2019120729W WO2020114271A1 WO 2020114271 A1 WO2020114271 A1 WO 2020114271A1 CN 2019120729 W CN2019120729 W CN 2019120729W WO 2020114271 A1 WO2020114271 A1 WO 2020114271A1
Authority
WO
WIPO (PCT)
Prior art keywords
water
dynamic object
image frame
wave
next image
Prior art date
Application number
PCT/CN2019/120729
Other languages
English (en)
French (fr)
Inventor
覃飏
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP19892415.1A priority Critical patent/EP3822918A4/en
Publication of WO2020114271A1 publication Critical patent/WO2020114271A1/zh
Priority to US17/178,437 priority patent/US11498003B2/en
Priority to US17/959,633 priority patent/US11826649B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/603D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Definitions

  • This application relates to the field of computer technology, and in particular, to an image rendering method, device, and storage medium.
  • Some mobile games also try to implement the water surface interaction scheme, but it binds special effects in the vicinity of the dynamic game object to produce a certain disturbance to the water surface. Its essence is to hang a special effect on the game dynamic object, and the special effect will affect the water surface. Disturbance, but when the dynamic game object turns to move, this special effect will swing like a tail behind the dynamic game object, making the entire game screen look very unnatural, which actually reduces the user's immersive experience.
  • This application provides an image rendering method, which obtains the normal map corresponding to the target water area of the dynamic object in the next image frame based on the water wave map corresponding to the target water area of the dynamic object in the image frame, and based on the normal map Rendering the water wave effect in the target water area, compared with the special effect pendant, it provides users with a more realistic water surface interaction effect and improves the immersive experience.
  • the present application also provides an image rendering device, device, computer-readable storage medium, and computer program product.
  • a first aspect of the present application provides an image rendering method, which is applied to a processing device, and the method includes:
  • the water wave effect of the target water body area where the dynamic object is in the next image frame is rendered.
  • a second aspect of the present application provides an image rendering device, the device including:
  • the acquisition module is used to acquire the water wave map corresponding to the target water body area where the dynamic object in the current image frame is located;
  • the iterative module is used to obtain the normal map corresponding to the target water area where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water area where the dynamic object is located in the current image frame;
  • the rendering module is configured to render the water wave effect of the target water body region where the dynamic object is located in the next image frame according to the normal map corresponding to the target water body region where the dynamic object is located in the next image frame.
  • a third aspect of the present application provides a processing device.
  • the device includes a processor and a memory:
  • the memory is used to store program code and transmit the program code to the processor
  • the processor is configured to execute the steps of the image rendering method described in the first aspect according to the instructions in the program code.
  • a fourth aspect of the present application provides a computer-readable storage medium for storing program code, where the program code is executed by a processor to implement the image rendering method described in the first aspect above.
  • a fifth aspect of the present application provides a computer program product including instructions that, when run on a computer, cause the computer to execute the image rendering method described in the first aspect above.
  • FIG. 1 is a scene architecture diagram of an image rendering method in an embodiment of this application
  • FIG. 3 is a schematic diagram illustrating the principle of acquiring the water wave map and the normal map of the next image frame based on the water wave map of the current image frame in the embodiment of the present application;
  • FIG. 5 is a schematic diagram of water wave hysteresis effect provided in an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an application scenario of an image rendering method in an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the effect of an image rendering method in an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of an image rendering device in an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an image rendering device in an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an image rendering device in an embodiment of this application.
  • FIG. 11 is a schematic structural diagram of an image rendering device in an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of an image rendering device in an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a processing device for image rendering in an embodiment of the present application.
  • Water surface rendering refers to the technique of drawing water bodies in video, including the geometric representation, coloring, and animation of water bodies. When drawing a water body, it is mainly directed to the water surface. Specifically, the water surface is divided into a grid, and the water surface is characterized by the position information and normal information of the vertices of the grid.
  • the water wave graph specifically refers to the graph formed by the height displacement information of the vertices of each grid on the water surface.
  • the normal map specifically refers to a map formed by the normal information of the tangent plane of the water surface.
  • the normal map records the normal information corresponding to each vertex, which can characterize the unevenness of each vertex of the water surface mesh, and achieve a three-dimensional effect based on the unevenness.
  • this application provides an image rendering method, which considers the water wave diffusion motion and is based on the current image
  • the water wave map corresponding to the target water body area where the dynamic object is in the frame obtains the normal map corresponding to the target water body area where the dynamic object is in the next image frame, and renders the water wave effect of the target water body area in the next image frame based on the normal map
  • iterative frame-by-frame rendering can simulate realistic water wave animation effects, making the water surface interaction very natural, providing users with an immersive experience; on the other hand, because only the target water body area is water wave rendering, the complexity is simplified and the reduction is reduced.
  • the amount of calculation can be adapted to mobile applications.
  • the processing device may specifically be a device including at least one of a graphics processor (Graphics Processing Unit, GPU) and a central processing unit (Central Processing Unit, CPU), for example, the processing device may be a terminal or a server.
  • the terminal refers to any existing user equipment with image processing capabilities, which is under development or will be developed in the future, including but not limited to: existing, under development or future development of smartphones, tablets, laptops Personal computers, desktop personal computers, etc.
  • a server refers to any device that has image processing capabilities and provides computing services.
  • the processing device may be an independent terminal or server, or may be a cluster formed by the terminal or server.
  • the image rendering method of the present application may be stored in the processing device in the form of an application program.
  • the processing device implements the image rendering method by executing the above application program.
  • the application program may be an independent application program, or may be a functional module, a plug-in, an applet, etc. integrated on other application programs.
  • the following uses the terminal as an execution subject to illustrate the specific implementation of the image rendering method.
  • the scene includes a terminal 10, and the terminal 10 obtains a water wave map corresponding to a target water body area where a dynamic object in the current image frame is located, and then according to the dynamic object in the current image frame
  • the water wave map corresponding to the target water body area obtain the normal map corresponding to the target water body area where the dynamic object is in the next image frame, based on the normal map corresponding to the target water body area where the dynamic object is in the next image frame , Render the water wave effect of the target water body area where the dynamic object is in the next image frame.
  • the terminal 10 may also obtain the water wave map corresponding to the target water body area where the dynamic object is in the next image frame according to the water wave map corresponding to the target water body area where the dynamic object is in the current image frame,
  • the water wave map is used to obtain the normal map corresponding to the target water body area where the dynamic object in the lower image frame is located.
  • FIG. 1 only takes the water wave map as an example, but does not limit the application.
  • the method can also be applied to a processing device, which can be a server or a terminal.
  • a processing device which can be a server or a terminal.
  • the method flow is the same as the following Here, only the method is applied to the terminal as an example for description.
  • the method includes:
  • S201 Acquire a water wave map corresponding to the target water body area where the dynamic object is in the current image frame.
  • the image rendering method of this embodiment can be applied to a variety of scenes.
  • a typical scene is to render a water wave animation effect on the water surface in a game scene.
  • the image frame is the image frame in the game application
  • the dynamic object is a mobile game character object in the game application, such as a game player; using this image rendering method can dynamically render a realistic water wave animation effect for the game player's movement in the water, thereby increasing the immersion of the game player .
  • the target water area where the dynamic object is located is a local area of the entire water surface.
  • the shape and size of the local area can be set according to requirements.
  • the shape of the local area can be rectangular, fan-shaped, or circular.
  • the embodiments of the present application provide several implementation methods for acquiring the target water body area:
  • the terminal may acquire the image position of the dynamic object in the image frame, take the image position as the center position of the target water body area where the dynamic object is located, and use the preset width and height as the target water body area's Width and height, the rectangular area where the dynamic object in the image frame is located is the target water area.
  • the terminal can obtain the image position of the dynamic object in the image frame, take the image position as the vertex position of the target water body area where the dynamic object is located, and use the preset radius and preset angle as the target water body area The fan-shaped radius and fan-shaped angle are obtained, and the fan-shaped region where the dynamic object in the image frame is located is the target water body region.
  • the water wave effect is not significant when the water wave spreads to the distance, and game players often do not pay attention to the water wave effect in the distance. Therefore, in this embodiment, only the image frame Rendering the water surface in the target water body area where the dynamic object in is able to meet the needs of game applications, and more importantly, rendering only the local area of the water surface can greatly reduce the amount of calculation.
  • S202 Obtain a normal map corresponding to the target water body region where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water body region where the dynamic object is located in the current image frame.
  • the terminal obtains a normal map corresponding to the target water body region where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water body region where the dynamic object is located in the current image frame.
  • the normal map is used to Render the water wave effect in the target water area.
  • the terminal may obtain two images corresponding to the target water body region where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water body region where the dynamic object is located in the current image frame, the first The image is a normal map, and the second image is a water wave map.
  • the water wave map is used to calculate the normal map of the lower image frame, or the water wave map and the normal map of the lower image frame. Iteratively update frame by frame.
  • the water wave map may also be obtained when the next image frame is obtained.
  • the embodiment of the present application does not limit the acquisition time of the water wave map.
  • the water wave map corresponding to the target water body area includes the water surface height displacement of the image frame and the previous frame
  • the normal map corresponding to the target water body area includes each water surface vertex in the current image frame of the water wave map.
  • the difference in water surface height displacement compared to two vertices adjacent to its position.
  • the terminal can obtain the normal map corresponding to the target water area where the dynamic object is located in the next image frame through the following steps:
  • the first step is to read the water surface height information from the water wave map corresponding to the target water body area where the dynamic object is located in the current image frame, where the water surface height information includes the water surface height displacement corresponding to the current image frame and the previous frame ;
  • the water wave map corresponding to the target water body area where the dynamic object is located in the next image frame is obtained;
  • the third step is to obtain the normal map corresponding to the target water area where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water area where the dynamic object is located in the next image frame.
  • the terminal can obtain the water wave map corresponding to the target water body area where the dynamic object is in the next image frame in the second step, and use it as the basis for obtaining the normal map, and Obtain the basis of the normal map corresponding to the target water area where the dynamic object is located in the next image frame.
  • the terminal may also acquire again when acquiring the normal map corresponding to the lower image frame, which is not limited in this embodiment of the present application.
  • this embodiment provides a specific implementation manner of acquiring the water wave map of the next image frame based on the wave equation.
  • the terminal uses the water surface height information as the wave equation parameter, and obtains the water surface height displacement corresponding to the next image frame through the damped wave equation and the wave equation parameter, and the water surface height displacement corresponding to the current image frame corresponds to the next image frame
  • the height of the water surface is displaced to generate a water wave map corresponding to the target water area where the dynamic object is located in the next image frame.
  • the damped wave equation is specifically:
  • k refers to the damping coefficient
  • c refers to the speed of water wave propagation
  • t is time
  • x, y are the horizontal and vertical coordinates of the position in the image frame.
  • h(t,x,y) is the height displacement of the current image frame at (x,y);
  • h(t- ⁇ t, x, y) is the height displacement at (x, y) of the previous image frame
  • h(t,x+1,y), h(t,x-1,y), h(t,x,y+1), h(t,x,y-1) is around (x,y) The height displacement of the four vertices in the current image frame.
  • the embodiments of the present application provide two implementation methods.
  • One way to achieve this is to obtain the water surface height of each water surface vertex in the next image frame compared to the upper vertex and the right vertex adjacent to its position according to the water wave map corresponding to the target water body area where the dynamic object is located in the next image frame
  • the displacement difference value, according to the water surface height displacement difference value, the normal map corresponding to the target water body area where the dynamic object in the next image frame is located is obtained.
  • Another implementation is to obtain the water surface of each water surface vertex in the next image frame compared with the left vertex adjacent to its position and the lower vertex according to the water wave map corresponding to the target water body region where the dynamic object is located in the next image frame
  • the height displacement difference value according to the water surface height displacement difference value, obtain the normal map corresponding to the target water body area where the dynamic object is located in the next image frame.
  • the water wave map and the normal map use two color rendering map formats, for example, the RGHalf format, which specifically refers to the use of R channels and G channels to store image information, specifically
  • the RGHalf format which specifically refers to the use of R channels and G channels to store image information
  • For the water wave map use the R channel and the G channel to store the water surface height displacement of the current image frame and the previous image frame respectively
  • for the normal map use the R channel and the G channel to store the water wave map respectively.
  • the difference in water height displacement of the vertices which can reduce the amount of rendering calculation.
  • FIG. 3 The process of obtaining the water wave map and the normal map of the next image frame based on the water wave map of the current image frame is exemplarily described below with reference to FIG. 3.
  • FIG. 3 it provides three rendering textures (RenderTexture, RT), which are respectively denoted as A, B, and Nor, where Nor represents a normal map Normal.
  • the three rendering textures are all in the RGHalf format.
  • Rendering texture A and rendering texture B are used to store water wave maps of two adjacent image frames, respectively, and rendering texture Nor is used to store two adjacent image frames.
  • Normal map for the next image frame in the middle For each image frame, the water wave map of the current image frame is used as input, and the water wave map and normal map of the next image frame are multiple render targets (Multiple Render Targets, MRT) output.
  • MRT Multiple Render Targets
  • the rendering texture A stores the water wave map of the current image frame, specifically, the R channel stores the water surface height displacement of frame n-1, and the G channel stores the water surface height displacement of frame n;
  • the rendering texture B stores the water wave image of frame n+1, and the water wave image of frame n+1 includes the water surface height displacement of frame n and the water surface height displacement of frame n+1, so that the G channel of texture A can be rendered
  • the value is written to the R channel of the rendered texture B, and the water surface height displacement of the n+1th frame calculated based on the water surface height displacement of the R channel and the G channel in the rendered texture A is written to the G channel of the rendered texture B, and the rendered texture Nor is stored
  • the normal map of frame n+1 includes the water surface height displacement difference of each water surface vertex in frame n+1 compared with the adjacent upper vertex and right vertex, so it can be based on the rendering texture B
  • the data of the G channel is calculated to obtain the difference
  • the two water wave charts are exchanged.
  • the rendering texture A and the rendering texture B are exchanged.
  • the n+1th frame takes the rendering texture B as the input, and the rendering texture A and the rendering texture Nor as the output .
  • the n+1 frame water wave map is stored in the rendered texture B, the R channel stores the water surface height displacement of the nth frame, and the G channel stores the n+1 water surface height displacement.
  • the G channel is The water surface height displacement of the n+1th frame is written into the R channel of the rendering texture A, and the water surface height displacement of the n+2th frame calculated based on the water surface height displacement of the nth frame and the n+1th frame is written into the rendering texture A
  • the G channel of this way the water wave image of the n+2th frame is generated in the rendered texture A.
  • the vertices of each water surface can be calculated compared with the adjacent upper vertex and right vertex
  • the difference in height displacement of the water surface is written into the R and G channels of the rendering texture Nor respectively to form the normal map of the n+2th frame.
  • the wave map and normal map of frame n+3 and frame n+4 can be generated.
  • S203 Render the water wave effect of the target water body region where the dynamic object is located in the next image frame according to the normal map corresponding to the target water body region where the dynamic object is located in the next image frame.
  • the terminal can render the water wave effect of the target water area where the dynamic object is located based on the normal map. Specifically, the terminal renders the water wave effect of the target water body region where the dynamic object is located in the next image frame according to the normal map corresponding to the target water body region where the dynamic object is located in the next image frame.
  • the embodiments of the present application provide an image rendering method that considers the water wave diffusion motion and obtains the dynamic object in the next image frame based on the water wave map corresponding to the target water body area where the dynamic object in the current image frame is located.
  • the normal map corresponding to the target water area at the location based on the normal map, renders the water wave effect of the target water area in the next image frame.
  • iterative frame-by-frame rendering can simulate a realistic water wave animation effect, making the water surface interaction very natural.
  • the complexity is simplified, the calculation amount is reduced, and the mobile terminal application can be better adapted.
  • the embodiment shown in FIG. 2 mainly describes the iterative process of water waves in detail.
  • a dynamic object moves in water in order to render a realistic water wave animation effect, it is also necessary to consider the continuous injection of new water waves as the dynamic object moves in water.
  • the process, that is, the formation of water waves includes both the injection process of water waves and the iterative process of water waves.
  • FIG. 4 Refer to the flowchart of the image rendering method shown in FIG. 4, wherein the embodiment shown in FIG. 4 is improved based on the embodiment shown in FIG. 2, and only the differences between the embodiment shown in FIG. 4 and FIG. 2 are described below To explain, the method includes:
  • the water wave injection generated when the dynamic object moves in the water can be characterized by the water wave injection displacement and the water wave diffusion attenuation matrix, where the water wave diffusion attenuation matrix can be represented by the kernel, which can be used to describe the shape of the water wave injection,
  • the shape of the water wave injection can be a symmetric structure, and its kernel is as follows:
  • the terminal may obtain the water wave injection displacement corresponding to the dynamic object according to at least one of the object attribute of the dynamic object or the action type performed by the dynamic object.
  • the object attributes may include the gender and weight of the game character object, and the types of actions carried out by the dynamic object may specifically be stomping, lifting, etc.
  • the terminal determines the gender and weight of the game character object, etc.
  • action types such as stomp, lift, etc. to obtain the water wave injection displacement corresponding to the game character object.
  • the water wave injection displacement corresponding to different object attributes or different action types can be different, and the water wave injection displacement corresponding to different object attributes and different action types can also be different, for example, the water wave injection displacement generated by the female game character stomping in the water It is smaller than the water wave injection displacement generated by the male game character stomping in the water.
  • the terminal can perform attenuation processing on the water wave diagram. Specifically, the terminal obtains the product of the water wave injection displacement corresponding to the dynamic object and the water wave diffusion attenuation matrix as the attenuation parameter matrix, and then superimposes the value in the attenuation parameter matrix to the target water body region where the dynamic object is in the current image frame Corresponding water wave diagram, the water wave diagram after attenuation processing is obtained.
  • the center point of the attenuation parameter matrix can be matched with the center point of the water wave chart, and then the attenuation parameter The values of each element of the matrix are superimposed on the corresponding vertices of the water wave graph.
  • the attenuation-treated water wave map is shifted in a direction opposite to the moving direction of the dynamic object to obtain an offset-processed water wave map.
  • the upper part of FIG. 5 is the effect picture of the water wave after the offset processing.
  • new water waves will be continuously injected.
  • Newly injected water waves need to be offset for all previous water waves; for details, see the lower part of Figure 5, assuming that the dynamic object does not move, each time a water wave is injected, the water wave before the injection will move in the direction of the dynamic object. Move in the opposite direction, and then move the entire water wave diagram to the moving direction of the dynamic object according to the position offset du of the dynamic object, and the water wave effect diagram shown in the upper part of FIG. 5 can be obtained.
  • the terminal may obtain the dynamic object in the next image frame according to the offset processed water wave map The wave map and normal map corresponding to the target water area. Then, the terminal renders the water wave effect of the target water body region where the dynamic object is located in the next image frame based on the normal map corresponding to the target water body region where the dynamic object is located in the next image frame.
  • the embodiments of the present application provide an image rendering method, which renders water surface rendering from the perspective of water wave formation from the aspects of water wave injection and water wave diffusion.
  • the dynamic object Corresponding water wave injection displacement and water wave diffusion attenuation matrix, attenuate the water wave map corresponding to the target water body area where the dynamic object is located in the current image frame, and then according to the position offset of the dynamic object, the overall attenuation of the water wave map Offset in the opposite direction of the moving direction of the dynamic object, and then obtain the normal map of the next image frame based on the offset processed water wave map, according to the normal map corresponding to the target water area where the dynamic object is located in the next image frame , Render the water wave effect of the target water area where the dynamic object is in the next image frame.
  • iterative rendering frame by frame it can simulate a more realistic water wave animation effect, and because only the target water body area where the dynamic object is located is processed, the complexity is simplified,
  • the scenario includes a terminal 10, and the terminal 10 may specifically be a smart phone, which is used to render a water surface for a mobile game.
  • the game character object swims in the water, and the water surface is disturbed to generate a water wave.
  • the formation of the water wave includes two processes of water wave injection and water wave diffusion.
  • the terminal 10 obtains the water wave injection displacement corresponding to the game character object according to the object attributes of the game character object and the actions performed by the game character object.
  • the terminal 10 calculates the product of the pre-defined water wave diffusion attenuation matrix kernel used to describe the shape of the water wave and the water wave injection displacement obtained based on the object attributes and the implemented actions as the attenuation parameter matrix.
  • the attenuation parameter The matrix is a 5*5 matrix.
  • the terminal 10 After acquiring the water wave chart corresponding to the target water body area where the game character object is located in the current image frame, the terminal 10 superimposes the values in the attenuation parameter matrix to the water wave chart, assuming that the size of the water wave chart is 25*15, then proceeding When superimposing, the center point of the attenuation parameter matrix is matched with the center point of the water wave chart, and then the rows and columns of the attenuation parameter matrix are filled with zeros to make it consistent with the size of the water wave chart, and then the processed attenuation parameters The values in the matrix are superimposed on the wave chart.
  • the terminal 10 performs offset processing on the attenuation wave chart. Specifically, the terminal shifts the attenuation wave chart along the movement direction of the game character object according to the position offset of the game character object Perform the offset in the opposite direction to obtain the wave chart after the offset process. In this embodiment, if the game character object is shifted to the east by 1 meter, the terminal 10 shifts the attenuation wave chart as a whole to the west by 1 meter to obtain the shifted wave chart.
  • the terminal 10 reads the water surface height information from the water wave diagram after the offset processing, the water surface height information includes the water surface height displacement corresponding to the current image frame and the previous frame, and then uses the water surface height information as the wave equation parameter, by The damped wave equation and the wave equation parameters are used to obtain the water surface height displacement corresponding to the next image frame, and the game in the next image frame is generated according to the water surface height displacement corresponding to the current image frame and the water surface height displacement corresponding to the next image frame.
  • the water wave map corresponding to the target water body area where the character object is located and then based on the water wave map corresponding to the target water body area where the game character object is located in the next image frame, each water surface vertex in the next image frame is respectively adjacent to its position
  • the difference in water surface height displacement between the top vertex and the right vertex of, and the normal map corresponding to the target water body region where the dynamic object in the next image frame is located according to the water surface height displacement difference.
  • the R channel and G channel of the normal map store the water surface height difference between each water surface vertex and the upper vertex adjacent to the position in the next image frame, and the water surface height difference between each water vertex and the adjacent right vertex Value, the water surface height displacement difference can characterize the roughness of the water surface, and the water wave effect of the target water body region where the game character object is located in the next image frame is rendered based on the normal map.
  • the terminal 10 can be used to iteratively update the water wave map and the normal map of the next image frame according to the acquired water wave map of the next image frame, so that iterative rendering can be realized frame by frame, thereby Simulate a realistic water wave animation effect.
  • the game character object implements the eastward movement, the water surface is disturbed, and a new water wave is injected, and the original water wave is offset in the opposite direction of the movement of the game character object, that is, to the west.
  • the water wave hysteresis effect shown in the right figure in FIG. 7 is formed.
  • the embodiments of the present application further provide an image rendering device.
  • the image rendering device provided by the embodiments of the present application will be introduced from the perspective of functional modularization.
  • the device 700 includes:
  • the obtaining module 710 is used to obtain the water wave map corresponding to the target water body area where the dynamic object in the current image frame is located;
  • the iteration module 720 is used to obtain a normal map corresponding to the target water body region where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water body region where the dynamic object is located in the current image frame;
  • the rendering module 730 is configured to render the water wave effect of the target water body region where the dynamic object is located in the next image frame according to the normal map corresponding to the target water body region where the dynamic object is located in the next image frame.
  • FIG. 9 is a schematic structural diagram of an image rendering apparatus provided by an embodiment of the present application. Based on the structure shown in FIG. 7, the apparatus 700 further includes:
  • Attenuation module 740 is used to attenuate the water wave map corresponding to the target water area where the dynamic object is located in the current image frame according to the water wave injection displacement and water wave diffusion attenuation matrix corresponding to the dynamic object when the dynamic object is moving in the water Processing to obtain the attenuation wave chart;
  • the offset module 750 is configured to offset the attenuation-processed water wave map along the reverse direction of the moving direction of the dynamic object according to the position offset of the dynamic object to obtain the offset-processed water wave map;
  • the iterative module 720 is specifically used to obtain the normal map corresponding to the target water area where the dynamic object is located in the next image frame according to the water wave map after the offset processing.
  • FIG. 10 is a schematic structural diagram of an image rendering device provided by an embodiment of the present application. Based on the structure shown in FIG. 9, the device 700 further includes:
  • the first acquiring module 760 is configured to acquire the water wave injection displacement corresponding to the dynamic object according to at least one of the object attribute of the dynamic object or the action type performed by the dynamic object.
  • the attenuation module 740 is specifically used for:
  • the value in the attenuation parameter matrix is superimposed on the water wave diagram corresponding to the target water body area where the dynamic object in the current image frame is located to obtain the water wave diagram after attenuation processing.
  • the iteration module 720 includes:
  • the reading submodule 721 is used to read the water surface height information from the water wave chart corresponding to the target water body region where the dynamic object is located in the current image frame, and the water surface height information includes the current image frame and the previous frame corresponding to the corresponding Water surface height displacement;
  • the first iteration submodule 722 is used to obtain the water wave map corresponding to the target water body area where the dynamic object is located in the next image frame according to the water surface height information;
  • the second iteration submodule 723 is configured to obtain a normal map corresponding to the target water body area where the dynamic object is located in the next image frame according to the water wave map corresponding to the target water body area where the dynamic object is located in the next image frame.
  • image rendering device shown in FIG. 11 may also be formed on the basis of the structure shown in FIG. 9 or FIG. 10, which is not limited in the embodiments of the present application.
  • the first iteration submodule 722 is specifically used to:
  • a water wave map corresponding to the target water body region where the dynamic object is located in the next image frame is generated.
  • the second iteration sub-module 723 is specifically used to:
  • the water wave map corresponding to the target water area where the dynamic object is in the next image frame obtain the water surface height displacement difference of each water surface vertex in the next image frame compared with the upper vertex and the right vertex adjacent to its position, Acquiring the normal map corresponding to the target water body area where the dynamic object is located in the next image frame according to the water surface height displacement difference; or,
  • the water wave map corresponding to the target water body area where the dynamic object is in the next image frame obtain the water surface height displacement difference of each water surface vertex in the next image frame compared with the left vertex and the lower vertex adjacent to its position,
  • the normal map corresponding to the target water area where the dynamic object is located in the next image frame is obtained according to the water surface height displacement difference.
  • FIG. 12 is a schematic structural diagram of an image rendering device provided by an embodiment of the present application. Based on the structure shown in FIG. 7, the device 700 further includes:
  • the second obtaining module 770 is used to obtain the image position of the dynamic object in the image frame, take the image position as the center position of the target water body area where the dynamic object is located, and take the preset width and height as the width and height of the target water body area, Get the target water area where the dynamic object is in the image frame.
  • image rendering device shown in FIG. 12 may also be formed on the basis of the structure shown in FIG. 9 or FIG. 10, which is not limited in the embodiments of the present application.
  • the water wave map and the normal map use two color rendering map formats.
  • the image frame includes an image frame in a game application
  • the dynamic object includes a game character object.
  • an embodiment of the present application provides an image rendering device that takes into account the water wave diffusion motion and obtains the dynamic object in the next image frame based on the water wave map corresponding to the target water body area where the dynamic object in the current image frame is located.
  • the normal map corresponding to the target water area at the location based on the normal map, renders the water wave effect of the target water area in the next image frame.
  • iterative frame-by-frame rendering can simulate a realistic water wave animation effect, making the water surface interaction very natural.
  • the complexity is simplified, the calculation amount is reduced, and the mobile terminal application can be better adapted.
  • FIG. 8 to FIG. 12 introduce the image rendering device from the perspective of functional modularization.
  • the embodiments of the present application also provide a processing device for image rendering. The following will implement the present application from the perspective of hardware physicalization Examples of equipment provided.
  • An embodiment of the present application also provides a processing device.
  • the processing device may be a terminal or a server.
  • the terminal may be any terminal device including a mobile phone, a tablet computer, a personal digital assistant (English full name: Personal Digital Assistant, abbreviation: PDA), an in-vehicle computer, etc. Taking the terminal as a mobile phone as an example:
  • the mobile phone includes: a radio frequency (English full name: Radio Frequency, English abbreviation: RF) circuit 1210, a memory 1220, an input unit 1230, a display unit 1240, a sensor 1250, an audio circuit 1260, and wireless fidelity (English full name: wireless fidelity) , English abbreviation: WiFi) module 1270, processor 1280, and power supply 1290 and other components.
  • RF Radio Frequency
  • RF Radio Frequency
  • memory 1220 includes an input unit 1230, a display unit 1240, a sensor 1250, an audio circuit 1260, and wireless fidelity (English full name: wireless fidelity) , English abbreviation: WiFi) module 1270, processor 1280, and power supply 1290 and other components.
  • wireless fidelity English full name: wireless fidelity
  • WiFi English abbreviation: WiFi
  • the RF circuit 1210 can be used to receive and send signals during receiving and sending information or during a call.
  • the downlink information of the base station is received and processed by the processor 1280; in addition, the designed uplink data is sent to the base station.
  • the RF circuit 1210 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (full English name: Low Noise Amplifier, English abbreviation: LNA), a duplexer, and the like.
  • the RF circuit 1210 can also communicate with the network and other devices through wireless communication.
  • the above wireless communication can use any communication standard or protocol, including but not limited to Global Mobile Communication System (English full name: Global System of Mobile Communication, English abbreviation: GSM), General Packet Radio Service (English full name: General Packet Radio Service, GPRS) ), code division multiple access (English full name: Code Division Multiple Access, English abbreviation: CDMA), wideband code division multiple access (English full name: Wideband Code Division Multiple Access, English abbreviation: WCDMA), long-term evolution (English full name: LongTerm) Evolution, English abbreviation: LTE), e-mail, short message service (English full name: Short Messaging Service, SMS), etc.
  • GSM Global System of Mobile Communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE long-term evolution
  • SMS Short Messaging Service
  • the memory 1220 may be used to store software programs and modules.
  • the processor 1280 runs the software programs and modules stored in the memory 1220 to execute various functional applications and data processing of the mobile phone.
  • the memory 1220 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; the storage data area may store Data created by the use of mobile phones (such as audio data, phone books, etc.), etc.
  • the memory 1220 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the input unit 1230 may be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the mobile phone.
  • the input unit 1230 may include a touch panel 1231 and other input devices 1232.
  • the touch panel 1231 also known as a touch screen, can collect user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc. on the touch panel 1231 or near the touch panel 1231. Operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 1231 may include a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates, and then sends Give the processor 1280, and can receive the command sent by the processor 1280 and execute it.
  • the touch panel 1231 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 1230 may also include other input devices 1232.
  • other input devices 1232 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), trackball, mouse, joystick, and so on.
  • the display unit 1240 may be used to display information input by the user or information provided to the user and various menus of the mobile phone.
  • the display unit 1240 may include a display panel 1241.
  • a liquid crystal display (English full name: Liquid Crystal Display, English abbreviation: LCD), an organic light-emitting diode (English full name: Organic Light-Emitting Diode, English abbreviation: OLED), etc. may be used.
  • a liquid crystal display English full name: Liquid Crystal Display, English abbreviation: LCD
  • OLED Organic Light-Emitting Diode
  • the touch panel 1231 can cover the display panel 1241, and when the touch panel 1231 detects a touch operation on or near it, it is transmitted to the processor 1280 to obtain the type of touch event, and then the processor 1280 according to the touch event The type provides corresponding visual output on the display panel 1241.
  • the touch panel 1231 and the display panel 1241 are two separate components to realize the input and input functions of the mobile phone, in some embodiments, the touch panel 1231 and the display panel 1241 may be integrated and Realize the input and output functions of the mobile phone.
  • the mobile phone may further include at least one sensor 1250, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1241 according to the brightness of the ambient light, and the proximity sensor may close the display panel 1241 and/or when the mobile phone moves to the ear Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when at rest, and can be used to identify mobile phone gesture applications (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometers, percussion), etc.
  • other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., are no longer here Repeat.
  • the audio circuit 1260, the speaker 1261, and the microphone 1262 can provide an audio interface between the user and the mobile phone.
  • the audio circuit 1260 can transmit the converted electrical signal of the received audio data to the speaker 1261, which is converted into a sound signal output by the speaker 1261; on the other hand, the microphone 1262 converts the collected sound signal into an electrical signal, which is converted by the audio circuit 1260 After receiving, it is converted into audio data, and then processed by the audio data output processor 1280, and then sent to, for example, another mobile phone through the RF circuit 1210, or the audio data is output to the memory 1220 for further processing.
  • WiFi is a short-range wireless transmission technology.
  • the mobile phone can help users send and receive emails, browse web pages, and access streaming media through the WiFi module 1270. It provides users with wireless broadband Internet access.
  • FIG. 12 shows the WiFi module 1270, it can be understood that it is not a necessary component of a mobile phone, and can be omitted as needed without changing the scope of the essence of the invention.
  • the processor 1280 is the control center of the mobile phone, and uses various interfaces and lines to connect the various parts of the entire mobile phone, by running or executing the software programs and/or modules stored in the memory 1220, and calling the data stored in the memory 1220 to execute Various functions and processing data of the mobile phone, so as to monitor the mobile phone as a whole.
  • the processor 1280 may include one or more processing units; preferably, the processor 1280 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and application programs, etc.
  • the modem processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1280.
  • the mobile phone also includes a power supply 1290 (such as a battery) that supplies power to various components.
  • a power supply 1290 (such as a battery) that supplies power to various components.
  • the power supply can be logically connected to the processor 1280 through the power management system, so as to realize functions such as charging, discharging, and power management through the power management system.
  • the mobile phone may also include a camera, a Bluetooth module, etc., which will not be repeated here.
  • the processor 1280 included in the terminal also has the following functions:
  • the water wave effect of the target water body region where the dynamic object is in the next image frame is rendered.
  • processor 1280 is further configured to execute any step of the image rendering method provided in the embodiments of the present application, specifically as follows:
  • processor 1280 is also used to execute:
  • the water wave map corresponding to the target water body region where the dynamic object is located in the current image frame is subjected to attenuation processing to obtain the attenuation processing Wave chart
  • the attenuation-processed water wave map is shifted in the reverse direction of the moving direction of the dynamic object to obtain the offset-processed water wave map;
  • the processor 1280 is used to execute the water wave map after the offset processing to obtain the normal map corresponding to the target water body area where the dynamic object is located in the next image frame.
  • the processor 1280 is further configured to perform at least one of the object attribute of the dynamic object or the action type implemented by the dynamic object to obtain the water wave injection displacement corresponding to the dynamic object.
  • processor 1280 is also used to execute:
  • the value in the attenuation parameter matrix is superimposed on the water wave diagram corresponding to the target water body area where the dynamic object in the current image frame is located to obtain the water wave diagram after attenuation processing.
  • processor 1280 is also used to execute:
  • the water surface height information includes the water surface height displacement corresponding to the current image frame and the previous frame;
  • the normal map corresponding to the target water body area where the dynamic object is in the next image frame is obtained.
  • processor 1280 is also used to execute:
  • a water wave map corresponding to the target water body region where the dynamic object is located in the next image frame is generated.
  • processor 1280 is also used to execute:
  • the water wave map corresponding to the target water area where the dynamic object is in the next image frame obtain the water surface height displacement difference of each water surface vertex in the next image frame compared with the upper vertex and the right vertex adjacent to its position, Acquiring the normal map corresponding to the target water body area where the dynamic object is located in the next image frame according to the water surface height displacement difference; or,
  • the water wave map corresponding to the target water body area where the dynamic object is in the next image frame obtain the water surface height displacement difference of each water surface vertex in the next image frame compared with the left vertex and the lower vertex adjacent to its position,
  • the normal map corresponding to the target water area where the dynamic object is located in the next image frame is obtained according to the water surface height displacement difference.
  • processor 1280 is also used to execute:
  • Obtain the image position of the dynamic object in the image frame take the image position as the center position of the target water body area where the dynamic object is located, and use the preset width and height as the width and height of the target water body area to obtain the dynamic object position in the image frame The target water area.
  • the water wave map and the normal map use two color rendering map formats.
  • the image frame includes an image frame in a game application
  • the dynamic object includes a game character object.
  • Embodiments of the present application also provide a computer-readable storage medium for storing program code, which is executed by a processor to implement any one of the image rendering methods described in the foregoing embodiments.
  • Embodiments of the present application also provide a computer program product including instructions, which when run on a computer, causes the computer to execute any one of the image rendering methods described in the foregoing embodiments.
  • the disclosed system, device, and device may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a division of logical functions.
  • there may be another division manner for example, multiple units or components may be combined or may Integration into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component displayed as a unit may or may not be a physical unit, that is, it may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the present application essentially or part of the contribution to the existing technology or all or part of the technical solution can be embodied in the form of a software product, the computer software product is stored in a storage medium , Including several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the apparatus in various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (English full name: Read-Only Memory, English abbreviation: ROM), random access memory (English full name: Random Access Memory, English abbreviation: RAM), magnetic Various media such as discs or optical discs that can store program codes.
  • At least one (item) refers to one or more, and “multiple” refers to two or more.
  • “And/or” is used to describe the association relationship of related objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: there are only A, only B, and A and B at the same time , Where A and B can be singular or plural.
  • the character “/” generally indicates that the related object is a "or” relationship.
  • At least one of the following” or similar expressions refers to any combination of these items, including any combination of a single item or a plurality of items.
  • At least one (a) of a, b or c can be expressed as: a, b, c, "a and b", “a and c", “b and c", or "a and b and c ", where a, b, c can be a single or multiple.

Abstract

本申请实施例公开的图像渲染方法、设备及存储介质,根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,从而根据法线贴图渲染下一图像帧中动态对象所处的目标水体区域的水波效果。一方面通过逐帧迭代渲染,能够模拟逼真的水波动画效果,使得水面交互非常自然,为用户提供沉浸式体验,另一方面,由于仅对目标水体区域进行水波渲染,简化了复杂度,降低了运算量,能够较好地适配移动端应用。

Description

图像渲染方法、设备及存储介质
本申请要求于2018年12月07日提交的申请号为2018114974820、发明名称为“图像渲染方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,尤其涉及一种图像渲染方法、设备及存储介质。
背景技术
随着计算机图形技术的发展,虚拟场景的实现得到越来越多的应用,用户往往更关注在虚拟场景下能够感受到更逼真的交互效果,以享受沉浸式体验。例如在游戏应用中,游戏玩家往往会关注游戏界面中显示出的动态游戏对象与游戏场景的交互效果。例如,游戏玩家在水中游走时,如果在水中出现水随着游戏玩家的游走而波动变化的更逼真的水面交互效果,则可以增加游戏玩家的游戏乐趣。
由于移动端硬件性能的限制以及技术实现的复杂性,目前市面上大多数手游中都回避了水面的交互,即水面都是静态的,游戏中的动态游戏对象(例如游戏玩家)在水中移动时,水面不会有任何互动效果,这使得游戏玩家无法沉浸式体验游戏场景。
而有部分手游中也尝试实现水面交互方案,但其是通过在动态游戏对象的附近绑定特效对水面产生一定的扰动,其本质是将一个特效挂在游戏动态对象上,特效会对水面进行扰动,但是当该动态游戏对象转向移动时,这个特效就会像尾巴随着动态游戏对象身后摇摆,使得整个游戏画面看起来非常不自然,这反倒降低了用户沉浸式体验。
基于此,如何提供一种图像渲染方法,以能够在虚拟场景下为用户提供更逼真的水面交互效果,从而为用户提供沉浸式体验,这是目前应用研发中亟需解决的问题。
发明内容
本申请提供了一种图像渲染方法,基于图像帧中动态对象所处的目标水体区域对应的水波图获取下一图像帧中动态对象所处目标水体区域对应的法线贴图,并基于法线贴图渲染目标水体区域的水波效果,相较于与特效挂件,其为用户提供更逼真的水面交互效果,提高了 沉浸式体验。对应地,本申请还提供了一种图像渲染装置、设备、计算机可读存储介质以及计算机程序产品。
本申请第一方面提供了一种图像渲染方法,应用于处理设备,所述方法包括:
获取当前图像帧中动态对象所处的目标水体区域对应的水波图;
根据所述当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图;
根据所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
本申请第二方面提供了一种图像渲染装置,所述装置包括:
获取模块,用于获取当前图像帧中动态对象所处的目标水体区域对应的水波图;
迭代模块,用于根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图;
渲染模块,用于根据所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
本申请第三方面提供一种处理设备,所述设备包括处理器以及存储器:
所述存储器用于存储程序代码,并将所述程序代码传输给所述处理器;
所述处理器用于根据所述程序代码中的指令,执行如上述第一方面所述的图像渲染方法的步骤。
本申请第四方面提供一种计算机可读存储介质,所述计算机可读存储介质用于存储程序代码,所述程序代码由处理器执行以实现上述第一方面所述的图像渲染方法。
本申请第五方面提供一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行上述第一方面所述的图像渲染方法。
附图说明
图1为本申请实施例中一种图像渲染方法的场景架构图;
图2为本申请实施例中一种图像渲染方法的流程图;
图3为本申请实施例中基于当前图像帧的水波图获取下一图像帧的水波图和法线贴图的原理示意图;
图4为本申请实施例中一种图像渲染方法的流程图;
图5为本申请实施例中提供的水波滞后效果示意图;
图6本申请实施例中一种图像渲染方法的应用场景示意图;
图7为本申请实施例中一种图像渲染方法的效果示意图;
图8为本申请实施例中图像渲染装置的一个结构示意图;
图9为本申请实施例中图像渲染装置的一个结构示意图;
图10为本申请实施例中图像渲染装置的一个结构示意图;
图11为本申请实施例中图像渲染装置的一个结构示意图;
图12为本申请实施例中图像渲染装置的一个结构示意图;
图13为本申请实施例中用于图像渲染的处理设备的一个结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请实施例能够以除了在本文图示或描述顺序以外的其他顺序进行实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
为了便于理解,首先对本申请涉及的技术术语进行介绍。
水面渲染是指在视频中对水体进行绘制的技术,包括水体的几何表示、着色、动画等。在对水体进行绘制时,主要是针对水面进行绘制,具体地,将水面划分为网格,通过网格的顶点的位置信息和法线信息表征水面。
水波图具体是指水面各网格顶点的高度位移信息所形成的图。
法线贴图具体是指水面的切平面的法线信息所形成的图。针对水面渲染,法线贴图记录了各个顶点对应的法线信息,其能够表征水面网格各个顶点的凹凸情况,基于该凹凸情况实现立体效果。
针对现有技术中通过绑定特效的方式实现水面交互导致画面不自然,降低了用户沉浸式体验的技术问题,本申请提供了一种图像渲染方法,该方法考虑水波扩散运动情况,基于当前图像帧中动态对象所处的目标水体区域对应的水波图获取下一图像帧中动态对象所处的目 标水体区域对应的法线贴图,基于法线贴图渲染下一图像帧中目标水体区域的水波效果,一方面通过逐帧迭代渲染,能够模拟逼真的水波动画效果,使得水面交互非常自然,为用户提供沉浸式体验,另一方面,由于仅对目标水体区域进行水波渲染,简化了复杂度,降低了运算量,能够较好地适配移动端应用。
应理解,本申请提供的图像渲染方法可以应用于任意具有图像处理能力的处理设备。该处理设备具体可以是包括图形处理器(Graphics Processing Unit,GPU)和中央处理器(Central Processing Unit,CPU)中的至少一种的设备,例如该处理设备可以是终端或者服务器。其中,终端是指现有的、正在研发的或将来研发的任何具有图像处理能力的用户设备,包括但不限于:现有的、正在研发的或将来研发的智能手机、平板电脑、膝上型个人计算机、桌面型个人计算机等。服务器是指任意具有图像处理能力的、提供计算服务的设备。在具体实现时,处理设备可以是独立的终端或服务器,也可以是终端或服务器形成的集群。
本申请的图像渲染方法可以以应用程序的形式存储于处理设备。处理设备通过执行上述应用程序实现图像渲染方法。其中,应用程序可以是独立的应用程序,也可以是集成于其他应用程序上的功能模块、插件、小程序等等。为了便于理解,后文均以终端作为执行主体对图像渲染方法的具体实现进行示例性说明。
请参见图1所示的图像渲染方法的场景架构图,该场景中包括终端10,终端10获取当前图像帧中动态对象所处的目标水体区域对应的水波图,然后根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,基于下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果,通过一帧一帧地迭代渲染,能够模拟逼真的水波动画效果,实现自然的水面渲染。
在一种可能实现方式中,该终端10还可以根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的水波图,该水波图用于获取下下图像帧中动态对象所处的目标水体区域对应的法线贴图,图1中仅以也获取该水波图为例,但不对本申请造成限定。
为了使得本申请的技术方案更加清楚、易于理解,接下来,将结合附图对本申请实施例提供的图像渲染方法进行详细介绍。参见图2所示的图像渲染方法的流程图,在本申请实施例中,该方法也可以应用于处理设备中,处理设备可以为服务器,也可以为终端,该方法流程均与下述同理,在此仅以该方法应用于终端为例进行说明,该方法包括:
S201:获取当前图像帧中动态对象所处的目标水体区域对应的水波图。
本实施例的图像渲染方法是可以应用于多种场景中的,例如比较典型的一种场景是针对 游戏场景中的水面渲染水波动画效果,在该场景下,图像帧就是游戏应用中的图像帧,而动态对象就是游戏应用中可移动的游戏角色对象,例如游戏玩家;利用该图像渲染方法能够针对游戏玩家在水中的移动情况动态渲染出逼真的水波动画效果,从而能够增加游戏玩家的沉浸感。
其中,动态对象所处的目标水体区域是整个水面的局部区域,该局部区域的形状和大小可以根据需求而设置,例如该局部区域的形状可以为矩形,也可以为扇形或者圆形等。为了便于理解,本申请实施例提供了获取目标水体区域的几种实现方式:
具体地,一种实现方式是,终端可以获取动态对象在图像帧中的图像位置,以该图像位置为该动态对象所处的目标水体区域的中心位置,以预设宽高为目标水体区域的宽高,获取在图像帧中动态对象所处的矩形区域为目标水体区域。
另一种实现方式是,终端可以获取动态对象在图像帧中的图像位置,以该图像位置为该动态对象所处的目标水体区域的顶点位置,以预设半径和预设角度为目标水体区域的扇形半径和扇形角度,获取在图像帧中动态对象所处的扇形区域为目标水体区域。
由于水波一般是由中心点向邻近区域扩散,当水波扩散到远处时水波效果已经不显著,而且游戏玩家也往往不会关注远处的水波效果,因此,在本实施例中仅对图像帧中的动态对象所处的目标水体区域进行水面渲染就能够满足游戏应用需求,更重要的是,仅对水面的局部区域进行渲染,能够极大减少运算量。
S202:根据该当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
在本实施例中,终端根据当前图像帧中动态对象所处的目标水体区域对应的水波图获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,该法线贴图用于渲染目标水体区域的水波效果。
在一种可能实现方式中,该终端可以根据当前图像帧中动态对象所处的目标水体区域对应的水波图获取下一图像帧中动态对象所处的目标水体区域对应的两张图像,第一张图像即为法线贴图,第二张图像为水波图,该水波图用于计算下下图像帧的法线贴图,或者用于计算下下图像帧的水波图和法线贴图,从而实现一帧一帧地迭代更新。在另一种可能实现方式中,该水波图还可以在获取下下图像帧时再获取,本申请实施例对该水波图的获取时机不作限定。
其中,针对任一图像帧,其目标水体区域对应的水波图包括该图像帧及其上一帧的水面高度位移,其目标水体区域对应的法线贴图包括水波图当前图像帧中各水面顶点分别与其位置相邻的两顶点相比的水面高度位移差值。基于此,终端可以通过如下步骤获取下一图像帧 中动态对象所处的目标水体区域对应的法线贴图:
第一步,从当前图像帧中动态对象所处的目标水体区域对应的水波图中,读取水面高度信息,其中,水面高度信息包括该当前图像帧及其上一帧各自对应的水面高度位移;
第二步,根据该水面高度信息,获取下一图像帧中动态对象所处的目标水体区域对应的水波图;
第三步,根据下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
需要说明的是,在上述步骤中,终端可以在第二步中即得到了下一图像帧中动态对象所处的目标水体区域对应的水波图,并将其作为获取法线贴图的依据,以及获取下下图像帧中动态对象所处的目标水体区域对应的法线贴图的依据。终端也可以在获取下下图像帧对应的法线贴图时再获取一遍,本申请实施例对此不作限定。
针对第二步,考虑到水波的波动性,本实施例提供了一种基于波动方程获取下一图像帧的水波图的具体实现方式。具体地,终端将水面高度信息作为波动方程参数,通过带阻尼的波动方程和该波动方程参数获得下一图像帧对应的水面高度位移,根据当前图像帧对应的水面高度位移和下一图像帧对应的水面高度位移,生成下一图像帧中动态对象所处的目标水体区域对应的水波图。
在该实施例中,带阻尼的波动方程具体为:
Figure PCTCN2019120729-appb-000001
其中,k是指阻尼系数;c是指水波传播速度;t是时间;x,y分别为图像帧中位置的横纵坐标。
在将波动方程参数代入上述带阻尼的波动方程后,可以通过有限差分法来近似求解。具体可以参见如下公式:
Figure PCTCN2019120729-appb-000002
将式(2)代入式(1)可得:
Figure PCTCN2019120729-appb-000003
其中,h(t+Δt,x,y)是所要求解的下一图像帧的(x,y)处的高度位移;
h(t,x,y)是当前图像帧的(x,y)处的高度位移;
h(t-Δt,x,y)是上一图像帧的(x,y)处的高度位移;
h(t,x+1,y),h(t,x-1,y),h(t,x,y+1),h(t,x,y-1)是(x,y)周围四个顶点在当前图像帧中的高度位移。
针对第三步,本申请实施例提供了两种实现方式。一种实现方式为,根据下一图像帧中动态对象所处的目标水体区域对应的水波图获取该下一图像帧中各水面顶点分别与其位置相邻的上边顶点和右边顶点相比的水面高度位移差值,根据该水面高度位移差值获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
另一种实现方式为,根据下一图像帧中动态对象所处的目标水体区域对应的水波图获取该下一图像帧中各水面顶点分别与其位置相邻的左边顶点和下边顶点相比的水面高度位移差值,根据该水面高度位移差值获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
还需要说明的是,在本实施例中,该水波图和该法线贴图采用两种颜色渲染贴图格式,例如,RGHalf格式,该格式具体是指采用R通道和G通道存储图像信息,具体地,针对水波图,利用R通道和G通道分别存储当前图像帧及其上一图像帧的水面高度位移,针对法线贴图,利用R通道和G通道分别存储水波图各水面顶点分别与相邻两顶点的水面高度位移差值,如此可以减少渲染的运算量。
下面结合图3对基于当前图像帧的水波图获取下一图像帧的水波图和法线图的过程进行示例性说明。如图3所示,其提供了三个渲染纹理(RenderTexture,RT),分别记作A、B以及Nor,其中,Nor表征法线贴图Normal Map。在该实施例中,三个渲染纹理均采用RGHalf格式,渲染纹理A和渲染纹理B分别用于存储相邻的两个图像帧的水波图,渲染纹理Nor用于存储相邻的两个图像帧中下一图像帧的法线贴图。针对每一图像帧,以当前图像帧的水波图为输入,以下一图像帧的水波图和法线贴图为多重渲染目标(Multiple Render Targets,MRT)输出。
针对第n帧,n为正整数,渲染纹理A存储当前图像帧的水波图,具体地,R通道存储第n-1帧的水面高度位移,G通道存储第n帧的水面高度位移;渲染纹理B存储第n+1帧的水波图,而第n+1帧的水波图包括第n帧的水面高度位移和第n+1帧的水面高度位移,如此,可 以将渲染纹理A的G通道的数值写入渲染纹理B的R通道,将基于渲染纹理A中R通道和G通道的水面高度位移计算得到的第n+1帧的水面高度位移写入渲染纹理B的G通道,渲染纹理Nor存储第n+1帧的法线贴图,法线贴图包括第n+1帧中各水面顶点分别与相邻的上边顶点和右边顶点相比的水面高度位移差值,如此,可以基于渲染纹理B中G通道的数据计算得到与上边顶点的水面高度位移差值和与右边顶点的水面高度位移差值,分别写入渲染纹理Nor的R、G通道中。
每次更新水波图和法线贴图时,将两张水波图进行交换。如针对第n+1帧,则在第n帧基础上,将渲染纹理A和渲染纹理B交换,如此,第n+1帧以渲染纹理B为输入,以渲染纹理A和渲染纹理Nor为输出,渲染纹理B中存储了n+1帧的水波图,其R通道存储有第n帧的水面高度位移,其G通道存储有第n+1帧的水面高度位移,基于此,将G通道中第n+1帧的水面高度位移写入渲染纹理A的R通道,将基于第n帧和第n+1帧的水面高度位移计算得到的第n+2帧的水面高度位移写入渲染纹理A的G通道,如此,在渲染纹理A中生成了第n+2帧的水波图像,基于第n+2帧的水面高度可以计算得到各水面顶点分别与相邻的上边顶点和右边顶点相比的水面高度位移差值,将其分别写入渲染纹理Nor的R、G通道,形成第n+2帧的法线贴图。依此类推,可以生成第n+3帧、第n+4帧的水波图和法线贴图。
S203:根据该下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
由于法线贴图能够表征动态对象所处的目标水体区域各水面顶点的凹凸程度,终端可以基于法线贴图渲染动态对象所处的目标水体区域的水波效果。具体地,终端根据下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。通过一帧一帧地迭代渲染,能够模拟逼真的水波动画效果。
由上可知,本申请实施例提供了一种图像渲染方法,该方法考虑水波扩散运动情况,基于当前图像帧中动态对象所处的目标水体区域对应的水波图获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,基于法线贴图渲染下一图像帧中目标水体区域的水波效果,一方面通过逐帧迭代渲染,能够模拟逼真的水波动画效果,使得水面交互非常自然,为用户提供沉浸式体验,另一方面,由于仅对目标水体区域进行水波渲染,简化了复杂度,降低了运算量,能够较好地适配移动端应用。
图2所示实施例主要对水波的迭代过程进行详细说明,当动态对象在水中运动时,为了渲染出逼真的水波动画效果,还需要考虑随着动态对象在水中运动情况不断注入新的水波的过程,即水波的形成既包括水波的注入过程还包括水波的迭代过程。
接下来,结合图4对包含新的水波注入过程的图像渲染方法进行介绍。参见图4所示的 图像渲染方法的流程图,其中,图4所示实施例为在图2所示实施例基础上改进得到的,下面仅针对图4与图2所示实施例的区别之处进行说明,该方法包括:
S401:当该动态对象在水中运动时,根据该动态对象对应的水波注入位移和水波扩散衰减矩阵,对该当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理得到衰减处理后的水波图。
在本实施例中,动态对象在水中运动时产生的水波注入可以通过水波注入位移和水波扩散衰减矩阵进行表征,其中,水波扩散衰减矩阵可以通过kernel表示,kernel可以用于描述水波注入的形状,例如,水波注入的形状可以是对称结构,其kernel如下所示:
Figure PCTCN2019120729-appb-000004
在具体实现时,终端可以根据该动态对象的对象属性或该动态对象实施的动作类型中至少一项,获取该动态对象对应的水波注入位移。以动态对象为游戏角色对象为例,对象属性可以包括游戏角色对象的性别、体重等,动态对象实施的动作类型具体可以是跺脚、抬脚等,如此,终端根据游戏角色对象的性别、体重等以及实施动作类型如跺脚、抬脚等获取该游戏角色对象对应的水波注入位移。其中,不同对象属性或不同动作类型对应的水波注入位移可以是不同的,不同对象属性和不同动作类型对应的水波注入位移也可以是不同的,例如,女性游戏角色在水中跺脚产生的水波注入位移小于男性游戏角色在水中跺脚产生的水波注入位移。
在获取水波注入位移和水波扩散衰减矩阵后,终端可以对水波图进行衰减处理。具体地,终端获取该动态对象对应的水波注入位移和水波扩散衰减矩阵的乘积,作为衰减参数矩阵,然后将该衰减参数矩阵中的数值叠加至该当前图像帧中动态对象所处的目标水体区域对应的水波图中,得到衰减处理后的水波图。其中,在将衰减参数矩阵中的数值叠加至当前图像帧对应的水波图中时,可以先将衰减参数矩阵的中心点与水波图的中心点进行匹配,然后以中心点为基准,将衰减参数矩阵各个元素的数值叠加到水波图对应顶点中。
S402:根据该动态对象的位置偏移量,将该衰减处理后的水波图沿着该动态对象移动方向的反方向进行偏移,得到偏移处理后的水波图。
由于衰减处理后的水波图是以动态对象为中心的,如果直接利用该水波图进行迭代渲染,则渲染出的下一图像帧中水波始终跟随角色而移动,而真实的水波并不会随着角色而移动,而是滞后于角色。基于此,本实施例中提出了将衰减处理后的水波向动态对象移动方向的反方向偏移的方式,以营造水波滞后感,从而模拟逼真的水波效果。
下面结合图5对渲染水波滞后感的实现原理进行解释说明。
参见图5,图5的上半部分为偏移处理后的水波效果图,随着动态对象不断地向后移动,就会不断地有新的水波注入,为了营造出水波滞后感,针对每次新注入的水波,需要对其之前的所有水波作偏移处理;具体参见图5下半部分,先假设动态对象不动,每次有水波注入时,即将注入之前的水波向动态对象移动方向的反方向移动,然后根据动态对象的位置偏移量du,将水波图整体向动态对象移动方向移动,即可得到图5的上半部分所示的水波效果图。
基于此,终端在获取当前图像帧中动态对象所处的目标水体区域对应的水波图和法线贴图时,可以根据该偏移处理后的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的水波图和法线贴图。然后,终端基于下一图像帧中动态对象所处的目标水体区域对应的法线贴图渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
由上可知,本申请实施例提供了一种图像渲染方法,该方法从水波形成的角度,从水波注入以及水波扩散两方面进行水面渲染,具体地,当动态对象在水中运动时,根据动态对象对应的水波注入位移和水波扩散衰减矩阵,对当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理,然后根据动态对象的位置偏移量,将衰减处理后的水波图整体沿着动态对象移动方向的反方向偏移,然后基于偏移处理后的水波图获取下一图像帧的法线贴图,根据下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。通过一帧一帧地迭代渲染,能够模拟较为逼真的水波动画效果,并且由于仅对动态对象所处的目标水体区域进行处理,简化了复杂度,降低了运算量,能够较好地适配移动端应用。
接下来,将结合游戏应用场景,对本申请实施例提供的图像渲染方法进行介绍。参见图6所示的图像渲染方法的应用场景示意图,该场景中包括终端10,终端10具体可以是智能手机,用于对手游进行水面渲染。
具体地,游戏角色对象在水中游泳,水面被扰动进而产生水波,该水波的形成包括水波注入和水波扩散两个过程。终端10根据游戏角色对象的对象属性和游戏角色对象实施的动作获取该游戏角色对象对应的水波注入位移。
接着,终端10计算预先定义的、用于描述水波形状的水波扩散衰减矩阵kernel以及基于对象属性和实施的动作获取出的水波注入位移的乘积,作为衰减参数矩阵,在本实施例中,衰减参数矩阵为5*5的矩阵。终端10在获取当前图像帧中游戏角色对象所处的目标水体区域对应的水波图后,将衰减参数矩阵中的数值叠加至该水波图中,假设水波图的尺寸为25*15,则在进行叠加时,将衰减参数矩阵的中心点与水波图的中心点进行匹配,然后将衰减参数矩阵的行和列补零,使之与水波图的尺寸相一致,然后,再将处理后的衰减参数矩阵中的数值 叠加至水波图。
为了营造水波滞后感,终端10将衰减处理后的水波图进行偏移处理,具体地,终端根据游戏角色对象的位置偏移量,将衰减处理后的水波图沿着该游戏角色对象移动方向的反方向进行偏移,得到偏移处理后的水波图。在该实施例中,游戏角色对象向东偏移1米,则终端10将衰减处理后的水波图整体向西偏移1米,得到偏移处理后的水波图。
接着,终端10从偏移处理后的水波图中读取水面高度信息,该水面高度信息包括当前图像帧及其上一帧各自对应的水面高度位移,然后将水面高度信息作为波动方程参数,通过带阻尼的波动方程和该波动方程参数获得下一图像帧对应的水面高度位移,根据该当前图像帧对应的水面高度位移和该下一图像帧对应的水面高度位移,生成下一图像帧中游戏角色对象所处的目标水体区域对应的水波图,再基于该下一图像帧中游戏角色对象所处的目标水体区域对应的水波图,获取该下一图像帧中各水面顶点分别与其位置相邻的上边顶点和右边顶点相比的水面高度位移差值,根据该水面高度位移差获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
其中,法线贴图的R通道和G通道分别存储下一图像帧中各水面顶点与位置相邻的上边顶点的水面高度位移差值和各水面顶点与位置相邻的右边顶点的水面高度位移差值,该水面高度位移差值能够表征水面的凹凸程度,基于该法线贴图渲染下一图像帧中游戏角色对象所处的目标水体区域的水波效果。
在本实施例中,终端10可以根据获取出的下一图像帧的水波图可以用于迭代更新下下图像帧的水波图和法线贴图,如此,可以实现一帧一帧地迭代渲染,从而模拟出逼真的水波动画效果。具体请参见图7,游戏角色对象实施向东游的动作,水面被扰动,注入新的水波,而原有的水波则向游戏角色对象移动方向相反的方向偏移,即向西偏移,从而形成了图7中右图所示的水波滞后效果。
以上为本申请实施例提供的图像渲染方法的一些具体实现方式,基于此,本申请实施例还提供了一种图像渲染装置。接下来,将从功能模块化的角度对本申请实施例提供的图像渲染装置进行介绍。
参见图8所示的图像渲染装置的结构示意图,该装置700包括:
获取模块710,用于获取当前图像帧中动态对象所处的目标水体区域对应的水波图;
迭代模块720,用于根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图;
渲染模块730,用于根据该下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
可选的,参见图9,图9为本申请实施例提供的图像渲染装置的一个结构示意图,在图7所示结构的基础上,该装置700还包括:
衰减模块740,用于当该动态对象在水中运动时,根据该动态对象对应的水波注入位移和水波扩散衰减矩阵,对该当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理得到衰减处理后的水波图;
偏移模块750,用于根据该动态对象的位置偏移量,将该衰减处理后的水波图沿着该动态对象移动方向的反方向进行偏移,得到偏移处理后的水波图;
则该迭代模块720,具体用于根据该偏移处理后的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
可选的,参见图10,图10为本申请实施例提供的图像渲染装置的一个结构示意图,在图9所示结构的基础上,该装置700还包括:
第一获取模块760,用于根据该动态对象的对象属性或该动态对象实施的动作类型中至少一项,获取该动态对象对应的水波注入位移。
可选的,该衰减模块740具体用于:
获取该动态对象对应的水波注入位移和水波扩散衰减矩阵的乘积,作为衰减参数矩阵;
将该衰减参数矩阵中的数值叠加至该当前图像帧中动态对象所处的目标水体区域对应的水波图中,得到衰减处理后的水波图。
可选的,参见图11,图11为本申请实施例提供的图像渲染装置的一个结构示意图,在图8所示结构的基础上,该迭代模块720包括:
读取子模块721,用于从当前图像帧中动态对象所处的目标水体区域对应的水波图中,读取水面高度信息,该水面高度信息包括该当前图像帧及其上一帧各自对应的水面高度位移;
第一迭代子模块722,用于根据该水面高度信息,获取下一图像帧中动态对象所处的目标水体区域对应的水波图;
第二迭代子模块723,用于根据该下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
需要说明的是,图11所示的图像渲染装置也可以在图9或图10所示结构基础上形成,本申请实施例对此不作限定。
可选的,该第一迭代子模块722具体用于:
将该水面高度信息作为波动方程参数,通过带阻尼的波动方程和该波动方程参数获得下一图像帧对应的水面高度位移;
根据该当前图像帧对应的水面高度位移和该下一图像帧对应的水面高度位移,生成下一 图像帧中动态对象所处的目标水体区域对应的水波图。
可选的,该第二迭代子模块723具体用于:
根据该下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中各水面顶点分别与其位置相邻的上边顶点和右边顶点相比的水面高度位移差值,根据该水面高度位移差值获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图;或者,
根据该下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中各水面顶点分别与其位置相邻的左边顶点和下边顶点相比的水面高度位移差值,根据该水面高度位移差值获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
可选的,参见图12,图12为本申请实施例提供的图像渲染装置的一个结构示意图,在图7所示结构的基础上,该装置700还包括:
第二获取模块770,用于获取动态对象在图像帧中的图像位置,以该图像位置为该动态对象所处的目标水体区域的中心位置,以预设宽高为目标水体区域的宽高,获取在图像帧中动态对象所处的目标水体区域。
需要说明的是,图12所示的图像渲染装置也可以在图9或图10所示结构基础上形成,本申请实施例对此不作限定。
可选的,该水波图和该法线贴图采用两种颜色渲染贴图格式。
可选的,该图像帧包括游戏应用中的图像帧,该动态对象包括游戏角色对象。
由上可知,本申请实施例提供了一种图像渲染装置,该装置考虑水波扩散运动情况,基于当前图像帧中动态对象所处的目标水体区域对应的水波图获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,基于法线贴图渲染下一图像帧中目标水体区域的水波效果,一方面通过逐帧迭代渲染,能够模拟逼真的水波动画效果,使得水面交互非常自然,为用户提供沉浸式体验,另一方面,由于仅对目标水体区域进行水波渲染,简化了复杂度,降低了运算量,能够较好地适配移动端应用。
图8至图12所示实施例从功能模块化的角度对图像渲染装置进行介绍,本申请实施例还提供了一种用于图像渲染的处理设备,下面将从硬件实体化的角度对本申请实施例提供的设备进行介绍。
本申请实施例还提供了一种处理设备,该设处理备可以为终端,也可以为服务器,在此仅提供终端的结构示意,如图13所示,为了便于说明,仅示出了与本申请实施例相关的部分,具体技术细节未揭示的,请参照本申请实施例装置部分。该终端可以为包括手机、平板电脑、个人数字助理(英文全称:Personal Digital Assistant,英文缩写:PDA)、车载电脑等任意终端设备,以终端为手机为例:
图13示出的是与本申请实施例提供的终端相关的手机的部分结构的框图。参考图13,手机包括:射频(英文全称:Radio Frequency,英文缩写:RF)电路1210、存储器1220、输入单元1230、显示单元1240、传感器1250、音频电路1260、无线保真(英文全称:wireless fidelity,英文缩写:WiFi)模块1270、处理器1280、以及电源1290等部件。本领域技术人员可以理解,图12中示出的手机结构并不构成对手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图12对手机的各个构成部件进行具体的介绍:
RF电路1210可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器1280处理;另外,将设计上行的数据发送给基站。通常,RF电路1210包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(英文全称:Low Noise Amplifier,英文缩写:LNA)、双工器等。此外,RF电路1210还可以通过无线通信与网络和其他设备通信。上述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(英文全称:Global System of Mobile communication,英文缩写:GSM)、通用分组无线服务(英文全称:General Packet Radio Service,GPRS)、码分多址(英文全称:Code Division Multiple Access,英文缩写:CDMA)、宽带码分多址(英文全称:Wideband Code Division Multiple Access,英文缩写:WCDMA)、长期演进(英文全称:Long Term Evolution,英文缩写:LTE)、电子邮件、短消息服务(英文全称:Short Messaging Service,SMS)等。
存储器1220可用于存储软件程序以及模块,处理器1280通过运行存储在存储器1220的软件程序以及模块,从而执行手机的各种功能应用以及数据处理。存储器1220可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1220可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
输入单元1230可用于接收输入的数字或字符信息,以及产生与手机的用户设置以及功能控制有关的键信号输入。具体地,输入单元1230可包括触控面板1231以及其他输入设备1232。触控面板1231,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1231上或在触控面板1231附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板1231可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标, 再送给处理器1280,并能接收处理器1280发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1231。除了触控面板1231,输入单元1230还可以包括其他输入设备1232。具体地,其他输入设备1232可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元1240可用于显示由用户输入的信息或提供给用户的信息以及手机的各种菜单。显示单元1240可包括显示面板1241,可选的,可以采用液晶显示器(英文全称:Liquid Crystal Display,英文缩写:LCD)、有机发光二极管(英文全称:Organic Light-Emitting Diode,英文缩写:OLED)等形式来配置显示面板1241。进一步的,触控面板1231可覆盖显示面板1241,当触控面板1231检测到在其上或附近的触摸操作后,传送给处理器1280以获取触摸事件的类型,随后处理器1280根据触摸事件的类型在显示面板1241上提供相应的视觉输出。虽然在图12中,触控面板1231与显示面板1241是作为两个独立的部件来实现手机的输入和输入功能,但是在某些实施例中,可以将触控面板1231与显示面板1241集成而实现手机的输入和输出功能。
手机还可包括至少一种传感器1250,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1241的亮度,接近传感器可在手机移动到耳边时,关闭显示面板1241和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路1260、扬声器1261,传声器1262可提供用户与手机之间的音频接口。音频电路1260可将接收到的音频数据转换后的电信号,传输到扬声器1261,由扬声器1261转换为声音信号输出;另一方面,传声器1262将收集的声音信号转换为电信号,由音频电路1260接收后转换为音频数据,再将音频数据输出处理器1280处理后,经RF电路1210以发送给比如另一手机,或者将音频数据输出至存储器1220以便进一步处理。
WiFi属于短距离无线传输技术,手机通过WiFi模块1270可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图12示出了WiFi模块1270,但是可以理解的是,其并不属于手机的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器1280是手机的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器1220内的软件程序和/或模块,以及调用存储在存储器1220内的数据, 执行手机的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器1280可包括一个或多个处理单元;优选的,处理器1280可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1280中。
手机还包括给各个部件供电的电源1290(比如电池),优选的,电源可以通过电源管理系统与处理器1280逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
尽管未示出,手机还可以包括摄像头、蓝牙模块等,在此不再赘述。
在本申请实施例中,该终端所包括的处理器1280还具有以下功能:
获取当前图像帧中动态对象所处的目标水体区域对应的水波图;
根据该当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图;
根据该下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
可选的,处理器1280还用于执行本申请实施例提供的图像渲染方法的任意一种实现方式的步骤,具体如下:
可选的,该处理器1280还用于执行:
当该动态对象在水中运动时,根据该动态对象对应的水波注入位移和水波扩散衰减矩阵,对该当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理得到衰减处理后的水波图;
根据该动态对象的位置偏移量,将该衰减处理后的水波图沿着该动态对象移动方向的反方向进行偏移,得到偏移处理后的水波图;
则该处理器1280用于执行根据该偏移处理后的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
可选的,该处理器1280还用于执行根据该动态对象的对象属性或该动态对象实施的动作类型中至少一项,获取该动态对象对应的水波注入位移。
可选的,该处理器1280还用于执行:
获取该动态对象对应的水波注入位移和水波扩散衰减矩阵的乘积,作为衰减参数矩阵;
将该衰减参数矩阵中的数值叠加至该当前图像帧中动态对象所处的目标水体区域对应的水波图中,得到衰减处理后的水波图。
可选的,该处理器1280还用于执行:
从当前图像帧中动态对象所处的目标水体区域对应的水波图中,读取水面高度信息,该水面高度信息包括该当前图像帧及其上一帧各自对应的水面高度位移;
根据该水面高度信息,获取下一图像帧中动态对象所处的目标水体区域对应的水波图;
根据该下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
可选的,该处理器1280还用于执行:
将该水面高度信息作为波动方程参数,通过带阻尼的波动方程和该波动方程参数获得下一图像帧对应的水面高度位移;
根据该当前图像帧对应的水面高度位移和该下一图像帧对应的水面高度位移,生成下一图像帧中动态对象所处的目标水体区域对应的水波图。
可选的,该处理器1280还用于执行:
根据该下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中各水面顶点分别与其位置相邻的上边顶点和右边顶点相比的水面高度位移差值,根据该水面高度位移差值获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图;或者,
根据该下一图像帧中动态对象所处的目标水体区域对应的水波图,获取该下一图像帧中各水面顶点分别与其位置相邻的左边顶点和下边顶点相比的水面高度位移差值,根据该水面高度位移差值获取该下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
可选的,该处理器1280还用于执行:
获取动态对象在图像帧中的图像位置,以该图像位置为该动态对象所处的目标水体区域的中心位置,以预设宽高为目标水体区域的宽高,获取在图像帧中动态对象所处的目标水体区域。
可选的,该水波图和该法线贴图采用两种颜色渲染贴图格式。
可选的,该图像帧包括游戏应用中的图像帧,该动态对象包括游戏角色对象。
本申请实施例还提供一种计算机可读存储介质,用于存储程序代码,该程序代码由处理器执行以实现前述各个实施例所述的一种图像渲染方法中的任意一种实施方式。
本申请实施例还提供一种包括指令的计算机程序产品,当其在计算机上运行时,使得计算机执行前述各个实施例所述的一种图像渲染方法中的任意一种实施方式。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统,装置和单元的具体工作过程,可以参考前述装置实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统,装置和装置,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,该单元的划分, 仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
该作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
该集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例该装置的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(英文全称:Read-Only Memory,英文缩写:ROM)、随机存取存储器(英文全称:Random Access Memory,英文缩写:RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (20)

  1. 一种图像渲染方法,其特征在于,应用于处理设备,包括:
    获取当前图像帧中动态对象所处的目标水体区域对应的水波图;
    根据所述当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图;
    根据所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
  2. 根据权利要求1所述方法,其特征在于,所述方法还包括:
    当所述动态对象在水中运动时,根据所述动态对象对应的水波注入位移和水波扩散衰减矩阵,对所述当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理得到衰减处理后的水波图;
    根据所述动态对象的位置偏移量,将所述衰减处理后的水波图沿着所述动态对象移动方向的反方向进行偏移,得到偏移处理后的水波图;
    则根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,包括:
    根据所述偏移处理后的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
  3. 根据权利要求2所述方法,其特征在于,通过以下方式获取所述动态对象对应的水波注入位移:
    根据所述动态对象的对象属性或所述动态对象实施的动作类型中至少一项,获取所述动态对象对应的水波注入位移。
  4. 根据权利要求2所述方法,其特征在于,所述根据所述动态对象对应的水波注入位移和水波扩散衰减矩阵,对所述当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理得到衰减处理后的水波图,包括:
    获取所述动态对象对应的水波注入位移和水波扩散衰减矩阵的乘积,作为衰减参数矩阵;
    将所述衰减参数矩阵中的数值叠加至所述当前图像帧中动态对象所处的目标水体区域对应的水波图中,得到衰减处理后的水波图。
  5. 根据权利要求1至4中任一项所述方法,其特征在于,所述根据当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图,包括:
    从当前图像帧中动态对象所处的目标水体区域对应的水波图中,读取水面高度信息,所述水面高度信息包括所述当前图像帧及其上一帧各自对应的水面高度位移;
    根据所述水面高度信息,获取下一图像帧中动态对象所处的目标水体区域对应的水波图;
    根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
  6. 根据权利要求5所述方法,其特征在于,所述根据所述水面高度信息,获取下一图像帧中动态对象所处的目标水体区域对应的水波图,包括:
    将所述水面高度信息作为波动方程参数,通过带阻尼的波动方程和所述波动方程参数获得下一图像帧对应的水面高度位移;
    根据所述当前图像帧对应的水面高度位移和所述下一图像帧对应的水面高度位移,生成下一图像帧中动态对象所处的目标水体区域对应的水波图。
  7. 根据权利要求5所述方法,其特征在于,所述根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图,包括:
    根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中各水面顶点分别与其位置相邻的上边顶点和右边顶点相比的水面高度位移差值,根据所述水面高度位移差值获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图;或者,
    根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中各水面顶点分别与其位置相邻的左边顶点和下边顶点相比的水面高度位移差值,根据所述水面高度位移差值获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
  8. 根据权利要求1至4中任一项所述方法,其特征在于,通过以下方式获取帧中动态对象所处的目标水体区域:
    获取动态对象在图像帧中的图像位置,以所述图像位置为所述动态对象所处的目标水体区域的中心位置,以预设宽高为目标水体区域的宽高,获取在图像帧中动态对象所处的目标水体区域。
  9. 根据权利要求1至4中任一项所述方法,其特征在于,所述水波图和所述法线贴图采用两种颜色渲染贴图格式。
  10. 根据权利要求1至4中任一项所述方法,其特征在于,所述图像帧包括游戏应用中的图像帧,所述动态对象包括游戏角色对象。
  11. 一种处理设备,其特征在于,所述处理设备包括处理器以及存储器:
    所述存储器用于存储程序代码,并将所述程序代码传输给所述处理器;
    所述处理器用于执行:
    获取当前图像帧中动态对象所处的目标水体区域对应的水波图;
    根据所述当前图像帧中动态对象所处的目标水体区域对应的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图;
    根据所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图,渲染下一图像帧中动态对象所处的目标水体区域的水波效果。
  12. 根据权利要求11所述设备,其特征在于,所述处理器还用于执行:
    当所述动态对象在水中运动时,根据所述动态对象对应的水波注入位移和水波扩散衰减矩阵,对所述当前图像帧中动态对象所处的目标水体区域对应的水波图进行衰减处理得到衰减处理后的水波图;
    根据所述动态对象的位置偏移量,将所述衰减处理后的水波图沿着所述动态对象移动方向的反方向进行偏移,得到偏移处理后的水波图;
    则所述处理器用于执行根据所述偏移处理后的水波图,获取下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
  13. 根据权利要求12所述设备,其特征在于,所述处理器还用于执行根据所述动态对象的对象属性或所述动态对象实施的动作类型中至少一项,获取所述动态对象对应的水波注入位移。
  14. 根据权利要求12所述设备,其特征在于,所述处理器还用于执行:
    获取所述动态对象对应的水波注入位移和水波扩散衰减矩阵的乘积,作为衰减参数矩阵;
    将所述衰减参数矩阵中的数值叠加至所述当前图像帧中动态对象所处的目标水体区域对应的水波图中,得到衰减处理后的水波图。
  15. 根据权利要求11至14中任一项所述设备,其特征在于,所述处理器还用于执行:
    从当前图像帧中动态对象所处的目标水体区域对应的水波图中,读取水面高度信息,所述水面高度信息包括所述当前图像帧及其上一帧各自对应的水面高度位移;
    根据所述水面高度信息,获取下一图像帧中动态对象所处的目标水体区域对应的水波图;
    根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
  16. 根据权利要求15所述设备,其特征在于,所述处理器还用于执行:
    将所述水面高度信息作为波动方程参数,通过带阻尼的波动方程和所述波动方程参数获得下一图像帧对应的水面高度位移;
    根据所述当前图像帧对应的水面高度位移和所述下一图像帧对应的水面高度位移,生成下一图像帧中动态对象所处的目标水体区域对应的水波图。
  17. 根据权利要求15所述设备,其特征在于,所述处理器还用于执行:
    根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中各水面顶点分别与其位置相邻的上边顶点和右边顶点相比的水面高度位移差值,根据所述水面高度位移差值获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图;或者,
    根据所述下一图像帧中动态对象所处的目标水体区域对应的水波图,获取所述下一图像帧中各水面顶点分别与其位置相邻的左边顶点和下边顶点相比的水面高度位移差值,根据所述水面高度位移差值获取所述下一图像帧中动态对象所处的目标水体区域对应的法线贴图。
  18. 根据权利要求11至14中任一项所述设备,其特征在于,所述处理器还用于执行:
    获取动态对象在图像帧中的图像位置,以所述图像位置为所述动态对象所处的目标水体区域的中心位置,以预设宽高为目标水体区域的宽高,获取在图像帧中动态对象所处的目标水体区域。
  19. 根据权利要求11至14中任一项所述设备,其特征在于,所述水波图和所述法线贴图采用两种颜色渲染贴图格式。
  20. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质用于存储程序代码,所述程序代码由处理器执行以实现权利要求1-10任一项所述的图像渲染方法。
PCT/CN2019/120729 2018-12-07 2019-11-25 图像渲染方法、设备及存储介质 WO2020114271A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19892415.1A EP3822918A4 (en) 2018-12-07 2019-11-25 IMAGE RENDERING APPARATUS AND METHOD, AND INFORMATION MEDIA
US17/178,437 US11498003B2 (en) 2018-12-07 2021-02-18 Image rendering method, device, and storage medium
US17/959,633 US11826649B2 (en) 2018-12-07 2022-10-04 Water wave rendering of a dynamic object in image frames

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811497482.0 2018-12-07
CN201811497482.0A CN109598777B (zh) 2018-12-07 2018-12-07 图像渲染方法、装置、设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/178,437 Continuation US11498003B2 (en) 2018-12-07 2021-02-18 Image rendering method, device, and storage medium

Publications (1)

Publication Number Publication Date
WO2020114271A1 true WO2020114271A1 (zh) 2020-06-11

Family

ID=65961551

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/120729 WO2020114271A1 (zh) 2018-12-07 2019-11-25 图像渲染方法、设备及存储介质

Country Status (4)

Country Link
US (2) US11498003B2 (zh)
EP (1) EP3822918A4 (zh)
CN (1) CN109598777B (zh)
WO (1) WO2020114271A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348881A (zh) * 2020-11-12 2021-02-09 网易(杭州)网络有限公司 图像渲染方法、装置和电子设备

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109598777B (zh) 2018-12-07 2022-12-23 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备及存储介质
CN110097619B (zh) * 2019-04-30 2022-12-13 腾讯科技(深圳)有限公司 应用程序中的动画效果实现方法、装置及设备
CN112214187B (zh) * 2019-07-11 2022-04-05 北京字节跳动网络技术有限公司 水波纹图像实现方法及装置
CN112396671B (zh) * 2019-08-16 2024-01-30 北京字节跳动网络技术有限公司 水波纹效果实现方法、装置、电子设备和计算机可读存储介质
CN110992456B (zh) * 2019-11-19 2021-09-07 浙江大学 一种基于位置动力学的雪崩模拟方法
CN111028323B (zh) * 2019-11-27 2023-08-15 深圳奇迹智慧网络有限公司 地图中水波纹的模拟方法、装置、设备和可读存储介质
CN111009026B (zh) * 2019-12-24 2020-12-01 腾讯科技(深圳)有限公司 对象渲染方法和装置、存储介质及电子装置
US11276227B2 (en) 2019-12-24 2022-03-15 Tencent Technology (Shenzhen) Company Limited Object rendering method and apparatus, storage medium, and electronic device using a simulated pre-integration map
CN111242838B (zh) * 2020-01-09 2022-06-03 腾讯科技(深圳)有限公司 模糊图像渲染方法和装置、存储介质及电子装置
CN111383311B (zh) * 2020-03-06 2024-03-01 网易(杭州)网络有限公司 法线贴图生成方法、装置、设备及存储介质
CN111508047B (zh) * 2020-04-21 2023-08-22 网易(杭州)网络有限公司 一种动画数据处理方法和装置
CN112435304B (zh) * 2020-07-20 2023-03-14 上海哔哩哔哩科技有限公司 水体交互贴图方法及系统
CN111986303A (zh) * 2020-09-09 2020-11-24 网易(杭州)网络有限公司 流体渲染方法、装置、存储介质及终端设备
CN112221150B (zh) * 2020-10-19 2023-01-10 珠海金山数字网络科技有限公司 一种虚拟场景中的涟漪仿真方法及装置
CN112446941A (zh) * 2020-11-27 2021-03-05 网易(杭州)网络有限公司 图像处理方法、装置、电子设备及存储介质
CN112465946B (zh) * 2020-12-08 2023-07-14 网易(杭州)网络有限公司 波纹的渲染方法、装置、电子设备及计算机可读介质
CN112860063B (zh) * 2021-02-02 2022-04-29 杭州电魂网络科技股份有限公司 交互水的实现方法、系统、电子装置和存储介质
CN113457132B (zh) * 2021-06-23 2024-03-01 北京达佳互联信息技术有限公司 对象投放方法、装置、电子设备及存储介质
CN113450443B (zh) * 2021-07-08 2023-03-24 网易(杭州)网络有限公司 海面模型的渲染方法和装置
CN114288647B (zh) * 2021-12-31 2022-07-08 深圳方舟互动科技有限公司 基于AI Designer的人工智能游戏引擎、游戏渲染方法及装置
CN116617658B (zh) * 2023-07-20 2023-10-20 腾讯科技(深圳)有限公司 一种图像渲染方法及相关装置
CN117274465B (zh) * 2023-11-22 2024-03-08 园测信息科技股份有限公司 匹配真实地理水域环境的水体渲染方法、系统、介质和设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100045669A1 (en) * 2008-08-20 2010-02-25 Take Two Interactive Software, Inc. Systems and method for visualization of fluids
CN105913471A (zh) * 2016-04-06 2016-08-31 腾讯科技(深圳)有限公司 图片处理的方法和装置
CN105912234A (zh) * 2016-04-06 2016-08-31 腾讯科技(深圳)有限公司 虚拟场景的交互方法和装置
CN109598777A (zh) * 2018-12-07 2019-04-09 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备及存储介质

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000200361A (ja) * 1998-08-07 2000-07-18 Sega Enterp Ltd 画像処理装置及び情報記録媒体
EP1136106A3 (en) * 2000-03-21 2001-12-05 Sony Computer Entertainment Inc. Entertainment apparatus, storage medium, and method of displaying an object
DE60112016T2 (de) * 2000-03-21 2006-04-20 Sony Computer Entertainment Inc. Unterhaltungsvorrichtung, speichermedium und verfahren zur wetterbestimmung
US6985148B2 (en) * 2001-12-13 2006-01-10 Microsoft Corporation Interactive water effects using texture coordinate shifting
US20060177122A1 (en) * 2005-02-07 2006-08-10 Sony Computer Entertainment Inc. Method and apparatus for particle manipulation using graphics processing
US8232999B2 (en) * 2008-01-22 2012-07-31 Dreamworks Animation Llc Fast oceans at near infinite resolution
US8204725B1 (en) * 2008-07-25 2012-06-19 Nvidia Corporation Real-time breaking waves for shallow water simulations
US9138649B2 (en) * 2008-10-08 2015-09-22 Sony Corporation Game control program, game device, and game control method adapted to control game where objects are moved in game field
CN102663245A (zh) * 2012-03-30 2012-09-12 福建天趣网络科技有限公司 3d游戏世界编辑器
SE537064C2 (sv) * 2012-05-04 2014-12-23 Cr Dev Ab Analys för kvantifiering av mikroskopisk anisotropisk diffusion
US9177419B2 (en) * 2012-06-27 2015-11-03 Pixar Advection of UV texture maps in fluid flows
CN103034765B (zh) * 2012-12-14 2015-08-05 天津大学 基于数值模拟的采空区注浆动态全过程仿真方法
US9811941B1 (en) * 2013-03-13 2017-11-07 Lucasfilm Entertainment Company Ltd. High resolution simulation of liquids
CN103474007B (zh) * 2013-08-27 2015-08-19 湖南华凯文化创意股份有限公司 一种互动式显示方法及系统
JP6513984B2 (ja) * 2015-03-16 2019-05-15 株式会社スクウェア・エニックス プログラム、記録媒体、情報処理装置及び制御方法
US10970843B1 (en) * 2015-06-24 2021-04-06 Amazon Technologies, Inc. Generating interactive content using a media universe database
US9940689B2 (en) * 2015-11-02 2018-04-10 Nvidia Corporation Latency-resistant sparse simulation technique, system and method
JP6732439B2 (ja) * 2015-12-03 2020-07-29 株式会社バンダイナムコエンターテインメント プログラム及び画像生成システム
JP6441843B2 (ja) * 2016-02-24 2018-12-19 株式会社カプコン ゲームプログラムおよびゲームシステム
JP6441844B2 (ja) * 2016-02-24 2018-12-19 株式会社カプコン ゲームプログラムおよびゲームシステム
JP6397436B2 (ja) * 2016-02-24 2018-09-26 株式会社カプコン ゲームプログラムおよびゲームシステム
WO2017174006A1 (zh) * 2016-04-06 2017-10-12 腾讯科技(深圳)有限公司 图片处理的方法和装置
CA2968589C (en) * 2016-06-10 2023-08-01 Square Enix Ltd. System and method for placing a character animation at a location in a game environment
US10073026B2 (en) * 2016-07-05 2018-09-11 The United States Of America, As Represented By The Secretary Of Commerce Optical particle sorter
CN106110656B (zh) * 2016-07-07 2020-01-14 网易(杭州)网络有限公司 在游戏场景计算路线的方法和装置
US10002456B2 (en) * 2016-07-08 2018-06-19 Wargaming.Net Limited Water surface rendering in virtual environment
CN106993200B (zh) * 2017-04-18 2019-05-31 腾讯科技(深圳)有限公司 一种数据的直播方法、相关设备及系统
US10744410B2 (en) * 2017-08-24 2020-08-18 Nintendo Co., Ltd. Storage medium, information processing apparatus, image processing method, and information processing system
CN109509243B (zh) * 2017-09-13 2022-11-11 腾讯科技(深圳)有限公司 一种液体仿真方法、液体交互方法及装置
US11534688B2 (en) * 2018-04-02 2022-12-27 Take-Two Interactive Software, Inc. Method and apparatus for enhanced graphics rendering in a video game environment
US11010509B2 (en) * 2018-05-23 2021-05-18 Nvidia Corporation Systems and methods for computer simulation of detailed waves for large-scale water simulation
US10950010B2 (en) * 2019-01-15 2021-03-16 Square Enix Ltd. Dynamic levels of destructive detail in electronic game display
US11625884B2 (en) * 2019-06-18 2023-04-11 The Calany Holding S. À R.L. Systems, methods and apparatus for implementing tracked data communications on a chip
CN110339562B (zh) * 2019-08-01 2023-09-15 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
US11217002B2 (en) * 2020-02-28 2022-01-04 Weta Digital Limited Method for efficiently computing and specifying level sets for use in computer simulations, computer graphics and other purposes
US20220008826A1 (en) * 2020-07-07 2022-01-13 Electronic Arts Inc. Strand Simulation in Multiple Levels
US20220134222A1 (en) * 2020-11-03 2022-05-05 Nvidia Corporation Delta propagation in cloud-centric platforms for collaboration and connectivity

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100045669A1 (en) * 2008-08-20 2010-02-25 Take Two Interactive Software, Inc. Systems and method for visualization of fluids
CN105913471A (zh) * 2016-04-06 2016-08-31 腾讯科技(深圳)有限公司 图片处理的方法和装置
CN105912234A (zh) * 2016-04-06 2016-08-31 腾讯科技(深圳)有限公司 虚拟场景的交互方法和装置
CN109598777A (zh) * 2018-12-07 2019-04-09 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3822918A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348881A (zh) * 2020-11-12 2021-02-09 网易(杭州)网络有限公司 图像渲染方法、装置和电子设备
CN112348881B (zh) * 2020-11-12 2023-06-30 网易(杭州)网络有限公司 图像渲染方法、装置和电子设备

Also Published As

Publication number Publication date
US11498003B2 (en) 2022-11-15
US20230098249A1 (en) 2023-03-30
CN109598777A (zh) 2019-04-09
US11826649B2 (en) 2023-11-28
EP3822918A1 (en) 2021-05-19
US20210170278A1 (en) 2021-06-10
EP3822918A4 (en) 2021-12-15
CN109598777B (zh) 2022-12-23

Similar Documents

Publication Publication Date Title
WO2020114271A1 (zh) 图像渲染方法、设备及存储介质
US10055879B2 (en) 3D human face reconstruction method, apparatus and server
JP7359920B2 (ja) 画像処理方法及びフレキシブルスクリーン端末
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
WO2016173427A1 (zh) 一种残影效果的实现方法,装置以及计算机可读介质
US11294533B2 (en) Method and terminal for displaying 2D application in VR device
WO2020156120A1 (zh) 通知消息显示方法及移动终端
CN110174993B (zh) 一种显示控制方法、终端设备及计算机可读存储介质
WO2021036531A1 (zh) 截屏方法及终端设备
WO2020181955A1 (zh) 界面控制方法及终端设备
WO2020155980A1 (zh) 控制方法及终端设备
JP2023526618A (ja) 未読メッセージの表示方法、装置及び電子機器
WO2021115279A1 (zh) 图片显示方法及电子设备
CN108170350A (zh) 实现数码变焦的方法、终端及计算机可读存储介质
CN108037966A (zh) 一种界面显示方法、装置及移动终端
CN111127595A (zh) 图像处理方法及电子设备
WO2020192298A1 (zh) 图像处理方法及终端设备
CN110941378B (zh) 视频内容显示方法及电子设备
CN112206517A (zh) 一种渲染方法、装置、存储介质及计算机设备
CN110517346B (zh) 虚拟环境界面的展示方法、装置、计算机设备及存储介质
CN110244884B (zh) 一种桌面图标管理方法及终端设备
CN108804628B (zh) 一种图片显示方法及终端
EP3731506A1 (en) Image display method and mobile terminal
WO2021104162A1 (zh) 显示方法及电子设备
WO2020192662A1 (zh) 操作方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19892415

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019892415

Country of ref document: EP

Effective date: 20210215

NENP Non-entry into the national phase

Ref country code: DE